WO2019100212A1 - 用于规划消融的超声系统及方法 - Google Patents

用于规划消融的超声系统及方法 Download PDF

Info

Publication number
WO2019100212A1
WO2019100212A1 PCT/CN2017/112174 CN2017112174W WO2019100212A1 WO 2019100212 A1 WO2019100212 A1 WO 2019100212A1 CN 2017112174 W CN2017112174 W CN 2017112174W WO 2019100212 A1 WO2019100212 A1 WO 2019100212A1
Authority
WO
WIPO (PCT)
Prior art keywords
ablation
real
path
ultrasound
time
Prior art date
Application number
PCT/CN2017/112174
Other languages
English (en)
French (fr)
Inventor
丛龙飞
张延慧
桑茂栋
Original Assignee
深圳迈瑞生物医疗电子股份有限公司
北京深迈瑞医疗电子技术研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳迈瑞生物医疗电子股份有限公司, 北京深迈瑞医疗电子技术研究院有限公司 filed Critical 深圳迈瑞生物医疗电子股份有限公司
Priority to PCT/CN2017/112174 priority Critical patent/WO2019100212A1/zh
Priority to CN201780094613.8A priority patent/CN111093516B/zh
Priority to CN202211667341.5A priority patent/CN115944392A/zh
Publication of WO2019100212A1 publication Critical patent/WO2019100212A1/zh
Priority to US16/879,732 priority patent/US20200281662A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6835Supports or holders, e.g., articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/684Indicating the position of the sensor on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/481Diagnostic techniques involving the use of contrast agent, e.g. microbubbles introduced into the bloodstream
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/05Surgical care
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the invention relates to a simulated interventional guidance device based on a medical ultrasound imaging device and an evaluation method for the simulation effect.
  • Interventional ablation surgery planning has been proposed very early, but the current interventional planning is based on three-dimensional data such as CT, MRI or three-dimensional ultrasound, such as three-dimensional segmentation of human tissues such as tumors and blood vessels based on human body three-dimensional data. And the three-dimensional reconstruction display sets the ablation needle path according to the processed medical image information.
  • an image-guided surgical planning device is proposed, which manually sets acupuncture needle insertion point, angle, depth, power, and ablation duration based on the patient's three-dimensional image, and calculates a temperature field based on the input microwave energy, and fuses Display the 3D temperature field and the patient's 3D image.
  • Another way is to provide a section of the 3D model data showing the ablation treatment plan and equipment, using the MPR display technology to display the three-dimensional image and the ablation volume.
  • the above planning schemes are based on static 3D data collected before surgery, and the path based on the data planning is very different from the actual clinical intervention operation.
  • the path based on the data planning is very different from the actual clinical intervention operation.
  • the actual interventional treatment process due to the influence of human tissues such as ribs, it is impossible to accurately enter the needle according to the planned path, which will inevitably affect the effect of ablation treatment and improve the risk of surgical treatment.
  • an ultrasound system for planning ablation that includes:
  • a transmitting circuit and a receiving circuit by exciting the ultrasonic probe to emit an ultrasonic beam to a detection object containing a specific tissue, and receiving an echo of the ultrasonic beam to obtain an ultrasonic echo signal;
  • An image processing module wherein the image processing module obtains real-time ultrasound image data according to the ultrasound echo signal
  • the navigation system includes a positioning device fixed on the ultrasonic probe, and obtaining, by the navigation system, spatial orientation information of a space where the positioning device fixed on the ultrasonic probe is located;
  • a memory that stores a computer program running on the processor
  • a planned ablation path is determined on the real-time ultrasound image.
  • an ultrasound imaging method for planning ablation comprising:
  • Correlation records a correspondence between the real-time ultrasound image and the spatial orientation information
  • a planned ablation path is determined on the real-time ultrasound image.
  • an ultrasound system comprising:
  • An ablation device the ablation device being fixed on the ultrasound probe;
  • a transmitting circuit and a receiving circuit by exciting the ultrasonic probe to emit an ultrasonic beam to a detection object containing a specific tissue, and receiving an echo of the ultrasonic beam to obtain an ultrasonic echo signal;
  • An image processing module wherein the image processing module obtains real-time ultrasound image data according to the ultrasound echo signal
  • the navigation system includes a positioning device fixed on the ultrasonic probe, and obtaining, by the navigation system, spatial orientation information of a space where the positioning device fixed on the ultrasonic probe is located;
  • a memory that stores a computer program running on the processor
  • the planned ablation path and the actual ablation path are stored in association.
  • FIG. 1 is a schematic diagram of a system architecture that provides an ultrasound system for planning ablation in accordance with some embodiments
  • FIG. 2 is a schematic flow chart of the device of FIG. 1 in one embodiment
  • FIG. 3 is a schematic flow chart of the device of FIG. 1 in other embodiments.
  • FIG. 4 is a schematic flow chart of the device of FIG. 1 in some embodiments thereof;
  • FIG. 5 is a schematic flow chart of the apparatus of FIG. 1 in an angiography mode
  • FIG. 6 is a schematic flow chart of the device of FIG. 1 in a multi-frame joint ablation mode
  • FIG. 7 is a schematic diagram of a display interface of the device of FIG. 1 in one embodiment.
  • FIG. 1 shows a schematic view of the structure of an ultrasound system 100 for planning ablation in one embodiment, the specific structure of which is shown below.
  • the ultrasound system 100 for planning ablation shown in FIG. 1 mainly includes an ultrasound probe 101, a transmitting circuit 103, a transmit/receive selection switch 102, a receiving circuit 104, a beamforming module 105, a signal processing module 116, and an image processing module 126.
  • the transmitting circuit 103 transmits a delayed-focused transmission pulse having a certain amplitude and polarity to the ultrasonic probe 101 through the transmission/reception selection switch 102.
  • the ultrasonic probe 101 is excited by a pulse of emission to a test object containing a specific tissue (for example, a specific tissue in a human body or an animal, a blood vessel thereof, etc., and the specific tissue herein includes a tissue that needs to be cleaned by ablation in a human or animal body, for example,
  • the tumor tissue emits an ultrasonic wave (which may be any one of a plane wave, a focused wave or a divergent wave), and receives an ultrasonic echo with information of the detection object reflected from the target area after a certain delay, and ultrasonically returns the ultrasound echo.
  • the wave is reconverted into an electrical signal.
  • the receiving circuit 104 receives the electrical signals generated by the ultrasonic probe 101 conversion, obtains ultrasonic echo signals, and sends the ultrasonic echo signals to the beam combining module 105.
  • Wave The beam synthesis module 105 performs processing such as focus delay, weighting, and channel summation on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the signal processing module 116 for related signal processing.
  • the ultrasonic echo signals processed by the signal processing module 116 are sent to the image processing module 126.
  • the image processing module 126 performs different processing on the signals according to different imaging modes required by the user, obtains ultrasonic image data of different modes, and then forms ultrasonic images of different modes through logarithmic compression, dynamic range adjustment, digital scan conversion, and the like.
  • the transmitting circuit and the receiving circuit excite the ultrasonic probe to emit an ultrasonic beam to the detecting object according to the setting of the ultrasonic imaging parameter, and receive the echo of the ultrasonic beam to obtain an ultrasonic echo signal, according to the ultrasonic echo.
  • the wave signal obtains a desired ultrasound image for display to reveal the tissue structure of the specific tissue and its surroundings.
  • the ultrasound image obtained by the ultrasound probe located at different orientations can be excited. It is called real-time ultrasound image.
  • the real-time ultrasound image can be changed according to the orientation adjustment of the ultrasound probe. It can also change according to the change of the ultrasound imaging parameter adjustment.
  • the real-time ultrasound image is different from the frozen image.
  • the frozen image refers to the ultrasound imaging device performing freezing. Stored image data acquired during function.
  • the ultrasound imaging parameters mentioned in this paper refer to all parameters that can be selected by the user during the imaging process of the ultrasound tissue image, such as TGC (Time Gain Compensate), acoustic frequency, pulse recurrence frequency. , PRF), ultrasonic type, and dynamic range, etc.
  • the ultrasound system 100 for planning ablation further includes a display screen 130, a processor 140, and a memory 160 and a human-machine interaction device 150.
  • the processor 140 is configured to output the obtained ultrasound image to the display screen 130 for display, and the processor 140
  • the computer program instructions written on the memory 160 are called to display the ultrasound image on the display screen 130, and the control instructions input by the user on the displayed ultrasound image are acquired by the human-machine interaction device.
  • the human-machine interaction device herein may include one of a keyboard, a scroll wheel, a mouse, a touch display screen, and the like, and the display screen 130 may also be a normal display screen or a touch display screen.
  • the human-machine interaction device 150 can also be a touch display screen, and then the control command input by the user on the ultrasonic image is acquired by the human-machine interaction device, and the processor 140 can call the memory 160.
  • the recorded computer program instructions know the contact of the input object on the touch display screen to determine the control command entered by the user on the displayed ultrasound image.
  • the processor 140 calls the computer program instructions written on the memory 160 to learn that the contact of the input object (for example, the index finger, the thumb, the stylus, the touch screen dedicated pen, etc.) on the touch display screen may be first displayed on the touch display screen.
  • Ultrasound image processor 140 can call memory
  • a gesture detection module stored in 160 detects a control command obtained by the user performing a contact operation on the graphical user interface through the input object.
  • a touch display having a graphical user interface (GUI), one or more processors, memory, and one or more modules, programs stored in memory for performing various functions are included Or an instruction set, which together implements a graphical user interface (GUI)-based manipulation input detection and obtains relevant control instructions.
  • GUI graphical user interface
  • the user interacts with the graphical user interface primarily through gesture input on the touch display.
  • the gesture input herein may include any type of user gesture input that the device can detect by directly touching the touch display or proximity to the touch display.
  • the gesture input may be a finger that the user uses a right or left hand (eg, an index finger, a thumb, etc.), or an input object that can be detected by touching the display screen (eg, a stylus, a touch screen dedicated pen) on the touch display screen.
  • the actions of selecting one location, multiple locations, and/or multiple consecutive locations may include operational actions such as contact, touch release, touch tap, long contact, rotational deployment, and the like.
  • the gesture detection module can detect a gesture input that interacts between the input object and the touch display screen, such as determining whether a contact has occurred, determining whether the gesture input is continuously input, determining whether to correspond to the predetermined gesture, determining an operation position corresponding to the gesture input, and determining Whether the corresponding operation position of the gesture input moves to the edge position of the corresponding display area, determines whether the gesture input has been interrupted (eg, whether the contact has stopped), determines the movement of the gesture input, and tracks the movement trajectory of the gesture input, and determines the corresponding gesture input.
  • the gesture detection module is stored in the memory, and the above gesture input is monitored by a call of one or more processors to obtain an operation input instruction of the user.
  • control command input by the user is obtained through a keyboard, a scroll wheel in a human-machine interaction device, or a device such as a touch display screen, according to which the ultrasonic imaging parameter of the ultrasonic probe can be adjusted, or the working mode of the ultrasonic probe can be switched, or Adjust the spatial position of the probe.
  • Working modes include contrast imaging, elastography, and the like.
  • the ultrasound system 100 for planning ablation also includes a navigation system, which in FIG. 1 includes a magnetic field emission and signal receiving module 170 and a positioning device 111 secured to the ultrasound probe 101.
  • the ultrasound system 100 for planning ablation further includes a body positioning device 180.
  • the magnetic field emission and signal receiving module 170 is configured to generate a magnetic field and receive a signal fed back by the positioning device 111 located in the magnetic field, and obtain spatial orientation information of the positioning device 111 with respect to the magnetic field based on the feedback signal.
  • the spatial orientation information may take on different coordinate system representations to at least exhibit at least one of positional information and orientation information relative to the magnetic field.
  • the magnetic field in this paper includes Electromagnetic fields, etc.
  • the magnetic field emission and signal receiving module 170 is coupled to the positioning device 111 in a data line or wireless form.
  • the magnetic field emission and signal receiving module 170 is configured to emit a magnetic field and receive position information transmitted by the positioning device 111.
  • the specific positioning principle is that the positioning device 111 is placed in the magnetic field range, and the positioning device (for example, the positioning coil) feeds back the magnetic field related information of the current orientation to the magnetic field transmitting and receiving module 170, and the module calculates the current spatial coordinates of the positioning device 111.
  • the first three coordinates of the direction such as (x, y, z, a, b, c), are the spatial coordinates (ie, position information) of the positioning device 111 with respect to the magnetic field at the current time, and the last three parameters are the current time of the positioning device 111.
  • Directional information relative to the magnetic field ie, orientation information).
  • the spatial coordinates and orientation information of the object can also be described in the form of Euler angles, quaternions, and matrices.
  • the direction information and the spatial coordinates (ie, position information) of the positioning device 111 with respect to the magnetic field at the current time may be expressed by means of (x, y, z, a, b, c), and jointly represent the return of the positioning device.
  • Spatial orientation information may be characterize the spatial orientation information using only (x, y, z) or (a, b, c).
  • the positioning device 111 is fixed to the ultrasonic probe 101 as shown in FIG.
  • the positioning device 111 can return in real time the spatial orientation information, such as position information and/or direction information, at the current time in the magnetic field of the positioning device.
  • the positioning device 111 is used for binding with the ultrasound probe 101, and the spatial orientation information returned in real time may be equivalent to the current spatial orientation information of the ultrasound probe.
  • the binding mode can design a special card slot device, and the positioning device 111 is stuck at a certain position on the surface of the ultrasonic probe 101, thereby forming the probe 121. It is also possible to incorporate the positioning device 111 inside the ultrasonic probe 101 while processing the probe, thereby forming the integral probe 121.
  • the mapping matrix Pi can be calculated, and the mapping matrix is used to map the coordinates in the image space coordinate system of the probe 121 to the magnetic field coordinate system formed by the magnetic field.
  • i indicates the current time.
  • the mapping matrix P i can contain the following two parts:
  • the navigation system formed by the positioning device 111 and the magnetic field transmitting and receiving module 170 can adopt the related technologies in the field of the navigation device. For details, refer to the description of the related art, which will not be described in detail herein. Of course, the navigation system in this embodiment may be, but is not limited to, adopting the foregoing magnetic field positioning mode, as long as the spatial orientation information that can be used to locate the ultrasonic probe in real space can be used for the navigation system.
  • the object positioning device 180 is an optional device for planning the ablation of the ultrasound system 100, and the device can return the current spatial orientation information of the device in real time when the device is placed within the range of the generated magnetic field, and the spatial orientation information in the embodiment is explained. See the previous description, for example, can include location letter Information and direction information (refer to the explanation of the positioning device 111 described above).
  • the object positioning device 180 is configured to be placed on the surface of the object to be tested (for example, a human body or an animal), and can be used to acquire spatial orientation information of the current measured object or obtain motion information of the surface of the measured object.
  • the object positioning device 180 may be fixed on the surface of the object to be detected containing the detection object, and the magnetic field emission and signal receiving module 170 receives the detection signal fed back by the object positioning device 180 located in the magnetic field, and obtains the object positioning device 180 according to the detection signal.
  • the spatial orientation information of the aforementioned magnetic field, and/or the motion information of the surface of the measured object, the motion information of the surface of the measured body mentioned herein includes respiratory motion information of the measured body, such as respiratory frequency.
  • the obtained detection information can be used to correct or correct the information obtained by the positioning device 111 on the ultrasonic probe, for example, based on the detection signal fed back by the object positioning device 180, to correct the spatial orientation information obtained by the positioning device 111.
  • the object positioning device 180 can be fixed to the skin surface of the subject using a double-sided tape, tape, bandage, or the like, and the object positioning device 180 is maintained at a position on the skin surface during the scanning acquisition of the entire ultrasonic image.
  • the aforementioned ultrasonic probes may be different types of probes, such as two-dimensional convex array probes, three-dimensional convex array probes, four-dimensional array probes, linear array probes, etc., when different probes are used for ablation and ablation evaluation, specific data processing Technology can be adjusted depending on the type of probe.
  • the processor 140 can obtain the real-time ultrasound image corresponding to the ultrasonic probe in any one of the magnetic fields and the spatial orientation information corresponding to the orientation according to the foregoing provided device, and the corresponding relationship between the spatial orientation information and the real-time ultrasound image can be Supports display and analysis of real-time ultrasound images and related spatial orientation information.
  • the signal processing module 116 and the image processing module 126 of FIG. 1 may be integrated on one motherboard 106, or one or more of the modules (including the number herein above) are integrated in Implemented on one or more processor/controller chips.
  • the processor 140 and the memory 160 may be disposed on the main board 106 or may be disposed independently of the main board 106.
  • the processor 140 can also be implemented on one or more processor/controller chips integrated with the signal processing module 116 and the image processing module 126 of FIG.
  • the aforementioned magnetic field emission and signal receiving module 170 and the positioning device 111 may further include an object positioning device 180, which may constitute a magnetic navigation positioning system.
  • the transmitting and receiving circuits (103 and 104) excite the ultrasonic probe
  • the head (101) transmits an ultrasonic beam to a detection object containing a specific tissue according to the set ultrasonic imaging parameter, and receives an echo of the ultrasonic beam to obtain an ultrasonic echo signal.
  • the image processing module (126) obtains a real-time ultrasound image from the ultrasound echo signals based on some or all of the aforementioned ultrasound imaging parameters.
  • the real-time ultrasound image herein may also be a different mode of ultrasound image as described above, such as a B image, a C image, a D image, etc., or other types of two-dimensional ultrasound images or three-dimensional ultrasound images.
  • the real-time ultrasound image obtained in this embodiment may be an ultrasound image obtained when the ultrasound probe is located at any one orientation, and the real-time ultrasound image may be a currently obtained one-frame ultrasound image or a continuous multi-frame ultrasound image.
  • the spatial orientation information of the space in which the positioning device is attached to the ultrasound probe is obtained by the navigation system.
  • the navigation system includes at least the aforementioned positioning device 111, and the positioning device 111 is fixed to the ultrasonic probe 101 as described above.
  • the magnetic field emission and signal receiving module 170 generates a magnetic field that covers the spatial extent of the positioning device 111.
  • the magnetic field emission and signal receiving module 170 receives the signal fed back by the positioning device 111 located in the magnetic field, and obtains the spatial orientation information of the positioning device 111 with respect to the magnetic field according to the signal fed back by the positioning device 111.
  • step 240 of FIG. 2 the processor 140 calls a program in the memory to correlate the correspondence between the real-time ultrasound image and the spatial orientation information according to the spatial orientation information in the foregoing steps, so that the image space coordinate system can be obtained.
  • the mapping relationship between the spatial coordinate systems of the magnetic fields in which the magnetic fields are located, and the mapping relationship between the real-time ultrasonic images and the imported three-dimensional model data and the ultrasonic probe can be mapped to the same coordinate system by using the mapping relationship. Underneath, it is easy to integrate display.
  • the image space coordinate system is represented by a coordinate system space in which image pixels formed by a real-time ultrasonic image obtained by acquiring a specific tissue through an ultrasonic probe, and the magnetic field space coordinate system in which the magnetic field is located is a coordinate system space in a magnetic field range.
  • the mapping relationship between the image space coordinate system and the magnetic field space coordinate system mentioned herein can be represented by the aforementioned mapping matrix.
  • the spatial orientation information in the space where the ultrasonic probe is located can be acquired in real time by the magnetic field transmitting and signal receiving module, and the real-time ultrasonic image corresponding to each time is obtained by real-time activation of the ultrasonic probe, so that the planning for ablation is utilized.
  • the ultrasound system can correlate the real-time ultrasound image with the spatial orientation information corresponding to the ultrasound probe when acquiring the real-time ultrasound image, and store, for example, when using the ultrasound probe to obtain the movie image data (the movie image data herein may include multiple frames of consecutive two
  • the ultrasonic image is selected, but not limited thereto, the spatial orientation information of the ultrasonic probe corresponding to each frame image in the movie image data can be obtained.
  • the positioning device, the object positioning device and the magnetic field transmitting and receiving module in the device shown in FIG. 1 the correspondence between the real-time ultrasonic image of any frame and the spatial orientation information of the ultrasonic probe in the magnetic space coordinate system can be correlated and recorded. .
  • the ultrasonic probes of the orientations obtain real-time ultrasound images corresponding to one of the foregoing orientations, and spatial orientation information corresponding to the foregoing one orientation, and store the information to establish a correspondence.
  • the processor 140 imports the three-dimensional model data of the specific organization described above to obtain view data of a specific organization.
  • the three-dimensional model data mentioned herein includes at least one of size information, shape information, position information, and blood vessel distribution information of a surrounding tissue, and the three-dimensional model data may be single-frame static image data, or may be Movie image data, for example, modal three-dimensional image data including CT, MRI, three-dimensional ultrasound contrast of a specific tissue.
  • the three-dimensional model data may be pre-stored offline image data acquired by using other devices or the device of the embodiment, or may be three-dimensional image data acquired by the ultrasound system for planning ablation provided in the present embodiment.
  • the three-dimensional model data may be derived from image data acquired prior to surgery.
  • the three-dimensional model data of the specific tissue in the above step 250 is obtained by the following steps.
  • the movie image data including the foregoing specific tissue is acquired by the ultrasonic probe, and the aforementioned movie image data may be obtained after the contrast agent is poured.
  • the spatial orientation information corresponding to the ultrasonic probes located at different orientations when acquiring the aforementioned movie image data is acquired by the navigation system, and secondly, the processor selects each of the foregoing movie image data according to the spatial orientation information corresponding to the ultrasonic probes located at different orientations.
  • a frame of image is mapped into a magnetic field space coordinate system, and a three-dimensional ultrasound image is reconstructed to obtain three-dimensional model data of the aforementioned specific tissue.
  • the aforementioned movie image data obtained in the contrast imaging mode will be taken as an example, and will be specifically described in conjunction with the flow shown in FIG. 5.
  • the transmitting circuit and the receiving circuit (103 and 104) excite the ultrasonic probe (101) to emit an ultrasonic beam to a detection object containing a specific tissue, and receive an echo of the ultrasonic beam to obtain an ultrasonic echo signal;
  • the image processing module obtains a real-time ultrasound image based on the ultrasound echo signals, as specifically described in steps 210 and 220 of FIG.
  • the magnetic field emission and signal receiving module 170 generates a magnetic field, and receives a signal fed back by the positioning device 111 located in the magnetic field, and obtains spatial orientation information of the positioning device 111 with respect to the magnetic field according to the signal fed back by the positioning device 111.
  • the moving ultrasound probe looks for a specific tissue on the subject, and selects an observation section containing a specific tissue.
  • step 516 is performed, and a mode switching instruction input by the user may be received to enter the contrast imaging mode, that is, the ultrasound contrast agent may be injected into the subject (step 518), and when the specific tissue is perfused, the movie image data including the specific tissue is acquired.
  • the movie image data herein includes a multi-frame continuous two-dimensional image.
  • the film image data acquired at this time may be a movie image in which the ultrasonic probe is swung in a fixed direction or is located at a different position in a region near a specific tissue.
  • the spatial orientation information corresponding to the ultrasonic probes in different orientations can be obtained by the magnetic field emission and signal receiving module 170, and the ultrasonic probes are swung in a fixed direction or at different positions in the vicinity of a specific tissue. All correspond to different orientations in space, so the spatial orientation information corresponding to each frame image in the above-mentioned movie image data will be obtained.
  • Step 522 is performed to obtain spatial orientation information corresponding to each frame image in the captured movie image data, for example, by using the foregoing mapping matrix P i .
  • step 524 is executed, and each frame image in the movie image data is mapped into the magnetic field space coordinate system according to the spatial orientation information corresponding to each frame image in the collected movie image data, and the three-dimensional ultrasound contrast image is reconstructed, thereby obtaining the foregoing three-dimensional image.
  • One of the model data (step 526).
  • the above reconstruction algorithm is a so-called free hand reconstruction based on the navigation system.
  • the specific implementation is based on the mapping matrix Pi to map each pixel in each frame image to a magnetic field coordinate system to form a three-dimensional image.
  • an interpolation algorithm nearest neighbor, linear
  • the aforementioned movie image data may also be obtained based on the M mode, the B mode, and the like, thereby performing reconstruction of the three-dimensional ultrasound image.
  • the three-dimensional body information of the specific organization described above can be obtained, and the view data of the specific organization can be obtained.
  • the three-dimensional body of the aforementioned specific tissue is obtained by the image segmentation extraction step.
  • the graphics segmentation extraction step includes segmentation extraction of a particular tissue shape, location information, tissue size, tissue depth, and the like from the aforementioned three-dimensional ultrasound image.
  • the manual segmentation method manually drawing specific tissue boundaries on each of the two-dimensional slices of the three-dimensional image data, and the two-dimensional slice segmentation results to generate a three-dimensional body (T) of a specific tissue.
  • the multi-needle combined ablation zone not only covers a certain tissue area but also covers the safety boundary.
  • T to indicate the security side of a particular organization, such as an organization, and a particular organization.
  • the latter, the latter's volume data contains the former, and both need to be ablated during the interventional ablation process.
  • the security boundary of a three-dimensional body can be expanded from a tissue region of a specific tissue (the expansion algorithm is a simple morphological filtering algorithm) generated at a certain distance.
  • the three-dimensional body of the specific tissue in this step may be a tissue region of a specific tissue, or may be an ablation region corresponding to a specific tissue, for example, a range encompassed by a security boundary corresponding to a specific tissue.
  • the three-dimensional body of a particular tissue contains at least one of a specific tissue shape, location information, tissue size, tissue depth, information on blood vessel distribution around a particular tissue, and the like, and a security boundary.
  • the view data mentioned in this embodiment includes at least one of the following: facet image data for a specific organization, three-dimensional image data for a specific tissue, two-dimensional graphic data of three-dimensional body formation obtained according to a specific tissue segmentation, and basis A three-dimensional representation or icon data formed by a three-dimensional body obtained by a specific tissue segmentation.
  • the icon or icon data may not be real image data, and may be an icon for characterizing three-dimensional body information on a map or an image, where the three-dimensional body information includes: shape of a specific tissue, position information, tissue size, tissue depth, and One or more combinations of information content such as security boundaries.
  • the processor 140 registers the three-dimensional model data with the real-time ultrasound image according to the result of the association record, for example, the correspondence between the real-time ultrasound image and the spatial orientation information, so that a three-dimensional body can be obtained.
  • Corresponding relationship with the real-time ultrasound image establishes a mapping relationship between the three-dimensional body, the real-time ultrasound image, and the space in which the positioning device is located.
  • a fused display can be performed, for example, in one embodiment, a real-time ultrasound image is displayed on a display screen, and view data about a specific tissue is fused on the ultrasound image.
  • view data of a specific organization may be mapped into the same coordinate system based on the spatial orientation information corresponding to the obtained real-time ultrasound images.
  • the processor may fused the real-time ultrasound image and view data for the same particular tissue on the display screen based on the results of the registration.
  • the imported 3D model data and the real-time ultrasound image are displayed based on the navigation fusion, that is, the registration process is used to register the 3D model data with the real-time ultrasound image, so that some or all of the real-time ultrasound image and the 3D model data can be displayed in real time.
  • the content for example, is linked with the real-time ultrasound image to display information such as a two-dimensional image cut surface, a three-dimensional body, and the like corresponding to the current real-time ultrasonic image in the three-dimensional model data.
  • CT/MRI and real-time ultrasound registration can be used to establish the fusion of 3D ultrasound data (including specific tissue information) and real-time ultrasound images.
  • image registration fusion such as: point registration, face matching Quasi- and vertical registration.
  • the surface fusion registration step may be: selecting an image cut surface in the three-dimensional model data, and using an ultrasonic probe to find an image cut surface in the same real-time ultrasonic image on the object to be measured (at this time, the mapping matrix is Pt), and establishing a three-dimensional image.
  • mapping mode Pt*M.
  • the three-dimensional view data or the two-dimensional view data representing the three-dimensional body may be extracted and merged with the real-time ultrasonic image data, and the display may be formed by at least the following three display modes: display specific Organizing corresponding 3D view data and 2D real-time ultrasound images; displaying 3D view data and 3D real-time ultrasound images corresponding to a specific tissue; and displaying 2D view data and real-time ultrasound images corresponding to a specific tissue.
  • the above three display modes can be displayed simultaneously in the same interface of the display screen.
  • the above three display modes are respectively displayed in multiple display areas of the same interface, and then the emissions are displayed on the same interface.
  • the processor 140 fuses the real-time ultrasound image and the view data of the particular tissue so that at least one fusion map can be formed.
  • the view data of the aforementioned specific tissue is marked on a real-time ultrasound image to form a frame fusion map.
  • the real-time ultrasound image and the view data of the specific tissue are correspondingly displayed on the fusion map according to the image data corresponding to the spatial coordinate system.
  • a real-time ultrasound image is extracted, and a real-time ultrasound image of the frame is output, and the view data of the specific tissue is determined on the real-time ultrasound image based on the registration result of the real-time ultrasound image and the view data of the specific tissue.
  • Corresponding position superimposing view data of a specific tissue on the frame real-time ultrasound image according to the determined corresponding position, forming at least one fusion map.
  • the real-time ultrasound image in this embodiment may be a two-dimensional image or a three-dimensional image.
  • the processor 140 displays the aforementioned real-time ultrasound image on the aforementioned display screen 130.
  • a real-time ultrasound image is displayed on the display, it can be displayed in separate screens or in multiple windows.
  • the image corresponding to the real-time ultrasound image in the three-dimensional model data may also be displayed synchronously.
  • the manner of marking the aforementioned three-dimensional body on the aforementioned real-time ultrasonic image includes at least one of the following modes:
  • image data corresponding to the aforementioned real-time ultrasonic image (713 in the figure) is extracted from the aforementioned three-dimensional model data and displayed,
  • the two-dimensional view data 712 of the specific tissue is drawn on the displayed image data 713 for highlighting the region where the three-dimensional body is located on the image data extracted from the aforementioned three-dimensional model data.
  • the ablation zone 711 can also be marked in Figure A.
  • the fusion map in the magnetic field space coordinate system is established according to the correspondence between the real-time ultrasound image and the spatial orientation information (such as Figure C in the lower left corner), in the fusion map C.
  • the positional relationship between the ultrasonic probe 721 and the real-time ultrasound image 722 is displayed, and the three-dimensional view data 724 of the aforementioned specific tissue is marked at the corresponding position of the fusion map C.
  • an interventional device such as ablation needle 725
  • ablation region 723 can also be labeled in fusion map C.
  • the fusion map may be obtained by superimposing a three-dimensional volume mark on a two-dimensional real-time ultrasonic image (a three-dimensional body mark may also be a circle, as shown by B in FIG. 7), or may superimpose a three-dimensional body mark on a three-dimensional real-time ultrasonic image.
  • the obtained image may also be a combined image formed based on the spatial positional relationship of the real-time ultrasound image and the ultrasound probe (as shown by C in FIG. 7).
  • the view data of the specific tissue superimposed and displayed on the real-time ultrasound image may adopt the icon mark shown in FIG. 7 to mark the shape and position of the specific tissue at the corresponding position of the real-time ultrasound image.
  • a straight line segment is used to characterize the ablation device marker, and the ablation device marker and the predicted ablation region are displayed on the real-time ultrasound image data to characterize the planned ablation path, and the projected ablation region is displayed at the end of the ablation device marker, when the ablation device is marked at
  • the position on the real-time ultrasound image changes, it is expected that the ablation zone will change in conjunction; or when the orientation of the ultrasound probe changes, the ablation device marker and/or the predicted ablation zone will also change in conjunction.
  • a planned ablation path is determined on the real-time ultrasound image. For example, based on the real-time ultrasound image, the user's input on the real-time ultrasound image is received by the human-machine interaction device, and based on the user's input, the processor 140 determines the planned ablation path.
  • the planned ablation path referred to herein includes at least the puncture guideline angle, the puncture guide direction, the ablation needle entry path, the ablation path depth, the expected ablation zone, the ablation power, the number of ablation needles, and the predicted One of the information such as the length of work, the expected range of ablation (or ablation area).
  • the ablation needle entry path may include information such as the needle distance, the needle insertion angle and the needle insertion depth, and the needle insertion position.
  • the user can set the ablation needle entry path based on the real-time ultrasound image displayed on the display screen, so that the accurate positioning can be measured.
  • the path of interventional ablation enhances the accuracy of the ablation planning path, improves the ablation effect, and reduces the risk of surgery.
  • the ablation device referred to herein includes one or more ablation needles, or an interventional catheter, etc., and the ablation needle is mainly described herein as an example.
  • the ablation path can be planned based on the real-time image displayed in the foregoing embodiment, the three-dimensional body of the specific tissue, and/or the image data corresponding to the aforementioned real-time ultrasound image extracted from the aforementioned three-dimensional model data.
  • the method flow shown in FIG. 3 and FIG. 4 can be specifically referred to.
  • the processor displays the ablation device and/or the ablation path on the aforementioned fusion map according to the planned ablation path.
  • ablation needle 732 is shown in Figure B
  • ablation needle 725 is shown in Figure C.
  • the ablation regions (731, 733) are marked in Figure B.
  • the ablation zone 723 can also be labeled in the fusion map C.
  • the processor implements marking the ablation device and/or the ablation path on the fusion map in the following manner, for example, as shown in FIG. 7 .
  • the positional relationship between the ultrasound probe and the real-time ultrasound image is obtained; and the positional relationship between the ultrasound probe and the real-time ultrasound image is marked in the fusion map.
  • the real-time ultrasound image at this time may be a three-dimensional image or a two-dimensional image.
  • the scanning relationship between the scanning area of the ultrasound probe and the real-time ultrasound image is displayed, and the three-dimensional view icon 724 of the specific target tissue is displayed at the corresponding scanning depth position, and the ablation area 723 is marked.
  • the intervention setting ablation needle 725 can clearly understand the position correspondence between the ablation needle and the ultrasonic probe and the real-time ultrasound image in the actual space from the figure, so that the user can more intuitively understand the actual planning situation, and can also be combined with the figure B. Make more accurate planning and positioning.
  • the positional relationship between the ultrasound probe and the real-time ultrasound image is determined in combination with the position of the scanning plane of the ultrasound probe.
  • the angle of the ablation device fixed on the ultrasound probe may be used, or the relative angle relationship between the ultrasound probe and the ablation device may be combined with the real-time ultrasound image and the obtained spatial orientation.
  • a probe icon (721) characterizing the ultrasound probe is displayed on the display screen, and the display position of the probe icon varies with the acquired spatial orientation information.
  • the ablation device indicia and/or the planned ablation path are displayed on the display screen in at least one of the following ways:
  • the display screen includes a plurality of windows, and the data within the plurality of windows changes in conjunction with changes in the position of the ultrasound probe.
  • the plurality of windows respectively display a plurality of results of the two-dimensional real-time ultrasound image or the three-dimensional real-time ultrasound image being fused with one of the three-dimensional view data and the two-dimensional view data.
  • the process of moving the ultrasound probe to different orientations can also be achieved by motor drive.
  • step 314 to step 316 in FIG. 3 the spatial orientation information of the space where the positioning device fixed on the ultrasonic probe is located, that is, the spatial orientation information of the orientation Q1, and the spatial orientation information of the real-time ultrasonic image and the orientation Q1 are recorded by the navigation system.
  • the correspondence relationship between the image space coordinate system and the magnetic field space coordinate system is obtained.
  • the processor imports the 3D model data of the specific organization, obtains the view data of the specific organization, and registers the 3D model data with the real-time ultrasound image.
  • the mapping relationship between the 3D model data and the real-time ultrasound image in the image space coordinate system or the magnetic field space coordinate system can be obtained, thereby realizing the fusion of the view data of the specific tissue on the real-time ultrasound image.
  • step 250 in FIG. 2 and Step 260 refer to step 250 in FIG. 2 and Step 260.
  • Step 322 in FIG. 3 displaying the real-time ultrasound image S1 corresponding to the orientation Q1, and displaying the three-dimensional information of the specific tissue on the real-time ultrasound image S1, thereby marking the view data of the specific tissue at the corresponding position of the real-time ultrasound image S1.
  • a fusion map is formed, and the fusion map here may be the B diagram or the C diagram in FIG.
  • the method of merging display is based on the result of registration of the three-dimensional model data with the real-time ultrasonic image, and the description of the above will not be repeated here.
  • the processor receives an execution parameter or a planned ablation path for the at least one ablation needle.
  • the ablation device includes at least one ablation needle. According to the relative fixed positional relationship between the ablation needle and the ultrasonic probe, the ablation needle can be mapped on the real-time ultrasound image, thereby determining the planned ablation path of the ablation needle based on the tracking positioning of the real-time ultrasound probe and the real-time ultrasound image.
  • the execution parameters mentioned in this embodiment include at least one of ablation power, estimated working time, and number of ablation devices.
  • step 326 or the foregoing process of determining a planned ablation path on a real-time ultrasound image may employ the following steps:
  • the ablation device marker can be displayed at a first location of the real-time ultrasound image to obtain an adjustment command for the ablation device marker; based on the adjustment command, the position of the ablation device marker on the real-time ultrasound image is changed To the second position; record the association information of the position change of the ablation device marker with the real-time ultrasound image, view data and/or spatial orientation relationship to obtain a planned ablation path.
  • the planned ablation path includes at least a set of execution parameter information of the at least one ablation device corresponding to the plurality of consecutive position changes.
  • the following steps are included prior to determining the planned ablation path on the real-time ultrasound image or prior to step 326:
  • an execution parameter is obtained about the ablation device, and the execution parameter includes at least one of an ablation power, an estimated working time, and a number of ablation devices.
  • an input window or a pull-down menu for executing parameters is provided on the display interface for the user to input a selection instruction to set the execution parameters.
  • the predicted ablation region is then obtained based on the aforementioned performance parameters, and the predicted ablation region is displayed on the real-time ultrasound image to determine a planned ablation path, wherein the display of the predicted ablation region varies as the position of the ablation device marker changes.
  • the user can perform the setting of the ablation planning path in the current mode when the patient is inspected based on the currently obtained real-time ultrasound image.
  • the previously stored planned ablation path is displayed, and the difference between the current planned ablation path and the prior planned ablation path is obtained.
  • the planned ablation path displayed on the display screen can include a first planned ablation path introduced by the processor with respect to the at least one ablation needle, for example, the first planned ablation path can be a pre-stored planned ablation path.
  • the planned ablation path in this embodiment may further include: a second planned ablation path obtained by the processor based on the foregoing process of determining a planned ablation path on the real-time ultrasound image based on the real-time ultrasound image correspondingly obtained when the ultrasound probe is located at the current orientation. .
  • the processor is based on the foregoing process of determining a planned ablation path on the real-time ultrasound image, the received second planned ablation path for the at least one ablation needle, eg, the second planned ablation path may be performed by the operator for the currently obtained real-time ultrasound image A planned ablation path for at least one ablation needle entered during ablation planning.
  • the user can input a second planned ablation path on the real-time ultrasound image through the human-machine interaction device, for example, setting a planned ablation path on the aforementioned real-time ultrasound image.
  • information such as planning ablation needle path can be set, and the position of a certain tissue can be found on the human body based on the real-time ultrasonic image, the needle insertion position is determined, and the expected needle insertion path is set.
  • the user marks and/or sets the puncture guide line angle of the ablation needle, the puncture guiding direction, the ablation needle entry path, the ablation path depth, the ablation power, the number of ablation needles, the expected working time, and the expected ablation on the real-time ultrasound image.
  • the information of the ablation path is planned such as the range (or ablation area) to perform the ablation planning.
  • the difference between the two can be saved, recorded, and/or labeled to prompt the user.
  • the aforementioned first planned ablation path may be obtained based on the current system or may be obtained based on other systems.
  • first and second are only used for different acquisition time or source for distinguishing the ablation path, and do not change the attribute content of the ablation path itself, that is, the first planned ablation path.
  • the second planned ablation path can include at least information such as puncture guideline angle, puncture guiding direction, ablation needle entry path, ablation path depth, ablation power, number of ablation needles, estimated working time, and expected ablation range (or ablation area). One of them.
  • the ablation path can be an ablation path for at least one puncture entry for one ablation, or an ablation path for at least one puncture entry for a plurality of ablation needles.
  • implementing the ablation path for the at least one ablation needle of the aforementioned step 326 includes receiving a first planned ablation path for the first ablation and receiving Receiving a second planned ablation path for the second ablation.
  • the first ablation needle and the second ablation needle may respectively correspond to two different ablation needles, or respectively correspond to two punctures of the same ablation needle.
  • the processor determines the predicted ablation zone based on the ablation path or execution parameters described above. Based on the ablation path described above, an organization region and a safety boundary containing a specific tissue can be determined (see the related description above).
  • the operator can set the ablation needle (or ablation needle) to simulate the ablation range (Si, i-th ablation) for a given ablation power and working time based on the clinical experience or the operating parameters provided by the manufacturer.
  • the expected ablation range of most ablation needles is an ellipsoid, so only the long axis length and the short axis length of the ellipsoid need to be set.
  • the operator can set a simulated ablation area of various ablation needles under different power and working time in advance in the system to construct a simple database.
  • the operator can directly import the ablation needle related parameters (ablation range and power setting, etc.) that have been set, and then obtain the predicted ablation area based on the ablation path input by the user.
  • the foregoing embodiment further includes: establishing a relationship database between the two according to the expected ablation path or the correspondence between the execution parameter and the predicted ablation region, and then inputting the information based on the operator (also user)
  • the above ablation path finds the associated relational database to determine the corresponding predicted ablation area.
  • a first predicted ablation zone can be obtained; however, according to the second planned ablation path, a second predicted ablation zone can be determined.
  • the first ablation path and the second ablation path have been explained in the foregoing, and the description will not be repeated here.
  • the first predicted ablation zone mentioned in the present embodiment is a planned ablation path determined based on the currently obtained real-time ultrasound image
  • the second predicted ablation zone is an imported pre-stored planned ablation path
  • the first predicted ablation zone is The planned ablation path corresponding to the first ablation device determined based on the currently obtained real-time ultrasound image
  • the second predicted ablation region is a planned ablation path corresponding to the second ablation device determined based on the currently obtained real-time ultrasound image.
  • the first ablation device and the second ablation device can each be a different one of a plurality of ablation devices (eg, multiple ablation needles) employed in one ablation planning.
  • the processor outputs the predicted ablation region.
  • the predicted ablation zone Several ways of outputting the predicted ablation zone are provided below.
  • Marking the predicted ablation region on the real-time ultrasound image for example, in one example, simultaneously marking the first predicted ablation region and the second predicted ablation region on the real-time ultrasound image .
  • the red line range 731 is the second projected ablation zone and the pink rendered range 733 is the first projected ablation zone.
  • Line 732 is the ablation needle.
  • the display mode of Figure B can be used to more clearly and visually plan the ablation range and Correspondence between ablation ranges currently obtained based on real-time ultrasound images.
  • the three-dimensional body described above may include a range encompassed by a security boundary. Referring to FIG. B in FIG. 7, the overlapping relationship between the first predicted ablation region and/or the second predicted ablation region and the three-dimensional body 734 may be separately calculated, and a corresponding calculation result may be obtained for quantifying the view of the specific tissue.
  • the relationship between the data and the ablation range further accurately compares the pros and cons of the planned ablation range with the current ablation range obtained based on real-time ultrasound images.
  • the first planned ablation path which is a pre-stored planned ablation path, may also be an ablation planning setting based on static 3D data in a conventional manner, and can be effectively compared with the traditional method and real-time scanning in combination with the method proposed in this embodiment.
  • the difference in ablation planning can improve surgical accuracy and improve user experience.
  • the processor outputs the expected ablation region by any one or a combination of the three output modes.
  • the operator can perform the ablation planning setting in this manner, for example, in one of the embodiments, based on the embodiment illustrated in FIG. 3, the planned ablation path in step 326 of FIG. 3 is for the at least one ablation needle input when the operator performs the ablation planning for the currently obtained real-time ultrasound image.
  • the ablation path is planned, and then in the subsequent step 336, the real-time ultrasound image, the spatial orientation information and the second planned ablation path corresponding to each position of the ultrasonic probe in the magnetic field are recorded, so that the aforementioned first ablation stored in advance can be obtained.
  • the path forms ablation planning data and uses the first ablation path during subsequent ablation planning (see Figure 4).
  • the positioning device is fixed on the ultrasonic probe, so that the needle can be removed from the puncture needle, and the relative position of the ablation device (such as the puncture ablation needle) and the probe can be based on the preoperative puncture planning.
  • the ablation device such as the puncture ablation needle
  • the path planning is not based on the actual ablation needle involved in the planning of the inside of the body, so as to avoid increasing the patient's pain during preoperative planning and reducing the preoperative cost.
  • the relative position of the ablation device (such as the puncture ablation needle) to the probe includes: the distance between the end of the ablation device (such as the ablation needle tip) and the ultrasound probe, the angle between the ablation device (such as the ablation needle) and the ultrasonic probe.
  • the foregoing steps 280 and 290, or the foregoing steps 326 and 330 may further include the following steps:
  • the ablation device indicia of the ablation device is displayed on the fusion map based on the relative position of the ablation device and the ultrasound probe; and the ablation path for the ablation device is set based on the real-time ultrasound image and the ablation device marker.
  • the ablation needle path of the ablation device can be planned by adjusting the position of the ablation device marker on the real-time ultrasound image on the fusion map.
  • the image location of the ablation device marker on the real-time ultrasound image is adjusted based on an adjustment command by the user regarding the ablation device marker received by the processor to obtain a portion of the second planned ablation path.
  • the real-time ultrasonic image, the spatial orientation information, and the planned ablation path obtained when the associated ultrasonic probe is located in each of the magnetic fields in the above-mentioned step 336 can be obtained by the following method, and the associated record characterizing the ablation device The correspondence between the simulated markers, the real-time ultrasound images, and the spatial orientation information, thereby forming pre-operative data on the ablation planning. In this process, it is not necessary to install an ablation device on the ultrasound probe, such as ablation needle, to achieve ablation planning, thereby reducing patient suffering.
  • FIG. 4 and FIG. 7B when using the device of the embodiment, a planning evaluation between the traditional planning path and the real-time planning path can be performed, thereby providing the doctor with a corresponding reference. To develop a more precise planning path.
  • the flow steps of an embodiment are illustrated in Figure 4, which can be used to compare the pros and cons of the planned ablation range with the ablation range obtained based on the current real-time ultrasound image.
  • step 410 and step 412 in FIG. 4 the transmitting circuit and the receiving circuit excite the ultrasonic probe to emit an ultrasonic beam to a detection object containing a specific tissue, and receive an echo of the ultrasonic beam to obtain an ultrasonic echo signal, and the image processing module according to the ultrasonic echo
  • the signal obtains a real-time ultrasound image, as described in detail in steps 210 and 220 described above, or steps 310 and 312.
  • the spatial orientation information of the space where the positioning device is located is obtained by using the navigation system. For details, refer to step 230 or step 314.
  • step 416 the processor associates the correspondence between the real-time ultrasound image and the spatial orientation information. For details, refer to step 230 or step 316.
  • step 416 the real-time ultrasound image of each frame and the spatial orientation of the ultrasound probe in the magnetic field when the ultrasound image is acquired can be obtained.
  • step 418 and step 420 a three-dimensional model of a specific organization is guided to obtain a special The organized view data is registered with the real-time ultrasonic image.
  • steps 250 and 260 or steps 318 and 320 described above refer to steps 250 and 260 or steps 318 and 320 described above.
  • step 422 at least one frame of the real-time ultrasound image is displayed on the display screen, and the view data of the specific tissue is marked on the corresponding position of the real-time ultrasound image (step 424), and the foregoing fusion map can be formed.
  • Step 270, or steps 322 and 324 are examples of the view data of the specific tissue is marked on the corresponding position of the real-time ultrasound image.
  • the processor receives a second planned ablation path for the at least one ablation needle when the ultrasound probe is in the current orientation.
  • the processor determines a second projected ablation zone based on the second planned ablation path.
  • the processor imports a pre-stored first planned ablation path for the at least one ablation needle, and determines a first predicted ablation zone based on the first planned ablation path.
  • the first predicted ablation zone and the second predicted ablation zone are labeled at respective locations of the real-time ultrasound image, such as 733 and 731 in Figure B of Figure 7.
  • the processor quantizes the overlap between the first predicted ablation region and the second predicted ablation region, and in step 436 the processor outputs an overlap between the first predicted ablation region and the second predicted ablation region. Quantitative results.
  • the first ablation path for the at least one ablation needle may be a second ablation path acquired using the method illustrated in FIG. 3, or a planned ablation path obtained when the ablation planning is performed based on the offline ultrasound image.
  • the output of the quantized result of the overlap between the first predicted ablation zone and the second predicted ablation zone may be a graphical display as shown in Figure 7, or a text display ratio.
  • the processor can display the spatial positional relationship between the probe icon 721, the ablation device marker 725, and the scanning section of the real-time ultrasound image 722 obtained at the current time in the same window or the same fusion map. Further, the view data of a particular organization is displayed in the same window or the same fused graph (724). Based on the registration result, the acquisition angle of the ultrasound image can be observed from the perspective of the actual space angle, thereby guiding the user to perform the planning operation of the ablation path.
  • the foregoing step of determining a planned ablation path on the real-time ultrasound image may further include the steps of: introducing a pre-stored planned ablation path, such as the aforementioned second planned ablation path, within the real-time ultrasound image; Position change, based on the changed real-time ultrasound image acquisition input for performing a change operation on the imported planned ablation path, for example based on the foregoing acquisition instructions for the first planned ablation path, respectively, for at least one frame of real-time ultrasound images obtained by the ultrasound probe at different times respectively Obtaining a corresponding planned ablation path, and the correction result based on the second planned ablation path may be A first planned ablation path is obtained to obtain an input of a change operation with respect to the second planned ablation path; and change data regarding the planned ablation path is obtained according to the change operation input.
  • a pre-stored planned ablation path such as the aforementioned second planned ablation path
  • storing a planning ablation path or change data regarding a planned ablation path establishing a planning ablation database for a particular organization, the information recorded by the database including at least one of: planning an ablation path and execution parameters regarding ablation And one of the information recorded by the database further includes: one of a planning ablation path and an execution parameter regarding ablation, and the spatial orientation information, the real-time ultrasound image, and the view data of the specific tissue.
  • One of the associations It is convenient to import the operation of planning the ablation path, or directly import the corresponding planned ablation path according to the execution parameters input by the user.
  • the planning situation of the multi-needle joint ablation can also be used. For details, refer to the process shown in FIG. 6.
  • step 610 and step 612 in FIG. 6 the transmitting probe and the receiving circuit excite the ultrasonic probe located in one orientation to emit an ultrasonic beam to the detecting object containing the specific tissue, and receive the echo of the ultrasonic beam to obtain the ultrasonic echo signal, and the image
  • the processing module obtains a real-time ultrasound image corresponding to the aforementioned orientation based on the ultrasound echo signal.
  • step 610 and step 612 refer to step 210 and step 220, or step 410 and step 412, which are described above.
  • step 614 of Figure 6 the spatial orientation information of the positioning device is obtained using a navigation system.
  • step 614 refer to step 230 or step 414 described above.
  • the processor correlates the correspondence between the real-time ultrasound image and the spatial orientation information corresponding to the azimuth association, so that the mapping relationship between the image space coordinate system and the magnetic field space coordinate system can be obtained.
  • step 616 refer to step 240 or step 416 described above.
  • the processor guides the three-dimensional model of the specific organization to obtain view data of the specific organization, and the view data includes three-dimensional body information.
  • the three-dimensional model data and the real-time ultrasonic image are performed. Registration, so that the mapping relationship between the three-dimensional model data and the image space coordinate system can be obtained. Based on these mapping relationships, the view data, the ultrasound probe, and the real-time ultrasound image of a specific tissue can be mapped in the same coordinate system, and fusion is realized. Imaging.
  • step 618 and step 620 refer to step 250 and step 260, or step 418 and step 420, which are described above. In step 622 of FIG.
  • the processor displays a real-time ultrasound image and marks the view data of the particular tissue on the real-time ultrasound image, for example, a three-dimensional or two-dimensional image of a particular tissue can be marked at a corresponding location on the real-time ultrasound image.
  • Data, or an icon marking a particular organization at a corresponding location on a real-time ultrasound image At least one fusion map can be obtained.
  • step 622 refer to the foregoing step 270, or steps 422 and 424.
  • the processor receives a set of first ablation for the planned ablation path that should be obtained based on the real-time ultrasound image.
  • the processor determines the first portion of the predicted ablation zone based on the aforementioned planned ablation path.
  • the processor calculates an overlapping relationship between the first portion and the aforementioned view data to obtain a first calculation result.
  • the processor outputs the aforementioned first calculation result.
  • the processor receives a set a second ablation for the planned ablation path that should be obtained based on the real-time ultrasound image.
  • the processor determines a second portion of the predicted ablation region based on the previously obtained planned ablation path.
  • the processor calculates an overlapping relationship between the aforementioned second portion and the first portion and the aforementioned view data to obtain a second calculation result.
  • the processor updates the first result of the output to the aforementioned second calculation result.
  • the first ablation needle and the second ablation needle may respectively correspond to two different ablation needles, or respectively correspond to two puncture insertion settings of the same ablation needle respectively.
  • the acquisition of the two planned ablation paths mentioned herein may be
  • the input of the two planned ablation paths respectively obtained after adjusting the first ablation needle and the second ablation needle respectively may also be the input of the two ablation paths respectively corresponding to the adjustment of the same ablation needle.
  • the first ablation needle and the second ablation needle are two different ablation needles, and then the embodiment can be used to demonstrate the stepwise three-dimensional body ablation planning process in the multi-needle joint ablation process, and gradually
  • the display of the corresponding proportion of the ablation range provides users with good data support, and the process from traditional human judgment to computer-aided judgment makes the ablation planning more scientific and more accurate.
  • the output of the first calculation result and the second calculation result may be a rendering output on the fusion map, or may output the display in a text manner, and may also be updated and displayed according to the process of the ablation planning.
  • the process of setting the path of the simulated ablation needle can be seen in Figure 3.
  • the operator moves the probe to move the probe on the patient's body surface, finds the appropriate position and orientation according to the envisaged intervention plan, and obtains the desired ultrasound section of the specific tissue, ie the predicted ablation needle position.
  • the probe angle or the puncture guide line angle is adjusted such that the puncture guide line passes through a specific tissue (Fig. 7 in Fig. 7).
  • the ablation path such as the depth of the needle is adjusted by the ultrasonic control panel knob, button or touch screen of the human-computer interaction device, and the simulated ablation needle is displayed along the puncture guide line to enter a specific tissue (such as a certain tissue region), and the simulated ablation is determined after the determination.
  • the needle tip shows the simulated ellipsoid ablation zone (Fig. 723 in Figure 7).
  • the system automatically stores the currently set ultrasound section, puncture guideline angle, simulated ablation needle path, path depth, simulated ablation area and other ablation paths and real-time ultrasound images. image. Since the magnetic positioning device is bound to the probe, the space represented by the positioning device on the probe relative to the physical space mapping matrix Pi of the magnetic field can be obtained in real time.
  • the coordinates of the simulated ablation region in the ultrasonic image space can be mapped to the physical space where the magnetic field is located, thereby obtaining the coordinates of the simulated ablation region in the magnetic field space coordinate system, that is, Pi*Si.
  • Pi*Si the coordinates of the simulated ablation region in the magnetic field space coordinate system
  • the percentage of ablation residual can be displayed in real time to quantify the overlapping relationship between the aforementioned three-dimensional body and the predicted ablation region, that is, the specific tissue (such as a certain tissue (safety boundary)) ablation residual region occupies the entire three-dimensional body.
  • the percentage, the real-time percentage A at the kth ablation can be expressed as:
  • the kth ablation may be based on k ablation of one ablation needle, or may be ablation based on k ablation needles
  • the parameter A quantitatively displays the current ablation effect in real time, which may be used as the calculation of the foregoing steps 332 and 334.
  • the manner can also be used for the calculation of the first calculation result and the second calculation result in FIG. 6. It is also possible to calculate the ablated volume percentage parameter and calculate the minimum number of ablation needles, ie how many predicted ablation zone joints are calculated Together, it can contain a certain tissue area Ts.
  • a real-time image, a multi-needle joint ablation region, and an organizational target can be displayed on the display screen, and the information to be displayed includes the two-dimensional image, the three-dimensional image data, and the three-dimensional image.
  • the relationship between the shape targets (ie, the three-dimensional bodies), as shown in FIG. 7, is a multi-window linkage display platform.
  • One of the windows is fused to display the real-time two-dimensional image and the simulated ablation area and the intersection of the three-dimensional body, that is, the simulated ablation area and a certain tissue (safe boundary) and the real-time image intersection area are superimposed and displayed on the real-time image (as shown in the upper right of Fig. 7).
  • Three-dimensional display technology such as volume rendering (VR) or surface rendering (SSD) can also be used to display real-time images, multi-needle joint ablation regions, a certain tissue target, etc. (Fig. 7 bottom left panel C).
  • the device can guide the operator to perform interventional ablation along the already set path.
  • the operator selects a path set in a simulation process to obtain parameters when setting the path: probe orientation information R i , mapping matrix P i , view data of a specific organization (for example, three-dimensional volume data S i ), and the like.
  • the probe orientation information R k corresponding to the current real-time probe, the mapping matrix P k , and the three-dimensional volume data S k . Based on the interrelationship between the two sets of data, the operator can be guided to ablate along the simulated set path.
  • the distance between the current probe and the analog set probe, the azimuth angle, etc. can be calculated.
  • Quantitative parameters, in the 3D display system can directly display the relative position of the two probe models on the fusion map (for example, in the lower left window of Figure 7, two probe models are displayed simultaneously, one of which indicates the position of the current ultrasonic probe, and another A probe model indicates the position of the analog set probe).
  • the coincidence rate of the predicted ablation region corresponding to the real-time probe and the predicted ablation region corresponding to the real-time probe can also be displayed in real time, ((P i *S i ) ⁇ (P k *S k ))/(P i *S i ), wherein P i *S i can be used to obtain a registered three-dimensional body.
  • the fusion display shows the intersection of the real-time image with the simulated set ablation region and the predicted ablation region, that is, the predicted ablation region and the current predicted ablation region are marked with different colors on the real-time image.
  • an ultrasound apparatus for guiding a planning design can be presented in one of the embodiments, showing the expected ablation effect when multiple needles are simultaneously ablated. That is, the probe orientation, the actual needle insertion depth, and the needle insertion angle when the doctor introduces the needle during the actual interventional ablation.
  • a planning mode (the analog system mode as mentioned below) is established in the ultrasound system.
  • the predicted ablation path is obtained by obtaining varying real-time ultrasound image data as the position of the ultrasound probe changes, and based on the varying real-time ultrasound image, thereby determining a planned ablation path on the real-time ultrasound image.
  • the planning mode (including the navigation system mode) is initiated or entered; the moving ultrasound probe selects real-time imaging of a particular tissue to obtain first real-time ultrasound image data, such as selecting to image a particular tissue of a particular tissue.
  • the data set of the ablation path can obtain at least two planned ablation paths corresponding to the moving ultrasound probe, obtain at least two predicted ablation regions at the corresponding positions according to the planned ablation path, and superimpose and display at least two moving ultrasounds on the real-time ultrasound image.
  • the projected ablation zone obtained by the probe forms a joint predicted ablation zone. For example, moving the ultrasound probe to select other locations to guide the ablation needle into a tissue target and display the combined projected ablation zone
  • the ultrasound contrast imaging mode is selected, the ultrasound contrast agent is injected, and the contrast image is acquired.
  • the probe is swung in a fixed direction or a piece of film image data containing different locations in the vicinity of a tissue is stored.
  • the fusion display jointly predicts the ablation region and the contrast image, and maps each frame image to the magnetic field space coordinate system based on the probe orientation information (map matrix Pi) corresponding to each frame image in the stored movie, and displays the contrast image and the joint predicted ablation region. Intercourse. That is, the simulated ablation area and a certain tissue (safety boundary) and the real-time image intersection area are superimposed and displayed on the real-time image (as shown in the upper right panel B of FIG. 7). By observing whether the contrast section of a certain tissue is included in the color prediction joint ablation area, it can be judged whether the currently set ablation needle can complete the ablation.
  • other ultrasound imaging modes may also be employed in the ultrasound contrast imaging mode in this embodiment.
  • an ultrasound system capable of comparing and verifying the predicted ablation zone with the actual ablation zone is also provided in this embodiment.
  • an ultrasound system is provided that includes an ultrasound probe, an ablation device, a transmit circuit and a receive circuit, an image processing module, a navigation system, a display screen, a memory, and a processor.
  • the ablation device is attached to the ultrasound probe, such as the ablation device and the ultrasound probe are fixed at a predetermined angle.
  • a transmitting circuit and a receiving circuit for transmitting an ultrasonic beam to a detecting object containing a specific tissue by exciting the ultrasonic probe, and receiving the ultrasonic beam The echo is obtained by the ultrasonic echo signal.
  • the image processing module obtains real-time ultrasound image data based on the ultrasound echo signals.
  • the navigation system includes a positioning device fixed on the ultrasonic probe, and the spatial orientation information of the space in which the positioning device fixed on the ultrasonic probe is located is obtained through the navigation system.
  • the memory stores a computer program running on the processor; and, when the processor executes the program, the following steps are implemented:
  • Correlation records real-time ultrasonic image data and real-time ultrasonic image data corresponding to spatial orientation information
  • the pre-stored planned ablation path can be obtained by the method of the foregoing content, which can be referred to the related content.
  • the foregoing obtaining the actual ablation path information about the ablation device by combining the real-time ultrasound image data and the spatial orientation information includes: obtaining corresponding position information about the ablation device entering the specific tissue based on the real-time ultrasound image data segmentation, and/or based on real-time
  • the ultrasound image obtains the user's input information to determine the actual ablation path.
  • information about the actual needle insertion angle, direction, and the like of the ablation device is obtained, thereby determining the actual planned ablation path.
  • the ablation path may include one of a guiding direction of ablation, a depth of the ablation path, an expected ablation area, and an execution parameter, and the execution parameters include at least: ablation power, estimated working time, and number of ablation devices, and the like.
  • the execution parameters include at least: ablation power, estimated working time, and number of ablation devices, and the like.
  • related information about the execution parameters and the like involved in the actual ablation path may be determined on the real-time ultrasound image, for example, the user's input information is obtained based on the real-time ultrasound image, thereby determining the actual ablation path.
  • the ablation device marker and the predicted ablation region on the real-time ultrasound image data When superimposing the planned ablation path on the real-time ultrasound image data, reference may be made to the embodiment shown in FIG. 7, for example, by displaying the ablation device marker and the predicted ablation region on the real-time ultrasound image data to characterize the planned ablation path, in the ablation device
  • the end of the marker shows the predicted ablation zone, and the ablation zone is expected to change in conjunction when the position of the ablation device marker changes on the real-time ultrasound image; or when the orientation of the ultrasound probe changes, the ablation device marker and/or the predicted ablation zone also Will be linked to change.
  • the actual ablation device marker and/or the actual ablation region may be displayed with a marker to characterize the actual ablation path.
  • the actual ablation zone is displayed at the end of the actual ablation device marker.
  • ablation device markers and the ablation device markers mentioned above can be differentiated by features such as color, and/or linearity.
  • the actual ablation path and the planned ablation path can also be color, and/or linear.
  • the difference between the actual ablation path and the planned ablation path may be recorded. For example, simultaneously recording the position and/or position difference of the ablation device corresponding to the planned ablation path and the actual ablation device in the same frame of the ultrasound image, and recording the difference in the needle insertion angle of the ablation device corresponding to the planned ablation path and the actual ablation device, and recording the plan at the same time.
  • the difference between the ablation path and the actual ablation path corresponding to the actual ablation path and so on.
  • the corresponding real-time ultrasound image data is also stored, or the associated real-time ultrasound image is stored in association with the associated record planning ablation path and the actual ablation path information.
  • Data and view data for a specific organization are also stored, or the associated real-time ultrasound image is stored in association with the associated record planning ablation path and the actual ablation path information.
  • the processor executes the program to import the pre-stored planned ablation path and superimpose the planned ablation path on the real-time ultrasound image data in the following manner:
  • the three-dimensional model data is registered with the real-time ultrasonic image data according to the real-time ultrasonic image data and the spatial orientation information of the associated record;
  • the planned ablation path is superimposed on the results of the real-time ultrasound image and the view data fusion display.
  • the planned ablation path and the actual ablation path are superimposed on the real-time ultrasound image data, and the difference between the ablation path and the actual ablation path is planned. For example, displaying the difference in position between the ablation device corresponding to the planned ablation path and the actual ablation device on the real-time ultrasound image data, planning the difference in the needle angle of the ablation device corresponding to the ablation path and the actual ablation device, and planning the ablation path corresponding to the actual ablation path Regional differences between the ablation zone and the actual ablation zone are expected, and so on.
  • the processor calculates an overlap relationship between the predicted ablation region corresponding to the planned ablation path and the actual ablation region corresponding to the actual ablation path, quantizes the overlapping relationship, and displays the input on the display screen.
  • the processor calculates an overlap relationship between the predicted ablation region or the actual ablation region and the view data of the specific tissue, quantize the overlapping relationship, and output the display on the display screen.
  • the overlapping relationship between the planned ablation path and the actual ablation path may also be displayed, for example, the overlapping relationship of the needle angles and the overlapping relationship between the positions of the ablation devices. ,and many more.
  • the procedure is similar as shown in Fig. 5.
  • the contrast agent is injected to perform a contrast imaging mode.
  • the lesion was not perfused with contrast agent in the contrast-enhanced ultrasound image, which was significantly contrasted with the surrounding normal tissue.
  • the angiographic images obtained after ablation can also be reconstructed into three-dimensional ultrasound contrast data using a navigation-based free-arm reconstruction algorithm.
  • the comparison shows that the combined ablation region and three-dimensional ultrasound angiography data can analyze the difference between the simulated system and the actual ablation effect.
  • imaging modalities such as elastography mode, and the like, may also be employed in other embodiments.
  • three-dimensional ultrasound image data at a particular tissue location after surgery is obtained; three-dimensional ultrasound image data is displayed; and a planned ablation path is superimposed on the three-dimensional ultrasound image data.
  • reconstruction method of the three-dimensional ultrasound image data reference may be made to the method of the contrast image mentioned above, which will not be described in detail herein.
  • Superimposing the ablation path on the three-dimensional ultrasound image data may adopt a stereoscopic display mode. For example, in the graph C in FIG. 7, the three-dimensionally displayed predicted ablation region is superimposed on the three-dimensional ultrasound image data, and the ablation region is usually an ellipsoid type.
  • other information in the planned ablation path can be displayed by marking the position of the ablation device, such as a needle insertion angle or a guiding angle, a needle insertion direction or a guiding direction, a needle insertion or an ablation path. Depth and other information.
  • the system can not only clinically verify the operator's preoperative design, but also can be used to estimate and evaluate the ablation effect of the ablation needle that has been inserted into the human body, and to evaluate and verify the actual ablation effect.
  • the processor can also obtain the spatial orientation information corresponding to the ultrasonic probe in the current orientation, obtain real-time position information, and extract the pre-stored according to the real-time ultrasonic image obtained when the ultrasonic probe is located in the current orientation.
  • the aforementioned ultrasonic probe is located at one of the orientations corresponding to the obtained spatial orientation information (can be extracted from the pre-stored planning data), obtains reference information, and simultaneously displays the aforementioned real-time position information and the aforementioned reference information.
  • the position of the two ultrasound probes is marked in the lower left panel C of Fig. 7, one is the position of the ultrasound probe generated during the ablation planning, and the other is the orientation of the ultrasound probe, which can prompt the operator how to adjust the position of the ultrasound probe.
  • marking the aforementioned ultrasonic probe and the aforementioned real-time ultrasonic image in the aforementioned fusion map The positional relationship between the two, and the aforementioned reference information are marked at corresponding positions in the aforementioned fusion map, thereby realizing the aforementioned simultaneous display of the aforementioned real-time position information and the aforementioned reference information.
  • FIG. 2 to FIG. 6 respectively provide only a sequence of execution steps between steps, and various modifications may be obtained based on the adjustment sequence of each step in FIGS. 2 to 6 in the foregoing, and the above steps are not limited to only The sequence of Figure 4 is executed.
  • the steps can be replaced with each other if the basic logic is satisfied.
  • the execution order can be changed. After one or more steps are repeated, the last step or steps are executed. Variations made by the embodiments provided herein. Of course, the different steps can be completed by using different execution subjects, such as the related descriptions above.
  • the technical solution of the present invention which is essential or contributes to the prior art, may be embodied in the form of a software product carried on a non-transitory computer readable storage carrier (eg ROM, disk, optical disk, hard disk, server cloud space), comprising a plurality of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, a network device, etc.) to execute the system structure and method of various embodiments of the present invention .
  • a computer readable storage medium having stored thereon a computer program that, when executed by a processor, is at least operable to implement the method steps of the various embodiments mentioned above.

Abstract

一种用于规划消融的超声系统(100)及方法,其通过固定在超声探头(101)上的定位装置(111),从而获得定位装置(111)的空间方位信息,关联记录空间方位信息与利用超声探头(101)实时扫描获得的超声图像之间的空间方位关系,从而进行消融穿刺规划,提高规划精确度。

Description

用于规划消融的超声系统及方法 技术领域
本发明涉及基于医学超声成像设备的模拟介入引导设备及模拟效果的评估方法。
背景技术
为了处置癌症必须去除或者杀死人体内的肿瘤组织。通常的切除手术对病人人体损伤很大,而且很多病人无法进行切除手术。由于成像技术的进步和微创介入设备的快速发展,影像引导介入治疗,尤其是超声引导介入消融治疗已经作为肿瘤临床最重要的治疗手段之一。当前常用的介入消融方法:射频消融(RFA)、微波消融、激光消融、冷冻消融。这些消融方法都是通过微创介入形式杀死人体内肿瘤区域细胞,但是由于消融方式本身的限制,每根介入消融针都只能杀死有限区域的肿瘤细胞,在肿瘤较大或者多个肿瘤离散分布时,需要进行多针或者重复进针消融,为保证准确杀死肿瘤细胞而不损伤或者少损伤正常组织,就需要事先对消融进针路径进行规划和模拟操作,对预计的消融区域进行术前的评估验证。
介入消融手术规划很早就被提出来,但是目前的介入手术规划都是基于CT、MRI或者三维超声等三维数据上进行规划,例如基于人体的三维数据进行肿瘤、血管等人体组织的三维分割提取和三维重建显示,根据处理后的医学图像信息设置消融进针路径。还有一种方式提出了一种影像引导手术规划装置,基于患者三维影像,手动设定消融针进针点、角度、深度、功率以及消融持续时间等,同时基于输入微波能量计算温度场,并融合显示三维温度场和患者三维影像。此外,另一种方式中还提供了一种给出了消融处置规划及设备,采用MPR显示技术显示三维图像和消融体积的3D模型数据的切面。上述规划方案都是基于术前采集到的静态3D数据,基于该数据规划的路径与实际临床介入操作差异非常大。实际介入治疗过程中由于受肋骨等人体组织的影响,无法按照规划的路径准确进针,这样势必影响消融治疗的效果,提高手术治疗风险。
发明内容
基于此,有必要针对现有技术中存在的问题,提供一种用于规划消融的超声系统及评估方法。
在其中一个实施例中,提供了一种用于规划消融的超声系统,其包括:
超声探头;
发射电路和接收电路,通过激励所述超声探头向含有特定组织的检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
图像处理模块,所述图像处理模块根据所述超声回波信号获得实时超声图像数据;
导航系统,所述导航系统包括定位装置,所述定位装置固定在所述超声探头上,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息;
显示屏;
存储器,所述存储器存储处理器上运行的计算机程序;和,
处理器,所述处理器执行所述程序时实现以下步骤:
关联记录所述实时超声图像数据和所述实时超声图像数据对应的所述空间方位信息;
导入所述特定组织的三维模型数据,获得所述特定组织的视图数据;
根据所述关联记录的结果,将所述三维模型数据与所述实时超声图像数据进行配准;
基于配准后的结果,在所述显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示;和,
在所述实时超声图像上确定规划消融路径。
在其中一个实施例中,还提供了一种用于规划消融的超声成像方法,其包括:
通过超声探头获得实时超声图像;
通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息;
关联记录所述实时超声图像与所述空间方位信息之间的对应关系,
导入所述特定组织的三维模型数据,获得所述特定组织的视图数据;
根据所述关联记录的结果,将所述三维模型数据与所述实时超声图像数据进行配准;
基于配准后的结果,在所述显示屏上将针对同一特定组织的实时超声图 像和视图数据进行融合显示;和,
在所述实时超声图像上确定规划消融路径。
在其中一个实施例中,还提供了一种超声系统,其包括:
超声探头;
消融设备,所述消融设备固定在超声探头上;
发射电路和接收电路,通过激励所述超声探头向含有特定组织的检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
图像处理模块,所述图像处理模块根据所述超声回波信号获得实时超声图像数据;
导航系统,所述导航系统包括定位装置,所述定位装置固定在所述超声探头上,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息;
显示屏;
存储器,所述存储器存储处理器上运行的计算机程序;和,
处理器,所述处理器执行所述程序时实现以下步骤:
关联记录所述实时超声图像数据和所述实时超声图像数据对应的所述空间方位信息;
显示所述实时超声图像数据;
结合所述实时超声图像数据和所述空间方位信息,获得消融设备的实际消融路径;
导入预先存储的规划消融路径;
在所述实时超声图像数据上叠加显示规划消融路径;
在所述实时超声图像数据上叠加显示实际消融路径;和,
关联存储所述规划消融路径和实际消融路径。
附图说明
图1为提供了依照一些实施例的用于规划消融的超声系统的系统架构示意图;
图2为图1设备在其中一个实施例中的流程示意图;
图3为图1设备在另一些实施例中的流程示意图;
图4为图1设备在其中一些实施例中的流程示意图;
图5为图1设备在造影模式下的流程示意图;
图6为图1设备在多帧联合消融模式下的流程示意图;
图7为图1设备在其中一个实施例中的显示界面示意图。
具体实施方式
下面通过具体实施方式结合附图对本发明作进一步详细说明。其中不同实施方式中类似元件采用了相关联的类似的元件标号。在以下的实施方式中,很多细节描述是为了使得本申请能被更好的理解。然而,本领域技术人员可以毫不费力的认识到,其中部分特征在不同情况下是可以省略的,或者可以由其他元件、材料、方法所替代。在某些情况下,本申请相关的一些操作并没有在说明书中显示或者描述,这是为了避免本申请的核心部分被过多的描述所淹没,而对于本领域技术人员而言,详细描述这些相关操作并不是必要的,他们根据说明书中的描述以及本领域的一般技术知识即可完整了解相关操作。
另外,说明书中所描述的特点、操作或者特征可以以任意适当的方式结合形成各种实施方式。同时,方法描述中的各步骤或者动作也可以按照本领域技术人员所能显而易见的方式进行顺序调换或调整。因此,说明书和附图中的各种顺序只是为了清楚描述某一个实施例,并不意味着是必须的顺序,除非另有说明其中某个顺序是必须遵循的。
本文中为部件所编序号本身,例如“第一”、“第二”等,仅用于区分所描述的对象,不具有任何顺序或技术含义。而本申请所说“连接”、“联接”,如无特别说明,均包括直接和间接连接(联接)。
图1给出了一个实施例中用于规划消融的超声系统100的结构示意图,具体结构如下所示。图1所示的用于规划消融的超声系统100主要包括:超声探头101、发射电路103、发射/接收选择开关102、接收电路104、波束合成模块105、信号处理模块116和图像处理模块126。在超声成像过程中,发射电路103将经过延迟聚焦的具有一定幅度和极性的发射脉冲通过发射/接收选择开关102发送到超声探头101。超声探头101受发射脉冲的激励,向含有特定组织的检测对象(例如,人体或者动物体内的特定组织及其血管等等,本文中的特定组织包括人体或动物体内需要通过消融清理的组织,例如肿瘤组织)发射超声波(可以是平面波、聚焦波或发散波中的任何一种),经一定延时后接收从目标区域反射回来的带有检测对象的信息的超声回波,并将此超声回波重新转换为电信号。接收电路104接收超声探头101转换生成的电信号,获得超声回波信号,并将这些超声回波信号送入波束合成模块105。波 束合成模块105对超声回波信号进行聚焦延时、加权和通道求和等处理,然后将超声回波信号送入信号处理模块116进行相关的信号处理。经过信号处理模块116处理的超声回波信号送入图像处理模块126。图像处理模块126根据用户所需成像模式的不同,对信号进行不同的处理,获得不同模式的超声图像数据,然后经对数压缩、动态范围调整、数字扫描变换等处理形成不同模式的超声图像,如B图像,C图像,D图像等等,或者其他类型的二维超声图像或三维超声图像。在实际进行超声图像的扫描采集过程中,上述发射电路和接收电路激励超声探头根据超声成像参数的设定向检测对象发射超声波束,并接收超声波束的回波获得超声回波信号,根据超声回波信号获得期望的超声图像,用以进行显示,展现特定组织及其周边的组织结构,然而,在进行超声图像的实时扫描采集过程时,激励位于不同方位的超声探头所获得的超声图像,可以称之为实时超声图像,该实时超声图像可以根据超声探头的方位调整而变化,也可以根据超声成像参数调节的变化而变化,实时超声图像不同于冻结图像,冻结图像是指超声成像设备执行冻结功能时所采集存储的图像数据。本文中提到的超声成像参数涉及所有在超声组织图像的成像过程中可供用户进行自主选择的参数,例如,TGC(Time Gain Compensate,时间增益补偿),声波频率,脉冲重复频率(pulse recurrence frequency,PRF),超声波类型,和动态范围等等。
用于规划消融的超声系统100还包括:显示屏130、处理器140以及存储器160和人机交互设备150,处理器140用于将获得的超声图像输出至显示屏130进行显示,并且处理器140调用存储器160上记载的计算机程序指令从而将超声图像显示在显示屏130上、以及通过人机交互设备获取用户在显示的超声图像上输入的控制指令。本文中的人机交互设备可以包括键盘、滚轮、鼠标、触摸显示屏等其中之一,显示屏130也可以是普通的显示屏或者触摸显示屏。若显示屏130为触摸显示屏,那么人机交互设备150则也可以是触摸显示屏,那么通过人机交互设备获取用户在超声图像上输入的控制指令,可以是,处理器140调用存储器160上记载的计算机程序指令获知输入对象在触摸显示屏上的接触,从而来确定用户在显示的超声图像上输入的控制指令。
针对处理器140调用存储器160上记载的计算机程序指令获知输入对象(例如,食指、拇指、手写笔、触摸显示屏专用笔等)在触摸显示屏上的接触,可以是首先在触摸显示屏上显示超声图像,处理器140可以调用存储器 160中存储的手势检测模块,来检测用户通过输入对象在图形用户界面上执行接触操作而获得的控制指令。在多个实施例中,包含具有带有图形用户界面(GUI)的触摸显示屏、一个或多个处理器、存储器、和存储在存储器中用于执行多种功能的一个或多个模块、程序或指令集,由它们共同实现了基于图形用户界面(GUI)的操控输入检测并获得相关控制指令。在本发明的其中一些实施例中,用户主要在触摸显示屏上通过手势输入与图形用户界面进行交互。这里的手势输入可以包括通过直接接触触摸显示屏或接近触摸显示屏使设备可以检测的任何类型的用户手势输入。例如,手势输入可以是用户使用右手或左手的手指(例如,食指、拇指等)、或者可以通过触摸显示屏可检测的输入对象(例如,手写笔、触摸显示屏专用笔)在触摸显示屏上选择一个位置、多个位置、和/或多个连续位置的动作,可以包括类似接触、触摸的释放、触摸的轻拍、长接触、旋转展开等操作动作。手势检测模块可以检测输入对象与触摸显示屏之间进行交互的手势输入,例如确定是否发生了接触、确定手势输入是否持续输入、确定是否与预定手势对应、确定手势输入所对应的操作位置、确定手势输入对应的操作位置是否移动到相应显示区域的边缘位置、确定手势输入是否已中断(如,接触是否已停止)、确定手势输入的移动并跟踪手势输入的移动轨迹、确定手势输入所对应的操作位置的运动速率(幅度)、运动速度(幅度和方向)、和/或运动加速度(幅度和/或方向的变化)、运动轨迹等等。手势检测模块存储在存储器上,并通过一个或多个处理器的调用来实现上述手势输入的监测,获得用户的操作输入指令。
无论是通过人机交互设备中的键盘、滚轮,还是通过触摸显示屏等设备来获得用户输入的控制指令,根据该控制指令可以调整超声探头的超声成像参数,或者切换超声探头的工作模式,或者调整探头的空间位置。工作模式包括造影成像、弹性成像等等。
此外,用于规划消融的超声系统100还包括导航系统,在图1中导航系统包括磁场发射与信号接收模块170和固定在超声探头101上定位装置111。此外,在其中一个实施例中,用于规划消融的超声系统100还包括人体定位装置180。
磁场发射与信号接收模块170用于产生磁场,并接收位于磁场中的定位装置111反馈回的信号,根据该反馈的信号获得定位装置111相对于磁场的空间方位信息。空间方位信息可以采用不同的坐标系表现形式,用来至少展现相对于磁场的位置信息和方位信息中的其中至少之一。本文中的磁场包括 电磁场等。例如,在其中一个实施例中,磁场发射与信号接收模块170以数据线或者无线的形式连接定位装置111。磁场发射与信号接收模块170用于发射磁场并接收定位装置111传回来的位置信息。具体定位原理为定位装置111放在磁场范围内,定位装置(例如定位线圈)把当前方位的磁场相关信息,反馈给磁场发射与信号接收模块170,该模块计算出定位装置111的当前空间坐标和方向,如(x,y,z,a,b,c)前三个坐标为定位装置111在当前时刻相对于磁场的空间坐标(即位置信息),后三个参数为定位装置111在当前时刻相对于磁场的方向信息(即方位信息)。同样可以采用欧拉角、四元数以及矩阵的形式描述物体的空间坐标和方位信息。在后续描述中,可以采用(x,y,z,a,b,c)的方式表达定位装置111在当前时刻相对于磁场的方向信息和空间坐标(即位置信息),并联合表征定位装置返回的空间方位信息。当然,也可以仅采用(x,y,z)或(a,b,c)来表征空间方位信息。
定位装置111如图1所示,其固定在超声探头101上。定位装置111放置在前述磁场的范围内时能实时返回该定位装置在磁场中的当前时刻处空间方位信息,例如位置信息和/或方向信息。定位装置111用于与超声探头101绑定在一起,实时返回的空间方位信息可以等效于超声探头的当前空间方位信息。绑定方式可以设计特地的卡槽装置,把定位装置111卡在超声探头101表面的某个位置,从而形成探头121。也可以在加工探头的同时把定位装置111内置在超声探头101的内部,从而形成整体探头121。基于超声探头的空间方位信息(例如,可以采用矩阵形式表示),从而可以计算映射矩阵Pi,该映射矩阵用于将探头121的图像空间坐标系中的坐标映射到磁场所形成的磁场坐标系中,i表示当前时刻。映射矩阵Pi可以包含以下两个部分:
第一部分为映射矩阵A把超声探头的图像空间坐标系映射到定位装置111所在磁场空间坐标系;第二部分为定位装置111当前时刻所在的磁场中的方位信息Ri,即Pi=Ri*A。上述通过定位装置111和磁场发射与信号接收模块170形成的导航系统,可以采用导航装置领域内的相关技术,具体可参见相关领域的说明,在此不再详细说明。当然,本实施例中的导航系统可以但不限于采用前述磁场定位方式,只要是可以用来定位超声探头在实际空间中的空间方位信息都可以用于导航系统。
对象定位装置180为用于规划消融的超声系统100的可选装置,该装放置在前述产生的磁场范围内时能实时返回该装置当前空间方位信息,本实施例中的空间方位信息的解释说明可参见前文相关说明,例如可以包括位置信 息和方向信息(可参见前述定位装置111的解释说明)。对象定位装置180用于放置在被测体(例如人体或者动物)的表面,可以用于获取当前被测体的空间方位信息,或者获取被测体表面的运动信息。对象定位装置180可以固定在含有上述检测对象的被测体表面,磁场发射与信号接收模块170接收位于前述磁场中的对象定位装置180反馈回的检测信号,根据该检测信号获得对象定位装置180相对于前述磁场的空间方位信息、和/或被测体表面的运动信息,本文中提到的被测体表面的运动信息包括被测体的呼吸运动信息,例如呼吸频率。获得的检测信息可以用于对超声探头上定位装置111获得的信息进行校正或矫正,例如,根据对象定位装置180反馈的检测信号,纠正依据定位装置111获得的空间方位信息。该对象定位装置180可以使用双面胶、胶带、绷带等固定到被测体的皮肤表面,在整个超声图像的扫描采集过程中该对象定位装置180维持在皮肤表面的位置不动。
前述提到的超声探头可以是不同型号的探头,如二维凸阵探头、三维凸阵探头、四维阵探头、线阵探头等等,采用不同的探头进行消融消融效果评估时,具体的数据处理技术可根据探头类型的不同进行调整。
处理器140根据前述提供的设备可以获取超声探头位于磁场中任意一个方位时对应的实时超声图像和与该方位对应的空间方位信息,通过空间方位信息和实时超声图像之间的对应关联关系,可以支持对实时超声图像以及相关空间方位信息进行显示和分析等处理。在本发明的其中一些实施例中,图1中的信号处理模块116和图像处理模块126可以集成在一个主板106上,或者其中的一个或两个以上(本文中以上包括本数)的模块集成在一个或多个处理器/控制器芯片上实现。当然,在图1所示的实施例中,处理器140和存储器160可以设置在主板106上,也可以独立于主板106设置。处理器140还可以与图1中的信号处理模块116和图像处理模块126集成在一个或多个处理器/控制器芯片上实现。前述磁场发射与信号接收模块170和定位装置111,或者还可以包括对象定位装置180,可以构成磁导航定位系统。
基于上述图1所提供的用于规划消融的超声系统100的结构示意图,以下将结合图2至图6提供的硬件环境详细描述一下有关超声介入手术模拟引导的评估方法的各个实施例的执行流程。
一、基于图1中提供的带有导航系统的超声成像设备,提供了一种可以基于实时超声图像进行消融介入规划的方法,具体过程参见图2所示。
在图2中的步骤210中,发射电路和接收电路(103和104)激励超声探 头(101),根据设定的超声成像参数向含有特定组织的检测对象发射超声波束,并接收上述超声波束的回波,获得超声回波信号。并且,在图2的步骤220中,图像处理模块(126)依据前述超声成像参数的部分或者全部,根据超声回波信号获得实时超声图像。本文的实时超声图像也还可以是前文所述的不同模式的超声图像,如B图像,C图像,D图像等等,或者其他类型的二维超声图像或三维超声图像。本实施例获得的实时超声图像可以是超声探头位于任意一个方位时所获得的超声图像,而且实时超声图像可以是当前获得的一帧超声图像,也可以是连续的多帧超声图像。
在图2的步骤230中,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息。导航系统至少包括前述定位装置111,定位装置111按照前述方式择一固定在超声探头101上。磁场发射与信号接收模块170产生一磁场,可覆盖包含定位装置111的空间范围。磁场发射与信号接收模块170接收位于磁场中的定位装置111反馈回的信号,根据定位装置111反馈的信号获得定位装置111相对于磁场的空间方位信息。
在图2的步骤240中,处理器140调用存储器中的程序来根据前述步骤中的空间方位信息,关联记录前述实时超声图像与空间方位信息之间的对应关系,从而可以获得图像空间坐标系与前述磁场所在的磁场空间坐标系之间的映射关系,利用这一映射关系可以将实时超声图像与导入的三维模型数据和超声探头中的其中之一进行空间方位映射,并映射到同一个坐标系下,便于融合显示。图像空间坐标系表示为通过超声探头采集特定组织而获得的实时超声图像所形成的图像像素所在的坐标系空间,而前述磁场所在的磁场空间坐标系为磁场范围内的坐标系空间。本文中提到的图像空间坐标系与磁场空间坐标系之间的映射关系可以通过前述映射矩阵表示。在本实施例中,通过磁场发射与信号接收模块可以实时获取超声探头所在空间中的空间方位信息,而实时激活超声探头则对应获得每个时刻对应的实时超声图像,因此利用用于规划消融的超声系统可以将实时超声图像与采集实时超声图像时超声探头对应的空间方位信息进行关联记录,并存储,例如,当利用超声探头获得电影图像数据(本文的电影图像数据可以包含多帧连续的二维超声图像,但不限于此)时,可以对应获得电影图像数据中每一帧图像对应的超声探头的空间方位信息。通过图1所示设备中的定位装置、对象定位装置和磁场发射与信号接收模块,可以关联记录任意一帧实时超声图像与前述超声探头在磁场空间坐标系中的空间方位信息之间的对应关系。例如,通过激励位于一 个方位的超声探头对应获得与前述一个方位关联对应的实时超声图像,以及与前述一个方位关联对应的空间方位信息,存储这些信息后建立对应关系。
在图2的步骤250中,处理器140导入前述特定组织的三维模型数据,获得特定组织的视图数据。本文提到的三维模型数据至少包含特定组织的大小信息、形状信息、位置信息、和周围组织的血管分布信息中的其中之一,而三维模型数据可以是单帧静态的图像数据,也可以是电影图像数据,例如,含有特定组织的CT、MRI、三维超声造影等模态三维图像数据。
这些三维模型数据可以是预先存储的利用其它设备或者本实施例设备获取的离线图像数据,也可以是现场基于本实施中的提供的用于规划消融的超声系统获取的三维图像数据。在其中一个实施例中,三维模型数据可以来源于术前采集的图像数据。又例如,在其中一个实施例中,通过以下步骤获得上述步骤250中的特定组织的三维模型数据。
首先,通过超声探头获取包含前述特定组织的电影图像数据,前述电影图像数据可以是在造影剂灌注之后获得,具体的获取方式可参见图5所示的实施例。然后,通过导航系统获取采集前述电影图像数据时前述超声探头位于不同方位时对应的空间方位信息,其次,处理器根据前述超声探头位于不同方位时对应的空间方位信息,将前述电影图像数据中每一帧图像映射到磁场空间坐标系中,重建三维超声图像,用以获得前述特定组织的三维模型数据。
以下以造影成像模式下获得的前述电影图像数据为例,结合图5所示流程进行具体说明。
执行图5中的步骤510,发射电路和接收电路(103和104)激励超声探头(101)向含有特定组织的检测对象发射超声波束,并接收超声波束的回波,获得超声回波信号;执行图5中的步骤512,图像处理模块根据超声回波信号获得实时超声图像,具体参见图2中的步骤210和220。此外,执行步骤514,磁场发射与信号接收模块170产生磁场,并接收位于磁场中的定位装置111反馈回的信号,根据定位装置111反馈的信号获得定位装置111相对于磁场的空间方位信息,具体参见图2中的步骤230。通过执行前述步骤后,移动超声探头在被测体上查找特定组织,选择包含特定组织的观察切面。之后,执行步骤516,可以接收用户输入的模式切换指令,进入造影成像模式,即可以给被测体注射超声造影剂(步骤518),当在特定组织灌注时,获取包含特定组织的电影图像数据(步骤520),本文中的电影图像数据包含多帧连续的 二维图像。为了能更好地在后续的步骤中重建更加精确的三维模型数据,此时获取的电影图像数据可以是超声探头沿固定方向摆动或者位于特定组织附近区域不同位置处对应获得的预定时长的电影图像数据,因此,在采集上述电影图像数据的过程中,通过磁场发射与信号接收模块170可以获得超声探头位于不同方位时对应的空间方位信息,超声探头沿固定方向摆动或者位于特定组织附近区域不同位置均对应处于空间中的不同方位,所以,上述电影图像数据中的每一帧图像对应关联的空间方位信息将被获得。执行步骤522,获取采集电影图像数据中每一帧图像对应关联的空间方位信息,例如可以采用前述映射矩阵Pi表示。再执行步骤524,根据采集的电影图像数据中每一帧图像对应关联的空间方位信息,将电影图像数据中每一帧图像映射到磁场空间坐标系中,重建三维超声造影图像,从而获得前述三维模型数据的其中一种(步骤526)。上述重建算法即是通常所说的基于导航系统的自由臂(free hand)重建,具体实现是基于前述映射矩阵Pi把当前每一帧图像中的每一个像素映射到磁场空间坐标系中,形成三维空间中的点集,基于这个点集采用插值算法(最近邻、线性)可以生成三维长方体数据,从而获得三维超声造影图像。当然,本文的其他实施例中不限于采用前述实施例中的造影成像模式,还可以基于M模式、B模式等等来获得前述电影图像数据,从而进行三维超声图像的重建。
基于导入的特定组织的三维模型数据,可以获得前述特定组织的三维体信息,得到特定组织的视图数据。例如,在导入特定组织的三维模型数据之后,通过图像分割提取步骤,获得前述特定组织的三维体。图形分割提取步骤包括从前述三维超声图像中分割提取特定组织形状、位置信息、组织大小、和组织深度等等。分割方法有很多中,手动分割方法:在三维图像数据的每一帧二维层片上手动描记特定组织边界,有这些二维层片分割结果生成特定组织的三维体(T)。同样也可以采用交互式(如半自动)或者全自动的分割算法,可参见相关的图像分割技术,在此不做详细说明。在实际消融时,不但要消融某一组织等特定组织所在的区域,同时需要消融掉特定组织周围一定区域内的组织,即特定组织的安全边界。安全边界是指在介入某一组织消融过程中,一般要求消融区域覆盖某一组织边缘向外扩展5mm左右的距离,以保证对整个某一组织的完全消融,即从特定组织所在区域向外扩展5mm左右距离后所获得的边界。因此多针联合的消融区域不但要覆盖某一组织区域还要覆盖安全边界。同样用符号T表示某一组织等特定组织和特定组织的安全边 界,后者体数据包含前者,而且在介入消融过程中两者都需要被消融。在图像处理领域,三维体的安全边界可以由特定组织的组织区域向外膨胀(膨胀算法为简单的形态滤波算法)一定距离生成。因此,本步骤中的特定组织的三维体可以是特定组织的组织区域,也可以是特定组织对应的消融区域,例如特定组织对应的安全边界所囊括的范围。特定组织的三维体至少包含了特定组织形状、位置信息、组织大小、组织深度、特定组织周围的血管分布信息等等信息、和安全边界中的其中之一。本实施例中提到的视图数据至少包括以下内容之一:针对特定组织的切面图像数据、针对特定组织的三维图像数据、依据特定组织分割获得的三维体形成的二维图示数据、和依据特定组织分割获得的三维体形成的三维图示或图标数据。图示或图标数据可以并非真实的图像数据,可以是用来在图或图像上表征三维体信息的图标,这里的三维体信息包括:特定组织的形状、位置信息、组织大小、组织深度、和安全边界等信息内容中之一或者两个以上的组合。
在图2的步骤260中,处理器140根据上述关联记录的结果,例如实时超声图像与前述空间方位信息之间的对应关系,将三维模型数据与实时超声图像进行配准,从而可以获得三维体与实时超声图像之间的对应关系,建立三维体、实时超声图像以及定位装置所在空间三者之间的映射关系。基于这三者的映射关系,可以进行融合显示,例如,在其中一个实施例中,在显示屏上显示实时超声图像,并在超声图像上融合显示关于特定组织的视图数据。在其中一个实施例中,根据前述提到的映射关系,基于获得的实时超声图像对应的空间方位信息,可以将实时超声图像、特定组织的视图数据映射到同一坐标系中。
在图2的步骤270中,处理器可以基于配准后的结果,在显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示。
把导入的三维模型数据与实时超声图像基于导航融合联动显示,即通过配准步骤把三维模型数据与实时超声图像进行配准,使得可以实时联动显示实时超声图像与三维模型数据中的部分或全部内容,例如,与实时超声图像联动显示三维模型数据中与当前实时超声图像对应的二维图像切面、三维体等信息。
在三维模型数据与实时超声图像的配准融合环节,可以采用CT/MRI与实时超声配准融合的操作,建立三维超声数据(包含特定组织信息)与实时超声图像的融合联动。图像配准融合的方案有很多种,例如:点点配准、面面配 准、垂直面配准等。其中面面融合配准步骤可以是,在三维模型数据中选择一个图像切面,使用超声探头在被测体上找出同样的实时超声图像中的图像切面(此时映射矩阵为Pt),建立三维模型数据与当前超声实时图像所在图像空间坐标系的第二映射关系M。若将三维模型数据中每个像素到磁场空间坐标系,则映射方式为Pt*M,然而把特定组织的三维体映射到磁场空间坐标系时,则映射关系为Ts=Pt*M*T,其中Ts表示磁场空间坐标系中的三维体。
可以从三维体与实时超声图像进行配准之后的结果中,抽取表征三维体的三维视图数据或二维视图数据分别与实时超声图像数据进行融合显示,可以形成至少以下三种显示方式:显示特定组织对应的三维视图数据和二维实时超声图像;显示特定组织对应的三维视图数据和三维实时超声图像;和,显示特定组织对应的二维视图数据和实时超声图像。当然上述三种显示方式可以同时在显示屏的同一界面中显示,例如,上述三种显示方式分别显示在同一界面的多个显示区域中,然后排放显示在同一界面上。
在图2的步骤270中,处理器140融合显示实时超声图像和特定组织的视图数据,从而可以形成至少一幅融合图。在其中一个实施例中,在一帧实时超声图像上标记显示前述特定组织的视图数据从而形成一帧融合图。当然,在标记前述特定组织的三维体时,还可以根据前述Ts=Pt*M*T,将三维体映射到磁场空间坐标系中,从而实现将特定组织的视图数据与实时图像可以映射到同一空间坐标系中,然后根据空间坐标系对应的图像数据,在融合图上对应显示实时超声图像和特定组织的视图数据。例如,在其中一个实施例中,提取一帧实时超声图像,输出显示该帧实时超声图像,基于实时超声图像和特定组织的视图数据的配准结果,确定特定组织的视图数据在实时超声图像上的对应位置,根据确定的对应位置在该帧实时超声图像上叠加显示特定组织的视图数据,形成至少一幅融合图。本实施例中的实时超声图像可以是二维图像、也可以是三维图像。
处理器140在前述显示屏130上显示前述实时超声图像。在显示屏上显示实时超声图像时,可以分屏显示或者多窗口显示。此外,在显示实时超声图像的同时,还可以同步显示三维模型数据中与实时超声图像对应的图像。
在其中一个实施例中,参见图7所示,在前述实时超声图像上标记显示前述三维体的方式至少包括以下方式之一:
1、参见位于图7中右上角位置的图B,在前述实时超声图像(图中的扇形区域735)上绘制前述特定组织的二维视图数据(图中的蓝色圆形区域734), 用以在实时超声图像上突出显示前述三维体所在的区域。此外还可以在图B中标记消融区域(731、733)以及介入装置(如消融针732)。
2、参见位于图7中左上角位置的图A,根据前述实时超声图像(图中735),从前述三维模型数据中提取与前述实时超声图像对应的图像数据(图中713)并显示,在显示的图像数据713上绘制前述特定组织的二维视图数据712,用以在前述三维模型数据中提取的图像数据上突出显示前述三维体所在的区域。此外还可以在图A中标记消融区域711。
3、参见位于图7中左下角位置的图C,根据实时超声图像与空间方位信息之间的对应关系建立磁场空间坐标系下的融合图(如位于左下角的图C),在融合图C中显示超声探探头721与实时超声图像722之间的位置关系,以及在融合图C的相应位置处标记前述特定组织的三维视图数据724。此外,在其中一个实施例中,如图7所示,在融合图C中还可以标记介入装置(如消融针725)、消融区域723。
无论是图7中的图B,还是图C,都可以称之为是本文提到的融合图。融合图可以是在二维实时超声图像上叠加三维体标记(三维体标记也可以是一个圆,如图7中的B所示)获得的,也可以是在三维实时超声图像上叠加三维体标记获得的,或者还可以是基于实时超声图像和超声探头的空间位置关系形成的组合图像(如图7中的C所示)。
可见,在一些实施例中,在实时超声图像上叠加显示的特定组织的视图数据,可以采用图7中所示的图标标记,用以在实时超声图像的相应位置上标记特定组织的形状、位置信息、组织大小、组织深度、和安全边界等信息中的之一或两个以上的组合信息。
图7中采用直线段来表征消融设备标记,在实时超声图像数据上通过标记显示消融设备标记和预计消融区域来表征规划消融路径,在消融设备标记的末端显示预计消融区域,当消融设备标记在实时超声图像上的位置发生变化时,则预计消融区域联动变化;或者当超声探头的方位发生变化时、消融设备标记和/或预计消融区域也会联动变化。
在图2的步骤280中,在实时超声图像上确定规划消融路径。例如,根据实时超声图像,通过人机交互设备接收用户在实时超声图像上的输入,基于用户的输入,处理器140确定规划消融路径。
本文提到的规划消融路径至少包括穿刺引导线角度、穿刺引导方向、消融针进入路径、消融路径深度、预计消融区域、消融功率、消融针数量、预计 工作时长、预计消融范围(或消融区域)等信息中的其中之一。而消融针进入路径可以包括进针距离、进针角度和进针深度、进针位置等信息。在本实施例中,根据显示屏上融合显示的实时超声图像和特定组织的视图数据,用户可以基于在显示屏上显示的实时超声图像,来设置消融针进入路径,从而可以精确的定位被测体介入消融时的路径,提升消融规划路径的精确度,提高消融效果,降低手术风险。本文中提到的消融设备包括一个或多个消融针,或者介入导管等等,本文中主要以消融针为例进行说明。通过本实施例可以基于前述实施例中显示的实时图像、特定组织的三维体、和/或前述三维模型数据中提取与前述实时超声图像对应的图像数据,来规划消融路径。在基于实时超声图像进行消融信息的设置时,具体可参见图3和图4所示的方法流程。
在图2的步骤290中,处理器根据规划消融路径,在前述融合图上显示消融设备和/或消融路径。例如,如图7所示,图B中消融针732,图C中消融针725。在图B中标记消融区域(731、733)。在融合图C中还可以标记消融区域723。
当然根据不同的规划情况还可以对应采用不同的标记显示方式,例如,在其中一个实施例中,处理器采用以下方式实现在融合图上标记消融设备和/或消融路径,例如如图7中的C所示,基于实时超声图像与空间方位信息之间的对应关系,获得超声探头与实时超声图像之间的位置关系;和,在融合图中标记超声探头与实时超声图像之间的位置关系。此时的实时超声图像可以是三维图像,也可以是二维图像。在图7中C的融合图中,显示超声探头的扫描区域与实时超声图像之间扫描关系,并在相应的扫描深度位置融合显示前述特定目标组织的三维视图图标724,同时标记消融区域723,以及介入设置消融针725,从图中可以清楚的了解消融针与超声探头、实时超声图像在实际空间中的位置对应关系,方便用户更加直观的了解到实际规划的情况,还可以结合图B来进行更加准确的规划定位。基于实时超声图像与空间方位信息之间的对应关系,结合超声探头的扫描平面所在位置来确定超声探头与实时超声图像之间的位置关系。此外,在其中一个实施例中,在标记消融设备时,可以根据消融设备固定在超声探头上的角度,也可以是超声探头与消融设备的相对角度关系,结合实时超声图像与所获得的空间方位信息来换算消融设备在实时超声图像中的位置,或者消融设备与探头以及实时图像之间的方位关系。
在一些实施例中,如图7所示,在显示屏上显示表征超声探头的探头图标(721),探头图标的显示位置随采集到的空间方位信息的变化而变化。
在一些实施例中,在显示屏上至少采用以下方式之一来显示消融设备标记和/或规划消融路径:
在前述获得的实时超声图像上跟随消融设备标记显示预计消融区域,例如,当基于用户输入的指令调节消融设备标记的深度或角度等执行参数时,则预计消融区域的显示也将发生改变;
计算特定组织的视图数据与预计消融区域的重叠关系,并输出所述重叠关系的计算结果,具体可参见后面的相关实施例的说明;和,
在同一个融合图或同一个窗口中绘制探头图标和在实时超声图像的相应位置处标记预计消融区域,例如图7中的图C。
在一些实施例中,显示屏上包括多个窗口,多个窗口内的数据随超声探头位置的变化而联动变化。
在一些实施例中,多个窗口分别显示二维实时超声图像或三维实时超声图像与三维视图数据和二维视图数据中两者之一进行融合显示的多个结果。
二、基于图1和图2所示实施例,在进行步骤280在实时超声图像上确定关于消融设备的消融路径的过程中可参见图3所示的过程。
执行图3中的步骤310至步骤312,利用图1中的发射电路和接收电路(103和104)通过激励位于一个方位Q1上的超声探头,向含有特定组织的检测对象发射超声波束,并接收该超声波束的回波,获得超声回波信号,根据该超声回波信号可以获得与前述一个方位Q1关联对应的实时超声图像S1。用户在被测体表面移动超声探头或者摆动超声探头时,超声探头将在不同的方位Qn(n=1,2,…,N,N为大于1的自然数)上被激励用以获得不同方位对应的至少一帧实时图像Sn(n=1,2,…,N,N为大于1的自然数)。当然,将超声探头移动到不同方位的过程也可以通过电机驱动来实现。
执行图3中的步骤314至步骤316,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息,即方位Q1的空间方位信息,关联记录实时超声图像与方位Q1的空间方位信息的对应关系,获得图像空间坐标系与磁场空间坐标系之间的映射关系。步骤314至步骤316的具体实现可参见图2中的步骤230和步骤240。
执行图3中的步骤318和步骤320,处理器导入特定组织的三维模型数据,获得特定组织的视图数据,将三维模型数据与实时超声图像进行配准, 可以获得三维模型数据与实时超声图像在图像空间坐标系或者在磁场空间坐标系上的映射关系,从而实现在实时超声图像上融合显示特定组织的视图数据,具体可参见图2中的步骤250和步骤260。
执行图3中的步骤322,显示与方位Q1关联对应的实时超声图像S1,在实时超声图像S1上融合显示特定组织的三维信息,从而在实时超声图像S1的相应位置上标记特定组织的视图数据(步骤324),形成融合图,这里的融合图可以是图7中的B图或者C图。融合显示的方式基于三维模型数据与实时超声图像进行配准的结果,在此不再重复说明可参见前文的相关说明。
执行图3中的步骤326,基于显示的实时超声图像,处理器接收关于至少一个消融针的执行参数或规划消融路径。消融设备包括至少一个消融针。根据消融针与超声探头的相对固定位置关系,可以在实时超声图像上映射显示消融针,从而基于实时超声探头的跟踪定位,以及实时超声图像确定消融针的规划消融路径。本实施例提到的执行参数至少包括:消融功率、预计工作时长、和消融设备的个数等中的其中之一。
例如,在其中一个实施例中,步骤326或前述在实时超声图像上确定规划消融路径的过程可以采用如下步骤:
根据消融设备与超声探头的相对位置,可以在实时超声图像的第一位置处显示消融设备标记,获取关于消融设备标记的调整指令;基于该调整指令,变更消融设备标记在实时超声图像上的位置至第二位置处;记录消融设备标记的位置变化与实时超声图像、视图数据和/或空间方位关系的关联信息,获得规划消融路径。规划消融路径至少包括多个连续位置变化下对应的至少一个消融设备的执行参数信息集合。
此外,在另一些实施例中,在实时超声图像上确定规划消融路径之前或者步骤326之前还包括以下步骤:
首先,获取关于消融设备的执行参数,执行参数至少包括:消融功率、预计工作时长、和消融设备的个数等中的其中之一。例如,在显示界面上提供关于执行参数的输入窗口或下拉菜单,用以供用户输入选择指令,从而对执行参数进行设置。
然后,根据前述执行参数获得预计消融区域,在实时超声图像上显示预计消融区域,用以确定规划消融路径,其中预计消融区域的显示随消融设备标记的位置变化而变化。
结合前述的关于用户在实时超声图像上输入的消融设备标记的位置变化、 以及相关执行参数的选择指令,可以使用户基于当前获得的实时超声图像在对病人进行检查时,即可在当前模式下进行消融规划路径的设置。
基于此,为了实现两次规划消融路径的对比,还可以在其中一个实施例中,显示在先存储的规划消融路径,获得当前规划消融路径与在先规划消融路径之间的差别。例如,在显示屏上显示的规划消融路径,可以包括:处理器导入的关于至少一个消融针的第一规划消融路径,例如,第一规划消融路径可以为预先存储的规划消融路径。然后,本实施例中的规划消融路径也可以包括:基于超声探头位于当前方位时对应获得的实时超声图像,处理器基于前述在实时超声图像上确定规划消融路径的过程获得的第二规划消融路径。处理器基于前述在实时超声图像上确定规划消融路径的过程,接收的关于至少一个消融针的第二规划消融路径,例如,第二规划消融路径可以是操作者针对当前获得的实时超声图像而进行消融规划时输入的关于至少一个消融针的规划消融路径。用户可以通过人机交互设备在实时超声图像上输入第二规划消融路径,例如在前述显示的实时超声图像上设置规划消融路径。通过图3所示的流程可以设定规划消融针路径等信息,并基于实时超声图像在人体上查找某一组织的位置,确定进针位置,设定预计进针路径。根据实时超声图像,用户在实时超声图像上标记和/或设置消融针的穿刺引导线角度、穿刺引导方向、消融针进入路径、消融路径深度、消融功率、消融针数量、预计工作时长、预计消融范围(或消融区域)等规划消融路径的信息,从而进行消融规划设置。通过同时在显示屏上显示第一规划消融路径和第二规划消融路径,可以保存、记录、和/或标记两者之间的差别,用以提示用户。前述第一规划消融路径可以是基于当前系统获得的,也可以是基于其他系统获得的。
上述“第一”和“第二”仅仅是用于文字上用以区分消融路径的获取时间或来源不同,并不会对消融路径本身属性内容带来改变,也就是说,第一规划消融路径和第二规划消融路径均可以至少包括穿刺引导线角度、穿刺引导方向、消融针进入路径、消融路径深度、消融功率、消融针数量、预计工作时长、和预计消融范围(或消融区域)等信息中的其中之一。
在其中一个实施例中,消融路径可以是关于一个消融针对应的至少一次穿刺进入的消融路径,或者是关于多个消融针的至少一次穿刺进入的消融路径。
例如,在其中一个实施例中,实现前述步骤326的接收关于至少一个消融针的消融路径包括:接收关于第一消融针对应的第一规划消融路径以及接 收关于第二消融针对应的第二规划消融路径。第一消融针和第二消融针可以分别对应两个不同的消融针,或者分别对应同一个消融针的前后两次穿刺进入设置。
执行图3中的步骤328,处理器根据上述消融路径或执行参数,确定预计消融区域。基于上述消融路径,可以确定包含特定组织的组织区域和安全边界(可参见前文相关说明)。操作者可以根据临床经验或者厂家提供的工作参数,设定消融针(也可以是消融针)在给定的消融功率和工作时间等消融路径下模拟消融范围(Si,第i次消融)。大多数消融针的预计消融范围为椭球,因此只需要设定椭球的长轴长度和短轴长度。操作者可以根据临床经验,在系统中事先设定好各种消融针在不同功率和工作时间下的模拟消融区域,构建一个简单的资料库。在临床应用时,操作者可以直接导入已经设定的消融针相关参数(消融范围以及功率设定等),然后在根据用户输入的消融路径来获得预计消融区域。在其中一个实施例中,前述实施例中还包括:根据预计消融路径或执行参数与预计消融区域之间的对应关系,建立两者的关联关系数据库,然后基于操作者(也可以用户)输入的上述消融路径,查找关联关系数据库确定相应的预计消融区域。例如,根据步骤326中获得的第一规划消融路径,可以获得第一预计消融区域;然而,根据第二规划消融路径,可以确定第二预计消融区域。第一消融路径和第二消融路径已在前文有相关解释,在此不再重复说明。可见,本实施中提到的第一预计消融区为基于当前获得的实时超声图像确定的规划消融路径、第二预计消融区域为导入的预先存储的规划消融路径;或者,第一预计消融区为基于当前获得的实时超声图像确定的第一消融设备对应的规划消融路径,第二预计消融区域为基于当前获得的实时超声图像确定的第二消融设备对应的规划消融路径。第一消融设备和第二消融设备可以分别为一次消融规划所采用的多个消融设备(例如多个消融针)中的不同部分。
执行图3中的步骤330,处理器输出上述预计消融区域。以下提供几种输出预计消融区域的方式。
1、在上述实时超声图像上标记显示上述预计消融区域(图3中的步骤330),例如,在其中一个实例中,在上述实时超声图像上同时标记第一预计消融区域和第二预计消融区域。例如图7中的图B所示,红色的线条范围731为第二预计消融区域,粉红色渲染的范围733为第一预计消融区域。线条732为消融针。采用图B的显示方式可以更加清晰地、可视化规划的消融范围与 当前基于实时超声图像而获得的消融范围之间的对应关系。
2、计算上述三维体与预计消融区域的重叠关系,并输出上述重叠关系的计算结果(图3中的步骤332和步骤334)。上述三维体可以包括安全边界所囊括的范围。参见图7中的图B所示,可以分别计算第一预计消融区域和/或第二预计消融区域与上述三维体734之间的重叠关系,获得相应的计算结果,用于量化特定组织的视图数据与消融范围之间的关系,更进一步地精确比较规划的消融范围与当前基于实时超声图像而获得的消融范围之间的优劣。当然,作为预先存储的规划消融路径的第一规划消融路径,还可以是利用传统方式基于静态3D数据来进行的消融规划设置,结合本实施例提出的方式可以有效的比较传统方式与实时扫描进行消融规划时的区别,从而能提升手术精确度,并提高用户体验。
3、在前述提到的融合图中绘制超声探头与实时超声图像之间的位置关系,以及在融合图的相应位置处标记预计消融区域。参见图7中的图C所示,在前述提到的融合图(图C)中绘制超声探头图标721与前述实时超声图像722之间的位置关系,以及在融合图的相应位置处标记预计消融区域723。
当然在实际的消融区域的显示实施例中,可以借鉴前述三种方式,并通过前述三种输出方式中的任意一种或者两种的结合来实现处理器输出上述预计消融区域。
此外,上述实施例中,通过接收超声探头位于当前方位时关于至少一个消融针的第二规划消融路径,从而确定第二预计消融区域,那么操作者可以通过此方式来进行消融规划设置,例如,在其中一个实施例中,基于图3所示的实施例,图3中的步骤326中规划消融路径为操作者针对当前获得的实时超声图像而进行消融规划时输入的关于至少一个消融针的第二规划消融路径,然后在后续的步骤336中关联记录超声探头位于磁场中每个方位时对应获得的实时超声图像、空间方位信息和第二规划消融路径,从而可以获得预先存储的前述第一消融路径,形成消融规划数据,并在后续的消融规划过程中使用该第一消融路径(参见图4)。返回前述步骤310,继续基于超声探头的不同方位,可以进行不同方位下的消融路径的设定,从而形成消融针的规划路径,完整显示消融手术的规划安排,以及消融效果的预估计。
当然从前文可知,本实施例中将定位装置固定在超声探头上,从而可以脱离穿刺针来进行跟踪,在术前进行穿刺规划时可以基于消融设备(如穿刺消融针)与探头的相对位置,来利用前述设备和方法步骤来模拟进行消融设 备的路径规划,而并不是基于真实的消融针介入到被测体内部进行规划,这样可以避免在术前规划时增加病人的痛苦,降低术前成本。消融设备(如穿刺消融针)与探头的相对位置包括:消融设备的端部(如消融针针尖)与超声探头的距离、消融设备(如消融针)与超声探头的安装夹角等等信息。在其中一个实施例中,前述步骤280和步骤290,或者前述步骤326和步骤330还可以包括以下步骤:
根据消融设备与超声探头的相对位置,在融合图上显示消融设备的消融设备标记;和,基于实时超声图像和消融设备标记,设定关于消融设备的消融路径。
例如,可以通过在融合图上调节消融设备标记在实时超声图像上的位置,从而规划消融针等消融设备的消融进针路径。根据处理器接收的用户输入的关于消融设备标记的调整命令,调整实时超声图像上消融设备标记的图像位置,从而获得第二规划消融路径的部分。在其中一个实施例中,在上述步骤336中关联记录超声探头位于磁场中每个方位时对应获得的实时超声图像、空间方位信息和规划消融路径时可以通过以下方式来获得,关联记录表征消融设备的模拟标记、实时超声图像、和空间方位信息之间的对应关系,从而形成关于消融规划的术前数据。而在这一过程中均可以不需要在超声探头上安装消融设备,例如消融针,就可以实现消融规划,从而减少病人痛苦。
基于前文所述,如图4和图7中的图B所示,在利用本实施例的设备时还可以进行传统规划路径和实时规划路径之间的规划评估,从而提供给医生进行相应的参考,用以制定更加精确的规划路径。图4中给出了一种实施例的流程步骤,采用图4所示的实施例可以比较规划的消融范围与基于当前实时超声图像而获得的消融范围之间的优劣。
图4中的步骤410和步骤412中,发射电路和接收电路激励超声探头向含有特定组织的检测对象发射超声波束,并接受超声波束的回波获得超声回波信号,图像处理模块根据超声回波信号获得实时超声图像,具体可参见前文所述的步骤210和步骤220,或者步骤310和312。步骤414中,利用导航系统获得定位装置所在空间的空间方位信息,具体可参见前文所述的步骤230,或者步骤314。步骤416中,处理器关联记录实时超声图像与空间方位信息之间的对应关系,具体可参见前文所述的步骤230或者步骤316。通过执行步骤416可以获得每一帧实时超声图像与采集该超声图像时超声探头在磁场中的空间方位。步骤418和步骤420中,引导特定组织的三维模型,获得特 定组织的视图数据,将三维模型数据与实时超声图像进行配准,具体可参见前文所述的步骤250和步骤260,或者步骤318和320。步骤422中,在显示屏上显示至少一帧实时超声图像,并在实时超声图像的相应位置上标记显示特定组织的视图数据(步骤424),可以形成前述的融合图,具体可参见前文所述的步骤270,或者步骤322和324。
接着,在图4中的步骤426中,处理器接收超声探头位于当前方位时关于至少一个消融针的第二规划消融路径。步骤428中,处理器根据第二规划消融路径确定第二预计消融区域。
在图4中的步骤430中,处理器导入预先存储的关于至少一个消融针的第一规划消融路径,根据第一规划消融路径确定第一预计消融区域。步骤432中,在实时超声图像的相应位置处标记第一预计消融区域和第二预计消融区域,例如图7中图B中733和731。
图4中的步骤434中,处理器量化第一预计消融区域和第二预计消融区域之间的重叠情况,在步骤436中处理器输出第一预计消融区域和第二预计消融区域之间重叠情况的量化结果。步骤426至步骤436中关于第一预计消融区域和第二预计消融区域的具体说明可参见前文关于步骤326至步骤330中的相关说明。参见前文的相关说明,关于至少一个消融针的第一消融路径可以是采用图3所示的方法获取的第二消融路径,或者是基于离线超声图像进行消融规划时获得的规划消融路径。第一预计消融区域和第二预计消融区域之间重叠情况的量化结果的输出可以是如图7所示的图形显示,或者文本显示比值。
如图7所示的图C,处理器可以在同一个窗口或同一个融合图中显示探头图标721、消融设备标记725以及当前时刻获得的实时超声图像722所在扫描截面之间的空间位置关系。更进一步的,在同一个窗口或同一个融合图中显示特定组织的视图数据(724)。基于配准结果可以从实际空间角度的视角来观察超声图像的采集角度,从而指导用户进行消融路径的规划操作。
在一些实施例中,前述在实时超声图像上确定规划消融路径的过程中还可以包括以下步骤:在实时超声图像内导入预先存储的规划消融路径,例如前述第二规划消融路径;随超声探头的位置变化,基于变化的实时超声图像获取针对导入的规划消融路径进行变更操作的输入,例如基于前述关于第一规划消融路径的获取说明,针对超声探头在不同时刻获得的至少一帧实时超声图像分别得到相应的规划消融路径,基于第二规划消融路径的修正结果可 以获得第一规划消融路径,从而得到关于第二规划消融路径的变更操作的输入;根据变更操作输入,获得关于规划消融路径的变更数据。
在一些实施例中,存储规划消融路径或关于规划消融路径的变更数据,建立关于特定组织的规划消融数据库,所述数据库记录的信息至少包括以下内容之一:规划消融路径和关于消融的执行参数中的其中之一,所述数据库记录的信息还包括:规划消融路径和关于消融的执行参数中的其中之一与所述空间方位信息、实时超声图像和所述特定组织的视图数据中的其中之一的关联关系。便于进行导入规划消融路径的操作,或依据用户输入的执行参数直接导入相应的规划消融路径。
三、基于前述各个实施例和模拟规划设备,还可以用于多针联合消融的规划情形,具体可以参见图6所示的流程。
图6中的步骤610和步骤612中,通过发射电路和接收电路激励位于一个方位上的超声探头向含有特定组织的检测对象发射超声波束,并接受超声波束的回波获得超声回波信号,图像处理模块根据超声回波信号获得与前述方位关联对应的实时超声图像。步骤610和步骤612的具体实现方式可参见前文所述的步骤210和步骤220,或者步骤410和步骤412。
图6中的步骤614中,利用导航系统获得定位装置的空间方位信息。步骤614的具体实现方式可参见前文所述的步骤230,或者步骤414。图6中的步骤616中,处理器关联记录与前述方位关联对应的实时超声图像与空间方位信息之间的对应关系,从而可以获得图像空间坐标系与磁场空间坐标系之间的映射关系。步骤616的具体实现方式可参见前文所述的步骤240,或者步骤416。通过执行步骤616可以获得每一帧实时超声图像与采集该超声图像时超声探头在磁场中的方位。图6中的步骤618和步骤620中,处理器引导特定组织的三维模型,获得特定组织的视图数据,视图数据包括三维体信息,具体可参见前文相关说明,将三维模型数据与实时超声图像进行配准,从而可以获得三维模型数据与图像空间坐标系之间的映射关系,基于这些映射关系可以将特定组织的视图数据、超声探头、实时超声图像均映射在同一个坐标系下,并实现融合成像。步骤618和步骤620的具体实现方式可参见前文所述的步骤250和步骤260,或者步骤418和步骤420。图6中的步骤622中,处理器显示实时超声图像,并在实时超声图像上标记显示特定组织的视图数据,例如,可以在实时超声图像上的相应位置处标记特定组织的三维或二维图像数据,或者标记在实时超声图像上的相应位置处特定组织的图标, 可以获得至少一个融合图。步骤622的具体实现方式可参见前文前述的步骤270,或者步骤422和424。
图6中的步骤624中,根据实时超声图像,处理器接收设定第一消融针对应获得的规划消融路径。图6中的步骤626中,处理器根据前述规划消融路径,确定预计消融区域的第一部分。图6中的步骤628中,处理器计算前述第一部分与前述视图数据的重叠关系,获得第一计算结果。图6中的步骤630中,处理器输出前述第一计算结果。图6中的步骤632中,根据实时超声图像,处理器接收设定第二消融针对应获得的规划消融路径。图6中的步骤634中,处理器根据前述再次获得的规划消融路径,确定预计消融区域的第二部分。图6中的步骤636中,处理器计算前述第二部分和第一部分与前述视图数据的重叠关系,获得第二计算结果。图6中的步骤638中,处理器将输出的第一结果更新为前述第二计算结果。第一消融针和第二消融针可以分别对应两个不同的消融针,或者分别对应同一个消融针的前后两次穿刺进入设置,因此,这里提到的两次规划消融路径的获得,可以是分别调整第一个消融针和第二个消融针后分别对应获得的两次规划消融路径的输入,也可以分别是调整同一个消融针后分别对应获得的两次消融路径的输入。在其中一个实施例中,第一消融针和第二消融针为两个不同的消融针,那么采用本实施例之后可以展现多针联合消融过程中逐步进行三维体消融规划处理的过程,并逐步的显示相应的消融范围比例,为用户提供良好的数据支持,由传统的人为判断转变为计算机辅助判断的过程,使得消融规划更加科学,更加精确。上述第一计算结果和第二计算结果的输出可以是在融合图上进行渲染输出,也可以是采用文本的方式输出显示,并且还可以根据消融规划的过程进行更新显示。
进行设定模拟消融针路径的流程可以参见图3所示。首先操作者移动探头在患者体表移动探头,按照构想的介入方案找到合适的位置和方向,获取特定组织的期望超声切面,即预计的消融进针位置。结合穿刺架与穿刺引导线,调整探头角度或者穿刺引导线角度,使得穿刺引导线经过特定组织(如图7中的图B)。通过人机交互设备的超声控制面板旋钮、按键或者触摸屏等调整进针深度等消融路径,屏幕显示所示模拟消融针沿穿刺引导线进入特定组织(如某一组织区域),确定后在模拟消融针尖端显示模拟椭球消融区域(如图7中图C的723)。确定时系统自动存储当前设定的超声切面、穿刺引导线角度、模拟消融针路径、路径深度、模拟消融区域等消融路径及实时超声图 像。由于在探头上绑定有磁定位装置,因此探头上定位装置所表示的空间相对于磁场的物理空间映射矩阵Pi可以实时获得。通过映射矩阵Pi,可以把模拟消融区域在超声图像空间中的坐标映射到磁场所在的物理空间,从而获得模拟消融区域在磁场空间坐标系中的坐标,即Pi*Si。在需要进行多针消融时(一根针多次或者多根针同时)重复前述环节操作设定多根模拟消融针(具体可参见图6所示流程),多针的联合消融区域B为下述公式所示:
Figure PCTCN2017112174-appb-000001
(多个消融区域的并集)。
Figure PCTCN2017112174-appb-000002
时,特定组织(如某一组织(安全边界))被完全消融。在多针消融的过程中,可以实时显示消融残余百分比用以量化表征前述三维体与预计消融区域的重叠关系,即特定组织(如某一组织(安全边界))消融残余区域占整个三维体的百分比,即第k次消融时的实时百分比A可以表示为:
Figure PCTCN2017112174-appb-000003
其中,第k次消融可以是基于一个消融针的k次消融,也可以是基于k个消融针的消融,该参数A实时定量显示当前消融效果,可以用于作为前述步骤332和步骤334的计算方式,也可以用于图6中的第一计算结果和第二计算结果的计算方式。同样也可以计算已消融体积百分比参数,以及计算最小消融针数,即计算多少个预计消融区域联合
Figure PCTCN2017112174-appb-000004
在一起可以包含某一组织区域Ts。
图3至图6所示的各个实施例中,都可以在显示屏上显示实时图像、多针联合消融区域、某一组织目标,上述需要显示的信息包含了二维图像、三维图像数据、三维形状目标(即三维体)之间的关系,如图7所示,多窗口联动显示平台。窗口之一融合显示实时二维图像与模拟消融区域以及三维体的相交区域,即把模拟消融区域和某一组织(安全边界)与实时图像相交区域彩色叠加在实时图像上显示(如图7右上图A)。也可以采用体绘制(VR)或者面绘制(SSD)等三维显示技术来显示实时图像、多针联合消融区域、某一组织目标等(如图7左下图C)。
通过本设备可以引导操作者沿已经设定路径进行介入消融。操作者选择一个模拟过程中设定的路径可以得到设定该路径时的参数:探头方位信息Ri、映射矩阵Pi、特定组织的视图数据(例如三维体数据Si)等。同时当前实时探头对应的探头方位信息Rk、映射矩阵Pk、三维体数据Sk。基于两组数据间的相互关系可以引导操作者沿着模拟设定的路径进行消融。例如:利用超声探头当前的空间方位Rk和预先存储的消融路径(如消融针进入路径)中探头空间方位信息Ri,可以计算当前探头和模拟设定探头之间的距离、方位夹角等定量参数,可以在三维显示系统中,在融合图上直接显示两个探头模型的相对位置(例如在图7左下窗口,同时显示两个探头模型,其中一个721表示当前超声探头的位置,用另一个探头模型表示模拟设定探头的位置)。也可以实时显示模拟设定的预计消融区域与实时探头对应的预计消融区域的重合率,((Pi*Si)∩(Pk*Sk))/(Pi*Si),其中Pi*Si可以用来获得配准后的三维体。或者如图7右上窗口图B,融合显示实时图像与模拟设定消融区域、以及预计消融区域的相交情况,即在实时图像上以不同颜色标记已设定预计消融区域和当前预计消融区域。
基于前述各个实施例提供的关于规划消融轨迹的方法和系统,可以在其中一个实施例中提出一种用于引导规划设计的超声设备,显示多针同时消融时的预计消融效果。即记录医生在实际介入消融时进行引导进针时探头方位、实际进针深度和进针角度。根据前述内容可见,在超声系统中建立了一个规划模式(如下文提到的模拟系统模式)。
在其中一些实施例中,通过随超声探头的位置变化获得变化的实时超声图像数据,并基于变化的实时超声图像,获取前述规划消融路径,从而在实时超声图像上确定规划消融路径。例如,在一个实施例中,启动或进入规划模式(包括导航系统模式);移动超声探头选择对特定组织进行实时成像,获得 第一实时超声图像数据,例如选择对特定组织的某一个切面进行成像,获得超声切面图像;在显示的第一实时超声图像数据上显示导入的特定组织的视图数据,用以基于超声引导消融针进入某一特定组织目标,根据消融针进针路径和消融路径深度可以设定预计模拟消融范围,从而确定规划消融路径的部分。获取超声探头的位置变更信息,获得特定组织的另一位置处的第二实时成像数据,在第二实时超声图像数据上确定规划消融路径的另一部分,从而形成规划消融路径的数据集,基于规划消融路径的数据集,可以获得至少两次移动超声探头所对应获得的规划消融路径,根据规划消融路径获得相应位置处的至少两个预计消融区域,在实时超声图像上叠加显示至少两次移动超声探头所对应获得的预计消融区域,形成联合预计消融区域。例如移动超声探头选择其它位置引导消融针进入某一组织目标,并显示联合预计消融区域
Figure PCTCN2017112174-appb-000005
然后,选择超声造影成像模式,注射超声造影剂,采集造影图像。在某一组织灌注时,沿固定方向摆动探头或者包含某一组织附近区域不同位置的一段电影图像数据,并存储。融合显示联合预计消融区域与造影图像,基于存储的电影中每帧图像对应的探头方位信息(映射矩阵Pi),把每帧图像映射到磁场空间坐标系,并显示造影图像与联合预计消融区域的相交情况。即把模拟消融区域和某一组织(安全边界)与实时图像相交区域彩色叠加在实时图像上显示(如图7右上图B)。通过观察某一组织的不同位置造影切面是否被彩色预计联合消融区域包含,即可判断当前设定的消融针是否能完成消融。当然,在本实施例中的超声造影成像模式还可以采用其他超声成像模式。
四、基于前述的超声系统和超声成像方法,本实施例中还提供了一种可以对比验证预计消融区域与实际消融区域的超声系统。在其中一个实施例中,提供了一种超声系统,其包括:超声探头、消融设备、发射电路和接收电路、图像处理模块、导航系统、显示屏、存储器和处理器。消融设备固定在超声探头上,例如消融设备与超声探头以预设角度固定。发射电路和接收电路,通过激励超声探头向含有特定组织的检测对象发射超声波束,并接收超声波束 的回波,获得超声回波信号。图像处理模块根据超声回波信号获得实时超声图像数据。导航系统包括定位装置,定位装置固定在超声探头上,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息。存储器存储处理器上运行的计算机程序;和,处理器执行程序时实现以下步骤:
关联记录实时超声图像数据和实时超声图像数据对应的空间方位信息;
显示实时超声图像数据;
结合实时超声图像数据和空间方位信息,获得关于消融设备的实际消融路径;
导入预先存储的规划消融路径;
在实时超声图像数据上叠加显示规划消融路径;
在实时超声图像数据上叠加显示实际消融路径信息;和,
关联存储规划消融路径和实际消融路径信息。
在本实施例中,预先存储的规划消融路径可以采用前述内容的方法获得,可参见前文相关内容。
前述结合实时超声图像数据和空间方位信息,获得关于消融设备的实际消融路径信息的过程中包括:基于实时超声图像数据分割获得关于消融设备进入到特定组织内的相应位置信息,和/或基于实时超声图像获得用户的输入信息,从而确定实际消融路径。此外,根据前述空间方位信息,或者结合消融设备与超声探头之间的实际安装固定角度,得到关于消融设备的实际进针角度、方向等信息,从而确定实际规划消融路径。有关消融路径,可以包括消融的引导方向、消融路径深度、预计消融区域、和执行参数等信息中的其中之一,而执行参数至少包括:消融功率、预计工作时长、和消融设备的个数等信息中的其中之一。此外,关于实际消融路径中涉及的执行参数等相关信息,可以在实时超声图像上确定,例如,基于实时超声图像获得用户的输入信息,从而确定实际消融路径。这一过程可以参见前文中关于在实时超声图像上确定规划消融路径的过程,方法相同,在此不再赘述。
在实时超声图像数据上叠加显示规划消融路径时,可参见图7所示的实施例,例如,在实时超声图像数据上通过标记显示消融设备标记和预计消融区域来表征规划消融路径,在消融设备标记的末端显示预计消融区域,当消融设备标记在实时超声图像上的位置发生变化时,则预计消融区域联动变化;或者当超声探头的方位发生变化时、消融设备标记和/或预计消融区域也会联动变化。
同理,在实时超声图像数据上叠加显示实际消融路径信息时,也可以采用标记显示实际消融设备标记和/或实际消融区域来表征实际消融路径。在实际消融设备标记的末端显示实际消融区域,当实际消融设备标记在实时超声图像上的位置发生变化时,则实际消融区域联动变化;或者当超声探头的方位发生变化时、实际消融设备标记和/或实际消融区域也会联动变化。
当然,对于实际消融设备标记,和前述提到的消融设备标记可以通过色彩、和/或线性等特征来区分显示,同样地,实际消融路径和规划消融路径也可以通过色彩、和/或线性等特征来区分显示。
在关联存储规划消融路径和实际消融路径信息时,可以包括记录实际消融路径与规划消融路径之间的差异。例如,同时记录规划消融路径对应的消融设备和实际消融设备在同一帧超声图像中的位置和/位置差异,同时记录规划消融路径对应的消融设备和实际消融设备的进针角度差异,同时记录规划消融路径和实际消融路径对应的预计消融区域和实际消融区域的差别,等等。
当然,在关联记录规划消融路径和实际消融路径信息的同时,还关联存储对应的实时超声图像数据,或者,在关联记录规划消融路径和实际消融路径信息的同时,还关联存储对应的实时超声图像数据和特定组织的视图数据。
在一些实施例中,处理器执行程序时采用以下方式实现导入预先存储的规划消融路径和在实时超声图像数据上叠加显示规划消融路径:
导入特定组织的三维模型数据,获得特定组织的视图数据;
根据关联记录的实时超声图像数据和空间方位信息,将三维模型数据与实时超声图像数据进行配准;
基于配准后的结果,在显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示;
导入预先存储的规划消融路径;和,
在实时超声图像和视图数据融合显示的结果上叠加显示规划消融路径。
以上实施例中的相关步骤可以参见前文中图1至图7的相关说明,在此不再赘述。
在一些实施例中,在实时超声图像数据上叠加显示规划消融路径和实际消融路径,以及规划消融路径和实际消融路径之间的差异。例如,在实时超声图像数据上显示规划消融路径对应的消融设备和实际消融设备的位置差异,规划消融路径对应的消融设备和实际消融设备的进针角度差异,规划消融路径和实际消融路径对应的预计消融区域和实际消融区域的区域差异,等等。 在其中一个实施例中,处理器计算规划消融路径对应的预计消融区域和实际消融路径对应的实际消融区域之间的重叠关系,量化该重叠关系,并显示输入在显示屏上。此外,在量化重叠关系的过程中,还可以加入计算预计消融区域或实际消融区域与特定组织的视图数据之间的重叠关系,量化该重叠关系,并输出显示在显示屏上。当然,除了显示消融区域之间的重叠关系,在其他实施例中,还可以显示规划消融路径和实际消融路径的重叠关系,例如,进针角度的重叠关系、消融设备的位置之间的重叠关系,等等。
如图5所示流程相似,在其中一个实施例中,在完成消融手术后,注射造影剂,进行造影成像模式。消融后消融灶在超声造影图像中没有造影剂灌注,与周围正常组织对比明显。同样可以把消融术后获得的造影图像采用基于导航的自由臂重建算法重建成三维超声造影数据,对比显示联合预计消融区域和三维超声造影数据可以分析模拟系统与实际消融效果的差异。当然,在其他实施例中还可以采用其他成像模式,例如,弹性成像模式等等。因此,在其中一个实施例中,获得关于手术后特定组织位置处的三维超声图像数据;显示三维超声图像数据;和,在三维超声图像数据上叠加规划消融路径。关于三维超声图像数据的重建方式可以参照前文提到的造影图像的方法,在此不再详细说明。在三维超声图像数据上叠加规划消融路径可以采用立体显示方式,例如,图7中的图C,在三维超声图像数据上叠加显示三维显示的预计消融区域,通常消融区域为椭球型。当然,除了显示三维的预计消融区域,还可以通过标记消融设备的位置来显示规划消融路径中的其他信息,例如,进针角度或引导角度、进针方向或引导方向、进针或消融路径的深度等信息。
本文描述了对某一组织消融临床应用的超声引导系统。该系统不但可以对操作者术前设计的方案进行临床验证,同时可以用于对已经插入人体的消融针进行消融效果估计与评价,以及评估验证实际消融效果。例如,在其中一个实施例中,处理器还可以获得前述超声探头位于当前方位时对应的空间方位信息,获得实时位置信息,根据前述超声探头位于当前方位时对应获得的实时超声图像,抽取预先存储的前述超声探头位于其中一个方位时对应获得的空间方位信息(可以预先存储的规划数据中抽取),获得参考信息,和,同时显示前述实时位置信息和前述参考信息。如图7的左下图C中标记两个超声探头的位置,一个是消融规划时产生的超声探头位置,一个是超声探头当前所在的方位,这样可以提示操作者如何调整超声探头位置。例如,根据前述实时位置信息,在前述融合图中标记前述超声探头与前述实时超声图像之 间的位置关系,和,在前述融合图内的相应位置处标记前述参考信息,从而实现前述同时显示前述实时位置信息和前述参考信息。
图2至图6分别提供的仅仅是一种步骤间的流程执行顺序,还可以基于前文中对图2至图6中的各个步骤进行调整顺序获得各种变形方案,上述各个步骤不限于仅按照图4的顺序执行,步骤间在满足基本逻辑的情况下可以相互置换,更改执行顺序,还可以重复执行其中的一个或多个步骤后,在执行最后一个或多个步骤,这些方案均属于依据本文提供的实施例进行的变形方案。当然,其中不同的步骤可以采用不同的执行主体来完成,例如前文的相关说明。通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品承载在一个非易失性计算机可读存储载体(如ROM、磁碟、光盘、硬盘、服务器云空间)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本发明各个实施例的系统结构和方法。例如,一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时至少可以用于实现前文中提到的各个实施例中的方法步骤。
以上实施例仅表达了几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。

Claims (35)

  1. 一种用于规划消融的超声系统,其特征在于,所述设备包括:
    超声探头;
    发射电路和接收电路,通过激励所述超声探头向含有特定组织的检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
    图像处理模块,所述图像处理模块根据所述超声回波信号获得实时超声图像数据;
    导航系统,所述导航系统包括定位装置,所述定位装置固定在所述超声探头上,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息;
    显示屏;
    存储器,所述存储器存储处理器上运行的计算机程序;和,
    处理器,所述处理器执行所述程序时实现以下步骤:
    关联记录所述实时超声图像数据和所述实时超声图像数据对应的所述空间方位信息;
    导入所述特定组织的三维模型数据,获得所述特定组织的视图数据;
    根据所述关联记录的结果,将所述三维模型数据与所述实时超声图像数据进行配准;
    基于配准后的结果,在所述显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示;和,
    在所述实时超声图像上确定规划消融路径。
  2. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述处理器还执行所述程序时实现以下过程:在所述显示屏上显示消融设备标记和/或规划消融路径。
  3. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序时采用以下方式实现所述在所述显示屏上显示消融设备标记和/或规划消融路径:
    在所述显示屏上显示表征超声探头的探头图标,所述探头图标的显示位置随所述空间方位信息的变化而变化。
  4. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述 规划消融路径至少包括消融的引导方向、消融路径深度、和预计消融区域中的其中之一。
  5. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序时所述在所述实时超声图像上确定规划消融路径:
    根据消融设备与超声探头的相对位置,在所述实时超声图像的第一位置处显示消融设备标记;和,
    获取关于消融设备标记的调整指令;
    基于所述调整指令,变更所述消融设备标记在所述实时超声图像上的位置至第二位置处;
    记录所述消融设备标记的位置变化与所述实时超声图像、所述视图数据和/或所述空间方位关系的关联信息,获得所述规划消融路径。
  6. 根据权利要求1或5所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序时在所述实时超声图像上确定规划消融路径之前还包括:
    获取关于消融设备的执行参数,所述执行参数至少包括:消融功率、预计工作时长、和消融设备的个数中的其中之一;
    根据所述执行参数获得预计消融区域,在所述实时超声图像上显示所述预计消融区域,用以确定所述规划消融路径,其中所述预计消融区域的显示随消融设备标记的位置变化而变化。
  7. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序时还包括:
    显示探头图标、消融设备标记以及当前时刻获得的实时超声图像所在扫描截面之间的空间位置关系。
  8. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述在所述显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示中至少包括以下情况之一:
    显示所述特定组织对应的三维视图数据和二维实时超声图像;
    显示所述特定组织对应的三维视图数据和三维实时超声图像;和,
    显示所述特定组织对应的二维视图数据和实时超声图像。
  9. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序实现所述在所述显示屏上显示消融设备标记和/或规划消融路径时至少包括以下方式之一:
    在所述实时超声图像上跟随消融设备标记显示预计消融区域;
    计算所述视图数据与预计消融区域的重叠关系,并输出所述重叠关系的计算结果;
    绘制探头图标和在所述实时超声图像的相应位置处标记预计消融区域;和,
    根据规划消融路径,同时标记第一预计消融区域和第二预计消融区域之间的重叠情况。
  10. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述显示屏上包括多个窗口,多个窗口内的数据随超声探头位置的变化而联动变化。
  11. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述消融设备包括至少一个消融针,所述处理器执行所述程序时至少采用以下方式之一实现所述在所述实时超声图像上确定规划消融路径:
    在所述实时超声图像内导入预先存储的规划消融路径;
    随超声探头的位置变化,基于变化的实时超声图像获取针对导入的规划消融路径进行变更操作的输入;和,
    根据所述输入,获得关于规划消融路径的变更数据。
  12. 根据权利要求1或11所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序时还包括以下步骤:
    存储所述规划消融路径或关于规划消融路径的变更数据,建立关于特定组织的规划消融数据库。
  13. 根据权利要求9所述的用于规划消融的超声系统,其特征在于,所述第一预计消融区为基于当前获得的实时超声图像确定的规划消融路径、第二预计消融区域为导入的预先存储的规划消融路径;或者,
    所述第一预计消融区为基于当前获得的实时超声图像确定的第一消融设备对应的规划消融路径,第二预计消融区域为基于当前获得的实时超声图像确定的第二消融设备对应的规划消融路径。
  14. 根据权利要求1或2所述的用于规划消融的超声系统,其特征在于,所述处理器执行所述程序时通过以下步骤获得所述特定组织的三维模型数据:
    获取包含所述特定组织的电影图像数据,
    获取采集所述电影图像数据中每一帧图像对应关联的空间方位信息,和,
    根据采集所述电影图像数据中每一帧图像对应关联的空间方位信息,将 所述电影图像数据中每一帧图像映射到磁场空间坐标系中,重建三维超声图像,用以获得所述特定组织的三维模型数据。
  15. 根据权利要求1所述的用于规划消融的超声系统,其特征在于,所述设备还包括:
    用于固定在含有所述检测对象的被测体表面的对象定位装置,所述磁场发射与信号接收模块接收位于所述磁场中的对象定位装置反馈回的检测信号,根据所述检测信号获得所述对象定位装置相对于所述磁场的空间方位信息、和/或所述被测体表面的运动信息。
  16. 一种用于规划消融的超声成像方法,其特征在于,所述方法包括:
    通过超声探头获得实时超声图像;
    通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息;
    关联记录所述实时超声图像与所述空间方位信息之间的对应关系,
    导入所述特定组织的三维模型数据,获得所述特定组织的视图数据;
    根据所述关联记录的结果,将所述三维模型数据与所述实时超声图像数据进行配准;
    基于配准后的结果,在所述显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示;和,
    在所述实时超声图像上确定规划消融路径。
  17. 根据权利要求16所述的超声成像方法,其特征在于,所述方法还包括:在所述显示屏上显示消融设备标记和/或规划消融路径。
  18. 根据权利要求16所述的超声成像方法,其特征在于,所述在所述显示屏上显示消融设备标记和/或规划消融路径包括:
    在所述显示屏上显示表征超声探头的探头图标,所述探头图标的显示位置随所述空间方位信息的变化而变化。
  19. 根据权利要求16所述的超声成像方法,其特征在于,所述规划消融路径至少包括消融的引导方向、消融路径深度、和预计消融区域中的其中之一。
  20. 根据权利要求16所述的超声成像方法,其特征在于,所述在所述实时超声图像上确定规划消融路径包括:
    根据消融设备与超声探头的相对位置,在所述实时超声图像的第一位置 处显示消融设备标记;和,
    获取关于消融设备标记的调整指令;
    基于所述调整指令,变更所述消融设备标记在所述实时超声图像上的位置至第二位置处;
    记录所述消融设备标记的位置变化与所述实时超声图像、所述视图数据和/或所述空间方位关系的关联信息,获得所述规划消融路径。
  21. 根据权利要求16或20所述的超声成像方法,其特征在于,所述在所述实时超声图像上确定规划消融路径之前还包括:
    获取关于消融设备的执行参数,所述执行参数至少包括:消融功率、预计工作时长、和消融设备的个数中的其中之一;
    根据所述执行参数获得预计消融区域,在所述实时超声图像上显示所述预计消融区域,用以确定所述规划消融路径,其中所述预计消融区域的显示随消融设备标记的位置变化而变化。
  22. 根据权利要求16所述的超声成像方法,其特征在于,所述方法还包括:显示探头图标、消融设备标记以及当前时刻获得的实时超声图像所在扫描截面之间的空间位置关系。
  23. 根据权利要求16所述的超声成像方法,其特征在于,所述在所述显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示中至少包括以下情况之一:
    显示所述特定组织对应的三维视图数据和二维实时超声图像;
    显示所述特定组织对应的三维视图数据和三维实时超声图像;和,
    显示所述特定组织对应的二维视图数据和实时超声图像。
  24. 根据权利要求16所述的超声成像方法,其特征在于,所述在所述显示屏上显示消融设备标记和/或规划消融路径时至少包括以下方式之一:
    在所述实时超声图像上跟随消融设备标记显示预计消融区域;
    计算所述视图数据与预计消融区域的重叠关系,并输出所述重叠关系的计算结果;
    绘制探头图标和在所述实时超声图像的相应位置处标记预计消融区域;和,
    根据规划消融路径,同时标记第一预计消融区域和第二预计消融区域之间的重叠情况。
  25. 根据权利要求16所述的超声成像方法,其特征在于,所述显示屏上 包括多个窗口,多个窗口内的数据随超声探头位置的变化而联动变化。
  26. 根据权利要求16所述的超声成像方法,其特征在于,所述消融设备包括至少一个消融针,至少采用以下方式之一实现所述在所述实时超声图像上确定规划消融路径:
    在所述实时超声图像内导入预先存储的规划消融路径;
    随超声探头的位置变化,基于变化的实时超声图像获取针对导入的规划消融路径进行变更操作的输入;和,
    根据所述输入,获得关于规划消融路径的变更数据。
  27. 根据权利要求24所述的超声成像方法,其特征在于,所述第一预计消融区为基于当前获得的实时超声图像确定的规划消融路径、第二预计消融区域为导入的预先存储的规划消融路径;或者,
    所述第一预计消融区为基于当前获得的实时超声图像确定的第一消融设备对应的规划消融路径,第二预计消融区域为基于当前获得的实时超声图像确定的第二消融设备对应的规划消融路径。
  28. 根据权利要求16或17所述的超声成像方法,其特征在于,所述处理器执行所述程序时通过以下步骤获得所述特定组织的三维模型数据:
    获取包含所述特定组织的电影图像数据,
    获取采集所述电影图像数据中每一帧图像对应关联的空间方位信息,和,
    根据采集所述电影图像数据中每一帧图像对应关联的空间方位信息,将所述电影图像数据中每一帧图像映射到磁场空间坐标系中,重建三维超声图像,用以获得所述特定组织的三维模型数据。
  29. 根据权利要求16所述的超声成像方法,其特征在于,所述方法中,通过固定在含有所述检测对象的被测体表面的对象定位装置获得检测信号,根据所述检测信号矫正所述空间方位信息。
  30. 根据权利要求16所述的超声成像方法,其特征在于,所述在所述实时超声图像上确定规划消融路径包括:
    随超声探头的位置变化获得变化的实时超声图像数据;和,
    基于变化的实时超声图像,获取规划消融路径。
  31. 根据权利要求16所述的超声成像方法,其特征在于,所述在所述实时超声图像上确定规划消融路径还包括:
    获得至少两次移动超声探头所对应获得的规划消融路径;
    根据规划消融路径获得至少两个预计消融区域;和,
    在实时超声图像上叠加显示所述至少两个预计消融区域。
  32. 一种超声系统,其特征在于,所述系统包括:
    超声探头;
    消融设备,所述消融设备固定在超声探头上;
    发射电路和接收电路,通过激励所述超声探头向含有特定组织的检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
    图像处理模块,所述图像处理模块根据所述超声回波信号获得实时超声图像数据;
    导航系统,所述导航系统包括定位装置,所述定位装置固定在所述超声探头上,通过导航系统获得固定在超声探头上的定位装置所在空间的空间方位信息;
    显示屏;
    存储器,所述存储器存储处理器上运行的计算机程序;和,
    处理器,所述处理器执行所述程序时实现以下步骤:
    关联记录所述实时超声图像数据和所述实时超声图像数据对应的所述空间方位信息;
    显示所述实时超声图像数据;
    结合所述实时超声图像数据和所述空间方位信息,获得消融设备的实际消融路径;
    导入预先存储的规划消融路径;
    在所述实时超声图像数据上叠加显示规划消融路径;
    在所述实时超声图像数据上叠加显示实际消融路径;和,
    关联存储所述规划消融路径和实际消融路径。
  33. 根据权利要求32所述的超声系统,其特征在于,所述处理器执行所述程序时采用以下方式实现所述导入预先存储的规划消融路径和在所述实时超声图像数据上叠加显示规划消融路径:
    导入所述特定组织的三维模型数据,获得所述特定组织的视图数据;
    根据所述关联记录的所述实时超声图像数据和所述空间方位信息,将所述三维模型数据与所述实时超声图像数据进行配准;
    基于配准后的结果,在所述显示屏上将针对同一特定组织的实时超声图像和视图数据进行融合显示;
    导入预先存储的规划消融路径;和,
    在所述实时超声图像和视图数据融合显示的结果上叠加显示所述规划消融路径。
  34. 根据权利要求32所述的超声系统,其特征在于,所述处理器执行所述程序时采用以下方式在所述实时超声图像数据上叠加显示规划消融路径和实际消融路径:
    在所述实时超声图像数据上叠加显示规划消融路径和实际消融路径,并显示规划消融路径和实际消融路径的重叠关系。
  35. 根据权利要求32所述的超声系统,其特征在于,所述处理器执行所述程序时还包括以下过程:
    获得关于手术后特定组织位置处的三维超声图像数据;
    显示三维超声图像数据;和,
    在所述三维超声图像数据上叠加三维显示的预计消融区域。
PCT/CN2017/112174 2017-11-21 2017-11-21 用于规划消融的超声系统及方法 WO2019100212A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2017/112174 WO2019100212A1 (zh) 2017-11-21 2017-11-21 用于规划消融的超声系统及方法
CN201780094613.8A CN111093516B (zh) 2017-11-21 2017-11-21 用于规划消融的超声系统及方法
CN202211667341.5A CN115944392A (zh) 2017-11-21 2017-11-21 用于规划消融的超声系统及方法
US16/879,732 US20200281662A1 (en) 2017-11-21 2020-05-20 Ultrasound system and method for planning ablation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/112174 WO2019100212A1 (zh) 2017-11-21 2017-11-21 用于规划消融的超声系统及方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/879,732 Continuation US20200281662A1 (en) 2017-11-21 2020-05-20 Ultrasound system and method for planning ablation

Publications (1)

Publication Number Publication Date
WO2019100212A1 true WO2019100212A1 (zh) 2019-05-31

Family

ID=66631302

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/112174 WO2019100212A1 (zh) 2017-11-21 2017-11-21 用于规划消融的超声系统及方法

Country Status (3)

Country Link
US (1) US20200281662A1 (zh)
CN (2) CN111093516B (zh)
WO (1) WO2019100212A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112163987A (zh) * 2020-07-06 2021-01-01 中国科学院苏州生物医学工程技术研究所 穿刺路径规划系统
CN112168348A (zh) * 2019-07-03 2021-01-05 钜旺生技股份有限公司 用于手术的定位与导航系统及其运作方法
CN112603536A (zh) * 2020-12-29 2021-04-06 北京华科恒生医疗科技有限公司 三维模型中生成电极热凝参数的方法和系统
CN112631861A (zh) * 2020-12-22 2021-04-09 深圳开立生物医疗科技股份有限公司 一种消融仪监控方法、装置及超声设备和存储介质
WO2022188651A1 (zh) * 2021-03-12 2022-09-15 上海微创医疗机器人(集团)股份有限公司 一种手术系统
CN115778545A (zh) * 2022-12-22 2023-03-14 天津市鹰泰利安康医疗科技有限责任公司 一种消融定位方法及系统

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10586400B2 (en) * 2018-02-23 2020-03-10 Robert E Douglas Processing 3D medical images to enhance visualization
US11504095B2 (en) * 2018-01-08 2022-11-22 Rivanna Medical, Inc. Three-dimensional imaging and modeling of ultrasound image data
US20210000553A1 (en) * 2018-05-04 2021-01-07 Hologic, Inc. Introducer and localization wire visualization
US20200015899A1 (en) 2018-07-16 2020-01-16 Ethicon Llc Surgical visualization with proximity tracking features
US11832996B2 (en) 2019-12-30 2023-12-05 Cilag Gmbh International Analyzing surgical trends by a surgical system
US11744667B2 (en) 2019-12-30 2023-09-05 Cilag Gmbh International Adaptive visualization by a surgical system
US11219501B2 (en) 2019-12-30 2022-01-11 Cilag Gmbh International Visualization systems using structured light
US11284963B2 (en) 2019-12-30 2022-03-29 Cilag Gmbh International Method of using imaging devices in surgery
US11896442B2 (en) 2019-12-30 2024-02-13 Cilag Gmbh International Surgical systems for proposing and corroborating organ portion removals
US11776144B2 (en) 2019-12-30 2023-10-03 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11648060B2 (en) 2019-12-30 2023-05-16 Cilag Gmbh International Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11759283B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11796660B2 (en) * 2020-07-24 2023-10-24 Fujifilm Sonosite, Inc. Systems and methods for customized user interface
US20230030941A1 (en) * 2021-07-29 2023-02-02 GE Precision Healthcare LLC Ultrasound imaging system and method for use with an adjustable needle guide
CN113940753B (zh) * 2021-11-12 2023-12-19 北京智愈医疗科技有限公司 一种用于组织切割路径自动规划的多图像信息融合方法及系统
CN115294426B (zh) * 2022-10-08 2022-12-06 深圳市益心达医学新技术有限公司 介入医学设备的跟踪方法、装置、设备及存储介质
CN116403696B (zh) * 2023-04-03 2023-11-28 南京诺源医疗器械有限公司 基于数据处理的微波消融系统控制方法
CN116269767B (zh) * 2023-05-22 2023-08-08 北京迈迪斯医疗技术有限公司 一种基于电磁定位的活检系统及导航方法
CN116712163B (zh) * 2023-07-08 2024-03-08 上海睿速创生医疗科技有限公司 一种进行射频消融手术的机器人系统
CN116956227B (zh) * 2023-09-21 2024-01-09 之江实验室 方位历程图显示方法、装置、计算机设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1788693A (zh) * 2004-10-22 2006-06-21 伊西康内外科公司 用于计划组织的治疗的系统和方法
US20070239150A1 (en) * 2000-07-31 2007-10-11 Galil Medical Ltd. Cryosurgery facilitation system
CN102238921A (zh) * 2008-12-03 2011-11-09 皇家飞利浦电子股份有限公司 用于整合介入规划和导航的反馈系统
CN102512246A (zh) * 2011-12-22 2012-06-27 中国科学院深圳先进技术研究院 手术导航系统及方法
CN105534593A (zh) * 2014-10-29 2016-05-04 深圳迈瑞生物医疗电子股份有限公司 介入消融模拟系统及方法
CN106539624A (zh) * 2016-11-23 2017-03-29 常州朗合医疗器械有限公司 医疗路径导航方法、规划方法及系统

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0417846A (ja) * 1990-05-11 1992-01-22 Olympus Optical Co Ltd 血栓溶解治療装置
US7363071B2 (en) * 1999-05-26 2008-04-22 Endocare, Inc. Computer guided ablation of tissue using integrated ablative/temperature sensing devices
US6643535B2 (en) * 1999-05-26 2003-11-04 Endocare, Inc. System for providing computer guided ablation of tissue
EP1460938A4 (en) * 2001-11-05 2006-07-26 Computerized Med Syst Inc DEVICE AND METHOD FOR DISPLAYING, LEADING AND OBJECTING AN EXTERNAL RADIOTHERAPY
CN100548223C (zh) * 2003-05-08 2009-10-14 株式会社日立医药 超声诊断设备
US7740584B2 (en) * 2005-08-16 2010-06-22 The General Electric Company Method and system for mapping physiology information onto ultrasound-based anatomic structure
EP1922005B1 (en) * 2005-08-25 2011-12-21 Koninklijke Philips Electronics N.V. System for electrophysiology regaining support to continue line and ring ablations
WO2009094646A2 (en) * 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
WO2010052596A1 (en) * 2008-11-04 2010-05-14 Koninklijke Philips Electronics, N.V. Method and system for ultrasound therapy
JP5689591B2 (ja) * 2009-06-01 2015-03-25 株式会社東芝 超音波診断装置及び超音波画像処理プログラム
JP6035148B2 (ja) * 2009-12-08 2016-11-30 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. アブレーション治療計画及びデバイス
CN102781356A (zh) * 2009-12-30 2012-11-14 皇家飞利浦电子股份有限公司 动态消融装置
US9737353B2 (en) * 2010-12-16 2017-08-22 Biosense Webster (Israel) Ltd. System for controlling tissue ablation using temperature sensors
US9241687B2 (en) * 2011-06-01 2016-01-26 Boston Scientific Scimed Inc. Ablation probe with ultrasonic imaging capabilities
EP2736436B1 (en) * 2011-07-28 2019-05-29 Koninklijke Philips N.V. Ablation planning system
BR112014007073A2 (pt) * 2011-09-25 2017-04-11 Theranos Inc sistemas e métodos para multi-análise
JP6334407B2 (ja) * 2011-11-28 2018-05-30 アシスト・メディカル・システムズ,インコーポレイテッド 組織を撮像及びアブレーションするためのカテーテル
CN102609622A (zh) * 2012-02-10 2012-07-25 中国人民解放军总医院 一种带有影像测量装置的消融治疗影像引导设备
RU2014139011A (ru) * 2012-02-27 2016-04-20 Конинклейке Филипс Н.В. Система и/или способ компьютерной томографии (ст) - высокоинтенсивного фокусированного ультразвука (hifu)
US9439627B2 (en) * 2012-05-22 2016-09-13 Covidien Lp Planning system and navigation system for an ablation procedure
US9498182B2 (en) * 2012-05-22 2016-11-22 Covidien Lp Systems and methods for planning and navigation
CN202636984U (zh) * 2012-06-05 2013-01-02 闵以克 前列腺介入治疗定位装置
BR112015004443A2 (pt) * 2012-09-01 2017-07-04 Koninklijke Philips Nv método para o uso de informações ultrassônicas para o tratamento de patologias por ablação, e sistema para o uso de informações ultrassônicas para tratamento de patologias por ablação
CN102860841B (zh) * 2012-09-25 2014-10-22 陈颀潇 超声图像引导下穿刺手术的辅助导航系统
CN104605926A (zh) * 2013-11-05 2015-05-13 深圳迈瑞生物医疗电子股份有限公司 一种超声介入消融系统及其工作方法
WO2015092628A1 (en) * 2013-12-20 2015-06-25 Koninklijke Philips N.V. Ultrasound imaging systems and methods for tracking locations of an invasive medical device
EP3273891B1 (en) * 2015-03-26 2021-10-06 Koninklijke Philips N.V. System and method for tumor ablation treatment planning including core tumor, margin and healthy tissue coverage
CN105286988A (zh) * 2015-10-12 2016-02-03 北京工业大学 一种ct影像引导肝肿瘤热消融针定位与导航系统
US10548666B2 (en) * 2015-11-17 2020-02-04 Covidien Lp Systems and methods for ultrasound image-guided ablation antenna placement

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070239150A1 (en) * 2000-07-31 2007-10-11 Galil Medical Ltd. Cryosurgery facilitation system
CN1788693A (zh) * 2004-10-22 2006-06-21 伊西康内外科公司 用于计划组织的治疗的系统和方法
CN102238921A (zh) * 2008-12-03 2011-11-09 皇家飞利浦电子股份有限公司 用于整合介入规划和导航的反馈系统
CN102512246A (zh) * 2011-12-22 2012-06-27 中国科学院深圳先进技术研究院 手术导航系统及方法
CN105534593A (zh) * 2014-10-29 2016-05-04 深圳迈瑞生物医疗电子股份有限公司 介入消融模拟系统及方法
CN106539624A (zh) * 2016-11-23 2017-03-29 常州朗合医疗器械有限公司 医疗路径导航方法、规划方法及系统

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112168348A (zh) * 2019-07-03 2021-01-05 钜旺生技股份有限公司 用于手术的定位与导航系统及其运作方法
CN112163987A (zh) * 2020-07-06 2021-01-01 中国科学院苏州生物医学工程技术研究所 穿刺路径规划系统
CN112163987B (zh) * 2020-07-06 2024-01-26 中国科学院苏州生物医学工程技术研究所 穿刺路径规划系统
CN112631861A (zh) * 2020-12-22 2021-04-09 深圳开立生物医疗科技股份有限公司 一种消融仪监控方法、装置及超声设备和存储介质
CN112603536A (zh) * 2020-12-29 2021-04-06 北京华科恒生医疗科技有限公司 三维模型中生成电极热凝参数的方法和系统
WO2022188651A1 (zh) * 2021-03-12 2022-09-15 上海微创医疗机器人(集团)股份有限公司 一种手术系统
CN115778545A (zh) * 2022-12-22 2023-03-14 天津市鹰泰利安康医疗科技有限责任公司 一种消融定位方法及系统
CN115778545B (zh) * 2022-12-22 2023-11-14 天津市鹰泰利安康医疗科技有限责任公司 一种消融定位方法及系统

Also Published As

Publication number Publication date
CN111093516B (zh) 2023-01-10
CN111093516A (zh) 2020-05-01
CN115944392A (zh) 2023-04-11
US20200281662A1 (en) 2020-09-10

Similar Documents

Publication Publication Date Title
WO2019100212A1 (zh) 用于规划消融的超声系统及方法
JP7167285B2 (ja) 超音波システム及び乳房組織撮像のための方法
JP6085366B2 (ja) 画像誘導手順用の超音波撮像システム及びその作動方法
JP5705403B2 (ja) 超音波画像内の所定の点を追跡するための方法及び装置
US20200367972A1 (en) Display method and system for ultrasound-guided intervention
JP6453857B2 (ja) 超音波画像の3d取得のためのシステムおよび方法
JP6559917B2 (ja) 超音波システム並びに乳房組織の撮像及び乳房超音波画像のアノテーション方法
US20160174934A1 (en) Method and system for guided ultrasound image acquisition
EP2411963B1 (en) Improvements to medical imaging
US20110208052A1 (en) Breast ultrasound annotation user interface
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
JP2018514352A (ja) 後期マーカー配置による融合イメージベース誘導のためのシステムおよび方法
US20070118100A1 (en) System and method for improved ablation of tumors
BRPI1001410A2 (pt) sistema de formação de imagem e terapia por ultra-som e método destinado à administração de terapia
US20230103969A1 (en) Systems and methods for correlating regions of interest in multiple imaging modalities
WO2022027251A1 (zh) 三维显示方法和超声成像系统
JP2018529475A (ja) 医療画像のアノテーションのための装置、方法、及びシステム
CN107835661A (zh) 超声图像处理系统和方法及其装置、超声诊断装置
CN115317128A (zh) 消融模拟方法及设备
US11925333B2 (en) System for fluoroscopic tracking of a catheter to update the relative position of a target and the catheter in a 3D model of a luminal network
WO2019109211A1 (en) Automatic ablation antenna segmentation from ct image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17932981

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC, EPO FORM 1205A DATED 21/09/20

122 Ep: pct application non-entry in european phase

Ref document number: 17932981

Country of ref document: EP

Kind code of ref document: A1