WO2022027251A1 - Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore - Google Patents

Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore Download PDF

Info

Publication number
WO2022027251A1
WO2022027251A1 PCT/CN2020/106893 CN2020106893W WO2022027251A1 WO 2022027251 A1 WO2022027251 A1 WO 2022027251A1 CN 2020106893 W CN2020106893 W CN 2020106893W WO 2022027251 A1 WO2022027251 A1 WO 2022027251A1
Authority
WO
WIPO (PCT)
Prior art keywords
lesion
ablation
dimensional
dimensional model
area
Prior art date
Application number
PCT/CN2020/106893
Other languages
English (en)
Chinese (zh)
Inventor
于开欣
丛龙飞
王超
周文兵
Original Assignee
深圳迈瑞生物医疗电子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳迈瑞生物医疗电子股份有限公司 filed Critical 深圳迈瑞生物医疗电子股份有限公司
Priority to CN202080102793.1A priority Critical patent/CN115811961A/zh
Priority to PCT/CN2020/106893 priority patent/WO2022027251A1/fr
Publication of WO2022027251A1 publication Critical patent/WO2022027251A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body

Definitions

  • the present application relates to the technical field of ultrasound imaging, and more particularly, to a three-dimensional display method and an ultrasound imaging system.
  • Real-time ultrasound-guided percutaneous tumor ablation intervention has the advantages of high curative effect, less invasion and quick postoperative recovery, and its status in tumor treatment is becoming more and more important.
  • the current interventional treatment of tumor ablation is mainly based on two-dimensional ultrasound image guidance, that is, the doctor finds the approximate location of the tumor area through real-time ultrasound images or contrast-enhanced ultrasound images, and roughly estimates the two-dimensional surface where the largest diameter of the tumor is located. Based on the two-dimensional image Develop an ablation plan and guide the ablation.
  • Three-dimensional visualization of stereoscopic structure images can provide areas that are difficult to display in two-dimensional images and obtain objective anatomical information.
  • the three-dimensional visualization display window can objectively and accurately display the tumor.
  • the positional relationship of the ablation needle and the current ablation for the lesion allow the doctor to intuitively plan the tumor operation, optimize the operation plan, improve the operation skills, and thus improve the safety of the operation.
  • the doctor can only see the ablation situation of the 3D tumor displayed at the current angle when performing tumor ablation, but cannot see the ablation images of the posterior side of the tumor and other angles. After the ablation is complete, it is necessary to manually rotate to find the next angle suitable for ablation, which will cause inconvenience in clinical applications.
  • An aspect of an embodiment of the present invention provides a three-dimensional display method, the method includes:
  • the current view of the 3D model of the lesion and the 3D model of the ablation lesion in the 3D display window is changed from the current view
  • the viewing angle is automatically rotated to a target viewing angle in which the ablation area meets the predetermined requirements of the ablation lesion, and the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion under the target viewing angle are displayed.
  • Another aspect of the embodiments of the present application provides a three-dimensional display method, the method includes:
  • the current view includes the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion under the current viewing angle;
  • the remaining three-dimensional display windows of the two or more windows display views from other perspectives, and the views from other perspectives include the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion from other perspectives.
  • an ultrasonic imaging system the ultrasonic imaging system includes:
  • a transmitting circuit used to excite the ultrasonic probe to transmit ultrasonic waves to the lesion
  • a receiving circuit configured to control the ultrasonic probe to receive the ultrasonic echo returned from the lesion to obtain an ultrasonic echo signal
  • processor for:
  • the current view of the three-dimensional display window is rotated from the current viewing angle to a target viewing angle in which the ablation area meets the predetermined requirements for ablating the lesion, and the three-dimensional model of the lesion and all the lesions in the target viewing angle are displayed.
  • the display is used for displaying the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion.
  • the three-dimensional display solution of the present application can automatically rotate the viewing angle of the three-dimensional model for observing the lesion and the three-dimensional model of the ablation lesion, which greatly facilitates the ablation operation.
  • FIG. 1 shows a schematic block diagram of an ultrasound imaging system according to an embodiment of the present invention
  • FIG. 2 shows a schematic flowchart of a three-dimensional display method according to an embodiment of the present invention
  • FIG. 3 shows a schematic diagram of spatial transformation in a three-dimensional display method according to an embodiment of the present invention
  • FIG. 4 shows a schematic diagram of a three-dimensional display window in a three-dimensional display method according to an embodiment of the present invention
  • FIG. 5 shows a schematic diagram of a main window and an auxiliary window in a three-dimensional display method according to an embodiment of the present invention
  • FIG. 6 shows a schematic diagram of the interface shown in FIG. 5 after rotating the viewing angle according to an embodiment of the present invention
  • FIG. 7 shows a schematic flowchart of a three-dimensional display method according to another embodiment of the present invention.
  • FIG. 8 shows a schematic diagram of a display interface in a three-dimensional display method according to an embodiment of the present invention.
  • FIG. 1 shows a schematic structural block diagram of an ultrasound imaging system 100 according to an embodiment of the present application.
  • the ultrasound imaging system 100 includes an ultrasound probe 110 , a transmitting circuit 112 , a receiving circuit 114 , a processor 118 , and a display 120 . Further, the ultrasound imaging system may further include a beam forming circuit 116 and a transmit/receive selection switch 122 , and the transmit circuit 112 and the reception circuit 114 may be connected to the ultrasound probe 110 through the transmit/receive selection switch 122 .
  • the ultrasonic probe 110 includes an array of multiple transducer array elements, which are used for transmitting ultrasonic waves according to electrical signals, or converting received ultrasonic echoes into electrical signals.
  • Multiple transducers can be arranged in a row to form a linear array, or arranged in a two-dimensional matrix to form an area array, and multiple transducers can also form a convex array, a phased array, etc.
  • the arrangement of the array elements is not limited.
  • the transducer can transmit ultrasonic waves according to the excitation electrical signal, or convert the received ultrasonic waves into electrical signals, so each transducer can be used to transmit ultrasonic waves to the tissue in the target area, and can also be used to receive ultrasonic echoes returned by the tissue.
  • the transmitter circuit 112 and the receiver circuit 114 can control which transducers are used for transmitting ultrasonic waves and which transducers are used for receiving ultrasonic waves, or control the transducers in time slots for transmitting ultrasonic waves or receiving ultrasonic echoes.
  • All transducers participating in ultrasonic emission can be excited by electrical signals at the same time to emit ultrasonic waves at the same time; or transducers participating in ultrasonic emission can also be excited by several electrical signals with a certain time interval, so as to continuously emit ultrasonic waves with a certain time interval .
  • the ultrasonic probe 110 is provided with a positioning sensor. When the ultrasonic probe 110 moves, the specific position of the current ultrasonic fan can be known according to the coordinate change of the positioning sensor.
  • a puncture frame is installed on the ultrasound probe 110 for fixing the ablation needle during the ablation procedure, and the angle and position of the ablation needle can be known based on the angle of the puncture frame and the depth of the ablation needle.
  • a positioning sensor Vtrax
  • the ablation needle moves, the angle and position of the ablation needle can be known according to the coordinate change of the positioning sensor.
  • the transmit circuit 112 transmits the delayed focused transmit pulses to the ultrasound probe 110 through the transmit/receive selection switch 122 .
  • the ultrasonic probe 110 is stimulated by the transmission pulse to transmit an ultrasonic beam to the target area of the object under test, and after a certain delay, receives the ultrasonic echo with tissue information reflected from the target area, and reconverts the ultrasonic echo into a electric signal.
  • the receiving circuit 114 receives the electrical signals converted and generated by the ultrasonic probe 110, obtains ultrasonic echo signals, and sends these ultrasonic echo signals to the beam forming circuit 116, and the beam forming circuit performs focusing delay, weighting and channel calculation on the ultrasonic echo data. and etc., and then sent to the processor 118.
  • the processor 118 performs signal detection, signal enhancement, data conversion, logarithmic compression and other processing on the ultrasonic echo data to form an ultrasonic image.
  • the ultrasound images obtained by the processor 118 can be displayed on the display 120 or stored in a memory.
  • the processor 118 may be implemented as software, hardware, firmware, or any combination thereof, and may use single or multiple application specific integrated circuits (ASICs), single or multiple general-purpose integrated circuits, single or multiple microprocessors, single or multiple programmable logic devices, or any combination of the foregoing circuits and/or devices, or other suitable circuits or devices. Also, the processor 118 may control other components in the ultrasound imaging system 100 to perform corresponding steps of the methods in the various embodiments in this specification.
  • ASICs application specific integrated circuits
  • the processor 118 may control other components in the ultrasound imaging system 100 to perform corresponding steps of the methods in the various embodiments in this specification.
  • the display 120 is connected to the processor 118, and the display 120 may be a touch display screen, a liquid crystal display screen, etc.; or the display 120 may be an independent display device such as a liquid crystal display, a television set, etc. independent of the ultrasound imaging system 100; or the display 120 may be Displays of electronic devices such as smartphones, tablets, etc.
  • the number of displays 120 may be one or more.
  • the display 120 may include a main screen and a touch screen, the main screen is mainly used for displaying ultrasound images, and the touch screen is mainly used for human-computer interaction.
  • Display 120 may display ultrasound images obtained by processor 118 .
  • the display 120 can also provide a graphical interface for the user to perform human-computer interaction while displaying the ultrasound image, set one or more controlled objects on the graphical interface, and provide the user with a human-computer interaction device to input operating instructions to control these objects.
  • the controlled object so as to perform the corresponding control operation.
  • an icon is displayed on a graphical interface, and the icon can be operated by using a human-computer interaction device to perform a specific function, such as rotating the angle of view of the current view.
  • the ultrasound imaging system 100 may further include other human-computer interaction devices other than the display 120, which are connected to the processor 118.
  • the processor 118 may be connected to the human-computer interaction device through an external input/output port.
  • the output port can be a wireless communication module, a wired communication module, or a combination of the two.
  • External input/output ports may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, and the like.
  • the human-computer interaction device may include an input device for detecting the user's input information, for example, the input information may be a control instruction for the ultrasonic transmission/reception sequence, or a point, line or frame drawn on the ultrasonic image. Manipulate input instructions, or may also include other instruction types.
  • the input device may include one or a combination of a keyboard, a mouse, a scroll wheel, a trackball, a mobile input device (eg, a mobile device with a touch display screen, a cell phone, etc.), a multi-function knob, and the like.
  • the human-computer interaction apparatus may also include an output device such as a printer.
  • the ultrasound imaging system 100 may also include memory for storing instructions executed by the processor, storing received ultrasound echoes, storing ultrasound images, and the like.
  • the memory may be a flash memory card, solid state memory, hard disk, or the like. It may be volatile memory and/or non-volatile memory, removable memory and/or non-removable memory, and the like.
  • the components included in the ultrasound imaging system 100 shown in FIG. 1 are only illustrative, and may include more or less components. This application is not limited to this.
  • FIG. 2 is a schematic flowchart of a three-dimensional display method 200 according to an embodiment of the present application.
  • the three-dimensional display method 200 in this embodiment of the present application can be used for surgical planning of percutaneous ablation surgery.
  • the three-dimensional display method 200 includes the following steps:
  • Step S210 acquiring an ultrasound image collected for the lesion.
  • the lesion is the lesion area of the target part of the measured object.
  • the measured object can be a patient who needs to undergo ablation, the target part of the measured object can be the liver, and the lesion area can be the liver tumor area; the target part of the measured object can also be the prostate, thyroid, breast, etc. location, the lesion area is the lesion area of the above-mentioned location.
  • the three-dimensional display solution of the present application is mainly described by taking the target site as the liver site as an example, but it should be understood that this is only an example, and the three-dimensional display solution of the present application can also be used for any other site.
  • the ultrasonic probe 110 may be excited by the transmit/receive selection switch 122 to transmit ultrasonic waves to the target part of the measured object via the transmit circuit 112 at regular intervals, and the ultrasonic probe 110 receives the ultrasonic waves via the receive circuit 114 .
  • the ultrasonic echoes returned from the target part of the measured object are converted into ultrasonic echo signals.
  • the beamforming module 116 may perform signal processing, and then send the beamformed ultrasound echo data to the processor 118 for related processing, thereby obtaining an ultrasound image.
  • the processor 118 can perform different processing on the ultrasonic echo signals to obtain ultrasonic data of different modes, and then, through logarithmic compression, dynamic range adjustment, digital scan conversion, etc., to form different modes of ultrasound data.
  • Ultrasound images such as two-dimensional ultrasound images including B images, C images, and D images.
  • the operator can use the ultrasound probe to scan the target part of the object to be measured.
  • the ultrasound image can be frozen to obtain the ultrasound image collected for the lesion.
  • Step S220 registering the ultrasound image with a pre-acquired three-dimensional image containing the lesion.
  • the operator may import the three-dimensional image containing the lesion into the ultrasound imaging system in advance before starting the ultrasound measurement, and the import method includes but is not limited to importing through a storage medium such as a U disk, a CD-ROM, etc. or importing through network transmission.
  • Three-dimensional images containing lesions can be obtained by computed tomography (CT, Computed Tomography), magnetic resonance imaging (MRI, Magnetic Resonance Imaging), positron emission tomography (PET, Positron Emission Tomography), digital X-ray imaging equipment , ultrasound equipment, digital image subtraction equipment (DSA, Digital Subtraction Angiography), optical imaging equipment and other medical imaging equipment.
  • CT computed tomography
  • MRI Magnetic Resonance Imaging
  • PET Positron Emission Tomography
  • digital X-ray imaging equipment ultrasound equipment
  • DSA Digital Subtraction Angiography
  • optical imaging equipment and other medical imaging equipment.
  • the registration of the ultrasound image and the three-dimensional image is to seek the spatial transformation relationship between the ultrasound image and the three-dimensional image, so that the geometric relationship between the corresponding points of the ultrasound image and the three-dimensional image is in one-to-one correspondence.
  • Registration may include rigid body registration or non-rigid body registration.
  • the positioning sensor fixed on the ultrasound probe can continuously provide position information along with the movement of the ultrasound probe, and the 6-DOF spatial orientation of the ultrasound probe can be obtained through the magnetic positioning controller, and the image information and the magnetic positioning information can be used.
  • the ultrasound image and the three-dimensional image can be registered and fused.
  • the processor may be connected to the positioning sensor provided on the ultrasonic probe through a wired or wireless manner to acquire probe position information.
  • the positioning sensor can use any type of structure or principle, such as an optical positioning sensor or a magnetic field positioning sensor, to position the ultrasonic probe.
  • X us is the coordinate of the point in the ultrasound image space
  • X sec is the coordinate of the corresponding point in the three-dimensional image space
  • A is the ultrasound image space (the coordinates are expressed as X us , Yu us , Z us ) to the positioning sensor space (the coordinates are expressed as is the transformation matrix of X sensor , Y sensor , Z sensor )
  • R probe is the transformation matrix from the positioning sensor space to the world coordinate space (the coordinates are expressed as X MG , Y MG , Z MG )
  • P is the world coordinate system to the three-dimensional image space transformation matrix.
  • the positioning sensor is fixed on the ultrasonic probe.
  • the registration methods used in the embodiments of the present application may include automatic registration, interactive registration, manual registration, or any combination of the above three methods.
  • the registration may include registration based on anatomical features or registration based on geometric features, registration based on pixel grayscale correlation, registration based on external localization landmarks, and the like. Registration may also include any other suitable registration method.
  • registering the ultrasound image and the three-dimensional image specifically includes: aligning the ultrasound image with a corresponding slice in the three-dimensional image, and determining the ultrasound according to the coordinates of the point in the ultrasound image and the coordinates of the corresponding point in the three-dimensional image.
  • the coordinate transformation matrix from image space to 3D image space.
  • the above-mentioned alignment operation can be performed manually by the user, that is, the user's manual alignment operation is received, so as to align the ultrasound image with the corresponding slice in the three-dimensional image.
  • the same tissue in the ultrasound image and the three-dimensional image can be identified for automatic alignment.
  • the target site is a liver site
  • the identified same tissue is, for example, a blood vessel, a liver capsule, or the like.
  • a spatial transformation matrix can be calculated from the coordinates of the coincident points.
  • the feature points in the ultrasound image and the three-dimensional image may be determined first, and the feature points generally have some of translation invariance, rotation invariance, scale invariance, insensitivity to illumination, insensitivity to modality, etc.
  • the properties of the feature points are determined by the feature point extraction method.
  • the features of the feature points are extracted, and the features can be generated by neighborhood gradient histogram, neighborhood autocorrelation, grayscale, etc.
  • the feature points of the ultrasound image are matched with the feature points of the three-dimensional ultrasound image, and a spatial transformation matrix is calculated based on the matched feature points.
  • the in vitro markers are, for example, one or more metal markers set on the body surface of the patient, which will form obvious light spots in the three-dimensional image, and then obtain the position of the marker.
  • the ultrasonic probe A sensor sensor
  • the position of the metal marker can be obtained through the sensor, and the registration of the ultrasound image and the three-dimensional image can be realized by aligning the marker in the three-dimensional ultrasound image with the marker in the ultrasound image.
  • the three-dimensional image of the lesion area is a three-dimensional ultrasound image
  • automatic registration can be performed based on the position information of the ultrasound image and the three-dimensional ultrasound image.
  • the 3D image may be acquired by a volume probe, or reconstructed by a convex array or linear array probe with a magnetic navigation device based on the Freehand 3D ultrasound reconstruction technology, or scanned by an area array probe.
  • the 3D ultrasound image reconstructed based on the magnetic navigation position information can be a reconstructed 3D ultrasound image obtained by scanning an ultrasound movie with positioning information on-site Freehand, and the position information can be obtained during scanning, so the P matrix above can be obtained automatically.
  • the position of the soft tissue and the lesion is shifted, so a respiration correction function is introduced during the registration process to perform respiration correction.
  • the added T(t) is a spatial mapping method for respiration correction, and T(t) changes with time, then the spatial transformation relationship between the ultrasonic image and the three-dimensional image is expressed in the form of a formula as:
  • the position deviation caused by the breathing movement can also be corrected by making the patient breathe smoothly.
  • the ultrasonic image and the three-dimensional image may also be fused according to the corresponding relationship between the ultrasonic image and the three-dimensional image obtained in step S220, and the fusion of the ultrasonic image and the three-dimensional image is displayed in the fusion display window of the display image.
  • the ultrasound image space coordinate system can be mapped to the three-dimensional image space coordinate system based on the registration relationship matrix. Since the positioning sensor is installed on the ultrasound probe, when the ultrasound probe moves, according to the coordinate change of the positioning sensor, the current ultrasound can be known. The specific positional relationship between the fan and the lesion located in the three-dimensional image space coordinate system.
  • Step S230 segment the lesion in the three-dimensional image containing the lesion, and display the three-dimensional model of the lesion in the three-dimensional display window based on the registration result and the segmentation result.
  • a three-dimensional display window is set on the display interface of the display for displaying the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion to be described below.
  • the 3D contour of the lesion is first segmented in the 3D image.
  • any suitable method may be used to segment the lesions in the three-dimensional image, including but not limited to automatic segmentation, manual segmentation, or interactive segmentation.
  • the automatic segmentation method may adopt one or more of random walk model, region growing, graph cut algorithm, pattern recognition, Markov field, adaptive threshold and other methods.
  • Manual segmentation involves the user delineating the edges of the lesion on multiple 2D slices of the 3D data and interpolating between the edges of each two slices, or outlining the edges of the lesion on each 2D slice, and then based on these 2D slices. The edges generate a three-dimensional outline of the lesion.
  • Interactive segmentation methods add user interaction as algorithm input in the segmentation process, so that objects with high-level semantics in the image can be completely extracted.
  • a preliminary segmentation range can be selected by the user; then, the three-dimensional contour of the lesion is automatically segmented within the preliminary segmentation range.
  • the user can draw some points or lines within the preliminary segmentation range, obtain the points or lines drawn by the user as input through an interactive segmentation algorithm, and automatically establish a weighted map of the similarity between each pixel and the foreground or background, And by solving the minimum cut to distinguish the foreground and background, the three-dimensional contour of the lesion is determined.
  • surface reconstruction is performed according to the three-dimensional contour obtained by the segmentation to generate a three-dimensional model of the lesion, such as the three-dimensional model 401 of the lesion as shown in FIG. 4 .
  • the surface rendering method can be used to first reconstruct the surface of the 3D target structure from the 3D data, that is, to reconstruct the surface of the object according to the segmentation results and contour lines, and then use a reasonable illumination model and texture mapping method to generate a realistic 3D surface. entity. It can be understood that since the image displayed by the display is still a two-dimensional plane, the surface rendering is actually a projection showing the reality of the three-dimensional object on the two-dimensional plane, similar to when the viewing angle is located at a certain point, The three-dimensional object is "photographed", and the image of the three-dimensional object is displayed on the photo.
  • the ultrasonic image can be displayed in the 3D display window, and the 3D model of the lesion can be displayed at the location of the lesion in the ultrasonic image.
  • the positional relationship between the reconstructed 3D model of the lesion and the real-time ultrasound image can reflect the location, size, geometry of the lesion and its relationship with the surrounding tissue.
  • the coordinate X us_tumor of the lesion position in the real-time ultrasound image is calculated based on the registration matrix of the three-dimensional image and the ultrasound image, and the three-dimensional model of the lesion is superimposed and displayed in the three-dimensional display window. the corresponding location.
  • step S240 the ablation parameters of the ablation lesion are acquired, and the 3D model of the ablation lesion is displayed at the corresponding position in the 3D display window according to the acquired ablation parameters and the registration result obtained in step S220, and the corresponding position is based on the registration result. and the location of the ablation focus determined by the acquired ablation parameters.
  • This embodiment of the present application does not limit the execution order of step S230 and step S240.
  • the image space of the three-dimensional display window can be the image space corresponding to the three-dimensional image, the image space corresponding to the ultrasound image, or any other image space, as long as the three-dimensional model of the lesion and the ablation focus are ensured
  • the 3D model can be displayed in the same image space.
  • two three-dimensional models are displayed in the image space of the three-dimensional image as an example for description.
  • Three-dimensional models of ablation lesions include single-needle ablation models or multi-needle combined ablation models.
  • the single-needle ablation model is a three-dimensional model of the ablation lesion displayed in the three-dimensional display window according to the single ablation needle inserted into the lesion based on the above-mentioned registration result and the acquired ablation parameters.
  • the multi-needle combined ablation model is a three-dimensional model of the ablation lesion displayed in the three-dimensional display window according to at least two ablation needles inserted into the lesion sequentially or simultaneously based on the above-mentioned registration results and the acquired ablation parameters.
  • the single-needle ablation model of the ablation lesion displayed in the three-dimensional display window is a single ellipsoid
  • the multi-needle combined ablation model includes multiple ellipsoids.
  • the three-dimensional model of the ablation lesion may be displayed in a different color from the three-dimensional model of the lesion to facilitate the distinction between the two.
  • both the three-dimensional model of the ablation lesion and the three-dimensional model of the lesion can be displayed as translucent figures, so as to facilitate the observation of the overlapping areas of the two.
  • FIG. 4 shows a three-dimensional model 402 of an ablation lesion in one embodiment.
  • the operator can intuitively determine the ablation area, that is, the overlapping area of the 3D model of the ablation lesion and the 3D model of the lesion.
  • the overlapping area may be displayed in a different color than the three-dimensional model of the ablation lesion and the three-dimensional model of the lesion.
  • the coordinates of the center of the ablation focus in the ultrasound image may be first determined, and the coordinates of the center of the ablation focus in the three-dimensional image may be determined according to the coordinate transformation matrix from the ultrasound image space to the three-dimensional image space; then, according to the position of the center of the ablation focus in the three-dimensional image
  • the coordinates in the three-dimensional image and the size of the ablation lesion, and a three-dimensional model of the ablation lesion is drawn in the image space of the three-dimensional image.
  • the coordinates of the center of the ablation focus in the ultrasonic image can be determined according to the angle of the puncture frame and the depth of the ablation path.
  • the coordinates of the center of the ablation lesion may be set separately for each of the ablation models.
  • the ultrasound imaging system In addition to the center coordinates of the ablation focus, the ultrasound imaging system also needs to obtain some ablation parameters set by the operator. For example, the operator also needs to set the power of the ablation needle and the continuous ablation time, and obtain the ablation range of the ablation needle to ensure that the ablation area contains A 3D model of the entire lesion and its safe boundaries.
  • the safety margin refers to that during the ablation process, the ablation lesion is generally required to cover the edge of the lesion and expand outward by a certain distance to ensure complete ablation of the entire lesion.
  • the operator can input the given power of the ablation operation and the ablation duration, and obtain the size of the ablation area according to the above-mentioned working parameters.
  • the operator may also first set the required ablation area range, that is, a preset ablation area, and select the corresponding given power and ablation duration and other working parameters according to the set ablation area range. Since the ablation range of the ablation needle is usually an ellipsoid, in this step, when setting the range of the ablation region, it is only necessary to set the length of the long axis and the length of the short axis of the ellipsoid.
  • the shape of the ablation focus is not limited to an ellipsoid, but can also include a sphere or a cylinder.
  • the operator can set the shape of the ablation focus according to the shape of the focus, and set different parameters according to the shape of the ablation focus.
  • the drawing method of the three-dimensional model of the ablation lesion can also be surface drawing. Assuming that the shape of the ablation focus is ellipsoid, the operator can set the long diameter, short diameter, needle tip distance, path depth, etc. of the ablation focus according to the actual ablation needle model. The system uses these parameters to draw the ablation focus at the coordinate origin. , where the needle tip distance is the distance between the heat source of the ablation needle and the needle tip, and the center point of the ablation focus is the position where the depth of the ablation needle punctured path minus the needle tip distance is located.
  • the puncture frame angle ⁇ needs to be set first, and the position of the ablation focus in the current ultrasonic sector Tus_ablate is calculated using the set puncture frame angle ⁇ and the path depth d , i.e. (x us_ablate, y us_ablate ), where:
  • the coordinates of the center of the ablation focus in the ultrasound image may be determined according to a positioning sensor disposed on the ablation needle.
  • the angle and depth of the ablation lesion are obtained based on the coordinate changes of the positioning sensor, that is, the coordinates of the ablation lesion in the positioning sensor space are determined according to the positioning sensor.
  • the transformation matrix from the positioning sensor to the ultrasonic image space it is converted to the ultrasonic image space, and finally the coordinates of the ablation focus are mapped to the three-dimensional image space according to the registration matrix of the ultrasonic image space and the three-dimensional image space to realize three-dimensional visualization.
  • the coordinates of the ablation focus in the ultrasound space are Tus_ablate
  • its coordinates are:
  • the operator moves the ultrasonic probe installed with the positioning sensor, and the position of the ablation focus in the three-dimensional image space will move with the change of R probe in the mapping relationship matrix.
  • the operator may click to save, that is, it is considered that the current position of the ablation focus has been ablated.
  • two or more three-dimensional display windows may be displayed on the display interface, wherein one window displays the current view, and the other windows display views from other perspectives, so that the operator can simultaneously view the images from multiple perspectives. Lesions and ablation foci were observed.
  • the above two or more three-dimensional display windows include a main window 501 and an auxiliary window 502 , where the main window 501 is used to display the current view, and the auxiliary window 502 is used to display views from other perspectives.
  • the auxiliary window 502 can be superimposed and displayed on the corner of the main window 501 as shown in FIG. 5 to save layout.
  • the auxiliary window 502 can also be displayed side by side with the main window 501 .
  • the size of the main window 501 is larger than the size of the auxiliary window 502, but not limited to this, and the sizes of the two may also be the same.
  • the other viewing angles displayed by the at least one auxiliary window include a reverse viewing angle of the current viewing angle displayed by the main window.
  • the main window 501 in FIG. 5 displays the perspective of the front side of the lesion
  • the auxiliary window 502 displays the perspective of the rear side of the lesion.
  • auxiliary windows can also be set to display other perspectives such as the left, right, and bottom sides of the lesion, so that the operator can more comprehensively understand the ablation conditions for different angles of the lesion.
  • the angle relationship between the main window and the auxiliary window may be preset, and the angle of view of the auxiliary window is determined according to the angle of view of the main window and the angle relationship.
  • the preset angular relationship between the main window and the auxiliary window is an angle difference of 30° in the clockwise direction, and the auxiliary window relative to the main window displays a view from a viewing angle rotated 30° clockwise.
  • other viewing angles displayed in the auxiliary window are the target viewing angles described below.
  • the auxiliary window may display the view at the viewing angle with the largest unablated area, or display the rotated view. The view from the minimum angle of view.
  • the corresponding views under each viewing angle may be sequentially displayed in the multiple auxiliary windows according to the size of the unablated region or the size of the rotation angle.
  • Step S250 Determine the ablation area according to the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion.
  • the ablation area is the overlapping area of the 3D model of the lesion and the 3D model of the ablation lesion. Based on the above registration results, the 3D model of the lesion and the 3D model of the ablation lesion can be mapped to the same image space, and the 3D model of the lesion can be calculated. The overlapping area of the model and the three-dimensional model of the ablation lesion is determined as the ablation area.
  • each coordinate point of the three-dimensional model of the lesion falls within the range of the three-dimensional model of the ablation focus, and the position of the lesion that falls within the range of the three-dimensional model of the ablation focus is considered to have been ablated , the lesion location outside the range of the 3D model of the ablation focus is considered not to be ablated, and the set of coordinate points that fall within the range of the 3D model of the ablation focus is determined as the ablated area, and the coordinates are connected through traversal and calculation. Domain and other methods can calculate the ablated coordinate area ⁇ P1, P2, ... ⁇ and the non-ablated coordinate area ⁇ N1, N2, ... ⁇ of the lesion.
  • the z-axis size of the current face in the current view can be determined based on the z-axis size of the ablated coordinate area and the unablated coordinate area.
  • Ablation ratio and maximum ablation connected area It can be understood that, in other embodiments, since the ablation area is the overlapping area of the 3D model of the lesion and the 3D model of the ablation lesion, based on similar principles, it is also possible to determine whether each coordinate point of the 3D model of the ablation lesion is not. The ablation area is determined within the confines of the 3D model of the lesion.
  • Step S260 when the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion satisfies the rotation condition, display the current view of the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion in the three-dimensional display window from the current perspective. Automatically rotate to a target viewing angle where the ablation area meets the predetermined requirements of the ablation lesion, and display the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion under the target viewing angle.
  • the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion satisfying the rotation condition includes: in the three-dimensional display window, the spatial position of the three-dimensional model of the lesion and the position of the three-dimensional model of the ablation lesion in any viewing angle
  • the inclusion of the spatial location satisfies the preset condition. For example, by judging whether the spatial position of the 3D model of the lesion and the spatial position of the 3D model of the ablation lesion in any viewing angle reach a certain proportion, when the spatial position of the 3D model of the lesion and the 3D model of the ablation lesion in other perspectives When the included part of the spatial position of , reaches a certain proportion, it is judged that the rotation condition is satisfied.
  • the rotation condition it can be judged whether the rotation condition is satisfied by judging whether the spatial position of the 3D model of the lesion and the spatial position of the 3D model of the ablation lesion at any viewing angle are mutually included, for example, if the spatial position of the 3D model of the lesion at other viewing angles is the same as If the spatial positions of the three-dimensional models of the ablation lesions do not include each other, it is determined that the rotation condition is satisfied.
  • it can be determined whether there is a target of interest in the mutually included part of the spatial position of the 3D model of the lesion and the spatial position of the 3D model of the ablation lesion at any viewing angle. It is judged that the rotation condition is satisfied.
  • whether the rotation condition is satisfied is determined according to the projected area of the ablation region under the current viewing angle.
  • the projection area mentioned herein refers to the display area of the display view of the 3D model displayed on the display window of the display at the current viewing angle.
  • the projected area of the ablation area at the current viewing angle can also be compared with the projected area of the 3D model of the ablation lesion to determine the size of the lesion at the current viewing angle. Whether most of the regions have been ablated, that is, if the proportion of the projected area of the ablation region in the projected area of the three-dimensional model of the ablation lesion exceeds a predetermined threshold, it is determined that the rotation condition is satisfied.
  • the ablation area seen in the current viewing angle is still too small, or the ablation area cannot be seen in the current viewing angle, indicating that the ablation lesion is
  • the overlapping area with the lesion area may be located in the opposite direction of the current viewing angle or in other different directions, so at this time, the viewing angle of the current view needs to be rotated to better observe the ablation area.
  • the projected area of the ablation area in the projected area of the three-dimensional model of the lesion is smaller than a predetermined threshold, or the projected area of the ablation area occupies the projected area of the three-dimensional model of the ablation lesion If the ratio is smaller than the predetermined threshold, it is determined that the rotation condition is satisfied.
  • the user can also determine whether the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion satisfies the rotation condition.
  • the rotation of the viewing angle can be triggered by manually clicking the automatic rotation function key.
  • a rotation operation is performed to rotate the current view to a target viewing angle in which the ablation area meets the predetermined requirements of the ablation lesion, so as to better observe the ablation effect.
  • the target viewing angle at which the ablation area meets the predetermined requirements of the ablation lesion may be a viewing angle that is beneficial for the user to observe the ablation area to ablate the lesion.
  • the target viewing angle may be a viewing angle with a smaller projected area of the ablation region.
  • the projected area of the ablation area can be compared with the projected area of the 3D model of the lesion, that is, the target viewing angle is the perspective with the smallest ratio of the projected area of the ablation area to the projected area of the 3D model of the lesion, that is, the current view Rotate to the target view with the largest unablated area.
  • At least one viewing angle whose ratio between the projected area of the ablation area and the projected area of the three-dimensional model of the lesion can be determined is smaller than or equal to a predetermined threshold, and when there is one viewing angle, the viewing angle is regarded as the target viewing angle with the largest unablated area, and there are more than one viewing angle.
  • one viewing angle among the more than one viewing angle can be selected as the target viewing angle with the largest unablated area, for example, the viewing angle with the smallest required rotation angle can be selected.
  • the projected area of the ablation area can also be compared with the projected area of the 3D model of the ablation lesion to find a viewing angle with a smaller projected area of the ablation area, that is, the target viewing angle is the projected area of the ablation area and the The viewing angle with the smallest ratio of the projected area of the three-dimensional model of the ablation lesion, or the target viewing angle is the viewing angle where the ratio of the projected area of the ablation region to the projected area of the three-dimensional model of the ablation lesion is less than or equal to a predetermined threshold.
  • the angle between the current viewing angle and the target viewing angle is calculated as the rotation angle of the view, and the viewing angle of the view is rotated according to the calculated rotation angle.
  • the coordinates of the projection of the unablated region of the three-dimensional model of the lesion in the current view can be obtained by traversing the coordinate depth of the three-dimensional model of the lesion, and then the center position of the projection of the unablated region of the lesion displayed in the current view can be calculated.
  • the second coordinate (x 2 , y 2 ) of the center position of the projection of the unablated area of the lesion under the target viewing angle is calculated, and the sum of the connecting line between the first coordinate (x 1 , y 1 ) and the coordinate origin is calculated.
  • the included angle ⁇ between the second coordinates (x 2 , y 2 ), and the included angle ⁇ is used as the rotation angle, where:
  • the line between the coordinate origin and the center point of the unablated area is used as the starting and ending point of rotation.
  • the connecting line between the coordinate origin and the center point of the lesion projection can also be used as the starting and ending point of rotation. .
  • the included angle between the connection line with the coordinate origin and the connection line between the second coordinate and the coordinate origin, and the included angle is used as the rotation angle of the rotation.
  • the viewing angle with the smallest required rotation angle may be determined as the target viewing angle.
  • two or more three-dimensional display windows are displayed on the display interface, and one window (for example, the main window) displays the current view under the current viewing angle; then in an example, When the current view is rotated, the views in other windows are rotated synchronously.
  • the main window 501 displays the perspective of the anterior side of the lesion
  • the auxiliary window displays the perspective of the rear of the lesion. If the perspective in the main window 501 is rotated to the rear of the lesion, the auxiliary window The viewing angle in 502 will be rotated to the anterior side of the lesion, as shown in FIG. 6 . If the viewing angle in the main window 501 is rotated to the left side of the lesion, the viewing angle in the auxiliary window 502 will be rotated to the right side of the lesion accordingly.
  • the views from other perspectives can also be fixed. For example, if the viewing angle in the main window 501 is rotated to the left side of the lesion, the viewing angle in the auxiliary window can continue to remain at the back side of the lesion.
  • steps S240 to S260 may be repeatedly performed to continue generating ablation lesions based on the rotated view until the lesions are completely ablated.
  • the three-dimensional display method 200 can automatically rotate the viewing angle for observing the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion, which greatly facilitates the ablation operation.
  • FIG. 7 another embodiment of the present application provides a three-dimensional display method 700, which includes:
  • step S710 two or more three-dimensional display windows are displayed on the display interface
  • step S720 a current view is displayed in one of the two or more three-dimensional display windows, wherein the current view includes a three-dimensional model of the lesion and a three-dimensional model of the ablation lesion under the current viewing angle;
  • step S730 the remaining windows of the two or more windows display views from other perspectives, and the views from the other perspectives include the three-dimensional model of the lesion and the ablation lesion from the other perspectives. 3D model.
  • the current view further includes an ultrasound image
  • the ultrasound image, the three-dimensional model of the lesion, and the three-dimensional model of the ablation lesion in the same three-dimensional display window are in the same image space, for example, may be in the image space of the ultrasound image or within the image space of a 3D image.
  • the views from other viewing angles may also include ultrasound images, and the three-dimensional models of the lesions, the three-dimensional models of ablation lesions and the ultrasound images from other viewing angles are also in the same image space.
  • FIG. 8 shows a schematic diagram of a display interface 800 of the three-dimensional display method 700 according to an embodiment of the present invention.
  • the ultrasound window 803 and the fusion display window 804 are also displayed, so The ultrasound window 803 is used to display an ultrasound image, for example, a real-time ultrasound image, and the fusion display window 804 is used to display a fusion image of the ultrasound image and the three-dimensional image including the lesion.
  • other suitable layouts may also be used for the first three-dimensional display window 801 , the second three-dimensional display window 802 , the ultrasound window 803 and the fusion display window 804 .
  • the three-dimensional display method 700 according to the embodiment of the present invention adopts a multi-viewing angle display manner of two or more windows, which is beneficial for the user to better know the ablation conditions of different viewing angles.
  • an embodiment of the present invention further provides an ultrasound imaging system 100 , and the ultrasound imaging system 100 can be used to implement the above-mentioned three-dimensional display method 200 .
  • the ultrasound imaging system 100 may include an ultrasound probe 110, a transmitting circuit 112, a receiving circuit 114, a beamforming circuit 116, a processor 118, a display 120, a transmit/receive selection switch 122, and some or all of the components in the memory 124.
  • the description can refer to the above. Only the main functions of the ultrasound imaging system 100 are described below, and the details that have been described above are omitted.
  • the transmitting circuit 112 is used to excite the ultrasonic probe 110 to transmit ultrasonic waves to the lesion;
  • the receiving circuit 114 is used to control the ultrasonic probe to receive the ultrasonic echo returned from the lesion to obtain an ultrasonic echo signal;
  • the processor 118 is used to: detect the ultrasonic echo processing the signal to obtain an ultrasound image; registering the ultrasound image with a pre-acquired three-dimensional image containing the lesion; segmenting the lesion in the three-dimensional image containing the lesion, based on the result of the registration With the result of the segmentation, the three-dimensional model of the lesion is displayed in the three-dimensional display window of the display 120; the ablation parameters of the ablation lesion are acquired, and the ablation lesion is displayed in the three-dimensional display window according to the registration result and the acquired ablation parameters
  • the ablation area is determined according to the spatial relationship between the 3D model of the lesion and the 3D model of the ablation lesion; when the rotation conditions are met, the current view of
  • the ablation area is an overlapping area of the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion.
  • the satisfying the rotation condition includes: the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion satisfies the rotation condition, or receiving a rotation instruction input by a user.
  • the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion satisfying a rotation condition includes: in the three-dimensional display window, the spatial position of the three-dimensional model of the lesion at any viewing angle and The inclusion of the spatial position of the three-dimensional model of the ablation focus satisfies a preset condition.
  • the spatial relationship between the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion satisfies a rotation condition, including: in the current viewing angle, the projected area of the ablation region in the three-dimensional display window The projected area of the three-dimensional model of the lesion or the projected area of the ablation lesion satisfies a preset condition.
  • satisfying a preset condition includes: the proportion of the projected area of the ablation area in the projected area of the three-dimensional model of the lesion exceeds a predetermined threshold, or the projected area of the ablation area is within The proportion of the projected area of the three-dimensional model of the ablation lesion exceeds a predetermined threshold.
  • the target viewing angle of the ablation area that meets the predetermined requirements of the ablation lesion includes: a viewing angle at which the ratio of the projected area of the ablation area to the projected area of the three-dimensional model of the lesion is the smallest, or the angle of view of the ablation area.
  • the ratio of the projected area of the ablation region to the projected area of the three-dimensional model of the ablation lesion is lower than the viewing angle of the predetermined threshold.
  • rotating the current view from the current viewing angle to a target viewing angle where the projection of the ablation area meets a predetermined requirement includes: determining the first coordinates of the center position of the unablated area of the three-dimensional model of the lesion under the current viewing angle; determining the target viewing angle under the The second coordinate of the center position of the unablated area of the three-dimensional model of the lesion; determine the angle between the line connecting the first coordinate and the coordinate origin and the line connecting the second coordinate and the coordinate origin, and use the angle as the rotation angle of the rotation .
  • rotating the current view from the current viewing angle to a target viewing angle where the projection of the ablation area meets a predetermined requirement includes: determining the third coordinate of the center position of the three-dimensional model of the lesion under the current viewing angle; determining the three-dimensional image of the lesion under the target viewing angle The second coordinate of the center position of the unablated region of the model; the included angle between the line connecting the third coordinate and the coordinate origin and the line connecting the second coordinate and the coordinate origin is determined, and the included angle is used as the rotation angle of the rotation.
  • the processor 118 is further configured to display two or more three-dimensional display windows on the display interface of the display 120, wherein one window displays the current view, and the other windows display views from other viewing angles.
  • the two or more windows include a main window and an auxiliary window, wherein the main window is used to display the current view, and the auxiliary window is used to display views from other perspectives, and the size of the main window may be larger than that of the auxiliary window.
  • the views in other perspectives are fixed, or the views in other perspectives can also be rotated synchronously.
  • a computer storage medium is also provided, where program instructions are stored on the computer storage medium, and when the program instructions are run by a computer or a processor, the program instructions are used to execute the three-dimensional images of the embodiments of the present application.
  • the corresponding steps of method 200 are displayed.
  • the storage medium may include, for example, a memory card of a smartphone, a storage component of a tablet computer, a hard disk of a personal computer, read only memory (ROM), erasable programmable read only memory (EPROM), portable compact disk read only memory (CD-ROM), USB memory, or any combination of the above storage media.
  • the computer-readable storage medium can be any combination of one or more computer-readable storage media.
  • a computer program is also provided, and the computer program can be stored in the cloud or on a local storage medium.
  • the computer program is run by a computer or a processor, it is used to execute the corresponding steps of the three-dimensional display method of the embodiments of the present application.
  • the three-dimensional display method and the ultrasound imaging system can automatically rotate the viewing angle for observing the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion, which greatly facilitates the surgeon's surgical operation.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • Various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all functions of some modules according to the embodiments of the present application.
  • DSP digital signal processor
  • the present application can also be implemented as a program of apparatus (eg, computer programs and computer program products) for performing part or all of the methods described herein.
  • Such a program implementing the present application may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from Internet sites, or provided on carrier signals, or in any other form.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Otolaryngology (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un procédé d'affichage tridimensionnel et un système d'imagerie ultrasonore. Le procédé comprend les étapes consistant à : obtenir une image ultrasonore acquise pour une lésion et aligner l'image ultrasonore avec une image tridimensionnelle comprenant la lésion ; segmenter la lésion dans l'image tridimensionnelle et afficher un modèle tridimensionnel de la lésion dans une fenêtre d'affichage tridimensionnel en fonction du résultat d'alignement et du résultat de segmentation ; afficher un modèle tridimensionnel d'une lésion d'ablation dans la fenêtre d'affichage tridimensionnel selon le résultat d'alignement et un paramètre obtenu d'ablation ; déterminer une zone d'ablation selon une relation spatiale entre le modèle tridimensionnel de la lésion et le modèle tridimensionnel de la lésion d'ablation ; et si la relation spatiale entre le modèle tridimensionnel de la lésion et le modèle tridimensionnel de la lésion d'ablation vérifie une condition de rotation, faire tourner automatiquement la vue actuelle de l'angle actuel de vision à un angle cible de vision où la zone d'ablation répond à des exigences prédéterminées de la lésion d'ablation et afficher le modèle tridimensionnel de la lésion et le modèle tridimensionnel de la lésion d'ablation selon l'angle cible de vision. Le système comprend une sonde ultrasonore (110), un circuit d'émission (112), un circuit de réception (114), un processeur (118) et un afficheur (120). Selon la solution d'affichage tridimensionnel, un angle de vision pour observer un modèle tridimensionnel d'une lésion et un modèle tridimensionnel d'une lésion d'ablation peut tourner automatiquement, ce qui facilite les opérations de l'utilisateur.
PCT/CN2020/106893 2020-08-04 2020-08-04 Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore WO2022027251A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080102793.1A CN115811961A (zh) 2020-08-04 2020-08-04 三维显示方法和超声成像系统
PCT/CN2020/106893 WO2022027251A1 (fr) 2020-08-04 2020-08-04 Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/106893 WO2022027251A1 (fr) 2020-08-04 2020-08-04 Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore

Publications (1)

Publication Number Publication Date
WO2022027251A1 true WO2022027251A1 (fr) 2022-02-10

Family

ID=80118674

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/106893 WO2022027251A1 (fr) 2020-08-04 2020-08-04 Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore

Country Status (2)

Country Link
CN (1) CN115811961A (fr)
WO (1) WO2022027251A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114469309A (zh) * 2022-02-16 2022-05-13 上海睿刀医疗科技有限公司 布设及消融装置、策略获得方法、电子设备和存储介质
CN114820731A (zh) * 2022-03-10 2022-07-29 青岛海信医疗设备股份有限公司 Ct影像和三维体表图像的配准方法及相关装置
CN116523802A (zh) * 2023-07-04 2023-08-01 天津大学 一种用于肝脏超声图像的增强优化方法
CN117503344A (zh) * 2023-12-12 2024-02-06 中国人民解放军总医院第一医学中心 多根穿刺针功率确认方法、装置、电子设备及存储介质
CN117689567A (zh) * 2024-01-31 2024-03-12 广州索诺康医疗科技有限公司 一种超声波图像扫描方法及装置
CN117853570A (zh) * 2024-03-08 2024-04-09 科普云医疗软件(深圳)有限公司 一种麻醉穿刺辅助定位方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116725673B (zh) * 2023-08-10 2023-10-31 卡本(深圳)医疗器械有限公司 基于三维重建与多模态医学图像配准的超声穿刺导航系统
CN117883179A (zh) * 2024-01-24 2024-04-16 天津市鹰泰利安康医疗科技有限责任公司 一种高频电穿孔脉冲消融装置及其图像引导方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859341A (zh) * 2009-04-13 2010-10-13 盛林 影像引导消融治疗手术规划装置
CN103971574A (zh) * 2014-04-14 2014-08-06 中国人民解放军总医院 超声引导肿瘤穿刺训练仿真系统
CN104605926A (zh) * 2013-11-05 2015-05-13 深圳迈瑞生物医疗电子股份有限公司 一种超声介入消融系统及其工作方法
CN108720921A (zh) * 2017-03-07 2018-11-02 韦伯斯特生物官能(以色列)有限公司 在导管消融处理期间视角的自动跟踪和调整
CN110769750A (zh) * 2017-04-18 2020-02-07 波士顿科学医学有限公司 由激活波形促使的电解剖标测工具
US20200046435A1 (en) * 2018-08-10 2020-02-13 Covidien Lp Systems and methods for ablation visualization
CN111012484A (zh) * 2020-01-06 2020-04-17 南京康友医疗科技有限公司 一种实时消融区域成像系统
US20200155086A1 (en) * 2017-05-02 2020-05-21 Apn Health, Llc Determining and displaying the 3d location and orientation of a cardiac-ablation balloon

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859341A (zh) * 2009-04-13 2010-10-13 盛林 影像引导消融治疗手术规划装置
CN104605926A (zh) * 2013-11-05 2015-05-13 深圳迈瑞生物医疗电子股份有限公司 一种超声介入消融系统及其工作方法
CN103971574A (zh) * 2014-04-14 2014-08-06 中国人民解放军总医院 超声引导肿瘤穿刺训练仿真系统
CN108720921A (zh) * 2017-03-07 2018-11-02 韦伯斯特生物官能(以色列)有限公司 在导管消融处理期间视角的自动跟踪和调整
CN110769750A (zh) * 2017-04-18 2020-02-07 波士顿科学医学有限公司 由激活波形促使的电解剖标测工具
US20200155086A1 (en) * 2017-05-02 2020-05-21 Apn Health, Llc Determining and displaying the 3d location and orientation of a cardiac-ablation balloon
US20200046435A1 (en) * 2018-08-10 2020-02-13 Covidien Lp Systems and methods for ablation visualization
CN111012484A (zh) * 2020-01-06 2020-04-17 南京康友医疗科技有限公司 一种实时消融区域成像系统

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114469309A (zh) * 2022-02-16 2022-05-13 上海睿刀医疗科技有限公司 布设及消融装置、策略获得方法、电子设备和存储介质
CN114469309B (zh) * 2022-02-16 2022-10-21 上海睿刀医疗科技有限公司 一种消融装置、电极针布设策略获得方法、电子设备和存储介质
CN114820731A (zh) * 2022-03-10 2022-07-29 青岛海信医疗设备股份有限公司 Ct影像和三维体表图像的配准方法及相关装置
CN116523802A (zh) * 2023-07-04 2023-08-01 天津大学 一种用于肝脏超声图像的增强优化方法
CN116523802B (zh) * 2023-07-04 2023-08-29 天津大学 一种用于肝脏超声图像的增强优化方法
CN117503344A (zh) * 2023-12-12 2024-02-06 中国人民解放军总医院第一医学中心 多根穿刺针功率确认方法、装置、电子设备及存储介质
CN117689567A (zh) * 2024-01-31 2024-03-12 广州索诺康医疗科技有限公司 一种超声波图像扫描方法及装置
CN117689567B (zh) * 2024-01-31 2024-05-24 广州索诺康医疗科技有限公司 一种超声波图像扫描方法及装置
CN117853570A (zh) * 2024-03-08 2024-04-09 科普云医疗软件(深圳)有限公司 一种麻醉穿刺辅助定位方法
CN117853570B (zh) * 2024-03-08 2024-05-10 科普云医疗软件(深圳)有限公司 一种麻醉穿刺辅助定位方法

Also Published As

Publication number Publication date
CN115811961A (zh) 2023-03-17

Similar Documents

Publication Publication Date Title
WO2022027251A1 (fr) Procédé d'affichage tridimensionnel et système d'imagerie ultrasonore
WO2019100212A1 (fr) Système ultrasonore et procédé de planification d'ablation
US8303502B2 (en) Method and apparatus for tracking points in an ultrasound image
US8102392B2 (en) Image processing/displaying apparatus having free moving control unit and limited moving control unit and method of controlling the same
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
US20160174934A1 (en) Method and system for guided ultrasound image acquisition
US20100121189A1 (en) Systems and methods for image presentation for medical examination and interventional procedures
US20230103969A1 (en) Systems and methods for correlating regions of interest in multiple imaging modalities
US10755453B2 (en) Image processing apparatus, image processing method, and ultrasound imaging apparatus having image processing unit
US20080234570A1 (en) System For Guiding a Medical Instrument in a Patient Body
JP6615603B2 (ja) 医用画像診断装置および医用画像診断プログラム
WO2014200099A1 (fr) Dispositif de diagnostic par ultrasons
US20130257910A1 (en) Apparatus and method for lesion diagnosis
US20180360427A1 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
JP2018057428A (ja) 超音波診断装置および超音波診断支援プログラム
JP7171168B2 (ja) 医用画像診断装置及び医用画像処理装置
US20220036545A1 (en) Breast mapping and abnormality localization
KR20170086311A (ko) 의료 영상 장치 및 그 동작방법
CN111836584B (zh) 超声造影成像方法、超声成像装置和存储介质
US20230098305A1 (en) Systems and methods to produce tissue imaging biomarkers
KR20130110544A (ko) 초음파 영상 상에 의료용 기구를 표시하는 방법 및 장치
KR20160041803A (ko) 영상 처리 장치, 영상 처리 장치 제어 방법 및 의료 영상 장치
CN115998334A (zh) 消融效果的显示方法和超声成像系统
Zenbutsu et al. 3D ultrasound assisted laparoscopic liver surgery by visualization of blood vessels
JP7299100B2 (ja) 超音波診断装置及び超音波画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20948856

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20948856

Country of ref document: EP

Kind code of ref document: A1