CN116416383A - Dynamic model building method, simulation operation device, equipment and medium - Google Patents

Dynamic model building method, simulation operation device, equipment and medium Download PDF

Info

Publication number
CN116416383A
CN116416383A CN202310372301.6A CN202310372301A CN116416383A CN 116416383 A CN116416383 A CN 116416383A CN 202310372301 A CN202310372301 A CN 202310372301A CN 116416383 A CN116416383 A CN 116416383A
Authority
CN
China
Prior art keywords
model
medical image
dynamic
image
dynamic model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310372301.6A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xinji Medical Robot Co ltd
Original Assignee
Shenzhen Xinji Medical Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xinji Medical Robot Co ltd filed Critical Shenzhen Xinji Medical Robot Co ltd
Priority to CN202310372301.6A priority Critical patent/CN116416383A/en
Publication of CN116416383A publication Critical patent/CN116416383A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Pathology (AREA)
  • Software Systems (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present application relates to a dynamic model building method, a simulation operation method, an apparatus, a computer device, a storage medium, and a computer program product. The method comprises the following steps: acquiring a first medical image and a second medical image before the operation of the target area; registering the first medical image and the second medical image to obtain a registration result; reconstructing the first medical image in the registration result to obtain a static model; processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable; establishing a mapping relation between each dynamic mathematical equation model and the static model; and processing the static model according to the mapping relation to obtain a dynamic model with real-time change. By adopting the method, a more accurate dynamic model can be generated.

Description

Dynamic model building method, simulation operation device, equipment and medium
Technical Field
The present application relates to the field of computer technology, and in particular, to a dynamic model building method, a simulation operation method, a device, a computer apparatus, a storage medium, and a computer program product.
Background
Transcatheter aortic valve replacement (transcatheter aortic value replacement, TAVR) is one of the minimally invasive interventional aortic valve procedures that restore normal pumping function to the heart by placing a prosthetic valve at the diseased aortic valve within the heart's organs of the human body.
Prior to TAVR procedure, pre-operative screening is required, including pre-operative evaluation and protocol design. The preoperative evaluation comprises clinical evaluation and imaging evaluation, wherein the clinical evaluation is mainly used for evaluating the necessity and feasibility of a TAVR operation, and the imaging evaluation is mainly used for preoperative planning design and selecting a proper valve model.
The imaging evaluation is the key point of TAVR preoperative evaluation, and the existing imaging evaluation means uses multi-row spiral computed tomography (MSCT, abbreviated as CT) to reconstruct a tomographic image in three dimensions, so as to achieve the purpose of quantitatively measuring a valve in a three-dimensional environment, and selects a proper artificial valve size according to the quantitative measurement result.
However, the three-dimensional model after three-dimensional reconstruction of the tomographic image is static, making the simulated static model inaccurate.
Disclosure of Invention
Based on this, it is necessary to provide a method for creating a more accurate dynamic model, a method for simulating operation, an apparatus, a computer device, a computer readable storage medium and a computer program product, in view of the above-mentioned technical problems.
In a first aspect, the present application provides a dynamic model building method, where the method includes:
acquiring a first medical image and a second medical image before the operation of the target area;
registering the first medical image and the second medical image to obtain a registration result;
reconstructing the first medical image in the registration result to obtain a static model;
processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable;
establishing a mapping relation between each dynamic mathematical equation model and the static model;
and processing the static model according to the mapping relation to obtain a dynamic model with real-time change.
In a second aspect, the present application further provides a method of analog operation, the method of analog operation including:
acquiring a dynamic model generated according to the dynamic model building method;
acquiring a medical instrument model;
and controlling the medical instrument model in the dynamic model to perform operation simulation.
In a third aspect, the present application further provides a dynamic model building apparatus, where the apparatus includes:
a medical image acquisition module for acquiring a first medical image and a second medical image before the operation of the target region;
The registration module is used for registering the first medical image and the second medical image to obtain a registration result;
the reconstruction module is used for reconstructing the first medical image in the registration result to obtain a static model;
the model processing module is used for processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable;
the mapping relation establishing module is used for establishing the mapping relation between each dynamic mathematical equation model and the static model;
and the dynamic model generation module is used for processing the static model according to the mapping relation to obtain a dynamic model which changes in real time.
In a fourth aspect, the present application also provides an analog operation device including:
the dynamic model acquisition module is used for acquiring the dynamic model generated by the dynamic model building device;
the medical instrument model acquisition module is used for acquiring a medical instrument model;
and the operation simulation module is used for controlling the medical instrument model in the dynamic model to perform operation simulation.
In a fifth aspect, the present application further provides a dynamic model building system, the system comprising an image processing device and an image acquisition device, the image processing device being in communication with the image acquisition device;
The image acquisition device is used for acquiring a first medical image and a second medical image before the operation of the target area;
the image processing device is used for realizing the dynamic model building method.
In a sixth aspect, the present application further provides an analog operating system, where the analog operating system includes a display device and the dynamic model building system described above;
the display device is used for acquiring the dynamic model generated based on the dynamic model building system and the medical instrument model, and displaying simulation operation performed by controlling the medical instrument model in the dynamic model.
In a seventh aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the method described above.
The dynamic model building method, the device, the computer equipment, the storage medium and the computer program product register the first medical image and the second medical image before operation, thus building a static model based on the first medical image, obtaining a dynamic mathematical equation model taking time as an independent variable based on the second medical image, and correcting the static model based on the dynamic mathematical equation model taking time as the independent variable to obtain a dynamic model, and the obtained dynamic model is more accurate.
Drawings
FIG. 1 is a schematic diagram of a dynamic modeling system in one embodiment;
FIG. 2 is a schematic diagram of a simulated operating system in one embodiment;
FIG. 3 is a flow diagram of a dynamic model building method in one embodiment;
FIG. 4 is a schematic diagram of an application scenario of a dynamic model in one embodiment;
FIG. 5 is a schematic diagram of the creation and application of a dynamic model in one embodiment;
FIG. 6 is a flow diagram of an emulated operating system in one embodiment;
FIG. 7 is a schematic diagram of the components of an analog operating system in one embodiment;
FIG. 8 is a schematic diagram of the components of another embodiment of an analog operating system;
FIG. 9 is a diagram of data loading to simulate an operating system in one embodiment;
FIG. 10 is a schematic diagram of static model creation in a dynamic model creation method in one embodiment;
FIG. 11 is a schematic diagram of image registration in a dynamic model building method in one embodiment;
FIG. 12 is a schematic diagram of static model creation in a dynamic model creation method in one embodiment;
FIG. 13 is a schematic diagram of dynamic model creation in a dynamic model creation method in one embodiment;
FIG. 14 is a schematic view of image processing in a dynamic model building method in one embodiment;
FIG. 15 is a schematic diagram of dynamic mathematical equation modeling in a dynamic model building method in one embodiment;
FIG. 16 is a schematic diagram of a method of simulated operation in one embodiment;
FIG. 17 is a schematic diagram of a method of simulating operation in another embodiment;
FIG. 18 is a schematic diagram of a method of simulating operation in another embodiment;
FIG. 19 is a schematic diagram of a method of simulating operation in another embodiment;
fig. 20 is an internal structural view of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
As shown in connection with fig. 1, in one embodiment of the present application, a dynamic model building system is provided that includes an image processing device 104 and an image acquisition device 102, the image processing device 104 being in communication with the image acquisition device 102. Wherein the image processing device 104 herein may be understood as a processor. Wherein the image processing device 104 will acquire the first medical image and the second medical image prior to the operation of the target region by means of the image acquisition device 102. The image processing device 104 registers the first medical image and the second medical image to obtain a registration result; reconstructing the first medical image in the registration result to obtain a static model; processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable; establishing a mapping relation between each dynamic mathematical equation model and a static model; and processing the static model according to the mapping relation to obtain a dynamic model with real-time change.
In one embodiment, the image acquisition device comprises an ultrasound device and a three-dimensional image acquisition device. A scan image of the target region may be acquired by the ultrasound device, optionally the scan image comprising target region ultrasound data of at least one cardiac cycle. The scanning image of the target area can be acquired by the three-dimensional image acquisition device, and the scanning image is a static image of the target area. Optionally, the three-dimensional image acquisition device is a multi-row helical computed tomography (MSCT, abbreviated as CT).
As shown in connection with FIG. 2, in one embodiment of the present application, a simulated operating system is provided that includes a display device 106 and a dynamic model building system 108. Wherein the display device 106 is configured to obtain the dynamic model and the medical instrument model generated by the dynamic model creation system 108, so as to perform operation simulation by controlling the medical instrument model in the dynamic model. Alternatively, the display device 106 may be a general display device, a virtual reality device, an augmented reality device, a mixed reality device, and may support on-site display as well as remote display, and the virtual reality device may be VR glasses.
In one embodiment, as shown in fig. 3, a dynamic model building method is provided, which is applied to the image processing apparatus in fig. 1. In this embodiment, the method includes the steps of:
Step 202, acquiring a first medical image and a second medical image prior to a target region operation.
Where the target region is a subsequent operative region, for example where the operative region is a heart, then the target region may comprise a region of an aortic valve of the heart. The first medical image is data acquired by the CT device and can comprise conventional image data and/or CTA image data. Optionally, the first medical image is CT image data corresponding to a region from a femoral artery of a thigh to a cervical artery. The second medical image is 4D ultrasonic image data acquired by the ultrasonic equipment. Optionally, the second medical image is three-dimensional ultrasound data comprising at least one complete heart cycle acquired by the ultrasound device.
Optionally, the image processing device acquires CT image data corresponding to a region from a femoral artery of a thigh to a cervical artery of the user before performing the operation, and the image processing device acquires three-dimensional ultrasound data of one complete heart cycle of the user before performing the operation.
And step 204, registering the first medical image and the second medical image to obtain a registration result.
The registration result is a first medical image set and a second medical image set which are positioned under the same coordinate system and are matched with each other in the position of the characteristic point.
Optionally, the image processing device performs feature registration on the first medical image and the second medical image through a feature point matching algorithm to obtain a first medical image set and a second medical image set which are located under the same coordinate system and have matched feature point positions, wherein the registration process can be regarded as a primary filtering process of the first medical image and the second medical image, and the first medical image and the second medical image which can be registered are respectively stored in the first medical image set and the second medical image set.
And 206, reconstructing the first medical image in the registration result to obtain a static model.
Wherein the static model is a static heart model of the user. Specifically, a target in a first medical image is identified, the target is segmented from the first medical image, and then three-dimensional reconstruction is performed based on the segmented target to obtain a static model. Wherein identifying the object in the first medical image may be identifying the object in the first medical image only in the registration result, without identifying the non-registered part, to improve efficiency. That is, the portion to be reconstructed is extracted from the first medical image according to the registration result, then the portion to be reconstructed is subject to object segmentation, and finally the segmented object is subject to three-dimensional reconstruction.
Optionally, the image processing device performs image segmentation and three-dimensional reconstruction on the first medical image in the registration result to obtain a static model. It should be noted that a static cardiac tissue has a unique name label, so that the static model generated in the present application may include a plurality of static models, and optionally, the static model includes at least one of three-dimensional models of an aorta, a vein, a pulmonary artery, a pericardium, a left atrium, a left ventricle, a right atrium, a right ventricle, a left atrioventricular valve, a right atrioventricular valve, a left aortic valve, a right aortic valve, and the like.
And step 208, processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable.
The dynamic mathematical equation model taking time as an independent variable is used as the dynamic mathematical equation model taking time as the independent variable of each heart tissue of the heart corresponding to the target area.
Optionally, the image processing device performs image segmentation and mathematical model construction on the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable.
Optionally, the image processing device identifies a target in the second medical image, segments the target from the second medical image, and then builds a mathematical model based on the time stamp sequence and the segmented target to obtain a dynamic mathematical equation model with time as an independent variable. Wherein identifying the object in the second medical image may be identifying the object in the second medical image only in the registration result, without identifying the non-registered part, to improve efficiency. That is to say, firstly, a part to be constructed of a mathematical model is extracted from the second medical image according to the registration result, then the part to be constructed of the mathematical model is subjected to target segmentation, and finally, the mathematical model construction is performed on the segmented target, for example, a mathematical equation is constructed on each extracted heart tissue according to a time stamp, so as to obtain a dynamic equation model of each heart tissue taking time as an independent variable, and optionally, each dynamic data equation model can be named by a heart tissue name.
Step 210, a mapping relationship between each dynamic mathematical equation model and the static model is established.
Optionally, the image processing device establishes a mapping relationship between each heart tissue dynamic mathematical equation model and a static heart model, and one static heart tissue model corresponds to one dynamic mathematical equation. The static heart model corresponds to each static heart tissue. In one alternative embodiment, the mapping relationship is established by the name of the static heart tissue model and the name of the dynamic mathematical equation model.
And 212, processing the static model according to the mapping relation to obtain a dynamic model which changes in real time.
Optionally, the image processing device performs matrix transformation operations such as rotation, translation, scaling and the like on each static heart tissue model according to each static heart tissue model and a corresponding dynamic mathematical equation thereof, so as to complete construction of a dynamic heart model (dynamic model which changes in real time), that is, the static model of each heart tissue can be subjected to change simulation according to the corresponding dynamic mathematical equation, thereby realizing conversion from the static model to the dynamic model.
In practical application, a simulated blood flow model can be constructed, so that the dynamic heart model and the simulated blood flow model are visually displayed, and visual simulation of heart dynamic change can be realized.
In the dynamic model building method, the first medical image and the second medical image before operation are registered, so that a static model is built based on the first medical image, a dynamic mathematical equation model taking time as an independent variable is obtained based on the second medical image, and the dynamic model is obtained by correcting the static model based on the dynamic mathematical equation model taking time as the independent variable, and the obtained dynamic model is more accurate.
In one embodiment, acquiring the first medical image and the second medical image prior to the operation of the target region comprises: acquiring three-dimensional image data of a target area as a first medical image by three-dimensional medical imaging equipment before operation; ultrasound data of the target region is acquired as a second medical image by the ultrasound imaging device prior to operation.
Wherein the three-dimensional medical imaging device is a CT device.
Optionally, the image processing device acquires conventional image data or CTA image data including data from femoral artery to cervical artery of the thigh as the first medical image by the CT device before operating on the aortic valve. The image processing device acquires 4D ultrasonic image data before operating the aortic valve through the ultrasonic imaging device, and takes the 4D ultrasonic image data as a second medical image. The 4D ultrasound data is three-dimensional ultrasound data comprising one complete heart cycle.
In the dynamic model building method, the acquired first medical image has the characteristic of high resolution, and the acquired second medical image has instantaneity and can acquire continuous three-dimensional images within a certain time.
In one embodiment, registering the first medical image and the second medical image results in a registration result, comprising: performing feature registration on the first medical image and the second medical image to obtain a first registration relationship; and processing the first medical image or the second medical image based on the first registration relation to obtain a first target medical image and a second target medical image which are positioned under the same coordinate system and have the characteristic point positions matched with each other as registration results.
Optionally, the image processing device performs fusion registration of the 4D ultrasound image data (second medical image) and the CT image data (first medical image) through a feature point matching algorithm, so as to obtain a 4D ultrasound image set (second target medical image) and the CT image data (first target medical image) which are located under the same coordinate system and have matched feature point positions.
Optionally, the image processing device performs feature extraction on the 4D ultrasonic image and generates feature sub corresponding to the feature of the 4D ultrasonic image to obtain a 4D ultrasonic image feature descriptor. The image processing equipment extracts the characteristics of the CT image and generates the characteristic descriptors thereof to obtain the CT image characteristic descriptors. The image processing equipment performs feature matching according to the similarity degree of the 4D ultrasonic image feature descriptors and the CT image feature descriptors, and performs matrix transformation operation on the 4D ultrasonic data and the CT image data respectively to obtain a registered 4D ultrasonic image set and a registered CT image data set. Wherein the matrix transformation comprises translational and rotational operating parameters.
In the dynamic model building method, the first medical image and the second medical image are registered to obtain the first target medical image and the second target medical image which are positioned under the same coordinate system and have the characteristic point positions matched, so that the registered first medical image and the registered second medical image are positioned under the same coordinate system, and the subsequent data processing is facilitated.
In one embodiment, reconstructing the first medical image in the registration result to obtain a static model includes: dividing the registered first medical image to obtain a target object; and carrying out three-dimensional reconstruction on the target object to obtain a static model.
Wherein the target object is each cardiac tissue of the heart.
Optionally, as shown in fig. 12, the image processing device uses a neural network method (nnet) to perform image segmentation on the registered CT image dataset to obtain a heart tissue model of the 3D image including a plurality of tag attributes. The image processing device performs three-dimensional reconstruction on the heart tissue model based on a voxel level reconstruction algorithm (marking Cube) to obtain a static heart model (static model).
In the dynamic model building method, a static model with high resolution is built.
In one embodiment, processing the second medical image in the registration results to obtain a dynamic mathematical equation model with time as an argument comprises: arranging the registered second medical images according to a time sequence, and carrying out image segmentation on the registered second medical images arranged according to the time sequence to obtain a target object corresponding to each registered second medical image; dividing a target object to obtain a plurality of parts; extracting characteristic points of each part; and constructing a dynamic mathematical equation model taking time as an independent variable of each part based on the positions and the corresponding time of the extracted characteristic points.
Optionally, the image processing device performs cardiac tissue identification segmentation on the registered 4D ultrasound image set according to the time stamp sequence, and extracts a multi-time sequence cardiac tissue ultrasound image set of pericardium, each chamber of heart and valve (each cardiac tissue). The image processing device constructs a mathematical equation for each extracted heart tissue according to the time stamp sequence to obtain a dynamic mathematical equation model taking time as an independent variable of each heart tissue.
Optionally, the image processing device traverses each frame of the set of 4D ultrasound images, performing the following operations: image segmentation is carried out on each frame of 4D ultrasonic image set based on a Unet method, so that a three-dimensional image containing a plurality of heart tissue labels is obtained; performing multi-label segmentation on a three-dimensional image containing a plurality of heart tissue labels to obtain a heart image model of the frame 4D ultrasonic image set; and (3) traversing and executing the steps 1 and 2 on each frame of 4D ultrasonic image set to finally obtain a multi-time sequence three-dimensional image set. The image processing device extracts characteristic points of each picture along the cross section of the ultrasonic image of each time stamp sequence; constructing a mathematical equation according to position and time based on the extracted feature points; and (3) carrying out mathematical equation optimization to obtain the heart tissue mathematical model (a dynamic mathematical equation model taking time as an independent variable).
In the dynamic model building method, a dynamic mathematical equation model taking time as an independent variable is built, so that the beating of the heart can be dynamically represented.
In one embodiment, establishing a mapping relationship between each dynamic mathematical equation model and the static model includes: acquiring a static sub-model of each part in the static model; and establishing a mapping relation between each dynamic mathematical equation model and each static submodel.
Optionally, the image processing device establishes a mapping relationship between each heart tissue dynamic mathematical equation model and a static heart model according to the name, and one static heart tissue model corresponds to one dynamic mathematical equation. For example, a static heart tissue model has a unique name identification, such as "LeftAtrilum" for the left atrium static model and "RightAtrilum" for the right atrium static model; a dynamic mathematical model has a unique name identification that can only be selected from the existing static heart tissue model names, such as: "LeftAtrilum" identifies the left atrium dynamic mathematical model and "Right Atrilum" identifies the right atrium dynamic mathematical model, so that the mapping relationship of each heart tissue dynamic mathematical equation model and the static heart model can be constructed by the dynamic mathematical model name and the static heart tissue name.
And performing matrix transformation operations such as rotation, translation, scaling and the like on each static heart tissue model according to each static heart tissue model (static sub-model) and a corresponding dynamic mathematical equation thereof, and completing the construction of the dynamic heart model. Optionally, the image processing device builds a simulated blood flow model.
In the dynamic model building method, the dynamic heart model with high simulation degree is built according to the mapping relation between the static model and the dynamic model based on the characteristic of high resolution of the static model and the characteristic of real-time change of the dynamic mathematical model, and the simulation effect of the dynamic model is improved.
In one embodiment, a method of analog operation is provided that is applied to the display device of FIG. 2. In this embodiment, the method includes the steps of: acquiring a dynamic model generated according to a dynamic model building method; acquiring a medical instrument model; the operation simulation is performed by controlling the medical instrument model in the dynamic model.
Among other things, the medical instrument models include, but are not limited to, medical instruments and medical consumables required to perform aortic valve replacement, and the medical instrument models may include TAVR surgical consumable models as well as conventional instrument models.
Optionally, the display device acquires a dynamic model generated according to a dynamic model building method; acquiring a medical instrument model; and controlling the medical instrument model in the dynamic model according to the instruction of the user to perform operation simulation of the aortic valve replacement. In the operation simulation method, the operation simulation is performed based on the dynamic model, so that the proficiency and the accuracy of the operation are improved.
In one embodiment, the operation simulation by controlling the medical instrument model in the dynamic model includes: the process of operating the simulation is visualized, the process of operating the simulation including the motion process of the dynamic model and the medical instrument model.
In the operation simulation method, the process of operation simulation is visualized, and the whole process of simulation operation can be seen.
In one embodiment, the medical device model is at least one of a guidewire model, a catheter model, a balloon model, and a valve model; operational simulation by controlling a medical instrument model in a dynamic model, comprising at least one of: controlling a guide wire model in a dynamic model to simulate guide wire puncture operation; controlling a guide wire model or a catheter model in the dynamic model to simulate a guide wire or catheter placing operation and/or an exiting operation; controlling a balloon model to simulate balloon pre-expansion operation in a dynamic model; controlling the balloon model in the dynamic model to simulate balloon dilation operation; and controlling the valve model in the dynamic model to simulate valve loading and placement operations.
In TAVR surgery, the key to successful guidewire penetration is to find a suitable vascular access, and then smoothly puncture the guidewire into the blood vessel along the femoral artery blood vessel. In order to enable medical workers to be familiar with the operation of puncturing a guide wire in a TAVR operation as soon as possible, a guide wire puncturing simulation method as shown in fig. 16 is designed in combination with the operation requirement of threading the guide wire in the TAVR operation. Optionally, simulating a guidewire penetration operation by controlling a guidewire in a dynamic model, comprising: the display equipment acquires and visualizes a target puncture needle; controlling and displaying the puncture of the target puncture needle into the blood vessel of the dynamic model according to the target instruction; the lancing process includes at least one of collision detection, angle calculation, and bleeding simulation.
Wherein the process of simulating the operation is visualized, comprising at least one of: calculating the distance and the contact position between the guide wire model and the corresponding blood vessel of the dynamic model in the puncturing process, generating a collision detection result based on the distance and the contact position, and visualizing the collision detection result; calculating the topological relation and/or the angular relation of the blood vessels corresponding to the guide wire model or the catheter model and the dynamic model, and visualizing the topological relation and/or the angular relation; visualizing the length/speed information of the guide wire model or the catheter model entering or exiting the blood vessel corresponding to the dynamic model; calculating the topological relation between the balloon model and the corresponding blood vessel of the dynamic model, and visualizing the topological relation; and calculating a distance and a contact position between the valve model and a corresponding blood vessel of the dynamic model, generating a collision detection result based on the distance and the contact position, and visualizing the collision detection result.
For the convenience of understanding of those skilled in the art, several common simulation procedures are shown in connection with fig. 17 to 19, but those skilled in the art will recognize that the dynamic model of the present application may be applied not only in the simulation procedures listed below, but also in other simulation procedures and preoperative planning, etc., and is not limited thereto.
Alternatively, as shown in fig. 17, the wire puncture operation is simulated by controlling the wire in a dynamic model, comprising the following 6 steps:
step 1: the virtual reality device acquires the puncture needle and the guide wire model selected by the user, and highlights the selected puncture needle and guide wire model on a VR display connected with the virtual reality device.
Step 2: translating the guidewire model. When a user presses a puncture needle control button on a VR controller connected with the virtual reality device to translate the guide wire needle head to reach the body surface on the upper portion of the femoral artery, the virtual reality device performs collision detection, angle calculation and bleeding simulation in real time. The collision detection is that the virtual reality equipment solves the distance and the contact position between the puncture needle head and the puncture needle and the blood vessel wall in the puncture process in real time, and sends the generated collision detection result model to the visualization module for processing and visualization display. And the angle calculation is that the virtual reality equipment calculates the included angle between the femoral artery vessel corresponding to the cross section where the guide wire needle is positioned and the cross section of the body data, the sagittal plane and the coronal plane in real time, calculates the included angle between the guide wire needle and the cross section of the body data, the sagittal plane and the coronal plane, and sends the generated information to the visualization module for processing and visualization display. The bleeding simulation is that the virtual reality device simulates the blood overflow condition caused by the rupture of the blood vessel due to improper operation of the puncture needle in the process of puncturing the guide wire.
Step 3: fine-tuning the angle of the puncture needle. When a user presses four direction buttons of a puncture needle on a VR controller connected with virtual reality equipment, the gesture of the puncture needle can be finely adjusted; when the angle between the puncture needle and the femoral artery is 20-30 degrees, entering a step 4, and executing the operation of the step 4; if the fine tuning process finds that the positions are not right, the process returns to the step 2, and the operation of the step 2 is executed.
Step 4: pushing the puncture needle into the blood vessel. The puncture needle push rod on the VR controller can push the puncture needle into skin and blood vessels, and the puncture needle indicator lamp on the VR controller is green; the puncture needle push rod on the VR controller is pulled backwards to withdraw the puncture needle from the skin and the blood vessel, and the puncture needle indicator lamp on the VR controller displays blue; in the process of pushing the puncture needle into the blood vessel, if the needle head does not enter the blood vessel yet, the operation can be carried out in the steps 2 and 3.
Step 5: the puncture was successful. When the puncture needle contacts the blood vessel, the contact point turns blue, and the puncture needle operating rod is pushed forward continuously to push the puncture needle into the blood vessel completely; after the puncture needle completely enters the blood vessel, the puncture stopping operation button is pressed to stop the puncture needle from entering.
Step 6: and withdrawing the puncture needle. The push button of the puncture needle is pressed down to slowly withdraw the puncture needle from the femoral artery, only the catheter wrapped outside the puncture needle is remained in the femoral artery, and the virtual reality equipment updates and displays the simulation effect of the puncture needle in real time when withdrawing, so that the percutaneous puncture operation is finally completed.
Optionally, simulating a guidewire or catheter placement operation and/or an exit operation by controlling the guidewire or catheter in a dynamic model, comprising: the virtual reality device acquires the topological relation between the guide wire and/or the guide tube and the vascular wall and the heart valve in the process that the guide wire and/or the guide tube enters the blood vessel and the valve of the dynamic model in real time, generates a topological relation model, and displays the topological relation model. Calculating the included angle between the femoral artery blood vessel corresponding to the cross section of the dynamic model and the body data cross section, sagittal plane and coronal plane of the guide wire and/or the catheter, and sending the included angle to a visualization module for processing and visualization display. And updating and displaying the blood vessel visualization effect of the guide wire and/or the catheter entering or exiting the dynamic model and the corresponding length, speed and other information in real time.
Optionally, simulating a guidewire or catheter placement operation and/or an exit operation by controlling the guidewire or catheter in a dynamic model, comprising the following seven steps:
step 1: and starting the guide wire pushing. The user presses the guide wire pushing button, and a guide wire pushing indicator light on the VR controller is displayed in red, so that the virtual reality device sends the guide wire to the tail of the puncture catheter.
Step 2: pushing the guide wire. The user derives the wire push rod forward, and the virtual reality device slowly enters the femoral artery vessel from the tail of the puncture catheter and moves in the aortic artery vessel to a position close to the heart.
The user deduces the wire push rod backwards, and the guide wire moves along the opposite direction; the user can control the wire pushing speed by changing the angle between the wire pushing push rod and the horizontal plane of the control console.
Step 3: the guidewire passes through the valve. 1) When the distance between the guide wire and the atrioventricular valve is a set warning distance, the guide wire push rod is reset immediately, and the guide wire stops moving; 2) When the atrioventricular valve shown on the VR display is in the diastolic state, immediately advancing the guidewire pushrod in order to allow the guidewire to successfully pass through the valve while the valve is in the diastolic state; 3) After the guide wire successfully passes through the valve, the guide wire pushing state indicator lamp controlled by the VR controller is green; 4) When the guide wire is positioned in the middle of the ventricle, the starting button on the push rod control console is reset, and the guide wire puncture and valve passing operation are completed.
Step 4: catheter pushing is initiated. The user presses the catheter push button and the catheter push indicator on the VR controller appears red and the virtual reality device sends the catheter to the tail of the puncture catheter.
Step 5: the catheter is pushed. The user manipulates the catheter pushrod on the VR controller and the virtual reality device controls the catheter to move along the guidewire in the vessel and to reach the near valve position in the same way as the guidewire pushing.
Step 6: the catheter passes through the valve. The user manipulates the catheter push rod on the VR controller in the same way as the guidewire push.
Step 7: and (3) withdrawing the guide wire. The user deduces the wire push rod backwards, and the virtual reality device controls the wire to slowly withdraw from the heart, the blood vessel and the body.
The principles of the above-described simulated guidewire or catheter placement operation and/or withdrawal operation include: solving a topological relation model: generating a topological relation model by using the topological relation among the guide wire/catheter, the vessel wall and the heart valve in the process that the guide wire and the catheter enter the vessel and the valve in real time, and sending the topological relation model to a visualization module for processing and visualization display; and (3) angle calculation: calculating included angles between a femoral artery blood vessel corresponding to the cross section where the guide wire is positioned and the cross section of the body data, the sagittal plane and the coronal plane in real time, calculating included angles between the guide wire and the cross section of the body data, the sagittal plane and the coronal plane, and sending the generated information to a visualization module for processing and visualized display; and (3) visualization: besides the above topological model and angle information, the information of the visual effect of the guide wire and the catheter entering/exiting the blood vessel, the corresponding length, speed and the like are updated and displayed in real time.
Optionally, simulating balloon pre-inflation and/or inflation operations by controlling the balloon in a dynamic model, comprising: the balloon pre-inflation/balloon inflation procedure is simulated from the dynamic model and the target balloon.
Alternatively, as shown in fig. 18, the balloon pre-expansion and/or expansion operation is simulated by controlling the balloon in a dynamic model, comprising the following 6 steps:
step 1: the balloon dilation is initiated. The user presses a balloon dilation start button on the VR controller, a balloon dilation start indicator on the VR controller is displayed in red, the selected balloon model is highlighted on the VR display, and the virtual reality device translates the balloon to the tail end of the pigtail catheter.
Step 2: pushing the balloon into the vessel. The user maneuvers the balloon pushrod on the VR console forward, the balloon movement indicator light appears green, and the virtual reality device slowly sends the balloon along the pigtail catheter into the aortic vessel. When the balloon enters the aortic blood vessel, the forward pushing state of the balloon push rod is kept, and the balloon slowly moves to the position of the atrioventricular valve ring in the aortic blood vessel. When the balloon reaches the mode valve ring position, the balloon push rod is reset, the balloon movement indicator lamp displays blue, and the balloon pushing is marked.
Step 3: the balloon is inflated. The user presses the balloon expansion button on the VR control console, the balloon expansion indicator lamp is displayed green, the virtual reality equipment simulates the expansion of the balloon in the atrium, and information such as the contact surface condition and the distance between the balloon and the inner wall of the blood vessel is calculated in real time. When the distance between the balloon and the inner wall of the blood vessel is lower than the set warning distance, the balloon expansion indicator light is displayed in red.
Step 4: balloon dilation stops. The user presses the balloon expansion button on the VR control console again, the balloon expansion indicator is closed, the balloon expansion is stopped on the virtual reality device, and information such as the state that the blood atrium wall is extruded by the balloon is visualized in real time.
Step 5: the balloon is contracted. The user presses the sacculus shrink button on the VR control cabinet, and the sacculus shrink pilot lamp shows as green, and virtual reality equipment contracts the operation to the sacculus of expanding and carries out real-time visualization with the shrink process transmission to VR display device to information such as real-time calculation sacculus and the contact surface condition of blood vessel inner wall and distance, when the sacculus shrink to press close to the pipe state, the sacculus shrink pilot lamp shows as blue, and virtual reality equipment stops the sacculus shrink.
Step 6: the balloon is withdrawn. The user presses a balloon exit start button on the VR controller and a balloon exit start indicator on the VR controller appears red. The user operates the balloon push rod on the VR control console backwards, the balloon movement indicator lamp is displayed green, and the virtual reality device slowly withdraws the balloon backwards from the atrium and the aortic blood vessel along the pigtail catheter; when the balloon returns to the original position, the balloon movement indicator light is displayed in blue; after the balloon returns to the original position, the balloon push rod is reset, the user presses the balloon exit start button on the VR controller again, and the virtual reality device closes all the indicator lights for balloon exit.
During the whole balloon placement, expansion, contraction, exit and other processes, the processor of the virtual reality device performs balloon visualization simulation and vessel wall expansion simulation in real time.
Optionally, simulating valve loading and placement operations by controlling the valve in a dynamic model includes: simulating a prosthetic valve loading and placement process according to the dynamic model and the target valve, the placement process including a second collision detection.
Alternatively, as shown in fig. 19, the valve loading and placement operation is simulated by controlling the valve in a dynamic model, comprising the following 5 steps:
step 1: valve loading is initiated. The user presses the start valve loading button on the VR controller, the valve loading indicator is shown in red, the selected prosthetic valve model is highlighted on the VR display, and the model and size information of the valve is displayed in the upper left corner of the stereoscopic display, and the virtual reality device translates the prosthetic valve to the tail end of the pigtail catheter.
Step 2: pushing the valve into the vessel. The user pushes forward the valve movement push rod on the VR controller, the valve movement indicator light is displayed green, the virtual reality device slowly advances the prosthetic valve from the pigtail catheter into the femoral artery vessel, and when the prosthetic valve completely enters the femoral artery vessel, the valve movement indicator light is displayed yellow.
Step 3: pushing the valve into the vessel. The user continues to push the valve movement push rod on the VR controller forward, the valve movement indicator light remains in a yellow display state, and the virtual reality device slowly advances the prosthetic valve from the femoral artery vessel to the heart valve site. When the prosthetic valve reaches a warning value at a set distance from the heart valve, the valve movement indicator light appears red and the associated area is visually displayed in the set color.
Step 4: the valve stops moving. The user resets the valve movement push rod on the VR controller, the valve movement indicator light is displayed in blue, and the prosthetic valve in the virtual reality device stops moving.
Step 5: and (5) valve placement. The user presses a valve release button on the VR controller and the virtual reality device automatically releases the prosthetic valve at the current location.
In the whole process, the processor of the virtual reality equipment performs collision detection simulation in real time; when the valve loading is started to the valve stopping movement process, the system performs valve loading simulation in real time; during valve release, the system performs a valve release simulation; after valve release, the system automatically performs valve working simulation.
In a specific embodiment, the dynamic model established by the dynamic model establishment method mainly has application scenes of 3 aspects as shown in fig. 4, and comprises simulation verification of preoperative diagnosis and treatment scheme, simulation training of operation of a clinician and TAVR anatomical lessons of a medical institution. Wherein, the simulation of preoperative diagnosis and treatment scheme verifies the scene and includes: the heart model real-time dynamic change simulation (dynamic model) and operation process simulation (simulation operation method) of the TAVR pre-operation planning simulation are combined with the advantages of CT image data (first medical image) and 4d ultrasonic image data (second medical image) to perform the heart dynamic (dynamic model) simulation of the TAVR operation, the limitation that the TAVR operation simulation process is performed by the CT image data or the ultrasonic image data alone is solved, the simulation verification of a TAVR operation design scheme by medical staff is facilitated, and the safety and the reliability of operation planning are improved. The clinician surgery simulation training scenario includes: a set of simple TAVR preoperative simulation device (simulation operating system) is simple to operate and low in cost, and provides a clinician with a simulation training method and equipment close to a real TAVR operation. The TAVR preoperative simulation device comprises a dynamic model, the virtual reality equipment and a medical instrument model in the embodiment. The medical institution TAVR anatomical class scenario includes: according to the design of the TAVR operation flow, the simulation system is close to teaching materials, meets the simulation requirement of the TAVR operation, and can be used by clinical professional students in medical institutions to conduct practical operation training simulation by using the simulation operation system in the embodiment, so that the standardization of the TAVR operation actual combat is improved.
In a specific embodiment, as shown in fig. 5, a dynamic model building method includes: and (3) image registration fusion, image segmentation and three-dimensional reconstruction, and constructing a dynamic heart tissue model (dynamic model). It should be noted that the image processing apparatus acquires CT image data (first medical image) and ultrasound image data (second medical image) before performing the step 1 image registration fusion. Based on the dynamic model, the process of TAVR preoperative simulation is realized by combining VR control equipment and devices. The TAVR preoperative simulation includes loading an external model and operative simulation. Wherein loading the external model includes loading a conventional instrument model and a TAVR consumable model. Conventional instrument models include, but are not limited to, three-dimensional surface models of scalpels, surgical scissors, forceps, sutures, and the like; TAVR consumable models include, but are not limited to, three-dimensional models of lancets, guidewires, catheters, balloons, prosthetic valves, and the like. After the external model is imported, parameters such as a central position, an angle posture, a size and the like corresponding to the external model can be obtained, dynamic change of the external model is achieved, namely, one or more parameters of the central position, the angle posture and the size of the external model are updated, the central position is achieved through updating central coordinates of the external model, the angle posture is achieved through updating any one of the rotation matrixes of the three axes of X, Y, Z, and size conversion is achieved through updating any one of the scaling factors of the three axes of X, Y, Z.
In one particular embodiment, as shown in FIG. 6, an analog operating system includes: the device comprises an image processing unit, a VR controller and a VR display. VR controller: a man-machine interaction device for realizing a TAVR operation simulation is provided, which consists of a push rod, a control button, an indicator light, a power supply, the indicator light and an electronic element 5, and is shown in a schematic diagram of the VR controller in figure 7. The push rod consists of a puncture needle push rod, a guide wire push rod, a catheter push rod, a balloon push rod and a prosthetic valve push rod and is respectively used for controlling the movement, the movement speed and the movement direction of models such as the puncture needle, the guide wire, the catheter, the balloon, the prosthetic valve and the like in the TAVR operation process; a control button comprising a plurality of control buttons for simulating the movements of surgical instruments in a TAVR surgical procedure; the indicator lamp corresponds to the push rod and marks the advancing direction and other indication states of the push rod; the power supply and the indicator lamp provide continuous power for the VR controller; and the electronic element converts commands of the control lever and the control button into electric signals and transmits the electric signals to the image processing unit for processing. Image processing unit (image processing apparatus): the control of image processing and TAVR operation simulation is provided, which consists of 6 sub-modules of a data management module, a three-dimensional reconstruction module, a heart controller, a valve implantation simulator, a VR control module, a VR display module and the like, each part executes independent functions, and the functions of TAVR operation pre-simulation are combined together. The data management module is used for managing the patient image data and the generated heart model; the three-dimensional reconstruction module is used for registering the heart CT image data and the 4D ultrasonic image data, carrying out image segmentation on the registered heart CT image data, and finally carrying out three-dimensional reconstruction on the heart multi-label model obtained by segmentation; the heart controller is used for carrying out image recognition extraction and mathematical modeling on the registered 4D ultrasonic image data, extracting images of a heart valve and four chambers on the heart ultrasonic data, constructing a heart mathematical equation model of morphological changes of the heart valve and the four chambers, and finally establishing a mapping relation between the heart mathematical model and the three-dimensional model to construct a heart tissue model of real-time changes; a valve implantation simulator for controlling the overall process of simulating a TAVR procedure; the VR control module is used for realizing data communication transmission of man-machine interaction; and the VR display module is used for processing the image to be visualized. VR display: graphical, image visualization display devices are provided for the simulation of the overall procedure of a TAVR procedure, including but not limited to stereoscopic displays, MPR displays, VR helmets. The stereoscopic display provides a visual effect of the TAVR operation simulation image with left-right parallax, and the stereoscopic effect of the TAVR operation simulation image can be completely displayed after the 3D glasses are worn; the MPR display provides visual display of multi-view images, is matched with the stereoscopic display for use, is placed on the left side, and is configured on the right side; VR helmet provides the stereoscopic display effect of the simulated process of TAVR operation of another device. VR display: a visual display device for graphics and images of the whole TAVR operation simulation process is provided, and a VR display is composed of a display screen, a graphic display card, a sensor and a power supply, wherein the visual display device comprises a three-dimensional display, an MPR display and a VR helmet, and the categories of the visual display device comprise but are not limited to a three-dimensional display, an MPR display and a VR helmet.
The display screen is used for displaying the final visualization; the graphic display card is used for carrying out acceleration processing on the rendered image; the sensor is used for accessing external electric signals; a power supply for providing power to the display. The stereoscopic display provides a visual effect of the TAVR operation simulation image with left-right parallax, and the stereoscopic effect of the TAVR operation simulation image can be completely displayed after the 3D glasses are worn; the MPR display provides visual display of multi-view images, is matched with the stereoscopic display for use, is placed on the left side, and is configured on the right side; VR helmet provides the stereoscopic display effect of the simulated process of TAVR operation of another device.
In one particular embodiment, as shown in FIG. 9, simulating the operating system data load visualization includes: the data required by the simulation operating system includes CT image data (first medical image), cardiac ultrasound image data (second medical image), TAVR surgical consumable model (medical instrument model), conventional instrument model (medical instrument model). The CT image data is data obtained by CT equipment before operation, and can comprise conventional image data and CTA image data; the ultrasonic image data is 4D ultrasonic image data obtained by ultrasonic equipment, and the data comprises three-dimensional ultrasonic data of heart and blood vessels in a complete heartbeat period; the TAVR surgical consumable model is a guide wire, a guide tube and artificial valve models of different types which are created by 3D modeling software and accord with the specification of the TAVR surgical; conventional surgical models were created using 3D modeling software with reference to surgical instruments used in TAVR surgery 1: 1. The virtual reality device uses the data management module to load and manage the used three-dimensional image and operation simulation model, and the loading of data is shown in fig. 9. Firstly, loading a used TAVR surgical consumable model, a conventional instrument model, patient CT image data and heart ultrasonic image data, and simultaneously, carrying out preprocessing operation of sequencing the loaded heart ultrasonic image data according to time sequence; if the loaded CT image data includes a plurality of different ranges of data for the patient, the system pre-processes the loaded CT image data, leaving only a large range of image data ranging from the femoral artery to the cervical artery. And then storing the loaded three-dimensional data into a three-dimensional data pool. And finally, sending the three-dimensional data to a VR display module for visual management, and determining whether to perform visualization on VR display equipment by the VR display module.
In a specific embodiment, as shown in fig. 10, the registration fusion and three-dimensional reconstruction of CT image data (first medical image) and ultrasound image data (second medical image) includes: selecting CT images and heart ultrasonic image data (4D ultrasonic image data) containing a complete heartbeat period from a three-dimensional data pool of the data management module; and performing fusion registration of the 4D ultrasonic image data and the CT image data in a three-dimensional reconstruction module through a characteristic point matching algorithm (an image registration fusion method is shown in fig. 11), obtaining a 4D ultrasonic image set (a first target medical image in a registration result) and CT image data (a second target medical image in a registration result) which are positioned under the same coordinate system and are matched with each other in the characteristic point position, and returning the registered 4D ultrasonic image set to a three-dimensional data pool to replace the original 4D ultrasonic image data. And performing image segmentation and three-dimensional reconstruction on the registered CT image data (an image segmentation and three-dimensional reconstruction method is shown in fig. 12) to obtain a static heart model (a static model), wherein the static heart model consists of three-dimensional models such as an aorta, a vein, a pulmonary artery, a pericardium, a left atrium, a left ventricle, a right atrium, a right ventricle, a left ventricular valve, a right ventricular valve, a left arterial valve, a right arterial valve and the like, storing the static heart model into a three-dimensional data pool module, and finally completing registration fusion and three-dimensional reconstruction of the image data.
In a specific embodiment, as shown in fig. 11, registering the first medical image and the second medical image results in a registration result, including: extracting the characteristics of the 4D ultrasonic image (the second medical image) and generating the characteristics thereof to obtain a 4D ultrasonic image characteristic descriptor; extracting features of a CT image (first medical image), and generating a feature descriptor of the CT image to obtain a CT image feature descriptor; performing feature matching according to the similarity degree of the 4D ultrasonic image feature descriptors and the CT image feature descriptors, and performing matrix transformation operation on the 4D ultrasonic data and the CT image data respectively to obtain a registered 4D ultrasonic image set (a first target medical image) and a registered CT image data set (a second target medical image); the matrix transformation includes translational and rotational operating parameters.
In a specific embodiment, as shown in fig. 13, the establishment of the dynamic model includes: selecting a registered 4D ultrasound image set (second target medical image) from the three-dimensional data pool; performing heart tissue identification and segmentation on the acquired 4D ultrasonic image set according to a time stamp sequence (a specific method is shown in fig. 14), and extracting a multi-time sequence heart tissue ultrasonic image set of pericardium, each chamber of heart and valves; constructing a mathematical equation (a dynamic mathematical equation model taking time as an independent variable) for each extracted heart tissue according to the time stamp sequence to obtain a dynamic mathematical equation model taking time as an independent variable for each heart tissue; extracting a static heart model from the three-dimensional data pool; establishing a mapping relation between each heart tissue dynamic mathematical equation model (dynamic mathematical equation model taking time as an independent variable) and a static heart model (static model), wherein one static heart tissue model corresponds to one dynamic mathematical equation; sending an instruction to a heart simulator, and identifying to start simulation; and the heart controller performs matrix transformation operations such as rotation, translation, scaling and the like on each static heart tissue model according to each static heart tissue model and a corresponding dynamic mathematical equation thereof, so as to complete the construction of the dynamic heart model and simultaneously construct a simulated blood flow model. Transmitting the constructed dynamic heart model and the simulated blood flow model to a VR display module for visualization processing; sequentially repeating the matrix transformation operations such as rotation, translation, scaling and the like of each static heart tissue model by a heart controller according to each static heart tissue model and a corresponding dynamic mathematical equation thereof to complete the construction of the dynamic heart model and simultaneously construct a simulated blood flow model; and sending the constructed dynamic heart model and the simulated blood flow model to a VR display module for visual processing, so as to realize visual simulation of heart dynamic change.
In a specific embodiment, as shown in fig. 14, the cardiac tissue identification segmentation (processing of the second medical image in the registration result) based on the 4D ultrasound image set includes: traversing each frame of the 4D ultrasound image set (second target medical image), performing the following operations: image segmentation is carried out on each frame of 4D ultrasonic image set based on a Unet method, so that a three-dimensional image containing a plurality of heart tissue labels is obtained; performing multi-label segmentation on a three-dimensional image containing a plurality of heart tissue labels to obtain a heart image model of the frame 4D ultrasonic image set; and traversing and executing the steps on each frame of 4D ultrasonic image set to finally obtain the multi-time sequence three-dimensional image set.
In one specific embodiment, as shown in fig. 15, a mathematical equation (including the creation of a dynamic mathematical equation model with time as an argument) is constructed for each extracted cardiac tissue in a time-stamped sequence, by the following method: extracting characteristic points of each picture from the ultrasonic image of each time stamp sequence along the cross section; constructing a mathematical equation according to position and time based on the extracted feature points; and (5) optimizing a mathematical equation to obtain the heart tissue mathematical model.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a dynamic model building device and a simulation operation device for realizing the above-mentioned dynamic model building method and simulation operation method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitations in the embodiments of the dynamic model building device and the simulation operation device provided below can be referred to the above limitations of the dynamic model building method and the simulation operation method, and are not repeated here.
In one embodiment, there is provided a dynamic model building apparatus including:
a medical image acquisition module for acquiring a first medical image and a second medical image before the operation of the target region;
the registration module is used for registering the first medical image and the second medical image to obtain a registration result;
the reconstruction module is used for reconstructing the first medical image in the registration result to obtain a static model;
the model processing module is used for processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable;
the mapping relation establishing module is used for establishing the mapping relation between each dynamic mathematical equation model and the static model;
and the dynamic model generation module is used for processing the static model according to the mapping relation to obtain a dynamic model which changes in real time.
In one embodiment, the medical image acquisition module is further configured to acquire, by the three-dimensional medical imaging device, three-dimensional image data of the target area as the first medical image before the operation; and acquiring ultrasonic data of the target area as a second medical image by ultrasonic imaging equipment before operation.
In one embodiment, the registration module is further configured to perform feature registration on the first medical image and the second medical image to obtain a first registration relationship; and processing the first medical image or the second medical image based on the first registration relation to obtain a first target medical image and a second target medical image which are positioned under the same coordinate system and have the characteristic point positions matched with each other as registration results.
In one embodiment, the reconstructing module is further configured to segment the registered first medical image to obtain a target object; and carrying out three-dimensional reconstruction on the target object to obtain the static model.
In one embodiment, the model processing module is further configured to arrange the registered second medical images according to a time sequence, and perform image segmentation on the registered second medical images arranged according to the time sequence, so as to obtain a target object corresponding to each registered second medical image; dividing the target object to obtain a plurality of parts; extracting characteristic points of each part; and constructing a dynamic mathematical equation model taking time as an independent variable of each part based on the extracted position and the corresponding time of the characteristic point.
In one embodiment, the mapping relation establishing module is further configured to obtain a static sub-model of each part in the static model; and establishing a mapping relation between each dynamic mathematical equation model and each static submodel.
In one embodiment, there is provided a dynamic model building apparatus including:
the dynamic model acquisition module is used for acquiring the dynamic model generated by the dynamic model building device;
the medical instrument model acquisition module is used for acquiring a medical instrument model;
and the operation simulation module is used for controlling the medical instrument model in the dynamic model to perform operation simulation.
In one embodiment, the operation simulation module is further configured to visualize an operation simulated process, where the operation simulated process includes a motion process of the dynamic model and the medical instrument model.
In one embodiment, the medical device model is at least one of a guidewire model, a catheter model, a balloon model, and a valve model; the above operation simulation module is further configured to perform an operation simulation by controlling the medical instrument model in the dynamic model by at least one of: controlling the guide wire model in the dynamic model to simulate guide wire puncture operation; controlling the guidewire model or the catheter model in the dynamic model to simulate a guidewire or catheter placement operation and/or an exit operation; controlling the balloon model in the dynamic model to simulate balloon pre-expansion operation; controlling the balloon model in the dynamic model to simulate a balloon dilation operation; and controlling the valve model in the dynamic model to simulate at least one of valve loading, valve placement operations, and valve operation.
In one embodiment, the operation simulation module is further configured to visualize the process of operation simulation by at least one of: calculating the distance and the contact position between the guide wire model and the corresponding blood vessel of the dynamic model in the puncturing process, generating a collision detection result based on the distance and the contact position, and visualizing the collision detection result; calculating the topological relation and/or the angular relation of the blood vessels corresponding to the guide wire model or the catheter model and the dynamic model, and visualizing the topological relation and/or the angular relation; visualizing the length/speed information of the guide wire model or the catheter model entering or exiting the blood vessel corresponding to the dynamic model; calculating the topological relation between the balloon model and the corresponding blood vessel of the dynamic model, and visualizing the topological relation; and calculating a distance and a contact position between the valve model and a corresponding blood vessel of the dynamic model, generating a collision detection result based on the distance and the contact position, and visualizing the collision detection result.
The above-described dynamic model building apparatus and each module in the simulation operation apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules. In one embodiment, a computer device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 20. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a dynamic model building method. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 20 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the computer device to which the present application applies, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (16)

1. A method for dynamic model creation, the method comprising:
acquiring a first medical image and a second medical image before the operation of the target area;
registering the first medical image and the second medical image to obtain a registration result;
reconstructing the first medical image in the registration result to obtain a static model;
processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable;
Establishing a mapping relation between each dynamic mathematical equation model and the static model;
and processing the static model according to the mapping relation to obtain a dynamic model with real-time change.
2. The dynamic model building method according to claim 1, wherein the acquiring the first medical image and the second medical image before the operation of the target region includes:
acquiring three-dimensional image data of a target area as a first medical image by three-dimensional medical imaging equipment before operation;
and acquiring ultrasonic data of the target area as a second medical image by ultrasonic imaging equipment before operation.
3. The method of claim 1, wherein registering the first medical image and the second medical image to obtain a registration result comprises:
performing feature registration on the first medical image and the second medical image to obtain a first registration relationship;
and processing the first medical image or the second medical image based on the first registration relation to obtain a first target medical image and a second target medical image which are positioned under the same coordinate system and have the characteristic point positions matched with each other as registration results.
4. The method according to claim 1, wherein reconstructing the first medical image in the registration result to obtain a static model includes:
dividing the registered first medical image to obtain a target object;
and carrying out three-dimensional reconstruction on the target object to obtain the static model.
5. The method according to claim 4, wherein the processing the second medical image in the registration result to obtain a dynamic mathematical equation model with time as an argument comprises:
arranging the registered second medical images according to a time sequence, and carrying out image segmentation on the registered second medical images arranged according to the time sequence to obtain a target object corresponding to each registered second medical image;
dividing the target object to obtain a plurality of parts;
extracting characteristic points of each part;
and constructing a dynamic mathematical equation model taking time as an independent variable of each part based on the extracted position and the corresponding time of the characteristic point.
6. The method for building a dynamic model according to claim 5, wherein said building a mapping relationship between each of said dynamic mathematical equation models and said static model comprises:
Acquiring a static sub-model of each part in the static model;
and establishing a mapping relation between each dynamic mathematical equation model and each static submodel.
7. A method of analog operation, the method comprising:
acquiring the dynamic model generated by the dynamic model building method according to any one of claims 1 to 6;
acquiring a medical instrument model;
and controlling the medical instrument model in the dynamic model to perform operation simulation.
8. The simulated operation method according to claim 7, wherein the operation simulation by controlling the medical device model in the dynamic model includes:
visualizing a process of operational simulation comprising a course of motion of the dynamic model and the medical instrument model.
9. The simulated operating method of claim 8, wherein the medical device model is at least one of a guidewire model, a catheter model, a balloon model, and a valve model; the operation simulation by controlling the medical instrument model in the dynamic model comprises at least one of the following:
controlling the guide wire model in the dynamic model to simulate guide wire puncture operation;
Controlling the guidewire model or the catheter model in the dynamic model to simulate a guidewire or catheter placement operation and/or an exit operation;
controlling the balloon model in the dynamic model to simulate balloon pre-expansion operation;
controlling the balloon model in the dynamic model to simulate a balloon dilation operation; and
controlling the valve model in the dynamic model simulates at least one of valve loading, valve placement operations, and valve operation.
10. The simulated operation method of claim 9, wherein visualizing the process of operating the simulation comprises at least one of: calculating the distance and the contact position between the guide wire model and the corresponding blood vessel of the dynamic model in the puncturing process, generating a collision detection result based on the distance and the contact position, and visualizing the collision detection result;
calculating the topological relation and/or the angular relation of the blood vessels corresponding to the guide wire model or the catheter model and the dynamic model, and visualizing the topological relation and/or the angular relation;
visualizing the length/speed information of the guide wire model or the catheter model entering or exiting the blood vessel corresponding to the dynamic model;
Calculating the topological relation between the balloon model and the corresponding blood vessel of the dynamic model, and visualizing the topological relation; and
and calculating the distance and the contact position between the valve model and the corresponding blood vessel of the dynamic model, generating a collision detection result based on the distance and the contact position, and visualizing the collision detection result.
11. A dynamic model building apparatus, the apparatus comprising:
a medical image acquisition module for acquiring a first medical image and a second medical image before the operation of the target region;
the registration module is used for registering the first medical image and the second medical image to obtain a registration result;
the reconstruction module is used for reconstructing the first medical image in the registration result to obtain a static model;
the model processing module is used for processing the second medical image in the registration result to obtain a dynamic mathematical equation model taking time as an independent variable;
the mapping relation establishing module is used for establishing the mapping relation between each dynamic mathematical equation model and the static model;
and the dynamic model generation module is used for processing the static model according to the mapping relation to obtain a dynamic model which changes in real time.
12. An analog operation device, characterized in that the analog operation device comprises:
a dynamic model acquisition module for acquiring the dynamic model generated by the dynamic model building apparatus according to claim 11;
the medical instrument model acquisition module is used for acquiring a medical instrument model;
and the operation simulation module is used for controlling the medical instrument model in the dynamic model to perform operation simulation.
13. A dynamic model building system, characterized in that the system comprises an image processing device and an image acquisition device, wherein the image processing device is communicated with the image acquisition device;
the image acquisition device is used for acquiring a first medical image and a second medical image before the operation of the target area;
the image processing apparatus is configured to implement the dynamic model building method according to any one of claims 1 to 6.
14. The system of claim 13, wherein the image acquisition device comprises an ultrasound device and a three-dimensional image acquisition device.
15. A simulated operating system comprising a display device and the dynamic model building system of claim 13 or 14;
The display device is configured to acquire a dynamic model generated based on the dynamic model creation system according to claim 11 or 12 and the medical instrument model, and display a simulation operation performed in the dynamic model to control the medical instrument model.
16. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6 or 7 to 9.
CN202310372301.6A 2023-04-04 2023-04-04 Dynamic model building method, simulation operation device, equipment and medium Pending CN116416383A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310372301.6A CN116416383A (en) 2023-04-04 2023-04-04 Dynamic model building method, simulation operation device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310372301.6A CN116416383A (en) 2023-04-04 2023-04-04 Dynamic model building method, simulation operation device, equipment and medium

Publications (1)

Publication Number Publication Date
CN116416383A true CN116416383A (en) 2023-07-11

Family

ID=87059207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310372301.6A Pending CN116416383A (en) 2023-04-04 2023-04-04 Dynamic model building method, simulation operation device, equipment and medium

Country Status (1)

Country Link
CN (1) CN116416383A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593470A (en) * 2024-01-18 2024-02-23 深圳奥雅设计股份有限公司 Street view reconstruction method and system based on AI model
CN117934230A (en) * 2024-03-21 2024-04-26 中山市人民医院 VR-based ECMO guide wire insertion operation training method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593470A (en) * 2024-01-18 2024-02-23 深圳奥雅设计股份有限公司 Street view reconstruction method and system based on AI model
CN117593470B (en) * 2024-01-18 2024-04-02 深圳奥雅设计股份有限公司 Street view reconstruction method and system based on AI model
CN117934230A (en) * 2024-03-21 2024-04-26 中山市人民医院 VR-based ECMO guide wire insertion operation training method

Similar Documents

Publication Publication Date Title
ES2704080T3 (en) Biometric simulation device, method to control the biometric simulation device, and program to control the biometric simulation device
US11183296B1 (en) Method and apparatus for simulated contrast for CT and MRI examinations
CN105992996B (en) Dynamic and interactive navigation in surgical environment
CN116416383A (en) Dynamic model building method, simulation operation device, equipment and medium
RU2627147C2 (en) Real-time display of vasculature views for optimal device navigation
CN107067398B (en) Completion method and device for missing blood vessels in three-dimensional medical model
CN107296650A (en) Intelligent operation accessory system based on virtual reality and augmented reality
CN114903591A (en) Virtual reality or augmented reality visualization of 3D medical images
US20040009459A1 (en) Simulation system for medical procedures
CN104272348B (en) For the imaging device and method being imaged to object
CN101887487B (en) Model generator for cardiological diseases
CN115005981A (en) Surgical path planning method, system, equipment, medium and surgical operation system
US11660142B2 (en) Method for generating surgical simulation information and program
US11672603B2 (en) System for patient-specific intervention planning
US11017531B2 (en) Shell-constrained localization of vasculature
CN115105207A (en) Operation holographic navigation method and system based on mixed reality
CN112967786B (en) Construction method and system of anatomical navigation based on multimode image and interactive equipment
CN113100935A (en) Preoperative puncture path planning method and training system for lung puncture operation
CN101477706A (en) Simulated operation planning method based on medical image
CN113017832A (en) Puncture surgery simulation method based on virtual reality technology
CN113302660A (en) Method for visualizing dynamic anatomical structures
CN114831729A (en) Left auricle plugging simulation system for ultrasonic cardiogram and CT multi-mode image fusion
KR20190088419A (en) Program and method for generating surgical simulation information
Jia et al. A hybrid catheter localisation framework in echocardiography based on electromagnetic tracking and deep learning segmentation
CN202815838U (en) Ablation treatment image guiding device with image three dimensional processing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination