CN113303905B - Interventional operation simulation method based on video image feedback - Google Patents

Interventional operation simulation method based on video image feedback Download PDF

Info

Publication number
CN113303905B
CN113303905B CN202110577825.XA CN202110577825A CN113303905B CN 113303905 B CN113303905 B CN 113303905B CN 202110577825 A CN202110577825 A CN 202110577825A CN 113303905 B CN113303905 B CN 113303905B
Authority
CN
China
Prior art keywords
ablation
simulation
image
weighting
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110577825.XA
Other languages
Chinese (zh)
Other versions
CN113303905A (en
Inventor
肖煜东
刘军
冷浩群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Second Xiangya Hospital of Central South University
Original Assignee
Second Xiangya Hospital of Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Second Xiangya Hospital of Central South University filed Critical Second Xiangya Hospital of Central South University
Priority to CN202110577825.XA priority Critical patent/CN113303905B/en
Publication of CN113303905A publication Critical patent/CN113303905A/en
Application granted granted Critical
Publication of CN113303905B publication Critical patent/CN113303905B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Robotics (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Molecular Biology (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Instructional Devices (AREA)
  • Surgical Instruments (AREA)

Abstract

The invention provides an interventional operation simulation method based on video image feedback, which comprises the following steps: (1) constructing a human body simulation model, wherein the human body simulation model at least comprises a simulation tissue layer and a simulation organ, and at least the simulation organ is made of a transparent material; (2) arranging a video monitoring device for an ablation process at one side of the simulated visceral organ, wherein the video monitoring device has a video image acquisition function of at least two wave bands; (3) during a surgical simulation with the ablation device, the video monitoring device is notified to perform a video acquisition on at least one of the two acquisition bands by an ablation switch of the ablation device.

Description

Interventional operation simulation method based on video image feedback
The technical field is as follows:
the invention relates to the technical field of medical auxiliary training, in particular to an interventional operation simulation method based on video image feedback.
Background art:
chronic viral hepatitis and cirrhosis are all easy to develop into primary liver cancer finally, and surgical operation is the only treatment means. In recent years, ablation therapy has become one of the important choices for the treatment of liver cancer with the development of multidisciplinary techniques. For early and isolated liver cancer, especially for patients with good physical condition, surgery is still the preferred means; ablation therapy should be considered when the physical condition of some patients is not amenable to surgery or when multiple surgery on a tumor is difficult. The ablation operation belongs to non-vascular minimally invasive treatment, acts on tumors through a chemical method or heat energy to eradicate or substantially damage the tumors, and has the same treatment effect as surgical resection and liver transplantation for some liver cancers. The more common clinical applications are radiofrequency ablation and microwave ablation. The method is suitable for treating tumor with maximum diameter not more than 3 cm. The ablation treatment is also suitable for patients who relapse after liver cancer resection, or have liver function unsuitable for surgical resection or have tumor parts which are particularly difficult to remove.
The percutaneous tumor ablation is to induce tumor cell necrosis and local tumor tissue inactivation by means of percutaneous puncture and other chemical ablation, heat ablation or cryoablation technology under the guidance of medical imaging equipment such as ultrasound, CT, nuclear magnetic resonance and the like. In the treatment process, the medical imaging equipment performs operation navigation, accurately positions the tumor and kills the tumor under the condition of protecting the functions of organs and tissues to the maximum extent, so the medical imaging equipment has the characteristics of small wound, good curative effect, short recovery period and slight complication. For liver cancer with tumor diameter less than 3 cm, the ablation curative effect is definite, the wound is small, no abdomen opening is needed, and the wound of a major operation is avoided; the postoperative recovery is fast, the hospital can be discharged after 1 to 2 days of the operation, and the influence on the quality of life is small; has high safety and lower incidence of postoperative complications than that of open abdominal surgery.
Although ablation has many advantages, it is a challenge for doctors, because ablation cannot realize direct-view operation with naked eyes, an auxiliary visualization tool is needed, and there is no systematic practice tool for ablation operation, some high-end manufacturers have introduced a virtual training system, but such a system cannot be different from a real object after all, cannot provide a real operation object and a real experience, and cannot provide real feedback, compared with practical practice for various exercises such as mental quality of students, and the virtual training system often needs various body sensing devices, sensors and the like, especially devices provided by foreign medical equipment manufacturers, and is expensive and not easy to popularize.
In practice, doctors with certain experience often practice in clinic, which not only affects the confidence of patients to doctors, but also easily causes doctor-patient disputes once clinical problems or recurrence occur.
Therefore, if a systematic ablation operation training method for a doctor can be provided, the operation level of the doctor can be effectively improved, so that the doctor can accumulate sufficient experience before clinical operation.
The invention content is as follows:
in view of the above problems in the prior art, the present invention is intended to provide an operation training method for microwave (or radio frequency, etc.) ablation operation based on real objects, which can realize direct video feedback. In the image processing, the invention adopts the means of coaxial shooting, overflow correction, heat conductivity correction and the like, thereby ensuring the maximum approximation of the simulated environment and the real environment.
In one aspect, the present invention provides a method for simulating an interventional operation based on video image feedback (or may also be referred to as a method for constructing an operation simulation environment), where the method includes:
(1) constructing a human body simulation model, wherein the human body simulation model at least comprises a simulation tissue layer and a simulation organ, and at least the simulation organ is made of a transparent material;
(2) arranging a video monitoring device for an ablation process at one side of the simulated visceral organ, wherein the video monitoring device has a video image acquisition function of at least two wave bands;
(3) during a surgical simulation with the ablation device, the video monitoring device is notified to perform a video acquisition on at least one of the two acquisition bands by an ablation switch of the ablation device.
Preferably, a first band of the two bands is a visible light band or a shorter wavelength light band, and the second band is an infrared band.
Preferably, the simulation process includes a puncturing stage and an ablation stage, an ablation switch of the ablation device is in communication connection with the video monitoring device for mode switching, the puncturing stage adopts a first wave band for video acquisition, the ablation stage adopts a second wave band for video acquisition or acquires a fusion image of the first and second wave bands, and the fusion image is presented to a user.
Preferably, the fused image is formed as follows:
(3.1) extracting RGB values of the visible light image, and weighting the RGB values respectively, wherein the weighting coefficient of each value is less than or equal to 45% or 50%;
(3.2) acquiring an infrared image by using infrared camera equipment, and weighting the RGB value of the infrared image, wherein the weighting coefficient is more than or equal to 45% or 50%;
and (3.3) performing pixel-by-pixel fusion on each frame of the infrared image and the visible light image according to the synchronous signals, and adding the RGB values of the two images respectively.
Preferably, R of each pixel point in the fused image is fusedMelt、GMelt、BMeltComparing the values with corresponding overflow thresholds respectively, determining the proportion of overflow pixels in the total pixel value, and multiplying the pixel value of any one of RGB by an anti-overflow coefficient when the proportion of the overflow pixels in the total pixel exceeds a preset threshold, wherein the anti-overflow coefficient is as follows:
Figure BDA0003085094450000041
where Q is the overflow threshold, A is the average pixel value, and Y is the overflow ratio. The coefficient is adopted to reduce the pixel value of the corresponding item, the higher the overflow proportion is, the larger the reduction amplitude is, and the overflow phenomenon is effectively avoided. Then, iterating and calculating the overflow proportion again, if the overflow phenomenon still occurs, reducing the coefficient according to the range of 0.05-0.1 (namely 5% -10%) each time, if the overflow proportion still exceeds the limit after three times of reduction, judging the picture to be abnormal, abandoning the picture, adopting the array value of the equal difference number of the corresponding pixels of the first three frames of the picture as the pixel value of the pixel, for example, setting the pixel value range as [0,255 ]]For example, if a pixel value is abnormal, the G pixel values of the pixels in the first three frames are 172, 176, 184 respectively, the average value of the difference values between the two is calculated, 7, and the current pixel value is set to 191. Therefore, the image change trend can be kept, and excessive distortion is avoided.
Preferably, the method further comprises: establishing a corresponding relation mapping of pixels of the ablation monitoring equipment and the temperature, and carrying out highlight or marked display on a specific temperature value concerned by a doctor in an ablation process based on the corresponding relation mapping.
Preferably, the method further comprises: measuring the heat conductivity coefficient of the simulated organ material and the heat conductivity coefficient of the animal liver material, firstly weighting the infrared image based on the material, and then fusing, wherein the weighting mode is as follows:
Figure BDA0003085094450000051
p' is the pixel value after correction, P is the pixel value before correction, λ1And λ2The thermal conductivity of the tested animal tissue and the thermal conductivity of the simulation material, c1And c2Specific heat of tested animal tissue and specific heat of simulated material, rho1And ρ2The density of the tested animal tissue (such as liver) and the density of the simulated material, respectively.
Preferably, the method further comprises repairing the punctured simulated tissue layer by a repair patch, and removing and replacing the simulated organ by detaching the detached upper body part model and lower body part model.
Preferably, the video monitoring device adopts a coaxial light beam splitting collection mode to collect images of different wave bands.
In another aspect, the present invention provides an operation training system for minimally invasive ablation procedures, the operation training system comprising: human simulation model, ablation equipment, operation panel, ablation monitoring facilities and ablation needle state display device, human simulation model set up on the operation panel, and human simulation model includes the upper part of the body model at least, the upper part of the body model is close to viscera one side and sets up ablation monitoring facilities, and the upper part of the body model is inside to have the simulation internal organs, the simulation internal organs is made by transparent material, it includes visible light imaging device and infrared thermal imaging device to melt monitoring facilities, and visible light imaging device and infrared thermal imaging device all are to the target internal organs, camera equipment and thermal imager with it melts needle state display device and is used for the image based on the control signal transport different grade type of ablation equipment.
By adopting the operation training simulation method, the user can repeatedly practice the operation, and the corresponding relation between the hand operation action and the advancing distance of the puncture needle can be known by the user through the video-based insertion depth feedback, so that the user can conveniently control the action amplitude of the user, know the instrument action condition caused by the action amplitude, and provide better control feeling. And the visual image and the infrared image of the target organ are switched and displayed or the visual image and the fusion image are switched and displayed by triggering the ablation switch, so that the visual display of the ablation time and the temperature change of the ablation area can be realized, and a user can better know the influence of the operation on the target tumor. The existing operation mode is usually to distinguish the ablation state through color change, but the color change cannot accurately reflect the ablation temperature, and the temperature change caused by ablation can be effectively and truly reflected through heat conductivity correction, temperature measurement of a temperature sensor and mapping.
Description of the drawings:
FIG. 1 is a schematic flow chart of the operation training method of the present invention
FIG. 2 is a schematic diagram of the arrangement of the various devices in practice of the training method of the present invention.
Fig. 3 is a schematic view of the optical path structure of the ablation monitoring apparatus used in the present invention.
Fig. 4 is a schematic structural view of a prosthetic patch used in the present invention.
FIG. 5 is a schematic view of a preferred simulated tissue layer and inner substrate construction.
The specific implementation mode is as follows:
as shown in fig. 1, the method for simulating an interventional operation based on video image feedback of the present invention comprises:
(1) constructing a human body simulation model, wherein the human body simulation model at least comprises a simulation tissue layer and a simulation organ, and at least the simulation organ is made of a transparent material;
(2) arranging a video monitoring device for an ablation process at one side of the simulated visceral organ, wherein the video monitoring device has a video image acquisition function of at least two wave bands;
(3) during a surgical simulation with the ablation device, the video monitoring device is controlled by an ablation switch of the ablation device to perform video acquisition on at least one of two acquisition bands. As shown, before ablation starts, i.e. during the puncture phase, image acquisition is performed with visible light, and after ablation starts, image acquisition and display is performed with infrared light or display is performed with a fusion image.
The fused image is formed as follows:
(3.1) extracting RGB values of the visible light image, and weighting the RGB values respectively, wherein the weighting coefficient of each value is less than or equal to 45% or 50%;
(3.2) acquiring an infrared image by using infrared camera equipment, and weighting the RGB value of the infrared image, wherein the weighting coefficient is more than or equal to 45% or 50%;
and (3.3) performing pixel-by-pixel fusion on each frame of the infrared image and the visible light image according to the synchronous signals, and adding the RGB values of the two images respectively.
As shown in fig. 2, the system used in the training method for minimally invasive ablation operation in this embodiment includes: the manikin 100, the ablation device 200, the console 300, the ablation monitoring device 400, and the ablation needle motion display device 500.
The operating table 300 is constructed substantially the same as or similar to an operating table in order to give the trainee as a realistic experience as possible.
For example, for liver cancer treatment training, the phantom 100 may be divided into two parts, one part being an upper part having visceral tissues, the other part being a lower part (which may be omitted), and the other part being a more realistic phantom having better quality, the other part being a lower part (which may be omitted), the other part being a common phantom having lower cost, thereby further reducing the cost. And the upper body part and the lower body part can be assembled together in two parts so as to be disassembled and assembled, and when the upper body part needs to be replaced, the lower body part does not need to be replaced at the same time. More preferably, a manikin (which may be omitted) is included, the manikin being detachably assembled with the upper body portion manikin.
A hollow region is provided below the phantom 100 (the lower part of the upper body model), and the hollow region is located in a non-visceral region (particularly, a non-liver region) and is adjacent to the liver. The purpose of the hollow region is to position ablation monitoring device 400, with the height of ablation monitoring device 400 being proportional to the height of the target monitoring region. At least the inner part (particularly, the internal organs) of the upper body model of the manikin 100 is made of a transparent material, such as transparent silica gel.
The ablation monitoring device 400 includes a visible light imaging device 401 and/or an infrared thermal imaging device 402, both of which are directed at the target organ. The target organ is made of a silica gel model or other transparent or semitransparent materials, more preferably, the heat capacity and the heat conductivity of the target organ are made of materials similar to the heat conductivity and the heat capacity of human tissues, and simulated tumors in various shapes and structures can be plugged into the target organ according to requirements.
The imaging focal points of the visible light imaging device 401 and the infrared thermal imaging device 402 are directed to a target simulated lesion region of a target organ.
In a preferred implementation, the visible light imaging device 401 and the infrared thermal imaging device 402 are arranged side by side, and the two images respectively capture images of the target organ and the simulated tumor from respective angles and respective wave bands. However, there is a problem that the user needs to observe the state of the two images at the same time, and since the two images are photographed side by side, the photographing angles of the two images are not completely the same but are shifted by a certain distance in the horizontal or vertical direction, so that the contents of the photographed images are not directed to the same portion. This tends to increase the observation burden on the user.
To address this problem, in another preferred implementation, an optical path structure as shown in fig. 3 is adopted to perform coaxial and spectroscopic image acquisition. As shown, the ablation monitoring device 400 includes a visible light imaging device 401, an infrared thermal imaging device 402, an optical lens 403, a beam splitter 404, a reflector 405, and a housing 406, where the optical lens 403 is located at the frontmost end of the housing, the housing has an opening, the optical lens 403 is just sealed at the opening, and the beam splitter 404 is located behind the optical lens 403, and is used for receiving incident light from the optical lens 403 and performing sub-band reflection and transmission on the incident light, where the beam splitter 404 is plated with a reflective film for visible light or infrared light, so as to reflect one of the visible light or infrared light and transmit light of another band, for example, the visible light is transmitted to the visible light imaging device 401, the infrared light is reflected to the reflector 405, and then reflected to the infrared light imaging device 402.
In this way, the same field of view of the visible light imaging device 401 and the infrared light imaging device 402 can be achieved.
Further, in order to make the observation of the user more convenient, the specific operation flow of the surgical simulation training method may be divided into two modes, a puncturing mode and an ablation mode, wherein in the puncturing mode, the image obtained by the visible light imaging device 401 is presented to the user through the display device 500, so that the user can know the corresponding relationship between the hand operation action and the travel distance of the puncturing needle, the user can control the action amplitude of the user conveniently, know the instrument action condition brought by the action amplitude, and provide better control feeling.
When the puncture reaches a preset position, an ablation mode is started, a user triggers a control switch of an ablation needle, the control switch simultaneously sends a mode switching signal to the ablation monitoring device, and the ablation monitoring device fuses an image obtained by the visible light imaging device 401 and an image obtained by the infrared light imaging device 402.
The specific fusion mode is to perform shutter triggering on the synchronous triggering signal of the visible light imaging device 401 and the infrared light imaging device 402, and acquire the visible light image and the infrared light image of the corresponding frame based on the synchronous signal. RGB values of a visible light image are extracted, and the RGB values are weighted unequally, and in the unequally weighted unequally, a weight w is given to the R value less than G, BR1、wG1、wB1Preferably, the weight assigned to the R value is less than about 10-20% of the weight of the G, B value. The respective weighting coefficients of RGB may be set between 30% and 60%, preferably equal to or less than 50%. The RGB values of the infrared photographic image obtained by the infrared light imaging device 402 are extracted, and since infrared light is invisible to the human eye, the infrared light imaging device 402 converts the infrared light image captured by it into an image visible to the human eye, where the object of the operation performed therein is the converted infrared image. The RGB values of the infrared image are weighted unequally, wherein the weighting coefficient for the R value is larger than the weighting coefficient w of GBR2、wG2、wB2. Adopt this kind of empowermentThe infrared light imaging device can adjust the color tone of the visible light image in the short wave direction and adjust the color tone of the image formed by the infrared light imaging device in the long wave direction, so that once the infrared light imaging device enters the ablation stage, the infrared light image can more clearly reflect the heat conduction condition of an ablated area, and on the other hand, the possibility of overflow of the pixel values of the weighted and fused two images can be reduced. And controlling the two image acquisition devices to output images with the same resolution, so that better fusion can be realized, and certainly, before the fusion, the registration of the two images can be carried out by adopting a conventional image registration method.
R of the weighted visible light image1The value and the corresponding pixel R of the weighted infrared image2Adding the values, and weighting the G of the visible light image1Value and G of weighted infrared image2Adding the values, and weighting B of the visible light image1Value and B of weighted infrared image2Adding values to obtain RGB values of pixels corresponding to the fused image: rMelt、GMelt、BMelt
Preferably, the method further comprises fusing R of each pixel point in the imageMelt、GMelt、BMeltComparing the values with corresponding overflow thresholds respectively, determining the proportion of overflow pixels to the total pixel value, and multiplying the pixel value of any one of RGB items by an overflow prevention coefficient when the proportion of overflow pixels to the total pixel exceeds a predetermined threshold in consideration of possible noise spikes, wherein the setting of the overflow prevention coefficient can be set based on the average pixel value of the item, the overflow threshold and the overflow proportion, for example, the coefficient is set as:
Figure BDA0003085094450000101
wherein Q is an overflow threshold, A is an average pixel value, and Y is an overflow proportion, and the coefficient is adopted to reduce the pixel value of the corresponding item so as to avoid the overflow phenomenon. Then, the overflow proportion is calculated iteratively again, if the overflow phenomenon still occurs, the coefficient is reduced according to the range of 0.05-0.1 (namely 5% -10%) every time, if the overflow proportion still exceeds the limit after the coefficient is reduced three times, the coefficient is judged to be abnormal, and the overflow proportion is abandonedThe picture uses the array value of the equal difference of the corresponding pixels of the first three frames of the picture as the pixel value of the pixel, for example, the pixel value range is set as [0,255 ]]For example, if a pixel value is abnormal, G pixel values of a pixel in the first three frames are 172, 176, 184, respectively, an average value of differences between the two is calculated, 7, and the current pixel value is set to 191. Therefore, the image change trend can be kept, and excessive distortion is avoided.
Further, a corresponding relation mapping of pixels of the ablation monitoring device and temperature is established, a temperature sensor is arranged at a preset position (distance) of the simulated organ, the ablation electrode of the ablation needle is inserted into the simulated organ, the distance of 1-4cm is kept between the ablation electrode and the temperature sensor, the power of the ablation electrode is gradually increased, measurement trigger signals are respectively sent to the temperature sensor and the ablation monitoring device, the temperature sensor measures the actual temperature of the position point where the temperature sensor is located, and the ablation monitoring device obtains a fusion image of the ablation electrode and the temperature sensor. Extracting the pixel value of the simulated organ close to the temperature sensor through image extraction software, establishing the corresponding relation between the pixel value and the temperature value, repeatedly testing to obtain the temperature value corresponding to each pixel value section, and forming a mapping relation table between the pixel value and the temperature value under the current parameter condition. Based on the relation table, a mark or a pixel expansion display is performed for a specific temperature value (specific pixel value) concerned by a doctor in the ablation process. For example, for a pixel value corresponding to a specific temperature, such as 120 degrees celsius, an isotherm is formed, and a pixel and its surrounding pixels at the isotherm are assigned with a higher brightness value, or pixels 3 × 3 around the pixel at the isotherm are assigned with the same RGB values, so as to form a distinct marking region. According to different tumors or experimental conditions, a plurality of parameter systems and corresponding mapping relations can be established.
Further, the thermal conductivity, specific heat and density of the material of the simulated organ (tumor) are measured, the thermal conductivity, specific heat and density of the material of the animal liver (tumor) are measured, and the pixel values are weighted based on the material by the following weighting method:
Figure BDA0003085094450000111
p' is the pixel value after correction, P is the pixel value before correction, λ1And λ2Thermal conductivity of the measured animal tissue and the thermal conductivity of the simulation material, c1And c2Specific heat of tested animal tissue and specific heat of simulated material, rho1And ρ2The density of the tested animal tissue (such as liver) and the density of the simulated material, respectively.
More preferably, in order to reduce the cost better, a replaceable organ model and/or an upper body surface tissue model are provided, and the organ model and the upper body surface tissue model are made of double-component silica gel. Preferably, the upper body model comprises tissue layers and a skeleton (the skeleton is used to form skeleton contours, and the tissue layers, especially the outer tissue layers, are used to cover the skeleton, and for the puncture region we are concerned with, there are no skeleton layers, as it is necessary to puncture from the skeleton space or the abdominal frameless region). The tissue layers comprise an outer tissue layer 3 and an inner substrate layer 4, the outer tissue layer and the inner substrate layer are attached to each other, the two ends of the outer tissue layer and the inner substrate layer are hermetically connected, the inner parts of the outer tissue layer and the inner substrate layer are fixed at intervals and are generally attached to each other, and a fluid channel is formed between the two.
The inner substrate is attached to the inner side of the outer tissue layer, and a plurality of attaching fixing points are arranged at intervals between the inner substrate and the outer tissue layer in a bonding or other fixing mode, so that the inner substrate and the simulated tissue layer are fixed together, and flexible glue is used, is dispensed and is bonded together. The periphery of the inner substrate is hermetically connected with the inner wall of the simulated tissue layer, a filling layer is formed between the inner substrate and the simulated tissue layer, a filling liquid input port 8 and a filling liquid suction port 9 are respectively arranged at the joint part of the inner substrate and the simulated tissue layer and at the two ends of the joint part, and the filling liquid input port and the filling liquid suction port are respectively communicated with the filling layer between the inner substrate and the inner simulated tissue layer. Preferably, the inner substrate is made of a material with little viscosity, such as flexible silicone, to ensure a tight fit between the inner substrate and the simulated tissue layer. Preferably, the backing layer is provided only on one side of the simulated tissue layer adjacent to the anterior chest and not on the other side, since most surgical ablation procedures are performed with a puncture from the anterior side.
More preferably, the system still includes restores paster 501, restore the paster including circular or square paster, paster one side has smooth surface, and the at least periphery part of opposite side surface is divided into the viscidity surface, and the mid portion activity ground on this viscidity surface bonds circular restoration piece 502, adhere to the first component of two ingredient silica gels on the circular restoration piece (in this embodiment, restore the piece and include the first component of silica gel in the cover film and, the cover film can be broken by the extrusion under external force squeezing action), this first component silica gel matches with the second component silica gel of input in the filling liquid input port, forms two ingredient silica gel. Preferably, the first component of the circular repair block is a silica gel curing agent, or the silica gel curing agent is filled in the circular repair block, and the silica gel curing agent is matched with silica gel liquid input in the filling liquid input port and is two components in two-component silica gel respectively. More preferably, the two-component silica gel is colorless or pure color, flexible gel.
Preferably, the two-component silica gel consists of a first component and a second component, wherein the mass ratio of the second component to the first component is 10: 1. the first component comprises 2-3 parts of a cross-linking agent, 1-4 parts of a coupling agent, 0.1-0.3 part of a catalyst, 5 parts of a plasticizer and 3 parts of a chain extender, the second component comprises 100-120 parts of hydroxyl-terminated polysiloxane, 80 parts of a filler and 30 parts of a plasticizer, the filler can be heavy calcium carbonate, silicon micropowder and aluminum hydroxide, the plasticizer is methyl silicone oil, and the viscosity of the hydroxyl-terminated polysiloxane is 400-650 mPa. The cross-linking agent is at least one of methyl orthosilicate or ethyl silicate, methyl triethoxysilane and tetraisopropoxy silane, the coupling agent is vinyl trimethoxysilane or aminopropyl trimethoxysilane, the chain extender can be a conventional common chain extender, such as dimethyl diethylsilane, and the aminopropyl methyl dimethoxysilane catalyst is dibutyltin dilaurate.
The repairing process comprises the following steps: and respectively pasting a repairing patch 101 on each of two sides of each puncture hole, namely the inner side of the inner lining layer and the outer side of the simulated tissue layer, wherein the middle part of each repairing patch is provided with a circular repairing block or is coated with a first component of repairing liquid, and the repairing patches are opposite to the puncture holes. Then, the filling liquid input port is opened, and the filling liquid suction port is naturally opened to discharge residual air possibly existing in the filling layer. The second component silica gel is slowly filled into the filling layer through the filling liquid inlet, and if the filling is not uniform, the second component silica gel can be manually pressed and adjusted. Because the inner substrate layer and the simulated tissue layer are originally attached together and fixedly connected at intervals, a fluid channel is formed between the inner substrate layer and the simulated tissue layer after the second component adhesive is filled, then the second component is conveyed to each position between the two components, including each puncture position, the air remained in the original puncture is extruded to the other side of the filling layer, so that the filling liquid is filled into the puncture and is contacted with the repair block on the repair patch or the first component of the repair liquid, after the second component is filled in each puncture hole, the filling liquid input port is closed, the filling liquid suction port is opened, the filling liquid is slowly sucked outwards until the inner substrate layer and the simulated tissue layer are basically attached together again, because the filling liquid input port and the filling liquid suction port are respectively positioned at two sides of the simulation equipment, residual air in the original puncture hole can be pumped out in the suction process, and the second component adhesive is left in the pit of the puncture hole. At the moment, the repairing blocks on the puncture holes can be gently squeezed and kneaded, so that the first component adhesive and the second component on the repairing blocks are fully mixed, then the repairing blocks are kept still until the curing is completed, the repairing of the inner substrate layer and the simulated tissue layer is realized, and the residual second component in the repairing blocks can be further sucked. After curing, the repair will become a further fixation point between the two layers. By adopting the liquid curing repairing mode, the repairing position has almost no obvious boundary, and the repairing position is better fused with the original material. Most preferably, the inner backing layer and the simulated tissue layer are made of the same material as the repair material, i.e., they are also made of two-component silicone or other similar two-component curing material. In another preferred implementation, the repair patch is separable from the inner simulated tissue layer and the repair patch is torn off when the repair is complete.
While the principles of the invention have been described in detail in connection with the preferred embodiments thereof, it will be understood by those skilled in the art that the foregoing embodiments are merely illustrative of exemplary implementations of the invention and are not limiting of the scope of the invention.

Claims (5)

1. An interventional operation simulation method based on video image feedback is characterized by comprising the following steps:
(1) constructing a human body simulation model, wherein the human body simulation model at least comprises a simulation tissue layer and a simulation organ, and at least the simulation organ is made of a transparent material;
(2) arranging a video monitoring device for an ablation process at one side of the simulated visceral organ, wherein the video monitoring device has a video image acquisition function of at least two wave bands;
(3) during the operation simulation by the ablation device, the video monitoring device is informed to carry out video acquisition on two acquisition wave bands through an ablation switch of the ablation device, wherein the first wave band of the two wave bands is a visible light wave band or a shorter wavelength light wave band, the second wave band is an infrared wave band,
the simulation process comprises a puncture stage and an ablation stage, an ablation switch of the ablation device is in communication connection with the video monitoring device and used for mode switching, the puncture stage adopts a first wave band for video acquisition, the ablation stage acquires visible light images and infrared light images based on synchronous signals and fuses corresponding frame images, the fused images are presented to a user, RGB values of the visible light images are extracted, unequal ratio weighting is respectively carried out on the RGB values, in the unequal ratio weighting, weights with R values smaller than G, B are given, and weighting coefficients are w respectivelyR1、wG1、wB1The RGB values of the infrared image are weighted unequally, wherein the weighting coefficient of the R value is larger than that of G, B, and the weighting coefficients are respectively wR2、wG2、wB2
Fusing R of each pixel point in the imageMelt、GMelt、BMeltThe values are respectively compared with corresponding overflow thresholds, the proportion of overflow pixels to the total pixel value is determined, and when the proportion of overflow pixels to the total pixel in any one of RGB exceeds a preset threshold, the pixel of the corresponding item is subjected to comparisonThe value is multiplied by an anti-overflow coefficient, which is:
Figure FDA0003660717960000011
wherein Q is an overflow threshold, A is an average pixel value, Y is an overflow ratio,
the method further comprises the following steps: measuring the heat conductivity coefficient of the simulated organ material and the heat conductivity coefficient of the animal liver material, firstly weighting the infrared image based on the material, and then fusing, wherein the weighting mode is as follows:
Figure FDA0003660717960000021
p' is the pixel value after correction, P is the pixel value before correction, λ1And λ2Thermal conductivity of the measured animal tissue and the thermal conductivity of the simulation material, c1And c2Specific heat of tested animal tissue and specific heat of simulated material, rho1And ρ2The density of the tested animal tissue and the density of the simulated material are respectively.
2. The method for simulating an interventional operation based on video image feedback according to claim 1, wherein the fused image is formed as follows:
(3.1) extracting RGB values of the visible light image, and weighting the RGB values respectively, wherein the weighting coefficient of each value is less than or equal to 45% or 50%;
(3.2) acquiring an infrared image by using infrared camera equipment, and weighting the RGB value of the infrared image, wherein the weighting coefficient is more than or equal to 45% or 50%;
and (3.3) performing pixel-by-pixel fusion on each frame of the infrared image and the visible light image according to the synchronous signals, and adding the RGB values of the two images respectively.
3. The method of claim 2, further comprising: and establishing a corresponding relation mapping of pixels of the ablation monitoring equipment and the temperature, and highlighting or marking the specific temperature value concerned by a doctor in the ablation process based on the corresponding relation mapping.
4. The video image feedback-based interventional procedure simulation method of claim 1, further comprising repairing the punctured simulated tissue layer with a repair patch, and removing and replacing the simulated organ through the removal of the removed upper body portion model and lower body portion model.
5. The method for simulating an interventional operation based on video image feedback of claim 1, wherein the video monitoring device collects images of different wave bands by means of coaxial light beam splitting collection.
CN202110577825.XA 2021-05-26 2021-05-26 Interventional operation simulation method based on video image feedback Active CN113303905B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110577825.XA CN113303905B (en) 2021-05-26 2021-05-26 Interventional operation simulation method based on video image feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110577825.XA CN113303905B (en) 2021-05-26 2021-05-26 Interventional operation simulation method based on video image feedback

Publications (2)

Publication Number Publication Date
CN113303905A CN113303905A (en) 2021-08-27
CN113303905B true CN113303905B (en) 2022-07-01

Family

ID=77374874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110577825.XA Active CN113303905B (en) 2021-05-26 2021-05-26 Interventional operation simulation method based on video image feedback

Country Status (1)

Country Link
CN (1) CN113303905B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114569252B (en) * 2022-03-02 2024-01-30 中南大学 Master-slave mapping proportion control system and method for surgical robot

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101902950A (en) * 2007-12-21 2010-12-01 爱尔康折射视界公司 Virtual microscope system for monitoring the progress of corneal ablative surgery and associated methods
CN103236213A (en) * 2013-04-19 2013-08-07 上海交通大学 Atrial fibrillation catheter ablation simulation based on optical binocular positioning
CN103489360A (en) * 2012-06-12 2014-01-01 韦伯斯特生物官能(以色列)有限公司 Physical heart simulator
CN105726117A (en) * 2014-12-29 2016-07-06 韦伯斯特生物官能(以色列)有限公司 Spectral Sensing Of Ablation
CN105744883A (en) * 2013-11-20 2016-07-06 乔治华盛顿大学 Systems and methods for hyperspectral analysis of cardiac tissue
CN106572842A (en) * 2014-06-24 2017-04-19 阿帕玛医疗公司 Tissue ablation and monitoring thereof
CN107452000A (en) * 2017-08-31 2017-12-08 天津大学 Verify the experimental facilities of ultrasonic temperature imaging accuracy
CN109998451A (en) * 2019-04-30 2019-07-12 东北大学 A kind of photo-thermal therapy device of based endoscopic imaging guidance
CN110365878A (en) * 2019-07-04 2019-10-22 华为技术有限公司 A kind of photographic device and method
CN110472658A (en) * 2019-07-05 2019-11-19 哈尔滨工程大学 A kind of the level fusion and extracting method of the detection of moving-target multi-source

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07177308A (en) * 1993-12-20 1995-07-14 Ricoh Co Ltd Method for controlling double-sided scanner
JPH07271544A (en) * 1994-03-29 1995-10-20 Toshiba Corp Image information processor
US7009627B2 (en) * 2001-11-21 2006-03-07 Canon Kabushiki Kaisha Display apparatus, and image signal processing apparatus and drive control apparatus for the same
US7460733B2 (en) * 2004-09-02 2008-12-02 Siemens Medical Solutions Usa, Inc. System and method for registration and modeling of deformable shapes by direct factorization
KR100933282B1 (en) * 2008-01-16 2009-12-22 연세대학교 산학협력단 Color restoration method and system
JP2013038504A (en) * 2011-08-04 2013-02-21 Sony Corp Imaging device, image processing method and program
KR102081133B1 (en) * 2013-12-30 2020-04-14 엘지디스플레이 주식회사 Method And apparatus Controlling Luminance Of Organic Light Emitting Diode Display Device
JP2015152645A (en) * 2014-02-10 2015-08-24 シナプティクス・ディスプレイ・デバイス合同会社 Image processing apparatus, image processing method, display panel driver, and display apparatus
CN104505053B (en) * 2015-01-04 2017-03-15 京东方科技集团股份有限公司 Show signal conversion method and device
JP5935196B1 (en) * 2015-07-08 2016-06-15 Eizo株式会社 Image processing apparatus, display apparatus, and program
CN106683056A (en) * 2016-12-16 2017-05-17 凯迈(洛阳)测控有限公司 Airborne photoelectric infrared digital image processing method and apparatus thereof
CN108961299B (en) * 2017-05-18 2021-03-02 北京金山云网络技术有限公司 Foreground image obtaining method and device
CN107205120B (en) * 2017-06-30 2019-04-09 维沃移动通信有限公司 A kind of processing method and mobile terminal of image
CN108347560A (en) * 2018-01-17 2018-07-31 浙江大华技术股份有限公司 A kind of anti-sun of video camera is burnt method, video camera and readable storage medium storing program for executing
CN112001873A (en) * 2020-08-27 2020-11-27 中广核贝谷科技有限公司 Data generation method based on container X-ray image
CN112686820A (en) * 2020-12-29 2021-04-20 北京旷视科技有限公司 Virtual makeup method and device and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101902950A (en) * 2007-12-21 2010-12-01 爱尔康折射视界公司 Virtual microscope system for monitoring the progress of corneal ablative surgery and associated methods
CN103489360A (en) * 2012-06-12 2014-01-01 韦伯斯特生物官能(以色列)有限公司 Physical heart simulator
CN103236213A (en) * 2013-04-19 2013-08-07 上海交通大学 Atrial fibrillation catheter ablation simulation based on optical binocular positioning
CN105744883A (en) * 2013-11-20 2016-07-06 乔治华盛顿大学 Systems and methods for hyperspectral analysis of cardiac tissue
CN106572842A (en) * 2014-06-24 2017-04-19 阿帕玛医疗公司 Tissue ablation and monitoring thereof
CN105726117A (en) * 2014-12-29 2016-07-06 韦伯斯特生物官能(以色列)有限公司 Spectral Sensing Of Ablation
CN107452000A (en) * 2017-08-31 2017-12-08 天津大学 Verify the experimental facilities of ultrasonic temperature imaging accuracy
CN109998451A (en) * 2019-04-30 2019-07-12 东北大学 A kind of photo-thermal therapy device of based endoscopic imaging guidance
CN110365878A (en) * 2019-07-04 2019-10-22 华为技术有限公司 A kind of photographic device and method
CN110472658A (en) * 2019-07-05 2019-11-19 哈尔滨工程大学 A kind of the level fusion and extracting method of the detection of moving-target multi-source

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
皮肤选择性光热解效应与光疗及热损伤的评估方法和技术;龚玮;《中国优秀博硕士学位论文全文数据库医药卫生科技辑》;20111215;第25-27页 *

Also Published As

Publication number Publication date
CN113303905A (en) 2021-08-27

Similar Documents

Publication Publication Date Title
AU2021266318B2 (en) Simulated tissue models and methods
JP2020024473A (en) Incisable simulation tissue
US10573201B2 (en) Method of producing a phantom and phantom
CN113303905B (en) Interventional operation simulation method based on video image feedback
CA3146636A1 (en) Simulated tissue structure for surgical training
JP2015505679A (en) Intervention image guidance by fusing ultrasound images
WO2006057296A1 (en) Ultrasonographic device
WO2018168261A1 (en) Control device, control method, and program
WO2012014437A1 (en) Image processor, image processing method and image processing program
CN113380093B (en) Operation training system for microwave ablation operation
Terry et al. An integrated port camera and display system for laparoscopy
CN105931549A (en) Manufacturing method of left atrial appendage closure simulation system and apparatus thereof
US20190130791A1 (en) Method of assessing the performance of a human or robot carrying out a medical procedure and assessment tool
Kingston et al. Hysteroscopic training: the butternut pumpkin model
CN210727882U (en) Laparoscope external view mirror device applying optical coherence tomography technology
JPS6312365Y2 (en)
Brown et al. Comparison of conventional and gaze-down imaging in laparoscopic task performance
CN218833265U (en) Visual stomach tube device
CN216388413U (en) Laparoscope ultrasonic puncture simulation training device
CN219230109U (en) Real-time dynamic reduction system for nasal bone fracture under ultrasonic guidance
CN113313988B (en) Repeatedly-usable and repairable operation simulation equipment
JP7053873B2 (en) Image processing equipment and endoscopic system
CN210667427U (en) Teaching model device for simulating retrograde cholangiopancreatography through endoscope
US6716170B2 (en) Body channel motion picture production system
JP2023003261A (en) Medical image processing device, operation method thereof, and endoscope system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant