CN116650022B - Method and system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images - Google Patents
Method and system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images Download PDFInfo
- Publication number
- CN116650022B CN116650022B CN202310954717.9A CN202310954717A CN116650022B CN 116650022 B CN116650022 B CN 116650022B CN 202310954717 A CN202310954717 A CN 202310954717A CN 116650022 B CN116650022 B CN 116650022B
- Authority
- CN
- China
- Prior art keywords
- data
- patient
- module
- focus
- uterine cavity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000004927 fusion Effects 0.000 title claims description 16
- 239000000203 mixture Substances 0.000 claims abstract description 49
- 238000007781 pre-processing Methods 0.000 claims abstract description 27
- 230000008569 process Effects 0.000 claims abstract description 27
- 238000007499 fusion processing Methods 0.000 claims abstract description 19
- 238000001727 in vivo Methods 0.000 claims abstract description 14
- 238000002601 radiography Methods 0.000 claims abstract description 7
- 230000003902 lesion Effects 0.000 claims description 45
- 238000002604 ultrasonography Methods 0.000 claims description 25
- 230000005540 biological transmission Effects 0.000 claims description 19
- 230000000877 morphologic effect Effects 0.000 claims description 17
- 239000000523 sample Substances 0.000 claims description 15
- 210000001015 abdomen Anatomy 0.000 claims description 12
- 210000004712 air sac Anatomy 0.000 claims description 6
- 210000003679 cervix uteri Anatomy 0.000 claims description 6
- 238000006073 displacement reaction Methods 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 4
- FAPWRFPIFSIZLT-UHFFFAOYSA-M Sodium chloride Chemical compound [Na+].[Cl-] FAPWRFPIFSIZLT-UHFFFAOYSA-M 0.000 claims description 3
- 239000002504 physiological saline solution Substances 0.000 claims description 3
- 230000001954 sterilising effect Effects 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 210000001215 vagina Anatomy 0.000 claims description 3
- 230000004807 localization Effects 0.000 claims 4
- 238000001514 detection method Methods 0.000 abstract description 4
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000005520 cutting process Methods 0.000 abstract description 2
- 230000006872 improvement Effects 0.000 description 7
- 238000010276 construction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 239000000243 solution Substances 0.000 description 2
- 238000007689 inspection Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000004291 uterus Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/481—Diagnostic techniques involving the use of contrast agent, e.g. microbubbles introduced into the bloodstream
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Hematology (AREA)
- Vascular Medicine (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention relates to the technical field of image recognition, in particular to a method and a system for assisting in positioning uterine focus by combining ultrasonic and endoscopic images; which comprises the following steps: firstly, data of a patient are acquired through radiography, intrauterine cavity size and in-vivo position data of the patient are obtained through ultrasonic equipment, then the data are processed through a preprocessing module, the preprocessed data are transmitted to a composition module, three-dimensional model data and two-dimensional plane grid data are generated through the composition module, the data are transmitted to a laser grid generating device and an operation terminal of a main doctor, and finally, the positioning of a focus of the patient is completed. The invention can carry out multiple detection on the patient according to the conventional detection device in the use process, thereby obtaining the data of multiple dimensions of the patient, processing the data by the image fusion processing module, obtaining the three-dimensional model data and the two-dimensional plane grid data after processing, being very beneficial to the judgment of doctors and being capable of assisting the doctors in carrying out the cutting operation.
Description
Technical Field
The invention relates to the technical field of positioning uterine focus by an image fusion technology, in particular to a method and a system for positioning uterine focus in an assisted manner by ultrasonic and endoscopic image fusion.
Background
In the prior art, the position of a uterine focus is determined by checking the focus in the uterus of a patient by using related equipment such as ultrasonic and endoscope and performing self judgment of a doctor.
The present invention has been made in view of this.
Disclosure of Invention
The invention aims to provide a method and a system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images, which are used for solving the problems in the background technology.
In order to achieve the above object, one of the objects of the present invention is to provide a method for assisting in locating a uterine focus by fusion of ultrasound and endoscopic images, comprising the steps of:
s1, intrauterine radiography and data acquisition: firstly, performing contrast examination on the uterine cavity of a patient by matching ultrasound with an endoscope, and collecting data;
s2, image fusion processing: detecting the position of the uterine cavity of the patient through ultrasonic equipment to obtain the size and in-vivo position data of the uterine cavity of the patient, and transmitting the acquired data to a composition module in an image fusion processing module;
s3, contrast examination of lesion positions: transmitting lesion position data, morphological data, size data and echo characteristic data subjected to contrast examination through ultrasound and an endoscope to a preprocessing module in an image fusion processing module for processing;
s4, preprocessing lesion data: the preprocessing module in the image processing module preprocesses lesion position data, morphological data, size data and echo characteristic data, and transmits the preprocessed data to the composition module;
s5, fusion modeling of lesion positions: the composition module is utilized to fuse the lesion position data, the morphological data, the size data and the echo characteristic data which are processed by the preprocessing module, and the acquired uterine cavity size and in-vivo position data of the patient, and three-dimensional model data and two-dimensional plane grid data are generated;
s6, model transmission: transmitting the three-dimensional model data and the two-dimensional plane grid data of the patient generated by the composition module to a laser grid generating device and an operation terminal of a main doctor through a transmission module; comprising the following steps:
s7, auxiliary treatment: finally, in the process of performing operation treatment on the patient, a doctor observes the three-dimensional model data through the laser grid generating device, so that the focus of the patient is positioned.
As a further improvement of the technical scheme, the ultrasound and the endoscope are matched to carry out contrast examination on the uterine cavity of the patient, and the specific steps are as follows:
in the step S1, the ultrasound and the endoscope cooperate to perform contrast examination on the uterine cavity of the patient specifically comprises the following steps:
s11: firstly, performing routine examination on a patient through a special vaginal probe with the frequency of 5 to 7 MHz;
s12: secondly, withdrawing the probe after checking without error, and sterilizing the probe;
s13: then inserting a double-cavity radiography tube through the cervix of the patient, injecting normal saline into the air sac, and then pulling the air sac downwards to the cervix internal opening;
s14: then placing the sterilized special vaginal probe with the frequency of 5 to 7MHz into the vagina of the patient again;
s15: simultaneously, physiological saline is injected into the uterine cavity to expand and fill the uterine cavity, and lesion position data, morphological data, size data and echo characteristic data are observed and recorded;
s16: finally, the suspicious lesion can be sampled according to the requirements of doctors, and the probe is withdrawn after the examination, observation, recording and sampling are completed.
As a further improvement of the present technical solution, in the step S2, by collecting the intrauterine cavity size and the in-vivo position data of the patient, the focus can be accurately positioned according to the physical factors of different patients in the subsequent positioning process.
As a further improvement of the present technical solution, in the step S3, the preprocessing module screens lesion location data, morphology data, size data, and echo feature data obtained by performing a contrast examination with an ultrasound and an endoscope, so as to preserve accurate and complete lesion location data, morphology data, size data, and echo feature data.
In the fifth step, the composition module is used to construct a three-dimensional model of the interior of the uterine cavity of the patient, and the position, shape and size of the focus in the uterine cavity, and the generated two-dimensional plane grid data are used to divide the plane graph of the uterine cavity and focus position of the patient into a plurality of square grids with consistent size, and then construct the specific position of the focus of the patient in the grid graph through the dotted line.
As a further improvement of the technical scheme, the specific steps of constructing the three-dimensional model by the composition module are as follows:
s51: firstly, reading the collected image data, lesion position data, morphological data, size data and echo characteristic data into a composition module;
s52: secondly, processing and positioning the image data through a composition module;
s53: synthesizing the image data, and performing three-dimensional modeling to obtain a complete three-dimensional model of the interior of the uterine cavity;
s54: then automatically picking up the needed anatomical feature points of the three-dimensional model in sequence, thereby obtaining the anatomical feature points of the standard three-dimensional model;
s55: then the projection is carried out in the three-dimensional space of X, Y and Z directions according to the initial characteristic points, the projection points of X, Y and Z directions are mapped onto the image, the actual characteristic points are constructed, and each characteristic point is calculatedThe specific formula is as follows:
wherein,for the actual feature point +.>Initial feature points;
s56: and then calculating a plurality of known characteristic point displacements, and constructing an interpolation function, wherein the interpolation function formula is as follows:
the system of analytical equations is obtained by constructing the interpolation function using the radial basis functions, as follows: ;/>
solving a linear system of analytical equations;
S57: and finally substituting each vertex of the initial characteristic points into the radial basis function by utilizing an interpolation function, calculating the displacement of other characteristic points, and finally obtaining the new coordinates of the grid vertices in the actual model to complete the construction of the three-dimensional model.
As a further improvement of the technical scheme, in the step S6, after the three-dimensional model data and the two-dimensional plane grid data generated by the composition module are transmitted to the operation terminal of the attending doctor through the transmission module, the attending doctor can observe the uterine cavity, the lesion position, the size and the shape of the patient in all directions through special three-dimensional software at the operation terminal.
As a further improvement of the technical scheme, in the step S7, grid lines of the uterine cavity and dotted lines of the focus are generated on the abdomen of the patient through a laser grid generating device, three-dimensional model data are added, and the depth of the focus in the uterine cavity and the depth of the uterine cavity in the abdomen of the patient are observed, so that the positioning of the focus of the patient is completed.
As a further improvement of the technical scheme, the image fusion processing module comprises a preprocessing module, a composition module and a transmission module, wherein the preprocessing module is in wireless connection with ultrasonic equipment and endoscope equipment, the preprocessing module is electrically connected with the composition module, the composition module is in wireless connection with the ultrasonic equipment for collecting the uterine cavity size and in-vivo position data of a patient, the composition module is electrically connected with the transmission module, and the transmission module is in wireless connection with a laser grid generating device and an operation terminal of a main doctor.
Compared with the prior art, the invention has the beneficial effects that:
1. in the method for assisting in positioning the uterine focus by combining the ultrasonic image and the endoscope image, multiple detection can be carried out on a patient according to a conventional detection device in the use process, so that data of multiple dimensions of the patient are adopted, the data are processed through an image fusion processing module, processed three-dimensional model data and two-dimensional plane grid data are obtained, the judgment of a doctor is facilitated, and the doctor is assisted in performing a cutting operation.
2. In the method for positioning the uterine focus in an assisted manner by fusion of the ultrasonic and endoscopic images, the three-dimensional model data and the two-dimensional plane grid data are visually expressed through the composition module in the use process, the position of the focus of the patient can be observed more intuitively and accurately in the operation process through the two-dimensional plane grid data, so that a doctor can conveniently treat the focus of the patient in the operation process, the doctor can more accurately position the focus of the patient in the operation process, meanwhile, the model is built through the three-dimensional model data, and compared with the traditional observation mode, the focus position, the size and the shape of the patient can be observed more comprehensively and finely by the doctor through the three-dimensional model observation mode, and the operation in the operation process is very facilitated.
3. In the method for assisting in positioning the uterine focus by combining the ultrasonic image and the endoscope image, the laser grid generating device is used for directly and accurately indicating the uterine cavity area and the focus area of the abdomen of the patient in the operation, so that a doctor of a main knife can more accurately position the uterine cavity and the focus position in the lower knife positioning process, compared with the traditional operation of scribing on the abdomen, the method can more accurately confirm the position of the uterine cavity and the focus position, and can accurately know the depth of the focus in the uterine cavity and the depth of the uterine cavity in the abdomen of the patient by matching with three-dimensional model data, and can accurately position the focus of the patient in the operation process, thereby improving the positioning accuracy and practicability.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention;
fig. 2 is a schematic flow chart of a system structure of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
Referring to fig. 1, it is an object of the present embodiment to provide a method for assisting in locating a uterine focus by combining ultrasound and endoscopic images, comprising the following steps:
s1, intrauterine radiography and data acquisition: firstly, performing contrast examination on the uterine cavity of a patient by matching ultrasound with an endoscope, and collecting data;
s2, image fusion processing: detecting the position of the uterine cavity of the patient through ultrasonic equipment to obtain the size and in-vivo position data of the uterine cavity of the patient, and transmitting the acquired data to a composition module in an image fusion processing module;
s3, contrast examination of lesion positions: transmitting lesion position data, morphological data, size data and echo characteristic data subjected to contrast examination through ultrasound and an endoscope to a preprocessing module in an image fusion processing module for processing;
s4, preprocessing lesion data: the preprocessing module in the image processing module preprocesses lesion position data, morphological data, size data and echo characteristic data, and transmits the preprocessed data to the composition module;
s5, fusion modeling of lesion positions: the composition module is utilized to fuse the lesion position data, the morphological data, the size data and the echo characteristic data which are processed by the preprocessing module, and the acquired uterine cavity size and in-vivo position data of the patient, and three-dimensional model data and two-dimensional plane grid data are generated;
s6, model transmission: transmitting the three-dimensional model data and the two-dimensional plane grid data of the patient generated by the composition module to a laser grid generating device and an operation terminal of a main doctor through a transmission module; comprising the following steps:
s7, auxiliary treatment: finally, in the process of performing operation treatment on the patient, a doctor observes the three-dimensional model data through the laser grid generating device, so that the focus of the patient is positioned.
In the step S1, the ultrasound and the endoscope cooperate to perform contrast examination on the uterine cavity of the patient specifically comprises the following steps:
s11: firstly, performing routine examination on a patient through a special vaginal probe with the frequency of 5 to 7 MHz;
s12: secondly, withdrawing the probe after checking without error, and sterilizing the probe;
s13: then inserting a double-cavity radiography tube through the cervix of the patient, injecting normal saline into the air sac, and then pulling the air sac downwards to the cervix internal opening;
s14: then placing the sterilized special vaginal probe with the frequency of 5 to 7MHz into the vagina of the patient again;
s15: simultaneously, physiological saline is injected into the uterine cavity to expand and fill the uterine cavity, and lesion position data, morphological data, size data and echo characteristic data are observed and recorded;
s16: finally, the suspicious lesion can be sampled according to the requirements of doctors, and the probe is withdrawn after the examination, observation, recording and sampling are completed.
In step S3, the preprocessing module screens lesion position data, morphology data, size data and echo characteristic data obtained by performing contrast inspection with ultrasound and an endoscope, so that accurate and complete lesion position data, morphology data, size data and echo characteristic data are reserved, and thus, the workload of a composition module used in the subsequent image fusion process can be reduced through the preprocessing module, the composition module has higher efficiency in the image construction process, better efficiency in the process of positioning the uterine focus position, and convenience is brought to patients for timely treatment.
In step S5, through the composition module, the three-dimensional model of the interior of the uterine cavity of the patient constructed through the composition module in the use process, and the position, the shape and the size of the focus in the uterine cavity, and the plane view of the uterine cavity and the focus position of the patient are divided into a plurality of square grids with the same size through the generated two-dimensional plane grid data, and then the focus of the patient is constructed at the specific position of the grid view through the dotted line, so that the focus of the patient is positioned in the position of the focus of the patient can be observed more intuitively and accurately in the operation process through the two-dimensional plane grid data, thereby being convenient for doctors to treat the focus of the patient in the operation process, and enabling the doctor to cut the position of the focus of the patient in the operation process to be more accurate.
The composition module builds a three-dimensional model as follows:
s51: firstly, reading the collected image data, lesion position data, morphological data, size data and echo characteristic data into a composition module;
s52: secondly, processing and positioning the image data through a composition module;
s53: synthesizing the image data, and performing three-dimensional modeling to obtain a complete three-dimensional model of the interior of the uterine cavity;
s54: then automatically picking up the needed anatomical feature points of the three-dimensional model in sequence, thereby obtaining the anatomical feature points of the standard three-dimensional model;
s55: then the projection is carried out in the three-dimensional space of X, Y and Z directions according to the initial characteristic points, the projection points of X, Y and Z directions are mapped onto the image, the actual characteristic points are constructed, and each characteristic point is calculatedThe specific formula is as follows:
wherein,for the actual feature point +.>Initial feature points;
s56: and then calculating a plurality of known characteristic point displacements, and constructing an interpolation function, wherein the interpolation function formula is as follows:
the system of analytical equations is obtained by constructing the interpolation function using the radial basis functions, as follows: ;/>
solving a linear system of analytical equations;
S57: and finally substituting each vertex of the initial characteristic points into the radial basis function by utilizing an interpolation function, calculating the displacement of other characteristic points, and finally obtaining the new coordinates of the grid vertices in the actual model to complete the construction of the three-dimensional model.
In step S6, after the three-dimensional model data and the two-dimensional plane grid data generated by the composition module are transmitted to the operation terminal of the main doctor through the transmission module, the main doctor can observe the uterine cavity, the focus position, the size and the morphology of the patient in all directions through special three-dimensional software at the operation terminal, so that compared with the traditional observation mode, the main doctor can observe the focus position, the size and the morphology of the patient more comprehensively and finely through the observation mode of the three-dimensional model, and the operation in the operation process is very facilitated.
In step S7, generating grid lines of the uterine cavity and broken lines of the focus on the abdomen of the patient through a laser grid generating device, adding three-dimensional model data, observing the depth of the focus in the uterine cavity and the depth of the uterine cavity in the abdomen of the patient, thereby completing the positioning of the focus of the patient, in the using process, directly and accurately irradiating the grid lines of the uterine cavity and the broken lines of the focus on the abdomen of the patient in operation through the laser grid generating device, therefore, a doctor of the main knife can accurately confirm the position of the uterine cavity and the position of the focus more accurately than the traditional operation of scribing the abdomen in the lower knife positioning process, and can accurately know the depth of the focus in the uterine cavity and the depth of the uterine cavity in the abdomen of a patient by matching with three-dimensional model data, so that the focus of the patient can be accurately positioned in the operation process, and the positioning accuracy and the practicability are improved.
The specific use process in this embodiment is as follows: firstly, the uterine cavity of a patient is subjected to contrast examination through matching of ultrasound and an endoscope, data are acquired, the position of the uterine cavity of the patient is detected through an ultrasound device, the size of the uterine cavity of the patient and in-vivo position data are obtained, the acquired data are transmitted to a composition module in an image fusion processing module, so that the lesion can be accurately positioned according to physical factors of different patients in the positioning process through acquiring the size of the uterine cavity of the patient and in-vivo position data, then lesion position data, morphological data, size data and echo characteristic data which are subjected to contrast examination through the ultrasound and the endoscope are transmitted to a preprocessing module in the image fusion processing module for processing, the processing module screens the lesion position data, the morphological data, the size data and the echo characteristic data which are obtained by utilizing the ultrasound and the endoscope to carry out contrast examination, the method comprises the steps of reserving accurate and complete lesion position data, form data, size data and echo characteristic data, transmitting the preprocessed data to a composition module, transmitting the preprocessed data to the composition module, constructing a three-dimensional model of the interior of a uterine cavity of a patient and the position, the form and the size of a focus in the uterine cavity by the composition module in the using process, dividing a plane graph of the uterine cavity and the focus position of the patient into a plurality of square grids with the same size by the generated two-dimensional plane grid data, constructing the specific position of the focus of the patient in the grid graph by a dotted line, transmitting the three-dimensional model data and the two-dimensional plane grid data of the patient generated by the composition module to a laser grid generating device and an operation terminal of a main doctor by a transmission module, and performing operation treatment on the patient, the doctor observes the three-dimensional model data through the laser grid generating device, so that the focus of the patient is positioned, and the method for positioning the uterine focus in an assisted manner by combining the ultrasonic image and the endoscope image is completed.
Referring to fig. 2, a second object of the present embodiment is to provide a system for assisting in positioning a uterine focus by fusing ultrasound and endoscopic images, which includes an image fusion processing module, wherein the image fusion processing module includes a preprocessing module, a composition module and a transmission module, the preprocessing module is wirelessly connected with an ultrasonic device and an endoscopic device, the preprocessing module is electrically connected with the composition module, the composition module is wirelessly connected with the ultrasonic device for acquiring the size and in-vivo position data of a uterine cavity of a patient, the composition module is electrically connected with the transmission module, and the transmission module is wirelessly connected with a laser grid generating device and an operation terminal of a doctor.
The foregoing has shown and described the basic principles, principal features and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the above-described embodiments, and that the above-described embodiments and descriptions are only preferred embodiments of the present invention, and are not intended to limit the invention, and that various changes and modifications may be made therein without departing from the spirit and scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (6)
1. A system for assisting in positioning uterine lesions by fusion of ultrasound and endoscopic images, which is characterized in that: the system comprises an image fusion processing module, wherein the image fusion processing module comprises a preprocessing module, a composition module and a transmission module, the preprocessing module is in wireless connection with ultrasonic equipment and endoscope equipment, the preprocessing module is electrically connected with the composition module, the composition module is in wireless connection with the ultrasonic equipment for collecting the uterine cavity size and in-vivo position data of a patient, the composition module is electrically connected with the transmission module, and the transmission module is in wireless connection with a laser grid generating device and an operation terminal of a main doctor;
the system operates according to the steps comprising:
s1, intrauterine radiography and data acquisition: firstly, performing contrast examination on the uterine cavity of a patient by matching ultrasound with an endoscope, and collecting data; the method comprises the following specific steps:
s11: firstly, performing routine examination on a patient through a special vaginal probe with the frequency of 5 to 7 MHz;
s12: secondly, withdrawing the probe after checking without error, and sterilizing the probe;
s13: then inserting a double-cavity radiography tube through the cervix of the patient, injecting normal saline into the air sac, and then pulling the air sac downwards to the cervix internal opening;
s14: then placing the sterilized special vaginal probe with the frequency of 5 to 7MHz into the vagina of the patient again;
s15: simultaneously, physiological saline is injected into the uterine cavity to expand and fill the uterine cavity, and lesion position data, morphological data, size data and echo characteristic data are observed and recorded;
s16: finally, sampling the suspicious lesion according to the requirements of doctors, and withdrawing the probe after the examination, observation, recording and sampling are completed;
s2, image fusion processing: detecting the position of the uterine cavity of the patient through ultrasonic equipment to obtain the size and in-vivo position data of the uterine cavity of the patient, and transmitting the acquired data to a composition module in an image fusion processing module;
s3, contrast examination of lesion positions: transmitting lesion position data, morphological data, size data and echo characteristic data subjected to contrast examination through ultrasound and an endoscope to a preprocessing module in an image fusion processing module for processing;
s4, preprocessing lesion data: the preprocessing module in the image processing module preprocesses lesion position data, morphological data, size data and echo characteristic data, and transmits the preprocessed data to the composition module;
s5, fusion modeling of lesion positions: the composition module fuses lesion position data, morphological data, size data and echo characteristic data which are processed by the preprocessing module, and acquired uterine cavity size and in-vivo position data of a patient, and generates three-dimensional model data and two-dimensional plane grid data; comprising the following steps:
s51: firstly, reading the collected image data, lesion position data, morphological data, size data and echo characteristic data into a composition module;
s52: processing and positioning the image data through a composition module;
s53: synthesizing the image data, and performing three-dimensional modeling to obtain a complete three-dimensional model of the interior of the uterine cavity;
s54: then automatically picking up the needed anatomical feature points of the three-dimensional model in sequence, thereby obtaining the anatomical feature points of the standard three-dimensional model;
s55: then the projection is carried out in the three-dimensional space of X, Y and Z directions according to the initial characteristic points, the projection points of X, Y and Z directions are mapped onto the image, the actual characteristic points are constructed, and each characteristic point is calculatedThe specific formula is as follows: />The method comprises the steps of carrying out a first treatment on the surface of the Wherein (1)>For the actual feature point +.>Initial feature points;
s56: and then calculating a plurality of known characteristic point displacements, and constructing an interpolation function, wherein the interpolation function formula is as follows:the system of analytical equations is obtained by constructing the interpolation function using the radial basis functions, as follows: />;/>Solving linear analytical equation set to obtain +.>;
S6, model transmission: transmitting the three-dimensional model data and the two-dimensional plane grid data of the patient generated by the composition module to a laser grid generating device and an operation terminal of a main doctor through a transmission module;
s7, auxiliary treatment: finally, in the process of performing operation treatment on the patient, a doctor observes the three-dimensional model data through the laser grid generating device, so that the focus of the patient is positioned.
2. The ultrasound and endoscopic image fusion assisted uterine lesion localization system according to claim 1, wherein: in the step S2, the intrauterine device size and the in-vivo position data of the patient are collected, and the focus is precisely positioned according to the body factors of different patients in the subsequent positioning process.
3. The ultrasound and endoscopic image fusion assisted uterine lesion localization system according to claim 2, wherein: in the step S3, the preprocessing module screens lesion position data, morphology data, size data and echo characteristic data of the ultrasound and endoscope for contrast examination, and retains the lesion position data, morphology data, size data and echo characteristic data of which the accuracy meets a predetermined requirement.
4. A system for assisting in locating uterine lesions by ultrasound and endoscopic image fusion according to claim 3, wherein: in the step S5, a three-dimensional model of the interior of the uterine cavity of the patient constructed by the composition module, wherein the three-dimensional model comprises the position, the shape and the size of the focus in the uterine cavity, the generated two-dimensional plane grid data is used for dividing the plane graph of the uterine cavity and the focus position of the patient into a plurality of square grids with the same size, and then the specific position of the focus of the patient is constructed by the dotted line.
5. The ultrasound and endoscopic image fusion assisted uterine lesion localization system according to claim 4, wherein: in the step S6, after the three-dimensional model data and the two-dimensional plane grid data generated by the composition module are transmitted to the operation terminal of the attending doctor through the transmission module, the attending doctor can perform omnibearing observation on the uterine cavity, the lesion position, the size and the morphology of the patient through special three-dimensional software at the operation terminal.
6. The ultrasound and endoscopic image fusion assisted uterine lesion localization system of claim 5, wherein: in the step S7, grid lines of the uterine cavity and dotted lines of the focus are generated on the abdomen of the patient by the laser grid generating device, and the depth of the focus in the uterine cavity and the depth of the uterine cavity in the abdomen of the patient are observed by the three-dimensional model data, so that the positioning of the focus of the patient is completed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310954717.9A CN116650022B (en) | 2023-08-01 | 2023-08-01 | Method and system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310954717.9A CN116650022B (en) | 2023-08-01 | 2023-08-01 | Method and system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116650022A CN116650022A (en) | 2023-08-29 |
CN116650022B true CN116650022B (en) | 2023-11-24 |
Family
ID=87715778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310954717.9A Active CN116650022B (en) | 2023-08-01 | 2023-08-01 | Method and system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116650022B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002017729A (en) * | 2000-07-11 | 2002-01-22 | Toshiba Corp | Endoscope ultrasonograph |
CN101216953A (en) * | 2008-01-04 | 2008-07-09 | 西北工业大学 | Skull 3D model construction method |
JP2010088699A (en) * | 2008-10-09 | 2010-04-22 | National Center For Child Health & Development | Medical image processing system |
CN106264618A (en) * | 2016-08-30 | 2017-01-04 | 冯庆宇 | A kind of uterine ultrasound ripple endoscopic system |
CN110023883A (en) * | 2016-10-31 | 2019-07-16 | 医达科技公司 | Method and system for interactive gridding placement and measurement that lesion removes |
CN115886999A (en) * | 2022-11-11 | 2023-04-04 | 中国科学院大学宁波华美医院 | Operation guiding method, device and control system based on simulation virtual technology |
CN115953377A (en) * | 2022-12-28 | 2023-04-11 | 中国科学院苏州生物医学工程技术研究所 | Digestive tract ultrasonic endoscope image fusion method and system |
CN116370077A (en) * | 2023-03-01 | 2023-07-04 | 深圳微美机器人有限公司 | Navigation method and device of ultrasonic endoscope probe, computer equipment and storage medium |
CN116468727A (en) * | 2023-06-19 | 2023-07-21 | 湖南科迈森医疗科技有限公司 | Method and system for assisting in judging high-risk endometrial hyperplasia based on endoscopic image recognition |
-
2023
- 2023-08-01 CN CN202310954717.9A patent/CN116650022B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002017729A (en) * | 2000-07-11 | 2002-01-22 | Toshiba Corp | Endoscope ultrasonograph |
CN101216953A (en) * | 2008-01-04 | 2008-07-09 | 西北工业大学 | Skull 3D model construction method |
JP2010088699A (en) * | 2008-10-09 | 2010-04-22 | National Center For Child Health & Development | Medical image processing system |
CN106264618A (en) * | 2016-08-30 | 2017-01-04 | 冯庆宇 | A kind of uterine ultrasound ripple endoscopic system |
CN110023883A (en) * | 2016-10-31 | 2019-07-16 | 医达科技公司 | Method and system for interactive gridding placement and measurement that lesion removes |
CN115886999A (en) * | 2022-11-11 | 2023-04-04 | 中国科学院大学宁波华美医院 | Operation guiding method, device and control system based on simulation virtual technology |
CN115953377A (en) * | 2022-12-28 | 2023-04-11 | 中国科学院苏州生物医学工程技术研究所 | Digestive tract ultrasonic endoscope image fusion method and system |
CN116370077A (en) * | 2023-03-01 | 2023-07-04 | 深圳微美机器人有限公司 | Navigation method and device of ultrasonic endoscope probe, computer equipment and storage medium |
CN116468727A (en) * | 2023-06-19 | 2023-07-21 | 湖南科迈森医疗科技有限公司 | Method and system for assisting in judging high-risk endometrial hyperplasia based on endoscopic image recognition |
Also Published As
Publication number | Publication date |
---|---|
CN116650022A (en) | 2023-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9392960B2 (en) | Focused prostate cancer treatment system and method | |
JP5858636B2 (en) | Image processing apparatus, processing method thereof, and program | |
EP3145431B1 (en) | Method and system of determining probe position in surgical site | |
US20040106869A1 (en) | Ultrasound tracking device, system and method for intrabody guiding procedures | |
JP5504028B2 (en) | Observation support system, method and program | |
WO2012014438A1 (en) | Device, method, and program for assisting endoscopic observation | |
EP4349265A1 (en) | Ultrasonic puncture guidance planning system based on multi-modal medical image registration | |
CN114224448B (en) | Puncture path planning device, puncture path planning apparatus, and puncture path planning program | |
US20210015447A1 (en) | Breast ultrasound workflow application | |
US11559282B2 (en) | Medical information processing system and medical image processing apparatus | |
JP7248098B2 (en) | Inspection device, inspection method and storage medium | |
US9538983B2 (en) | Device for guiding a medical imaging probe and method for guiding such a probe | |
CN116650022B (en) | Method and system for assisting in positioning uterine focus by fusion of ultrasonic and endoscopic images | |
JP2018134197A (en) | Medical procedure navigation system and method | |
US10076311B2 (en) | Method and apparatus for registering medical images | |
CN115624383A (en) | Three-dimensional modeling system for surgical site | |
CN213665711U (en) | Craniomaxillofacial preoperative planning auxiliary device | |
CN111466952B (en) | Real-time conversion method and system for ultrasonic endoscope and CT three-dimensional image | |
CN114617614A (en) | Surgical robot, prostate puncture method and device thereof, and storage medium | |
CN113907883A (en) | 3D visualization operation navigation system and method for ear-side skull-base surgery | |
CN208426174U (en) | Intelligent Minimally Invasive Surgery device | |
KR101492801B1 (en) | Operating medical navigation system and method for heart surgery with registration of oct image and 3-dimensional image | |
CN105232146A (en) | Intervention ablation catheter with ultrasonic positioning function | |
CN205107886U (en) | Pipe is melted in intervention with supersound locate function | |
CN117179893B (en) | Mammary gland puncture positioning path planning system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |