CN102999902A - Optical navigation positioning system based on CT (computed tomography) registration results and navigation method thereby - Google Patents
Optical navigation positioning system based on CT (computed tomography) registration results and navigation method thereby Download PDFInfo
- Publication number
- CN102999902A CN102999902A CN2012104542202A CN201210454220A CN102999902A CN 102999902 A CN102999902 A CN 102999902A CN 2012104542202 A CN2012104542202 A CN 2012104542202A CN 201210454220 A CN201210454220 A CN 201210454220A CN 102999902 A CN102999902 A CN 102999902A
- Authority
- CN
- China
- Prior art keywords
- image
- art
- module
- dimensional
- registration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Apparatus For Radiation Diagnosis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Disclosed are an optical navigation positioning system based on CT (computed tomography) registration results and a navigation method thereby. The system comprises a preoperative CT image guide input module, an image segmentation module, a body surface initial-registration module, a preoperative CT image and intraoperative two-dimensional ultrasound image module and an intraoperative navigation module. By combining virtual reality and intraoperative ultrasound, intraoperative positioning errors caused by factors such as breathing are compensated, and accordingly a target point for coronary artery bypass grafting is accurately positioned and navigated. Cardiac and coronary vessel tree in preoperative cardiac CT image data is manually segmented and reconstructed, an augmented virtual reality environment integrating endoscope and virtual endoscope is built by the aid of optical navigation apparatus and CT-ultrasound-based intraoperative registration error correction, and accordingly the target point of coronary artery bypass grafting is accurately positioned and navigated.
Description
Technical field
What the present invention relates to is a kind of system and method for technical field of information processing, specifically a kind of navigation positioning system and air navigation aid thereof based on CT registration results before intraoperative ultrasound and the art for auxiliary coronary artery bypass surgery.
Background technology
The incidence of disease of China's cardiovascular disease rises year by year in recent years, and wherein coronary heart disease is modal angiocardiopathy.In the methods for the treatment of for coronary heart disease, bypass operation of coronary artery is present main, ripe therapeutic modality except drug therapy and interventional therapy.Yet traditional coronary artery bypass surgery needs mid-sternal incision, also need to just can finish operation by operations such as extracorporal circulatory systems in case of necessity.It comprises the shortcomings such as otch is large, it is slow to recover, complication is many.The auxiliary Wicresoft of navigation coronary bypass surgery as the novel therapeutic mode, only need by opening several finger thicknesses at the wall of the chest otch, utilize special operating theater instruments can finish operation, reached the requirements such as otch is attractive in appearance, wound is little, recovery is fast, few intercurrent disease.
How fast and accurately one of the main Operative Puzzle of auxiliary Wicresoft coronary artery bypass grafting of navigating is localizing objects point, and improper or wrong location can directly have influence on success or failure and the late result of operation.At present, the location of Wicresoft's coronary bypass surgery mainly relies on the front image of art, fail to utilize the real-time information of reflection operative region truth, can't solve the error problem that causes owing to factors such as breathing, skin-marker displacement and position chanPs in the art, thereby so that locating effect is undesirable.
Find through the retrieval to prior art, " 3D-image guidance for minimally invasive robotic coronary artery bypass; Heart Surg Forum 2000-9732 (3) " (3-D view guidance machine people assists method of off pump coronary bypass, forum of department of cardiac surgery, 2000) in TM.Peter attempt at first the research that the applying three-dimensional imaged image carries out the preplanning of cardiac operation art, he is by the CT Image Segmentation Using to gathering before the operation in patients, obtain the surface model of heart and skeleton, tentatively set up the operation guiding system of Wicresoft's bypass surgery.This system provides a virtual endoscope simultaneously, utilizes the technology of pre-operative surgical registration, so that the relative position relation of the skeleton of the heart model that modeling goes out in the relative position relation of model and virtual endoscope and the system in true endoscope and the model test is consistent.But this system is a prototype system, has many deficiencies, does not have and the virtual problems such as 3D scene effective integration such as true endoscope 2D image.
" Flexible calibration of actuated stereoscopic endoscope for overlay in robot assisted surgery; MICCAI 2002(1): LNCS 2488; T.Dohi and R.Kikinis (eds); 25-34. " (the flexible calibration of robotic assisted surgery neutral body endoscope, medical image in 2002 calculates with area of computer aided and gets involved conference) in Mourguess and Coste-Maniere attempting the virtual endoscope that the heart of an animal is set three-dimensional model at the image of Microendoscopic and the coronary artery of this heart is merged so that certain transparency is overlapping separately, adversary's Intraoperative position impact point has played certain effect.But the model of coronary artery tree is static in the art, and the fusion of the three-dimensional model of coronary artery tree and endoscope image is to rely on to identify the sign point of observing in the endoscope the two is merged comparatively accurately, and the robustness of this method and accuracy are very not satisfactory.
Chinese patent literature CN1650813, put down in writing " surgical navigation systems is based on the robotic surgery localization method of optical alignment ", the method is at first at three zone bits of robot base design, choose this three signs by the optical tracker positioning pointer, allow robot probe and optical tracker positioning pointer when docking gather simultaneously the space the same coordinate system, at last the coordinate interchange by optical tracker and robot system.This technology mainly use with cerebral surgery operation in because the deformation of cerebral surgery operation anatomy is little, easier assurance on the accuracy problem; But this technology can't be applied to the larger cardiovascular system field of anatomy deformation.
Chinese patent literature CN101703409A, put down in writing " a kind of system and method for ultrasound-guided robot-assisted treatment ", this system comprises operating robot, two-dimensional ultrasound instrument, magnetic orientator and workstation, by they are combined, effective detection and robotization treatment to affected area have been realized, thereby reduce doctor's labour intensity, and improve the degree of accuracy that operation is implemented.But what use in the method is magnetic orientator, and magnetic orientator is subject to the various interference such as place, apparatus, other instruments in use, produces easily error.The ultrasound information of three-dimensional reconstruction can produce the anamorphose problem simultaneously, may cause a deviation with its image information guided operation, and it is limited to produce quantity of information, can't be used for some comparatively operations of elaborate, such as the accurate operation of heart operation.
As seen, although the robot assisted therapy system has begun to be applied in the clinical treatment, its development on the surgical target location is not perfect, and a lot of clinical difficult problems still have to be solved.
Summary of the invention
The present invention is directed to the prior art above shortcomings, a kind of optical guidance positioning system and air navigation aid thereof based on the CT registration results proposed, the mode that adopts virtual reality and intraoperative ultrasound to combine, the Intraoperative position error that compensation causes owing to factors such as breathings, thus realization is to the accurate location navigation of the impact point of off pump coronary artery bypass.By the heart in the CT view data before the art and coronary artery vascular tree are manually cut apart reconstruction, then the optical guidance instrument with based on art before build an enhancing reality environment that endoscope and virtual endoscope are merged mutually under registration error is proofreaied and correct in the art of CT image and two-dimensional ultrasonic image the help, thereby can realize the accurate location navigation to the impact point of off pump coronary artery bypass.
The present invention is achieved by the following technical solutions:
The present invention relates to a kind of optical guidance positioning system based on the CT registration results, comprise: the CT image imports module before the art, the image segmentation module, body surface initial registration module, navigation module in two-dimensional ultrasonic image registration module and the art in the front CT image of art and the art, wherein: the CT image imports module and receives the DICOM format-pattern file that the front imaging examination of art obtains before the art, the image bag also exports respectively image segmentation module and body surface initial registration module to before generating art, before the art in CT image and the art two-dimensional ultrasonic image registration module link to each other with body surface initial registration module with the image segmentation module respectively and receive with the Three-Dimensional Dynamic coronary artery of surgical target point and set and body surface initial registration result, the front CT transition matrix of two-dimensional ultrasonic image registration module output art navigation module to the art in CT image and the art is exported accurate surgical navigational information by navigation module in the art before the art.
The image bag comprises before the described art: CT image and the 3 D stereoscopic image that obtains according to the DICOM data reconstruction before the art of some width of cloth DIOM forms of one or several cardiac cycles the target area in, wherein: the front CT view data of cardiac cycle art by the art of several phase places before image form.
Described Three-Dimensional Dynamic coronary artery tree with surgical target point comprises: by dynamic heart model and the vascular tree model of image segmentation module generation.
Image importing module comprises before the described art: the DICOM image reads in unit and the front image three-dimensional drawing unit of art, wherein: the DICOM image read in the unit with art before a series of DICOM format-pattern files of one cardiac cycle of heart of obtaining of imaging examination import parse the DICOM data and transfer to art before the image three-dimensional drawing unit, image bag and export respectively image segmentation module and body surface initial registration module to before the image three-dimensional drawing unit becomes 3 D stereoscopic image and merges into art according to the DICOM data reconstruction before the art.
Described image segmentation module comprises: cardiac segmentation unit and vascular tree cutting unit, wherein: the cardiac segmentation unit to art before in the image bag cardiac CT image in the cardiac cycle manually sketch the contours of frame by frame cardiac silhouette based on clinical experience, and rebuild according to the cardiac CT image segmentation result that obtains and to obtain the dynamic heart model; The vascular tree cutting unit to art before in the image bag vascular tree CT image in the cardiac cycle manually sketch the contours of frame by frame the vascular tree profile and manually mark the surgical target point according to clinical demand based on clinical experience, obtain the vascular tree model and export before the art two-dimensional ultrasonic image registration module in the CT image and art to as setting with the Three-Dimensional Dynamic coronary artery of surgical target point in the lump with the dynamic heart model thereby rebuild according to the vascular tree CT image segmentation result that obtains.
Described body surface initial registration module comprises: register mark point selection unit and transform matrix calculations unit, wherein: the register mark point on the surgical object body surface is selected to be arranged in register mark point selection unit, and obtain the front CT view data of its art in the volume coordinate of image coordinate system coordinate and true surgical object, the transform matrix calculations unit is according to image coordinate system coordinate and volume coordinate, utilize the rigid registration algorithm to realize registration between these two coordinates, namely realize the registration in image space and surgical object space, and utilize these two groups of coordinate Calculation to go out the initial conversion matrix as body surface initial registration result, export two-dimensional ultrasound registration module in the front CT image of art and the art to.
The two-dimensional ultrasonic image registration module comprises in the front CT image of described art and the art: two-dimensional ultrasonic image reads in the unit in the art, CT image registration unit and ECG (electrocardiogram before two-dimensional ultrasonic image and the art, cardiogram) reads in the unit, wherein: two-dimensional ultrasonic image reads in the two-dimensional ultrasound input picture of unit collection and the current heartbeat phase place that cardiogram reads in the unit collection in the front CT image registration unit reception of two-dimensional ultrasonic image and the art art, to mate with corresponding several phase places and current heartbeat phase place in the Three-Dimensional Dynamic coronary artery tree of surgical target point by the input ECG signal, calculate transition matrix between the two and export navigation module in the art to, wherein the heart of each transition matrix phase place in corresponding cardiac cycle and set with the three-dimensional coronary artery of surgical target point.
Navigation module comprises in the described art: transition matrix selected cell and navigation display unit, wherein: the transition matrix selected cell is according to a current heartbeat Selecting phasing corresponding transition matrix with it, and export this transition matrix of gained to the navigation display unit, calculate distance and the relative position relation between current virtual apparatus and the actual target point and carry out the video demonstration by the navigation display unit, thereby guided surgery is accurately finished.
The present invention relates to the air navigation aid of said system, may further comprise the steps:
After cardiac segmentation unit in the first step, the image segmentation module imports module a series of CT view data (DICOM form) of acquisition one cardiac cycle of target area by the front CT image of art, cardiac CT image in this cardiac cycle is manually sketched the contours of cardiac silhouette frame by frame based on clinical experience, obtain segmentation result, and reconstruction obtains the dynamic heart model according to segmentation result.
Vascular tree cutting unit in second step, the image segmentation module to art before in the image bag vascular tree CT image in the cardiac cycle manually sketch the contours of frame by frame the vascular tree profile and manually mark the surgical target point according to clinical demand based on clinical experience, obtain segmentation result, thereby rebuild acquisition vascular tree model according to segmentation result, construct a three-dimensional virtual scene that comprises the dynamic vascular tree-model, this three-dimensional virtual scene has consisted of a virtual endoscopic images.
The 3rd step, body surface initial registration module utilize the body surface registration to obtain the transition matrix of the front CT image coordinate system of art and surgical object Real-time Two-dimensional ultrasonoscopy coordinate system, i.e. coordinate points under two different spaces coordinate systems, by mutually mapping and realize one-to-one relationship of feature, reach the corresponding of CT image and surgical object Real-time Two-dimensional ultrasonoscopy before the final art; Specifically by choosing several register mark points in the CT image in the preoperative, find the point corresponding with register mark point in the image and utilize the optical guidance instrument to obtain these at the coordinate of real space in real space, utilize this two groups of different coordinates, but the position coordinates of point set is tried to achieve two transition matrix T between the space one to one.
The 4th step, to be complementary with corresponding several phase places and current heartbeat phase place in the Three-Dimensional Dynamic coronary artery tree of surgical target point by the input ECG signal, namely with above-mentioned transition matrix T as the initial conversion matrix, if Ti(i=1,2, N) be on the basis of transition matrix T for art before transition matrix behind the CT correct image of each phase place in the image bag, namely for the CT image i of one of them phase place, Ti * T can further proofread and correct the registration error between this phase place CT image and the surgical object Real-time Two-dimensional ultrasonoscopy, wherein N is the phase place number of the front CT image of art in the front CT view data of cardiac cycle art, thereby obtains one group of transition matrix that T is proofreaied and correct; Concrete steps comprise:
4.1) by ultrasonic probe heart is gathered the surgical object realtime graphic of a series of surgical objects, wherein the surgical object realtime graphic that collects of each width of cloth all before corresponding some corresponding arts before the art of CT image, each phase place the CT image corresponding a series of two-dimensional ultrasonic image, for CT view data before the art of phase place i, extract the surface profile of wall of the heart;
4.2) corresponding each width of cloth two-dimensional ultrasonic image of phase place i extracted the profile of wall of the heart, the inwall that extracts from this a series of two-dimensional ultrasonic image has formed one group of point set, extract the unique point point set from this point is concentrated, the wall of the heart surface profile that extracts the CT image before art is another group point set;
4.3) by iterative closest point algorithms (ICP algorithm) point set on the two-dimensional ultrasonic image is registrated to the point set on the CT image before the art, obtain the transition matrix Ti between these two point sets, be used for the initial matrix of follow-up smart registration process.
The 5th goes on foot, by external optical alignment true endoscopic images and virtual endoscopic images is merged.
Described external optical alignment refers to adopt the NDI optical orientator to carry out the infrared reflection location, thereby realizes the three-dimensional localization to research object.
Described fusion refers to: by two width of cloth scenes are merged mutually with different transparencies, just formed one and strengthened reality environment, so that the real scene that the virtual scene that comprises the heartbeat model that virtual endoscope is seen and true endoscope are seen is consistent; The transition matrix Ti corresponding according to current heartbeat Selecting phasing from one group of transition matrix that step 4.3 obtains simultaneously, thereby set up the Mapping and Converting relation between image space and the real space, this Mapping and Converting relation is used for demonstrating in real time distance and the relative position relation between apparatus and the impact point the most at last, thereby guided surgery is accurately implemented.
Technique effect
Advantage of the present invention comprises: 1. take the lead in introducing intraoperative ultrasound in robot assisted coronary bypass surgery navigation, proofread and correct the error that the factor such as breathing causes by the registration of CT image before two-dimensional ultrasonic image in the art and the art; 2. solved and relied on clinically the personal experience to locate the uncertain problem of bridging impact point all the time.
Description of drawings
Fig. 1 is modular structure synoptic diagram of the present invention.
Fig. 2 is several three-dimensional cardiac synoptic diagram of embodiment (containing coronary artery tree and cardiogram correspondence position).
Fig. 3 is optical guidance instrument and registration module synoptic diagram.
Fig. 4 is that the embodiment flow process is introduced synoptic diagram.
Fig. 5 is that embodiment merges synoptic diagram;
Among the figure: a is real endoscopic images; B is virtual endoscopic images; C is both coincidence pattern pictures.
Embodiment
The below elaborates to embodiments of the invention, and present embodiment is implemented under take technical solution of the present invention as prerequisite, provided detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
Embodiment 1
As shown in Figure 1, present embodiment comprises: comprising: the CT image imports module before the art, the image segmentation module, body surface initial registration module, navigation module in two-dimensional ultrasonic image registration module and the art in the front CT image of art and the art, wherein: the CT image imports module and receives the DICOM format-pattern file that the front imaging examination of art obtains before the art, the image bag also exports respectively image segmentation module and body surface initial registration module to before generating art, before the art in CT image and the art two-dimensional ultrasonic image registration module link to each other with body surface initial registration module with the image segmentation module respectively and receive with the Three-Dimensional Dynamic coronary artery of surgical target point and set and body surface initial registration result, the front CT transition matrix of two-dimensional ultrasonic image registration module output art navigation module to the art in CT image and the art is exported accurate surgical navigational information by navigation module in the art before the art.
The image bag comprises before the described art: the front CT image of the art of some width of cloth DIOM forms of one or several cardiac cycles and the 3 D stereoscopic image that obtains according to the DICOM data reconstruction the target area in.
Described Three-Dimensional Dynamic coronary artery tree with surgical target point comprises: by dynamic heart model and the vascular tree model of image segmentation module generation.
CT image importing module comprises before the described art: the DICOM image reads in unit and the front image three-dimensional drawing unit of art, wherein: the DICOM image read in the unit with art before a series of DICOM format-pattern files of one cardiac cycle of heart of obtaining of imaging examination import parse the DICOM data and transfer to art before the image three-dimensional drawing unit, image bag and export respectively image segmentation module and body surface initial registration module to before the image three-dimensional drawing unit becomes 3 D stereoscopic image and merges into art according to the DICOM data reconstruction before the art.
Described image segmentation module comprises: cardiac segmentation unit and vascular tree cutting unit, wherein: the cardiac segmentation unit to art before in the image bag cardiac CT image in the cardiac cycle manually sketch the contours of frame by frame cardiac silhouette based on clinical experience, and rebuild according to the cardiac CT image segmentation result that obtains and to obtain the dynamic heart model; The vascular tree cutting unit to art before in the image bag vascular tree CT image in the cardiac cycle manually sketch the contours of frame by frame the vascular tree profile and manually mark the surgical target point according to clinical demand based on clinical experience, obtain the vascular tree model and set as the Three-Dimensional Dynamic coronary artery with surgical target point as shown in Figure 2 in the lump with the dynamic heart model to export before the art two-dimensional ultrasonic image registration module in the CT image and art to thereby rebuild according to the vascular tree CT image segmentation result that obtains.
Described body surface initial registration module comprises: register mark point selection unit and transform matrix calculations unit, wherein: the register mark point on the surgical object body surface is selected to be arranged in register mark point selection unit, and obtain its in the CT view data in image coordinate system coordinate and volume coordinate, the transform matrix calculations unit is according to image coordinate system coordinate and volume coordinate, utilize the rigid registration algorithm to realize registration between these two coordinates, namely realize the registration in image space and surgical object space, and utilize these two groups of coordinate Calculation to go out the initial conversion matrix as body surface initial registration result, export two-dimensional ultrasonic image registration module in the front CT image of art and the art to.
Described register mark point refers to: when obtaining in the preoperative the CT view data, body surface at surgical object sticks 6-8 metal marker point, the metal marker point is evenly distributed on the thorax body surface of surgical object, metal marker point can be in the preoperative highlighted demonstration in the CT view data.
Described volume coordinate specifically is achieved by NDI spectra optical guidance instrument is set in register mark point selection unit.
The two-dimensional ultrasonic image registration module comprises in the front CT image of described art and the art: two-dimensional ultrasonic image reads in the unit in the art, CT image registration unit and ECG (electrocardiogram before two-dimensional ultrasonic image and the art, cardiogram) reads in the unit, wherein: CT image registration unit receives the intraoperative ultrasound image and reads in the two-dimensional ultrasound input picture of unit collection and the current heartbeat phase place that cardiogram reads in the unit collection before two-dimensional ultrasonic image and the art, to mate with corresponding several phase places and current heartbeat phase place in the Three-Dimensional Dynamic coronary artery tree of surgical target point by the input ECG signal, calculate the front CT transition matrix of art between the two and export navigation module in the art to, corresponding phase during wherein each is organized the corresponding a kind of current heartbeat phase place of the front CT transition matrix of art and sets with the Three-Dimensional Dynamic coronary artery of surgical target point.
Navigation module comprises in the described art: transition matrix selected cell and navigation display unit, wherein: the transition matrix selected cell according to current heartbeat Selecting phasing art before corresponding one group of transition matrix in the CT transition matrix, and export one group of transition matrix of gained to the navigation display unit, calculate distance and the relative position relation between current virtual apparatus and the actual target point and carry out the video demonstration by the navigation display unit, thereby guided surgery is accurately finished.
Air navigation aid may further comprise the steps:
After cardiac segmentation unit in the first step, the image segmentation module imports module a series of CT view data (DICOM form) of acquisition one cardiac cycle of target area by the front CT image of art, cardiac CT image in this cardiac cycle is manually sketched the contours of cardiac silhouette frame by frame based on clinical experience, obtain segmentation result, and reconstruction obtains the dynamic heart model according to segmentation result.
Vascular tree cutting unit in second step, the image segmentation module to art before in the image bag vascular tree CT image in the cardiac cycle manually sketch the contours of frame by frame the vascular tree profile and manually mark the surgical target point according to clinical demand based on clinical experience, obtain segmentation result, and according to segmentation result reconstruction acquisition vascular tree model, construct a three-dimensional virtual scene that comprises the dynamic vascular tree-model, this three-dimensional virtual scene has consisted of a virtual endoscopic images.
The CT view data was at the transition matrix of image coordinate system coordinate with the volume coordinate of true surgical object before the 3rd step, body surface initial registration module utilized the body surface registration to obtain art, i.e. coordinate points under two different spaces coordinate systems, by mutually mapping and realize one-to-one relationship of feature, reach the corresponding of CT image and surgical object Real-time Two-dimensional ultrasonoscopy before the final art; Specifically by choosing several register mark points in the CT image in the preoperative, find the point corresponding with register mark point in the image and utilize the optical guidance instrument to obtain these at the coordinate of real space in real space, utilize this two groups of different coordinates, but the position coordinates of point set is tried to achieve two transition matrixes between the space one to one.
Described body surface registration adopts the rigid registration algorithm, namely in two-dimensional space, and point (x
1, y
1) arrive point (x through rigid body translation
2, y
2) transformation for mula be:
Wherein, θ is the anglec of rotation, (t
x, t
y)
TBe translational movement.
Described transition matrix T:[X
2]=T[X
1], wherein: X
1And X
2Certain point coordinate of CT image data space before the corresponding surgical object two-dimensional ultrasonic image space of difference and the art.
The 4th step, since before a cardiac cycle art CT view data by the art of several phase places before image form, therefore will be complementary with corresponding several phase places and current heartbeat phase place in the Three-Dimensional Dynamic coronary artery tree of surgical target point by the input ECG signal, concrete steps comprise:
4.1) by ultrasonic probe heart is gathered the surgical object realtime graphic of a series of surgical objects, the surgical object realtime graphic that collects of each width of cloth CT image before corresponding some corresponding arts all wherein, before the art of each phase place the CT image corresponding a series of two-dimensional ultrasonic image, for CT view data before the art of phase place i, extract the surface profile of wall of the heart, namely for CT image i before the art of a phase place, Ti * T can further proofread and correct the registration error between this phase place CT image and the surgical object Real-time Two-dimensional ultrasonoscopy, wherein N is the phase place number of the front CT image of art in the front CT view data of cardiac cycle art, and idiographic flow is:
Demarcate good ultrasonic probe by one and obtain two-dimensional ultrasonic image in the art, by T the two-dimensional ultrasonic image coordinate system is transformed into CT image data coordinate system before the art, thus with art before the CT image merge.If Ti(i=1,2, N) be on the basis of transition matrix T for art before the transition matrix behind the CT correct image before the art of each phase place in the image bag, namely for CT image i before a phase place, corresponding with the Ti in the one group of transition matrix art, Ti * T can further proofread and correct the registration error between this phase place CT image and the surgical object Real-time Two-dimensional ultrasonoscopy.Idiographic flow is: by the ultrasonic probe of demarcating heart is gathered a series of two-dimensional ultrasonic image, because ECG signal, the two-dimensional ultrasonic image that each width of cloth gathers is the front CT image of corresponding some corresponding arts all.After finishing, before the art of each phase place the CT image corresponding a series of two-dimensional ultrasonic image.For CT view data before the art of phase place i, extract the surface profile of wall of the heart.Then corresponding each width of cloth two-dimensional ultrasonic image of this phase place extracts the profile of wall of the heart.The inwall that extracts from this a series of two-dimensional ultrasonic image has formed one group of point set, and the wall of the heart surface profile that extracts the CT image before art is another group point set.By iterative closest point (ICP) algorithm the point set on the two-dimensional ultrasonic image is registrated to before the art and has obtained a new transition matrix Ti on the point set on the CT image, and Ti * T is on the basis of T, precision has had further raising;
The ultrasonic probe that described demarcation is good refers to: for real-time two-dimensional ultrasonic image being integrated into navigational system, need to try to achieve the transition matrix that is tied to the navigating instrument coordinate system from the two-dimensional ultrasonic image coordinate.If TM
Td ← uiThe transition matrix that is tied to the coordinate system of the tracing equipment that is fixed on the ultrasonic probe (optics give out light ball or electromagnetic sensor) from the two-dimensional ultrasonic image coordinate, TM
Ui ← tdBe the transition matrix that tracing equipment coordinate from the ultrasonic probe is tied to world coordinate system (navigating instrument coordinate system), the coordinate of a point in the two-dimensional ultrasonic image can be transformed into by following formula the coordinate under the world coordinate system.
Wherein, (u
k, u
v) be this coordinate in the two-dimensional ultrasonic image coordinate system, (s
x, s
y) be the scale-up factor of x axle and y axle, (x
w, y
w, z
w) be its coordinate in world coordinate system.Ultrasonic probe is demarcated and will be tried to achieve exactly the transition matrix TM that is tied to the coordinate system that is fixed on the tracing equipment on the ultrasonic probe from the two-dimensional ultrasonic image coordinate
Td ← ui
Finish after the above-mentioned steps by iterative closest point algorithms (ICP algorithm) point set on the two-dimensional ultrasonic image is registrated to the point set on the CT image before the art, obtain the transition matrix Ti between these two point sets, the initial matrix that is used for follow-up smart registration process, concrete steps comprise: suppose two point set P and Q subject to registration
p
iAnd q
iRespectively the point that two points are concentrated, i=1 ... n, the key of registration problems be exactly find the solution optimum solution so that
Hour R and T;
After having finished the first registration of ICP, CT image i has obtained a transition matrix Ti * T separately before the art of each phase place.Afterwards in navigation stage, will be real-time carry out continuously smart registration, the initial conversion matrix of smart registration is exactly Ti * T each time.For CT image before the art of some phase places, target is to seek a transition matrix that makes the optimum of similarity measure maximum, be designated as T ' i, can finally satisfy the error that causes owing to factors such as breathings, thereby satisfy the positioning requirements to target blood, the similarity measure that adopts in this process is normalized mutual information.
Normalized mutual information is defined as:
Wherein, M is the in the preoperative gradation of image point set of CT up-sampling gained of two-dimensional ultrasonic image region, and R is real-time two-dimensional ultrasonic image gray scale point set, and H (M) is the Shannon entropy of M,
i
MRepresent the gray-scale value of M image slices vegetarian refreshments,
Represent that the pixel gray-scale value is i in the M image
MProbability; H(R) be the Shannon entropy of R,
i
RRepresent the gray-scale value of R image slices vegetarian refreshments,
Represent that the pixel gray-scale value is i in the R image
RProbability; H(M, R) be the combination entropy of M and R,
Represent that the pixel gray-scale value is i in the M image
M, the pixel gray-scale value is i in the R image
RProbability.For M image subject to registration and R image, establish I
M(X
M), I
R(X
R) be respectively the gray scale function of M, R, X
M, X
RRepresent respectively the coordinate in M, the R image space, then X
R=T ' i * X
M, i.e. I
R(X
R)=I
R(T ' i * X
M).
Whole registration process is namely sought T ' i, so that NMI(M, R) maximum.Since Ti * T be by with art in the registration of two-dimensional ultrasonic image obtain, had less error, be a preferably initial optimizing position, therefore can converge to fast optimum solution, thereby obtain a final accurate transition matrix.
The 5th goes on foot, by external optical alignment true endoscope and virtual endoscopic images is merged.
As shown in Figure 5, described fusion refers to: by two width of cloth scenes are merged mutually with different transparencies, just formed one and strengthened reality environment, so that the real scene that the virtual scene that comprises the heartbeat model that virtual endoscope is seen and true endoscope are seen is consistent; The transition matrix Ti corresponding according to current heartbeat Selecting phasing from one group of transition matrix that step 4.3 obtains simultaneously, thereby set up the Mapping and Converting relation between image space and the real space, this Mapping and Converting relation is used for demonstrating in real time distance and the relative position relation between apparatus and the impact point the most at last, thereby guided surgery is accurately implemented.
As shown in Figure 3, described external optical alignment refers to adopt the NDI optical orientator to carry out the infrared reflection location, thereby realizes the three-dimensional localization to research object.This NDI optical guidance instrument comprises: light source 8 and receiver 9, registration tools 10.Wherein light source and receiver farthest coverage be 3000mm, maximum area can reach 1470 * 1856mm
2Registration tools is comprised of witch ball 11 parts and long registration syringe needle 12 parts of front end.The patient evenly attaches 6-8 metal marker point before carrying out CT scan, metal marker is put in the preoperative highlighted demonstration in the CT view data like this.The patient plows on operating table with same position when performing the operation, the life of delivering a child checkout equipment, after applying general anaesthetic, begin to utilize and in image, obtain the coordinate of metal marker point in image coordinate system, utilize simultaneously metal marker point that NDI optical guidance instrument obtains at the coordinate in true research object space, utilize the rigid registration algorithm to realize the registration of this coordinate system, thereby realize the registration in image space and surgical object space.
The endoscope that imports in real time by optical orientator and the positional information of mechanical arm, software can show the relative position of mechanical arm and model in real time in computer screen.In order to realize the fusion of true endoscope and virtual endoscope, need to try to achieve from world coordinates and be tied to the transition matrix of endoscope coordinate system and be tied to the matrix of endoscope projected coordinate system from the endoscope coordinate, namely need to carry out the demarcation of endoscope.After finishing endoscope and demarcating, the endoscope position by input towards etc. information, virtual endoscope is in full accord with the state of true endoscope in the realization.Thereby so that the real scene that the virtual scene that comprises the heartbeat model that virtual endoscope is seen and true endoscope are seen is consistent, by two width of cloth scenes are merged mutually with different transparencies, just formed one and strengthened reality environment, effectively guided surgery is accurately implemented.
Above-mentioned endoscope is demarcated: obtaining of endoscopic images is that a 3D scene is in the result of a 2D projection plane gained.
Wherein, X=[x y z 1]
TBe the homogeneous coordinate system of a point in the 3D scene, x, y, z represent respectively the some coordinate on x axle, y axle, z axle, [u v 1]
TRepresented the coordinate of this point in the 2D projection plane, u, v represent respectively the some coordinate on x axle, y axle, and λ is the secondly zoom factor of coordinate system of 2D projection plane.Comprised a transition matrix M who is transformed into the endoscope coordinate system from world coordinate system in the following formula
ExtAnd transition matrix M from endoscope coordinate system conversion endoscope projected coordinate system
IntM
ExtBe 4 * 4 transition matrixes, can be expressed as:
Wherein, r
1... r
9Be twiddle factor, t
x, t
y, t
zBe translation vector.
M
IntCan be expressed as:
Wherein, f is the distance that the camera lens focus arrives the minute surface center, and s is the ratio of width to height in the camera lens visual field, (u
0, v
0) be the coordinate of minute surface center under the 2D projected coordinate system.The demarcation of endoscope namely is to determine M
IntAnd M
ExtThese two matrixes.
Use this location technology can realize comparatively quickly and accurately that accuracy of the mean reaches about 3mm to target location coronarius, greatly reduced the time of seeking the pathology coronary artery in the robot assisted coronary bypass, make operation safer, efficient.
Before the art in CT image and the ultrasonic registration process of surgical object Real-time Two-dimensional because the error that the factors such as breathings, heartbeat cause is introduced the robot coronary artery bypass surgery with intraoperative ultrasound and is navigated, this is salient point of the present invention.
Its obtainable beneficial effect: 1. realized the optical guidance localization method based on CT registration results before intraoperative ultrasound and the art; 2. solved and rely on clinically the personal experience to locate bridging impact point problem all the time, so that operation technique is safer, efficient.
Claims (10)
1. optical guidance positioning system based on the CT registration results, it is characterized in that, comprise: the CT image imports module before the art, the image segmentation module, body surface initial registration module, navigation module in two-dimensional ultrasonic image registration module and the art in the front CT image of art and the art, wherein: the CT image imports module and receives the DICOM format-pattern file that the front imaging examination of art obtains before the art, the image bag also exports respectively image segmentation module and body surface initial registration module to before generating art, before the art in CT image and the art two-dimensional ultrasonic image registration module link to each other with body surface initial registration module with the image segmentation module respectively and receive with the Three-Dimensional Dynamic coronary artery of surgical target point and set and body surface initial registration result, the front CT transition matrix of two-dimensional ultrasound registration module output art navigation module to the art in CT image and the art is exported accurate surgical navigational information by navigation module in the art before the art;
The image bag comprises before the described art: CT image and the 3 D stereoscopic image that obtains according to the DICOM data reconstruction before the art of some width of cloth DIOM forms of one or several cardiac cycles the target area in, wherein: the front CT view data of the art of a cardiac cycle by the art of several phase places before image form;
Described Three-Dimensional Dynamic coronary artery tree with surgical target point comprises: by dynamic heart model and the vascular tree model of image segmentation module generation.
2. system according to claim 1, it is characterized in that, CT image importing module comprises before the described art: the DICOM image reads in unit and the front image three-dimensional drawing unit of art, wherein: the DICOM image read in the unit with art before a series of DICOM format-pattern files of one cardiac cycle of heart of obtaining of imaging examination import parse the DICOM data and transfer to art before the image three-dimensional drawing unit, image bag and export respectively image segmentation module and body surface initial registration module to before the image three-dimensional drawing unit becomes 3 D stereoscopic image and merges into art according to the DICOM data reconstruction before the art.
3. system according to claim 1, it is characterized in that, described image segmentation module comprises: cardiac segmentation unit and vascular tree cutting unit, wherein: the cardiac segmentation unit to art before in the image bag before the heart art in the cardiac cycle CT image manually cut apart frame by frame based on clinical experience, and rebuild according to CT image segmentation result before the heart art that obtains and to obtain the dynamic heart model; The vascular tree cutting unit to art before in the image bag before the vascular tree art in the cardiac cycle CT image manually cut apart frame by frame based on clinical experience, and rebuild according to CT image segmentation result before the vascular tree art that obtains and to obtain the vascular tree model and to export before the art two-dimensional ultrasonic image registration module in the CT image and art to as setting with the Three-Dimensional Dynamic coronary artery of surgical target point in the lump with the dynamic heart model.
4. system according to claim 1, it is characterized in that, described body surface initial registration module comprises: register mark point selection unit and transform matrix calculations unit, wherein: the register mark point on the surgical object body surface is selected to be arranged in register mark point selection unit, and obtain its in the preoperative the CT view data in image coordinate system coordinate and volume coordinate, the transform matrix calculations unit is according to image coordinate system coordinate and volume coordinate, utilize the rigid registration algorithm to realize registration between these two coordinates, namely realize the registration in image space and surgical object space, and utilize these two groups of coordinate Calculation to go out the initial conversion matrix as body surface initial registration result, export two-dimensional ultrasonic image registration module in the front CT image of art and the art to.
5. system according to claim 1, it is characterized in that, the two-dimensional ultrasonic image registration module comprises in the front CT image of described art and the art: two-dimensional ultrasonic image reads in the unit in the art, CT image registration unit and cardiogram read in the unit before two-dimensional ultrasonic image and the art, wherein: CT image registration unit receives the intraoperative ultrasound image and reads in the two-dimensional ultrasound input picture of unit collection and the current heartbeat phase place that cardiogram reads in the unit collection before two-dimensional ultrasonic image and the art, to mate with corresponding several phase places and current heartbeat phase place in the Three-Dimensional Dynamic coronary artery tree of surgical target point by the input ECG signal, calculate the front CT image transitions matrix of art between the two and export navigation module in the art to, corresponding phase during wherein each is organized the corresponding a kind of current heartbeat phase place of the front CT transition matrix of art and sets with the Three-Dimensional Dynamic coronary artery of surgical target point.
6. system according to claim 1, it is characterized in that, navigation module comprises in the described art: transition matrix selected cell and navigation display unit, wherein: the transition matrix selected cell according to current heartbeat Selecting phasing art before corresponding one group of transition matrix in the CT transition matrix, and export one group of transition matrix of gained to the navigation display unit, calculate distance and the relative position relation between current virtual apparatus and the actual target point and carry out the video demonstration by the navigation display unit, thereby guided surgery is accurately finished.
7. the air navigation aid according to the described system of above-mentioned arbitrary claim is characterized in that, may further comprise the steps:
After the front CT view data of art of cardiac segmentation unit in the first step, the image segmentation module by a series of DICOM forms of front CT image importing module acquisition one cardiac cycle of target area of art, CT image before the heart art in this cardiac cycle is manually sketched the contours of cardiac silhouette frame by frame based on clinical experience, obtain segmentation result, and reconstruction obtains the dynamic heart model according to segmentation result;
Vascular tree cutting unit in second step, the image segmentation module to art before in the image bag before the vascular tree art in the cardiac cycle CT image manually sketch the contours of frame by frame the vascular tree profile and manually mark the surgical target point according to clinical demand based on clinical experience, obtain segmentation result, and according to segmentation result reconstruction acquisition vascular tree model, construct a three-dimensional virtual scene that comprises the dynamic vascular tree-model, this three-dimensional virtual scene has consisted of a virtual endoscopic images;
The CT view data was at the transition matrix of image coordinate system coordinate with the volume coordinate of true surgical object before the 3rd step, body surface initial registration module utilized the body surface registration to obtain art, i.e. coordinate points under two different spaces coordinate systems, by mutually mapping and realize one-to-one relationship of feature, reach the corresponding of CT image and surgical object Real-time Two-dimensional ultrasonoscopy before the final art;
The 4th goes on foot, will be complementary with corresponding several phase places and current heartbeat phase place in the Three-Dimensional Dynamic coronary artery tree of surgical target point by the input ECG signal;
The 5th goes on foot, by external optical alignment true endoscope and virtual endoscopic images is merged, namely by two width of cloth scenes are merged mutually with different transparencies, just formed one and strengthened reality environment, so that the real scene that the virtual scene that comprises the heartbeat model that virtual endoscope is seen and true endoscope are seen is consistent; The transition matrix Ti corresponding according to current heartbeat Selecting phasing from one group of transition matrix that step 4.3 obtains simultaneously, thereby set up the Mapping and Converting relation between image space and the real space, this Mapping and Converting relation is used for demonstrating in real time distance and the relative position relation between apparatus and the impact point the most at last, thereby guided surgery is accurately implemented.
8. method according to claim 7, it is characterized in that, described the 3rd step is specifically by choosing several register mark points in the CT image in the preoperative, find the point corresponding with register mark point in the image and utilize the optical guidance instrument to obtain these at the coordinate of real space in real space, utilize this two groups of different coordinates, but the position coordinates of point set is tried to achieve two transition matrixes between the space one to one.
9. method according to claim 7, it is characterized in that, described the 4th the step specifically refer to: with transition matrix T as the initial conversion matrix, if Ti is for the transition matrix that obtains behind the CT correct image before the art of each phase place on the basis of transition matrix T, namely for CT image i before the art of a phase place, Ti * T can further proofread and correct the registration error between this phase place CT image and the surgical object Real-time Two-dimensional ultrasonoscopy, and wherein N is the phase place number of image before the art in the CT view data before the art of a cardiac cycle.
10. method according to claim 7 is characterized in that, described the 4th step comprises:
4.1) by ultrasonic probe heart is gathered the surgical object realtime graphic of a series of surgical objects, wherein the surgical object realtime graphic that collects of each width of cloth all before corresponding some corresponding arts before the art of CT image, each phase place the CT image corresponding a series of two-dimensional ultrasonic image, for CT view data before the art of phase place i, extract the surface profile of wall of the heart;
4.2) corresponding each width of cloth two-dimensional ultrasonic image of phase place i extracted the profile of wall of the heart, the inwall that extracts from this a series of two-dimensional ultrasonic image has formed one group of point set, extract the unique point point set from this point is concentrated, the wall of the heart surface profile that extracts the CT image before art is another group point set;
4.3) by iterative closest point algorithms the point set on the two-dimensional ultrasonic image is registrated to the point set on the CT image before the art, obtain the transition matrix Ti between these two point sets, be used for the initial matrix of follow-up smart registration process.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210454220.2A CN102999902B (en) | 2012-11-13 | 2012-11-13 | Optical guidance positioning navigation method based on CT registration result |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210454220.2A CN102999902B (en) | 2012-11-13 | 2012-11-13 | Optical guidance positioning navigation method based on CT registration result |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102999902A true CN102999902A (en) | 2013-03-27 |
CN102999902B CN102999902B (en) | 2016-12-21 |
Family
ID=47928436
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210454220.2A Active CN102999902B (en) | 2012-11-13 | 2012-11-13 | Optical guidance positioning navigation method based on CT registration result |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102999902B (en) |
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103169445A (en) * | 2013-04-16 | 2013-06-26 | 苏州朗开医疗技术有限公司 | Navigation method and system for endoscope |
CN103295455A (en) * | 2013-06-19 | 2013-09-11 | 北京理工大学 | Ultrasonic training system based on CT (Computed Tomography) image simulation and positioning |
CN103371870A (en) * | 2013-07-16 | 2013-10-30 | 深圳先进技术研究院 | Multimode image based surgical operation navigation system |
CN103948361A (en) * | 2014-04-14 | 2014-07-30 | 中国人民解放军总医院 | Marking-point-free endoscope positioning and tracking method and system |
CN103971574A (en) * | 2014-04-14 | 2014-08-06 | 中国人民解放军总医院 | Ultrasonic guidance tumor puncture training simulation system |
CN104323860A (en) * | 2014-11-07 | 2015-02-04 | 刘弘毅 | Navigation path planning device and method |
WO2015039302A1 (en) * | 2013-09-18 | 2015-03-26 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd | Method and system for guided ultrasound image acquisition |
CN105078514A (en) * | 2014-04-22 | 2015-11-25 | 重庆海扶医疗科技股份有限公司 | Construction method and device of three-dimensional model, image monitoring method and device |
CN105411679A (en) * | 2015-11-23 | 2016-03-23 | 中国科学院深圳先进技术研究院 | Puncturing plan route correcting method and device |
CN105828876A (en) * | 2013-12-18 | 2016-08-03 | 皇家飞利浦有限公司 | System and method for ultrasound and computed tomography image registration for sonothrombolysis treatment |
WO2016154715A1 (en) * | 2015-03-31 | 2016-10-06 | Centre For Imaging Technology Commercialization (Cimtec) | Method and system for registering ultrasound and computed tomography images |
WO2016192671A1 (en) * | 2015-06-05 | 2016-12-08 | Chen Chieh Hsiao | Intra operative tracking method |
WO2017020281A1 (en) * | 2015-08-05 | 2017-02-09 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic image processing system and method and device thereof, ultrasonic diagnostic device |
CN106485739A (en) * | 2016-09-22 | 2017-03-08 | 哈尔滨工业大学 | A kind of point set method for registering based on L2 distance |
CN106725852A (en) * | 2016-12-02 | 2017-05-31 | 上海精劢医疗科技有限公司 | The operation guiding system of lung puncture |
CN106890025A (en) * | 2017-03-03 | 2017-06-27 | 浙江大学 | A kind of minimally invasive operation navigating system and air navigation aid |
CN107072720A (en) * | 2014-09-19 | 2017-08-18 | 株式会社高永科技 | The coordinate system integration method of optical tracking system and optical tracking system |
CN107456278A (en) * | 2016-06-06 | 2017-12-12 | 北京理工大学 | A kind of ESS air navigation aid and system |
CN107481272A (en) * | 2016-06-08 | 2017-12-15 | 瑞地玛医学科技有限公司 | A kind of radiotherapy treatment planning image registration and the method and system merged |
CN107610109A (en) * | 2017-09-06 | 2018-01-19 | 艾瑞迈迪医疗科技(北京)有限公司 | Method for displaying image, the apparatus and system of endoscope micro-wound navigation |
CN107689045A (en) * | 2017-09-06 | 2018-02-13 | 艾瑞迈迪医疗科技(北京)有限公司 | Method for displaying image, the apparatus and system of endoscope micro-wound navigation |
CN107854177A (en) * | 2017-11-18 | 2018-03-30 | 上海交通大学医学院附属第九人民医院 | A kind of ultrasound and CT/MR image co-registrations operation guiding system and its method based on optical alignment registration |
WO2018076503A1 (en) * | 2016-10-28 | 2018-05-03 | 苏州朗开医疗技术有限公司 | Positioning system and medical positioning system for diagnosing of target object in body |
CN108272502A (en) * | 2017-12-29 | 2018-07-13 | 战跃福 | A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding |
CN108309450A (en) * | 2017-12-27 | 2018-07-24 | 刘洋 | Locator system and method for surgical navigational |
CN108324369A (en) * | 2018-02-01 | 2018-07-27 | 艾瑞迈迪医疗科技(北京)有限公司 | Method for registering and Use of Neuronavigation equipment in art based on face |
CN108475428A (en) * | 2015-12-22 | 2018-08-31 | 皇家飞利浦有限公司 | The coronary artery segmentation of cardiac module guiding |
CN108836479A (en) * | 2018-05-16 | 2018-11-20 | 山东大学 | A kind of medical image registration method and operation guiding system |
CN108992084A (en) * | 2018-09-07 | 2018-12-14 | 广东工业大学 | Use the method and CT- supersonic inspection device of CT system and ultrasonic system combined imaging |
CN109035414A (en) * | 2018-06-20 | 2018-12-18 | 深圳大学 | Generation method, device, equipment and the storage medium of augmented reality operative image |
CN109310396A (en) * | 2016-06-20 | 2019-02-05 | 蝴蝶网络有限公司 | For assisting the automated graphics of user's operation Vltrasonic device to obtain |
CN109345632A (en) * | 2018-09-17 | 2019-02-15 | 深圳达闼科技控股有限公司 | A kind of method, relevant apparatus and readable storage medium storing program for executing obtaining image |
CN110070788A (en) * | 2019-03-18 | 2019-07-30 | 叶哲伟 | A kind of human body 3D meridian point method for visualizing based on mixed reality |
CN110075429A (en) * | 2019-04-26 | 2019-08-02 | 上海交通大学 | A kind of ultrasonic transducer air navigation aid, navigation device, electronic equipment and readable storage medium storing program for executing |
CN110368026A (en) * | 2018-04-13 | 2019-10-25 | 北京柏惠维康医疗机器人科技有限公司 | A kind of operation auxiliary apparatus and system |
CN110403698A (en) * | 2018-04-28 | 2019-11-05 | 北京柏惠维康医疗机器人科技有限公司 | A kind of instrument intervention device and system |
CN110443749A (en) * | 2019-09-10 | 2019-11-12 | 真健康(北京)医疗科技有限公司 | A kind of dynamic registration method and device |
CN110478050A (en) * | 2019-08-23 | 2019-11-22 | 北京仁馨医疗科技有限公司 | 3-D image and scope image fusing method, apparatus and system based on CT/MRI data |
CN110706357A (en) * | 2019-10-10 | 2020-01-17 | 青岛大学附属医院 | Navigation system |
CN110731821A (en) * | 2019-09-30 | 2020-01-31 | 艾瑞迈迪医疗科技(北京)有限公司 | Method and guide bracket for minimally invasive tumor ablation based on CT/MRI |
CN110751681A (en) * | 2019-10-18 | 2020-02-04 | 西南科技大学 | Augmented reality registration method, device, equipment and storage medium |
WO2020046199A1 (en) * | 2018-08-29 | 2020-03-05 | Agency For Science, Technology And Research | Lesion localization in an organ |
CN110931121A (en) * | 2019-11-29 | 2020-03-27 | 重庆邮电大学 | Remote operation guiding device based on Hololens and operation method |
CN111053964A (en) * | 2018-10-17 | 2020-04-24 | 易美逊医疗有限公司 | Insertion device positioning guidance system and method |
CN111163837A (en) * | 2017-07-28 | 2020-05-15 | 医达科技公司 | Method and system for surgical planning in a mixed reality environment |
CN111415404A (en) * | 2020-03-16 | 2020-07-14 | 广州柏视医疗科技有限公司 | Positioning method and device for intraoperative preset area, storage medium and electronic equipment |
CN111612778A (en) * | 2020-05-26 | 2020-09-01 | 上海交通大学 | Preoperative CTA and intraoperative X-ray coronary artery registration method |
CN111724420A (en) * | 2020-05-14 | 2020-09-29 | 北京天智航医疗科技股份有限公司 | Intraoperative registration method and device, storage medium and server |
CN111759463A (en) * | 2020-07-31 | 2020-10-13 | 南京普爱医疗设备股份有限公司 | Method for improving positioning precision of surgical mechanical arm |
CN111858151A (en) * | 2019-04-29 | 2020-10-30 | Emc知识产权控股有限公司 | Method and system for prioritizing storage of critical data objects during backup operations |
CN111870344A (en) * | 2020-05-29 | 2020-11-03 | 中山大学肿瘤防治中心(中山大学附属肿瘤医院、中山大学肿瘤研究所) | Preoperative navigation method, system and terminal equipment |
CN112002018A (en) * | 2020-08-18 | 2020-11-27 | 云南省第一人民医院 | Intraoperative position navigation system, device and method based on mixed reality |
CN112155733A (en) * | 2020-09-29 | 2021-01-01 | 苏州微创畅行机器人有限公司 | Readable storage medium, bone modeling and registering system and bone surgery system |
CN112331311A (en) * | 2020-11-06 | 2021-02-05 | 青岛海信医疗设备股份有限公司 | Method and device for fusion display of video and preoperative model in laparoscopic surgery |
CN112370161A (en) * | 2020-10-12 | 2021-02-19 | 珠海横乐医学科技有限公司 | Operation navigation method and medium based on ultrasonic image characteristic plane detection |
CN113012230A (en) * | 2021-03-30 | 2021-06-22 | 华南理工大学 | Method for placing surgical guide plate under auxiliary guidance of AR in operation |
CN113129342A (en) * | 2019-12-31 | 2021-07-16 | 无锡祥生医疗科技股份有限公司 | Multi-modal fusion imaging method, device and storage medium |
CN113349931A (en) * | 2021-06-18 | 2021-09-07 | 云南微乐数字医疗科技有限公司 | Focus registration method of high-precision surgical navigation system |
CN113425411A (en) * | 2021-08-04 | 2021-09-24 | 成都科莱弗生命科技有限公司 | Method and device for lesion positioning navigation |
CN113610826A (en) * | 2021-08-13 | 2021-11-05 | 推想医疗科技股份有限公司 | Puncture positioning method and device, electronic device and storage medium |
CN113643226A (en) * | 2020-04-27 | 2021-11-12 | 成都术通科技有限公司 | Labeling method, device, equipment and medium |
CN114387320A (en) * | 2022-03-25 | 2022-04-22 | 武汉楚精灵医疗科技有限公司 | Medical image registration method, device, terminal and computer-readable storage medium |
CN114404041A (en) * | 2022-01-19 | 2022-04-29 | 上海精劢医疗科技有限公司 | C-shaped arm imaging parameter calibration system and method |
US11364179B2 (en) | 2018-04-30 | 2022-06-21 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
US11382701B2 (en) | 2018-10-17 | 2022-07-12 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
US11389254B2 (en) | 2016-08-18 | 2022-07-19 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
CN115553818A (en) * | 2022-12-05 | 2023-01-03 | 湖南省人民医院(湖南师范大学附属第一医院) | Myocardial biopsy system based on fusion positioning |
CN115697178A (en) * | 2020-10-27 | 2023-02-03 | 瑞德医疗机器股份有限公司 | Operation support device |
CN116019554A (en) * | 2021-12-15 | 2023-04-28 | 商丘市第一人民医院 | Spatial registration acceleration method and system in spinal surgery navigation |
CN116965848A (en) * | 2023-09-25 | 2023-10-31 | 中南大学 | Three-dimensional ultrasonic imaging method, system, equipment and storage medium |
US11806087B2 (en) | 2016-08-18 | 2023-11-07 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
CN117204950A (en) * | 2023-09-18 | 2023-12-12 | 普密特(成都)医疗科技有限公司 | Endoscope position guiding method, device, equipment and medium based on image characteristics |
TWI836491B (en) * | 2021-11-18 | 2024-03-21 | 瑞鈦醫療器材股份有限公司 | Method and navigation system for registering two-dimensional image data set with three-dimensional image data set of body of interest |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1748646A (en) * | 2004-08-12 | 2006-03-22 | 通用电气公司 | Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool |
CN101681504A (en) * | 2006-11-27 | 2010-03-24 | 皇家飞利浦电子股份有限公司 | System and method for fusing real-time ultrasound images with pre-acquired medical images |
CN102224525A (en) * | 2008-11-25 | 2011-10-19 | 皇家飞利浦电子股份有限公司 | Image provision for registration |
-
2012
- 2012-11-13 CN CN201210454220.2A patent/CN102999902B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1748646A (en) * | 2004-08-12 | 2006-03-22 | 通用电气公司 | Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool |
CN101681504A (en) * | 2006-11-27 | 2010-03-24 | 皇家飞利浦电子股份有限公司 | System and method for fusing real-time ultrasound images with pre-acquired medical images |
CN102224525A (en) * | 2008-11-25 | 2011-10-19 | 皇家飞利浦电子股份有限公司 | Image provision for registration |
Non-Patent Citations (6)
Title |
---|
G.P. PENNEY ET AL: "Registration of freehand 3D ultrasound and magnetic resonance liver images", 《MEDICAL IMAGE ANALYSIS》 * |
J.H. KASPERSEN ET AL: "Three-Dimensional Ultrasound-Based Navigation Combined with Preoperative CT During Abdominal Interventions: A Feasibility Study", 《CARDIOVASC INTERVENT RADIOL》 * |
JASBIR SRA: "Cardiac Image Registration", 《JOURNAL OF ATRIAL FIBRILLATION》 * |
JUNFENG CAI ET AL: "The implementation of an integrated computer-assisted system for minimally invasive cardiac surgery", 《THE INTERNATIONAL JOURNAL OF MEDICAL ROBOTICS AND COMPUTER ASSISTED SURGERY》 * |
XISHI HUANG ET AL: "Dynamic 2D Ultrasound and 3D CT Image Registration of the Beating Heart", 《IEEE TRANSACTIONS ON MEDICAL IMAGING》 * |
YIYONG SUN ET AL: "Image Guidance of Intracardiac Ultrasound with Fusion of Pre-operative Images", 《MICCAI 2007》 * |
Cited By (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103169445A (en) * | 2013-04-16 | 2013-06-26 | 苏州朗开医疗技术有限公司 | Navigation method and system for endoscope |
CN103169445B (en) * | 2013-04-16 | 2016-07-06 | 苏州朗开医疗技术有限公司 | The air navigation aid of a kind of endoscope and system |
CN103295455A (en) * | 2013-06-19 | 2013-09-11 | 北京理工大学 | Ultrasonic training system based on CT (Computed Tomography) image simulation and positioning |
CN103371870A (en) * | 2013-07-16 | 2013-10-30 | 深圳先进技术研究院 | Multimode image based surgical operation navigation system |
WO2015039302A1 (en) * | 2013-09-18 | 2015-03-26 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd | Method and system for guided ultrasound image acquisition |
CN105828876A (en) * | 2013-12-18 | 2016-08-03 | 皇家飞利浦有限公司 | System and method for ultrasound and computed tomography image registration for sonothrombolysis treatment |
CN103971574A (en) * | 2014-04-14 | 2014-08-06 | 中国人民解放军总医院 | Ultrasonic guidance tumor puncture training simulation system |
CN103948361A (en) * | 2014-04-14 | 2014-07-30 | 中国人民解放军总医院 | Marking-point-free endoscope positioning and tracking method and system |
CN103948361B (en) * | 2014-04-14 | 2016-10-05 | 中国人民解放军总医院 | Endoscope's positioning and tracing method of no marks point and system |
CN105078514A (en) * | 2014-04-22 | 2015-11-25 | 重庆海扶医疗科技股份有限公司 | Construction method and device of three-dimensional model, image monitoring method and device |
CN107072720A (en) * | 2014-09-19 | 2017-08-18 | 株式会社高永科技 | The coordinate system integration method of optical tracking system and optical tracking system |
US11206998B2 (en) | 2014-09-19 | 2021-12-28 | Koh Young Technology Inc. | Optical tracking system for tracking a patient and a surgical instrument with a reference marker and shape measurement device via coordinate transformation |
CN107072720B (en) * | 2014-09-19 | 2020-11-20 | 株式会社高迎科技 | Optical tracking system and coordinate system integration method of optical tracking system |
CN104323860A (en) * | 2014-11-07 | 2015-02-04 | 刘弘毅 | Navigation path planning device and method |
CN104323860B (en) * | 2014-11-07 | 2018-08-31 | 常州朗合医疗器械有限公司 | Navigation path planning device and method |
WO2016154715A1 (en) * | 2015-03-31 | 2016-10-06 | Centre For Imaging Technology Commercialization (Cimtec) | Method and system for registering ultrasound and computed tomography images |
WO2016192671A1 (en) * | 2015-06-05 | 2016-12-08 | Chen Chieh Hsiao | Intra operative tracking method |
CN106794044B (en) * | 2015-06-05 | 2019-09-27 | 钛隼生物科技股份有限公司 | Method for tracing in art |
CN106794044A (en) * | 2015-06-05 | 2017-05-31 | 陈阶晓 | Method for tracing in art |
KR101933132B1 (en) | 2015-06-05 | 2019-03-15 | 치 시아오 첸 | Intraoperative tracking method |
TWI595437B (en) * | 2015-06-05 | 2017-08-11 | 鈦隼生物科技股份有限公司 | Intraoperative tracking method |
US9827053B2 (en) | 2015-06-05 | 2017-11-28 | Chieh-Hsiao Chen | Intraoperative tracking method |
KR20170033858A (en) * | 2015-06-05 | 2017-03-27 | 치 시아오 첸 | Intraoperative tracking method |
EP3145420A4 (en) * | 2015-06-05 | 2018-04-04 | Chen, Chieh Hsiao | Intra operative tracking method |
US10713802B2 (en) | 2015-08-05 | 2020-07-14 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Ultrasonic image processing system and method and device thereof, ultrasonic diagnostic device |
WO2017020281A1 (en) * | 2015-08-05 | 2017-02-09 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic image processing system and method and device thereof, ultrasonic diagnostic device |
CN105411679B (en) * | 2015-11-23 | 2017-07-14 | 中国科学院深圳先进技术研究院 | One kind punctures path planning method and device for correcting |
CN105411679A (en) * | 2015-11-23 | 2016-03-23 | 中国科学院深圳先进技术研究院 | Puncturing plan route correcting method and device |
CN108475428A (en) * | 2015-12-22 | 2018-08-31 | 皇家飞利浦有限公司 | The coronary artery segmentation of cardiac module guiding |
CN107456278A (en) * | 2016-06-06 | 2017-12-12 | 北京理工大学 | A kind of ESS air navigation aid and system |
WO2017211087A1 (en) * | 2016-06-06 | 2017-12-14 | 北京理工大学 | Endoscopic surgery navigation method and system |
CN107481272A (en) * | 2016-06-08 | 2017-12-15 | 瑞地玛医学科技有限公司 | A kind of radiotherapy treatment planning image registration and the method and system merged |
US11185307B2 (en) | 2016-06-20 | 2021-11-30 | Bfly Operations, Inc. | Augmented reality interface for assisting a user to operate an ultrasound device |
US11861887B2 (en) | 2016-06-20 | 2024-01-02 | Bfly Operations, Inc. | Augmented reality interface for assisting a user to operate an ultrasound device |
US11564657B2 (en) | 2016-06-20 | 2023-01-31 | Bfly Operations, Inc. | Augmented reality interface for assisting a user to operate an ultrasound device |
US11670077B2 (en) | 2016-06-20 | 2023-06-06 | Bflyoperations, Inc. | Augmented reality interface for assisting a user to operate an ultrasound device |
CN109310396A (en) * | 2016-06-20 | 2019-02-05 | 蝴蝶网络有限公司 | For assisting the automated graphics of user's operation Vltrasonic device to obtain |
US11806087B2 (en) | 2016-08-18 | 2023-11-07 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
US11389254B2 (en) | 2016-08-18 | 2022-07-19 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
CN106485739A (en) * | 2016-09-22 | 2017-03-08 | 哈尔滨工业大学 | A kind of point set method for registering based on L2 distance |
CN106485739B (en) * | 2016-09-22 | 2019-06-11 | 哈尔滨工业大学 | A kind of point set method for registering based on L2 distance |
WO2018076503A1 (en) * | 2016-10-28 | 2018-05-03 | 苏州朗开医疗技术有限公司 | Positioning system and medical positioning system for diagnosing of target object in body |
CN106725852A (en) * | 2016-12-02 | 2017-05-31 | 上海精劢医疗科技有限公司 | The operation guiding system of lung puncture |
CN106890025A (en) * | 2017-03-03 | 2017-06-27 | 浙江大学 | A kind of minimally invasive operation navigating system and air navigation aid |
CN111163837A (en) * | 2017-07-28 | 2020-05-15 | 医达科技公司 | Method and system for surgical planning in a mixed reality environment |
CN111163837B (en) * | 2017-07-28 | 2022-08-02 | 医达科技公司 | Method and system for surgical planning in a mixed reality environment |
CN107610109A (en) * | 2017-09-06 | 2018-01-19 | 艾瑞迈迪医疗科技(北京)有限公司 | Method for displaying image, the apparatus and system of endoscope micro-wound navigation |
CN107689045A (en) * | 2017-09-06 | 2018-02-13 | 艾瑞迈迪医疗科技(北京)有限公司 | Method for displaying image, the apparatus and system of endoscope micro-wound navigation |
CN107854177A (en) * | 2017-11-18 | 2018-03-30 | 上海交通大学医学院附属第九人民医院 | A kind of ultrasound and CT/MR image co-registrations operation guiding system and its method based on optical alignment registration |
CN108309450A (en) * | 2017-12-27 | 2018-07-24 | 刘洋 | Locator system and method for surgical navigational |
CN108272502A (en) * | 2017-12-29 | 2018-07-13 | 战跃福 | A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding |
CN108324369A (en) * | 2018-02-01 | 2018-07-27 | 艾瑞迈迪医疗科技(北京)有限公司 | Method for registering and Use of Neuronavigation equipment in art based on face |
CN110368026A (en) * | 2018-04-13 | 2019-10-25 | 北京柏惠维康医疗机器人科技有限公司 | A kind of operation auxiliary apparatus and system |
CN110368026B (en) * | 2018-04-13 | 2021-03-12 | 北京柏惠维康医疗机器人科技有限公司 | Operation auxiliary device and system |
CN110403698A (en) * | 2018-04-28 | 2019-11-05 | 北京柏惠维康医疗机器人科技有限公司 | A kind of instrument intervention device and system |
CN110403698B (en) * | 2018-04-28 | 2020-10-30 | 北京柏惠维康科技有限公司 | Instrument intervention device and system |
US11364179B2 (en) | 2018-04-30 | 2022-06-21 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
CN108836479A (en) * | 2018-05-16 | 2018-11-20 | 山东大学 | A kind of medical image registration method and operation guiding system |
CN108836479B (en) * | 2018-05-16 | 2020-01-24 | 山东大学 | Medical image registration method and surgical navigation system |
CN109035414A (en) * | 2018-06-20 | 2018-12-18 | 深圳大学 | Generation method, device, equipment and the storage medium of augmented reality operative image |
WO2020046199A1 (en) * | 2018-08-29 | 2020-03-05 | Agency For Science, Technology And Research | Lesion localization in an organ |
CN108992084A (en) * | 2018-09-07 | 2018-12-14 | 广东工业大学 | Use the method and CT- supersonic inspection device of CT system and ultrasonic system combined imaging |
CN108992084B (en) * | 2018-09-07 | 2023-08-01 | 广东工业大学 | Method for imaging by using combination of CT system and ultrasonic system and CT-ultrasonic inspection equipment |
CN109345632A (en) * | 2018-09-17 | 2019-02-15 | 深圳达闼科技控股有限公司 | A kind of method, relevant apparatus and readable storage medium storing program for executing obtaining image |
CN111053964A (en) * | 2018-10-17 | 2020-04-24 | 易美逊医疗有限公司 | Insertion device positioning guidance system and method |
US11779403B2 (en) | 2018-10-17 | 2023-10-10 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
US11382701B2 (en) | 2018-10-17 | 2022-07-12 | Envizion Medical Ltd. | Insertion device positioning guidance system and method |
CN110070788A (en) * | 2019-03-18 | 2019-07-30 | 叶哲伟 | A kind of human body 3D meridian point method for visualizing based on mixed reality |
CN110075429A (en) * | 2019-04-26 | 2019-08-02 | 上海交通大学 | A kind of ultrasonic transducer air navigation aid, navigation device, electronic equipment and readable storage medium storing program for executing |
CN111858151A (en) * | 2019-04-29 | 2020-10-30 | Emc知识产权控股有限公司 | Method and system for prioritizing storage of critical data objects during backup operations |
CN110478050A (en) * | 2019-08-23 | 2019-11-22 | 北京仁馨医疗科技有限公司 | 3-D image and scope image fusing method, apparatus and system based on CT/MRI data |
CN110443749A (en) * | 2019-09-10 | 2019-11-12 | 真健康(北京)医疗科技有限公司 | A kind of dynamic registration method and device |
CN110731821B (en) * | 2019-09-30 | 2021-06-01 | 艾瑞迈迪医疗科技(北京)有限公司 | Method and guide bracket for minimally invasive tumor ablation based on CT/MRI |
CN110731821A (en) * | 2019-09-30 | 2020-01-31 | 艾瑞迈迪医疗科技(北京)有限公司 | Method and guide bracket for minimally invasive tumor ablation based on CT/MRI |
CN110706357B (en) * | 2019-10-10 | 2023-02-24 | 青岛大学附属医院 | Navigation system |
CN110706357A (en) * | 2019-10-10 | 2020-01-17 | 青岛大学附属医院 | Navigation system |
CN110751681B (en) * | 2019-10-18 | 2022-07-08 | 西南科技大学 | Augmented reality registration method, device, equipment and storage medium |
CN110751681A (en) * | 2019-10-18 | 2020-02-04 | 西南科技大学 | Augmented reality registration method, device, equipment and storage medium |
CN110931121A (en) * | 2019-11-29 | 2020-03-27 | 重庆邮电大学 | Remote operation guiding device based on Hololens and operation method |
CN113129342A (en) * | 2019-12-31 | 2021-07-16 | 无锡祥生医疗科技股份有限公司 | Multi-modal fusion imaging method, device and storage medium |
CN111415404A (en) * | 2020-03-16 | 2020-07-14 | 广州柏视医疗科技有限公司 | Positioning method and device for intraoperative preset area, storage medium and electronic equipment |
CN111415404B (en) * | 2020-03-16 | 2021-06-29 | 广州柏视医疗科技有限公司 | Positioning method and device for intraoperative preset area, storage medium and electronic equipment |
CN113643226B (en) * | 2020-04-27 | 2024-01-19 | 成都术通科技有限公司 | Labeling method, labeling device, labeling equipment and labeling medium |
CN113643226A (en) * | 2020-04-27 | 2021-11-12 | 成都术通科技有限公司 | Labeling method, device, equipment and medium |
CN111724420A (en) * | 2020-05-14 | 2020-09-29 | 北京天智航医疗科技股份有限公司 | Intraoperative registration method and device, storage medium and server |
CN111612778B (en) * | 2020-05-26 | 2023-07-11 | 上海交通大学 | Preoperative CTA and intraoperative X-ray coronary artery registration method |
CN111612778A (en) * | 2020-05-26 | 2020-09-01 | 上海交通大学 | Preoperative CTA and intraoperative X-ray coronary artery registration method |
CN111870344A (en) * | 2020-05-29 | 2020-11-03 | 中山大学肿瘤防治中心(中山大学附属肿瘤医院、中山大学肿瘤研究所) | Preoperative navigation method, system and terminal equipment |
CN111759463A (en) * | 2020-07-31 | 2020-10-13 | 南京普爱医疗设备股份有限公司 | Method for improving positioning precision of surgical mechanical arm |
CN112002018A (en) * | 2020-08-18 | 2020-11-27 | 云南省第一人民医院 | Intraoperative position navigation system, device and method based on mixed reality |
CN112155733A (en) * | 2020-09-29 | 2021-01-01 | 苏州微创畅行机器人有限公司 | Readable storage medium, bone modeling and registering system and bone surgery system |
CN112155733B (en) * | 2020-09-29 | 2022-01-28 | 苏州微创畅行机器人有限公司 | Readable storage medium, bone modeling and registering system and bone surgery system |
CN112370161B (en) * | 2020-10-12 | 2022-07-26 | 珠海横乐医学科技有限公司 | Operation navigation method and medium based on ultrasonic image characteristic plane detection |
CN112370161A (en) * | 2020-10-12 | 2021-02-19 | 珠海横乐医学科技有限公司 | Operation navigation method and medium based on ultrasonic image characteristic plane detection |
CN115697178B (en) * | 2020-10-27 | 2024-05-10 | 瑞德医疗机器股份有限公司 | Surgical support device |
CN115697178A (en) * | 2020-10-27 | 2023-02-03 | 瑞德医疗机器股份有限公司 | Operation support device |
CN112331311B (en) * | 2020-11-06 | 2022-06-03 | 青岛海信医疗设备股份有限公司 | Method and device for fusion display of video and preoperative model in laparoscopic surgery |
CN112331311A (en) * | 2020-11-06 | 2021-02-05 | 青岛海信医疗设备股份有限公司 | Method and device for fusion display of video and preoperative model in laparoscopic surgery |
CN113012230A (en) * | 2021-03-30 | 2021-06-22 | 华南理工大学 | Method for placing surgical guide plate under auxiliary guidance of AR in operation |
CN113012230B (en) * | 2021-03-30 | 2022-09-23 | 华南理工大学 | Method for placing surgical guide plate under auxiliary guidance of AR in operation |
CN113349931A (en) * | 2021-06-18 | 2021-09-07 | 云南微乐数字医疗科技有限公司 | Focus registration method of high-precision surgical navigation system |
CN113349931B (en) * | 2021-06-18 | 2024-06-04 | 云南微乐数字医疗科技有限公司 | Focus registration method for high-precision operation navigation system |
CN113425411A (en) * | 2021-08-04 | 2021-09-24 | 成都科莱弗生命科技有限公司 | Method and device for lesion positioning navigation |
CN113610826A (en) * | 2021-08-13 | 2021-11-05 | 推想医疗科技股份有限公司 | Puncture positioning method and device, electronic device and storage medium |
TWI836491B (en) * | 2021-11-18 | 2024-03-21 | 瑞鈦醫療器材股份有限公司 | Method and navigation system for registering two-dimensional image data set with three-dimensional image data set of body of interest |
CN116019554A (en) * | 2021-12-15 | 2023-04-28 | 商丘市第一人民医院 | Spatial registration acceleration method and system in spinal surgery navigation |
CN116019554B (en) * | 2021-12-15 | 2024-03-05 | 商丘市第一人民医院 | Spatial registration acceleration method and system in spinal surgery navigation |
CN114404041A (en) * | 2022-01-19 | 2022-04-29 | 上海精劢医疗科技有限公司 | C-shaped arm imaging parameter calibration system and method |
CN114404041B (en) * | 2022-01-19 | 2023-11-14 | 上海精劢医疗科技有限公司 | C-arm imaging parameter calibration system and method |
CN114387320B (en) * | 2022-03-25 | 2022-07-19 | 武汉楚精灵医疗科技有限公司 | Medical image registration method, device, terminal and computer-readable storage medium |
CN114387320A (en) * | 2022-03-25 | 2022-04-22 | 武汉楚精灵医疗科技有限公司 | Medical image registration method, device, terminal and computer-readable storage medium |
CN115553818A (en) * | 2022-12-05 | 2023-01-03 | 湖南省人民医院(湖南师范大学附属第一医院) | Myocardial biopsy system based on fusion positioning |
CN117204950A (en) * | 2023-09-18 | 2023-12-12 | 普密特(成都)医疗科技有限公司 | Endoscope position guiding method, device, equipment and medium based on image characteristics |
CN117204950B (en) * | 2023-09-18 | 2024-05-10 | 普密特(成都)医疗科技有限公司 | Endoscope position guiding method, device, equipment and medium based on image characteristics |
CN116965848A (en) * | 2023-09-25 | 2023-10-31 | 中南大学 | Three-dimensional ultrasonic imaging method, system, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN102999902B (en) | 2016-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102999902B (en) | Optical guidance positioning navigation method based on CT registration result | |
US10898057B2 (en) | Apparatus and method for airway registration and navigation | |
AU2007221876B2 (en) | Registration of images of an organ using anatomical features outside the organ | |
US20190272632A1 (en) | Method and a system for registering a 3d pre acquired image coordinates system with a medical positioning system coordinate system and with a 2d image coordinate system | |
CN102949240B (en) | Image-guided lung interventional operation system | |
JP2966089B2 (en) | Interactive device for local surgery inside heterogeneous tissue | |
Devernay et al. | Towards endoscopic augmented reality for robotically assisted minimally invasive cardiac surgery | |
RU2594811C2 (en) | Visualisation for navigation instruction | |
CN101797182A (en) | Nasal endoscope minimally invasive operation navigating system based on augmented reality technique | |
EP1933710A1 (en) | Sensor guided catheter navigation system | |
Merritt et al. | Real-time CT-video registration for continuous endoscopic guidance | |
Deligianni et al. | Nonrigid 2-D/3-D registration for patient specific bronchoscopy simulation with statistical shape modeling: Phantom validation | |
Wahle et al. | 3D heart-vessel reconstruction from biplane angiograms | |
CN105616003B (en) | A kind of soft tissue 3D vision tracking based on radial direction spline interpolation | |
CN115105204A (en) | Laparoscope augmented reality fusion display method | |
Stoyanov et al. | Current issues of photorealistic rendering for virtual and augmented reality in minimally invasive surgery | |
TW499308B (en) | A new method for registration of computerized brain atlas and CT image | |
Lu et al. | Virtual-real registration of augmented reality technology used in the cerebral surgery lesion localization | |
Zhang et al. | Design and Implementation of An Assisted Positioning System for Carotid Endarterectomy Based on Mixed Reality | |
CN113902882A (en) | Augmented reality assisting method and device for vascular interventional operation | |
Singla | Intra-operative ultrasound-based augmented reality for laparoscopic surgical guidance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |