CN106236264A - The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system - Google Patents

The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system Download PDF

Info

Publication number
CN106236264A
CN106236264A CN201610717521.8A CN201610717521A CN106236264A CN 106236264 A CN106236264 A CN 106236264A CN 201610717521 A CN201610717521 A CN 201610717521A CN 106236264 A CN106236264 A CN 106236264A
Authority
CN
China
Prior art keywords
data
image
camera lens
virtual image
instruments
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610717521.8A
Other languages
Chinese (zh)
Other versions
CN106236264B (en
Inventor
李国新
陈韬
蒋振刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610717521.8A priority Critical patent/CN106236264B/en
Publication of CN106236264A publication Critical patent/CN106236264A/en
Application granted granted Critical
Publication of CN106236264B publication Critical patent/CN106236264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00743Type of operation; Specification of treatment sites
    • A61B2017/00818Treatment of the gastro-intestinal system

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of optically-based tracking and the gastrointestinal procedures air navigation aid of images match and system, this gastrointestinal procedures air navigation aid comprises the steps: to obtain patient at least at the scan image data of operative site;Medical apparatus and instruments has camera lens, when it enters operative site, obtains, by camera lens, the optical image that camera lens front end is real-time;Tracer element follows the tracks of the position data of medical apparatus and instruments camera lens, and by this position data and previous scans as Data Matching, obtains the virtual image corresponding with optical effect;Optical image real-time for camera lens front end and virtual image are exported the real-time navigation being used for carrying out performing the operation.The present invention can realize the dynamic navigation in operation process, provides real-time tracking effect for medical apparatus and instruments gastrointestinal procedures.

Description

The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system
Technical field
The present invention relates to a kind of optically-based tracking and the gastrointestinal procedures air navigation aid of images match and system.
Background technology
Along with the maturation of technology and equipment, peritoneoscope is widely used at multiple fields particularly gastrointestinal surgery, but Laparoscopically operating feature makes it lose fine " sense of touch " in Tradtional laparotomy, so that distinguish dissection under the mirror of chamber " vision " of position becomes of crucial importance;And peritoneoscope mostly is the 2D visual field, lacks depth feelings, although being provided that 3D's on the market Camera lens, but because somewhat expensive the most still can not be promoted well;Secondly, peritoneoscope also has self intrinsic limitation: The visual field of patient is from the 160 of open surgery degree of constrictions to 70 degree, and this tubular visual field makes patient can not be effectively observed with the time The multiple internal organs of intraperitoneal and apparatus greatly reduce the certainty to the overall situation, and gastrointestinal procedures particularly Surgery for Gastric Carcinoma, is base In with peripheral vessels for the operation guided, the traveling of blood vessel and variation are significant to the strategy of operation.
Computer aided technique particularly three-dimensional reconstruction application on gastrointestinal procedures in recent years gets more and more, and it is right The dissection of laparoscopic surgery distinguishes have certain supplementary value, and current three-dimensional reconstruction both domestic and external is mainly used in operation Planning and surgical navigational, and surgery planning typically refers to carry out emulation operation by Three-dimension Reconstruction Model in the preoperative and drills, and hands Art navigation refers to reach to guide in art the purpose of operation by observing the anatomical model rebuild.This guidance information, typically It is to be shown by the form such as computer screen or 3D printer model, but two kinds of forms belong to the navigation of " static ".
Summary of the invention
It is an object of the invention to provide a kind of optically-based tracking and the gastrointestinal procedures air navigation aid of images match and be System.The present invention can realize the dynamic navigation in operation process, provides real-time tracking effect for gastrointestinal procedures.
Its technical scheme is as follows:
The gastrointestinal procedures air navigation aid that optically-based tracking and images match combine,
The method comprises the steps:
Obtain patient at least at the scan image data of operative site;
Medical apparatus and instruments has camera lens, when it enters operative site, obtains, by camera lens, the optics shadow that camera lens front end is real-time Picture;Tracer element follows the tracks of the position data of medical apparatus and instruments camera lens, and by this position data and previous scans as Data Matching, To the virtual image corresponding with optical image;
Optical image real-time for camera lens front end and virtual image are exported the real-time navigation being used for carrying out performing the operation.
Further, previous scans includes as data: at least at view data and the human body setting base of operative site Location data;
By aforementioned location data and location data match, and show the virtual image that these location data are corresponding.
Further, being provided with trace labelling point on described medical apparatus and instruments, described tracer element is by following the tracks of this tracking mark Note point obtains the position data of medical apparatus and instruments camera lens.
Further, after optical image real-time for camera lens front end and virtual image being merged, form fusion image, and Outwards export this fusion image.
Further, described scan image data includes point contextual data of at least two branch scape, and medical apparatus and instruments enters Time operative region time, its position data is corresponding with the location data of each point of contextual data, and show correspondence branch scape number According to virtual image.
Further, described branch scape includes: central area, bottom right district, lower-left district, upper right district, liver and stomach area;
Described point of contextual data is: lower-left district data, bottom right district data, upper right district data, central area data, liver and stomach area number According to, in abovementioned steps, by the position data of aforementioned medical apparatus and instruments camera lens and lower-left district data, bottom right district data, upper right district number Carry out correspondence according to the location data in, central area data or liver and stomach area data, and show virtual image corresponding thereto.
Further, in abovementioned steps, medical apparatus and instruments sequentially enters lower-left district, bottom right district, upper right district, central area, liver stomach District, and show corresponding virtual image.
Further, described scan image data includes subdata, and this subdata includes sub-virtual image, in aforementioned step In Zhou, follow the tracks of the characteristic point in optical image, when upset or mobile occurs in the characteristic point of the subregion in optical image, then adjust Export with in the sub-virtual image in this region, and aforementioned virtual image that this sub-virtual image is added to.
Further, in abovementioned steps, after obtaining the optical image that camera lens front end is real-time, extract in this optical image Characteristic point, and corresponding with the characteristic point in virtual image according to this feature point, identify the characteristic point of mistake and by this mistake Characteristic point fortune remove;Or, calculating optical image and the deviation value of virtual image, and according to this deviation value, virtual image is carried out Correction.
The medical apparatus and instruments gastrointestinal procedures navigation system that optically-based tracking and images match combine,
This system includes:
Medical apparatus and instruments, this medical apparatus and instruments has camera lens, is used for entering operative region, and obtains the optics shadow of operative region Picture;
Tracer element, for following the tracks of the position data of medical apparatus and instruments camera lens;
Memory element, for storing the scan image data of acquisition in advance;
Matching unit, for the position data of camera lens being mated with scan image data, and obtains this position data Corresponding virtual image;
Output unit, for the most outwards exporting optical image real-time for camera lens front end and virtual image.
Below advantages of the present invention or principle are illustrated:
1, this air navigation aid need to obtain patient in advance at least at the scan image data of operative site;Medical apparatus and instruments during operation Obtaining the optical image of its camera lens front end, tracer element can follow the tracks of the position of medical apparatus and instruments simultaneously, and by the position data of camera lens Mate with scan image data, obtain the virtual image corresponding with optical image, it is achieved the Real-time and Dynamic navigation of operation.
2, in the scan image data by scanning acquisition in advance, include view data, also include determining of its correspondence Bit data;Can be completed by CT scan or other modes during scanning in advance, location base can be set during scanning with patient Point, it is also possible to some with patient body is characterized as setting base, to position this view data accurately, in order to and light Learn image to realize mating accurately.
3, for convenience of medical worker, the effect of operation is observed, by optical image real-time for camera lens front end and melt Form fusion image after conjunction, and outwards export this fusion image.
4, in operation process, and owing to the structure of abdominal viscera is complicated, position is not fixed, and is easily deformed, for improving The precision of navigation, is divided into multiple branches scape by surgical scene, further improves precision of its coupling, it is achieved high-precision lead Boat.
5, for gastrointestinal procedures, following five the branch scapes of employing: central area, bottom right district, lower-left district, upper right district, liver Gastric area, is made a distinction by these five branch scapes, can preferably meet being actually needed in operation process.
6, by tracer element medical apparatus and instruments carried out position tracking, and surgical scene and scan image data are carried out Join, but its precision still have much room for improvement, at this point it is possible to extract camera lens front end optical image in characteristic point, and with virtual shadow Characteristic point in Xiang is corresponding, corrects further according to deviation value between the two, to reach the effect of Accuracy Matching.
7, the characteristic point in optical image is followed the tracks of, when upset occurs in the characteristic point of the subregion in optical image or moves Dynamic, then call in the sub-virtual image in this region, and aforementioned virtual image that this sub-virtual image is added to and export;So, may be used To adapt to the internal organs in operation process and the change of vessel position, it is achieved the most accurate navigation.
8, when optical image is mated with virtual image, the characteristic point of mistake is removed or carried out according to deviation Correction, to reach accurate syncretizing effect.
Accompanying drawing explanation
Fig. 1 is the flow chart of gastrointestinal procedures air navigation aid described in the embodiment of the present invention.
Detailed description of the invention
Below embodiments of the invention are described in detail.
As it is shown in figure 1, the peritoneoscope stomach gut surgery air navigation aid of optically-based tracking and images match, the method include as Lower step:
Medical worker, in advance by CT scan, obtains patient's scan image data (this scanogram number at operative site According to including at least in view data and the location data of human body setting base of operative site);This scan image data is being deposited Storage unit stores, before surgery scan image data is imported the processor of this navigation system, rebuild the virtual of three-dimensional Model;
In operation process, (in the present embodiment, medical apparatus and instruments is peritoneoscope to medical apparatus and instruments, it is also possible to for camera lens Knife head for surgical or other apparatuses) enter operative site, camera lens carries out video acquisition, obtains the optical image that its front end is real-time;
Followed the tracks of the position of medical apparatus and instruments camera lens by tracer element, and slightly mate based on this position;
Accurate coupling is realized again by the coupling of scene cut and characteristics of image;
Merged by scene and optical image and dummy model are merged and outwards export.
Now it is described as follows:
Medical apparatus and instruments has camera lens, when it enters operative site, obtains medical apparatus and instruments tracer element by camera lens and follows the tracks of The position data of medical apparatus and instruments camera lens, and by this position data and previous scans as the location data match in data, To the virtual image corresponding with optical image;Optical image real-time for camera lens front end and virtual image are exported and is used for carrying out hands The real-time navigation of art.
Being provided with trace labelling point on described medical apparatus and instruments, described tracer element obtains doctor by following the tracks of this trace labelling point (at this time, it may be necessary to first demarcate medical apparatus and instruments camera lens, its step is as follows: use plane mark for the position data for the treatment of apparatus camera lens Solid plate is the grid figure of 12X9, and the grid length of side is 20mm, and camera lens output plane resolution is 1280X720;1. from different angles The image of degree shooting 10 width grid figures;2. the detection all angle points of grid figure, utilize the pass between its spatial point and corresponding diagram picture point System solves the internal ginseng such as the focal length of camera, coordinate center;3. the distortion parameter of camera is solved.4. utilize the radial direction finally obtained abnormal Variable element recovers distortionless image).
Real-time fusion image is formed after optical image real-time for camera lens front end and virtual image being merged, and outwards Export this fusion image.
When being obtained the scan image data of patient by CT scan, this scan image data includes the branch of multiple branches scape Scape data;When medical apparatus and instruments enters during operative region, its position data is corresponding with the location data of each point of contextual data, and Obtain the virtual image of corresponding point contextual data;Specific as follows:
Described scan image data is divided into five branch scapes, i.e. lower-left district (around the left blood vessel of stomach nethike embrane), bottom right district (under pylorus), upper right district (on pylorus, hepatoduodenal ligament), central area (coeliac artery and branch thereof), liver and stomach area liver stomach it Between, when medical apparatus and instruments enters during operative region, its position data is corresponding with the location data of each point of contextual data, and show The virtual image of corresponding point contextual data.The method is used to reduce the error of registration, it is ensured that the accuracy of operation.
Medical apparatus and instruments enters the order of five branch scapes: lower-left district, bottom right district, upper right district, central area, liver and stomach area.
When medical apparatus and instruments first enters into each branch scape, medical apparatus and instruments first pauses and optical image now is virtual with described Image mates, and enters corresponding branch scape, the virtual image of point contextual data that display is corresponding.
When navigating, after obtaining the optical image that camera lens front end is real-time, extract the characteristic point in this optical image, and Corresponding with the characteristic point in virtual image according to this feature point, identify the characteristic point of mistake and the characteristic point of this mistake is transported Remove;Or, calculating optical image and the deviation value of virtual image, and according to this deviation value, virtual image is rectified a deviation, to reach To pinpoint effect.Specifically: use a kind of method for registering images based on binocular vision image characteristic point, it is achieved three-dimensional Model coupling in medical apparatus and instruments binocular vision image.First relevant position is obtained according to threedimensional model and medical instrument locations Lower virtual medical instrument image, utilizes Harris feature extraction algorithm, detects virtual image and the characteristic point of optical image, then For the registration point of mistake, filter out this mispairing point according to feature based cross-correlation and organizational structure invariance, finally exist After the characteristic point correctly registrated, TPS conversion is used to obtain images after registration.Based on multiple dimensioned Harris angle point SAM join Quasi-algorithm is a kind of new algorithm, and its step realized is: extract image edge information first with Wavelet Multiscale Product;So After, introduce Multi-scale Harris corner detection operator extraction image edge information;Then pass through estimation transformation parameter and definition phase Determine optimal matching points like property measure function, finally utilize least square to solve transformation parameter.Have an advantage in that: 1. registration Intensive reading and speed higher;2. utilizing Wavelet Multiscale Product rim detection, can abate the noise the interference to feature point extraction;3. draw The metric space entering angle point represents, it is achieved the registration between multi-resolution image.
Coupling during, in addition it is also necessary to solve mutual occlusion problem, its solve method be use processed offline and Online treatment two aspect: 1, in off-line process, first two width images about shooting, in calculating scene, each pixel is deep Angle value, then improves depth value, in order to that extracts a relative coarseness blocks edge, calculates in scene at HSV simultaneously The pixel value of each pixel in color space, and use the image enhancement processing such as sharpening to obtain profile the most clearly;Afterwards Utilize fusion means, block edge and profile information in conjunction with coarse, it is thus achieved that higher precision block edge.2, online treatment process In, first tracking characteristics point, calculate the displacement of objective contour according to the displacement of characteristic point, it is thus achieved that approximate contours, then with closely Like seeking the precise boundary of target object in the belt-like zone centered by profile, finally the technology that redraws is utilized to obtain hiding relation correct Virtual-real synthesis image.Continue next frame image, the objective contour tried to achieve in current frame image is taken turns as the initial of next frame Exterior feature, repeats above step.
Owing to abdominal viscera is not fixed, activeness is relatively big, deformation easily occurs, so its difficult point is to process asking of deformation Topic;Solution is: in advance by patient carries out CT scan, and after rebuilding the dummy model of three-dimensional, can be by processing again Setting up subdata, this subdata includes sub-virtual image (as part blood vessel is in virtual image during plumbness), at hands During art, follow the tracks of the characteristic point in optical image, when upset or mobile occurs in the characteristic point of the subregion in optical image, Then call the sub-virtual image (as part blood vessel is in virtual image during plumbness) in this region, and by this sub-virtual image It is added in aforementioned virtual image and exports.
The present embodiment has the advantage that
1, this air navigation aid need to obtain patient in advance at least at the scan image data of operative site;Medical apparatus and instruments during operation Obtaining the optical image of its camera lens front end, tracer element can follow the tracks of the position of medical apparatus and instruments simultaneously, and by the position data of camera lens Mate with to scanogram, obtain the virtual image corresponding with optical image, it is achieved the Real-time and Dynamic navigation of operation.
2, in the scan image data by scanning acquisition in advance, include view data, also include determining of its correspondence Bit data;Can be completed by CT scan or other modes during scanning in advance, location base can be set during scanning with patient Point, it is also possible to some with patient body is characterized as setting base, to position this view data accurately, in order to and light Learn image to realize mating accurately.
3, for convenience of medical worker, the effect of operation is observed, by optical image real-time for camera lens front end and virtual shadow After merging, form fusion image, and outwards export this fusion image.
4, in operation process, and owing to the structure of abdominal viscera is complicated, position is not fixed, and is easily deformed, before camera lens The error of registration can be there is, for improving the precision of registration, by surgical scene between optical image and the virtual image of reconstruction of end It is divided into multiple branches scape.
5, for gastrointestinal procedures, following five the branch scapes of employing: central area, bottom right district, lower-left district, upper right district, liver Gastric area, is made a distinction by these five branch scapes, can preferably meet being actually needed in operation process.
6, by tracer element medical apparatus and instruments carried out position tracking, and surgical scene and scan image data are carried out Join, but its precision still have much room for improvement, at this point it is possible to extract camera lens front end optical image in characteristic point, and with virtual shadow Characteristic point in Xiang is corresponding, corrects further according to deviation value between the two, to reach the effect of Accuracy Matching.
7, the characteristic point in optical image is followed the tracks of, when upset occurs in the characteristic point of the subregion in optical image or moves Dynamic, then call in the sub-virtual image in this region, and aforementioned virtual image that this sub-virtual image is added to and export;So, may be used To adapt to the internal organs in operation process and the change of vessel position, it is achieved the most accurate navigation.
8, when optical image is mated with virtual image, the characteristic point of mistake is removed or carried out according to deviation Correction, to reach accurate syncretizing effect.
These are only the specific embodiment of the present invention, do not limit protection scope of the present invention with this;Do not violating this Any replacement made on the basis of bright design and improvement, all belong to protection scope of the present invention.

Claims (10)

  1. The gastrointestinal procedures air navigation aid that the most optically-based tracking and images match combine, it is characterised in that
    The method comprises the steps:
    Obtain patient at least at the scan image data of operative site;
    Medical apparatus and instruments has camera lens, when it enters operative site, obtains, by camera lens, the optical image that camera lens front end is real-time;With The position data of track element tracks medical apparatus and instruments camera lens, and by this position data and previous scans as Data Matching, obtain with The virtual image that optical image is corresponding;
    Optical image real-time for camera lens front end and virtual image are exported the real-time navigation being used for carrying out performing the operation.
  2. The gastrointestinal procedures air navigation aid that optically-based tracking the most as claimed in claim 1 and images match combine, it is characterised in that
    Previous scans includes as data: at least in view data and the location data of human body setting base of operative site;
    By aforementioned location data and location data match, and show the virtual image that these location data are corresponding.
  3. The gastrointestinal procedures air navigation aid that optically-based tracking the most as claimed in claim 1 and images match combine, it is characterised in that
    Being provided with trace labelling point on described medical apparatus and instruments, described tracer element obtains Medical treatment device by following the tracks of this trace labelling point The position data of tool camera lens.
  4. The gastrointestinal procedures air navigation aid that optically-based tracking the most as claimed in claim 1 and images match combine, it is characterised in that
    Form fusion image after optical image real-time for camera lens front end and virtual image being merged, and outwards export this fusion Image.
  5. 5. the gastrointestinal procedures air navigation aid that optically-based tracking and images match as according to any one of Claims 1-4 combine, It is characterized in that,
    Described scan image data includes point contextual data of at least two branch scape, when medical apparatus and instruments enters during operative region, Its position data is corresponding with the location data of each point of contextual data, and show the virtual image of point contextual data of correspondence.
  6. The gastrointestinal procedures air navigation aid that optically-based tracking the most as claimed in claim 5 and images match combine, it is characterised in that
    Described branch scape includes: central area, bottom right district, lower-left district, upper right district, liver and stomach area;
    Described point of contextual data is: lower-left district data, bottom right district data, upper right district data, central area data, liver and stomach area data, In abovementioned steps, by the position data of aforementioned medical apparatus and instruments camera lens and lower-left district data, bottom right district data, upper right district data, Location data in central area data or liver and stomach area data carry out correspondence, and show virtual image corresponding thereto.
  7. The gastrointestinal procedures air navigation aid that optically-based tracking the most as claimed in claim 6 and images match combine, it is characterised in that
    In abovementioned steps, medical apparatus and instruments sequentially enters lower-left district, bottom right district, upper right district, central area, liver and stomach area, and shows and this Corresponding virtual image.
  8. 8. the gastrointestinal procedures air navigation aid that optically-based tracking and images match as according to any one of Claims 1-4 combine, It is characterized in that,
    Described scan image data includes subdata, and this subdata includes sub-virtual image, in abovementioned steps, follows the tracks of optics Characteristic point in image, when upset or mobile occurs in the characteristic point of the subregion in optical image, then calls the son in this region Virtual image, and aforementioned virtual image that this sub-virtual image is added to export.
  9. 9. the gastrointestinal procedures air navigation aid that optically-based tracking and images match as according to any one of Claims 1-4 combine, It is characterized in that,
    In abovementioned steps, after obtaining the real-time optical image in camera lens front end, extract the characteristic point in this optical image, and according to This feature point is corresponding with the characteristic point in virtual image, identifies the characteristic point of mistake and the characteristic point fortune of this mistake is removed; Or, calculating optical image and the deviation value of virtual image, and according to this deviation value, virtual image is rectified a deviation.
  10. The medical apparatus and instruments gastrointestinal procedures navigation system that the most optically-based tracking and images match combine, it is characterised in that
    This system includes:
    Medical apparatus and instruments, this medical apparatus and instruments has camera lens, is used for entering operative region, and obtains the optical image of operative region;
    Tracer element, for following the tracks of the position data of medical apparatus and instruments camera lens;
    Memory element, for storing the scan image data of acquisition in advance;
    Matching unit, for the position data of camera lens being mated with scan image data, and it is corresponding to obtain this position data Virtual image;
    Output unit, for the most outwards exporting optical image real-time for camera lens front end and virtual image.
CN201610717521.8A 2016-08-24 2016-08-24 Gastrointestinal surgery navigation method and system based on optical tracking and image matching Active CN106236264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610717521.8A CN106236264B (en) 2016-08-24 2016-08-24 Gastrointestinal surgery navigation method and system based on optical tracking and image matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610717521.8A CN106236264B (en) 2016-08-24 2016-08-24 Gastrointestinal surgery navigation method and system based on optical tracking and image matching

Publications (2)

Publication Number Publication Date
CN106236264A true CN106236264A (en) 2016-12-21
CN106236264B CN106236264B (en) 2020-05-08

Family

ID=57594757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610717521.8A Active CN106236264B (en) 2016-08-24 2016-08-24 Gastrointestinal surgery navigation method and system based on optical tracking and image matching

Country Status (1)

Country Link
CN (1) CN106236264B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107680103A (en) * 2017-09-12 2018-02-09 南方医科大学南方医院 The method that actual situation for stomach cancer hysteroscope intelligent operation real-time navigation system blocks processing mixed reality automatically
CN107704661A (en) * 2017-09-13 2018-02-16 南方医科大学南方医院 Construction method for the mixed finite element deformation model of stomach cancer endoscope-assistant surgery real-time navigation system
CN109223177A (en) * 2018-07-30 2019-01-18 艾瑞迈迪医疗科技(北京)有限公司 Image display method, device, computer equipment and storage medium
CN110478039A (en) * 2019-07-24 2019-11-22 常州锦瑟医疗信息科技有限公司 A kind of medical equipment tracking system based on mixed reality technology
CN111658141A (en) * 2020-05-07 2020-09-15 南方医科大学南方医院 Gastrectomy port position navigation system, gastrectomy port position navigation device and storage medium
CN113786239A (en) * 2021-08-26 2021-12-14 哈尔滨工业大学(深圳) Method and system for tracking and real-time early warning of surgical instruments under stomach and digestive tract

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040034300A1 (en) * 2002-08-19 2004-02-19 Laurent Verard Method and apparatus for virtual endoscopy
CN1839408A (en) * 2003-08-21 2006-09-27 皇家飞利浦电子股份有限公司 Device and method for combined display of angiograms and current x-ray images
CN101375805A (en) * 2007-12-29 2009-03-04 清华大学深圳研究生院 Method and system for guiding operation of electronic endoscope by auxiliary computer
US20120059248A1 (en) * 2010-08-20 2012-03-08 Troy Holsing Apparatus and method for airway registration and navigation
CN102821670A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Endoscope observation supporting system and method, and device and programme
CN103371870A (en) * 2013-07-16 2013-10-30 深圳先进技术研究院 Multimode image based surgical operation navigation system
CN103489178A (en) * 2013-08-12 2014-01-01 中国科学院电子学研究所 Method and system for image registration
CN103530872A (en) * 2013-09-18 2014-01-22 北京理工大学 Mismatching deleting method based on angle constraint
CN103948432A (en) * 2014-04-30 2014-07-30 深圳先进技术研究院 Algorithm for augmented reality of three-dimensional endoscopic video and ultrasound image during operation

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040034300A1 (en) * 2002-08-19 2004-02-19 Laurent Verard Method and apparatus for virtual endoscopy
CN1839408A (en) * 2003-08-21 2006-09-27 皇家飞利浦电子股份有限公司 Device and method for combined display of angiograms and current x-ray images
CN101375805A (en) * 2007-12-29 2009-03-04 清华大学深圳研究生院 Method and system for guiding operation of electronic endoscope by auxiliary computer
CN102821670A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Endoscope observation supporting system and method, and device and programme
US20120059248A1 (en) * 2010-08-20 2012-03-08 Troy Holsing Apparatus and method for airway registration and navigation
CN103371870A (en) * 2013-07-16 2013-10-30 深圳先进技术研究院 Multimode image based surgical operation navigation system
CN103489178A (en) * 2013-08-12 2014-01-01 中国科学院电子学研究所 Method and system for image registration
CN103530872A (en) * 2013-09-18 2014-01-22 北京理工大学 Mismatching deleting method based on angle constraint
CN103948432A (en) * 2014-04-30 2014-07-30 深圳先进技术研究院 Algorithm for augmented reality of three-dimensional endoscopic video and ultrasound image during operation

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107680103A (en) * 2017-09-12 2018-02-09 南方医科大学南方医院 The method that actual situation for stomach cancer hysteroscope intelligent operation real-time navigation system blocks processing mixed reality automatically
CN107704661A (en) * 2017-09-13 2018-02-16 南方医科大学南方医院 Construction method for the mixed finite element deformation model of stomach cancer endoscope-assistant surgery real-time navigation system
CN109223177A (en) * 2018-07-30 2019-01-18 艾瑞迈迪医疗科技(北京)有限公司 Image display method, device, computer equipment and storage medium
CN110478039A (en) * 2019-07-24 2019-11-22 常州锦瑟医疗信息科技有限公司 A kind of medical equipment tracking system based on mixed reality technology
CN111658141A (en) * 2020-05-07 2020-09-15 南方医科大学南方医院 Gastrectomy port position navigation system, gastrectomy port position navigation device and storage medium
CN111658141B (en) * 2020-05-07 2023-07-25 南方医科大学南方医院 Gastrectomy port position navigation system, gastrectomy port position navigation device and storage medium
CN113786239A (en) * 2021-08-26 2021-12-14 哈尔滨工业大学(深圳) Method and system for tracking and real-time early warning of surgical instruments under stomach and digestive tract

Also Published As

Publication number Publication date
CN106236264B (en) 2020-05-08

Similar Documents

Publication Publication Date Title
US11717376B2 (en) System and method for dynamic validation, correction of registration misalignment for surgical navigation between the real and virtual images
CN106236264A (en) The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system
US10593052B2 (en) Methods and systems for updating an existing landmark registration
JP6976266B2 (en) Methods and systems for using multi-view pose estimation
Grasa et al. Visual SLAM for handheld monocular endoscope
CN106952347B (en) Ultrasonic surgery auxiliary navigation system based on binocular vision
US9066086B2 (en) Methods for generating stereoscopic views from monoscopic endoscope images and systems using the same
WO2017211087A1 (en) Endoscopic surgery navigation method and system
US20070276234A1 (en) Systems and Methods for Intraoperative Targeting
US11896441B2 (en) Systems and methods for measuring a distance using a stereoscopic endoscope
CN112641514B (en) Minimally invasive interventional navigation system and method
CN109276296A (en) A kind of puncture needle method for visualizing based on two-dimensional ultrasound image
JP2013517909A (en) Image-based global registration applied to bronchoscopy guidance
CN112184653B (en) Binocular endoscope-based focus three-dimensional size measuring and displaying method
CN108113629B (en) Hard tube endoscope rotation angle measuring method and device
Zhou et al. Visual tracking of laparoscopic instruments
CN103006332A (en) Scalpel tracking method and device and digital stereoscopic microscope system
CN109345632B (en) Method for acquiring image, related device and readable storage medium
CN106236263A (en) The gastrointestinal procedures air navigation aid decomposed based on scene and system
Wang et al. Stereo video analysis for instrument tracking in image-guided surgery
US20240206980A1 (en) Volumetric filter of fluoroscopic sweep video
US20220101533A1 (en) Method and system for combining computer vision techniques to improve segmentation and classification of a surgical site
US20230215059A1 (en) Three-dimensional model reconstruction
WO2022190366A1 (en) Shape measurement system for endoscope and shape measurement method for endoscope
Fuertes et al. Augmented reality system for keyhole surgery-performance and accuracy validation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant