CN111870211A - Three-dimensional endoscope with instrument pose navigation function and navigation method thereof - Google Patents

Three-dimensional endoscope with instrument pose navigation function and navigation method thereof Download PDF

Info

Publication number
CN111870211A
CN111870211A CN202010739975.1A CN202010739975A CN111870211A CN 111870211 A CN111870211 A CN 111870211A CN 202010739975 A CN202010739975 A CN 202010739975A CN 111870211 A CN111870211 A CN 111870211A
Authority
CN
China
Prior art keywords
curved surface
camera
endoscope
dimensional
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010739975.1A
Other languages
Chinese (zh)
Inventor
龙忠杰
张勤俭
郭恒冰
左云波
刘宝国
苏鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Information Science and Technology University
Beijing Cancer Hospital
Original Assignee
Beijing Information Science and Technology University
Beijing Cancer Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Information Science and Technology University, Beijing Cancer Hospital filed Critical Beijing Information Science and Technology University
Priority to CN202010739975.1A priority Critical patent/CN111870211A/en
Publication of CN111870211A publication Critical patent/CN111870211A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

The invention relates to a three-dimensional endoscope with instrument pose navigation and a navigation method thereof, wherein the three-dimensional endoscope with instrument pose navigation comprises the following steps: the front end of the three-dimensional endoscope is placed at a position with a certain distance from a measured surface to project line laser, the structured light of the measured surface is collected, and the structure points on the camera plane are respectively extracted and calculated to obtain the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate system; converting the coordinate value of the measured surface on the coordinate system of the endoscope into the coordinate system of the electromagnetic sensor; rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface; calculating a weighted average value of all curved surface point cloud three-dimensional coordinates of the generated entity curved surface to obtain a space coordinate value of the gravity center G of the curved surface; and acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G to finish navigation. The invention can complete the three-dimensional reconstruction and real-time reproduction of the measured curved surface, and the navigation has continuity and is not interfered by shielding or signals.

Description

Three-dimensional endoscope with instrument pose navigation function and navigation method thereof
Technical Field
The invention relates to the field of surgical navigation equipment, in particular to a three-dimensional endoscope with instrument pose navigation and a navigation method thereof.
Background
An endoscope is an optical instrument that is accessed through a small surgical incision to assist a physician in viewing internal organ or tissue surface structures. Currently, most of the 30-70 visual endoscopes are widely used in operation auxiliary examination. However, the conventional endoscope can only provide a two-dimensional image without depth information of the surgical field, can only perform intra-operative observation on the surgical field range, and cannot provide stereoscopic vision perception of the surgical target, especially perception of the relative distance between the endoscope and the target object. Moreover, the conventional endoscope is complicated in construction and expensive, but nevertheless, collision with a hard object (such as a bone) in the body due to lack of relative distance sensing often occurs, which is extremely likely to cause damage to the distal lens of the endoscope. In terms of instrument navigation, the common main forms are a photoelectric tracking type and an electromagnetic control type. The former can realize the precondition of continuous navigation that the arm can not block the photoelectric signal, otherwise the navigation information will be interrupted; the latter typically requires an electromagnetic guide to control the motion profile of the endoscope (commonly found in capsule endoscopes). Both lack navigation stability and ease of operation.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a three-dimensional endoscope with instrument pose navigation and a navigation method thereof, which can complete three-dimensional reconstruction and real-time reproduction of a measured curved surface, and the navigation is continuous and is not interfered by occlusion or signals.
In order to achieve the purpose, the invention adopts the following technical scheme: a three-dimensional endoscope navigation method with instrument pose navigation comprises the following steps: s1, placing the front end of the three-dimensional endoscope at a certain distance from the measured surface to project line laser, collecting the structured light of the measured surface, respectively extracting and calculating the structural points on the camera plane, and obtaining the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate system
Figure BDA0002606392320000011
Figure BDA0002606392320000012
Wherein c represents an endoscope coordinate system, and i represents the index of the number of laser points; s2, converting the coordinate value of the measured surface on the endoscope coordinate system into an electromagnetic sensor coordinate system; s3, rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface; s4, calculating a weighted average value of all the three-dimensional coordinates of the point clouds of the curved surfaces of the generated entity curved surfaces to obtain a spatial coordinate value of the gravity center G of the curved surfaces; and S5, acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G, and completing navigation.
Further, in the step S1, the certain distance is 3 to 10 mm.
Further, in step S1, the structured light of the measured surface is collected by a camera.
Further, in step S1, the optical axis of the camera is defined as the Z axis, i.e., the depth direction, the upper side of the camera is defined as the X axis, and the right side is defined as the Y axis; then the three-dimensional coordinate value of each detected laser point in the endoscope coordinate system is:
Figure BDA0002606392320000021
Figure BDA0002606392320000022
Figure BDA0002606392320000023
in the formula, B is the length of a base line, namely the distance value between the front end of the optical fiber and the front end of the camera; f is the focal length of the camera; (x, y) is a two-dimensional coordinate value of the laser point on the camera plane; (c)x,cy) Is the projected center coordinate of the camera plane.
Further, in step S2, the conversion formula is:
Figure BDA0002606392320000024
in the formula, RsIs a three-dimensional rotation matrix of an electromagnetic sensor and has Rs=RzRyRx,Rz、Ry、RxThe rotation around the Z axis, the Y axis and the X axis is respectively a matrix of 3X 3;
Figure BDA0002606392320000025
is a displacement matrix between the camera and the electromagnetic sensor; s denotes the electromagnetic sensor coordinate system.
Further, in step S3, a Marching Cubes optimization method is adopted to render the curved point cloud in the electromagnetic coordinate system in OpenGL to generate an entity curved surface.
Further, the method for generating the solid curved surface specifically comprises the following steps:
s31, setting the top points of two adjacent grids as CiAnd Ci-1If the interval parameter t of the point cloud lattice is equal to Ci-Ci-1
S32, calculating by CiCi-1For each point cloud within a cube of side lengths at side length CiCi-1If projected at the point diOn the left side of the midpoint of the side length, the point cloud is assigned to the vertex CiOtherwise, configure to Ci-1
S33, extracting a triangular curved surface meeting preset conditions according to the distribution state of the vertex, and generating an index of an 8-bit two-dimensional numerical value according to the extracted triangular curved surface;
and S34, obtaining the extraction type of the curved surface according to the index of the 8-bit two-dimensional numerical value, finishing the rendering of the curved surface and generating the visual curved surface.
Further, in step S5, the specific navigation method includes the following steps:
s51, searching three point clouds on a circle with the center of gravity G as the center of circle and r as the radius, if and only if an equilateral triangle can be formed, storing the three-dimensional coordinates of the three point clouds, and recording the vertexes of the triangle as P1,P2,P3
S52, calculating the vector product of any two sides of the equilateral triangle to obtain the normal vector in the minimal neighborhood of the equilateral triangle, wherein the normal vector passing through the gravity center G is the real-time attitude vector of the appliance on the current curved surface, and the coordinate of the gravity center G is the current position of the appliance;
and S53, displaying and reproducing the normal vector and the real-time instrument posture vector in the step S52 on a display screen through an OpenGL image, and completing navigation.
A three-dimensional endoscope for implementing the above navigation method, comprising: the system comprises a camera, an optical fiber, an electromagnetic sensor and an endoscope probe; the camera, the optical fiber and the electromagnetic sensor are all arranged in the probe tube of the endoscope, and are arranged in a triangular shape, and a triangular geometric structure formed by the camera, the optical fiber and the electromagnetic sensor is stored in an upper computer; the helium-neon laser and the coupling mirror are arranged at one end of the optical fiber, line laser projected by the helium-neon laser is emitted by the optical fiber after passing through the coupling mirror, is projected on the measured curved surface to form structured light consistent with the measured curved surface, and is transmitted to the upper computer after being collected by the camera; the electromagnetic sensor collects the relative position and posture information of the current endoscope and transmits the information to the upper computer, and the motion state of the camera is tracked.
Further, the electromagnetic sensor and the camera are fixed in a parallel constraint relationship, and the electromagnetic sensor is positioned between the camera and the optical fiber; the geometric connecting line between the camera and the optical fiber and the geometric connecting line between the optical fiber and the electromagnetic sensor are of a splayed structure.
Due to the adoption of the technical scheme, the invention has the following advantages: 1. the invention adopts optical fiber to transmit laser, adopts a 6-freedom micro electromagnetic sensor to track the motion state of the camera, has the diameter of 1mm, is fixed side by side with the camera in a splayed shape, can effectively compress the tip of the endoscope to be within the diameter of 5-6mm, and has compact structure. 2. The invention can complete the three-dimensional reconstruction and real-time three-dimensional reproduction of the measured curved surface, the measurement does not need to set a reference plane and calibrate, and the measurement, reconstruction and three-dimensional reproduction time can be completed within 5 seconds. 3. The navigation of the invention has continuity and is not interfered by shielding or signals.
Drawings
Fig. 1 is a schematic view of the overall structure of a three-dimensional endoscope according to the present invention.
FIG. 2 is a schematic diagram of the normal vector of the center of gravity G and the real-time pose vector of the tool on the three-dimensional rendering of the measured surface; wherein the dashed line represents the normal vector of the center of gravity G and the solid line represents the real-time pose vector of the tool.
Detailed Description
In the description of the present invention, it is to be understood that the terms "upper", "lower", "inside", "outside", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are only for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention. The invention is described in detail below with reference to the figures and examples.
As shown in fig. 1, the present invention provides a three-dimensional endoscope with instrument posture guidance, which includes a camera 1, an optical fiber 2, an electromagnetic sensor 3 (electromagnetic position tracker), and an endoscope probe 6. Camera 1, optic fibre 2 and electromagnetic sensor 3 all set up in endoscope probe 6, and camera 1, optic fibre 2 and electromagnetic sensor 3 are the triangle-shaped and arrange, and the triangle-shaped geometry that the three formed is saved to the host computer in. The helium-neon laser and the coupling mirror are arranged at one end of the optical fiber, line laser 4 projected by the helium-neon laser is emitted by the optical fiber 2 after passing through the coupling mirror, and is projected on the measured curved surface 5 to form structural light consistent with the measured curved surface 5, and the structural light is collected by the camera 1 and then transmitted to an upper computer. The electromagnetic sensor 3 collects the information of the relative position (3 degrees of freedom) and the posture (3 degrees of freedom) of the current endoscope and transmits the information to the upper computer, so that the tracking of the motion state of the camera 1 is realized.
In the above embodiment, the camera 1 is a miniature camera with a diameter of 2mm and an OVM6946 model, the resolution of the miniature camera is 16 ten thousand pixels, the viewing angle is 120 °, the image plane size is 400 × 400, 30 frames/second, and the miniature camera has a waterproof function and is provided with an LED light source with adjustable brightness.
In the above embodiments, the electromagnetic sensor 3 is a 6-degree-of-freedom micro electromagnetic sensor having a diameter of 1 mm. For example, in the present embodiment, the electromagnetic Sensor 3 employs a Micro-Sensor 1.8 having the dimensions: 17.3mm in length and 1.8mm in diameter from Polhemus, USA.
In the above embodiments, the diameter of the endoscope probe 6 is 5-6 mm.
In the above embodiments, the camera 1, the optical fiber 2 and the electromagnetic sensor 3 are arranged in a triangular manner, and in the endoscope probe tube 6, the electromagnetic sensor 3 and the camera 1 are fixed in a parallel constraint relationship, and the electromagnetic sensor 3 must be located between the camera 1 and the optical fiber 2; the geometric connecting line between the camera 1 and the optical fiber 2 and the geometric connecting line between the optical fiber 2 and the electromagnetic sensor 3 are of a splayed structure, so that the parallax error during measurement is increased, and the sensitivity and the precision of measurement are improved.
Based on the three-dimensional endoscope, the invention also provides a three-dimensional endoscope navigation method with instrument pose navigation, which comprises the following steps:
s1, placing the front end of the three-dimensional endoscope at a certain distance from the measured surface to project line laser 4, collecting the structured light of the measured surface, respectively extracting and calculating the structural points on the camera plane (i.e. image plane), and obtaining the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate system
Figure BDA0002606392320000041
Where c denotes an endoscope coordinate system and i denotes an index of the number of laser points.
Preferably, the certain distance may be 3-10 mm;
preferably, the structured light of the measured surface can be collected by using the camera 1;
the optical axis of the camera is defined as the Z axis, i.e., the depth direction, the upper side of the camera is defined as the X axis, and the right side is the Y axis. Then the three-dimensional coordinate value of each detected laser point in the endoscope coordinate system is:
Figure BDA0002606392320000042
Figure BDA0002606392320000043
Figure BDA0002606392320000044
in the formula, B is the length of a base line, namely the distance value between the front end of the optical fiber and the front end of the camera; f is the focal length of the camera; (x, y) is a two-dimensional coordinate value of the laser point on the camera plane; (c)x,cy) Is the projected center coordinate of the camera plane.
S2, converting the coordinate values of the measured surface on the endoscope coordinate system to the electromagnetic sensor 3, and expressing the coordinate values in the electromagnetic sensor coordinate system.
The conversion formula is:
Figure BDA0002606392320000051
in the formula, RsIs a three-dimensional rotation matrix of an electromagnetic sensor and has Rs=RzRyRx,Rz、Ry、RxThe rotation around the Z axis, the Y axis and the X axis is respectively a matrix of 3X 3;
Figure BDA0002606392320000052
for cameras and electromagnetic transmissionsA displacement matrix between the sensors; s denotes the electromagnetic sensor coordinate system.
And S3, rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface.
Preferably, a Marching Cubes optimization method is adopted, and a curved surface point cloud under an electromagnetic coordinate system is rendered under OpenGL to generate an entity curved surface;
the method for generating the solid curved surface comprises the following specific steps:
s31, setting the top points of two adjacent grids as CiAnd Ci-1If the interval parameter t of the point cloud lattice is equal to Ci-Ci-1
S32, calculating by CiCi-1For each point cloud within a cube of side lengths at side length CiCi-1If projected at the point diOn the left side of the midpoint of the side length, the point cloud is assigned to the vertex CiOtherwise, configure to Ci-1
S33, extracting a triangular curved surface meeting preset conditions according to the distribution state of the vertex, and generating an index of an 8-bit two-dimensional numerical value according to the extracted triangular curved surface;
within a cube, the vertices where the triangle exists give a two-dimensional value of 1, otherwise 0;
and S34, obtaining the type of the extraction of the curved surface according to the index of the 8-bit two-dimensional numerical value, further finishing the rendering of the curved surface and generating the visual curved surface.
And S4, calculating the weighted average of the three-dimensional coordinates of all the curved surface point clouds of the generated entity curved surface to obtain the spatial coordinate value of the gravity center G of the curved surface.
And S5, acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G, and completing navigation.
The method comprises the following steps:
s51, searching three point clouds on a circle with center of gravity G as center and r as radius (preferably, r is 1mm), if and only if an equilateral triangle can be formed, storing three-dimensional coordinates of the three point clouds, and designating the vertexes of the triangle as P1,P2,P3
S52, calculating the vector product of any two sides of the equilateral triangle obtained in the step S51, and obtaining the normal vector in the minimum neighborhood of the equilateral triangle, wherein the normal vector passing through the gravity center G is the real-time attitude vector of the appliance on the current curved surface, and the coordinate of the gravity center G is the current position of the appliance.
And S53, reproducing the normal vector and the real-time instrument posture vector in the step S52 on a display screen through an OpenGL image display (as shown in FIG. 2), and completing navigation.
The above embodiments are only for illustrating the present invention, and the structure, size, arrangement position and shape of each component can be changed, and on the basis of the technical scheme of the present invention, the improvement and equivalent transformation of the individual components according to the principle of the present invention should not be excluded from the protection scope of the present invention.

Claims (10)

1. A three-dimensional endoscope navigation method with instrument pose navigation is characterized by comprising the following steps:
s1, placing the front end of the three-dimensional endoscope at a certain distance from the measured surface to project line laser, collecting the structured light of the measured surface, respectively extracting and calculating the structural points on the camera plane, and obtaining the three-dimensional coordinate value of the curved surface point cloud under the endoscope coordinate system
Figure FDA0002606392310000011
Wherein c represents an endoscope coordinate system, and i represents the index of the number of laser points;
s2, converting the coordinate value of the measured surface on the endoscope coordinate system into an electromagnetic sensor coordinate system;
s3, rendering the curved surface point cloud under the electromagnetic coordinate system to generate an entity curved surface;
s4, calculating a weighted average value of all the three-dimensional coordinates of the point clouds of the curved surfaces of the generated entity curved surfaces to obtain a spatial coordinate value of the gravity center G of the curved surfaces;
and S5, acquiring a real-time attitude vector of the instrument on the current curved surface according to the space coordinate value of the gravity center G, and completing navigation.
2. The navigation method of claim 1, wherein: in the step S1, the certain distance is 3-10 mm.
3. The navigation method of claim 1, wherein: in step S1, the structured light of the measured surface is collected by a camera.
4. The navigation method of claim 1, wherein: in step S1, the optical axis of the camera is defined as the Z axis, i.e., the depth direction, the upper side of the camera is defined as the X axis, and the right side is defined as the Y axis; then the three-dimensional coordinate value of each detected laser point in the endoscope coordinate system is:
Figure FDA0002606392310000012
Figure FDA0002606392310000013
Figure FDA0002606392310000014
in the formula, B is the length of a base line, namely the distance value between the front end of the optical fiber and the front end of the camera; f is the focal length of the camera; (x, y) is a two-dimensional coordinate value of the laser point on the camera plane; (c)x,cy) Is the projected center coordinate of the camera plane.
5. The navigation method of claim 1, wherein: in step S2, the conversion formula is:
Figure FDA0002606392310000015
in the formula, RsIs a three-dimensional rotation matrix of an electromagnetic sensor and has Rs=RzRyRx,Rz、Ry、RxRespectively around the Z axis,The rotation of the Y axis and the rotation of the X axis are both 3-by-3 matrixes;
Figure FDA0002606392310000016
is a displacement matrix between the camera and the electromagnetic sensor; s denotes the electromagnetic sensor coordinate system.
6. The navigation method of claim 1, wherein: in step S3, a Marching Cubes optimization method is adopted to render the curved point cloud in the electromagnetic coordinate system in OpenGL to generate an entity curved surface.
7. The navigation method according to claim 6, wherein the method for generating the solid surface comprises the following steps:
s31, setting the top points of two adjacent grids as CiAnd Ci-1If the interval parameter t of the point cloud lattice is equal to Ci-Ci-1
S32, calculating by CiCi-1For each point cloud within a cube of side lengths at side length CiCi-1If projected at the point diOn the left side of the midpoint of the side length, the point cloud is assigned to the vertex CiOtherwise, configure to Ci-1
S33, extracting a triangular curved surface meeting preset conditions according to the distribution state of the vertex, and generating an index of an 8-bit two-dimensional numerical value according to the extracted triangular curved surface;
and S34, obtaining the extraction type of the curved surface according to the index of the 8-bit two-dimensional numerical value, finishing the rendering of the curved surface and generating the visual curved surface.
8. The navigation method according to claim 1, wherein in step S5, the specific navigation method includes the following steps:
s51, searching three point clouds on a circle with the center of gravity G as the center of circle and r as the radius, if and only if an equilateral triangle can be formed, storing the three-dimensional coordinates of the three point clouds, and recording the vertexes of the triangle as P1,P2,P3
S52, calculating the vector product of any two sides of the equilateral triangle to obtain the normal vector in the minimal neighborhood of the equilateral triangle, wherein the normal vector passing through the gravity center G is the real-time attitude vector of the appliance on the current curved surface, and the coordinate of the gravity center G is the current position of the appliance;
and S53, displaying and reproducing the normal vector and the real-time instrument posture vector in the step S52 on a display screen through an OpenGL image, and completing navigation.
9. A three-dimensional endoscope for implementing the navigation method according to any one of claims 1 to 8, comprising: the system comprises a camera, an optical fiber, an electromagnetic sensor and an endoscope probe; the camera, the optical fiber and the electromagnetic sensor are all arranged in the probe tube of the endoscope, and are arranged in a triangular shape, and a triangular geometric structure formed by the camera, the optical fiber and the electromagnetic sensor is stored in an upper computer; the helium-neon laser and the coupling mirror are arranged at one end of the optical fiber, line laser projected by the helium-neon laser is emitted by the optical fiber after passing through the coupling mirror, is projected on the measured curved surface to form structured light consistent with the measured curved surface, and is transmitted to the upper computer after being collected by the camera; the electromagnetic sensor collects the relative position and posture information of the current endoscope and transmits the information to the upper computer, and the motion state of the camera is tracked.
10. The three-dimensional endoscope according to claim 9, wherein said electromagnetic sensor is fixed in parallel constrained relationship with said camera head, and said electromagnetic sensor is located between said camera head and said optical fiber; the geometric connecting line between the camera and the optical fiber and the geometric connecting line between the optical fiber and the electromagnetic sensor are of a splayed structure.
CN202010739975.1A 2020-07-28 2020-07-28 Three-dimensional endoscope with instrument pose navigation function and navigation method thereof Pending CN111870211A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010739975.1A CN111870211A (en) 2020-07-28 2020-07-28 Three-dimensional endoscope with instrument pose navigation function and navigation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010739975.1A CN111870211A (en) 2020-07-28 2020-07-28 Three-dimensional endoscope with instrument pose navigation function and navigation method thereof

Publications (1)

Publication Number Publication Date
CN111870211A true CN111870211A (en) 2020-11-03

Family

ID=73201520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010739975.1A Pending CN111870211A (en) 2020-07-28 2020-07-28 Three-dimensional endoscope with instrument pose navigation function and navigation method thereof

Country Status (1)

Country Link
CN (1) CN111870211A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113925441A (en) * 2021-12-17 2022-01-14 极限人工智能有限公司 Imaging method and imaging system based on endoscope

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105030331A (en) * 2015-04-24 2015-11-11 长春理工大学 Position sensor and three-dimension laparoscope camera calibration device and method
CN107485447A (en) * 2017-08-09 2017-12-19 北京信息科技大学 Utensil pose guider and method in a kind of art towards knee cartilage transplantation
CN208319312U (en) * 2017-08-09 2019-01-04 北京信息科技大学 Utensil pose navigation device in a kind of art towards knee cartilage transplantation
CN109620104A (en) * 2019-01-10 2019-04-16 深圳市资福医疗技术有限公司 Capsule endoscope and its localization method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105030331A (en) * 2015-04-24 2015-11-11 长春理工大学 Position sensor and three-dimension laparoscope camera calibration device and method
CN107485447A (en) * 2017-08-09 2017-12-19 北京信息科技大学 Utensil pose guider and method in a kind of art towards knee cartilage transplantation
CN208319312U (en) * 2017-08-09 2019-01-04 北京信息科技大学 Utensil pose navigation device in a kind of art towards knee cartilage transplantation
CN109620104A (en) * 2019-01-10 2019-04-16 深圳市资福医疗技术有限公司 Capsule endoscope and its localization method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113925441A (en) * 2021-12-17 2022-01-14 极限人工智能有限公司 Imaging method and imaging system based on endoscope
CN113925441B (en) * 2021-12-17 2022-05-03 极限人工智能有限公司 Imaging method and imaging system based on endoscope

Similar Documents

Publication Publication Date Title
CN108289598B (en) Drawing system
EP2825087B1 (en) Otoscanner
Fuchs et al. Augmented reality visualization for laparoscopic surgery
US8248414B2 (en) Multi-dimensional navigation of endoscopic video
US7824328B2 (en) Method and apparatus for tracking a surgical instrument during surgery
EP2001390B1 (en) System for 3-d tracking of surgical instrument in relation to patient body
US20080071142A1 (en) Visual navigation system for endoscopic surgery
CN106890025A (en) A kind of minimally invasive operation navigating system and air navigation aid
KR101799281B1 (en) Endoscope for minimally invasive surgery
US20080071141A1 (en) Method and apparatus for measuring attributes of an anatomical feature during a medical procedure
CN102448398A (en) Distance-based position tracking method and system
CN105979879A (en) Virtual image with optical shape sensing device perspective
CN102762142A (en) Laser enhanced reconstruction of 3d surface
CN110264504A (en) A kind of three-dimensional registration method and system for augmented reality
CN113689577B (en) Method, system, equipment and medium for matching virtual three-dimensional model with entity model
CN112184653B (en) Binocular endoscope-based focus three-dimensional size measuring and displaying method
CN107485447B (en) Device and method for navigating pose of surgical instrument for knee cartilage grafting
CN111870211A (en) Three-dimensional endoscope with instrument pose navigation function and navigation method thereof
CN208319312U (en) Utensil pose navigation device in a kind of art towards knee cartilage transplantation
CN114159163B (en) Magnetic navigation system facing soft lens
CN110074810A (en) The portable engaging member of the movement of calibration object in panorama, computerized tomography or cephalometry x-ray imaging
JP2001293006A (en) Surgical navigation apparatus
Payandeh et al. Application of imaging to the laproscopic surgery
Chang et al. Optical characterization and calibration of gastroscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201103

WD01 Invention patent application deemed withdrawn after publication