CN113349928A - Augmented reality surgical navigation device and method for flexible instrument - Google Patents

Augmented reality surgical navigation device and method for flexible instrument Download PDF

Info

Publication number
CN113349928A
CN113349928A CN202110552899.8A CN202110552899A CN113349928A CN 113349928 A CN113349928 A CN 113349928A CN 202110552899 A CN202110552899 A CN 202110552899A CN 113349928 A CN113349928 A CN 113349928A
Authority
CN
China
Prior art keywords
flexible instrument
augmented reality
coordinate system
dimensional
conversion matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110552899.8A
Other languages
Chinese (zh)
Other versions
CN113349928B (en
Inventor
马龙飞
廖洪恩
张欣然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202110552899.8A priority Critical patent/CN113349928B/en
Publication of CN113349928A publication Critical patent/CN113349928A/en
Application granted granted Critical
Publication of CN113349928B publication Critical patent/CN113349928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an augmented reality operation navigation device and method for a flexible instrument, wherein the device comprises a flexible instrument shape sensing module and an augmented reality module, wherein: the flexible instrument shape sensing module comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module; the augmented reality module comprises a binocular camera and a three-dimensional display screen and is used for acquiring the spatial position of the flexible instrument through the binocular camera and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to display on the three-dimensional display screen; wherein, the grating sensor array is composed of 4 optical fibers and is arranged in the inner part or the surface of the flexible instrument in a spiral surrounding mode. The invention can accurately position the position and shape of the flexible instrument in the body.

Description

Augmented reality surgical navigation device and method for flexible instrument
Technical Field
The invention relates to the technical field of medical instruments, in particular to an augmented reality operation navigation device and method for a flexible instrument.
Background
In the minimally invasive surgery process, the spatial information of the medical instrument in the body of the patient plays an important role in the accurate operation of the medical instrument. However, many medical devices inevitably change their shapes after entering the body, which makes them difficult to position spatially, such as ablation needles, biopsy needles, and intramedullary nails, which are deformed by stress, and flexible devices, such as endoscopes and soft robots, which actively change their shapes according to the factors such as the natural orifice of the human body or the convenience of operation.
The current technology for acquiring the shape information of the flexible instrument mainly comprises the following steps: visual images, medical images, or electromagnetic tracking. The method based on the visual image is convenient to use, but cannot process the obstructed environment; as for medical images, three-dimensional information is difficult to obtain, the current conventional intraoperative ultrasonic positioning or intraoperative X-ray perspective positioning has high requirements on the operating skills of doctors, the operation difficulty is greatly increased, and the X-ray perspective imaging has large radiation quantity to doctors and patients; the electromagnetic sensor is easily interfered by surrounding ferromagnetic materials, and needs electromagnetic compatible surgical instruments and related accessories, so that the working environment is harsh and the cost is high. The method is limited by the narrow working environment in vivo, and the existing method for acquiring the shape information of the flexible instrument causes that a doctor cannot directly observe the shape of the flexible instrument in vivo and cannot accurately position the flexible instrument.
Therefore, there is a need for an augmented reality surgical navigation device and method for flexible instruments that addresses the above issues.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an augmented reality operation navigation device and method for a flexible instrument.
The invention provides an augmented reality operation navigation device for a flexible instrument, which comprises a flexible instrument shape sensing module and an augmented reality module, wherein:
the flexible instrument shape sensing module comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module; the grating sensor array is composed of 4 optical fibers, gratings are engraved at equal distances in each optical fiber, the gratings at the equal distances are constructed into a square area, and the cross section of the flexible instrument and the square area are positioned on the same central shaft;
the augmented reality module comprises a binocular camera and a three-dimensional display screen and is used for acquiring the spatial position of the flexible instrument through the binocular camera and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to display on the three-dimensional display screen;
wherein the optical fiber is arranged in a spiral surrounding manner on the inner part or the surface of the flexible instrument based on the same cross section of the flexible instrument.
According to the augmented reality surgical navigation device for the flexible instrument, the grating sensor array is composed of 4 Bragg optical fibers.
According to the augmented reality operation navigation device for the flexible instrument, provided by the invention, the optical marker is arranged in the shooting range of the binocular camera and is arranged at the near end of the flexible instrument, so that the spatial position of the flexible instrument can be tracked in real time.
According to the augmented reality operation navigation device for the flexible instrument, the augmented reality module further comprises a semi-transparent semi-reflecting mirror, and the semi-transparent semi-reflecting mirror is used for projecting the three-dimensional image to a real scene, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
According to the augmented reality surgical navigation device for the flexible instrument, the augmented reality module further comprises a supporting shell used for fixing the augmented reality module, and the three-dimensional display, the binocular camera and the half-mirror are arranged on the surface of the supporting shell.
According to the augmented reality operation navigation device for the flexible instrument, the surface of the supporting shell is connected with the fixing support for fixing the augmented reality module; the fixed support is provided with a rotating shaft, and the rotating shaft is used for adjusting the angle of the augmented reality module.
The invention also provides an augmented reality operation navigation method based on any one of the augmented reality operation navigation devices for the flexible instrument, which comprises the following steps:
acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module;
calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
According to the invention, an augmented reality surgical navigation method for a flexible instrument is provided, the method further comprises the following steps:
calibrating the optical marker of the target projection area through the binocular camera to obtain a third conversion matrix, wherein the third conversion matrix is used for converting a coordinate system of the binocular camera to a projection coordinate system;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix, the second conversion matrix and the third conversion matrix, and projecting the generated three-dimensional image to the target projection area through a half-transmitting half-reflecting mirror, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of any of the above-mentioned augmented reality surgical navigation methods for a flexible instrument when executing the program.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method for augmented reality surgical navigation for a flexible instrument as described in any one of the above.
According to the augmented reality operation navigation device and method for the flexible instrument, the flexible instrument three-dimensional shape is obtained by arranging the grating optical fiber sensor in the flexible instrument through the flexible instrument shape sensing module, so that the flexible instrument three-dimensional model is deformed in real time, and the position and the shape of the flexible instrument in a body can be accurately positioned.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic structural diagram of an augmented reality surgical navigation device for a flexible instrument provided by the present invention;
FIG. 2 is a schematic diagram of the overall structure of the augmented reality surgical navigation device for a flexible instrument provided by the invention;
FIG. 3 is a schematic flow chart of a method for augmented reality surgical navigation of a flexible instrument according to the present invention;
FIG. 4 is a schematic representation of a four fiber optic helically deployed flexible instrument provided by the present invention;
FIG. 5 is a schematic cross-sectional view of the position of a grating node in the flexible instrument provided by the present invention;
fig. 6 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Limited by the narrow working environment in the body, doctors are difficult to directly observe the shape of the flexible instrument in the body, and the flexible instrument cannot be accurately positioned. In addition, the navigation screen of the existing surgical navigation system is usually far away from the surgical area, and an operator operates surgical instruments according to the display on the screen, so that the surgical instruments highly depend on the proprioception and clinical experience of doctors, the problem of incongruity between hands and eyes exists, and uncertainty is added to surgical operation.
Before the operation is implemented, a transformation matrix of an optical fiber sensing global coordinate system and an optical marker coordinate system is obtained through calibration; and obtaining a conversion matrix of a binocular camera coordinate system and an augmented reality projection coordinate system through calibration. After the above setting is completed, the flexible instrument augmented reality operation navigation device is fixed on the operating table or the mobile trolley by using the fixed support. In the implementation of an operation, the binocular camera and the flexible instrument sensing module are utilized to track the shape and the position of the flexible instrument in space in real time, and the semi-transparent semi-reflecting mirror is used for projecting the three-dimensional image of the flexible instrument to an operation area in real time so as to visually guide the operation. The flexible instrument is a flexible instrument having a strip-shaped structure, and may be an endoscope, a soft robot, or the like, which is not particularly limited in the present invention. The distal end of the flexible instrument is one end entering the body, and the proximal end of the flexible instrument is the other end which is reserved outside after the flexible instrument enters the body.
Fig. 1 is a schematic structural diagram of an augmented reality surgical navigation device for a flexible instrument, and as shown in fig. 1, the augmented reality surgical navigation device for a flexible instrument includes a flexible instrument shape sensing module 101 and an augmented reality module 102, where:
the flexible instrument shape sensing module 101 comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module 102; the grating sensor array is composed of 4 optical fibers, gratings are engraved at equal distances in each optical fiber, the gratings at the equal distances are constructed into a square area, and the cross section of the flexible instrument and the square area are positioned on the same central shaft;
the augmented reality module 102 includes a binocular camera and a three-dimensional display screen, and is configured to acquire a spatial position of the flexible instrument through the binocular camera, and generate a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position, and display the three-dimensional image on the three-dimensional display screen;
wherein the optical fiber is arranged in a spiral surrounding manner on the inner part or the surface of the flexible instrument based on the same cross section of the flexible instrument.
In the invention, the grating sensor array of the flexible instrument shape sensing module 101 is an array formed by a plurality of fiber grating sensors, and is precisely integrated with the flexible instrument in a clockwise or anticlockwise spiral surrounding mode, wherein the spiral surrounding mode can be on the surface of the flexible instrument or in the flexible instrument.
An optical positioning sensor is fixed at the proximal end of the flexible instrument, so that the binocular camera of the augmented reality module 102 positions the pose of the flexible instrument in real time. The binocular camera is integrated on the augmented reality module and calibrated, so that the three-dimensional image of the flexible instrument and the in-situ superposition display of the flexible instrument are realized, and the problem of difficult positioning caused by deformation of the flexible instrument in the body is solved.
According to the augmented reality operation navigation device for the flexible instrument, the flexible instrument is provided with the grating optical fiber sensor through the flexible instrument shape sensing module, the three-dimensional shape of the flexible instrument is obtained, so that the flexible instrument three-dimensional model is deformed in real time, and the position and the shape of the flexible instrument in a body can be accurately positioned.
Fig. 2 is a schematic structural diagram of an augmented reality navigation device for a flexible instrument according to the present invention, and as shown in fig. 2, the present invention discloses an augmented reality navigation device for a flexible instrument, which includes a flexible instrument shape sensing module 101 and an augmented reality module 102, where the flexible instrument shape sensing module 101 includes a fiber sensing demodulator 1011 and a fiber bragg grating sensor array 1012, and optionally, the grating sensor array 1012 is composed of 4 bragg fibers.
The augmented reality module 102 includes a binocular camera 1021 and a three-dimensional display 1022, optionally, an optical marker 1013 is disposed within a shooting range of the binocular camera 1021, and the optical marker 1013 is disposed at a proximal end of the flexible instrument 1014 for tracking a spatial position of the flexible instrument 1014 in real time.
Optionally, the augmented reality module 102 further includes a half mirror 1023, and the half mirror 1023 is used for projecting the three-dimensional image to the real scene, so that the three-dimensional image and the current position of the flexible instrument 1014 are superposed to generate the augmented reality scene.
Optionally, the augmented reality module 102 further includes a support housing 1024 for fixing the augmented reality module 102, and the three-dimensional display 1022, the binocular camera 1021 and the half mirror 1023 are disposed on a surface of the support housing 1024.
Optionally, a fixing bracket 103 is connected to a surface of the supporting housing 1024, and is used for fixing the augmented reality module 102; the fixed support 103 is provided with a rotating shaft, and the angle of the augmented reality module 102 is adjusted through the rotating shaft.
In the present invention, as can be seen in fig. 2, the binocular camera 1021, the three-dimensional display screen 1022 and the half mirror 1023 are integrated together by a support housing 1024 and remain relatively fixed. The half mirror 1023 can reflect the three-dimensional image displayed by the three-dimensional display 1022 to the eyes of the observer, and the observer can observe the real scene through the half mirror 1023So that the augmented reality scene with the spatial perspective fusion can be seen. Specifically, in the present invention, the spatial position relationship between the binocular camera coordinate system and the projection coordinate system is calibrated by an optical marker (the optical marker is disposed in the target projection area, which may be a minimally invasive surgery area), and the transformation matrix from the binocular camera coordinate system Tra to the projection coordinate system Dis is obtained
Figure BDA0003075902520000081
The flexible instrument shape sensing module 101 may accurately calculate the shape P of the flexible instrument 1014 by implanting four Bragg fibers 1012 in the flexible instrument 1014InsThe three-dimensionally modeled flexible instrument model is then correspondingly deformed and displayed in the three-dimensional display 1022 after the flexible instrument 1014 has been introduced into the body. In the present invention, the proximal end of flexible instrument 1014 is fixedly disposed with an optical marker 1013 for tracking the position of the flexible instrument in space in real time. Conversion matrix from flexible instrument space coordinate system Ins to optical marker coordinate system Mar obtained through pre-calibration
Figure BDA0003075902520000082
Furthermore, the three-dimensional modeled flexible instrument model and the real flexible instrument can be overlaid in situ in the space through the following coordinate system conversion formula, so as to obtain the augmented reality scene, wherein the coordinate system conversion formula is as follows: .
Figure BDA0003075902520000083
Wherein the content of the first and second substances,
Figure BDA0003075902520000084
a transformation matrix representing the transformation of the optical marker coordinate system Mar to the binocular camera coordinate system Tra.
In a minimally invasive surgery, a flexible instrument enters a body to perform surgery operation and is difficult to be directly observed, the three-dimensional image of the flexible instrument can be displayed in situ in space, an operator can generate a perspective eye effect and can be intuitively guided to perform the surgery operation, the problem of incongruity between hands and eyes is avoided, an augmented reality module does not contact the human body and can be used without disinfection, and the augmented reality navigation device is convenient to set.
Fig. 3 is a schematic flow chart of the augmented reality surgical navigation method for a flexible instrument, and as shown in fig. 3, the augmented reality surgical navigation method for an augmented reality surgical navigation device for a flexible instrument according to the embodiment of the present invention includes:
step 301, acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of an optical fiber in the flexible instrument shape sensing module.
In the present invention, FIG. 4 is a schematic diagram of a flexible instrument for helically laying four optical fibers according to the present invention, and referring to FIG. 4, four optical fibers PjAnd (j ═ 1,2,3 and 4) is spirally arranged on the flexible instrument, so that accurate shape measurement is realized on a thicker flexible instrument, flexible bending with a large angle can be realized, and damage to the optical fiber caused by pulling is reduced.
Furthermore, gratings are engraved at equal distances on the optical fibers, and four grating sensors at corresponding positions in the four optical fibers are distributed as a group of gratings in a square area, and the center of the square area is located on the central line of the flexible instrument, namely the cross section of the flexible instrument and the square area are located on the same central axis. The grating position adjacent to each optical fiber rotates 180 degrees around the central line, so that four virtual optical fibers l parallel to the central line can be formed by utilizing the gratings in different optical fibersj(j ═ 1,2,3, 4). According to the invention, a central line parameter equation is constructed by utilizing the strain measured by each grating and the size parameters of the flexible instrument. Taking a bragg fiber as an example, the details are as follows:
wavelength λ of Bragg reflected lightBWith the grating period Λ and the grating effective refractive index neffThe following steps are involved:
λB=2neffΛ;
since the grating period Λ will change due to grating strain, the wavelength λBA change will occur. Grating strain epsilon and wavelength change delta lambdaBThe relationship of (1) is:
Figure BDA0003075902520000091
wherein, PεIs the photoelastic coefficient. From the above relationship, the grating strain epsilon can be measured by the difference Δ λ of the Bragg wavelength before and after the environmental change (i.e. before and after the flexible instrument deforms)BTo calculate:
Figure BDA0003075902520000092
in the invention, the cross section of the flexible instrument is always kept in a circular shape, and the center line of the flexible instrument has the properties of constant curvature and constant bending rate, so that the flexible instrument is defined into a plurality of sections (the flexible instrument is divided into i sections, each line section i represents one section of the flexible instrument), and the four virtual optical fibers are equidistant from the center line. liThe length of the line segment i representing the center line between two adjacent sets of gratings can therefore be written in the form:
Figure BDA0003075902520000101
wherein S isi(t) represents a centerline parameter equation.
FIG. 5 is a cross-sectional view of the position of a grating node in the flexible instrument provided by the present invention, as shown in FIG. 5, four virtual optical fibers are in a cylindrical shape and are curves equidistant from the center line. Thus, in each segment i, the length of the virtual fiber can be obtained by:
Figure BDA0003075902520000102
wherein the subscript i 1, 2.. and N denotes the segmentation between each set of gratings; j ═ 1,2,3,4 represents four virtual fibers;
Figure BDA0003075902520000103
the theoretical length of the jth virtual optical fiber of the ith subsection can be obtained by calculation through a geometric position relation by utilizing the formula; dj,iThe distance of the offset of the j-th virtual fiber of the ith segment from the center line is shown.
In addition, the Bragg grating may measure the strain ε at the location of the gratingj,iThe measured length l 'of each virtual fiber can thus be obtained by'j,i
l′j,i=(1+εj,i)li
Wherein, subscript i is 1,2, 1, N, j is 1,2,3,4, liIndicating the length of the ith segment. Theoretically calculated length
Figure BDA0003075902520000104
Should be equal to the measured length l'j,i
Figure BDA0003075902520000105
Unknown parameter ai,biCan be obtained by solving the above equation. Thus, curvature information κ for segment i of flexible instrumentiAnd bending rate information tauiThe calculation formula is as follows:
Figure BDA0003075902520000106
Figure BDA0003075902520000111
the curvature and the flexibility of each section node of the flexible instrument are obtained through the measurement algorithm and then converted into the global position of each nodeAnd calculating the shape of the flexible instrument. The shape of the flexible instrument centerline is defined by a set of discrete nodes OiIt is shown as the center of the cross-section.
To obtain the global position coordinates of each node, a global coordinate system { x, y, z } and a local coordinate system { x } of each node (i.e., each segment) are definedi,yi,zi(i.e. local coordinate system). In the present invention, the global coordinate system is fixed at the proximal bottom of the flexible instrument and the local coordinate system { x }1,y1,z1And (6) overlapping.
Local coordinate system { xi,yi,ziAnd fixed at node OiThe Frenet coordinate system differs only by the rotation alphaiWherein z isiThe axis being at node OiIs tangent to the centerline of (a). In Frenet coordinate system i (the origin of the coordinate system is fixed at point O)i) Node OiCan be expressed as [0, ki,τi]. Conversion of curvature vectors to local coordinate system xi,yi,ziIn (b), expressed as:
icosα,κisinα,τi]。
further, in the present invention, two components of the curvature vector, κicos alpha and kappaisin α corresponding to the local coordinate system x, respectivelyiDirection and yiAnd (4) direction. Tau isiIs corresponding to local ziDirectional flexibility. Further, the node O can be calculatediIn a local coordinate system { xi,yi,ziPosition in
Figure BDA0003075902520000112
And direction
Figure BDA0003075902520000113
According to a piecewise iso-curvature method, node OiGlobal position P ofiAnd a global direction RiCan be expressed as:
Figure BDA0003075902520000114
because the centerline of a flexible instrument with a continuous shape change is a smooth curve in most cases, the change in curvature and flexure is relatively uniform and without abrupt changes. Therefore, the three-dimensional shape P of the flexible instrument can be recovered in the global coordinate system by interpolation according to the positions and the directions of the discrete nodesIns
302, calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera;
step 303, according to the three-dimensional shape information and the spatial position, performing coordinate system conversion through the first conversion matrix and the second conversion matrix to obtain the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
Optionally, on the basis of the foregoing embodiment, the method further includes:
calibrating the optical marker of the target projection area through the binocular camera to obtain a third conversion matrix, wherein the third conversion matrix is used for converting a coordinate system of the binocular camera to a projection coordinate system;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix, the second conversion matrix and the third conversion matrix, and projecting the generated three-dimensional image to the target projection area through a half-transmitting half-reflecting mirror, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
In the invention, the minimally invasive operation area is taken as a target projection area and passes through an optical systemA marker for calibrating the spatial position relationship between the coordinate system of the binocular camera and the projection coordinate system to obtain a transformation matrix from the coordinate system Tra to the projection coordinate system Dis
Figure BDA0003075902520000121
Accurately calculating the shape P of the flexible instrumentInsThen, the conversion matrix from the space coordinate system Ins of the flexible instrument to the coordinate system Mar of the optical marker is obtained through pre-calibration
Figure BDA0003075902520000122
Further, the formula can be converted by a coordinate system:
Figure BDA0003075902520000123
and superposing the flexible instrument model of the three-dimensional modeling and the real flexible instrument in situ in the space to obtain an augmented reality scene, wherein,
Figure BDA0003075902520000124
a transformation matrix representing the transformation of the optical marker coordinate system Mar to the binocular camera coordinate system Tra.
In a minimally invasive surgery, a flexible instrument enters a body to perform surgery operation and is difficult to be directly observed, the problems that the flexible instrument is difficult to position and hands and eyes are not coordinated in the minimally invasive surgery are solved, the three-dimensional shape of the flexible instrument in the body is obtained through the flexible instrument shape sensing module, so that the flexible instrument three-dimensional model is deformed in real time, the projection position of the flexible instrument in the space is adjusted in real time through the augmented reality module, the flexible instrument three-dimensional model is overlaid with the real flexible instrument in situ, and the in-situ augmented reality real-time guidance is realized. Because of the absence of X-ray radiation, the dependence on operator experience is reduced, and the safety and the efficiency of the operation are enhanced.
According to the augmented reality operation navigation method for the flexible instrument, the flexible instrument three-dimensional shape is obtained by arranging the grating optical fiber sensor in the flexible instrument through the flexible instrument shape sensing module, so that the flexible instrument three-dimensional model is deformed in real time, and the position and the shape of the flexible instrument in a body can be accurately positioned.
Fig. 6 is a schematic structural diagram of an electronic device provided in the present invention, and as shown in fig. 6, the electronic device may include: a processor (processor)601, a communication Interface (Communications Interface)602, a memory (memory)603 and a communication bus 604, wherein the processor 601, the communication Interface 602 and the memory 603 complete communication with each other through the communication bus 604. The processor 601 may invoke logic instructions in the memory 603 to perform an augmented reality surgical navigation method for a flexible instrument, the method comprising: acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module; calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera; and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
In addition, the logic instructions in the memory 603 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the method for augmented reality surgical navigation for a flexible instrument provided by the above methods, the method comprising: acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module; calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera; and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor is implemented to perform the method for augmented reality surgical navigation for a flexible instrument provided in the above embodiments, the method comprising: acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module; calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera; and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An augmented reality surgical navigation device for a flexible instrument, comprising a flexible instrument shape sensing module and an augmented reality module, wherein:
the flexible instrument shape sensing module comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module; the grating sensor array is composed of 4 optical fibers, gratings are engraved at equal distances in each optical fiber, the gratings at the equal distances are constructed into a square area, and the cross section of the flexible instrument and the square area are positioned on the same central shaft;
the augmented reality module comprises a binocular camera and a three-dimensional display screen and is used for acquiring the spatial position of the flexible instrument through the binocular camera and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to display on the three-dimensional display screen;
wherein the optical fiber is arranged in a spiral surrounding manner on the inner part or the surface of the flexible instrument based on the same cross section of the flexible instrument.
2. The augmented reality surgical navigation device for a flexible instrument of claim 1, wherein the grating sensor array is comprised of 4 bragg fibers.
3. The augmented reality surgical navigation device for a flexible instrument of claim 1, wherein an optical marker is disposed within a capture range of the binocular camera and is disposed at a proximal end of the flexible instrument for tracking a spatial position of the flexible instrument in real time.
4. The augmented reality surgical navigation device for a flexible instrument of claim 1, wherein the augmented reality module further comprises a half mirror for projecting the three-dimensional image to a real scene so that the three-dimensional image and the current position of the flexible instrument are superimposed to generate an augmented reality scene.
5. The augmented reality surgical navigation device for a flexible instrument of claim 4, wherein the augmented reality module further comprises a support housing for securing the augmented reality module, the three-dimensional display, the binocular camera and the half mirror being disposed on a surface of the support housing.
6. The augmented reality surgical navigation device for a flexible instrument of claim 5, wherein a fixing bracket is connected to a surface of the support housing for fixing the augmented reality module; the fixed support is provided with a rotating shaft, and the rotating shaft is used for adjusting the angle of the augmented reality module.
7. An augmented reality surgical navigation method based on the augmented reality surgical navigation device for the flexible instrument according to any one of claims 1 to 6, characterized by comprising:
acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module;
calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
8. The method for augmented reality surgical navigation for a flexible instrument of claim 7, further comprising:
calibrating the optical marker of the target projection area through the binocular camera to obtain a third conversion matrix, wherein the third conversion matrix is used for converting a coordinate system of the binocular camera to a projection coordinate system;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix, the second conversion matrix and the third conversion matrix, and projecting the generated three-dimensional image to the target projection area through a half-transmitting half-reflecting mirror, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the method for augmented reality surgical navigation for a flexible instrument according to any one of claims 7 to 8.
10. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the method for augmented reality surgical navigation for a flexible instrument according to any one of claims 7 to 8.
CN202110552899.8A 2021-05-20 2021-05-20 Augmented reality surgical navigation device for flexible instrument Active CN113349928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110552899.8A CN113349928B (en) 2021-05-20 2021-05-20 Augmented reality surgical navigation device for flexible instrument

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110552899.8A CN113349928B (en) 2021-05-20 2021-05-20 Augmented reality surgical navigation device for flexible instrument

Publications (2)

Publication Number Publication Date
CN113349928A true CN113349928A (en) 2021-09-07
CN113349928B CN113349928B (en) 2023-01-24

Family

ID=77527040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110552899.8A Active CN113349928B (en) 2021-05-20 2021-05-20 Augmented reality surgical navigation device for flexible instrument

Country Status (1)

Country Link
CN (1) CN113349928B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113907867A (en) * 2021-12-16 2022-01-11 北京微刀医疗科技有限公司 Irreversible electroporation ablation needle and irreversible electroporation ablation needle visualization system
WO2024077889A1 (en) * 2022-10-13 2024-04-18 中国科学院自动化研究所 Operation prompting method and apparatus for flexible instrument, device, and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070156019A1 (en) * 2005-12-30 2007-07-05 Larkin David Q Robotic surgery system including position sensors using fiber bragg gratings
CN103153223A (en) * 2010-10-08 2013-06-12 皇家飞利浦电子股份有限公司 Flexible tether with integrated sensors for dynamic instrument tracking
US20130308138A1 (en) * 2011-01-28 2013-11-21 Koninklijke Philips N.V. Fiber optic sensor for determining 3d shape
CN106249881A (en) * 2016-07-21 2016-12-21 江苏奥格视特信息科技有限公司 Augmented reality view field space and virtual three-dimensional target dynamic method for registering
CN107690302A (en) * 2015-04-06 2018-02-13 直观外科手术操作公司 The system and method for registration compensation in the surgical operation of image guiding
US20180143373A1 (en) * 2015-06-01 2018-05-24 Leibniz-Institut Für Photonische Technologien E.V. Multimode optical fibers and methods for providing a light transmission system using such fibers
CN109827518A (en) * 2017-11-23 2019-05-31 桂林电子科技大学 Fiber integrated interferometer parallel-connection structure three-dimensional spatial distribution formula changing sensor
US20190216572A1 (en) * 2016-07-11 2019-07-18 Taiwan Main Orthopaedic Biotechnology Co., Ltd. Image guided augmented reality method and a surgical navigation of wearable glasses using the same
CN110638527A (en) * 2019-07-01 2020-01-03 中国科学院苏州生物医学工程技术研究所 Operation microscopic imaging system based on optical coherence tomography augmented reality
CN111265299A (en) * 2020-02-19 2020-06-12 上海理工大学 Operation navigation method based on optical fiber shape sensing
CN111417353A (en) * 2017-10-10 2020-07-14 威布鲁尼克斯公司 Surgical shape sensing fiber optic apparatus and method
WO2020172413A1 (en) * 2019-02-20 2020-08-27 Humanetics Innovative Solutions, Inc. Optical fiber system having helical core structure for detecting forces during a collision test
CN111728697A (en) * 2020-07-21 2020-10-02 中国科学技术大学 Operation training and navigation device based on head-mounted three-dimensional augmented reality equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070156019A1 (en) * 2005-12-30 2007-07-05 Larkin David Q Robotic surgery system including position sensors using fiber bragg gratings
CN103153223A (en) * 2010-10-08 2013-06-12 皇家飞利浦电子股份有限公司 Flexible tether with integrated sensors for dynamic instrument tracking
US20130188855A1 (en) * 2010-10-08 2013-07-25 Koninklijke Philips Electronics N.V. Flexible tether with integrated sensors for dynamic instrument tracking
US20130308138A1 (en) * 2011-01-28 2013-11-21 Koninklijke Philips N.V. Fiber optic sensor for determining 3d shape
CN107690302A (en) * 2015-04-06 2018-02-13 直观外科手术操作公司 The system and method for registration compensation in the surgical operation of image guiding
US20180143373A1 (en) * 2015-06-01 2018-05-24 Leibniz-Institut Für Photonische Technologien E.V. Multimode optical fibers and methods for providing a light transmission system using such fibers
US20190216572A1 (en) * 2016-07-11 2019-07-18 Taiwan Main Orthopaedic Biotechnology Co., Ltd. Image guided augmented reality method and a surgical navigation of wearable glasses using the same
CN106249881A (en) * 2016-07-21 2016-12-21 江苏奥格视特信息科技有限公司 Augmented reality view field space and virtual three-dimensional target dynamic method for registering
CN111417353A (en) * 2017-10-10 2020-07-14 威布鲁尼克斯公司 Surgical shape sensing fiber optic apparatus and method
CN109827518A (en) * 2017-11-23 2019-05-31 桂林电子科技大学 Fiber integrated interferometer parallel-connection structure three-dimensional spatial distribution formula changing sensor
WO2020172413A1 (en) * 2019-02-20 2020-08-27 Humanetics Innovative Solutions, Inc. Optical fiber system having helical core structure for detecting forces during a collision test
CN110638527A (en) * 2019-07-01 2020-01-03 中国科学院苏州生物医学工程技术研究所 Operation microscopic imaging system based on optical coherence tomography augmented reality
CN111265299A (en) * 2020-02-19 2020-06-12 上海理工大学 Operation navigation method based on optical fiber shape sensing
CN111728697A (en) * 2020-07-21 2020-10-02 中国科学技术大学 Operation training and navigation device based on head-mounted three-dimensional augmented reality equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
崔曦雯;陈芳;韩博轩;马聪;马龙飞: "虚拟内窥镜图像增强膝关节镜手术导航系统", 《中国生物医学工程学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113907867A (en) * 2021-12-16 2022-01-11 北京微刀医疗科技有限公司 Irreversible electroporation ablation needle and irreversible electroporation ablation needle visualization system
WO2024077889A1 (en) * 2022-10-13 2024-04-18 中国科学院自动化研究所 Operation prompting method and apparatus for flexible instrument, device, and storage medium

Also Published As

Publication number Publication date
CN113349928B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
US11376075B2 (en) Systems and methods for non-rigid deformation of tissue for virtual navigation of interventional tools
JP7050733B2 (en) Virtual image with viewpoint of optical shape detector
US20180116732A1 (en) Real-time Three Dimensional Display of Flexible Needles Using Augmented Reality
US11266466B2 (en) Shape sensor systems with redundant sensing
EP2849669B1 (en) Systems and methods for deformation compensation using shape sensing
US8864655B2 (en) Fiber optic instrument shape sensing system and method
US8989528B2 (en) Optical fiber grating sensors and methods of manufacture
US20130303891A1 (en) Systems and Methods for Registration of a Medical Device Using Rapid Pose Search
CN113349928B (en) Augmented reality surgical navigation device for flexible instrument
CN106999153A (en) In the case of end is unfixed using optic shape sensing to ultrasonic probe from motion tracking and registration
US20210186648A1 (en) Surgical shape sensing fiber optic apparatus and method thereof
US20240060770A1 (en) Method for shape sensing an optical fiber
Ha et al. Comparative study on electromagnetic tracking and fiber Bragg grating-based catheter shape sensing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant