CN113349928B - Augmented reality surgical navigation device for flexible instrument - Google Patents
Augmented reality surgical navigation device for flexible instrument Download PDFInfo
- Publication number
- CN113349928B CN113349928B CN202110552899.8A CN202110552899A CN113349928B CN 113349928 B CN113349928 B CN 113349928B CN 202110552899 A CN202110552899 A CN 202110552899A CN 113349928 B CN113349928 B CN 113349928B
- Authority
- CN
- China
- Prior art keywords
- flexible instrument
- augmented reality
- coordinate system
- conversion matrix
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
Abstract
The invention provides an augmented reality operation navigation device and method for a flexible instrument, wherein the device comprises a flexible instrument shape sensing module and an augmented reality module, wherein: the flexible instrument shape sensing module comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module; the augmented reality module comprises a binocular camera and a three-dimensional display screen and is used for acquiring the spatial position of the flexible instrument through the binocular camera and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to display the three-dimensional image on the three-dimensional display screen; wherein, the grating sensor array is composed of 4 optical fibers and is arranged in the inner part or the surface of the flexible instrument in a spiral surrounding mode. The invention can accurately position the position and shape of the flexible instrument in the body.
Description
Technical Field
The invention relates to the technical field of medical instruments, in particular to an augmented reality operation navigation device and method for a flexible instrument.
Background
In the minimally invasive surgery process, the spatial information of the medical instrument in the body of the patient plays an important role in the accurate operation of the medical instrument. However, many medical devices inevitably change their shapes after entering the body, which makes them difficult to position spatially, such as ablation needles, biopsy needles, and intramedullary nails, which are deformed by stress, and flexible devices, such as endoscopes and soft robots, which actively change their shapes according to the factors such as the natural orifice of the human body or the convenience of operation.
The current technology for acquiring the shape information of the flexible instrument mainly comprises the following steps: visual images, medical images, or electromagnetic tracking. The method based on the visual image is convenient to use, but cannot process the environment with obstacles; as for medical images, three-dimensional information is difficult to obtain, the current conventional intraoperative ultrasonic positioning or intraoperative X-ray perspective positioning has high requirements on the operating skills of doctors, the operation difficulty is greatly increased, and the X-ray perspective imaging has large radiation quantity to doctors and patients; the electromagnetic sensor is easily interfered by surrounding ferromagnetic materials, and needs electromagnetic compatible surgical instruments and related accessories, so that the working environment is harsh and the cost is high. The method is limited by the narrow working environment in vivo, and the existing method for acquiring the shape information of the flexible instrument causes that a doctor cannot directly observe the shape of the flexible instrument in vivo and cannot accurately position the flexible instrument.
Therefore, there is a need for an augmented reality surgical navigation device and method for flexible instruments that addresses the above issues.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an augmented reality operation navigation device and method for a flexible instrument.
The invention provides an augmented reality operation navigation device for a flexible instrument, which comprises a flexible instrument shape sensing module and an augmented reality module, wherein:
the flexible instrument shape sensing module comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module; the grating sensor array is composed of 4 optical fibers, gratings are engraved at equal distances in each optical fiber, the gratings at equal distances are constructed into a square area, and the cross section of the flexible instrument and the square area are positioned on the same central shaft;
the augmented reality module comprises a binocular camera and a three-dimensional display screen and is used for acquiring the spatial position of the flexible instrument through the binocular camera and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to display on the three-dimensional display screen;
wherein the optical fiber is arranged in a spiral surrounding manner on the inner part or the surface of the flexible instrument based on the same cross section of the flexible instrument.
According to the augmented reality operation navigation device for the flexible instrument, the grating sensor array is composed of 4 Bragg optical fibers.
According to the augmented reality operation navigation device for the flexible instrument, provided by the invention, the optical marker is arranged in the shooting range of the binocular camera and is arranged at the near end of the flexible instrument, so that the spatial position of the flexible instrument can be tracked in real time.
According to the augmented reality operation navigation device for the flexible instrument, the augmented reality module further comprises a semi-transparent and semi-reflective mirror, and the semi-transparent and semi-reflective mirror is used for projecting the three-dimensional image to a real scene, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
According to the augmented reality surgical navigation device for the flexible instrument, the augmented reality module further comprises a supporting shell used for fixing the augmented reality module, and the three-dimensional display, the binocular camera and the half-mirror are arranged on the surface of the supporting shell.
According to the augmented reality operation navigation device for the flexible instrument, the surface of the supporting shell is connected with the fixing support for fixing the augmented reality module; and the fixed support is provided with a rotating shaft, and the rotating shaft is used for adjusting the angle of the augmented reality module.
The invention also provides an augmented reality operation navigation method based on any one of the augmented reality operation navigation devices for the flexible instrument, which comprises the following steps:
acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of an optical fiber in the flexible instrument shape sensing module;
calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of a binocular camera;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying a generated three-dimensional image on a three-dimensional display screen.
According to the invention, an augmented reality surgical navigation method for a flexible instrument is provided, the method further comprises the following steps:
calibrating the optical marker of the target projection area through the binocular camera to obtain a third conversion matrix, wherein the third conversion matrix is used for converting a coordinate system of the binocular camera to a projection coordinate system;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix, the second conversion matrix and the third conversion matrix, and projecting the generated three-dimensional image to the target projection area through a half-transmitting half-reflecting mirror, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of any of the above-mentioned augmented reality surgical navigation methods for a flexible instrument when executing the program.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method for augmented reality surgical navigation for a flexible instrument as described in any one of the above.
According to the augmented reality operation navigation device and method for the flexible instrument, the flexible instrument three-dimensional shape is obtained by arranging the grating optical fiber sensor in the flexible instrument through the flexible instrument shape sensing module, so that the flexible instrument three-dimensional model is deformed in real time, and the position and the shape of the flexible instrument in a body can be accurately positioned.
Drawings
In order to more clearly illustrate the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic structural diagram of an augmented reality surgical navigation device for a flexible instrument provided by the present invention;
FIG. 2 is a schematic diagram of the overall structure of the augmented reality surgical navigation device for a flexible instrument provided by the invention;
FIG. 3 is a schematic flow chart of a method for augmented reality surgical navigation of a flexible instrument according to the present invention;
FIG. 4 is a schematic representation of a four fiber optic helically deployed flexible instrument provided by the present invention;
FIG. 5 is a schematic cross-sectional view of the position of a grating node in the flexible instrument provided by the present invention;
fig. 6 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Is limited by the narrow working environment in the body, and doctors are difficult to directly observe the shape of the flexible instrument in the body and cannot accurately position the flexible instrument. In addition, the navigation screen of the existing surgical navigation system is usually far away from the surgical area, and an operator operates surgical instruments according to the display on the screen, so that the surgical instruments highly depend on the proprioception and clinical experience of doctors, the problem of incongruity between hands and eyes exists, and uncertainty is added to surgical operation.
Before the operation is implemented, a transformation matrix of an optical fiber sensing global coordinate system and an optical marker coordinate system is obtained through calibration; and obtaining a conversion matrix of a binocular camera coordinate system and an augmented reality projection coordinate system through calibration. After the above setting is completed, the flexible instrument augmented reality operation navigation device is fixed on the operating table or the mobile trolley by using the fixed support. In the implementation of an operation, the binocular camera and the flexible instrument sensing module are utilized to track the shape and the position of the flexible instrument in space in real time, and the semi-transparent semi-reflecting mirror is used for projecting the three-dimensional image of the flexible instrument to an operation area in real time so as to visually guide the operation. It should be noted that the flexible instrument is a flexible instrument having a strip-shaped structure, and may be an endoscope, a soft robot, or the like, which is not limited in this respect. The distal end of the flexible instrument is one end entering the body, and the proximal end of the flexible instrument is the other end which is reserved outside after the flexible instrument enters the body.
Fig. 1 is a schematic structural diagram of an augmented reality surgical navigation device for a flexible instrument, and as shown in fig. 1, the present invention provides an augmented reality surgical navigation device for a flexible instrument, which includes a flexible instrument shape sensing module 101 and an augmented reality module 102, wherein:
the flexible instrument shape sensing module 101 comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module 102; the grating sensor array is composed of 4 optical fibers, gratings are engraved at equal distances in each optical fiber, the gratings at the equal distances are constructed into a square area, and the cross section of the flexible instrument and the square area are positioned on the same central shaft;
the augmented reality module 102 comprises a binocular camera and a three-dimensional display screen, and is used for acquiring the spatial position of the flexible instrument through the binocular camera, and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to be displayed on the three-dimensional display screen;
wherein the optical fiber is arranged in a spiral surrounding manner on the inner part or the surface of the flexible instrument based on the same cross section of the flexible instrument.
In the invention, the grating sensor array of the flexible instrument shape sensing module 101 is an array formed by a plurality of fiber grating sensors, and is precisely integrated with the flexible instrument in a clockwise or anticlockwise spiral surrounding mode, wherein the spiral surrounding mode can be on the surface of the flexible instrument or in the flexible instrument.
An optical positioning sensor is fixed at the proximal end of the flexible instrument, so that the binocular camera of the augmented reality module 102 positions the pose of the flexible instrument in real time. The binocular camera is integrated on the augmented reality module and calibrated, so that the three-dimensional image of the flexible instrument and the in-situ superposition display of the flexible instrument are realized, and the problem of difficult positioning caused by deformation of the flexible instrument in the body is solved.
According to the augmented reality operation navigation device for the flexible instrument, the flexible instrument is provided with the optical grating fiber sensor through the flexible instrument shape sensing module, so that the three-dimensional shape of the flexible instrument is obtained, the three-dimensional model of the flexible instrument is deformed in real time, and the position and the shape of the flexible instrument in a body can be accurately positioned.
Fig. 2 is a schematic structural diagram of an augmented reality surgical navigation apparatus for a flexible instrument according to the present invention, and referring to fig. 2, the present invention discloses an augmented reality navigation apparatus for a flexible instrument, including a flexible instrument shape sensing module 101 and an augmented reality module 102, where the flexible instrument shape sensing module 101 includes a fiber sensing demodulator 1011 and a fiber grating sensor array 1012, and optionally, the grating sensor array 1012 is formed by 4 bragg fibers.
The augmented reality module 102 includes a binocular camera 1021 and a three-dimensional display 1022, optionally, an optical marker 1013 is disposed within a shooting range of the binocular camera 1021, and the optical marker 1013 is disposed at a proximal end of the flexible instrument 1014 for tracking a spatial position of the flexible instrument 1014 in real time.
Optionally, the augmented reality module 102 further includes a half mirror 1023, and the half mirror 1023 is used for projecting the three-dimensional image to the real scene, so that the three-dimensional image and the current position of the flexible instrument 1014 are superposed to generate the augmented reality scene.
Optionally, the augmented reality module 102 further includes a support housing 1024 for fixing the augmented reality module 102, and the three-dimensional display 1022, the binocular camera 1021 and the half mirror 1023 are disposed on a surface of the support housing 1024.
Optionally, a fixing bracket 103 is connected to a surface of the supporting housing 1024, and is used for fixing the augmented reality module 102; the fixed support 103 is provided with a rotating shaft, and the angle of the augmented reality module 102 is adjusted through the rotating shaft.
In the present invention, as can be seen in fig. 2, the binocular camera 1021, the three-dimensional display screen 1022 and the half mirror 1023 are integrated together by a support housing 1024 and remain relatively fixed. The half mirror 1023 can reflect the three-dimensional image displayed by the three-dimensional display 1022 to the eyes of an observer, and the observer can observe a real scene through the half mirror 1023, so that the augmented reality scene fused with the space perspective can be seen. Specifically, in the present invention, an optical marker (the optical marker is disposed in a target projection area, which may be a minimally invasive surgery area) is used to calibrate the spatial position relationship between the binocular camera coordinate system and the projection coordinate system, and a transformation matrix from the binocular camera coordinate system Tra to the projection coordinate system Dis is obtainedThe flexible instrument shape sensing module 101 can accurately calculate the shape P of the flexible instrument 1014 by implanting four Bragg fibers 1012 into the flexible instrument 1014 Ins The three-dimensionally modeled flexible instrument model is then correspondingly deformed and displayed in the three-dimensional display 1022 after the flexible instrument 1014 has been introduced into the body. In the present invention, the proximal end of flexible instrument 1014 is fixedly disposed with an optical marker 1013 for tracking the position of the flexible instrument in space in real time. Conversion matrix from flexible instrument space coordinate system Ins to optical marker coordinate system Mar obtained through pre-calibrationFurthermore, the three-dimensional modeled flexible instrument model and the real flexible instrument can be overlaid in situ in the space through the following coordinate system conversion formula, so as to obtain the augmented reality scene, wherein the coordinate system conversion formula is as follows: .
Wherein, the first and the second end of the pipe are connected with each other,a transformation matrix representing the transformation of the optical marker coordinate system Mar to the binocular camera coordinate system Tra.
In a minimally invasive surgery, a flexible instrument enters a body to perform surgery operation and is difficult to be directly observed, the three-dimensional image of the flexible instrument can be displayed in situ in space, so that an operator can generate an eye-seeing effect and can be intuitively guided to perform the surgery operation without the problem of inconsistent hands and eyes, the augmented reality module does not contact the human body, can be used without disinfection and is convenient to set, and the augmented reality surgery navigation device for the flexible instrument greatly reduces the radiation amount and the surgery time of the surgery.
Fig. 3 is a schematic flow chart of the augmented reality surgical navigation method for a flexible instrument, and as shown in fig. 3, the augmented reality surgical navigation method for an augmented reality surgical navigation device for a flexible instrument according to the embodiment of the present invention includes:
In the present invention, FIG. 4 is a schematic diagram of a flexible instrument for helically laying four optical fibers according to the present invention, and referring to FIG. 4, four optical fibers P j (j =1,2,3,4) helical laying on a flexible instrument, achieving thicker flexibilityThe shape measuring device can be used for accurately measuring the shape of the optical fiber and realizing flexible bending with a large angle, and simultaneously, the damage to the optical fiber caused by traction is reduced.
Furthermore, gratings are engraved at equal distances on the optical fibers, and four grating sensors at corresponding positions in the four optical fibers are distributed as a group of gratings in a square area, and the center of the square area is located on the central line of the flexible instrument, namely the cross section of the flexible instrument and the square area are located on the same central axis. The grating position adjacent to each optical fiber rotates 180 degrees around the central line, so that four virtual optical fibers l parallel to the central line can be formed by utilizing the gratings in different optical fibers j (j =1,2,3,4). According to the invention, a central line parameter equation is constructed by utilizing the strain measured by each grating and the size parameters of the flexible instrument. Taking a bragg fiber as an example, the details are as follows:
wavelength λ of Bragg reflected light B With the grating period Λ and the grating effective refractive index n eff The method comprises the following steps:
λ B =2n eff Λ;
since the grating period Λ will change due to grating strain, the wavelength λ B A change will occur. Grating strain epsilon and wavelength change delta lambda B The relationship of (A) is:
wherein, P ε Is the photoelastic coefficient. From the above relationship, the grating strain epsilon can be measured by the difference Δ λ of the Bragg wavelength before and after the environmental change (i.e. before and after the flexible instrument deforms) B To calculate:
in the present invention, the cross section of the flexible instrument is always kept circular, and the center line of the flexible instrument has the properties of constant curvature and constant flexibility, therefore, the present invention defines the flexible instrument as a plurality of segments (dividing the flexible instrument into segments)i segments, each line segment i representing one of the segments of the flexible instrument), the four virtual optical fibers are equidistant from the centerline. l. the i The length of the line segment i representing the center line between two adjacent sets of gratings can therefore be written in the form:
wherein S is i (t) represents a centerline parameter equation.
FIG. 5 is a schematic cross-sectional view of the position of grating nodes in the flexible instrument provided by the present invention, and as shown in FIG. 5, four virtual optical fibers are cylindrical and are curves equidistant from the centerline. Thus, in each segment i, the length of the virtual fiber can be obtained by:
wherein the subscript i =1, 2.. The N denotes the segmentation between the groups of gratings; j =1,2,3,4 represents four virtual optical fibers;the theoretical length of the jth virtual optical fiber of the ith subsection is represented and can be obtained by calculation through a geometric position relation by utilizing the formula; d j,i The distance of the offset of the j-th virtual fiber of the ith segment from the center line is shown.
In addition, the Bragg grating may measure the strain ε at the location of the grating j,i The measured length l 'of each virtual fiber can thus be obtained by' j,i :
l′ j,i =(1+ε j,i )l i ;
Wherein the subscript i =1, 2.., N, j =1,2,3,4, l i Indicating the length of the ith segment. Theoretically calculated lengthShould be etcFrom measuring length l' j,i :
Unknown parameter a i ,b i Can be obtained by solving the above equation. Thus, curvature information κ for segment i of flexible instrument i And bending rate information tau i The calculation formula is as follows:
the curvature and the flexibility of each section node of the flexible instrument are obtained through the measurement algorithm, and then the curvature and the flexibility are converted into the global positions of each node, so that the shape of the flexible instrument is calculated. The shape of the flexible instrument centerline is defined by a set of discrete nodes O i It is shown as the center of the cross-section.
To obtain the global position coordinates of each node, a global coordinate system { x, y, z } and a local coordinate system { x } of each node (i.e., each segment) are defined i ,y i ,z i I.e. the local coordinate system. In the present invention, the global coordinate system is fixed at the proximal bottom of the flexible instrument and the local coordinate system { x } 1 ,y 1 ,z 1 And (4) coincidence.
Local coordinate system { x i ,y i ,z i And fixed at node O i The Frenet coordinate system differs only by the rotation alpha i Wherein z is i The axis being at node O i Is tangent to the centerline of (a). In Frenet coordinate System i (coordinate System origin fixed at Point O) i ) Node O i Can be expressed as [0, k i ,τ i ]. Conversion of curvature vectors to local coordinate system x i ,y i ,z i In (b), expressed as:
[κ i cosα,κ i sinα,τ i ]。
further, in the present invention, two components of the curvature vector, κ i cos alpha and kappa i sin α corresponding to the local coordinate system x, respectively i Direction and y i And (4) direction. Tau is i Is corresponding to local z i Directional flexibility. Node O can then be calculated i In a local coordinate system { x i ,y i ,z i Position inAnd direction
According to a piecewise iso-curvature method, node O i Global position P of i And a global direction R i Can be expressed as:
because the centerline of a flexible instrument with a continuous shape change is a smooth curve in most cases, the change in curvature and flexure is relatively uniform and without abrupt changes. Therefore, interpolation is carried out according to the positions and the directions of the discrete nodes, and the three-dimensional shape P of the flexible instrument can be recovered in the global coordinate system Ins 。
302, calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of a binocular camera;
Optionally, on the basis of the foregoing embodiment, the method further includes:
calibrating the optical marker of the target projection area through the binocular camera to obtain a third conversion matrix, wherein the third conversion matrix is used for converting a coordinate system of the binocular camera to a projection coordinate system;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix, the second conversion matrix and the third conversion matrix, and projecting the generated three-dimensional image to the target projection area through a semi-transparent and semi-reflective mirror, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
In the invention, a minimally invasive surgery area is taken as a target projection area, the spatial position relation between a binocular camera coordinate system and a projection coordinate system is calibrated through an optical marker, and a conversion matrix from the binocular camera coordinate system Tra to the projection coordinate system Dis is obtainedAccurately calculating the shape P of the flexible instrument Ins Then, the conversion matrix from the space coordinate system Ins of the flexible instrument to the coordinate system Mar of the optical marker is obtained through pre-calibrationFurther, the formula can be converted by a coordinate system:and superposing the flexible instrument model of the three-dimensional modeling and the real flexible instrument in situ in the space to obtain an augmented reality scene, wherein,representing optical marker coordinate system Mar transformationsTo the binocular camera coordinate system Tra.
In a minimally invasive surgery, a flexible instrument enters a body to perform surgery operation and is difficult to be directly observed, the problems that the flexible instrument is difficult to position and hands and eyes are not coordinated in the minimally invasive surgery are solved, the three-dimensional shape of the flexible instrument in the body is obtained through the flexible instrument shape sensing module, so that the flexible instrument three-dimensional model is deformed in real time, the projection position of the flexible instrument in the space is adjusted in real time through the augmented reality module, the flexible instrument three-dimensional model is overlaid with the real flexible instrument in situ, and the in-situ augmented reality real-time guidance is realized. Because of the absence of X-ray radiation, the dependence on operator experience is reduced, and the safety and the efficiency of the operation are enhanced.
According to the augmented reality operation navigation method for the flexible instrument, the flexible instrument three-dimensional shape is obtained by arranging the grating optical fiber sensor in the flexible instrument through the flexible instrument shape sensing module, so that the flexible instrument three-dimensional model is deformed in real time, and the position and the shape of the flexible instrument in a body can be accurately positioned.
Fig. 6 is a schematic structural diagram of an electronic device provided in the present invention, and as shown in fig. 6, the electronic device may include: a processor (processor) 601, a communication Interface (Communications Interface) 602, a memory (memory) 603 and a communication bus 604, wherein the processor 601, the communication Interface 602 and the memory 603 complete communication with each other through the communication bus 604. The processor 601 may invoke logic instructions in memory 603 to perform a method of augmented reality surgical navigation for a flexible instrument, the method comprising: acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of an optical fiber in the flexible instrument shape sensing module; calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera; and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
In addition, the logic instructions in the memory 603 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the method for augmented reality surgical navigation for a flexible instrument provided by the above methods, the method comprising: acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module; calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of a binocular camera; and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying a generated three-dimensional image on a three-dimensional display screen.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor is implemented to perform the method for augmented reality surgical navigation for a flexible instrument provided in the above embodiments, the method comprising: acquiring three-dimensional shape information of a flexible instrument based on a flexible instrument shape sensing module, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of an optical fiber in the flexible instrument shape sensing module; calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera; and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment may be implemented by software plus a necessary general hardware platform, and may also be implemented by hardware. Based on the understanding, the above technical solutions substantially or otherwise contributing to the prior art may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the various embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (9)
1. An augmented reality surgical navigation device for a flexible instrument, comprising a flexible instrument shape sensing module and an augmented reality module, wherein:
the flexible instrument shape sensing module comprises a grating sensor array and an optical fiber sensing demodulator, and is used for detecting the shape of the flexible instrument and sending the three-dimensional shape information obtained by detection to the augmented reality module; the grating sensor array is composed of 4 optical fibers, gratings are engraved at equal distances in each optical fiber, the gratings at the equal distances are constructed into a square area, and the cross section of the flexible instrument and the square area are positioned on the same central shaft;
the augmented reality module comprises a binocular camera and a three-dimensional display screen and is used for acquiring the spatial position of the flexible instrument through the binocular camera and generating a three-dimensional image corresponding to the flexible instrument according to the three-dimensional shape information and the spatial position to display the three-dimensional image on the three-dimensional display screen;
wherein the optical fiber is arranged in a spiral surrounding manner on the inner part or the surface of the flexible instrument based on the same cross section of the flexible instrument;
the augmented reality operation navigation device for the flexible instrument is specifically used for:
based on a flexible instrument shape sensing module, acquiring three-dimensional shape information of a flexible instrument, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module, and the method specifically comprises the following steps:
according to the Bragg wavelength difference Delta lambda before and after the flexible instrument is deformed B Calculating to obtain the grating strain epsilon, wherein the formula is as follows:
λ B =2n eff A
wherein λ is B Is the wavelength of the Bragg reflected light, Λ is the grating period, n eff Is the effective refractive index of the grating, P ε Is the photoelastic coefficient;
solving for the unknown parameter a by the following formula i ,b i The formula is specifically as follows:
l′ j,i =(1+ε j,i )l i
wherein S is i (t) represents the centerline parameter equation, l i The length of a line segment i representing the center line between two adjacent groups of gratings, subscript i =1,2, \8230, N represents the segmentation between the groups of gratings; j =1,2,3,4 represents four virtual optical fibers;representing the theoretical length of the jth virtual fiber of the ith segment; d is a radical of j,i The distance, epsilon, of the offset center line of the j-th virtual fiber of the i-th segment j,i For locating strain on location of grating,/' j,i Representing the measured length, l, of each virtual fibre i Indicates the length of the i-th segment,representing a theoretical calculated length;
curvature information kappa of segment i of flexible instrument i And bending rate information tau i The calculation formula is as follows:
converting the curvature information and the flexibility information of each section of node of the flexible instrument into the global position of each node, and calculating the three-dimensional shape information of the flexible instrument;
calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of a binocular camera;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
2. The augmented reality surgical navigation device for a flexible instrument of claim 1, wherein the grating sensor array is comprised of 4 bragg fibers.
3. The augmented reality surgical navigation device for a flexible instrument of claim 1, wherein an optical marker is disposed within a capture range of the binocular camera and is disposed at a proximal end of the flexible instrument for tracking a spatial position of the flexible instrument in real time.
4. The augmented reality surgical navigation device for a flexible instrument of claim 1, wherein the augmented reality module further comprises a half mirror for projecting the three-dimensional image to a real scene so that the three-dimensional image and the current position of the flexible instrument are superimposed to generate an augmented reality scene.
5. The augmented reality surgical navigation device for a flexible instrument of claim 4, wherein the augmented reality module further comprises a support housing for securing the augmented reality module, the three dimensional display screen, the binocular camera and the half mirror being disposed on a surface of the support housing.
6. The augmented reality surgical navigation device for a flexible instrument of claim 5, wherein a fixing bracket is connected to a surface of the support housing for fixing the augmented reality module; and the fixed support is provided with a rotating shaft, and the rotating shaft is used for adjusting the angle of the augmented reality module.
7. The augmented reality surgical navigation device for a flexible instrument of claim 1, further to:
calibrating the optical marker of the target projection area through the binocular camera to obtain a third conversion matrix, wherein the third conversion matrix is used for converting a coordinate system of the binocular camera to a projection coordinate system;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix, the second conversion matrix and the third conversion matrix, and projecting the generated three-dimensional image to the target projection area through a semi-transparent and semi-reflective mirror, so that the three-dimensional image and the current position of the flexible instrument are superposed to generate an augmented reality scene.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
based on a flexible instrument shape sensing module, acquiring three-dimensional shape information of a flexible instrument, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of an optical fiber in the flexible instrument shape sensing module, and the method specifically comprises the following steps:
according to the Bragg wavelength difference before and after the flexible instrument is deformedΔλ B Calculating to obtain the grating strain epsilon, wherein the formula is as follows:
λ B =2n eff A
wherein λ is B Is the wavelength of the Bragg reflected light, Λ is the grating period, n eff Is the effective refractive index of the grating, P ε Is the photoelastic coefficient;
solving for unknown parameters a by the following formula i ,b i The formula is specifically as follows:
l′ j,i =(1+ε j,i )l i
wherein S is i (t) represents the centerline parameter equation, l i The length of a line segment i representing the center line between two adjacent groups of gratings, subscript i =1,2, \8230, N represents the segmentation between the groups of gratings; j =1,2,3,4 represents four virtual optical fibers;representing the theoretical length of the jth virtual fiber of the ith segment; d is a radical of j,i Represents the ith scoreThe j-th virtual fiber of the segment is offset from the center line by a distance, epsilon j,i For locating strain on location of grating,/' j,i Representing the measured length of each virtual fibre,/ i Indicates the length of the i-th segment,representing a theoretical calculated length;
curvature information kappa of segment i of flexible instrument i And flexibility information tau i The calculation formula is as follows:
converting the curvature information and the flexibility information of each section of node of the flexible instrument into the global position of each node, and calculating the three-dimensional shape information of the flexible instrument;
calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying a generated three-dimensional image on a three-dimensional display screen.
9. A non-transitory computer readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of:
based on a flexible instrument shape sensing module, acquiring three-dimensional shape information of a flexible instrument, wherein the three-dimensional shape information is acquired through curvature information and flexibility information of optical fibers in the flexible instrument shape sensing module, and the method specifically comprises the following steps:
according to the Bragg wavelength difference Delta lambda before and after the flexible instrument is deformed B Calculating to obtain the grating strain epsilon, wherein the formula is as follows:
λ B =2n eff A
wherein λ is B Is the wavelength of the Bragg reflected light, Λ is the grating period, n eff Is the effective refractive index of the grating, P ε Is the photoelastic coefficient;
solving for the unknown parameter a by the following formula i ,b i The formula is specifically as follows:
l′ j,i =(1+ε j,i )l i
wherein S is i (t) center line parametersNumerical equation,/ i The length of a line segment i representing the center line between two adjacent groups of gratings, subscript i =1,2, \8230, N represents the segmentation between the groups of gratings; j =1,2,3,4 represents four virtual optical fibers;representing the theoretical length of the jth virtual fiber of the ith segment; d is a radical of j,i The distance, epsilon, of the offset center line of the j-th virtual fiber of the i-th segment j,i For locating strain on location of grating,/' j,i Representing the measured length of each virtual fibre,/ i Indicates the length of the i-th segment,representing a theoretical calculation length;
curvature information kappa of segment i of a flexible instrument i And bending rate information tau i The calculation formula is as follows:
converting the curvature information and the flexibility information of each section of node of the flexible instrument into the global position of each node, and calculating the three-dimensional shape information of the flexible instrument;
calibrating an optical marker arranged at the near end of the flexible instrument through a binocular camera to obtain a first conversion matrix and a second conversion matrix, and tracking the optical marker to obtain the spatial position of the flexible instrument; the first conversion matrix is used for converting a space coordinate system of the flexible instrument into a coordinate system of the optical marker, and the second conversion matrix is used for converting the coordinate system of the optical marker into a coordinate system of the binocular camera;
and according to the three-dimensional shape information and the space position, converting a coordinate system through the first conversion matrix and the second conversion matrix, acquiring the position of the flexible instrument in the binocular camera, performing three-dimensional modeling, and displaying the generated three-dimensional image on a three-dimensional display screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110552899.8A CN113349928B (en) | 2021-05-20 | 2021-05-20 | Augmented reality surgical navigation device for flexible instrument |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110552899.8A CN113349928B (en) | 2021-05-20 | 2021-05-20 | Augmented reality surgical navigation device for flexible instrument |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113349928A CN113349928A (en) | 2021-09-07 |
CN113349928B true CN113349928B (en) | 2023-01-24 |
Family
ID=77527040
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110552899.8A Active CN113349928B (en) | 2021-05-20 | 2021-05-20 | Augmented reality surgical navigation device for flexible instrument |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113349928B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113907867A (en) * | 2021-12-16 | 2022-01-11 | 北京微刀医疗科技有限公司 | Irreversible electroporation ablation needle and irreversible electroporation ablation needle visualization system |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7930065B2 (en) * | 2005-12-30 | 2011-04-19 | Intuitive Surgical Operations, Inc. | Robotic surgery system including position sensors using fiber bragg gratings |
WO2012046202A1 (en) * | 2010-10-08 | 2012-04-12 | Koninklijke Philips Electronics N.V. | Flexible tether with integrated sensors for dynamic instrument tracking |
JP6270483B2 (en) * | 2011-01-28 | 2018-01-31 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 3D shape reconstruction for optical tracking of elongated devices |
EP3280312A4 (en) * | 2015-04-06 | 2019-02-20 | Intuitive Surgical Operations Inc. | Systems and methods of registration compensation in image guided surgery |
GB201509418D0 (en) * | 2015-06-01 | 2015-07-15 | Univ Dundee | Fibre based imaging |
CN109416841B (en) * | 2016-07-11 | 2023-03-31 | 台湾骨王生技股份有限公司 | Method for enhancing image fidelity and application thereof method for surgical guidance on wearable glasses |
CN106249881B (en) * | 2016-07-21 | 2019-03-19 | 江苏奥格视特信息科技有限公司 | Augmented reality view field space and virtual three-dimensional target dynamic method for registering |
US20210186648A1 (en) * | 2017-10-10 | 2021-06-24 | Yan Xia | Surgical shape sensing fiber optic apparatus and method thereof |
CN109827518B (en) * | 2017-11-23 | 2021-09-28 | 桂林电子科技大学 | Three-dimensional space distributed deformation sensor with fiber integrated interferometer parallel structure |
US11885699B2 (en) * | 2019-02-20 | 2024-01-30 | Humanetics Innovative Solutions, Inc. | Optical fiber system having helical core structure for detecting forces during a collision test |
CN110638527B (en) * | 2019-07-01 | 2021-06-01 | 中国科学院苏州生物医学工程技术研究所 | Operation microscopic imaging system based on optical coherence tomography augmented reality |
CN111265299B (en) * | 2020-02-19 | 2023-08-18 | 上海理工大学 | Operation navigation system based on optical fiber shape sensing |
CN111728697A (en) * | 2020-07-21 | 2020-10-02 | 中国科学技术大学 | Operation training and navigation device based on head-mounted three-dimensional augmented reality equipment |
-
2021
- 2021-05-20 CN CN202110552899.8A patent/CN113349928B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN113349928A (en) | 2021-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7050733B2 (en) | Virtual image with viewpoint of optical shape detector | |
US20230380716A1 (en) | Systems and methods for deformation compensation using shape sensing | |
US11779400B2 (en) | Combining strain-based shape sensing with catheter control | |
US11376075B2 (en) | Systems and methods for non-rigid deformation of tissue for virtual navigation of interventional tools | |
JP7149923B2 (en) | Method and system for absolute three-dimensional measurement using shape sensors that are insensitive to torsion | |
US11266466B2 (en) | Shape sensor systems with redundant sensing | |
US20180116732A1 (en) | Real-time Three Dimensional Display of Flexible Needles Using Augmented Reality | |
US10376178B2 (en) | Systems and methods for registration of a medical device using rapid pose search | |
CN109452930B (en) | Registration system and method for medical devices using reduced search space | |
CN100534378C (en) | 3D positioning system and method in endoscopic main body in medical use | |
CN105934215A (en) | Robotic control of imaging devices with optical shape sensing | |
CN106999153A (en) | In the case of end is unfixed using optic shape sensing to ultrasonic probe from motion tracking and registration | |
US20210186648A1 (en) | Surgical shape sensing fiber optic apparatus and method thereof | |
CN113349928B (en) | Augmented reality surgical navigation device for flexible instrument | |
US20240060770A1 (en) | Method for shape sensing an optical fiber | |
Ha et al. | Comparative study on electromagnetic tracking and fiber Bragg grating-based catheter shape sensing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |