WO2012001549A1 - Robotic control of an oblique endoscope for fov images - Google Patents

Robotic control of an oblique endoscope for fov images Download PDF

Info

Publication number
WO2012001549A1
WO2012001549A1 PCT/IB2011/052334 IB2011052334W WO2012001549A1 WO 2012001549 A1 WO2012001549 A1 WO 2012001549A1 IB 2011052334 W IB2011052334 W IB 2011052334W WO 2012001549 A1 WO2012001549 A1 WO 2012001549A1
Authority
WO
WIPO (PCT)
Prior art keywords
endoscope
volume
image
workspace
robot
Prior art date
Application number
PCT/IB2011/052334
Other languages
French (fr)
Inventor
Aleksandra Popovic
Paul Thienphrapa
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US35984210P priority Critical
Priority to US61/359,842 priority
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2012001549A1 publication Critical patent/WO2012001549A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00183Optical arrangements characterised by the viewing angles for variable viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00243Type of minimally invasive operation cardiac
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00278Transorgan operations, e.g. transgastric
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3612Image-producing devices, e.g. surgical cameras with images taken automatically
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information

Abstract

A robot unit (10) employs an oblique endoscope (12) for generating a video stream (13) of a workspace volume (40), and a robot (11) for moving the endoscope (12) within the workspace volume (40). A control unit (20) employs a robot controller (21), a sweep control module (23) and an image reconstructor (22). The robot controller (21) commands the robot (11) to execute one or more image acquisition sweeps of the endoscope (12) within the workspace volume (40), each image acquisition sweep including one or more rotational motions of the endoscope (12) within the workspace volume (40). The sweep control module (23) links each endoscopic image generated during the image acquisition sweep(s) to a corresponding rotational pose of the endoscope (12) within the workspace volume (40). The image reconstructor (22) reconstructs a volume image of the workspace volume (40) from the linking of the generated endoscopic images to the corresponding rotational poses of the endoscope (12) within the workspace volume (40).

Description

ROBOTIC CONTROL OF AN OBLIQUE ENDOSCOPE FOR FOV IMAGES
The present invention generally relates to robotic control of an oblique endoscope for the generation of high resolution, large field-of-view ("FOV") images. The present invention specifically relates to endoscopic image feedback for robotic control of endoscope position/orientation and speed in imaging a volume (e.g., an anatomical region).
An endoscope is a device having the ability to image from inside a body. Examples of an endoscope include, but are not limited to, any type of medical scopes (e.g., bronchoscope, colonoscope, laparascope, etc.). In particular, a rigid endoscope consists of a series of lenses located along the endoscope shaft.
Minimally invasive surgery is performed through small ports. Endoscopes are often used to provide a visual feedback of the surgical site. For example, in totally endoscopic heart surgery, endoscopes are used to provide intra-operative real-time visualization of cardiac arteries. Due to their size, usually <=10mm, and relative distance to an anatomical object under consideration, endoscopes provide visualization of a small area only. For a surgeon, this may pose problems in understanding a relative position of the viewed area.
To improve the field-of-view, endoscopes used in surgery are often oblique, with the lens at an angle to the endoscope shaft. This allows a surgeon to perform a manual visual sweep of an anatomical region by rotating the endoscope around its axis and/or changing an insertion angle of the endoscope in the anatomical region. From the sweep, the surgeon uses a series of endoscopic images viewed on the screen to build a mental map of the surgical site. However, a manual sweep of the surgical area using an oblique viewing endoscope might be a tiring process and it is prone to errors given the complexity of hand-eye coordination.
In the art, robotic systems have been used to hold the endoscope while a surgeon controls the robotic system using a computer mouse or other input devices. However, while attaching robot to the endoscope may improve manipulation of the endoscope, the prior art does not address the difficulty of building a mental map of the surgical site.
The present invention provides methods for obtaining a complete view of a surgical workspace in minimally invasive surgery using a robotically manipulated oblique endoscope. These methods involve moving the endoscope until the image acquisition sweeps cover the desired workspace volume. The sequence of images collected, combined with associated endoscope positions and orientations (derived from known robot kinematics), are used to reconstruct a visual, graphical, or otherwise representation of the workspace volume. While facilitating an opportunity to obtain maximum image coverage of a workspace volume, these methods further facilitate an opportunity to maximize the reconstruction quality of endoscopic images, minimize the running time, and comply with any physical limits imposed by the insertion point and endoscope cables.
One form of the present invention is a robotic imaging system employing a robotic unit and a control unit.
The robotic unit includes an oblique endoscope for generating a video stream including a plurality of endoscopic images of a workspace volume, and a robot for moving the endoscope within the workspace volume.
The control unit includes a robot controller, a sweep control module and an image reconstructor. The robot controller commands the robot to execute one or more image acquisition sweeps of the endoscope within the workspace volume, each image acquisition sweep including one or more rotational motions of the endoscope within the workspace volume. The sweep control module links each endoscopic image generated during the image acquisition sweep(s) to a corresponding rotational pose of the endoscope within the workspace volume. The image reconstructor reconstructs a volume image of the workspace volume from a linking of the generated endoscopic images to the corresponding rotational poses of the endoscope within the workspace volume.
A second form of the present invention includes a robotic imaging method for an oblique endoscope generating a video stream including a plurality of endoscopic images of a workspace volume. The robotic method involves a commanding of a robot to execute one or more image acquisition sweeps of the endoscope within the workspace volume, each image acquisition sweep including the robot controlling one or more rotational motions of the endoscope within the workspace volume. The robotic imaging method further includes a linking of each endoscopic image generated during the image acquisition sweep(s) to a corresponding rotational pose of the endoscope within the workspace volume, and a reconstruction of a volume image of the workspace volume from the linking of the generated endoscopic images to the corresponding rotational poses of the endoscope within the workspace volume.
The foregoing forms and other forms of the present invention as well as various features and advantages of the present invention will become further apparent from the following detailed description of various embodiments of the present invention read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the present invention rather than limiting, the scope of the present invention being defined by the appended claims and equivalents thereof.
FIG. 1 illustrates an exemplary embodiment of a robotic imaging system in accordance with the present invention.
FIGS. 2A-2E illustrate various exemplary imaging modes of the robotic imaging system of FIG. 1.
FIG. 3 illustrates an exemplary image acquisition sweep in accordance with the present invention.
FIG. 4 illustrates an exemplary image processing as known in the art.
FIGS. 5A-5D illustrate an exemplary generation of a volume image from endoscopic images in accordance with the present invention.
FIG. 6 illustrates a flowchart representative of an exemplary embodiment of a robotic imaging method in accordance with the present invention.
FIG. 7 illustrates a flowchart representative of an exemplary embodiment of an image feedback method in accordance with the present invention.
FIGS. 8A-8E illustrates an exemplary execution of the flowcharts illustrated in FIGS. 3 and 4 in accordance with the present invention.
FIGS. 9A-9C illustrate an exemplary display of a volume image and an endoscopic image in accordance with the present invention.
As shown in FIG. 1 , a robotic imaging system employs a robotic unit 10 and a control unit 20 for any robotic procedure involving a reconstruction of a volume image from two (2) or more endoscopic images of a workspace volume. Examples of such robotic procedures include, but are not limited to, medical procedures, assembly line procedures and procedures involving mobile robots. In particular, the robotic system may be utilized for medical procedures including, but are not limited to, minimally invasive cardiac surgery (e.g., coronary artery bypass grafting or mitral valve replacement), minimally invasive abdominal surgery (laparoscopy) (e.g., prostatectomy or cholecystectomy), and natural orifice translumenal endoscopic surgery.
Robotic unit 10 includes a robot 1 1 and an oblique endoscope 12 rigidly attached to the robot 1 1.
Robot 11 is broadly defined herein as any robotic device structurally configured with motorized control of one or more joints for maneuvering an end-effector as desired for the particular robotic procedure. In practice, robot 1 1 may have a minimum of five (5) degrees- of-freedom including an end-effector translation, an end-effector axis rotation, and three (3) degrees of rotational freedom of the joints.
Endoscope 12 is broadly defined herein as any device having an oblique field-of-view for imaging within a workspace volume. Examples of endoscope 12 for purposes of the present invention include, but are not limited to, any type of scope, flexible or rigid (e.g., endoscope, arthroscope, bronchoscope, choledochoscope, colonoscope, cystoscope, duodenoscope, gastroscope, hysteroscope, laparoscope, laryngoscope, neuroscope, otoscope, push enteroscope, rhinolaryngoscope, sigmoidoscope, sinuscope, thorascope, etc.) and any device similar to a scope that is equipped with an image system (e.g., a nested cannula with imaging). The imaging is local, and surface images may be obtained optically with fiber optics, lenses, or miniaturized (e.g. CCD based) imaging systems.
For purposes of the present invention, the term "workspace volume" is broadly defined herein as any three-dimensional space whereby endoscope 12 may be inserted therein for imaging object(s) within the workspace volume and/or for imaging the boundary of the workspace volume, the term "endoscopic image" is broadly defined herein as an image of a workspace volume within a field-of-view of endoscope 12, and the term "volume image" is broadly defined herein as a stitching of two (2) or more endoscopic images of the workspace volume.
In practice, endoscope 12 is mounted to the end-effector of robot 1 1. A pose of the end-effector of robot 1 1 is a position and an orientation of the end-effector within a coordinate system of robot 11. With endoscope 12 being inserted within a workspace volume, any given pose of the field-of-view of endoscope 12 within the workspace volume corresponds to a distinct pose of the end-effector within the robotic coordinate system. Consequently, each individual endoscopic image generated by endoscope 12 within the workspace volume may be linked to a corresponding pose of endoscope 12 within the workspace volume.
A description of FIGS. 2A-2E will now be provided herein to facilitate operational understanding of robotic unit 10. Specifically, an endoscope 30 is inserted within a spherical workspace volume 40 and a robot (not shown) executes an image acquisition sweep involving one or more rotational motions of endoscope 30 within workspace volume 40 and an endoscopic image acquisition involving a fixed posing of endoscope 30 relative to a predetermined area of workspace volume 40.
As shown in FIG. 2A, endoscope 30 is inserted through an insertion point 41 to a depth within workspace volume whereby a field-of-view 32 is located within a lower hemisphere 40b of workspace volume 40. Additionally, endoscope 30 has an endoscopic axis 31 coinciding with a workspace axis 42 that is perpendicular to insertion point 41. During an image acquisition sweep, endoscope 30 is spun about endoscopic axis 31 and the sweep encompasses < 360° of rotation to acquire two or more endoscopic images of workspace volume 40.
As shown in FIG. 2B, endoscope 30 is inserted through insertion point 41 to a depth within workspace volume whereby field-of-view 32 is located within an upper hemisphere 40a of workspace volume 40. Again, endoscopic axis 31 coincides with workspace axis 42 whereby, during an image acquisition sweep, endoscope 30 is spun about endoscopic axis 31 and the sweep encompasses < 360° of rotation to acquire two or more endoscopic images of workspace volume 40.
As shown in FIG. 2C, endoscope 30 is inserted through insertion point 41 to a depth within workspace volume whereby field-of-view 32 is located within lower hemisphere 40b of workspace volume 40, and endoscope 30 is tilted relative workspace axis 42. During an image acquisition sweep, endoscope 30 is simultaneously spun about endoscopic axis 31 and revolved around workspace axis 42 with the sweep encompassing < 360° of rotation to acquire two or more endoscopic images of workspace volume 40. Alternatively, endoscope 30 may be exclusively spun around endoscopic axis 31 or exclusively revolved around workspace axis 42.
As shown in FIG. 2D, endoscope 30 is inserted through insertion point 41 to a depth within workspace volume whereby field-of-view 32 is located within upper hemisphere 40a of workspace volume 40, and endoscope 30 is tilted relative workspace axis 42. During an image acquisition sweep, endoscope 30 is simultaneously spun about endoscopic axis 31 and revolved around workspace axis 42 with the sweep encompassing < 360° of rotation to acquire two or more endoscopic images of workspace volume 40. Again, alternatively, endoscope 30 may be exclusively spun around endoscopic axis 31 or exclusively revolved around workspace axis 42.
As shown in FIG. 2E, endoscope 30 is inserted through insertion point 41 and endoscope 30 is tilted relative to workspace axis 42 whereby field-of-view 32 is directed at a remote center of motion ("RCM") point 42 opposing insertion point 41. The endoscopic image acquisition involves endoscope 30 acquiring an endoscopic image centered on RCM 42.
In practice, the workspace volume may or may not be spherical. Nonetheless, in either case, those having ordinary skill in the art will appreciate how to execute image acquisition sweeps and endoscopic image acquisitions for workspace volumes of various shapes and dimensions from the description of FIGS. 2A-2E.
Referring back to FIG. 1, control unit 20 includes a robot controller 21 and an image reconstructor 22.
Robot controller 21 is broadly defined herein as any controller structurally configured to provide one or more robot control commands ("RCC") 25 to robot 11 for controlling a pose of the end-effector of robot 1 1 as desired for an image acquisition sweep or an endoscopic image acquisition. More particularly, robot control commands 25 dictate definitive movements of each robotic joint as needed to achieve a desired rotation of the end- effector of robot 1 1 during an image acquisition sweep, and thus a desired rotation of endoscope 12 within a workspace volume during the image acquisition sweep. Similarly, robot control commands 25 dictate definitive movements of each robotic joint as needed to achieve a desired fixed pose of the end-effector of robot 1 1 during an endoscopic image acquisition, and thus a desired fixed pose of endoscope 12 within a workspace volume during the endoscopic image acquisition.
Image reconstructor 22 is broadly defined herein as any device structurally configured to execute any frame- stitching algorithm for generating a volume image from two (2) or more overlapping endoscopic images. In practice, image reconstructor 22 may or may not require a calibration of endoscope 12.
For example, one rotation of endoscope 20 as shown in FIGS. 2A-2D may produce a ring-shaped series 50 of overlapping endoscopic images 51 as a frustum is projected on a plane as shown in FIG. 3. In the consecutive overlapping endoscopic images 51, it is possible to observe rotation of features, such as, for example, the rotation of features of endoscopic images 51a and 51b as shown in FIG. 4.
Specifically, lines 52 and 53 are drawn next to two prominent features of endoscopic images 51a and 51b. As the endoscope rotates, the features rotate as some disappear from the endoscopic image 51a to endoscopic image 51b, and some new features appear within endoscopic image 51b. If the observed surface of series 50 is actually a plane as shown, only knowing robot poses during an image acquisition sweep would be sufficient to reconstruct the map (mosaic) of the surface of series 50. However, since the shape and depth of the observed object is not known, image features have to be reconstructed and tracked between frames. In one embodiment of image reconstructor 22, velocity in space Vx and Vy of each feature in any given point may be retrieved from derivatives of image in space Ix and Iy and time (It) in accordance with the following equation [1 ] :
(Ix * Vx) +(Iy * Vy) - It = 0 [ 1]
A solution to equation [ 1] is a velocity field - optical flow. Specifically, equation [1 ] may be solved using a windowed approach and approximating derivatives in the window only, assuming limited feature movement between the frames. The expected optical flow generated by the endoscope rotation should lie on concentric circles with a variable vector length depending on the speed or rotation and depth in each point. However, the noise and imperfectness in feature tracking will cause the vectors to have both length and angular error. The mean error across the image may be quantified and used to correct the speed of endoscope. If the endoscope is being raised from the volume, the optical flow will flow from the center of the image. Same errors as in rotational case may be defined as well.
As to image mosaicing in one rotation, optical flow is used to align images of consecutive endoscopic images, knowing which features overlap and which features belong to new image segments. Given the ratio between the speed of rotation and the frame rate of video capture, it is expected that more than two frames would overlap in each segment. To improve robustness of mosaicing, each pixel of the image may be computed across many views by finding the mean value or median of those values. The mean variation of those values (e.g., standard deviation) across the image may be used to evaluate the quality of reconstruction. Alternatively, to improve image quality and sharpness, pixel values may be weighted with their distance from frame border, weighting pixels close to the center with higher values.
As to image mosaicing at different depths for multiple image series, mosaicing may be performed exactly the same as between the endoscopic images. In this case, instead of stitching a single imaging series, the same method may be used to stitch multiple image series.
FIGS. 5A-5D illustrate a sequential stitching of endoscopic frames. First, an endoscopic image 61 as shown in FIG. 5B is stitched to an endoscopic image 60 as shown in FIG. 5B whereby endoscopic image 60 is highlighted to emphasize the image stitching. Next, an endoscopic image 62 as shown in FIG. 5C is stitched to endoscopic image 61 whereby endoscopic image 60 is again highlighted to emphasize the image stitching. Further image stitching results in a volume image 63 as shown in FIG. 5D whereby endoscopic image 60 is again highlighted to emphasize the image stitching.
Referring back to FIG. 1, control unit 20 includes a sweep control module 23 broadly defined herein as any module structurally configured for linking each endoscopic image generated during an image acquisition sweep or an endoscopic image acquisition to a corresponding pose of endoscope 12 within the workspace volume as will be further explained in connection with the subsequent description of FIG. 6 herein. Additionally, sweep control module 23 may be structurally configured for controlling a rotational speed and/or a depth of endoscope 12 during each image acquisition sweep of the workspace volume as will be further explained in connection with the subsequent description of FIG. 7 herein. In practice, sweep control module 23 may be implemented by hardware, software and/or firmware integrated within robot controller 21, integrated within image reconstructor 22, distributed between robot controller 21 and image reconstructor 22, or installed on a separate device connected to robot controller 21 and/or image reconstructor 22.
FIG. 6 illustrates a flowchart 70 representative of a robotic imaging method of the present invention. A stage S71 of flowchart 70 encompasses an initial depth positioning and workspace axis orientation of endoscope 12 manually by a user of the system or
automatically by sweep control module 23. In practice, the initial depth positioning and workspace axis orientation of endoscope 12 is dependent upon the robotic application of the system.
A stage S72 of flowchart 70 encompasses an execution of an image acquisition sweep as commanded by robot controller 21 and a processing of video stream 13 by image reconstructor 22 and sweep control module 23. During this stage, sweep control module 23 links each endoscopic image generated during the image acquisition sweep(s) to a corresponding rotational pose of endoscope 12 within the workspace volume to enable image reconstructor 22 to properly stitch the endoscopic images.
If only multiple image acquisition sweeps are to be executed, then flowchart 70 loops through stages S71-S73 whereby sweep control module 23 depth positions and workspace axis orients the endoscope within workspace volume in a manner that ensures all endoscopic images of the image acquisition sweeps sufficiently overlap to facilitate a reconstruction of the volume image.
If only one image acquisition sweep was to be executed or upon completion of the multiple acquisition sweeps, an optional stage S74 of flowchart 70 encompasses depth positioning and workspace axis orientation of the endoscope within workspace volume that acquires an endoscopic image of a centered RCM or other predetermined area of the workspace volume.
In practice, the number of multiple image acquisition sweeps and the requirement for stage S74 is dependent upon the robotic procedure and the size and shape of the workspace volume.
FIGS. 8A-8E illustrate an exemplary execution of flowchart 70. Specifically, FIG. 8A shows endoscope 90 is inserted through an insertion point 101 of a workspace volume 100 until a tip of endoscope 90 comes as close to the bottom of workspace volume 100 as possible without any object within workspace volume 100 or light reflections obscuring the video stream being generated by endoscope 90. This step allows imaging of the largest area of the bottom surface of workspace volume 100. Determination of the insertion depth may be done by manually by observation or automatically by sweep control module 23 tracking one or more anatomical features in the image.
Once at this configuration as shown in FIG. 8B, endoscope 90 is completely rotated once about its endoscopic axis in order to acquire an image sweep of a circular frame series near the bottom of workspace volume 100. The projection sweep traces an annulus 1 10 when viewed from above. A width of annulus 1 10 and thus the area covered by the sweep depends on the viewing angle of oblique endoscope 90 and a distance from a tip of endoscope 90 to the imaged surface. A RCM 102 of workspace volume 100 is not covered by this first sweep, so it will be addressed later in the process.
One complete rotation provides an adequate overlap between the initial and final endoscopic images for image reconstructor 22 to reconstruct the continuous strip. For example, one full rotation around the endoscopic axis would typically take five (5) to ten (10) seconds. Endoscopes used in the surgical field usually perform at approximately thirty (30) frames per second. Thus, at every ten (10) degrees of rotation, there will be approximately eight (8) endoscopic images.
As shown in FIG. 8C, with its orientation fixed, endoscope 90 is raised out of workspace volume 100 in preparation for the next sweep. This time endoscope 90 is rotated about its endoscopic axis in the direction opposite that of the previous sweep as show in FIG. 9D. This is to prevent cables from becoming excessively tangled. Viewed from above, the aggregate covered area is now two concentric annuli 1 10 and 1 1 1 with some overlap between them. The amount of overlap should be chosen so as to facilitate reconstruction, and is controlled by amount that endoscope 90 is raised. As before, this decision may be made by observation, or automatically by sweep control module 23 tracking anatomical feature(s) in the image.
The process of raising the endoscope 90 and performing a rotating sweep in the opposite direction continues until the view becomes obscured by workspace volume 100 at insertion point 101 or in other words, until the ability to gather new data from a perpendicular insertion angle is exhausted.
If in the first iteration of endoscope 90 is not touching the bottom of workspace volume 100, a circular area representing the projection of RCM 102 onto the surface will not be visible. To visualize this area, endoscope 90 needs to be tilted whereby endoscope 90 is parallel to the bottom surface of workspace volume as shown in FIG. 8E.
Depending on the robotic procedure and user's preference, maintaining endoscope 90 perpendicular to insertion point 101 may be sufficient to visualize all structures relevant to the robotic procedure during each image acquisition sweep. Alternatively or concurrently, endoscope 90 may be oriented as shown in FIGS. 2C and/or 2D during one or more of the image acquisition sweeps.
Specifically, for the upper half of workspace volume 100, endoscope 90 is tilted about the insertion point 101 to bring the lens closer to workspace volume 100. The amount of tilt is such that there is still some overlap with a previous sweep. With endoscope 90 now angled relative to the insertion plane, the field-of-view of endoscope 90 then revolved around the workspace axis perpendicular to the plane through the insertion point 101. Endoscope 90 is simultaneously spun about its shaft as well, so the net effect is that the lends of endoscope 90 faces the surface of workspace volume 1000 nearest a tip of endoscope 90 at all times. The resulting projection sweep takes the shape of an annulus as before when viewed from above.
Next, endoscope 90 is simultaneously raised and angled in order to view yet higher sections of workspace volume 100. Then, the two simultaneous rotations (spin about the insertion axis normal and revolution about the shaft) are performed in the opposite direction to alleviate cabling concerns. This process continues until either no further information may be gleaned or an angle of endoscope 90 reaches a limit imposed by workspace volume 100. The final area left to be covered is the region at the bottom of workspace volume 100, in line with a vector perpendicular to the insertion plane and through insertion point 101. Again, at this point the tip of endoscope 90 should be near insertion point 101, so it is tilted and rotated such that the lens of endoscope 90 faces the bottom. The image should be large enough to cover the remaining area. Once the image is acquired, endoscope 90 is removed from workspace volume 100 and the sequences of images are reconstructed into a coherent volume image, thus completing the method.
Similarly for the lower hemisphere of workspace volume 100, endoscope 90 is inserted at a right angle to the insertion plane to the lowest depth possible within workspace volume 100. The first sweep is identical the sweep shown in FIG. 8B. However, at each endoscope raising step, the raising motion is combined with a tilt of endoscope about insertion point 101 in order to reduce the disparity between the lens-to-surface distances of successive sweeps. Each raise and tilt is done to set up a new sweep that covers new surfaces with sufficient overlap with the surfaces covered by previous sweeps. Each sweep after the initial sweep involves the simultaneous rotations about the insertion plane normal and the shaft, synchronized in a way that causes the lens of endoscope 90 to face the surface closest to endoscope 90. The sweeps are repeated until no new surfaces may be seen or endoscope 90 cannot be tilted any further. Endoscope 90 is then oriented to capture the volume floor before being removed to complete the procedure.
Referring back to FIG. 6, flowchart 70 may be performed with a fixed rotational speed of the endoscope for each sweep and a fixed depth distance of the endoscope between sweeps. Alternatively, FIG. 7 illustrates a flowchart 80 representative of an image feedback method of the present invention that facilitates an adjustment to the rotational speed of the endoscope for each sweep and the depth distance of the endoscope between sweeps.
Specifically, a stage S81 of flowchart 80 encompasses sweep control module 23 initially setting the rotational speed of the endoscope for the initial sweep and the lowest depth of the endoscope for the initial sweep and robot controller 21 issuing robot commands 25 to achieve the set depth of endoscope within the workspace volume and the rotating the endoscope at the set rotational speed for the first sweep. After the first sweep, a stage S82 of flowchart 80 encompasses sweep control module 23 analyzing the quality of the on-going reconstruction of the volume image. Flowchart 80 will remain in a loop of stages S81 and S82 for each subsequent sweep until such time sweep control module 23 feels the quality of the on-going reconstruction of the volume image is unacceptable. In that case, a stage S83 of flowchart 80 encompasses module 23 adjusting the rotational speed of the endoscope for subsequent sweeps and/or adjusting the depth distance between subsequent sweeps. In practice, flowchart 80 may be integrated into stage S71 of flowchart 70 (FIG. 6).
For example, with an initial execution of stage S71, a rotational speed of endoscope 12 (FIG. 1) is set to a maximum rotational speed of robot 1 1 (FIG. 1). This is to assure the fastest possible operation. For each frame, image reconstructor 22 (FIG. 1) performs mosaicing as previously described herein and module 23 (FIG. 1) evaluates the quality of reconstruction, either using vectors, mean values, or both. If the quality of reconstruction is below a user specified threshold, then module 23 informs robot controller 21 (FIG. 1) to reduce the rotational speed of endoscope 12 to assure smoother frame transition, which would influence optical flow computation through window size and the reconstruction through allowing intensity information from more frames. The speed may be reduced using linear relation to image quality or by a predefined multiplication factor.
In practice rotational speed may be decreased even if the quality improves.
Considering that reconstruction will be the best close to the object, it is to expect that rotational speed would decrease over time.
Similarly, the control of endoscope depth may be performed analogously.
Specifically, the depth of endoscope 12 is not a continuous value. The endoscope is raised for a specific amount after each sweep. Therefore, less overlap and lower quality of reconstruction is expected. It is simple to demonstrate that 50% of overlap between consecutive sweeps increases by two times for the same increase in depth. This overlap can assure enough information to perform successful sweep. Therefore, the amount of overlap may be set to an initial value (e.g., > 50%) with the initial execution of stage S71. As the algorithm is computing the mosaic, the depth increment may be decreased to achieve better image mosaicing.
The quality control of image mosaicing may be performed by: 1) comparing expected optical flow vector orientations with observed optical flow vector orientations, for example, the error indicator may be an angle between expected and observed vector, or 2) comparing pixel values between two or more images after images are transformed into one reference frame. Both measures are per pixel and may be generalized for one mosaicing step by taking average, root mean square or any other statistical method of combining errors known in art.
Referring back to FIG. 1, control unit 20 further includes an image display module 24 broadly defined herein as any module structurally configured for displaying an image of a video stream of the workspace volume within a reconstructed volume image. For example, a volume image 121 of heart 120 may be reconstructed as shown in FIG. 9A whereby a video stream 122 from endoscope 12 is highlighted within a reconstructed volume image 121 as the endoscope 12 is being controlled via robot 1 1 as shown in FIGS. 9B and 9C. This enables a user of endoscope 12 to visualize volume image 121 as needed while the user is performing a procedure via endoscope 12. In practice, image display module 24 may be implemented by hardware, software and/or firmware integrated within robot controller 21, integrated within image reconstructor 22, distributed between robot controller 21 and image reconstructor 22, or installed on a separate device connected to robot controller 21 and/or image reconstructor 22.
From the description of FIGS. 1-9 herein, those having ordinary skill in the art will appreciate the numerous benefits of the present invention including, but not limited to, an opportunity to maximize the reconstruction quality and coverage of a volume image, to minimize a running time of a reconstruction of a volume image, and to comply with any physical limits imposed by the insertion point of the workspace volume and endoscope cables. Furthermore, while FIG. 8 was described herein with the endoscope being raised after each sweep, in practice the endoscope may be lowered after each sweep.
Although the present invention has been described with reference to exemplary aspects, features and implementations, the disclosed systems and methods are not limited to such exemplary aspects, features and/or implementations. Rather, as will be readily apparent to persons skilled in the art from the description provided herein, the disclosed systems and methods are susceptible to modifications, alterations and enhancements without departing from the spirit or scope of the present invention. Accordingly, the present invention expressly encompasses such modification, alterations and
enhancements within the scope hereof.

Claims

1. A robot imaging system, comprising:
a robot unit (10) including
an oblique endoscope (12) operable for generating a video stream (13) including a plurality of endoscopic images of a workspace volume (40), and
a robot (1 1) operable for moving the endoscope (12) within the workspace volume (40); and
a control unit (20) including
a robot controller (21) operable for commanding the robot (1 1) in executing at least one image acquisition sweep of the endoscope (12) within the workspace volume (40), wherein each image acquisition sweep includes at least one rotational motion by the endoscope (12) within the workspace volume (40),
a sweep control module (23) operable for linking each endoscopic image generated during the at least one image acquisition sweep to a corresponding rotational pose of the endoscope (12) within the workspace volume (40), and
an image reconstructor (22) operable for reconstructing a volume image of the workspace volume (40) from a linking of the generated endoscopic images to the
corresponding rotational poses of the endoscope (12) within the workspace volume (40).
2. The robot imaging system of claim 1, wherein the sweep control module (23) is further operable for controlling at least one of a rotational speed and a depth of the endoscope (12) within the workspace volume (40) during each image acquisition sweep as a function of a quality of a reconstruction of the volume image.
3. The robot imaging system of claim 2, wherein the sweep control module (23) is further operable for testing the quality of the reconstruction of the volume image as a function of at least one of an error between an expected optical flow vector orientation and an observed optical flow vector orientation and a pixel comparison between two endoscopic images transformed into a single reference frame.
4. The robot imaging system of claim 1,
wherein a reconstruction of the volume image of the workspace volume (40) includes weighting pixels within the endoscopic images; and wherein, for each endoscopic image, the weighted pixels decrease in value from a center of the endoscopic image to each frame border of the endoscopic image.
5. The robot imaging system of claim 1, wherein the at least one rotational motion of the endoscope (12) includes a spin rotation of the endoscope (12) about an endoscopic axis extending through the endoscope (12).
6. The robot imaging system of claim 1, wherein the at least one rotational motion of the endoscope (12) includes:
the endoscope (12) being tilted relative to a workspace axis perpendicular to an insertion point of the endoscope (12) into the workspace volume (40), and
a revolving rotation of the endoscope (12) about the workspace axis.
7. The robot imaging system of claim 1,
wherein the robot controller (21) is further operable for commanding the robot (11) to execute an endoscopic image acquisition of the endoscope (12) within the workspace volume (40);
wherein the endoscopic image acquisition includes the robot (1 1) fixing a pose of the endoscope (12) relative to a predetermined area of the workspace volume (40);
wherein the sweep control module (23) is further operable for linking an endoscopic image generated during the endoscopic image acquisition to a fixed pose of the endoscope (12) relative to a predetermined area of the workspace volume (40); and
wherein the image reconstructor (22) is further operable for reconstructing the volume image of the workspace volume (40) from a linking of the generated endoscopic image to the fixed pose of the endoscope (12) within the workspace volume (40).
8. The robot imaging system of claim 1, wherein the control unit (20) further includes: an image display module (24) operable, subsequent to a reconstruction of the volume image, for highlighting the video stream (13) within a display of the reconstructed volume image.
9. A control unit (20) for a robot unit (10) including an oblique endoscope (12) for generating a video stream (13) including a plurality of endoscopic images of a workspace volume (40) and a robot (1 1) for moving the endoscope (12) within the workspace volume (40), the control unit (20) comprising:
a robot controller (21) operable for commanding the robot (1 1) in executing at least one image acquisition sweep of the endoscope (12) within the workspace volume (40), wherein each image acquisition sweep includes at least one rotational motion by the endoscope (12) within the workspace volume (40);
a sweep control module (23) operable for linking each endoscopic image generated during the at least one image acquisition sweep to a corresponding rotational pose of the endoscope (12) within the workspace volume (40); and
an image reconstructor (22) operable for reconstructing a volume image of the workspace volume (40) from a linking of the generated endoscopic images to the corresponding rotational poses of the endoscope (12) within the workspace volume (40).
10. The control unit (20) of claim 9, wherein the sweep control module (23) is further operable for controlling at least one of a rotational speed and a depth of the endoscope (12) within the workspace volume (40) during each image acquisition sweep as a function of a quality of a reconstruction of the volume image.
1 1. The control unit (20) of claim 10, wherein the sweep control module (23) is further operable for testing the quality of the reconstruction of the volume image as a function of at least one of an error between an expected optical flow vector orientation and an observed optical flow vector orientation and a pixel comparison between two endoscopic images transformed into a single reference frame.
12. The control unit (20) of claim 9,
wherein a reconstruction of the volume image of the workspace volume (40) includes weighting pixels within the endoscopic images; and
wherein, for each endoscopic image, the weighted pixels decrease in value from a center of the endoscopic image to each frame border of the endoscopic image.
13. The control unit (20) of claim 9, wherein the at least one rotational motion of the endoscope (12) includes a spin rotation of the endoscope (12) about an endoscopic axis extending through the endoscope (12).
14. The control unit (20) of claim 9, wherein the at least one rotational motion of the endoscope (12) includes:
the endoscope (12) being tilted relative to a workspace axis perpendicular to an insertion point of the endoscope (12) into the workspace volume (40), and
a revolving rotation of the endoscope (12) about the workspace axis.
15. The control unit (20) of claim 9,
wherein the robot controller (21) is further operable for commanding the robot (1 1) to execute an endoscopic image acquisition of the endoscope (12) within the workspace volume (40);
wherein the endoscopic image acquisition includes the robot (1 1) fixing a pose of the endoscope (12) relative to a predetermined area of the workspace volume (40);
wherein the sweep control module (23) is further operable for linking an endoscopic image generated during the endoscopic image acquisition to a fixed pose of the endoscope (12) relative to a predetermined area of the workspace volume (40); and
wherein the image reconstructor (22) is further operable for reconstructing the volume image of the workspace volume (40) from a linking of the generated endoscopic image to the fixed pose of the endoscope (12) within the workspace volume (40).
16. The control unit (20) of claim 9, wherein the control unit (20) further includes:
an image display module (24) operable, subsequent to a reconstruction of the volume image, for highlighting the video stream (13) within a display of the reconstructed volume image.
17. A robot imaging method for an oblique endoscope (12) generating a video stream (13) including a plurality of endoscopic images of a workspace volume (40), the robot imaging method comprising:
commanding a robot (1 1) to execute at least one image acquisition sweep of the endoscope (12) within the workspace volume (40),
wherein each image acquisition sweep includes the robot (1 1) controlling at least one rotational motion by the endoscope (12) within the workspace volume (40);
linking each endoscopic image generated during the at least one image acquisition sweep to a corresponding rotational pose of the endoscope (12) within the workspace volume (40); and reconstructing a volume image of the workspace volume (40) from the linking of the endoscopic images with the corresponding rotational poses of the endoscope (12) within the workspace volume (40).
18. The robot imaging method of claim 17, wherein the sweep control module (23) is further operable for controlling at least one of a rotational speed and a depth of the endoscope
(12) within the workspace volume (40) during each image acquisition sweep as a function of a quality of a reconstruction of the volume image.
19. The robot imaging method of claim 17, further comprising:
commanding the robot (1 1) to execute an endoscopic image acquisition of the endoscope (12) within the workspace volume (40),
wherein the endoscopic image acquisition includes the robot (1 1) fixing a pose of the endoscope (12) relative to a predetermined area of the workspace volume (40);
linking an endoscopic image generated during the endoscopic image acquisition to a fixed pose of the endoscope relative to (12) relative to a predetermined area of the workspace volume (40); and
reconstructing the volume image of the workspace volume (40) from the linkage of the generated endoscopic image to a fixed pose of the endoscope (12) within the workspace volume (40).
20. The robot imaging method of claim 17, further comprising:
subsequent to a reconstruction of the volume image, highlighting the video stream
(13) within a display of the reconstructed volume image.
PCT/IB2011/052334 2010-06-30 2011-05-27 Robotic control of an oblique endoscope for fov images WO2012001549A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US35984210P true 2010-06-30 2010-06-30
US61/359,842 2010-06-30

Publications (1)

Publication Number Publication Date
WO2012001549A1 true WO2012001549A1 (en) 2012-01-05

Family

ID=44533264

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/052334 WO2012001549A1 (en) 2010-06-30 2011-05-27 Robotic control of an oblique endoscope for fov images

Country Status (2)

Country Link
TW (1) TW201206387A (en)
WO (1) WO2012001549A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013181684A1 (en) * 2012-06-05 2013-12-12 Optimized Ortho Pty Ltd A method, guide, guide indicia generation means, computer readable storage medium, reference marker and impactor for aligning an implant
WO2014001980A1 (en) * 2012-06-28 2014-01-03 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
EP3119325B1 (en) 2014-03-17 2018-12-12 Intuitive Surgical Operations, Inc. Systems and methods for control of imaging instrument orientation
US11123149B2 (en) 2015-10-09 2021-09-21 Covidien Lp Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040210105A1 (en) * 2003-04-21 2004-10-21 Hale Eric Lawrence Method for capturing and displaying endoscopic maps
US20090005640A1 (en) * 2007-06-28 2009-01-01 Jens Fehre Method and device for generating a complete image of an inner surface of a body cavity from multiple individual endoscopic images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040210105A1 (en) * 2003-04-21 2004-10-21 Hale Eric Lawrence Method for capturing and displaying endoscopic maps
US20090005640A1 (en) * 2007-06-28 2009-01-01 Jens Fehre Method and device for generating a complete image of an inner surface of a body cavity from multiple individual endoscopic images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013181684A1 (en) * 2012-06-05 2013-12-12 Optimized Ortho Pty Ltd A method, guide, guide indicia generation means, computer readable storage medium, reference marker and impactor for aligning an implant
WO2014001980A1 (en) * 2012-06-28 2014-01-03 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
CN104411226A (en) * 2012-06-28 2015-03-11 皇家飞利浦有限公司 Enhanced visualization of blood vessels using a robotically steered endoscope
JP2015526133A (en) * 2012-06-28 2015-09-10 コーニンクレッカ フィリップス エヌ ヴェ Improving blood vessel visualization using a robot-operated endoscope
RU2689767C2 (en) * 2012-06-28 2019-05-28 Конинклейке Филипс Н.В. Improved imaging of blood vessels using a robot-controlled endoscope
EP3119325B1 (en) 2014-03-17 2018-12-12 Intuitive Surgical Operations, Inc. Systems and methods for control of imaging instrument orientation
US10548459B2 (en) 2014-03-17 2020-02-04 Intuitive Surgical Operations, Inc. Systems and methods for control of imaging instrument orientation
US11123149B2 (en) 2015-10-09 2021-09-21 Covidien Lp Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems

Also Published As

Publication number Publication date
TW201206387A (en) 2012-02-16

Similar Documents

Publication Publication Date Title
US9603508B2 (en) Method for capturing and displaying endoscopic maps
RU2594813C2 (en) Robot control for an endoscope from blood vessel tree images
US9129422B2 (en) Combined surface reconstruction and registration for laparoscopic surgery
US8504136B1 (en) See-through abdomen display for minimally invasive surgery
US10945796B2 (en) Robotic control of surgical instrument visibility
Noonan et al. A stereoscopic fibroscope for camera motion and 3D depth recovery during minimally invasive surgery
RU2692206C2 (en) Robotic control of endoscope based on anatomical features
WO2012001549A1 (en) Robotic control of an oblique endoscope for fov images
CN109069207B (en) Robot system, control unit thereof, and computer-readable storage medium
EP3463032B1 (en) Image-based fusion of endoscopic image and ultrasound images
CN108601626A (en) Robot guiding based on image
Liu et al. Global and local panoramic views for gastroscopy: an assisted method of gastroscopic lesion surveillance
CN109715106A (en) Control device, control method and medical system
JP2019511931A (en) Alignment of Surgical Image Acquisition Device Using Contour Signature
US20160331475A1 (en) Continuous image integration for robotic surgery
JP2018521774A (en) Endoscopic guidance from interactive planar slices of volume images
US10702346B2 (en) Image integration and robotic endoscope control in X-ray suite
US20110176715A1 (en) Four-dimensional volume imaging system
WO2020054566A1 (en) Medical observation system, medical observation device and medical observation method
Wittenberg et al. 3-D reconstruction of the sphenoid sinus from monocular endoscopic views: First results
WO2018154601A1 (en) Multi-camera imaging and visualization system for minimally invasive surgery
Lerotic et al. The use of super resolution in robotic assisted minimally invasive surgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11730074

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 11730074

Country of ref document: EP

Kind code of ref document: A1