JP4152402B2 - Surgery support device - Google Patents

Surgery support device Download PDF

Info

Publication number
JP4152402B2
JP4152402B2 JP2005189842A JP2005189842A JP4152402B2 JP 4152402 B2 JP4152402 B2 JP 4152402B2 JP 2005189842 A JP2005189842 A JP 2005189842A JP 2005189842 A JP2005189842 A JP 2005189842A JP 4152402 B2 JP4152402 B2 JP 4152402B2
Authority
JP
Japan
Prior art keywords
image
endoscope
position
surgical
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2005189842A
Other languages
Japanese (ja)
Other versions
JP2007007041A (en
Inventor
秀和 仲本
誠 橋爪
Original Assignee
株式会社日立メディコ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立メディコ filed Critical 株式会社日立メディコ
Priority to JP2005189842A priority Critical patent/JP4152402B2/en
Publication of JP2007007041A publication Critical patent/JP2007007041A/en
Application granted granted Critical
Publication of JP4152402B2 publication Critical patent/JP4152402B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a high-function endoscopic surgery support apparatus that supports a surgery using an endoscope with a display image, and more particularly to a surgery support apparatus combined with an imaging apparatus such as an MRI apparatus or a CT apparatus.

In recent years, surgery support devices that support the implementation of medical practices, particularly surgical treatment, using an endoscope or a surgical robot have been developed and put into practical use. For example, in Patent Document 1, 3D data obtained by extracting a surgical target part from a three-dimensional image is projected and displayed on an image captured by a camera in accordance with the direction and position of the camera. In addition, it describes that the position and orientation of a manipulator for operating a surgical instrument are measured, and a 3D image of the manipulator is projected and displayed on an image obtained by an imaging unit. Patent Document 2 describes that information created by the surgery support image creation means is superimposed on an endoscopic image and displayed, and the surgery support image shows the movable range of the surgical instrument and manipulator. Is described.
Japanese Unexamined Patent Publication No. 9-19441 JP 2001-104333 A

  In the surgical treatment using the operation support system as described above, the positions of the surgical instrument and the endoscope change from moment to moment, and the image of the endoscope changes accordingly. In addition, the position where the endoscope is arranged may be changed as appropriate. However, in the conventional technology, if the position of the surgical instrument or endoscope is changed after creating the surgical support image, the surgical support image is recreated according to the endoscope position, and these positions are adjusted. It is necessary to display the image, and it is difficult to display a support image in real time according to changes in the endoscope position and the surgical instrument position. For the same reason, it has also been difficult to confirm in real time the distance between a specific surgical instrument including an endoscope and a specific region of the subject (for example, a region where the surgical tool should not contact).

  Therefore, the present invention can display an image of the surgical target region at the position superimposed on the endoscope image in real time according to changes in the endoscope and the surgical tool, It is an object of the present invention to provide a surgery support apparatus that can display a distance from a specific region of a subject in real time.

  The surgical operation support apparatus of the present invention that solves the above problems is displayed on the display means using an endoscope, display means for displaying an image of the endoscope, and three-dimensional volume image data of the subject. A surgical support apparatus comprising a surgical support image for generating a surgical support image to be superimposed and displayed on the video image, the three-dimensional position detection means for sequentially detecting the position of the endoscope, and the three-dimensional position detection means Coordinate integration means for integrating the coordinates of the image and the image coordinates of the image creation means, the image creation means based on the position information of the endoscope obtained from the coordinate integration means, the three-dimensional For the specific area image data extracted from the volume image data, a projection image having a specific plane determined by the position information of the endoscope as a projection surface is created and superimposed on the image of the endoscope at the time of detecting the endoscope position. Displayed And wherein the Rukoto.

In the surgery support apparatus of the present invention, the projection image created by the image creation unit is, for example, a projection image on a plane that includes the focal position of the endoscope and is orthogonal to the traveling direction of the endoscope.
In the surgery support apparatus of the present invention, preferably, the three-dimensional position detection means detects the position of the surgical tool applied to the subject, and the image creation means is detected by the three-dimensional position detection means, Based on the position information of the surgical tool converted by the coordinate integration means, the image of the surgical tool is superimposed on the position on the surgery support image corresponding to the detected surgical tool position and displayed on the display means.

In the surgery support apparatus of the present invention, the image creation means creates a projection image of a specific region by changing the projection plane of the projection image in real time in accordance with the change of the endoscope image accompanying the movement of the endoscope, Display on the display device.
In addition, the surgery support apparatus of the present invention preferably includes a calculation unit that calculates the distance between the specific region and the surgical instrument, and provides a warning when the distance calculated by the calculation unit and / or the distance is within a threshold value. Display on the display device.
The surgery support apparatus of the present invention further includes an imaging unit that acquires three-dimensional volume image data. In this case, the imaging unit can also serve as an image creation unit.

  According to the present invention, the position of the endoscope is sequentially detected by the three-dimensional position detection means, and the detected position information is unified with the coordinates of the image creation means for creating the surgery support image by the coordinate integration means. Since it is given to the image creation means as coordinate information, the image creation means can display the specific area image in real time while maintaining the endoscope position and the relative position in the real space according to changes in the position and orientation of the endoscope. Superimposed display is possible. As a result, information necessary for surgical support can be provided to the surgeon, and the accuracy of treatment can be improved. In addition, since the specific area image superimposed on the endoscope maintains the relative position in the real space, the distance from the surgical instrument can be calculated together with the operation support image display, and the surgical instrument comes into contact with the endoscope. It is possible to promptly warn of close proximity to blood vessels and organs that should not be handled.

  Further, according to the present invention, an imaging apparatus such as an MRI apparatus or a CT apparatus can also serve as an image creating unit. In this case, a pointing tool is used by using a three-dimensional position detecting unit for detecting an endoscope position. It is possible to perform imaging such as an interactive scan for imaging a desired part of the subject designated by the item in a timely manner.

Hereinafter, embodiments of the present invention will be described with reference to the drawings.
FIG. 1 is a perspective view showing the overall configuration of a high-function endoscopic surgery support apparatus according to the present invention, and FIG. 2 is a block diagram. This operation support apparatus is an apparatus for performing surgical treatment of a patient using an endoscope and a small robot while using an image obtained by an imaging apparatus such as an MRI apparatus as one of operation support images. As an imaging device 10 for capturing a tomographic image of a subject, an endoscope 20 for observing a surgical target region of the subject, a small surgical robot 30 for operating a surgical instrument, an endoscope 20 and A three-dimensional position detection device 40 for detecting the position of a surgical instrument operated by the robot 30, an image creation unit 50 for creating a surgery support image superimposed on an endoscope image, an endoscope image and a surgery support A display device 70 for displaying an image and a control unit 60 for performing display control of the display device 70 are provided. The imaging device 10, the endoscope 20, and the robot 30 are provided with operation units 80 (81, 82, 83) for an operator to input operation commands and conditions, respectively.

  In the embodiment shown in FIG. 1, the imaging apparatus 10 is a vertical magnetic field type open MRI apparatus in which a pair of magnets 13 and 15 are arranged above and below an imaging space, and a bed 12 on which a subject 11 is placed. Can be operated in a state in which is placed in the imaging space. The configuration of the MRI apparatus and the imaging method are the same as those of the conventional MRI apparatus, and a description thereof is omitted here.

  The endoscope 20 includes a scope unit and an image processing unit. The scope unit includes an objective lens and a small camera inserted into the subject 11. The image processing unit captures the image data captured by the small camera, records it in the video recording device 21, transfers it to the control unit 60, and displays it on the display device 70 as an endoscopic video. The endoscope 20 may be either a monocular endoscope equipped with a single camera or a stereo endoscope equipped with a pair of cameras, and uses a display device 70 that supports two-viewpoint stereo video display. In some cases, both can be accommodated. The endoscope 20 is fixed to a drive mechanism attached to the MRI apparatus, and can perform operations such as rotation and translation by operations from the operation unit 83. In addition, the endoscope 20 includes an instrument (a surgical instrument following instrument) (not shown) that moves following the operation of the endoscope and can detect the position by the three-dimensional position detection device 40. It is attached.

  The small robot 30 has a plurality of robot arms that are rotatably connected to each other. By changing the connection angle of these robot arms, a surgical tool such as a scalpel fixed to the tip of the robot arm can be arbitrarily set in any direction. It can be moved and rotated by the amount. The robot 30 is also attached with an instrument (surgical instrument tracking instrument) that moves following the movement and can be detected by the three-dimensional position detection device 40.

  As a mechanism for driving the endoscope 20 and the robot (robot arm) 30, rails are attached to the bed 12 of the MRI apparatus 10 along both side edges thereof, and the arc shape is movable along the rails. Arms (endoscope arm 25 and robot arm 35) are attached. The scope unit and robot arm of the endoscope 20 are attached to this arm. When there are a plurality of surgical tools used for the operation, a plurality of robots are attached to one or a plurality of arms. In the example shown in FIG. 1, two surgical tools are fixed to one arm.

  The three-dimensional position detection device 40 is for detecting the position in the real space of the surgical instrument follower fixed to the endoscope 20 and the robot 30, and in this embodiment, a pair of infrared cameras 41 are provided. The position detection device 42 is provided, and a processing unit 43 that performs processing such as predetermined coordinate conversion on the position signal detected by the position detection device 42.

  The position detection device 42 includes an infrared source that generates infrared rays, and detects the infrared rays that are irradiated from the infrared source and reflected from a predetermined reflector (marker), whereby the position of the target to which the reflector is attached is determined. It can be detected. In the above-described surgical instrument follower fixed to the endoscope and the robot, markers are fixed at least at three locations. From the three position coordinates obtained from the markers, for example, the tip of the endoscope to which the instrument is fixed is fixed. Three-dimensional position information such as the position and the direction of the endoscope can be obtained. The position detection device 42 is fixed to a gantry that covers the upper magnet 13 of the MRI apparatus 10 by a support arm 45, and can be set to an arbitrary position by moving the support arm 45.

  A reference tool 47 is fixed to the gantry of the MRI apparatus 10 in order to associate the direction and position in the camera coordinates captured by the infrared camera 41 with the apparatus coordinate system (real space coordinates) of the MRI apparatus. To the reference tool 47, three markers 48 (for example, reflecting spheres that reflect infrared rays emitted from the infrared camera) 48 that can be detected by the infrared camera 41 are fixed. By detecting the positions of these three markers, it is possible to detect the three-dimensional position of the camera with respect to the origin of the apparatus coordinates, and to associate the camera coordinates with the apparatus coordinates. The infrared camera 41 sends position information detected at a predetermined sampling speed to the processing unit 43. The processing unit 43 converts the position information (camera coordinate position information) detected by the position detection device 42 into position information in the apparatus coordinate system. A coordinate conversion matrix for coordinate conversion is stored in advance in the processing unit 43, and position information that has undergone coordinate conversion is obtained simultaneously with position detection by the position detection device. The processing unit 43 can be constructed on a personal computer, for example.

  The image creation unit 50 processes the three-dimensional image data of the subject 10 imaged by the MRI apparatus 10 and extracts the three-dimensional segmentation image data (hereinafter abbreviated as 3D-SG image data) obtained by extracting a specific region of the subject. At the same time, based on the position of the endoscope 20 obtained from the three-dimensional position detection device 40, a projection process is performed on the 3D-SG image data to create a surgery support image. The image creating unit 50 creates a pre-registered surgical tool image (virtual surgical tool), and creates a surgical support image based on the surgical tool position obtained from the three-dimensional position detection device 40. The image creating unit 50 stores virtual surgical tool data created in advance for a plurality of surgical tools, and when a surgical tool used for surgery is selected and registered, the corresponding virtual surgical tool data is stored. Virtual instrument data to be used can be used. The virtual surgical instrument data may be two-dimensional data schematically showing the surgical instrument, but may also be three-dimensional data. In this case, projection processing is performed in the same manner as SG image data.

  The control unit 60 includes a display control unit 61, a distance calculation unit 62, and the like. The display control unit 61 superimposes the surgery support image created by the image creation unit 50 on the video of the endoscope and causes the display device 70 to display it, and also displays the calculation result in the distance calculation unit 62 and a warning accompanying it. It is displayed on the device 70. The distance calculation unit 62 calculates the distance between the surgical instrument position obtained from the three-dimensional position detection device 40 and the specific region of the subject. At this time, the position of the surgical tool is given as information converted into the position of the apparatus coordinate system, while the position (image coordinates) of the specific area is associated with the apparatus coordinates of the MRI apparatus. The distance between the two can be immediately calculated from the position information. The user can select and set a specific area in which the distance to the surgical instrument should be calculated in advance, and when displaying a warning, it is possible to set a threshold at which a warning should be issued. Yes.

  Such a function of the control unit 60 may be independent of the MRI apparatus 10, but the control unit provided in the MRI apparatus 10 can also serve as the function. The control unit 60 includes an operation unit for the user to input various conditions and commands. When the control unit of the MRI apparatus 10 also serves as the control unit 60, the above command and the like are input from the operation unit 81 of the MRI apparatus 10.

  The display device 70 includes one or more monitors. In the present embodiment, an operator monitor 71 and an operating room monitor 72 are provided. The operator monitor 71 can also serve as a monitor for displaying a GUI or the like of the input unit (operation unit 81).

Next, the operation of the surgery support apparatus having the above configuration will be described. FIG. 3 is an example of a flow showing a procedure of endoscopic surgery using the surgery support apparatus of the present embodiment.
As a preparatory stage for surgery, first, after attaching a surgical instrument follower to the endoscope 20 and the robot arm (steps 101 and 102), the endoscope and the robot arm are installed inside the MRI apparatus (step 103). Next, the subject is imaged by the MRI apparatus, and a three-dimensional volume image is obtained for the region of the subject including the surgical site (step 104). Segmentation of a necessary part is performed using this three-dimensional volume image (step 105).

  For the segmentation, a known method known as region extraction, for example, a method of specifying a point sequence on the contour of a region and connecting them by spline interpolation or a threshold method can be adopted. Also, spline interpolation or the like can be performed on data between slices. As shown in FIG. 4, the three-dimensional volume image is DICOM data determined by the position in the slice plane and the slice direction, and specific area image data (3D-SG image data) 401, 402, which is a result of such segmentation, 403 is also DICOM data. Since the DICOM coordinates are associated with the apparatus coordinates of the MRI apparatus, if the position on the apparatus coordinates is known, the corresponding coordinates on the DICOM data are determined. The segmentation is performed on, for example, the organ to be operated and its nearby organs and blood vessels, and each 3D-SG image data is stored in the memory of the image forming unit 50. In the illustrated example, 3D-SG image data 401, 402, and 403 are created for three portions 404 to 406, respectively.

  Next, registration of the endoscope 20 and the robot arm is performed using the three-dimensional volume image (step 106). The registration is an operation for integrating the position of the surgical instrument follower (the marker) attached to the endoscope 20 and the robot arm into the processing unit 43 of the three-dimensional position detection device 40 into the device coordinates (real space coordinates). . Specifically, the position of the reference tool 47 (three markers) and the position of the surgical instrument follower (three markers) fixed to the endoscope 20 and the robot arm are detected by the three-dimensional position detection device 40, respectively. Thus, the position of the surgical instrument following instrument in the apparatus coordinate system is detected. On the other hand, by inputting the positional relationship between the marker of the surgical instrument following instrument and the focal point of the endoscope 20 and the distal end of the surgical instrument (such as a scalpel) fixed to the robot arm, the marker placement and the points to be monitored and the marker Is stored in the processing unit 43. By performing such registration, the position and orientation of the endoscope 20 and the surgical instrument in real space coordinates can be detected thereafter by detecting the position of each marker of the surgical instrument following instrument.

  When such a preparation step is completed, robotic surgery using an endoscope is started (step 107). The movements of the endoscope and the robot are detected at any time by the three-dimensional position detection device 40, and the position information is sent to the image creating unit 50 as position information converted into image coordinates. The image creation unit 50 creates a projection image (rendered image) from the 3D-SG image data stored in the memory, based on the direction of the endoscope sent from the three-dimensional position detection device 40 and its focal position. . The projection plane of the projected image is a plane that is centered on the endoscope focal point and orthogonal to the direction of the endoscope, for example. FIG. 4 shows a state in which segmentation images 408 to 410 as projection images are created from 3D-SG image data. The image creation unit 50 creates an image 407 in which a virtual surgical tool is arranged in the image space based on the orientation and position information of the robot arm. When the virtual instrument data is three-dimensional data, a projection image projected on the same projection plane as the segmentation image is created.

The display control unit 61 superimposes the surgery support image created by the image forming unit 50 on the video image of the endoscope and displays it on the display device 70 (412-414). At this time, the size of the projected image is adjusted to the magnification of the endoscope. When the magnification of the endoscope is “× 1”, the size of the projected image is the same as the size of the real space, and when it is “× 2”, it is twice the size of the real space. FIG. 5 shows a state in which the segmentation image and the virtual surgical tool are superimposed and displayed on the video of the endoscope. Since the coordinates of the positions of the endoscope and the virtual surgical tool are integrated with the apparatus coordinates of the MRI apparatus, the segmentation image 503, the virtual surgical tools 504 and 505, and the endoscope 502 are respectively in real space as shown in the figure. Are maintained as endoscopic images 506 as operation support images (507, 508, 509).
Such creation and superposition display of the projection image from the segmentation image are executed in accordance with the sequential detection of the positions of the endoscope and the robot. That is, as the endoscope moves, the video of the endoscope changes, so that a segmentation image and a virtual surgical tool at the changed position are created and displayed in a superimposed manner.

  As an example, as shown in FIG. 6, when an endoscope 603 is installed from the foot side with respect to a surgical target region (target) 602 of a subject 601, the endoscope 603 and the target 602 are connected. The relationship is uniquely defined in the apparatus coordinate system, and a projection image 605 is created with a predetermined plane determined by the endoscope position as a projection plane, and is superimposed on the endoscope video 606. Is installed from the head side, the position coordinate of the endoscope 603 in the apparatus coordinate system is changed, and a projection image 610 on a new projection plane is created and superimposed on the endoscope image 611. That is, the projected image from the same side as the image always projected by the endoscope is superimposed and displayed.

Also, when the surgical instrument position changes, the virtual surgical instrument is displayed at a position on the video corresponding to the changed position. In this case, since the coordinates of the positions of the endoscope and the virtual surgical tool are integrated with the apparatus coordinates of the MRI apparatus, image processing can be performed in real time.
The segmentation image (projection image) is a planar image, but is an image in which each organ or part is depicted three-dimensionally. Therefore, the surgeon can easily grasp the three-dimensional positional relationship between the segmented organs and parts in the video image displayed on the endoscope.

  FIG. 4 shows how three segmentation images are created and displayed, but the creation and display of segmentation images can be selected as necessary (step 109). For example, as the endoscope progresses, blood vessels and organs that should not come into contact with the surgical tool may differ. In that case, the selected segmentation image is displayed by the operator selecting the segmentation image to be displayed via the operation unit 81. Alternatively, the entire segmentation image can be hidden.

  The distance calculation unit 62 calculates the distance between the surgical tool fixed to the robot arm and the specific area of the subject (area specified in advance by the user) based on the position of the robot arm obtained from the three-dimensional position detection device 40. This is calculated and displayed on the display device 70 (step 110). The distance can be calculated from the position information of the surgical tool in the real space coordinates and the position information of the segment image in the specific area. When the calculated distance is less than or equal to a preset threshold value, a warning is issued with an image or sound (step 111).

  Further, in the present embodiment, interactive scanning is executed by user selection (step 112). The interactive scan is a function that captures in real time a tomographic image including a position on the subject designated by the user with the pointing tool. Usually, in an MRI apparatus, a desired position is designated by an indicator that can be detected by a three-dimensional position detector, and this MRI apparatus receives this position information, and performs imaging of a cross section including the indication direction of the indicator. Here, since the robot arm is provided with a surgical instrument follower that can be detected by a three-dimensional position detector, interactive scanning can be executed using this instrument as an indicator. The tomographic image captured by the interactive scan can be displayed on the display device 70 in parallel with the endoscopic image.

  The endoscopic video and the projected image superimposed on it are images with the viewpoint of the endoscope, but the tomographic image obtained by the interactive scan is an image with a different plane as the slice plane, and the robot arm Therefore, for example, a part that is difficult to see behind the endoscope or an internal tissue that cannot be seen with the endoscope can be imaged. As a result, the surgeon can obtain multifaceted information together with the endoscopic image, leading to improvement in surgical accuracy.

  Although not shown in the flow shown in FIG. 3, in this embodiment, it is also possible to perform surgical navigation using three-dimensional volume data as necessary. Surgery navigation is a well-known technique as a surgery support function, and a position designated from a three-dimensional volume image acquired in advance by designating a desired position of a subject with an indicator that can be detected by a three-dimensional position detector. This is a function for creating and displaying a cross-sectional image including. Also in this case, it is possible to use a robot arm instead of the pointing tool.

  While appropriately performing the interactive and surgical navigation as described above, the operation support image superimposed on the endoscope is updated and displayed in real time until the operation is completed (step 113).

  Next, an example of a GUI for executing the present embodiment is shown in FIG. In the example shown in the figure, an endoscope image is displayed in the approximate center of the screen 701, and two surgical tools 713 and 714 and a segmentation image 708 are superimposed on the screen.

  On the right side of the video display screen, a warning display screen 702, a distance display screen 703, and a three-dimensional display screen 704 are provided. On the warning display screen 702, for example, the fact that one surgical instrument is approaching a specific area is displayed with a warning sound in characters and graphics. The distance display screen 703 displays the distance between the surgical tool and the specific area. This display is updated as the distance changes. The three-dimensional display screen 704 displays the arrangement in the real space between the specific region that is a segmentation target and the surgical instrument.

  An operation screen (GUI) is displayed on the lower side 705 of the video display screen. “Scan1” and “Scan2” 706 are operation buttons for starting imaging, and one of a plurality (here, two) of imaging methods can be selected. “Segmentation: ON / OFF” 707 is a button for selecting display / non-display of the segmentation image. In this example, the display / non-display of the entire segmentation image can be selected. It is also possible to specify one of a plurality of areas displayed on the original display screen 704 and select display / non-display. “Operate Tool: ON / OFF” 712 is a button for selecting display / non-display of the surgical tool. In this case as well, display / non-display of all of the multiple surgical tools may be selectable or specified. It may be possible to select display / non-display of only the operated tool. “ISC: ON / OFF” 717 is an operation button for performing interactive scanning, and “Navigation: ON / OFF” 717 is an operation button for performing navigation.

  The operator can give an instruction for imaging, display, etc. via the GUI displayed on the monitor 2 and can proceed with the operation while viewing the video, distance, etc. displayed on the monitor.

  According to the present embodiment, since the coordinates of the endoscope and the robot arm (the surgical tool fixed thereto) are integrated with the apparatus coordinates of the MRI apparatus, surgical support is provided in real time as the endoscope image changes. Images can be updated and superimposed. In addition, the distance between the surgical instrument position and the specific region can be calculated in a very short time, thereby promptly issuing a warning when the surgical instrument approaches a specific part that should not come into contact.

  In the above embodiment, the case where the imaging apparatus is an MRI apparatus has been described as an example. However, as the imaging apparatus, other imaging apparatuses such as a CT apparatus and PET can be adopted in addition to the MRI apparatus. Furthermore, in the present embodiment, the surgery support apparatus in which the imaging apparatus such as the MRI apparatus is incorporated has been described. However, the imaging apparatus is not essential in the surgery support apparatus of the present invention, and the image creation unit is captured by the imaging apparatus. It is also possible to take a three-dimensional volume image data and create a surgery support image. In this case, it is necessary to associate the coordinates of the three-dimensional volume image with the real space coordinates where the operation is performed. Such association is performed, for example, by imaging a subject with a marker that can be detected by an imaging device that captures a three-dimensional volume image and fixing the subject with a three-dimensional position detector in a real space where surgery is performed. It is possible to carry out by detecting the position of the marker fixed to the.

  In this embodiment, the case where the position of the robot to which the surgical tool is fixed is detected by the three-dimensional position detector that detects the endoscope position has been described. However, the robot control system has the position information of the robot arm. If it is, the position information can be used. In this case as well, the robot arm position is registered in the processing unit of the 3D position detector using an indicator that can be detected by the 3D position detector, and coordinate conversion between the robot coordinates and the coordinates of the 3D position detection device is performed. By creating a matrix, it is possible to create a surgical support image and superimpose it on an endoscopic image using position information from the robot control system, as in the case where the robot position is detected by a three-dimensional position detection device. It becomes possible.

The figure which shows typically the whole external appearance of the surgery assistance apparatus of this invention The block diagram which shows one Embodiment of the surgery assistance apparatus of this invention The flowchart which shows the surgery procedure using the surgery assistance apparatus of this invention The figure explaining the segmentation and projection processing in an image preparation part The figure explaining the superimposition of the surgery assistance image which the image creation part created, and the endoscope image The figure explaining the relationship between the change of an endoscope position and projection processing The figure which shows an example of GUI of the operation part in the surgery assistance apparatus of this invention

Explanation of symbols

DESCRIPTION OF SYMBOLS 10 ... Imaging device 20 ... Endoscope 30 ... Small robot 40 for operation ... Three-dimensional position detection apparatus 50 ... Image preparation part 60 ... Control part 70 ... Display apparatus 80 ... Operation section

Claims (7)

  1. An endoscope,
    Display means for displaying an image of the endoscope;
    A surgical support device comprising image creation means for creating a surgical support image to be displayed superimposed on the video displayed on the display means using three-dimensional volume image data of a subject,
    Three-dimensional position detection means for sequentially detecting the position of the endoscope;
    A coordinate integration unit that integrates the coordinates of the three-dimensional position detection unit and the image coordinates of the image creation unit;
    The image creating means includes a memory for storing, in the image coordinates, three-dimensional specific region image data of each organ or part extracted from the three-dimensional volume image data, and a desired organ among the three-dimensional specific region image data accepts the display selection of and region image data, based on the position information of the endoscope obtained from the coordinates integrating means, for three-dimensional specific area image data having received the selection of the display, the position of the endoscope An operation support apparatus that creates a specific area projection image having a specific plane determined by information as a projection plane, and displays the projection image superimposed on an endoscope image at the time of endoscope position detection.
  2. The surgical operation support device according to claim 1,
    The operation support apparatus according to claim 1, wherein the projection image created by the image creation means is a projection image on a plane that includes a focal position of the endoscope and is orthogonal to a traveling direction of the endoscope.
  3. The surgical operation support device according to claim 1 or 2,
    The three-dimensional position detecting means detects a position of a surgical instrument applied to the subject;
    The image creating means detects the surgical tool image on the surgical support image corresponding to the detected surgical tool position based on the surgical tool position information detected by the three-dimensional position detecting means and converted by the coordinate integrating means. An operation support apparatus, characterized in that the operation support apparatus displays the image on the display unit in a superimposed manner.
  4. The surgical operation support device according to any one of claims 1 to 3,
    The image creation means creates a projection image of the specific area by changing a projection plane of the projection image in real time in accordance with a change in the endoscope image accompanying the movement of the endoscope, and displays the projection image on the display device A surgical operation support device characterized in that
  5. The surgery support apparatus according to any one of claims 1 to 4, further comprising a calculation unit that calculates a distance between the specific region and a surgical instrument,
    An operation support apparatus that causes the display device to display a distance calculated by the calculation means and / or a warning when the distance is within a threshold value.
  6. The surgery support device according to any one of claims 1 to 5,
    Furthermore, the surgery assistance apparatus provided with the imaging means which acquires the said three-dimensional volume image data.
  7. The surgical operation support device according to claim 6,
    The surgery support apparatus, wherein the imaging means also serves as the image creation means.
JP2005189842A 2005-06-29 2005-06-29 Surgery support device Active JP4152402B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005189842A JP4152402B2 (en) 2005-06-29 2005-06-29 Surgery support device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005189842A JP4152402B2 (en) 2005-06-29 2005-06-29 Surgery support device

Publications (2)

Publication Number Publication Date
JP2007007041A JP2007007041A (en) 2007-01-18
JP4152402B2 true JP4152402B2 (en) 2008-09-17

Family

ID=37746280

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005189842A Active JP4152402B2 (en) 2005-06-29 2005-06-29 Surgery support device

Country Status (1)

Country Link
JP (1) JP4152402B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145730A1 (en) 2012-03-29 2013-10-03 パナソニック株式会社 Surgery assistance device and surgery assistance program

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2173248B1 (en) * 2007-04-16 2014-11-05 NeuroArm Surgical, Ltd. Methods, devices, and systems relating to cameras configured to be positioned within the bore of a magnet and mr bore space illumination
JP5410021B2 (en) * 2008-01-22 2014-02-05 株式会社日立メディコ Medical diagnostic imaging equipment
JP5154961B2 (en) * 2008-01-29 2013-02-27 テルモ株式会社 Surgery system
JP5561458B2 (en) * 2008-03-18 2014-07-30 国立大学法人浜松医科大学 Surgery support system
JP5823096B2 (en) * 2009-02-10 2015-11-25 株式会社東芝 X-ray diagnostic apparatus and image processing method
BRPI1007726A2 (en) * 2009-05-18 2017-01-31 Koninl Philips Electronics Nv Image-to-image registration method, Image-to-image registration system, Guided endoscopy camera position calibration method and Guided endoscopy camera calibration system
JP5551957B2 (en) * 2010-03-31 2014-07-16 富士フイルム株式会社 Projection image generation apparatus, operation method thereof, and projection image generation program
JP5485853B2 (en) * 2010-10-14 2014-05-07 株式会社日立メディコ Medical image display device and medical image guidance method
EP2829218B1 (en) 2012-03-17 2017-05-03 Waseda University Image completion system for in-image cutoff region, image processing device, and program therefor
JP5807826B2 (en) * 2012-03-29 2015-11-10 パナソニックヘルスケア株式会社 Surgery support device and surgery support program
EP2837326A4 (en) 2012-09-07 2016-02-24 Olympus Corp Medical apparatus
GB2505926A (en) * 2012-09-14 2014-03-19 Sony Corp Display of Depth Information Within a Scene
JP6221166B2 (en) * 2012-10-22 2017-11-01 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Display device, medical device, and program
JP6265627B2 (en) * 2013-05-23 2018-01-24 オリンパス株式会社 Endoscope apparatus and method for operating endoscope apparatus
JP5781135B2 (en) * 2013-09-27 2015-09-16 エフ・エーシステムエンジニアリング株式会社 3D navigation video generation device
JP5611441B1 (en) 2013-11-29 2014-10-22 スキルアップジャパン株式会社 Image processing apparatus for microscope and medical microscope system
EP3084747A4 (en) 2013-12-20 2017-07-05 Intuitive Surgical Operations, Inc. Simulator system for medical procedure training
JP2015220643A (en) * 2014-05-19 2015-12-07 株式会社東芝 Stereoscopic observation device
EP3258871B1 (en) * 2015-02-20 2018-10-24 Koninklijke Philips N.V. Medical system, apparatus and method for shape sensing
WO2019155931A1 (en) * 2018-02-09 2019-08-15 ソニー株式会社 Surgical system, image processing device, and image processing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145730A1 (en) 2012-03-29 2013-10-03 パナソニック株式会社 Surgery assistance device and surgery assistance program

Also Published As

Publication number Publication date
JP2007007041A (en) 2007-01-18

Similar Documents

Publication Publication Date Title
US8248413B2 (en) Visual navigation system for endoscopic surgery
US6675032B2 (en) Video-based surgical targeting system
KR101296215B1 (en) Method and system for performing 3-d tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US10357319B2 (en) Robotic system display method for displaying auxiliary information
CN102802550B (en) The medical robotic system of the auxiliary view comprising the movement limit scope of extending the connected apparatus entering guide far-end is provided
EP1103229B1 (en) System and method for use with imaging devices to facilitate planning of interventional procedures
CN102170835B (en) Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US7824328B2 (en) Method and apparatus for tracking a surgical instrument during surgery
EP1395194B1 (en) A guide system
US8792963B2 (en) Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US8108072B2 (en) Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
US10013808B2 (en) Surgeon head-mounted display apparatuses
US6690960B2 (en) Video-based surgical targeting system
US9107698B2 (en) Image annotation in image-guided medical procedures
US20020082498A1 (en) Intra-operative image-guided neurosurgery with augmented reality visualization
US20060079745A1 (en) Surgical navigation with overlay on anatomical images
US20080071143A1 (en) Multi-dimensional navigation of endoscopic video
US8147503B2 (en) Methods of locating and tracking robotic instruments in robotic surgical systems
US6612980B2 (en) Anatomical visualization system
JP6293777B2 (en) Collision avoidance during controlled movement of image capture device and operable device movable arm
DE60212313T2 (en) Apparatus for ultrasound imaging of a biopsy cannula
US20040254454A1 (en) Guide system and a probe therefor
KR101320379B1 (en) Auxiliary image display and manipulation on a computer display in a medical robotic system
KR20080089376A (en) Medical robotic system providing three-dimensional telestration
KR101720047B1 (en) Virtual measurement tool for minimally invasive surgery

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070904

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070911

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20071109

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080408

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080605

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20080701

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20080701

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110711

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120711

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120711

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130711

Year of fee payment: 5

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350