JP4472085B2 - Surgical navigation system - Google Patents

Surgical navigation system Download PDF

Info

Publication number
JP4472085B2
JP4472085B2 JP2000017135A JP2000017135A JP4472085B2 JP 4472085 B2 JP4472085 B2 JP 4472085B2 JP 2000017135 A JP2000017135 A JP 2000017135A JP 2000017135 A JP2000017135 A JP 2000017135A JP 4472085 B2 JP4472085 B2 JP 4472085B2
Authority
JP
Japan
Prior art keywords
position
image
unit
surgical
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2000017135A
Other languages
Japanese (ja)
Other versions
JP2001204738A (en
Inventor
浩二 下村
剛明 中村
均 唐沢
敬司 塩田
正宏 工藤
康雄 後藤
昌章 植田
俊哉 菅井
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2000017135A priority Critical patent/JP4472085B2/en
Publication of JP2001204738A publication Critical patent/JP2001204738A/en
Application granted granted Critical
Publication of JP4472085B2 publication Critical patent/JP4472085B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a surgical navigation system for navigating a direction in which a surgical operation is performed by displaying the position of a treatment device such as a surgical instrument used on, for example, biological image information displayed on a display unit.
[0002]
[Prior art]
Conventionally, for example, as disclosed in JP-A-5-305073, a tomographic image obtained by CT (computerized tomography) or MRI (magnetic resonance imaging) is synthesized by a computer and displayed on a display unit such as a monitor. In addition to tomographic or stereoscopic display, the shape of treatment instruments used for surgery and treatment instruments such as endoscopes are calibrated in advance, and markers for position detection are attached to these instruments, and infrared rays are used from the outside. By performing position detection, the position of the device being used is displayed on the above-mentioned biological image information. Devices to navigate have been developed.
[0003]
Japanese Patent Publication No. 6-43893, Japanese Patent Publication No. 3-28686, and the like show techniques for recognizing the position and shape of a subject. Alternatively, Japanese Patent Application Laid-Open No. 7-261094 discloses a surgical microscope capable of picture-in-picture of an endoscopic image as a microscope image.
[0004]
In addition, a TV monitor, a head-mounted display, and the like can easily perform a work of combining and displaying a plurality of images in a desired form by using an image mixer.
[0005]
[Problems to be solved by the invention]
However, the conventional configuration has the following problems.
(1) It is impossible to display in which part of the living body the living tissue viewed in the endoscopic image displayed on the monitor is located.
[0006]
(2) Treatment tools generally used for treatment of the surgical site in the patient's body and treatment devices such as endoscopes are displayed with simplified bar-shaped marks on the screen of the microscope image of the monitor. Yes. For this reason, the shape of the treatment device cannot be accurately displayed. In particular, when a part of the treatment device is out of the microscope image, there is a possibility that the portion of the treatment device is in contact with the living tissue. is there. In such a case, there is a risk of inadvertently damaging the living tissue.
[0007]
(3) When a plurality of treatment devices are prepared in advance and the treatment proceeds while exchanging the treatment devices to be used, the calibration must be retaken every time the treatment device to be used is replaced. For this reason, since it takes time to replace the treatment device, there is a problem that it is difficult to efficiently perform the operation.
[0008]
The present invention has been made paying attention to the above circumstances, and its purpose is to display the position on the living tissue of the image obtained by the endoscopic image when the endoscope is used, Furthermore, the shape of a treatment device such as a treatment instrument or an endoscope to be used can be synthesized and displayed on an image of a biological tissue, for example, a portion that is out of the microscope image is inadvertently in contact with the biological tissue. Surgery navigation that can prevent surgery, and even if the treatment instrument used and treatment equipment such as an endoscope are replaced, there is no need to recalibrate and the operation can be carried out efficiently. To provide a system.
[0009]
[Means for Solving the Problems]
The invention according to claim 1 is an operation part in a patient's body. A surgical microscope for observing the body and the surgical part inside the body An endoscope having a position measuring unit for measuring the position of a treatment device used for the treatment of the patient and a distance measuring unit for measuring the distance from the distal end of the insertion unit inserted into the patient's body to the target of the surgical unit An image information display unit that displays image information of an operation part in the body of the patient, a storage unit that stores biological image information measured in advance, position information from the position measurement unit, and the distance measurement unit And the biometric image information recorded in advance in the storage unit. In addition to constructing a reference image, the position of the endoscope and the surgical microscope, the distance from the target, and the surgical microscope in the surgical site analyzed from the data of the endoscopic image is a blind spot. Information on the target that cannot be seen On the screen of the image information display section Composite display An operation navigation system comprising an image information composition unit for performing the operation.
In the first aspect of the present invention, the position of the treatment device used for the treatment of the surgical site in the patient's body is measured by the position measuring unit, and the position of the surgical site is measured from the distal end of the insertion unit inserted into the patient's body. The distance to the target is measured by the distance measuring means of the endoscope. Furthermore, the biometric image information measured in advance is held in the storage unit, and the position information from the position measurement unit, the distance information from the distance measuring unit, and the biometric image information recorded in the storage unit are combined into image information. Synthesized by part To construct a reference image. Furthermore, the position of the endoscope and the surgical microscope, the distance from the target, the target that cannot be seen from the surgical microscope in the surgical site analyzed from the data of the endoscopic image and cannot be seen. Information on things On the screen of the image information display section Composition It is intended to be displayed.
[0010]
According to a second aspect of the present invention, the image information synthesizing unit has means for previously storing the shape of the treatment device, and information on the position and shape of the treatment device and a biological image recorded in advance. 2. The surgical navigation system according to claim 1, wherein the information is combined with information and displayed on the screen of the image information display unit so that the shape and position of the device can be discriminated.
According to the second aspect of the present invention, the shape of the treatment device is stored in advance in the image information composition unit, and the position and shape information of the treatment device and the biological image information recorded in advance are combined to obtain image information. The shape and position of the device are displayed so as to be distinguishable on the screen of the display unit.
[0011]
The invention according to claim 3 is an operation part in a patient's body. A surgical microscope for observing the body and the surgical part inside the body An endoscope having a position measuring unit for measuring the position of a treatment device used for the treatment of the patient and a distance measuring unit for measuring the distance from the distal end of the insertion unit inserted into the body of the patient to the target of the surgical unit An image information display unit that displays image information of an operation part in the body of the patient, a storage unit that stores biological image information measured in advance, position information from the position measurement unit, and the distance measurement unit And the biometric image information recorded in advance in the storage unit. In addition to constructing a reference image, the position of the endoscope and the surgical microscope, the distance from the target, and the surgical microscope in the surgical site analyzed from the data of the endoscopic image is a blind spot. Information on the target that cannot be seen On the screen of the image information display section Composite display A plurality of treatment devices are appropriately and selectively exchanged for use in the treatment of the surgical site in the patient's body, and at a predetermined set position from the distal end of each treatment device. A navigation system for surgery which is provided with a marker for identification by the position measuring unit.
According to the third aspect of the present invention, the position of the treatment device used for the treatment of the surgical site in the patient's body is measured by the position measuring unit, and the target of the surgical site is measured from the distal end of the insertion portion to be inserted into the patient's body. The distance to the object is measured by the distance measuring means of the endoscope. Furthermore, the biometric image information measured in advance is held in the storage unit, and the position information from the position measurement unit, the distance information from the distance measuring unit, and the biometric image information recorded in the storage unit are combined into image information. Synthesized by part To construct a reference image. Furthermore, the position of the endoscope and the surgical microscope, the distance from the target, the target that cannot be seen from the surgical microscope in the surgical site analyzed from the data of the endoscopic image and cannot be seen. Information on things On the screen of the image information display section Composition indicate. Furthermore, when a plurality of treatment devices are appropriately and selectively exchanged for treatment of the surgical site in the patient's body, the position measurement unit performs each treatment with a marker at a predetermined set position from the tip of each treatment device. The device type can be identified.
[0012]
DETAILED DESCRIPTION OF THE INVENTION
Hereinafter, a first embodiment of the present invention will be described with reference to FIGS. FIG. 1 shows a schematic configuration of an entire system of a surgical microscope using the surgical navigation system of the present embodiment. In FIG. 1, 1 is a surgical microscope installed in the operating room, 2 is a mirror body of the surgical microscope 1, and 61 is a surgical bed on which a patient 32 is placed. Here, the gantry 3 of the surgical microscope 1 is provided with a base 4 that can move on the floor surface and a support column 5 that stands on the base 4. Note that the gantry 3 of the surgical microscope 1 is disposed on the distal end side of the surgical bed 61 in the operating room (for example, the side on which the head 32a of the patient 32 on the bed 61 is disposed).
[0013]
Further, a support mechanism 62 that supports the mirror body 2 of the surgical microscope 1 so as to be movable in an arbitrary direction is provided on the upper portion of the support column 5. The support mechanism 62 is provided with a first arm 6, a second arm 7, and a third arm 8. Here, an illumination light source (not shown) is built in the first arm 6. One end portion of the first arm 6 is attached to the upper portion of the support column 5 so as to be rotatable about a substantially vertical axis O1.
[0014]
Furthermore, one end of the second arm 7 is attached to the other end of the first arm 6 so as to be rotatable about a substantially vertical axis O2. The second arm 7 is formed by a pantograph arm comprising a link mechanism and a spring member for balance adjustment, and can be moved up and down.
[0015]
Further, one end of the third arm 8 is attached to the other end of the second arm 7 so as to be rotatable about a substantially vertical axis O3. The mirror body 2 of the surgical microscope 1 is connected to the other end of the third arm 8. Further, the third arm 8 is supported so as to be rotatable about two axes O4 and O5 in directions orthogonal to each other on a substantially horizontal plane. The mirror body 2 is supported by the third arm 8 so as to be able to be lifted in the front-rear direction with respect to the observation direction of the operator around the axis O4 and to be able to be lifted in the left-right direction around the axis O5. ing.
[0016]
In addition, an electromagnetic brake (not shown) is provided on the bearing portion of each of the rotation shafts O1 to O5 of the support mechanism 62. This electromagnetic brake is connected to an electromagnetic brake power circuit (not shown) built in the column 5. Further, this electromagnetic brake power supply circuit is connected to a switch 10 provided on a grip 9 integrally fixed to the mirror body 2 as shown in FIG.
[0017]
And the electromagnetic brake of each rotating shaft O1-O5 is turned on / off by the switch 10. Here, for example, when the switch 10 is turned on, the electromagnetic brakes of the rotary shafts O1 to O5 are turned off, so that the support mechanism 62 is held in the unlocked state, and the mirror body 2 is spatially free. The position can be adjusted. When the switch 10 is turned off, the electromagnetic brakes of the rotary shafts O1 to O5 are turned on, the support mechanism 62 is switched to the locked state, and the position of the mirror body 2 is fixed. .
[0018]
FIG. 3 shows a schematic configuration of the mirror body 2 of the surgical microscope 1. The mirror body 2 is provided with one objective lens 11 and a pair of left and right observation optical systems 14A and 14B. Here, a variable magnification optical system 12, left and right imaging lenses 13a and 13b, and left and right eyepieces 14a and 14b are sequentially arranged on the observation optical axes of the left and right observation optical systems 14A and 14B. . The pair of left and right observation optical systems 14A and 14B constitute a stereoscopic observation optical system.
[0019]
Further, the image forming surfaces of the image forming lenses 13a and 13b are disposed so as to be arranged at the focal positions of the eyepiece lenses 14a and 14b, respectively. Note that reference numeral 16 in FIG. 3 denotes a position sensor that detects the lens position of the objective lens 11. Here, the objective lens 11 is connected to a motor (not shown) and supported so as to be movable in the optical axis direction. The lens position of the objective lens 11 in the optical axis direction can be detected by the position sensor 16.
[0020]
Further, in the system of the surgical microscope 1 of the present embodiment, as shown in FIG. 4, a treatment tool 33 (for example, a bipolar probe, an ultrasonic aspirator, forceps, etc.) used for the surgery simultaneously with the surgical microscope 1, A treatment device such as an endoscope 34 is used in combination. Furthermore, the surgical navigation system of the present embodiment includes a substantially C-shaped head frame 35 that surrounds the head 32a of the patient 32, a treatment tool 33 that is used to treat a surgical site in the body of the patient 32, and an endoscope. A position detection sensor (position measurement unit) 36 that measures the position of a treatment device such as a mirror 34 is provided.
[0021]
Here, the head frame 35 is fixed to the skull of the patient 32. Further, the head frame 35 is fixed to the surgical bed 61 so as not to move by a fixing frame (not shown).
[0022]
The head frame 35 is provided with a plurality of head frame markers 37. As the head frame marker 37, an active type that emits infrared light or a passive marker that is a simple protrusion is arbitrarily selected and used.
[0023]
Further, the position detection sensor 36 is connected to a position and each set value calculation circuit 38. An image composition circuit (image information composition unit) 39 is further connected to the position and each set value calculation circuit 38. The image composition circuit 39 includes a storage device 40 such as a storage unit of a workstation, various input devices (controllers) such as a keyboard 41, a mouse 42, a touch pen 43, and a laser pointer (not shown), and a TV monitor 44. In addition, various display devices (image information display units) such as a head mounted display 45 are connected to each other.
[0024]
The head frame 35 is attached to the head 32a of the patient 32 in advance before surgery. Then, a tomographic image of the living body is photographed by MRI, CT or the like with the head frame 35 attached, and the biological image data 46 photographed at this time is recorded in the storage device 40. As a result, the position data of the head frame 35 is synthesized with the biological image data 46, and subsequent navigation can be performed with the head frame 35 as a reference.
[0025]
Further, the position detection sensor 36 has an infrared sensor adapted to the above-described head frame marker 37 of the head frame 35 and other marker types to be described later. (Direction) can be detected.
[0026]
The endoscope 34 for observing the inside of the living body is connected to a proximal side operation portion 34b at the proximal end portion of an elongated insertion portion 34a inserted into the body of the patient. The operation unit 34 b is provided with a plurality of endoscope markers 47, and the effect thereof is the same as that of the head frame marker 37. Then, by performing calibration in advance, the position detection sensor 36 captures the endoscope marker 47, thereby grasping the relative position and orientation of the endoscope 34 with respect to the head frame 35.
[0027]
Furthermore, the type of the endoscope 34 can be specified by the arrangement of a plurality of endoscope markers 47 or a combination of active and passive. Therefore, data such as the shape, field of view direction, and surface angle of the endoscope 34 input in advance is read, and the outer shape and field of view of the endoscope 34 are displayed on a display device such as the TV monitor 44 and the head mounted display 45. The direction and angle of view are displayed.
[0028]
A model identification marker 49 is provided at the proximal end of the insertion portion 34a of the endoscope 34. The model identification marker 49 can determine which model of the endoscope 34 (for example, data for each model such as perspective or direct view, several angles of view, shape dimensions, etc.). ing. The model identification marker 49 is identified by, for example, color or number, and the position detection sensor 36 determines the position and type.
[0029]
In addition, current endoscopes are generally provided with a focus adjustment function, a zoom function, and the like. Therefore, the setting data of these functions of the endoscope 34 according to the present embodiment is sent to the position / setting value calculation circuit 38 and used for construction of a navigation image.
[0030]
Furthermore, the endoscope 34 compatible with the surgical navigation system of the present embodiment includes, for example, the use of spot light and an ultrasonic sensor disclosed in Japanese Patent Laid-Open No. 3-28686, which has already been filed by the present applicant. It has a distance detection function (distance measuring means). Then, as shown in FIG. 6, the distance S1 between the distal end of the insertion portion 34a of the endoscope 34 and the object (target of the surgical part) 48 can be measured.
[0031]
Further, in the surgical microscope 1 used together with the endoscope 34, as shown in FIG. 2, a plurality of signal members are provided at predetermined positions on the side surface of the mirror body 2, and in this embodiment, three microscope markers 18a. , 18b, 18c are fixed integrally. As these microscope markers 18a, 18b, and 18c, an active type that emits infrared rays or a passive marker that is a simple protrusion is arbitrarily selected and used. Then, by detecting these microscope markers 18a, 18b, and 18c by the position detection sensor 36, position detection using the relative position between the head frame 35 and the mirror body 2, the rotation angle of each arm, and the like as variables, or a microscope marker By detecting the relative position with respect to the head frame 35 by 18a to 18c, the position can be grasped.
[0032]
Note that the treatment instrument 33 such as a bipolar probe, an ultrasonic aspirator, and forceps is also provided with a treatment instrument marker 50 having the same configuration as the head frame marker 37 and the endoscope marker 47.
[0033]
Next, the operation of the above configuration will be described. When using the surgical navigation system of the present embodiment, the biological image data 46 taken in advance before surgery is transmitted from the storage device 40 to the image composition circuit 39, and a reference image is constructed. The image at this time may be a 3D image, a 2D image (front, side, or top) as described later, or may be combined as necessary. In addition, as long as the 3D structure inside the skull (brain) can be determined, the display format of the 3D image is not particularly limited, and may be a wire frame image or a 3D skeleton display.
[0034]
Thereafter, the operator 51 grasps the grip 9 of the body 2 of the surgical microscope 1 and depresses the switch 10 to release the electromagnetic brake built in the axes O1 to O5, and moves the body 2 to move the surgical body. The focal position is positioned at the observation site.
[0035]
In addition, a light beam emitted from the surgical site at the time of observation with the surgical microscope 1 enters the mirror body 2. At this time, the light beam incident on the mirror body 2 from the objective lens 11 is observed through the variable magnification optical thread 12, the imaging lenses 13a and 13b, and the eyepieces 14a and 14b. Observe at magnification. Note that when the focus position after observation does not match, the objective lens 11 is driven by a motor (not shown) to perform focusing.
[0036]
During observation with the surgical microscope 1, the position detection sensor 36 detects the microscope markers 18 a, 18 b and 18 c on the mirror body 2, and the detection signals are transmitted to the position and setting value calculation circuit 38 for signal processing. Thus, the position and orientation of the mirror body 2 in the living body coordinate system are detected.
[0037]
Further, the position information of the objective lens 11 is transmitted to the position and each set value calculation circuit 38 by the position sensor 16. At this time, the position and each set value calculation circuit 38 calculates the relative position of the focal position with respect to the mirror body 2 from the position information of the objective lens 11. Further, the position of the focal position in the biological coordinate system is calculated from the position and orientation of the mirror body 2 in the biological coordinate system and the relative position of the focal position with respect to the mirror body 2. Further, the output signal from the position and each set value calculation circuit 38 is input to the image synthesis circuit 39, and the three-dimensional image data and the focal point are displayed on the living body coordinate system on the image on the display device such as the TV monitor 44 or the head mount display 45. The position is displayed superimposed.
[0038]
Through the above operation, the operator 43 visually observes the screen of the display device such as the TV monitor 44 or the head mounted display 45, and the focal position is superimposed on the image of the surgical site based on the three-dimensional image data displayed on this screen. The observed image can be observed. Then, by observing a display image of a display device such as the TV monitor 44 or the head mount display 45, the observation position of the microscope 1 can be known on the image based on the three-dimensional image data.
[0039]
Further, during the operation, a treatment instrument 33 and a treatment device such as an endoscope 34 are inserted into the surgical site of the patient's head 32a. Here, for example, when the treatment tool 33 is inserted into the surgical site, the position detection sensor 36 detects the position of the head frame 35 and sets the reference three-dimensional coordinates. Subsequently, the position of the treatment tool 33 is detected with respect to the reference three-dimensional coordinates, and the type of each treatment tool 33 is recognized as described above, and the type of the treatment tool 33 is detected in the position and each set value calculation circuit 38. The data of is transmitted. Then, the shape data of the treatment instrument 33 recorded in advance in the storage device 40 is read and transmitted to the image composition circuit 39.
[0040]
Note that when the endoscope 34 is used as the treatment device, the data is transmitted to the image synthesis circuit 39 and the endoscopic image is transmitted as in the case of the treatment instrument 33.
[0041]
Further, the data of the mirror body 2 of the surgical microscope 1 is also transmitted to the image composition circuit 39 as described above. A microscope image is also transmitted to the image synthesis circuit 39, and a brain image without a blind spot is constructed by synthesizing with the above-described endoscopic image and the above-described image based on the biological image data 46.
[0042]
Furthermore, the shape data and position data of each treatment instrument 33 and endoscope 34 are synthesized on the brain image. Here, the synthesized image is displayed on a display device such as a TV monitor 44 or a head mounted display 45, and is picture-in-pictured in a microscopic image, which will be described later, as necessary.
[0043]
These controls are instructed by a controller such as a keyboard 41, a mouse 42, a touch pen 43, or a laser pointer (not shown).
[0044]
FIG. 7 shows a display example of the display screen of the TV monitor 44. Here, in the left half of the display screen of the TV monitor 44, the 3D display unit 52 is arranged, and the surgical part of the patient's head 32a is displayed in 3D. Further, a top image display unit 53, a side image display unit 54, a front image display unit 55 are arranged in the upper right part of the display screen of the TV monitor 44, and a microscope image display unit 56 is arranged in the lower right part of the display screen. ing. The top image display unit 53 has a 2D image of the top image of the surgical site of the patient's head 32a, and the side image display unit 54 has a 2D image of the side image of the surgical site of the patient's head 32a and a front image display unit. 55 shows a 2D image of the front image of the surgical site of the patient's head 32a, and the microscope image display unit 56 displays a 2D image of the microscope image of the surgical site of the patient's head 32a.
[0045]
In addition, the 3D display unit 52 uses a display by a wire frame to synthesize and display a state of craniotomy inside the object and an object (such as a tumor) in a three-dimensional manner. Further, the endoscope 34 and the like being used are displayed in a three-dimensional outline so that the insertion status, the positional relationship with the object 48, the contact status with surrounding living tissue, and the like can be determined. It should be noted that the top image display unit 53, the side image display unit 54, and the front image display unit 55 also show similarly synthesized 2D images so that they can be confirmed more reliably.
[0046]
Further, the microscope image display unit 56 shows the state of the object 48 that cannot be seen from the body 2 of the surgical microscope 1 as a blind spot, the positions of the endoscope 34 and the surgical microscope 1 described above, It is synthesized and displayed by analyzing from data such as the focal position, the distance S1 to the object, the zoom setting, and the endoscope image. Further, an endoscope image display unit 57 is disposed on the upper part of the microscope image display unit 56. The endoscopic image from the endoscope 34 is displayed on the endoscopic image display unit 57.
[0047]
Therefore, the above configuration has the following effects. That is, in the present embodiment, when the endoscope 34 is used on the display screen of the TV monitor 44, the position of the image obtained as the endoscopic image on the living tissue can be displayed.
[0048]
Furthermore, the shape of the treatment instrument 33 and the treatment device such as the endoscope 34 to be used is stored, and the actual shape of the treatment device is synthesized and displayed on the biological tissue image on the display screen of the TV monitor 44. As described above, it is possible to prevent the treatment instrument 33 and a part of the treatment device such as the endoscope 34 from deviating from, for example, the microscopic image and inadvertently coming into contact with the living tissue. it can.
[0049]
In addition, a plurality of endoscope markers 47 and a treatment instrument marker 50 are provided on the operation portion 34b of the endoscope 34, and the arrangement of the markers 47 and 50, or a combination of active and passive, etc. Since the type of the endoscope 34 and the treatment instrument 33 can be specified, even if the treatment instrument 33 or the treatment instrument such as the endoscope 34 to be used is replaced, there is no need to newly recalibrate. be able to.
[0050]
FIG. 8 shows a second embodiment of the present invention. In the present embodiment, the configuration of the display screen displayed on the eyepiece portion of the mirror body 2 of the surgical microscope 1 according to the first embodiment (see FIGS. 1 to 7) is changed as follows.
[0051]
That is, the image display of the microscope image which actually displays the microscope image obtained by the mirror body 2 of the surgical microscope 1 on the microscope image display screen 71 of the eyepiece part of the mirror body 2 of the surgical microscope 1 of the present embodiment. A unit 72 and a target object virtual display unit 73 that virtually displays the target object 48 as in the first embodiment are provided.
[0052]
Furthermore, an arbitrary image can be displayed together on the object virtual display unit 73 of the microscope image display screen 71 by using picture-in-picture. For example, the position can be confirmed by 3D display or an endoscopic image can be displayed. I can do it.
[0053]
Therefore, in this embodiment, an operator who is viewing the microscope image display screen 71 of the eyepiece part of the body 2 of the surgical microscope 1 during the operation confirms a necessary image without taking his eyes off the eyepiece part. There is an effect that can be done.
[0054]
FIG. 9 shows a third embodiment of the present invention. In this embodiment, the endoscope 34 used together with the surgical microscope 1 of the first embodiment (see FIGS. 1 to 7) has a plurality of insertion portions 34a having different effective lengths, and in this embodiment, two types. Endoscopes 34A and 34B are provided.
[0055]
That is, in the present embodiment, the length of the insertion portion 34a2 of the other endoscope 34B is set to be larger than the length of the insertion portion 34a1 of one endoscope 34A. An endoscope marker 47a is arranged at a fixed setting position L1 from the distal end of the insertion portion 34a1 of the endoscope 34A. Further, similarly, an endoscope marker 47b is arranged at a fixed setting position L2 from the distal end in the insertion portion 34a2 of the endoscope 34B. Here, the installation position L1 of the endoscope marker 47a of the insertion section 34a1 on the endoscope 34A side and the installation position L2 of the endoscope marker 47b of the insertion section 34a2 on the endoscope 34B side are set to the same position. ing.
[0056]
Therefore, the above configuration has the following effects. That is, in the present embodiment, since the installation positions L1 and L2 of the endoscope markers 47a and 47b of the two types of endoscopes 34A and 34B are set to the same position, the two types of endoscopes 34A and 34B are arranged. Even if it is replaced during the operation, it can be used immediately without having to recalibrate.
[0057]
Furthermore, the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the scope of the present invention.
Next, other characteristic technical matters of the present application are appended as follows.
Record
(Appendix 1) Means for measuring the position of the equipment to be used;
An endoscope having distance detection means for measuring the distance from the tip to the target;
Having image information display means,
The position information, the distance information, and biological image information recorded in advance are synthesized,
A position display device having a function of displaying on a screen.
[0058]
(Appendix 2) means for measuring the position of the equipment to be used;
Means for previously storing the shape of the device;
Means for displaying image information;
Combining the information on the position and shape of the device with biological image information recorded in advance,
A position display device for displaying the shape and position of the device on a screen in a distinguishable manner.
[0059]
(Appendix 3) Means for measuring the position of the equipment to be used;
Combining the position information and biological image information recorded in advance,
In the position display device that displays the position of the device,
If there are multiple devices to use,
By having a marker for the measurement means to identify at a specified position from their tips,
Even when the device to be used is replaced, the position from the marker to the tip is always constant.
[0060]
(Additional Item 4) In Additional Item 1,
The display content is
2D or 3D biological image information,
Positional information of the endoscope;
Synthesizing the position of the biological tissue located in the center of the visual field of the endoscope;
What can display the position in the biological tissue of the endoscopic image.
[0061]
(Additional Item 5) In Additional Item 2,
The display content is
2D or 3D biological image information,
2D or 3D shape information of the device used is synthesized,
A device capable of displaying the position of the device used with respect to a living tissue.
[0062]
(Additional Item 6) In Additional Item 3,
Depending on the mounting position or type of the marker,
The type of equipment used can be identified.
[0063]
(Additional Item 7) In Additional Items 1 to 3 and Additional Items 4 to 6,
The image information is
Those that can be arbitrarily inserted or combined into a microscope image of a surgical microscope.
[0064]
(Prior art of supplementary items 1 to 3) Conventionally, as in JP-A-5-305073, a tomographic image obtained in advance by CT or MRI is synthesized with a computer to display a tomographic or stereoscopic display and used for surgery. By calibrating the shape of devices such as treatment tools and endoscopes in advance, attaching position detection markers to those devices, and performing position detection using infrared rays or the like from the outside, the above-mentioned biological image information is displayed. There are devices that display the position of the devices that are being used, especially in neurosurgery and the like, and display the position of the brain tumor synthesized on a microscopic image, or navigate the direction in which the surgery proceeds.
[0065]
In addition, the position and shape of the subject have been recognized as in Japanese Patent Publication No. 6-43893 and Japanese Patent Publication No. 3-28686. Alternatively, there has been a surgical microscope capable of picture-in-picture of an endoscopic image as a microscopic image, as in JP-A-7-261994.
[0066]
In addition, in TV monitors and head mounted displays, it was possible to easily combine and display a plurality of images in a desired form by using an image mixer.
[0067]
(Problems to be solved by Supplementary Items 1 to 3) The problem to be solved by the present invention is a problem of the prior art.
-It is impossible to display in which part of the living body the living tissue viewed in the endoscopic image is located.
・ Generally used instruments such as treatment tools and endoscopes can only display a simplified bar shape on the screen and cannot display the shape, so parts that are out of the microscope image are not prepared. It is impossible to prevent damage to living tissue.
• You must recalibrate each time you change the equipment you use.
It tries to solve this point.
[0068]
(Effects of Supplementary Items 1 to 3)
-When an endoscope is used, the position on the living tissue of the image obtained by the endoscopic image can be displayed.
-By storing the shape of devices such as treatment instruments and endoscopes to be used, and by synthesizing and displaying the actual shape on the biological tissue image, for example, a part that is out of the microscopic image is inadvertently To prevent damage.
・ Eliminates the need to recalibrate even when equipment such as treatment tools and endoscopes are replaced.
That's it.
[0069]
【The invention's effect】
According to the present invention Inside When an endoscope is used, the position of an image obtained as an endoscopic image on a living tissue can be displayed, and the shape of a treatment instrument such as a treatment tool or an endoscope used can be It can be synthesized and displayed on the tissue image. For example, it is possible to prevent inadvertently damaging the living tissue from a part that is off the microscopic image, and to use a treatment instrument, an endoscope, etc. Even if the treatment device is replaced, it is not necessary to newly recalibrate, and the operation can be performed efficiently.
[Brief description of the drawings]
FIG. 1 is a schematic configuration diagram of an entire system of a surgical microscope using a surgical navigation system according to a first embodiment of the present invention.
FIG. 2 is a front view showing a body of the surgical microscope according to the first embodiment.
FIG. 3 is a schematic configuration diagram of the inside of a body of the surgical microscope according to the first embodiment.
FIG. 4 is a schematic configuration diagram of the entire surgical navigation system according to the first embodiment.
FIG. 5 is a perspective view showing a main configuration of the surgical navigation system according to the first embodiment.
FIG. 6 is a longitudinal sectional view of a main part showing a use state of the endoscope compatible with the surgical navigation system of the first embodiment.
FIG. 7 is an explanatory diagram for explaining a display example of the display device in the surgical navigation system according to the first embodiment.
FIG. 8 is a plan view showing a microscope image of an eyepiece part of a surgical microscope according to a second embodiment of the present invention.
FIG. 9 is a side view showing a plurality of endoscopes having different effective lengths used in the surgical navigation system according to the third embodiment of the present invention.
[Explanation of symbols]
1 Surgical microscope
33 Treatment tool (treatment device)
34 Endoscope (treatment device)
36 Position detection sensor (position measurement unit)
39 Image composition circuit (image information composition unit)
40 Storage device (storage unit)
44 TV monitor (image information display)
45 Head mounted display (image information display)

Claims (3)

  1. A surgical microscope for observing the surgical site inside the patient's body ,
    A position measuring unit for measuring the position of a treatment device used for treatment of the surgical site in the body ;
    An endoscope having a distance measuring means for measuring a distance from a distal end of an insertion portion to be inserted into a patient's body to a target of the surgical site;
    An image information display unit for displaying image information of an operation part in the patient's body;
    A storage unit for holding biological image information measured in advance;
    The position information from the position measuring unit, the distance information from the distance measuring unit, and the biological image information recorded in advance in the storage unit are combined to construct a reference image, and the endoscope And the position of the surgical microscope, the distance from the target, the information on the target that cannot be seen from the surgical microscope in the surgical site analyzed from the data of the endoscopic image, as a blind spot. A surgical navigation system, comprising: an image information synthesis unit configured to perform synthesis display on a screen of the image information display unit.
  2.   The image information synthesizing unit has means for preliminarily storing the shape of the treatment device, and synthesizes the position and shape information of the treatment device and biological image information recorded in advance. The surgical navigation system according to claim 1, wherein the shape and position of the device are displayed on the screen of an image information display unit so as to be discriminable.
  3. A surgical microscope for observing the surgical site inside the patient's body ,
    A position measuring unit for measuring the position of a treatment device used for treatment of the surgical site in the body ;
    An endoscope having a distance measuring unit for measuring a distance from a distal end of an insertion unit to be inserted into a patient's body to a target of the surgical unit;
    An image information display unit for displaying image information of an operation part in the patient's body;
    A storage unit for holding biological image information measured in advance;
    The position information from the position measuring unit, the distance information from the distance measuring unit, and the biological image information recorded in advance in the storage unit are combined to construct a reference image, and the endoscope And the position of the surgical microscope, the distance from the target, and the information on the target that cannot be seen from the surgical microscope in the surgical site analyzed from the data of the endoscopic image. An image information composition unit for composition display on the screen of the information display unit,
    As necessary, selectively replace multiple treatment devices for treatment of the surgical site inside the patient's body,
    A surgical navigation system, wherein a marker for identifying the position measurement unit is provided at a predetermined set position from the tip of each treatment device.
JP2000017135A 2000-01-26 2000-01-26 Surgical navigation system Expired - Fee Related JP4472085B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000017135A JP4472085B2 (en) 2000-01-26 2000-01-26 Surgical navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000017135A JP4472085B2 (en) 2000-01-26 2000-01-26 Surgical navigation system

Publications (2)

Publication Number Publication Date
JP2001204738A JP2001204738A (en) 2001-07-31
JP4472085B2 true JP4472085B2 (en) 2010-06-02

Family

ID=18544167

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000017135A Expired - Fee Related JP4472085B2 (en) 2000-01-26 2000-01-26 Surgical navigation system

Country Status (1)

Country Link
JP (1) JP4472085B2 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010034530A1 (en) * 2000-01-27 2001-10-25 Malackowski Donald W. Surgery system
JP4615842B2 (en) * 2003-10-09 2011-01-19 オリンパス株式会社 Endoscope system and endoscope image processing apparatus
JP4695357B2 (en) * 2004-07-20 2011-06-08 オリンパス株式会社 Surgery system
JP4916114B2 (en) * 2005-01-04 2012-04-11 オリンパス株式会社 Endoscope device
WO2008093517A1 (en) 2007-01-31 2008-08-07 National University Corporation Hamamatsu University School Of Medicine Device for displaying assistance information for surgical operation, method for displaying assistance information for surgical operation, and program for displaying assistance information for surgical operation
JP5283157B2 (en) * 2007-03-20 2013-09-04 国立大学法人静岡大学 Surgery support information display device, surgery support information display method, and surgery support information display program
FR2917598B1 (en) * 2007-06-19 2010-04-02 Medtech Multi-applicative robotic platform for neurosurgery and method of recaling
JP5561458B2 (en) 2008-03-18 2014-07-30 国立大学法人浜松医科大学 Surgery support system
JP5379454B2 (en) * 2008-11-21 2013-12-25 オリンパスメディカルシステムズ株式会社 Medical observation apparatus and medical observation system
JP5569711B2 (en) 2009-03-01 2014-08-13 国立大学法人浜松医科大学 Surgery support system
JP5675227B2 (en) 2010-08-31 2015-02-25 富士フイルム株式会社 Endoscopic image processing apparatus, operation method, and program
JP5302285B2 (en) 2010-10-28 2013-10-02 シャープ株式会社 Stereoscopic video output device, stereoscopic video output method, stereoscopic video output program, computer-readable recording medium, and stereoscopic video display device
JP5796982B2 (en) 2011-03-31 2015-10-21 オリンパス株式会社 Surgery system control device and control method
JP2013252387A (en) * 2012-06-08 2013-12-19 Canon Inc Medical image processing apparatus
JP6108812B2 (en) * 2012-12-17 2017-04-05 オリンパス株式会社 Insertion device
US10433763B2 (en) 2013-03-15 2019-10-08 Synaptive Medical (Barbados) Inc. Systems and methods for navigation and simulation of minimally invasive therapy
JP5781135B2 (en) * 2013-09-27 2015-09-16 エフ・エーシステムエンジニアリング株式会社 3D navigation video generation device
JP6257371B2 (en) * 2014-02-21 2018-01-10 オリンパス株式会社 Endoscope system and method for operating endoscope system
JP6311393B2 (en) * 2014-03-28 2018-04-18 セイコーエプソン株式会社 Information processing apparatus, information processing method, and information processing system
JP6094531B2 (en) * 2014-06-02 2017-03-15 中国電力株式会社 Work support system using virtual images
KR20180047881A (en) * 2016-11-01 2018-05-10 한국전기연구원 Probe Apparatus and Method for Diagnosing Brain Tumor in Real-time using Electromagnetic Wave

Also Published As

Publication number Publication date
JP2001204738A (en) 2001-07-31

Similar Documents

Publication Publication Date Title
EP1499235B1 (en) Endoscope structures and techniques for navigating to a target in branched structure
JP4156107B2 (en) Image guided intervening procedure planning method and apparatus
US6661571B1 (en) Surgical microscopic system
US7203277B2 (en) Visualization device and method for combined patient and object image data
CN105188590B (en) Image collecting device and can the collision during manipulation device lever arm controlled motion avoid
Navab et al. Camera augmented mobile C-arm (CAMC): calibration, accuracy study, and clinical applications
EP0926998B1 (en) Image guided surgery system
DE69534862T2 (en) Surgical navigation arrangement including reference and location systems
EP3162318B1 (en) Auxiliary image display and manipulation on a computer display in a medical robotic system
KR101258912B1 (en) Laparoscopic ultrasound robotic surgical system
US7951070B2 (en) Object observation system and method utilizing three dimensional imagery and real time imagery during a procedure
EP0975257B1 (en) Endoscopic system
CA2486525C (en) A guide system and a probe therefor
US7794388B2 (en) Method and apparatus for generating at least one section of a virtual 3D model of a body interior
US20050203367A1 (en) Guide system
US7998062B2 (en) Endoscope structures and techniques for navigating to a target in branched structure
US9082319B2 (en) Method, apparatus, and system for computer-aided tracking, navigation and motion teaching
JP4220780B2 (en) Surgery system
US8864655B2 (en) Fiber optic instrument shape sensing system and method
KR101759534B1 (en) Visual tracking and annotation of clinically important anatomical landmarks for surgical interventions
JP2575586B2 (en) Surgical device positioning system
CN105208960B (en) System and method for the robot medical system integrated with outside imaging
JP5372225B2 (en) Tool position and identification indicator displayed in the border area of the computer display screen
CN104936545B (en) The system and method for design intervention program
EP1872737B1 (en) Computer assisted orthopaedic surgery system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061113

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20091110

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100108

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100209

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100303

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130312

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130312

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140312

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees