JP2009226072A - Method and device for surgical operation support - Google Patents

Method and device for surgical operation support Download PDF

Info

Publication number
JP2009226072A
JP2009226072A JP2008076412A JP2008076412A JP2009226072A JP 2009226072 A JP2009226072 A JP 2009226072A JP 2008076412 A JP2008076412 A JP 2008076412A JP 2008076412 A JP2008076412 A JP 2008076412A JP 2009226072 A JP2009226072 A JP 2009226072A
Authority
JP
Japan
Prior art keywords
image
blood vessel
means
warning
operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2008076412A
Other languages
Japanese (ja)
Other versions
JP5160276B2 (en
Inventor
Yoshiyuki Kunuki
義幸 九貫
Original Assignee
Fujifilm Corp
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp, 富士フイルム株式会社 filed Critical Fujifilm Corp
Priority to JP2008076412A priority Critical patent/JP5160276B2/en
Publication of JP2009226072A publication Critical patent/JP2009226072A/en
Application granted granted Critical
Publication of JP5160276B2 publication Critical patent/JP5160276B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

When performing a surgical operation, a blood vessel existing up to a predetermined depth from the tissue surface can be superimposed and displayed on a normal screen on which the surface of a living tissue is projected.
A subject to which an angiographic agent is administered is irradiated with excitation light and visible light in a specific wavelength region alternately, and a fluorescence image and a normal image irradiated with the excitation light by an imaging means are provided. Acquire alternately. Then, the acquired fluorescent image is subjected to threshold processing with a predetermined threshold to extract a blood vessel image, and a composite image is created by superimposing the extracted blood vessel image on the acquired normal image. The composite image created in this way is displayed as a moving image. As a result, the surface of the living tissue can be observed, and blood vessels existing from the surface of the living tissue to a predetermined depth can also be observed on the composite image.
[Selection] Figure 6

Description

  The present invention relates to a surgery support method and apparatus, and more particularly to a technique for supporting endoscopic surgery.

  When performing surgery, etc., it is necessary to pay sufficient attention not to damage the blood vessels. However, since blood vessels that exist beyond a certain depth from the body tissue surface of the subject cannot be confirmed, blood vessels There is a risk of injury (bleeding risk).

  Endoscopic surgery, in particular, has the advantage of being a minimally invasive surgery, but it is difficult to grasp the positional relationship between the tip position of a treatment tool such as an electric knife or ultrasonic knife and blood vessels, although the surgical field is narrow. There is a problem that it is difficult to control bleeding when the risk is higher than normal surgery and the amount of bleeding is large. Therefore, when performing endoscopic surgery, it is required that an operator who is sufficiently familiar with endoscopic operations perform it.

  2. Description of the Related Art Conventionally, a surgery support system that supports surgery based on an image of an affected area taken by an imaging apparatus has been proposed (Patent Document 1). The surgical operation support system described in Patent Document 1 specifies the type of line or surface that approximates the surgical route line, and inputs the reference point of the surgical route on the captured stereoscopic image. The operation route of the designated line or surface passing through is calculated, and the operation route is displayed together with the stereoscopic image.

  Further, as a conventional technique for performing imaging focusing on blood vessels, there is an invention described in Patent Document 2. In the invention described in Patent Document 2, a fluorescent dye such as indocyanine green (ICG) is administered to an animal, an angiographic image is obtained by irradiating light of a specific wavelength that excites the fluorescent dye, and the blood vessel The patency of the blood vessel portion is evaluated from the contrast image.

  On the other hand, there is an invention described in Patent Document 3 that supports treatment for removing an obstruction from a tubular passage such as a blood vessel in the body. In the invention described in Patent Document 3, the distance between a target such as a stent and the tip of the catheter tube is measured by the output of a sensor provided at the tip of the catheter, and the positional relationship between the target and the tip of the catheter tube (sensor). Is displayed to inform the user.

  Conventionally, white light (illumination light) and excitation light are alternately irradiated to a living tissue every field period, and an illumination light image and a fluorescence image are alternately captured every field period by an imaging unit. An electronic endoscope system that combines an illumination light image captured with a fluorescent image and a fluorescent image has been proposed (Patent Document 4).

This electronic endoscope system emits fluorescence due to autofluorescence when a living tissue is irradiated with light of a specific wavelength (excitation light) such as ultraviolet rays. However, autofluorescence has a characteristic that it is weaker in a lesion tissue than in a normal tissue. This system is used to observe abnormalities in living tissue. In particular, an illumination light image and a fluorescence image are synthesized, and a lesion tissue can be identified only by looking at the synthesized image.
Japanese Patent Laid-Open No. 2007-75198 Special table 2003-510121 gazette Special table 2001-510354 gazette JP 2004-254899 A

  In the invention described in Patent Document 1, the doctor designates the type of line or surface that approximates the surgical route line, and the number of reference points according to the type (for example, in the case of “spherical surface”, four reference points are used. Point), the operation route of the specified line or surface passing through all the input reference points is calculated, and the operation route is presented to the doctor, and the optimal operation route that does not damage the blood vessel is presented. Not what you want.

  The invention described in Patent Document 2 evaluates the patency of blood vessels during or after surgery and the blood flow in a tissue part by examining angiographic images, so that blood vessels are not damaged during surgery. It doesn't help. The angiographic image is useful for performing the operation, but it is not possible to confirm the tissue portion other than the blood vessel in which the blood flow is flowing, and therefore the operation is not performed while viewing the angiographic image.

  The invention described in Patent Document 3 can inform the user of the positional relationship between a target such as a stent and the tip (sensor) of the catheter tube, but informs the positional relationship between a treatment tool such as an electric knife and a blood vessel. I can't.

  The invention described in Patent Document 4 obtains a composite image obtained by combining a white light image (illumination light image) and a fluorescent image, but the fluorescent image described in Patent Document 4 has a specific wavelength such as ultraviolet light. It is an image of a living tissue that is self-fluorescent when irradiated with light, and is not an angiographic image. As a method of combining the white light image and the fluorescent image, the color difference signal of each pixel uses the color difference signal of the white light image, and the luminance signal of each pixel is the luminance signal of the white light image and the luminance of the fluorescent image. Signals are added at a predetermined ratio.

  An object of the present invention is to display a blood vessel that is present up to a predetermined depth from the tissue surface on a normal screen on which the surface of a living tissue is projected when performing a surgical operation. It is an object of the present invention to provide an operation support method and apparatus that can significantly reduce the risk of damaging the patient (bleeding risk).

  Another object of the present invention is to provide a surgical support method capable of issuing a warning and alerting when a surgical treatment instrument is close to a blood vessel, thereby preventing bleeding troubles in advance. To provide an apparatus.

  Still another object of the present invention is to provide an appropriate surgical path that does not approach the blood vessel simply by inputting the starting point and the ending point of the operation, thereby operating the treatment instrument more safely during the operation. It is an object of the present invention to provide an operation support method and apparatus that can perform the operation.

  In order to achieve the above object, the surgical operation support method according to claim 1 is characterized in that the excitation light in a specific wavelength range for causing the blood vessel contrast agent to emit light to the subject to which the blood vessel contrast agent has been administered is a predetermined time interval. A step of continuously irradiating the subject with the excitation light, a step of continuously photographing the subject irradiated with the excitation light by the imaging means, obtaining a fluorescence image, and a time period during which the subject is not irradiated with the excitation light. A step of continuously irradiating visible light, a step of continuously photographing the subject irradiated with the visible light by the imaging means to acquire a normal image, and a threshold value of the acquired fluorescent image by a predetermined threshold Processing to extract a blood vessel image; creating a composite image in which the extracted blood vessel image is superimposed on the acquired normal image; and continuously displaying the generated composite image on a display unit as a moving image Including It is characterized in that.

  That is, in the case of a normal image, only blood vessels on the surface of a living tissue can be observed, but a blood vessel contrast agent is caused to emit light by irradiating a subject to which an angiographic contrast agent is administered with excitation light in a specific wavelength range. According to the fluorescent image, blood vessels existing from the surface of the living tissue to a predetermined depth can be observed. The fluorescent image is thresholded with a predetermined threshold value to extract a blood vessel image, and a composite image is created and displayed by superimposing the blood vessel image on the normal image, thereby enabling observation of the surface of the living tissue. The blood vessels existing from the surface of the living tissue to a predetermined depth can be observed on the composite image.

  The operation support method according to claim 1, wherein the extracted blood vessel image is a blood vessel image including density information of the fluorescent image, or a binarized blood vessel image. It is said.

  The operation support method according to claim 1 or 2, wherein the step of creating the composite image uses a binarized signal indicating a blood vessel image extracted by the threshold processing and a background thereof as a key. The blood vessel image is key-synthesized with the normal image as a signal. That is, synthesis is performed so that only the blood vessel image is pasted on the normal image.

  The operation support method according to any one of claims 1 to 3, wherein the step of displaying the composite image includes at least one of the normal image, the fluorescence image, and the blood vessel image together with the composite image. One image is simultaneously displayed on the display means. Thereby, other images (normal image, fluorescent image, or blood vessel image) other than the synthesized image can also be referred to.

  5. The surgical support method according to claim 1, wherein the fluorescent image is based on a transmission characteristic of the excitation light, a light emission characteristic and a concentration of the angiographic contrast agent. It includes an image of blood vessels existing from the surface of the living tissue to a predetermined depth.

  That is, since the depth of the blood vessel that can be imaged by the imaging means is determined by parameters such as the transmission characteristics of excitation light, the light emission characteristics and concentration of the blood vessel contrast agent, the above parameters are determined so that blood vessels up to the target depth can be captured It is preferable.

  According to a sixth aspect of the present invention, in the surgical support method according to the fifth aspect, the predetermined depth is equal to or greater than a depth that is incised at a time by a treatment instrument for endoscopic surgery. This ensures that the blood vessel is not damaged when a portion where no blood vessel is observed on the composite image is incised.

  The operation support method according to any one of claims 1 to 6, wherein: a step of detecting a tip position of a surgical treatment tool based on the normal image; and a tip of the detected treatment tool. A step of calculating an actual distance from the distal end position of the treatment tool to the nearest blood vessel based on the position and the position information of the blood vessel in the extracted blood vessel image, and a warning when the calculated actual distance is less than or equal to a predetermined distance And a step of emitting.

  That is, the distal end position of the treatment instrument is detected by performing image processing on the normal image, the position of the blood vessel closest to the detected distal end position of the treatment instrument is detected, and imaging of the imaging means between these detected positions is performed. The actual distance is calculated based on the distance on the surface. When obtaining the actual distance from the distance on the imaging surface, the actual distance is converted based on the imaging distance to the surface of the living tissue or the reference scale on the surface of the living tissue (for example, the size of the tip of the treatment tool). . When the calculated actual distance is less than or equal to a predetermined distance, a warning is issued to notify that the distal end position of the treatment instrument has approached the blood vessel.

  According to an eighth aspect of the present invention, in the operation support method according to the seventh aspect, the step of issuing the warning is characterized in that a warning character is superimposed and displayed on the display means.

  As shown in claim 9, in the operation support method according to claim 7 or 8, the step of issuing the warning is characterized in that the calculated actual distance is superimposed and displayed on the display means. By displaying the actual distance, it is possible to notify that the distal end position of the treatment instrument has approached the blood vessel.

  10. The operation support method according to claim 8, wherein the step of issuing the warning changes a color of the warning character or the actual distance when displaying the warning character or the actual distance. At least one of display, display that blinks the warning character or the actual distance, display that repeats enlargement / reduction of the warning character or the actual distance, and display that expands the warning character or the actual distance, Yes. The warning display form as described above makes it easy to check the warning.

  11. The operation support method according to claim 7, wherein the step of issuing the warning includes the warning character or the vicinity of the distal end position of the treatment instrument displayed on the display means. It is characterized by displaying the actual distance. By displaying a warning in the vicinity of the distal end position of the treatment instrument (that is, the portion being watched), the warning is not overlooked.

  According to a twelfth aspect of the present invention, in the surgery support method according to any one of the seventh to eleventh aspects, the step of issuing the warning is characterized by generating a warning sound or a warning sound.

  In the operation support method according to any one of claims 1 to 12, as shown in claim 13, the step of receiving the input of the start point and end point of the operation on the screen of the display means, Based on the position information of the start point and the end point and the position information of the blood vessel in the extracted blood vessel image, a surgical path connecting the start point and the end point of the operation without passing through the blood vessel and the dangerous region in the vicinity of the blood vessel. The method further includes a step of calculating and a step of displaying the calculated surgical route superimposed on the composite image.

  That is, only by inputting the start point and end point of the operation, it is possible to present a surgical route that does not pass through the blood vessel and the dangerous region in the vicinity of the blood vessel (appropriate surgical route that does not approach the blood vessel). The tool can be operated more safely.

  14. The operation support method according to claim 13, wherein the step of calculating the operation route includes the safest route connecting the start point and the end point of the operation, or the shortest route as the operation route. As a feature. Note that a plurality of surgical route candidates may be displayed and selected by a doctor. Moreover, it is preferable that the presented surgical route can be appropriately corrected.

  The operation support method according to any one of claims 1 to 12, wherein a step of receiving an input of a start point and an end point of the operation on the screen of the display means, and A step of calculating a safety region other than a blood vessel and a dangerous region near the blood vessel based on the position information of the start point and the end point and the position information of the blood vessel in the extracted blood vessel image, and the calculated safety region can be identified And a step of displaying on the display means.

  In other words, it is possible to present safe areas (areas that do not approach blood vessels) other than blood vessels and dangerous areas in the vicinity of the blood vessels simply by inputting the start and end points of the operation, thereby making the treatment tool safer during surgery. Will be able to operate. The safe area and the dangerous area can be displayed so as to be distinguishable by changing the density and color of the composite image.

  The surgical operation support device according to claim 16 irradiates a subject to which an angiographic contrast agent is administered with excitation light in a specific wavelength region for causing the angiographic contrast agent to emit light continuously at a predetermined time interval. One light source means, a second light source means for continuously irradiating the subject with visible light during a non-irradiation period of the excitation light, and excitation light or visible light by the first and second light source means. An imaging unit that alternately and repeatedly images the subject in synchronization with the irradiation of the image, and obtaining an image captured in synchronization with the excitation light irradiation of the images captured by the imaging unit as a fluorescence image, An image acquisition means for acquiring an image photographed in synchronization with the irradiation of visible light as a normal image, and a blood vessel image extraction for extracting a blood vessel image by thresholding the fluorescent image acquired by the image acquisition means with a predetermined threshold value Means and said Composite image creation means for creating a composite image in which the blood vessel image extracted by the blood vessel image extraction means is superimposed on the normal image acquired by the image acquisition means; and the composite image created by the composite image creation means is displayed as a moving image And a composite image output means for outputting to the image.

  That is, a subject to which an angiographic agent is administered is irradiated with excitation light and visible light in a specific wavelength region alternately and continuously, and synchronized with the excitation light or visible light irradiation by the same imaging device. Thus, the subject is alternately and repeatedly photographed to acquire a fluorescence image and a normal image. The fluorescent image is thresholded with a predetermined threshold value to extract a blood vessel image, and a composite image is created and displayed by superimposing the blood vessel image on the normal image, thereby enabling observation of the surface of the living tissue. The blood vessels existing from the surface of the living tissue to a predetermined depth can be observed on the composite image.

  According to a seventeenth aspect of the present invention, in the surgery support device according to the sixteenth aspect, the first light source means generates excitation light in a near-infrared wavelength region. Since the excitation light in the near-infrared wavelength region can pass through hemoglobin and water, when it reaches a predetermined depth from the surface of the living tissue and a blood vessel exists within that depth, the angiographic contrast medium is excited and fluorescent. Can emit light. Therefore, blood vessels existing from the surface of the living tissue to a predetermined depth can be photographed as a fluorescence image.

  The surgery support apparatus according to claim 16 or 17, wherein the blood vessel image extraction unit extracts a blood vessel image including density information of the fluorescent image or a binarized blood vessel image. It is characterized by.

  The surgery support device according to any one of claims 16 to 18, wherein the composite image creating means is configured to obtain a blood vessel image extracted by the threshold processing and a binarized signal indicating the background thereof. And means for synthesizing the blood vessel image with the normal image based on the generated key signal.

  The surgery support device according to any one of claims 16 to 19, wherein the composite image output unit includes at least one of the normal image, the fluorescence image, and the blood vessel image together with the composite image. An image is displayed on the display means at the same time. Note that it may be possible to appropriately select which image is displayed simultaneously with the composite image.

  The surgical operation support device according to any one of claims 16 to 20, as shown in claim 21, wherein a treatment instrument detection means for detecting a distal end position of a surgical treatment instrument based on the normal image, and the detected treatment Distance calculating means for calculating an actual distance from the distal end position of the treatment instrument to the nearest blood vessel based on the distal end position of the instrument and the position information of the blood vessel in the extracted blood vessel image, and the calculated actual distance is a predetermined distance It is characterized by further comprising warning generating means for issuing a warning when it is as follows.

  According to a twenty-second aspect of the present invention, in the surgery support device according to the twenty-first aspect, the warning generation unit displays a warning character superimposed on the display unit.

  According to a twenty-third aspect of the present invention, in the surgery support device according to the twenty-first or twenty-second aspect, the warning generating means displays the calculated actual distance superimposed on the display means.

  24. The surgical operation support apparatus according to claim 22 or 23, wherein the warning generating means changes the color of the warning character or the actual distance when displaying the warning character or the actual distance. , At least one of a display that blinks the warning character or the actual distance, a display that repeatedly enlarges or reduces the warning character or the actual distance, and a display that expands the warning character or the actual distance. .

  25. The surgical operation support apparatus according to any one of claims 22 to 24, wherein the warning generating means is arranged so that the warning character or actual size is in the vicinity of the distal end position of the treatment instrument displayed on the display means. It is characterized by displaying distance.

  According to a twenty-sixth aspect of the present invention, in the surgery support device according to any one of the twenty-first to twenty-fifth aspects, the warning generating means emits a warning sound or a warning sound.

  27. The surgical operation support apparatus according to claim 16, wherein an input unit for inputting a start point and an end point of the operation on the screen of the display unit is input by the input unit. Based on the position information of the start point and end point of the operated surgery and the position information of the blood vessel in the blood vessel image extracted by the blood vessel image extracting means, the start of the operation without passing through the blood vessel and the dangerous area near the blood vessel A surgical route calculating unit that calculates a surgical route that connects a point and an end point, and a surgical route display unit that displays the calculated surgical route in a superimposed manner on the composite image are further provided.

  28. The surgical operation support apparatus according to claim 27, wherein the surgical route calculation means calculates the safest route connecting the start point and the end point of the surgery or the shortest route as the surgical route. It is characterized by doing.

  The operation support device according to any one of claims 16 to 26, wherein an input means for inputting a start point and an end point of the operation on the screen of the display means, and an input by the input means as shown in claim 29 A safety region for calculating a safety region other than a blood vessel and a dangerous region near the blood vessel based on the position information of the start and end points of the performed surgery and the position information of the blood vessel in the blood vessel image extracted by the blood vessel image extraction means It further comprises calculation means and safety area display means for displaying the calculated safety area on the display means in an identifiable manner.

  30. The surgery support apparatus according to claim 16, wherein the imaging means is an imaging means provided in an endoscope, and the first light source means and the second light source means The light source means generates the excitation light or visible light from the distal end of the endoscope.

  According to the present invention, when performing a surgical operation, it is possible to display a normal image that allows good observation of the surface of the living tissue, and from a surface of the living tissue that cannot be displayed with the normal image to a predetermined depth. The blood vessels existing in the blood vessel can be displayed superimposed on the normal screen, and thereby the risk of damaging the blood vessels (bleeding risk) can be greatly reduced.

  In addition, when a surgical treatment instrument is close to a blood vessel, a warning can be issued to alert the patient, thereby preventing bleeding troubles. In addition, the start and end points of the operation can be determined. An appropriate operation route that does not approach the blood vessel can be presented simply by inputting, and thus the treatment instrument during the operation can be operated more safely.

  DESCRIPTION OF EMBODIMENTS Hereinafter, preferred embodiments of a surgery support method and apparatus according to the present invention will be described with reference to the accompanying drawings.

<Appearance of surgery support device>
FIG. 1 is an external view showing an embodiment of a surgery support apparatus according to the present invention.

  As shown in FIG. 1, the surgery support apparatus 10 mainly includes a laparoscope 100, which is a kind of endoscope, a processor 200, a light source device 300, and a monitor device 400. Note that the processor 200 may be configured to incorporate the light source device 300.

  The laparoscope 100 is detachably attached to the processor 200 and the light source device 300 via an electrical connector 110 and a light guide (LG) connector 120, respectively. An image showing the subject imaged by the laparoscope 100 is appropriately subjected to image processing by the processor 200 and then output to the monitor device 400 where it is observed by the operator.

  FIG. 2 is a schematic diagram of laparoscopic surgery using the laparoscope 100. In laparoscopic surgery, several holes are opened in the abdominal wall, and the distal end 100A of the insertion portion of the laparoscope 100, an electric scalpel 30 used for endoscopic surgery, and a treatment tool such as forceps are inserted through the trocar 20 and carbonic acid. Add gas or air to inflate the abdominal wall.

  The surgeon advances the operation by operating the treatment tool while observing the surgical target site imaged by the laparoscope 100 with the monitor device 400.

<Internal configuration of surgery support device>
FIG. 3 is a block diagram showing the internal configuration of the surgery support apparatus 10.

[Laparoscope]
An objective lens 130, an image sensor (CCD) 140, and an illumination lens 150 are disposed at the insertion portion distal end 100A of the laparoscope 100.

  The objective lens 130 forms an image of the subject on the light receiving surface of the CCD 140, and the CCD 140 converts the subject image formed on the light receiving surface into an electric signal by each light receiving element. The CCD 14 of this embodiment is a color CCD in which three primary color red (R), green (G), and blue (B) color filters are arranged for each pixel in a predetermined arrangement (Bayer arrangement, honeycomb arrangement). It is.

  Further, inside the laparoscope 100, a wiring 160 for driving the CCD 140 and taking out the CCD output is provided, and a light guide 170 is provided.

  One end 170 </ b> A of the light guide 170 is connected to the light source device 300 via the LG connector 120, and the other end 170 </ b> B of the light guide 170 faces the illumination lens 150. The light emitted from the light source device 300 is emitted from the illumination lens 150 via the light guide 170 and illuminates the visual field range of the objective lens 130.

  The laparoscope 100 of this embodiment has the same configuration as a general laparoscope except that an infrared cut filter is not provided on the front surface of the CCD 140.

[Processor]
The processor 200 mainly includes a central processing unit (CPU) 210, an analog front end (AFE) 220, an image input controller 222, a normal image processing unit 224, a fluorescence image processing unit 226, an image composition unit 230, a CCD driver 240, a timing. A generator (TG) 242, a character generator (CG) 244, a memory 246, a video output unit 248, an audio processing unit 250, a speaker 252, and an operation unit 254 are configured.

  The CPU 210 has a built-in program ROM in which various data necessary for control are recorded in addition to the control program executed by the CPU 210. The CPU 210 controls each unit by reading a control program recorded in the program ROM into the memory 246 based on an instruction input such as a shooting instruction from the operation unit 254 and sequentially executing the control program. The memory 246 is used as a program execution processing area, a temporary storage area for image data, and various work areas.

  The CCD 140 in the laparoscope 100 outputs the charges accumulated in each pixel as a serial image signal line by line in synchronization with the vertical transfer clock and horizontal transfer clock supplied from the TG 242 via the CCD driver 240. The CPU 210 controls the driving of the CCD 140 by controlling the TG 242.

  The operation unit 254 includes a switch for instructing the start and end of imaging, a switch for instructing calculation of a surgical route, which will be described later, and a pointing device such as a mouse for inputting an instruction for the start and end points of the surgical route. Yes.

  The image signal output from the CCD 140 is an analog signal, and this analog image signal is taken into the AFE 220. The AFE 220 includes a correlated double sampling circuit (CDS), an automatic gain control circuit (AGC), and an AD converter (ADC). The CDS removes noise contained in the image signal, the AGC amplifies the noise-removed image signal with a predetermined gain, and the ADC converts the analog image signal into a digital image having a gradation width of a predetermined bit. Convert to signal.

  The image input controller 222 has a built-in line buffer having a predetermined capacity, and stores an image signal for one frame output from the AFE 220. The image signal for one frame accumulated in the image input controller 222 is stored in the memory 246 via the bus 256.

  In addition to the CPU 210, memory 246, and image input controller 222, the normal image processing unit 224, fluorescent image processing unit 226, image composition unit 230, CG244, video output unit 248, and the like are connected to the bus 256. Information can be transmitted and received between each other via a bus 256.

  The image signal for one frame stored in the memory 246 is taken into the normal image processing unit 224 or the fluorescence image processing unit 226 and subjected to necessary image processing. The images processed by the normal image processing unit 224 and the fluorescence image processing unit 226 are combined by the image combining unit 230. Details of the normal image processing unit 224, the fluorescence image processing unit 226, and the image composition unit 230 will be described later.

  The synthesized image synthesized by the image synthesis unit 230 is converted into a video signal for the monitor device 400 by the video output unit 248 and output to the monitor device 400.

  Further, the CG 244 generates a warning character or the like according to a command from the CPU 210 and outputs it to the image composition unit 230, and the sound processing unit 250 generates a warning sound such as a beep sound or a warning sound from the speaker 252 according to a command from the CPU 210. Let

[Light source device]
The light source device 300 mainly includes a white light source 310, a rotary filter 320, a diaphragm 330, a condenser lens 340, a motor drive circuit 350, a motor 360, and an automatic light amount adjustment circuit (ALC) 370. It has a function of alternately making excitation light in a specific wavelength region (near infrared region) incident on the light guide 170.

  As the light source 310, for example, a halogen lamp can be used. White light emitted from the halogen lamp has a wavelength range of 400 nm to 1800 nm. The rotary filter 320 transmits only visible light or transmits only near-infrared excitation light according to the rotational position.

  FIG. 4 is a plan view of the rotary filter 320. As shown in the figure, the rotary filter 320 is provided with an infrared cut filter 322 and a near-infrared bandpass filter 324, and the rotary filter 320 has the infrared cut filter 322 positioned in front of the light source 310. In the case where only the visible light (400 nm to 700 nm) is transmitted and the near-infrared bandpass filter (near-infrared BPF) 324 is positioned in front of the light source 310, excitation light in the near-infrared region (for example, , Around 800 nm).

  The motor drive circuit 350 outputs a drive signal to the motor 360, rotates the rotary filter 320 at a speed of 30 times / second, and synchronizes with the vertical synchronization signal from the TG 242 and the infrared cut filter 322 and the near-infrared BPF 324. The phase is controlled so that and are switched.

  The light transmitted through the rotary filter 320 is guided to the end face of the light guide 170 through the diaphragm 330 and the condenser lens 340.

  The ALC 370 controls the diaphragm 330 based on the brightness information of the captured image applied from the CPU 210, and adjusts the amount of light incident on the light guide 170 so that the captured image is maintained at a constant brightness. This prevents halation or the like from occurring.

  When visible light is incident on the light guide 170 by the light source device 300 having the above configuration, the laparoscope 100 can capture a color image (normal image). When the excitation light is incident on the light guide 170, the laparoscope 100 Then, it is possible to take a fluorescent image of a living tissue that emits fluorescence by excitation light.

[Method for obtaining fluorescent image including blood vessel image]
In order to obtain information in living tissue using light, it is necessary to avoid light in a wavelength range that is absorbed by the living tissue. As shown in FIG. 5, in the wavelength range of visible light of 700 nm or less, there is absorption of hemoglobin, and in the wavelength range of 1000 nm or more, there is water absorption, so light in this wavelength range cannot be used. Light in the wavelength range of 700 nm to 1000 nm (near infrared range) is referred to as a “biological spectroscopic window” because it passes through living tissue relatively well. In other words, the above-described excitation light in the near infrared region is light that permeates the living tissue relatively well.

  In the present invention, in order to observe a blood vessel inside a living tissue, a blood vessel contrast agent is administered to a subject, and a fluorescence image including a blood vessel image is photographed by irradiating excitation light in the near infrared region. As the angiographic contrast agent, fluorescent reagent ICG (indocyanine green) having an excitation light wavelength of 785 nm and a fluorescence wavelength of 805 nm, and fluorescent reagent Cy7 having an excitation light wavelength of 747 nm and a fluorescence wavelength of 776 nm can be used.

<First Embodiment>
In the present invention, normal images and fluorescent images are taken alternately. When a fluorescent image is taken, the living tissue is irradiated with near-infrared excitation light as described above.

  FIG. 6 is a flowchart showing the first embodiment of the operation support method and apparatus according to the present invention.

  First, an angiographic contrast agent is administered from the vein of the subject (step S10). In addition, since an angiographic agent is metabolized in the body, it is necessary to administer a fluorescent contrast agent so as to keep the blood concentration constant. In addition, it is preferable to administer the angiographic contrast agent so that the blood concentration of the angiographic contrast agent is appropriate (that is, the fluorescence intensity reaches a peak).

  Subsequently, the normal image and the fluorescence image are alternately photographed for each frame in synchronization with the vertical synchronization signal (VD signal) having a period of 1/60 seconds (step S12). That is, as shown in FIG. 7A, in synchronization with the VD signal, visible light and excitation light are alternately emitted from the light source device 300, and the subject is irradiated through the light guide 170 and the illumination lens 150. As a result, the normal image exposure (photographing) and the fluorescence image exposure (photographing) are alternately performed by the CCD 140 (FIGS. 7B and 7F).

  When the normal image is exposed, the image signal (normal image) is read from the CCD 140 in synchronization with the next VD signal (FIG. 7C), and then the normal image processing unit shown in FIG. Normal image processing is performed at 224 (FIG. 7D, step S14).

  The normal image processing unit 224 includes a linear matrix circuit, a white balance correction circuit, a gamma correction circuit, a synchronization circuit, and the like, and processes R, G, and B image signals indicating normal images input by these circuits.

  On the other hand, when the fluorescent image is exposed, the image signal (fluorescent image) is read from the CCD 140 in synchronization with the next VD signal (FIG. 7G), and then the fluorescent image shown in FIG. The processing unit 226 performs fluorescence image processing and blood vessel image generation processing (FIG. 7H, steps S16 and S18).

  The fluorescence image processing unit 226 first processes R, G, and B image signals indicating fluorescence images input by a gamma correction circuit, a synchronization circuit, and the like, and performs R, G, B after the synchronization processing in the synchronization circuit. A luminance signal (a signal having only density information) is generated from the image signal. Subsequently, a blood vessel image is generated from a fluorescent image having only density information.

  FIG. 8 is a flowchart showing a blood vessel image generation (blood vessel image extraction) process in step S18 of FIG.

  As shown in FIG. 8, the fluorescence image processed in step S16 is input (step S30). Next, threshold processing is performed on the input fluorescence image with a predetermined threshold value, and density information of pixels (pixels that receive light from a fluorescent tissue that emits fluorescence) having a density equal to or higher than the density set by the threshold value is extracted. A binarized image obtained by binarization is extracted (step S32). The threshold value is preferably as small as possible within a range in which pixel density information and noise can be distinguished.

  Subsequently, image processing for extracting only a blood vessel image from the extracted density information or binarized image is performed (step S34). For example, a filtering process is performed to delete an image portion or a noise component having a feature amount different from a blood vessel feature amount (elongated continuous shape or the like).

  As shown in FIGS. 7E and 7I, the normal image and the fluorescence image (blood vessel image) subjected to the image processing as described above are output in two frames so as to be continuous frames, and are simultaneously output. The normal image and the blood vessel image are output to the image synthesis unit 230 shown in FIG. 3 and synthesized there.

  9A and 9B are schematic diagrams of a normal image and a blood vessel image output to the image composition unit 230, respectively.

  FIG. 10 is a block diagram showing an internal configuration of the image composition unit 230. As shown in the figure, the image composition unit 230 includes mixers 232, 234, and 236 and a color conversion unit 238.

  A blood vessel image composed of density information is added to the color conversion unit 238, and the color conversion unit 238 performs color conversion so that the blood vessel image to be input has a preset color (for example, red), and is converted. The subsequent blood vessel image is output to the mixer 232. Note that the color conversion unit 238 performs color conversion so as to obtain a color of gradation (bright red to dark red) corresponding to the density information of the blood vessel image.

  A normal image is added to the other input of the mixer 232, and the mixer 232 synthesizes the normal image and the blood vessel image. In the image synthesis by the mixer 232, a blood vessel image is key-synthesized with a normal image using a binary image corresponding to the blood vessel image (a binary signal indicating a blood vessel image and its background) as a key signal. As a result, the image is synthesized so that only the blood vessel image is pasted on the normal image, and the image is synthesized so that the normal image can be seen in the portion without the blood vessel.

  The synthesized image synthesized in this way is added to the mixer 234. Warning character information is added to the other input of the mixer 234 from the CG 244 as necessary, and the mixer 234 can synthesize the warning character on the composite image.

  The composite image output from the mixer 234 is added to the mixer 236. Surgical route information is added to the other input of the mixer 236 from the CPU 210 so that the mixer 236 can synthesize the surgical route on the composite image. Details of the combined display of the warning character and the surgical route will be described later.

  Returning to FIG. 6, the synthesized image synthesized by the image synthesizing unit 230 as described above is output to the monitor device 400 via the video output unit 248 and displayed on the monitor device 400 (step S22). FIG. 11 shows an example of a monitor screen on which a composite image is displayed.

  Next, it is determined whether or not the operation has been completed based on whether or not an operation end instruction has been input from the operation unit 254. If the operation has not been completed, the process proceeds to step S12. Finishes the operation support process (step S24).

  In this embodiment, the blood vessel image is an image having a gradation corresponding to the density information, but is not limited to this, and may be a binarized image. In this case, the blood vessel image can be easily confirmed, but since there is no density difference in the blood vessel image, information on the depth direction of the blood vessel cannot be obtained.

[Other display examples of composite image]
12 and 13 each show another example of a monitor screen on which an image including a composite image is displayed.

  On the monitor screen shown in FIG. 12, the composite image A and the normal image B are displayed side by side. In the monitor screen shown in FIG. 13, the composite image A, the normal image B, and the blood vessel image C are displayed side by side. Note that the monitor screens shown in FIGS. 11, 12, and 13 may be appropriately switched by an operation on the operation unit 254.

[Other Embodiments of Surgery Support Device]
In the laparoscope 100 of the above embodiment, the CCD 140 having R, G, and B color filters is used. However, the present invention is not limited to this, and a monochrome CCD (not shown) may be used. .

  In this case, when visible light is incident on the light guide 170 from the light source device 300, R, G, and B visible light are sequentially incident.

  That is, the light source device 300 uses the rotation filter 329 shown in FIG. 14 instead of the rotation filter 320 shown in FIG. The rotary filter 329 is provided with an R filter 328R, a G filter 328G, a B filter 328B, and a near infrared BPF 324 for each angle range of about 90 degrees. Further, an infrared cut filter (not shown) is also superimposed on the R filter 328R, the G filter 328G, and the B filter 328B.

  When the R filter 328R, the G filter 328G, the B filter 328B, and the near-infrared BPF 324 are positioned in front of the light source 310, the rotary filter 329 has R, G, and B visible light and near-infrared light, respectively. Transmits excitation light.

  By rotating the rotary filter 329 at a speed of 15 times / second, it is possible to generate surface sequential light in which R light, G light, B light, and excitation light are switched every 1/60 second period.

  FIG. 15 is a timing chart of signal processing when the subject is irradiated with the surface sequential light of the R light, G light, B light, and excitation light.

  As shown in FIG. 15A, in synchronization with the VD signal, R light, G light, B light, and excitation light are emitted in order from the light source device 300, and normal image exposure using R, G, B light, Then, the fluorescence image of the excitation light is exposed (FIGS. 15B and 15F).

  When normal image exposure with R, G, and B light and fluorescence image exposure with excitation light are performed, the image signal is read out from the CCD 140 in synchronization with the next VD signal (FIG. 15C). (G)) Subsequently, after the image signal is read, image processing by the normal image processing unit 224 and the fluorescence image processing unit 226 is performed (FIGS. 15D and 15H).

  In the case of R, G, and B frame sequential image signal processing, the synchronization processing corresponding to the arrangement of the color filters is unnecessary, and other image processing is performed using the image signal obtained from the color CCD 140. This is almost the same as the above processing. In FIG. 15D, the image processing for these images is started after the reading of the three color images of R, G, and B is completed. You may make it start the process with respect to the read color image.

  When the image processing by the normal image processing unit 224 and the fluorescence image processing unit 226 is completed, the normal image and the fluorescence image (blood vessel image) are respectively continuous frames as shown in FIGS. 4 frames are output at a time.

In FIG. 15D, the color image is generated from the three images R 1 G 1 B 1 and the next color image is generated from the next three images R 2 G 2 B 2. Each time a new color image is acquired, such as R 1 G 1 B 1 → G 1 B 1 R 2 → B 1 R 2 G 2 →... A color image may be generated from the color image. According to this, a smoother live view image can be created.

[Other Embodiments of Light Source Device]
FIG. 16 is a block diagram showing another embodiment of a light source device applied to the present invention. In addition, the same code | symbol is attached | subjected to the part which is common in the light source device 300 shown in FIG. 3, and the detailed description is abbreviate | omitted.

  A light source device 300 ′ illustrated in FIG. 16 includes a laser control unit 500, a semiconductor laser 510, a reflection mirror 520, and a half mirror 530 added to the light source device 300 illustrated in FIG. 3. Further, as the rotary filter 329, a filter in which the near infrared BPF 324 portion of the rotary filter 320 shown in FIG. 4 is shielded from light is used.

  When visible light is emitted by the light source device 300 ′ having the above configuration, the wavelength range of white light emitted from the light source 310 is limited to the wavelength range of visible light by the infrared cut filter of the rotary filter 329, and infrared light is emitted. The visible light transmitted through the cut filter is incident on the incident end face of the light guide 170 through the diaphragm 330, the condenser lens 340, and the half mirror 530.

  On the other hand, when the excitation light is emitted, the laser control unit 500 controls the semiconductor laser 510 to emit laser light intermittently. The light emission period of the semiconductor laser 510 is controlled to synchronize with the light shielding period in which the white light emitted from the light source 310 is shielded by the rotary filter 329.

  The semiconductor laser 510 can emit near-infrared laser light (excitation light) in the vicinity of 800 nm, and this excitation light is incident on the incident end face of the light guide 170 via the reflection mirror 520 and the half mirror 530.

  In addition, a white light emitting diode can be used instead of the light source 310 shown in FIG. 16, and the rotary filter 329 and the like can be omitted by controlling the white light emitting diode on / off.

<Second Embodiment>
FIG. 17 is a flowchart showing a second embodiment of the surgery support method and apparatus according to the present invention.

  In the second embodiment, a warning is displayed when a surgical instrument and a blood vessel are close to each other on the screen on which the composite image is displayed according to the first embodiment. Hereinafter, a process for displaying a warning will be described.

17, usually by entering the image (step S50), than to the image processing ordinary image input, detects the tip position P O of the electric knife (step S52). Detection of tip position P O of the electric knife, first detects an image of the electric knife from the normal image. Specific methods for detecting an electric knife include edge detection or shape pattern detection, vectorization of electric knife feature points, and approximate detection of the feature point vector, and the characteristics of the electric knife used A known method such as a method based on hue detection can be used.

If electrocautery is detected, it is possible to detect the tip position P O by tracking the edge of the detected electric knife.

Then, to calculate the distance L to the vessel at the shortest distance from the front end position P O of the electric scalpel which the detected (step S54). The distance L is calculated as follows.

Based on the end position P O of the electric knife 30 as shown in FIG. 18, to scan the blood vessel image in the radial direction from the front end position P O. For example, scanning the pixel values of the blood vessel image in the direction of 0:00 from the position P O, initially regarded the position of the pixel which pixel value appears with the vessel wall, the position P O and CCD140 imaging plane to the vessel wall The distance is calculated. This process is repeatedly executed while rotating by a predetermined angle from 0 o'clock to 12 o'clock, and the minimum value among the calculated distances is obtained as the shortest distance L.

  Subsequently, the actual distance L ′ of the distance L calculated as described above is calculated (step S56). As a method for calculating the actual distance L ′, an imaging distance to the surface of the living tissue is detected, and an actual distance L ′ is calculated based on the imaging distance, or a reference scale ( For example, a method of detecting the size of a treatment tool having a known dimension on a normal image and calculating the actual distance L ′ based on the size is conceivable.

  As a method for detecting the imaging distance to the surface of the living tissue, a laser beam is emitted from the tip of the laparoscope 100 toward the center of the captured image. Since the position of the bright spot of the laser beam changes according to the photographing distance to the surface of the living tissue, the photographing distance can be obtained by detecting the position of the bright spot of the laser beam on the screen. In addition, by emitting a plurality of laser beams, the distances at a plurality of locations on the surface of the living tissue can be obtained.

  When the photographing distance X is detected, the actual distance L ′ can be obtained by multiplying the distance L on the image plane by the ratio of the distance X and the focal length f of the objective lens 130.

Thus the exact distance L 'to the vessel in the shortest distance from the front end position P O of the electric knife 30 that is calculated to determine whether or not the reference distance L ref following to determine risk or not ( Step S58).

If the determination result is “Yes” (L ′ ≦ L ref ), a warning is displayed on the composite image (step S60). If the determination result is “No” (L ′> L ref ), the process proceeds to step S62 and no warning is displayed in step S60.

  FIG. 19 shows an example of warning display. In the example shown in the figure, the warning character “approaching!” Is displayed, and the blood vessel that is approaching by the leader line and the actual distance (in this example, “2 mm”) are displayed at the same time. ing. Note that a pointer may be displayed at the position of the blood vessel as indicated by an arrow, or a point may be attached to the point being measured. Further, the actual distance may be displayed as a bar or meter.

  When displaying the warning character and the actual distance on the monitor device 400, the CPU 210 outputs a signal indicating the warning character and the actual distance to the CG 244, and the warning character information indicating the warning character and the actual distance from the CG 244. To output. As shown in FIG. 10, the mixer 234 of the image composition unit 230 synthesizes the warning character and the actual distance on the composite image based on the warning character information input from the CG 244. As a result, a combined image in which the warning character and the actual distance are combined can be displayed on the screen of the monitor device 400.

  As a display form of the warning character and the actual distance, it is conceivable that the display color is changed, the display is blinked, the enlargement / reduction display is repeated, or the enlargement display is performed. Thereby, attention can be drawn to the display of the warning character and the actual distance, and the warning can be prevented from being overlooked.

  In this embodiment, the warning character and the actual distance are displayed simultaneously. However, the present invention is not limited to this, and one of the warning character and the actual distance may be displayed.

  Further, together with the display of the warning character and the actual distance, or instead of the display of the warning character and the actual distance, a warning voice command is output from the CPU 210 to the voice processing unit 250, and a voice signal is output from the voice processing unit 250 to the speaker 252. By doing so, a warning sound or a warning sound may be generated.

  Returning to FIG. 17, in step S <b> 62, it is determined whether or not the operation is completed based on whether or not an operation end instruction is input from the operation unit 254. If the operation is not completed, the process proceeds to step S <b> 64. When the operation is finished, the operation support process is finished.

In step S64 and step S66, the in the same manner as in step S50 and step S52 to enter a new normal image, detecting the tip position P O of the electric knife. Subsequently, the tip position P O of the electric knife is determined whether or not moved from the previous position, when the tip position P O of the electric scalpel has moved, it makes a state transition to step S54, the electric knife tip position P If O has not moved, the process proceeds to step S58 (step S68).

<Third Embodiment>
FIG. 20 is a flowchart showing a third embodiment of the surgery support method and apparatus according to the present invention.

  The third embodiment is characterized in that a safe surgical route is calculated and superimposed on the screen on which the composite image is displayed according to the first embodiment. Hereinafter, processing for displaying a surgical route and the like will be described.

In FIG. 20, the surgeon inputs the start point position P A and the end point position P B of the operation as desired while viewing the composite image (step S70). When inputting the start point position P A and the end point position P B , the operation unit 254 and the screen of the monitor device 400 are used. For example, the operation mode of the surgery support apparatus 10 is changed to the start point / end point position input mode. Set. Then, move the mouse pointer to the start position P A on the screen of the monitor device 400 sets the starting point position P A by clicking the mouse. Similarly, the end point position P B is set.

FIG. 21A shows an example of a monitor screen when the start point position P A and the end point position P B are set. In addition, it is preferable to display a marker at each of the set start point position P A and end point position P B.

Starting position P A, and after the setting of the end point P B is completed, the operator inputs the instruction for generating surgical path from the operation unit 254, CPU 210 is a danger region from the start point position P A to the end point position P B A surgical route that does not pass is calculated (step S72).

An example of the calculation of the surgical route will be described with reference to FIG. In the example shown in FIG. 21 (B), obtaining the middle point P 1 to P 7 between vessel order from the start position P A in the end position P B. For example, in the case of obtaining the middle point P 1 vascular sandwiching a line segment connecting the starting point position P A and the end point position P B, or line segments seek vessels crossing. In the example of FIG. 21B, blood vessels X 1 and X 2 are obtained, and the position where these blood vessels X 1 and X 2 are closest is obtained. Then, the middle point P 1 at which the blood vessels X 1 and X 2 are closest is calculated.

Similarly, the case of obtaining the middle point P 2 vascular sandwiching a line segment connecting said determined midpoint P 1 and the end position P B, or line segments seek vessels crossing. In the example of FIG. 21B, blood vessels X 2 and X 3 are obtained, and the position where these blood vessels X 2 and X 3 are closest is obtained. Then, to calculate the midpoint P 2 of the closest position.

On the other hand, when connecting the middle point P 3 and the end point P B obtained as described above in a straight line, the line segment will to cross the vascular X 5. In this case, the vessel X 5 to the transverse, and the vessel X 4 route side to bypass the blood vessel X 5 obtains the most approached position, calculates a middle point P 4 of these positions.

When the above manner is calculated for each midpoint P 1 to P 7 is completed, a surgical path C connecting the starting point position P A and the end point position P B as shown in FIG. 21 (C), each of the middle A surgical route C passing through the midpoints P 1 to P 7 is calculated. The surgical path C passing through each point is calculated so as to be a smooth curve as a whole using, for example, spline interpolation. It should be noted that the actual distance between the position of the midpoint between the blood vessels and the blood vessel is calculated, and when this actual distance is less than or equal to a threshold set in advance as a dangerous distance, another surgical route is calculated.

Returning to FIG. 20, when the calculation of the surgical route is completed, the image on the surgical route along with the surgical route (for example, an image in the vicinity of the start point position P A , end point position P B , and each of the midpoints P 1 to P 7 ). Is stored in association with the position on the screen (step S74).

  Next, whether or not the image has changed is determined by comparison with the stored image (step S76). The change in the image is caused by a change in the imaging range of the laparoscope 100 or a movement of the imaging object during the operation.

  If the image does not change, the surgical route calculated in step S72 is displayed superimposed on the composite image displayed on the monitor device 400 (step S80).

  When displaying the surgical route on the monitor device 400, the surgical route information indicating the surgical route is output from the CPU 210 to the image composition unit 230. As shown in FIG. 10, the mixer 236 of the image composition unit 230 synthesizes a surgical route on the composite image based on the surgical route information input from the CPU 210. Thereby, the synthesized image in which the surgical route is synthesized can be displayed on the screen of the monitor device 400.

On the other hand, if it is determined that the image has changed, an image shift amount in each part from the image stored in step S74 is calculated, and the start point position P A , end point position P B , and each intermediate amount are calculated by the shift amount. The positions of the points P 1 to P 7 are corrected, and the operation route is recalculated based on the corrected positions (step S78). In step S80, the surgical route after recalculation is displayed.

  Thereafter, it is determined whether or not the operation has been completed based on the presence or absence of an instruction to end the operation from the operation unit 354. If the operation has not been completed, the process proceeds to step S76. The operation support process is terminated (step S82).

[Modification of Third Embodiment]
After the calculation of the surgical route in step S72 in FIG. 20, the surgical route is displayed on the monitor screen so that the surgical route can be corrected based on an instruction input from the operator (for example, dragging the surgical route with a mouse). Alternatively, as shown in FIG. 22, the actual distance between the surgical route and the blood vessel may be calculated, and the actual distance or a warning may be displayed at a location requiring attention. According to this, when making an incision along a surgical route, an incision can be carefully made at a place requiring special attention.

Furthermore, the input method of the start point position P A and the end point position P B of the operation is not limited to this embodiment. For example, in the case of a monitor device with a touch panel, it may be performed by a touch pen or the like.

Also, as shown in FIG. 23, when the surgical path is displayed after the start point position P A and the end point position P B of the surgery are input (FIGS. 23A and 23B), a plurality of paths on the surgical path are displayed. The point is marked with an electric knife. The portions of the marking, the start position P A, the end point P B, and necessary points in calculating a surgical path (Q 1 ~Q 6) is selected. Then, after the end of marking, an image including these points and its position are stored instead of the image stored in step S74 of FIG. According to this, the extraction of the feature points on the surgical route becomes easy, and the change display of the surgical route accompanying the change of the image can be performed more easily.

[Other calculation and display methods for surgical route]
FIG. 24 is a diagram for explaining another calculation method of the surgical route.

  The surgical route calculation method shown in FIG. 21 calculates the surgical route so as to pass through the midpoint between the blood vessels, so that the safest surgical route can be calculated. However, the surgical route calculation method shown in FIG. Is a method of calculating the shortest surgical route that does not pass through the dangerous area.

As shown in FIG. 24, when the start point position P A and the end point position P B of the operation are designated, the blood vessels X 1 and X 2 near the line segment connecting the start point position P A and the end point position P B are shown. ,... Are obtained (FIG. 24A). Subsequently, risk areas A 1 , A 2 ,... Within a certain distance from the blood vessels X 1 , X 2 ,... Are set (FIG. 24B). The risk areas A 1 , A 2 ,... Are set based on a dangerous actual distance between a surgical route set in advance and a blood vessel wall.

Subsequently, when the starting point position P A and the point position P B of the operation are connected with a straight line (FIG. 24C), if the risk area A 1 , A 2 ,. Although it is set as a route, when it passes through the dangerous areas A 1 , A 2 ,..., A surgical route that bypasses is calculated as follows.

That is, determine the shortest position P 1 that does not pass through the critical region A 1 from the start point position P A of the surgical as shown in FIG. 24 (D), surgery connecting it path between the position P A and the position P 1 by linear path And Then, determine the shortest position P2 does not pass through the critical region A 2 from the position P1, then obtains the shortest position P1 'that does not pass through the critical region A 1 from the position P2. A surgical route is defined as a path connecting the arc of the dangerous area A 1 between the positions P 1 and P 1 ′ and the position P 1 ′ and the position P 2 with a straight line. Similarly, the surgical path is defined as a path connecting the arc of the danger area A 2 between the positions P 2 and P 2 ′ and the position P 2 ′ and the position P B with a straight line (FIG. 24E). ).

  This makes it possible to calculate the shortest surgical route that does not pass through the dangerous area.

Further, without calculating the surgical path as shown in FIG. 25, it calculates a safe region from the start position P A of surgery end position P B, the calculated safety area and risk area (area other than the safety area) May be displayed in an identifiable manner.

The safety area can be obtained by calculating a dangerous area. For example, a risk region is set for each blood vessel in the vicinity of a line segment connecting the start point position P A and the end point position P B as shown in FIG. And a dangerous area is calculated based on the area | region which connected these dangerous areas. Then, an area other than the calculated dangerous area is obtained as a safety area.

  The safe area and the dangerous area can be displayed so as to be distinguishable by changing the density and color of the composite image.

  In addition, a plurality of surgical routes such as a safe route and a shortest route may be simultaneously displayed so as to be identifiable, or only a selected surgical route among the plurality of routes may be displayed. Further, the selected surgical route may be highlighted or displayed inconspicuously except for the selected route (changing the color or making the line thinner).

[Other embodiments]
In this embodiment, the case of using a laparoscope has been described. However, the present invention is not limited to this, and various endoscopes (upper gastrointestinal endoscope, small intestine endoscope, large intestine endoscope, thoracoscope) , Laryngeal endoscope, bronchoscope, cystoscope, cholangioscope, arthroscope, etc.) In short, any device can be used as long as it can capture a normal image and a fluorescent image.

  Further, when extracting a blood vessel image, the actual size of the blood vessel diameter is measured by the technique described in the second embodiment, and a thin blood vessel (capillary blood vessel or the like) that can be stopped by coagulation with an electric knife or the like is imaged. It is preferable to delete from the blood vessel image by processing.

  Furthermore, the type of angiographic agent, the concentration in blood, and so on that a blood vessel having a depth of at least one incision with a treatment tool (for example, about 3 mm in the case of an electric knife) can be imaged. It is necessary to set the type and intensity of the excitation light.

  Furthermore, the normal image and the fluorescence image are not limited to being photographed alternately one by one, but one fluorescence image may be photographed every time a plurality of regular images are photographed continuously. This is because a normal image is used as a live view image on which an actual treatment tool or the like is reflected, so real-time characteristics are required. However, since a fluorescent image is used to extract blood vessel images, it is significantly different from a normal image. It is because there is no problem if it does not shift to.

  The treatment tool is not limited to an electric knife, and surgical instruments such as an ultrasonic knife, a microwave knife, a laser knife, and a frozen knife are conceivable.

  Furthermore, the present invention is not limited to the above examples, and it goes without saying that various improvements and modifications may be made without departing from the scope of the present invention.

FIG. 1 is an external view showing an embodiment of a surgery support apparatus according to the present invention. FIG. 3 is a block diagram showing the internal configuration of the surgery support apparatus. FIG. 3 is a chart showing how images in the image recording unit are classified into images by group. FIG. 4 is a plan view of the rotary filter. FIG. 5 is a graph showing the relationship between the wavelength of light and the light absorption rate of living tissue. FIG. 6 is a flowchart showing the first embodiment of the operation support method and apparatus according to the present invention. FIG. 7 is a timing chart of signal processing when the subject is irradiated with visible light and excitation light alternately. FIG. 8 is a flowchart showing a blood vessel image generation (blood vessel image extraction) process. FIGS. 9A and 9B are schematic diagrams of a normal image and a blood vessel image output to the image composition unit 230, respectively. FIG. 10 is a block diagram showing an internal configuration of the image composition unit. FIG. 11 is a diagram illustrating an example of a monitor screen on which a composite image is displayed. FIG. 12 is a diagram illustrating another example of a monitor screen on which an image including a composite image is displayed. FIG. 13 is a diagram showing still another example of a monitor screen on which an image including a composite image is displayed. FIG. 14 is a plan view showing another example of the rotary filter. FIG. 15 is a timing chart of signal processing when the subject is irradiated with frame sequential light of R light, G light, B light, and excitation light. FIG. 16 is a block diagram showing another embodiment of a light source device applied to the present invention. FIG. 17 is a flowchart showing a second embodiment of the surgery support method and apparatus according to the present invention. FIG. 18 is a diagram used for explaining a method of calculating the distance from the tip position of the electric knife to the blood vessel at the shortest distance. FIG. 19 shows an example of a warning display. FIG. 20 is a flowchart showing a third embodiment of the surgery support method and apparatus according to the present invention. FIG. 21 is a diagram used for explaining a method for calculating a surgical route. FIG. 22 is a diagram showing an example of a monitor screen displaying blood vessels and a surgical route. FIG. 23 is a diagram used for explaining a point designation method on a surgical route. FIG. 24 is a diagram used for explaining another calculation method of the surgical route. FIG. 25 is a diagram showing a display example in which a safe area and a dangerous area at the time of surgery are displayed in an identifiable manner.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 10 ... Surgery support apparatus, 30 ... Electric knife, 100 ... Laparoscope, 140 ... CCD, 170 ... Light guide, 200 ... Processor, 210 ... Central processing unit (CPU), 224 ... Normal image processing part, 226 ... Fluorescence image processing , 230 ... Image composition part, 244 ... Character generator (CG), 248 ... Video output part, 250 ... Audio processing part, 252 ... Speaker, 254 ... Operation part, 300 ... Light source device, 310 ... Light source, 320, 329 ... Rotating filter, 322... Infrared cut filter, 324. Near infrared bandpass filter, 350... Mode drive circuit, 360... Motor, 400. 530 ... Half mirror

Claims (30)

  1. Continuously irradiating a subject to which an angiographic contrast agent is administered with excitation light in a specific wavelength region for causing the angiographic contrast agent to emit light at a predetermined time interval;
    A step of continuously photographing the subject irradiated with the excitation light by the imaging means and acquiring a fluorescence image;
    Irradiating the subject with visible light continuously during a non-irradiation period of the excitation light;
    Continuously capturing images of the subject irradiated with the visible light by the imaging means, and obtaining a normal image;
    Extracting the blood vessel image by thresholding the acquired fluorescent image with a predetermined threshold;
    Creating a composite image in which the extracted blood vessel image is superimposed on the acquired normal image;
    Continuously displaying the created composite image as a moving image on a display means;
    A surgical operation support method comprising:
  2.   The operation support method according to claim 1, wherein the extracted blood vessel image is a blood vessel image including density information of the fluorescent image, or a binarized blood vessel image.
  3. The step of creating the composite image includes
    The surgery support method according to claim 1 or 2, wherein the blood vessel image is key-synthesized with the normal image using a binarized signal indicating the blood vessel image extracted by the threshold processing and the background thereof as a key signal. .
  4.   4. The step of displaying the composite image includes simultaneously displaying the composite image and at least one of the normal image, fluorescence image, and blood vessel image on the display means. Surgical support method according to crab.
  5.   The fluorescent image includes an image of a blood vessel existing from a surface of a living tissue of the subject to a predetermined depth based on a transmission characteristic of the excitation light, a light emission characteristic and a concentration of the angiographic contrast agent. Item 5. The surgical operation support method according to any one of Items 1 to 4.
  6.   The operation support method according to claim 5, wherein the predetermined depth is equal to or greater than a depth incised at once by a treatment instrument for endoscopic surgery.
  7. Detecting a tip position of a surgical treatment tool based on the normal image;
    Calculating an actual distance from the tip position of the treatment tool to the nearest blood vessel based on the detected tip position of the treatment tool and the position information of the blood vessel in the extracted blood vessel image;
    A step of issuing a warning when the calculated actual distance is equal to or less than a predetermined distance;
    The operation support method according to claim 1, further comprising:
  8.   The operation support method according to claim 7, wherein in the step of issuing the warning, a warning character is superimposed and displayed on the display unit.
  9.   The operation support method according to claim 7 or 8, wherein in the step of issuing the warning, the calculated actual distance is superimposed and displayed on the display means.
  10.   In the step of issuing the warning, when displaying the warning character or the actual distance, a display for changing the color of the warning character or the actual distance, a display for blinking the warning character or the actual distance, the warning character or the actual distance The operation support method according to claim 8 or 9, wherein at least one of display that repeats enlargement / reduction and display that enlarges the warning character or the actual distance is performed.
  11.   The operation support according to any one of claims 7 to 10, wherein in the step of issuing the warning, the warning character or the actual distance is displayed in the vicinity of the distal end position of the treatment instrument displayed on the display means. Method.
  12.   The operation support method according to any one of claims 7 to 11, wherein the step of issuing a warning generates a warning sound or a warning sound.
  13. Receiving an operation start point and an end point on the screen of the display means;
    Based on the received position information of the start point and end point of the operation and the position information of the blood vessel in the extracted blood vessel image, the start point and end point of the operation without passing through the blood vessel and the dangerous region near the blood vessel Calculating the surgical route connecting
    Displaying the calculated surgical route superimposed on the composite image;
    The operation support method according to claim 1, further comprising:
  14.   The operation support method according to claim 13, wherein the step of calculating the operation route calculates a safest route connecting the start point and end point of the operation or a shortest route as the operation route.
  15. Receiving an operation start point and an end point on the screen of the display means;
    Calculating a safety region other than the blood vessel and the dangerous region near the blood vessel based on the received position information of the start and end points of the surgery and the blood vessel position information in the extracted blood vessel image;
    Displaying the calculated safety area on the display means in an identifiable manner;
    The operation support method according to claim 1, further comprising:
  16. First light source means for continuously irradiating a subject to which an angiographic contrast agent has been administered with excitation light in a specific wavelength region for causing the angiographic contrast agent to emit light at a predetermined time interval;
    Second light source means for continuously irradiating the subject with visible light during a non-irradiation period of the excitation light;
    Imaging means for alternately and repeatedly imaging a subject in synchronization with irradiation of excitation light or visible light by the first and second light source means;
    Of the images photographed by the imaging means, an image photographed in synchronization with the excitation light irradiation is acquired as a fluorescence image, and an image photographed in synchronization with the visible light irradiation is acquired as a normal image. Image acquisition means for
    A blood vessel image extracting means for extracting a blood vessel image by performing threshold processing on the fluorescence image acquired by the image acquiring means with a predetermined threshold;
    Synthetic image creation means for creating a composite image in which the blood vessel image extracted by the blood vessel image extraction means is superimposed on the normal image acquired by the image acquisition means;
    A composite image output means for outputting the composite image created by the composite image creation means as a moving image to the display means;
    An operation support apparatus comprising:
  17.   The surgical operation support apparatus according to claim 16, wherein the first light source means generates excitation light in a near-infrared wavelength region.
  18.   The surgery support apparatus according to claim 16 or 17, wherein the blood vessel image extraction unit extracts a blood vessel image including density information of the fluorescent image or a binarized blood vessel image.
  19.   The composite image creating means creates a key signal composed of a binarized signal indicating the blood vessel image extracted by the threshold processing and the background thereof, and adds the blood vessel to the normal image based on the created key signal. The surgery support apparatus according to claim 16, further comprising a key synthesizing unit for images.
  20.   The composite image output means causes the display means to simultaneously display at least one of the normal image, the fluorescence image, and the blood vessel image together with the composite image. The surgical operation support device described.
  21. A treatment instrument detection means for detecting a tip position of a surgical treatment instrument based on the normal image;
    Distance calculating means for calculating an actual distance from the distal end position of the treatment instrument to the nearest blood vessel based on the detected distal end position of the treatment instrument and blood vessel position information in the extracted blood vessel image;
    Warning generating means for issuing a warning when the calculated actual distance is equal to or less than a predetermined distance;
    The surgery support apparatus according to claim 16, further comprising:
  22.   The operation support apparatus according to claim 21, wherein the warning generation unit displays a warning character superimposed on the display unit.
  23.   The surgery support apparatus according to claim 21 or 22, wherein the warning generation unit displays the calculated actual distance on the display unit in a superimposed manner.
  24.   The warning generating means, when displaying the warning character or the actual distance, a display for changing the color of the warning character or the actual distance, a display for blinking the warning character or the actual distance, an enlargement of the warning character or the actual distance The operation support apparatus according to claim 22 or 23, wherein at least one of a display that repeats reduction and a display that enlarges the warning character or the actual distance is displayed.
  25.   The operation support apparatus according to any one of claims 22 to 24, wherein the warning generation unit displays the warning character or the actual distance in the vicinity of the distal end position of the treatment instrument displayed on the display unit. .
  26.   The operation support apparatus according to any one of claims 21 to 25, wherein the warning generation unit generates a warning sound or a warning sound.
  27. Input means for inputting the start point and end point of the operation on the screen of the display means;
    Based on the position information of the start point and end point of the operation input by the input unit and the position information of the blood vessel in the blood vessel image extracted by the blood vessel image extraction unit, the blood vessel and the dangerous region near the blood vessel are not passed. Surgical route calculation means for calculating a surgical route connecting the start point and end point of the surgery to,
    Surgical route display means for displaying the calculated surgical route superimposed on the composite image;
    The surgery support apparatus according to any one of claims 16 to 26, further comprising:
  28.   28. The surgery support apparatus according to claim 27, wherein the surgery route calculation means calculates the safest route connecting the start point and the end point of the surgery or the shortest route as the surgery route.
  29. Input means for inputting the start point and end point of the operation on the screen of the display means;
    Based on the position information of the start point and end point of the operation input by the input unit and the position information of the blood vessel in the blood vessel image extracted by the blood vessel image extraction unit, the safety region other than the blood vessel and the dangerous region near the blood vessel Safety area calculation means for calculating
    Safety area display means for displaying the calculated safety area on the display means in an identifiable manner;
    The surgery support apparatus according to any one of claims 16 to 26, further comprising:
  30.   The imaging means is an imaging means provided in an endoscope, and the first light source means and the second light source means generate the excitation light or visible light from the distal end of the endoscope. The surgery support apparatus according to any one of claims 16 to 29.
JP2008076412A 2008-03-24 2008-03-24 Image display method and apparatus Expired - Fee Related JP5160276B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008076412A JP5160276B2 (en) 2008-03-24 2008-03-24 Image display method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008076412A JP5160276B2 (en) 2008-03-24 2008-03-24 Image display method and apparatus

Publications (2)

Publication Number Publication Date
JP2009226072A true JP2009226072A (en) 2009-10-08
JP5160276B2 JP5160276B2 (en) 2013-03-13

Family

ID=41242181

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008076412A Expired - Fee Related JP5160276B2 (en) 2008-03-24 2008-03-24 Image display method and apparatus

Country Status (1)

Country Link
JP (1) JP5160276B2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012050617A (en) * 2010-08-31 2012-03-15 Fujifilm Corp Image acquisition method and device
JP2012065698A (en) * 2010-09-21 2012-04-05 Fujifilm Corp Operation support system, and operation support method using the same
JP2012145556A (en) * 2010-12-24 2012-08-02 Ai-Phase Co Ltd Data image recorder, thermal analysis apparatus, data image recording method, normalization method of image data, calculation method of thermophysical property and display method of recorded image
JP2012170641A (en) * 2011-02-22 2012-09-10 Olympus Corp Fluorescence observation apparatus
WO2012147820A1 (en) * 2011-04-28 2012-11-01 オリンパス株式会社 Fluorescent observation device and image display method therefor
WO2013014901A1 (en) * 2011-07-27 2013-01-31 富士フイルム株式会社 Photoacoustic imaging system and device, and probe unit used therein
WO2014156493A1 (en) * 2013-03-29 2014-10-02 オリンパス株式会社 Fluorescence observation device
WO2014188740A1 (en) * 2013-05-23 2014-11-27 オリンパス株式会社 Endoscopic device and endoscopic device operation method
JP2015531271A (en) * 2012-09-14 2015-11-02 ソニー株式会社 Surgical image processing system, surgical image processing method, program, computer-readable recording medium, medical image processing apparatus, and image processing inspection apparatus
US9320437B2 (en) 2009-12-22 2016-04-26 Genial Light Co., Ltd. Intravital observation device
WO2017002516A1 (en) * 2015-07-02 2017-01-05 オリンパス株式会社 Drive device
JP2017012666A (en) * 2015-07-06 2017-01-19 オリンパス株式会社 Endoscopic examination data recording system
JP2017012665A (en) * 2015-07-06 2017-01-19 オリンパス株式会社 Endoscopic examination data recording system
WO2018225316A1 (en) * 2017-06-05 2018-12-13 オリンパス株式会社 Medical control device
WO2018235179A1 (en) * 2017-06-21 2018-12-27 オリンパス株式会社 Image processing device, endoscope device, method for operating image processing device, and image processing program
US10456040B2 (en) 2014-09-18 2019-10-29 Shimadzu Corporation Imaging device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002074330A (en) * 2000-09-01 2002-03-15 Fuji Photo Film Co Ltd Fluorescent diagnosis image display device
JP2006509573A (en) * 2002-12-13 2006-03-23 イエトメド リミテッド Optical inspection method and apparatus particularly useful for distinguishing between tumor and normal tissue in real time during surgery
JP2008018172A (en) * 2006-07-14 2008-01-31 Hitachi Medical Corp Surgery supporting system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002074330A (en) * 2000-09-01 2002-03-15 Fuji Photo Film Co Ltd Fluorescent diagnosis image display device
JP2006509573A (en) * 2002-12-13 2006-03-23 イエトメド リミテッド Optical inspection method and apparatus particularly useful for distinguishing between tumor and normal tissue in real time during surgery
JP2008018172A (en) * 2006-07-14 2008-01-31 Hitachi Medical Corp Surgery supporting system

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9320437B2 (en) 2009-12-22 2016-04-26 Genial Light Co., Ltd. Intravital observation device
JP2012050617A (en) * 2010-08-31 2012-03-15 Fujifilm Corp Image acquisition method and device
JP2012065698A (en) * 2010-09-21 2012-04-05 Fujifilm Corp Operation support system, and operation support method using the same
JP2012145556A (en) * 2010-12-24 2012-08-02 Ai-Phase Co Ltd Data image recorder, thermal analysis apparatus, data image recording method, normalization method of image data, calculation method of thermophysical property and display method of recorded image
JP2012170641A (en) * 2011-02-22 2012-09-10 Olympus Corp Fluorescence observation apparatus
WO2012147820A1 (en) * 2011-04-28 2012-11-01 オリンパス株式会社 Fluorescent observation device and image display method therefor
JP6053673B2 (en) * 2011-04-28 2016-12-27 オリンパス株式会社 Fluorescence observation apparatus and image display method thereof
US9513219B2 (en) 2011-04-28 2016-12-06 Olympus Corporation Fluoroscopy apparatus and image display method therefor
WO2013014901A1 (en) * 2011-07-27 2013-01-31 富士フイルム株式会社 Photoacoustic imaging system and device, and probe unit used therein
CN103732153A (en) * 2011-07-27 2014-04-16 富士胶片株式会社 Photoacoustic imaging system and device, and probe unit used therein
JP2013027481A (en) * 2011-07-27 2013-02-07 Fujifilm Corp Photoacoustic imaging system and apparatus, and probe unit used therefor
JP2015531271A (en) * 2012-09-14 2015-11-02 ソニー株式会社 Surgical image processing system, surgical image processing method, program, computer-readable recording medium, medical image processing apparatus, and image processing inspection apparatus
US10219701B2 (en) 2013-03-29 2019-03-05 Olympus Corporation Fluorescence observation apparatus
WO2014156493A1 (en) * 2013-03-29 2014-10-02 オリンパス株式会社 Fluorescence observation device
JP2014226341A (en) * 2013-05-23 2014-12-08 オリンパス株式会社 Endoscope apparatus and operation method of endoscope apparatus
WO2014188740A1 (en) * 2013-05-23 2014-11-27 オリンパス株式会社 Endoscopic device and endoscopic device operation method
US10456040B2 (en) 2014-09-18 2019-10-29 Shimadzu Corporation Imaging device
JP6138395B1 (en) * 2015-07-02 2017-05-31 オリンパス株式会社 Drive device
CN107405062A (en) * 2015-07-02 2017-11-28 奥林巴斯株式会社 Drive device
WO2017002516A1 (en) * 2015-07-02 2017-01-05 オリンパス株式会社 Drive device
JP2017012665A (en) * 2015-07-06 2017-01-19 オリンパス株式会社 Endoscopic examination data recording system
JP2017012666A (en) * 2015-07-06 2017-01-19 オリンパス株式会社 Endoscopic examination data recording system
WO2018225316A1 (en) * 2017-06-05 2018-12-13 オリンパス株式会社 Medical control device
WO2018235179A1 (en) * 2017-06-21 2018-12-27 オリンパス株式会社 Image processing device, endoscope device, method for operating image processing device, and image processing program

Also Published As

Publication number Publication date
JP5160276B2 (en) 2013-03-13

Similar Documents

Publication Publication Date Title
US6468204B2 (en) Fluorescent endoscope apparatus
US7588535B2 (en) Apparatus, method and system for intravascular photographic imaging
US8681208B2 (en) Image processing device and program
CN102247115B (en) Electronic endoscope system
US20080253686A1 (en) Method and Device for Examining or Imaging an Interior Surface of a Cavity
US20040019253A1 (en) Endoscope apparatus
JP2010042182A (en) Laser treatment device
JP2009034224A (en) Medical treatment apparatus
JP2012016545A (en) Endoscope apparatus
US20050215854A1 (en) Medical procedure support system and method
ES2341079B1 (en) Equipment for improved vision by infrared vascular structures, applicable to assist phytoscopic, laparoscopic and endoscopic interventions and signal treatment process to improve such vision.
JP5396004B2 (en) Fluorescence observation apparatus and method of operating fluorescence observation apparatus
CN102197982B (en) The electronic endoscope system
US20030167007A1 (en) Apparatus and method for spectroscopic examination of the colon
EP2366327B1 (en) An electronic endoscope system and a method of acquiring blood vessel information
JP3923595B2 (en) Fluorescence observation equipment
JP5572326B2 (en) Image processing apparatus, imaging apparatus, image processing program, and image processing method
US6636755B2 (en) Method and apparatus for obtaining an optical tomographic image of a sentinel lymph node
JP2006263044A (en) Fluorescence detecting system
JP5496852B2 (en) Electronic endoscope system, processor device for electronic endoscope system, and method for operating electronic endoscope system
JPH07222712A (en) Fluorescent endoscope system
JP2003535659A (en) Medical imaging using a scanning single optical fiber systems, diagnostic and therapeutic
WO2012165203A1 (en) Endoscope device
EP1484001B1 (en) Endoscope image processing apparatus
US20040199053A1 (en) Autosteering vision endoscope

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100615

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120418

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120424

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120625

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20120806

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120828

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20121010

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20121126

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20121212

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20151221

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees