WO2005084525A1 - Image processing device - Google Patents

Image processing device Download PDF

Info

Publication number
WO2005084525A1
WO2005084525A1 PCT/JP2005/003880 JP2005003880W WO2005084525A1 WO 2005084525 A1 WO2005084525 A1 WO 2005084525A1 JP 2005003880 W JP2005003880 W JP 2005003880W WO 2005084525 A1 WO2005084525 A1 WO 2005084525A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
input
images
image processing
selection screen
Prior art date
Application number
PCT/JP2005/003880
Other languages
French (fr)
Japanese (ja)
Inventor
Takechiyo Nakamitsu
Original Assignee
Olympus Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corporation filed Critical Olympus Corporation
Publication of WO2005084525A1 publication Critical patent/WO2005084525A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters

Definitions

  • the present invention relates to an image processing device that processes a plurality of medical images.
  • a general endoscopic operation system includes an endoscope for performing observation, a camera head connected to the endoscope, an endoscope camera device for processing an image signal captured by the camera head, and an object.
  • a light source device that supplies illumination light
  • a monitor that displays an image of a subject
  • an insufflation device that is used to inflate the abdominal cavity
  • a high-frequency ablation device hereinafter referred to as an electric Female
  • the camera head captures an optical image of the subject, and an imaging signal of the subject image is subjected to signal processing by an endoscope power camera.
  • the endoscope camera device generates an endoscope image from an imaging signal.
  • This endoscope image is output to a monitor from the endoscope camera device.
  • the treatment site on the endoscope image is displayed on the monitor, and various treatments are performed while observing the treatment site on the monitor.
  • each of these devices is mounted on a system trolley for an endoscope together with a system control device as disclosed in Japanese Patent Application Laid-Open No. 7-303654, for example, to improve operability. Used as an endoscope system.
  • buttons 1001 indicating the input channels connected to the system are displayed on the display of the channel 1000. For this reason, the main operator cannot recognize the number of input channels by the plurality of buttons 1001 and recognizes the subject image on the side monitored by another operator. Cannot be selected properly and quickly.
  • the present invention has been made in view of the above circumstances, and has as its object to provide an image processing apparatus that can more appropriately and quickly select a plurality of input images to be input. .
  • the image processing apparatus of the present invention is configured such that image input means for inputting a plurality of images, image capture means for capturing the plurality of images input from the image input means, and that the image capture means captures the images.
  • Image selection screen generation means for generating an image selection screen in which reduced images of the plurality of images are allocated for each input path of the image input means; and the reduction on the image selection screen generated by the image selection screen generation means
  • Image selecting means for selecting an image, and image output means for outputting an image of the reduced image selected by the image selecting means to a display means.
  • FIG. 1 is a configuration diagram showing a configuration of a medical endoscope system according to a first embodiment of the present invention.
  • FIG. 2 A block diagram showing the configuration of the system controller in FIG.
  • FIG. 3 Diagram showing an image selection screen displayed on the centralized operation panel in Fig. 1.
  • FIG. 4 is a diagram showing a key input screen displayed on a centralized operation panel according to Embodiment 2 of the present invention.
  • FIG. 5 is a diagram showing an image selection screen displayed on a centralized operation panel according to Embodiment 2 of the present invention.
  • FIG. 6 is a first diagram showing an image selection screen displayed on a centralized operation panel according to Embodiment 3 of the present invention.
  • FIG. 7 is a second diagram illustrating an image selection screen displayed on the centralized operation panel according to the third embodiment of the present invention.
  • Figure 2
  • FIG. 1 is a configuration diagram showing a configuration of a medical endoscope system
  • FIG. 2 is a block diagram showing a configuration of a system controller of FIG. 1
  • FIG. 2 is a diagram showing an image selection screen displayed on the centralized operation panel of FIG.
  • the device 113, the VTR 114, the ultrasonic observation device 130, the in-hospital network device 131, and the external storage device 132 are connected to a system controller 100 that is a medical image processing device that controls the entire system.
  • the second endoscope camera device 110b, the second light source device llb, the third endoscope camera device 110c, and the third light source device 111c are connected to the relay unit 101.
  • the relay unit 101 and the system controller 100 are connected by a relay cable 102, and each device is centrally controlled by the system controller 100.
  • a centralized operation panel 103 operated by a nurse or the like in a non-sterile area and a remote controller (hereinafter, remote control) 104 operated by an operator in a sterile area can operate the system controller 100.
  • the system controller 100 When the system controller 100 is operated by the centralized operation panel 103 and the remote controller 104, for example, the system controller 100 controls the centralized display panel 105 beside the monitor, and displays intraoperative data on the centralized display panel 105. it can.
  • the centralized operation panel 103 includes a display unit 103a such as a liquid crystal display and a touch sensor 103b integrally provided on the display unit (see FIG. 2).
  • the centralized operation panel 103 has a display function of displaying the status display and operation switches of each device on the display unit 103a as a setting screen, and an operation function of operating the operation switches by touching a predetermined area of the touch sensor 103b. Have.
  • the centralized operation panel 103 uses the display function to Among the status displays and operation switches of the devices connected to 100, the intended one is displayed on the screen of the display unit 103a.
  • Each controlled device connected to the system controller 100 transmits data to the system controller 100 via a communication line.
  • the system controller 100 displays a display on each device on the display unit 103a of the centralized operation panel 103.
  • the system controller 100 enables operation input with the touch sensor 103b of the centralized operation panel 103.
  • Various images including the endoscope camera devices 110a and 110b are displayed on the observation monitors 118a and 118b via the system controller 100.
  • the system controller 100 has a switcher processing unit 1.
  • the switch processing unit 1 includes an input image input from the first endoscope camera device 110a, the VTR 114, and the ultrasonic observation device 130, and a second endoscope input via the relay unit 101 and the relay cable 102. Based on the input image input from the mirror camera device 110b and the third endoscope camera device 110c, the image is switched and output to the observation monitor 118a and the observation monitor 118b under the control of the CPU unit 2. I do.
  • the switcher processing unit 1 based on the control of the CPU unit 2, among the input images input, for example, an input image from the second endoscope camera device 110 b as a live observation image Output to 118a.
  • the switcher processing unit 1 uses the observation monitor 118b as a live observation image, for example, the input image from the first endoscope camera device 110a among the input image. Output to
  • the switcher processing section 1 outputs each input image data to the internal bus 3 under the control of the CPU section 2.
  • Each input image data output to the internal bus 3 is captured in a time-division manner by the capture processing section 4 under the control of the CPU section 2.
  • the CPU 2 causes the internal recording unit 5 to record the image data captured by the capture processing unit 4 via the internal bus 3.
  • the control state information of each device such as the first endoscope camera device 110a, the first light source device ll la, the electric scalpel 112, the insufflation device 113, and the external storage device 132 is transmitted through the IZF conversion section 6. internal Output to bus 3.
  • the control state information is monitored by the CPU 2 via the internal bus 3.
  • the control state information of each device such as the second endoscope camera device 110b, the second light source device ll lb, the third endoscope camera device 110c, and the third light source device 111c is stored in the relay unit 101. Is input through the IZF converter 7.
  • the control state information is transmitted from the relay unit 101 via the relay cable 102 and output to the internal bus 3 via the serial IZF unit 12 in the system controller 100.
  • the control state information is monitored by the CPU 2 via the internal bus 3.
  • the in-hospital network device 131 is connected to the system controller 100 via the LAN iZF unit 8.
  • Various medical information for example, medical image data such as patient preoperative examination data and CT images
  • an in-hospital server (not shown) is output to the internal bus 3 via the LAN IZF unit 8.
  • the CPU 2 monitors these various medical information via the internal bus 3.
  • operation data from the remote control 104 is output to the internal bus 3 via the remote control IZF unit 9. Then, the CPU unit 2 controls each device based on operation data from the remote controller 104 via the internal bus 3.
  • the CPU unit 2 controls the graphic unit 10 to display various control screens on the display unit 103a of the centralized operation panel 103. Further, the touch sensor 103b detects an operation of a finger or the like on the control screen displayed on the display unit 103a of the centralized operation panel 103, and operates the internal bus 3 via the touch sensor IZF unit 11 to operate the finger or the like on the control screen. The operation data of is output.
  • the CPU unit 2 Based on the operation data, the CPU unit 2 also performs various controls and instructions on the central operation panel 103. Further, the CPU unit 2 displays various data screens based on various controls and instructions on the centralized operation panel 103 or operation instructions of the remote controller 104 on the centralized display panel 105.
  • the centralized display panel 105 is configured to be able to display an image captured by the capture processing unit 4 as a moving image. With this configuration, the centralized display panel 105 is connected to the first endoscope camera device 110a, the ultrasonic observation device 130, the second endoscope camera device 110b, or the third endoscope similarly to the observation monitors 118a and 118b. Image from camera device 110c The image can be displayed as a live image of a moving image.
  • the main operator can simultaneously refer to the central display panel 105 with a moving image of a desired device as a reference moving image under endoscopic image observation by the observation monitors 118a and 118b.
  • the first endoscope camera device 110a and the ultrasonic observation device 130 are connected to the system controller 100.
  • the second endoscope camera device 110b and the third endoscope camera device 110c are connected via the relay unit 101 and the relay cable 102.
  • the CPU unit 2 generates a thumbnail image of the captured image data recorded in the internal recording unit 5, and displays the system controller on the display unit 103a of the centralized operation panel 103 as shown in FIG.
  • An image selection screen 200 including thumbnail images 201 for 100 input channels is displayed.
  • the main operator can determine to which channel a plurality of live images from the endoscope camera device or the like input to the system controller 100 are connected by using the image selection screen 200 by using the thumbnail image 201. It is possible to easily visually recognize it. Therefore, for example, the main surgeon instructs a nurse or the like during the procedure, and the nurse or the like activates the touch sensor 103b to display a desired live image on the selective concentration display panel 105 as a reference moving image. .
  • the thumbnail for each input channel is displayed on the display unit 103a of the centralized operation panel 103. Since the image 201 is displayed, a desired live image can be more appropriately and quickly selected as a reference moving image from a plurality of live input images input to the system.
  • FIG. 4 is a diagram showing a key input screen displayed on the centralized operation panel
  • FIG. 5 is a diagram showing an image selection screen displayed on the centralized operation panel. It is.
  • FIG. 4 shows a state where “CAMERA 1” indicating an input image from the first endoscope camera device 110a is input to the channel “Video A”.
  • the image selection screen 200 having the thumbnail image 201 is displayed on the display unit 103a of the centralized operation panel 103.
  • the main surgeon instructs a nurse or the like during the procedure to operate the touch sensor 103b to select the thumbnail image 201 on which the character information is superimposed.
  • a desired reference moving image is displayed on the central display panel 105 during observation of the endoscopic image during the procedure.
  • a desired live image can be selected more easily than in the first embodiment.
  • FIG. 6 is a first diagram illustrating an image selection screen displayed on the centralized operation panel
  • FIG. 7 is a diagram illustrating an image selection screen displayed on the centralized operation panel. It is the 2nd figure shown.
  • the third embodiment is almost the same as the second embodiment, and only different points will be described.
  • the thumbnail image 201 displayed on the image selection screen 200 is different from the first endoscope camera device 110a as in the first and second embodiments, as shown in FIG. Thumbnails of live images from the sound wave observation device 130, the second endoscope camera device 110b, and the third endoscope camera device 110c, and thumbnails of recorded images recorded on the VTR 114! And a thumbnail of a pre-operative CT image of the patient input via 131.
  • the character information to be superimposed on the thumbnail of the preoperative CT image includes the character information input on the key input screen 300, and further reads the file information storing the preoperative CT image.
  • character information such as a file name “CT001.jpg” can be included.
  • not only a live image but also a recorded image can be selected more appropriately and promptly as a reference moving image.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Endoscopes (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An image processing device includes a CPU unit capable of controlling a graphic unit so as to display various control screens on a display unit of a concentrated operation panel. When a first endoscope camera device, an ultrasonographic device, a second endoscope camera device, and a third endoscope camera device are connected, input images from these devices are time-division-captured by a capture processing unit according to control of the CPU unit. The CPU unit generates thumbnail images from the capture image data and displays an image selection screen formed by the thumbnail images for each input channel of the system controller on the display unit of the concentrated operation panel. Thus, it is possible to appropriately and rapidly select input images which are inputted.

Description

明 細 書  Specification
画像処理装置  Image processing device
技術分野  Technical field
[0001] 本発明は、複数の医療画像を処理する画像処理装置に関する。  The present invention relates to an image processing device that processes a plurality of medical images.
背景技術  Background art
[0002] 一般的な内視鏡手術システムは、観察を行うための内視鏡、内視鏡に接続される カメラヘッド、カメラヘッドで撮像した画像信号を処理する内視鏡カメラ装置、被写体 へ照明光を供給する光源装置、被写体画像を表示するモニタ、腹腔内を膨張させる ために用いる気腹装置、手技を行うための処理装置であり生体組織を切除あるいは 凝固する高周波焼灼装置 (以下、電気メス)などを備えて 、る。  [0002] A general endoscopic operation system includes an endoscope for performing observation, a camera head connected to the endoscope, an endoscope camera device for processing an image signal captured by the camera head, and an object. A light source device that supplies illumination light, a monitor that displays an image of a subject, an insufflation device that is used to inflate the abdominal cavity, and a high-frequency ablation device (hereinafter referred to as an electric Female).
[0003] このような内視鏡手術システムを用いて手技が開始されると、まず内視鏡が被検部 位に挿入される。そして、内視鏡を介して光源装置からの照明光が被写体に照射さ れ、内視鏡が被写体の光学像をカメラヘッドに伝送する。  [0003] When a procedure is started using such an endoscopic operation system, first, an endoscope is inserted into a position to be inspected. Then, illumination light from the light source device is applied to the subject via the endoscope, and the endoscope transmits an optical image of the subject to the camera head.
[0004] 続いて、カメラヘッドが被写体の光学像を撮像し、被写体像の撮像信号が内視鏡力 メラ装置で信号処理される。内視鏡カメラ装置は撮像信号より内視鏡画像を生成する 。この内視鏡画像は内視鏡カメラ装置よりモニタに出力される。そして、内視鏡画像 上の被処置部位がモニタに映し出され、モニタ上の被処置部位を観察しながら各種 処置が行われる。  [0004] Subsequently, the camera head captures an optical image of the subject, and an imaging signal of the subject image is subjected to signal processing by an endoscope power camera. The endoscope camera device generates an endoscope image from an imaging signal. This endoscope image is output to a monitor from the endoscope camera device. Then, the treatment site on the endoscope image is displayed on the monitor, and various treatments are performed while observing the treatment site on the monitor.
[0005] 通常、これらの各装置は、例えば日本国特開平 7— 303654号公報において開示さ れるようなシステム制御装置と共に内視鏡用のシステムトロリに搭載され、操作性の向 上が図られた内視鏡システムとして使用されている。  [0005] Usually, each of these devices is mounted on a system trolley for an endoscope together with a system control device as disclosed in Japanese Patent Application Laid-Open No. 7-303654, for example, to improve operability. Used as an endoscope system.
[0006] しかしながら、内視鏡手術では、複数の術者により手技がなされることがある。このよ うな場合、複数の内視鏡が被検部位に挿入されるため、術者毎の内視鏡を被写体画 像を表示する必要がある。しかし、従来の内視鏡手術システムのシステムコントローラ は、術者毎の内視鏡力ゝらの複数の被写体画像がシステム上のどのチャンネルを介し て入力されて 、るかが認識できな 、。  [0006] However, in endoscopic surgery, a procedure may be performed by a plurality of operators. In such a case, since a plurality of endoscopes are inserted into the region to be inspected, it is necessary to display an image of the subject using the endoscope for each operator. However, the system controller of the conventional endoscopic surgery system cannot recognize through which channel on the system a plurality of subject images such as the endoscope power of each operator are input.
[0007] すなわち、図 8に示すように、従来の内視鏡手術システムでは、例えば集中操作パ ネル 1000の表示部上に単にシステムに接続されている複数の入力チャンネルを示 す複数のボタン 1001が表示されるだけである。そのため、主たる術者は、複数のボ タン 1001による入力チャンネルの数し力認識できず、他の術者がモニタしている側 の被写体画像を確認した 、場合に、所望の被写体画像の入力チャンネルを適切か つ迅速に選択することができな 、と 、つた問題がある。 [0007] That is, as shown in FIG. 8, in the conventional endoscopic surgery system, for example, Only the buttons 1001 indicating the input channels connected to the system are displayed on the display of the channel 1000. For this reason, the main operator cannot recognize the number of input channels by the plurality of buttons 1001 and recognizes the subject image on the side monitored by another operator. Cannot be selected properly and quickly.
[0008] 本発明は、上記事情に鑑みてなされたものであり、入力される複数の入力画像を、 より適切かつ迅速に、選択することのできる画像処理装置を提供することを目的とし ている。  [0008] The present invention has been made in view of the above circumstances, and has as its object to provide an image processing apparatus that can more appropriately and quickly select a plurality of input images to be input. .
発明の開示  Disclosure of the invention
課題を解決するための手段  Means for solving the problem
[0009] 本発明の画像処理装置は、複数の画像を入力する画像入力手段と、前記画像入 力手段より入力された前記複数の画像をキヤプチヤする画像キヤプチャ手段と、前記 画像キヤプチャ手段がキヤプチヤした前記複数の画像の縮小画像を前記画像入力 手段の入力経路毎に割り付けた画像選択画面を生成する画像選択画面生成手段と 、前記画像選択画面生成手段で生成された前記画像選択画面上の前記縮小画像 を選択する画像選択手段と、前記画像選択手段で選択された前記縮小画像の画像 を表示手段に出力する画像出力手段とを備えて構成される。  [0009] The image processing apparatus of the present invention is configured such that image input means for inputting a plurality of images, image capture means for capturing the plurality of images input from the image input means, and that the image capture means captures the images. Image selection screen generation means for generating an image selection screen in which reduced images of the plurality of images are allocated for each input path of the image input means; and the reduction on the image selection screen generated by the image selection screen generation means Image selecting means for selecting an image, and image output means for outputting an image of the reduced image selected by the image selecting means to a display means.
[0010] 本発明によれば、入力される複数の入力画像を、より適切かつ迅速に、選択するこ とができるという効果がある。  According to the present invention, there is an effect that a plurality of input images to be input can be selected more appropriately and quickly.
図面の簡単な説明  Brief Description of Drawings
[0011] [図 1]本発明の実施例 1に係る医療用内視鏡システムの構成を示す構成図  FIG. 1 is a configuration diagram showing a configuration of a medical endoscope system according to a first embodiment of the present invention.
[図 2]図 1のシステムコントローラの構成を示すブロック図  [FIG. 2] A block diagram showing the configuration of the system controller in FIG.
[図 3]図 1の集中操作パネルに表示される画像選択画面を示す図  [Fig. 3] Diagram showing an image selection screen displayed on the centralized operation panel in Fig. 1.
[図 4]本発明の実施例 2に係る集中操作パネルに表示されるキー入力画面を示す図 [図 5]本発明の実施例 2に係る集中操作パネルに表示される画像選択画面を示す図 [図 6]本発明の実施例 3に係る集中操作パネルに表示される画像選択画面を示す第 1の図  FIG. 4 is a diagram showing a key input screen displayed on a centralized operation panel according to Embodiment 2 of the present invention. FIG. 5 is a diagram showing an image selection screen displayed on a centralized operation panel according to Embodiment 2 of the present invention. FIG. 6 is a first diagram showing an image selection screen displayed on a centralized operation panel according to Embodiment 3 of the present invention.
[図 7]本発明の実施例 3に係る集中操作パネルに表示される画像選択画面を示す第 2の図 FIG. 7 is a second diagram illustrating an image selection screen displayed on the centralized operation panel according to the third embodiment of the present invention. Figure 2
[図 8]従来の集中操作パネルに表示される画像選択画面を示す図  [Figure 8] Diagram showing image selection screen displayed on conventional centralized operation panel
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0012] 以下、図面を参照しながら本発明の好ましい実施例について述べる。 Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
[0013] (実施例 1) (Example 1)
図 1な 、し図 3は本発明の実施例 1に係わり、図 1は医療用内視鏡システムの構成 を示す構成図、図 2は図 1のシステムコントローラの構成を示すブロック図、図 3は図 1 の集中操作パネルに表示される画像選択画面を示す図である。  1 to 3 relate to the first embodiment of the present invention, FIG. 1 is a configuration diagram showing a configuration of a medical endoscope system, FIG. 2 is a block diagram showing a configuration of a system controller of FIG. 1, and FIG. 2 is a diagram showing an image selection screen displayed on the centralized operation panel of FIG.
[0014] 図 1に示すように、医療用内視鏡システムを構成する第 1内視鏡用カメラ装置 110a 、第 1光源装置 l l la、高周波電源焼灼装置 (以下、電気メス) 112、気腹装置 113、 VTR114、超音波観測装置 130、院内ネットワーク装置 131及び外部記憶装置 132 は、システム全体の制御を行う医療画像処理装置であるシステムコントローラ 100に 接続されている。一方、第 2内視鏡用カメラ装置 110b、第 2光源装置 l l lb、第 3内 視鏡用カメラ装置 110c、第 3光源装置 111cは、中継ユニット 101に接続される。  As shown in FIG. 1, a first endoscope camera device 110a, a first light source device ll la, a high-frequency power cautery device (hereinafter referred to as an electric scalpel) 112, which constitutes a medical endoscope system, The device 113, the VTR 114, the ultrasonic observation device 130, the in-hospital network device 131, and the external storage device 132 are connected to a system controller 100 that is a medical image processing device that controls the entire system. On the other hand, the second endoscope camera device 110b, the second light source device llb, the third endoscope camera device 110c, and the third light source device 111c are connected to the relay unit 101.
[0015] 中継ユニット 101とシステムコントローラ 100は中継ケーブル 102により接続され、 各装置はシステムコントローラ 100によって集中制御される。  [0015] The relay unit 101 and the system controller 100 are connected by a relay cable 102, and each device is centrally controlled by the system controller 100.
[0016] また、非滅菌域の看護師などが操作する集中操作パネル 103と滅菌域の術者が操 作するリモートコントローラ(以下、リモコン) 104は、システムコントローラ 100を操作 できる。  A centralized operation panel 103 operated by a nurse or the like in a non-sterile area and a remote controller (hereinafter, remote control) 104 operated by an operator in a sterile area can operate the system controller 100.
[0017] 集中操作パネル 103とリモコン 104によりシステムコントローラ 100が操作されると、 例えば、システムコントローラ 100はモニタ横の集中表示パネル 105を制御し、集中 表示パネル 105に術中のデータを表示させることができる。  When the system controller 100 is operated by the centralized operation panel 103 and the remote controller 104, for example, the system controller 100 controls the centralized display panel 105 beside the monitor, and displays intraoperative data on the centralized display panel 105. it can.
[0018] 集中操作パネル 103は、液晶ディスプレイ等の表示部 103aと、この表示部の上に 一体的に設けたタツチセンサ 103bとにより構成される(図 2参照)。集中操作パネル 1 03は、各装置の状態表示や操作スィッチなどを設定画面として表示部 103aに表示 する表示機能と共に、タツチセンサ 103bの所定領域を触れることで操作スィッチによ る操作を行う操作機能を有して 、る。  The centralized operation panel 103 includes a display unit 103a such as a liquid crystal display and a touch sensor 103b integrally provided on the display unit (see FIG. 2). The centralized operation panel 103 has a display function of displaying the status display and operation switches of each device on the display unit 103a as a setting screen, and an operation function of operating the operation switches by touching a predetermined area of the touch sensor 103b. Have.
[0019] 本実施例では、集中操作パネル 103は、上記表示機能により、システムコントローラ 100に接続された各装置の状態表示や操作スィッチのうち、目的のものを表示部 10 3aの画面上に表示させる。 In the present embodiment, the centralized operation panel 103 uses the display function to Among the status displays and operation switches of the devices connected to 100, the intended one is displayed on the screen of the display unit 103a.
[0020] システムコントローラ 100に接続された各被制御装置は、通信線を介しデータをシ ステムコントローラ 100に送信する。システムコントローラ 100は、各装置との通信が 成立している場合、各装置に関する表示を集中操作パネル 103の表示部 103aに表 示する。また、システムコントローラ 100は、各装置との通信が成立している場合、集 中操作パネル 103のタツチセンサ 103bでの操作入力を可能とする。  Each controlled device connected to the system controller 100 transmits data to the system controller 100 via a communication line. When communication with each device is established, the system controller 100 displays a display on each device on the display unit 103a of the centralized operation panel 103. In addition, when communication with each device is established, the system controller 100 enables operation input with the touch sensor 103b of the centralized operation panel 103.
[0021] 内視鏡用カメラ装置 110a、 110bをはじめとする各種の映像は、システムコントロー ラ 100を介して観察用モニタ 118a, 118bに映し出される。  Various images including the endoscope camera devices 110a and 110b are displayed on the observation monitors 118a and 118b via the system controller 100.
[0022] 図 2に示すように、システムコントローラ 100は、スィッチャ処理部 1を有している。ス イツチヤ処理部 1は、第 1内視鏡用カメラ装置 110a、 VTR114、超音波観測装置 13 0から入力される入力画像と、中継ユニット 101及び中継ケーブル 102を介して入力 される第 2内視鏡用カメラ装置 110b、第 3内視鏡用カメラ装置 110cから入力される 入力画像とを入力して、 CPU部 2の制御に基づき、スイッチングして観察用モニタ 11 8a及び観察用モニタ 118bに出力する。  As shown in FIG. 2, the system controller 100 has a switcher processing unit 1. The switch processing unit 1 includes an input image input from the first endoscope camera device 110a, the VTR 114, and the ultrasonic observation device 130, and a second endoscope input via the relay unit 101 and the relay cable 102. Based on the input image input from the mirror camera device 110b and the third endoscope camera device 110c, the image is switched and output to the observation monitor 118a and the observation monitor 118b under the control of the CPU unit 2. I do.
[0023] スィッチャ処理部 1は、 CPU部 2の制御に基づき、入力された入力画像のうち、例え ば第 2内視鏡用カメラ装置 110bからの入力画像をライブ観察画像として観察用モ- タ 118aに出力する。  The switcher processing unit 1, based on the control of the CPU unit 2, among the input images input, for example, an input image from the second endoscope camera device 110 b as a live observation image Output to 118a.
[0024] また、スィッチャ処理部 1は、 CPU部 2の制御に基づき、入力された入力画像のうち 、例えば第 1内視鏡用カメラ装置 110aからの入力画像をライブ観察画像として観察 用モニタ 118bに出力する。  Further, based on the control of the CPU unit 2, the switcher processing unit 1 uses the observation monitor 118b as a live observation image, for example, the input image from the first endoscope camera device 110a among the input image. Output to
[0025] さらに、スィッチャ処理部 1は、 CPU部 2の制御に基づき、入力された各入力画像 データを内部バス 3に出力する。  Further, the switcher processing section 1 outputs each input image data to the internal bus 3 under the control of the CPU section 2.
[0026] 内部バス 3に出力された各入力画像データは、 CPU部 2の制御に基づきキヤプチ ャ処理部 4にて時分割にキヤプチャされる。 CPU部 2は、キヤプチャ処理部 4でキヤプ チヤされた画像データを内部バス 3を介して内部記録部 5に記録させる。  Each input image data output to the internal bus 3 is captured in a time-division manner by the capture processing section 4 under the control of the CPU section 2. The CPU 2 causes the internal recording unit 5 to record the image data captured by the capture processing unit 4 via the internal bus 3.
[0027] 第 1内視鏡用カメラ装置 110a、第 1光源装置 l l la、電気メス 112、気腹装置 113 及び外部記憶装置 132等の各装置の制御状態情報は、 IZF変換部 6を介して内部 バス 3に出力される。そして、これら制御状態情報を CPU部 2が内部バス 3を介して 監視する。 The control state information of each device such as the first endoscope camera device 110a, the first light source device ll la, the electric scalpel 112, the insufflation device 113, and the external storage device 132 is transmitted through the IZF conversion section 6. internal Output to bus 3. The control state information is monitored by the CPU 2 via the internal bus 3.
[0028] また、第 2内視鏡用カメラ装置 110b、第 2光源装置 l l lb、第 3内視鏡用カメラ装置 110c,第 3光源装置 111c等の各装置の制御状態情報は中継ユニット 101内に IZ F変換部 7を介して入力される。これら制御状態情報は中継ユニット 101から中継ケ 一ブル 102を伝送してシステムコントローラ 100内のシリアル IZF部 12を介して内部 バス 3に出力される。そして、これら制御状態情報を CPU部 2が内部バス 3を介して 監視する。  The control state information of each device such as the second endoscope camera device 110b, the second light source device ll lb, the third endoscope camera device 110c, and the third light source device 111c is stored in the relay unit 101. Is input through the IZF converter 7. The control state information is transmitted from the relay unit 101 via the relay cable 102 and output to the internal bus 3 via the serial IZF unit 12 in the system controller 100. The control state information is monitored by the CPU 2 via the internal bus 3.
[0029] また、院内ネットワーク装置 131は、 LAN iZF部 8を介してシステムコントローラ 1 00に接続される。図示しない院内サーバを介した各種医療情報 (例えば、患者の術 前検査データや CT画像等の医療画像データ)は LAN IZF部 8を介して内部バス 3に出力される。そして、これら各種医療情報を CPU部 2が内部バス 3を介して監視 する。  The in-hospital network device 131 is connected to the system controller 100 via the LAN iZF unit 8. Various medical information (for example, medical image data such as patient preoperative examination data and CT images) via an in-hospital server (not shown) is output to the internal bus 3 via the LAN IZF unit 8. The CPU 2 monitors these various medical information via the internal bus 3.
[0030] さらに、リモコン 104からの操作データがリモコン IZF部 9を介して内部バス 3に出 力される。そして、 CPU部 2は、内部バス 3を介したリモコン 104からの操作データに 基づき、各装置を制御する。  Further, operation data from the remote control 104 is output to the internal bus 3 via the remote control IZF unit 9. Then, the CPU unit 2 controls each device based on operation data from the remote controller 104 via the internal bus 3.
[0031] CPU部 2は、グラフィック部 10を制御することで、各種制御画面を集中操作パネル 103の表示部 103aに表示させる。また、タツチセンサ 103bは、集中操作パネル 103 の表示部 103aに表示された制御画面上の指等の操作を検知し、タツチセンサ IZF 部 11を介して内部バス 3に制御画面上での指等の操作の操作データを出力する。  The CPU unit 2 controls the graphic unit 10 to display various control screens on the display unit 103a of the centralized operation panel 103. Further, the touch sensor 103b detects an operation of a finger or the like on the control screen displayed on the display unit 103a of the centralized operation panel 103, and operates the internal bus 3 via the touch sensor IZF unit 11 to operate the finger or the like on the control screen. The operation data of is output.
[0032] この操作データにより CPU部 2は、集中操作パネル 103上力も各種制御及び指示 を行う。また、 CPU部 2は、この集中操作パネル 103上で各種制御及び指示あるい はリモコン 104の操作指示に基づく各種データ画面を集中表示パネル 105に表示さ せる。  Based on the operation data, the CPU unit 2 also performs various controls and instructions on the central operation panel 103. Further, the CPU unit 2 displays various data screens based on various controls and instructions on the centralized operation panel 103 or operation instructions of the remote controller 104 on the centralized display panel 105.
[0033] 集中表示パネル 105は、キヤプチャ処理部 4がキヤプチヤした画像を動画像として 表示することが可能に構成されている。この構成により、集中表示パネル 105は、観 察モニタ 118a, 118bと同様に第 1内視鏡用カメラ装置 110a、超音波観測装置 130 、第 2内視鏡用カメラ装置 110bあるいは第 3内視鏡用カメラ装置 110cからの入力画 像を動画のライブ画像として表示することができる。 The centralized display panel 105 is configured to be able to display an image captured by the capture processing unit 4 as a moving image. With this configuration, the centralized display panel 105 is connected to the first endoscope camera device 110a, the ultrasonic observation device 130, the second endoscope camera device 110b, or the third endoscope similarly to the observation monitors 118a and 118b. Image from camera device 110c The image can be displayed as a live image of a moving image.
[0034] つまり、主たる術者は、観察モニタ 118a, 118bによる内視鏡画像観察下で、所望 の装置の動画像を参照動画像として集中表示パネル 105で同時に参照できる。  [0034] In other words, the main operator can simultaneously refer to the central display panel 105 with a moving image of a desired device as a reference moving image under endoscopic image observation by the observation monitors 118a and 118b.
[0035] このように構成された本実施例の作用につ 、て説明する。 The operation of the present embodiment configured as described above will be described.
[0036] システムコントローラ 100に第 1内視鏡用カメラ装置 110a及び超音波観測装置 130 が接続される。また中継ユニット 101及び中継ケーブル 102を介して第 2内視鏡用力 メラ装置 110b及び第 3内視鏡用カメラ装置 110cが接続される。  The first endoscope camera device 110a and the ultrasonic observation device 130 are connected to the system controller 100. The second endoscope camera device 110b and the third endoscope camera device 110c are connected via the relay unit 101 and the relay cable 102.
[0037] そして、 CPU部 2の制御に基づき、これらの装置からの入力画像がキヤプチャ処理 部 4にて時分割にキヤプチヤされ、キヤプチャされた画像データが内部バス 3を介して 内部記録部 5に記録される。  Based on the control of the CPU unit 2, input images from these devices are captured in a time-division manner by the capture processing unit 4, and the captured image data is sent to the internal recording unit 5 via the internal bus 3. Be recorded.
[0038] そして、 CPU部 2は、内部記録部 5に記録されたキヤプチャ画像データ力 サムネ ィル画像を生成し、図 3に示すように、集中操作パネル 103の表示部 103aに、システ ムコントローラ 100の入力チャンネル毎のサムネイル画像 201からなる画像選択画面 200を表示する。  [0038] Then, the CPU unit 2 generates a thumbnail image of the captured image data recorded in the internal recording unit 5, and displays the system controller on the display unit 103a of the centralized operation panel 103 as shown in FIG. An image selection screen 200 including thumbnail images 201 for 100 input channels is displayed.
[0039] したがって、主たる術者は、この画像選択画面 200によりシステムコントローラ 100 に入力されている内視鏡用カメラ装置等からの複数のライブ画像がどのチャンネルに 接続されているか力 サムネイル画像 201により容易に視認することが可能となる。そ こで、例えば主たる術者は手技中に看護師等に指示することで、看護師等がタツチ センサ 103bを作動させて選択集中表示パネル 105上に所望のライブ画像を参照動 画像として表示させる。  Therefore, the main operator can determine to which channel a plurality of live images from the endoscope camera device or the like input to the system controller 100 are connected by using the image selection screen 200 by using the thumbnail image 201. It is possible to easily visually recognize it. Therefore, for example, the main surgeon instructs a nurse or the like during the procedure, and the nurse or the like activates the touch sensor 103b to display a desired live image on the selective concentration display panel 105 as a reference moving image. .
[0040] このように本実施例では、内視鏡画像の観察下で所望の参照動画像を集中表示パ ネル 105に表示させる際に、集中操作パネル 103の表示部 103aに入力チャンネル 毎のサムネイル画像 201を表示するので、システムに入力される複数のライブ入力画 像より所望のライブ画像を参照動画像として、より適切かつ迅速に選択することがで きる。  As described above, in the present embodiment, when a desired reference moving image is displayed on the centralized display panel 105 while observing an endoscope image, the thumbnail for each input channel is displayed on the display unit 103a of the centralized operation panel 103. Since the image 201 is displayed, a desired live image can be more appropriately and quickly selected as a reference moving image from a plurality of live input images input to the system.
[0041] (実施例 2)  (Example 2)
図 4及び図 5は本発明の実施例 2に係わり、図 4は集中操作パネルに表示されるキ 一入力画面を示す図、図 5は集中操作パネルに表示される画像選択画面を示す図 である。 4 and 5 relate to the second embodiment of the present invention. FIG. 4 is a diagram showing a key input screen displayed on the centralized operation panel, and FIG. 5 is a diagram showing an image selection screen displayed on the centralized operation panel. It is.
[0042] 実施例 2は、実施例 1とほとんど同じであるので、異なる点のみ説明する。  Since the second embodiment is almost the same as the first embodiment, only different points will be described.
[0043] 本実施例では、図 4に示すように、手技前に集中操作パネル 103の表示部 103aに キー入力画面 300が表示され、このキー入力画面 300でシステムの入力チャンネル 毎に、入力画像を示す文字が入力される。図 4ではチャンネル「Video A」に第 1内 視鏡用カメラ装置 110aからの入力画像を示す「CAMERA 1」を入力した状態を示 している。 In this embodiment, as shown in FIG. 4, a key input screen 300 is displayed on the display unit 103a of the centralized operation panel 103 before the procedure, and the input image is displayed on the key input screen 300 for each input channel of the system. Is input. FIG. 4 shows a state where “CAMERA 1” indicating an input image from the first endoscope camera device 110a is input to the channel “Video A”.
[0044] このようなキー入力画面 300でシステムの入力チャンネル毎に、入力画像を示す文 字が入力されると、図 5に示すように、入力した文字情報が重畳された、入力チャンネ ル毎のサムネイル画像 201を有する画像選択画面 200が集中操作パネル 103の表 示部 103aに表示される。  When a character indicating an input image is input for each input channel of the system on such a key input screen 300, as shown in FIG. 5, the input character information is superimposed on each input channel. The image selection screen 200 having the thumbnail image 201 is displayed on the display unit 103a of the centralized operation panel 103.
[0045] そして、主たる術者は、手技中に看護師等に指示することで、文字情報が重畳され たサムネイル画像 201をタツチセンサ 103bを作動させて選択させる。この選択操作 により集中表示パネル 105に手技中に内視鏡画像の観察下で所望の参照動画像が 表示される。  Then, the main surgeon instructs a nurse or the like during the procedure to operate the touch sensor 103b to select the thumbnail image 201 on which the character information is superimposed. By this selection operation, a desired reference moving image is displayed on the central display panel 105 during observation of the endoscopic image during the procedure.
[0046] これにより本実施例では、実施例 1の効果に加え、実施例 1よりもより容易に所望の ライブ画像を選択することが可能となる。  As a result, in this embodiment, in addition to the effects of the first embodiment, a desired live image can be selected more easily than in the first embodiment.
[0047] (実施例 3) (Example 3)
図 6及び図 7は本発明の実施例 3に係わり、図 6は集中操作パネルに表示される画 像選択画面を示す第 1の図、図 7は集中操作パネルに表示される画像選択画面を示 す第 2の図である。  6 and 7 relate to the third embodiment of the present invention. FIG. 6 is a first diagram illustrating an image selection screen displayed on the centralized operation panel, and FIG. 7 is a diagram illustrating an image selection screen displayed on the centralized operation panel. It is the 2nd figure shown.
[0048] 実施例 3は、実施例 2とほとんど同じであるので、異なる点のみ説明する。  The third embodiment is almost the same as the second embodiment, and only different points will be described.
[0049] 本実施例では、画像選択画面 200に表示されるサムネイル画像 201は、図 6に示 すように、実施例 1及び実施例 2のように第 1内視鏡用カメラ装置 110a、超音波観測 装置 130、第 2内視鏡用カメラ装置 110b及び第 3内視鏡用カメラ装置 110cからのラ イブ画像のサムネイルと、 VTR114に録画して!/、る録画画像のサムネイル及び院内 ネットワーク装置 131を介して入力した患者の術前 CT画像のサムネイルとから構成さ れる。 [0050] この場合、例えば術前 CT画像のサムネイルに重畳する文字情報は、キー入力画 面 300で入力された文字情報を含め、さらに、術前 CT画像を格納しているファイル 情報を読み出して、例えばファイル名「CT001. jpg」等の文字情報を含むことができ る。 In the present embodiment, as shown in FIG. 6, the thumbnail image 201 displayed on the image selection screen 200 is different from the first endoscope camera device 110a as in the first and second embodiments, as shown in FIG. Thumbnails of live images from the sound wave observation device 130, the second endoscope camera device 110b, and the third endoscope camera device 110c, and thumbnails of recorded images recorded on the VTR 114! And a thumbnail of a pre-operative CT image of the patient input via 131. [0050] In this case, for example, the character information to be superimposed on the thumbnail of the preoperative CT image includes the character information input on the key input screen 300, and further reads the file information storing the preoperative CT image. For example, character information such as a file name “CT001.jpg” can be included.
[0051] また、 VTR114に画像を録画せている状態の時は、図 7に示すように、 VTR114の 録画中の画像のサムネイルに「録画中」等の文字を表示することができる。  When an image is being recorded on the VTR 114, characters such as “recording” can be displayed on the thumbnail of the image being recorded on the VTR 114 as shown in FIG.
[0052] このように本実施例では、実施例 2の効果に加え、ライブ画像だけでなく記録画像 の選択も参照動画像として、より適切かつ迅速に行うことができる。  As described above, in this embodiment, in addition to the effects of the second embodiment, not only a live image but also a recorded image can be selected more appropriately and promptly as a reference moving image.
[0053] 本発明は、上述した実施例に限定されるものではなぐ本発明の要旨を変えない範 囲において、種々の変更、改変等が可能である。  [0053] The present invention is not limited to the above-described embodiments, and various changes and modifications can be made without departing from the spirit of the present invention.

Claims

請求の範囲 The scope of the claims
[1] 複数の画像を入力する画像入力手段と、  [1] image input means for inputting a plurality of images,
前記画像入力手段より入力された前記複数の画像をキヤプチヤする画像キヤプチ ャ手段と、  Image capturing means for capturing the plurality of images input from the image input means;
前記画像キヤプチャ手段がキヤプチヤした前記複数の画像の縮小画像を前記画像 入力手段の入力経路毎に割り付けた画像選択画面を生成する画像選択画面生成 手段と、  Image selection screen generation means for generating an image selection screen in which reduced images of the plurality of images captured by the image capture means are allocated for each input path of the image input means,
前記画像選択画面生成手段で生成された前記画像選択画面上の前記縮小画像 を選択する画像選択手段と、  Image selection means for selecting the reduced image on the image selection screen generated by the image selection screen generation means;
前記画像選択手段で選択された前記縮小画像の画像を表示手段に出力する画像 出力手段と  Image output means for outputting an image of the reduced image selected by the image selection means to a display means;
を具備したことを特徴とする画像処理装置。  An image processing apparatus comprising:
[2] 前記複数の画像は、術中のライブ画像を少なくとも含む  [2] The plurality of images include at least an intraoperative live image
ことを特徴とする請求項 1に記載の画像処理装置。  The image processing apparatus according to claim 1, wherein:
[3] 前記縮小画像に重畳する文字データを入力する文字入力手段を有する [3] There is a character input unit for inputting character data to be superimposed on the reduced image
ことを特徴とする請求項 1に記載の画像処理装置。  The image processing apparatus according to claim 1, wherein:
[4] 前記縮小画像に重畳する文字データを入力する文字入力手段を有する [4] There is a character input unit for inputting character data to be superimposed on the reduced image
ことを特徴とする請求項 2に記載の画像処理装置。  3. The image processing device according to claim 2, wherein:
[5] 前記文字データは、前記画像入力手段を識別する情報を含む [5] The character data includes information for identifying the image input unit.
ことを特徴とする請求項 3に記載の画像処理装置。  4. The image processing apparatus according to claim 3, wherein:
[6] 前記文字データは、前記画像入力手段を識別する情報を含む [6] The character data includes information for identifying the image input unit.
ことを特徴とする請求項 4に記載の画像処理装置。  5. The image processing device according to claim 4, wherein:
[7] 前記画像入力手段は、内視鏡用カメラ装置を含む [7] The image input means includes an endoscope camera device
ことを特徴とする請求項 5に記載の画像処理装置。  6. The image processing device according to claim 5, wherein:
[8] 前記画像入力手段は、内視鏡用カメラ装置を含む [8] The image input means includes an endoscope camera device
ことを特徴とする請求項 6に記載の画像処理装置。  7. The image processing device according to claim 6, wherein:
[9] 複数の画像を入力する画像入力工程と、 [9] an image input step of inputting a plurality of images,
前記画像入力手段より入力された前記複数の画像をキヤプチヤする画像キヤプチ ヤエ程と、 An image capture device for capturing the plurality of images input by the image input means; With yae,
前記画像キヤプチヤエ程によりキヤプチヤした前記複数の画像の縮小画像を前記 画像入力手段の入力経路毎に割り付けた画像選択画面を生成する画像選択画面 生成工程と、  An image selection screen generating step of generating an image selection screen in which reduced images of the plurality of images captured by the image capturing step are allocated for each input path of the image input means;
前記画像選択画面生成工程により生成された前記画像選択画面上の前記縮小画 像を選択する画像選択工程と、  An image selection step of selecting the reduced image on the image selection screen generated by the image selection screen generation step;
前記画像選択工程により選択された前記縮小画像の画像を表示手段に出力する 画像出力工程と  Outputting an image of the reduced image selected by the image selecting step to a display means;
を具備したことを特徴とする画像処理方法。  An image processing method comprising:
[10] 前記複数の画像は、術中のライブ画像を少なくとも含む  [10] The plurality of images include at least an intraoperative live image
ことを特徴とする請求項 9に記載の画像処理方法。  10. The image processing method according to claim 9, wherein:
[11] 前記縮小画像に重畳する文字データを入力する文字入力工程を有する [11] There is a character input step of inputting character data to be superimposed on the reduced image.
ことを特徴とする請求項 9に記載の画像処理方法。  10. The image processing method according to claim 9, wherein:
[12] 前記縮小画像に重畳する文字データを入力する文字入力工程を有する [12] There is a character input step of inputting character data to be superimposed on the reduced image.
ことを特徴とする請求項 10に記載の画像処理方法。  11. The image processing method according to claim 10, wherein:
[13] 前記文字データは、前記画像入力手段を識別する情報を含む [13] The character data includes information for identifying the image input unit.
ことを特徴とする請求項 11に記載の画像処理方法。  12. The image processing method according to claim 11, wherein:
[14] 前記文字データは、前記画像入力手段を識別する情報を含む [14] The character data includes information for identifying the image input unit.
ことを特徴とする請求項 12に記載の画像処理方法。  13. The image processing method according to claim 12, wherein:
PCT/JP2005/003880 2004-03-08 2005-03-07 Image processing device WO2005084525A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-064517 2004-03-08
JP2004064517A JP2005245961A (en) 2004-03-08 2004-03-08 Image processing apparatus

Publications (1)

Publication Number Publication Date
WO2005084525A1 true WO2005084525A1 (en) 2005-09-15

Family

ID=34918191

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/003880 WO2005084525A1 (en) 2004-03-08 2005-03-07 Image processing device

Country Status (2)

Country Link
JP (1) JP2005245961A (en)
WO (1) WO2005084525A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008006180A1 (en) * 2006-07-10 2008-01-17 Katholieke Universiteit Leuven Endoscopic vision system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007047457A2 (en) * 2005-10-13 2007-04-26 Volcano Corporation Component-based catheter lab intravascular ultrasound system
KR101071015B1 (en) 2007-12-27 2011-10-06 삼성메디슨 주식회사 Ultrasound diagnostic device and method for displaying images
FR2928257B1 (en) 2008-03-04 2011-01-14 Super Sonic Imagine ELECTRONIC SYSTEM FOR DOUBLE SCREEN DISPLAY.
JP5941762B2 (en) * 2012-06-14 2016-06-29 オリンパス株式会社 Manipulator system
US20180271613A1 (en) * 2015-10-02 2018-09-27 Sony Corporation Medical control apparatus, control method, program, and medical control system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0568668A (en) * 1991-09-12 1993-03-23 Olympus Optical Co Ltd Image recorder for medical use
JPH07184850A (en) * 1993-12-28 1995-07-25 Olympus Optical Co Ltd Image processor
JPH07303654A (en) * 1994-05-12 1995-11-21 Olympus Optical Co Ltd System control device
JP2000222417A (en) * 1999-01-29 2000-08-11 Olympus Optical Co Ltd Image filing device
JP2002233499A (en) * 2001-02-08 2002-08-20 Olympus Optical Co Ltd Endoscopic operation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0568668A (en) * 1991-09-12 1993-03-23 Olympus Optical Co Ltd Image recorder for medical use
JPH07184850A (en) * 1993-12-28 1995-07-25 Olympus Optical Co Ltd Image processor
JPH07303654A (en) * 1994-05-12 1995-11-21 Olympus Optical Co Ltd System control device
JP2000222417A (en) * 1999-01-29 2000-08-11 Olympus Optical Co Ltd Image filing device
JP2002233499A (en) * 2001-02-08 2002-08-20 Olympus Optical Co Ltd Endoscopic operation system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008006180A1 (en) * 2006-07-10 2008-01-17 Katholieke Universiteit Leuven Endoscopic vision system
US8911358B2 (en) 2006-07-10 2014-12-16 Katholieke Universiteit Leuven Endoscopic vision system

Also Published As

Publication number Publication date
JP2005245961A (en) 2005-09-15

Similar Documents

Publication Publication Date Title
JP5160025B2 (en) Surgery system
JP4869951B2 (en) Medical device data analyzer
US20030046562A1 (en) Remote medical supporting system
JP2008000282A (en) Procedure image recording control system and surgery system
JP2004181229A (en) System and method for supporting remote operation
JP2006081664A (en) Medical system and method for controlling medical system
JP2009207872A (en) Medical control device and its system
JP2005111080A (en) Surgery support system
JP2006000538A (en) Operating theater controlling system
WO2005084525A1 (en) Image processing device
US8154589B2 (en) Medical operation system for verifying and analyzing a medical operation
US10130240B2 (en) Medical system
JP2000271147A (en) Remote surgery support system
US9782060B2 (en) Medical system
JP2000245738A (en) Remote operation supporting system
JP2006288956A (en) Surgery system
JP4445598B2 (en) Endoscope visual field control system
JP2007082630A (en) Integrated operation room control system
JP2006000537A (en) Endoscope system
JP2004313341A (en) Medical device system
JP2001238205A (en) Endoscope system
JP2005334090A (en) Endoscopy system
JP2005143918A (en) Remote operation support system
JP2008173398A (en) Medical apparatus control system
JP2003339736A (en) Medical control system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase