WO2018180573A1 - 手術用画像処理装置、画像処理方法、及び、手術システム - Google Patents
手術用画像処理装置、画像処理方法、及び、手術システム Download PDFInfo
- Publication number
- WO2018180573A1 WO2018180573A1 PCT/JP2018/010391 JP2018010391W WO2018180573A1 WO 2018180573 A1 WO2018180573 A1 WO 2018180573A1 JP 2018010391 W JP2018010391 W JP 2018010391W WO 2018180573 A1 WO2018180573 A1 WO 2018180573A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- update
- image processing
- surgical
- post
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 342
- 238000001356 surgical procedure Methods 0.000 title claims abstract description 13
- 238000003672 processing method Methods 0.000 title claims abstract description 7
- 238000003384 imaging method Methods 0.000 claims description 34
- 238000000605 extraction Methods 0.000 claims description 15
- 230000000694 effects Effects 0.000 claims description 3
- 230000002093 peripheral effect Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 abstract description 38
- 238000002674 endoscopic surgery Methods 0.000 abstract description 15
- 230000008569 process Effects 0.000 description 33
- 238000004891 communication Methods 0.000 description 29
- 238000005516 engineering process Methods 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 19
- 230000006870 function Effects 0.000 description 17
- 238000001514 detection method Methods 0.000 description 16
- 238000004364 calculation method Methods 0.000 description 13
- 230000005540 biological transmission Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000010336 energy treatment Methods 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 6
- 210000004204 blood vessel Anatomy 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 210000005036 nerve Anatomy 0.000 description 3
- 208000005646 Pneumoperitoneum Diseases 0.000 description 2
- 210000003815 abdominal wall Anatomy 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000007789 sealing Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 201000003144 pneumothorax Diseases 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001954 sterilising effect Effects 0.000 description 1
- 238000004659 sterilization and disinfection Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00057—Operational features of endoscopes provided with means for testing or calibration
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/60—Software deployment
- G06F8/61—Installation
- G06F8/63—Image based installation; Cloning; Build to order
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/60—Software deployment
- G06F8/65—Updates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/40—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Definitions
- the present technology relates to a surgical image processing apparatus, an image processing method, and a surgical system, and in particular, a surgical image processing apparatus and an image processing method capable of easily comparing images before and after software update. And to a surgical system.
- various types of image processing are performed on signals captured by the endoscope. These image processes are realized by software, and these image processes are executed by a processor such as a GPU operating according to a predetermined program.
- Patent Document 1 discloses an endoscope system that performs maintenance such as software update via a communication line.
- the image quality of the image may change before and after the software update.
- the doctor may want to compare the image before the image quality changes with the image after the image quality changes.
- This technology has been made in view of such a situation, and makes it easy to compare images before and after software update.
- the surgical image processing apparatus of the present technology includes an image processing unit that performs image processing by software on the surgical site image, and a display control unit that controls display of the surgical site image subjected to the image processing,
- the image processing unit generates a pre-update processed image obtained by performing the image processing before the software update on the surgical site image, and a post-update processed image obtained by performing the image processing after the software update on the surgical site image.
- the display control unit controls display of at least a part of at least one of the pre-update processed image and the post-update processed image.
- a difference calculation unit that calculates a difference between the pre-update processed image and the post-update processed image is further provided, and the display control unit controls display of information informing that the difference is larger than a predetermined value. be able to.
- a scene detection unit for detecting a predetermined scene in the image is further provided, and the display control unit is selected by the user among the pre-update processing image and the post-update processing image for the detected scene. Can be displayed.
- a feature amount extraction unit that extracts a feature amount of the detected scene, and the extracted feature amount and the detected scene are selected by the user from the pre-update processed image and the post-update processed image. It is possible to further provide a recording control unit that records the history information in association with the selection information indicating the person who has performed.
- a learning unit that learns which of the pre-update processed image and the post-update processed image is selected by the user for the detected scene for each feature amount of the scene based on the history information Can be further provided.
- An inquiry unit that inquires a learning result corresponding to the feature amount of a predetermined scene detected in another image is further provided, and the display control unit is configured to inquire about the learning result in the other image based on the inquired learning result.
- One of the pre-update processed image and the post-update processed image can be displayed for the predetermined scene.
- An image processing method includes an image processing apparatus including an image processing unit that performs image processing on software on an operation part image, and a display control unit that controls display of the operation part image subjected to the image processing. Generating a pre-update image obtained by performing the image processing before the software update on the surgical site image, and a post-update image obtained by performing the image processing after the software update on the surgical image. Controlling display of at least a part of at least one of the processed image and the post-update processed image.
- the surgical operation system includes a surgical imaging apparatus that acquires a surgical site image, an image processing unit that performs software image processing on the surgical site image, and display of the surgical site image that has been subjected to the image processing.
- a surgical image processing apparatus having a display control unit for controlling, and the image processing unit includes a pre-update image obtained by performing the image processing before the software update on the surgical unit image, and the surgical unit image. And a post-update processed image subjected to the image processing after the software update, and the display control unit controls display of at least a part of at least one of the pre-update processed image and the post-update processed image To do.
- a pre-update processed image obtained by performing the image processing before the software update on the surgical site image and a post-update processed image obtained by performing the image processing after the software update on the surgical site image are generated.
- the display of at least a part of at least one of the pre-update processed image and the post-update processed image is controlled.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system 100 to which the technology according to the present disclosure can be applied.
- FIG. 1 shows a state in which an operator (doctor) 167 is performing an operation on a patient 171 on a patient bed 169 using the endoscopic operation system 100.
- the endoscopic surgery system 100 includes an endoscope 101, other surgical tools 117, a support arm device 127 that supports the endoscope 101, and a cart 137 on which various devices for endoscopic surgery are mounted. Consists of
- trocars 125a to 125d are punctured into the abdominal wall. Then, the lens barrel 103 of the endoscope 101 and other surgical tools 117 are inserted into the body cavity of the patient 171 from the trocars 125a to 125d.
- an insufflation tube 119, an energy treatment tool 121, and forceps 123 are inserted into the body cavity of the patient 171.
- the energy treatment device 121 is a treatment device that performs incision and peeling of a tissue, sealing of a blood vessel, or the like by a high-frequency current or ultrasonic vibration.
- the illustrated surgical tool 117 is merely an example, and various surgical tools generally used in endoscopic surgery, such as a lever and a retractor, may be used as the surgical tool 117, for example.
- the image of the surgical site in the body cavity of the patient 171 captured by the endoscope 101 is displayed on the display device 141.
- the surgeon 167 performs a treatment such as excision of the affected part using the energy treatment tool 121 and the forceps 123 while viewing the image of the surgical part displayed on the display device 141 in real time.
- the pneumoperitoneum tube 119, the energy treatment device 121, and the forceps 123 are supported by an operator 167 or an assistant during the operation.
- the support arm device 127 includes an arm portion 131 extending from the base portion 129.
- the arm part 131 is composed of joint parts 133 a, 133 b, 133 c and links 135 a, 135 b and is driven under the control of the arm control device 145.
- the endoscope 101 is supported by the arm part 131, and its position and posture are controlled. Thereby, stable fixation of the endoscope 101 is realized.
- the endoscope 101 includes a lens barrel 103 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 171 and a camera head 105 (imaging device) connected to the proximal end of the lens barrel 103.
- a camera head 105 imaging device
- FIG. 1 an endoscope 101 configured as a so-called rigid mirror having a rigid barrel 103 is shown.
- the endoscope 101 is configured as a so-called flexible mirror having a flexible barrel 103. May be.
- An opening into which an objective lens is fitted is provided at the tip of the lens barrel 103.
- a light source device 143 is connected to the endoscope 101, and light generated by the light source device 143 is guided to the tip of the lens barrel 103 by a light guide extending inside the lens barrel 103, and the objective lens Is irradiated toward the observation target in the body cavity of the patient 171.
- the endoscope 101 may be a direct endoscope, a perspective mirror, or a side endoscope.
- An optical system and an image sensor are provided inside the camera head 105, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging device, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 139 as RAW data.
- the camera head 105 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.
- a plurality of image sensors may be provided in the camera head 105 in order to cope with, for example, stereoscopic viewing (3D display).
- a plurality of relay optical systems are provided inside the lens barrel 103 in order to guide observation light to each of the plurality of imaging elements.
- the CCU 139 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and further a GPGPU (General Purpose computing unit on GPU), and controls the operations of the endoscope 101 and the display device 141 in an integrated manner. Specifically, the CCU 139 performs various types of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example, on the image signal received from the camera head 105. The CCU 139 provides an image signal subjected to image processing to the display device 141. Further, the CCU 139 transmits a control signal to the camera head 105 to control the driving thereof.
- the control signal can include information regarding imaging conditions such as magnification and focal length.
- the display device 141 displays an image based on an image signal subjected to image processing by the CCU 139 under the control of the CCU 139.
- the endoscope 101 is compatible with high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320), and / or 3D display
- high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320)
- a display device 141 capable of high-resolution display and / or 3D display can be used.
- a more immersive feeling can be obtained by using a display device 141 having a size of 55 inches or more.
- a plurality of display devices 141 having different resolutions and sizes may be provided depending on applications.
- the light source device 143 includes a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 101 when photographing a surgical site.
- a light source such as an LED (Light Emitting Diode)
- LED Light Emitting Diode
- the arm control device 145 is configured by a processor such as a CPU, for example, and operates according to a predetermined program to control driving of the arm portion 131 of the support arm device 127 according to a predetermined control method.
- the input device 147 is an input interface for the endoscopic surgery system 100.
- the user can input various information and instructions to the endoscopic surgery system 100 via the input device 147.
- the user inputs various types of information related to the operation, such as the patient's physical information and information about the surgical technique, via the input device 147.
- the user instructs to drive the arm unit 131 via the input device 147 or to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 101.
- An instruction or the like for driving the energy treatment device 121 is input.
- the type of the input device 147 is not limited, and the input device 147 may be various known input devices.
- the input device 147 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 157, and / or a lever can be applied.
- the touch panel may be provided on the display surface of the display device 141.
- the input device 147 may be a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display). In this case, various inputs are performed according to the user's gesture and line of sight detected by these devices.
- the input device 147 may include a camera capable of detecting the user's movement, and various inputs may be performed according to the user's gesture and line of sight detected from the video captured by the camera.
- the input device 147 may include a microphone that can pick up a user's voice, and various inputs may be performed by voice through the microphone.
- the input device 147 is configured to be able to input various kinds of information without contact, so that a user belonging to the clean area (for example, the operator 167) operates the device belonging to the unclean area in a non-contact manner. Is possible.
- the user since the user can operate the device without releasing his / her hand from the surgical tool he / she has, the convenience for the user is improved.
- the treatment instrument control device 149 controls the drive of the energy treatment instrument 121 for tissue cauterization, incision, or blood vessel sealing.
- the pneumoperitoneum device 151 supplies gas into the body cavity via the pneumothorax tube 119. Send it in.
- the recorder 153 is a device that can record various types of information related to surgery.
- the printer 155 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- FIG. 2 is a block diagram illustrating an example of functional configurations of the camera head 105 and the CCU 139.
- the camera head 105 includes a lens unit 107, an imaging unit 109, a driving unit 111, a communication unit 113, and a camera head control unit 115 as functions thereof.
- the CCU 139 includes a communication unit 159, an image processing unit 161, and a control unit 163 as its functions.
- the camera head 105 and the CCU 139 are connected to each other via a transmission cable 165 so that they can communicate with each other.
- the lens unit 107 is an optical system provided at a connection portion with the lens barrel 103. Observation light captured from the tip of the lens barrel 103 is guided to the camera head 105 and enters the lens unit 107.
- the lens unit 107 is configured by combining a plurality of lenses including a zoom lens and a focus lens. The optical characteristics of the lens unit 107 are adjusted so that the observation light is condensed on the light receiving surface of the image sensor of the imaging unit 109.
- the zoom lens and the focus lens are configured such that their positions on the optical axis are movable in order to adjust the magnification and focus of the captured image.
- the image pickup unit 109 is configured by an image pickup device, and is arranged at the rear stage of the lens unit 107.
- the observation light that has passed through the lens unit 107 is collected on the light receiving surface of the image sensor, and an image signal corresponding to the observation image is generated by photoelectric conversion.
- the image signal generated by the imaging unit 109 is provided to the communication unit 113.
- the image pickup element constituting the image pickup unit 109 for example, an element capable of color photographing having a Bayer array such as a CMOS (Complementary Metal Metal Oxide Semiconductor) image sensor is used.
- CMOS Complementary Metal Metal Oxide Semiconductor
- an imaging device what can respond
- the image sensor that configures the image capturing unit 109 may be configured to include a pair of image sensors for acquiring right-eye and left-eye image signals corresponding to 3D display. By performing the 3D display, the operator 167 can more accurately grasp the depth of the living tissue in the surgical site.
- the imaging unit 109 is configured as a multi-plate type, a plurality of lens units 107 are also provided corresponding to each imaging element.
- the imaging unit 109 is not necessarily provided in the camera head 105.
- the imaging unit 109 may be provided inside the lens barrel 103 immediately after the objective lens.
- the driving unit 111 includes an actuator, and moves the zoom lens and the focus lens of the lens unit 107 by a predetermined distance along the optical axis under the control of the camera head control unit 115. Thereby, the magnification and the focus of the image captured by the imaging unit 109 can be adjusted as appropriate.
- the communication unit 113 includes a communication device for transmitting and receiving various types of information to and from the CCU 139.
- the communication unit 113 transmits the image signal obtained from the imaging unit 109 as RAW data to the CCU 139 via the transmission cable 165.
- the image signal is preferably transmitted by optical communication in order to display a captured image of the surgical site with low latency.
- the operator 167 performs the operation while observing the state of the affected part with the captured image, so that a moving image of the operated part is displayed in real time as much as possible for a safer and more reliable operation. Because it is required.
- the communication unit 113 is provided with a photoelectric conversion module that converts an electrical signal into an optical signal.
- the image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 139 via the transmission cable 165.
- the communication unit 113 receives a control signal for controlling the driving of the camera head 105 from the CCU 139.
- the control signal includes, for example, information indicating that the frame rate of the captured image is specified, information indicating that the exposure value at the time of imaging is specified, and / or information indicating that the magnification and focus of the captured image are specified. Contains information about.
- the communication unit 113 provides the received control signal to the camera head control unit 115.
- the control signal from the CCU 139 may also be transmitted by optical communication.
- the communication unit 113 is provided with a photoelectric conversion module that converts an optical signal into an electric signal.
- the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 115.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus are automatically set by the control unit 163 of the CCU 139 based on the acquired image signal. That is, a so-called AE (Auto-Exposure) function, AF (Auto-Focus) function, and AWB (Auto-White Balance) function are mounted on the endoscope 101.
- AE Auto-Exposure
- AF Auto-Focus
- AWB Auto-White Balance
- the camera head control unit 115 controls driving of the camera head 105 based on a control signal from the CCU 139 received via the communication unit 113. For example, the camera head control unit 115 controls driving of the imaging element of the imaging unit 109 based on information indicating that the frame rate of the captured image is specified and / or information indicating that the exposure at the time of imaging is specified. For example, the camera head control unit 115 appropriately moves the zoom lens and the focus lens of the lens unit 107 via the drive unit 111 based on information indicating that the magnification and the focus of the captured image are designated.
- the camera head control unit 115 may further have a function of storing information for identifying the lens barrel 103 and the camera head 105.
- the camera head 105 can be resistant to autoclave sterilization by arranging the lens unit 107, the imaging unit 109, and the like in a sealed structure with high airtightness and waterproofness.
- the communication unit 159 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 105.
- the communication unit 159 receives an image signal transmitted from the camera head 105 via the transmission cable 165.
- the image signal can be suitably transmitted by optical communication.
- the communication unit 159 is provided with a photoelectric conversion module that converts an optical signal into an electric signal.
- the communication unit 159 provides the image processing unit 161 with the image signal converted into an electrical signal.
- the communication unit 159 transmits a control signal for controlling the driving of the camera head 105 to the camera head 105.
- the control signal may also be transmitted by optical communication.
- the image processing unit 161 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 105.
- image processing for example, development processing, high image quality processing (band enhancement processing, super-resolution processing, NR (Noise Reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing), etc.
- image processing unit 161 performs detection processing on the image signal for performing AE, AF, and AWB.
- the image processing unit 161 is configured by a processor such as a CPU, GPU, GPGPU, and the above-described image processing and detection processing are performed when the processor operates according to a predetermined program.
- the image processing unit 161 is configured by a plurality of GPUs, the image processing unit 161 appropriately divides information related to the image signal, and performs image processing in parallel by the plurality of GPUs.
- the control unit 163 performs various controls relating to imaging of the surgical site by the endoscope 101 and display of the captured image. For example, the control unit 163 generates a control signal for controlling the driving of the camera head 105. Here, when the imaging condition is input by the user, the control unit 163 generates a control signal based on the input by the user.
- the control unit 163 determines the optimum exposure value and focal length according to the detection processing result by the image processing unit 161. , And white balance are calculated as appropriate to generate a control signal.
- control unit 163 causes the display device 141 to display an image of the surgical unit based on the image signal subjected to the image processing by the image processing unit 161.
- the control unit 163 recognizes various objects in the surgical unit image using various image recognition techniques. For example, the control unit 163 detects the shape and color of the edge of the object included in the surgical site image, thereby detecting the surgical instrument 117 such as the forceps 123, a specific living body part, bleeding, and the mist when using the energy treatment instrument 121. Etc. can be recognized.
- the control unit 163 displays various types of surgery support information on the surgical site image using the recognition result. The surgery support information is displayed in a superimposed manner and presented to the operator 167, so that the surgery can be performed more safely and reliably.
- the transmission cable 165 connecting the camera head 105 and the CCU 139 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
- wired communication is performed using the transmission cable 165, but communication between the camera head 105 and the CCU 139 may be performed wirelessly.
- communication between the two is performed wirelessly, there is no need to install the transmission cable 165 in the operating room, so that the situation where the movement of the medical staff in the operating room is hindered by the transmission cable 165 is eliminated.
- SW software
- the image quality of the image may change before and after the SW update.
- the doctor may want to compare the image before the image quality changes with the image after the image quality changes.
- there is a risk that such a change in image quality may affect doctors' intraoperative diagnosis and operation of surgical tools.
- the SW that adds the camera shake correction process is updated, the image of the surgical site is stabilized and the visibility is improved in many scenes of the operation.
- a fine tissue such as a blood vessel or a nerve is displayed by being emphasized, so that the visibility of the portion can be improved. May be difficult to see.
- 3 corresponds to the above-described CCU 139, and performs image processing on an image signal of an operation part image (input image) input from the camera head 105, for example, and outputs the processed image signal to the display device 202.
- the display device 202 corresponds to the display device 141 described above, and displays an image based on the image signal output from the image processing device 201.
- the image processing apparatus 201 includes an image processing unit 211 and a display control unit 212.
- the image processing unit 211 corresponds to the image processing unit 161 described above, and performs various types of image processing on the image signal of the input image. These image processes are realized by software (SW), and a processor such as a GPU operates according to a predetermined program to execute these image processes.
- SW software
- a processor such as a GPU operates according to a predetermined program to execute these image processes.
- the SW for realizing these image processes is updated by installing a predetermined program from a recording medium, for example.
- a program may be installed via a network such as the Internet.
- the image processing unit 211 includes a post-SW update image processing unit 221 and a pre-SW update image processing unit 222.
- the SW updated image processing unit 221 outputs an image signal of an updated processed image obtained by performing image processing after the SW is updated, that is, image processing realized by the latest version of SW, on the input image.
- the SW pre-update image processing unit 222 outputs, to the input image, an image signal of a pre-update image that has been subjected to image processing before the SW is updated, that is, image processing realized by the previous SW version. .
- the SW update post-update image processing unit 221 and the SW pre-update image processing unit 222 are configured by two GPUs, the SW update post-update image processing unit 221 and the SW pre-update image processing unit 222 perform the SW update and SW processing.
- the image processing before the update can be performed in parallel, that is, simultaneously.
- post-SW update image processing unit 221 and the pre-SW update image processing unit 222 may be configured by one GPU so that the image processing after SW update and the image processing before SW update are performed serially. it can.
- the display control unit 212 is based on the image signal output from the image processing unit 211 (the post-SW update image processing unit 221 and the SW pre-update image processing unit 222), and is at least one of the post-update processing image and the pre-update processing image. At least a part of one is displayed on the display device 202. Further, the display control unit 212 controls the display of the post-update processed image and the pre-update processed image on the display device 202 in accordance with a user operation.
- FIG. 4 Flow of operation image display processing
- step S21 the SW-updated image processing unit 221 performs image processing (image processing realized by the latest version of SW) after the SW is updated on the input operation part image.
- step S22 the pre-SW image processing unit 222 performs image processing before the SW is updated (image processing realized by the previous version of SW) on the input surgical image.
- steps S21 and S22 is performed simultaneously in parallel, but may be performed serially as described above.
- the image signal of the post-update processed image subjected to the image processing after SW update and the image signal of the pre-update processed image subjected to the image processing prior to SW update are output to the display control unit 212.
- step S ⁇ b> 23 the display control unit 212 displays the post-update processed image and the pre-update processed image on the display device 202 based on the image signals output from the SW updated image processing unit 221 and the SW pre-update image processing unit 222. Display.
- the image after the image quality is changed by the SW update and the image before the image quality is changed are displayed as the surgical site image captured by the endoscope 101.
- An operator who performs an operation using the mirror 101 can easily compare these images.
- FIG. 5 shows a first display example of the post-update processed image and the pre-update processed image.
- FIG. 5 shows a state in which the post-update processed image 251 and the pre-update processed image 252 are simultaneously displayed on one screen.
- the post-update image 251 displayed in the upper left area of the screen is displayed larger than that displayed in the upper right area of the screen. Thereby, the operator who is performing an operation using the endoscope 101 can easily compare both the post-update processing image 251 and the pre-update processing image 252.
- the display sizes of the post-update processed image 251 and the pre-update processed image 252 on the screen can be adjusted according to the operation of the user (surgeon). Specifically, the area where the post-update processed image 251 is displayed and the area where the pre-update processed image 252 is displayed are switched on the screen.
- the processed image and the pre-update processed image may be displayed.
- FIG. 6 shows a second display example of the post-update processed image and the pre-update processed image.
- FIG. 6 shows a state in which the user (operator) selected from the post-update processing image 251 and the pre-update processing image 252 is displayed.
- the update preprocessed image 252 is displayed on the display device 202. Thereby, it is possible to avoid the user being confused by the image whose image quality has changed at the time of starting the display of the surgical part image.
- the pre-update processed image 252 and the post-update processed image 251 are switched and displayed by a user operation.
- the user interface for switching the display is provided on the camera 101 such as the camera head 105 or the surgical instrument 117 of the endoscope 101 that can be switched instantly by the operator.
- FIG. 7 shows a third display example of the post-update processed image and the pre-update processed image.
- FIG. 7 shows a state where the post-update processed image 251 and the pre-update processed image 252 are displayed for each pixel added at a predetermined ratio.
- a value obtained by multiplying the pixel value of each pixel of the pre-update processing image 252 by a predetermined value ⁇ ( ⁇ ⁇ 1) and a pixel value of each pixel of the post-update processing image 251 are set to a predetermined value (1
- the output image 261 is displayed with the sum of the values multiplied by - ⁇ ) as the pixel value of each pixel.
- FIG. 8 shows a fourth display example of the post-update processed image and the pre-update processed image.
- FIG. 8 shows a state in which the first area of the post-update processed image 251 and the second area other than the first area of the pre-update processed image 252 are combined and displayed.
- an image of a portion corresponding to the area D1 in the post-update processed image 251 is displayed in the right area D1 where the screen is divided into left and right, and the left side where the screen is divided into left and right In the area D2, an image of a portion corresponding to the area D2 in the pre-update image 252 is displayed.
- the width w of the region D1 in other words, the boundary between the region D1 and the region D2 is adjusted by the operation of the user (surgeon). Further, the images displayed in the region D1 and the region D2 may be switched by a user operation.
- an area including the center of the screen specifically, a circular area D3 centered on the center of the screen, and an image of a portion corresponding to the area D3 in the post-update processed image 251.
- Is displayed, and an image of a portion corresponding to the region D4 in the pre-update image 252 is displayed in the peripheral region D4 outside the region D3 of the screen.
- the radius r of the region D3 in other words, the boundary between the region D3 and the region D4 is adjusted by the operation of the user (surgeon).
- the images displayed in the region D3 and the region D4 may be switched by a user operation.
- the shape of the region D3 is not limited to a circle and may be other shapes such as a rectangle.
- FIG. 9 illustrates a configuration example of an image processing apparatus according to the second embodiment of the present technology.
- the image processing device 271 in FIG. 9 performs image processing on the image signal of the input image and outputs it to the display devices 202-1 and 202-2.
- the display devices 202-1 and 202-2 each display an image based on the image signal output from the image processing device 201.
- the image processing device 271 includes an image processing unit 211 and a display control unit 281.
- the image processing unit 211 is the same as the configuration shown in FIG.
- the display control unit 281 displays the post-update processed image on the display device 202-1 based on the image signal output from the image processing unit 211 (the SW post-update image processing unit 221 and the pre-SW update image processing unit 222). At the same time, the pre-update image is displayed on the display device 202-2.
- an image of the surgical site captured by the endoscope 101 on each of the two display devices is an image after the image quality is changed by SW update, and an image before the image quality is changed. Is displayed, it is possible for an operator who is performing surgery using the endoscope 101 to easily compare these images.
- FIG. 10 illustrates a configuration example of an image processing device according to the third embodiment of the present technology.
- 10 includes an image processing unit 311 and a display control unit 312.
- the image processing unit 311 includes a post-SW update image processing unit 321, a pre-SW update image processing unit 322, and a difference calculation unit 323.
- the SW updated image processing unit 321 outputs an image signal of the updated processed image obtained by performing the image processing after the SW is updated on the input image, similarly to the SW updated image processing unit 221 in FIG.
- the SW pre-update image processing unit 322 outputs an image signal of the pre-update image obtained by performing the image processing before the SW is updated on the input image, like the pre-SW update image processing unit 222 in FIG.
- the difference calculation unit 323 calculates the difference between the pre-update processing image and the post-update processing image, and supplies the calculation result to the display control unit 312.
- the display control unit 312 is based on the image signal output from the image processing unit 311 (the SW updated image processing unit 321 and the SW pre-update image processing unit 322), and at least one of the post-update processed image and the pre-update processed image. At least a part is displayed on the display device 202.
- the display control unit 312 includes an OSD processing unit 331.
- the OSD processing unit 331 OSD displays information corresponding to the calculation result from the image processing unit 311 (difference calculation unit 323) on the screen of the display device 202.
- FIG. 11 Flow of operation image display processing
- step S31 the SW-updated image processing unit 321 performs image processing (image processing realized by the latest version of SW) after the SW is updated on the input operation part image.
- step S32 the pre-SW update image processing unit 322 performs image processing before the SW is updated (image processing realized by the previous version of SW) on the input operation part image.
- steps S31 and S32 are performed simultaneously in parallel, but may be performed serially as described above.
- the image signal of the post-update processed image subjected to the image processing after SW update and the image signal of the pre-update processed image subjected to the image processing before SW update are output to the display control unit 312 and the difference calculation unit 323.
- the display control unit 312 displays only the post-update processed image on the display device 202 based on the image signal output from the post-SW update image processing unit 321.
- step S33 the difference calculation unit 323 calculates the difference between the pre-update processed image and the post-update processed image. Specifically, the difference calculation unit 323 calculates the difference between the pixel values of each pixel of the pre-update processed image and the post-update processed image and sums them up. The calculation result is supplied to the display control unit 312.
- step S34 the display control unit 312 determines whether the difference is greater than a predetermined value based on the calculation result from the difference calculation unit 323.
- step S34 If it is determined that the difference is not greater than the predetermined value, the process of step S34 is repeated, and the post-update processed image continues to be displayed on the display device 202.
- step S35 if it is determined that the difference is greater than the predetermined value, the process proceeds to step S35.
- step S35 the OSD processing unit 331 of the display control unit 312 performs OSD display of information notifying that the difference is larger than a predetermined value.
- the OSD processing unit 331 performs OSD display of a notification image 361 that notifies that the difference is larger than a predetermined value on the post-update processing image displayed on the display device 202, as illustrated on the left side of FIG. .
- the user can recognize that there is a difference in image quality between the post-update processed image and the pre-update processed image in the scene currently being viewed.
- the user selects a display mode for comparing the post-update processed image and the pre-update processed image by performing a predetermined operation such as touching the portion of the notification image 361 on the screen of the display device 202.
- the selection screen to do is displayed.
- the display control unit 312 displays a selection screen provided with buttons 381 to 384 for selecting one of four display modes. In each of the four display modes, for example, the display described in the display examples 1 to 4 described above is performed.
- the surgeon performing an operation using the endoscope 101 has a large difference between the image after the image quality is changed by the SW update and the image before the image quality is changed. These images can be displayed in a desired display form, and these images can be easily compared.
- FIG. 13 illustrates a configuration example of an image processing device according to the fourth embodiment of the present technology.
- 13 includes an image processing unit 411, a display control unit 412, a scene detection unit 413, a feature amount extraction unit 414, a recording control unit 415, a history information recording unit 416, a learning unit 417, and a learning information recording unit. 418.
- the image processing unit 411 includes a post-SW update image processing unit 421 and a pre-SW update image processing unit 422.
- the SW updated image processing unit 421 outputs an image signal of an updated processed image obtained by performing image processing after the SW is updated on the input image, as in the SW updated image processing unit 221 of FIG.
- the SW pre-update image processing unit 422 outputs the image signal of the pre-update image that has been subjected to the image processing before the SW is updated to the input image, as with the pre-SW update image processing unit 222 of FIG.
- the display control unit 412 notifies the user of the post-update processed image and the pre-update processed image based on the image signal output from the image processing unit 411 (the SW updated image processing unit 421 and the SW pre-update image processing unit 422). The selected one is displayed on the display device 202.
- the display control unit 412 includes an OSD processing unit 431.
- the OSD processing unit 431 performs OSD display on the screen of the display device 202 of information corresponding to an important flag from a scene detection unit 413 described later.
- the scene detection unit 413 detects a highly important scene in the input image. When a scene with high importance is detected, the scene detection unit 413 supplies an important flag indicating that the scene is a scene with high importance to the display control unit 412 and the feature amount extraction unit 414.
- the feature amount extraction unit 414 extracts a feature amount of a highly important scene detected in the input image based on the importance flag from the scene detection unit 413, and supplies it to the recording control unit 415.
- the recording control unit 415 associates the feature amount from the feature amount extraction unit 414 with the selection information from the display control unit 412 and records it in the history information recording unit 416 as history information.
- the selection information is information indicating which one of the post-update processed image and the pre-update processed image has been selected by the user for a highly important scene detected in the input image.
- the learning unit 417 determines, for each feature amount, a highly important scene detected in the input image from among the post-update processed image and the pre-update processed image. Learn which was selected by the user.
- the learning result obtained by learning is recorded in the learning information recording unit 418 as learning information.
- FIG. 14 Flow of operation image display processing
- step S41 the SW-updated image processing unit 421 performs image processing after the SW is updated (image processing realized by the latest version of SW) on the input image.
- step S42 the pre-SW update image processing unit 422 performs image processing before the SW is updated (image processing realized by the previous SW version) on the input image.
- steps S41 and S42 are performed simultaneously in parallel, but may be performed serially as described above.
- the image signal of the post-update processed image subjected to the image processing after SW update and the image signal of the pre-update processed image subjected to the image processing prior to SW update are output to the display control unit 412.
- the display control unit 412 displays both the post-update processed image and the pre-update processed image on the display device 202 based on the image signal output from the image processing unit 411.
- step S43 the scene detection unit 413 determines whether a scene with high importance is detected in the input image.
- the highly important scene is, for example, a scene where tissue incision or detachment is performed, and these are detected based on the shape of the surgical instrument and the color of the surgical site shown in the input image (surgical site image).
- step S43 If it is determined that a scene with high importance is not detected, the process of step S43 is repeated, and both the post-update processed image and the pre-update processed image continue to be displayed on the display device 202.
- the scene detection unit 413 supplies an important flag to the display control unit 412 and the feature amount extraction unit 414, and the process proceeds to step S44.
- the feature amount extraction unit 414 extracts the feature amount of the highly important scene detected in the input image based on the important flag from the scene detection unit 413, and supplies it to the recording control unit 415.
- the feature amount of a highly important scene is information corresponding to the shape of the surgical instrument, the color of the surgical part, etc., which is the reference for detecting the scene.
- step S45 the OSD processing unit 431 of the display control unit 412 displays information indicating that the currently displayed scene is a highly important scene based on the importance flag from the scene detection unit 413. I do.
- step S44 and step S45 may be performed in parallel.
- the image processing apparatus 401 causes the user to select which of the post-update processed image and the pre-update processed image is to be displayed.
- the process proceeds to step S46.
- step S46 the display control unit 412 causes the display device 202 to display only the one selected by the user from the post-update processed image and the pre-update processed image.
- the display control unit 412 supplies the recording control unit 415 with selection information indicating which one of the post-update processed image and the pre-update processed image is selected by the user (displayed on the display device 202).
- step S47 the recording control unit 415 associates the feature amount from the feature amount extraction unit 414 with the selection information from the display control unit 412, and records it in the history information recording unit 416 as history information.
- step S ⁇ b> 48 the learning unit 417 performs the post-update processing image and the pre-update processing for scenes with high importance detected in the input image for each feature amount based on the history information recorded in the history information recording unit 416. Learn which of the images was selected by the user. Here, for each feature amount, which one of the post-update processed image and the pre-update processed image is selected more by the user is learned by machine learning or the like.
- the surgeon performing an operation using the endoscope 101 has the image after the image quality is changed by the SW update and the image before the image quality is changed for the highly important scene. Can be easily compared, and a more suitable one can be selected and displayed. In addition, for highly important scenes, it is possible to learn which display mode of the image after the image quality has changed due to the SW update and the image before the image quality has changed is more suitable for the operator. .
- FIG. 15 illustrates a configuration example of an image processing device according to the fifth embodiment of the present technology.
- the image processing apparatus 501 in FIG. 15 is more suitable for the surgeon among the post-update processing image and the pre-update processing image among the post-update processing image and the pre-update processing image with respect to a scene having high importance in the newly input operation section image using the learning result described above. Automatically selects the correct one for display.
- the image processing apparatus 501 includes an image processing unit 411, a display control unit 412, a scene detection unit 413, a feature amount extraction unit 414, a learning information recording unit 418, and a learning information inquiry unit 511.
- the image processing unit 411, the display control unit 412, the scene detection unit 413, the feature amount extraction unit 414, and the learning information recording unit 418 are the same as those shown in FIG. To do.
- the learning information inquiry unit 511 inquires learning information corresponding to the feature quantity of the scene with high importance detected in the newly input image from the learning information recorded in the learning information recording unit 418.
- the learning information inquiry unit 511 supplies selection information associated with the feature amount to the display control unit 412 based on the inquired learning information.
- FIG. 16 (Flow of operation image display processing) Next, with reference to the flowchart of FIG. 16, the flow of the surgical part image display process by the image processing apparatus 501 will be described.
- the processing in FIG. 16 is started when an operation part image (moving image) obtained by imaging the operation part is input to the image processing apparatus 501 from the endoscope 101 (camera head 105).
- steps S51 to S54 in FIG. 16 is basically the same as the processing in steps S41 to S44 in FIG.
- step S54 when a feature amount of a highly important scene detected in the input image is extracted, the feature amount is supplied to the learning information inquiry unit 511.
- step S55 the learning information inquiry unit 511 inquires learning information corresponding to the feature amount from the feature amount extraction unit 414 from the learning information recorded in the learning information recording unit 418.
- the learning information inquiry unit 511 supplies selection information associated with the feature amount from the feature amount extraction unit 414 to the display control unit 412 based on the inquired learning information.
- step S56 based on the selection information from the learning information inquiry unit 511, the display control unit 412 causes the display device 202 to display only the one indicated by the selection information among the post-update processed image and the pre-update processed image. .
- the technology according to the present disclosure is applied to an endoscopic surgery system
- a system to which the technology according to the present disclosure can be applied is not limited to such an example.
- the technology according to the present disclosure may be applied to a testing flexible endoscope system or a microscope operation system.
- this technique can take the following structures.
- An image processing unit that performs image processing by software on the surgical unit image;
- a display control unit that controls display of the surgical part image that has been subjected to the image processing,
- the image processing unit includes a pre-update processed image obtained by performing the image processing before the software update on the surgical site image, and a post-update processed image obtained by performing the image processing after the software update on the surgical site image.
- Generate and The display control unit controls display of at least a part of at least one of the pre-update processed image and the post-update processed image.
- the surgical image processing apparatus adds the pre-update processing image and the post-update processing image to each pixel at a predetermined ratio and displays the pixel.
- the display control unit synthesizes and displays the first area of the pre-update processed image and the second area other than the first area of the post-update processed image (1) or (2) The surgical image processing apparatus described.
- the first area is one area in which the screen is divided into right and left parts
- the surgical image processing apparatus according to (8), wherein the second region is the other region in which the screen is divided into left and right parts.
- the first area is a central area including the center of the screen;
- a further difference calculating unit that calculates a difference between the pre-update image and the post-update image;
- the said display control part controls the display of the information which alert
- a scene detector for detecting a predetermined scene in the image;
- the display control unit displays, for the detected scene, the selected one of the pre-update processing image and the post-update processing image to the user.
- (1) or (2) apparatus (1) or (2) apparatus.
- a feature quantity extraction unit for extracting the feature quantity of the detected scene; The extracted feature quantity is associated with selection information indicating the selected one of the pre-update processed image and the post-update processed image for the detected scene and recorded as history information.
- the surgical image processing device further including: (15) A learning unit that learns which of the pre-update processed image and the post-update processed image is selected by the user for the detected scene for each feature amount of the scene based on the history information
- the surgical image processing apparatus further comprising: (16) An inquiry unit for inquiring a learning result corresponding to the feature amount of the predetermined scene detected in the other image;
- the display control unit displays either the pre-update processed image or the post-update processed image for the predetermined scene in the other image based on the inquired learning result (15).
- An image processing unit that performs image processing by software on the surgical unit image
- a surgical image processing apparatus comprising: a display control unit that controls display of the surgical part image subjected to the image processing; A pre-update processed image obtained by performing the image processing before the software update on the surgical part image, and an updated post-processed image obtained by performing the image processing after the software update on the surgical part image,
- An image processing method including a step of controlling display of at least a part of at least one of the pre-update processed image and the post-update processed image.
- An imaging device for surgery that acquires an image of the surgical site;
- An image processing unit that performs image processing by software on the surgical part image;
- a surgical image processing apparatus comprising: a display control unit that controls display of the surgical part image that has undergone the image processing;
- the image processing unit includes a pre-update processed image obtained by performing the image processing before the software update on the surgical site image, and a post-update processed image obtained by performing the image processing after the software update on the surgical site image.
- Generate The display control unit controls display of at least a part of at least one of the pre-update processed image and the post-update processed image.
- 201 Image processing device 211 Image processing unit, 212 Display control unit, 221 Image processing unit after SW update, 222 Image processing unit before SW update, 281 Display control unit, 301 Image processing device, 311 Image processing unit, 312 Display control unit , 321 SW updated image processing unit, 322 SW pre-updated image processing unit, 323 difference calculation unit, 331 OSD processing unit, 401 image processing device, 411 image processing unit, 412 display control unit, 413 scene detection unit, 414 feature quantity Extraction unit, 415 recording control unit, 416 history information recording unit, 417 learning unit, 418 learning information recording unit, 421 SW updated image processing unit, 422 SW pre-update image processing unit, 431 OSD processing unit, 511 learning information inquiry unit
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Public Health (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Gynecology & Obstetrics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Endoscopes (AREA)
Abstract
Description
2.ソフトウェアの更新について
3.第1の実施の形態
4.第2の実施の形態
5.第3の実施の形態
6.第4の実施の形態
7.第5の実施の形態
まず、本技術が適用される内視鏡手術システムの概要について説明する。
ところで、上述した内視鏡手術システムにおいては、内視鏡により撮像された画像に各種の画像処理が施される。これらの画像処理はソフトウェアにより実現され、GPUなどのプロセッサが所定のプログラムに従って動作することにより、これらの画像処理を実行する。
(画像処理装置の構成例)
まず、図3を参照して、本技術に係る第1の実施の形態の画像処理装置の構成例について説明する。
次に、図4のフローチャートを参照して、画像処理装置201による術部画像表示処理の流れについて説明する。図4の処理は、内視鏡101(カメラヘッド105)から、術部を撮影した術部画像(動画像)が画像処理装置201に入力されると開始される。
図5は、更新後処理画像および更新前処理画像の第1の表示例を示している。
図6は、更新後処理画像および更新前処理画像の第2の表示例を示している。
図7は、更新後処理画像および更新前処理画像の第3の表示例を示している。
図8は、更新後処理画像および更新前処理画像の第4の表示例を示している。
(画像処理装置の構成例)
図9は、本技術に係る第2の実施の形態の画像処理装置の構成例を示している。
(画像処理装置の構成例)
図10は、本技術に係る第3の実施の形態の画像処理装置の構成例を示している。
次に、図11のフローチャートを参照して、画像処理装置301による術部画像表示処理の流れについて説明する。図11の処理は、内視鏡101(カメラヘッド105)から、術部を撮影した術部画像(動画像)が画像処理装置301に入力されると開始される。
(画像処理装置の構成例)
図13は、本技術に係る第4の実施の形態の画像処理装置の構成例を示している。
次に、図14のフローチャートを参照して、画像処理装置401による術部画像表示処理の流れについて説明する。図14の処理は、内視鏡101(カメラヘッド105)から、術部を撮影した術部画像(動画像)が画像処理装置401に入力されると開始される。
(画像処理装置の構成例)
図15は、本技術に係る第5の実施の形態の画像処理装置の構成例を示している。
次に、図16のフローチャートを参照して、画像処理装置501による術部画像表示処理の流れについて説明する。図16の処理は、内視鏡101(カメラヘッド105)から、術部を撮影した術部画像(動画像)が画像処理装置501に入力されると開始される。
(1)
術部画像に、ソフトウェアによる画像処理を施す画像処理部と、
前記画像処理が施された前記術部画像の表示を制御する表示制御部と
を備え、
前記画像処理部は、前記術部画像に前記ソフトウェア更新前の前記画像処理を施した更新前処理画像と、前記術部画像に前記ソフトウェア更新後の前記画像処理を施した更新後処理画像とを生成し、
前記表示制御部は、前記更新前処理画像および前記更新後処理画像の少なくともいずれか一方の少なくとも一部の表示を制御する
手術用画像処理装置。
(2)
前記画像処理部は、前記ソフトウェア更新前の前記画像処理と前記ソフトウェア更新後の前記画像処理とを並行して行う
(1)に記載の手術用画像処理装置。
(3)
前記表示制御部は、前記更新前処理画像と前記更新後処理画像とを、1つの画面に同時に表示させる
(1)または(2)に記載の手術用画像処理装置。
(4)
前記表示制御部は、ユーザの操作に応じて、前記画面における前記更新前処理画像および前記更新後処理画像の表示サイズを調整する
(3)に記載の手術用画像処理装置。
(5)
前記表示制御部は、前記更新前処理画像と前記更新後処理画像とを、それぞれ異なる表示装置に表示させる
(1)または(2)に記載の手術用画像処理装置。
(6)
前記表示制御部は、前記更新前処理画像および前記更新後処理画像のうちユーザに選択された方を表示させる
(1)または(2)に記載の手術用画像処理装置。
(7)
前記表示制御部は、前記更新前処理画像と前記更新後処理画像とを、所定の比率で画素毎に加算して表示させる
(1)または(2)に記載の手術用画像処理装置。
(8)
前記表示制御部は、前記更新前処理画像の第1の領域と、前記更新後処理画像の前記第1の領域以外の第2の領域とを合成して表示させる
(1)または(2)に記載の手術用画像処理装置。
(9)
前記第1の領域は、画面が左右に2分割された一方の領域であり、
前記第2の領域は、前記画面が左右に2分割された他方の領域である
(8)に記載の手術用画像処理装置。
(10)
前記第1の領域は、画面の中心を含む中心領域であり、
前記第2の領域は、前記画面における前記中心領域より外側の周縁領域である
(8)に記載の手術用画像処理装置。
(11)
前記第1の領域と前記第2の領域との境界は、ユーザの操作により決定される
(8)乃至(10)のいずれかに記載の手術用画像処理装置。
(12)
前記更新前処理画像と前記更新後処理画像の差分を演算する差分演算部をさらに備え、
前記表示制御部は、前記差分が所定値より大きい場合、その旨を報知する情報の表示を制御する
(1)または(2)に記載の手術用画像処理装置。
(13)
前記画像において所定のシーンを検出するシーン検出部をさらに備え、
前記表示制御部は、検出された前記シーンについて、前記更新前処理画像および前記更新後処理画像のうちのユーザに選択された方を表示させる
(1)または(2)に記載の手術用画像処理装置。
(14)
検出された前記シーンの特徴量を抽出する特徴量抽出部と、
抽出された前記特徴量と、検出された前記シーンについて、前記更新前処理画像および前記更新後処理画像のうちの前記ユーザに選択された方を示す選択情報とを対応付けて、履歴情報として記録する記録制御部とをさらに備える
(13)に記載の手術用画像処理装置。
(15)
前記履歴情報に基づいて、前記シーンの前記特徴量毎に、検出された前記シーンについて、前記更新前処理画像および前記更新後処理画像のうちのいずれが前記ユーザに選択されたかを学習する学習部をさらに備える
(14)に記載の手術用画像処理装置。
(16)
他の画像において検出された所定のシーンの前記特徴量に対応する学習結果を照会する照会部をさらに備え、
前記表示制御部は、照会された前記学習結果に基づいて、前記他の画像における前記所定のシーンについて、前記更新前処理画像および前記更新後処理画像のうちのいずれかを表示させる
(15)に記載の手術用画像処理装置。
(17)
術部画像に、ソフトウェアによる画像処理を施す画像処理部と、
前記画像処理が施された前記術部画像の表示を制御する表示制御部と
を備える手術用画像処理装置が、
前記術部画像に前記ソフトウェア更新前の前記画像処理を施した更新前処理画像と、前記術部画像に前記ソフトウェア更新後の前記画像処理を施した更新後処理画像を生成し、
前記更新前処理画像および前記更新後処理画像の少なくともいずれかの少なくとも一部の表示を制御する
ステップを含む画像処理方法。
(18)
術部画像を取得する手術用撮像装置と、
前記術部画像に、ソフトウェアによる画像処理を施す画像処理部と、
前記画像処理が施された前記術部画像の表示を制御する表示制御部と
を有する手術用画像処理装置と
を備え、
前記画像処理部は、前記術部画像に前記ソフトウェア更新前の前記画像処理を施した更新前処理画像と、前記術部画像に前記ソフトウェア更新後の前記画像処理を施した更新後処理画像とを生成し、
前記表示制御部は、前記更新前処理画像および前記更新後処理画像の少なくともいずれか一方の少なくとも一部の表示を制御する
手術システム。
Claims (18)
- 術部画像に、ソフトウェアによる画像処理を施す画像処理部と、
前記画像処理が施された前記術部画像の表示を制御する表示制御部と
を備え、
前記画像処理部は、前記術部画像に前記ソフトウェア更新前の前記画像処理を施した更新前処理画像と、前記術部画像に前記ソフトウェア更新後の前記画像処理を施した更新後処理画像とを生成し、
前記表示制御部は、前記更新前処理画像および前記更新後処理画像の少なくともいずれか一方の少なくとも一部の表示を制御する
手術用画像処理装置。 - 前記画像処理部は、前記ソフトウェア更新前の前記画像処理と前記ソフトウェア更新後の前記画像処理とを並行して行う
請求項1に記載の手術用画像処理装置。 - 前記表示制御部は、前記更新前処理画像と前記更新後処理画像とを、1つの画面に同時に表示させる
請求項1に記載の手術用画像処理装置。 - 前記表示制御部は、ユーザの操作に応じて、前記画面における前記更新前処理画像および前記更新後処理画像の表示サイズを調整する
請求項3に記載の手術用画像処理装置。 - 前記表示制御部は、前記更新前処理画像と前記更新後処理画像とを、それぞれ異なる表示装置に表示させる
請求項1に記載の手術用画像処理装置。 - 前記表示制御部は、前記更新前処理画像および前記更新後処理画像のうちユーザに選択された方を表示させる
請求項1に記載の手術用画像処理装置。 - 前記表示制御部は、前記更新前処理画像と前記更新後処理画像とを、所定の比率で画素毎に加算して表示させる
請求項1に記載の手術用画像処理装置。 - 前記表示制御部は、前記更新前処理画像の第1の領域と、前記更新後処理画像の前記第1の領域以外の第2の領域とを合成して表示させる
請求項1に記載の手術用画像処理装置。 - 前記第1の領域は、画面が左右に2分割された一方の領域であり、
前記第2の領域は、前記画面が左右に2分割された他方の領域である
請求項8に記載の手術用画像処理装置。 - 前記第1の領域は、画面の中心を含む中心領域であり、
前記第2の領域は、前記画面における前記中心領域より外側の周縁領域である
請求項8に記載の手術用画像処理装置。 - 前記第1の領域と前記第2の領域との境界は、ユーザの操作により決定される
請求項8に記載の手術用画像処理装置。 - 前記更新前処理画像と前記更新後処理画像の差分を演算する差分演算部をさらに備え、
前記表示制御部は、前記差分が所定値より大きい場合、その旨を報知する情報の表示を制御する
請求項1に記載の手術用画像処理装置。 - 前記画像において所定のシーンを検出するシーン検出部をさらに備え、
前記表示制御部は、検出された前記シーンについて、前記更新前処理画像および前記更新後処理画像のうちのユーザに選択された方を表示させる
請求項1に記載の手術用画像処理装置。 - 検出された前記シーンの特徴量を抽出する特徴量抽出部と、
抽出された前記特徴量と、検出された前記シーンについて、前記更新前処理画像および前記更新後処理画像のうちの前記ユーザに選択された方を示す選択情報とを対応付けて、履歴情報として記録する記録制御部とをさらに備える
請求項13に記載の手術用画像処理装置。 - 前記履歴情報に基づいて、前記シーンの前記特徴量毎に、検出された前記シーンについて、前記更新前処理画像および前記更新後処理画像のうちのいずれが前記ユーザに選択されたかを学習する学習部をさらに備える
請求項14に記載の手術用画像処理装置。 - 他の画像において検出された所定のシーンの前記特徴量に対応する学習結果を照会する照会部をさらに備え、
前記表示制御部は、照会された前記学習結果に基づいて、前記他の画像における前記所定のシーンについて、前記更新前処理画像および前記更新後処理画像のうちのいずれかを表示させる
請求項15に記載の手術用画像処理装置。 - 術部画像に、ソフトウェアによる画像処理を施す画像処理部と、
前記画像処理が施された前記術部画像の表示を制御する表示制御部と
を備える手術用画像処理装置が、
前記術部画像に前記ソフトウェア更新前の前記画像処理を施した更新前処理画像と、前記術部画像に前記ソフトウェア更新後の前記画像処理を施した更新後処理画像を生成し、
前記更新前処理画像および前記更新後処理画像の少なくともいずれかの少なくとも一部の表示を制御する
ステップを含む画像処理方法。 - 術部画像を取得する手術用撮像装置と、
前記術部画像に、ソフトウェアによる画像処理を施す画像処理部と、
前記画像処理が施された前記術部画像の表示を制御する表示制御部と
を有する手術用画像処理装置と
を備え、
前記画像処理部は、前記術部画像に前記ソフトウェア更新前の前記画像処理を施した更新前処理画像と、前記術部画像に前記ソフトウェア更新後の前記画像処理を施した更新後処理画像とを生成し、
前記表示制御部は、前記更新前処理画像および前記更新後処理画像の少なくともいずれか一方の少なくとも一部の表示を制御する
手術システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201880020194.8A CN110536629B (zh) | 2017-03-31 | 2018-03-16 | 手术图像处理设备、图像处理方法和手术系统 |
JP2019509271A JP7115467B2 (ja) | 2017-03-31 | 2018-03-16 | 手術用画像処理装置、画像処理方法、及び、手術システム |
US16/496,452 US11483473B2 (en) | 2017-03-31 | 2018-03-16 | Surgical image processing apparatus, image processing method, and surgery system |
EP18774672.2A EP3603479B1 (en) | 2017-03-31 | 2018-03-16 | Surgical image processing device, image processing method, and surgery system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-072244 | 2017-03-31 | ||
JP2017072244 | 2017-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018180573A1 true WO2018180573A1 (ja) | 2018-10-04 |
Family
ID=63675571
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/010391 WO2018180573A1 (ja) | 2017-03-31 | 2018-03-16 | 手術用画像処理装置、画像処理方法、及び、手術システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US11483473B2 (ja) |
EP (1) | EP3603479B1 (ja) |
JP (1) | JP7115467B2 (ja) |
CN (1) | CN110536629B (ja) |
WO (1) | WO2018180573A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020110278A1 (ja) * | 2018-11-30 | 2020-06-04 | オリンパス株式会社 | 情報処理システム、内視鏡システム、学習済みモデル、情報記憶媒体及び情報処理方法 |
JPWO2021241695A1 (ja) * | 2020-05-27 | 2021-12-02 | ||
WO2024111147A1 (ja) * | 2022-11-25 | 2024-05-30 | パナソニックIpマネジメント株式会社 | 画像補正装置 |
JP7545375B2 (ja) | 2021-09-27 | 2024-09-04 | 株式会社日立国際電気 | カメラシステム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113011418B (zh) * | 2021-02-09 | 2024-02-23 | 杭州海康慧影科技有限公司 | 确定图像中待处理区域的方法、装置、设备 |
CN118678007A (zh) * | 2024-08-19 | 2024-09-20 | 杭州海康威视数字技术股份有限公司 | 一种视频处理设备以及视频信号传输方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002176582A (ja) * | 2000-12-06 | 2002-06-21 | Fuji Photo Film Co Ltd | 電子機器 |
WO2006093225A1 (ja) * | 2005-03-03 | 2006-09-08 | Nikon Corporation | カメラ付携帯電話および画像機器 |
JP2008186294A (ja) * | 2007-01-30 | 2008-08-14 | Toshiba Corp | ソフトウェア更新装置及びソフトウェア更新システム |
JP2009226169A (ja) | 2008-03-25 | 2009-10-08 | Olympus Medical Systems Corp | 撮像システムおよび撮像システムのメンテナンス方法。 |
JP2013090194A (ja) * | 2011-10-19 | 2013-05-13 | Sony Corp | サーバ装置、画像送信方法、端末装置、画像受信方法、プログラムおよび画像処理システム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001034775A (ja) * | 1999-05-17 | 2001-02-09 | Fuji Photo Film Co Ltd | 履歴画像表示方法 |
US7747960B2 (en) * | 2006-09-06 | 2010-06-29 | Stereotaxis, Inc. | Control for, and method of, operating at least two medical systems |
US20130174042A1 (en) | 2011-12-30 | 2013-07-04 | Samsung Electronics Co., Ltd. | Display apparatus, upgrading apparatus, display system and control method thereof |
JP2014138294A (ja) * | 2013-01-17 | 2014-07-28 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
US9270919B2 (en) * | 2013-09-24 | 2016-02-23 | Karl Storz Imaging, Inc. | Simultaneous display of two or more different sequentially processed images |
WO2015105951A1 (en) | 2014-01-08 | 2015-07-16 | Board Of Regents Of The University Of Texas System | System and method for intraoperative fluorescence imaging in ambient light |
JP6339872B2 (ja) | 2014-06-24 | 2018-06-06 | オリンパス株式会社 | 画像処理装置、内視鏡システム及び画像処理方法 |
EP3265011A1 (en) * | 2015-03-01 | 2018-01-10 | Aris MD, Inc. | Reality-augmented morphological procedure |
JP2017041831A (ja) * | 2015-08-21 | 2017-02-23 | 株式会社リコー | 通信システム、通信管理システム、通信管理方法、及びプログラム |
-
2018
- 2018-03-16 JP JP2019509271A patent/JP7115467B2/ja active Active
- 2018-03-16 US US16/496,452 patent/US11483473B2/en active Active
- 2018-03-16 WO PCT/JP2018/010391 patent/WO2018180573A1/ja active Application Filing
- 2018-03-16 EP EP18774672.2A patent/EP3603479B1/en active Active
- 2018-03-16 CN CN201880020194.8A patent/CN110536629B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002176582A (ja) * | 2000-12-06 | 2002-06-21 | Fuji Photo Film Co Ltd | 電子機器 |
WO2006093225A1 (ja) * | 2005-03-03 | 2006-09-08 | Nikon Corporation | カメラ付携帯電話および画像機器 |
JP2008186294A (ja) * | 2007-01-30 | 2008-08-14 | Toshiba Corp | ソフトウェア更新装置及びソフトウェア更新システム |
JP2009226169A (ja) | 2008-03-25 | 2009-10-08 | Olympus Medical Systems Corp | 撮像システムおよび撮像システムのメンテナンス方法。 |
JP2013090194A (ja) * | 2011-10-19 | 2013-05-13 | Sony Corp | サーバ装置、画像送信方法、端末装置、画像受信方法、プログラムおよび画像処理システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3603479A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020110278A1 (ja) * | 2018-11-30 | 2020-06-04 | オリンパス株式会社 | 情報処理システム、内視鏡システム、学習済みモデル、情報記憶媒体及び情報処理方法 |
JPWO2020110278A1 (ja) * | 2018-11-30 | 2021-10-28 | オリンパス株式会社 | 情報処理システム、内視鏡システム、学習済みモデル、情報記憶媒体及び情報処理方法 |
JP7127785B2 (ja) | 2018-11-30 | 2022-08-30 | オリンパス株式会社 | 情報処理システム、内視鏡システム、学習済みモデル、情報記憶媒体及び情報処理方法 |
US11907849B2 (en) | 2018-11-30 | 2024-02-20 | Olympus Corporation | Information processing system, endoscope system, information storage medium, and information processing method |
JPWO2021241695A1 (ja) * | 2020-05-27 | 2021-12-02 | ||
JP7463507B2 (ja) | 2020-05-27 | 2024-04-08 | 富士フイルム株式会社 | 内視鏡画像処理装置 |
JP7545375B2 (ja) | 2021-09-27 | 2024-09-04 | 株式会社日立国際電気 | カメラシステム |
WO2024111147A1 (ja) * | 2022-11-25 | 2024-05-30 | パナソニックIpマネジメント株式会社 | 画像補正装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018180573A1 (ja) | 2020-02-06 |
EP3603479A1 (en) | 2020-02-05 |
EP3603479A4 (en) | 2020-03-18 |
EP3603479B1 (en) | 2021-04-28 |
US11483473B2 (en) | 2022-10-25 |
CN110536629B (zh) | 2022-04-15 |
US20210112197A1 (en) | 2021-04-15 |
JP7115467B2 (ja) | 2022-08-09 |
CN110536629A (zh) | 2019-12-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7067467B2 (ja) | 医療用情報処理装置、情報処理方法、医療用情報処理システム | |
WO2018180573A1 (ja) | 手術用画像処理装置、画像処理方法、及び、手術システム | |
WO2018123613A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
US20200113413A1 (en) | Surgical system and surgical imaging device | |
US11883120B2 (en) | Medical observation system, medical signal processing device, and medical signal processing device driving method | |
WO2017145606A1 (ja) | 画像処理装置、画像処理方法及び内視鏡システム | |
JP7092111B2 (ja) | 撮像装置、映像信号処理装置および映像信号処理方法 | |
US10778889B2 (en) | Image pickup apparatus, video signal processing apparatus, and video signal processing method | |
WO2019167555A1 (ja) | 映像信号処理装置、映像信号処理方法および撮像装置 | |
US11778325B2 (en) | Image processing apparatus, image processing method, and image processing program | |
JP7456385B2 (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
WO2020203164A1 (ja) | 医療システム、情報処理装置及び情報処理方法 | |
US20210228061A1 (en) | Medical observation system, medical observation apparatus, and drive method of medical observation apparatus | |
WO2018043205A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
US11979670B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and program for blending plurality of image signals based on a peaking signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18774672 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019509271 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018774672 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2018774672 Country of ref document: EP Effective date: 20191031 |