WO2021044900A1 - Operation system, image processing device, image processing method, and program - Google Patents

Operation system, image processing device, image processing method, and program Download PDF

Info

Publication number
WO2021044900A1
WO2021044900A1 PCT/JP2020/031967 JP2020031967W WO2021044900A1 WO 2021044900 A1 WO2021044900 A1 WO 2021044900A1 JP 2020031967 W JP2020031967 W JP 2020031967W WO 2021044900 A1 WO2021044900 A1 WO 2021044900A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
treatment system
unit
area
Prior art date
Application number
PCT/JP2020/031967
Other languages
French (fr)
Japanese (ja)
Inventor
真人 山根
雄生 杉江
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2021044900A1 publication Critical patent/WO2021044900A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • This technology relates to treatment systems, image processing devices, image processing methods, and programs that can be applied to surgery, examinations, etc.
  • the purpose of this technique is to provide a treatment system, an image processing device, an image processing method, and a program capable of sufficiently grasping the target image.
  • the treatment system includes an imaging unit and an image processing unit.
  • the imaging unit can photograph the object to be treated.
  • the image processing unit enlarges the first region in the captured image captured by the imaging unit, and obtains the amount of information in the second region different from the first region in the captured image. Decrease so that a given area within the area of is emphasized.
  • the first area in the captured image in which the treatment target is captured is enlarged.
  • the amount of information in the second region, which is different from the first region in the captured image, is reduced so that the predetermined region in the second region is emphasized. This makes it possible to fully grasp the target image.
  • the image processing apparatus includes an image acquisition unit and an image processing unit.
  • the image acquisition unit acquires an image including the treatment target.
  • the image processing unit enlarges the first region in the image, and the predetermined region in the second region can obtain the amount of information in the second region different from the first region in the image. Reduce to be emphasized.
  • the image processing method is an image processing method executed by a computer system, and includes acquiring an image including a treatment target.
  • the first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. Will be done.
  • a program causes a computer system to perform the following steps.
  • the first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. Steps to make.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 100 to which the technique according to the present disclosure can be applied.
  • FIG. 1 shows how an operator (doctor) 32 is performing surgery on a patient 34 on a patient bed 33 using the endoscopic surgery system 100.
  • the endoscopic surgery system 100 includes an endoscope 1, other surgical tools 9, a support arm device 14 that supports the endoscope 1, and various devices for endoscopic surgery. It is composed of a cart 19 equipped with.
  • troccas 13a to 13d are punctured into the abdominal wall.
  • the lens barrel 2 of the endoscope 1 and other surgical tools 9 are inserted into the body cavity of the patient 34 from the troccers 13a to 13d.
  • a pneumoperitoneum tube 10 an energy treatment tool 11, and forceps 12 are inserted into the body cavity of the patient 34.
  • the energy treatment tool 11 is a treatment tool that cuts and peels tissue, seals a blood vessel, or the like by using a high-frequency current or ultrasonic vibration.
  • the surgical tool 9 shown in the figure is merely an example, and as the surgical tool 9, various surgical tools generally used in endoscopic surgery such as a sword and a retractor may be used.
  • the image of the surgical site in the body cavity of the patient 34 taken by the endoscope 1 is displayed on the display device 21.
  • the surgeon 32 uses the energy treatment tool 11 and the forceps 12 to perform a procedure such as excising the affected portion while viewing the image of the surgical portion displayed on the display device 21 in real time.
  • the pneumoperitoneum tube 10, the energy treatment tool 11, and the forceps 12 are supported by the surgeon 32, an assistant, or the like during the operation.
  • the support arm device 14 includes an arm portion 16 extending from the base portion 15.
  • the arm portion 16 is composed of joint portions 17a, 17b, 17c, and links 18a, 18b, and is driven by control from the arm control device 23.
  • the endoscope 1 is supported by the arm portion 16, and its position and posture are controlled. Thereby, the stable position of the endoscope 1 can be fixed.
  • the endoscope 1 is composed of a lens barrel 2 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 34, and a camera head 3 connected to the base end of the lens barrel 2.
  • the endoscope 1 configured as a so-called rigid mirror having a rigid barrel 2 is illustrated, but the endoscope 1 is configured as a so-called flexible mirror having a flexible barrel 2. May be good.
  • the tip of the lens barrel 2 is provided with an opening in which an objective lens is fitted.
  • a light source device 22 is connected to the endoscope 1, and the light generated by the light source device 22 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 2, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 34 through the lens.
  • the endoscope 1 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 3, and the reflected light (observation light) from the observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 39.
  • the camera head 3 is equipped with a function of adjusting the magnification and the focal length by appropriately driving the optical system.
  • the camera head 3 may be provided with a plurality of image pickup elements.
  • a plurality of relay optical systems are provided inside the lens barrel 2 in order to guide the observation light to each of the plurality of image pickup elements.
  • the CCU 20 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operations of the endoscope 1 and the display device 21 in an integrated manner. Specifically, the CCU 20 performs various image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal received from the camera head 3. The CCU 20 provides the display device 21 with the image signal that has undergone the image processing. Further, the CCU 20 transmits a control signal to the camera head 3 and controls the driving thereof.
  • the control signal may include information about imaging conditions such as magnification and focal length.
  • the display device 21 displays an image based on the image signal processed by the CCU 20 under the control of the CCU 20.
  • the endoscope 1 is compatible with high-resolution shooting such as 4K (3840 horizontal pixels x 2160 vertical pixels) or 8K (7680 horizontal pixels x 4320 vertical pixels), and / or 3D display.
  • a display device 21 capable of displaying a high resolution and / or a device capable of displaying in 3D can be used.
  • the display device 21 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 21 having a size of 55 inches or more.
  • a plurality of display devices 21 having different resolutions and sizes may be provided depending on the application.
  • the light source device 22 is composed of, for example, a light source such as an LED (light LED radio), and supplies irradiation light for photographing the surgical site to the endoscope 1.
  • a light source such as an LED (light LED radio)
  • the arm control device 23 is composed of a processor such as a CPU, and operates according to a predetermined program to control the drive of the arm portion 16 of the support arm device 14 according to a predetermined control method.
  • the input device 24 is an input interface for the endoscopic surgery system 100.
  • the user can input various information and input instructions to the endoscopic surgery system 100 via the input device 24.
  • the user inputs various information related to the surgery, such as physical information of the patient and information about the surgical procedure, via the input device 24.
  • the user gives an instruction to drive the arm unit 16 via the input device 24, or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 1.
  • An instruction to drive the energy treatment tool 11 and the like are input.
  • the type of the input device 24 is not limited, and the input device 24 may be various known input devices.
  • the input device 24 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 27 and / or a lever and the like can be applied.
  • the touch panel may be provided on the display surface of the display device 21.
  • the input device 24 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), and various inputs are made according to the user's gesture and line of sight detected by these devices. Is done.
  • the input device 24 includes a camera capable of detecting the movement of the user, and various inputs are performed according to the gesture and the line of sight of the user detected from the image captured by the camera.
  • the input device 24 includes a microphone capable of picking up the user's voice, and various inputs are performed by voice through the microphone.
  • the input device 24 is configured to be able to input various information in a non-contact manner, so that a user belonging to a clean area (for example, an operator 32) can operate a device belonging to a dirty area in a non-contact manner. Is possible.
  • the user can operate the device without taking his / her hand off the surgical tool that he / she has, which improves the convenience of the user.
  • the treatment tool control device 25 controls the drive of the energy treatment tool 11 for cauterizing, incising, sealing a blood vessel, or the like of a tissue.
  • the pneumoperitoneum device 26 has a gas in the body cavity through the pneumoperitoneum tube 10 in order to inflate the body cavity of the patient 34 for the purpose of securing the field of view by the endoscope 1 and securing the work space of the operator.
  • the recorder 35 is a device capable of recording various information related to surgery.
  • the printer 36 is a device capable of printing various information related to surgery in various formats such as text, images, and graphs.
  • the support arm device 14 includes a base portion 15 as a base and an arm portion 16 extending from the base portion 15.
  • the arm portion 16 is composed of a plurality of joint portions 17a, 17b, 17c and a plurality of links 18a, 18b connected by the joint portions 17b.
  • the configuration of the arm portion 16 is shown in a simplified manner. Actually, the shapes, numbers and arrangements of the joint portions 17a to 17c and the links 18a and 18b, the direction of the rotation axis of the joint portions 17a to 17c, and the like are appropriately set so that the arm portion 16 has a desired degree of freedom. obtain.
  • the arm portion 16 can be preferably configured to have at least 6 degrees of freedom.
  • the endoscope 1 can be freely moved within the movable range of the arm portion 16, so that the lens barrel 2 of the endoscope 1 can be inserted into the body cavity of the patient 34 from a desired direction. It will be possible.
  • Actuators are provided in the joint portions 17a to 17c, and the joint portions 17a to 17c are configured to be rotatable around a predetermined rotation axis by driving the actuator.
  • the arm control device 23 By controlling the drive of the actuator by the arm control device 23, the rotation angles of the joint portions 17a to 17c are controlled, and the drive of the arm portion 16 is controlled. Thereby, control of the position and posture of the endoscope 1 can be realized.
  • the arm control device 23 can control the drive of the arm unit 16 by various known control methods such as force control or position control.
  • the arm control device 23 appropriately controls the drive of the arm portion 16 in response to the operation input.
  • the position and orientation of the endoscope 1 may be controlled.
  • the endoscope 1 at the tip of the arm portion 16 can be moved from an arbitrary position to an arbitrary position, and then fixedly supported at the moved position.
  • the arm portion 16 may be operated by a so-called master slave method. In this case, the arm portion 16 can be remotely controlled by the user via an input device 24 installed at a location away from the operating room.
  • the arm control device 23 When force control is applied, the arm control device 23 receives an external force from the user and moves the actuators of the joint portions 17a to 17c so that the arm portion 16 moves smoothly according to the external force. So-called power assist control for driving may be performed. As a result, when the user moves the arm portion 16 while directly touching the arm portion 16, the arm portion 16 can be moved with a relatively light force. Therefore, the endoscope 1 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • endoscopy 1 was supported by a doctor called a scopist.
  • the position of the endoscope 1 can be fixed more reliably without manpower, so that an image of the surgical site can be stably obtained. , It becomes possible to perform surgery smoothly.
  • the arm control device 23 does not necessarily have to be provided in the cart 19. Further, the arm control device 23 does not necessarily have to be one device.
  • the arm control device 23 may be provided in each of the joint portions 17a to 17c of the arm portion 16 of the support arm device 14, and the arm portion 16 is driven by the plurality of arm control devices 23 cooperating with each other. Control may be realized.
  • the light source device 22 supplies the endoscope 1 with irradiation light for photographing the surgical site.
  • the light source device 22 is composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy, so that the white balance of the captured image in the light source device 22 can be controlled. Can be adjusted.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-divided manner, and the drive of the image sensor of the camera head 3 is controlled in synchronization with the irradiation timing to support each of RGB. It is also possible to capture the image in a time-divided manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 22 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 3 in synchronization with the timing of changing the light intensity to acquire images in time division and synthesizing the images, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 22 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane. So-called narrow band imaging, in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected.
  • An excitation light corresponding to the fluorescence wavelength of the reagent may be irradiated to obtain a fluorescence image.
  • the light source device 22 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the camera head 3 and the CCU 20 shown in FIG.
  • the camera head 3 has a lens unit 4, an imaging unit 5, a driving unit 6, a communication unit 7, and a camera head control unit 8 as its functions.
  • the CCU 20 has a communication unit 28, an image processing unit 29, and a control unit 30 as its functions.
  • the camera head 3 and the CCU 20 are connected by a transmission cable 31 so as to be able to communicate in both directions.
  • the lens unit 4 is an optical system provided at a connection portion with the lens barrel 2.
  • the observation light taken in from the tip of the lens barrel 2 is guided to the camera head 3 and incident on the lens unit 4.
  • the lens unit 4 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the optical characteristics of the lens unit 4 are adjusted so as to collect the observation light on the light receiving surface of the image sensor of the image pickup unit 5.
  • the zoom lens and the focus lens are configured so that their positions on the optical axis can be moved in order to adjust the magnification and the focus of the captured image.
  • the image pickup unit 5 is composed of an image pickup element and is arranged after the lens unit 4.
  • the observation light that has passed through the lens unit 4 is focused on the light receiving surface of the image pickup device, and an image signal corresponding to the observation image is generated by photoelectric conversion.
  • the image signal generated by the imaging unit 5 is provided to the communication unit 7.
  • CMOS Complementary Metal Oxide Semiconductor
  • image pickup device for example, an image pickup device capable of capturing a high resolution image of 4K or higher may be used.
  • the image pickup elements constituting the image pickup unit 5 are configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D display, respectively.
  • the 3D display enables the operator 32 to more accurately grasp the depth of the biological tissue in the surgical site.
  • the image pickup unit 5 is composed of a multi-plate type, a plurality of lens units 4 are also provided corresponding to each image pickup element.
  • the imaging unit 5 does not necessarily have to be provided on the camera head 3.
  • the imaging unit 5 may be provided inside the lens barrel 2 immediately after the objective lens.
  • the drive unit 6 is composed of an actuator, and the zoom lens and focus lens of the lens unit 4 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 8. As a result, the magnification and focus of the image captured by the imaging unit 5 can be adjusted as appropriate.
  • the communication unit 7 is composed of a communication device for transmitting and receiving various information to and from the CCU 20.
  • the communication unit 7 transmits the image signal obtained from the image pickup unit 5 as RAW data to the CCU 20 via the transmission cable 31.
  • the image signal is transmitted by optical communication.
  • the surgeon 32 performs the surgery while observing the condition of the affected area with the captured image, so for safer and more reliable surgery, the moving image of the surgical site is displayed in real time as much as possible. This is because it is required.
  • the communication unit 7 is provided with a photoelectric conversion module that converts an electric signal into an optical signal.
  • the image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 20 via the transmission cable 31.
  • the communication unit 7 receives a control signal for controlling the drive of the camera head 3 from the CCU 20.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
  • the communication unit 7 provides the received control signal to the camera head control unit 8.
  • the control signal from the CCU 20 may also be transmitted by optical communication.
  • the communication unit 7 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 8.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus are automatically set by the control unit 30 of the CCU 20 based on the acquired image signal. That is, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 1.
  • the camera head control unit 8 controls the drive of the camera head 3 based on the control signal from the CCU 20 received via the communication unit 7. For example, the camera head control unit 8 controls the drive of the image pickup device of the image pickup unit 5 based on the information to specify the frame rate of the captured image and / or the information to specify the exposure at the time of imaging. Further, for example, the camera head control unit 8 appropriately moves the zoom lens and the focus lens of the lens unit 4 via the drive unit 6 based on the information that the magnification and the focus of the captured image are specified.
  • the camera head control unit 8 may further have a function of storing information for identifying the lens barrel 2 and the camera head 3.
  • the camera head 3 can be made resistant to autoclave sterilization.
  • the communication unit 28 is composed of a communication device for transmitting and receiving various types of information to and from the camera head 3.
  • the communication unit 28 receives an image signal transmitted from the camera head 3 via the transmission cable 31.
  • the image signal can be suitably transmitted by optical communication.
  • the communication unit 28 is provided with a photoelectric conversion module that converts an optical signal into an electric signal.
  • the communication unit 28 provides the image processing unit 29 with an image signal converted into an electric signal.
  • the communication unit 28 transmits a control signal for controlling the drive of the camera head 3 to the camera head 3.
  • the control signal may also be transmitted by optical communication.
  • the image processing unit 29 performs various image processing on the image signal which is the RAW data transmitted from the camera head 3.
  • the image processing includes, for example, development processing, high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing). Etc., various known signal processing is included.
  • the image processing unit 29 performs detection processing on the image signal for performing AE, AF, and AWB.
  • the image processing unit 29 is composed of a processor such as a CPU or GPU, and when the processor operates according to a predetermined program, the above-mentioned image processing and detection processing can be performed.
  • the image processing unit 29 is composed of a plurality of GPUs, the image processing unit 29 appropriately divides the information related to the image signal and performs image processing in parallel by the plurality of GPUs.
  • the control unit 30 performs various controls related to the imaging of the surgical site by the endoscope 1 and the display of the captured image. For example, the control unit 30 generates a control signal for controlling the drive of the camera head 3. At this time, if the imaging conditions are input by the user, the control unit 30 generates a control signal based on the input by the user. Alternatively, when the endoscope 1 is equipped with the AE function, the AF function, and the AWB function, the control unit 30 determines the optimum exposure value, focal length, and the optimum exposure value, depending on the result of the detection process by the image processing unit 29. The white balance is calculated appropriately and a control signal is generated.
  • control unit 30 causes the display device 21 to display an image of the surgical unit based on the image signal that has been image-processed by the image processing unit 29.
  • the control unit 30 recognizes various objects in the surgical site image by using various image recognition techniques.
  • the control unit 30 detects a surgical tool such as forceps, a specific biological part, bleeding, a mist when using the energy treatment tool 11, etc. by detecting the shape, color, etc. of the edge of the object included in the surgical site image. Can be recognized.
  • the control unit 30 uses the recognition result to superimpose and display various surgical support information on the image of the surgical site. By superimposing the surgery support information and presenting it to the surgeon 32, it becomes possible to proceed with the surgery more safely and surely.
  • the transmission cable 31 that connects the camera head 3 and the CCU 20 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
  • the communication is performed by wire using the transmission cable 31, but the communication between the camera head 3 and the CCU 20 may be performed wirelessly.
  • the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 31 in the operating room, so that the situation where the movement of the medical staff in the operating room is hindered by the transmission cable 31 can be solved.
  • the endoscopic surgery system 100 corresponds to the surgery system.
  • the camera head 3 corresponds to an imaging unit capable of imaging an object to be treated, and is configured as a part of the endoscope 1.
  • the image processing unit 29 corresponds to an embodiment of the image processing unit according to the present technology. Further, the CCU 20 having the image processing unit 29 corresponds to one embodiment of the image processing apparatus according to the present technology.
  • the image processing unit 29 has hardware necessary for configuring a computer, such as a processor such as a CPU or GPU and a memory such as ROM or RAM.
  • the image processing method according to the present technology is executed when the CPU or the like loads and executes the control program (program according to the present technology) recorded in the ROM or the like into the RAM.
  • the specific configuration of the image processing unit 29 is not limited, and any hardware such as FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit) may be used. It is also possible to realize the image processing unit 29 as a software block by executing a predetermined program by the CPU or the like of the CCU 20.
  • FIG. 3 is a block diagram showing a configuration example of the image processing unit 29.
  • the CPU or the like of the image processing unit 29 executes a predetermined program to execute the image acquisition unit 40 as a functional block, the attention area setting unit 41, the enlargement processing unit 42, the emphasis area setting unit 43, and the emphasis processing.
  • a unit 44, an information amount control unit 45, and an image composition unit 46 are realized.
  • dedicated hardware such as an IC (integrated circuit) may be used.
  • the program is installed in the image processing unit 29 via, for example, various recording media. Alternatively, the program may be installed via the Internet or the like.
  • the type of recording medium on which the program is recorded is not limited, and any computer-readable recording medium may be used. For example, any recording medium for recording data non-temporarily may be used.
  • the image acquisition unit 40 acquires a captured image of the treatment target captured by the endoscope 1.
  • the procedure includes various medical actions such as examination, surgery, and diagnosis.
  • medical practice includes, for example, an act that causes or may cause harm to the human body unless it is done by the medical judgment and skill of a doctor.
  • the treatment includes measurement of body temperature and blood pressure, treatment that does not require specialized judgment and skill such as minor cuts and burns, replacement of gauze soiled with filth, adjustment of dosage, and attachment of treatment equipment such as casts.
  • Includes various medical activities such as blood collection for transfusion, adjustment of oral tracheal tubes, and management and operation of pacemakers.
  • the treatment target includes various living organisms such as a human body, pet animals such as dogs and cats, and livestock such as cows and pigs. In addition, a part of the living body such as the arm and internal organs is also included in the treatment target.
  • the image also includes a still image and a moving image. Of course, a plurality of frame images included in the moving image are also included in the image.
  • the acquisition of an image includes the acquisition of an image signal including image information. Further, the data format of the image information is not limited and may be set arbitrarily.
  • the image acquisition unit 40 acquires an image taken in the body cavity of the human body to be operated on by endoscopy. The acquired captured image is output to the attention area setting unit 41.
  • the attention area setting unit 41 sets the attention area with respect to the captured image.
  • the area in the image is defined by a pixel area composed of one or more pixels.
  • the area of interest is an area of interest for the operator 32 during the procedure. For example, an area including a lesion of the human body to be treated is included. Alternatively, it includes an area where an incision or the like is performed by a surgical tool such as a scalpel.
  • any region that the operator 32 wants to pay attention to may be set as the region of interest.
  • the region of interest may be set by, for example, the operator 32.
  • a region of interest may be automatically set for the input captured image.
  • the method of setting the region of interest is not limited, and any algorithm may be used.
  • the attention area setting unit 41 is an area other than the attention area, and sets an area to be set as an emphasis area, which will be described later.
  • all the regions around the region of interest in the captured image are set as regions to be set as the emphasis region (hereinafter, referred to as peripheral regions).
  • peripheral regions regions to be set as the emphasis region.
  • the present invention is not limited to this, and a part of the captured image other than the region of interest may be set as the region to be set as the emphasis region.
  • the enlargement processing unit 42 can enlarge all or a part of the captured image.
  • the enlargement processing unit 42 expands the area of interest set by the area of interest setting unit 41.
  • the emphasis area setting unit 43 sets an emphasis area with respect to the peripheral area.
  • the emphasized area is an area to be emphasized in the peripheral area. For example, an area corresponding to a portion that requires attention when performing a treatment is set as an emphasized area. For example, a region corresponding to an instrument used in the treatment, a bleeding site, a site where damage should be avoided, or the like is set as an emphasized area. A region corresponding to all of these may be set as an emphasized region, or an region corresponding to at least one may be set as an emphasized region. In addition, an area corresponding to an arbitrary part that requires attention when performing the treatment may be set as an emphasized area.
  • the instruments used for the treatment are surgical tools such as scalpels, tweezers, and forceps.
  • the present invention is not limited to this, and various surgical tools used in general treatment may be used.
  • the bleeding site is a site where bleeding is occurring from the treatment target.
  • the bleeding site includes an injured site where bleeding is occurring and bloody blood around the injured site.
  • the site to avoid damage is an important organ of the living body such as an artery.
  • organs such as the retina that may have a great influence on the living body by damaging them are also included.
  • a portion corresponding to a medical product such as gauze may be set as an emphasis area.
  • the region of interest corresponds to the first region in the captured image.
  • the peripheral region corresponds to a second region different from the first region.
  • the emphasized region corresponds to a predetermined region in the second region.
  • the method of setting the emphasized area is not limited, and any technique such as image recognition processing may be used.
  • any image recognition method such as edge detection or pattern matching may be used, and the algorithm is not particularly limited.
  • an arbitrary machine learning algorithm using DNN (Deep Neural Network) or the like may be used.
  • AI artificial intelligence
  • a learning unit and an identification unit are provided for setting the emphasized area.
  • the learning unit and the recognition unit may be constructed in, for example, the emphasis area setting unit, or may be constructed in another device capable of communicating with the CCU 20.
  • the learning unit performs machine learning based on the input information (learning data) and outputs the learning result.
  • the identification unit identifies (determines, predicts, etc.) the input information based on the input information and the learning result.
  • a neural network or deep learning is used as a learning method in the learning unit.
  • a neural network is a model that imitates a human brain neural circuit, and is composed of three types of layers: an input layer, an intermediate layer (hidden layer), and an output layer.
  • Deep learning is a model that uses a multi-layered neural network, and it is possible to learn complex patterns hidden in a large amount of data by repeating characteristic learning in each layer. Deep learning is used, for example, to identify objects in images and words in sounds. Of course, it can also be applied to the setting of the emphasized area.
  • a neurochip / neuromorphic chip incorporating the concept of a neural network can be used.
  • Machine learning problem settings include supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, reverse reinforcement learning, active learning, and transfer learning.
  • supervised learning features are learned based on given labeled learning data (teacher data). This makes it possible to derive labels for unknown data.
  • unsupervised learning a large amount of unlabeled learning data is analyzed to extract features, and clustering is performed based on the extracted features. This makes it possible to analyze trends and predict the future based on a huge amount of unknown data.
  • semi-supervised learning is a mixture of supervised learning and unsupervised learning. After learning features in supervised learning, a huge amount of training data is given in unsupervised learning, and the features are automatically characterized. This is a method of repeatedly learning while calculating the amount.
  • Reinforcement learning also deals with the problem of observing the current state of an agent in an environment and deciding what action to take. Agents learn rewards from the environment by choosing actions and learn how to get the most rewards through a series of actions. In this way, by learning the optimum solution in a certain environment, it is possible to reproduce human judgment and to make a computer acquire judgment that exceeds human judgment. It is also possible to generate virtual sensing data by machine learning. For example, it is possible to predict another sensing data from one sensing data and use it as input information, such as generating position information from the input image information. It is also possible to generate different sensing data from a plurality of sensing data. It is also possible to predict the required information and generate predetermined information from the sensing data.
  • the emphasized area setting unit 43 corresponds to an area setting unit that sets a predetermined area with respect to the second area.
  • the setting of each area executed by the attention area setting unit 41 and the emphasis area setting unit 43 may function in one block.
  • an area setting unit having the functions of an attention area setting unit 41 and an emphasis area setting unit 43 may be configured, and an attention area, a peripheral area, and an emphasis area may be set by the area setting unit.
  • the enhancement processing unit 44 can perform enhancement processing on all or a part of the captured image.
  • the emphasis processing unit 44 performs the emphasis processing on the emphasis region.
  • the emphasis process is a process of emphasizing the emphasized area in the peripheral area.
  • the enhancement process for the enhancement region includes at least one of enlargement of the image, colorization of the image, increase of the gradation value of the image, or conversion of the display format of the image.
  • Image colorization is a process of converting a grayscale image into a color image. For example, colorization of an image is a process in which an emphasized area is converted into a color image when a peripheral area including an emphasized area is converted into grayscale.
  • the colorization of an image includes at least one of a process of filling the emphasized area with a predetermined color or a process of coloring the boundary of the emphasized area with a predetermined color.
  • Increasing the gradation value of an image is a process of increasing the gradation of an image.
  • increasing the gradation value of an image is a process of changing the image corresponding to the emphasized region to 256 gradations when the image corresponding to the peripheral region has 128 gradations.
  • the gradation value of the emphasized region is equal to or less than the gradation value of the region of interest.
  • the conversion of the display format of an image is a process in which an image is output in a different display format. For example, when the peripheral area is converted to a painting-like (animation-like), the emphasized area is output in the same image display format as when the image was taken, which is the same as the attention area. Of course, it is not limited to this.
  • the information amount control unit 45 controls the amount of information of the captured image.
  • the amount of information is typically defined by the amount of image data of the captured image.
  • the amount of information can be specified by the number of bits of the image, the number of pixels of the image, and the like.
  • the amount of information may be defined based on the gradation value and the like.
  • the reduction in the amount of information includes an arbitrary process for reducing the amount of image data of the captured image. For example, as processing for reducing the amount of information in the peripheral area, image reduction, grayscale of the image, reduction of the gradation value of the image, conversion of the display format of the image, cropping of the image, etc. are executed. Is a process of hiding a part of the image.
  • a 4K image is cut out from an 8K captured image.
  • the image is cut out so that it cannot be seen because a part of the peripheral region is superimposed.
  • the captured image is a three-dimensional image
  • converting the peripheral region from the three-dimensional image to the two-dimensional image is also included in the process of reducing the amount of information in the peripheral region.
  • the emphasis processing unit 44 emphasizes the emphasized area.
  • the information amount control unit 45 reduces the amount of information in the peripheral region. Therefore, the amount of information in the peripheral area is reduced so that the emphasized area is emphasized.
  • any process that can reduce the amount of information in the peripheral area so that the emphasized area is emphasized may be executed.
  • the image synthesizing unit 46 electronically synthesizes the captured image.
  • the area of interest enlarged by the enlargement processing unit 42, the emphasized area emphasized by the emphasis processing unit 44, and the peripheral area in which the amount of information is reduced by the information amount control unit 45 are combined by the image composition unit 46. It is synthesized.
  • the combined composite image is output to the display device 21 via the communication unit 28.
  • the surgeon 32 can perform the treatment while checking the composite image displayed on the display device 21.
  • the information amount control unit 45 may reduce the amount of information only in a region different from the emphasized region in the peripheral region. In this case, as a result, the emphasized area is emphasized in the peripheral area. Therefore, the process of reducing the amount of information only in the area different from the emphasized area of the peripheral area is included in the emphasis process for the emphasized area. Further, when the amount of information of only the area different from the emphasized area of the peripheral area is reduced, the amount of information of the entire peripheral area is reduced. Therefore, the process of reducing the amount of information only in the area different from the emphasized area of the peripheral area is included in the process of reducing the amount of information of the peripheral process so that the emphasized area is emphasized.
  • FIG. 4 is a flowchart showing an example of setting and processing of each area.
  • FIG. 5 is a schematic view showing an example of setting the area of the captured image to be treated.
  • the captured image 50 of the patient 34 captured by the endoscope 1 is input (step 101). For example, as shown in FIG. 5, a photographed image 50 in which the surgical portion in the body cavity of the patient 34 is photographed is displayed on the display device 21. As the captured image 50, the display device 21 displays the inside of the body cavity of the patient 34 including the lesion portion photographed by the endoscope 1, the operator 32's hand, the surgical tool 51, the bleeding site 52, and the like. In addition, various information related to surgery such as physical information of the patient 34 input via the input device 24 and information about the surgical procedure may be displayed on the captured image 50.
  • the attention area setting unit 41 sets the attention area 53 and the peripheral area 54 in the input captured image 50 (step 102).
  • the area where the operator 32 is performing the treatment is set as the area of interest 53, and the other area is set as the peripheral area 54. Further, in the present embodiment, the region of interest 53 and the peripheral region 54 are separated and processed separately.
  • FIG. 6 is a schematic view showing an enlarged example of the region of interest 53.
  • the area of interest 53 is expanded by the enlargement processing unit 42 (step 103).
  • enlarging an image means displaying (the content of) the inside of the image in an area larger than the original display area (pixel area).
  • the method of expanding the region of interest 53 is not limited.
  • the entire region of interest 53 may be magnified by a predetermined magnification.
  • a different magnification may be assigned to each of the plurality of regions in the region of interest 53, or the magnification may be enlarged according to the assigned magnification.
  • the emphasis area setting unit 43 determines whether or not there is an emphasis area in the peripheral area 54 (step 104).
  • the region corresponding to the surgical tool 51 and the bleeding site 52 in the captured image 50 is set as the emphasized region by the emphasized region setting unit 43.
  • FIG. 7 is a schematic view showing an example in which the enhancement process is executed on the enhancement region in the peripheral region 54.
  • the emphasis processing unit 44 performs the emphasis processing on the emphasis regions 55 (56 and 57) (step 105).
  • the emphasis processing unit 44 executes an enhancement process of filling the emphasis area 56 corresponding to the surgical instrument 51 and the emphasis area 57 corresponding to the bleeding site 52 with a predetermined color. For example, in the emphasis region 56 corresponding to the surgical tool 51, an enhancement process of painting with a color that is more conspicuous than in the body cavity of the human body is executed.
  • an enhancement process of filling with red or a color close to red that imitates the color of blood is executed.
  • the method of executing the emphasis processing is not limited.
  • the operator 32 may assign an arbitrary color to each emphasis area 55.
  • the type of the surgical tool 51 or the like may be recognized by image recognition or the like, and the enhancement process assigned to each surgical tool may be executed.
  • different emphasis processing may be executed when the surgical tools 51 are close to each other.
  • the method of setting the emphasized region 57 is not limited.
  • the reliability of the area may be set as an index indicating how much the recognition result of image recognition can be trusted.
  • the corresponding region may be set as the emphasized region depending on whether or not the reliability exceeds a predetermined threshold value.
  • a plurality of threshold values may be set, and the reliability of each emphasized region may be determined stepwise. For example, if the reliability exceeds the highest threshold, the emphasized area may be filled with a dark color. Further, for example, a color luminance level for filling the emphasized area may be assigned for each threshold value.
  • FIG. 8 is a schematic diagram showing an example in which the control of the amount of information is executed for the peripheral region 54.
  • the information amount control unit 45 controls the amount of information with respect to the peripheral area 54 including the emphasized area on which the emphasis processing is executed (step 106).
  • the peripheral area 54 is reduced as a process for reducing the amount of information in the peripheral area 54. Reducing an image means displaying (the content of) an image in an area smaller than the original display area.
  • the method of reducing the peripheral area 54 is not limited. For example, the entire peripheral area 54 may be reduced by a predetermined magnification. Alternatively, a different magnification may be assigned to each of the plurality of areas in the peripheral area 54, or the magnification may be reduced according to the assigned magnification.
  • grayscale of the peripheral region 54 is executed as a process of reducing the amount of information in the peripheral region 54. That is, the emphasis processing unit 44 and the information amount control unit 45 execute a process of reducing the amount of information in the peripheral area 54 and an emphasis process on the emphasis area 55. As a result, even if the peripheral region 54 is reduced, the emphasized region 55 can be sufficiently grasped.
  • the combination of the process of reducing the amount of information and the emphasis process for the emphasis area 55 may be arbitrarily executed.
  • the peripheral region 54 may be converted into a painting style, and the surgical instrument 51 and the bleeding site 52 may be emphasized with a predetermined color.
  • the region of interest 53 is shown by a dotted line. Not limited to this, when the emphasis processing and the control of the amount of information are performed on the peripheral region 54, the region of interest 53 may not be shown.
  • FIG. 9 is a schematic diagram showing an example of a composite image.
  • the image synthesizing unit 46 generates a composite image 58 in which the region of interest 53 and the peripheral region 54 are combined (step 107). Further, the generated composite image 58 is output to the display device 21 (step 108). As a result, the surgeon 32 can perform the treatment while checking the composite image 58 displayed on the display device 21 in real time.
  • each process of steps 101 to 106 is shown.
  • the images including the attention region 53 and the peripheral region 54 shown in FIGS. 5 to 8 are not displayed on the display device 21, and only the composite image 58 generated by the image compositing unit 46 is displayed. It is displayed on the device 21. Not limited to this, an image including the region of interest 53 and the peripheral region 54 for each of these processes may be displayed on the display device 21.
  • the region of interest 53 in the captured image 50 captured by the patient 34 is enlarged.
  • the amount of information in the peripheral region different from the attention region 53 in the captured image 50 is reduced so that the emphasized region 55 in the peripheral region 54 is emphasized. This makes it possible to fully grasp the target image.
  • the area of interest may be magnified by an electronic zoom.
  • an electronic zoom For example, if a small lesion is enlarged by an electronic zoom, the field of view becomes narrower than that of the same-magnification display, and information outside the region of interest cannot be displayed. Therefore, there is a problem that it is not possible to operate the instrument outside the region of interest and to confirm the condition outside the region of interest such as bleeding around the lesion during electronic zooming.
  • the area of interest is set in the surgical field image, and the area of interest is enlarged and displayed.
  • the peripheral area other than the area of interest is reduced and displayed.
  • highlighting is performed on the highlighted area so that necessary information can be extracted even if the display is reduced.
  • the area of interest is enlarged and displayed, it is possible to grasp the surrounding conditions such as the state of the surgical instrument, the state of bleeding, and the condition of the tissue outside the area of interest.
  • the information amount control unit 45 reduces the amount of information in the peripheral area 54.
  • the enhancement process may be executed for the emphasis region 55.
  • the reduction of the peripheral area 54 was executed by an arbitrary method.
  • the peripheral area 54 may be reduced so that the entire peripheral area 54 can be displayed in an area other than the enlarged area of interest 53 in the captured image 50. This makes it possible to display the composite image without any loss.
  • the emphasis processing unit 44 executed the emphasis processing on all of the emphasis areas. Not limited to this, only surgical tools and bleeding sites that greatly affect the treatment may be emphasized. Further, the surgeon 32 may decide whether or not the emphasis processing is executed.
  • the attention area 53 is set to a rectangular shape by the attention area setting unit 41.
  • the region of interest may be set in any shape.
  • a circular area of interest having a predetermined radius may be set based on the location where the treatment is performed.
  • the region of interest 53 and the peripheral region 54 are separated and processed separately.
  • the process is not limited to this, and the process may be executed in the region of interest and the peripheral region in one captured image.
  • the technology according to this disclosure can be applied to various products.
  • the technique according to the present disclosure may be applied to a microsurgery system used for so-called microsurgery, which is performed while magnifying and observing a minute part of a patient.
  • FIG. 10 is a diagram showing an example of a schematic configuration of a microscopic surgery system 200 to which the technique according to the present disclosure can be applied.
  • the microscope surgery system 200 is composed of a microscope device 201, a control device 217, and a display device 219.
  • the “user” means an operator, an assistant, or any other medical staff who uses the microsurgery system 200.
  • the microscope device 201 includes a microscope unit 203 for magnifying and observing an observation target (patient's surgical unit), an arm unit 209 that supports the microscope unit 203 at the tip, and a base unit 215 that supports the base end of the arm unit 209. Has.
  • the microscope unit 203 includes a substantially cylindrical tubular portion 205, an imaging unit (not shown) provided inside the tubular portion 205, and an operation unit 207 provided in a part of the outer periphery of the tubular portion 205. It is composed of and.
  • the microscope unit 203 is an electronic imaging type microscope unit (so-called video type microscope unit) that electronically captures an captured image by the imaging unit.
  • a cover glass is provided on the opening surface at the lower end of the tubular portion 205 to protect the internal imaging portion.
  • the light from the observation target (hereinafter, also referred to as observation light) passes through the cover glass and is incident on the imaging portion inside the tubular portion 205.
  • a light source made of, for example, an LED or the like may be provided inside the tubular portion 205, and light may be emitted from the light source to the observation target through the cover glass at the time of imaging.
  • the image pickup unit is composed of an optical system that collects the observation light and an image pickup element that receives the observation light collected by the optical system.
  • the optical system is composed of a combination of a plurality of lenses including a zoom lens and a focus lens, and its optical characteristics are adjusted so as to form an image of observation light on a light receiving surface of an image pickup device.
  • the image sensor receives the observation light and performs photoelectric conversion to generate a signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
  • an image pickup device for example, an image pickup device having a Bayer array and capable of color photographing is used.
  • the image sensor may be various known image sensors such as a CMOS image sensor or a CCD (Charge Coupled Device) image sensor.
  • the image signal generated by the image sensor is transmitted to the control device 217 as RAW data.
  • the transmission of this image signal may be preferably performed by optical communication.
  • the surgeon performs the surgery while observing the condition of the affected area with the captured image, so for safer and more reliable surgery, it is required that the moving image of the surgical site be displayed in real time as much as possible. Because it can be done.
  • By transmitting the image signal by optical communication it is possible to display the captured image with low latency.
  • the imaging unit may have a drive mechanism for moving the zoom lens and the focus lens of the optical system along the optical axis. By appropriately moving the zoom lens and the focus lens by the drive mechanism, the magnification of the captured image and the focal length at the time of imaging can be adjusted. Further, the imaging unit may be equipped with various functions that can be generally provided in an electronic imaging type microscope unit, such as an AE function and an AF function.
  • the image pickup unit may be configured as a so-called single-plate type image pickup unit having one image pickup element, or may be configured as a so-called multi-plate type image pickup unit having a plurality of image pickup elements.
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to stereoscopic vision (3D display), respectively.
  • the 3D display enables the operator to more accurately grasp the depth of the biological tissue in the surgical site.
  • a plurality of optical systems may be provided corresponding to each image pickup element.
  • the operation unit 207 is an input means that is composed of, for example, a cross lever or a switch, and receives a user's operation input.
  • the user can input an instruction to change the magnification of the observation image and the focal length to the observation target via the operation unit 207.
  • the magnification and focal length can be adjusted by appropriately moving the zoom lens and the focus lens by the drive mechanism of the imaging unit according to the instruction.
  • the user can input an instruction to switch the operation mode (all-free mode and fixed mode described later) of the arm unit 209 via the operation unit 207.
  • the operation mode all-free mode and fixed mode described later
  • the operation unit 207 may be provided at a position where the user can easily operate the tubular portion 205 with a finger while holding the tubular portion 205 so that the operation unit 207 can be operated even while the user is moving the tubular portion 205. preferable.
  • the arm portion 209 is configured by connecting a plurality of links (first link 213a to sixth link 213f) rotatably to each other by a plurality of joint portions (first joint portion 211a to sixth joint portion 211f). Will be done.
  • the first joint portion 211a has a substantially cylindrical shape, and at its tip (lower end), the upper end of the tubular portion 205 of the microscope portion 203 is a rotation axis (first axis) parallel to the central axis of the tubular portion 205. O1) Support it so that it can rotate around.
  • the first joint portion 211a may be configured such that the first axis O1 coincides with the optical axis of the imaging portion of the microscope unit 203. This makes it possible to change the field of view so as to rotate the captured image by rotating the microscope unit 203 around the first axis O1.
  • the first link 213a fixedly supports the first joint portion 211a at the tip.
  • the first link 213a is a rod-shaped member having a substantially L-shape, and one side of the tip side extends in a direction orthogonal to the first axis O1, and the end of the one side is the first joint. It is connected to the first joint portion 211a so as to abut on the upper end portion of the outer periphery of the portion 211a.
  • the second joint portion 211b is connected to the end of the other side of the base end side of the substantially L-shape of the first link 213a.
  • the second joint portion 211b has a substantially cylindrical shape, and at its tip, the base end of the first link 213a is rotatably supported around a rotation axis (second axis O2) orthogonal to the first axis O1. ..
  • the tip of the second link 213b is fixedly connected to the base end of the second joint portion 211b.
  • the second link 213b is a rod-shaped member having a substantially L-shape, and one side of the tip side extends in a direction orthogonal to the second axis O2, and the end of the one side is the base of the second joint portion 211b. Fixedly connected to the end.
  • the third joint portion 211c is connected to the other side of the base end side of the substantially L-shape of the second link 213b.
  • the third joint portion 211c has a substantially cylindrical shape, and at the tip thereof, the base end of the second link 213b is placed around a rotation axis (third axis O3) orthogonal to the first axis O1 and the second axis O2. Supports rotatably.
  • the tip of the third link 213c is fixedly connected to the base end of the third joint portion 211c.
  • the third link 213c is configured so that its tip side has a substantially cylindrical shape, and the base end of the third joint portion 211c has a substantially same central axis at the tip of the cylindrical shape. It is fixedly connected.
  • the base end side of the third link 213c has a prismatic shape, and the fourth joint portion 211d is connected to the end portion thereof.
  • the fourth joint portion 211d has a substantially cylindrical shape, and at its tip, the base end of the third link 213c is rotatably supported around a rotation axis (fourth axis O4) orthogonal to the third axis O3. ..
  • the tip of the fourth link 213d is fixedly connected to the base end of the fourth joint portion 211d.
  • the fourth link 213d is a rod-shaped member that extends substantially linearly, and while extending so as to be orthogonal to the fourth axis O4, the end of the tip thereof hits the side surface of the fourth joint portion 211d in a substantially cylindrical shape. It is fixedly connected to the fourth joint portion 211d so as to be in contact with the fourth joint portion 211d.
  • a fifth joint portion 211e is connected to the base end of the fourth link 213d.
  • the fifth joint portion 211e has a substantially cylindrical shape, and on the tip end side thereof, the base end of the fourth link 213d is rotatably supported around a rotation axis (fifth axis O5) parallel to the fourth axis O4. To do.
  • the tip of the fifth link 213e is fixedly connected to the base end of the fifth joint portion 211e.
  • the fourth axis O4 and the fifth axis O5 are rotation axes capable of moving the microscope unit 203 in the vertical direction.
  • the height of the microscope unit 203 that is, the distance between the microscope unit 203 and the observation target can be adjusted by rotating the configuration on the tip side including the microscope unit 203 around the fourth axis O4 and the fifth axis O5. ..
  • the fifth link 213e has a first member having a substantially L-shape in which one side extends in the vertical direction and the other side extends in the horizontal direction, and the first member extends vertically downward from a portion extending in the horizontal direction. It is configured by combining with a rod-shaped second member to be stretched.
  • the base end of the fifth joint portion 211e is fixedly connected to the vicinity of the upper end of the portion extending in the vertical direction of the first member of the fifth link 213e.
  • the sixth joint portion 211f is connected to the base end (lower end) of the second member of the fifth link 213e.
  • the sixth joint portion 211f has a substantially cylindrical shape, and on the tip end side thereof, the base end of the fifth link 213e is rotatably supported around a rotation axis (sixth axis O6) parallel to the vertical direction.
  • the tip of the sixth link 213f is fixedly connected to the base end of the sixth joint portion 211f.
  • the sixth link 213f is a rod-shaped member extending in the vertical direction, and its base end is fixedly connected to the upper surface of the base portion 215.
  • the rotatable range of the first joint portion 211a to the sixth joint portion 211f is appropriately set so that the microscope unit 203 can perform a desired movement.
  • a total of 6 degrees of freedom of translation 3 degrees of freedom and rotation 3 degrees of freedom can be realized with respect to the movement of the microscope unit 203.
  • the position and posture of the microscope unit 203 can be freely controlled within the movable range of the arm unit 209. It will be possible. Therefore, it becomes possible to observe the surgical site from all angles, and the surgery can be performed more smoothly.
  • the configuration of the arm portion 209 shown in the figure is merely an example, and the number and shape (length) of the links constituting the arm portion 209, the number of joints, the arrangement position, the direction of the rotation axis, and the like are freely desired. It may be appropriately designed so that the degree can be realized.
  • the arm unit 209 in order to move the microscope unit 203 freely, the arm unit 209 is preferably configured to have 6 degrees of freedom, but the arm unit 209 has a larger degree of freedom (that is, redundant freedom). Degrees of freedom).
  • the arm unit 209 can change the posture of the arm unit 209 while the position and orientation of the microscope unit 203 are fixed. Therefore, more convenient control for the operator can be realized, for example, by controlling the posture of the arm unit 209 so that the arm unit 209 does not interfere with the field of view of the operator looking at the display device 219.
  • the first joint portion 211a to the sixth joint portion 211f may be provided with an actuator equipped with a drive mechanism such as a motor and an encoder or the like for detecting the rotation angle at each joint portion.
  • the posture of the arm portion 209 that is, the position and posture of the microscope portion 203 can be controlled by appropriately controlling the drive of each actuator provided in the first joint portion 211a to the sixth joint portion 211f by the control device 217. ..
  • the control device 217 grasps the current posture of the arm portion 209 and the current position and posture of the microscope portion 203 based on the information about the rotation angle of each joint portion detected by the encoder. Can be done.
  • the control device 217 uses the grasped information to calculate a control value (for example, rotation angle or generated torque) for each joint that realizes the movement of the microscope unit 203 in response to an operation input from the user. Then, the drive mechanism of each joint is driven according to the control value.
  • a control value for example, rotation angle or generated torque
  • the control method of the arm unit 209 by the control device 217 is not limited, and various known control methods such as force control or position control may be applied.
  • the control device 217 appropriately controls the drive of the arm unit 209 according to the operation input, and controls the position and posture of the microscope unit 203. May be done.
  • the microscope unit 203 can be moved from an arbitrary position to an arbitrary position, and then fixedly supported at the moved position.
  • an input device such as a foot switch that can be operated even if the operator holds the surgical tool in his hand.
  • the operation input may be performed in a non-contact manner based on the gesture detection and the line-of-sight detection using a wearable device or a camera provided in the operating room.
  • the arm portion 209 may be operated by a so-called master slave method.
  • the arm portion 209 can be remotely controlled by the user via an input device installed at a location away from the operating room.
  • the actuators of the first joint portion 211a to the sixth joint portion 211f are driven so as to receive an external force from the user and the arm portion 209 moves smoothly according to the external force. So-called power assist control may be performed.
  • the microscope unit 203 can be moved with a relatively light force. Therefore, the microscope unit 203 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • the drive of the arm portion 209 may be controlled so as to perform a pivot operation.
  • the pivot operation is an operation of moving the microscope unit 203 so that the optical axis of the microscope unit 203 always faces a predetermined point in space (hereinafter, referred to as a pivot point). According to the pivot operation, the same observation position can be observed from various directions, so that the affected part can be observed in more detail.
  • the microscope unit 203 is configured so that its focal length cannot be adjusted, it is preferable that the pivot operation is performed with the distance between the microscope unit 203 and the pivot point fixed. In this case, the distance between the microscope unit 203 and the pivot point may be adjusted to a fixed focal length of the microscope unit 203.
  • the microscope unit 203 moves on a hemisphere (schematically illustrated in FIG. C1) having a radius corresponding to the focal length centered on the pivot point, and is clear even if the observation direction is changed. An captured image will be obtained.
  • the control device 217 calculates the distance between the microscope unit 203 and the pivot point based on the information about the rotation angle of each joint portion detected by the encoder, and the microscope is based on the calculation result.
  • the focal length of unit 203 may be automatically adjusted.
  • the microscope unit 203 is provided with an AF function, the focal length may be automatically adjusted by the AF function each time the distance between the microscope unit 203 and the pivot point changes due to the pivot operation. ..
  • first joint portion 211a to the sixth joint portion 211f may be provided with a brake for restraining the rotation thereof.
  • the operation of the brake can be controlled by the control device 217.
  • the control device 217 operates the brake of each joint unit.
  • the posture of the arm unit 209 that is, the position and posture of the microscope unit 203 can be fixed without driving the actuator, so that power consumption can be reduced.
  • the control device 217 may release the brake of each joint unit and drive the actuator according to a predetermined control method.
  • Such a brake operation can be performed in response to an operation input by the user via the above-mentioned operation unit 207.
  • the operation unit 207 When the user wants to move the position and posture of the microscope unit 203, he / she operates the operation unit 207 to release the brake of each joint unit.
  • the operation mode of the arm portion 209 shifts to a mode in which the joint portions can freely rotate (all-free mode).
  • the operation unit 207 shifts to the mode in which the rotation of each joint portion is restricted (fixed mode).
  • the control device 217 comprehensively controls the operation of the microscope surgery system 200 by controlling the operations of the microscope device 201 and the display device 219.
  • the control device 217 controls the drive of the arm portion 209 by operating the actuators of the first joint portion 211a to the sixth joint portion 211f according to a predetermined control method.
  • the control device 217 changes the operation mode of the arm portion 209 by controlling the operation of the brakes of the first joint portion 211a to the sixth joint portion 211f.
  • the control device 217 generates image data for display by performing various signal processing on the image signal acquired by the imaging unit of the microscope unit 203 of the microscope device 201, and displays the image data. Displayed on the device 219.
  • the signal processing for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.) and / or enlargement processing (that is, Various known signal processing such as electronic zoom processing) may be performed.
  • the communication between the control device 217 and the microscope unit 203 and the communication between the control device 217 and the first joint portion 211a to the sixth joint portion 211f may be wired communication or wireless communication.
  • wired communication communication by an electric signal may be performed, or optical communication may be performed.
  • the transmission cable used for the wired communication may be configured as an electric signal cable, an optical fiber, or a composite cable thereof depending on the communication method.
  • wireless communication since it is not necessary to lay a transmission cable in the operating room, it is possible to solve the situation where the transmission cable hinders the movement of the medical staff in the operating room.
  • the control device 217 may be a processor such as a CPU or GPU, or a microcomputer or a control board on which a processor and a storage element such as a memory are mixedly mounted.
  • the various functions described above can be realized by operating the processor of the control device 217 according to a predetermined program.
  • the control device 217 is provided as a device separate from the microscope device 201, but the control device 217 is installed inside the base portion 215 of the microscope device 201 and is integrated with the microscope device 201. It may be configured in. Alternatively, the control device 217 may be composed of a plurality of devices.
  • a microcomputer, a control board, and the like are arranged in the microscope unit 203 and the first joint portion 211a to the sixth joint portion 211f of the arm portion 209, respectively, and these are connected to each other so as to be communicable with the control device 217. Similar functionality may be realized.
  • the display device 219 is provided in the operating room and displays an image corresponding to the image data generated by the control device 217 under the control of the control device 217. That is, the display device 219 displays an image of the surgical site taken by the microscope unit 203.
  • the display device 219 may display various information related to the surgery, such as physical information of the patient and information about the surgical procedure, in place of the image of the surgical site or together with the image of the surgical site. In this case, the display of the display device 219 may be appropriately switched by an operation by the user.
  • a plurality of display devices 219 may be provided, and each of the plurality of display devices 219 may display an image of the surgical site and various information related to the surgery.
  • various known display devices such as a liquid crystal display device or an EL (Electro Luminescence) display device may be applied.
  • FIG. 11 is a diagram showing a state of surgery using the microscopic surgery system 200 shown in FIG.
  • FIG. 11 schematically shows how the surgeon 221 is performing surgery on the patient 225 on the patient bed 223 using the microsurgery system 200.
  • the control device 217 is omitted from the configuration of the microscope surgery system 200, and the microscope device 201 is shown in a simplified manner.
  • the image of the surgical site taken by the microscope device 201 is enlarged and displayed on the display device 219 installed on the wall surface of the operating room by using the microscope surgery system 200.
  • the display device 219 is installed at a position facing the operator 221.
  • the operator 221 observes the state of the operation site by the image projected on the display device 219, and the operator, for example, excises the affected area. Various measures are taken against.
  • the microscope device 201 can also function as a support arm device that supports another observation device or other surgical tool at its tip instead of the microscope unit 203. By supporting these observation devices and surgical tools with a support arm device, it is possible to fix the position more stably and reduce the burden on the medical staff as compared with the case where the medical staff manually supports them. It becomes possible to do.
  • the technique according to the present disclosure may be applied to a support arm device that supports a configuration other than such a microscope unit.
  • the microscopic surgery system 200 corresponds to the surgical operation system.
  • the microscope unit 203 corresponds to an imaging unit capable of imaging a treatment target, and is configured as a part of the microscope device 201.
  • the control device 217 has the same function as the image processing unit 29 of the CCU 20. That is, the control device 217 corresponds to an embodiment of the image processing device according to the present technology.
  • the treatment system, image processing device, image processing method, and program related to this technology are executed by interlocking multiple computers that are communicably connected via a network or the like, and the image processing device related to this technology is constructed. You may. That is, the treatment system, image processing device, image processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
  • the image processing device, the image processing method, and the program according to the present technology by the computer system for example, the setting of the area of interest, the determination of the emphasized area, the control of the amount of information, and the like are executed by a single computer. And when each process is performed by a different computer. Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result. That is, the treatment system, image processing device, image processing method, and program related to this technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. It is possible.
  • the concepts that define the shape, size, positional relationship, state, etc. such as “center”, “same”, “orthogonal”, “parallel”, “cylindrical shape”, and “cylindrical shape”, are “substantially centered” and “substantially”.
  • the concept includes “substantially the same”, “substantially orthogonal”, “substantially parallel”, “substantially cylindrical”, “substantially cylindrical” and the like.
  • a predetermined range for example, ⁇ 10% range
  • the included state is also included.
  • the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained.
  • the description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
  • the present technology can also adopt the following configurations.
  • An imaging unit that can photograph the treatment target and The first region in the captured image captured by the imaging unit is enlarged, and the amount of information in the second region different from the first region in the captured image is determined by a predetermined amount in the second region.
  • a treatment system with an image processing unit that reduces the area so that it is emphasized.
  • the treatment system according to (1) The image processing unit is a treatment system that executes a process of reducing the amount of information in the second region and an enhancement process for the predetermined region.
  • the image processing unit reduces the amount of information in the second region by reducing the image, grayscale the image, reducing the gradation value of the image, converting the display format of the image, or cutting out the image.
  • the image processing unit is a treatment system that executes at least one of enlargement of an image, colorization of an image, increase of a gradation value of an image, or conversion of a display format of an image as an enhancement process for the predetermined area.
  • the conversion of the display format is a treatment system including converting an image into a painting style.
  • the captured image includes a three-dimensional image.
  • the image processing unit is a treatment system that converts the second region into a two-dimensional image as a process of reducing the amount of information in the second region. (7) The treatment system according to (3).
  • the image processing unit is a treatment system that reduces the second region so that the entire second region can be displayed in an enlarged region other than the first region in the captured image.
  • the image processing unit is a treatment system that executes enhancement processing on the predetermined area after reducing the amount of information in the second area.
  • the image processing unit is a treatment system that reduces the amount of information in the second region after performing enhancement processing on the predetermined region.
  • the treatment system according to (1) The image processing unit is a treatment system that reduces the amount of information of only other regions different from the predetermined region of the second region. (11) The treatment system according to any one of (1) to (10), and further.
  • a treatment system including an area setting unit for setting the predetermined area with respect to the second area.
  • the area setting unit is a treatment system that sets a region corresponding to a portion that requires attention when performing the treatment as the predetermined region.
  • the area setting unit is a treatment system that sets a area corresponding to at least one of an instrument used for the treatment, a bleeding site, or a site for which damage should be avoided as the predetermined area.
  • the imaging unit is a treatment system configured as a part of the endoscope.
  • the imaging unit is a surgical system configured as a part of the surgical microscope.
  • a treatment system that is any one of (1) to (15).
  • the treatment is a treatment system including surgery.
  • a treatment system that is any one of (1) to (16).
  • the treatment target is a treatment system including a living body.
  • An image acquisition unit that acquires an image including the treatment target, and The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized.
  • An image processing device including an image processing unit to be operated.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

An operation system pertaining to an embodiment of the present invention comprises an imaging unit and an image processing unit. The imaging unit is capable of capturing an image of an operation object. The image processing unit enlarges a first region in a background image captured by the imaging unit, and reduces the amount of information of a second region different from the first region in the captured image so that a prescribed region in the second region is emphasized. An object image can thereby be adequately understood.

Description

施術システム、画像処理装置、画像処理方法、及びプログラムTreatment system, image processing device, image processing method, and program
 本技術は、手術や検査等に適用可能な施術システム、画像処理装置、画像処理方法、及びプログラムに関する。 This technology relates to treatment systems, image processing devices, image processing methods, and programs that can be applied to surgery, examinations, etc.
 特許文献1に記載の内視鏡装置では、生体内が撮影された撮影画像が取得される。この撮影画像に対して、生体内の病変部に対して処置を施すための処置具の位置に基づいて、病変部を含む注目領域が設定される。また撮影画像の画角を維持しながら注目領域が他の領域よりも拡大される。これにより、撮影画像全体と注目領域とが同一の画像に表示され、病変部を適切な大きさで表示することが図られている(特許文献1の明細書段落[0047][0062]図7、図13等)。 With the endoscope device described in Patent Document 1, a photographed image taken in the living body is acquired. With respect to this captured image, a region of interest including the lesion is set based on the position of the treatment tool for treating the lesion in the living body. In addition, the area of interest is enlarged more than other areas while maintaining the angle of view of the captured image. As a result, the entire captured image and the region of interest are displayed in the same image, and the lesion portion is displayed in an appropriate size (paragraphs [0047] [0062] of Patent Document 1). , FIG. 13 etc.).
特許第5855358号公報Japanese Patent No. 5855358
 このように内視鏡等を利用した処置等において、対象の画像を十分に把握することが可能となる技術が求められている。 In this way, there is a demand for a technique that makes it possible to sufficiently grasp the target image in treatments using an endoscope or the like.
 以上のような事情に鑑み、本技術の目的は、対象の画像を十分に把握することが可能な施術システム、画像処理装置、画像処理方法、及びプログラムを提供することにある。 In view of the above circumstances, the purpose of this technique is to provide a treatment system, an image processing device, an image processing method, and a program capable of sufficiently grasping the target image.
 上記目的を達成するため、本技術の一形態に係る施術システムは、撮像部と、画像処理部とを具備する。
 前記撮像部は、施術対象を撮影可能である。
 前記画像処理部は、前記撮像部により撮影された撮影画像内の第1の領域を拡大し、前記撮影画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる。
In order to achieve the above object, the treatment system according to one embodiment of the present technology includes an imaging unit and an image processing unit.
The imaging unit can photograph the object to be treated.
The image processing unit enlarges the first region in the captured image captured by the imaging unit, and obtains the amount of information in the second region different from the first region in the captured image. Decrease so that a given area within the area of is emphasized.
 この施術システムでは、施術対象が撮影された撮影画像内の第1の領域が拡大される。撮影画像内の第1の領域とは異なる第2の領域の情報量が、第2の領域内の所定の領域が強調されるように減少される。これにより、対象の画像を十分に把握することが可能となる。 In this treatment system, the first area in the captured image in which the treatment target is captured is enlarged. The amount of information in the second region, which is different from the first region in the captured image, is reduced so that the predetermined region in the second region is emphasized. This makes it possible to fully grasp the target image.
 本技術の一形態に係る画像処理装置は、画像取得部と、画像処理部とを具備する。
 前記画像取得部は、施術対象を含む画像を取得する。
 前記画像処理部は、前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる。
The image processing apparatus according to one embodiment of the present technology includes an image acquisition unit and an image processing unit.
The image acquisition unit acquires an image including the treatment target.
The image processing unit enlarges the first region in the image, and the predetermined region in the second region can obtain the amount of information in the second region different from the first region in the image. Reduce to be emphasized.
 本技術の一形態に係る画像処理方法は、コンピュータシステムにより実行される画像処理方法であって、施術対象を含む画像を取得することを含む。
 前記画像内の第1の領域が拡大され、前記画像内の前記第1の領域とは異なる第2の領域の情報量が、前記第2の領域内の所定の領域が強調されるように減少される。
The image processing method according to one form of the present technology is an image processing method executed by a computer system, and includes acquiring an image including a treatment target.
The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. Will be done.
 本技術の一形態に係るプログラムは、コンピュータシステムに以下のステップを実行させる。
 施術対象を含む画像を取得するステップ。
 前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させるステップ。
A program according to a form of the present technology causes a computer system to perform the following steps.
The step of acquiring an image including the treatment target.
The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. Steps to make.
内視鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of the schematic structure of the endoscopic surgery system. カメラヘッド及びCCUの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional structure of a camera head and a CCU. 画像処理部の構成例を示すブロック図である。It is a block diagram which shows the structural example of the image processing part. 各領域の設定及び処理の一例を示すフローチャートである。It is a flowchart which shows an example of setting and processing of each area. 施術対象の撮影画像の領域の設定の一例を示す模式図である。It is a schematic diagram which shows an example of setting the area of the photographed image to be treated. 注目領域の拡大された一例を示す模式図である。It is a schematic diagram which shows an expanded example of a region of interest. 周辺領域内の強調領域に対して強調処理が実行された一例を示す模式図である。It is a schematic diagram which shows an example which the emphasis processing was executed for the emphasis area in the peripheral area. 周辺領域に対して情報量の制御が実行された一例を示す模式図である。It is a schematic diagram which shows an example in which the control of the amount of information was executed with respect to the peripheral area. 合成画像の一例を示す模式図である。It is a schematic diagram which shows an example of a composite image. 顕微鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of the schematic structure of the microscopic surgery system. 顕微鏡手術システムを用いた手術の様子を示す図である。It is a figure which shows the state of the operation using the microscope operation system.
 以下、本技術に係る実施形態を、図面を参照しながら説明する。 Hereinafter, embodiments relating to the present technology will be described with reference to the drawings.
 図1は、本開示に係る技術が適用され得る内視鏡手術システム100の概略的な構成の一例を示す図である。図1では、術者(医師)32が、内視鏡手術システム100を用いて、患者ベッド33上の患者34に手術を行っている様子が図示されている。図示するように、内視鏡手術システム100は、内視鏡1と、その他の術具9と、内視鏡1を支持する支持アーム装置14と、内視鏡下手術のための各種の装置が搭載されたカート19とから構成される。 FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 100 to which the technique according to the present disclosure can be applied. FIG. 1 shows how an operator (doctor) 32 is performing surgery on a patient 34 on a patient bed 33 using the endoscopic surgery system 100. As shown in the figure, the endoscopic surgery system 100 includes an endoscope 1, other surgical tools 9, a support arm device 14 that supports the endoscope 1, and various devices for endoscopic surgery. It is composed of a cart 19 equipped with.
 内視鏡手術では、腹壁を切って開腹する代わりに、トロッカ13a~13dと呼ばれる筒状の開孔器具が腹壁に複数穿刺される。そして、トロッカ13a~13dから、内視鏡1の鏡筒2や、その他の術具9が患者34の体腔内に挿入される。図示する例では、その他の術具9として、気腹チューブ10、エネルギー処置具11及び鉗子12が、患者34の体腔内に挿入されている。また、エネルギー処置具11は、高周波電流や超音波振動により、組織の切開及び剥離、又は血管の封止等を行う処置具である。ただし、図示する術具9はあくまで一例であり、術具9としては、例えば攝子、レトラクタ等、一般的に内視鏡下手術において用いられる各種の術具が用いられてよい。 In endoscopic surgery, instead of cutting the abdominal wall to open the abdomen, a plurality of tubular laparotomy devices called troccas 13a to 13d are punctured into the abdominal wall. Then, the lens barrel 2 of the endoscope 1 and other surgical tools 9 are inserted into the body cavity of the patient 34 from the troccers 13a to 13d. In the illustrated example, as other surgical tools 9, a pneumoperitoneum tube 10, an energy treatment tool 11, and forceps 12 are inserted into the body cavity of the patient 34. Further, the energy treatment tool 11 is a treatment tool that cuts and peels tissue, seals a blood vessel, or the like by using a high-frequency current or ultrasonic vibration. However, the surgical tool 9 shown in the figure is merely an example, and as the surgical tool 9, various surgical tools generally used in endoscopic surgery such as a sword and a retractor may be used.
 内視鏡1によって撮影された患者34の体腔内の術部の画像が、表示装置21に表示される。術者32は、表示装置21に表示された術部の画像をリアルタイムで見ながら、エネルギー処置具11や鉗子12を用いて、例えば患部を切除する等の処置を行う。なお、図示は省略しているが、気腹チューブ10、エネルギー処置具11及び鉗子12は、手術中に、術者32又は助手等によって支持される。 The image of the surgical site in the body cavity of the patient 34 taken by the endoscope 1 is displayed on the display device 21. The surgeon 32 uses the energy treatment tool 11 and the forceps 12 to perform a procedure such as excising the affected portion while viewing the image of the surgical portion displayed on the display device 21 in real time. Although not shown, the pneumoperitoneum tube 10, the energy treatment tool 11, and the forceps 12 are supported by the surgeon 32, an assistant, or the like during the operation.
 [支持アーム装置]
 支持アーム装置14は、ベース部15から延伸するアーム部16を備える。図示する例では、アーム部16は、関節部17a、17b、17c、及びリンク18a、18bから構成されており、アーム制御装置23からの制御により駆動される。アーム部16によって内視鏡1が支持され、その位置及び姿勢が制御される。これにより、内視鏡1の安定的な位置の固定が実現され得る。
[Support arm device]
The support arm device 14 includes an arm portion 16 extending from the base portion 15. In the illustrated example, the arm portion 16 is composed of joint portions 17a, 17b, 17c, and links 18a, 18b, and is driven by control from the arm control device 23. The endoscope 1 is supported by the arm portion 16, and its position and posture are controlled. Thereby, the stable position of the endoscope 1 can be fixed.
 [内視鏡]
 内視鏡1は、先端から所定の長さの領域が患者34の体腔内に挿入される鏡筒2と、鏡筒2の基端に接続されるカメラヘッド3とから構成される。図示する例では、硬性の鏡筒2を有するいわゆる硬性鏡として構成される内視鏡1を図示しているが、内視鏡1は、軟性の鏡筒2を有するいわゆる軟性鏡として構成されてもよい。
[Endoscope]
The endoscope 1 is composed of a lens barrel 2 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 34, and a camera head 3 connected to the base end of the lens barrel 2. In the illustrated example, the endoscope 1 configured as a so-called rigid mirror having a rigid barrel 2 is illustrated, but the endoscope 1 is configured as a so-called flexible mirror having a flexible barrel 2. May be good.
 鏡筒2の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡1には光源装置22が接続されており、当該光源装置22によって生成された光が、鏡筒2の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者34の体腔内の観察対象に向かって照射される。なお、内視鏡1は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 The tip of the lens barrel 2 is provided with an opening in which an objective lens is fitted. A light source device 22 is connected to the endoscope 1, and the light generated by the light source device 22 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 2, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 34 through the lens. The endoscope 1 may be a direct endoscope, a perspective mirror, or a side endoscope.
 カメラヘッド3の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU:Camera Control Unit)39に送信される。なお、カメラヘッド3には、その光学系を適宜駆動させることにより、倍率及び焦点距離を調整する機能が搭載される。 An optical system and an image sensor are provided inside the camera head 3, and the reflected light (observation light) from the observation target is focused on the image sensor by the optical system. The observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 39. The camera head 3 is equipped with a function of adjusting the magnification and the focal length by appropriately driving the optical system.
 なお、例えば立体視(3D表示)等に対応するために、カメラヘッド3には撮像素子が複数設けられてもよい。この場合、鏡筒2の内部には、当該複数の撮像素子のそれぞれに観察光を導光するために、リレー光学系が複数系統設けられる。 Note that, for example, in order to support stereoscopic viewing (3D display) and the like, the camera head 3 may be provided with a plurality of image pickup elements. In this case, a plurality of relay optical systems are provided inside the lens barrel 2 in order to guide the observation light to each of the plurality of image pickup elements.
 [カートに搭載される各種の装置]
 CCU20は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡1及び表示装置21の動作を統括的に制御する。具体的には、CCU20は、カメラヘッド3から受け取った画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。CCU20は、当該画像処理を施した画像信号を表示装置21に提供する。また、CCU20は、カメラヘッド3に対して制御信号を送信し、その駆動を制御する。当該制御信号には、倍率や焦点距離等、撮像条件に関する情報が含まれ得る。
[Various devices mounted on the cart]
The CCU 20 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operations of the endoscope 1 and the display device 21 in an integrated manner. Specifically, the CCU 20 performs various image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal received from the camera head 3. The CCU 20 provides the display device 21 with the image signal that has undergone the image processing. Further, the CCU 20 transmits a control signal to the camera head 3 and controls the driving thereof. The control signal may include information about imaging conditions such as magnification and focal length.
 表示装置21は、CCU20からの制御により、当該CCU20によって画像処理が施された画像信号に基づく画像を表示する。内視鏡1が例えば4K(水平画素数3840×垂直画素数2160)又は8K(水平画素数7680×垂直画素数4320)等の高解像度の撮影に対応したものである場合、及び/又は3D表示に対応したものである場合には、表示装置21としては、それぞれに対応して、高解像度の表示が可能なもの、及び/又は3D表示可能なものが用いられ得る。4K又は8K等の高解像度の撮影に対応したものである場合、表示装置21として55インチ以上のサイズのものを用いることで一層の没入感が得られる。また、用途に応じて、解像度、サイズが異なる複数の表示装置21が設けられてもよい。 The display device 21 displays an image based on the image signal processed by the CCU 20 under the control of the CCU 20. When the endoscope 1 is compatible with high-resolution shooting such as 4K (3840 horizontal pixels x 2160 vertical pixels) or 8K (7680 horizontal pixels x 4320 vertical pixels), and / or 3D display. As the display device 21, a display device 21 capable of displaying a high resolution and / or a device capable of displaying in 3D can be used. When the display device 21 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 21 having a size of 55 inches or more. Further, a plurality of display devices 21 having different resolutions and sizes may be provided depending on the application.
 光源装置22は、例えばLED(light emitting diode)等の光源から構成され、術部を撮影する際の照射光を内視鏡1に供給する。 The light source device 22 is composed of, for example, a light source such as an LED (light LED radio), and supplies irradiation light for photographing the surgical site to the endoscope 1.
 アーム制御装置23は、例えばCPU等のプロセッサによって構成され、所定のプログラムに従って動作することにより、所定の制御方式に従って支持アーム装置14のアーム部16の駆動を制御する。 The arm control device 23 is composed of a processor such as a CPU, and operates according to a predetermined program to control the drive of the arm portion 16 of the support arm device 14 according to a predetermined control method.
 入力装置24は、内視鏡手術システム100に対する入力インタフェースである。ユーザは、入力装置24を介して、内視鏡手術システム100に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、入力装置24を介して、患者の身体情報や、手術の術式についての情報等、手術に関する各種の情報を入力する。また、例えば、ユーザは、入力装置24を介して、アーム部16を駆動させる旨の指示や、内視鏡1による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示、エネルギー処置具11を駆動させる旨の指示等を入力する。 The input device 24 is an input interface for the endoscopic surgery system 100. The user can input various information and input instructions to the endoscopic surgery system 100 via the input device 24. For example, the user inputs various information related to the surgery, such as physical information of the patient and information about the surgical procedure, via the input device 24. Further, for example, the user gives an instruction to drive the arm unit 16 via the input device 24, or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 1. , An instruction to drive the energy treatment tool 11 and the like are input.
 入力装置24の種類は限定されず、入力装置24は各種の公知の入力装置であってよい。入力装置24としては、例えば、マウス、キーボード、タッチパネル、スイッチ、フットスイッチ27及び/又はレバー等が適用され得る。入力装置24としてタッチパネルが用いられる場合には、当該タッチパネルは表示装置21の表示面上に設けられてもよい。 The type of the input device 24 is not limited, and the input device 24 may be various known input devices. As the input device 24, for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 27 and / or a lever and the like can be applied. When a touch panel is used as the input device 24, the touch panel may be provided on the display surface of the display device 21.
 あるいは、入力装置24は、例えばメガネ型のウェアラブルデバイスやHMD(Head Mounted Display)等の、ユーザによって装着されるデバイスであり、これらのデバイスによって検出されるユーザのジェスチャや視線に応じて各種の入力が行われる。また、入力装置24は、ユーザの動きを検出可能なカメラを含み、当該カメラによって撮像された映像から検出されるユーザのジェスチャや視線に応じて各種の入力が行われる。更に、入力装置24は、ユーザの声を収音可能なマイクロフォンを含み、当該マイクロフォンを介して音声によって各種の入力が行われる。このように、入力装置24が非接触で各種の情報を入力可能に構成されることにより、特に清潔域に属するユーザ(例えば術者32)が、不潔域に属する機器を非接触で操作することが可能となる。また、ユーザは、所持している術具から手を離すことなく機器を操作することが可能となるため、ユーザの利便性が向上する。 Alternatively, the input device 24 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), and various inputs are made according to the user's gesture and line of sight detected by these devices. Is done. Further, the input device 24 includes a camera capable of detecting the movement of the user, and various inputs are performed according to the gesture and the line of sight of the user detected from the image captured by the camera. Further, the input device 24 includes a microphone capable of picking up the user's voice, and various inputs are performed by voice through the microphone. In this way, the input device 24 is configured to be able to input various information in a non-contact manner, so that a user belonging to a clean area (for example, an operator 32) can operate a device belonging to a dirty area in a non-contact manner. Is possible. In addition, the user can operate the device without taking his / her hand off the surgical tool that he / she has, which improves the convenience of the user.
 処置具制御装置25は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11の駆動を制御する。気腹装置26は、内視鏡1による視野の確保及び術者の作業空間の確保の目的で、患者34の体腔を膨らめるために、気腹チューブ10を介して当該体腔内にガスを送り込む。レコーダ35は、手術に関する各種の情報を記録可能な装置である。プリンタ36は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment tool control device 25 controls the drive of the energy treatment tool 11 for cauterizing, incising, sealing a blood vessel, or the like of a tissue. The pneumoperitoneum device 26 has a gas in the body cavity through the pneumoperitoneum tube 10 in order to inflate the body cavity of the patient 34 for the purpose of securing the field of view by the endoscope 1 and securing the work space of the operator. To send. The recorder 35 is a device capable of recording various information related to surgery. The printer 36 is a device capable of printing various information related to surgery in various formats such as text, images, and graphs.
 以下、内視鏡手術システム100において特に特徴的な構成について、更に詳細に説明する。 Hereinafter, a particularly characteristic configuration of the endoscopic surgery system 100 will be described in more detail.
 [支持アーム装置]
 支持アーム装置14は、基台であるベース部15と、ベース部15から延伸するアーム部16とを備える。図示する例では、アーム部16は、複数の関節部17a、17b、17cと、関節部17bによって連結される複数のリンク18a、18bとから構成されているが、図1では、簡単のため、アーム部16の構成を簡略化して図示している。実際には、アーム部16が所望の自由度を有するように、関節部17a~17c及びリンク18a、18bの形状、数及び配置、並びに関節部17a~17cの回転軸の方向等が適宜設定され得る。例えば、アーム部16は、好適に、6自由度以上の自由度を有するように構成され得る。これにより、アーム部16の可動範囲内において内視鏡1を自由に移動させることが可能になるため、所望の方向から内視鏡1の鏡筒2を患者34の体腔内に挿入することが可能になる。
[Support arm device]
The support arm device 14 includes a base portion 15 as a base and an arm portion 16 extending from the base portion 15. In the illustrated example, the arm portion 16 is composed of a plurality of joint portions 17a, 17b, 17c and a plurality of links 18a, 18b connected by the joint portions 17b. The configuration of the arm portion 16 is shown in a simplified manner. Actually, the shapes, numbers and arrangements of the joint portions 17a to 17c and the links 18a and 18b, the direction of the rotation axis of the joint portions 17a to 17c, and the like are appropriately set so that the arm portion 16 has a desired degree of freedom. obtain. For example, the arm portion 16 can be preferably configured to have at least 6 degrees of freedom. As a result, the endoscope 1 can be freely moved within the movable range of the arm portion 16, so that the lens barrel 2 of the endoscope 1 can be inserted into the body cavity of the patient 34 from a desired direction. It will be possible.
 関節部17a~17cにはアクチュエータが設けられており、関節部17a~17cは当該アクチュエータの駆動により所定の回転軸まわりに回転可能に構成されている。当該アクチュエータの駆動がアーム制御装置23によって制御されることにより、各関節部17a~17cの回転角度が制御され、アーム部16の駆動が制御される。これにより、内視鏡1の位置及び姿勢の制御が実現され得る。この際、アーム制御装置23は、力制御又は位置制御等、各種の公知の制御方式によってアーム部16の駆動を制御することができる。 Actuators are provided in the joint portions 17a to 17c, and the joint portions 17a to 17c are configured to be rotatable around a predetermined rotation axis by driving the actuator. By controlling the drive of the actuator by the arm control device 23, the rotation angles of the joint portions 17a to 17c are controlled, and the drive of the arm portion 16 is controlled. Thereby, control of the position and posture of the endoscope 1 can be realized. At this time, the arm control device 23 can control the drive of the arm unit 16 by various known control methods such as force control or position control.
 例えば、術者32が、入力装置24(フットスイッチ27を含む)を介して適宜操作入力を行うことにより、当該操作入力に応じてアーム制御装置23によってアーム部16の駆動が適宜制御され、内視鏡1の位置及び姿勢が制御されてよい。当該制御により、アーム部16の先端の内視鏡1を任意の位置から任意の位置まで移動させた後、その移動後の位置で固定的に支持することができる。なお、アーム部16は、いわゆるマスタースレイブ方式で操作されてもよい。この場合、アーム部16は、手術室から離れた場所に設置される入力装置24を介してユーザによって遠隔操作され得る。 For example, when the operator 32 appropriately inputs an operation via the input device 24 (including the foot switch 27), the arm control device 23 appropriately controls the drive of the arm portion 16 in response to the operation input. The position and orientation of the endoscope 1 may be controlled. By this control, the endoscope 1 at the tip of the arm portion 16 can be moved from an arbitrary position to an arbitrary position, and then fixedly supported at the moved position. The arm portion 16 may be operated by a so-called master slave method. In this case, the arm portion 16 can be remotely controlled by the user via an input device 24 installed at a location away from the operating room.
 また、力制御が適用される場合には、アーム制御装置23は、ユーザからの外力を受け、その外力にならってスムーズにアーム部16が移動するように、各関節部17a~17cのアクチュエータを駆動させる、いわゆるパワーアシスト制御を行ってもよい。これにより、ユーザが直接アーム部16に触れながらアーム部16を移動させる際に、比較的軽い力で当該アーム部16を移動させることができる。従って、より直感的に、より簡易な操作で内視鏡1を移動させることが可能となり、ユーザの利便性を向上させることができる。 When force control is applied, the arm control device 23 receives an external force from the user and moves the actuators of the joint portions 17a to 17c so that the arm portion 16 moves smoothly according to the external force. So-called power assist control for driving may be performed. As a result, when the user moves the arm portion 16 while directly touching the arm portion 16, the arm portion 16 can be moved with a relatively light force. Therefore, the endoscope 1 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
 ここで、一般的に、内視鏡下手術では、スコピストと呼ばれる医師によって内視鏡1が支持されていた。これに対して、支持アーム装置14を用いることにより、人手によらずに内視鏡1の位置をより確実に固定することが可能になるため、術部の画像を安定的に得ることができ、手術を円滑に行うことが可能になる。 Here, in general, in endoscopic surgery, endoscopy 1 was supported by a doctor called a scopist. On the other hand, by using the support arm device 14, the position of the endoscope 1 can be fixed more reliably without manpower, so that an image of the surgical site can be stably obtained. , It becomes possible to perform surgery smoothly.
 なお、アーム制御装置23は必ずしもカート19に設けられなくてもよい。また、アーム制御装置23は必ずしも1つの装置でなくてもよい。例えば、アーム制御装置23は、支持アーム装置14のアーム部16の各関節部17a~17cにそれぞれ設けられてもよく、複数のアーム制御装置23が互いに協働することにより、アーム部16の駆動制御が実現されてもよい。 The arm control device 23 does not necessarily have to be provided in the cart 19. Further, the arm control device 23 does not necessarily have to be one device. For example, the arm control device 23 may be provided in each of the joint portions 17a to 17c of the arm portion 16 of the support arm device 14, and the arm portion 16 is driven by the plurality of arm control devices 23 cooperating with each other. Control may be realized.
 [光源装置]
 光源装置22は、内視鏡1に術部を撮影する際の照射光を供給する。光源装置22は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成される。このとき、RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置22において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド3の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。
[Light source device]
The light source device 22 supplies the endoscope 1 with irradiation light for photographing the surgical site. The light source device 22 is composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof. At this time, when a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy, so that the white balance of the captured image in the light source device 22 can be controlled. Can be adjusted. Further, in this case, the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-divided manner, and the drive of the image sensor of the camera head 3 is controlled in synchronization with the irradiation timing to support each of RGB. It is also possible to capture the image in a time-divided manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
 また、光源装置22は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド3の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Further, the drive of the light source device 22 may be controlled so as to change the intensity of the output light at predetermined time intervals. By controlling the drive of the image sensor of the camera head 3 in synchronization with the timing of changing the light intensity to acquire images in time division and synthesizing the images, so-called high dynamic without blackout and overexposure. Range images can be generated.
 また、光源装置22は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察するもの(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得るもの等が行われ得る。光源装置22は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Further, the light source device 22 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane. So-called narrow band imaging, in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed. Alternatively, in the special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light. In fluorescence observation, the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected. An excitation light corresponding to the fluorescence wavelength of the reagent may be irradiated to obtain a fluorescence image. The light source device 22 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
 [カメラヘッド及びCCU]
 図2を参照して、内視鏡1のカメラヘッド3及びCCU20の機能についてより詳細に説明する。図2は、図1に示すカメラヘッド3及びCCU20の機能構成の一例を示すブロック図である。
[Camera head and CCU]
The functions of the camera head 3 and the CCU 20 of the endoscope 1 will be described in more detail with reference to FIG. FIG. 2 is a block diagram showing an example of the functional configuration of the camera head 3 and the CCU 20 shown in FIG.
 図2を参照すると、カメラヘッド3は、その機能として、レンズユニット4と、撮像部5と、駆動部6と、通信部7と、カメラヘッド制御部8とを有する。また、CCU20は、その機能として、通信部28と、画像処理部29と、制御部30とを有する。カメラヘッド3とCCU20とは、伝送ケーブル31によって双方向に通信可能に接続されている。 Referring to FIG. 2, the camera head 3 has a lens unit 4, an imaging unit 5, a driving unit 6, a communication unit 7, and a camera head control unit 8 as its functions. Further, the CCU 20 has a communication unit 28, an image processing unit 29, and a control unit 30 as its functions. The camera head 3 and the CCU 20 are connected by a transmission cable 31 so as to be able to communicate in both directions.
 まず、カメラヘッド3の機能構成について説明する。レンズユニット4は、鏡筒2との接続部に設けられる光学系である。鏡筒2の先端から取り込まれた観察光は、カメラヘッド3まで導光され、当該レンズユニット4に入射する。レンズユニット4は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。レンズユニット4は、撮像部5の撮像素子の受光面上に観察光を集光するように、その光学特性が調整されている。また、ズームレンズ及びフォーカスレンズは、撮像画像の倍率及び焦点の調整のため、その光軸上の位置が移動可能に構成される。 First, the functional configuration of the camera head 3 will be described. The lens unit 4 is an optical system provided at a connection portion with the lens barrel 2. The observation light taken in from the tip of the lens barrel 2 is guided to the camera head 3 and incident on the lens unit 4. The lens unit 4 is configured by combining a plurality of lenses including a zoom lens and a focus lens. The optical characteristics of the lens unit 4 are adjusted so as to collect the observation light on the light receiving surface of the image sensor of the image pickup unit 5. Further, the zoom lens and the focus lens are configured so that their positions on the optical axis can be moved in order to adjust the magnification and the focus of the captured image.
 撮像部5は撮像素子によって構成され、レンズユニット4の後段に配置される。レンズユニット4を通過した観察光は、当該撮像素子の受光面に集光され、光電変換によって、観察像に対応した画像信号が生成される。撮像部5によって生成された画像信号は、通信部7に提供される。 The image pickup unit 5 is composed of an image pickup element and is arranged after the lens unit 4. The observation light that has passed through the lens unit 4 is focused on the light receiving surface of the image pickup device, and an image signal corresponding to the observation image is generated by photoelectric conversion. The image signal generated by the imaging unit 5 is provided to the communication unit 7.
 撮像部5を構成する撮像素子としては、例えばCMOS(Complementary Metal Oxide Semiconductor)タイプのイメージセンサであり、Bayer配列を有するカラー撮影可能なものが用いられる。なお、当該撮像素子としては、例えば4K以上の高解像度の画像の撮影に対応可能なものが用いられてもよい。術部の画像が高解像度で得られることにより、術者32は、当該術部の様子をより詳細に把握することができ、手術をより円滑に進行することが可能となる。 As the image sensor constituting the image pickup unit 5, for example, a CMOS (Complementary Metal Oxide Semiconductor) type image sensor having a Bayer array and capable of color photographing is used. As the image pickup device, for example, an image pickup device capable of capturing a high resolution image of 4K or higher may be used. By obtaining the image of the surgical site in high resolution, the surgeon 32 can grasp the state of the surgical site in more detail, and the operation can proceed more smoothly.
 また、撮像部5を構成する撮像素子は、3D表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成される。3D表示が行われることにより、術者32は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部5が多板式で構成される場合には、各撮像素子に対応して、レンズユニット4も複数系統設けられる。 Further, the image pickup elements constituting the image pickup unit 5 are configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D display, respectively. The 3D display enables the operator 32 to more accurately grasp the depth of the biological tissue in the surgical site. When the image pickup unit 5 is composed of a multi-plate type, a plurality of lens units 4 are also provided corresponding to each image pickup element.
 また、撮像部5は、必ずしもカメラヘッド3に設けられなくてもよい。例えば、撮像部5は、鏡筒2の内部に、対物レンズの直後に設けられてもよい。 Further, the imaging unit 5 does not necessarily have to be provided on the camera head 3. For example, the imaging unit 5 may be provided inside the lens barrel 2 immediately after the objective lens.
 駆動部6は、アクチュエータによって構成され、カメラヘッド制御部8からの制御により、レンズユニット4のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部5による撮像画像の倍率及び焦点が適宜調整され得る。 The drive unit 6 is composed of an actuator, and the zoom lens and focus lens of the lens unit 4 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 8. As a result, the magnification and focus of the image captured by the imaging unit 5 can be adjusted as appropriate.
 通信部7は、CCU20との間で各種の情報を送受信するための通信装置によって構成される。通信部7は、撮像部5から得た画像信号をRAWデータとして伝送ケーブル31を介してCCU20に送信する。この際、術部の撮像画像を低レイテンシで表示するために、当該画像信号は光通信によって送信されることが好ましい。手術の際には、術者32が撮像画像によって患部の状態を観察しながら手術を行うため、より安全で確実な手術のためには、術部の動画像が可能な限りリアルタイムに表示されることが求められるからである。光通信が行われる場合には、通信部7には、電気信号を光信号に変換する光電変換モジュールが設けられる。画像信号は当該光電変換モジュールによって光信号に変換された後、伝送ケーブル31を介してCCU20に送信される。 The communication unit 7 is composed of a communication device for transmitting and receiving various information to and from the CCU 20. The communication unit 7 transmits the image signal obtained from the image pickup unit 5 as RAW data to the CCU 20 via the transmission cable 31. At this time, in order to display the captured image of the surgical site with low latency, it is preferable that the image signal is transmitted by optical communication. At the time of surgery, the surgeon 32 performs the surgery while observing the condition of the affected area with the captured image, so for safer and more reliable surgery, the moving image of the surgical site is displayed in real time as much as possible. This is because it is required. When optical communication is performed, the communication unit 7 is provided with a photoelectric conversion module that converts an electric signal into an optical signal. The image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 20 via the transmission cable 31.
 また、通信部7は、CCU20から、カメラヘッド3の駆動を制御するための制御信号を受信する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。通信部7は、受信した制御信号をカメラヘッド制御部8に提供する。なお、CCU20からの制御信号も、光通信によって伝送されてもよい。この場合、通信部7には、光信号を電気信号に変換する光電変換モジュールが設けられ、制御信号は当該光電変換モジュールによって電気信号に変換された後、カメラヘッド制御部8に提供される。 Further, the communication unit 7 receives a control signal for controlling the drive of the camera head 3 from the CCU 20. The control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition. The communication unit 7 provides the received control signal to the camera head control unit 8. The control signal from the CCU 20 may also be transmitted by optical communication. In this case, the communication unit 7 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 8.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、取得された画像信号に基づいてCCU20の制御部30によって自動的に設定される。つまり、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡1に搭載される。 The imaging conditions such as the frame rate, exposure value, magnification, and focus are automatically set by the control unit 30 of the CCU 20 based on the acquired image signal. That is, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 1.
 カメラヘッド制御部8は、通信部7を介して受信したCCU20からの制御信号に基づいて、カメラヘッド3の駆動を制御する。例えば、カメラヘッド制御部8は、撮像画像のフレームレートを指定する旨の情報及び/又は撮像時の露光を指定する旨の情報に基づいて、撮像部5の撮像素子の駆動を制御する。また、例えば、カメラヘッド制御部8は、撮像画像の倍率及び焦点を指定する旨の情報に基づいて、駆動部6を介してレンズユニット4のズームレンズ及びフォーカスレンズを適宜移動させる。カメラヘッド制御部8は、更に、鏡筒2やカメラヘッド3を識別するための情報を記憶する機能を備えてもよい。 The camera head control unit 8 controls the drive of the camera head 3 based on the control signal from the CCU 20 received via the communication unit 7. For example, the camera head control unit 8 controls the drive of the image pickup device of the image pickup unit 5 based on the information to specify the frame rate of the captured image and / or the information to specify the exposure at the time of imaging. Further, for example, the camera head control unit 8 appropriately moves the zoom lens and the focus lens of the lens unit 4 via the drive unit 6 based on the information that the magnification and the focus of the captured image are specified. The camera head control unit 8 may further have a function of storing information for identifying the lens barrel 2 and the camera head 3.
 なお、レンズユニット4や撮像部5等の構成を、気密性及び防水性が高い密閉構造内に配置することで、カメラヘッド3について、オートクレーブ滅菌処理に対する耐性を持たせることができる。 By arranging the configuration of the lens unit 4, the image pickup unit 5, and the like in a sealed structure having high airtightness and waterproofness, the camera head 3 can be made resistant to autoclave sterilization.
 次に、CCU20の機能構成について説明する。通信部28は、カメラヘッド3との間で各種の情報を送受信するための通信装置によって構成される。通信部28は、カメラヘッド3から、伝送ケーブル31を介して送信される画像信号を受信する。この際、上記のように、当該画像信号は好適に光通信によって送信され得る。この場合、光通信に対応して、通信部28には、光信号を電気信号に変換する光電変換モジュールが設けられる。通信部28は、電気信号に変換した画像信号を画像処理部29に提供する。 Next, the functional configuration of the CCU 20 will be described. The communication unit 28 is composed of a communication device for transmitting and receiving various types of information to and from the camera head 3. The communication unit 28 receives an image signal transmitted from the camera head 3 via the transmission cable 31. At this time, as described above, the image signal can be suitably transmitted by optical communication. In this case, corresponding to optical communication, the communication unit 28 is provided with a photoelectric conversion module that converts an optical signal into an electric signal. The communication unit 28 provides the image processing unit 29 with an image signal converted into an electric signal.
 また、通信部28は、カメラヘッド3に対して、カメラヘッド3の駆動を制御するための制御信号を送信する。当該制御信号も光通信によって送信されてよい。 Further, the communication unit 28 transmits a control signal for controlling the drive of the camera head 3 to the camera head 3. The control signal may also be transmitted by optical communication.
 画像処理部29は、カメラヘッド3から送信されたRAWデータである画像信号に対して各種の画像処理を施す。当該画像処理としては、例えば現像処理、高画質化処理(帯域強調処理、超解像処理、NR(Noise reduction)処理及び/又は手ブレ補正処理等)、並びに/又は拡大処理(電子ズーム処理)等、各種の公知の信号処理が含まれる。また、画像処理部29は、AE、AF及びAWBを行うための、画像信号に対する検波処理を行う。 The image processing unit 29 performs various image processing on the image signal which is the RAW data transmitted from the camera head 3. The image processing includes, for example, development processing, high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing). Etc., various known signal processing is included. In addition, the image processing unit 29 performs detection processing on the image signal for performing AE, AF, and AWB.
 画像処理部29は、CPUやGPU等のプロセッサによって構成され、当該プロセッサが所定のプログラムに従って動作することにより、上述した画像処理や検波処理が行われ得る。なお、画像処理部29が複数のGPUによって構成される場合には、画像処理部29は、画像信号に係る情報を適宜分割し、これら複数のGPUによって並列的に画像処理を行う。 The image processing unit 29 is composed of a processor such as a CPU or GPU, and when the processor operates according to a predetermined program, the above-mentioned image processing and detection processing can be performed. When the image processing unit 29 is composed of a plurality of GPUs, the image processing unit 29 appropriately divides the information related to the image signal and performs image processing in parallel by the plurality of GPUs.
 制御部30は、内視鏡1による術部の撮像、及びその撮像画像の表示に関する各種の制御を行う。例えば、制御部30は、カメラヘッド3の駆動を制御するための制御信号を生成する。この際、撮像条件がユーザによって入力されている場合には、制御部30は、当該ユーザによる入力に基づいて制御信号を生成する。あるいは、内視鏡1にAE機能、AF機能及びAWB機能が搭載されている場合には、制御部30は、画像処理部29による検波処理の結果に応じて、最適な露出値、焦点距離及びホワイトバランスを適宜算出し、制御信号を生成する。 The control unit 30 performs various controls related to the imaging of the surgical site by the endoscope 1 and the display of the captured image. For example, the control unit 30 generates a control signal for controlling the drive of the camera head 3. At this time, if the imaging conditions are input by the user, the control unit 30 generates a control signal based on the input by the user. Alternatively, when the endoscope 1 is equipped with the AE function, the AF function, and the AWB function, the control unit 30 determines the optimum exposure value, focal length, and the optimum exposure value, depending on the result of the detection process by the image processing unit 29. The white balance is calculated appropriately and a control signal is generated.
 また、制御部30は、画像処理部29によって画像処理が施された画像信号に基づいて、術部の画像を表示装置21に表示させる。この際、制御部30は、各種の画像認識技術を用いて術部画像内における各種の物体を認識する。例えば、制御部30は、術部画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11使用時のミスト等を認識することができる。制御部30は、表示装置21に術部の画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させる。手術支援情報が重畳表示され、術者32に提示されることにより、より安全かつ確実に手術を進めることが可能になる。 Further, the control unit 30 causes the display device 21 to display an image of the surgical unit based on the image signal that has been image-processed by the image processing unit 29. At this time, the control unit 30 recognizes various objects in the surgical site image by using various image recognition techniques. For example, the control unit 30 detects a surgical tool such as forceps, a specific biological part, bleeding, a mist when using the energy treatment tool 11, etc. by detecting the shape, color, etc. of the edge of the object included in the surgical site image. Can be recognized. When the display device 21 displays the image of the surgical site, the control unit 30 uses the recognition result to superimpose and display various surgical support information on the image of the surgical site. By superimposing the surgery support information and presenting it to the surgeon 32, it becomes possible to proceed with the surgery more safely and surely.
 カメラヘッド3及びCCU20を接続する伝送ケーブル31は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 The transmission cable 31 that connects the camera head 3 and the CCU 20 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
 ここで、図示する例では、伝送ケーブル31を用いて有線で通信が行われていたが、カメラヘッド3とCCU20との間の通信は無線で行われてもよい。両者の間の通信が無線で行われる場合には、伝送ケーブル31を手術室内に敷設する必要がなくなるため、手術室内における医療スタッフの移動が当該伝送ケーブル31によって妨げられる事態が解消され得る。
 なお、本実施形態において、内視鏡手術システム100は、施術システムに相当する。またカメラヘッド3は、施術対象を撮像可能な撮像部に相当し、内視鏡1の一部として構成されている。
Here, in the illustrated example, the communication is performed by wire using the transmission cable 31, but the communication between the camera head 3 and the CCU 20 may be performed wirelessly. When the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 31 in the operating room, so that the situation where the movement of the medical staff in the operating room is hindered by the transmission cable 31 can be solved.
In this embodiment, the endoscopic surgery system 100 corresponds to the surgery system. Further, the camera head 3 corresponds to an imaging unit capable of imaging an object to be treated, and is configured as a part of the endoscope 1.
 [画像処理部の構成]
 画像処理部29の詳細について説明する。画像処理部29は、本技術に係る画像処理部の一実施形態に相当する。また画像処理部29を有するCCU20は、本技術に係る画像処理装置の一実施形態に相当する。
[Structure of image processing unit]
The details of the image processing unit 29 will be described. The image processing unit 29 corresponds to an embodiment of the image processing unit according to the present technology. Further, the CCU 20 having the image processing unit 29 corresponds to one embodiment of the image processing apparatus according to the present technology.
 本実施形態では、画像処理部29は、例えばCPUやGPU等のプロセッサ、ROMやRAM等のメモリ等、コンピュータの構成に必要なハードウェアを有する。CPU等がROM等に記録されている制御プログラム(本技術に係るプログラム)をRAMにロードして実行することにより、本技術に係る画像処理方法が実行される。
 画像処理部29の具体的な構成は限定されず、例えばFPGA(Field Programmable Gate Array)、ASIC(Application Specific Integrated Circuit)等の任意のハードウェアが用いられてもよい。
 なお、CCU20のCPU等が所定のプログラムを実行することで、ソフトウェアブロックとして、画像処理部29を実現することも可能である。
In the present embodiment, the image processing unit 29 has hardware necessary for configuring a computer, such as a processor such as a CPU or GPU and a memory such as ROM or RAM. The image processing method according to the present technology is executed when the CPU or the like loads and executes the control program (program according to the present technology) recorded in the ROM or the like into the RAM.
The specific configuration of the image processing unit 29 is not limited, and any hardware such as FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit) may be used.
It is also possible to realize the image processing unit 29 as a software block by executing a predetermined program by the CPU or the like of the CCU 20.
 図3は、画像処理部29の構成例を示すブロック図である。
 本実施形態では、画像処理部29のCPU等が所定のプログラムを実行することで、機能ブロックとしての画像取得部40、注目領域設定部41、拡大処理部42、強調領域設定部43、強調処理部44、情報量制御部45、及び画像合成部46が実現される。もちろん各ブロックを実現するために、IC(集積回路)等の専用のハードウェアが用いられてもよい。
 プログラムは、例えば種々の記録媒体を介して画像処理部29にインストールされる。あるいは、インターネット等を介してプログラムのインストールが実行されてもよい。
 なおプログラムが記録される記録媒体の種類等は限定されず、コンピュータが読み取り可能な任意の記録媒体が用いられてよい。例えば非一時的にデータを記録する任意の記録媒体が用いられてよい。
FIG. 3 is a block diagram showing a configuration example of the image processing unit 29.
In the present embodiment, the CPU or the like of the image processing unit 29 executes a predetermined program to execute the image acquisition unit 40 as a functional block, the attention area setting unit 41, the enlargement processing unit 42, the emphasis area setting unit 43, and the emphasis processing. A unit 44, an information amount control unit 45, and an image composition unit 46 are realized. Of course, in order to realize each block, dedicated hardware such as an IC (integrated circuit) may be used.
The program is installed in the image processing unit 29 via, for example, various recording media. Alternatively, the program may be installed via the Internet or the like.
The type of recording medium on which the program is recorded is not limited, and any computer-readable recording medium may be used. For example, any recording medium for recording data non-temporarily may be used.
 画像取得部40は、内視鏡1により撮影された施術対象の撮影画像を取得する。
 本開示において、施術は、例えば、検査、手術、及び診断等の様々な医療行為を含む。なお医療行為は、例えば医師の医学的判断及び技術をもってするのでなければ人体に危害を及ぼし、又は危害を及ぼす恐れのある行為を含む。
 また施術は、体温や血圧の測定、軽微な切り傷及びやけど等の専門的な判断や技術を必要としない処置、汚物で汚れたガーゼ等の交換、投薬量の調整、ギプス等の治療器具の装着、輸血用血液の採血、経口用気管チューブ等の調整、及びペースメーカの管理及び操作等の医療に関係する種々の行為を含む。
 施術対象は、例えば、人体、犬や猫等の愛玩動物、牛や豚等の家畜等、種々の生体を含む。また腕や内臓等の生体等の一部分も施術対象に含まれる。
 また画像は、静止画像及び動画像を含む。もちろん動画像に含まれる複数のフレーム画像も、画像に含まれる。また画像の取得は、画像の情報を含む画像信号の取得を含む。また画像の情報のデータ形式等も限定されず、任意に設定されてよい。
 本実施形態では、画像取得部40により、内視鏡手術の対象となる人体の体腔内の撮影画像が取得される。取得された撮影画像は、注目領域設定部41に出力される。
The image acquisition unit 40 acquires a captured image of the treatment target captured by the endoscope 1.
In the present disclosure, the procedure includes various medical actions such as examination, surgery, and diagnosis. In addition, medical practice includes, for example, an act that causes or may cause harm to the human body unless it is done by the medical judgment and skill of a doctor.
In addition, the treatment includes measurement of body temperature and blood pressure, treatment that does not require specialized judgment and skill such as minor cuts and burns, replacement of gauze soiled with filth, adjustment of dosage, and attachment of treatment equipment such as casts. Includes various medical activities such as blood collection for transfusion, adjustment of oral tracheal tubes, and management and operation of pacemakers.
The treatment target includes various living organisms such as a human body, pet animals such as dogs and cats, and livestock such as cows and pigs. In addition, a part of the living body such as the arm and internal organs is also included in the treatment target.
The image also includes a still image and a moving image. Of course, a plurality of frame images included in the moving image are also included in the image. Further, the acquisition of an image includes the acquisition of an image signal including image information. Further, the data format of the image information is not limited and may be set arbitrarily.
In the present embodiment, the image acquisition unit 40 acquires an image taken in the body cavity of the human body to be operated on by endoscopy. The acquired captured image is output to the attention area setting unit 41.
 注目領域設定部41は、撮影画像に対して注目領域を設定する。なお、画像内の領域は、1以上の画素からなる画素領域により規定される。
 注目領域とは、施術中の術者32にとって注目の対象となる領域である。例えば、施術対象である人体の病変部等を含む領域が含まれる。あるいは、メス等の術具により切開等の施術が行われている領域が含まれる。その他、術者32が注目したい任意の領域が、注目領域として設定されてよい。
 注目領域は、例えば術者32より設定されてもよい。あるいは、入力された撮影画像に対して、自動的に注目領域が設定されてもよい。注目領域を設定する方法は限定されず、任意のアルゴリズムが用いられてよい。
 また注目領域設定部41は、注目領域以外の領域であり、後に説明する強調領域の設定の対象となる領域を設定する。
 本実施形態では、撮影画像内の注目領域の周辺の全ての領域が、強調領域の設定の対象となる領域として設定される(以下、周辺領域と記載する)。もちろんこれに限定されず、撮影画像内の、注目領域以外の一部の領域が、強調領域の設定の対象となる領域として設定されてもよい。
The attention area setting unit 41 sets the attention area with respect to the captured image. The area in the image is defined by a pixel area composed of one or more pixels.
The area of interest is an area of interest for the operator 32 during the procedure. For example, an area including a lesion of the human body to be treated is included. Alternatively, it includes an area where an incision or the like is performed by a surgical tool such as a scalpel. In addition, any region that the operator 32 wants to pay attention to may be set as the region of interest.
The region of interest may be set by, for example, the operator 32. Alternatively, a region of interest may be automatically set for the input captured image. The method of setting the region of interest is not limited, and any algorithm may be used.
Further, the attention area setting unit 41 is an area other than the attention area, and sets an area to be set as an emphasis area, which will be described later.
In the present embodiment, all the regions around the region of interest in the captured image are set as regions to be set as the emphasis region (hereinafter, referred to as peripheral regions). Of course, the present invention is not limited to this, and a part of the captured image other than the region of interest may be set as the region to be set as the emphasis region.
 拡大処理部42は、撮影画像の全部又は一部の領域を拡大することが可能である。本実施形態では、拡大処理部42により、注目領域設定部41により設定された注目領域が拡大される。 The enlargement processing unit 42 can enlarge all or a part of the captured image. In the present embodiment, the enlargement processing unit 42 expands the area of interest set by the area of interest setting unit 41.
 強調領域設定部43は、周辺領域に対して強調領域を設定する。
 強調領域は、周辺領域において、強調の対象となる領域である。例えば、施術を行う際に注意を要する部分に対応する領域が、強調領域として設定される。
 例えば、施術に用いられる器具、出血部位、又は損傷を避けるべき部位等に対応する領域が、強調領域として設定される。これらの全部に対応する領域が強調領域として設定されてもよいし、少なくとも1つに対応する領域が強調領域として設定されてもよい。
 その他、施術を行う際に注意を要する任意の部分に対応する領域が、強調領域として設定されてよい。
The emphasis area setting unit 43 sets an emphasis area with respect to the peripheral area.
The emphasized area is an area to be emphasized in the peripheral area. For example, an area corresponding to a portion that requires attention when performing a treatment is set as an emphasized area.
For example, a region corresponding to an instrument used in the treatment, a bleeding site, a site where damage should be avoided, or the like is set as an emphasized area. A region corresponding to all of these may be set as an emphasized region, or an region corresponding to at least one may be set as an emphasized region.
In addition, an area corresponding to an arbitrary part that requires attention when performing the treatment may be set as an emphasized area.
 施術に用いられる器具とは、メス、鑷子、鉗子等の術具である。なお、これに限定されず、一般的な施術において用いられる各種の術具が用いられてよい。
 出血部位とは、施術対象から出血が起きている部位である。本実施形態では、出血部位は、出血が起きている損傷箇所、及び損傷箇所周辺の流血した血液を含む。
 損傷を避けるべき部位とは、動脈等の生体の重要器官である。これ以外にも、網膜等の損傷させることで生体に大きな影響を及ぼす可能性がある器官も含まれる。
 その他、例えば、ガーゼ等の医療用品に対応する部分等が、強調領域として設定されてもよい。
 なお、本実施形態において、注目領域は、撮影画像内の第1の領域に相当する。また本実施形態において、周辺領域は、第1の領域とは異なる第2の領域に相当する。また本実施形態において、強調領域は、第2の領域内の所定の領域に相当する。
The instruments used for the treatment are surgical tools such as scalpels, tweezers, and forceps. In addition, the present invention is not limited to this, and various surgical tools used in general treatment may be used.
The bleeding site is a site where bleeding is occurring from the treatment target. In the present embodiment, the bleeding site includes an injured site where bleeding is occurring and bloody blood around the injured site.
The site to avoid damage is an important organ of the living body such as an artery. In addition to this, organs such as the retina that may have a great influence on the living body by damaging them are also included.
In addition, for example, a portion corresponding to a medical product such as gauze may be set as an emphasis area.
In the present embodiment, the region of interest corresponds to the first region in the captured image. Further, in the present embodiment, the peripheral region corresponds to a second region different from the first region. Further, in the present embodiment, the emphasized region corresponds to a predetermined region in the second region.
 強調領域の設定方法は限定されず、画像認識処理等、任意の技術が用いられてよい。例えばエッジ検出やパターンマッチング等の任意の画像認識方法が用いられてよく、そのアルゴリズムは特に限定されない。
 また例えばDNN(Deep Neural Network:深層ニューラルネットワーク)等を用いた任意の機械学習アルゴリズムが用いられてもよい。例えばディープラーニング(深層学習)を行うAI(人工知能)等を用いることで、強調領域の設定精度を向上させることが可能となる。
 例えば、強調領域の設定のために、学習部及び識別部が備えられる。学習部及び認識部は、例えば強調領域設定部内に構築されてもよいし、CCU20と通信可能な他の装置内に構築されてもよい。
 学習部は、入力された情報(学習データ)に基づいて機械学習を行い、学習結果を出力する。また、識別部は、入力された情報と学習結果に基づいて、当該入力された情報の識別(判断や予測等)を行う。
 学習部における学習手法には、例えばニューラルネットワークやディープラーニングが用いられる。ニューラルネットワークとは、人間の脳神経回路を模倣したモデルであって、入力層、中間層(隠れ層)、出力層の3種類の層から成る。
 ディープラーニングとは、多層構造のニューラルネットワークを用いたモデルであって、各層で特徴的な学習を繰り返し、大量データの中に潜んでいる複雑なパターンを学習することができる。
 ディープラーニングは、例えば画像内のオブジェクトや音声内の単語を識別する用途として用いられる。もちろん、強調領域の設定に適用することも可能である。
 また、このような機械学習を実現するハードウェア構造としては、ニューラルネットワークの概念を組み込まれたニューロチップ/ニューロモーフィック・チップが用いられ得る。
The method of setting the emphasized area is not limited, and any technique such as image recognition processing may be used. For example, any image recognition method such as edge detection or pattern matching may be used, and the algorithm is not particularly limited.
Further, for example, an arbitrary machine learning algorithm using DNN (Deep Neural Network) or the like may be used. For example, by using AI (artificial intelligence) or the like that performs deep learning (deep learning), it is possible to improve the setting accuracy of the emphasized region.
For example, a learning unit and an identification unit are provided for setting the emphasized area. The learning unit and the recognition unit may be constructed in, for example, the emphasis area setting unit, or may be constructed in another device capable of communicating with the CCU 20.
The learning unit performs machine learning based on the input information (learning data) and outputs the learning result. In addition, the identification unit identifies (determines, predicts, etc.) the input information based on the input information and the learning result.
For example, a neural network or deep learning is used as a learning method in the learning unit. A neural network is a model that imitates a human brain neural circuit, and is composed of three types of layers: an input layer, an intermediate layer (hidden layer), and an output layer.
Deep learning is a model that uses a multi-layered neural network, and it is possible to learn complex patterns hidden in a large amount of data by repeating characteristic learning in each layer.
Deep learning is used, for example, to identify objects in images and words in sounds. Of course, it can also be applied to the setting of the emphasized area.
Further, as a hardware structure for realizing such machine learning, a neurochip / neuromorphic chip incorporating the concept of a neural network can be used.
 機械学習の問題設定には、教師あり学習、教師なし学習、半教師学習、強化学習、逆強化学習、能動学習、転移学習等がある。
 例えば教師あり学習は、与えられたラベル付きの学習データ(教師データ)に基づいて特徴量を学習する。これにより、未知のデータのラベルを導くことが可能となる。
 また、教師なし学習は、ラベルが付いていない学習データを大量に分析して特徴量を抽出し、抽出した特徴量に基づいてクラスタリングを行う。これにより、膨大な未知のデータに基づいて傾向の分析や未来予測を行うことが可能となる。
 また、半教師学習は、教師あり学習と教師なし学習を混在させたものであって、教師あり学習で特徴量を学ばせた後、教師なし学習で膨大な訓練データを与え、自動的に特徴量を算出させながら繰り返し学習を行う方法である。
 また、強化学習は、ある環境内におけるエージェントが現在の状態を観測して取るべき行動を決定する問題を扱うものである。エージェントは、行動を選択することで環境から報酬を習得し、一連の行動を通じて報酬が最も多く得られるような方策を学習する。このように、ある環境における最適解を学習することで、人間の判断力を再現し、また、人間を超える判断力をコンピュータに習得させることが可能となる。
 機械学習によって、仮想的なセンシングデータを生成することも可能である。例えば、入力された画像情報から位置情報を生成するなど、あるセンシングデータから別のセンシングデータを予測して入力情報として使用することが可能である。
 また、複数のセンシングデータから別のセンシングデータを生成することも可能である。また、必要な情報を予測し、センシングデータから所定の情報を生成することも可能である。
 なお学習アルゴリズムの適用は、本開示内の任意の処理に対して実行されてよい。
 本実施形態において、強調領域設定部43は、第2の領域に対して所定の領域を設定する領域設定部に相当する。
 注目領域設定部41及び強調領域設定部43により実行される各領域の設定は、1つのブロックで機能してもよい。例えば、注目領域設定部41及び強調領域設定部43の機能を持つ領域設定部等が構成され、領域設定部により注目領域、周辺領域、及び強調領域が設定されてもよい。
Machine learning problem settings include supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, reverse reinforcement learning, active learning, and transfer learning.
For example, in supervised learning, features are learned based on given labeled learning data (teacher data). This makes it possible to derive labels for unknown data.
In unsupervised learning, a large amount of unlabeled learning data is analyzed to extract features, and clustering is performed based on the extracted features. This makes it possible to analyze trends and predict the future based on a huge amount of unknown data.
In addition, semi-supervised learning is a mixture of supervised learning and unsupervised learning. After learning features in supervised learning, a huge amount of training data is given in unsupervised learning, and the features are automatically characterized. This is a method of repeatedly learning while calculating the amount.
Reinforcement learning also deals with the problem of observing the current state of an agent in an environment and deciding what action to take. Agents learn rewards from the environment by choosing actions and learn how to get the most rewards through a series of actions. In this way, by learning the optimum solution in a certain environment, it is possible to reproduce human judgment and to make a computer acquire judgment that exceeds human judgment.
It is also possible to generate virtual sensing data by machine learning. For example, it is possible to predict another sensing data from one sensing data and use it as input information, such as generating position information from the input image information.
It is also possible to generate different sensing data from a plurality of sensing data. It is also possible to predict the required information and generate predetermined information from the sensing data.
The application of the learning algorithm may be executed for any process in the present disclosure.
In the present embodiment, the emphasized area setting unit 43 corresponds to an area setting unit that sets a predetermined area with respect to the second area.
The setting of each area executed by the attention area setting unit 41 and the emphasis area setting unit 43 may function in one block. For example, an area setting unit having the functions of an attention area setting unit 41 and an emphasis area setting unit 43 may be configured, and an attention area, a peripheral area, and an emphasis area may be set by the area setting unit.
 強調処理部44は、撮影画像の全部又は一部の領域に対して強調処理を行うことが可能である。本実施形態では、強調処理部44により、強調領域に対して強調処理が行われる。
 強調処理とは、周辺領域において、強調領域を強調する処理である。本実施形態では、強調領域に対する強調処理として、画像の拡大、画像のカラー化、画像の階調値の増加、又は画像の表示形式の変換の少なくとも1つを含む。
 画像のカラー化とは、グレースケールの画像をカラー画像に変換する処理である。例えば、画像のカラー化は、強調領域を含む周辺領域がグレースケールに変換された場合に、強調領域がカラー画像に変換される処理である。また例えば、画像のカラー化は、強調領域が所定の色で塗りつぶされる処理、又は強調領域の境界が所定の色で着色される処理の少なくとも1つを含む。
 画像の階調値の増加とは、画像の階調を増加する処理である。例えば、画像の階調値の増加は、周辺領域に該当する画像が128階調の場合に強調領域に該当する画像を256階調にする処理である。典型的には、強調領域の階調値は、注目領域の階調値以下となる。もちろんこれに限定される訳ではない。
 画像の表示形式の変換とは、画像が異なる表示形式で出力される処理である。例えば、周辺領域が絵画調(アニメーション調)に変換された場合に、強調領域が注目領域と同じ撮影された時と同じ画像の表示形式で出力される処理である。もちろんこれに限定される訳ではない。
The enhancement processing unit 44 can perform enhancement processing on all or a part of the captured image. In the present embodiment, the emphasis processing unit 44 performs the emphasis processing on the emphasis region.
The emphasis process is a process of emphasizing the emphasized area in the peripheral area. In the present embodiment, the enhancement process for the enhancement region includes at least one of enlargement of the image, colorization of the image, increase of the gradation value of the image, or conversion of the display format of the image.
Image colorization is a process of converting a grayscale image into a color image. For example, colorization of an image is a process in which an emphasized area is converted into a color image when a peripheral area including an emphasized area is converted into grayscale. Further, for example, the colorization of an image includes at least one of a process of filling the emphasized area with a predetermined color or a process of coloring the boundary of the emphasized area with a predetermined color.
Increasing the gradation value of an image is a process of increasing the gradation of an image. For example, increasing the gradation value of an image is a process of changing the image corresponding to the emphasized region to 256 gradations when the image corresponding to the peripheral region has 128 gradations. Typically, the gradation value of the emphasized region is equal to or less than the gradation value of the region of interest. Of course, it is not limited to this.
The conversion of the display format of an image is a process in which an image is output in a different display format. For example, when the peripheral area is converted to a painting-like (animation-like), the emphasized area is output in the same image display format as when the image was taken, which is the same as the attention area. Of course, it is not limited to this.
 情報量制御部45は、撮影画像の情報量を制御する。
 情報量は、典型的には、撮影画像の画像データ量により規定される。例えば、画像のビット数、又は画像の画素数等により、情報量が規定することが可能である。その他、階調値等に基づいて、情報量が規定されてもよい。
 情報量の減少は、撮影画像の画像データ量を減少させる任意の処理を含む。例えば、周辺領域の情報量を減少させる処理として、画像の縮小、画像のグレースケール化、画像の階調値の削減、画像の表示形式の変換、又は画像の切り出し等が実行される
 画像の切り出しとは、画像の一部を非表示にする処理である。例えば、8Kの撮影画像から4Kの画像が切り出される。その他、例えば、拡大された注目領域の拡大された領域の部分が周辺領域に対して重畳する場合、周辺領域の一部が重畳されたことで見えなくなることも画像の切り出しに含まれる。
 また撮影画像が3次元画像の場合に、周辺領域を3次元画像から2次元画像に変換することも、周辺の領域の情報量を減少させる処理に含まれる。
 本実施形態では、強調処理部44により、強調領域が強調される。そして情報量制御部45により、周辺領域の情報量が減少される。従って、周辺領域の情報量が、強調領域が強調されるように減少させる。
 その他、強調領域が強調されるように周辺領域の情報量を減少させることが可能な任意の処理が実行されてよい。
The information amount control unit 45 controls the amount of information of the captured image.
The amount of information is typically defined by the amount of image data of the captured image. For example, the amount of information can be specified by the number of bits of the image, the number of pixels of the image, and the like. In addition, the amount of information may be defined based on the gradation value and the like.
The reduction in the amount of information includes an arbitrary process for reducing the amount of image data of the captured image. For example, as processing for reducing the amount of information in the peripheral area, image reduction, grayscale of the image, reduction of the gradation value of the image, conversion of the display format of the image, cropping of the image, etc. are executed. Is a process of hiding a part of the image. For example, a 4K image is cut out from an 8K captured image. In addition, for example, when a portion of an enlarged region of an enlarged region of interest is superimposed on a peripheral region, the image is cut out so that it cannot be seen because a part of the peripheral region is superimposed.
Further, when the captured image is a three-dimensional image, converting the peripheral region from the three-dimensional image to the two-dimensional image is also included in the process of reducing the amount of information in the peripheral region.
In the present embodiment, the emphasis processing unit 44 emphasizes the emphasized area. Then, the information amount control unit 45 reduces the amount of information in the peripheral region. Therefore, the amount of information in the peripheral area is reduced so that the emphasized area is emphasized.
In addition, any process that can reduce the amount of information in the peripheral area so that the emphasized area is emphasized may be executed.
 画像合成部46は、撮影画像を電子的に合成する。本実施形態では、拡大処理部42により拡大された注目領域と、強調処理部44により強調された強調領域と、情報量制御部45により情報量が減少された周辺領域とが画像合成部46により合成される。
 合成された合成画像は、通信部28を介して、表示装置21に出力される。術者32は、表示装置21に表示される合成画像を確認しながら施術を行うことが可能である。
The image synthesizing unit 46 electronically synthesizes the captured image. In the present embodiment, the area of interest enlarged by the enlargement processing unit 42, the emphasized area emphasized by the emphasis processing unit 44, and the peripheral area in which the amount of information is reduced by the information amount control unit 45 are combined by the image composition unit 46. It is synthesized.
The combined composite image is output to the display device 21 via the communication unit 28. The surgeon 32 can perform the treatment while checking the composite image displayed on the display device 21.
 なお、情報量制御部45により、周辺領域の強調領域とは異なる領域のみの情報量が減少されてもよい。この場合、結果的に、周辺領域において、強調領域が強調される。従って、周辺領域の強調領域とは異なる領域のみの情報量を減少する処理は、強調領域に対する強調処理に含まれる。
 また周辺領域の強調領域とは異なる領域のみの情報量を減少すると、周辺領域全体の情報量は減少している。従って、周辺領域の強調領域とは異なる領域のみの情報量を減少する処理は、周辺処理の情報量を、強調領域が強調されるように減少する処理に含まれる。
The information amount control unit 45 may reduce the amount of information only in a region different from the emphasized region in the peripheral region. In this case, as a result, the emphasized area is emphasized in the peripheral area. Therefore, the process of reducing the amount of information only in the area different from the emphasized area of the peripheral area is included in the emphasis process for the emphasized area.
Further, when the amount of information of only the area different from the emphasized area of the peripheral area is reduced, the amount of information of the entire peripheral area is reduced. Therefore, the process of reducing the amount of information only in the area different from the emphasized area of the peripheral area is included in the process of reducing the amount of information of the peripheral process so that the emphasized area is emphasized.
 図4は、各領域の設定及び処理の一例を示すフローチャートである。図5は、施術対象の撮影画像の領域の設定の一例を示す模式図である。 FIG. 4 is a flowchart showing an example of setting and processing of each area. FIG. 5 is a schematic view showing an example of setting the area of the captured image to be treated.
 内視鏡1により撮影された患者34の撮影画像50が入力される(ステップ101)。例えば、図5に示すように、患者34の体腔内の術部が撮影された撮影画像50が表示装置21に表示される。
 表示装置21には、撮影画像50として、内視鏡1によって撮影された病変部を含む患者34の体腔内、術者32の手、術具51、及び出血部位52等が表示される。なお、例えば入力装置24を介して入力された患者34の身体情報や、手術の術式についての情報等の手術に関する各種の情報が撮影画像50に表示されてもよい。
The captured image 50 of the patient 34 captured by the endoscope 1 is input (step 101). For example, as shown in FIG. 5, a photographed image 50 in which the surgical portion in the body cavity of the patient 34 is photographed is displayed on the display device 21.
As the captured image 50, the display device 21 displays the inside of the body cavity of the patient 34 including the lesion portion photographed by the endoscope 1, the operator 32's hand, the surgical tool 51, the bleeding site 52, and the like. In addition, various information related to surgery such as physical information of the patient 34 input via the input device 24 and information about the surgical procedure may be displayed on the captured image 50.
 注目領域設定部41により、入力された撮影画像50内に注目領域53と周辺領域54とが設定される(ステップ102)。本実施形態では、図5に示すように、術者32が施術を行っている領域が注目領域53として設定され、それ以外の領域が周辺領域54として設定される。また本実施形態では、注目領域53と周辺領域54とが分離され、別々に処理が行われる。 The attention area setting unit 41 sets the attention area 53 and the peripheral area 54 in the input captured image 50 (step 102). In the present embodiment, as shown in FIG. 5, the area where the operator 32 is performing the treatment is set as the area of interest 53, and the other area is set as the peripheral area 54. Further, in the present embodiment, the region of interest 53 and the peripheral region 54 are separated and processed separately.
 図6は、注目領域53の拡大された一例を示す模式図である。
 図6に示すように、拡大処理部42により注目領域53が拡大される(ステップ103)。
 なお本開示において、画像の拡大は、画像内(の内容)を、元の表示領域(画素領域)よりも大きい領域に表示することを意味する。
 なお、本実施形態において、注目領域53が拡大される方法は限定されない。例えば、注目領域53全体が所定の倍率で拡大されてもよい。あるいは、注目領域53内の複数の領域ごとに異なる倍率が割り当てられても、割り当てられた倍率に従って拡大されてもよい。
FIG. 6 is a schematic view showing an enlarged example of the region of interest 53.
As shown in FIG. 6, the area of interest 53 is expanded by the enlargement processing unit 42 (step 103).
In the present disclosure, enlarging an image means displaying (the content of) the inside of the image in an area larger than the original display area (pixel area).
In this embodiment, the method of expanding the region of interest 53 is not limited. For example, the entire region of interest 53 may be magnified by a predetermined magnification. Alternatively, a different magnification may be assigned to each of the plurality of regions in the region of interest 53, or the magnification may be enlarged according to the assigned magnification.
 強調領域設定部43により、周辺領域54内に強調領域があるか否かが判定される(ステップ104)。本実施形態では、図5に示すように、強調領域設定部43により撮影画像50内の術具51及び出血部位52に対応する領域が強調領域として設定される。 The emphasis area setting unit 43 determines whether or not there is an emphasis area in the peripheral area 54 (step 104). In the present embodiment, as shown in FIG. 5, the region corresponding to the surgical tool 51 and the bleeding site 52 in the captured image 50 is set as the emphasized region by the emphasized region setting unit 43.
 図7は、周辺領域54内の強調領域に対して強調処理が実行された一例を示す模式図である。
 図7に示すように、強調処理部44により、強調領域55(56及び57)に対して強調処理が行われる(ステップ105)。本実施形態では、強調処理部44により、術具51に対応する強調領域56、及び出血部位52に対応する強調領域57に対して、所定の色で塗りつぶす強調処理が実行される。
 例えば、術具51に対応する強調領域56では、人体の体腔内よりも目立つ色で塗りつぶす強調処理が実行される。また例えば、出血部位52に対応する強調領域57では、血の色を模した赤色及び赤色に近い色で塗りつぶす強調処理が実行される。
 なお、本実施形態において、強調処理が実行される方法は限定されない。例えば、術者32により強調領域55ごとに任意の色が割り当てられてもよい。また例えば、画像認識等で術具51の種類等が認識され、術具ごとに割り当てられた強調処理が実行されてもよい。また例えば、術具51同士が近い場合に異なる強調処理が実行されてもよい。
 また本実施形態において、強調領域57の設定方法は限定されない。例えば、画像認識の認識結果をどの程度信頼してよいかを表す指標として領域に対する信頼度が設定されてもよい。その信頼度が所定の閾値を超えているかどうかに応じて該当する領域が強調領域に設定されるか否かが判定されてもよい。また閾値が複数設定され、各強調領域の信頼度が段階的に判定されてもよい。例えば、信頼度が最も高い閾値を超えている場合は、強調領域が濃い色で塗りつぶされてもよい。また例えば、閾値ごとに強調領域を塗りつぶすための色の輝度レベルが割り当てられてもよい。
FIG. 7 is a schematic view showing an example in which the enhancement process is executed on the enhancement region in the peripheral region 54.
As shown in FIG. 7, the emphasis processing unit 44 performs the emphasis processing on the emphasis regions 55 (56 and 57) (step 105). In the present embodiment, the emphasis processing unit 44 executes an enhancement process of filling the emphasis area 56 corresponding to the surgical instrument 51 and the emphasis area 57 corresponding to the bleeding site 52 with a predetermined color.
For example, in the emphasis region 56 corresponding to the surgical tool 51, an enhancement process of painting with a color that is more conspicuous than in the body cavity of the human body is executed. Further, for example, in the emphasis region 57 corresponding to the bleeding site 52, an enhancement process of filling with red or a color close to red that imitates the color of blood is executed.
In this embodiment, the method of executing the emphasis processing is not limited. For example, the operator 32 may assign an arbitrary color to each emphasis area 55. Further, for example, the type of the surgical tool 51 or the like may be recognized by image recognition or the like, and the enhancement process assigned to each surgical tool may be executed. Further, for example, different emphasis processing may be executed when the surgical tools 51 are close to each other.
Further, in the present embodiment, the method of setting the emphasized region 57 is not limited. For example, the reliability of the area may be set as an index indicating how much the recognition result of image recognition can be trusted. It may be determined whether or not the corresponding region is set as the emphasized region depending on whether or not the reliability exceeds a predetermined threshold value. Further, a plurality of threshold values may be set, and the reliability of each emphasized region may be determined stepwise. For example, if the reliability exceeds the highest threshold, the emphasized area may be filled with a dark color. Further, for example, a color luminance level for filling the emphasized area may be assigned for each threshold value.
 図8は、周辺領域54に対して情報量の制御が実行された一例を示す模式図である。
 情報量制御部45により、強調処理が実行された強調領域を含む周辺領域54に対して、情報量が制御される(ステップ106)。本実施形態では、周辺領域54の情報量を減少させる処理として、周辺領域54の縮小が実行される。
 画像の縮小は、画像(の内容)を、元の表示領域よりも小さい領域に表示することを意味する。なお周辺領域54が縮小される方法は限定されない。例えば、周辺領域54全体が所定の倍率で縮小されてもよい。あるいは、周辺領域54内の複数の領域ごとに異なる倍率が割り当てられても、割り当てられた倍率に従って縮小されてもよい。
 また本実施形態では、周辺領域54の情報量を減少させる処理として、周辺領域54のグレースケール化が実行される。すなわち、強調処理部44及び情報量制御部45により、周辺領域54の情報量を減少させる処理、及び強調領域55に対する強調処理が実行される。これにより、周辺領域54が縮小されても、強調領域55を十分に把握することが可能である。
 なお、情報量を減少させる処理、及び強調領域55に対する強調処理の組み合わせは任意に実行されてもよい。例えば、周辺領域54が絵画調に変換され、術具51や出血部位52が所定の色で強調されてもよい。
 なお、図7及び図8に示す例では、注目領域53が点線で図示されている。これに限定されず、周辺領域54に対して強調処理及び情報量の制御が行われる場合に、注目領域53が図示されなくてもよい。
FIG. 8 is a schematic diagram showing an example in which the control of the amount of information is executed for the peripheral region 54.
The information amount control unit 45 controls the amount of information with respect to the peripheral area 54 including the emphasized area on which the emphasis processing is executed (step 106). In the present embodiment, the peripheral area 54 is reduced as a process for reducing the amount of information in the peripheral area 54.
Reducing an image means displaying (the content of) an image in an area smaller than the original display area. The method of reducing the peripheral area 54 is not limited. For example, the entire peripheral area 54 may be reduced by a predetermined magnification. Alternatively, a different magnification may be assigned to each of the plurality of areas in the peripheral area 54, or the magnification may be reduced according to the assigned magnification.
Further, in the present embodiment, grayscale of the peripheral region 54 is executed as a process of reducing the amount of information in the peripheral region 54. That is, the emphasis processing unit 44 and the information amount control unit 45 execute a process of reducing the amount of information in the peripheral area 54 and an emphasis process on the emphasis area 55. As a result, even if the peripheral region 54 is reduced, the emphasized region 55 can be sufficiently grasped.
The combination of the process of reducing the amount of information and the emphasis process for the emphasis area 55 may be arbitrarily executed. For example, the peripheral region 54 may be converted into a painting style, and the surgical instrument 51 and the bleeding site 52 may be emphasized with a predetermined color.
In the examples shown in FIGS. 7 and 8, the region of interest 53 is shown by a dotted line. Not limited to this, when the emphasis processing and the control of the amount of information are performed on the peripheral region 54, the region of interest 53 may not be shown.
 図9は、合成画像の一例を示す模式図である。 FIG. 9 is a schematic diagram showing an example of a composite image.
 図9に示すように、画像合成部46により、注目領域53と周辺領域54とが合成された合成画像58が生成される(ステップ107)。また生成された合成画像58が表示装置21に出力される(ステップ108)。これにより、術者32は、表示装置21に表示された合成画像58をリアルタイムで確認しながら施術を行うことが可能となる。
 なお、図5~図8に示す例では、ステップ101~106の各処理が図示された。本実施形態では、これらの図5~図8に図示された注目領域53及び周辺領域54を含む画像は、表示装置21に表示されず、画像合成部46により生成された合成画像58のみが表示装置21に表示される。これに限定されず、これらの処理ごとの注目領域53及び周辺領域54を含む画像が表示装置21に表示されてもよい。
As shown in FIG. 9, the image synthesizing unit 46 generates a composite image 58 in which the region of interest 53 and the peripheral region 54 are combined (step 107). Further, the generated composite image 58 is output to the display device 21 (step 108). As a result, the surgeon 32 can perform the treatment while checking the composite image 58 displayed on the display device 21 in real time.
In the examples shown in FIGS. 5 to 8, each process of steps 101 to 106 is shown. In the present embodiment, the images including the attention region 53 and the peripheral region 54 shown in FIGS. 5 to 8 are not displayed on the display device 21, and only the composite image 58 generated by the image compositing unit 46 is displayed. It is displayed on the device 21. Not limited to this, an image including the region of interest 53 and the peripheral region 54 for each of these processes may be displayed on the display device 21.
 以上、本実施形態に係る内視鏡手術システム100では、患者34が撮影された撮影画像50内の注目領域53が拡大される。撮影画像50内の注目領域53とは異なる周辺領域の情報量が、周辺領域54内の強調領域55が強調されるように減少される。これにより、対象の画像を十分に把握することが可能となる。 As described above, in the endoscopic surgery system 100 according to the present embodiment, the region of interest 53 in the captured image 50 captured by the patient 34 is enlarged. The amount of information in the peripheral region different from the attention region 53 in the captured image 50 is reduced so that the emphasized region 55 in the peripheral region 54 is emphasized. This makes it possible to fully grasp the target image.
 ビデオ内視鏡や顕微鏡で精緻な手術を行う際に、注目している領域を電子ズームにより拡大することがある。このとき、例えば、小さい病変部を電子ズームで拡大すると、等倍表示に比べて視野が狭くなり、注目領域外の情報が表示できなくなる。そのため、電子ズーム時に注目領域外での器具操作や病変部周辺の出血などの注目領域外の様態確認が行えなくなる問題があった。 When performing delicate surgery with a video endoscope or microscope, the area of interest may be magnified by an electronic zoom. At this time, for example, if a small lesion is enlarged by an electronic zoom, the field of view becomes narrower than that of the same-magnification display, and information outside the region of interest cannot be displayed. Therefore, there is a problem that it is not possible to operate the instrument outside the region of interest and to confirm the condition outside the region of interest such as bleeding around the lesion during electronic zooming.
 そこで本技術では、術野画像に注目領域が設定され、注目領域が拡大表示される。それとともに、注目領域以外の周辺領域が縮小表示される。また縮小表示されても必要な情報が抽出できるように強調領域に強調処理が行われる。これにより、注目領域を拡大して表示していても、注目領域外の術具の状態、出血の状態、組織の容態等の周辺の状態を把握する事ができる。 Therefore, in this technology, the area of interest is set in the surgical field image, and the area of interest is enlarged and displayed. At the same time, the peripheral area other than the area of interest is reduced and displayed. In addition, highlighting is performed on the highlighted area so that necessary information can be extracted even if the display is reduced. As a result, even if the area of interest is enlarged and displayed, it is possible to grasp the surrounding conditions such as the state of the surgical instrument, the state of bleeding, and the condition of the tissue outside the area of interest.
 <その他の実施形態>
 本技術は、以上説明した実施形態に限定されず、他の種々の実施形態を実現することができる。
<Other Embodiments>
The present technology is not limited to the embodiments described above, and various other embodiments can be realized.
 上記の実施形態では、強調処理部44により強調領域55に対して強調処理が実行された後に、情報量制御部45により周辺領域54の情報量が減少された。これに限定されず、周辺領域54の情報量が減少された後に、強調領域55に対して強調処理が実行されてもよい。 In the above embodiment, after the emphasis processing unit 44 executes the emphasis processing on the emphasis area 55, the information amount control unit 45 reduces the amount of information in the peripheral area 54. Not limited to this, after the amount of information in the peripheral region 54 is reduced, the enhancement process may be executed for the emphasis region 55.
 上記の実施形態では、周辺領域54の縮小が任意の方法で実行された。これに限定されず、撮影画像50内の拡大された注目領域53以外の領域に周辺領域54の全体が表示可能なように、周辺領域54が縮小されてもよい。これにより、合成画像を欠損なく表示することが可能となる。 In the above embodiment, the reduction of the peripheral area 54 was executed by an arbitrary method. The peripheral area 54 may be reduced so that the entire peripheral area 54 can be displayed in an area other than the enlarged area of interest 53 in the captured image 50. This makes it possible to display the composite image without any loss.
 上記の実施形態では、強調処理部44により、強調領域の全てに対して強調処理が実行された。これに限定されず、施術に大きく影響する術具や出血部位のみが強調処理されてもよい。また術者32により、強調処理が実行されるか否かが決定されてもよい。 In the above embodiment, the emphasis processing unit 44 executed the emphasis processing on all of the emphasis areas. Not limited to this, only surgical tools and bleeding sites that greatly affect the treatment may be emphasized. Further, the surgeon 32 may decide whether or not the emphasis processing is executed.
 上記の実施形態では、注目領域設定部41により、注目領域53が矩形の形状に設定された。これに限定されず、任意の形状で注目領域が設定されてもよい。例えば、施術が行われている箇所を基準として、所定の半径の円形の注目領域が設定されてもよい。 In the above embodiment, the attention area 53 is set to a rectangular shape by the attention area setting unit 41. Not limited to this, the region of interest may be set in any shape. For example, a circular area of interest having a predetermined radius may be set based on the location where the treatment is performed.
 上記の実施形態では、注目領域53と周辺領域54とが分離され、別々に処理が行われた。これに限定されず、1つの撮影画像内の注目領域と周辺領域とに処理が実行されてもよい。 In the above embodiment, the region of interest 53 and the peripheral region 54 are separated and processed separately. The process is not limited to this, and the process may be executed in the region of interest and the peripheral region in one captured image.
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、患者の微細部位を拡大観察しながら行う、いわゆるマイクロサージェリーに用いられる顕微鏡手術システムに適用されてもよい。 The technology according to this disclosure can be applied to various products. For example, the technique according to the present disclosure may be applied to a microsurgery system used for so-called microsurgery, which is performed while magnifying and observing a minute part of a patient.
 図10は、本開示に係る技術が適用され得る顕微鏡手術システム200の概略的な構成の一例を示す図である。図10を参照すると、顕微鏡手術システム200は、顕微鏡装置201と、制御装置217と、表示装置219とから構成される。なお、以下の顕微鏡手術システム200についての説明において、「ユーザ」とは、術者及び助手等、顕微鏡手術システム200を使用する任意の医療スタッフのことを意味する。 FIG. 10 is a diagram showing an example of a schematic configuration of a microscopic surgery system 200 to which the technique according to the present disclosure can be applied. Referring to FIG. 10, the microscope surgery system 200 is composed of a microscope device 201, a control device 217, and a display device 219. In the following description of the microsurgery system 200, the “user” means an operator, an assistant, or any other medical staff who uses the microsurgery system 200.
 顕微鏡装置201は、観察対象(患者の術部)を拡大観察するための顕微鏡部203と、顕微鏡部203を先端で支持するアーム部209と、アーム部209の基端を支持するベース部215とを有する。 The microscope device 201 includes a microscope unit 203 for magnifying and observing an observation target (patient's surgical unit), an arm unit 209 that supports the microscope unit 203 at the tip, and a base unit 215 that supports the base end of the arm unit 209. Has.
 顕微鏡部203は、略円筒形状の筒状部205と、当該筒状部205の内部に設けられる撮像部(図示せず)と、筒状部205の外周の一部領域に設けられる操作部207とから構成される。顕微鏡部203は、撮像部によって電子的に撮像画像を撮像する、電子撮像式の顕微鏡部(いわゆるビデオ式の顕微鏡部)である。 The microscope unit 203 includes a substantially cylindrical tubular portion 205, an imaging unit (not shown) provided inside the tubular portion 205, and an operation unit 207 provided in a part of the outer periphery of the tubular portion 205. It is composed of and. The microscope unit 203 is an electronic imaging type microscope unit (so-called video type microscope unit) that electronically captures an captured image by the imaging unit.
 筒状部205の下端の開口面には、内部の撮像部を保護するカバーガラスが設けられる。観察対象からの光(以下、観察光ともいう)は、当該カバーガラスを通過して、筒状部205の内部の撮像部に入射する。なお、筒状部205の内部には例えばLED等からなる光源が設けられてもよく、撮像時には、当該カバーガラスを介して、当該光源から観察対象に対して光が照射されてもよい。 A cover glass is provided on the opening surface at the lower end of the tubular portion 205 to protect the internal imaging portion. The light from the observation target (hereinafter, also referred to as observation light) passes through the cover glass and is incident on the imaging portion inside the tubular portion 205. A light source made of, for example, an LED or the like may be provided inside the tubular portion 205, and light may be emitted from the light source to the observation target through the cover glass at the time of imaging.
 撮像部は、観察光を集光する光学系と、当該光学系が集光した観察光を受光する撮像素子とから構成される。当該光学系は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成され、その光学特性は、観察光を撮像素子の受光面上に結像するように調整されている。当該撮像素子は、観察光を受光して光電変換することにより、観察光に対応した信号、すなわち観察像に対応した画像信号を生成する。当該撮像素子としては、例えばBayer配列を有するカラー撮影可能なものが用いられる。当該撮像素子は、CMOSイメージセンサ又はCCD(Charge Coupled Device)イメージセンサ等、各種の公知の撮像素子であってよい。撮像素子によって生成された画像信号は、RAWデータとして制御装置217に送信される。ここで、この画像信号の送信は、好適に光通信によって行われてもよい。手術現場では、術者が撮像画像によって患部の状態を観察しながら手術を行うため、より安全で確実な手術のためには、術部の動画像が可能な限りリアルタイムに表示されることが求められるからである。光通信で画像信号が送信されることにより、低レイテンシで撮像画像を表示することが可能となる。 The image pickup unit is composed of an optical system that collects the observation light and an image pickup element that receives the observation light collected by the optical system. The optical system is composed of a combination of a plurality of lenses including a zoom lens and a focus lens, and its optical characteristics are adjusted so as to form an image of observation light on a light receiving surface of an image pickup device. The image sensor receives the observation light and performs photoelectric conversion to generate a signal corresponding to the observation light, that is, an image signal corresponding to the observation image. As the image pickup device, for example, an image pickup device having a Bayer array and capable of color photographing is used. The image sensor may be various known image sensors such as a CMOS image sensor or a CCD (Charge Coupled Device) image sensor. The image signal generated by the image sensor is transmitted to the control device 217 as RAW data. Here, the transmission of this image signal may be preferably performed by optical communication. At the surgical site, the surgeon performs the surgery while observing the condition of the affected area with the captured image, so for safer and more reliable surgery, it is required that the moving image of the surgical site be displayed in real time as much as possible. Because it can be done. By transmitting the image signal by optical communication, it is possible to display the captured image with low latency.
 なお、撮像部は、その光学系のズームレンズ及びフォーカスレンズを光軸に沿って移動させる駆動機構を有してもよい。当該駆動機構によってズームレンズ及びフォーカスレンズが適宜移動されることにより、撮像画像の拡大倍率及び撮像時の焦点距離が調整され得る。また、撮像部には、AE機能やAF機能等、一般的に電子撮像式の顕微鏡部に備えられ得る各種の機能が搭載されてもよい。 The imaging unit may have a drive mechanism for moving the zoom lens and the focus lens of the optical system along the optical axis. By appropriately moving the zoom lens and the focus lens by the drive mechanism, the magnification of the captured image and the focal length at the time of imaging can be adjusted. Further, the imaging unit may be equipped with various functions that can be generally provided in an electronic imaging type microscope unit, such as an AE function and an AF function.
 また、撮像部は、1つの撮像素子を有するいわゆる単板式の撮像部として構成されてもよいし、複数の撮像素子を有するいわゆる多板式の撮像部として構成されてもよい。撮像部が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、当該撮像部は、立体視(3D表示)に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、当該撮像部が多板式で構成される場合には、各撮像素子に対応して、光学系も複数系統が設けられ得る。 Further, the image pickup unit may be configured as a so-called single-plate type image pickup unit having one image pickup element, or may be configured as a so-called multi-plate type image pickup unit having a plurality of image pickup elements. When the image pickup unit is composed of a multi-plate type, for example, each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them. Alternatively, the image pickup unit may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to stereoscopic vision (3D display), respectively. The 3D display enables the operator to more accurately grasp the depth of the biological tissue in the surgical site. When the image pickup unit is composed of a multi-plate type, a plurality of optical systems may be provided corresponding to each image pickup element.
 操作部207は、例えば十字レバー又はスイッチ等によって構成され、ユーザの操作入力を受け付ける入力手段である。例えば、ユーザは、操作部207を介して、観察像の拡大倍率及び観察対象までの焦点距離を変更する旨の指示を入力することができる。当該指示に従って撮像部の駆動機構がズームレンズ及びフォーカスレンズを適宜移動させることにより、拡大倍率及び焦点距離が調整され得る。また、例えば、ユーザは、操作部207を介して、アーム部209の動作モード(後述するオールフリーモード及び固定モード)を切り替える旨の指示を入力することができる。なお、ユーザが顕微鏡部203を移動させようとする場合には、当該ユーザは筒状部205を握るように把持した状態で当該顕微鏡部203を移動させる様態が想定される。従って、操作部207は、ユーザが筒状部205を移動させている間でも操作可能なように、ユーザが筒状部205を握った状態で指によって容易に操作しやすい位置に設けられることが好ましい。 The operation unit 207 is an input means that is composed of, for example, a cross lever or a switch, and receives a user's operation input. For example, the user can input an instruction to change the magnification of the observation image and the focal length to the observation target via the operation unit 207. The magnification and focal length can be adjusted by appropriately moving the zoom lens and the focus lens by the drive mechanism of the imaging unit according to the instruction. Further, for example, the user can input an instruction to switch the operation mode (all-free mode and fixed mode described later) of the arm unit 209 via the operation unit 207. When the user intends to move the microscope unit 203, it is assumed that the user moves the microscope unit 203 while grasping the tubular portion 205. Therefore, the operation unit 207 may be provided at a position where the user can easily operate the tubular portion 205 with a finger while holding the tubular portion 205 so that the operation unit 207 can be operated even while the user is moving the tubular portion 205. preferable.
 アーム部209は、複数のリンク(第1リンク213a~第6リンク213f)が、複数の関節部(第1関節部211a~第6関節部211f)によって互いに回動可能に連結されることによって構成される。 The arm portion 209 is configured by connecting a plurality of links (first link 213a to sixth link 213f) rotatably to each other by a plurality of joint portions (first joint portion 211a to sixth joint portion 211f). Will be done.
 第1関節部211aは、略円柱形状を有し、その先端(下端)で、顕微鏡部203の筒状部205の上端を、当該筒状部205の中心軸と平行な回転軸(第1軸O1)まわりに回動可能に支持する。ここで、第1関節部211aは、第1軸O1が顕微鏡部203の撮像部の光軸と一致するように構成され得る。これにより、第1軸O1まわりに顕微鏡部203を回動させることにより、撮像画像を回転させるように視野を変更することが可能になる。 The first joint portion 211a has a substantially cylindrical shape, and at its tip (lower end), the upper end of the tubular portion 205 of the microscope portion 203 is a rotation axis (first axis) parallel to the central axis of the tubular portion 205. O1) Support it so that it can rotate around. Here, the first joint portion 211a may be configured such that the first axis O1 coincides with the optical axis of the imaging portion of the microscope unit 203. This makes it possible to change the field of view so as to rotate the captured image by rotating the microscope unit 203 around the first axis O1.
 第1リンク213aは、先端で第1関節部211aを固定的に支持する。具体的には、第1リンク213aは略L字形状を有する棒状の部材であり、その先端側の一辺が第1軸O1と直交する方向に延伸しつつ、当該一辺の端部が第1関節部211aの外周の上端部に当接するように、第1関節部211aに接続される。第1リンク213aの略L字形状の基端側の他辺の端部に第2関節部211bが接続される。 The first link 213a fixedly supports the first joint portion 211a at the tip. Specifically, the first link 213a is a rod-shaped member having a substantially L-shape, and one side of the tip side extends in a direction orthogonal to the first axis O1, and the end of the one side is the first joint. It is connected to the first joint portion 211a so as to abut on the upper end portion of the outer periphery of the portion 211a. The second joint portion 211b is connected to the end of the other side of the base end side of the substantially L-shape of the first link 213a.
 第2関節部211bは、略円柱形状を有し、その先端で、第1リンク213aの基端を、第1軸O1と直交する回転軸(第2軸O2)まわりに回動可能に支持する。第2関節部211bの基端には、第2リンク213bの先端が固定的に接続される。 The second joint portion 211b has a substantially cylindrical shape, and at its tip, the base end of the first link 213a is rotatably supported around a rotation axis (second axis O2) orthogonal to the first axis O1. .. The tip of the second link 213b is fixedly connected to the base end of the second joint portion 211b.
 第2リンク213bは、略L字形状を有する棒状の部材であり、その先端側の一辺が第2軸O2と直交する方向に延伸しつつ、当該一辺の端部が第2関節部211bの基端に固定的に接続される。第2リンク213bの略L字形状の基端側の他辺には、第3関節部211cが接続される。 The second link 213b is a rod-shaped member having a substantially L-shape, and one side of the tip side extends in a direction orthogonal to the second axis O2, and the end of the one side is the base of the second joint portion 211b. Fixedly connected to the end. The third joint portion 211c is connected to the other side of the base end side of the substantially L-shape of the second link 213b.
 第3関節部211cは、略円柱形状を有し、その先端で、第2リンク213bの基端を、第1軸O1及び第2軸O2と互いに直交する回転軸(第3軸O3)まわりに回動可能に支持する。第3関節部211cの基端には、第3リンク213cの先端が固定的に接続される。第2軸O2及び第3軸O3まわりに顕微鏡部203を含む先端側の構成を回動させることにより、水平面内での顕微鏡部203の位置を変更するように、当該顕微鏡部203を移動させることができる。つまり、第2軸O2及び第3軸O3まわりの回転を制御することにより、撮像画像の視野を平面内で移動させることが可能になる。 The third joint portion 211c has a substantially cylindrical shape, and at the tip thereof, the base end of the second link 213b is placed around a rotation axis (third axis O3) orthogonal to the first axis O1 and the second axis O2. Supports rotatably. The tip of the third link 213c is fixedly connected to the base end of the third joint portion 211c. By rotating the configuration on the tip side including the microscope unit 203 around the second axis O2 and the third axis O3, the microscope unit 203 is moved so as to change the position of the microscope unit 203 in the horizontal plane. Can be done. That is, by controlling the rotation around the second axis O2 and the third axis O3, the field of view of the captured image can be moved in a plane.
 第3リンク213cは、その先端側が略円柱形状を有するように構成されており、当該円柱形状の先端に、第3関節部211cの基端が、両者が略同一の中心軸を有するように、固定的に接続される。第3リンク213cの基端側は角柱形状を有し、その端部に第4関節部211dが接続される。 The third link 213c is configured so that its tip side has a substantially cylindrical shape, and the base end of the third joint portion 211c has a substantially same central axis at the tip of the cylindrical shape. It is fixedly connected. The base end side of the third link 213c has a prismatic shape, and the fourth joint portion 211d is connected to the end portion thereof.
 第4関節部211dは、略円柱形状を有し、その先端で、第3リンク213cの基端を、第3軸O3と直交する回転軸(第4軸O4)まわりに回動可能に支持する。第4関節部211dの基端には、第4リンク213dの先端が固定的に接続される。 The fourth joint portion 211d has a substantially cylindrical shape, and at its tip, the base end of the third link 213c is rotatably supported around a rotation axis (fourth axis O4) orthogonal to the third axis O3. .. The tip of the fourth link 213d is fixedly connected to the base end of the fourth joint portion 211d.
 第4リンク213dは、略直線状に延伸する棒状の部材であり、第4軸O4と直交するように延伸しつつ、その先端の端部が第4関節部211dの略円柱形状の側面に当接するように、第4関節部211dに固定的に接続される。第4リンク213dの基端には、第5関節部211eが接続される。 The fourth link 213d is a rod-shaped member that extends substantially linearly, and while extending so as to be orthogonal to the fourth axis O4, the end of the tip thereof hits the side surface of the fourth joint portion 211d in a substantially cylindrical shape. It is fixedly connected to the fourth joint portion 211d so as to be in contact with the fourth joint portion 211d. A fifth joint portion 211e is connected to the base end of the fourth link 213d.
 第5関節部211eは、略円柱形状を有し、その先端側で、第4リンク213dの基端を、第4軸O4と平行な回転軸(第5軸O5)まわりに回動可能に支持する。第5関節部211eの基端には、第5リンク213eの先端が固定的に接続される。第4軸O4及び第5軸O5は、顕微鏡部203を上下方向に移動させ得る回転軸である。第4軸O4及び第5軸O5まわりに顕微鏡部203を含む先端側の構成を回動させることにより、顕微鏡部203の高さ、すなわち顕微鏡部203と観察対象との距離を調整することができる。 The fifth joint portion 211e has a substantially cylindrical shape, and on the tip end side thereof, the base end of the fourth link 213d is rotatably supported around a rotation axis (fifth axis O5) parallel to the fourth axis O4. To do. The tip of the fifth link 213e is fixedly connected to the base end of the fifth joint portion 211e. The fourth axis O4 and the fifth axis O5 are rotation axes capable of moving the microscope unit 203 in the vertical direction. The height of the microscope unit 203, that is, the distance between the microscope unit 203 and the observation target can be adjusted by rotating the configuration on the tip side including the microscope unit 203 around the fourth axis O4 and the fifth axis O5. ..
 第5リンク213eは、一辺が鉛直方向に延伸するとともに他辺が水平方向に延伸する略L字形状を有する第1の部材と、当該第1の部材の水平方向に延伸する部位から鉛直下向きに延伸する棒状の第2の部材と、が組み合わされて構成される。第5リンク213eの第1の部材の鉛直方向に延伸する部位の上端近傍に、第5関節部211eの基端が固定的に接続される。第5リンク213eの第2の部材の基端(下端)には、第6関節部211fが接続される。 The fifth link 213e has a first member having a substantially L-shape in which one side extends in the vertical direction and the other side extends in the horizontal direction, and the first member extends vertically downward from a portion extending in the horizontal direction. It is configured by combining with a rod-shaped second member to be stretched. The base end of the fifth joint portion 211e is fixedly connected to the vicinity of the upper end of the portion extending in the vertical direction of the first member of the fifth link 213e. The sixth joint portion 211f is connected to the base end (lower end) of the second member of the fifth link 213e.
 第6関節部211fは、略円柱形状を有し、その先端側で、第5リンク213eの基端を、鉛直方向と平行な回転軸(第6軸O6)まわりに回動可能に支持する。第6関節部211fの基端には、第6リンク213fの先端が固定的に接続される。 The sixth joint portion 211f has a substantially cylindrical shape, and on the tip end side thereof, the base end of the fifth link 213e is rotatably supported around a rotation axis (sixth axis O6) parallel to the vertical direction. The tip of the sixth link 213f is fixedly connected to the base end of the sixth joint portion 211f.
 第6リンク213fは鉛直方向に延伸する棒状の部材であり、その基端はベース部215の上面に固定的に接続される。 The sixth link 213f is a rod-shaped member extending in the vertical direction, and its base end is fixedly connected to the upper surface of the base portion 215.
 第1関節部211a~第6関節部211fの回転可能範囲は、顕微鏡部203が所望の動きを可能であるように適宜設定されている。これにより、以上説明した構成を有するアーム部209においては、顕微鏡部203の動きに関して、並進3自由度及び回転3自由度の計6自由度の動きが実現され得る。このように、顕微鏡部203の動きに関して6自由度が実現されるようにアーム部209を構成することにより、アーム部209の可動範囲内において顕微鏡部203の位置及び姿勢を自由に制御することが可能になる。従って、あらゆる角度から術部を観察することが可能となり、手術をより円滑に実行することができる。 The rotatable range of the first joint portion 211a to the sixth joint portion 211f is appropriately set so that the microscope unit 203 can perform a desired movement. As a result, in the arm unit 209 having the configuration described above, a total of 6 degrees of freedom of translation 3 degrees of freedom and rotation 3 degrees of freedom can be realized with respect to the movement of the microscope unit 203. In this way, by configuring the arm unit 209 so that 6 degrees of freedom are realized with respect to the movement of the microscope unit 203, the position and posture of the microscope unit 203 can be freely controlled within the movable range of the arm unit 209. It will be possible. Therefore, it becomes possible to observe the surgical site from all angles, and the surgery can be performed more smoothly.
 なお、図示するアーム部209の構成はあくまで一例であり、アーム部209を構成するリンクの数及び形状(長さ)、並びに関節部の数、配置位置及び回転軸の方向等は、所望の自由度が実現され得るように適宜設計されてよい。例えば、上述したように、顕微鏡部203を自由に動かすためには、アーム部209は6自由度を有するように構成されることが好ましいが、アーム部209はより大きな自由度(すなわち、冗長自由度)を有するように構成されてもよい。冗長自由度が存在する場合には、アーム部209においては、顕微鏡部203の位置及び姿勢が固定された状態で、アーム部209の姿勢を変更することが可能となる。従って、例えば表示装置219を見る術者の視界にアーム部209が干渉しないように当該アーム部209の姿勢を制御する等、術者にとってより利便性の高い制御が実現され得る。 The configuration of the arm portion 209 shown in the figure is merely an example, and the number and shape (length) of the links constituting the arm portion 209, the number of joints, the arrangement position, the direction of the rotation axis, and the like are freely desired. It may be appropriately designed so that the degree can be realized. For example, as described above, in order to move the microscope unit 203 freely, the arm unit 209 is preferably configured to have 6 degrees of freedom, but the arm unit 209 has a larger degree of freedom (that is, redundant freedom). Degrees of freedom). When there is a degree of freedom of redundancy, the arm unit 209 can change the posture of the arm unit 209 while the position and orientation of the microscope unit 203 are fixed. Therefore, more convenient control for the operator can be realized, for example, by controlling the posture of the arm unit 209 so that the arm unit 209 does not interfere with the field of view of the operator looking at the display device 219.
 ここで、第1関節部211a~第6関節部211fには、モータ等の駆動機構、及び各関節部における回転角度を検出するエンコーダ等が搭載されたアクチュエータが設けられ得る。そして、第1関節部211a~第6関節部211fに設けられる各アクチュエータの駆動が制御装置217によって適宜制御されることにより、アーム部209の姿勢、すなわち顕微鏡部203の位置及び姿勢が制御され得る。具体的には、制御装置217は、エンコーダによって検出された各関節部の回転角度についての情報に基づいて、アーム部209の現在の姿勢、並びに顕微鏡部203の現在の位置及び姿勢を把握することができる。制御装置217は、把握したこれらの情報を用いて、ユーザからの操作入力に応じた顕微鏡部203の移動を実現するような各関節部に対する制御値(例えば、回転角度又は発生トルク等)を算出し、当該制御値に応じて各関節部の駆動機構を駆動させる。なお、この際、制御装置217によるアーム部209の制御方式は限定されず、力制御又は位置制御等、各種の公知の制御方式が適用されてよい。 Here, the first joint portion 211a to the sixth joint portion 211f may be provided with an actuator equipped with a drive mechanism such as a motor and an encoder or the like for detecting the rotation angle at each joint portion. Then, the posture of the arm portion 209, that is, the position and posture of the microscope portion 203 can be controlled by appropriately controlling the drive of each actuator provided in the first joint portion 211a to the sixth joint portion 211f by the control device 217. .. Specifically, the control device 217 grasps the current posture of the arm portion 209 and the current position and posture of the microscope portion 203 based on the information about the rotation angle of each joint portion detected by the encoder. Can be done. The control device 217 uses the grasped information to calculate a control value (for example, rotation angle or generated torque) for each joint that realizes the movement of the microscope unit 203 in response to an operation input from the user. Then, the drive mechanism of each joint is driven according to the control value. At this time, the control method of the arm unit 209 by the control device 217 is not limited, and various known control methods such as force control or position control may be applied.
 例えば、術者が、図示しない入力装置を介して適宜操作入力を行うことにより、当該操作入力に応じて制御装置217によってアーム部209の駆動が適宜制御され、顕微鏡部203の位置及び姿勢が制御されてよい。当該制御により、顕微鏡部203を任意の位置から任意の位置まで移動させた後、その移動後の位置で固定的に支持することができる。なお、当該入力装置としては、術者の利便性を考慮して、例えばフットスイッチ等、術者が手に術具を有していても操作可能なものが適用されることが好ましい。また、ウェアラブルデバイスや手術室内に設けられるカメラを用いたジェスチャ検出や視線検出に基づいて、非接触で操作入力が行われてもよい。これにより、清潔域に属するユーザであっても、不潔域に属する機器をより自由度高く操作することが可能になる。あるいは、アーム部209は、いわゆるマスタースレイブ方式で操作されてもよい。この場合、アーム部209は、手術室から離れた場所に設置される入力装置を介してユーザによって遠隔操作され得る。 For example, when the operator appropriately inputs an operation through an input device (not shown), the control device 217 appropriately controls the drive of the arm unit 209 according to the operation input, and controls the position and posture of the microscope unit 203. May be done. By this control, the microscope unit 203 can be moved from an arbitrary position to an arbitrary position, and then fixedly supported at the moved position. In consideration of the convenience of the operator, it is preferable to use an input device such as a foot switch that can be operated even if the operator holds the surgical tool in his hand. Further, the operation input may be performed in a non-contact manner based on the gesture detection and the line-of-sight detection using a wearable device or a camera provided in the operating room. As a result, even a user belonging to the clean area can operate the device belonging to the unclean area with a higher degree of freedom. Alternatively, the arm portion 209 may be operated by a so-called master slave method. In this case, the arm portion 209 can be remotely controlled by the user via an input device installed at a location away from the operating room.
 また、力制御が適用される場合には、ユーザからの外力を受け、その外力にならってスムーズにアーム部209が移動するように第1関節部211a~第6関節部211fのアクチュエータが駆動される、いわゆるパワーアシスト制御が行われてもよい。これにより、ユーザが、顕微鏡部203を把持して直接その位置を移動させようとする際に、比較的軽い力で顕微鏡部203を移動させることができる。従って、より直感的に、より簡易な操作で顕微鏡部203を移動させることが可能となり、ユーザの利便性を向上させることができる。 When force control is applied, the actuators of the first joint portion 211a to the sixth joint portion 211f are driven so as to receive an external force from the user and the arm portion 209 moves smoothly according to the external force. So-called power assist control may be performed. As a result, when the user grips the microscope unit 203 and tries to move the position directly, the microscope unit 203 can be moved with a relatively light force. Therefore, the microscope unit 203 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
 また、アーム部209は、ピボット動作をするようにその駆動が制御されてもよい。ここで、ピボット動作とは、顕微鏡部203の光軸が空間上の所定の点(以下、ピボット点という)を常に向くように、顕微鏡部203を移動させる動作である。ピボット動作によれば、同一の観察位置を様々な方向から観察することが可能となるため、より詳細な患部の観察が可能となる。なお、顕微鏡部203が、その焦点距離を調整不可能に構成される場合には、顕微鏡部203とピボット点との距離が固定された状態でピボット動作が行われることが好ましい。この場合には、顕微鏡部203とピボット点との距離を、顕微鏡部203の固定的な焦点距離に調整しておけばよい。これにより、顕微鏡部203は、ピボット点を中心とする焦点距離に対応する半径を有する半球面(図C1に概略的に図示する)上を移動することとなり、観察方向を変更しても鮮明な撮像画像が得られることとなる。一方、顕微鏡部203が、その焦点距離を調整可能に構成される場合には、顕微鏡部203とピボット点との距離が可変な状態でピボット動作が行われてもよい。この場合には、例えば、制御装置217は、エンコーダによって検出された各関節部の回転角度についての情報に基づいて、顕微鏡部203とピボット点との距離を算出し、その算出結果に基づいて顕微鏡部203の焦点距離を自動で調整してもよい。あるいは、顕微鏡部203にAF機能が設けられる場合であれば、ピボット動作によって顕微鏡部203とピボット点との距離が変化するごとに、当該AF機能によって自動で焦点距離の調整が行われてもよい。 Further, the drive of the arm portion 209 may be controlled so as to perform a pivot operation. Here, the pivot operation is an operation of moving the microscope unit 203 so that the optical axis of the microscope unit 203 always faces a predetermined point in space (hereinafter, referred to as a pivot point). According to the pivot operation, the same observation position can be observed from various directions, so that the affected part can be observed in more detail. When the microscope unit 203 is configured so that its focal length cannot be adjusted, it is preferable that the pivot operation is performed with the distance between the microscope unit 203 and the pivot point fixed. In this case, the distance between the microscope unit 203 and the pivot point may be adjusted to a fixed focal length of the microscope unit 203. As a result, the microscope unit 203 moves on a hemisphere (schematically illustrated in FIG. C1) having a radius corresponding to the focal length centered on the pivot point, and is clear even if the observation direction is changed. An captured image will be obtained. On the other hand, when the microscope unit 203 is configured so that its focal length can be adjusted, the pivot operation may be performed in a state where the distance between the microscope unit 203 and the pivot point is variable. In this case, for example, the control device 217 calculates the distance between the microscope unit 203 and the pivot point based on the information about the rotation angle of each joint portion detected by the encoder, and the microscope is based on the calculation result. The focal length of unit 203 may be automatically adjusted. Alternatively, if the microscope unit 203 is provided with an AF function, the focal length may be automatically adjusted by the AF function each time the distance between the microscope unit 203 and the pivot point changes due to the pivot operation. ..
 また、第1関節部211a~第6関節部211fには、その回転を拘束するブレーキが設けられてもよい。当該ブレーキの動作は、制御装置217によって制御され得る。例えば、顕微鏡部203の位置及び姿勢を固定したい場合には、制御装置217は各関節部のブレーキを作動させる。これにより、アクチュエータを駆動させなくてもアーム部209の姿勢、すなわち顕微鏡部203の位置及び姿勢が固定され得るため、消費電力を低減することができる。顕微鏡部203の位置及び姿勢を移動したい場合には、制御装置217は、各関節部のブレーキを解除し、所定の制御方式に従ってアクチュエータを駆動させればよい。 Further, the first joint portion 211a to the sixth joint portion 211f may be provided with a brake for restraining the rotation thereof. The operation of the brake can be controlled by the control device 217. For example, when it is desired to fix the position and posture of the microscope unit 203, the control device 217 operates the brake of each joint unit. As a result, the posture of the arm unit 209, that is, the position and posture of the microscope unit 203 can be fixed without driving the actuator, so that power consumption can be reduced. When it is desired to move the position and posture of the microscope unit 203, the control device 217 may release the brake of each joint unit and drive the actuator according to a predetermined control method.
 このようなブレーキの動作は、上述した操作部207を介したユーザによる操作入力に応じて行われ得る。ユーザは、顕微鏡部203の位置及び姿勢を移動したい場合には、操作部207を操作し、各関節部のブレーキを解除させる。これにより、アーム部209の動作モードが、各関節部における回転を自由に行えるモード(オールフリーモード)に移行する。また、ユーザは、顕微鏡部203の位置及び姿勢を固定したい場合には、操作部207を操作し、各関節部のブレーキを作動させる。これにより、アーム部209の動作モードが、各関節部における回転が拘束されたモード(固定モード)に移行する。 Such a brake operation can be performed in response to an operation input by the user via the above-mentioned operation unit 207. When the user wants to move the position and posture of the microscope unit 203, he / she operates the operation unit 207 to release the brake of each joint unit. As a result, the operation mode of the arm portion 209 shifts to a mode in which the joint portions can freely rotate (all-free mode). Further, when the user wants to fix the position and posture of the microscope unit 203, he / she operates the operation unit 207 to operate the brake of each joint portion. As a result, the operation mode of the arm portion 209 shifts to the mode in which the rotation of each joint portion is restricted (fixed mode).
 制御装置217は、顕微鏡装置201及び表示装置219の動作を制御することにより、顕微鏡手術システム200の動作を統括的に制御する。例えば、制御装置217は、所定の制御方式に従って第1関節部211a~第6関節部211fのアクチュエータを動作させることにより、アーム部209の駆動を制御する。また、例えば、制御装置217は、第1関節部211a~第6関節部211fのブレーキの動作を制御することにより、アーム部209の動作モードを変更する。また、例えば、制御装置217は、顕微鏡装置201の顕微鏡部203の撮像部によって取得された画像信号に各種の信号処理を施すことにより、表示用の画像データを生成するとともに、当該画像データを表示装置219に表示させる。当該信号処理では、例えば現像処理(デモザイク処理)、高画質化処理(帯域強調処理、超解像処理、NR(Noise reduction)処理及び/又は手ブレ補正処理等)及び/又は拡大処理(すなわち、電子ズーム処理)等、各種の公知の信号処理が行われてよい。 The control device 217 comprehensively controls the operation of the microscope surgery system 200 by controlling the operations of the microscope device 201 and the display device 219. For example, the control device 217 controls the drive of the arm portion 209 by operating the actuators of the first joint portion 211a to the sixth joint portion 211f according to a predetermined control method. Further, for example, the control device 217 changes the operation mode of the arm portion 209 by controlling the operation of the brakes of the first joint portion 211a to the sixth joint portion 211f. Further, for example, the control device 217 generates image data for display by performing various signal processing on the image signal acquired by the imaging unit of the microscope unit 203 of the microscope device 201, and displays the image data. Displayed on the device 219. In the signal processing, for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.) and / or enlargement processing (that is, Various known signal processing such as electronic zoom processing) may be performed.
 なお、制御装置217と顕微鏡部203との通信、及び制御装置217と第1関節部211a~第6関節部211fとの通信は、有線通信であってもよいし無線通信であってもよい。有線通信の場合には、電気信号による通信が行われてもよいし、光通信が行われてもよい。この場合、有線通信に用いられる伝送用のケーブルは、その通信方式に応じて電気信号ケーブル、光ファイバ、又はこれらの複合ケーブルとして構成され得る。一方、無線通信の場合には、手術室内に伝送ケーブルを敷設する必要がなくなるため、当該伝送ケーブルによって医療スタッフの手術室内の移動が妨げられる事態が解消され得る。 The communication between the control device 217 and the microscope unit 203 and the communication between the control device 217 and the first joint portion 211a to the sixth joint portion 211f may be wired communication or wireless communication. In the case of wired communication, communication by an electric signal may be performed, or optical communication may be performed. In this case, the transmission cable used for the wired communication may be configured as an electric signal cable, an optical fiber, or a composite cable thereof depending on the communication method. On the other hand, in the case of wireless communication, since it is not necessary to lay a transmission cable in the operating room, it is possible to solve the situation where the transmission cable hinders the movement of the medical staff in the operating room.
 制御装置217は、CPU、GPU等のプロセッサ、又はプロセッサとメモリ等の記憶素子が混載されたマイコン若しくは制御基板等であり得る。制御装置217のプロセッサが所定のプログラムに従って動作することにより、上述した各種の機能が実現され得る。なお、図示する例では、制御装置217は、顕微鏡装置201と別個の装置として設けられているが、制御装置217は、顕微鏡装置201のベース部215の内部に設置され、顕微鏡装置201と一体的に構成されてもよい。あるいは、制御装置217は、複数の装置によって構成されてもよい。例えば、顕微鏡部203や、アーム部209の第1関節部211a~第6関節部211fにそれぞれマイコンや制御基板等が配設され、これらが互いに通信可能に接続されることにより、制御装置217と同様の機能が実現されてもよい。 The control device 217 may be a processor such as a CPU or GPU, or a microcomputer or a control board on which a processor and a storage element such as a memory are mixedly mounted. The various functions described above can be realized by operating the processor of the control device 217 according to a predetermined program. In the illustrated example, the control device 217 is provided as a device separate from the microscope device 201, but the control device 217 is installed inside the base portion 215 of the microscope device 201 and is integrated with the microscope device 201. It may be configured in. Alternatively, the control device 217 may be composed of a plurality of devices. For example, a microcomputer, a control board, and the like are arranged in the microscope unit 203 and the first joint portion 211a to the sixth joint portion 211f of the arm portion 209, respectively, and these are connected to each other so as to be communicable with the control device 217. Similar functionality may be realized.
 表示装置219は、手術室内に設けられ、制御装置217からの制御により、当該制御装置217によって生成された画像データに対応する画像を表示する。つまり、表示装置219には、顕微鏡部203によって撮影された術部の画像が表示される。なお、表示装置219は、術部の画像に代えて、又は術部の画像とともに、例えば患者の身体情報や手術の術式についての情報等、手術に関する各種の情報を表示してもよい。この場合、表示装置219の表示は、ユーザによる操作によって適宜切り替えられてよい。あるいは、表示装置219は複数設けられてもよく、複数の表示装置219のそれぞれに、術部の画像や手術に関する各種の情報が、それぞれ表示されてもよい。なお、表示装置219としては、液晶ディスプレイ装置又はEL(Electro Luminescence)ディスプレイ装置等、各種の公知の表示装置が適用されてよい。 The display device 219 is provided in the operating room and displays an image corresponding to the image data generated by the control device 217 under the control of the control device 217. That is, the display device 219 displays an image of the surgical site taken by the microscope unit 203. The display device 219 may display various information related to the surgery, such as physical information of the patient and information about the surgical procedure, in place of the image of the surgical site or together with the image of the surgical site. In this case, the display of the display device 219 may be appropriately switched by an operation by the user. Alternatively, a plurality of display devices 219 may be provided, and each of the plurality of display devices 219 may display an image of the surgical site and various information related to the surgery. As the display device 219, various known display devices such as a liquid crystal display device or an EL (Electro Luminescence) display device may be applied.
 図11は、図10に示す顕微鏡手術システム200を用いた手術の様子を示す図である。図11では、術者221が、顕微鏡手術システム200を用いて、患者ベッド223上の患者225に対して手術を行っている様子を概略的に示している。なお、図11では、簡単のため、顕微鏡手術システム200の構成のうち制御装置217の図示を省略するとともに、顕微鏡装置201を簡略化して図示している。 FIG. 11 is a diagram showing a state of surgery using the microscopic surgery system 200 shown in FIG. FIG. 11 schematically shows how the surgeon 221 is performing surgery on the patient 225 on the patient bed 223 using the microsurgery system 200. In FIG. 11, for the sake of simplicity, the control device 217 is omitted from the configuration of the microscope surgery system 200, and the microscope device 201 is shown in a simplified manner.
 図11に示すように、手術時には、顕微鏡手術システム200を用いて、顕微鏡装置201によって撮影された術部の画像が、手術室の壁面に設置される表示装置219に拡大表示される。表示装置219は、術者221と対向する位置に設置されており、術者221は、表示装置219に映し出された映像によって術部の様子を観察しながら、例えば患部の切除等、当該術部に対して各種の処置を行う。 As shown in FIG. 11, at the time of surgery, the image of the surgical site taken by the microscope device 201 is enlarged and displayed on the display device 219 installed on the wall surface of the operating room by using the microscope surgery system 200. The display device 219 is installed at a position facing the operator 221. The operator 221 observes the state of the operation site by the image projected on the display device 219, and the operator, for example, excises the affected area. Various measures are taken against.
 以上、本開示に係る技術が適用され得る顕微鏡手術システム200の一例について説明した。なお、ここでは、一例として顕微鏡手術システム200について説明したが、本開示に係る技術が適用され得るシステムはかかる例に限定されない。例えば、顕微鏡装置201は、その先端に顕微鏡部203に代えて他の観察装置や他の術具を支持する、支持アーム装置としても機能し得る。これらの観察装置や術具を支持アーム装置によって支持することにより、医療スタッフが人手で支持する場合に比べて、より安定的に位置を固定することが可能となるとともに、医療スタッフの負担を軽減することが可能となる。本開示に係る技術は、このような顕微鏡部以外の構成を支持する支持アーム装置に適用されてもよい。 The example of the microsurgery system 200 to which the technique according to the present disclosure can be applied has been described above. Although the microscopic surgery system 200 has been described here as an example, the system to which the technique according to the present disclosure can be applied is not limited to such an example. For example, the microscope device 201 can also function as a support arm device that supports another observation device or other surgical tool at its tip instead of the microscope unit 203. By supporting these observation devices and surgical tools with a support arm device, it is possible to fix the position more stably and reduce the burden on the medical staff as compared with the case where the medical staff manually supports them. It becomes possible to do. The technique according to the present disclosure may be applied to a support arm device that supports a configuration other than such a microscope unit.
 なお、本実施形態において、顕微鏡手術システム200は、施術システムに相当する。また顕微鏡部203は施術対象を撮像可能な撮像部に相当し、顕微鏡装置201の一部として構成される。また制御装置217は、CCU20の画像処理部29と同様の機能を備える。すなわち、制御装置217は、本技術に係る画像処理装置の一実施形態に相当する。 In the present embodiment, the microscopic surgery system 200 corresponds to the surgical operation system. Further, the microscope unit 203 corresponds to an imaging unit capable of imaging a treatment target, and is configured as a part of the microscope device 201. Further, the control device 217 has the same function as the image processing unit 29 of the CCU 20. That is, the control device 217 corresponds to an embodiment of the image processing device according to the present technology.
 ネットワーク等を介して通信可能に接続された複数のコンピュータが連動することにより本技術に係る施術システム、画像処理装置、画像処理方法、及びプログラムが実行され、本技術に係る画像処理装置が構築されてもよい。
 すなわち本技術に係る施術システム、画像処理装置、画像処理方法、及びプログラムは、単体のコンピュータにより構成されたコンピュータシステムのみならず、複数のコンピュータが連動して動作するコンピュータシステムにおいても実行可能である。なお、本開示において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれもシステムである。
The treatment system, image processing device, image processing method, and program related to this technology are executed by interlocking multiple computers that are communicably connected via a network or the like, and the image processing device related to this technology is constructed. You may.
That is, the treatment system, image processing device, image processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. .. In the present disclosure, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
 コンピュータシステムによる本技術に係る施術システム、画像処理装置、画像処理方法、及びプログラムの実行は、例えば、注目領域の設定、強調領域の判定、及び情報量の制御等が、単体のコンピュータにより実行される場合、及び各処理が異なるコンピュータにより実行される場合の両方を含む。
 また所定のコンピュータによる各処理の実行は、当該処理の一部又は全部を他のコンピュータに実行させその結果を取得することを含む。
 すなわち本技術に係る施術システム、画像処理装置、画像処理方法、及びプログラムは、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成にも適用することが可能である。
In the execution of the treatment system, the image processing device, the image processing method, and the program according to the present technology by the computer system, for example, the setting of the area of interest, the determination of the emphasized area, the control of the amount of information, and the like are executed by a single computer. And when each process is performed by a different computer.
Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result.
That is, the treatment system, image processing device, image processing method, and program related to this technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. It is possible.
 各図面を参照して説明した注目領域設定部、強調領域設定部、強調処理部、情報量制御部等の各構成、通信システムの制御フロー等はあくまで一実施形態であり、本技術の趣旨を逸脱しない範囲で、任意に変形可能である。すなわち本技術を実施するための他の任意の構成やアルゴリズム等が採用されてよい。 Each configuration of the attention area setting unit, the emphasis area setting unit, the emphasis processing unit, the information amount control unit, etc., the control flow of the communication system, etc. described with reference to each drawing are merely one embodiment, and the purpose of the present technology is to be understood. It can be deformed arbitrarily as long as it does not deviate. That is, other arbitrary configurations, algorithms, and the like for implementing the present technology may be adopted.
 本開示において、「中心」「同じ」「直交」「平行」「円柱形状」「円筒形状」等の、形状、サイズ、位置関係、状態等を規定する概念は、「実質的に中心」「実質的に同じ」「実質的に直交」「実質的に平行」「実質的に円柱形状」「実質的に円筒形状」等を含む概念とする。 In the present disclosure, the concepts that define the shape, size, positional relationship, state, etc., such as "center", "same", "orthogonal", "parallel", "cylindrical shape", and "cylindrical shape", are "substantially centered" and "substantially". The concept includes "substantially the same", "substantially orthogonal", "substantially parallel", "substantially cylindrical", "substantially cylindrical" and the like.
 例えば「完全に中心」「完全に同じ」「完全に直交」「完全に平行」「完全に円柱形状」「完全に円筒形状」等を基準とした所定の範囲(例えば±10%の範囲)に含まれる状態も含まれる。 For example, within a predetermined range (for example, ± 10% range) based on "perfect center", "perfect same", "perfectly orthogonal", "perfectly parallel", "perfectly cylindrical", "perfectly cylindrical", etc. The included state is also included.
 なお、本開示中に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。上記の複数の効果の記載は、それらの効果が必ずしも同時に発揮されるということを意味しているのではない。条件等により、少なくとも上記した効果のいずれかが得られることを意味しており、もちろん本開示中に記載されていない効果が発揮される可能性もある。 Note that the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained. The description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
 なお、本技術は以下のような構成も採ることができる。
(1)施術対象を撮影可能な撮像部と、
 前記撮像部により撮影された撮影画像内の第1の領域を拡大し、前記撮影画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる画像処理部と
 を具備する施術システム。
(2)(1)に記載の施術システムであって、
 前記画像処理部は、前記第2の領域の情報量を減少させる処理、及び前記所定の領域に対する強調処理を実行する
 施術システム。
(3)(2)に記載の施術システムであって、
 前記画像処理部は、前記第2の領域の情報量を減少させる処理として、画像の縮小、画像のグレースケール化、画像の階調値の削減、画像の表示形式の変換、又は画像の切り出しの少なくとも1つを実行する
 施術システム。
(4)(2)又は(3)に記載の施術システムであって、
 前記画像処理部は、前記所定の領域に対する強調処理として、画像の拡大、画像のカラー化、画像の階調値の増加、又は画像の表示形式の変換の少なくとも1つを実行する
 施術システム。
(5)(3)又は(4)に記載の施術システムであって、
 前記表示形式の変換は、画像を絵画調に変換することを含む
 施術システム。
(6)(2)から(5)のうちいずれか1つに記載の施術システムであって、
 前記撮影画像は、3次元画像を含み、
 前記画像処理部は、前記第2の領域の情報量を減少させる処理として、前記第2の領域を2次元画像に変換する
 施術システム。
(7)(3)に記載の施術システムであって、
 前記画像処理部は、前記撮影画像内の拡大された前記第1の領域以外の領域に前記第2の領域の全体が表示可能なように、前記第2の領域を縮小する
 施術システム。
(8)(2)又は(7)に記載の施術システムであって、
 前記画像処理部は、前記第2の領域の情報量を減少させた後に、前記所定の領域に対して強調処理を実行する
 施術システム。
(9)(2)から(7)のうちいずれか1つに記載の施術システムであって、
 前記画像処理部は、前記所定の領域に対して強調処理を実行した後に、前記第2の領域の情報量を減少させる
 施術システム。
(10)(1)に記載の施術システムであって、
 前記画像処理部は、前記第2の領域の前記所定の領域とは異なる他の領域のみの情報量を減少させる
 施術システム。
(11)(1)から(10)のうちいずれか1つに記載の施術システムであって、さらに、
 前記第2の領域に対して前記所定の領域を設定する領域設定部を具備する
 施術システム。
(12)(11)に記載の施術システムであって、
 前記領域設定部は、前記施術を行う際に注意を要する部分に対応する領域を、前記所定の領域として設定する
 施術システム。
(13)(11)又は(12)に記載の施術システムであって、
 前記領域設定部は、前記施術に用いられる器具、出血部位、又は損傷を避けるべき部位の少なくとも1つに対応する領域を、前記所定の領域として設定する
 施術システム。
(14)(1)から(13)のうちいずれか1つに記載の施術システムであって、さらに、
 内視鏡を具備し、
 前記撮像部は、前記内視鏡の一部として構成されている
 施術システム。
(15)(1)から(13)のうちいずれか1つに記載の施術システムであって、さらに、
 手術顕微鏡を具備し、
 前記撮像部は、前記手術顕微鏡の一部として構成されている
 施術システム。
(16)(1)から(15)のうちいずれか1つの施術システムであって、
 前記施術は、手術を含む
 施術システム。
(17)(1)から(16)のうちいずれか1つの施術システムであって、
 前記施術対象は、生体を含む
 施術システム。
(18)施術対象を含む画像を取得する画像取得部と、
 前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる画像処理部と
 を具備する画像処理装置。
(19)施術対象を含む画像を取得し、
 前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる
 ことをコンピュータシステムが実行する画像処理方法。
(20)施術対象を含む画像を取得するステップと、
 前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させるステップと
 をコンピュータシステムに実行させるプログラム。
The present technology can also adopt the following configurations.
(1) An imaging unit that can photograph the treatment target and
The first region in the captured image captured by the imaging unit is enlarged, and the amount of information in the second region different from the first region in the captured image is determined by a predetermined amount in the second region. A treatment system with an image processing unit that reduces the area so that it is emphasized.
(2) The treatment system according to (1).
The image processing unit is a treatment system that executes a process of reducing the amount of information in the second region and an enhancement process for the predetermined region.
(3) The treatment system according to (2).
The image processing unit reduces the amount of information in the second region by reducing the image, grayscale the image, reducing the gradation value of the image, converting the display format of the image, or cutting out the image. A treatment system that performs at least one.
(4) The treatment system according to (2) or (3).
The image processing unit is a treatment system that executes at least one of enlargement of an image, colorization of an image, increase of a gradation value of an image, or conversion of a display format of an image as an enhancement process for the predetermined area.
(5) The treatment system according to (3) or (4).
The conversion of the display format is a treatment system including converting an image into a painting style.
(6) The treatment system according to any one of (2) to (5).
The captured image includes a three-dimensional image.
The image processing unit is a treatment system that converts the second region into a two-dimensional image as a process of reducing the amount of information in the second region.
(7) The treatment system according to (3).
The image processing unit is a treatment system that reduces the second region so that the entire second region can be displayed in an enlarged region other than the first region in the captured image.
(8) The treatment system according to (2) or (7).
The image processing unit is a treatment system that executes enhancement processing on the predetermined area after reducing the amount of information in the second area.
(9) The treatment system according to any one of (2) to (7).
The image processing unit is a treatment system that reduces the amount of information in the second region after performing enhancement processing on the predetermined region.
(10) The treatment system according to (1).
The image processing unit is a treatment system that reduces the amount of information of only other regions different from the predetermined region of the second region.
(11) The treatment system according to any one of (1) to (10), and further.
A treatment system including an area setting unit for setting the predetermined area with respect to the second area.
(12) The treatment system according to (11).
The area setting unit is a treatment system that sets a region corresponding to a portion that requires attention when performing the treatment as the predetermined region.
(13) The treatment system according to (11) or (12).
The area setting unit is a treatment system that sets a area corresponding to at least one of an instrument used for the treatment, a bleeding site, or a site for which damage should be avoided as the predetermined area.
(14) The treatment system according to any one of (1) to (13), and further.
Equipped with an endoscope
The imaging unit is a treatment system configured as a part of the endoscope.
(15) The treatment system according to any one of (1) to (13), and further.
Equipped with a surgical microscope
The imaging unit is a surgical system configured as a part of the surgical microscope.
(16) A treatment system that is any one of (1) to (15).
The treatment is a treatment system including surgery.
(17) A treatment system that is any one of (1) to (16).
The treatment target is a treatment system including a living body.
(18) An image acquisition unit that acquires an image including the treatment target, and
The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. An image processing device including an image processing unit to be operated.
(19) Acquire an image including the treatment target,
The first region in the image is enlarged, and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. An image processing method that a computer system performs to let you do.
(20) Steps to acquire an image including the treatment target and
The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. A program that causes a computer system to perform steps and.
 1…内視鏡
 3…カメラヘッド
 9…術具
 20…CCU
 29…画像処理部
 40…画像取得部
 42…拡大処理部
 44…強調処理部
 45…情報量制御部
 50…撮影画像
 53…注目領域
 54…周辺領域
 55…強調領域
 100…内視鏡手術システム
 200…顕微鏡手術システム
 217…制御装置
1 ... Endoscope 3 ... Camera head 9 ... Surgical tool 20 ... CCU
29 ... Image processing unit 40 ... Image acquisition unit 42 ... Enlargement processing unit 44 ... Emphasis processing unit 45 ... Information amount control unit 50 ... Captured image 53 ... Attention area 54 ... Peripheral area 55 ... Emphasis area 100 ... Endoscopic surgery system 200 … Microscopic surgery system 217… Control device

Claims (20)

  1.  施術対象を撮影可能な撮像部と、
     前記撮像部により撮影された撮影画像内の第1の領域を拡大し、前記撮影画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる画像処理部と
     を具備する施術システム。
    An imaging unit that can take pictures of the treatment target,
    The first region in the captured image captured by the imaging unit is enlarged, and the amount of information in the second region different from the first region in the captured image is determined by a predetermined amount in the second region. A treatment system with an image processing unit that reduces the area so that it is emphasized.
  2.  請求項1に記載の施術システムであって、
     前記画像処理部は、前記第2の領域の情報量を減少させる処理、及び前記所定の領域に対する強調処理を実行する
     施術システム。
    The treatment system according to claim 1.
    The image processing unit is a treatment system that executes a process of reducing the amount of information in the second region and an enhancement process for the predetermined region.
  3.  請求項2に記載の施術システムであって、
     前記画像処理部は、前記第2の領域の情報量を減少させる処理として、画像の縮小、画像のグレースケール化、画像の階調値の削減、画像の表示形式の変換、又は画像の切り出しの少なくとも1つを実行する
     施術システム。
    The treatment system according to claim 2.
    The image processing unit reduces the amount of information in the second region by reducing the image, grayscale the image, reducing the gradation value of the image, converting the display format of the image, or cutting out the image. A treatment system that performs at least one.
  4.  請求項2に記載の施術システムであって、
     前記画像処理部は、前記所定の領域に対する強調処理として、画像の拡大、画像のカラー化、画像の階調値の増加、又は画像の表示形式の変換の少なくとも1つを実行する
     施術システム。
    The treatment system according to claim 2.
    The image processing unit is a treatment system that executes at least one of enlargement of an image, colorization of an image, increase of a gradation value of an image, or conversion of a display format of an image as an enhancement process for the predetermined area.
  5.  請求項3に記載の施術システムであって、
     前記表示形式の変換は、画像を絵画調に変換することを含む
     施術システム。
    The treatment system according to claim 3.
    The conversion of the display format is a treatment system including converting an image into a painting style.
  6.  請求項2に記載の施術システムであって、
     前記撮影画像は、3次元画像を含み、
     前記画像処理部は、前記第2の領域の情報量を減少させる処理として、前記第2の領域を2次元画像に変換する
     施術システム。
    The treatment system according to claim 2.
    The captured image includes a three-dimensional image.
    The image processing unit is a treatment system that converts the second region into a two-dimensional image as a process of reducing the amount of information in the second region.
  7.  請求項3に記載の施術システムであって、
     前記画像処理部は、前記撮影画像内の拡大された前記第1の領域以外の領域に前記第2の領域の全体が表示可能なように、前記第2の領域を縮小する
     施術システム。
    The treatment system according to claim 3.
    The image processing unit is a treatment system that reduces the second region so that the entire second region can be displayed in an enlarged region other than the first region in the captured image.
  8.  請求項2に記載の施術システムであって、
     前記画像処理部は、前記第2の領域の情報量を減少させた後に、前記所定の領域に対して強調処理を実行する
     施術システム。
    The treatment system according to claim 2.
    The image processing unit is a treatment system that executes enhancement processing on the predetermined area after reducing the amount of information in the second area.
  9.  請求項2に記載の施術システムであって、
     前記画像処理部は、前記所定の領域に対して強調処理を実行した後に、前記第2の領域の情報量を減少させる
     施術システム。
    The treatment system according to claim 2.
    The image processing unit is a treatment system that reduces the amount of information in the second region after performing enhancement processing on the predetermined region.
  10.  請求項1に記載の施術システムであって、
     前記画像処理部は、前記第2の領域の前記所定の領域とは異なる他の領域のみの情報量を減少させる
     施術システム。
    The treatment system according to claim 1.
    The image processing unit is a treatment system that reduces the amount of information of only other regions different from the predetermined region of the second region.
  11.  請求項1に記載の施術システムであって、さらに、
     前記第2の領域に対して前記所定の領域を設定する領域設定部を具備する
     施術システム。
    The treatment system according to claim 1, further
    A treatment system including an area setting unit for setting the predetermined area with respect to the second area.
  12.  請求項11に記載の施術システムであって、
     前記領域設定部は、前記施術を行う際に注意を要する部分に対応する領域を、前記所定の領域として設定する
     施術システム。
    The treatment system according to claim 11.
    The area setting unit is a treatment system that sets a region corresponding to a portion that requires attention when performing the treatment as the predetermined region.
  13.  請求項11に記載の施術システムであって、
     前記領域設定部は、前記施術に用いられる器具、出血部位、又は損傷を避けるべき部位の少なくとも1つに対応する領域を、前記所定の領域として設定する
     施術システム。
    The treatment system according to claim 11.
    The area setting unit is a treatment system that sets a area corresponding to at least one of an instrument used for the treatment, a bleeding site, or a site for which damage should be avoided as the predetermined area.
  14.  請求項1に記載の施術システムであって、さらに、
     内視鏡を具備し、
     前記撮像部は、前記内視鏡の一部として構成されている
     施術システム。
    The treatment system according to claim 1, further
    Equipped with an endoscope
    The imaging unit is a treatment system configured as a part of the endoscope.
  15.  請求項1に記載の施術システムであって、さらに、
     手術顕微鏡を具備し、
     前記撮像部は、前記手術顕微鏡の一部として構成されている
     施術システム。
    The treatment system according to claim 1, further
    Equipped with a surgical microscope
    The imaging unit is a surgical system configured as a part of the surgical microscope.
  16.  請求項1に記載の施術システムであって、
     前記施術は、手術を含む
     施術システム。
    The treatment system according to claim 1.
    The treatment is a treatment system including surgery.
  17.  請求項1に記載の施術システムであって、
     前記施術対象は、生体を含む
     施術システム。
    The treatment system according to claim 1.
    The treatment target is a treatment system including a living body.
  18.  施術対象を含む画像を取得する画像取得部と、
     前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる画像処理部と
     を具備する画像処理装置。
    An image acquisition unit that acquires an image including the treatment target,
    The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. An image processing device including an image processing unit to be operated.
  19.  施術対象を含む画像を取得し、
     前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させる
     ことをコンピュータシステムが実行する画像処理方法。
    Acquire an image including the treatment target,
    The first region in the image is enlarged, and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. An image processing method that a computer system performs to let you do.
  20.  施術対象を含む画像を取得するステップと、
     前記画像内の第1の領域を拡大し、前記画像内の前記第1の領域とは異なる第2の領域の情報量を、前記第2の領域内の所定の領域が強調されるように減少させるステップと
     をコンピュータシステムに実行させるプログラム。
    Steps to acquire an image including the treatment target,
    The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. A program that causes a computer system to perform steps and.
PCT/JP2020/031967 2019-09-02 2020-08-25 Operation system, image processing device, image processing method, and program WO2021044900A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019159457 2019-09-02
JP2019-159457 2019-09-02

Publications (1)

Publication Number Publication Date
WO2021044900A1 true WO2021044900A1 (en) 2021-03-11

Family

ID=74852763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/031967 WO2021044900A1 (en) 2019-09-02 2020-08-25 Operation system, image processing device, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2021044900A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334665A (en) * 1994-06-06 1995-12-22 Ge Yokogawa Medical Syst Ltd Method and device for picture display
JP2012245157A (en) * 2011-05-27 2012-12-13 Olympus Corp Endoscope apparatus
JP2013042301A (en) * 2011-08-12 2013-02-28 Casio Comput Co Ltd Image processor, image processing method, and program
JP2013507182A (en) * 2009-10-07 2013-03-04 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Method and apparatus for displaying enhanced imaging data on a clinical image
JP2013066241A (en) * 2011-06-09 2013-04-11 Toshiba Corp Image processing system and method
WO2017115442A1 (en) * 2015-12-28 2017-07-06 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
WO2018131141A1 (en) * 2017-01-13 2018-07-19 オリンパス株式会社 Endoscope image processing device and endoscope image processing method
WO2019012911A1 (en) * 2017-07-14 2019-01-17 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis assistance device, and medical operation assistance device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334665A (en) * 1994-06-06 1995-12-22 Ge Yokogawa Medical Syst Ltd Method and device for picture display
JP2013507182A (en) * 2009-10-07 2013-03-04 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Method and apparatus for displaying enhanced imaging data on a clinical image
JP2012245157A (en) * 2011-05-27 2012-12-13 Olympus Corp Endoscope apparatus
JP2013066241A (en) * 2011-06-09 2013-04-11 Toshiba Corp Image processing system and method
JP2013042301A (en) * 2011-08-12 2013-02-28 Casio Comput Co Ltd Image processor, image processing method, and program
WO2017115442A1 (en) * 2015-12-28 2017-07-06 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
WO2018131141A1 (en) * 2017-01-13 2018-07-19 オリンパス株式会社 Endoscope image processing device and endoscope image processing method
WO2019012911A1 (en) * 2017-07-14 2019-01-17 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis assistance device, and medical operation assistance device

Similar Documents

Publication Publication Date Title
JP7067467B2 (en) Information processing equipment for medical use, information processing method, information processing system for medical use
CN111278344B (en) Surgical Arm System and Surgical Arm Control System
CN110325093A (en) Medical arm system, control device and control method
JP7088185B2 (en) Medical systems, medical devices and control methods
JP7480477B2 (en) Medical observation system, control device and control method
US11540700B2 (en) Medical supporting arm and medical system
WO2018168261A1 (en) Control device, control method, and program
US20230172438A1 (en) Medical arm control system, medical arm control method, medical arm simulator, medical arm learning model, and associated programs
WO2021049438A1 (en) Medical support arm and medical system
CN113993478A (en) Medical tool control system, controller and non-transitory computer readable memory
JPWO2019092950A1 (en) Image processing equipment, image processing method and image processing system
US20220008156A1 (en) Surgical observation apparatus, surgical observation method, surgical light source device, and surgical light irradiation method
WO2017221491A1 (en) Control device, control system, and control method
WO2021049220A1 (en) Medical support arm and medical system
JP2021097720A (en) Endoscope and arm system
JP7092111B2 (en) Imaging device, video signal processing device and video signal processing method
WO2020203164A1 (en) Medical system, information processing device, and information processing method
WO2021044900A1 (en) Operation system, image processing device, image processing method, and program
WO2021256168A1 (en) Medical image-processing system, surgical image control device, and surgical image control method
WO2021140923A1 (en) Medical image generation device, medical image generation method, and medical image generation program
JP2023507063A (en) Methods, devices, and systems for controlling image capture devices during surgery
JPWO2020045014A1 (en) Medical system, information processing device and information processing method
WO2018043205A1 (en) Medical image processing device, medical image processing method, and program
WO2022004250A1 (en) Medical system, information processing device, and information processing method
WO2020050187A1 (en) Medical system, information processing device, and information processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20860881

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20860881

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP