WO2021044900A1 - Système opératoire, dispositif de traitement d'image, procédé de traitement d'image et programme - Google Patents

Système opératoire, dispositif de traitement d'image, procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2021044900A1
WO2021044900A1 PCT/JP2020/031967 JP2020031967W WO2021044900A1 WO 2021044900 A1 WO2021044900 A1 WO 2021044900A1 JP 2020031967 W JP2020031967 W JP 2020031967W WO 2021044900 A1 WO2021044900 A1 WO 2021044900A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
treatment system
unit
area
Prior art date
Application number
PCT/JP2020/031967
Other languages
English (en)
Japanese (ja)
Inventor
真人 山根
雄生 杉江
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2021044900A1 publication Critical patent/WO2021044900A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • This technology relates to treatment systems, image processing devices, image processing methods, and programs that can be applied to surgery, examinations, etc.
  • the purpose of this technique is to provide a treatment system, an image processing device, an image processing method, and a program capable of sufficiently grasping the target image.
  • the treatment system includes an imaging unit and an image processing unit.
  • the imaging unit can photograph the object to be treated.
  • the image processing unit enlarges the first region in the captured image captured by the imaging unit, and obtains the amount of information in the second region different from the first region in the captured image. Decrease so that a given area within the area of is emphasized.
  • the first area in the captured image in which the treatment target is captured is enlarged.
  • the amount of information in the second region, which is different from the first region in the captured image, is reduced so that the predetermined region in the second region is emphasized. This makes it possible to fully grasp the target image.
  • the image processing apparatus includes an image acquisition unit and an image processing unit.
  • the image acquisition unit acquires an image including the treatment target.
  • the image processing unit enlarges the first region in the image, and the predetermined region in the second region can obtain the amount of information in the second region different from the first region in the image. Reduce to be emphasized.
  • the image processing method is an image processing method executed by a computer system, and includes acquiring an image including a treatment target.
  • the first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. Will be done.
  • a program causes a computer system to perform the following steps.
  • the first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized. Steps to make.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 100 to which the technique according to the present disclosure can be applied.
  • FIG. 1 shows how an operator (doctor) 32 is performing surgery on a patient 34 on a patient bed 33 using the endoscopic surgery system 100.
  • the endoscopic surgery system 100 includes an endoscope 1, other surgical tools 9, a support arm device 14 that supports the endoscope 1, and various devices for endoscopic surgery. It is composed of a cart 19 equipped with.
  • troccas 13a to 13d are punctured into the abdominal wall.
  • the lens barrel 2 of the endoscope 1 and other surgical tools 9 are inserted into the body cavity of the patient 34 from the troccers 13a to 13d.
  • a pneumoperitoneum tube 10 an energy treatment tool 11, and forceps 12 are inserted into the body cavity of the patient 34.
  • the energy treatment tool 11 is a treatment tool that cuts and peels tissue, seals a blood vessel, or the like by using a high-frequency current or ultrasonic vibration.
  • the surgical tool 9 shown in the figure is merely an example, and as the surgical tool 9, various surgical tools generally used in endoscopic surgery such as a sword and a retractor may be used.
  • the image of the surgical site in the body cavity of the patient 34 taken by the endoscope 1 is displayed on the display device 21.
  • the surgeon 32 uses the energy treatment tool 11 and the forceps 12 to perform a procedure such as excising the affected portion while viewing the image of the surgical portion displayed on the display device 21 in real time.
  • the pneumoperitoneum tube 10, the energy treatment tool 11, and the forceps 12 are supported by the surgeon 32, an assistant, or the like during the operation.
  • the support arm device 14 includes an arm portion 16 extending from the base portion 15.
  • the arm portion 16 is composed of joint portions 17a, 17b, 17c, and links 18a, 18b, and is driven by control from the arm control device 23.
  • the endoscope 1 is supported by the arm portion 16, and its position and posture are controlled. Thereby, the stable position of the endoscope 1 can be fixed.
  • the endoscope 1 is composed of a lens barrel 2 in which a region having a predetermined length from the tip is inserted into the body cavity of the patient 34, and a camera head 3 connected to the base end of the lens barrel 2.
  • the endoscope 1 configured as a so-called rigid mirror having a rigid barrel 2 is illustrated, but the endoscope 1 is configured as a so-called flexible mirror having a flexible barrel 2. May be good.
  • the tip of the lens barrel 2 is provided with an opening in which an objective lens is fitted.
  • a light source device 22 is connected to the endoscope 1, and the light generated by the light source device 22 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 2, and is an objective. It is irradiated toward the observation target in the body cavity of the patient 34 through the lens.
  • the endoscope 1 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 3, and the reflected light (observation light) from the observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to the camera control unit (CCU: Camera Control Unit) 39.
  • the camera head 3 is equipped with a function of adjusting the magnification and the focal length by appropriately driving the optical system.
  • the camera head 3 may be provided with a plurality of image pickup elements.
  • a plurality of relay optical systems are provided inside the lens barrel 2 in order to guide the observation light to each of the plurality of image pickup elements.
  • the CCU 20 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operations of the endoscope 1 and the display device 21 in an integrated manner. Specifically, the CCU 20 performs various image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal received from the camera head 3. The CCU 20 provides the display device 21 with the image signal that has undergone the image processing. Further, the CCU 20 transmits a control signal to the camera head 3 and controls the driving thereof.
  • the control signal may include information about imaging conditions such as magnification and focal length.
  • the display device 21 displays an image based on the image signal processed by the CCU 20 under the control of the CCU 20.
  • the endoscope 1 is compatible with high-resolution shooting such as 4K (3840 horizontal pixels x 2160 vertical pixels) or 8K (7680 horizontal pixels x 4320 vertical pixels), and / or 3D display.
  • a display device 21 capable of displaying a high resolution and / or a device capable of displaying in 3D can be used.
  • the display device 21 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 21 having a size of 55 inches or more.
  • a plurality of display devices 21 having different resolutions and sizes may be provided depending on the application.
  • the light source device 22 is composed of, for example, a light source such as an LED (light LED radio), and supplies irradiation light for photographing the surgical site to the endoscope 1.
  • a light source such as an LED (light LED radio)
  • the arm control device 23 is composed of a processor such as a CPU, and operates according to a predetermined program to control the drive of the arm portion 16 of the support arm device 14 according to a predetermined control method.
  • the input device 24 is an input interface for the endoscopic surgery system 100.
  • the user can input various information and input instructions to the endoscopic surgery system 100 via the input device 24.
  • the user inputs various information related to the surgery, such as physical information of the patient and information about the surgical procedure, via the input device 24.
  • the user gives an instruction to drive the arm unit 16 via the input device 24, or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 1.
  • An instruction to drive the energy treatment tool 11 and the like are input.
  • the type of the input device 24 is not limited, and the input device 24 may be various known input devices.
  • the input device 24 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 27 and / or a lever and the like can be applied.
  • the touch panel may be provided on the display surface of the display device 21.
  • the input device 24 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), and various inputs are made according to the user's gesture and line of sight detected by these devices. Is done.
  • the input device 24 includes a camera capable of detecting the movement of the user, and various inputs are performed according to the gesture and the line of sight of the user detected from the image captured by the camera.
  • the input device 24 includes a microphone capable of picking up the user's voice, and various inputs are performed by voice through the microphone.
  • the input device 24 is configured to be able to input various information in a non-contact manner, so that a user belonging to a clean area (for example, an operator 32) can operate a device belonging to a dirty area in a non-contact manner. Is possible.
  • the user can operate the device without taking his / her hand off the surgical tool that he / she has, which improves the convenience of the user.
  • the treatment tool control device 25 controls the drive of the energy treatment tool 11 for cauterizing, incising, sealing a blood vessel, or the like of a tissue.
  • the pneumoperitoneum device 26 has a gas in the body cavity through the pneumoperitoneum tube 10 in order to inflate the body cavity of the patient 34 for the purpose of securing the field of view by the endoscope 1 and securing the work space of the operator.
  • the recorder 35 is a device capable of recording various information related to surgery.
  • the printer 36 is a device capable of printing various information related to surgery in various formats such as text, images, and graphs.
  • the support arm device 14 includes a base portion 15 as a base and an arm portion 16 extending from the base portion 15.
  • the arm portion 16 is composed of a plurality of joint portions 17a, 17b, 17c and a plurality of links 18a, 18b connected by the joint portions 17b.
  • the configuration of the arm portion 16 is shown in a simplified manner. Actually, the shapes, numbers and arrangements of the joint portions 17a to 17c and the links 18a and 18b, the direction of the rotation axis of the joint portions 17a to 17c, and the like are appropriately set so that the arm portion 16 has a desired degree of freedom. obtain.
  • the arm portion 16 can be preferably configured to have at least 6 degrees of freedom.
  • the endoscope 1 can be freely moved within the movable range of the arm portion 16, so that the lens barrel 2 of the endoscope 1 can be inserted into the body cavity of the patient 34 from a desired direction. It will be possible.
  • Actuators are provided in the joint portions 17a to 17c, and the joint portions 17a to 17c are configured to be rotatable around a predetermined rotation axis by driving the actuator.
  • the arm control device 23 By controlling the drive of the actuator by the arm control device 23, the rotation angles of the joint portions 17a to 17c are controlled, and the drive of the arm portion 16 is controlled. Thereby, control of the position and posture of the endoscope 1 can be realized.
  • the arm control device 23 can control the drive of the arm unit 16 by various known control methods such as force control or position control.
  • the arm control device 23 appropriately controls the drive of the arm portion 16 in response to the operation input.
  • the position and orientation of the endoscope 1 may be controlled.
  • the endoscope 1 at the tip of the arm portion 16 can be moved from an arbitrary position to an arbitrary position, and then fixedly supported at the moved position.
  • the arm portion 16 may be operated by a so-called master slave method. In this case, the arm portion 16 can be remotely controlled by the user via an input device 24 installed at a location away from the operating room.
  • the arm control device 23 When force control is applied, the arm control device 23 receives an external force from the user and moves the actuators of the joint portions 17a to 17c so that the arm portion 16 moves smoothly according to the external force. So-called power assist control for driving may be performed. As a result, when the user moves the arm portion 16 while directly touching the arm portion 16, the arm portion 16 can be moved with a relatively light force. Therefore, the endoscope 1 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • endoscopy 1 was supported by a doctor called a scopist.
  • the position of the endoscope 1 can be fixed more reliably without manpower, so that an image of the surgical site can be stably obtained. , It becomes possible to perform surgery smoothly.
  • the arm control device 23 does not necessarily have to be provided in the cart 19. Further, the arm control device 23 does not necessarily have to be one device.
  • the arm control device 23 may be provided in each of the joint portions 17a to 17c of the arm portion 16 of the support arm device 14, and the arm portion 16 is driven by the plurality of arm control devices 23 cooperating with each other. Control may be realized.
  • the light source device 22 supplies the endoscope 1 with irradiation light for photographing the surgical site.
  • the light source device 22 is composed of, for example, an LED, a laser light source, or a white light source composed of a combination thereof.
  • a white light source is configured by combining RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy, so that the white balance of the captured image in the light source device 22 can be controlled. Can be adjusted.
  • the laser light from each of the RGB laser light sources is irradiated to the observation target in a time-divided manner, and the drive of the image sensor of the camera head 3 is controlled in synchronization with the irradiation timing to support each of RGB. It is also possible to capture the image in a time-divided manner. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 22 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 3 in synchronization with the timing of changing the light intensity to acquire images in time division and synthesizing the images, so-called high dynamic without blackout and overexposure. Range images can be generated.
  • the light source device 22 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue to irradiate light in a narrow band as compared with the irradiation light (that is, white light) in normal observation, the surface layer of the mucous membrane. So-called narrow band imaging, in which a predetermined tissue such as a blood vessel is photographed with high contrast, is performed.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating with excitation light.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected.
  • An excitation light corresponding to the fluorescence wavelength of the reagent may be irradiated to obtain a fluorescence image.
  • the light source device 22 may be configured to be capable of supplying narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the camera head 3 and the CCU 20 shown in FIG.
  • the camera head 3 has a lens unit 4, an imaging unit 5, a driving unit 6, a communication unit 7, and a camera head control unit 8 as its functions.
  • the CCU 20 has a communication unit 28, an image processing unit 29, and a control unit 30 as its functions.
  • the camera head 3 and the CCU 20 are connected by a transmission cable 31 so as to be able to communicate in both directions.
  • the lens unit 4 is an optical system provided at a connection portion with the lens barrel 2.
  • the observation light taken in from the tip of the lens barrel 2 is guided to the camera head 3 and incident on the lens unit 4.
  • the lens unit 4 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the optical characteristics of the lens unit 4 are adjusted so as to collect the observation light on the light receiving surface of the image sensor of the image pickup unit 5.
  • the zoom lens and the focus lens are configured so that their positions on the optical axis can be moved in order to adjust the magnification and the focus of the captured image.
  • the image pickup unit 5 is composed of an image pickup element and is arranged after the lens unit 4.
  • the observation light that has passed through the lens unit 4 is focused on the light receiving surface of the image pickup device, and an image signal corresponding to the observation image is generated by photoelectric conversion.
  • the image signal generated by the imaging unit 5 is provided to the communication unit 7.
  • CMOS Complementary Metal Oxide Semiconductor
  • image pickup device for example, an image pickup device capable of capturing a high resolution image of 4K or higher may be used.
  • the image pickup elements constituting the image pickup unit 5 are configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to 3D display, respectively.
  • the 3D display enables the operator 32 to more accurately grasp the depth of the biological tissue in the surgical site.
  • the image pickup unit 5 is composed of a multi-plate type, a plurality of lens units 4 are also provided corresponding to each image pickup element.
  • the imaging unit 5 does not necessarily have to be provided on the camera head 3.
  • the imaging unit 5 may be provided inside the lens barrel 2 immediately after the objective lens.
  • the drive unit 6 is composed of an actuator, and the zoom lens and focus lens of the lens unit 4 are moved by a predetermined distance along the optical axis under the control of the camera head control unit 8. As a result, the magnification and focus of the image captured by the imaging unit 5 can be adjusted as appropriate.
  • the communication unit 7 is composed of a communication device for transmitting and receiving various information to and from the CCU 20.
  • the communication unit 7 transmits the image signal obtained from the image pickup unit 5 as RAW data to the CCU 20 via the transmission cable 31.
  • the image signal is transmitted by optical communication.
  • the surgeon 32 performs the surgery while observing the condition of the affected area with the captured image, so for safer and more reliable surgery, the moving image of the surgical site is displayed in real time as much as possible. This is because it is required.
  • the communication unit 7 is provided with a photoelectric conversion module that converts an electric signal into an optical signal.
  • the image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 20 via the transmission cable 31.
  • the communication unit 7 receives a control signal for controlling the drive of the camera head 3 from the CCU 20.
  • the control signal includes, for example, information to specify the frame rate of the captured image, information to specify the exposure value at the time of imaging, and / or information to specify the magnification and focus of the captured image, and the like. Contains information about the condition.
  • the communication unit 7 provides the received control signal to the camera head control unit 8.
  • the control signal from the CCU 20 may also be transmitted by optical communication.
  • the communication unit 7 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 8.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus are automatically set by the control unit 30 of the CCU 20 based on the acquired image signal. That is, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 1.
  • the camera head control unit 8 controls the drive of the camera head 3 based on the control signal from the CCU 20 received via the communication unit 7. For example, the camera head control unit 8 controls the drive of the image pickup device of the image pickup unit 5 based on the information to specify the frame rate of the captured image and / or the information to specify the exposure at the time of imaging. Further, for example, the camera head control unit 8 appropriately moves the zoom lens and the focus lens of the lens unit 4 via the drive unit 6 based on the information that the magnification and the focus of the captured image are specified.
  • the camera head control unit 8 may further have a function of storing information for identifying the lens barrel 2 and the camera head 3.
  • the camera head 3 can be made resistant to autoclave sterilization.
  • the communication unit 28 is composed of a communication device for transmitting and receiving various types of information to and from the camera head 3.
  • the communication unit 28 receives an image signal transmitted from the camera head 3 via the transmission cable 31.
  • the image signal can be suitably transmitted by optical communication.
  • the communication unit 28 is provided with a photoelectric conversion module that converts an optical signal into an electric signal.
  • the communication unit 28 provides the image processing unit 29 with an image signal converted into an electric signal.
  • the communication unit 28 transmits a control signal for controlling the drive of the camera head 3 to the camera head 3.
  • the control signal may also be transmitted by optical communication.
  • the image processing unit 29 performs various image processing on the image signal which is the RAW data transmitted from the camera head 3.
  • the image processing includes, for example, development processing, high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing). Etc., various known signal processing is included.
  • the image processing unit 29 performs detection processing on the image signal for performing AE, AF, and AWB.
  • the image processing unit 29 is composed of a processor such as a CPU or GPU, and when the processor operates according to a predetermined program, the above-mentioned image processing and detection processing can be performed.
  • the image processing unit 29 is composed of a plurality of GPUs, the image processing unit 29 appropriately divides the information related to the image signal and performs image processing in parallel by the plurality of GPUs.
  • the control unit 30 performs various controls related to the imaging of the surgical site by the endoscope 1 and the display of the captured image. For example, the control unit 30 generates a control signal for controlling the drive of the camera head 3. At this time, if the imaging conditions are input by the user, the control unit 30 generates a control signal based on the input by the user. Alternatively, when the endoscope 1 is equipped with the AE function, the AF function, and the AWB function, the control unit 30 determines the optimum exposure value, focal length, and the optimum exposure value, depending on the result of the detection process by the image processing unit 29. The white balance is calculated appropriately and a control signal is generated.
  • control unit 30 causes the display device 21 to display an image of the surgical unit based on the image signal that has been image-processed by the image processing unit 29.
  • the control unit 30 recognizes various objects in the surgical site image by using various image recognition techniques.
  • the control unit 30 detects a surgical tool such as forceps, a specific biological part, bleeding, a mist when using the energy treatment tool 11, etc. by detecting the shape, color, etc. of the edge of the object included in the surgical site image. Can be recognized.
  • the control unit 30 uses the recognition result to superimpose and display various surgical support information on the image of the surgical site. By superimposing the surgery support information and presenting it to the surgeon 32, it becomes possible to proceed with the surgery more safely and surely.
  • the transmission cable 31 that connects the camera head 3 and the CCU 20 is an electric signal cable that supports electric signal communication, an optical fiber that supports optical communication, or a composite cable thereof.
  • the communication is performed by wire using the transmission cable 31, but the communication between the camera head 3 and the CCU 20 may be performed wirelessly.
  • the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 31 in the operating room, so that the situation where the movement of the medical staff in the operating room is hindered by the transmission cable 31 can be solved.
  • the endoscopic surgery system 100 corresponds to the surgery system.
  • the camera head 3 corresponds to an imaging unit capable of imaging an object to be treated, and is configured as a part of the endoscope 1.
  • the image processing unit 29 corresponds to an embodiment of the image processing unit according to the present technology. Further, the CCU 20 having the image processing unit 29 corresponds to one embodiment of the image processing apparatus according to the present technology.
  • the image processing unit 29 has hardware necessary for configuring a computer, such as a processor such as a CPU or GPU and a memory such as ROM or RAM.
  • the image processing method according to the present technology is executed when the CPU or the like loads and executes the control program (program according to the present technology) recorded in the ROM or the like into the RAM.
  • the specific configuration of the image processing unit 29 is not limited, and any hardware such as FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit) may be used. It is also possible to realize the image processing unit 29 as a software block by executing a predetermined program by the CPU or the like of the CCU 20.
  • FIG. 3 is a block diagram showing a configuration example of the image processing unit 29.
  • the CPU or the like of the image processing unit 29 executes a predetermined program to execute the image acquisition unit 40 as a functional block, the attention area setting unit 41, the enlargement processing unit 42, the emphasis area setting unit 43, and the emphasis processing.
  • a unit 44, an information amount control unit 45, and an image composition unit 46 are realized.
  • dedicated hardware such as an IC (integrated circuit) may be used.
  • the program is installed in the image processing unit 29 via, for example, various recording media. Alternatively, the program may be installed via the Internet or the like.
  • the type of recording medium on which the program is recorded is not limited, and any computer-readable recording medium may be used. For example, any recording medium for recording data non-temporarily may be used.
  • the image acquisition unit 40 acquires a captured image of the treatment target captured by the endoscope 1.
  • the procedure includes various medical actions such as examination, surgery, and diagnosis.
  • medical practice includes, for example, an act that causes or may cause harm to the human body unless it is done by the medical judgment and skill of a doctor.
  • the treatment includes measurement of body temperature and blood pressure, treatment that does not require specialized judgment and skill such as minor cuts and burns, replacement of gauze soiled with filth, adjustment of dosage, and attachment of treatment equipment such as casts.
  • Includes various medical activities such as blood collection for transfusion, adjustment of oral tracheal tubes, and management and operation of pacemakers.
  • the treatment target includes various living organisms such as a human body, pet animals such as dogs and cats, and livestock such as cows and pigs. In addition, a part of the living body such as the arm and internal organs is also included in the treatment target.
  • the image also includes a still image and a moving image. Of course, a plurality of frame images included in the moving image are also included in the image.
  • the acquisition of an image includes the acquisition of an image signal including image information. Further, the data format of the image information is not limited and may be set arbitrarily.
  • the image acquisition unit 40 acquires an image taken in the body cavity of the human body to be operated on by endoscopy. The acquired captured image is output to the attention area setting unit 41.
  • the attention area setting unit 41 sets the attention area with respect to the captured image.
  • the area in the image is defined by a pixel area composed of one or more pixels.
  • the area of interest is an area of interest for the operator 32 during the procedure. For example, an area including a lesion of the human body to be treated is included. Alternatively, it includes an area where an incision or the like is performed by a surgical tool such as a scalpel.
  • any region that the operator 32 wants to pay attention to may be set as the region of interest.
  • the region of interest may be set by, for example, the operator 32.
  • a region of interest may be automatically set for the input captured image.
  • the method of setting the region of interest is not limited, and any algorithm may be used.
  • the attention area setting unit 41 is an area other than the attention area, and sets an area to be set as an emphasis area, which will be described later.
  • all the regions around the region of interest in the captured image are set as regions to be set as the emphasis region (hereinafter, referred to as peripheral regions).
  • peripheral regions regions to be set as the emphasis region.
  • the present invention is not limited to this, and a part of the captured image other than the region of interest may be set as the region to be set as the emphasis region.
  • the enlargement processing unit 42 can enlarge all or a part of the captured image.
  • the enlargement processing unit 42 expands the area of interest set by the area of interest setting unit 41.
  • the emphasis area setting unit 43 sets an emphasis area with respect to the peripheral area.
  • the emphasized area is an area to be emphasized in the peripheral area. For example, an area corresponding to a portion that requires attention when performing a treatment is set as an emphasized area. For example, a region corresponding to an instrument used in the treatment, a bleeding site, a site where damage should be avoided, or the like is set as an emphasized area. A region corresponding to all of these may be set as an emphasized region, or an region corresponding to at least one may be set as an emphasized region. In addition, an area corresponding to an arbitrary part that requires attention when performing the treatment may be set as an emphasized area.
  • the instruments used for the treatment are surgical tools such as scalpels, tweezers, and forceps.
  • the present invention is not limited to this, and various surgical tools used in general treatment may be used.
  • the bleeding site is a site where bleeding is occurring from the treatment target.
  • the bleeding site includes an injured site where bleeding is occurring and bloody blood around the injured site.
  • the site to avoid damage is an important organ of the living body such as an artery.
  • organs such as the retina that may have a great influence on the living body by damaging them are also included.
  • a portion corresponding to a medical product such as gauze may be set as an emphasis area.
  • the region of interest corresponds to the first region in the captured image.
  • the peripheral region corresponds to a second region different from the first region.
  • the emphasized region corresponds to a predetermined region in the second region.
  • the method of setting the emphasized area is not limited, and any technique such as image recognition processing may be used.
  • any image recognition method such as edge detection or pattern matching may be used, and the algorithm is not particularly limited.
  • an arbitrary machine learning algorithm using DNN (Deep Neural Network) or the like may be used.
  • AI artificial intelligence
  • a learning unit and an identification unit are provided for setting the emphasized area.
  • the learning unit and the recognition unit may be constructed in, for example, the emphasis area setting unit, or may be constructed in another device capable of communicating with the CCU 20.
  • the learning unit performs machine learning based on the input information (learning data) and outputs the learning result.
  • the identification unit identifies (determines, predicts, etc.) the input information based on the input information and the learning result.
  • a neural network or deep learning is used as a learning method in the learning unit.
  • a neural network is a model that imitates a human brain neural circuit, and is composed of three types of layers: an input layer, an intermediate layer (hidden layer), and an output layer.
  • Deep learning is a model that uses a multi-layered neural network, and it is possible to learn complex patterns hidden in a large amount of data by repeating characteristic learning in each layer. Deep learning is used, for example, to identify objects in images and words in sounds. Of course, it can also be applied to the setting of the emphasized area.
  • a neurochip / neuromorphic chip incorporating the concept of a neural network can be used.
  • Machine learning problem settings include supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, reverse reinforcement learning, active learning, and transfer learning.
  • supervised learning features are learned based on given labeled learning data (teacher data). This makes it possible to derive labels for unknown data.
  • unsupervised learning a large amount of unlabeled learning data is analyzed to extract features, and clustering is performed based on the extracted features. This makes it possible to analyze trends and predict the future based on a huge amount of unknown data.
  • semi-supervised learning is a mixture of supervised learning and unsupervised learning. After learning features in supervised learning, a huge amount of training data is given in unsupervised learning, and the features are automatically characterized. This is a method of repeatedly learning while calculating the amount.
  • Reinforcement learning also deals with the problem of observing the current state of an agent in an environment and deciding what action to take. Agents learn rewards from the environment by choosing actions and learn how to get the most rewards through a series of actions. In this way, by learning the optimum solution in a certain environment, it is possible to reproduce human judgment and to make a computer acquire judgment that exceeds human judgment. It is also possible to generate virtual sensing data by machine learning. For example, it is possible to predict another sensing data from one sensing data and use it as input information, such as generating position information from the input image information. It is also possible to generate different sensing data from a plurality of sensing data. It is also possible to predict the required information and generate predetermined information from the sensing data.
  • the emphasized area setting unit 43 corresponds to an area setting unit that sets a predetermined area with respect to the second area.
  • the setting of each area executed by the attention area setting unit 41 and the emphasis area setting unit 43 may function in one block.
  • an area setting unit having the functions of an attention area setting unit 41 and an emphasis area setting unit 43 may be configured, and an attention area, a peripheral area, and an emphasis area may be set by the area setting unit.
  • the enhancement processing unit 44 can perform enhancement processing on all or a part of the captured image.
  • the emphasis processing unit 44 performs the emphasis processing on the emphasis region.
  • the emphasis process is a process of emphasizing the emphasized area in the peripheral area.
  • the enhancement process for the enhancement region includes at least one of enlargement of the image, colorization of the image, increase of the gradation value of the image, or conversion of the display format of the image.
  • Image colorization is a process of converting a grayscale image into a color image. For example, colorization of an image is a process in which an emphasized area is converted into a color image when a peripheral area including an emphasized area is converted into grayscale.
  • the colorization of an image includes at least one of a process of filling the emphasized area with a predetermined color or a process of coloring the boundary of the emphasized area with a predetermined color.
  • Increasing the gradation value of an image is a process of increasing the gradation of an image.
  • increasing the gradation value of an image is a process of changing the image corresponding to the emphasized region to 256 gradations when the image corresponding to the peripheral region has 128 gradations.
  • the gradation value of the emphasized region is equal to or less than the gradation value of the region of interest.
  • the conversion of the display format of an image is a process in which an image is output in a different display format. For example, when the peripheral area is converted to a painting-like (animation-like), the emphasized area is output in the same image display format as when the image was taken, which is the same as the attention area. Of course, it is not limited to this.
  • the information amount control unit 45 controls the amount of information of the captured image.
  • the amount of information is typically defined by the amount of image data of the captured image.
  • the amount of information can be specified by the number of bits of the image, the number of pixels of the image, and the like.
  • the amount of information may be defined based on the gradation value and the like.
  • the reduction in the amount of information includes an arbitrary process for reducing the amount of image data of the captured image. For example, as processing for reducing the amount of information in the peripheral area, image reduction, grayscale of the image, reduction of the gradation value of the image, conversion of the display format of the image, cropping of the image, etc. are executed. Is a process of hiding a part of the image.
  • a 4K image is cut out from an 8K captured image.
  • the image is cut out so that it cannot be seen because a part of the peripheral region is superimposed.
  • the captured image is a three-dimensional image
  • converting the peripheral region from the three-dimensional image to the two-dimensional image is also included in the process of reducing the amount of information in the peripheral region.
  • the emphasis processing unit 44 emphasizes the emphasized area.
  • the information amount control unit 45 reduces the amount of information in the peripheral region. Therefore, the amount of information in the peripheral area is reduced so that the emphasized area is emphasized.
  • any process that can reduce the amount of information in the peripheral area so that the emphasized area is emphasized may be executed.
  • the image synthesizing unit 46 electronically synthesizes the captured image.
  • the area of interest enlarged by the enlargement processing unit 42, the emphasized area emphasized by the emphasis processing unit 44, and the peripheral area in which the amount of information is reduced by the information amount control unit 45 are combined by the image composition unit 46. It is synthesized.
  • the combined composite image is output to the display device 21 via the communication unit 28.
  • the surgeon 32 can perform the treatment while checking the composite image displayed on the display device 21.
  • the information amount control unit 45 may reduce the amount of information only in a region different from the emphasized region in the peripheral region. In this case, as a result, the emphasized area is emphasized in the peripheral area. Therefore, the process of reducing the amount of information only in the area different from the emphasized area of the peripheral area is included in the emphasis process for the emphasized area. Further, when the amount of information of only the area different from the emphasized area of the peripheral area is reduced, the amount of information of the entire peripheral area is reduced. Therefore, the process of reducing the amount of information only in the area different from the emphasized area of the peripheral area is included in the process of reducing the amount of information of the peripheral process so that the emphasized area is emphasized.
  • FIG. 4 is a flowchart showing an example of setting and processing of each area.
  • FIG. 5 is a schematic view showing an example of setting the area of the captured image to be treated.
  • the captured image 50 of the patient 34 captured by the endoscope 1 is input (step 101). For example, as shown in FIG. 5, a photographed image 50 in which the surgical portion in the body cavity of the patient 34 is photographed is displayed on the display device 21. As the captured image 50, the display device 21 displays the inside of the body cavity of the patient 34 including the lesion portion photographed by the endoscope 1, the operator 32's hand, the surgical tool 51, the bleeding site 52, and the like. In addition, various information related to surgery such as physical information of the patient 34 input via the input device 24 and information about the surgical procedure may be displayed on the captured image 50.
  • the attention area setting unit 41 sets the attention area 53 and the peripheral area 54 in the input captured image 50 (step 102).
  • the area where the operator 32 is performing the treatment is set as the area of interest 53, and the other area is set as the peripheral area 54. Further, in the present embodiment, the region of interest 53 and the peripheral region 54 are separated and processed separately.
  • FIG. 6 is a schematic view showing an enlarged example of the region of interest 53.
  • the area of interest 53 is expanded by the enlargement processing unit 42 (step 103).
  • enlarging an image means displaying (the content of) the inside of the image in an area larger than the original display area (pixel area).
  • the method of expanding the region of interest 53 is not limited.
  • the entire region of interest 53 may be magnified by a predetermined magnification.
  • a different magnification may be assigned to each of the plurality of regions in the region of interest 53, or the magnification may be enlarged according to the assigned magnification.
  • the emphasis area setting unit 43 determines whether or not there is an emphasis area in the peripheral area 54 (step 104).
  • the region corresponding to the surgical tool 51 and the bleeding site 52 in the captured image 50 is set as the emphasized region by the emphasized region setting unit 43.
  • FIG. 7 is a schematic view showing an example in which the enhancement process is executed on the enhancement region in the peripheral region 54.
  • the emphasis processing unit 44 performs the emphasis processing on the emphasis regions 55 (56 and 57) (step 105).
  • the emphasis processing unit 44 executes an enhancement process of filling the emphasis area 56 corresponding to the surgical instrument 51 and the emphasis area 57 corresponding to the bleeding site 52 with a predetermined color. For example, in the emphasis region 56 corresponding to the surgical tool 51, an enhancement process of painting with a color that is more conspicuous than in the body cavity of the human body is executed.
  • an enhancement process of filling with red or a color close to red that imitates the color of blood is executed.
  • the method of executing the emphasis processing is not limited.
  • the operator 32 may assign an arbitrary color to each emphasis area 55.
  • the type of the surgical tool 51 or the like may be recognized by image recognition or the like, and the enhancement process assigned to each surgical tool may be executed.
  • different emphasis processing may be executed when the surgical tools 51 are close to each other.
  • the method of setting the emphasized region 57 is not limited.
  • the reliability of the area may be set as an index indicating how much the recognition result of image recognition can be trusted.
  • the corresponding region may be set as the emphasized region depending on whether or not the reliability exceeds a predetermined threshold value.
  • a plurality of threshold values may be set, and the reliability of each emphasized region may be determined stepwise. For example, if the reliability exceeds the highest threshold, the emphasized area may be filled with a dark color. Further, for example, a color luminance level for filling the emphasized area may be assigned for each threshold value.
  • FIG. 8 is a schematic diagram showing an example in which the control of the amount of information is executed for the peripheral region 54.
  • the information amount control unit 45 controls the amount of information with respect to the peripheral area 54 including the emphasized area on which the emphasis processing is executed (step 106).
  • the peripheral area 54 is reduced as a process for reducing the amount of information in the peripheral area 54. Reducing an image means displaying (the content of) an image in an area smaller than the original display area.
  • the method of reducing the peripheral area 54 is not limited. For example, the entire peripheral area 54 may be reduced by a predetermined magnification. Alternatively, a different magnification may be assigned to each of the plurality of areas in the peripheral area 54, or the magnification may be reduced according to the assigned magnification.
  • grayscale of the peripheral region 54 is executed as a process of reducing the amount of information in the peripheral region 54. That is, the emphasis processing unit 44 and the information amount control unit 45 execute a process of reducing the amount of information in the peripheral area 54 and an emphasis process on the emphasis area 55. As a result, even if the peripheral region 54 is reduced, the emphasized region 55 can be sufficiently grasped.
  • the combination of the process of reducing the amount of information and the emphasis process for the emphasis area 55 may be arbitrarily executed.
  • the peripheral region 54 may be converted into a painting style, and the surgical instrument 51 and the bleeding site 52 may be emphasized with a predetermined color.
  • the region of interest 53 is shown by a dotted line. Not limited to this, when the emphasis processing and the control of the amount of information are performed on the peripheral region 54, the region of interest 53 may not be shown.
  • FIG. 9 is a schematic diagram showing an example of a composite image.
  • the image synthesizing unit 46 generates a composite image 58 in which the region of interest 53 and the peripheral region 54 are combined (step 107). Further, the generated composite image 58 is output to the display device 21 (step 108). As a result, the surgeon 32 can perform the treatment while checking the composite image 58 displayed on the display device 21 in real time.
  • each process of steps 101 to 106 is shown.
  • the images including the attention region 53 and the peripheral region 54 shown in FIGS. 5 to 8 are not displayed on the display device 21, and only the composite image 58 generated by the image compositing unit 46 is displayed. It is displayed on the device 21. Not limited to this, an image including the region of interest 53 and the peripheral region 54 for each of these processes may be displayed on the display device 21.
  • the region of interest 53 in the captured image 50 captured by the patient 34 is enlarged.
  • the amount of information in the peripheral region different from the attention region 53 in the captured image 50 is reduced so that the emphasized region 55 in the peripheral region 54 is emphasized. This makes it possible to fully grasp the target image.
  • the area of interest may be magnified by an electronic zoom.
  • an electronic zoom For example, if a small lesion is enlarged by an electronic zoom, the field of view becomes narrower than that of the same-magnification display, and information outside the region of interest cannot be displayed. Therefore, there is a problem that it is not possible to operate the instrument outside the region of interest and to confirm the condition outside the region of interest such as bleeding around the lesion during electronic zooming.
  • the area of interest is set in the surgical field image, and the area of interest is enlarged and displayed.
  • the peripheral area other than the area of interest is reduced and displayed.
  • highlighting is performed on the highlighted area so that necessary information can be extracted even if the display is reduced.
  • the area of interest is enlarged and displayed, it is possible to grasp the surrounding conditions such as the state of the surgical instrument, the state of bleeding, and the condition of the tissue outside the area of interest.
  • the information amount control unit 45 reduces the amount of information in the peripheral area 54.
  • the enhancement process may be executed for the emphasis region 55.
  • the reduction of the peripheral area 54 was executed by an arbitrary method.
  • the peripheral area 54 may be reduced so that the entire peripheral area 54 can be displayed in an area other than the enlarged area of interest 53 in the captured image 50. This makes it possible to display the composite image without any loss.
  • the emphasis processing unit 44 executed the emphasis processing on all of the emphasis areas. Not limited to this, only surgical tools and bleeding sites that greatly affect the treatment may be emphasized. Further, the surgeon 32 may decide whether or not the emphasis processing is executed.
  • the attention area 53 is set to a rectangular shape by the attention area setting unit 41.
  • the region of interest may be set in any shape.
  • a circular area of interest having a predetermined radius may be set based on the location where the treatment is performed.
  • the region of interest 53 and the peripheral region 54 are separated and processed separately.
  • the process is not limited to this, and the process may be executed in the region of interest and the peripheral region in one captured image.
  • the technology according to this disclosure can be applied to various products.
  • the technique according to the present disclosure may be applied to a microsurgery system used for so-called microsurgery, which is performed while magnifying and observing a minute part of a patient.
  • FIG. 10 is a diagram showing an example of a schematic configuration of a microscopic surgery system 200 to which the technique according to the present disclosure can be applied.
  • the microscope surgery system 200 is composed of a microscope device 201, a control device 217, and a display device 219.
  • the “user” means an operator, an assistant, or any other medical staff who uses the microsurgery system 200.
  • the microscope device 201 includes a microscope unit 203 for magnifying and observing an observation target (patient's surgical unit), an arm unit 209 that supports the microscope unit 203 at the tip, and a base unit 215 that supports the base end of the arm unit 209. Has.
  • the microscope unit 203 includes a substantially cylindrical tubular portion 205, an imaging unit (not shown) provided inside the tubular portion 205, and an operation unit 207 provided in a part of the outer periphery of the tubular portion 205. It is composed of and.
  • the microscope unit 203 is an electronic imaging type microscope unit (so-called video type microscope unit) that electronically captures an captured image by the imaging unit.
  • a cover glass is provided on the opening surface at the lower end of the tubular portion 205 to protect the internal imaging portion.
  • the light from the observation target (hereinafter, also referred to as observation light) passes through the cover glass and is incident on the imaging portion inside the tubular portion 205.
  • a light source made of, for example, an LED or the like may be provided inside the tubular portion 205, and light may be emitted from the light source to the observation target through the cover glass at the time of imaging.
  • the image pickup unit is composed of an optical system that collects the observation light and an image pickup element that receives the observation light collected by the optical system.
  • the optical system is composed of a combination of a plurality of lenses including a zoom lens and a focus lens, and its optical characteristics are adjusted so as to form an image of observation light on a light receiving surface of an image pickup device.
  • the image sensor receives the observation light and performs photoelectric conversion to generate a signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
  • an image pickup device for example, an image pickup device having a Bayer array and capable of color photographing is used.
  • the image sensor may be various known image sensors such as a CMOS image sensor or a CCD (Charge Coupled Device) image sensor.
  • the image signal generated by the image sensor is transmitted to the control device 217 as RAW data.
  • the transmission of this image signal may be preferably performed by optical communication.
  • the surgeon performs the surgery while observing the condition of the affected area with the captured image, so for safer and more reliable surgery, it is required that the moving image of the surgical site be displayed in real time as much as possible. Because it can be done.
  • By transmitting the image signal by optical communication it is possible to display the captured image with low latency.
  • the imaging unit may have a drive mechanism for moving the zoom lens and the focus lens of the optical system along the optical axis. By appropriately moving the zoom lens and the focus lens by the drive mechanism, the magnification of the captured image and the focal length at the time of imaging can be adjusted. Further, the imaging unit may be equipped with various functions that can be generally provided in an electronic imaging type microscope unit, such as an AE function and an AF function.
  • the image pickup unit may be configured as a so-called single-plate type image pickup unit having one image pickup element, or may be configured as a so-called multi-plate type image pickup unit having a plurality of image pickup elements.
  • each image pickup element may generate an image signal corresponding to each of RGB, and a color image may be obtained by synthesizing them.
  • the image pickup unit may be configured to have a pair of image pickup elements for acquiring image signals for the right eye and the left eye corresponding to stereoscopic vision (3D display), respectively.
  • the 3D display enables the operator to more accurately grasp the depth of the biological tissue in the surgical site.
  • a plurality of optical systems may be provided corresponding to each image pickup element.
  • the operation unit 207 is an input means that is composed of, for example, a cross lever or a switch, and receives a user's operation input.
  • the user can input an instruction to change the magnification of the observation image and the focal length to the observation target via the operation unit 207.
  • the magnification and focal length can be adjusted by appropriately moving the zoom lens and the focus lens by the drive mechanism of the imaging unit according to the instruction.
  • the user can input an instruction to switch the operation mode (all-free mode and fixed mode described later) of the arm unit 209 via the operation unit 207.
  • the operation mode all-free mode and fixed mode described later
  • the operation unit 207 may be provided at a position where the user can easily operate the tubular portion 205 with a finger while holding the tubular portion 205 so that the operation unit 207 can be operated even while the user is moving the tubular portion 205. preferable.
  • the arm portion 209 is configured by connecting a plurality of links (first link 213a to sixth link 213f) rotatably to each other by a plurality of joint portions (first joint portion 211a to sixth joint portion 211f). Will be done.
  • the first joint portion 211a has a substantially cylindrical shape, and at its tip (lower end), the upper end of the tubular portion 205 of the microscope portion 203 is a rotation axis (first axis) parallel to the central axis of the tubular portion 205. O1) Support it so that it can rotate around.
  • the first joint portion 211a may be configured such that the first axis O1 coincides with the optical axis of the imaging portion of the microscope unit 203. This makes it possible to change the field of view so as to rotate the captured image by rotating the microscope unit 203 around the first axis O1.
  • the first link 213a fixedly supports the first joint portion 211a at the tip.
  • the first link 213a is a rod-shaped member having a substantially L-shape, and one side of the tip side extends in a direction orthogonal to the first axis O1, and the end of the one side is the first joint. It is connected to the first joint portion 211a so as to abut on the upper end portion of the outer periphery of the portion 211a.
  • the second joint portion 211b is connected to the end of the other side of the base end side of the substantially L-shape of the first link 213a.
  • the second joint portion 211b has a substantially cylindrical shape, and at its tip, the base end of the first link 213a is rotatably supported around a rotation axis (second axis O2) orthogonal to the first axis O1. ..
  • the tip of the second link 213b is fixedly connected to the base end of the second joint portion 211b.
  • the second link 213b is a rod-shaped member having a substantially L-shape, and one side of the tip side extends in a direction orthogonal to the second axis O2, and the end of the one side is the base of the second joint portion 211b. Fixedly connected to the end.
  • the third joint portion 211c is connected to the other side of the base end side of the substantially L-shape of the second link 213b.
  • the third joint portion 211c has a substantially cylindrical shape, and at the tip thereof, the base end of the second link 213b is placed around a rotation axis (third axis O3) orthogonal to the first axis O1 and the second axis O2. Supports rotatably.
  • the tip of the third link 213c is fixedly connected to the base end of the third joint portion 211c.
  • the third link 213c is configured so that its tip side has a substantially cylindrical shape, and the base end of the third joint portion 211c has a substantially same central axis at the tip of the cylindrical shape. It is fixedly connected.
  • the base end side of the third link 213c has a prismatic shape, and the fourth joint portion 211d is connected to the end portion thereof.
  • the fourth joint portion 211d has a substantially cylindrical shape, and at its tip, the base end of the third link 213c is rotatably supported around a rotation axis (fourth axis O4) orthogonal to the third axis O3. ..
  • the tip of the fourth link 213d is fixedly connected to the base end of the fourth joint portion 211d.
  • the fourth link 213d is a rod-shaped member that extends substantially linearly, and while extending so as to be orthogonal to the fourth axis O4, the end of the tip thereof hits the side surface of the fourth joint portion 211d in a substantially cylindrical shape. It is fixedly connected to the fourth joint portion 211d so as to be in contact with the fourth joint portion 211d.
  • a fifth joint portion 211e is connected to the base end of the fourth link 213d.
  • the fifth joint portion 211e has a substantially cylindrical shape, and on the tip end side thereof, the base end of the fourth link 213d is rotatably supported around a rotation axis (fifth axis O5) parallel to the fourth axis O4. To do.
  • the tip of the fifth link 213e is fixedly connected to the base end of the fifth joint portion 211e.
  • the fourth axis O4 and the fifth axis O5 are rotation axes capable of moving the microscope unit 203 in the vertical direction.
  • the height of the microscope unit 203 that is, the distance between the microscope unit 203 and the observation target can be adjusted by rotating the configuration on the tip side including the microscope unit 203 around the fourth axis O4 and the fifth axis O5. ..
  • the fifth link 213e has a first member having a substantially L-shape in which one side extends in the vertical direction and the other side extends in the horizontal direction, and the first member extends vertically downward from a portion extending in the horizontal direction. It is configured by combining with a rod-shaped second member to be stretched.
  • the base end of the fifth joint portion 211e is fixedly connected to the vicinity of the upper end of the portion extending in the vertical direction of the first member of the fifth link 213e.
  • the sixth joint portion 211f is connected to the base end (lower end) of the second member of the fifth link 213e.
  • the sixth joint portion 211f has a substantially cylindrical shape, and on the tip end side thereof, the base end of the fifth link 213e is rotatably supported around a rotation axis (sixth axis O6) parallel to the vertical direction.
  • the tip of the sixth link 213f is fixedly connected to the base end of the sixth joint portion 211f.
  • the sixth link 213f is a rod-shaped member extending in the vertical direction, and its base end is fixedly connected to the upper surface of the base portion 215.
  • the rotatable range of the first joint portion 211a to the sixth joint portion 211f is appropriately set so that the microscope unit 203 can perform a desired movement.
  • a total of 6 degrees of freedom of translation 3 degrees of freedom and rotation 3 degrees of freedom can be realized with respect to the movement of the microscope unit 203.
  • the position and posture of the microscope unit 203 can be freely controlled within the movable range of the arm unit 209. It will be possible. Therefore, it becomes possible to observe the surgical site from all angles, and the surgery can be performed more smoothly.
  • the configuration of the arm portion 209 shown in the figure is merely an example, and the number and shape (length) of the links constituting the arm portion 209, the number of joints, the arrangement position, the direction of the rotation axis, and the like are freely desired. It may be appropriately designed so that the degree can be realized.
  • the arm unit 209 in order to move the microscope unit 203 freely, the arm unit 209 is preferably configured to have 6 degrees of freedom, but the arm unit 209 has a larger degree of freedom (that is, redundant freedom). Degrees of freedom).
  • the arm unit 209 can change the posture of the arm unit 209 while the position and orientation of the microscope unit 203 are fixed. Therefore, more convenient control for the operator can be realized, for example, by controlling the posture of the arm unit 209 so that the arm unit 209 does not interfere with the field of view of the operator looking at the display device 219.
  • the first joint portion 211a to the sixth joint portion 211f may be provided with an actuator equipped with a drive mechanism such as a motor and an encoder or the like for detecting the rotation angle at each joint portion.
  • the posture of the arm portion 209 that is, the position and posture of the microscope portion 203 can be controlled by appropriately controlling the drive of each actuator provided in the first joint portion 211a to the sixth joint portion 211f by the control device 217. ..
  • the control device 217 grasps the current posture of the arm portion 209 and the current position and posture of the microscope portion 203 based on the information about the rotation angle of each joint portion detected by the encoder. Can be done.
  • the control device 217 uses the grasped information to calculate a control value (for example, rotation angle or generated torque) for each joint that realizes the movement of the microscope unit 203 in response to an operation input from the user. Then, the drive mechanism of each joint is driven according to the control value.
  • a control value for example, rotation angle or generated torque
  • the control method of the arm unit 209 by the control device 217 is not limited, and various known control methods such as force control or position control may be applied.
  • the control device 217 appropriately controls the drive of the arm unit 209 according to the operation input, and controls the position and posture of the microscope unit 203. May be done.
  • the microscope unit 203 can be moved from an arbitrary position to an arbitrary position, and then fixedly supported at the moved position.
  • an input device such as a foot switch that can be operated even if the operator holds the surgical tool in his hand.
  • the operation input may be performed in a non-contact manner based on the gesture detection and the line-of-sight detection using a wearable device or a camera provided in the operating room.
  • the arm portion 209 may be operated by a so-called master slave method.
  • the arm portion 209 can be remotely controlled by the user via an input device installed at a location away from the operating room.
  • the actuators of the first joint portion 211a to the sixth joint portion 211f are driven so as to receive an external force from the user and the arm portion 209 moves smoothly according to the external force. So-called power assist control may be performed.
  • the microscope unit 203 can be moved with a relatively light force. Therefore, the microscope unit 203 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • the drive of the arm portion 209 may be controlled so as to perform a pivot operation.
  • the pivot operation is an operation of moving the microscope unit 203 so that the optical axis of the microscope unit 203 always faces a predetermined point in space (hereinafter, referred to as a pivot point). According to the pivot operation, the same observation position can be observed from various directions, so that the affected part can be observed in more detail.
  • the microscope unit 203 is configured so that its focal length cannot be adjusted, it is preferable that the pivot operation is performed with the distance between the microscope unit 203 and the pivot point fixed. In this case, the distance between the microscope unit 203 and the pivot point may be adjusted to a fixed focal length of the microscope unit 203.
  • the microscope unit 203 moves on a hemisphere (schematically illustrated in FIG. C1) having a radius corresponding to the focal length centered on the pivot point, and is clear even if the observation direction is changed. An captured image will be obtained.
  • the control device 217 calculates the distance between the microscope unit 203 and the pivot point based on the information about the rotation angle of each joint portion detected by the encoder, and the microscope is based on the calculation result.
  • the focal length of unit 203 may be automatically adjusted.
  • the microscope unit 203 is provided with an AF function, the focal length may be automatically adjusted by the AF function each time the distance between the microscope unit 203 and the pivot point changes due to the pivot operation. ..
  • first joint portion 211a to the sixth joint portion 211f may be provided with a brake for restraining the rotation thereof.
  • the operation of the brake can be controlled by the control device 217.
  • the control device 217 operates the brake of each joint unit.
  • the posture of the arm unit 209 that is, the position and posture of the microscope unit 203 can be fixed without driving the actuator, so that power consumption can be reduced.
  • the control device 217 may release the brake of each joint unit and drive the actuator according to a predetermined control method.
  • Such a brake operation can be performed in response to an operation input by the user via the above-mentioned operation unit 207.
  • the operation unit 207 When the user wants to move the position and posture of the microscope unit 203, he / she operates the operation unit 207 to release the brake of each joint unit.
  • the operation mode of the arm portion 209 shifts to a mode in which the joint portions can freely rotate (all-free mode).
  • the operation unit 207 shifts to the mode in which the rotation of each joint portion is restricted (fixed mode).
  • the control device 217 comprehensively controls the operation of the microscope surgery system 200 by controlling the operations of the microscope device 201 and the display device 219.
  • the control device 217 controls the drive of the arm portion 209 by operating the actuators of the first joint portion 211a to the sixth joint portion 211f according to a predetermined control method.
  • the control device 217 changes the operation mode of the arm portion 209 by controlling the operation of the brakes of the first joint portion 211a to the sixth joint portion 211f.
  • the control device 217 generates image data for display by performing various signal processing on the image signal acquired by the imaging unit of the microscope unit 203 of the microscope device 201, and displays the image data. Displayed on the device 219.
  • the signal processing for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.) and / or enlargement processing (that is, Various known signal processing such as electronic zoom processing) may be performed.
  • the communication between the control device 217 and the microscope unit 203 and the communication between the control device 217 and the first joint portion 211a to the sixth joint portion 211f may be wired communication or wireless communication.
  • wired communication communication by an electric signal may be performed, or optical communication may be performed.
  • the transmission cable used for the wired communication may be configured as an electric signal cable, an optical fiber, or a composite cable thereof depending on the communication method.
  • wireless communication since it is not necessary to lay a transmission cable in the operating room, it is possible to solve the situation where the transmission cable hinders the movement of the medical staff in the operating room.
  • the control device 217 may be a processor such as a CPU or GPU, or a microcomputer or a control board on which a processor and a storage element such as a memory are mixedly mounted.
  • the various functions described above can be realized by operating the processor of the control device 217 according to a predetermined program.
  • the control device 217 is provided as a device separate from the microscope device 201, but the control device 217 is installed inside the base portion 215 of the microscope device 201 and is integrated with the microscope device 201. It may be configured in. Alternatively, the control device 217 may be composed of a plurality of devices.
  • a microcomputer, a control board, and the like are arranged in the microscope unit 203 and the first joint portion 211a to the sixth joint portion 211f of the arm portion 209, respectively, and these are connected to each other so as to be communicable with the control device 217. Similar functionality may be realized.
  • the display device 219 is provided in the operating room and displays an image corresponding to the image data generated by the control device 217 under the control of the control device 217. That is, the display device 219 displays an image of the surgical site taken by the microscope unit 203.
  • the display device 219 may display various information related to the surgery, such as physical information of the patient and information about the surgical procedure, in place of the image of the surgical site or together with the image of the surgical site. In this case, the display of the display device 219 may be appropriately switched by an operation by the user.
  • a plurality of display devices 219 may be provided, and each of the plurality of display devices 219 may display an image of the surgical site and various information related to the surgery.
  • various known display devices such as a liquid crystal display device or an EL (Electro Luminescence) display device may be applied.
  • FIG. 11 is a diagram showing a state of surgery using the microscopic surgery system 200 shown in FIG.
  • FIG. 11 schematically shows how the surgeon 221 is performing surgery on the patient 225 on the patient bed 223 using the microsurgery system 200.
  • the control device 217 is omitted from the configuration of the microscope surgery system 200, and the microscope device 201 is shown in a simplified manner.
  • the image of the surgical site taken by the microscope device 201 is enlarged and displayed on the display device 219 installed on the wall surface of the operating room by using the microscope surgery system 200.
  • the display device 219 is installed at a position facing the operator 221.
  • the operator 221 observes the state of the operation site by the image projected on the display device 219, and the operator, for example, excises the affected area. Various measures are taken against.
  • the microscope device 201 can also function as a support arm device that supports another observation device or other surgical tool at its tip instead of the microscope unit 203. By supporting these observation devices and surgical tools with a support arm device, it is possible to fix the position more stably and reduce the burden on the medical staff as compared with the case where the medical staff manually supports them. It becomes possible to do.
  • the technique according to the present disclosure may be applied to a support arm device that supports a configuration other than such a microscope unit.
  • the microscopic surgery system 200 corresponds to the surgical operation system.
  • the microscope unit 203 corresponds to an imaging unit capable of imaging a treatment target, and is configured as a part of the microscope device 201.
  • the control device 217 has the same function as the image processing unit 29 of the CCU 20. That is, the control device 217 corresponds to an embodiment of the image processing device according to the present technology.
  • the treatment system, image processing device, image processing method, and program related to this technology are executed by interlocking multiple computers that are communicably connected via a network or the like, and the image processing device related to this technology is constructed. You may. That is, the treatment system, image processing device, image processing method, and program according to the present technology can be executed not only in a computer system composed of a single computer but also in a computer system in which a plurality of computers operate in conjunction with each other. ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device in which a plurality of modules are housed in one housing are both systems.
  • the image processing device, the image processing method, and the program according to the present technology by the computer system for example, the setting of the area of interest, the determination of the emphasized area, the control of the amount of information, and the like are executed by a single computer. And when each process is performed by a different computer. Further, the execution of each process by a predetermined computer includes causing another computer to execute a part or all of the process and acquire the result. That is, the treatment system, image processing device, image processing method, and program related to this technology can be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network. It is possible.
  • the concepts that define the shape, size, positional relationship, state, etc. such as “center”, “same”, “orthogonal”, “parallel”, “cylindrical shape”, and “cylindrical shape”, are “substantially centered” and “substantially”.
  • the concept includes “substantially the same”, “substantially orthogonal”, “substantially parallel”, “substantially cylindrical”, “substantially cylindrical” and the like.
  • a predetermined range for example, ⁇ 10% range
  • the included state is also included.
  • the effects described in this disclosure are merely examples and are not limited, and other effects may be obtained.
  • the description of the plurality of effects described above does not necessarily mean that those effects are exerted at the same time. It means that at least one of the above-mentioned effects can be obtained depending on the conditions and the like, and of course, there is a possibility that an effect not described in the present disclosure may be exhibited.
  • the present technology can also adopt the following configurations.
  • An imaging unit that can photograph the treatment target and The first region in the captured image captured by the imaging unit is enlarged, and the amount of information in the second region different from the first region in the captured image is determined by a predetermined amount in the second region.
  • a treatment system with an image processing unit that reduces the area so that it is emphasized.
  • the treatment system according to (1) The image processing unit is a treatment system that executes a process of reducing the amount of information in the second region and an enhancement process for the predetermined region.
  • the image processing unit reduces the amount of information in the second region by reducing the image, grayscale the image, reducing the gradation value of the image, converting the display format of the image, or cutting out the image.
  • the image processing unit is a treatment system that executes at least one of enlargement of an image, colorization of an image, increase of a gradation value of an image, or conversion of a display format of an image as an enhancement process for the predetermined area.
  • the conversion of the display format is a treatment system including converting an image into a painting style.
  • the captured image includes a three-dimensional image.
  • the image processing unit is a treatment system that converts the second region into a two-dimensional image as a process of reducing the amount of information in the second region. (7) The treatment system according to (3).
  • the image processing unit is a treatment system that reduces the second region so that the entire second region can be displayed in an enlarged region other than the first region in the captured image.
  • the image processing unit is a treatment system that executes enhancement processing on the predetermined area after reducing the amount of information in the second area.
  • the image processing unit is a treatment system that reduces the amount of information in the second region after performing enhancement processing on the predetermined region.
  • the treatment system according to (1) The image processing unit is a treatment system that reduces the amount of information of only other regions different from the predetermined region of the second region. (11) The treatment system according to any one of (1) to (10), and further.
  • a treatment system including an area setting unit for setting the predetermined area with respect to the second area.
  • the area setting unit is a treatment system that sets a region corresponding to a portion that requires attention when performing the treatment as the predetermined region.
  • the area setting unit is a treatment system that sets a area corresponding to at least one of an instrument used for the treatment, a bleeding site, or a site for which damage should be avoided as the predetermined area.
  • the imaging unit is a treatment system configured as a part of the endoscope.
  • the imaging unit is a surgical system configured as a part of the surgical microscope.
  • a treatment system that is any one of (1) to (15).
  • the treatment is a treatment system including surgery.
  • a treatment system that is any one of (1) to (16).
  • the treatment target is a treatment system including a living body.
  • An image acquisition unit that acquires an image including the treatment target, and The first region in the image is enlarged and the amount of information in the second region different from the first region in the image is reduced so that the predetermined region in the second region is emphasized.
  • An image processing device including an image processing unit to be operated.

Abstract

Système opératoire selon un mode de réalisation de la présente invention comprenant une unité d'imagerie et une unité de traitement d'image. L'unité d'imagerie est capable de capturer une image d'un sujet d'opération. L'unité de traitement d'image agrandit une première région dans une image d'arrière-plan capturée par l'unité d'imagerie, et réduit la quantité d'informations d'une seconde région différente de la première région dans l'image capturée de telle sorte qu'une région prédéterminée de la seconde région soit mise en évidence. L'image d'un sujet peut ainsi être bien appréhendée.
PCT/JP2020/031967 2019-09-02 2020-08-25 Système opératoire, dispositif de traitement d'image, procédé de traitement d'image et programme WO2021044900A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019159457 2019-09-02
JP2019-159457 2019-09-02

Publications (1)

Publication Number Publication Date
WO2021044900A1 true WO2021044900A1 (fr) 2021-03-11

Family

ID=74852763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/031967 WO2021044900A1 (fr) 2019-09-02 2020-08-25 Système opératoire, dispositif de traitement d'image, procédé de traitement d'image et programme

Country Status (1)

Country Link
WO (1) WO2021044900A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334665A (ja) * 1994-06-06 1995-12-22 Ge Yokogawa Medical Syst Ltd 画像表示方法及び画像表示装置
JP2012245157A (ja) * 2011-05-27 2012-12-13 Olympus Corp 内視鏡装置
JP2013042301A (ja) * 2011-08-12 2013-02-28 Casio Comput Co Ltd 画像処理装置、画像処理方法及びプログラム
JP2013507182A (ja) * 2009-10-07 2013-03-04 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 臨床画像上に強調された画像化データを表示するための方法および装置
JP2013066241A (ja) * 2011-06-09 2013-04-11 Toshiba Corp 画像処理システム及び方法
WO2017115442A1 (fr) * 2015-12-28 2017-07-06 オリンパス株式会社 Appareil de traitement d'image, procédé de traitement d'image et programme de traitement d'image
WO2018131141A1 (fr) * 2017-01-13 2018-07-19 オリンパス株式会社 Dispositif de traitement d'images d'endoscope et procédé de traitement d'images d'endoscope
WO2019012911A1 (fr) * 2017-07-14 2019-01-17 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à une intervention médicale

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334665A (ja) * 1994-06-06 1995-12-22 Ge Yokogawa Medical Syst Ltd 画像表示方法及び画像表示装置
JP2013507182A (ja) * 2009-10-07 2013-03-04 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 臨床画像上に強調された画像化データを表示するための方法および装置
JP2012245157A (ja) * 2011-05-27 2012-12-13 Olympus Corp 内視鏡装置
JP2013066241A (ja) * 2011-06-09 2013-04-11 Toshiba Corp 画像処理システム及び方法
JP2013042301A (ja) * 2011-08-12 2013-02-28 Casio Comput Co Ltd 画像処理装置、画像処理方法及びプログラム
WO2017115442A1 (fr) * 2015-12-28 2017-07-06 オリンパス株式会社 Appareil de traitement d'image, procédé de traitement d'image et programme de traitement d'image
WO2018131141A1 (fr) * 2017-01-13 2018-07-19 オリンパス株式会社 Dispositif de traitement d'images d'endoscope et procédé de traitement d'images d'endoscope
WO2019012911A1 (fr) * 2017-07-14 2019-01-17 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à une intervention médicale

Similar Documents

Publication Publication Date Title
JP7067467B2 (ja) 医療用情報処理装置、情報処理方法、医療用情報処理システム
CN110325093A (zh) 医疗用臂系统、控制装置与控制方法
JP7088185B2 (ja) 医療用システム、医療用装置および制御方法
CN111278344B (zh) 手术臂系统和手术臂控制系统
US11540700B2 (en) Medical supporting arm and medical system
WO2018168261A1 (fr) Dispositif de commande, procédé de commande et programme
JP7095693B2 (ja) 医療用観察システム
JP2021013412A (ja) 医療用観察システム、制御装置及び制御方法
US20230172438A1 (en) Medical arm control system, medical arm control method, medical arm simulator, medical arm learning model, and associated programs
US20220008156A1 (en) Surgical observation apparatus, surgical observation method, surgical light source device, and surgical light irradiation method
CN113993478A (zh) 医疗工具控制系统、控制器和非暂时性计算机可读存储器
WO2021049438A1 (fr) Bras de support médical et système médical
JPWO2019092950A1 (ja) 画像処理装置、画像処理方法および画像処理システム
WO2021049220A1 (fr) Bras de support médical et système médical
JP7092111B2 (ja) 撮像装置、映像信号処理装置および映像信号処理方法
WO2019181242A1 (fr) Endoscope et système de bras
WO2020203164A1 (fr) Système médical, dispositif de traitement d'informations, et procédé de traitement d'informations
WO2021044900A1 (fr) Système opératoire, dispositif de traitement d'image, procédé de traitement d'image et programme
WO2021256168A1 (fr) Système de traitement d'image médicale, dispositif de commande d'image chirurgicale et procédé de commande d'image chirurgicale
WO2017221491A1 (fr) Dispositif, système et procédé de commande
WO2021140923A1 (fr) Dispositif de génération d'images médicales, procédé de génération d'images médicales, et programme de génération d'images médicales
JP2023507063A (ja) 手術中に画像取込装置を制御するための方法、装置、およびシステム
JPWO2020045014A1 (ja) 医療システム、情報処理装置及び情報処理方法
WO2018043205A1 (fr) Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme
WO2022004250A1 (fr) Système médical, dispositif de traitement d'informations et procédé de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20860881

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20860881

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP