WO2019202827A1 - Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image, et programme - Google Patents

Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image, et programme Download PDF

Info

Publication number
WO2019202827A1
WO2019202827A1 PCT/JP2019/004770 JP2019004770W WO2019202827A1 WO 2019202827 A1 WO2019202827 A1 WO 2019202827A1 JP 2019004770 W JP2019004770 W JP 2019004770W WO 2019202827 A1 WO2019202827 A1 WO 2019202827A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
white light
image
blood
bleeding
Prior art date
Application number
PCT/JP2019/004770
Other languages
English (en)
Japanese (ja)
Inventor
穂 高橋
菊地 大介
貴美 水倉
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019202827A1 publication Critical patent/WO2019202827A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Definitions

  • the present disclosure relates to an image processing system, an image processing apparatus, an image processing method, and a program.
  • Patent Document 1 discloses a technique that makes it easy to obtain information on a three-dimensional structure and a positional relationship between objects by using depth information. Yes.
  • This Patent Document 1 discloses that the thickness of a blood pool generated by bleeding is analyzed, and a position where the obtained thickness is equal to or greater than a predetermined thickness is detected as a bleeding position.
  • the present disclosure proposes an image processing system, an image processing apparatus, an image processing method, and a program that can more accurately grasp the bleeding site of the surgical site at the time of surgery.
  • the light source device that has at least a white light source that emits white light and irradiates the surgical site that is a part of a living body on which surgery is performed, the white light by the light source device.
  • a surgical camera that acquires a white light image of the surgical site irradiated with light and a plurality of the white light images acquired at different time points, the temporal change of the surgical site between the white light images
  • An image processing system is provided that includes an image processing device that identifies a bleeding position in the surgical site.
  • an operation part that is a part of a living body on which surgery is performed, illuminated by the white light from a light source device that has at least a white light source that emits white light, Using a plurality of white light images obtained by imaging at different time points, and a specifying unit for specifying a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of white light images
  • a light source device that has at least a white light source that emits white light
  • a specifying unit for specifying a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of white light images
  • an operation part that is a part of a living body on which surgery is performed, illuminated by the white light from a light source device that has at least a white light source that emits white light, Using a plurality of white light images obtained by imaging at different time points, and specifying a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of white light images.
  • a processing method is provided.
  • the light source device that has at least a white light source that emits white light and that irradiates the surgical part that is a part of a living body on which surgery is performed, and the light source device A plurality of white light images acquired at different times by the surgical camera using a computer capable of mutual communication with both of the surgical cameras that acquire white light images of the surgical site irradiated with the white light And a program for functioning as a specifying unit that specifies a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of images.
  • the surgical camera acquires a white light image of the surgical site at different time points
  • the image processing apparatus is based on a temporal change of the surgical site between a plurality of white light images acquired at different time points. Identify the bleeding location in the surgical site.
  • FIG. 1 is a block diagram illustrating a configuration of an image processing system according to an embodiment of the present disclosure. It is explanatory drawing for demonstrating the feature point extracted from the picked-up image of a camera. It is a schematic diagram for demonstrating the prediction method of the appearance position of the feature point which concerns on the embodiment. It is explanatory drawing for demonstrating the method to acquire an image with the surgical camera from which the position and attitude
  • a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
  • it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system 1 to which the technology according to the present disclosure can be applied.
  • the endoscopic surgery system 1 is used for endoscopic surgery performed in place of a laparotomy in a medical field.
  • trocars 200a to 200d In endoscopic surgery, instead of cutting and opening the abdominal wall, an opening device called trocars 200a to 200d is attached to the abdominal wall of the affected area 230.
  • the surgical camera 110, the energy treatment tool 210, and the forceps 220 are inserted into the patient's body cavity from the trocars 200a to 200d. While viewing the image of the affected area 230 captured by the surgical camera 110 in real time, for example, a treatment such as excising the affected area 230 is performed by the energy treatment tool 210 or the like.
  • omitted the surgical camera 110, the energy treatment tool 210, and forceps 220 are supported by an operator, an assistant, a scoopist, a robot, etc. during an operation.
  • a cart 190 equipped with devices for endoscopic surgery, a bed 240 on which a patient lies, and a foot switch 250 are arranged.
  • the cart 190 is equipped with, for example, a camera control unit (CCU) 100, a light source device 130, an output device 140, a treatment instrument control device 150, an insufflation device 160, a recorder 170, and a printer 180 as medical devices. It is location.
  • CCU camera control unit
  • the surgical camera 110 has a function of acquiring an image of a surgical site where surgery is being performed.
  • the surgical camera 110 acquires an image corresponding to the type of light emitted to the surgical site from a light source device 130 described later.
  • the surgical camera 110 is, for example, a white light image that is an image of a surgical site irradiated with white light, or a near-field image that is irradiated with near infrared light having a wavelength belonging to the near infrared light band.
  • An infrared light image, a laser light image in which a laser beam having a predetermined wavelength is irradiated on the surgical site, and the like are acquired.
  • the white light image and the near-infrared light image may be obtained by providing the surgical camera 110 with a plurality of image sensors, and the white light image and the near-infrared light image may be captured by different image sensors.
  • a white light image and a near-infrared light image may be captured with a single imager by switching the light source.
  • An image signal of the affected area 230 captured by the surgical camera 110 is transmitted to the CCU 100 as RAW data via a camera cable.
  • the acquired image may be stored in the storage unit 1250 as shown in FIG.
  • the surgical camera 110 and the CCU 100 may be connected by a wireless communication method in addition to the camera cable.
  • an endoscope or a surgical microscope may be used.
  • a surgical camera 110 configured as a so-called rigid mirror having a rigid lens barrel is illustrated.
  • the surgical camera 110 is configured as a so-called flexible mirror having a flexible lens barrel.
  • a wired transmission cable such as an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable thereof may be used.
  • a light source device 130 is connected to the surgical camera 110, and light generated by the light source device 130 is reflected by a light guide that extends inside a lens barrel provided at the distal end of the surgical camera 110. The light is guided to the tip of the tube, and irradiated to the observation target in the body cavity of the patient via the objective lens.
  • the CCU 100 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the surgical camera 110 and the output device 140. Specifically, the CCU 100 performs various image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example, on the image signal received from the surgical camera 110. The CCU 100 provides the output device 140 with the image signal subjected to the image processing. In addition, the CCU 100 transmits a control signal to the surgical camera 110 and controls its driving.
  • the control signal can include information regarding imaging conditions such as magnification and focal length.
  • the light source device 130 has a white light source that emits white light.
  • the light source device 130 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation as necessary.
  • the light source device 130 is, for example, at least one of a near-infrared light source that emits near-infrared light having a wavelength belonging to the near-infrared light band and a laser light source that emits laser light having a predetermined wavelength. It is preferable to provide these.
  • the light source device 130 may include a plurality of near-infrared light sources that emit near-infrared light having different wavelengths.
  • the light source device 130 supplies irradiation light to the surgical camera 110 when photographing the surgical site.
  • specification of the bleeding position of the surgical site, calculation of the bleeding amount, and the like may be performed.
  • So-called narrow band imaging (NBI) may be performed in which a predetermined tissue such as a blood vessel on the mucosal surface layer is imaged with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue.
  • ICG indocyanine green
  • the light source device 130 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
  • the output device 140 displays an image based on an image signal subjected to image processing by the CCU 100 under the control of the CCU 100. Further, the output device emits a predetermined sound under the control of the CCU 100.
  • the treatment instrument control device 150 controls driving of the energy treatment instrument 210 for tissue ablation, incision, blood vessel sealing, or the like.
  • the treatment instrument control device 150 is, for example, a high-frequency output device that outputs a high-frequency current to the energy treatment instrument 210 that cuts the affected part 230.
  • the pneumoperitoneum device 160 includes an air supply / intake unit, and is a device that inflates a patient's body cavity by supplying air to the patient's body, for example, the abdominal region. When the patient's body cavity is inflated by the pneumoperitoneum 160, it is possible to secure the visual field by the surgical camera 110 and to secure the operator's work space.
  • the recorder 170 is a device that can record various types of information related to surgery.
  • the recorder 170 may record an image acquired by the surgical camera 110, and various types of information acquired by a feature point detection unit 1210 and a feature amount detection unit 1220 included in the CCU 100 described later are recorded as history information. May be.
  • the recorder 170 may appropriately record various parameters, the progress of processing, and the like that need to be saved when the image processing system 10 according to the present embodiment performs some processing.
  • the printer 180 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
  • the foot switch 250 controls the CCU 100 and the treatment instrument control device 150 by using an operation of an operator or an assistant as a trigger signal.
  • the control of the CCU 100 and the treatment instrument control device 150 is not limited to the foot switch 250, and may be performed by a touch panel, voice input, or the like.
  • the schematic configuration of the endoscopic surgery system 1 to which the technology according to the present disclosure can be applied has been described.
  • the technology according to the present disclosure can be applied not only to the above-described endoscopic surgery system but also to a microscope surgery system used for surgery using a microscope.
  • FIG. 2 is a diagram illustrating a configuration example of the image processing system 10 according to the embodiment of the present disclosure.
  • the image processing system 10 according to the embodiment of the present disclosure can be implemented as one function of the endoscopic surgery system 1 as illustrated in FIG. 1, for example.
  • a detailed description will be given by taking as an example a case where the image processing system 10 described in detail below is mounted on the endoscopic surgery system 1.
  • the image processing system 10 according to the embodiment of the present disclosure includes an image processing device 120 as illustrated in FIG. 2.
  • the image processing apparatus 120 is mounted on the CCU 100 described above, for example.
  • the image processing apparatus 120 cooperates with the surgical camera 110, the light source device 130, and the output device 140 that are mounted in the endoscopic surgery system 1, Perform various processes.
  • the image processing apparatus 120 according to the embodiment of the present disclosure is not the surgical camera 110, the light source device 130, and the output device 140 in the endoscopic surgical system 1 as illustrated in FIG. 110, the light source device 130, and the output device 140 can perform various kinds of processing while cooperating with those having the same functions.
  • the image processing apparatus 120 is a part of a living body on which surgery is performed, which is acquired by the surgical camera 110 at different time points and is illuminated by a light source device having at least a white light source. Using a plurality of images of the surgical site, a bleeding position in the surgical site is specified based on a temporal change of the surgical site between the plurality of white light images.
  • the image processing device 120 further uses at least one of images other than the white light image (for example, a near-infrared light image or a laser light image) acquired by the surgical camera 110 at different points in time at the surgical site. It is also possible to specify the bleeding position, etc.
  • the image processing apparatus 120 includes a feature point detection unit 1210, a feature amount detection unit 1220, and a feature amount detection unit 1220, as illustrated in FIG.
  • An output control unit 1230, a light source switching unit 1240, and a storage unit 1250 are provided.
  • the feature point detection unit 1210 uses a plurality of white light images acquired by the surgical camera 110 and analyzes each of the plurality of white light images to characterize the white light image. A “feature point” is detected from each of the plurality of white light images.
  • the feature point detection unit 1210 according to the embodiment of the present disclosure includes a three-dimensional information acquisition unit 1211 and a blood detection unit 1212 as illustrated in FIG.
  • the three-dimensional information acquisition unit 1211 acquires three-dimensional information of the area displayed in the white light image acquired by the surgical camera 110.
  • the blood detection unit 1212 detects blood that may be present in the region displayed in the white light image, and acquires blood distribution information regarding the blood distribution.
  • the three-dimensional information refers to position information related to the position of the image area acquired by the surgical camera 110 and shape information related to the three-dimensional shape of the area displayed in the image.
  • the feature point detection unit 1210 transmits the acquired three-dimensional information and blood position information to the feature amount detection unit 1220.
  • the three-dimensional information acquisition unit 1211 estimates the position and posture of the surgical camera 110 by analyzing the white light image acquired by the surgical camera 110, and in the three-dimensional space of the portion displayed in the white light image. 3D information is acquired. Such estimation of the position and orientation of the surgical camera 110, grasping of the position of the portion displayed on the white light image, and the like can be performed by a known self-position estimation technique such as SLAM (Simultaneous Localization and Mapping). .
  • FIG. 3 is an explanatory diagram for explaining the feature points extracted from the captured image of the camera. For example, as shown in FIG.
  • a general image including an image of an operation part includes various images that characterize structures and positional relationships of various objects present in the image.
  • the three-dimensional information acquisition unit 1211 is displayed on the image even when the surgical camera 110 moves due to the patient's body movement during operation or the operation by the operator.
  • the position of the region can be grasped, and three-dimensional information can be acquired.
  • FIG. 4 is a schematic diagram for explaining a method of predicting the appearance position of the feature point FP.
  • Surgical camera 110 located at the position C 1 takes an image of the object P, when acquiring an image of an area A 1, the display area A 1 of the acquired image, the feature point FP 1 relates to an object P is extracted.
  • the surgical camera 110 extracts a feature point FP 2 relating to the object P existing in the display area A 2 displayed from the position C 2. To do. Further, when the surgical camera 110 moves from the position C 2 to the position C 3 , the surgical camera 110 extracts a feature point FP 3 relating to the object P that exists in the display area A 3 displayed from the position C 3. To do.
  • the display position of the feature point FP 1 in the area A 1 when it is expected to move to the position C 4 followed by surgical camera 110, the display position of the feature point FP 1 in the area A 1, the position of the feature point FP 2 in the display area A 2, and a display the position of the object P to be estimated from each of the position of the feature point FP 3 in the area a 3, the positional relationship between the position C 4 and the position C 3, based on the display position of the feature point FP 4 estimated area a 4 , the appearance position of the feature point in the display area A 4 can be predicted.
  • the three-dimensional information acquisition unit 1211 can estimate the position and posture of the surgical camera 110 using the above self-position estimation technique. As a result, the three-dimensional information acquisition unit 1211 stores the estimated position at any time for a certain period even when the surgical camera 110 is moved by the patient's body movement during the operation and the surgical unit is not displayed on the output device 140. By doing so, the position of the surgical site can be grasped.
  • 5A and 5B are explanatory diagrams for explaining a method of acquiring an image of the surgical site by the surgical camera 110 whose position and posture are estimated.
  • the three-dimensional information acquisition unit 1211 The position of the area can be predicted. Therefore, the user can move the surgical camera 110 to a desired position and display the blood B region on the output device 140. For example, as shown in FIG. 5B, the user can grasp the position of the image display region. However, a specific part can be enlarged and displayed.
  • the blood detection unit 1212 is a region displayed in the white light image based on the color information constituting the white light image acquired by the surgical camera 110 and the three-dimensional information acquired by the three-dimensional information acquisition unit 1211. Blood is detected and blood distribution information is acquired. Specifically, a discriminator for identifying blood existing in the white light image from the white light image is created in advance, and the created discriminator is applied to the white light image acquired by the surgical camera 110. Thus, blood distribution information can be obtained.
  • FIG. 6A is an explanatory diagram for explaining an example of a method for detecting blood.
  • FIG. 6B is a schematic diagram illustrating an example of a feature plane created to detect blood.
  • FIG. 6C is a schematic diagram illustrating an image in which a blood distribution is displayed.
  • FIG. 6A is a diagram illustrating an example of a learning image.
  • FIG. 6A shows an image in which blood B exists on a part of the organ surface as a learning image.
  • a point P1 indicates a portion learned as blood
  • a point P2 indicates a portion learned as an organ region.
  • the discriminator is, for example, color information such as RGB of the blood region and the region other than the blood region displayed in the learning image, three-dimensional information of the blood region, and three-dimensional information of the organ region where the organ surface is displayed. It is created based on the surface roughness calculated from the above, the standard deviation of the surface roughness, and the like.
  • the blood region and the organ region are learned as described above, and learning result data is created.
  • the learning result data for example, as shown in FIG. 6B, a feature amount plane is created with the color information on the vertical axis and the surface roughness on the horizontal axis, and the color information indicated by the blood vessel region and the surface roughness
  • An identification surface f indicating the boundary between the feature amount region occupied by the combination and the feature amount region occupied by the combination of the color information and the surface roughness indicated by the organ region is specified. For example, in this way, it is only necessary to construct a discriminator that identifies blood existing in the image from the image.
  • the blood detection unit 1212 uses the constructed discriminator (for example, the feature amount as shown in FIG. 6B). Using the identification surface f) in the plane, the location where blood is present in the image acquired by the surgical camera 110 is specified. Specifically, the blood detection unit 1212 divides the image acquired by the surgical camera 110 into a predetermined size, and from the three-dimensional information acquired by the three-dimensional information acquisition unit 1211, for each divided range of the image. RGB and surface roughness are calculated. Thereafter, the blood detection unit 1212 plots the obtained combination of RGB and surface roughness in any region on the feature amount plane for each divided range based on the information on the identification surface f. It is determined whether each of the divided areas of the image is blood. As described above, as shown in FIG. 6C, blood distribution information relating to the distribution of blood B in the image acquired by the surgical camera 110 can be acquired.
  • the constructed discriminator for example, the feature amount as shown in FIG. 6B.
  • the blood detection unit 1212 can also detect the presence of a foreign substance by the above method. For example, when a surgical instrument is left in the surgical site, the color information and surface roughness of the area where the surgical instrument is displayed are greatly different from the color information and surface roughness of the blood area and the color information and surface roughness of the organ area. . Accordingly, the positions plotted on the feature plane corresponding to the area where the surgical instrument is displayed are significantly different from the positions plotted as the blood area and the organ area. Therefore, the blood detector 1212 can detect the presence of a foreign substance. In addition, the presence of a foreign object can be detected using various known object recognition techniques instead of the above-described method using a discriminator. For example, various surgical tools used in surgery can be detected as objects. By recognizing, it is also possible to detect whether the surgical tool is left unattended.
  • the feature amount detection unit 1220 uses at least the three-dimensional information and the blood distribution information acquired from the feature point detection unit 1210 to calculate the feature amount related to blood in each region displayed in the image. To detect. As illustrated in FIG. 2, the feature amount detection unit 1220 according to the embodiment of the present disclosure includes a bleeding determination unit 1221, a bleeding position specifying unit 1222, and a blood volume calculation unit 1223. The bleeding determination unit 1221 determines whether or not bleeding has occurred in the region displayed in the image acquired by the surgical camera 110.
  • the bleeding position specifying unit 1222 has a function of using a plurality of images acquired at different time points and specifying a bleeding position in the surgical site based on temporal changes of the surgical site between the plurality of images.
  • the blood volume calculation unit 1223 calculates the blood volume of the blood displayed in the image acquired by the surgical camera 110.
  • the bleeding determination unit 1221 determines whether bleeding has occurred in the region displayed in the image based on the blood distribution information of the plurality of white light images acquired at different times. Specifically, the bleeding determination unit 1221 calculates the area of the area where the blood exists and the amount of change in the area based on the blood distribution information of the white light image at each time point, and changes in the area of the area where the blood exists. When the amount exceeds a predetermined threshold, it can be determined that bleeding has occurred. The bleeding determination unit 1221 can determine whether or not bleeding has occurred based on the amount of change in blood volume calculated by the blood volume calculation unit 1223 described later.
  • the area of the blood region can be calculated from the number of pixels of the blood region in the acquired white light image and the magnification of the displayed white light image, and the amount of change in the area can be calculated from the calculated blood region. It can be calculated from the time change of the area.
  • FIG. 7 is an explanatory diagram for explaining a method for determining whether or not bleeding has occurred.
  • the graph shown in the upper part of FIG. 7 is a graph showing the transition of the area of the blood region on the acquired white light image, with the vertical axis being the blood region area and the horizontal axis being the time.
  • the graph shown in the middle part of FIG. 7 is a graph showing the transition of the change amount of the area of the blood region on the acquired white light image, with the vertical axis being the change amount of the blood region and the horizontal axis being the time.
  • the lower part of FIG. 7 shows white light images acquired at times t 1 to t 6 .
  • each of the times t 1 to t 6 is, for example, a time when 10 frames have elapsed.
  • the amount of change in the area of the blood region is a threshold value T (this threshold value T may be set in advance by a third party or set by the user. Is set, and when the amount of change in the area of the blood region exceeds the threshold T, the bleeding determination unit 1221 can determine that bleeding has occurred. In Figure 7, bleeding is determined to have occurred at time t 2.
  • the bleeding determination unit 1221 performs image processing on the white light image at each time point, and calculates the difference in the area of the blood region between the images at each time point.
  • the amount of change in the area of the blood region is a difference in the area of the blood region between images at each time point, as shown in the lower part of FIG. Therefore, when the position of the surgical camera 110 deviates from the position at the previous time point, the blood distribution displayed in the image changes in accordance with the deviation, so the area of the blood region and the blood region in the predetermined surgical site There is a possibility that the area change amount cannot be calculated accurately. In that case, the area of the blood region can be accurately calculated by correcting the positional shift based on the three-dimensional information acquired by the three-dimensional information acquisition unit 1211.
  • the bleeding position specifying unit 1222 specifies the bleeding position by tracing the transition of the blood distribution state based on the blood distribution information at each time point when the plurality of white light images are acquired.
  • FIG. 8 is an explanatory diagram for explaining a method of specifying a bleeding position.
  • the graph shown in the upper part of FIG. 8 is a graph showing the area of the blood region on the acquired white light image, where the vertical axis represents the area of the blood region and the horizontal axis represents time.
  • the middle part of FIG. 8 shows white light images acquired at times t 1 to t 6 .
  • each of the times t 1 to t 6 is, for example, a time when 10 frames have elapsed.
  • FIG. 8 is a diagram showing the difference between the white light images shown in the middle part of FIG.
  • the left figure at the bottom of FIG. 8 shows the difference between the white light image acquired at time t 3 and the white light image acquired at time t 2 .
  • the second diagram from the left in the lower part of FIG. 8 shows the difference between the white light image acquired at time t 4 and the white light image acquired at time t 3 .
  • the fourth diagram from the left in the third figures and 8 in the lower left of the lower 8 white obtained at the time of the acquired white-light image and the time t 4 at each time t 5 It shows the optical image difference, and the difference of the obtained white light image at the time of the white light image and the time t 5 that is acquired at the time point t 6.
  • the bleeding position specifying unit 1222 can specify the bleeding position by tracing back the blood region displayed in the white light image at each time point in the time direction. .
  • the bleeding position specifying unit 1222 can specify the bleeding position by tracing back the blood region displayed in the white light image at each time point in the time direction. .
  • the bleeding position specifying unit 1222 can specify the bleeding position by tracing back the blood region displayed in the white light image at each time point in the time direction. .
  • the bleeding at time t 2 the difference between the blood region occurs occurs.
  • the user can recognize the bleeding position more accurately as compared with the conventional technique.
  • the bleeding position specifying unit 1222 can grasp the transition of the blood distribution with the passage of time. Therefore, the bleeding position specifying unit 1222 compares the blood region displayed in the white light image at the past time point with the blood region displayed in the white light image at the latest time point, and enlarges the blood region in the direction in which the blood region expands. By calculating the amount of change in the blood region, it is possible to estimate the future blood distribution.
  • Bleeding position specifying unit 1222 for example, as shown in the graph of FIG. 8 the upper, it can be estimated blood region area at the time of the upcoming time t 7, as shown in FIG. 8 middle right, the future it can be estimated blood distribution at time t 7.
  • the bleeding position specifying unit 1222 may perform the image processing on the white light image at each time point and calculate the difference in the area of the blood region between the white light images at each time point.
  • the bleeding position specifying unit 1222 bleeds using a plurality of laser light images acquired by irradiating a laser beam having a predetermined wavelength from the light source device 130 to the surgical site and irradiating the laser beam at different time points. It is also possible to specify the position.
  • the light source device 130 has a laser light source that generates laser light with high coherence, and when the laser light is irradiated on the blood, the laser light is scattered by the blood. Since the phases of the scattered light are different from each other, the scattered light randomly interferes on the imaging surface of the surgical camera 110, and a spotted light intensity distribution pattern (so-called speckle pattern) is obtained.
  • the light intensity distribution pattern can be used as blood flow information.
  • the movement is large as blood flows out. Therefore, in the speckle pattern as described above, the speckle image obtained from the speckle pattern has a higher luminance value as the area is closer to the bleeding position. Should be. Therefore, the bleeding position specifying unit 1222 can calculate the speckle image by various known methods, and can specify the bleeding position based on the distribution of luminance values in the calculated speckle image.
  • the bleeding position specifying unit 1222 specifies the bleeding position using each of the various methods as described above, and determines information regarding the bleeding position obtained from the various methods in an integrated manner, thereby enabling more accurate bleeding. It is also possible to specify the position.
  • the blood volume calculation unit 1223 calculates the volume of blood displayed in the image acquired by the surgical camera 110.
  • a white light image that is an image of an operation site irradiated with white light, or a white light image and a near-infrared light image are used.
  • the blood volume calculation unit 1223 assumes, for example, that the blood thickness displayed in the white light image is constant, and the blood thickness and the blood region displayed in the white light image are displayed.
  • the product of the area can be the blood volume in the displayed area.
  • the blood volume calculation unit 1223 can also calculate the blood volume using a near-infrared light image that is an image of the surgical site irradiated with near-infrared light. In this case, a plurality of near-infrared light images may be used for calculating the blood volume.
  • a blood volume calculation method using a near-infrared light image will be described with reference to FIGS. 9A to 9D, 10A, and 10B.
  • the left figure of FIG. 9A is the schematic diagram which showed from the side the operation part irradiated with white light from the upper part.
  • FIG. 9D are the schematic diagrams which respectively showed the operation part irradiated with the near-infrared light which has wavelength (lambda) 1 , wavelength (lambda) 2, and wavelength (lambda) 3 from the side. .
  • the right figure of FIG. 9A is a schematic diagram which shows the image of the operation part irradiated with the white light acquired by the surgical camera 110.
  • FIG. 9B right, FIG. 9C right, and FIG. 9D right show the surgical site irradiated with near-infrared light having wavelengths ⁇ 1 , ⁇ 2, and ⁇ 3 acquired by the surgical camera 110, respectively. It is the schematic diagram which showed the image of.
  • the left diagram in FIG. 10A is a schematic diagram showing a white light image
  • the right diagram in FIG. 10A is a schematic diagram showing a blood region detected by the blood detector 1212 using the white light image
  • the left diagram in FIG. 10B is a schematic diagram showing a near-infrared light image in the same region as the image shown in the left diagram in FIG. 10A
  • the right diagram in FIG. 10B shows blood using the near-infrared light image.
  • 6 is a schematic diagram showing a blood region detected by a detection unit 1212.
  • the white light WL is reflected on the surfaces of blood B1, blood B2, and blood B3 having different thicknesses, as shown in FIG. 9A, because light having a wavelength corresponding to red is reflected on the surface of the substance. Therefore, blood B1, blood B2, and blood B3 are displayed on the acquired white light image.
  • near-infrared light is absorbed by hemoglobin in blood.
  • the near-infrared light having a shorter wavelength has a higher degree of absorption by hemoglobin in the blood, and the near-infrared light having a longer wavelength has a lower degree of absorption by hemoglobin in the blood. Therefore, as shown in FIG.
  • the blood B1 and blood B2 In the near-infrared light image near-infrared light of wavelength lambda 2 is acquired by irradiation, as shown in FIG. 9C, the blood B1 and blood B2, many of near infrared light having a wavelength lambda 2 passes Reflected on the surface of the organ below the blood. Therefore, the blood B1 and the blood B2 are not recognized by the surgical camera 110. Further, the blood B1 and blood B2 than the thickness larger blood B3, many of near infrared light having a wavelength of lambda 2 is absorbed. Therefore, in the near-infrared light image near-infrared light of wavelength lambda 2 is acquired by irradiation, the blood B3 is displayed differently from a portion other than the blood B3.
  • near-infrared light image near-infrared light of wavelength lambda 3 is acquired by irradiation, as shown in FIG. 9D, many of near infrared light having a wavelength lambda 3 even thickness larger blood B3 transmitted It is reflected on the surface of the organ below the blood. Therefore, none of blood B1, blood B2, and blood B3 is displayed. Therefore, as shown in the left diagram of FIG. 10A, the blood B1 and the blood B2 are both displayed in the white light image. Therefore, as shown in the right diagram of FIG. 10A, the blood detector 1212 has both the blood B1 and the blood B2. Can be detected. On the other hand, as shown in FIG.
  • Blood detector 1212 can detect only blood B2.
  • the blood displayed in the near-infrared light image varies depending on the wavelength of the irradiated near-infrared light. Therefore, it is possible to associate the wavelength with the thickness of blood by taking the difference between a plurality of near-infrared light images obtained by irradiating the surgical site with near-infrared light having different wavelengths from the light source device 130.
  • the thickness of the blood can be calculated. As described above, the product of the calculated blood thickness and the area of the blood region having the thickness can be obtained, and the sum of the products can be used as the blood volume in the displayed region.
  • the blood volume calculation method using the near-infrared light image can calculate the blood volume more accurately than the blood volume calculation method using the white light image.
  • the blood volume calculation unit 1223 can calculate the amount of change in blood volume by performing the above-described processing on the near-infrared light images acquired at a plurality of points in time. Since white light is reflected on the surface of blood, a white light image may be used instead of the near infrared light image obtained by irradiating near infrared light with a certain wavelength. As described above, the blood volume calculation unit 1223 compares the blood volumes calculated in at least two or more of the plurality of near-infrared light images and white light images, and generates a near-infrared light image or white light image. The amount of change in blood volume in the displayed area can be calculated.
  • the output control unit 1230 controls the content of information output by the output device 140 and the output method. For example, the output control unit 1230 displays information such as an image acquired by the surgical camera 110, three-dimensional information of the image, a display or voice notification that informs the user of the occurrence of bleeding, a display regarding the bleeding position, and a bleeding amount. Output to the output device 140.
  • the type of information that the output control unit 1230 causes the output device 140 to output may be set in advance by the user. Further, when the output control unit 1230 receives image information acquired by the surgical camera 110, three-dimensional information, information on the bleeding position, information on the blood volume, or the like, it may transmit an output instruction to the output device 140. Good.
  • FIG. 11 is a schematic diagram illustrating an example of information displayed on the output device 140.
  • FIG. 12 is a schematic diagram illustrating another example of information output to the output device 140.
  • FIG. 11 shows a real-time image (MONITOR 1) acquired by the surgical camera 110, an image in which a display indicating the bleeding position is superimposed on the real-time image (MONITOR 2), and a moving image before and after bleeding (MONITOR 3). ), An enlarged image of the bleeding position (MONITOR 4), and a graph of the bleeding amount are displayed on the output device 140.
  • the output control unit 1230 may cause the output device 140 to display a near-infrared light image and a laser light image that is an image of a surgical part irradiated with laser light. Further, the output control unit 1230 may cause the output device 140 to display a display indicating the blood region superimposed on the white light image. The output control unit 1230 may cause the output device 140 to output a display or a warning sound when the bleeding amount exceeds a predetermined value.
  • the output control unit 1230 may display the future blood distribution estimated by the bleeding position specifying unit 1222 on the output device 140.
  • An image in which a display indicating a bleeding position is superimposed on a real-time image, or a moving image before and after bleeding is displayed on the output device 140, which is conventionally difficult to find because it is covered with thick accumulated blood.
  • the user can easily recognize the bleeding site and can start the hemostasis work more quickly.
  • the user can check vitals, determine the timing of the transition from endoscopic surgery to open surgery, the start of blood transfusion, and the like.
  • By displaying the future blood distribution on the output device 140 the user can recognize the state of the surgical site before blood is accumulated.
  • the output control unit 1230 indicates different bleeding positions according to the degree of change in the blood region in the vicinity of the bleeding position or the blood moving speed.
  • An image on which the display is superimposed may be displayed. For example, a display indicating a bleeding position existing in the vicinity of blood whose degree of change in blood region or movement speed is high may be highlighted.
  • the bleeding position near the blood where the degree of change in the blood region or the moving speed is large is often the bleeding position to be treated with priority. Therefore, the output control unit 1230 causes the output device 140 to highlight the bleeding position existing in the vicinity of blood whose degree of blood region change or movement speed is high as described above, so that the user can quickly perform treatment such as hemostasis. It can be carried out.
  • the output control unit 1230 determines the direction of the bleeding position when the bleeding position specified in the past has deviated from the area displayed in the real-time image acquired by the surgical camera 110.
  • the displayed display d can be displayed superimposed on a real-time image.
  • the above-described output images, sounds, and the like may be selectively output together with real-time images by the user setting output contents in advance.
  • An image in which a display indicating a bleeding position is superimposed on a real-time image, a moving image before and after bleeding, an enlarged image of a bleeding position, and the like are output by the output control unit when the bleeding determination unit 1221 determines that bleeding has occurred. 1230 may be displayed by sending an output instruction to the output device 140.
  • Bleeding related information includes information related to the occurrence of bleeding as described above, the area of the blood region, the amount of change in the area, the amount of blood, the amount of change in blood volume, the display of the bleeding position, the moving image before and after the occurrence of bleeding, the bleeding position This includes all information related to bleeding at the surgical site, such as a display indicating the direction of the operation.
  • the light source switching unit 1240 switches light emitted from the light source device 130. For example, a switching instruction is transmitted from the input device operated by the user to the light source switching unit 1240. With this switching instruction, the light source device 130 switches between the white light source, the near-infrared light source, and the laser light source, and irradiates the surgical site with user-desired light.
  • Storage unit 1250 An image acquired by the surgical camera 110 is recorded in the storage unit 1250.
  • the storage unit 1250 appropriately stores various programs, databases, and the like that are used when the image processing apparatus 120 performs various processes as described above. Further, the storage unit 1250 may record various types of information acquired by the feature point detection unit 1210 and the feature amount detection unit 1220 as history information. Further, in the storage unit 1250, for example, when the blood detection unit 1212, the bleeding determination unit 1221, the bleeding position specifying unit 1222, and the blood volume calculation unit 1223 perform each processing, various kinds of storage that need to be stored have occurred. Parameters, progress of processing, etc. may be recorded as appropriate.
  • the processing is not limited to the processing executed by the blood detection unit 1212, the bleeding determination unit 1221, the bleeding position specification unit 1222, and the blood volume calculation unit 1223, and it is necessary to store the image processing system 10 according to the present embodiment when performing some processing.
  • Various parameters, progress of processing, and the like may be recorded as appropriate.
  • the storage unit 1250 can be freely read / written by the communication unit 1260, the feature point detection unit 1210, the feature amount detection unit 1220, and the like.
  • the storage unit 1250 may store and overwrite an image and store an image for a time set in advance by the user. Further, the storage unit 1250 may be stored so that images of the time set in advance by the user before and after the time when the bleeding determination unit 1221 determines that bleeding has occurred are not overwritten.
  • the communication unit 1260 has a function of performing information communication with the surgical camera 110 via a network. Specifically, the communication unit 1260 receives an image from the surgical camera 110. The communication unit 1260 has a function of transmitting a control signal to the surgical camera 110 and controlling the driving thereof.
  • the control signal can include information regarding imaging conditions such as magnification and focal length.
  • the light source switching unit 1240 sets the light source of the light source device 130 to a white light source, and irradiates white light to the surgical site where surgery is performed.
  • the surgical camera 110 acquires a white light image of the surgical site.
  • the acquired white light image is transmitted to the feature point detection unit 1210, the output control unit 1230, and the storage unit 1250 via the communication unit 1260 of the image processing apparatus 120.
  • the output control unit 1230 transmits the received white light image to the output device 140, and the output device 140 displays the received white light image as a real-time image. Further, the white light image transmitted to the storage unit 1250 is recorded for a time preset by the user.
  • the three-dimensional information acquisition unit 1211 provided in the feature point detection unit 1210 acquires three-dimensional information regarding the position and shape of the region displayed in the acquired white light image.
  • the blood detection unit 1212 provided in the feature point detection unit 1210 is an image based on the color information constituting the white light image acquired by the surgical camera 110 and the 3D information acquired by the 3D information acquisition unit 1211.
  • the blood in the area displayed in (2) is detected, and blood distribution information is generated.
  • the blood detection unit 1212 divides the image acquired by the surgical camera 110 into a predetermined size, and from the three-dimensional information acquired by the three-dimensional information acquisition unit 1211, for each divided range of the image. Calculate surface roughness.
  • RGB for each divided area of the image and the calculated surface roughness are plotted on the feature plane with the vertical axis as color information and the horizontal axis as the surface roughness. It is determined whether or not the range is a blood region.
  • the three-dimensional information and the blood distribution information are transmitted to the feature amount detection unit 1220 and the storage unit 1250.
  • the blood distribution information recorded in the storage unit 1250 is transmitted to the output control unit 1230 together with the white light image recorded in the storage unit 1250.
  • the blood distribution information is applied to the white light image, and the display indicating the blood region is output by the output control unit 1230 as an image displayed superimposed on the white light image.
  • the blood volume calculation unit 1223 provided in the feature amount detection unit 1220 calculates the blood volume using the near-infrared light image.
  • the light source switching unit 1240 switches the light source of the light source device 130 from a white light source to a near infrared light source. The light source is switched by, for example, the user's operation of the foot switch 250.
  • the light source device 130 irradiates the surgical site with near infrared light having different wavelengths.
  • the blood volume calculation unit 1223 calculates the blood thickness by taking the difference between a plurality of near-infrared light images obtained by irradiating the operation site with near-infrared light having different wavelengths.
  • the blood volume calculation unit 1223 obtains the product of the calculated blood thickness and the area of the blood region having the thickness, and sets the sum of the products as the blood volume in the displayed region.
  • the blood volume calculation unit 1223 compares the blood volume calculated with two or more images, and calculates the change amount of the blood volume in the region displayed in the near-infrared light image or the white light image.
  • the bleeding determination unit 1221 determines the occurrence of bleeding from the amount of change in blood volume calculated by the blood volume calculation unit 1223. Specifically, when the threshold value is set by the user for the change amount of the blood volume and the change amount of the blood volume exceeds the threshold value, the bleeding determination unit 1221 determines that bleeding has occurred. The bleeding determination unit 1221 may determine whether or not bleeding has occurred based on the amount of change in the area of the blood region. Specifically, the bleeding determination unit 1221 performs image processing on the white light image at each time point, and calculates the difference in the area of the blood region between the images at each time point. When a threshold value is set in advance by the user for the amount of change in the area of the blood region, and the amount of change in the area of the blood region exceeds the threshold value, the bleeding determination unit 1221 may determine that bleeding has occurred.
  • the bleeding position specifying unit 1222 reads out a plurality of white light images stored in the storage unit 1250 and blood distribution information at each time point when the white light images are acquired.
  • the bleeding position specifying unit 1222 uses the read white light image and blood distribution information to grasp the transition of the blood distribution state. Going back to this blood distribution state, the bleeding position is specified as the position where the blood is generated.
  • the information such as the blood volume calculated by the blood volume calculation unit, the time when bleeding occurred, and the specified bleeding position is transmitted to at least one of the output control unit 1230 and the storage unit 1250.
  • the output control unit 1230 appropriately displays a real-time image transmitted from the surgical camera 110, an amount of bleeding transmitted from the blood volume calculation unit, an image on which the bleeding position specified by the bleeding position specifying unit is displayed, and the like. Output to the output device 140.
  • the image displaying the bleeding position is read from the storage unit 1250. The above processing is executed at any time.
  • FIG. 13 is a block diagram illustrating a hardware configuration example of the image processing apparatus 120 according to an embodiment of the present disclosure.
  • the image processing apparatus 120 includes, for example, a CPU 1270, a ROM 1271, a RAM 1272, a host bus 1273, a bridge 1274, an external bus 1275, an interface 1276, an input device 1277, and a display device 1278. , An audio output device 1279, a storage device 1280, a drive 1281, a connection port 1282, and a removable storage medium 1283.
  • the hardware configuration shown here is an example, and some of the components may be omitted.
  • the hardware configuration may further include components other than the components shown here.
  • the CPU 1270 functions as, for example, an arithmetic processing device or a control device, and controls the overall operation of each component or a part thereof based on various programs recorded in the ROM 1271, RAM 1272, storage device 1280, or removable storage medium 1283. .
  • the ROM 1271 is means for storing a program read by the CPU 1270, data used for calculation, and the like.
  • the RAM 1272 temporarily or permanently stores, for example, a program read by the CPU 1270 and various parameters that change as appropriate when the program is executed.
  • the CPU 1270, the ROM 1271, and the RAM 1272 are connected to each other via, for example, a host bus 1273 capable of high-speed data transmission.
  • the host bus 1273 is connected to an external bus 876 having a relatively low data transmission speed via a bridge 1274, for example.
  • the external bus 1275 is connected to various components via an interface 1276.
  • the input device 1277 for example, a mouse, a keyboard, a touch panel, a button, a switch, a lever, or the like can be used. Furthermore, as the input device 1277, a remote controller capable of transmitting a control signal using infrared rays or other radio waves may be used.
  • the input device 1277 includes a voice input device such as a microphone.
  • the foot switch 250 is preferably used as the input device 1277.
  • the display device 1278 is, for example, a display device such as a CRT (Cathode Ray Tube), LCD, or organic EL, a printer, or the like, and the audio output device 1279 is an audio output device such as a speaker or headphones.
  • a display device such as a CRT (Cathode Ray Tube), LCD, or organic EL, a printer, or the like
  • the audio output device 1279 is an audio output device such as a speaker or headphones.
  • Each of the display device 1278 and the audio output device 1279 is a device capable of visually or audibly notifying the acquired information to the user.
  • the storage device 1280 is a device for storing various data.
  • a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like is used.
  • the drive 1281 is a device that reads information recorded on a removable storage medium 1283 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, or writes information to the removable storage medium 1283.
  • a removable storage medium 1283 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory
  • the removable storage medium 1283 is, for example, a DVD medium, a Blu-ray (registered trademark) medium, an HD DVD medium, various semiconductor storage media, and the like.
  • the removable storage medium 1283 may be, for example, an IC card on which a non-contact type IC chip is mounted, an electronic device, or the like.
  • connection port 1282 is a port for connecting an external connection device 902 such as a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal. is there.
  • an external connection device 902 such as a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal. is there.
  • the technique of the present disclosure it is possible to more accurately grasp the bleeding site of the surgical site at the time of surgery. In addition, it is possible to estimate a stable bleeding position even when the patient moves. In addition, the user can recognize the change in the amount of bleeding. As described above, the user can quickly start the hemostasis operation, the operation time is shortened, and the amount of bleeding can be suppressed. In addition, since the user can recognize the amount of bleeding, it is possible to check vitals, determine the transition from endoscopic surgery to laparotomy, the timing of the start of blood transfusion, and the like. As a result, it becomes possible to accelerate the post-operative recovery of the patient.
  • a light source device that has at least a white light source for irradiating white light, and irradiates the operation part that is a part of a living body on which surgery is performed;
  • a surgical camera for acquiring a white light image of the surgical site irradiated with the white light by the light source device;
  • An image processing device that uses a plurality of the white light images acquired at different time points, and identifies a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of white light images; Comprising Image processing system.
  • the image processing device acquires three-dimensional information of a region displayed in the white light image, Based on the color information constituting the white light image and the three-dimensional information, blood present in the region displayed in the white light image is detected, and blood distribution information relating to the blood distribution is obtained, Based on the blood distribution information at each time point when the plurality of white light images are acquired, the bleeding position is identified by tracing back the transition of the blood distribution state, The image processing system according to (1). (3) The image processing device, based on the blood distribution information at each time point, changes in the area of the area where the blood exists, blood volume existing in the area displayed in the white light image, and changes in the blood volume Calculate at least one of the quantities, The image processing system according to (1) or (2).
  • the image processing apparatus determines that bleeding has occurred in the area displayed in the white light image when the amount of change in the area exceeds a predetermined threshold.
  • the image processing system according to any one of (1) to (3).
  • the image processing apparatus determines that bleeding has occurred in the area displayed in the white light image when the amount of change in the blood volume exceeds a predetermined threshold.
  • the image processing system according to any one of (1) to (3).
  • the image processing device estimates a future blood distribution based on a plurality of blood distribution transitions at each of a plurality of time points.
  • the light source device has a near infrared light having a wavelength belonging to a near infrared light band.
  • the surgical camera further acquires a plurality of near-infrared light images in which the near-infrared light is irradiated on the surgical site
  • the image processing device includes the blood in a region displayed in the near-infrared light image or the white-light image based on at least two or more of the near-infrared light image and the white-light image. Calculating the volume and the change in the blood volume, The image processing system according to (5). (7) The image processing device estimates a future blood distribution based on a transition of the blood distribution at a plurality of different time points; The image processing system according to any one of (1) to (3).
  • the image processing device detects the presence of a foreign substance present in an area displayed in the white light image based on the three-dimensional information; The image processing system according to (1) or (2).
  • the light source device includes a laser light source that emits laser light having a predetermined wavelength, The surgical camera further acquires a laser light image of the surgical site irradiated with the laser light, The image processing apparatus acquires blood flow information related to blood flow using the laser light image acquired in advance, and specifies the bleeding position based on the blood flow information.
  • the image processing system according to (1) (10) When the image processing apparatus determines that bleeding has occurred in the region displayed in the white light image, the bleeding information related to the bleeding is output.
  • the image processing device outputs the identified bleeding position by a predetermined method.
  • the image processing apparatus outputs the white light image until a predetermined time elapses from the starting point, starting from a time point that is a predetermined time later than the time point when the white light image determined that bleeding has occurred is acquired.
  • the image processing device outputs a predetermined signal for inducing the occurrence of bleeding when at least one of the blood volume or the amount of change in the blood volume exceeds a predetermined threshold.
  • the image processing device displays a display indicating the bleeding position superimposed on the white light image at the latest time point, The image processing system according to (1).
  • the image processing device changes the display indicating the bleeding position in the region displayed in the white light image in accordance with the change amount of the blood volume or the change amount of the area of the region where the blood exists.
  • the image processing system according to any one of (1) to (3).
  • the image processing device outputs a display indicating the direction of the bleeding position when the bleeding position deviates from the area displayed in the white light image acquired by the surgical camera.
  • the image processing device outputs a change in the amount of bleeding in the region displayed in the white light image;
  • the image processing device outputs an estimated future blood distribution based on a transition of the blood distribution at a plurality of different times.
  • An image processing method comprising: using a plurality of obtained white light images and specifying a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of white light images.
  • a light source device that has at least a white light source that emits white light and irradiates the surgical site that is a part of a living body on which surgery is performed, and the white light is irradiated by the light source device.
  • a computer capable of mutual communication with both of the surgical cameras for acquiring a white light image of the surgical site; Using a plurality of the white light images acquired at different times by the surgical camera and functioning as a specifying unit for specifying a bleeding position in the surgical site based on a temporal change of the surgical site between the plurality of images To make program.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

Le problème décrit par la présente invention est de proposer un système de traitement d'image, un dispositif de traitement d'image, un procédé de traitement d'image et un programme qui permettent de déterminer plus précisément des emplacements d'hémorragie dans un site chirurgical pendant une intervention chirurgicale. À cet effet, le système de traitement d'image de la présente invention comprend : un dispositif de source de lumière qui comprend au moins une source de lumière blanche qui émet une lumière blanche, et irradie, au moyen de la lumière blanche, un site chirurgical constitué d'une partie d'un organisme sur lequel une chirurgie est effectuée ; une caméra chirurgicale qui obtient des images de lumière blanche du site chirurgical irradié au moyen de la lumière blanche par le dispositif de source de lumière ; et un dispositif de traitement d'image qui utilise une pluralité d'images de lumière blanche obtenues à différents moments pour identifier un emplacement d'hémorragie dans le site chirurgical sur la base de changements au cours du temps dans le site chirurgical entre la pluralité d'images de lumière blanche.
PCT/JP2019/004770 2018-04-17 2019-02-12 Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image, et programme WO2019202827A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018079066A JP2021112220A (ja) 2018-04-17 2018-04-17 画像処理システム、画像処理装置、画像処理方法及びプログラム
JP2018-079066 2018-04-17

Publications (1)

Publication Number Publication Date
WO2019202827A1 true WO2019202827A1 (fr) 2019-10-24

Family

ID=68239509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/004770 WO2019202827A1 (fr) 2018-04-17 2019-02-12 Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image, et programme

Country Status (2)

Country Link
JP (1) JP2021112220A (fr)
WO (1) WO2019202827A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3666166A1 (fr) * 2018-12-10 2020-06-17 Covidien LP Système et procédé pour générer un modèle tridimensionnel d'un site chirurgical
JP6875038B1 (ja) * 2020-01-30 2021-05-19 アナウト株式会社 コンピュータプログラム及び映像再生方法
WO2021152948A1 (fr) * 2020-01-30 2021-08-05 アナウト株式会社 Programme informatique et procédé de lecture d'image
WO2022190740A1 (fr) 2021-03-09 2022-09-15 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement
WO2022191128A1 (fr) 2021-03-09 2022-09-15 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement
WO2022191129A1 (fr) 2021-03-09 2022-09-15 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024004013A1 (fr) * 2022-06-28 2024-01-04 国立研究開発法人国立がん研究センター Programme, procédé de traitement d'informations, et dispositif de traitement d'informations

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011036372A (ja) * 2009-08-10 2011-02-24 Tohoku Otas Kk 医療画像記録装置
JP2011142929A (ja) * 2010-01-12 2011-07-28 National Institute Of Advanced Industrial Science & Technology 低侵襲血管新生計測装置
JP2015529489A (ja) * 2012-07-25 2015-10-08 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 手術用システムにおける効率的且つインタラクティブな出血検出

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011036372A (ja) * 2009-08-10 2011-02-24 Tohoku Otas Kk 医療画像記録装置
JP2011142929A (ja) * 2010-01-12 2011-07-28 National Institute Of Advanced Industrial Science & Technology 低侵襲血管新生計測装置
JP2015529489A (ja) * 2012-07-25 2015-10-08 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 手術用システムにおける効率的且つインタラクティブな出血検出

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3666166A1 (fr) * 2018-12-10 2020-06-17 Covidien LP Système et procédé pour générer un modèle tridimensionnel d'un site chirurgical
US11045075B2 (en) 2018-12-10 2021-06-29 Covidien Lp System and method for generating a three-dimensional model of a surgical site
US11793402B2 (en) 2018-12-10 2023-10-24 Covidien Lp System and method for generating a three-dimensional model of a surgical site
JP6875038B1 (ja) * 2020-01-30 2021-05-19 アナウト株式会社 コンピュータプログラム及び映像再生方法
WO2021152948A1 (fr) * 2020-01-30 2021-08-05 アナウト株式会社 Programme informatique et procédé de lecture d'image
JP2021121095A (ja) * 2020-01-30 2021-08-19 アナウト株式会社 コンピュータプログラム及び映像再生方法
WO2022190740A1 (fr) 2021-03-09 2022-09-15 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement
WO2022191128A1 (fr) 2021-03-09 2022-09-15 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement
WO2022191129A1 (fr) 2021-03-09 2022-09-15 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement

Also Published As

Publication number Publication date
JP2021112220A (ja) 2021-08-05

Similar Documents

Publication Publication Date Title
WO2019202827A1 (fr) Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image, et programme
US11776144B2 (en) System and method for determining, adjusting, and managing resection margin about a subject tissue
US20220095903A1 (en) Augmented medical vision systems and methods
JP6834184B2 (ja) 情報処理装置、情報処理装置の作動方法、プログラム及び医療用観察システム
JP4698966B2 (ja) 手技支援システム
CN115243636A (zh) 关联可视化数据和供电外科器械数据的外科系统
CN115279298A (zh) 通过外科系统分析外科手术趋势并提供用户推荐
US8355043B2 (en) Medical apparatus
CN115087406A (zh) 根据手术烟雾云特征的自适应外科系统控制
CN114901203A (zh) 外科系统的自适应可视化
EP3426128B1 (fr) Dispositif de traitement d'image, système de chirurgie endoscopique et procédé de traitement d'image
JP2015523102A (ja) 手術及び介入処置での追跡及び制御の為のデュアルモードステレオイメージングシステム
JP2019162339A (ja) 手術支援システムおよび表示方法
JP2020527374A (ja) 医用イメージングシステム、方法およびコンピュータプログラム製品
JPWO2018168261A1 (ja) 制御装置、制御方法、及びプログラム
US20220218427A1 (en) Medical tool control system, controller, and non-transitory computer readable storage
JP2018515159A (ja) 人間又は動物の体内の関心構造を照射する装置、システム、及び方法
WO2020165978A1 (fr) Dispositif d'enregistrement d'image, procédé d'enregistrement d'image et programme d'enregistrement d'image
WO2016185912A1 (fr) Appareil de traitement d'image, procédé de traitement d'image et système chirurgical
CN116096309A (zh) 腔内机器人(elr)系统和方法
US20210295980A1 (en) Medical image processing apparatus, trocar, medical observation system, image processing method, and computer readable recording medium
WO2018198255A1 (fr) Dispositif de traitement d'image, procédé de commande pour dispositif de traitement d'image, et programme
JP7146318B1 (ja) コンピュータプログラム、学習モデルの生成方法、及び手術支援装置
US20220409300A1 (en) Systems and methods for providing surgical assistance based on operational context
WO2020009127A1 (fr) Système d'observation médicale, dispositif d'observation médicale et procédé de commande de dispositif d'observation médicale

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19787946

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19787946

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP