WO2019065111A1 - Système de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à une intervention médicale - Google Patents

Système de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à une intervention médicale Download PDF

Info

Publication number
WO2019065111A1
WO2019065111A1 PCT/JP2018/032663 JP2018032663W WO2019065111A1 WO 2019065111 A1 WO2019065111 A1 WO 2019065111A1 JP 2018032663 W JP2018032663 W JP 2018032663W WO 2019065111 A1 WO2019065111 A1 WO 2019065111A1
Authority
WO
WIPO (PCT)
Prior art keywords
diagnostic information
unit
image
information
attention area
Prior art date
Application number
PCT/JP2018/032663
Other languages
English (en)
Japanese (ja)
Inventor
徹郎 江畑
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2019544482A priority Critical patent/JP6796725B2/ja
Publication of WO2019065111A1 publication Critical patent/WO2019065111A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Definitions

  • the present invention relates to a medical image processing system for extracting a region of interest from a medical image, an endoscope system, a diagnosis support device, and a medical service support device.
  • diagnostic information about a pathological condition is obtained by extracting a notable area of interest with a possibility of a lesion from a medical image and performing image analysis on the extracted notable area.
  • the endoscope system shown in Patent Document 1 in the feature space based on pixel values, the distance between the region of a normal mucous membrane and the specific region in which the pixel values of the observation target imaged by the endoscope are distributed , It is judged whether the state of observation object is a lesion or normal.
  • image analysis is performed depending on imaging conditions such as imaging angle, distance to the observation object, scale of the observation object (for example, size etc.), brightness, blur, blur, water fog, etc. and differences among patients.
  • Incorrect diagnostic information may be calculated for the attention area extracted by the above. In such a case, usability may be reduced, or even serious misdiagnosis may occur.
  • the examination content and the lesion desired to be displayed as an attention area may be different for each patient.
  • an expert examines an obvious lesion can be determined to be a lesion without automatically extracting it as a region of interest. In such a case, automatically extracting and displaying a region of interest results in a decrease in usability.
  • the present invention suppresses the presentation of erroneous diagnostic information to a user, and can provide a medical image processing system capable of presenting appropriate diagnostic information to the user, an endoscope system, and a diagnostic support device. And providing a medical service support apparatus.
  • the medical image processing system comprises an image acquisition unit for acquiring a medical image obtained by imaging an observation target, an attention area extraction unit for extracting an attention area including the attention target from the medical image, and an attention area.
  • a diagnostic information determination unit that determines diagnostic information related to a target of interest, a display unit that displays a focused region and diagnostic information determined for the focused region, and first diagnostic information specified among diagnostic information displayed on the display unit
  • a display control unit that performs control not to display the first attention area determined to be the first diagnosis information in the diagnosis information determination unit on the display unit.
  • the designation of the first diagnostic information is preferably performed by a position designation operation of designating the position of the first diagnostic information.
  • a position designation operation of designating the position of the first diagnostic information.
  • the designation of the first diagnostic information is made from the switching operation for instructing the display unit to switch and display the plurality of diagnostic information and displaying the first diagnostic information and the diagnostic information in the switching display.
  • a similar image storage unit for storing similar images and similar image diagnostic information that is diagnostic information determined for similar images and images similar to the image of the first attention area, the display unit including the similar image and similar image diagnostic information It is preferable to display, and the diagnosis information designation unit designates the first diagnosis information from among the similar image diagnosis information stored in the similar image storage unit.
  • the display unit preferably displays a list of similar images and similar image diagnostic information.
  • the specific attention target tracking unit for tracking the first attention target included in the first attention area is provided, and the display control unit controls not to display the first attention target being tracked as the first attention area on the display unit. It is preferred to do.
  • An endoscope system focuses on an endoscope for imaging an observation target, an image acquisition unit for acquiring a medical image obtained by imaging the observation target, and an attention area for extracting an attention area including the attention target from the medical image.
  • the diagnostic support apparatus of the present invention includes the medical image processing system of the present invention described above.
  • a medical service support apparatus according to the present invention is a medical service support apparatus including the medical image processing system according to the present invention described above.
  • the present invention it is possible to suppress the presentation of erroneous diagnostic information to the user, and to present appropriate diagnostic information to the user.
  • FIG. It is an external view of an endoscope system. It is a block diagram of an endoscope system. It is a block diagram which shows the function of an image processing part. It is an image figure of a monitor which displays a diagnostic information judged to a plurality of attention areas and each attention area. It is explanatory drawing which shows the designation
  • FIG. It is an image figure of the monitor which displays the similar image and similar image diagnostic information which are displayed as a list. It is an explanatory view showing correction processing. It is an explanatory view showing tracking processing of the 1st tracking object.
  • the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a monitor 18, and a user interface 19.
  • the endoscope 12 emits illumination light to an object to be observed, and captures an image of the object illuminated by the illumination light.
  • the light source device 14 generates illumination light for irradiating a subject.
  • the processor device 16 performs system control, image processing, and the like of the endoscope system 10.
  • the monitor 18 is a display unit that displays an image output from the processor device 16.
  • the user interface 19 is an input device for performing setting input and the like to the processor device 16 and the like, and includes a keyboard KB, a mouse MS, a foot switch FS and the like.
  • the user interface 19 is not limited to the keyboard KB, the mouse MS, and the foot switch FS, and may be a graphical user interface, voice input, a touch display, or the like. Further, the medical image processing system of the present invention includes an image acquisition unit 54 and an image processing unit 61 provided in the processor device 16, a monitor 18, and a user interface 19.
  • the endoscope 12 includes an insertion portion 12a to be inserted into a subject, an operation portion 12b provided at a proximal end portion of the insertion portion 12a, a curved portion 12c provided at the distal end side of the insertion portion 12a, and a distal end portion 12d. ,have.
  • the bending portion 12c is bent by operating the angle knob 12e of the operation portion 12b.
  • the bending of the bending portion 12 c causes the tip 12 d to face in a desired direction.
  • the tip end 12d is provided with an injection port (not shown) for injecting air, water or the like toward the subject.
  • the operation unit 12b is provided with a zoom operation unit 13a.
  • a forceps channel (not shown) for inserting a treatment tool or the like is provided from the insertion portion 12a to the distal end portion 12d. The treatment tool is inserted into the forceps channel from the forceps inlet 12f.
  • the light source device 14 includes a light source unit 20 and a light source control unit 22.
  • the light source unit 20 emits illumination light for illuminating a subject.
  • the light source unit 20 includes one or more light sources.
  • the light source control unit 22 controls the drive of the light source unit 20.
  • the light source control unit 22 independently controls the timing of turning on or off the light source constituting the light source unit 20, the light emission amount at the time of lighting, and the like. As a result, the light source unit 20 can emit plural types of illumination lights having different light emission amounts and light emission timings.
  • the illumination light emitted by the light source unit 20 is incident on the light guide 41.
  • the light guide 41 is incorporated in the endoscope 12 and the universal cord (not shown), and propagates the illumination light to the distal end 12 d of the endoscope 12.
  • the universal cord is a cord that connects the endoscope 12 to the light source device 14 and the processor device 16.
  • a multimode fiber can be used. As an example, it is possible to use a thin fiber cable having a core diameter of 105 ⁇ m, a cladding diameter of 125 ⁇ m, and a diameter of ⁇ 0.3 to 0.5 mm including a protective layer to be an outer shell.
  • An illumination optical system 30 a and an imaging optical system 30 b are provided at the distal end 12 d of the endoscope 12.
  • the illumination optical system 30 a has an illumination lens 45, and illumination light is emitted toward the subject through the illumination lens 45.
  • the imaging optical system 30 b includes an objective lens 46, a zoom lens 47, and an image sensor 48.
  • the image sensor 48 is a reflected light of illumination light returning from the subject via the objective lens 46 and the zoom lens 47 (in addition to the reflected light, the scattered light, the fluorescence emitted from the subject, or the drug administered to the subject)
  • the subject is imaged using fluorescence and the like.
  • the zoom lens 47 is moved by operating the zoom operation unit 13 a, and the image sensor 48 is used to magnify or reduce the subject to be imaged.
  • the image sensor 48 is, for example, a color sensor having a primary color filter, and has B pixels (blue pixels) having blue color filters, G pixels (green pixels) having green color filters, and R having red color filters. There are three types of pixels (red pixels). Blue color filters transmit mainly violet to blue light. The green color filter is mainly green light. The red color filter transmits mainly red light. As described above, when an object is imaged using the primary color image sensor 48, at most, a B image (blue image) obtained from B pixels, a G image (green image) obtained from G pixels, and an R obtained from R pixels Three types of images (red image) can be obtained simultaneously.
  • a charge coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor can be used.
  • CMOS complementary metal oxide semiconductor
  • the complementary color sensor includes, for example, a cyan pixel provided with a cyan color filter, a magenta pixel provided with a magenta color filter, a yellow pixel provided with a yellow color filter, and a green pixel provided with a green color filter.
  • the image obtained from the pixels of each color when using a complementary color sensor can be converted into a B image, a G image, and an R image by performing complementary-primary color conversion.
  • a monochrome sensor without a color filter can be used as the image sensor 48. In this case, an image of each color can be obtained by sequentially imaging the subject using illumination light of each color such as BGR.
  • the processor device 16 includes a central control unit 52, an image acquisition unit 54, an image processing unit 61, and a display control unit 66.
  • the central control unit 52 performs overall control of the endoscope system 10 such as synchronous control of the irradiation timing of the illumination light and the imaging timing.
  • the central control unit 52 controls the input various settings as the light source control unit 22, the image sensor 48, or the image processing unit 61.
  • the data is input to each part of the endoscope system 10.
  • the image acquisition unit 54 acquires, from the image sensor 48, an image obtained by imaging a subject.
  • the image acquired by the image acquisition unit 54 is an image acquired by a medical device such as the endoscope 12 and thus is referred to as a medical image.
  • the image acquisition unit 54 includes a DSP (Digital Signal Processor) 56, a noise reduction unit 58, and a conversion unit 59, and uses these to perform various processes on the acquired medical image as needed.
  • the DSP 56 performs various processing such as defect correction processing, offset processing, gain correction processing, linear matrix processing, gamma conversion processing, demosaicing processing, and YC conversion processing, as necessary, on the acquired medical image.
  • the defect correction process is a process of correcting the pixel value of the pixel corresponding to the defective pixel of the image sensor 48.
  • the offset process is a process of reducing the dark current component from the image subjected to the defect correction process and setting an accurate zero level.
  • the gain correction process is a process of adjusting the signal level of each image by multiplying the image subjected to the offset process by the gain.
  • the linear matrix processing is processing for improving the color reproducibility of the image subjected to the offset processing, and the gamma conversion processing is processing for adjusting the brightness and saturation of the image after the linear matrix processing.
  • demosaicing processing is processing for interpolating the pixel value of a missing pixel, and is applied to an image after gamma conversion processing.
  • the missing pixels are pixels having no pixel value due to the arrangement of the color filters (because pixels of other colors are arranged in the image sensor 48).
  • the demosaicing process interpolates the B image to generate pixel values of pixels located at G and R pixel positions of the image sensor 48.
  • the YC conversion process is a process of converting the image after the demosaicing process into a luminance channel Y, a color difference channel Cb, and a color difference channel Cr.
  • the noise reduction unit 58 performs noise reduction processing on the luminance channel Y, the color difference channel Cb, and the color difference channel Cr using, for example, a moving average method or a median filter method.
  • the conversion unit 59 reconverts the luminance channel Y, the color difference channel Cb, and the color difference channel Cr after the noise reduction processing into an image of each color of BGR.
  • the image processing unit 61 performs various types of image processing on the medical image acquired by the image acquisition unit 54. Further, the image processing unit 61 extracts an attention area from the medical image, and calculates diagnosis information for supporting diagnosis of the observation target from the extracted attention area. The extraction of the region of interest and the calculation of the diagnostic information will be described later.
  • the display control unit 66 converts the medical image or the diagnostic information sent from the image processing unit 61 into a format suitable for display on the monitor 18 and outputs the converted format to the monitor 18. As a result, at least the medical image and the diagnostic information are displayed on the monitor 18.
  • the image processing unit 61 includes an attention area extraction unit 70, a diagnosis information determination unit 72, a diagnosis information specification unit 74, a similar image storage unit 76, a diagnosis information correction unit 78, and a specific attention. And a target tracking unit 79.
  • the attention area extraction unit 70 extracts an attention area to be noted as a target of examination or diagnosis from the medical image.
  • the attention area extraction unit 70 calculates a specific feature amount from the medical image. Then, an area in which the calculated specific feature quantity satisfies the specific condition is extracted as the area of interest.
  • the attention area extracted by the attention area extraction unit 70 is not limited to a two-dimensional area such as the surface of the observation target. For example, in addition to the surface of the observation target, a three-dimensional region in the depth direction (infiltration) of the observation target may be extracted as a region of interest.
  • a specific feature quantity there is “ln (G / B)” indicating the ratio of G image and B image as the feature quantity for extracting the area of the superficial blood vessel.
  • blood vessel index values for blood vessels which will be described later
  • duct duct index values for duct structures may be used.
  • the specific feature amount for example, in addition to performing a convolutional neural network on a medical image, a feature amount obtained by color information of a medical image, a gradient of pixel values, or the like may be used.
  • the gradient of the pixel value is, for example, the shape of the subject (such as the global unevenness of the mucous membrane or the local depression or bulge), the color (color such as inflammation, hemorrhage, redness, or whitening due to atrophy)
  • the attention area extracted by the attention area extraction unit 70 includes, for example, a lesion area represented by cancer, a benign tumor area, and an inflammation area (including so-called inflammation, a part having change such as bleeding or atrophy)
  • a lesion area represented by cancer
  • a benign tumor area and an inflammation area (including so-called inflammation, a part having change such as bleeding or atrophy)
  • the attention area extraction unit 70 extracts an area including at least one of a lesion area, a benign tumor area, an inflammation area, a marking area, and a biopsy performing area as an attention area.
  • the attention area extraction unit 70 may perform a removal process of removing an excessively dark area and an excessively bright area which interfere with accurate calculation of the diagnostic information in the medical image.
  • the lower limit value and the upper limit value are set for each of the B image, the G image, and the R image in the medical image. Then, an area below the lower limit value is extracted as an excessively dark area, and is removed from each image. Similarly, a region exceeding the upper limit value is extracted as a too bright region and removed from each image. A region of interest is extracted from a medical image from which regions that are too dark and regions that are too bright are removed. Note that the removal of an area that is too dark and an area that is too bright may not be performed depending on the state of the medical image.
  • the diagnosis information determination unit 72 calculates various index values from the attention area extracted by the attention area extraction unit 70, and determines diagnosis information for supporting diagnosis of a lesion based on the calculated various index values.
  • the various index values include blood vessel index values related to blood vessels such as blood vessel density and blood vessel travel patterns, and ductal index values related to duct structure. Examples of diagnostic information include the type and size of a lesion.
  • the calculated diagnostic information is displayed on the monitor 18 by the display control unit 66 together with the medical image.
  • the image in the attention area and the diagnostic information determined for the attention area are associated with each other and sequentially stored in the similar image storage unit 76.
  • the attention area extraction unit 70 extracts the attention area a and the attention area b
  • diagnostic information is determined for each attention area a, b.
  • the monitor 18 displays "cancer” determined as the type of the lesion and "10 mm” determined as the size of the lesion.
  • “adenoma” determined as the type of the lesion and “7 mm” determined as the size of the lesion are displayed on the monitor 18.
  • the diagnostic information designation unit 74 designates the first diagnostic information that the user considers to be an incorrect determination or the user considers that the lesion is an obvious lesion and that display on the monitor 18 is unnecessary. .
  • the display control unit 66 does not display the attention area determined as the first diagnostic information on the monitor 18. This makes it possible to suppress the presentation of erroneous diagnostic information to the user, and present appropriate diagnostic information to the user.
  • the designation of the first diagnostic information is preferably performed by a position designation operation of designating the position of the first diagnostic information being displayed on the monitor 18.
  • a position designation operation of designating the position of the first diagnostic information being displayed on the monitor 18.
  • a user interface 19 that receives an instruction regarding specification of the first diagnostic information is used.
  • position specification operation is performed with screen tap operation on the monitor 18.
  • the gaze detection device of the user is connected to the endoscope system 10
  • the position specification operation can be performed based on the user gaze information.
  • the first diagnostic information may be directly designated by the user interface 19, or the first attention area determined to be the first diagnostic information may be designated.
  • the pointer PT displayed on the monitor 18 is operated by the mouse MS, and the diagnostic information of the diagnostic information that is desired to specify the pointer PT as the first diagnostic information. Align to the position.
  • the first diagnostic information is "cancer" in the region of interest a
  • the pointer PT is placed on “cancer” and the mouse MS is clicked. Thereby, "cancer" is designated as the first diagnostic information.
  • a switching operation for instructing the monitor 18 to switch and display the plurality of diagnostic information and displaying the first diagnostic information from among the diagnostic information in the switching display.
  • the first diagnosis information may be designated by a confirmation operation of selecting and confirming the.
  • a foot switch FS and a scope switch for diagnostic information specification 13b provided on the operation unit 12b of the endoscope 12 are used for the switching operation and the determination operation.
  • diagnostic information (type: “cancer”, size) determined for one attention area (“attention area a”) of a plurality of attention areas and the attention area : "10 mm” is displayed.
  • the determination switch DB of the foot switch FS is used as the determination operation.
  • the diagnostic information (type: “cancer”, size: “10 mm”) is determined as the first diagnostic information by pressing once or pressing the diagnostic information specification scope switch 13 b twice in succession.
  • the diagnostic information (type: “cancer”, size: “10 mm”) determined for the region of interest a is not used as the first diagnostic information
  • other diagnostic information can be specified.
  • pressing the switching switch SB of the foot switch FS once or pressing the diagnostic information designation switch 13 b once makes the attention area a as shown in FIG. 6B.
  • diagnosis information (type: “adenoma”, size: “7 mm”) determined for the attention area b and the attention area b different from the above.
  • the determination switch DB is set to 1 as the determination operation. Pressing the button once or pressing the diagnostic information specification scope switch 13b twice in succession.
  • an image of the first attention area designated as the first diagnostic information In addition to designating the first diagnosis information from among the diagnosis information determined for the attention area in the diagnosis information designation unit 74, an image of the first attention area designated as the first diagnostic information If there is a similar image similar to the above, similar image diagnostic information which is diagnostic information determined for the similar image is also presented to the user, and the first diagnostic information is specified from the similar image diagnostic information. You may In this case, after designating the first diagnosis information from among the diagnosis information determined for the attention area, automatically or from the similar image storage unit 76 by the user operating the user interface 19. , A similar image similar to the image of the first attention area and similar image diagnostic information associated with the similar image are extracted. The extraction of the similar image is performed by pattern matching with the image of the first attention area or the like.
  • the similar image 80 extracted from the similar image storage unit 76 and the similar image diagnostic information 82 associated with the similar image 80 are displayed in a list on the monitor 18.
  • the monitor 18 also displays a pointer PT for selecting as the first diagnostic information from among the similar image diagnostic information.
  • the user operates the mouse MS while referring to the similar image and the similar image diagnostic information listed on the monitor 18.
  • the pointer PT is set to the similar image diagnostic information to be designated as the first diagnostic information, and the click is performed.
  • the similar image diagnostic information indicated by the pointer PT is designated as the first diagnostic information.
  • the display control unit 66 performs control not to display the attention area determined as the designated similar image diagnostic information on the monitor 18.
  • the diagnostic information correction unit 78 designates the first diagnostic information that the user considers to be an erroneous decision out of a plurality of diagnostic information, in addition to or instead of the designation of the first diagnostic information, the first diagnostic information Perform correction processing to correct the second diagnostic information that the user thinks is correct.
  • a correction screen is displayed on the monitor 18 automatically or by the user operating the user interface 19 after designating the first diagnosis information from among the diagnosis information determined for the attention area. .
  • a correction candidate list is displayed for the information to be corrected among the diagnostic information.
  • the correction candidate list is displayed for the type of lesion.
  • the user finds the correct second diagnostic information in the correction candidate list, the user operates the mouse MS to set the pointer PT to the correct diagnostic information.
  • the first diagnostic information is corrected to the correct second diagnostic information.
  • the display control unit 66 performs control not to display the attention area determined as the first diagnostic information on the monitor 18, while displaying the attention area determined as the second diagnostic information on the monitor 18 Control to
  • the specific attention target tracking unit 79 performs tracking processing for tracking the first attention target included in the first attention area, when the user designates the first diagnostic information considered to be an erroneous determination among the plurality of diagnostic information. . As described above, by tracking the first target of interest, even if the first target of interest disappears from the monitor 18 and is displayed again on the monitor 18, the first target of interest is prevented from displaying as the target area.
  • the attention area a and the diagnosis information (type: “cancer”, size) “10 mm”) is not displayed on the monitor 18 (in FIG. 9A, the dotted line indicates that the display is not displayed on the monitor 18.
  • FIG. 9C Since the attention area a is a first attention area, the first attention target 90 included in the attention area a is tracked by the specific attention target tracking unit 79.
  • the medical image displayed on the monitor 18 is largely changed by the operation of the endoscope 12 and the movement of the observation target, and the first target 90 disappears from the monitor 18. Also in this case, the tracking process of the first target 90 continues. Then, as shown in FIG. 9C, when the first target 90 appears again on the monitor 18, even though the first tracking target is extracted as a target area by the target area extraction unit 70. , And not displayed on the monitor 18 as a region of interest. In addition, the diagnostic information determined for the attention area is also not displayed on the monitor 18.
  • the image of the first target of interest is saved as a template, or the feature quantity is calculated and saved, and the saved templates or
  • the position of the first attention target appearing again in the screen is the first attention target before disappearing from the screen It may be determined whether it is the same as the position.
  • the endoscope 12 by determining the movement amount of the endoscope 12 using an inertial sensor such as a gyro sensor, a magnetic sensor, or an acceleration sensor, it is possible to estimate the position of the first target of interest appearing again on the screen. Based on this estimation, it may be determined whether the position of the first target of interest appearing again in the screen is the same as the position of the first target of interest before it disappears from the screen.
  • an inertial sensor such as a gyro sensor, a magnetic sensor, or an acceleration sensor
  • the endoscope 12 obtains a medical image by imaging an observation target.
  • the image acquisition unit 54 of the processor device 16 acquires a medical image acquired by the endoscope 12.
  • the attention area extraction unit 70 extracts the attention area from the medical image.
  • the diagnostic information determination unit 72 determines diagnostic information for supporting diagnosis of a lesion.
  • the attention area and the diagnostic information determined for the attention area are displayed on the monitor 18.
  • the user refers to the region of interest and the diagnostic information displayed on the monitor 18, and in the diagnostic information, the user thinks that it is a misjudgment or the user considers that the lesion is obvious and does not need to be displayed on the monitor 18 1) Confirm whether diagnostic information exists. If the first diagnostic information does not exist, the user is instructed by the user interface 19 such as the mouse MS and the foot switch FS that all the diagnostic information being displayed is correct. On the other hand, when the user determines that the first diagnostic information exists, designation of the first diagnostic information is performed by the user interface 19 such as the mouse MS or the foot switch FS. After the specification of the first diagnostic information, the first attention area determined to be the first diagnostic information is not displayed on the monitor 18. The above process is repeated until the diagnosis using the region of interest is completed. The switching operation as to whether or not to make a diagnosis using the region of interest is performed by the user interface 19.
  • the shape of the region of interest ROI is represented by a square (rectangle) in the drawing, but the shape may be other than a rectangle (rectangle) such as an ellipse or a circle. This is because, in the above embodiment, since the attention area is extracted based on the feature amount, the shape of the attention area changes according to the distribution state of the feature amount.
  • the blood vessel index value calculated by the diagnostic information determination unit 72 for example, the blood vessel density, the blood vessel thickness, and the blood vessel index value, the number of blood vessels, the number of branches, the branch angle Distance between branches, number of crossings, change in thickness, interval, depth based on mucous membrane, height difference, inclination, contrast, color, change in color, meandering degree, blood concentration, oxygen saturation, ratio of artery , The proportion of veins, the concentration of dye administered, the travel pattern, and the blood flow.
  • the blood vessel density is represented by the proportion of blood vessels contained in a specific area in the image.
  • the thickness of the blood vessel (blood vessel diameter) is the distance between the boundary line of the blood vessel and the mucous membrane, for example, counting the number of pixels along the lateral direction of the blood vessel from the edge of the extracted blood vessel through the blood vessel Count by. Therefore, although the thickness of the blood vessel is the number of pixels, it can be converted to a unit of length such as " ⁇ m" if necessary when the imaging distance and zoom magnification etc. at the time of imaging a medical image are known. is there.
  • the number of blood vessels is the number of blood vessels extracted in the entire medical image or in the region of interest.
  • the number of blood vessels is calculated using, for example, the number of branch points of the extracted blood vessels (the number of branches), the number of intersections with other blood vessels (the number of intersections), or the like.
  • the bifurcation angle of a blood vessel is an angle which two blood vessels make at a bifurcation point.
  • the distance between bifurcation points is a linear distance between any bifurcation point and its neighboring bifurcation point, or a length along a blood vessel from any bifurcation point to its neighboring bifurcation point.
  • the number of crossings of blood vessels is the number of crossing points where blood vessels having different submucosal depths cross on a medical image. More specifically, the number of crossings of blood vessels is the number of blood vessels at relatively shallow positions under the submucosa crossing blood vessels at deep positions.
  • the change in the thickness of the blood vessel is blood vessel information related to the variation in the thickness of the blood vessel, and is also referred to as the aperture unequal degree.
  • the change in thickness of the blood vessel is, for example, the rate of change in the diameter of the blood vessel (also referred to as the degree of dilation).
  • the medical image acquired in the past examination With respect to the thickness of the extracted blood vessel, the temporal change in the thickness of the same blood vessel extracted from the medical image obtained in the subsequent new examination may be used as the change in the thickness of the blood vessel.
  • the ratio of the small diameter portion or the ratio of the large diameter portion may be calculated.
  • the thin diameter portion is a portion whose thickness is equal to or less than the threshold
  • the large diameter portion is a portion whose thickness is thicker than the threshold.
  • the complexity of change in blood vessel thickness (hereinafter referred to as “complexity of change in thickness”) is blood vessel information that indicates how complex the change is in the case of blood vessel thickness change.
  • the blood vessel information is calculated by combining a plurality of blood vessel information representing a change in the thickness of the blood vessel (that is, the change rate of the blood vessel diameter, the ratio of the narrow diameter portion, or the ratio of the wide diameter portion).
  • the complexity of the thickness change can be determined, for example, by the product of the change rate of the blood vessel diameter and the ratio of the small diameter portion.
  • the length of the blood vessel is the number of pixels counted along the longitudinal direction of the extracted blood vessel.
  • the blood vessel interval is the number of pixels of pixels representing the mucous membrane between the edges of the extracted blood vessel. When one blood vessel is extracted, the blood vessel interval has no value.
  • the blood vessel depth is measured relative to the mucous membrane (more specifically, the surface of the mucous membrane).
  • the depth of the blood vessel relative to the mucous membrane can be calculated, for example, based on the color of the blood vessel.
  • blood vessels close to the surface of the mucous membrane are expressed with a magenta color
  • blood vessels far from the surface of the mucous membrane and deep under the submucosa are expressed with a cyan color Therefore, based on the balance of the R, G, B color signals of the pixels extracted as blood vessels, the depth of the blood vessels with reference to the mucous membrane is calculated for each pixel.
  • the height difference of the blood vessel is the size of the difference in the depth of the blood vessel.
  • the height difference of one blood vessel to be noticed is obtained by the difference between the depth (maximum depth) of the deepest part of the blood vessel and the depth (minimum depth) of the shallowest part. When the depth is constant, the height difference is zero.
  • the blood vessel may be divided into a plurality of sections, and the inclination of the blood vessel may be calculated in each section.
  • the area of the blood vessel is a value proportional to the number of pixels of pixels extracted as blood vessels or the number of pixels of pixels extracted as blood vessels.
  • the area of the blood vessel is calculated within the region of interest, outside the region of interest, or for the entire medical image.
  • the contrast of the blood vessel is the relative contrast to the mucous membrane to be observed.
  • the contrast of the blood vessel is calculated, for example, by “Y V / Y M ” or “(Y V ⁇ Y M ) / (Y V + Y M )” using the blood vessel brightness Y V and the mucous membrane brightness Y M Do.
  • the color of a blood vessel is each value of RGB of the pixel showing a blood vessel.
  • the change in blood vessel color is the difference or ratio between the maximum value and the minimum value of each of the RGB values of the pixel representing the blood vessel.
  • the ratio of the maximum value to the minimum value of the pixel value of the B pixel representing the blood vessel, the ratio of the maximum value to the minimum value of the pixel value of the G pixel, or the ratio of the maximum value to the minimum value of the pixel value of the R pixel is Represents the change in color of
  • the color change of the blood vessel and the color of the blood vessel may be calculated for each value such as cyan, magenta, yellow, and green by converting into a complementary color.
  • the meandering degree of a blood vessel is blood vessel information that represents the width of a range in which the blood vessel travels in a meandering manner.
  • the meandering degree of the blood vessel is, for example, the smallest rectangular area (number of pixels) including the blood vessel for which the meandering degree is calculated. Further, the ratio of the length of the blood vessel to the straight distance between the start point and the end point of the blood vessel may be the meander degree of the blood vessel.
  • the blood concentration of a blood vessel is blood vessel information that is proportional to the amount of hemoglobin contained in the blood vessel. Since the ratio (G / R) of the pixel value of the G pixel to the pixel value of the R pixel representing the blood vessel is proportional to the amount of hemoglobin, the blood concentration is calculated for each pixel by calculating the value of G / R. Can.
  • Blood vessel oxygen saturation is the amount of oxygenated hemoglobin relative to the total amount of hemoglobin (total amount of oxygenated hemoglobin and reduced hemoglobin).
  • the oxygen saturation may be calculated using a medical image obtained by photographing an observation target with light of a specific wavelength band (for example, blue light with a wavelength of about 470 ⁇ 10 nm) having a large difference between the absorption coefficients of oxidized hemoglobin and reduced hemoglobin. it can.
  • a specific wavelength band for example, blue light with a wavelength of about 470 ⁇ 10 nm
  • the pixel value of the B pixel representing the blood vessel has a correlation with the oxygen saturation, so using a table etc.
  • the oxygen saturation of each pixel to be represented can be calculated.
  • the proportion of arteries is the ratio of the number of pixels of arteries to the number of pixels of all blood vessels.
  • the ratio of veins is the ratio of the number of pixels of veins to the number of pixels of all blood vessels.
  • Arteries and veins can be distinguished by oxygen saturation. For example, if a blood vessel having an oxygen saturation of 70% or more is an artery and a blood vessel having an oxygen saturation of less than 70% is a vein, the extracted blood vessel can be divided into an artery and a vein. Can be calculated.
  • the concentration of the administered dye is the concentration of the dye dispersed to the observation subject or the concentration of the dye injected into the blood vessel by intravenous injection.
  • the concentration of the administered dye is calculated, for example, by the ratio of the pixel value of the dye color to the pixel value of the pixel other than the dye color. For example, when a pigment colored in blue is administered, the ratio B / G of the B image to the G image, the ratio B / R of the B image to the R image, etc. are fixed (or temporarily attached) to the observation target Represents the concentration of dye.
  • the travel pattern of the blood vessel is blood vessel information regarding the travel direction of the blood vessel.
  • the traveling pattern of the blood vessel is, for example, an average angle (traveling direction) of the blood vessel with respect to a reference line set arbitrarily, dispersion of an angle formed by the blood vessel with respect to the reference line arbitrarily set (variation in traveling direction), and the like.
  • Blood flow in blood vessels is the number of red blood cells per unit time.
  • the Doppler shift frequency of each pixel representing a blood vessel of a medical image is calculated using a signal obtained by the ultrasound probe. You can determine the flow rate.
  • the present invention is applied to an endoscope system for processing an endoscopic image which is one of medical images, but medical images other than endoscopic images are processed.
  • the present invention can also be applied to medical image processing apparatuses that do.
  • the present invention is also applicable to a diagnosis support apparatus for performing diagnosis support to a user using a medical image.
  • the present invention can be applied to a medical service support apparatus for supporting medical services such as diagnostic reports using medical images.
  • the medical image is preferably a normal light image obtained by irradiating light of a plurality of wavelength bands as light of a white band or light of a white band.
  • the medical image is a special light image obtained by irradiating light of a specific wavelength band, and light of the specific wavelength band is preferably a band narrower than the white band.
  • the specific wavelength band is preferably included in the blue or green band in the visible range.
  • the specific wavelength band includes a wavelength band of 390 nm to 450 nm, or 530 nm to 550 nm, and light of a specific wavelength band has a peak wavelength within the wavelength band of 390 nm to 450 nm, or 530 nm to 550 nm. Is preferred.
  • the specific wavelength band is preferably included in the red band in the visible range.
  • the specific wavelength band includes a wavelength band of 585 nm to 615 nm, or 610 nm to 730 nm, and light of the specific wavelength band has a peak wavelength within the wavelength band of 585 nm to 615 nm, or 610 nm to 730 nm. It is preferable to have.
  • the specific wavelength band includes wavelength bands having different absorption coefficients for oxyhemoglobin and reduced hemoglobin, and light of a specific wavelength band has peak wavelengths in wavelength bands having different absorption coefficients for oxyhemoglobin and reduced hemoglobin. Is preferred.
  • the specific wavelength band includes wavelength bands of 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ 10 nm, or 600 nm to 750 nm, and light of the specific wavelength band is 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ 10 nm It is preferable to have a peak wavelength in a wavelength band of 600 nm or more and 750 nm or less.
  • the medical image is an in-vivo image obtained by copying the inside of a living body, and the in-vivo image preferably has information of fluorescence emitted from a fluorescent substance in the living body.
  • the fluorescence is preferably obtained by irradiating the living body with excitation light included in a wavelength standby of a peak wavelength of 390 or more and 470 nm or less.
  • the medical image is an in-vivo image obtained by copying the inside of a living body, and the specific wavelength band is preferably a wavelength band of infrared light.
  • the specific wavelength band includes a wavelength band of 790 nm to 820 nm, or 905 nm to 970 nm, and light of a specific wavelength band has a peak wavelength in a wavelength band of 790 nm to 820 nm, or 905 nm to 970 nm. Is preferred.
  • the image acquisition unit acquires a special light image having a signal of a specific wavelength band based on a normal light image obtained by irradiating white band light or light of a plurality of wavelength bands as light of the white band. It is preferable to have a special light image acquisition unit, and the medical image is a special light image.
  • the signal in the specific wavelength band is obtained by calculation based on RGB or CMY color information included in the normal light image.
  • an arithmetic image generation unit that generates an arithmetic image by arithmetic operation be provided, and the medical image be an arithmetic image.
  • V-LED Volt Light Emitting Diode
  • B-LED Blue Light Emitting Diode
  • G-LED Green Light Emitting Diode
  • the V-LED 20a emits violet light V in a wavelength band of 380 nm to 420 nm.
  • the B-LED 20b emits blue light B in a wavelength band of 420 nm to 500 nm.
  • the wavelength cut filter 23 cuts off at least the wavelength side longer than 450 nm of the peak wavelength.
  • the blue light Bx transmitted through the wavelength cut filter 23 is in the wavelength range of 420 to 460 nm.
  • the reason why light in the wavelength range longer than 460 nm is cut is a factor that reduces the blood vessel contrast of the blood vessel to be observed. It is because there is.
  • the wavelength cut filter 23 may reduce the light in the wavelength range longer than 460 nm instead of cutting the light in the wavelength range longer than 460 nm.
  • the G-LED 20c emits green light G having a wavelength band ranging from 480 nm to 600 nm.
  • the R-LED 20d emits red light R in a wavelength band of 600 nm to 650 nm.
  • the light source device 14 When light in a white band (white light) is emitted, all the V-LED 20a, B-LED 20b, G-LED 20c, and R-LED 20d are turned on. As a result, as shown in FIG. 12, the light source device 14 emits white light including violet light V, blue light Bx, green light G, and red light R. White light is almost white because it has a certain intensity or more from the blue band to the red band.
  • specific light having a peak wavelength in the wavelength band of 440 ⁇ 10 nm is emitted as light of a specific wavelength band (specific light), for example, as shown in FIG. .
  • Green light G and red light R are emitted for specific light which is larger than the amount of light emission.
  • the illumination light may be emitted using a laser light source and a phosphor.
  • the light source unit 20 is a blue laser light source (denoted as "445 LD", which represents “Laser Diode”) 104 which emits blue laser light having a peak wavelength of 445 ⁇ 10 nm.
  • a blue-violet laser light source (denoted as "405 LD”) 106 which emits blue-violet laser light having a peak wavelength of 405 ⁇ 10 nm.
  • the illumination optical system 30a is provided with a phosphor 110 to which blue laser light or blue-violet laser light from the light guide 24 is incident.
  • the phosphor 110 is excited by blue laser light to emit fluorescence.
  • part of the blue laser light transmits without exciting the phosphor 110.
  • the blue-violet laser light transmits without exciting the phosphor 110.
  • the light emitted from the phosphor 110 illuminates the body to be observed through the illumination lens 32.
  • the blue laser light source 104 is turned on, and mainly the blue laser light is incident on the phosphor 110, so that blue laser light and blue laser light as shown in FIG. White light is generated by combining the fluorescence emitted from the phosphor 110.
  • specific light having a peak wavelength in the wavelength band of 440 ⁇ 10 nm as light of a specific wavelength band (specific light)
  • the blue laser light source 104 and the blue-violet laser light source 106 are turned on to Both the purple laser light and the blue laser light are incident on the phosphor 110.
  • the specific light is generated by combining the fluorescence to be excited and emitted from the phosphor 110 by the blue-violet laser light, the blue laser light, and the blue laser light.
  • the half width of the blue laser light or the blue-violet laser light is preferably about ⁇ 10 nm.
  • a broad area type InGaN-based laser diode can be used, and an InGaNAs-based laser diode or a GaNAs-based laser diode can also be used.
  • a light emitter such as a light emitting diode may be used as the light source.
  • the phosphor 110 absorbs a part of blue laser light and emits plural colors of green to yellow (for example, a YAG phosphor or a phosphor such as BAM (BaMgAl 10 O 17 )). It is preferable to use what is comprised including.
  • a YAG phosphor or a phosphor such as BAM (BaMgAl 10 O 17 ) for example, a YAG phosphor or a phosphor such as BAM (BaMgAl 10 O 17 )
  • BAM BaMgAl 10 O 17
  • the illumination light may be emitted using a broadband light source such as a xenon lamp and a rotary filter.
  • a broadband light source such as a xenon lamp and a rotary filter.
  • the light source unit 20 is provided with the wide band light source 202, the rotation filter 204, and the filter switching unit 206.
  • a diaphragm 203 is provided between the broadband light source 202 and the rotary filter 204.
  • the diaphragm control unit 205 adjusts the area of the opening of the diaphragm 203.
  • the aperture control unit 205 controls the aperture 203 based on the light control signal from the processor device 16.
  • the broadband light source 202 is a xenon lamp, a white LED, or the like, and emits broadband light ranging in wavelength from blue to red.
  • the rotary filter 204 includes a normal mode filter 210 provided on the inner side closest to the rotation axis, and a specific light filter 212 provided on the outer side of the white light filter 210 (see FIG. 18).
  • the filter switching unit 206 moves the rotary filter 204 in the radial direction. Specifically, when the white light is emitted, the filter switching unit 206 inserts the white light filter 210 into the optical path of the broadband light. When the light (specific light) in a specific wavelength band is emitted, the filter switching unit 206 inserts the filter 212 for specific light into the optical path of the broadband light.
  • the white light filter 210 is provided with a B filter 210a, a G filter 210b, and an R filter 210c along the circumferential direction.
  • the B filter 210a transmits wide band blue light B having a wavelength range of 400 to 500 nm out of the wide band light.
  • the G filter 210 b transmits green light G of the broadband light.
  • the R filter 210c transmits the red light R of the broadband light. Therefore, in the case of emitting white light, blue light B, green light G, and red light R are sequentially emitted as white light by rotating the rotary filter 204.
  • the filter for specific light 212 is provided with a Bn filter 212a and a G filter 212b along the circumferential direction.
  • the Bn filter 212a transmits blue narrow band light Bn of 400 to 450 nm out of wide band light.
  • the G filter 212 b transmits green light G out of the wide band light. Therefore, when the specific light is emitted, the blue narrow band light Bn and the green light G are sequentially irradiated toward the observation target as the specific light by rotating the rotary filter 204.
  • each time the observation target is illuminated with blue light B, green light G, red light R, monochrome The image sensor captures an image of the observation target. An image including a component of white light is generated by the B image, the G image, and the R image obtained by imaging the observation target.
  • the observation object is imaged by a monochrome image sensor, and the Bn image and G image obtained by this imaging.
  • the attention area extraction unit 70 the diagnosis information determination unit 72, the diagnosis information specification unit 74, the similar image storage unit 76, the diagnosis information correction unit 78, or the specific attention target tracking unit 79 included in the image processing unit 61.
  • the hardware-like structure of a processing unit (processing unit) that executes various processes is various processors as described below.
  • CPU Central Processing Unit
  • FPGA Field Programmable Gate Array
  • PLD programmable logic device
  • One processing unit may be configured of one of these various processors, or configured of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA) It may be done.
  • a plurality of processing units may be configured by one processor.
  • one processor is configured by a combination of one or more CPUs and software as represented by computers such as clients and servers; There is a form in which this processor functions as a plurality of processing units.
  • SoC system on chip
  • IC integrated circuit
  • circuitry in the form in which circuit elements such as semiconductor elements are combined.
  • Endoscope System 12 Endoscope 12a Insertion Part 12b Operation Part 12c Curved Part 12d Tip Part 12e Angle Knob 12f Forceps Entrance 13a Zoom Operation Part 13b Scope Switch for Diagnostic Information Specification 14 Light Source Device 16 Processor Device 18 Monitor 19 User Interface 20 Light source unit 22 Light source control unit 30a Illumination optical system 30b Imaging optical system 41 Light guide 45 Illumination lens 46 Objective lens 47 Zoom lens 48 Image sensor 52 Central control unit 54 Image acquisition unit 56 DSP (Digital Signal Processor) 58 noise reduction unit 59 conversion unit 61 image processing unit 66 display control unit 70 attention area extraction unit 72 diagnosis information determination unit 74 diagnosis information specification unit 76 similar image storage unit 78 diagnosis information correction unit 79 specific attention target tracking unit 80 similar image 82 Similar image diagnostic information 90 first target

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Endoscopes (AREA)

Abstract

La présente invention concerne un système de traitement d'image médicale, un système d'endoscope, un dispositif d'aide au diagnostic et un dispositif d'aide à une intervention médicale qui sont capables d'éliminer la présentation d'informations de diagnostic incorrectes à un utilisateur, et de présenter des informations de diagnostic appropriées en fonction de l'utilisateur. Une unité d'extraction de région d'intérêt extrait une région d'intérêt incluant une cible d'intérêt depuis l'image médicale. Une unité de détermination d'informations de diagnostic détermine des informations de diagnostic sur la cible d'intérêt pour chaque région d'intérêt. Un moniteur affiche la région d'intérêt et les informations de diagnostic déterminées pour la région d'intérêt. Une unité de désignation d'informations de diagnostic désigne des premières informations de diagnostic parmi les informations de diagnostic affichées sur une unité d'affichage. Une unité de commande d'affichage exécute une commande qui n'est pas affichée sur le moniteur pour une première région d'intérêt déterminée comme étant la première information de diagnostic.
PCT/JP2018/032663 2017-09-26 2018-09-03 Système de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à une intervention médicale WO2019065111A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019544482A JP6796725B2 (ja) 2017-09-26 2018-09-03 医療画像処理システム、内視鏡システム、診断支援装置、及び医療業務支援装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-184278 2017-09-26
JP2017184278 2017-09-26

Publications (1)

Publication Number Publication Date
WO2019065111A1 true WO2019065111A1 (fr) 2019-04-04

Family

ID=65901240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/032663 WO2019065111A1 (fr) 2017-09-26 2018-09-03 Système de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à une intervention médicale

Country Status (2)

Country Link
JP (1) JP6796725B2 (fr)
WO (1) WO2019065111A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021172288A1 (fr) * 2020-02-26 2021-09-02 富士フイルム富山化学株式会社 Dispositif d'identification de médicament
JPWO2021205777A1 (fr) * 2020-04-08 2021-10-14
WO2023058388A1 (fr) * 2021-10-04 2023-04-13 富士フイルム株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, système endoscopique et dispositif d'aide à la création de rapport
WO2024034253A1 (fr) * 2022-08-12 2024-02-15 富士フイルム株式会社 Système d'endoscope et procédé de fonctionnement associé

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004159739A (ja) * 2002-11-11 2004-06-10 Ge Medical Systems Global Technology Co Llc 画像処理方法および装置
JP2005328977A (ja) * 2004-05-19 2005-12-02 Hitachi Medical Corp 画像診断支援装置及び方法
JP2008036028A (ja) * 2006-08-03 2008-02-21 Olympus Medical Systems Corp 画像表示装置
JP2010213760A (ja) * 2009-03-13 2010-09-30 Toshiba Corp 画像処理装置および画像処理プログラム
JP2011255006A (ja) * 2010-06-09 2011-12-22 Olympus Corp 画像処理装置、内視鏡装置、プログラム及び画像処理方法
JP2013153883A (ja) * 2012-01-27 2013-08-15 Canon Inc 画像処理装置、撮影システム及び画像処理方法
WO2016199273A1 (fr) * 2015-06-11 2016-12-15 オリンパス株式会社 Dispositif d'endoscope et procédé de fonctionnement de dispositif d'endoscope
JP2016214312A (ja) * 2015-05-14 2016-12-22 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004159739A (ja) * 2002-11-11 2004-06-10 Ge Medical Systems Global Technology Co Llc 画像処理方法および装置
JP2005328977A (ja) * 2004-05-19 2005-12-02 Hitachi Medical Corp 画像診断支援装置及び方法
JP2008036028A (ja) * 2006-08-03 2008-02-21 Olympus Medical Systems Corp 画像表示装置
JP2010213760A (ja) * 2009-03-13 2010-09-30 Toshiba Corp 画像処理装置および画像処理プログラム
JP2011255006A (ja) * 2010-06-09 2011-12-22 Olympus Corp 画像処理装置、内視鏡装置、プログラム及び画像処理方法
JP2013153883A (ja) * 2012-01-27 2013-08-15 Canon Inc 画像処理装置、撮影システム及び画像処理方法
JP2016214312A (ja) * 2015-05-14 2016-12-22 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
WO2016199273A1 (fr) * 2015-06-11 2016-12-15 オリンパス株式会社 Dispositif d'endoscope et procédé de fonctionnement de dispositif d'endoscope

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021172288A1 (fr) * 2020-02-26 2021-09-02 富士フイルム富山化学株式会社 Dispositif d'identification de médicament
JPWO2021172288A1 (fr) * 2020-02-26 2021-09-02
JP7282978B2 (ja) 2020-02-26 2023-05-29 富士フイルム富山化学株式会社 薬剤識別装置
JPWO2021205777A1 (fr) * 2020-04-08 2021-10-14
WO2021205777A1 (fr) * 2020-04-08 2021-10-14 富士フイルム株式会社 Dispositif de processeur et procédé de fonctionnement de celui-ci
JP7447243B2 (ja) 2020-04-08 2024-03-11 富士フイルム株式会社 プロセッサ装置及びその作動方法
WO2023058388A1 (fr) * 2021-10-04 2023-04-13 富士フイルム株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, système endoscopique et dispositif d'aide à la création de rapport
WO2024034253A1 (fr) * 2022-08-12 2024-02-15 富士フイルム株式会社 Système d'endoscope et procédé de fonctionnement associé

Also Published As

Publication number Publication date
JPWO2019065111A1 (ja) 2020-10-22
JP6796725B2 (ja) 2020-12-09

Similar Documents

Publication Publication Date Title
JP6785941B2 (ja) 内視鏡システム及びその作動方法
US10959606B2 (en) Endoscope system and generating emphasized image based on color information
US10231658B2 (en) Endoscope system, processor device for endoscope system, operation method for endoscope system, and operation method for processor device
US10891737B2 (en) Medical image processing device, endoscope system, diagnosis support device, and medical service support device
US11426054B2 (en) Medical image processing system, endoscope system, diagnosis support apparatus, and medical service support apparatus
JP6538634B2 (ja) プロセッサ装置及び内視鏡システム並びにプロセッサ装置の作動方法
JP6796725B2 (ja) 医療画像処理システム、内視鏡システム、診断支援装置、及び医療業務支援装置
WO2017057414A1 (fr) Dispositif de traitement d'image, système d'endoscope et procédé de traitement d'image
JP6724154B2 (ja) プロセッサ装置及び内視鏡システム並びにプロセッサ装置の作動方法
US20180206738A1 (en) Endoscope system and method of operating endoscope system
JPWO2017057573A1 (ja) 画像処理装置、内視鏡システム、及び画像処理方法
JP6924837B2 (ja) 医療画像処理システム、内視鏡システム、診断支援装置、並びに医療業務支援装置
WO2018159082A1 (fr) Système d'endoscope, dispositif de processeur, et procédé de fonctionnement de système d'endoscope
JP6763025B2 (ja) プロセッサ装置及びプロセッサ装置の作動方法
JP6850358B2 (ja) 医療画像処理システム、内視鏡システム、診断支援装置、及び医療業務支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18862913

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019544482

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18862913

Country of ref document: EP

Kind code of ref document: A1