CN115191909A - Medical image processing apparatus and method for operating the same - Google Patents

Medical image processing apparatus and method for operating the same Download PDF

Info

Publication number
CN115191909A
CN115191909A CN202210353152.4A CN202210353152A CN115191909A CN 115191909 A CN115191909 A CN 115191909A CN 202210353152 A CN202210353152 A CN 202210353152A CN 115191909 A CN115191909 A CN 115191909A
Authority
CN
China
Prior art keywords
color
information
medical image
brightness
color information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210353152.4A
Other languages
Chinese (zh)
Inventor
久保雅裕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Publication of CN115191909A publication Critical patent/CN115191909A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Signal Processing (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The invention provides a medical image processing apparatus and an operating method thereof, which can emphasize small difference of depth information without greatly damaging the original color tone of an observed object. Referring to B/G (X1), G/R (Y1), and 1G image signal G (L1), sa representing B/G (X1), G/R (Y1) is changed in the 1 st color direction D1 for pixels where the 1G image signal G (L1) exceeds the upper limit value, and Sb of B/G (X1), G/R (Y1) is changed in the 2 nd color direction D2 for pixels where the 1G image signal G (L1) is less than the lower limit value, thereby obtaining B/G (X2), G/R (Y2).

Description

Medical image processing apparatus and method for operating the same
Technical Field
The present invention relates to a medical image processing apparatus for performing color emphasis processing on an observation target and an operation method thereof.
Background
In the medical field in recent years, an endoscope system including a light source device, an endoscope, and a processor device is widely used. In an endoscope system, illumination light is irradiated from an endoscope to an observation target, and an image of the observation target is displayed on a monitor based on RGB image signals obtained by imaging the observation target irradiated with the illumination light by an imaging element of the endoscope.
In an image of an observation target by an endoscope, diagnosis is performed by color, shape, or the like. In contrast, in patent document 1, the fine color difference between the normal portion and the abnormal portion is emphasized by performing a process of expanding the color difference between the normal portion and the abnormal portion. In patent document 2, even when an image pickup object in which a light range and a dark range are mixed is picked up, the color balance of a normal mucous membrane is maintained in the light range and the dark range, for example. Further, although different from the endoscope field, in patent document 3, brightness information is acquired for each hue region from an image signal of a captured image obtained by a digital camera or the like, and exposure control is performed based on the acquired brightness information, thereby optimizing the saturation of the captured image.
Patent document 1: japanese patent No. 5932894
Patent document 2: japanese patent No. 6580778
Patent document 3: japanese patent laid-open publication No. 2013-77879
As a method for enhancing an image, generally, lightness and hue are regarded as independent properties to be enhanced. Specifically, the color RGB image is projected into a color space such as Lab, HSV, and LHS, and the lightness, hue, and saturation are controlled as independent phenomena. In this case, when the depth information such as blood concentration and pigment concentration is emphasized as the object of emphasizing the image, the brightness and contrast can be adjusted in general. However, when the difference between the depth information is small, the visibility is not sufficiently improved only by adjusting the brightness or the contrast.
On the other hand, when the lightness and darkness information is emphasized, pseudo color display using a change in information that is not related to the color tone, such as heat mapping, may be considered. In this case, although the lightness information can be emphasized by assigning colors so that minute differences in lightness information can be easily distinguished, the original hue is lost, and thus it is difficult to determine information other than lightness by pseudo color display.
Disclosure of Invention
The invention aims to provide a medical image processing device and an operating method thereof, wherein the medical image processing device can emphasize small difference of depth information without greatly damaging the original color tone of an observed object.
The medical image processing apparatus of the present invention includes a processor that performs: acquiring a colorful 1 st medical image; with respect to each pixel constituting the color 1 st medical image, 1 st brightness information on brightness of one of 3 attributes of color is acquired from the image signal value, and 1 st color information on color information remaining with brightness removed from the 3 attributes of color is acquired; referring to the 1 st color information and the 1 st brightness information, in a case where the 1 st brightness information exceeds a predetermined upper limit value, the 1 st color information is changed in the 1 st color direction, and in a case where the 1 st brightness information is less than a predetermined lower limit value, the 1 st color information is changed in the 2 nd color direction having a relationship of complementary color or reverse color to the 1 st color direction, thereby obtaining the 2 nd color information, the 1 st color information and the 2 nd color information are synthesized according to a 1 st synthesis coefficient set according to the 1 st color information, thereby obtaining synthesized color information, the 1 st brightness information is converted according to a brightness conversion coefficient set according to the 1 st color information, thereby obtaining the 2 nd brightness information, the 1 st brightness information and the 2 nd brightness information are synthesized according to a 2 nd synthesis coefficient set according to the 1 st color information, thereby obtaining synthesized brightness information, and the synthesized brightness information and the synthesized color information are converted into a 2 nd medical image of which a value of the image signal is changed.
Preferably, the color indicated by the 1 st color direction is yellow, and the color indicated by the 2 nd color direction is blue. Preferably, the 1 st synthesis coefficient and the 2 nd synthesis coefficient are determined from the color information within the 1 st color gamut in which the 1 st color information is within a predetermined range, and have fixed values outside the 1 st color gamut. Preferably, the 1 st and 2 nd synthesis coefficients are larger than 0 in the 1 st color gamut, and change in correspondence with the 1 st distance or the 1 st angle determined from the color information, and are set to 0 outside the 1 st color gamut.
Preferably, the processor performs the following processing: in a 2 nd gamut where the 2 nd color information is within a predetermined range, the 3 rd color information is acquired by changing the 2 nd distance or the 2 nd angle determined from the color information. Preferably, the 1 st color information is any one of: red signal R of color 1 st medical image anda ratio R/G of green signals G of the color 1 st medical image and a ratio B/G of blue signals B and green signals G of the color 1 st medical image; color difference signals Cr, cb obtained from the chromatic 1 st medical image; hue and saturation obtained from the chromatic 1 st medical image; and CIE1976La obtained from the colored 1 st medical image * b * A in (a) * b *
Preferably, the 1 st color gamut is a color gamut corresponding to a color of high concentration blood. Preferably, the 1 st color gamut is a color gamut corresponding to the color of the color film with the color elements dispersed therein.
The method for operating a medical image processing apparatus according to the present invention includes the steps of: acquiring a color 1 st medical image; a step of acquiring, with respect to each pixel constituting the 1 st color medical image, 1 st brightness information on brightness of one of 3 attributes of a color from an image signal value, and 1 st color information on color information remaining after brightness is removed from the 3 attributes of the color; a step of, with reference to the 1 st color information and the 1 st lightness information, changing the 1 st color information in the 1 st color direction in a case where the 1 st lightness information exceeds a predetermined upper limit value, and changing the 1 st color information in the 2 nd color direction having a relationship of complementary color or reverse color with the 1 st color direction in a case where the 1 st lightness information is less than a predetermined lower limit value, thereby acquiring the 2 nd color information; a step of synthesizing the 1 st color information and the 2 nd color information by a 1 st synthesis coefficient set based on the 1 st color information, thereby obtaining synthesized color information; a step of converting the 1 st brightness information by a brightness conversion coefficient set based on the 1 st color information, thereby obtaining the 2 nd brightness information; a step of synthesizing the 1 st brightness information and the 2 nd brightness information according to the 2 nd synthesis coefficient set according to the 1 st color information, thereby obtaining synthesized brightness information; and a step of converting the composite brightness information and the composite color information into a 2 nd color medical image with the image signal value changed.
Effects of the invention
According to the present invention, a slight difference in the shade information can be emphasized without significantly impairing the original color tone of the observation target.
Drawings
Fig. 1 is an external view of an endoscope system according to embodiment 1.
Fig. 2 is a block diagram showing functions of the endoscope system according to embodiment 1.
Fig. 3 is a graph showing emission spectra of violet light V, blue light B, green light G, and red light R.
FIG. 4 is a block diagram showing the function of the lightness information emphasis processing section when the 1 st color information is a B/G ratio or a G/R ratio.
FIG. 5 is a block diagram showing the function of the 1 st color change processing section when the 1 st color information is a B/G ratio or a G/R ratio.
FIG. 6 is an explanatory diagram showing the 1 st color change processing section before and after processing when the 1 st color information is B/G ratio or G/R ratio.
FIG. 7 is an explanatory diagram showing the 1 st color gamut when the 1 st color information is a B/G ratio or a G/R ratio.
Fig. 8 is an explanatory diagram showing the distribution of the 1 st synthesis coefficient.
Fig. 9 is an explanatory diagram showing a color gamut corresponding to the color of high concentration blood.
Fig. 10 is an explanatory diagram showing a color gamut corresponding to the color of a mucous membrane with a blue dye dispersed therein.
Fig. 11 is a block diagram showing the function of the color information combining unit.
Fig. 12 is a block diagram showing the function of the 2 nd color change processing section.
Fig. 13 is an explanatory diagram showing the 2 nd color gamut.
Fig. 14 is a graph showing a relationship between the 2 nd distance R2in and the 2 nd distance R2out.
Fig. 15 is a graph showing a relationship between the 2 nd angle θ 2in and the 2 nd angle θ 2out.
Fig. 16 is an explanatory diagram showing functions of the brightness information acquisition section, the brightness information synthesis section, and the color image conversion section.
Fig. 17 is a flowchart showing a series of flows of the special light observation mode.
Fig. 18 is a block diagram showing the function of the lightness and darkness information enhancement processing unit when the 1 st color information is the color difference signals Cr and Cb.
Fig. 19 is an explanatory diagram showing before and after processing in the 1 st color change processing unit when the 1 st color information is the color difference signals Cr and Cb.
Fig. 20 is an explanatory diagram showing the 1 st color gamut when the 1 st color information is the color difference signals Cr, cb.
Fig. 21 is a block diagram showing the function of the lightness information emphasizing process section when the 1 st color information is hue H and saturation S.
Fig. 22 is an explanatory diagram showing before and after processing in the 1 st color change processing unit when the 1 st color information is hue H and saturation S.
Fig. 23 is an explanatory diagram showing the 1 st color gamut when the 1 st color information is hue H and saturation S.
Fig. 24 is a block diagram showing functions of the endoscope system according to embodiment 2.
Fig. 25 is a graph showing the emission spectrum of normal light.
Fig. 26 is a graph showing the emission spectrum of the special light.
Fig. 27 is a block diagram showing functions of the endoscope system according to embodiment 3.
Fig. 28 is a plan view showing a rotary filter.
Fig. 29 is a graph showing the distribution of the 2 nd synthesis coefficient.
Detailed Description
[ embodiment 1 ]
As shown in fig. 1, an endoscope system 10 according to embodiment 1 includes an endoscope 12, a light source device 13, a processor device 14, a display 15, and a user interface 16. The endoscope 12 is optically connected to the light source device 13 and electrically connected to the processor device 14. The light source device 13 supplies illumination light to the endoscope 12.
The endoscope 12 irradiates illumination light to an observation target to image the observation target and acquire an endoscope image. The endoscope 12 includes an insertion portion 12a to be inserted into the body of an observation object, an operation portion 12b provided at a proximal end portion of the insertion portion 12a, a bending portion 12c provided at a distal end side of the insertion portion 12a, and a distal end portion 12d. The bending portion 12c performs a bending operation by operating the operation portion 12 b. The distal end portion 12d irradiates illumination light toward an observation target, and receives reflected light from the observation target to photograph the observation target. The distal end portion 12d is oriented in a desired direction by the bending operation of the bending portion 12 c. The operation unit 12b is provided with: a mode switching switch 12f for switching an operation mode; a still image acquisition instruction switch 12g for instructing acquisition of a still image of the observation target; and a zoom operation unit 12h for operating the zoom lens 21b.
The processor device 14 is electrically connected to a display 15 and a user interface 16. The processor device 14 receives endoscopic images from the endoscope 12. The display 15 outputs and displays an image, information, or the like of the observation target processed by the processor device 14. The user interface 16 has a keyboard, a mouse, a touch panel, a microphone, and the like, and has a function of accepting an input operation such as function setting. The expansion processor device 17 is electrically connected to the processor device 14.
The endoscope system 10 has a normal observation mode and a special light observation mode, and is switched by a mode switching switch 12 f. The normal observation mode is a mode in which a normal image obtained by irradiating normal light such as white light to an observation target is displayed on the display 15. The special light observation mode is a mode in which a light-and-dark emphasized image obtained by irradiating special light including blue narrow-band light or the like to an observation target and having a light portion in which a light portion and a dark portion are emphasized is displayed on the display 15. In the present embodiment, the depth-emphasized image is generated from a special image signal described later obtained by irradiating the observation target with special light, but may be generated from an image signal obtained by irradiating the observation target with illumination light having various wavelength regions other than the special light, such as normal light, instead of the special light.
As shown in fig. 2, the Light source device 13 includes a Light source unit 20 having a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20B, a G-LED (Green Light Emitting Diode) 20c, and an R-LED (Red Light Emitting Diode) 20d, a Light source control unit 21 for controlling the driving of the 4-color LEDs 20a to 20d, and an optical path coupling unit 23 for coupling the optical paths of the 4-color lights emitted from the 4-color LEDs 20a to 20d. The light coupled by the optical path coupling unit 23 is irradiated into the subject via the light guide 24 and the illumination lens 33 inserted into the insertion unit 12 a. In addition, an LD (Laser Diode) may be used instead of the LED.
The light guide 24 is incorporated in the endoscope 12 and a general cord (a cord connecting the endoscope 12, the light source device 13, and the processor device 14), and transmits the light coupled by the optical path coupling unit 23 to the distal end portion 12d of the endoscope 12. Further, as the light guide 41, a multimode fiber can be used. For example, a small diameter optical fiber cable having a core diameter of 105 μm, a cladding diameter of 125 μm, and a diameter of 0.3 to 0.5mm including a protective layer serving as a sheath can be used.
An illumination optical system 30 and an imaging optical system 31 are provided at the distal end portion 12d of the endoscope 12. The illumination optical system 3 () has an illumination lens 33, and light from the light guide 24 is irradiated to the observation target via the illumination lens 33. The imaging optical system 31 includes an objective lens 35 and an imaging sensor 36. The reflected light from the observation target enters the image sensor 36 via the objective lens 35. Thereby, the reflected image of the observation target is formed on the image sensor 36.
The image sensor 36 is a color image sensor, and takes a reflected image of the subject to output an image signal. The image sensor 36 is preferably a CCD (Charge Coupled Device) image sensor, a CMOS (Complementary Metal-Oxide Semiconductor) image sensor, or the like. The image sensor 36 used in the present invention is a color image sensor for obtaining R (red), G (green), and B (blue) 3 RGB image signals, that is, a so-called RGB image sensor including R pixels provided with an R filter, G pixels provided with a G filter, and B pixels provided with a B filter. The R pixel outputs an R image signal, the G pixel outputs a G image signal, and the B pixel outputs a B image signal.
In the case of the normal observation mode, the image sensor 36 outputs a normal image signal by photographing an observation target irradiated with normal light. In the special light observation mode, the image sensor 36 captures an observation target irradiated with special light and outputs a special image signal.
The image sensor 36 may be a so-called complementary color image sensor including complementary color filters of C (cyan), M (magenta), Y (yellow), and G (green) instead of the RGB color image sensor. The CMYG4 color image signal is output in the case of using a complementary color image pickup sensor, and therefore, it is necessary to convert the CMYG4 color image signal into an RGB3 color image signal by complementary-primary color conversion. The image sensor 36 may be a monochrome image sensor provided with no color filter. In this case, light source control unit 21 turns on blue light B, green light G, and red light R in a time-division manner, and it is necessary to add synchronization processing to the processing of the image pickup signal.
The image signal output from the image sensor 36 is sent to the CDS/AGC circuit 40. The CDS/AGC circuit 40 performs Correlated Double sampling (CDS (Correlated Double sampling)) or Automatic Gain Control (AGC) (Auto Gain Control) on an image signal, which is an analog signal. The image signal passed through the CDS/AGC circuit 40 is converted into a Digital image signal by an a/D (Analog/Digital) converter) 42. The a/D converted digital image signal is input to the processor device 14.
The Processor device 14 includes an image Signal input unit 45, a DSP (Digital Signal Processor) 46, a noise removal unit 47, a Signal switching unit 48, a normal image processing unit 49, a depth information enhancement processing unit 50, and a video Signal generation unit 51.
The image signal input unit 45 receives a digital image signal from the endoscope 12. The DSP46 performs various signal processes such as a defect correction process, an offset process, a gain process, a matrix process, a gamma conversion process, and a demosaicing process on the received image signal. In the defect correction processing, the signal of the defective pixel of the image sensor 36 is corrected. In the offset processing, a dark current component is removed from the image signal subjected to the defect correction processing, and an accurate zero level is set. In the gain processing, the signal level is adjusted by multiplying the offset-processed image signal by a certain gain.
Regarding the gain processing, it is different between the normal observation mode and the special light observation mode. In the gain processing in the normal observation mode, the R image signal, the G image signal, and the B image signal of the normal image signal are multiplied by an R gain coefficient, a G gain coefficient, and a B gain coefficient for normal observation, respectively. In the gain processing in the special light observation mode, an R gain coefficient, a G gain coefficient, and a B gain coefficient for special observation are multiplied by the R image signal, the G image signal, and the B image signal of the special image signal, respectively.
The gain-processed image signal is subjected to matrix processing for improving color reproducibility. The matrix processing is also different between the normal observation mode and the special light observation mode. As for the matrix processing in the normal observation mode, the matrix processing for normal observation is performed on the normal image signal. In the matrix processing in the special light observation mode, special observation matrix processing is performed on the special image signal.
Then, the brightness or saturation is adjusted by the gamma conversion process. The image signal after the matrix processing is subjected to demosaicing processing (also referred to as isotropic processing or synchronization processing), and a signal of a missing color in each pixel is generated by interpolation. By this demosaicing process, all pixels have signals of RGB colors.
The noise removing unit 47 removes noise from the image signal on which the gamma correction or the like is performed by performing noise removal processing (for example, a moving average method, a median filtering method, or the like) on the image signal by the DSP 46. The noise-removed image signal is sent to the signal switching section 48.
The signal switching unit 48 sends the normal image signal to the normal image processing unit 49 when the mode switching SW13a is set to the normal observation mode, and sends the special image signal to the depth information emphasizing processing unit 50 when the mode switching SW13a is set to the special light observation mode.
The normal image processing unit 49 performs image processing for normal observation on the normal image signal. The normal observation image processing includes normal observation structure enhancement processing and the like. The normal image signal subjected to the image processing for normal observation is input as a normal image from the normal image processing unit 49 to the video signal generating unit 51. The depth information enhancement processing unit 50 generates a depth enhanced image from the special image signal. The details of the depth information enhancement processing unit 50 will be described later. The depth-enhanced image is input to the video signal generator 51.
The video signal generator 51 converts the normal image or the depth-weighted image input from the normal image processor 49 or the depth-information-enhancement processor 50 into a video signal for displaying the image as an image that can be displayed on the display 15. The display 15 displays a normal image or a depth-enhanced image based on the video signal.
As shown in FIG. 3, the light source unit 20 composed of V-LED20a, B-LED20B, G-LED20c and R-LED20d provided in the light source device 13 emits light of the following emission spectrum. The V-LED20a generates violet light V having a central wavelength of 405 + -10 nm and a wavelength range of 380-420 nm. The B-LED20B generates blue light B having a central wavelength of 460 + -10 nm and a wavelength range of 420-500 nm. The G-LED20c generates green light G having a wavelength ranging from 480 to 600 nm. The R-LED20d generates red light R having a center wavelength of 620 to 630nm and a wavelength range of 600 to 650 nm.
The light source control unit 21 lights the V-LED20a, the B-LED20B, the G-LED20c, and the R-LED20d in the normal observation mode. Accordingly, the observation target is irradiated with light obtained by mixing the violet light V, blue light B, green light G, and red light R4 as normal light. In the normal observation mode, light source control unit 21 sets the light amount ratio among violet light V, blue light B, green light G, and red light R to Vc: bc: gc: the respective LEDs 20a to 20d are controlled by Rc. On the other hand, in the special light observation mode, the light source control unit 21 controls the LEDs 20a to 20d so that the light amount ratio among the violet light V, blue light B, green light G, and red light R is Vs: bs: gs: rs.
In addition, in this specification, the light amount ratio includes a case where the ratio of the at least one semiconductor light source is 0 (zero). Therefore, the case where one, two, or more of the semiconductor light sources are not lit is included. For example, when the light amount ratio between the violet light V, the blue light B, the green light G, and the red light R is 1: 0, only one semiconductor light source is turned on, and when the other 3 semiconductor light sources are not turned on, the light amount ratio is also set.
As shown in fig. 4, the lightness and darkness information enhancement processing portion 50 includes an image acquisition portion 60, a color and lightness information acquisition portion 61, a 1 st color change processing portion 62, a synthesis coefficient setting portion 63, a color information synthesis portion 65, a 2 nd color change processing portion 66, a lightness information acquisition portion 67, a lightness information synthesis portion 68, and a color image conversion portion 69.
In the processor device 14, a program related to various processes of the depth information emphasizing processing unit is incorporated in the program memory. The functions of the image acquisition section 60, the color and brightness information acquisition section 61, the 1 st color change processing section 62, the synthesis coefficient setting section 63, the color information synthesis section 65, the 2 nd color change processing section 66, the brightness information acquisition section 67, the brightness information synthesis section 68, and the color image conversion section 69 are realized by executing programs by a control section (not shown) constituted by a processor.
The image acquiring unit 60 acquires the R image signal, the G image signal, and the B image signal of the special image signal as the color 1 st medical image. Among the special image signals acquired by the image acquisition unit 60, the R image signal is set as the 1 st R image signal, the G image signal is set as the 1 st G image signal, and the B image signal is set as the 1 st R image signal. The image acquisition section 60 sends the special image signal to the color and brightness information acquisition section 61.
The image acquired by the image acquiring unit 60 may be a color 1 st medical image obtained by various medical apparatuses, such as a color ultrasonic image obtained by an ultrasonic diagnostic apparatus, in addition to the 1 st R image signal, the 1 st G image signal, and the 1 st B image signal of the special image signal which is the color endoscopic image obtained by the endoscope 12. Accordingly, the medical image processing apparatus of the present invention includes an ultrasonic processor apparatus for processing an ultrasonic image in addition to the processor apparatus 14 for an endoscope shown in the present embodiment.
Regarding each pixel constituting the special image signal, the color and brightness information acquisition section 61 acquires 1 st brightness information regarding the brightness of one of the 3 attributes of the color from the image signal value, and acquires 1 st color information regarding the color information remaining after the brightness is removed from the 3 attributes of the color. The color and brightness information acquiring section 61 includes a signal ratio calculating section 61a for acquiring the 1 st color information. In the color and brightness information acquiring section 61, when the 1 st color information is acquired by the signal ratio calculating section 61a, the 1 st brightness information is preferably an image signal suitable for representing the brightness of each pixel in the special image signal. In the present embodiment, the 1 st G image signal in which the 1 st brightness information is a special image signal is hereinafter expressed as G (L1) for distinguishing from other brightness information (2 nd brightness information) (see fig. 5).
The signal ratio calculator 61a performs a difference process (Log-Log = Log G/B = -Log (B/G)) on the basis of the 1G image signal and the 1B image signal after the Log conversion, thereby calculating a B/G ratio (a B/G ratio is expressed by omitting "-Log" from the-Log (B/G)). The signal ratio calculator 61a calculates the G/R ratio by performing a difference process (logR-logG = logR/G = -Log (G/R)) on the basis of the 1R image signal and the 1G image signal after Log conversion. The G/R ratio indicates that "-log" is omitted from-log (G/R) in the same manner as the B/G ratio. The B/G ratio and the G/R ratio calculated by the signal ratio calculating section 61a are set as 1 st color information. Hereinafter, the B/G ratio and the G/R ratio calculated by the signal ratio calculating unit 61a are expressed as a B/G ratio (X1) and a G/R ratio (Y1) to distinguish them from other color information (2 nd color information, 3 rd color information)
(refer to fig. 5).
The B/G ratio and the G/R ratio are obtained for each pixel from the pixel values of the pixels at the same position in the B image signal, the G image signal, and the R image signal. The B/G ratio and the G/R ratio are obtained for each pixel. Further, since the B/G ratio is related to the blood vessel depth (the distance from the mucosal surface to the position where a specific blood vessel is located), the B/G ratio varies depending on the blood vessel depth. Since the G/R ratio is related to the blood volume (hemoglobin index), the G/R ratio varies with variation in the blood volume. Further, instead of the B/G ratio and the G/R ratio, CIE1976La obtained from the color No. 1 medical image can be used * b * A of (a) * b * . In this case, in La * b * In (a) * b * Corresponding to the 1 st color information, and L corresponding to the 1 st lightness information.
The 1 st color change processing section 62 refers to the 1 st color information and the 1 st brightness information, changes the 1 st color information in the 1 st color direction in the case where the 1 st brightness information exceeds a predetermined upper limit value, and changes the 1 st color information in the 2 nd color direction having a relationship of complementary color or inverse color with the 1 st color direction in the case where the 1 st brightness information is less than a predetermined lower limit value, thereby acquiring the 2 nd color information.
Specifically, as shown in fig. 5 and 6, for a pixel Pa in which the pixel value of the G image signal G (L1) exceeds the upper limit value, the 1 st color change processing unit 62 changes the value Sa indicating the B/G ratio and the G/R ratio corresponding to the pixel Pa in the 1 st color direction D1. The values representing the B/G ratio and G/R ratio after the 1 st color direction D1 is changed are set as Sa * . It is preferable that the 1 st color direction D1 be changed by changing the radius and angle of motion in a space obtained by polar-coordinate conversion of the B/G ratio and the G/R ratio.
Then, the 1 st color change processing unit 62 refers to the pixel value of the G image signal G (L1), and changes the value Sb indicating the B/G ratio and the G/R ratio corresponding to the pixel Pa in the 2 nd color direction D2 with respect to the pixel Pb whose pixel value indicating the brightness is smaller than the lower limit value. Sb is a value representing the B/G ratio and the G/R ratio after the change in the 2 nd color direction D2 * . The B/G ratio and the G/R ratio obtained by the 1 st color change processing unit 62 in the 1 st direction D1 are defined as a B/G ratio (X2) and a G/R ratio (Y2). Thus, it is possible to realize a color change for emphasis of lightness and darkness for emphasizing a color change of lightness information such as a color gamut corresponding to a color of high-concentration blood or a color gamut corresponding to a color of a mucous membrane to which a dye is dispersed.
In addition, in order to emphasize a light region where both the color of the bright portion and the color of the dark portion exist, the color represented in the 1 st color direction D1 is preferably complementary to or reversed from the color represented in the 2 nd color direction D2. For example, the color indicated by the 1 st color direction is preferably yellow, and the color indicated by the 2 nd color direction is preferably blue. Since the lightness of the color itself is perceived as bright as yellow, in the case of a bright pixel region such as the pixel Pa, it is preferable to change the lightness in the direction of yellow. On the other hand, since blue is perceived as dark, in the case of a dark pixel region such as the pixel Pb, it is preferable to change in the blue direction in order to suppress brightness. In this way, the change in the shade of the bright portion and the dark portion is expressed by adding a combination of hues such as hue and saturation, and the change in the shade is easily recognized.
The synthesis coefficient setting unit 63 sets a 1 st synthesis coefficient for synthesizing the 1 st color information and the 2 nd color information described later, or a 2 nd synthesis coefficient for synthesizing the 1 st brightness information and the 2 nd brightness information described later, based on the 1 st color information. The 1 st synthesis coefficient and the 2 nd synthesis coefficient are preferably determined from the color information in the 1 st color gamut in which the 1 st color information is within a predetermined range, and have fixed values outside the 1 st color gamut.
Specifically, as shown in fig. 7, in the signal ratio space formed by the B/G ratio and the G/R ratio, the 1 st color gamut is defined by the 1 st centers XC1 and YC1, the 1 st distances R1 from the 1 st centers XC1 and YC1, and the 1 st angle θ 1 from the 1 st center line CL1 through which the 1 st centers XC1 and YC1 pass.
The 1 st and 2 nd synthesis coefficients f1 and f2 are preferably greater than 0 in the 1 st color gamut, and change in correspondence with the 1 st distance R1 or the 1 st angle θ 1, being set to 0 outside the 1 st color gamut. For example, as shown in fig. 8, the 1 st synthesis coefficient f1 is preferably set so that the 1 st center XC1 and YC1 is the largest and the farther away from the 1 st center XC1 and YC1, the smaller. On the other hand, as shown in fig. 29, the 2-synthesis coefficient f2 is preferably set so that the 1 st centers XC1 and YC1 are the largest, and is preferably set so that the farther from the 1 st centers XC1 and YC1, the larger the distance becomes.
In addition, when the 1 st color gamut is a color gamut corresponding to a color of high-concentration blood such as a bleeding region, it is preferable to determine the 1 st color gamut in the 1 st quadrant of the signal ratio space composed of the B/G ratio and the G/R ratio. In this case, as shown in fig. 9, the 1 st color gamut 72 is preferably set as an overlapping region of a distance region 72a determined by the 1 st distance R1 and an angle region 72b determined by the 1 st angle θ 1 in the 1 st quadrant of the signal ratio space. When the 1 st color gamut corresponds to the color of the mucous membrane to which the blue-based coloring matter such as indigo carmine is dispersed, the 1 st color gamut is preferably determined in the 3 rd quadrant of the signal ratio space. In this case, as shown in fig. 10, the 1 st color gamut 73 is preferably set in the 3 rd quadrant of the signal ratio space as an overlapping area of a distance area 73a determined by the 1 st distance R1 and an angle area 73b determined by the 1 st angle θ 1.
The color information combining section 65 obtains combined color information by combining the 1 st color information and the 2 nd color information according to the 1 st combining coefficient f 1. Specifically, as shown in fig. 11, the composite color information is obtained by adding each of the pixel values of the B/G ratio (X1) and the G/R ratio (Y1) to the f1 × B/G ratio (X2) and the f1 × G/R ratio (Y2) obtained by multiplying the 1 st synthesis coefficient f1 by each of the pixel values of the B/G ratio (X2) and the G/R ratio (Y2) as the 2 nd color information. That is, in the B/G ratio (X2) and the G/R ratio (Y2), the portions multiplied by the coefficients in the region larger than 0 are added to the B/G ratio (X1) and the G/R ratio (Y1). The B/G ratio and the G/R ratio obtained by the color information synthesizing section 65 are referred to as a B/G ratio (MX) and a G/R ratio (MY).
The 2 nd color change processing part 66 acquires the 3 rd color information by changing the 2 nd distance or the 2 nd angle determined from the color information within the 2 nd color gamut in which the 2 nd color information is within the predetermined range. As shown in fig. 12, in the 2 nd color change processing section 66, in the signal ratio space formed by B/G (MX) and G/R (MY), the B/G ratio (X3) and the G/R ratio (Y3) are obtained by changing the 2 nd distance R2 or the 2 nd angle θ 2.
The 2 nd color gamut determined by the 2 nd distance R2 or the 2 nd angle θ 2 is an area for emphasizing a color difference of a predetermined color emphasis object. As the target of color emphasis, for example, a region (an atrophic mucosa, a deep blood vessel, redness, or the like) which is color-different from a normal portion due to a lesion such as atrophic gastritis is preferably included. In the 2 nd color change processing section 66, since the change in hue or saturation is nonlinearly expanded or contracted, the change in color associated with the difference in shade is further emphasized, and therefore, the shade information can be easily obtained.
As shown in fig. 13, the 2 nd color gamut is defined by the 2 nd distance R2 from the 2 nd centers XC2, YC2 or the 2 nd angle θ 2 from the 2 nd center line CL2 passing through the 2 nd centers XC2, XC 2in the signal ratio space, similarly to the 1 st color gamut. In the 2 nd color change processing section 66, in order to change the 2 nd distance R2 or the 2 nd angle θ 2, it is necessary to convert the B/G ratio (MX) and the G/R ratio (MY) into polar coordinates.
The 2 nd distance changing process of changing the 2 nd distance R2 is preferably performed as follows. As shown in fig. 14, the 2 nd distance R2out is output with respect to the input of the 2 nd distance R2in of the coordinates included in the 2 nd gamut. This means that the saturation is lower as the 2 nd distance R2 is smaller, and the saturation is higher as the 2 nd distance R2 is larger. The 2 nd distance change processing uses a conversion curve TR of S words representing the input/output relationship.
In the 2 nd distance change processing, in the 2 nd color gamut, the output value R2out is made smaller than the input value R2in the low saturation range LR where the 2 nd distance R2 is small, while the output value R2out is made larger than the input value R2in the high saturation range HR. This makes it possible to lower the saturation of the observation target included in the low saturation range, and to increase the saturation of the observation target included in the high saturation range. By such saturation emphasis, the saturation difference between a plurality of observation target ranges can be increased. In addition, outside the 2 nd color gamut, the output value R2out is the same as the input value R2in.
The 2 nd angle changing process of changing the 2 nd angle θ 2 is preferably performed as follows. As shown in fig. 15, the 2 nd angle θ 2out is output with respect to the input of the 2 nd angle θ 2in of the coordinates included in the 2 nd gamut. The 2 nd angle θ 2 is farther from 0, and the color difference from the 2 nd center XC2, XC2 becomes larger. The 2 nd angle change processing uses a conversion curve TA of S-words representing the input/output relationship.
In the 2 nd angle change processing, in the 2 nd color gamut, the output value θ 2out is set smaller than the input value θ 2in the hue range RA indicated by the negative 2 nd angle θ 2, while the output value θ 2out is set larger than the input value θ 2in the hue range RB indicated by the positive 2 nd angle θ 2. This can increase the hue difference between the observation target included in the negative hue range RA and the observation target included in the positive hue range RB. In addition, outside the 2 nd color gamut, the output value θ 2out is the same as the input value θ 2in.
As shown in fig. 16, the brightness information acquisition section 67 converts the 1 st brightness information by the brightness conversion coefficient set according to the 1 st color information, thereby acquiring the 2 nd brightness information. The lightness conversion coefficient Tc is preferably the same as the 1 st synthesis coefficient f 1. Specifically, the G image signal G (L2) (G (L2) = Tc × G (L1)) as the 2 nd brightness information is calculated by multiplying the pixel value of each pixel of the G image signal G (L1) by the brightness conversion coefficient Tc.
The brightness information combining section 68 combines the 1 st brightness information and the 2 nd brightness information by the 2 nd combining coefficient set based on the 1 st color information, thereby acquiring the combined brightness information. Specifically, the pixel value of each pixel of the G image signal G (L2) is multiplied by the value of the 2 nd synthesis coefficient f2 for each pixel of the G image signal G (L2) and the pixel value of each pixel of the G image signal (L1) as the 1 st brightness information is added, thereby calculating the G image signal G (LM) (G (L1) + f2 × G (L2) as the synthesized brightness information.
The color image converting section 69 converts the 3 rd color information and the composite brightness information into a 2 nd color medical image. Specifically, the B/G ratio (X3) is converted into the 2 nd B image signal by performing an operation based on the G image signal G (LM) and the B/G ratio (X3). Then, the G/R ratio (Y3) is converted into the 2 nd R image signal by performing an operation based on the G image signal G (LM) and the G/R ratio (Y3). The G image signal G (LM) is output as a 2G image signal without performing special conversion. These 2B, 2G and 3 rd image signals are color 2 nd medical images. The color image conversion unit 69 outputs the color 2 nd medical image as a depth-enhanced image.
Next, a series of flows of the special light observation mode will be described with reference to a flowchart of fig. 17. The mode changeover switch 12f is operated to switch to the special light observation mode. Thereby, the special light is irradiated to the observation object. The special image signal is obtained by photographing the observation target with the image sensor 36. The image acquiring unit 60 acquires the 1 st R image signal, the 1 st G image signal, and the 1 st B image signal of the special image signal as the color 1 st medical image.
The color and brightness information acquiring section 61 acquires the B/G ratio (X1) and the G/R ratio (Y1) as the 1 st color information from the 1 st R image signal, the 1 st G image signal, and the 1 st B image signal of the special image signal. The color and brightness information acquiring unit 61 acquires the 1 st G image signal G (L1) as the 1 st brightness information.
The 1 st color change processing section 62 refers to the B/G ratio (X1), the G/R ratio (Y1), and the 1 st G image signal G (L1), and changes the values indicating the B/G ratio (X1) and the G/R ratio (Y1) in the 1 st color direction (yellow direction) when the 1 st G image signal G (L1) exceeds the upper limit value, and changes the values indicating the B/G ratio (X1) and the G/R ratio (Y1) in the 2 nd color direction (blue direction) when the 1 st G image signal G (L1) is less than the lower limit value. Thereby, the B/G ratio (X2) and the G/R ratio (Y2) are acquired as the 2 nd color information.
The color information synthesizing unit 65 synthesizes the B/G ratio (X1), the G/R ratio (Y1), the B/G ratio (X2), and the G/R ratio (Y2) with a preset 1 st synthesis coefficient f1 to obtain the B/G ratio (MX) and the G/R ratio (MY) as synthesized color information. The color image conversion section 69 converts the image into a color 2 nd medical image based on the B/G ratio (MX) and the G/R ratio (MY). The color 2 nd medical image is displayed on the display 15 as a depth-emphasized image. The above series of processing is repeated for a period during which the special light observation mode continues.
In the above embodiment, the light and dark emphasized image is obtained using the B/G ratio (X1), the G/R ratio (Y1), and the 1 st G image signal G (L1) calculated by the signal ratio calculating unit 61a, but the light and dark emphasized image may be obtained using other 1 st color information and 1 st brightness information.
For example, the color difference signals Cr, cb may be acquired from the chromatic 1 st medical image as 1 st color information, and the luminance signal Y may be acquired as 1 st brightness information. As shown in fig. 18, the acquisition of the color difference signals Cr and Cb and the luminance signal Y is performed by the luminance-color-difference-signal converting section 61 b. The same as the above-described embodiment using the B/G ratio and the G/R ratio is performed except that the luminance-color-difference signal conversion section 61B acquires the color-difference signals Cr and Cb and the luminance signal Y. Thus, the color difference signals Cb, cr correspond to the B/G ratio, G/R ratio, and the luminance signal Y corresponds to the 1G-th image signal.
In addition, when the color difference signals Cr and Cb and the luminance signal Y are used, as shown in fig. 19, the 1 st color change processing section 62 refers to the pixel value of the luminance signal Y, and refers to the pixel Pa of which the pixel value representing the brightness exceeds the upper limit valueThe value Sa indicating the color difference signals Cr, cb corresponding to the pixel Pa is changed in the 1 st color direction D1. The values of the color difference signals Cr and Cb after changing in the 1 st color direction D1 are set to Sa * . Further, the 1 st color direction is preferably changed by changing the moving path and the angle in the space after polar coordinate conversion of the color difference signals Cr and Cb.
The 1 st color change processing unit 62 refers to the pixel value of the luminance signal Y, and changes the value Sb of the color difference signals Cr and Cb corresponding to the pixel Pb in the 2 nd color direction D2 with respect to the pixel Pb having a pixel value representing the brightness smaller than the lower limit value. The values representing the color difference signals Cr, cb changed in the 2 nd color direction D2 are set to Sb *
In addition, when the gamut corresponding to the color of high concentration blood is set to the 1 st gamut, the 1 st gamut 72 is preferably determined in the 2 nd quadrant in the Cr and Cb spaces. As shown in fig. 20, the 1 st color gamut 72 is preferably set as an overlapping region of a distance region 72a determined by the 1 st distance R1 and an angle region 72b determined by the 1 st angle θ 1 in the 2 nd quadrant of the Cr, cb space.
Further, from the 1 st color medical image, the Hue H (Hue) and the saturation S (Satura formation) may be acquired as the 1 st color information, and the Value V (Value) may be acquired as the 1 st Value information. As shown in fig. 21, the hue H, the saturation S, and the value V are acquired by the HSV converting section 61 c. The same as the above embodiment using the B/G ratio and the G/R ratio is performed except that the hue H, the saturation S, and the value V are obtained by the HSV converting part 61 c. Thus, the hue H and saturation S correspond to the B/G ratio, G/R ratio, and the lightness V corresponds to the 1G image signal.
In addition, when the hue H, the saturation S, and the value V are used, as shown in fig. 22, the 1 st color change processing unit 62 refers to a pixel value of the value V, and changes a value Sa indicating the hue H and the saturation S corresponding to the pixel Pa in the 1 st color direction D1 with respect to a pixel Pa whose pixel value indicating the value exceeds an upper limit value. Sa is a value indicating hue H and saturation S changed in the 1 st color direction D1 *
The 1 st color change processing unit 62 refers to the pixel value of the lightness V, and represents the lightnessThe pixel Pb having a pixel value smaller than the lower limit value changes a value Sb indicating the hue H and the saturation S corresponding to the pixel Pb in the 2 nd color direction D2. Sb is a value representing the hue H and the saturation S after the change in the 2 nd color direction D2 *
In addition, when the color gamut corresponding to the color of high concentration blood is set to the 1 st color gamut, the 1 st color gamut 72 is preferably determined in the 2 nd quadrant in the HS space. As shown in fig. 23, the 1 st color gamut 72 is preferably set in the HS space to a region having a hue range and a saturation range of a color gamut corresponding to a color of high-concentration blood.
[ 2 nd embodiment ]
In embodiment 2, the observation target is irradiated with a laser light source and a fluorescent material instead of the 4-color LEDs 20a to 20d shown in embodiment 1. Otherwise, the same as embodiment 1 is applied.
As shown in fig. 24, in the endoscope system 100 according to embodiment 2, the light source device 13 is provided with a blue laser light source (indicated as "445LD" in fig. 24) 104 that emits a blue laser beam having a central wavelength of 445 ± 10nm and a blue-violet laser light source (indicated as "405LD" in fig. 24) 106 that emits a blue-violet laser beam having a central wavelength of 405 ± 10nm, instead of the 4-color LEDs 20a to 20d. The light emission from the semiconductor light emitting elements of the light sources 104 and 106 is individually controlled by the light source control unit 108, and the ratio of the light amount of the light emitted from the blue laser light source 104 to the light amount of the light emitted from the blue-violet laser light source 106 can be freely changed.
In the normal observation mode, the light source control unit 108 drives the blue laser light source 104. In contrast, in the case of the special light observation mode, both the blue laser light source 104 and the blue-violet laser light source 106 are driven. The laser beams emitted from the light sources 104 and 106 are incident on the light guide 24 via optical members (neither shown) such as a condenser lens, an optical fiber, and a combiner.
The half-value width of the blue laser beam or the blue-violet laser beam is preferably about ± 10 nm. The blue laser light source 104 and the blue-violet laser light source 106 may use a wide-area InGaN-based laser diode, and may also use an InGaNAs-based laser diode or a GaNAs-based laser diode. The light source may be configured to use a light emitting body such as a light emitting diode.
In the illumination optical system 30, a phosphor 110 on which a blue laser beam or a blue-violet laser beam from the light guide 24 is incident is provided in addition to the illumination lens 33. The phosphor 110 is irradiated with a blue laser beam, thereby emitting fluorescence from the phosphor 110. And, a part of the blue laser beam directly transmits the phosphor 110. The blue-violet laser beam is transmitted without exciting the phosphor 110. The light emitted from the fluorescent material 110 is irradiated to an observation target via the illumination lens 33.
Here, in the normal observation mode, since mainly the blue laser beam is incident on the phosphor 110, as shown in fig. 25, the observation target is irradiated with normal light obtained by multiplexing the blue laser beam and the fluorescent light excited and emitted from the phosphor 110 by the blue laser beam. On the other hand, in the special light observation mode, since both the blue-violet laser beam and the blue laser beam are incident on the phosphor 110, as shown in fig. 26, special light in which excitation light as the blue-violet laser beam, the blue laser beam, and fluorescence excited and emitted from the phosphor 110 by the blue laser beam are multiplexed is irradiated to the observation target.
The phosphor 110 preferably includes a plurality of phosphors (for example, YAG-based phosphor or BAM (BaMgAl) phosphor) that absorb part of the blue laser beam and emit green to yellow light by excitation 10 O 17 ) Such as a phosphor). As in this configuration example, when the semiconductor light emitting element is used as the excitation light source of the phosphor 110, high-intensity white light can be obtained with high luminous efficiency, the intensity of the white light can be easily adjusted, and variations in the color temperature and chromaticity of the white light can be suppressed to be small.
[ embodiment 3 ]
In embodiment 3, the observation target is irradiated with a broadband light source such as a xenon lamp and a rotary filter instead of the 4-color LEDs 20a to 20d shown in embodiment 1. Then, instead of the color imaging sensor 36, the observation target is imaged by a monochrome imaging sensor. The processor device 14 is not provided with the signal switching unit 48, and the image signal output from the noise removing unit 47 is input to the normal image processing unit 49 and the depth information enhancement processing unit 50 directly connected to the normal image processing unit 49.
In embodiment 3, in the case of the normal observation mode, the normal image processing unit 49 operates, and the depth information enhancement processing unit 50 directly transmits the normal image output from the normal image processing unit 49 to the video signal generation unit 51. On the other hand, in the special light observation mode, the depth information emphasis processing unit 59 operates, and the normal image processing unit 49 directly sends the special image signal output from the noise removal unit 47 to the depth information emphasis processing unit 50. Except for the above, the same as embodiment 1.
As shown in fig. 27, in the endoscope system 200 according to embodiment 3, the light source device 13 is provided with a broadband light source 202, a rotary filter 204, and a filter switching unit 205, instead of the 4-color LEDs 20a to 20d. Further, in the imaging optical system 30b, a monochrome imaging sensor 206 not provided with a color filter is provided instead of the color imaging sensor 36.
The broadband light source 202 is a xenon lamp, a white LED, or the like, and emits white light having a wavelength range from blue to red. The rotary filter 204 includes a normal observation mode filter 208 provided on the inner side and a special light observation mode filter 209 provided on the outer side (see fig. 28). The filter switching unit 205 moves the rotary filter 204 in the radial direction, and inserts the filter 208 for the normal observation mode of the rotary filter 204 into the optical path of the white light when the mode switching SW13a is set to the normal observation mode, and inserts the filter 209 for the special light observation mode of the rotary filter 204 into the optical path of the white light when the special light observation mode is set.
As shown in fig. 28, the normal observation mode filter 208 is provided with a B filter 208a that transmits blue light in white light, a G filter 208B that transmits green light in white light, and an R filter 208c that transmits red light in white light, along the circumferential direction. Accordingly, in the normal observation mode, the rotary filter 204 is rotated, and thereby blue light, green light, and red light are alternately irradiated to the observation target.
The special light observation mode filter 209 is provided with a Bn filter 209a for transmitting blue narrow-band light of a specific wavelength among the white light, a G filter 209c for transmitting green light among the white light, and an R filter 209d for transmitting red light among the white light, along the circumferential direction. Accordingly, in the special light observation mode, the rotating filter 204 rotates, and thereby blue narrow-band light, green light, and red light are alternately irradiated to the observation target. An image signal based on blue narrow-band light is used as a B image signal of the special image signal. The image signal based on green light is used as the G image signal of the special image signal, and the image signal based on red light is used as the R image signal of the special image signal.
In the above embodiment, when the input values (two color information such as B/G ratio and G/R ratio) are input to the 1 st color change processing unit 62 or the 2 nd color change processing unit 66, a two-dimensional L UT (Look Up Table) that outputs output values (two color information such as B/G ratio and G/R ratio) associated with each other may be used.
In the above embodiment, the hardware configuration of the processing units (processing units) such as the normal image processing unit 49, the depth information enhancement processing unit 50, the video signal generating unit 51, the image acquiring unit 60, the color and brightness information acquiring unit 61, the 1 st color change processing unit 62, the synthesis coefficient setting unit 63, the color information synthesizing unit 65, the 2 nd color change processing unit 66, the brightness information acquiring unit 67, the brightness information synthesizing unit 68, the color image converting unit 69, the signal ratio calculating unit 61a, the luminance/color difference signal converting unit 61b, the HSV converting unit 61c, and the light source control unit 108 is a processor (processor) as described below. Among the various processors are: a Programmable Logic Device (PLD) such as a CPU (Central Processing Unit) or an FPGA (Field Programmable Gate Array) which is a general-purpose processor that executes software (program) to function as various Processing units and whose circuit configuration can be changed after manufacture, a dedicated circuit which is a processor having a circuit configuration specifically designed to execute various Processing, and the like.
One processing unit may be constituted by one of these various processors, or may be constituted by a combination of two or more processors of the same kind or different kinds (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). Further, a plurality of processing units may be configured by one processor. First, as represented by a computer such as a client or a server, there is a system in which one processor is configured by a combination of one or more CPUs and software, and the processor functions as a plurality of processing units. Secondly, there is a System using a processor in which the functions of the entire System including a plurality of processing units are realized by one IC (Integrated Circuit) Chip, as typified by a System On Chip (SoC) or the like. In this manner, as the hardware configuration, various processing units are configured using one or more of the various processors described above.
The hardware configuration of these various processors is, more specifically, a circuit (circuit) in which circuit elements such as semiconductor elements are combined. The hardware configuration of the storage unit is a storage device such as an HDD (Hard disk Drive) or an SSD (Solid State Drive).
Description of the symbols
10-endoscope system, 12-endoscope, 12 a-insertion section, 12B-operation section, 12 c-bending section, 12D-tip section, 12 f-mode switch, 12G-still image acquisition instruction switch, 12 h-zoom operation section, 13-light source device, 14-processor device, 15-display, 16-user interface, 20-light source section, 20a-V-LED, 20B-LED, 20c-G-LED,20D-R-LED, 21-light source control section, 23-light path coupling section, 24-light guide, 30-illumination optical system, 31-image pickup optical system, 33-illumination lens, 35-objective lens, 36-image pickup sensor, 40-CDS/AGC circuit, 42-a/D converter, 45-image signal input section, 46-DSP, 47-noise removal section, 48-signal switching section, 49-normal image processing section, 50-depth emphasis processing section, 51-image signal generation section, 60-image signal acquisition section, 61-color information acquisition section, 61-brightness ratio signal acquisition section, 61-brightness ratio calculation section, 61-1-brightness ratio calculation section, 1-color ratio calculation section, and color ratio calculation sectionA color change processing section, a 63-synthesis coefficient setting section, a 65-color information synthesizing section, a 66-2 nd color change processing section, a 67-brightness information acquiring section, a 68-brightness information synthesizing section, a 69-color image converting section, 72, 73-1 st color gamut, 72a, 73 a-distance region, 72B, 73B-angle region, 100-endoscope system, 104-blue laser light source, 106-blue-violet laser light source, 108-light source control section, 110-phosphor, 200-endoscope system, 202-broadband light source, 204-rotation filter, 205-filter switching section, 206-image pickup sensor, 208-filter for normal observation mode, 208a-B filter, 208B-G filter, 208c-R filter, 209-filter for special light observation mode, 209a-Bn filter, 209B-G filter, 209c-R filter, sa, and Sa * Value of Sb, sb * -value, D1-1 st color direction, D2-2 nd color direction, XC1, YC 1-1 st center, R1-1 st distance, θ 1-1 st angle, CL 1-1 st centerline, XC2, YC 2-2 nd center, R2-2 nd distance, θ 2-2 nd angle, CL 2-2 nd centerline, TR, transition curve of TA-S word, LR-low saturation range, HR-high saturation range, RA-negative side hue range, RB-positive side hue range.

Claims (12)

1. A medical image processing apparatus is provided with a medical image processing unit,
which is provided with a processor and a control unit,
the processor performs the following processing:
acquiring a colorful 1 st medical image;
with respect to each pixel constituting the 1 st color medical image, 1 st brightness information on brightness of one of 3 attributes of a color is acquired from an image signal value, and 1 st color information on color information left by removing brightness from the 3 attributes of the color is acquired;
referring to the 1 st color information and the 1 st brightness information, in a case where the 1 st brightness information exceeds a predetermined upper limit value, changing the 1 st color information in a 1 st color direction, and in a case where the 1 st brightness information is less than a predetermined lower limit value, changing the 1 st color information in a 2 nd color direction having a relationship of complementary color or inverse color with the 1 st color direction, thereby acquiring 2 nd color information,
synthesizing the 1 st color information and the 2 nd color information by a 1 st synthesis coefficient set based on the 1 st color information, thereby obtaining synthesized color information,
converting the 1 st brightness information by a brightness conversion coefficient set according to the 1 st color information, thereby obtaining 2 nd brightness information,
synthesizing the 1 st brightness information and the 2 nd brightness information according to a 2 nd synthesis coefficient set according to the 1 st color information, thereby obtaining synthesized brightness information,
and converting the composite lightness information and the composite color information into a 2 nd color medical image with the image signal value changed.
2. The medical image processing apparatus according to claim 1,
the color represented by the 1 st color direction is yellow, and the color represented by the 2 nd color direction is blue.
3. The medical image processing apparatus according to claim 1,
the 1 st synthesis coefficient and the 2 nd synthesis coefficient are determined from the color information within a 1 st color gamut in which the 1 st color information is within a predetermined range, and have fixed values outside the 1 st color gamut.
4. The medical image processing apparatus according to claim 2,
the 1 st synthesis coefficient and the 2 nd synthesis coefficient are determined from the color information within a 1 st color gamut in which the 1 st color information is within a predetermined range, and have fixed values outside the 1 st color gamut.
5. The medical image processing apparatus according to claim 3,
the 1 st and 2 nd synthesis coefficients are greater than 0 within the 1 st color gamut, and change corresponding to the 1 st distance or 1 st angle determined from the color information, being set to 0 outside the 1 st color gamut.
6. The medical image processing apparatus according to claim 4,
the 1 st synthesis coefficient and the 2 nd synthesis coefficient are larger than 0 in the 1 st color gamut, and are changed corresponding to the 1 st distance or the 1 st angle determined from the color information, and are set to 0 outside the 1 st color gamut.
7. The medical image processing apparatus according to any one of claims 1 to 6,
the processor performs the following processing:
in a 2 nd color gamut in which the 2 nd color information is within a predetermined range, 3 rd color information is acquired by changing a 2 nd distance or a 2 nd angle determined from the color information.
8. The medical image processing apparatus according to any one of claims 1 to 6,
the 1 st color information is any one of:
a ratio R/G of a red signal R of the color 1 st medical image to a green signal G of the color 1 st medical image and a ratio B/G of a blue signal B of the color 1 st medical image to the green signal G;
color difference signals Cr, cb obtained from the colored 1 st medical image;
hue and saturation obtained from the colored 1 st medical image; and
CIE1976La obtained from the colored 1 st medical image * b * A in (a) * b *
9. The medical image processing apparatus according to claim 7,
the 1 st color information is any one of:
a ratio R/G of a red signal R of the color 1 st medical image to a green signal G of the color 1 st medical image and a ratio B/G of a blue signal B of the color 1 st medical image to the green signal G;
color difference signals Cr, cb obtained from the colored 1 st medical image;
hue and saturation obtained from the colored 1 st medical image; and
CIE1976La obtained from the colored 1 st medical image * b * A in (a) * b *
10. The medical image processing apparatus according to claim 3 or 4,
the 1 st color gamut is a color gamut corresponding to a color of high concentration blood.
11. The medical image processing apparatus according to claim 3 or 4,
the 1 st color gamut is a color gamut corresponding to the color of the color-dispersed mucous membrane.
12. A method for operating a medical image processing apparatus, comprising the steps of, by a processor:
acquiring a color 1 st medical image;
a step of acquiring, with respect to each pixel constituting the 1 st color medical image, 1 st brightness information regarding brightness of one of 3 attributes of a color from an image signal value, and 1 st color information regarding color information remaining after brightness is removed from the 3 attributes of the color;
a step of, with reference to the 1 st color information and the 1 st brightness information, changing the 1 st color information in a 1 st color direction in a case where the 1 st brightness information exceeds a predetermined upper limit value, and changing the 1 st color information in a 2 nd color direction having a relationship of complementary color or inverse color with the 1 st color direction in a case where the 1 st brightness information is less than a predetermined lower limit value, thereby acquiring 2 nd color information;
a step of synthesizing the 1 st color information and the 2 nd color information by a 1 st synthesis coefficient set based on the 1 st color information, thereby obtaining synthesized color information;
a step of converting the 1 st brightness information by a brightness conversion coefficient set based on the 1 st color information, thereby obtaining 2 nd brightness information;
a step of synthesizing the 1 st brightness information and the 2 nd brightness information by a 2 nd synthesis coefficient set based on the 1 st color information, thereby obtaining synthesized brightness information; and
and converting the image signal value into a 2 nd color medical image in which the image signal value is changed, based on the combined brightness information and the combined color information.
CN202210353152.4A 2021-04-06 2022-04-02 Medical image processing apparatus and method for operating the same Pending CN115191909A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021064956A JP2022160298A (en) 2021-04-06 2021-04-06 Medical image processing device and operation method thereof
JP2021-064956 2021-04-06

Publications (1)

Publication Number Publication Date
CN115191909A true CN115191909A (en) 2022-10-18

Family

ID=83574475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210353152.4A Pending CN115191909A (en) 2021-04-06 2022-04-02 Medical image processing apparatus and method for operating the same

Country Status (2)

Country Link
JP (1) JP2022160298A (en)
CN (1) CN115191909A (en)

Also Published As

Publication number Publication date
JP2022160298A (en) 2022-10-19

Similar Documents

Publication Publication Date Title
JP6285383B2 (en) Image processing apparatus, endoscope system, operation method of image processing apparatus, and operation method of endoscope system
JP5460506B2 (en) Endoscope apparatus operating method and endoscope apparatus
JP5460507B2 (en) Endoscope apparatus operating method and endoscope apparatus
JP6243364B2 (en) Endoscope processor, operation method, and control program
US11089949B2 (en) Endoscope system and method of operating same
JP6690003B2 (en) Endoscope system and operating method thereof
JP7203477B2 (en) Endoscope system and its operating method
US9734592B2 (en) Medical image processing device and method for operating the same
JP5972312B2 (en) Medical image processing apparatus and operating method thereof
JP6556076B2 (en) Endoscopic image signal processing apparatus and method, and program
CN111712178B (en) Endoscope system and working method thereof
JP7096788B2 (en) How to operate a medical image processing device, an endoscope system, and a medical image processing device
JP7116223B2 (en) Endoscope system and its operating method
CN115191909A (en) Medical image processing apparatus and method for operating the same
JP6669539B2 (en) Image processing apparatus, operation method of image processing apparatus, and image processing program
US11304600B2 (en) Light source device, endoscope system, and method of operating light source device
JP6855610B2 (en) Image processing device, operation method of image processing device, and image processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination