WO2022181263A1 - Medical image processing device, medical image processing method, and program - Google Patents

Medical image processing device, medical image processing method, and program Download PDF

Info

Publication number
WO2022181263A1
WO2022181263A1 PCT/JP2022/003924 JP2022003924W WO2022181263A1 WO 2022181263 A1 WO2022181263 A1 WO 2022181263A1 JP 2022003924 W JP2022003924 W JP 2022003924W WO 2022181263 A1 WO2022181263 A1 WO 2022181263A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
phosphor
depth position
fluorescence
position information
Prior art date
Application number
PCT/JP2022/003924
Other languages
French (fr)
Japanese (ja)
Inventor
穂 高橋
健太郎 深沢
大介 菊地
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to CN202280014492.2A priority Critical patent/CN116887760A/en
Publication of WO2022181263A1 publication Critical patent/WO2022181263A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities

Definitions

  • the present disclosure relates to a medical image processing apparatus, a medical image processing method, and a program.
  • biological tissue is labeled with a fluorescent reagent and a fluorescent image of the biological tissue is observed.
  • a fluorescent reagent biological tissues such as blood vessels, blood flow, lymphatic flow, and tumors, which are difficult to see or photograph with the naked eye under white light, can be easily visualized.
  • An operator can perform an accurate surgery (that is, fluorescence-guided surgery) by performing a procedure while confirming a fluorescence image together with a normal observation image obtained under white light.
  • fluorescence images are blurred due to fluorescence scattering in living tissue.
  • the degree of blurring of the fluorescence image tends to increase. Therefore, it may be difficult to clearly grasp the boundaries of the phosphors in the fluorescence image.
  • Patent Document 1 discloses an apparatus that determines the depth position of a blood vessel using a plurality of spectroscopic images with different wavelength ranges, and applies blood vessel enhancement processing according to the depth position to the fluorescence image of the blood vessel.
  • the apparatus of Patent Document 1 acquires a spectroscopic image based on a normal observation image captured by irradiating a blood vessel with white light, and determines the depth position of the blood vessel based on the spectroscopic image.
  • the objects such as blood vessels whose depth position can be determined by the device of Patent Document 1 are limited to those that can be shown in the normal observation image. That is, the apparatus of Patent Literature 1 cannot determine the depth position of an object that cannot normally be shown in an observation image. Therefore, the apparatus of Patent Document 1 cannot determine the depth position of an object that can be shown in the fluorescence image but cannot be shown in the normal observation image.
  • the device of Patent Literature 1 cannot, for example, determine the depth position of deep tissue that is far from the surface of the living tissue.
  • the present disclosure provides an advantageous technique for acquiring information based on the depth position of an observation target site in living tissue.
  • One aspect of the present disclosure is an image acquisition unit that acquires a fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue containing the fluorescent substance with excitation light, and based on the fluorescent image, the depth of the fluorescent substance a depth position information acquiring unit that acquires depth position information related to the position, the depth position information acquiring unit analyzing the fluorescence image and spreading information indicating the image intensity distribution of the phosphor in the fluorescence image. , and compares the spread information with a spread function representing the image intensity distribution in living tissue to acquire depth position information.
  • the image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light, and the depth position information acquiring unit analyzes the visible light image to obtain the image of the living tissue.
  • the type may be estimated, and a spread function corresponding to the estimated type of living tissue may be obtained.
  • the spread information may be the brightness distribution of the phosphor in the fluorescence image, and the spread function may be a line spread function based on the brightness distribution of the phosphor and the depth position information.
  • the spreading function includes, as a parameter, a scattering coefficient determined according to the fluorescence wavelength of the phosphor, and the depth position information acquiring unit acquires the scattering coefficient corresponding to the fluorescence wavelength, and the spreading function reflecting the scattering coefficient and the spreading Depth position information may be obtained based on the information.
  • the biological tissue includes a plurality of fluorescent substances having different fluorescence wavelengths
  • the image acquisition unit captures a fluorescent image obtained by irradiating the biological tissue with excitation light of each of the plurality of fluorescent substances and photographing the biological tissue.
  • the depth position information acquisition unit may acquire depth position information for each of the plurality of phosphors based on the fluorescence image.
  • the depth position information acquisition unit may acquire relative depth position information indicating the relative relationship of depth positions between the plurality of phosphors based on the depth position information of each of the plurality of phosphors.
  • the medical image processing apparatus may include an image quality adjustment unit that performs sharpening processing on the fluorescence image according to the depth position information.
  • the medical image processing apparatus includes an observation image generation unit that generates an observation image, and the image acquisition unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light, and performs observation.
  • the portion of the fluorescence image that corresponds to the phosphor after undergoing a sharpening process may be superimposed on the visible light image.
  • the biological tissue contains a plurality of fluorescent substances having different fluorescence wavelengths
  • the image acquisition unit acquires a fluorescent image obtained by photographing the biological tissue while irradiating the biological tissue with excitation light of each of the plurality of fluorescent substances.
  • the depth position information acquisition unit acquires depth position information for each of the plurality of phosphors based on the fluorescence image;
  • the portion corresponding to the plurality of phosphors of the fluorescence image after adjusting the relative brightness among the plurality of phosphors and adjusting the relative brightness among the plurality of phosphors is An observation image may be generated by superimposing it on the visible light image.
  • the medical image processing apparatus includes an observation image generation unit that generates an observation image, and the image acquisition unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light, and performs observation.
  • the image generation unit analyzes the fluorescence image after undergoing the sharpening process, specifies the range of the phosphor in the living tissue, and produces an observation image in which the portion corresponding to the range of the phosphor in the visible light image is emphasized. may be generated.
  • the observation image generation unit may generate an observation image in which a portion corresponding to the range of the phosphor in the visible light image is emphasized according to the depth position information.
  • Another aspect of the present disclosure includes a step of acquiring a fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue containing the fluorescent substance with excitation light; Acquisition of the depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution of the biological tissue. and a medical image processing method.
  • Another aspect of the present disclosure provides a computer with a procedure for acquiring a fluorescence image obtained by photographing the living tissue while irradiating the living tissue containing a fluorescent substance with excitation light, and analyzing the fluorescence image to obtain a fluorescence image.
  • a depth position related to the depth position of the phosphor is obtained by comparing the spread information with the procedure for acquiring the spread information indicating the image intensity distribution of the phosphor in the living tissue and the spread function representing the image intensity distribution in the living tissue. It relates to a procedure for obtaining information and a program for executing.
  • FIG. 1 is a block diagram showing an example of a medical observation system.
  • FIG. 2 is a block diagram showing an example of a medical image processing apparatus.
  • FIG. 3 shows an example of a visible light image (normal observation image) of living tissue.
  • FIG. 4 shows an example of a fluorescence image of living tissue.
  • FIG. 5 shows an example of a fluorescence image (that is, a sharp fluorescence image) obtained by subjecting the fluorescence image shown in FIG. 4 to sharpening processing.
  • FIG. 6 is a cross-sectional view along the XZ plane of an example of living tissue.
  • 7 is a cross-sectional view along the XY plane of the living tissue shown in FIG. 6.
  • FIG. 8 shows an example of a function (that is, an approximation function) obtained by approximating the one-dimensional luminance distribution along the observation reference line of the phosphor based on the Gaussian function.
  • FIG. 9 is a diagram showing an example of an observation image according to the first embodiment;
  • FIG. 10 is a diagram showing another example of an observation image according to the first embodiment;
  • FIG. 11 is a flow chart showing an example of the medical image processing method according to the first embodiment.
  • FIG. 12 shows an example of the relationship between wavelength (horizontal axis) and fluorescence intensity (vertical axis) of two types of phosphors (ie, first phosphor and second phosphor).
  • FIG. 9 is a diagram showing an example of an observation image according to the first embodiment
  • FIG. 10 is a diagram showing another example of an observation image according to the first embodiment
  • FIG. 11 is a flow chart showing an example of the medical image processing method according to the first embodiment.
  • FIG. 12 shows an example of the relationship
  • FIG. 13 shows an example of a fluorescent image of living tissue taken before surgery with the first fluorescent reagent mixed in blood.
  • FIG. 14 shows an example of a fluorescent image of living tissue taken after surgery in a state in which blood is mixed with a second fluorescent reagent different from the first fluorescent reagent.
  • FIG. 15 is a flow chart showing an example of a medical image processing method according to the second embodiment.
  • FIG. 16 is a diagram showing an example of an observation image according to the second embodiment.
  • FIG. 17 is a diagram showing another example of an observation image according to the second embodiment;
  • FIG. 18 is a flow chart showing an example of a medical image processing method according to the third embodiment.
  • FIG. 19 is a diagram showing an example of an observation image according to the third embodiment. It is a figure which shows roughly the whole structure of a microscope system. It is a figure which shows the example of an imaging system. It is a figure which shows the example of an imaging system.
  • FIG. 1 is a block diagram showing an example of a medical observation system 10.
  • FIG. 2 is a block diagram showing an example of the medical image processing apparatus 12.
  • FIG. 3 shows an example of a visible light image (normal observation image) 100 of a living tissue 200.
  • FIG. 4 shows an example of a fluorescence image 101 of a biological tissue 200.
  • FIG. 5 shows an example of a fluorescence image 101 (that is, a sharp fluorescence image 102) obtained by performing sharpening processing (scattering suppression processing) on the fluorescence image 101 shown in FIG.
  • the medical observation system 10 shown in FIG. 1 includes an imaging unit 11, a medical image processing device 12 and an output unit 13.
  • the imaging unit 11, the medical image processing apparatus 12, and the output unit 13 may be provided integrally or may be provided separately.
  • controllers of two or more of the imaging unit 11, the medical image processing apparatus 12, and the output unit 13 may be configured by a common control unit.
  • the imaging unit 11, the medical image processing apparatus 12, and the output unit 13 are equipped with a transmitting/receiving section (not shown), and can transmit/receive data between them by wire and/or wirelessly.
  • the imaging unit 11 acquires a visible light image 100 (see FIG. 3) of the living tissue 200 by irradiating the living tissue 200 including the fluorescent substance 202 such as the blood vessel 201 with visible light while photographing the living tissue 200 .
  • the imaging unit 11 acquires a fluorescence image 101 (see FIG. 4) of the living tissue 200 by photographing the living tissue 200 while irradiating the living tissue 200 with excitation light.
  • the imaging unit 11 acquires both the visible light image 100 and the fluorescence image 101 of the biological tissue 200 to be observed.
  • the imaging unit 11 shown in FIG. 1 includes a camera controller 21 , a camera storage section 22 , an imaging section 23 , a light irradiation section 24 and a sample support section 25 .
  • the camera controller 21 controls components of the imaging unit 11 .
  • the camera storage unit 22 stores various data and programs.
  • a component of the imaging unit 11 (for example, the camera controller 21) can appropriately read, rewrite, update, and delete various data and programs in the camera storage section 22.
  • FIG. 1 A component of the imaging unit 11 (for example, the camera controller 21) can appropriately read, rewrite, update, and delete various data and programs in the camera storage section 22.
  • the sample support section 25 supports the sample of the biological tissue 200 to be observed while positioning it at a predetermined observation position. Arrangement of the sample of the living tissue 200 at the observation position may be manually performed by hand, or may be performed mechanically by a transfer device (not shown).
  • a sample of biological tissue 200 supported by the sample support portion 25 contains a fluorescent substance 202 labeled with a fluorescent reagent (eg, ICG (Indocyanine Green), 5-ALA, or fluorescein).
  • a fluorescent reagent eg, ICG (Indocyanine Green), 5-ALA, or fluorescein.
  • the living tissue 200 may include two or more phosphors 202 as described later.
  • the light irradiation unit 24 irradiates the living tissue 200 positioned at the observation position with imaging light (that is, visible light and excitation light).
  • the light irradiation section 24 has a visible light irradiation section 24a and an excitation light irradiation section 24b.
  • the visible light irradiation unit 24a emits visible light (especially white light) toward the observation position.
  • the excitation light irradiation unit 24b emits excitation light for fluorescence excitation of the phosphor 202 toward the observation position.
  • the excitation light irradiation unit 24b can emit multiple types of excitation light with those excitation wavelengths. In this case, the excitation light irradiation unit 24b can selectively emit excitation light having a wavelength corresponding to the fluorescent reagent that is actually used.
  • the visible light irradiator 24a and the excitation light irradiator 24b may be composed of separate devices, or may be partially or wholly composed of a common device.
  • the imaging unit 23 captures the image of the living tissue 200 positioned at the observation position to acquire captured images (that is, the visible light image 100 and the fluorescence image 101).
  • the imaging unit 23 has a visible light imaging unit 23a and an excitation light imaging unit 23b.
  • the visible light imaging unit 23 a acquires a visible light image 100 of the biological tissue 200 .
  • the excitation light imaging unit 23b acquires the fluorescence image 101 of the biological tissue 200.
  • the visible light imaging unit 23a and the excitation light imaging unit 23b may be configured by separate devices, or may be partially or wholly configured by a common device.
  • the visible light image 100 and the fluorescence image 101 of the biological tissue 200 acquired in this way are transmitted from the imaging unit 11 to the medical image processing apparatus 12 under the control of the camera controller 21.
  • a specific transmission method of the visible light image 100 and the fluorescence image 101 from the imaging unit 11 to the medical image processing apparatus 12 is not limited.
  • the visible light image 100 and the fluorescence image 101 may be directly transmitted from the imaging unit 23 to the medical image processing apparatus 12 immediately after imaging, or may be transmitted from a device other than the imaging unit 23 (for example, the camera controller 21) to the medical image. It may be sent to processing device 12 .
  • the visible light image 100 and the fluorescence image 101 may be temporarily stored in the camera storage unit 22 and then read out from the camera storage unit 22 and transmitted to the medical image processing apparatus 12 .
  • the medical image processing apparatus 12 analyzes the captured image of the living tissue 200 (especially the fluorescence image 101) and acquires the depth position information of the phosphor 202. The medical image processing apparatus 12 also generates observation images based on the visible light image 100 and the fluorescence image 101 .
  • the locations of the phosphors 202 in the living tissue 200 are visibly displayed.
  • the specific images and other information included in the observed image are not limited.
  • the output unit 13 has an output controller 31 , an output storage section 32 and a display device 33 .
  • the output controller 31 controls the components of the output unit 13.
  • the output storage unit 32 stores various data and programs.
  • the components of the output unit 13 (for example, the output controller 31) can appropriately read, rewrite, update, and delete various data and programs in the output storage section 32.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • the display device 33 displays observation images sent from the medical image processing device 12 .
  • An observer such as an operator can confirm the range of the phosphor 202 in the living tissue 200 by viewing the observation image displayed on the display device 33 .
  • These functional units provided in the medical image processing apparatus 12 can be configured by arbitrary hardware and/or software, and two or more functional units may be realized by a common processing unit.
  • the image acquisition unit 41 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 (including the phosphor 202) from the imaging unit 11.
  • the image acquisition unit 41 directly transmits the acquired visible light image 100 and fluorescence image 101 to other processing units (image processing controller 40, depth position information acquisition unit 42, image quality adjustment unit 43, and observation image generation unit 44). , or temporarily stored in the processing storage unit 45 .
  • Other processing units (image processing controller 40, depth position information acquisition unit 42, image quality adjustment unit 43, and observation image generation unit 44) of the medical image processing apparatus 12 generate the visible light image 100 and the fluorescence image as necessary. 101 may be acquired from the processing storage unit 45 .
  • the depth position information acquisition unit 42 acquires depth position information related to the depth position of the phosphor 202 based on the fluorescence image 101 . That is, the depth position information acquisition unit 42 derives depth position information of the phosphor 202 based on the degree of blurring of the phosphor 202 in the fluorescence image 101 .
  • the depth position information referred to here is information associated with the depth position of the phosphor 202, and typically information directly indicating the depth position of the phosphor 202 (for example, the distance from the tissue surface 200a (depth is the absolute value of However, the depth position information may be information indirectly indicating the depth position of the phosphor 202, or may be other information derived from the depth position and other information.
  • the depth position information acquisition unit 42 of this embodiment analyzes the fluorescence image 101 and acquires spread information indicating the image intensity distribution of the phosphor 202 in the fluorescence image 101 . Then, the depth position information acquisition unit 42 acquires depth position information of the phosphor 202 by comparing the spread information of the phosphor 202 with the spread function representing the image intensity distribution in the biological tissue 200 .
  • the spread information of the phosphor 202 indicates the degree of actual blurring of the phosphor 202 in the fluorescence image 101 .
  • the blurring of the phosphors 202 in the fluorescence image 101 is caused by light scattering in the biological tissue 200, and therefore varies depending on the depth position of the phosphors 202.
  • the spread function is a function in which the degree of blurring of the phosphor 202 in the fluorescence image 101 is formulated according to the depth position of the phosphor 202 .
  • the depth position information acquisition unit 42 derives the depth position of the phosphor 202 by comparing the spread information indicating the actual degree of blurring of the phosphor 202 with the formulated spread function.
  • FIG. 6 is a cross-sectional view of an example of the biological tissue 200 along the XZ plane.
  • FIG. 7 is a cross-sectional view along the XY plane of the biological tissue 200 shown in FIG.
  • FIG. 8 shows an example of a function (that is, an approximation function) obtained by approximating the one-dimensional luminance distribution of the phosphor 202 along the observation reference line 203 based on the Gaussian function.
  • the "X-axis", “Y-axis” and “Z-axis” are mutually perpendicular, and the direction along the Z-axis (ie, the Z-axis direction) indicates the depth direction.
  • the phosphor 202 appears in a wider range than the actual range.
  • the depth position information acquisition unit 42 acquires information on the "range of the phosphor 202 in the fluorescence image 101" wider than the actual range of the phosphor 202 as "spread information”.
  • the one-dimensional luminance distribution of the phosphor 202 in the fluorescence image 101 can be obtained as "spread information of the phosphor 202".
  • the depth position information acquisition unit 42 of the present embodiment analyzes the luminance distribution of the fluorescence image 101, and the linear portion of the phosphor 202 having the length L along the X-axis direction shown in FIG. Obtain the one-dimensional intensity distribution for line 203).
  • the linear portion having the maximum length in the X-axis direction of the phosphor 202 is set as the observation reference line 203 .
  • the depth position information acquisition unit 42 of the present embodiment uses a statistically modeled function (for example, a probability distribution function such as a Gaussian function or a Lorentz function) for the spread information of the phosphor 202 acquired in this way. Fit the criterion to get the approximation function.
  • a statistically modeled function for example, a probability distribution function such as a Gaussian function or a Lorentz function
  • the comparison between the spread information and the spread function can be simplified, reducing the processing load. can be reduced.
  • the spread function is typically represented by a point spread function (PSF), which is a response function for a point light source, or a line spread function (LSF), which is a response function for a line light source.
  • PSF point spread function
  • LSF line spread function
  • the point spread function (PSF( ⁇ )) and the line spread function (LSF( ⁇ )) are expressed, for example, by the following equations.
  • d represents the depth position of the phosphor 202 (that is, the position in the depth direction from the tissue surface 200a to the phosphor 202 (in particular, the observation reference line 203)).
  • ⁇ a represents an absorption coefficient, and is determined according to the type of the biological tissue 200 (for example, an organ such as the liver; more specifically, the medium (composition) forming the biological tissue 200).
  • ⁇ s ′ represents an equivalent scattering coefficient and is determined according to the type of living tissue 200 and the fluorescence wavelength of the phosphor 202 .
  • ⁇ s ′ can be determined in association with the wavelength of the excitation light.
  • the difference between the excitation and fluorescence wavelengths is small, and in some cases the difference between the excitation and fluorescence wavelengths can be substantially ignored with respect to the degree of blurring of the phosphor 202 . In such cases, it is possible to regard the excitation wavelength as the fluorescence wavelength and define ' ⁇ s '' in relation to the excitation wavelength.
  • ⁇ s ′ can be defined in association with the fluorescent reagent.
  • ⁇ s ′ is indirectly associated with the fluorescence wavelength.
  • the processing storage unit 45 of the present embodiment pre-stores a large number of “ ⁇ a ” data in association with the type of living tissue 200 .
  • the processing storage unit 45 stores in advance a large number of “ ⁇ s ′” data in association with the type of living tissue 200 and the fluorescence wavelength of the phosphor 202 .
  • a specific value of “ ⁇ a ” can be read from the processing storage unit 45 according to the type of the actual living tissue 200 .
  • a specific value of “ ⁇ s ′” can be read from the processing storage unit 45 according to the actual type of biological tissue 200 and the actual fluorescence wavelength of the phosphor 202 .
  • “L” represents the length of the phosphor 202 on the XY plane perpendicular to the depth direction (see FIG. 7). In this embodiment, “L” is represented by the X-axis direction length of the observation reference line 203 representing the phosphor 202 .
  • ⁇ d is determined based on the absorption coefficient ( ⁇ a ) and the equivalent scattering coefficient ( ⁇ s ′), as expressed by Equation 2 above.
  • the spread functions (that is, the point spread function “PSF( ⁇ )” and the line spread function “LSF( ⁇ )”) are “d”, “ ⁇ a ”, “ ⁇ s ′” and “L” as parameters. Therefore, once “ ⁇ a ”, “ ⁇ s ′” and “L” are fixed at certain values, the spreading function is expressed as a function of “d” and the depth of phosphor 202 is expressed as shown in Equation 4 above.
  • the position “d” can be represented by a function of the position “ ⁇ ” of the phosphor 202 in the XY plane (“f( ⁇ )”).
  • the depth position information acquisition unit 42 of the present embodiment calculates the length "L" of the phosphor 202 to be observed from the "range of the phosphor 202 in the fluorescence image 101" specified by analyzing the fluorescence image 101. to get The depth position information acquisition unit 42 also reads and acquires the corresponding “ ⁇ a ” and “ ⁇ s ′” from the processing storage unit 45 according to the type of the living tissue 200 and the fluorescence wavelength of the phosphor 202 . Then, the depth position information acquisition unit 42 acquires “LSF( ⁇ )” reflecting the corresponding values of “L”, “ ⁇ a ” and “ ⁇ s ′”, and based on the “LSF( ⁇ )” to derive “f( ⁇ )”.
  • the depth position information acquisition unit 42 can acquire the type of the living tissue 200 and the fluorescence wavelength of the phosphor 202 by any method.
  • the depth position information acquisition unit 42 of the present embodiment estimates the type of the living tissue 200 by analyzing the visible light image 100, and determines the fluorescence wavelength of the phosphor 202 based on the information sent from the imaging unit 11. discriminate. However, the depth position information acquisition unit 42, based on information manually input by the operator to the medical observation system 10 (for example, the medical image processing apparatus 12), determines the type of the biological tissue 200 and the fluorescence of the phosphor 202. Wavelength may be obtained.
  • the depth position information acquiring unit 42 converts the “line spread function based on the luminance distribution of the phosphor 202 and the depth position information” determined according to the type of the biological tissue 200 into “the depth of the phosphor 202 . It is obtained as a spread function based on the vertical position. Then, the depth position information acquisition unit 42 derives the depth position information of the phosphor 202 based on the line spread information (approximate function) based on the luminance of the phosphor 202 and the line spread function.
  • the image quality adjustment unit 43 (see FIG. 2) performs sharpening processing on the fluorescence image 101 according to the depth position information of the phosphor 202 .
  • sharpening processing is performed using the inverse of the spread function that indicates the degree of blurring of the phosphor 202 , and the image restoration filter derived from the spread function is applied to the fluorescence image 101 .
  • the image quality adjustment unit 43 may perform other sharpening processing on the fluorescence image 101, or may perform arbitrary image quality adjustment processing on the visible light image 100 and/or the fluorescence image 101.
  • the observation image generation unit 44 generates an observation image 103 .
  • the observation image 103 is not limited as long as the location of the phosphor 202 in the living tissue 200 is represented visually.
  • FIG. 9 is a diagram showing an example of the observation image 103 according to the first embodiment.
  • FIG. 10 is a diagram showing another example of the observation image 103 according to the first embodiment.
  • the observation image generator 44 can generate the observation image 103 by emphasizing the portion corresponding to the range of the phosphor 202 in the visible light image 100 (see FIG. 9).
  • the observation image generator 44 can more accurately identify the range of the phosphor 202 in the biological tissue 200 by analyzing the fluorescence image 101 after undergoing the sharpening process. Then, the observation image generation unit 44 can generate the observation image 103 by performing processing for emphasizing the corresponding portion of the range of the phosphor 202 in the visible light image 100 .
  • an observer observing the observation image 103 can clearly confirm the range of the phosphor 202 on the visible light image 100 .
  • the observation image generation unit 44 superimposes the portion corresponding to the phosphor 202 of the fluorescence image 101 after undergoing the sharpening process on the corresponding portion of the visible light image 100, thereby obtaining the observation image 103. can be generated (see FIG. 10).
  • an observer observing the observation image 103 can visually recognize the fluorescence state of the phosphor 202 on the visible light image 100 .
  • the observation image generation unit 44 superimposes the fluorescence image 101 of the phosphor 202 after performing image processing (for example, processing for adjusting color, gradation, and brightness) on the visible light image 100 to generate an observation image 103. may be generated.
  • Information indicating the depth position information of the phosphor 202 directly or indirectly may be reflected in the observed image 103 (see FIG. 9).
  • the observer of the observation image 103 can intuitively grasp the depth position information of the phosphor 202 .
  • Information indicating the depth position information of the phosphor 202 can be reflected in the observed image 103 in any form.
  • a portion corresponding to the range of the phosphor 202 may be emphasized according to the depth position information.
  • the observation image generation unit 44 generates the color, pattern, shading and/or brightness of the portion corresponding to the range of the phosphor 202 in the observation image 103 according to the depth position information of the corresponding phosphor 202. may be adjusted.
  • the observation image 103 may include an index display indicating the relationship between the highlighted display of the phosphor 202 and the depth position information of the phosphor 202 .
  • the depth position of the phosphor 202 is represented by the shade of the filling of the phosphor 202, and the contrast between the shade display and the depth position (that is, the distance from the tissue surface 200a (0 mm to 20 mm)).
  • a bar-like index display indicating the relationship is included in the observed image 103 .
  • the imaging conditions for the visible light image 100 and/or the fluorescent image 101 of the biological tissue 200 may be optimized according to the depth position information of the phosphor 202 . That is, the imaging unit 11 shown in FIG. 1 may adjust the imaging conditions based on the “depth position information of the phosphor 202 ” sent from the medical image processing apparatus 12 . In this case, the imaging unit 11 may adjust, for example, the driving conditions of the camera (including the imaging element) and the state of the imaging light applied to the living tissue 200 (for example, the brightness of the excitation light and/or visible light).
  • FIG. 11 is a flow chart showing an example of the medical image processing method according to the first embodiment.
  • the image acquisition unit 41 of the medical image processing apparatus 12 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 from the imaging unit 11 (S1 in FIG. 11).
  • the depth position information acquisition unit 42 analyzes the visible light image 100 to identify the type of the living tissue 200 (S2).
  • a specific method of analyzing the visible light image 100 for specifying the type of the living tissue 200 is not limited, and the type of the living tissue 200 can be specified by using a known image processing method.
  • the depth position information acquisition unit 42 acquires a line spread function corresponding to the type of the living tissue 200 (S3).
  • the depth position information acquisition unit 42 of the present embodiment acquires the corresponding absorption coefficient and equivalent scattering coefficient read from the processing storage unit 45 and the length of the phosphor 202 (in particular, the observation reference line 203). Obtain the line spread function reflecting L and .
  • the depth position information acquisition unit 42 analyzes the fluorescence image 101 and acquires line spread information regarding the luminance of the phosphor 202 in the fluorescence image 101 (S4).
  • a specific analysis method of the fluorescence image 101 for acquiring the line spread information of the phosphor 202 is not limited, and the line spread information of the phosphor 202 (observation reference line of the phosphor 202) can be obtained by using a known image processing method. 203) can be obtained.
  • the depth position information acquiring unit 42 compares the line spread information of the phosphor 202 with the line spread function to derive the depth position of the phosphor 202 (S5).
  • the image quality adjustment unit 43 performs sharpening processing (that is, light scattering suppression processing) optimized based on the depth position of the phosphor 202 on the fluorescence image 101 (S6).
  • sharpening processing that is, light scattering suppression processing
  • S6 fluorescence image 101
  • observation image generator 44 generates an observation image 103 from the visible light image 100 and the fluorescence image 101 (especially the fluorescence image 101 after the sharpening process) (S7).
  • the image quality adjustment unit 43 and the observation image generation unit 44 may perform arbitrary image processing to adjust the image quality of the visible light image 100, the fluorescence image 101 and/or the observation image 103 of the living tissue 200. good.
  • the observed image 103 is sent from the medical image processing device 12 to the output unit 13 and displayed on the display device 33 .
  • Information indicating the depth position of the phosphor 202 is sent from the medical image processing apparatus 12 to the imaging unit 11 as necessary.
  • the depth position of the phosphor 202 can be acquired based on the degree of image blur of the phosphor 202 in the fluorescence image 101 .
  • the phosphor 202 that appears in the visible light image 100 but also the phosphor 202 that does not appear in the visible light image 100 can be appropriately set at the depth. can be obtained. Moreover, the depth position of the phosphor 202 in the biological tissue 200 can be acquired without providing a special device.
  • image processing (such as sharpening processing) optimized for the depth position of the phosphor 202 can be performed.
  • the visibility of the phosphor 202 in the fluorescence image 101 can be improved, and the range of the phosphor 202 in the biological tissue 200 can be specified more accurately.
  • an observer such as an operator can easily and accurately grasp the state of the phosphor 202 in the biological tissue 200 from the observation image 103. can be done.
  • the living tissue 200 includes a plurality of phosphors 202 having mutually different fluorescence wavelengths.
  • two or more types of fluorescent reagents are used to label one or more types of tissues (fluorescent substances 202), and two or more types of fluorescent substances 202 having mutually different fluorescence wavelengths are included in the living tissue 200 to be observed. .
  • FIG. 12 shows an example of the relationship between the wavelength (horizontal axis) and the fluorescence intensity (vertical axis) of two types of phosphors 202 (that is, the first phosphor 202a and the second phosphor 202b).
  • the first phosphor 202a and the second phosphor 202b shown in FIG. 12 are excited in different wavelength ranges, and have different peak fluorescence wavelengths (that is, fluorescence wavelengths showing maximum fluorescence intensity). Specifically, the peak fluorescence wavelength Pw1 of the first phosphor 202a is shorter than the peak fluorescence wavelengths Pw2 and Pw3 of the second phosphor 202b.
  • both the liver segment and the liver cancer region are fluorescently colored using one type of fluorescent reagent, it is not known whether the boundary of the fluorescent region indicates the boundary between the liver segment and the liver cancer region.
  • the liver segment can be labeled with the first fluorescent reagent
  • the liver cancer region can be labeled with the second fluorescent reagent having a different fluorescence wavelength (especially peak fluorescence wavelength) from the first fluorescent reagent.
  • an observer such as an operator grasps each of the liver segment (first phosphor 202a) and the liver cancer region (second phosphor 202b) from the fluorescence image while clearly distinguishing the boundaries between the two. be able to. Therefore, the operator can more accurately grasp the position and range of the liver cancer region in the entire liver, and can appropriately remove the liver cancer while suppressing the expansion of the tissue excision range.
  • FIG. 13 shows an example of a fluorescent image 101 of a biological tissue 200 taken before surgery with the first fluorescent reagent mixed in blood.
  • FIG. 14 shows an example of a fluorescent image 101 of a biological tissue 200 taken after surgery in a state in which a second fluorescent reagent different from the first fluorescent reagent is mixed with blood.
  • Fluorescent reagents mixed with blood before surgery may remain in blood vessels (especially observation sites) even after surgery. Therefore, if the fluorescent reagent mixed in blood before surgery and the fluorescent reagent mixed in blood after surgery are the same, the fluorescence observed after surgery may be due to the fluorescent reagent administered before surgery or the fluorescent reagent administered after surgery. It is not possible to determine whether it is due to the fluorescent reagent used.
  • the fluorescence wavelength of the first fluorescent reagent mixed with blood before surgery and the fluorescence wavelength of the second fluorescent reagent mixed with blood after surgery are different, fluorescence emitted by the first fluorescent reagent (see FIG. 13) , the fluorescence emitted by the second fluorescent reagent can be clearly distinguished (see FIG. 14). Therefore, the state of blood flow after surgery can be quickly and appropriately obtained from the fluorescence image 101 without performing treatment (that is, washing treatment) to remove the first fluorescent reagent from the observation target site before mixing the second fluorescent reagent into the blood. can be observed.
  • display of multiple phosphors 202 in observation image 103 is changed according to the type of phosphor 202 (that is, the type of fluorescent reagent). is preferred.
  • an observation image 103 is generated by superimposing a fluorescence image 101 on a visible light image 100, if there is a large difference in brightness between a plurality of types of phosphors 202 in the fluorescence image 101, the fluorescence image 101 and the observation image 103 It is difficult to visually recognize the phosphor 202 in .
  • the difference in brightness between the phosphors 202 is reduced, It is possible to make the phosphor 202 easier to see.
  • the relative depth positional relationship between the first phosphor 202a and the second phosphor 202b is shown in the fluorescence image. 101 cannot be grasped.
  • the fluorescence wavelengths of the first phosphor 202a and the second phosphor 202b are different from each other. Therefore, even if the degree of blurring in the fluorescence image 101 is simply compared between the first phosphor 202a and the second phosphor 202b, which of the first phosphor 202a and the second phosphor 202b is positioned higher. It is difficult to determine whether
  • the display color, pattern, shade and/or brightness of each phosphor 202 in the observation image 103 is changed according to the depth position information of the corresponding phosphor 202. may be adjusted.
  • the depth position of each phosphor 202 can be grasped from the observed image 103, and the relative depth positional relationship between the phosphors 202 can be grasped.
  • the observed image 103 is generated by highlighting the corresponding range of the phosphors 202 in the visible light image 100, if all the phosphors 202 are similarly highlighted in the observed image 103 regardless of their types, the phosphors 202 Unable to discern the relative depth position relationship between Further, even if the depth position information of each phosphor 202 is reflected in the observation image 103, if all the phosphors 202 are highlighted in the same way regardless of the type, each fluorescence is displayed in the observation image 103. The type of body 202 cannot be identified.
  • each phosphor 202 in the observation image 103 may be emphasized according to both the depth position information and the type of phosphor 202.
  • both the depth position of each phosphor 202 and the type of each phosphor 202 can be grasped from the observed image 103 .
  • FIG. 15 is a flowchart showing an example of a medical image processing method according to the second embodiment.
  • the living tissue 200 includes two phosphors 202 (a first phosphor 202a and a second phosphor 202b) will be described below.
  • the processing performed on a single phosphor 202 in the above-described first embodiment is performed on a plurality of phosphors 202 (first phosphor 202a and second phosphor 202b). performed for each.
  • the image acquisition unit 41 of the medical image processing apparatus 12 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 from the imaging unit 11 (S11 in FIG. 15).
  • the visible light image 100 of the living tissue 200 is obtained by the imaging unit 11 performing visible light imaging once.
  • the fluorescence image 101 of the biological tissue 200 may be obtained by the imaging unit 11 performing fluorescence imaging once, or may be obtained by performing fluorescence imaging a plurality of times.
  • both the first phosphor 202a and the second phosphor 202b can be appropriately excited by the excitation light emitted at once from the light irradiation section 24 (especially the excitation light irradiation section 24b) of the imaging unit 11, one fluorescence photography is performed. It is possible to obtain the fluorescence image 101 by .
  • the depth position information acquisition unit 42 analyzes the visible light image 100 to identify the type of the living tissue 200 (S12).
  • the depth position information acquisition unit 42 acquires the line spread function corresponding to the type of the living tissue 200 (S13).
  • the depth position information acquisition unit 42 of the present embodiment acquires a line spread function for the first phosphor 202a and a line spread function for the second phosphor 202b.
  • the depth position information acquisition unit 42 analyzes the fluorescence image 101 and acquires line spread information regarding luminance for each of the first phosphor 202a and the second phosphor 202b (S14).
  • the depth position information acquisition unit 42 compares the line spread information of each of the first phosphor 202a and the second phosphor 202b with the corresponding line spread function, thereby obtaining the first phosphor 202a and the second phosphor 202a.
  • the depth position of 202b is derived (S15).
  • the depth position information acquiring unit 42 of the present embodiment acquires depth position information of each of the plurality of phosphors 202 (the first phosphor 202a and the second phosphor 202b) based on the fluorescence image 101. do.
  • the sharpening process optimized based on the depth positions of the first phosphor 202a and the second phosphor 202b is performed by the image quality adjustment unit 43 on each of the first phosphor 202a and the second phosphor 202b. is performed (S16).
  • processing for adjusting the brightness between the first phosphor 202a and the second phosphor 202b in the fluorescence image 101 is performed on the fluorescence image 101 (S17).
  • the brightness of the first phosphor 202a and the second phosphor 202b can be appropriately adjusted.
  • This brightness adjustment processing may be performed by the image quality adjustment unit 43 or may be performed by the observation image generation unit 44 .
  • the image quality adjustment section 43 substantially functions as the observation image generation section 44 as well.
  • the observed image 103 is generated (S18). . That is, the observation image generator 44 generates the observation image 103 from the visible light image 100 and the fluorescence image 101 (especially the fluorescence image 101 after the sharpening process).
  • a specific method for generating the observed image 103 is not particularly limited.
  • the observation image generation unit 44 superimposes, on the visible light image 100, portions corresponding to the plurality of phosphors 202 of the fluorescence image 101 after relative brightness adjustment among the plurality of phosphors 202 has been performed. may be used to generate the observed image 103 .
  • the observation image generation unit 44 may generate the observation image 103 by performing processing for emphasizing the corresponding portion of the range of the phosphor 202 in the visible light image 100 .
  • FIG. 16 is a diagram showing an example of an observation image 103 according to the second embodiment.
  • FIG. 17 is a diagram showing another example of the observation image 103 according to the second embodiment.
  • the biological tissue 200 to be observed contains three phosphors 202 (first phosphor 202a, second phosphor 202b, and third phosphor 202c).
  • the observation image generation unit 44 generates the observation image 103 by emphasizing the portions corresponding to the ranges of the phosphors 202a, 202b, and 202c in the visible light image 100, as in the example shown in FIG. (See FIG. 16).
  • the shading or color of the portion of the observation image 103 corresponding to the range of the phosphors 202 is adjusted according to the depth position information of the corresponding phosphors 202 .
  • the overlapping phosphors 202b and 202c can be distinguished and visually recognized.
  • the observation image generation unit 44 can generate an observation image 103 showing a state in which the phosphors 202a, 202b, and 202c are projected onto the XZ plane (see FIG. 17). For example, based on the derived depth positions of the phosphors 202a, 202b, and 202c, by projecting the phosphors 202a, 202b, and 202c onto the XZ plane passing through the observation reference line 203, as shown in FIG. observation image 103 can be generated. If the thickness of each phosphor 202a, 202b, and 202c in the Z-axis direction is unknown, the thickness of each phosphor may be presumably determined according to the length (L) of each phosphor.
  • the relative depth positional relationship of the phosphors 202a, 202b, and 202c can be intuitively grasped from the observation image 103.
  • each of the phosphors 202a, 202b, and 202c is displayed with shading according to the depth position.
  • a color-coded display may be performed.
  • the observer when a plurality of phosphors 202 are contained in the biological tissue 200, the observer can obtain a plurality of , the relative depth position relationship between the phosphors 202 can be grasped.
  • an observer such as an operator can easily and accurately grasp the relative positional relationship between the phosphors 202 in the living tissue 200 from the observed image 103 .
  • the operator can perform surgery while grasping the relative positional relationship between the phosphors 202, thereby improving the accuracy and stability of the procedure.
  • the depth position between the phosphors 202 is calculated without directly deriving the depth position of each phosphor 202.
  • a relative relationship is derived. That is, in the above-described first and second embodiments, the absolute value of the depth position of each phosphor 202 is derived, but in the third embodiment, the relative depth position relationship between the phosphors 202 is derived.
  • the equivalent scattering coefficient ( ⁇ s ') and the length (L) of the phosphor 202 among the parameters of the above-described spreading function (see Equations 1 to 4 above) can be obtained with corresponding values.
  • the absorption coefficient ( ⁇ a ) is assumed to be a case in which the corresponding value cannot be obtained.
  • the processing storage unit 45 stores data of the equivalent scattering coefficient ( ⁇ s ′) but does not store the absorption coefficient ( ⁇ a ).
  • the depth position information acquisition unit 42 analyzes the fluorescence image 101 and obtains the spread information of each of the plurality of phosphors 202 in the fluorescence image 101. and obtain the length “L” of each phosphor 202 .
  • the depth position information acquisition unit 42 analyzes the visible light image 100 to acquire the type of the living tissue 200, and based on the type of the living tissue 200 and the fluorescence wavelength of each phosphor 202, the equivalent scattering of each phosphor 202.
  • a coefficient ( ⁇ s ′) is acquired from the processing storage unit 45 .
  • the relationship based on the spread information and the spread function including the depth position (d) and the absorption coefficient ( ⁇ a ) of the phosphor 202 as unknown parameters formula" can be obtained.
  • the above two relational expressions based on spread information and spread function are obtained.
  • Each of the plurality of relational expressions obtained in this manner represents the depth position of the corresponding phosphor 202 and the degree of blur, regardless of the fluorescence wavelength of the corresponding phosphor 202 .
  • the value of the absorption coefficient ( ⁇ a ) is determined according to the type of living tissue 200 . Therefore, the absorption coefficients ( ⁇ a ) of the plurality of phosphors 202 contained in the same living tissue 200 show the same value.
  • the substantially unknown parameters are the depth position (d) of the first phosphor 202a and the depth position (d) of the second phosphor 202b. (d), and the common absorption coefficient ( ⁇ a ). According to two relational expressions involving these three unknown parameters, it is possible to derive the relative relationship between the depth position of the first phosphor 202a and the depth position of the second phosphor 202b. .
  • the depth position information acquisition unit 42 of the present embodiment obtains the plurality of phosphors 202 based on the plurality of relational expressions obtained for each phosphor 202 and the spread information of each phosphor 202 . Acquire relative depth position information that indicates the relative relationship between depth positions.
  • FIG. 18 is a flowchart showing an example of a medical image processing method according to the third embodiment. A case will be described below in which the living tissue 200 includes two phosphors 202 (a first phosphor 202a and a second phosphor 202b).
  • the image acquisition unit 41 of the medical image processing apparatus 12 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 from the imaging unit 11 (S21 in FIG. 18).
  • the depth position information acquisition unit 42 analyzes the visible light image 100 to identify the type of the living tissue 200 (S22).
  • the depth position information acquisition unit 42 acquires a line spread function corresponding to the type of living tissue 200 for each of the first phosphor 202a and the second phosphor 202b (S23).
  • the depth position information acquisition unit 42 analyzes the fluorescence image 101 to acquire line spread information regarding luminance for each of the first phosphor 202a and the second phosphor 202b (S24).
  • the depth position information acquisition unit 42 compares the line spread information of each of the first phosphor 202a and the second phosphor 202b with the corresponding line spread function, thereby obtaining the first phosphor 202a and the second phosphor 202a.
  • 202b is derived (S25).
  • the depth position information acquisition unit 42 acquires depth position information of each of the first phosphor 202 a and the second phosphor 202 b based on the fluorescence image 101 .
  • the depth position information acquisition unit 42 compares the depth position information of the first phosphor 202a and the depth position information of the second phosphor 202b to obtain the first phosphor 202a and the second phosphor 202b. can derive relative depth position information between
  • the image quality adjustment unit 43 of the present embodiment performs processing on the fluorescence image 101 to adjust the brightness between the first phosphor 202a and the second phosphor 202b in the fluorescence image 101 (S26).
  • observation image generation unit 44 generates an observation image 103 from the visible light image 100 and the fluorescence image 101 (especially the fluorescence image 101 after brightness adjustment) (S27).
  • the observation image generation unit 44 superimposes the portion corresponding to the plurality of phosphors 202 of the fluorescence image 101 after the relative brightness adjustment among the plurality of phosphors 202 is performed on the visible light image 100. can be used to generate the observed image 103 .
  • the observation image generation unit 44 acquires the range of each phosphor 202 from the fluorescence image 101 and emphasizes the portion in the visible light image 100 corresponding to the range of each phosphor 202 to generate the observation image 103. can do.
  • FIG. 19 is a diagram showing an example of an observation image 103 according to the third embodiment.
  • the filling colors of the first phosphor 202a and the second phosphor 202b change according to the relative depth positions of the first phosphor 202a and the second phosphor 202b.
  • the display density of each phosphor 202 in the observation image 103 represents the relative depth position.
  • the relative depth positions between may be represented in the observed image 103 .
  • the observer when a plurality of phosphors 202 are contained in the biological tissue 200, the observer can obtain a plurality of , the relative depth position relationship between the phosphors 202 can be grasped.
  • a configuration example of the microscope system of the present disclosure is shown in FIG.
  • a microscope system 5000 shown in FIG. 20 includes a microscope device 5100 , a control section 5110 and an information processing section 5120 .
  • a microscope device 5100 includes a light irradiation section 5101 , an optical section 5102 , and a signal acquisition section 5103 .
  • the microscope device 5100 may further include a sample placement section 5104 on which the biological sample S is placed.
  • the configuration of the microscope apparatus is not limited to that shown in FIG. 20.
  • the light irradiation unit 5101 may exist outside the microscope apparatus 5100. It may be used as the unit 5101 .
  • the light irradiation section 5101 may be arranged such that the sample mounting section 5104 is sandwiched between the light irradiation section 5101 and the optical section 5102, and may be arranged on the side where the optical section 5102 exists, for example.
  • the microscope apparatus 5100 may be configured for one or more of bright field observation, phase contrast observation, differential interference observation, polarization observation, fluorescence observation, and dark field observation.
  • the microscope system 5000 may be configured as a so-called WSI (Whole Slide Imaging) system or digital pathology system, and can be used for pathological diagnosis.
  • Microscope system 5000 may also be configured as a fluorescence imaging system, in particular a multiplex fluorescence imaging system.
  • the microscope system 5000 may be used to perform intraoperative pathological diagnosis or remote pathological diagnosis.
  • the microscope device 5100 acquires data of the biological sample S obtained from the subject of the surgery, and transfers the data to the information processing unit 5120. can send.
  • the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing device 5120 located in a place (another room, building, or the like) away from the microscope device 5100 .
  • the information processing device 5120 receives and outputs the data.
  • a user of the information processing device 5120 can make a pathological diagnosis based on the output data.
  • the biological sample S may be a sample containing a biological component.
  • the biological components may be tissues, cells, liquid components of a living body (blood, urine, etc.), cultures, or living cells (cardiomyocytes, nerve cells, fertilized eggs, etc.).
  • the biological sample may be a solid, a specimen fixed with a fixative such as paraffin, or a solid formed by freezing.
  • the biological sample can be a section of the solid.
  • a specific example of the biological sample is a section of a biopsy sample.
  • the biological sample may be one that has undergone processing such as staining or labeling.
  • the treatment may be staining for indicating the morphology of biological components or for indicating substances (surface antigens, etc.) possessed by biological components, examples of which include HE (Hematoxylin-Eosin) staining and immunohistochemistry staining. be able to.
  • the biological sample may be treated with one or more reagents, and the reagents may be fluorescent dyes, chromogenic reagents, fluorescent proteins, or fluorescently labeled antibodies.
  • the specimen may be one prepared for the purpose of pathological diagnosis or clinical examination from a specimen or tissue sample collected from the human body. Moreover, the specimen is not limited to the human body, and may be derived from animals, plants, or other materials.
  • the specimen may be the type of tissue used (such as an organ or cell), the type of target disease, the subject's attributes (such as age, sex, blood type, or race), or the subject's lifestyle. The properties differ depending on habits (for example, eating habits, exercise habits, smoking habits, etc.).
  • the specimens may be managed with identification information (bar code information, QR code (trademark) information, etc.) that allows each specimen to be identified.
  • the light irradiation unit 5101 is a light source for illuminating the biological sample S and an optical unit for guiding the light irradiated from the light source to the specimen.
  • the light source may irradiate the biological sample with visible light, ultraviolet light, or infrared light, or a combination thereof.
  • the light source may be one or more of halogen lamps, laser light sources, LED lamps, mercury lamps, and xenon lamps. A plurality of types and/or wavelengths of light sources may be used in fluorescence observation, and may be appropriately selected by those skilled in the art.
  • the light irradiator may have a transmissive, reflective, or episcopic (coaxial or lateral) configuration.
  • the optical section 5102 is configured to guide the light from the biological sample S to the signal acquisition section 5103 .
  • the optical section can be configured to allow the microscope device 5100 to observe or image the biological sample S.
  • Optical section 5102 may include an objective lens.
  • the type of objective lens may be appropriately selected by those skilled in the art according to the observation method.
  • the optical section may include a relay lens for relaying the image magnified by the objective lens to the signal acquisition section.
  • the optical unit may further include optical components other than the objective lens and the relay lens, an eyepiece lens, a phase plate, a condenser lens, and the like.
  • the optical section 5102 may further include a wavelength separation section configured to separate light having a predetermined wavelength from the light from the biological sample S.
  • the wavelength separation section can be configured to selectively allow light of a predetermined wavelength or wavelength range to reach the signal acquisition section.
  • the wavelength separator may include, for example, one or more of a filter that selectively transmits light, a polarizing plate, a prism (Wollaston prism), and a diffraction grating.
  • the optical components included in the wavelength separation section may be arranged, for example, on the optical path from the objective lens to the signal acquisition section.
  • the wavelength separation unit is provided in the microscope apparatus when fluorescence observation is performed, particularly when an excitation light irradiation unit is included.
  • the wavelength separator may be configured to separate fluorescent light from each other or white light and fluorescent light.
  • the signal acquisition unit 5103 can be configured to receive light from the biological sample S and convert the light into an electrical signal, particularly a digital electrical signal.
  • the signal acquisition unit may be configured to acquire data on the biological sample S based on the electrical signal.
  • the signal acquisition unit may be configured to acquire data of an image (image, particularly a still image, a time-lapse image, or a moving image) of the biological sample S, particularly an image magnified by the optical unit. It can be configured to acquire data.
  • the signal acquisition unit includes one or more imaging elements, such as CMOS or CCD, having a plurality of pixels arranged one-dimensionally or two-dimensionally.
  • the signal acquisition unit may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or an image sensor for sensing such as AF and an image sensor for image output for observation. and may include In addition to the plurality of pixels, the image sensor includes a signal processing unit (including one, two, or three of CPU, DSP, and memory) that performs signal processing using pixel signals from each pixel, and an output control unit for controlling output of image data generated from the pixel signals and processed data generated by the signal processing unit. Furthermore, the imaging device may include an asynchronous event detection sensor that detects, as an event, a change in brightness of a pixel that photoelectrically converts incident light exceeding a predetermined threshold. An imaging device including the plurality of pixels, the signal processing section, and the output control section may preferably be configured as a one-chip semiconductor device.
  • the control unit 5110 controls imaging by the microscope device 5100 .
  • the control unit can drive the movement of the optical unit 5102 and/or the sample placement unit 5104 to adjust the positional relationship between the optical unit and the sample placement unit.
  • the control unit 5110 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (for example, the optical axis direction of the objective lens). Further, the control section may move the optical section and/or the sample placement section in any direction on a plane perpendicular to the optical axis direction.
  • the control unit may control the light irradiation unit 5101 and/or the signal acquisition unit 5103 for imaging control.
  • the sample mounting section 5104 may be configured such that the position of the biological sample on the sample mounting section can be fixed, and may be a so-called stage.
  • the sample mounting section 5104 can be configured to move the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
  • the information processing section 5120 can acquire data (such as imaging data) acquired by the microscope device 5100 from the microscope device 5100 .
  • the information processing section can perform image processing on the imaging data.
  • the image processing may include color separation processing.
  • the color separation process is a process of extracting data of light components of a predetermined wavelength or wavelength range from the captured data to generate image data, or removing data of light components of a predetermined wavelength or wavelength range from the captured data. It can include processing and the like.
  • the image processing may include autofluorescence separation processing for separating the autofluorescence component and dye component of the tissue section, and fluorescence separation processing for separating the wavelengths between dyes having different fluorescence wavelengths.
  • autofluorescence signals extracted from one may be used to remove autofluorescence components from image information of the other specimen.
  • the information processing section 5120 may transmit data for imaging control to the control section 5110, and the control section 5110 receiving the data may control imaging by the microscope apparatus 5100 according to the data.
  • the information processing section 5120 may be configured as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM.
  • the information processing section may be included in the housing of the microscope device 5100 or may be outside the housing. Also, various processes or functions by the information processing unit may be realized by a server computer or cloud connected via a network.
  • a method of imaging the biological sample S by the microscope device 5100 may be appropriately selected by a person skilled in the art according to the type of the biological sample and the purpose of imaging. An example of the imaging method will be described below.
  • the microscope device can first identify an imaging target region.
  • the imaging target region may be specified so as to cover the entire region where the biological sample exists, or a target portion (target tissue section, target cell, or target lesion portion) of the biological sample. ) may be specified to cover
  • the microscope device divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device sequentially images each divided region. As a result, an image of each divided area is obtained.
  • the microscope device identifies an imaging target region R that covers the entire biological sample S.
  • the microscope device divides the imaging target region R into 16 divided regions.
  • the microscope device can then image the segmented region R1, and then image any region included in the imaging target region R, such as a region adjacent to the segmented region R1. Then, image capturing of the divided areas is performed until there are no unimaged divided areas. Areas other than the imaging target area R may also be imaged based on the captured image information of the divided areas. After imaging a certain divided area, the positional relationship between the microscope device and the sample mounting section is adjusted in order to image the next divided area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them.
  • the image capturing device that captures each divided area may be a two-dimensional image sensor (area sensor) or a one-dimensional image sensor (line sensor).
  • the signal acquisition section may capture an image of each divided area via the optical section. Further, the imaging of each divided area may be performed continuously while moving the microscope device and/or the sample mounting section, or the movement of the microscope apparatus and/or the sample mounting section may be performed when imaging each divided area. may be stopped.
  • the imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap.
  • Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
  • the information processing device can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesizing process over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
  • the microscope device can first identify an imaging target region.
  • the imaging target region may be specified so as to cover the entire region where the biological sample exists, or the target portion (target tissue section or target cell-containing portion) of the biological sample. may be specified.
  • the microscope device scans a part of the imaging target area (also referred to as a "divided scan area") in one direction (also referred to as a "scanning direction”) in a plane perpendicular to the optical axis to capture an image. do.
  • the scanning of the divided scan area is completed, the next divided scan area next to the scan area is scanned. These scanning operations are repeated until the entire imaging target area is imaged. As shown in FIG.
  • the microscope device identifies a region (gray portion) in which the tissue section exists in the biological sample S as an imaging target region Sa. Then, the microscope device scans the divided scan area Rs in the imaging target area Sa in the Y-axis direction. After completing the scanning of the divided scan region Rs, the microscope device scans the next divided scan region in the X-axis direction. This operation is repeated until scanning is completed for the entire imaging target area Sa.
  • the positional relationship between the microscope device and the sample placement section is adjusted for scanning each divided scan area and for imaging the next divided scan area after imaging a certain divided scan area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them.
  • the imaging device that captures each divided scan area may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor).
  • the signal acquisition section may capture an image of each divided area via an enlarging optical system.
  • the imaging of each divided scan area may be performed continuously while moving the microscope device and/or the sample placement unit.
  • the imaging target area may be divided so that the divided scan areas partially overlap each other, or the imaging target area may be divided so that the divided scan areas do not overlap.
  • Each divided scan area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
  • the information processing apparatus can combine a plurality of adjacent divided scan areas to generate image data of a wider area. By performing the synthesizing process over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Further, image data with lower resolution can be generated from the image of the divided scan area or the image subjected to the synthesis processing.
  • the technical categories that embody the above technical ideas are not limited.
  • the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus.
  • the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
  • the present disclosure can also take the following configuration.
  • the image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
  • the depth position information acquisition unit estimating the type of the biological tissue by analyzing the visible light image;
  • the medical image processing apparatus according to item 1, wherein the spread function corresponding to the estimated type of the biological tissue is obtained.
  • the spread information is the luminance distribution of the phosphor in the fluorescence image; 3.
  • the spread function is a line spread function based on the luminance distribution of the phosphor and the depth position information.
  • the spreading function includes as a parameter a scattering coefficient determined according to the fluorescence wavelength of the phosphor, The depth position information acquisition unit, obtaining a scattering coefficient corresponding to the fluorescence wavelength; 4. The medical image processing apparatus according to any one of items 1 to 3, wherein the depth position information is acquired based on the spread function reflecting the scattering coefficient and the spread information.
  • the biological tissue contains a plurality of fluorescent substances having mutually different fluorescent wavelengths
  • the image acquisition unit acquires the fluorescence image obtained by irradiating the biological tissue with the excitation light of each of the plurality of fluorophores and photographing the biological tissue, 5.
  • the medical image processing apparatus according to any one of items 1 to 4, wherein the depth position information acquiring unit acquires the depth position information of each of the plurality of phosphors based on the fluorescence image.
  • the depth position information acquisition unit acquires relative depth position information indicating a relative relationship of the depth positions between the plurality of phosphors based on the depth position information of each of the plurality of phosphors. 6.
  • a medical image processing apparatus according to item 5.
  • An observation image generation unit that generates an observation image
  • the image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light
  • the medical image processing apparatus according to item 7, wherein, in the observation image, a portion corresponding to the phosphor of the fluorescence image after undergoing the sharpening process is superimposed on the visible light image.
  • the biological tissue contains a plurality of fluorescent substances having mutually different fluorescent wavelengths
  • the image acquisition unit acquires the fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue with the excitation light of each of the plurality of fluorescent substances
  • the depth position information acquisition unit acquires the depth position information of each of the plurality of phosphors based on the fluorescence image
  • the observation image generation unit Adjusting the relative brightness between the plurality of phosphors in the portion corresponding to the plurality of phosphors of the fluorescence image, An item for generating the observation image by superimposing, on the visible light image, portions corresponding to the plurality of phosphors of the fluorescence image after adjusting the relative brightness among the plurality of phosphors.
  • An observation image generation unit that generates an observation image
  • the image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
  • the observation image generation unit analyzing the fluorescence image after undergoing the sharpening process to specify the range of the phosphor in the living tissue; 8.
  • the medical image processing apparatus according to item 7, which generates the observation image in which a portion corresponding to the range of the phosphor in the visible light image is emphasized.
  • An imaging unit that obtains a fluorescence image by imaging a biological tissue containing a fluorescent substance while irradiating the biological tissue with excitation light; and a medical image processing device that analyzes the fluorescence image
  • the medical image processing device is an image acquisition unit that acquires the fluorescence image; a depth position information acquisition unit that acquires depth position information related to the depth position of the phosphor based on the fluorescence image; The depth position information acquisition unit, analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image; A medical observation system that acquires the depth position information by comparing the spread information with the spread function representing the image intensity distribution in the living tissue.
  • a display device for displaying an observation image in which the location of the phosphor in the biological tissue is visibly displayed The imaging unit acquires a visible light image by imaging the living tissue while irradiating the living tissue with visible light, 13.
  • [Item 15] Acquiring a fluorescence image obtained by photographing the living tissue containing the fluorescent material while irradiating the living tissue with excitation light; analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image; obtaining depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution in the living tissue;
  • a medical image processing method comprising:
  • Imaging unit 12 medical image processing device 13 output unit 21 camera controller 22 camera storage unit 23 imaging unit 24 light irradiation unit 25 sample support unit 31 output controller 32 output storage unit 33 display device 40 image processing controller 41 Image acquisition unit 42 Depth position information acquisition unit 43 Image quality adjustment unit 44 Observation image generation unit 45 Processing storage unit 100 Visible light image 101 Fluorescence image 102 Sharp fluorescence image 103 Observation image 200 Living tissue 200a Tissue surface 201 Blood vessel 202 Phosphor 203 Observation reference line

Abstract

[Problem] To provide an advantageous feature to acquire information based on the depth position of an observation target site of a biological tissue. [Solution] This medical image processing device is provided with: an image acquisition unit for acquiring a fluorescent image obtained by imaging a biological tissue including a phosphor while irradiating the biological tissue with excitation light; and a depth position information acquisition unit for acquiring depth position information relating to the depth position of the phosphor on the basis of the fluorescent image. The depth position information acquisition unit acquires distribution information indicating an image intensity distribution of the phosphor in the fluorescent image by analyzing the fluorescent image, and acquires depth position information by collating the distribution information with a distribution function representing an image intensity distribution in the biological tissue.

Description

医療用画像処理装置、医療用画像処理方法、及びプログラムMEDICAL IMAGE PROCESSING APPARATUS, MEDICAL IMAGE PROCESSING METHOD, AND PROGRAM
 本開示は、医療用画像処理装置、医療用画像処理方法、及びプログラムに関する。 The present disclosure relates to a medical image processing apparatus, a medical image processing method, and a program.
 生態観察の技術分野では、生体組織を蛍光試薬によって標識し、その生体組織の蛍光画像が観察されることがある。蛍光試薬を用いることで、白色光下における肉眼での視認や撮影が難しい血管、血流、リンパ流及び腫瘍などの生体組織を、見やすく可視化することができる。術者は、白色光下で得られた通常観察画像とともに蛍光画像を確認しながら手技を行うことで、正確な手術(すなわち蛍光ガイド手術)を行うことができる。 In the technical field of ecological observation, there are cases where biological tissue is labeled with a fluorescent reagent and a fluorescent image of the biological tissue is observed. By using a fluorescent reagent, biological tissues such as blood vessels, blood flow, lymphatic flow, and tumors, which are difficult to see or photograph with the naked eye under white light, can be easily visualized. An operator can perform an accurate surgery (that is, fluorescence-guided surgery) by performing a procedure while confirming a fluorescence image together with a normal observation image obtained under white light.
 一方、蛍光画像は生体組織中での蛍光散乱によってボケる。特に、生体組織表面から血管等の観察対象(蛍光体)までの距離が大きくなるに従って、蛍光画像のボケの程度は増大する傾向がある。そのため、蛍光画像において蛍光体の境界を明確に把握することが難しい場合がある。また、生体組織における蛍光体の深さ位置を、蛍光画像から視覚的に正確に判断することは簡単ではない。 On the other hand, fluorescence images are blurred due to fluorescence scattering in living tissue. In particular, as the distance from the surface of the biological tissue to the observation target (phosphor) such as a blood vessel increases, the degree of blurring of the fluorescence image tends to increase. Therefore, it may be difficult to clearly grasp the boundaries of the phosphors in the fluorescence image. In addition, it is not easy to accurately visually determine the depth position of the phosphor in the living tissue from the fluorescence image.
 特許文献1は、波長域が異なる複数の分光画像を用いて血管の深さ位置を判別し、当該深さ位置に応じた血管強調処理を血管の蛍光画像に施す装置を開示する。 Patent Document 1 discloses an apparatus that determines the depth position of a blood vessel using a plurality of spectroscopic images with different wavelength ranges, and applies blood vessel enhancement processing according to the depth position to the fluorescence image of the blood vessel.
特開2010-51350号公報JP 2010-51350 A
 特許文献1の装置は、血管に白色光を照射して撮影された通常観察画像に基づいて分光画像を取得し、当該分光画像に基づいて血管の深さ位置を判別する。 The apparatus of Patent Document 1 acquires a spectroscopic image based on a normal observation image captured by irradiating a blood vessel with white light, and determines the depth position of the blood vessel based on the spectroscopic image.
 そのため特許文献1の装置が深さ位置を判別可能な血管等の対象は、通常観察画像に写すことができる対象に限定される。すなわち特許文献1の装置は、通常観察画像に写すことができない対象の深さ位置を判別することができない。したがって特許文献1の装置では、蛍光画像に写すことができても通常観察画像に写すことができない対象については、深さ位置を判別することができない。 Therefore, the objects such as blood vessels whose depth position can be determined by the device of Patent Document 1 are limited to those that can be shown in the normal observation image. That is, the apparatus of Patent Literature 1 cannot determine the depth position of an object that cannot normally be shown in an observation image. Therefore, the apparatus of Patent Document 1 cannot determine the depth position of an object that can be shown in the fluorescence image but cannot be shown in the normal observation image.
 そのため特許文献1の装置は、例えば生体組織の表面からの距離が大きい深部組織の深さ位置を判別することはできない。 Therefore, the device of Patent Literature 1 cannot, for example, determine the depth position of deep tissue that is far from the surface of the living tissue.
 本開示は、生体組織における観察対象部位の深さ位置に基づく情報を取得するのに有利な技術を提供する。 The present disclosure provides an advantageous technique for acquiring information based on the depth position of an observation target site in living tissue.
 本開示の一態様は、蛍光体を含む生体組織に励起光を照射しつつ生体組織を撮影することで得られる蛍光画像を取得する画像取得部と、蛍光画像に基づいて、蛍光体の深さ位置に関連する深さ位置情報を取得する深さ位置情報取得部と、を備え、深さ位置情報取得部は、蛍光画像を解析して、蛍光画像における蛍光体の像強度分布を示す拡がり情報を取得し、生体組織における像強度分布を表す拡がり関数に対し、拡がり情報を照らし合わせることで、深さ位置情報を取得する、医療用画像処理装置に関する。 One aspect of the present disclosure is an image acquisition unit that acquires a fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue containing the fluorescent substance with excitation light, and based on the fluorescent image, the depth of the fluorescent substance a depth position information acquiring unit that acquires depth position information related to the position, the depth position information acquiring unit analyzing the fluorescence image and spreading information indicating the image intensity distribution of the phosphor in the fluorescence image. , and compares the spread information with a spread function representing the image intensity distribution in living tissue to acquire depth position information.
 画像取得部は、生体組織に可視光を照射しつつ生体組織を撮影することで得られる可視光画像を取得し、深さ位置情報取得部は、可視光画像を解析することで、生体組織の種類を推定し、推定された生体組織の種類に応じた拡がり関数を取得してもよい。 The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light, and the depth position information acquiring unit analyzes the visible light image to obtain the image of the living tissue. The type may be estimated, and a spread function corresponding to the estimated type of living tissue may be obtained.
 拡がり情報は、蛍光画像における蛍光体の輝度分布であり、拡がり関数は、蛍光体の輝度分布と、深さ位置情報と、に基づく線拡がり関数であってもよい。 The spread information may be the brightness distribution of the phosphor in the fluorescence image, and the spread function may be a line spread function based on the brightness distribution of the phosphor and the depth position information.
 拡がり関数は、蛍光体の蛍光波長に応じて定まる散乱係数をパラメータとして含み、深さ位置情報取得部は、蛍光波長に対応する散乱係数を取得し、散乱係数が反映された拡がり関数と、拡がり情報とに基づいて、深さ位置情報を取得してもよい。 The spreading function includes, as a parameter, a scattering coefficient determined according to the fluorescence wavelength of the phosphor, and the depth position information acquiring unit acquires the scattering coefficient corresponding to the fluorescence wavelength, and the spreading function reflecting the scattering coefficient and the spreading Depth position information may be obtained based on the information.
 生体組織は、蛍光波長が相互に異なる複数の蛍光体を含み、画像取得部は、複数の蛍光体のそれぞれの励起光を生体組織に照射して生体組織を撮影することで得られる蛍光画像を取得し、深さ位置情報取得部は、蛍光画像に基づいて、複数の蛍光体のそれぞれの深さ位置情報を取得してもよい。 The biological tissue includes a plurality of fluorescent substances having different fluorescence wavelengths, and the image acquisition unit captures a fluorescent image obtained by irradiating the biological tissue with excitation light of each of the plurality of fluorescent substances and photographing the biological tissue. The depth position information acquisition unit may acquire depth position information for each of the plurality of phosphors based on the fluorescence image.
 深さ位置情報取得部は、複数の蛍光体のそれぞれの深さ位置情報に基づいて、複数の蛍光体間における深さ位置の相対関係を示す相対深さ位置情報を取得してもよい。 The depth position information acquisition unit may acquire relative depth position information indicating the relative relationship of depth positions between the plurality of phosphors based on the depth position information of each of the plurality of phosphors.
 医療用画像処理装置は、深さ位置情報に応じた鮮鋭化処理を蛍光画像に対して行う画質調整部を備えてもよい。 The medical image processing apparatus may include an image quality adjustment unit that performs sharpening processing on the fluorescence image according to the depth position information.
 医療用画像処理装置は、観察画像を生成する観察画像生成部を備え、画像取得部は、生体組織に可視光を照射しつつ生体組織を撮影することで得られる可視光画像を取得し、観察画像では、鮮鋭化処理を受けた後の蛍光画像の蛍光体に対応する部分が可視光画像に重畳されてもよい。 The medical image processing apparatus includes an observation image generation unit that generates an observation image, and the image acquisition unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light, and performs observation. In the image, the portion of the fluorescence image that corresponds to the phosphor after undergoing a sharpening process may be superimposed on the visible light image.
 生体組織は、蛍光波長が相互に異なる複数の蛍光体を含み、画像取得部は、複数の蛍光体のそれぞれの励起光を生体組織に照射しつつ生体組織を撮影することで得られる蛍光画像を取得し、深さ位置情報取得部は、蛍光画像に基づいて、複数の蛍光体のそれぞれの深さ位置情報を取得し、観察画像生成部は、蛍光画像の複数の蛍光体に対応する部分に対し、複数の蛍光体間における相対的な明るさの調整を行い、複数の蛍光体間における相対的な明るさの調整が行われた後の蛍光画像の複数の蛍光体に対応する部分を、可視光画像に重畳して観察画像を生成してもよい。 The biological tissue contains a plurality of fluorescent substances having different fluorescence wavelengths, and the image acquisition unit acquires a fluorescent image obtained by photographing the biological tissue while irradiating the biological tissue with excitation light of each of the plurality of fluorescent substances. the depth position information acquisition unit acquires depth position information for each of the plurality of phosphors based on the fluorescence image; On the other hand, the portion corresponding to the plurality of phosphors of the fluorescence image after adjusting the relative brightness among the plurality of phosphors and adjusting the relative brightness among the plurality of phosphors is An observation image may be generated by superimposing it on the visible light image.
 医療用画像処理装置は、観察画像を生成する観察画像生成部を備え、画像取得部は、生体組織に可視光を照射しつつ生体組織を撮影することで得られる可視光画像を取得し、観察画像生成部は、鮮鋭化処理を受けた後の蛍光画像を解析して、生体組織における蛍光体の範囲を特定し、可視光画像において蛍光体の範囲に対応する部分が強調された観察画像を生成してもよい。 The medical image processing apparatus includes an observation image generation unit that generates an observation image, and the image acquisition unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light, and performs observation. The image generation unit analyzes the fluorescence image after undergoing the sharpening process, specifies the range of the phosphor in the living tissue, and produces an observation image in which the portion corresponding to the range of the phosphor in the visible light image is emphasized. may be generated.
 観察画像生成部は、可視光画像の蛍光体の範囲に対応する部分が、深さ位置情報に応じて強調された観察画像を生成してもよい。 The observation image generation unit may generate an observation image in which a portion corresponding to the range of the phosphor in the visible light image is emphasized according to the depth position information.
 本開示の他の態様は、蛍光体を含む生体組織に励起光を照射しつつ生体組織を撮影することで得られる蛍光画像を取得する工程と、蛍光画像を解析して、蛍光画像における蛍光体の像強度分布を示す拡がり情報を取得する工程と、生体組織における像強度分布を表す拡がり関数に対し、拡がり情報を照らし合わせることで、蛍光体の深さ位置に関連する深さ位置情報を取得する工程と、を含む医療用画像処理方法に関する。 Another aspect of the present disclosure includes a step of acquiring a fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue containing the fluorescent substance with excitation light; Acquisition of the depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution of the biological tissue. and a medical image processing method.
 本開示の他の態様は、コンピュータに、蛍光体を含む生体組織に励起光を照射しつつ生体組織を撮影することで得られる蛍光画像を取得する手順と、蛍光画像を解析して、蛍光画像における蛍光体の像強度分布を示す拡がり情報を取得する手順と、生体組織における像強度分布を表す拡がり関数に対し、拡がり情報を照らし合わせることで、蛍光体の深さ位置に関連する深さ位置情報を取得する手順と、を実行させるためのプログラムに関する。 Another aspect of the present disclosure provides a computer with a procedure for acquiring a fluorescence image obtained by photographing the living tissue while irradiating the living tissue containing a fluorescent substance with excitation light, and analyzing the fluorescence image to obtain a fluorescence image. A depth position related to the depth position of the phosphor is obtained by comparing the spread information with the procedure for acquiring the spread information indicating the image intensity distribution of the phosphor in the living tissue and the spread function representing the image intensity distribution in the living tissue. It relates to a procedure for obtaining information and a program for executing.
図1は、医療用観察システムの一例を示すブロック図である。FIG. 1 is a block diagram showing an example of a medical observation system. 図2は、医療用画像処理装置の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of a medical image processing apparatus. 図3は、生体組織の可視光画像(通常観察画像)の一例を示す。FIG. 3 shows an example of a visible light image (normal observation image) of living tissue. 図4は、生体組織の蛍光画像の一例を示す。FIG. 4 shows an example of a fluorescence image of living tissue. 図5は、図4に示す蛍光画像に対して鮮鋭化処理を行うことで得られる蛍光画像(すなわち鮮鋭蛍光画像)の一例を示す。FIG. 5 shows an example of a fluorescence image (that is, a sharp fluorescence image) obtained by subjecting the fluorescence image shown in FIG. 4 to sharpening processing. 図6は、生体組織の一例のXZ平面に沿う断面図である。FIG. 6 is a cross-sectional view along the XZ plane of an example of living tissue. 図7は、図6に示す生体組織のXY平面に沿う断面図である。7 is a cross-sectional view along the XY plane of the living tissue shown in FIG. 6. FIG. 図8は、蛍光体の観察基準線に沿う1次元輝度分布をガウス関数に基づいて近似することで得られる関数(すなわち近似関数)の一例を示す。FIG. 8 shows an example of a function (that is, an approximation function) obtained by approximating the one-dimensional luminance distribution along the observation reference line of the phosphor based on the Gaussian function. 図9は、第1実施形態に係る観察画像の一例を示す図である。FIG. 9 is a diagram showing an example of an observation image according to the first embodiment; 図10は、第1実施形態に係る観察画像の他の例を示す図である。FIG. 10 is a diagram showing another example of an observation image according to the first embodiment; 図11は、第1実施形態に係る医療用画像処理方法の一例を示すフローチャートである。FIG. 11 is a flow chart showing an example of the medical image processing method according to the first embodiment. 図12は、2種類の蛍光体(すなわち第1蛍光体及び第2蛍光体)の波長(横軸)と蛍光強度(縦軸)との間の関係例を示す。FIG. 12 shows an example of the relationship between wavelength (horizontal axis) and fluorescence intensity (vertical axis) of two types of phosphors (ie, first phosphor and second phosphor). 図13は、第1蛍光試薬が血液に混入された状態で、手術前に撮影された生体組織の蛍光画像の一例を示す。FIG. 13 shows an example of a fluorescent image of living tissue taken before surgery with the first fluorescent reagent mixed in blood. 図14は、第1蛍光試薬とは異なる第2の蛍光試薬が血液に混入された状態で、手術後に撮影された生体組織の蛍光画像の一例を示す。FIG. 14 shows an example of a fluorescent image of living tissue taken after surgery in a state in which blood is mixed with a second fluorescent reagent different from the first fluorescent reagent. 図15は、第2実施形態に係る医療用画像処理方法の一例を示すフローチャートである。FIG. 15 is a flow chart showing an example of a medical image processing method according to the second embodiment. 図16は、第2実施形態に係る観察画像の一例を示す図である。FIG. 16 is a diagram showing an example of an observation image according to the second embodiment. 図17は、第2実施形態に係る観察画像の他の例を示す図である。FIG. 17 is a diagram showing another example of an observation image according to the second embodiment; 図18は、第3実施形態に係る医療用画像処理方法の一例を示すフローチャートである。FIG. 18 is a flow chart showing an example of a medical image processing method according to the third embodiment. 図19は、第3実施形態に係る観察画像の一例を示す図である。FIG. 19 is a diagram showing an example of an observation image according to the third embodiment. 顕微鏡システムの全体構成を概略的に示す図である。It is a figure which shows roughly the whole structure of a microscope system. 撮像方式の例を示す図である。It is a figure which shows the example of an imaging system. 撮像方式の例を示す図である。It is a figure which shows the example of an imaging system.
 以下、図面を参照して、本開示の典型的な実施形態を例示的に説明する。 Hereinafter, exemplary embodiments of the present disclosure will be exemplified with reference to the drawings.
[第1実施形態]
[医療用観察システム]
 図1は、医療用観察システム10の一例を示すブロック図である。図2は、医療用画像処理装置12の一例を示すブロック図である。図3は、生体組織200の可視光画像(通常観察画像)100の一例を示す。図4は、生体組織200の蛍光画像101の一例を示す。図5は、図4に示す蛍光画像101に対して鮮鋭化処理(散乱抑制処理)を行うことで得られる蛍光画像101(すなわち鮮鋭蛍光画像102)の一例を示す。
[First embodiment]
[Medical observation system]
FIG. 1 is a block diagram showing an example of a medical observation system 10. As shown in FIG. FIG. 2 is a block diagram showing an example of the medical image processing apparatus 12. As shown in FIG. FIG. 3 shows an example of a visible light image (normal observation image) 100 of a living tissue 200. As shown in FIG. FIG. 4 shows an example of a fluorescence image 101 of a biological tissue 200. As shown in FIG. FIG. 5 shows an example of a fluorescence image 101 (that is, a sharp fluorescence image 102) obtained by performing sharpening processing (scattering suppression processing) on the fluorescence image 101 shown in FIG.
 図1に示す医療用観察システム10は、撮影ユニット11、医療用画像処理装置12及び出力ユニット13を備える。 The medical observation system 10 shown in FIG. 1 includes an imaging unit 11, a medical image processing device 12 and an output unit 13.
 撮影ユニット11、医療用画像処理装置12及び出力ユニット13は、一体的に設けられてもよいし、別体として設けられてもよい。例えば、撮影ユニット11、医療用画像処理装置12及び出力ユニット13のうちの2以上のコントローラが、共通の制御ユニットにより構成されていてもよい。 The imaging unit 11, the medical image processing apparatus 12, and the output unit 13 may be provided integrally or may be provided separately. For example, controllers of two or more of the imaging unit 11, the medical image processing apparatus 12, and the output unit 13 may be configured by a common control unit.
 撮影ユニット11、医療用画像処理装置12及び出力ユニット13は送受信部(図示省略)を具備し、有線及び/又は無線によって相互間でデータ類の送受信を行うことができる。 The imaging unit 11, the medical image processing apparatus 12, and the output unit 13 are equipped with a transmitting/receiving section (not shown), and can transmit/receive data between them by wire and/or wirelessly.
 撮影ユニット11は、血管201等の蛍光体202を含む生体組織200に可視光を照射しつつ生体組織200を撮影することで、生体組織200の可視光画像100(図3参照)を取得する。また撮影ユニット11は、生体組織200に励起光を照射しつつ、生体組織200を撮影することで、生体組織200の蛍光画像101(図4参照)を取得する。 The imaging unit 11 acquires a visible light image 100 (see FIG. 3) of the living tissue 200 by irradiating the living tissue 200 including the fluorescent substance 202 such as the blood vessel 201 with visible light while photographing the living tissue 200 . The imaging unit 11 acquires a fluorescence image 101 (see FIG. 4) of the living tissue 200 by photographing the living tissue 200 while irradiating the living tissue 200 with excitation light.
 このように撮影ユニット11は、観察対象である生体組織200に関し、可視光画像100及び蛍光画像101の両方を取得する。 In this way, the imaging unit 11 acquires both the visible light image 100 and the fluorescence image 101 of the biological tissue 200 to be observed.
 図1に示す撮影ユニット11は、カメラコントローラ21、カメラ記憶部22、撮像部23、光照射部24及び試料支持部25を備える。 The imaging unit 11 shown in FIG. 1 includes a camera controller 21 , a camera storage section 22 , an imaging section 23 , a light irradiation section 24 and a sample support section 25 .
 カメラコントローラ21は、撮影ユニット11の構成要素を制御する。 The camera controller 21 controls components of the imaging unit 11 .
 カメラ記憶部22は、各種データ及びプログラムを記憶する。撮影ユニット11の構成要素(例えばカメラコントローラ21)は、カメラ記憶部22における各種データ及びプログラムの読み出し、書き換え更新及び削除を、適宜行うことができる。 The camera storage unit 22 stores various data and programs. A component of the imaging unit 11 (for example, the camera controller 21) can appropriately read, rewrite, update, and delete various data and programs in the camera storage section 22. FIG.
 試料支持部25は、カメラコントローラ21の制御下で、観察対象である生体組織200の試料を、所定の観察位置に位置づけつつ支持する。生体組織200の試料の観察位置への配置は、人手により手動的に行われてもよいし、移送装置(図示省略)により機械的に行われてもよい。 Under the control of the camera controller 21, the sample support section 25 supports the sample of the biological tissue 200 to be observed while positioning it at a predetermined observation position. Arrangement of the sample of the living tissue 200 at the observation position may be manually performed by hand, or may be performed mechanically by a transfer device (not shown).
 試料支持部25により支持される生体組織200の試料は、蛍光試薬(例えばICG(Indocyanine Green)、5-ALA或いはフルオレセイン)により標識された蛍光体202を含む。 A sample of biological tissue 200 supported by the sample support portion 25 contains a fluorescent substance 202 labeled with a fluorescent reagent (eg, ICG (Indocyanine Green), 5-ALA, or fluorescein).
 本実施形態は、観察対象の生体組織200に蛍光体202が1つのみ含まれる場合を想定するが、後述のように生体組織200は2以上の蛍光体202を含んでもよい。 Although this embodiment assumes that only one phosphor 202 is included in the living tissue 200 to be observed, the living tissue 200 may include two or more phosphors 202 as described later.
 光照射部24は、カメラコントローラ21の制御下で、観察位置に位置づけられている生体組織200に対して撮影光(すなわち可視光及び励起光)を照射する。 Under the control of the camera controller 21, the light irradiation unit 24 irradiates the living tissue 200 positioned at the observation position with imaging light (that is, visible light and excitation light).
 光照射部24は、可視光照射部24a及び励起光照射部24bを有する。可視光照射部24aは、可視光(特に白色光)を、観察位置に向けて発する。励起光照射部24bは、蛍光体202を蛍光励起するための励起光を、観察位置に向けて発する。 The light irradiation section 24 has a visible light irradiation section 24a and an excitation light irradiation section 24b. The visible light irradiation unit 24a emits visible light (especially white light) toward the observation position. The excitation light irradiation unit 24b emits excitation light for fluorescence excitation of the phosphor 202 toward the observation position.
 対象部位を蛍光化するために励起波長の異なる複数種類の蛍光試薬が使われる可能性がある場合、励起光照射部24bは、それらの励起波長を持つ複数種類の励起光を発することができる。この場合、励起光照射部24bは、実際に使用される蛍光試薬に応じた波長を持つ励起光を、選択的に発することができる。 When there is a possibility that multiple types of fluorescent reagents with different excitation wavelengths are used to fluoresce the target site, the excitation light irradiation unit 24b can emit multiple types of excitation light with those excitation wavelengths. In this case, the excitation light irradiation unit 24b can selectively emit excitation light having a wavelength corresponding to the fluorescent reagent that is actually used.
 可視光照射部24a及び励起光照射部24bは、互いに別々のデバイスにより構成されてもよいし、一部又は全体が共通のデバイスにより構成されてもよい。 The visible light irradiator 24a and the excitation light irradiator 24b may be composed of separate devices, or may be partially or wholly composed of a common device.
 撮像部23は、カメラコントローラ21の制御下で、観察位置に位置づけられている生体組織200の撮影を行って撮影画像(すなわち可視光画像100及び蛍光画像101)を取得する。 Under the control of the camera controller 21, the imaging unit 23 captures the image of the living tissue 200 positioned at the observation position to acquire captured images (that is, the visible light image 100 and the fluorescence image 101).
 撮像部23は、可視光撮像部23a及び励起光撮像部23bを有する。可視光撮像部23aは、生体組織200の可視光画像100を取得する。励起光撮像部23bは、生体組織200の蛍光画像101を取得する。 The imaging unit 23 has a visible light imaging unit 23a and an excitation light imaging unit 23b. The visible light imaging unit 23 a acquires a visible light image 100 of the biological tissue 200 . The excitation light imaging unit 23b acquires the fluorescence image 101 of the biological tissue 200. FIG.
 可視光撮像部23a及び励起光撮像部23bは、互いに別々のデバイスにより構成されてもよいし、一部又は全体が共通のデバイスにより構成されてもよい。 The visible light imaging unit 23a and the excitation light imaging unit 23b may be configured by separate devices, or may be partially or wholly configured by a common device.
 このようにして取得された生体組織200の可視光画像100及び蛍光画像101は、カメラコントローラ21の制御下で、撮影ユニット11から医療用画像処理装置12に送信される。 The visible light image 100 and the fluorescence image 101 of the biological tissue 200 acquired in this way are transmitted from the imaging unit 11 to the medical image processing apparatus 12 under the control of the camera controller 21.
 撮影ユニット11から医療用画像処理装置12への可視光画像100及び蛍光画像101の具体的な送信方法は、限定されない。可視光画像100及び蛍光画像101は、撮影直後に撮像部23から医療用画像処理装置12に直接的に送信されてもよいし、撮像部23以外の装置(例えばカメラコントローラ21)から医療用画像処理装置12に送信されてもよい。例えば、可視光画像100及び蛍光画像101は、一旦、カメラ記憶部22に保存された後に、カメラ記憶部22から読み出されて医療用画像処理装置12に送信されてもよい。 A specific transmission method of the visible light image 100 and the fluorescence image 101 from the imaging unit 11 to the medical image processing apparatus 12 is not limited. The visible light image 100 and the fluorescence image 101 may be directly transmitted from the imaging unit 23 to the medical image processing apparatus 12 immediately after imaging, or may be transmitted from a device other than the imaging unit 23 (for example, the camera controller 21) to the medical image. It may be sent to processing device 12 . For example, the visible light image 100 and the fluorescence image 101 may be temporarily stored in the camera storage unit 22 and then read out from the camera storage unit 22 and transmitted to the medical image processing apparatus 12 .
 医療用画像処理装置12は、生体組織200の撮影画像(特に蛍光画像101)を解析して、蛍光体202の深さ位置情報を取得する。また医療用画像処理装置12は、可視光画像100及び蛍光画像101に基づいて観察画像を生成する。 The medical image processing apparatus 12 analyzes the captured image of the living tissue 200 (especially the fluorescence image 101) and acquires the depth position information of the phosphor 202. The medical image processing apparatus 12 also generates observation images based on the visible light image 100 and the fluorescence image 101 .
 本実施形態の観察画像では、生体組織200における蛍光体202の場所が視認可能に表される。ただし、観察画像に含まれる具体的な画像及びその他の情報は、限定されない。 In the observation image of the present embodiment, the locations of the phosphors 202 in the living tissue 200 are visibly displayed. However, the specific images and other information included in the observed image are not limited.
 医療用画像処理装置12の機能構成例及び画像処理例の詳細は後述する。 Details of an example of the functional configuration and an example of image processing of the medical image processing apparatus 12 will be described later.
 出力ユニット13は、出力コントローラ31、出力記憶部32及びディスプレイ装置33を有する。 The output unit 13 has an output controller 31 , an output storage section 32 and a display device 33 .
 出力コントローラ31は、出力ユニット13の構成要素を制御する。 The output controller 31 controls the components of the output unit 13.
 出力記憶部32は、各種データ及びプログラムを記憶する。出力ユニット13の構成要素(例えば出力コントローラ31)は、出力記憶部32おける各種データ及びプログラムの読み出し、書き換え更新及び削除を、適宜行うことができる。 The output storage unit 32 stores various data and programs. The components of the output unit 13 (for example, the output controller 31) can appropriately read, rewrite, update, and delete various data and programs in the output storage section 32. FIG.
 ディスプレイ装置33は、医療用画像処理装置12から送られてくる観察画像を表示する。術者等の観察者は、ディスプレイ装置33に表示される観察画像を見ることで、生体組織200における蛍光体202の範囲を確認することができる。 The display device 33 displays observation images sent from the medical image processing device 12 . An observer such as an operator can confirm the range of the phosphor 202 in the living tissue 200 by viewing the observation image displayed on the display device 33 .
[医療用画像処理装置]
 次に、医療用画像処理装置12の機能構成例及び画像処理例について説明する。
[Medical image processing device]
Next, a functional configuration example and an image processing example of the medical image processing apparatus 12 will be described.
 図2に示す医療用画像処理装置12は、画像処理コントローラ40、画像取得部41、深さ位置情報取得部42、画質調整部43、観察画像生成部44及び処理記憶部45を備える。医療用画像処理装置12が備えるこれらの機能部は任意のハードウェア及び/又はソフトウェアにより構成可能であり、2以上の機能部が共通の処理ユニットにより実現されてもよい。  The medical image processing apparatus 12 shown in FIG. These functional units provided in the medical image processing apparatus 12 can be configured by arbitrary hardware and/or software, and two or more functional units may be realized by a common processing unit.
 画像取得部41は、生体組織200(蛍光体202を含む)の可視光画像100及び蛍光画像101を、撮影ユニット11から取得する。 The image acquisition unit 41 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 (including the phosphor 202) from the imaging unit 11.
 画像取得部41は、取得した可視光画像100及び蛍光画像101を、他の処理部(画像処理コントローラ40、深さ位置情報取得部42、画質調整部43及び観察画像生成部44)に直接的に送ってもよいし、処理記憶部45に一旦保存してもよい。医療用画像処理装置12の他の処理部(画像処理コントローラ40、深さ位置情報取得部42、画質調整部43及び観察画像生成部44)は、必要に応じて、可視光画像100及び蛍光画像101を処理記憶部45から取得してもよい。 The image acquisition unit 41 directly transmits the acquired visible light image 100 and fluorescence image 101 to other processing units (image processing controller 40, depth position information acquisition unit 42, image quality adjustment unit 43, and observation image generation unit 44). , or temporarily stored in the processing storage unit 45 . Other processing units (image processing controller 40, depth position information acquisition unit 42, image quality adjustment unit 43, and observation image generation unit 44) of the medical image processing apparatus 12 generate the visible light image 100 and the fluorescence image as necessary. 101 may be acquired from the processing storage unit 45 .
 深さ位置情報取得部42は、蛍光画像101に基づいて、蛍光体202の深さ位置に関連する深さ位置情報を取得する。すなわち深さ位置情報取得部42は、蛍光画像101における蛍光体202のボケの程度に基づいて、蛍光体202の深さ位置情報を導き出す。 The depth position information acquisition unit 42 acquires depth position information related to the depth position of the phosphor 202 based on the fluorescence image 101 . That is, the depth position information acquisition unit 42 derives depth position information of the phosphor 202 based on the degree of blurring of the phosphor 202 in the fluorescence image 101 .
 ここで言う深さ位置情報は、蛍光体202の深さ位置に関連づけられる情報であり、典型的には蛍光体202の深さ位置を直接的に示す情報(例えば組織表面200aからの距離(深さ)の絶対値)である。ただし深さ位置情報は、蛍光体202の深さ位置を間接的に示す情報であってもよいし、深さ位置及び他の情報から導き出される別の情報であってもよい。 The depth position information referred to here is information associated with the depth position of the phosphor 202, and typically information directly indicating the depth position of the phosphor 202 (for example, the distance from the tissue surface 200a (depth is the absolute value of However, the depth position information may be information indirectly indicating the depth position of the phosphor 202, or may be other information derived from the depth position and other information.
 本実施形態の深さ位置情報取得部42は、蛍光画像101を解析して、蛍光画像101における蛍光体202の像強度分布を示す拡がり情報を取得する。そして深さ位置情報取得部42は、生体組織200における像強度分布を表す拡がり関数に対し、蛍光体202の拡がり情報を照らし合わせることで、蛍光体202の深さ位置情報を取得する。 The depth position information acquisition unit 42 of this embodiment analyzes the fluorescence image 101 and acquires spread information indicating the image intensity distribution of the phosphor 202 in the fluorescence image 101 . Then, the depth position information acquisition unit 42 acquires depth position information of the phosphor 202 by comparing the spread information of the phosphor 202 with the spread function representing the image intensity distribution in the biological tissue 200 .
 蛍光体202の拡がり情報は、蛍光画像101における蛍光体202の実際のボケの程度を示す。蛍光画像101における蛍光体202のボケは、生体組織200中の光散乱に起因するため、蛍光体202の深さ位置に応じて変わる。一方、拡がり関数は、蛍光画像101における蛍光体202のボケの程度が、蛍光体202の深さ位置に応じて定式化された関数である。 The spread information of the phosphor 202 indicates the degree of actual blurring of the phosphor 202 in the fluorescence image 101 . The blurring of the phosphors 202 in the fluorescence image 101 is caused by light scattering in the biological tissue 200, and therefore varies depending on the depth position of the phosphors 202. FIG. On the other hand, the spread function is a function in which the degree of blurring of the phosphor 202 in the fluorescence image 101 is formulated according to the depth position of the phosphor 202 .
 したがって深さ位置情報取得部42は、蛍光体202の実際のボケの程度を示す拡がり情報を、定式化された拡がり関数に照らし合わせることで、蛍光体202の深さ位置を導き出す。 Therefore, the depth position information acquisition unit 42 derives the depth position of the phosphor 202 by comparing the spread information indicating the actual degree of blurring of the phosphor 202 with the formulated spread function.
 図6は、生体組織200の一例のXZ平面に沿う断面図である。図7は、図6に示す生体組織200のXY平面に沿う断面図である。図8は、蛍光体202の観察基準線203に沿う1次元輝度分布をガウス関数に基づいて近似することで得られる関数(すなわち近似関数)の一例を示す。「X軸」、「Y軸」及び「Z軸」は相互に直角を成し、Z軸に沿う方向(すなわちZ軸方向)は深さ方向を示す。 FIG. 6 is a cross-sectional view of an example of the biological tissue 200 along the XZ plane. FIG. 7 is a cross-sectional view along the XY plane of the biological tissue 200 shown in FIG. FIG. 8 shows an example of a function (that is, an approximation function) obtained by approximating the one-dimensional luminance distribution of the phosphor 202 along the observation reference line 203 based on the Gaussian function. The "X-axis", "Y-axis" and "Z-axis" are mutually perpendicular, and the direction along the Z-axis (ie, the Z-axis direction) indicates the depth direction.
 図6に示すように、生体組織200の表面(すなわち「組織表面」)200aから「d」で表される深さに位置する蛍光体202から発せられた蛍光は、組織表面200aに到達するまでの間に、生体組織200により散乱される。その結果、組織表面200aにおいて観察される蛍光体202の蛍光像はボケて、蛍光体202の実際のサイズ(図7参照)よりも大きくなる。 As shown in FIG. 6, fluorescence emitted from phosphor 202 located at a depth represented by “d” from surface (ie, “tissue surface”) 200a of biological tissue 200 reaches tissue surface 200a. are scattered by the living tissue 200 during the As a result, the fluorescent image of the phosphor 202 observed on the tissue surface 200a is blurred and larger than the actual size of the phosphor 202 (see FIG. 7).
 したがって蛍光画像101において、蛍光体202は、実際の範囲よりも拡がった範囲で表れる。 Therefore, in the fluorescence image 101, the phosphor 202 appears in a wider range than the actual range.
 深さ位置情報取得部42は、実際の蛍光体202の範囲よりも広範な「蛍光画像101における蛍光体202の範囲」の情報を「拡がり情報」として取得する。 The depth position information acquisition unit 42 acquires information on the "range of the phosphor 202 in the fluorescence image 101" wider than the actual range of the phosphor 202 as "spread information".
 典型的には、蛍光画像101における蛍光体202の1次元輝度分布を、「蛍光体202の拡がり情報」として取得することができる。本実施形態の深さ位置情報取得部42は、蛍光画像101の輝度分布解析を行って、図7に示すX軸方向に沿った長さLを有する蛍光体202の線状部分(すなわち観察基準線203)に関する1次元輝度分布を取得する。なお、図6及び図7に示す例では、蛍光体202のうちX軸方向に最大長さを有する線状部分が、観察基準線203に設定される。 Typically, the one-dimensional luminance distribution of the phosphor 202 in the fluorescence image 101 can be obtained as "spread information of the phosphor 202". The depth position information acquisition unit 42 of the present embodiment analyzes the luminance distribution of the fluorescence image 101, and the linear portion of the phosphor 202 having the length L along the X-axis direction shown in FIG. Obtain the one-dimensional intensity distribution for line 203). In the examples shown in FIGS. 6 and 7 , the linear portion having the maximum length in the X-axis direction of the phosphor 202 is set as the observation reference line 203 .
 本実施形態の深さ位置情報取得部42は、このようにして取得される蛍光体202の拡がり情報を、統計的にモデル化された関数(例えばガウス関数やローレンツ関数などの確率分布関数)を基準に近似して、近似関数を取得する。 The depth position information acquisition unit 42 of the present embodiment uses a statistically modeled function (for example, a probability distribution function such as a Gaussian function or a Lorentz function) for the spread information of the phosphor 202 acquired in this way. Fit the criterion to get the approximation function.
 このようにして得られる近似関数(図8参照)を「蛍光体202の拡がり情報」として使うことで、拡がり情報と拡がり関数(特に線拡がり関数)との間の照合を簡単にして、処理負荷を軽減することができる。 By using the approximation function (see FIG. 8) obtained in this way as the "spread information of the phosphor 202", the comparison between the spread information and the spread function (especially the line spread function) can be simplified, reducing the processing load. can be reduced.
 一方、拡がり関数は、典型的には、点光源に関する応答関数である点拡がり関数(PSF: Point Spread Function)或いは線光源に関する応答関数である線拡がり関数(LSF: Line Spread Function)によって表される。実際の蛍光体202は長さを持つため、本実施形態では拡がり関数として線拡がり関数が用いられる。 On the other hand, the spread function is typically represented by a point spread function (PSF), which is a response function for a point light source, or a line spread function (LSF), which is a response function for a line light source. . Since the actual phosphor 202 has a length, a line spread function is used as the spread function in this embodiment.
 点拡がり関数(PSF(ρ))及び線拡がり関数(LSF(ρ))は、例えば以下の式により表される。
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
The point spread function (PSF(ρ)) and the line spread function (LSF(ρ)) are expressed, for example, by the following equations.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
 上記の点拡がり関数(PSF(ρ))及び線拡がり関数(LSF(ρ))に関する式1~式4において、「ρ」は、深さ方向(Z軸方向)に対して直角を成す平面(すなわちXY平面)における位置を表す。 In Equations 1 to 4 regarding the point spread function (PSF(ρ)) and the line spread function (LSF(ρ)), “ρ” is a plane perpendicular to the depth direction (Z-axis direction) ( That is, it represents the position on the XY plane).
 「d」は、蛍光体202の深さ位置(すなわち組織表面200aから蛍光体202(特に観察基準線203)までの深さ方向の位置)を表す。 "d" represents the depth position of the phosphor 202 (that is, the position in the depth direction from the tissue surface 200a to the phosphor 202 (in particular, the observation reference line 203)).
 「μ」は、吸収係数を表し、生体組織200の種類(例えば肝臓等の臓器;より具体的には生体組織200を構成する媒質(組成))に応じて定められる。 “μ a ” represents an absorption coefficient, and is determined according to the type of the biological tissue 200 (for example, an organ such as the liver; more specifically, the medium (composition) forming the biological tissue 200).
 「μ’」は、等価散乱係数を表し、生体組織200の種類及び蛍光体202の蛍光波長に応じて定められる。 “μ s ′” represents an equivalent scattering coefficient and is determined according to the type of living tissue 200 and the fluorescence wavelength of the phosphor 202 .
 なお、励起光の波長と蛍光体202の蛍光波長とは所定の対応関係があるため、「μ’」は励起光の波長と対応づけて定められうる。殆どの蛍光試薬に関して励起波長及び蛍光波長の差は小さく、蛍光体202のボケの程度に関して励起波長と蛍光波長との差を実質的に無視しても差し支えないこともある。そのような場合には、励起波長を蛍光波長とみなし、「μ’」を励起波長と対応づけて定めることが可能である。また、複数の蛍光試薬が使用される可能性があり且つ各蛍光試薬の蛍光波長が一意に定められる場合、「μ’」は蛍光試薬に対応づけて定められうる。この場合、「μ’」は、間接的に蛍光波長に対応づけられることになる。 Since there is a predetermined correspondence between the wavelength of the excitation light and the fluorescence wavelength of the phosphor 202, “μ s ′” can be determined in association with the wavelength of the excitation light. For most fluorescent reagents, the difference between the excitation and fluorescence wavelengths is small, and in some cases the difference between the excitation and fluorescence wavelengths can be substantially ignored with respect to the degree of blurring of the phosphor 202 . In such cases, it is possible to regard the excitation wavelength as the fluorescence wavelength and define 'μ s '' in relation to the excitation wavelength. Also, when a plurality of fluorescent reagents may be used and the fluorescent wavelength of each fluorescent reagent is uniquely defined, "μ s ′" can be defined in association with the fluorescent reagent. In this case, “μ s ′” is indirectly associated with the fluorescence wavelength.
 このように「μ」及び「μ’」の具体的な値は、予め取得可能であり、データベースに保存しておくことが可能である。本実施形態の処理記憶部45は、多数の「μ」のデータを、生体組織200の種類に対応づけて予め記憶する。同様に処理記憶部45は、多数の「μ’」のデータを、生体組織200の種類及び蛍光体202の蛍光波長に対応づけて予め記憶する。 Thus, specific values of "μ a " and "μ s ′" can be obtained in advance and stored in a database. The processing storage unit 45 of the present embodiment pre-stores a large number of “μ a ” data in association with the type of living tissue 200 . Similarly, the processing storage unit 45 stores in advance a large number of “μ s ′” data in association with the type of living tissue 200 and the fluorescence wavelength of the phosphor 202 .
 そのため「μ」の具体的な値は、実際の生体組織200の種類に応じて、処理記憶部45から読み出し可能である。同様に、「μ’」の具体的な値は、実際の生体組織200の種類及び実際の蛍光体202の蛍光波長に応じて、処理記憶部45から読み出し可能である。 Therefore, a specific value of “μ a ” can be read from the processing storage unit 45 according to the type of the actual living tissue 200 . Similarly, a specific value of “μ s ′” can be read from the processing storage unit 45 according to the actual type of biological tissue 200 and the actual fluorescence wavelength of the phosphor 202 .
 「L」は、深さ方向に対して直角を成すXY平面における、蛍光体202の長さを表す(図7参照)。本実施形態では、蛍光体202を代表する観察基準線203のX軸方向長さによって、「L」が表される。 "L" represents the length of the phosphor 202 on the XY plane perpendicular to the depth direction (see FIG. 7). In this embodiment, “L” is represented by the X-axis direction length of the observation reference line 203 representing the phosphor 202 .
 「κ」は、上記式2により表されるように、吸収係数(μ)及び等価散乱係数(μ’)に基づいて定められる。 “κ d ” is determined based on the absorption coefficient (μ a ) and the equivalent scattering coefficient (μ s ′), as expressed by Equation 2 above.
 上記の式1~式4から明らかなように、拡がり関数(すなわち点拡がり関数「PSF(ρ)」及び線拡がり関数「LSF(ρ)」)は、「d」、「μ」、「μ’」及び「L」をパラメータとして含む。したがって、「μ」、「μ’」及び「L」が特定の値に確定すれば、拡がり関数は「d」の関数として表され、上記の式4に示すように蛍光体202の深さ位置「d」を、蛍光体202のXY平面における位置「ρ」の関数(「f(ρ)」)により表すことができる。 As is apparent from the above equations 1-4, the spread functions (that is, the point spread function “PSF(ρ)” and the line spread function “LSF(ρ)”) are “d”, “μ a ”, “μ s ′” and “L” as parameters. Therefore, once “μ a ”, “μ s ′” and “L” are fixed at certain values, the spreading function is expressed as a function of “d” and the depth of phosphor 202 is expressed as shown in Equation 4 above. The position “d” can be represented by a function of the position “ρ” of the phosphor 202 in the XY plane (“f(ρ)”).
 本実施形態の深さ位置情報取得部42は、蛍光画像101を解析することで特定される「蛍光画像101中の蛍光体202の範囲」から、観察対象の蛍光体202の長さ「L」を取得する。また深さ位置情報取得部42は、生体組織200の種類及び蛍光体202の蛍光波長に応じて、対応の「μ」及び「μ’」を処理記憶部45から読み出して取得する。そして深さ位置情報取得部42は、「L」、「μ」及び「μ’」の対応値が反映された「LSF(ρ)」を取得し、当該「LSF(ρ)」に基づいて「f(ρ)」を導き出す。 The depth position information acquisition unit 42 of the present embodiment calculates the length "L" of the phosphor 202 to be observed from the "range of the phosphor 202 in the fluorescence image 101" specified by analyzing the fluorescence image 101. to get The depth position information acquisition unit 42 also reads and acquires the corresponding “μ a ” and “μ s ′” from the processing storage unit 45 according to the type of the living tissue 200 and the fluorescence wavelength of the phosphor 202 . Then, the depth position information acquisition unit 42 acquires “LSF(ρ)” reflecting the corresponding values of “L”, “μ a ” and “μ s ′”, and based on the “LSF(ρ)” to derive “f(ρ)”.
 なお深さ位置情報取得部42は、生体組織200の種類及び蛍光体202の蛍光波長を、任意の方法で取得可能である。 The depth position information acquisition unit 42 can acquire the type of the living tissue 200 and the fluorescence wavelength of the phosphor 202 by any method.
 本実施形態の深さ位置情報取得部42は、可視光画像100を解析することで生体組織200の種類を推定し、撮影ユニット11から送られてくる情報に基づいて蛍光体202の蛍光波長を判別する。ただし深さ位置情報取得部42は、操作者によって医療用観察システム10(例えば医療用画像処理装置12)に手動的に入力される情報に基づいて、生体組織200の種類や蛍光体202の蛍光波長を取得してもよい。 The depth position information acquisition unit 42 of the present embodiment estimates the type of the living tissue 200 by analyzing the visible light image 100, and determines the fluorescence wavelength of the phosphor 202 based on the information sent from the imaging unit 11. discriminate. However, the depth position information acquisition unit 42, based on information manually input by the operator to the medical observation system 10 (for example, the medical image processing apparatus 12), determines the type of the biological tissue 200 and the fluorescence of the phosphor 202. Wavelength may be obtained.
 このようにして深さ位置情報取得部42は、生体組織200の種類に応じて定められる「蛍光体202の輝度分布と、深さ位置情報とに基づく線拡がり関数」を「蛍光体202の深さ位置に基づく拡がり関数」として取得する。そして深さ位置情報取得部42は、蛍光体202の輝度に基づく線拡がり情報(近似関数)と、線拡がり関数とに基づいて、蛍光体202の深さ位置情報を導き出す。 In this way, the depth position information acquiring unit 42 converts the “line spread function based on the luminance distribution of the phosphor 202 and the depth position information” determined according to the type of the biological tissue 200 into “the depth of the phosphor 202 . It is obtained as a spread function based on the vertical position. Then, the depth position information acquisition unit 42 derives the depth position information of the phosphor 202 based on the line spread information (approximate function) based on the luminance of the phosphor 202 and the line spread function.
 画質調整部43(図2参照)は、蛍光体202の深さ位置情報に応じた鮮鋭化処理を、蛍光画像101に対して行う。 The image quality adjustment unit 43 (see FIG. 2) performs sharpening processing on the fluorescence image 101 according to the depth position information of the phosphor 202 .
 鮮鋭化処理の具体的な方法は限定されない。典型的には、蛍光体202のボケの程度を示す拡がり関数の逆関数を使った鮮鋭化処理が行われ、上述の拡がり関数から導き出される画像復元フィルタが蛍光画像101に適用される。 The specific method of sharpening processing is not limited. Typically, sharpening processing is performed using the inverse of the spread function that indicates the degree of blurring of the phosphor 202 , and the image restoration filter derived from the spread function is applied to the fluorescence image 101 .
 画質調整部43は、蛍光画像101に対して他の鮮鋭化処理を行ってもよいし、任意の画質調整処理を可視光画像100及び/又は蛍光画像101に対して行ってもよい。 The image quality adjustment unit 43 may perform other sharpening processing on the fluorescence image 101, or may perform arbitrary image quality adjustment processing on the visible light image 100 and/or the fluorescence image 101.
 観察画像生成部44は、観察画像103を生成する。観察画像103は、生体組織200における蛍光体202の場所が視認可能に表されていれば、限定されない。 The observation image generation unit 44 generates an observation image 103 . The observation image 103 is not limited as long as the location of the phosphor 202 in the living tissue 200 is represented visually.
 図9は、第1実施形態に係る観察画像103の一例を示す図である。図10は、第1実施形態に係る観察画像103の他の例を示す図である。 FIG. 9 is a diagram showing an example of the observation image 103 according to the first embodiment. FIG. 10 is a diagram showing another example of the observation image 103 according to the first embodiment.
 一例として、観察画像生成部44は、可視光画像100において蛍光体202の範囲に対応する部分を強調することで、観察画像103を生成することができる(図9参照)。 As an example, the observation image generator 44 can generate the observation image 103 by emphasizing the portion corresponding to the range of the phosphor 202 in the visible light image 100 (see FIG. 9).
 特に、鮮鋭化処理を受けた蛍光画像101では、蛍光体202のボケが低減され、より正確な蛍光体202の範囲が表れている。そのため観察画像生成部44は、鮮鋭化処理を受けた後の蛍光画像101を解析することで、生体組織200における蛍光体202の範囲をより正確に特定することができる。そして観察画像生成部44は、可視光画像100において、蛍光体202の範囲の対応箇所を強調する処理を行うことで、観察画像103を生成することができる。 In particular, in the fluorescence image 101 that has undergone the sharpening process, blurring of the phosphors 202 is reduced, and a more accurate range of the phosphors 202 appears. Therefore, the observation image generator 44 can more accurately identify the range of the phosphor 202 in the biological tissue 200 by analyzing the fluorescence image 101 after undergoing the sharpening process. Then, the observation image generation unit 44 can generate the observation image 103 by performing processing for emphasizing the corresponding portion of the range of the phosphor 202 in the visible light image 100 .
 この場合、観察画像103を観察する観察者は、可視光画像100上で蛍光体202の範囲を明確に確認することができる。 In this case, an observer observing the observation image 103 can clearly confirm the range of the phosphor 202 on the visible light image 100 .
 他の例として、観察画像生成部44は、鮮鋭化処理を受けた後の蛍光画像101の蛍光体202に対応する部分を、可視光画像100の対応箇所に重畳することで、観察画像103を生成することができる(図10参照)。 As another example, the observation image generation unit 44 superimposes the portion corresponding to the phosphor 202 of the fluorescence image 101 after undergoing the sharpening process on the corresponding portion of the visible light image 100, thereby obtaining the observation image 103. can be generated (see FIG. 10).
 この場合、観察画像103を観察する観察者は、可視光画像100上で蛍光体202の蛍光状態を視認することができる。なお、観察画像生成部44は、画像処理(例えば色、濃淡及び明るさの調整処理)を行った後の蛍光体202の蛍光画像101を、可視光画像100に重畳することで、観察画像103を生成してもよい。 In this case, an observer observing the observation image 103 can visually recognize the fluorescence state of the phosphor 202 on the visible light image 100 . Note that the observation image generation unit 44 superimposes the fluorescence image 101 of the phosphor 202 after performing image processing (for example, processing for adjusting color, gradation, and brightness) on the visible light image 100 to generate an observation image 103. may be generated.
 観察画像103には、蛍光体202の深さ位置情報を直接的に又は間接的に示す情報が反映されてもよい(図9参照)。この場合、観察画像103の観察者は、蛍光体202の深さ位置情報を直感的に把握することができる。 Information indicating the depth position information of the phosphor 202 directly or indirectly may be reflected in the observed image 103 (see FIG. 9). In this case, the observer of the observation image 103 can intuitively grasp the depth position information of the phosphor 202 .
 蛍光体202の深さ位置情報を示す情報は、任意の形態で、観察画像103に反映可能である。例えば、観察画像103において、蛍光体202の範囲に対応する部分が、深さ位置情報に応じて強調されてもよい。一例として、観察画像生成部44は、観察画像103のうち蛍光体202の範囲に対応する部分の色、模様、濃淡及び/又は明るさを、対応の蛍光体202の深さ位置情報に応じて調整してもよい。 Information indicating the depth position information of the phosphor 202 can be reflected in the observed image 103 in any form. For example, in the observed image 103, a portion corresponding to the range of the phosphor 202 may be emphasized according to the depth position information. As an example, the observation image generation unit 44 generates the color, pattern, shading and/or brightness of the portion corresponding to the range of the phosphor 202 in the observation image 103 according to the depth position information of the corresponding phosphor 202. may be adjusted.
 この場合、蛍光体202の強調表示と蛍光体202の深さ位置情報との関係を示す指標表示が、観察画像103に含まれてもよい。図9に示す例では、蛍光体202の塗りつぶしの濃淡によって蛍光体202の深さ位置が表されており、濃淡表示と深さ位置(すなわち組織表面200aからの距離(0mm~20mm))との関係を示すバー状指標表示が、観察画像103に含まれる。 In this case, the observation image 103 may include an index display indicating the relationship between the highlighted display of the phosphor 202 and the depth position information of the phosphor 202 . In the example shown in FIG. 9, the depth position of the phosphor 202 is represented by the shade of the filling of the phosphor 202, and the contrast between the shade display and the depth position (that is, the distance from the tissue surface 200a (0 mm to 20 mm)). A bar-like index display indicating the relationship is included in the observed image 103 .
 更に、蛍光体202の深さ位置情報に応じて、生体組織200の可視光画像100及び/又は蛍光画像101の撮影条件の最適化が行われてもよい。すなわち図1に示す撮影ユニット11は、医療用画像処理装置12から送られてくる「蛍光体202の深さ位置情報」に基づいて、撮影条件を調整してもよい。この場合、撮影ユニット11は、例えばカメラ(撮像素子を含む)の駆動条件や生体組織200に照射する撮影光の状態(例えば励起光及び/又は可視光の明るさ)を調整してもよい。 Furthermore, the imaging conditions for the visible light image 100 and/or the fluorescent image 101 of the biological tissue 200 may be optimized according to the depth position information of the phosphor 202 . That is, the imaging unit 11 shown in FIG. 1 may adjust the imaging conditions based on the “depth position information of the phosphor 202 ” sent from the medical image processing apparatus 12 . In this case, the imaging unit 11 may adjust, for example, the driving conditions of the camera (including the imaging element) and the state of the imaging light applied to the living tissue 200 (for example, the brightness of the excitation light and/or visible light).
 次に、上述の医療用観察システム10(特に医療用画像処理装置12)を使った、医療用画像処理方法の一例を説明する。 Next, an example of a medical image processing method using the medical observation system 10 (in particular, the medical image processing apparatus 12) will be described.
 図11は、第1実施形態に係る医療用画像処理方法の一例を示すフローチャートである。 FIG. 11 is a flow chart showing an example of the medical image processing method according to the first embodiment.
 まず、医療用画像処理装置12の画像取得部41によって、生体組織200の可視光画像100及び蛍光画像101が、撮影ユニット11から取得される(図11のS1)。 First, the image acquisition unit 41 of the medical image processing apparatus 12 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 from the imaging unit 11 (S1 in FIG. 11).
 そして、深さ位置情報取得部42が可視光画像100を解析することで、生体組織200の種類が特定される(S2)。 生体組織200の種類を特定するための可視光画像100の具体的な解析方法は限定されず、既知の画像処理方法を用いることで生体組織200の種類を特定することが可能である。 Then, the depth position information acquisition unit 42 analyzes the visible light image 100 to identify the type of the living tissue 200 (S2). A specific method of analyzing the visible light image 100 for specifying the type of the living tissue 200 is not limited, and the type of the living tissue 200 can be specified by using a known image processing method.
 そして深さ位置情報取得部42によって、生体組織200の種類に応じた線拡がり関数が取得される(S3)。本実施形態の深さ位置情報取得部42は、上述のように、処理記憶部45から読み出された対応の吸収係数及び等価散乱係数と、蛍光体202(特に観察基準線203)の長さLと、が反映された線拡がり関数を取得する。 Then, the depth position information acquisition unit 42 acquires a line spread function corresponding to the type of the living tissue 200 (S3). As described above, the depth position information acquisition unit 42 of the present embodiment acquires the corresponding absorption coefficient and equivalent scattering coefficient read from the processing storage unit 45 and the length of the phosphor 202 (in particular, the observation reference line 203). Obtain the line spread function reflecting L and .
 そして深さ位置情報取得部42によって、蛍光画像101が解析されて、蛍光画像101における蛍光体202の輝度に関する線拡がり情報が取得される(S4)。蛍光体202の線拡がり情報を取得するための蛍光画像101の具体的な解析方法は限定されず、既知の画像処理方法を用いることで蛍光体202の線拡がり情報(蛍光体202の観察基準線203に関する輝度分布)を取得することが可能である。 Then, the depth position information acquisition unit 42 analyzes the fluorescence image 101 and acquires line spread information regarding the luminance of the phosphor 202 in the fluorescence image 101 (S4). A specific analysis method of the fluorescence image 101 for acquiring the line spread information of the phosphor 202 is not limited, and the line spread information of the phosphor 202 (observation reference line of the phosphor 202) can be obtained by using a known image processing method. 203) can be obtained.
 そして深さ位置情報取得部42によって、蛍光体202の線拡がり情報が線拡がり関数に照らし合わされることで、蛍光体202の深さ位置が導き出される(S5)。 Then, the depth position information acquiring unit 42 compares the line spread information of the phosphor 202 with the line spread function to derive the depth position of the phosphor 202 (S5).
 そして画質調整部43により、蛍光体202の深さ位置に基づいて最適化された鮮鋭化処理(すなわち光散乱抑制処理)が、蛍光画像101に対して行われる(S6)。これにより、ボケが抑えられた鮮明な蛍光体202の蛍光画像101を得ることができ、蛍光画像101における蛍光体202の範囲を実際の蛍光体202の範囲に近づけることができる。 Then, the image quality adjustment unit 43 performs sharpening processing (that is, light scattering suppression processing) optimized based on the depth position of the phosphor 202 on the fluorescence image 101 (S6). As a result, a clear fluorescence image 101 of the phosphor 202 with less blur can be obtained, and the range of the phosphor 202 in the fluorescence image 101 can be brought closer to the actual range of the phosphor 202 .
 そして観察画像生成部44によって、可視光画像100及び蛍光画像101(特に鮮鋭化処理後の蛍光画像101)から観察画像103が生成される(S7)。 Then, the observation image generator 44 generates an observation image 103 from the visible light image 100 and the fluorescence image 101 (especially the fluorescence image 101 after the sharpening process) (S7).
 なお、画質調整部43及び観察画像生成部44は、任意の画像処理を行って、生体組織200の可視光画像100、蛍光画像101及び/又は観察画像103の画質等の状態を調整してもよい。 The image quality adjustment unit 43 and the observation image generation unit 44 may perform arbitrary image processing to adjust the image quality of the visible light image 100, the fluorescence image 101 and/or the observation image 103 of the living tissue 200. good.
 その後、観察画像103が、医療用画像処理装置12から出力ユニット13に送られ、ディスプレイ装置33に表示される。また必要に応じて、蛍光体202の深さ位置を示す情報が医療用画像処理装置12から撮影ユニット11に送られる。 After that, the observed image 103 is sent from the medical image processing device 12 to the output unit 13 and displayed on the display device 33 . Information indicating the depth position of the phosphor 202 is sent from the medical image processing apparatus 12 to the imaging unit 11 as necessary.
 以上説明したように本実施形態によれば、蛍光画像101における蛍光体202の画像ボケの程度に基づいて、蛍光体202の深さ位置を取得することができる。 As described above, according to this embodiment, the depth position of the phosphor 202 can be acquired based on the degree of image blur of the phosphor 202 in the fluorescence image 101 .
 そのため、可視光画像100に写る蛍光体202に関してだけではなく、可視光画像100には写らない蛍光体202(例えば生体組織200の深部に位置する蛍光体202)に関しても、深さ位置を適切に取得することができる。また、特殊な装置を設けることなく、生体組織200における蛍光体202の深さ位置を取得することができる。 Therefore, not only the phosphor 202 that appears in the visible light image 100 but also the phosphor 202 that does not appear in the visible light image 100 (for example, the phosphor 202 located deep in the living tissue 200) can be appropriately set at the depth. can be obtained. Moreover, the depth position of the phosphor 202 in the biological tissue 200 can be acquired without providing a special device.
 また、蛍光体202の深さ位置に最適化された画像処理(鮮鋭化処理など)を行うことができる。 Also, image processing (such as sharpening processing) optimized for the depth position of the phosphor 202 can be performed.
 これにより、蛍光画像101における蛍光体202の視認性を向上させることができ、生体組織200における蛍光体202のより正確な範囲を特定することができる。また、そのような蛍光画像101に基づいて観察画像103を生成することで、術者等の観察者は、観察画像103から、生体組織200における蛍光体202の状態を簡単且つ正確に把握することができる。 Thereby, the visibility of the phosphor 202 in the fluorescence image 101 can be improved, and the range of the phosphor 202 in the biological tissue 200 can be specified more accurately. Further, by generating the observation image 103 based on such a fluorescence image 101, an observer such as an operator can easily and accurately grasp the state of the phosphor 202 in the biological tissue 200 from the observation image 103. can be done.
 その結果、術者はより正確な手技で手術(例えば内視鏡手術)を行うことが可能になり、手術における癌等の取り残しや正常組織に対するダメージを、効果的に低減することができる。 As a result, it becomes possible for the operator to perform surgery (for example, endoscopic surgery) with more accurate techniques, and it is possible to effectively reduce the damage to normal tissue and cancer left in surgery.
[第2実施形態]
 以下に説明する第2実施形態において、上述の第1実施形態と同一又は対応の要素には、同一の符号を付し、その詳細な説明は省略する。
[Second embodiment]
In the second embodiment described below, the same reference numerals are given to the same or corresponding elements as in the above-described first embodiment, and detailed description thereof will be omitted.
 本実施形態では、蛍光波長が相互に異なる複数の蛍光体202を、生体組織200が含む場合を想定する。例えば、2種類以上の蛍光試薬が用いられて1種類以上の組織(蛍光体202)が標識され、相互に異なる蛍光波長を有する2種類以上の蛍光体202が観察対象の生体組織200に含まれる。 In this embodiment, it is assumed that the living tissue 200 includes a plurality of phosphors 202 having mutually different fluorescence wavelengths. For example, two or more types of fluorescent reagents are used to label one or more types of tissues (fluorescent substances 202), and two or more types of fluorescent substances 202 having mutually different fluorescence wavelengths are included in the living tissue 200 to be observed. .
 以下の説明において、種類を区別することなく複数の蛍光体を総称する場合には、単に「蛍光体202」と表記する。 In the following description, when a plurality of phosphors are collectively referred to without distinguishing their types, they are simply referred to as "phosphor 202".
 図12は、2種類の蛍光体202(すなわち第1蛍光体202a及び第2蛍光体202b)の波長(横軸)と蛍光強度(縦軸)との間の関係例を示す。 FIG. 12 shows an example of the relationship between the wavelength (horizontal axis) and the fluorescence intensity (vertical axis) of two types of phosphors 202 (that is, the first phosphor 202a and the second phosphor 202b).
 図12に示す第1蛍光体202a及び第2蛍光体202bは、相互に異なる波長域で励起され、それぞれのピーク蛍光波長(すなわち最大の蛍光強度を示す蛍光波長)が互いに異なる。具体的には、第1蛍光体202aのピーク蛍光波長Pw1は、第2蛍光体202bのピーク蛍光波長Pw2、Pw3よりも短い。 The first phosphor 202a and the second phosphor 202b shown in FIG. 12 are excited in different wavelength ranges, and have different peak fluorescence wavelengths (that is, fluorescence wavelengths showing maximum fluorescence intensity). Specifically, the peak fluorescence wavelength Pw1 of the first phosphor 202a is shorter than the peak fluorescence wavelengths Pw2 and Pw3 of the second phosphor 202b.
 このように複数種類の蛍光試薬を使って複数種類の蛍光体202を蛍光発色させることにより、1つの観察画像103において複数種類の蛍光体202を、互いに区別された状態で同時に表すことが可能である。術者等の観察者は、そのような観察画像103において、複数種類の蛍光体202を互いに区別しながら同時に観察することが可能である。 In this way, by using a plurality of types of fluorescent reagents to cause a plurality of types of phosphors 202 to develop fluorescent colors, it is possible to simultaneously represent a plurality of types of phosphors 202 in a single observation image 103 while being distinguished from each other. be. An observer such as an operator can simultaneously observe the plurality of types of phosphors 202 in such an observation image 103 while distinguishing them from each other.
 このような複数種類の蛍光体202の同時観察は、様々な用途への応用が可能であり、外科手術等の分野においてニーズが今後益々高まることが予想される。 Such simultaneous observation of multiple types of phosphors 202 can be applied to various uses, and it is expected that the needs will increase more and more in fields such as surgery.
 以下に、複数種類の蛍光体202の同時観察の用途例を挙げる。 An application example of simultaneous observation of multiple types of phosphors 202 is given below.
 第1の用途例として、生体組織200中の異なる箇所を別々の蛍光試薬で標識する用途がある。例えば、肝胆膵外科診療において、肝臓の区域同定と肝臓癌領域の同定とを区別して行うケースが挙げられる。 As a first application example, there is an application for labeling different locations in the biological tissue 200 with different fluorescent reagents. For example, in hepato-biliary-pancreatic surgery, there is a case in which identification of a liver segment and identification of a liver cancer region are performed separately.
 肝臓区域及び肝臓癌領域の両方を1種類の蛍光試薬を使って蛍光発色させる場合、蛍光箇所の境界が肝臓区域及び肝臓癌領域のいずれの境界を示すのか分からない。 When both the liver segment and the liver cancer region are fluorescently colored using one type of fluorescent reagent, it is not known whether the boundary of the fluorescent region indicates the boundary between the liver segment and the liver cancer region.
 一方、第1蛍光試薬により肝臓区域を標識し、第1蛍光試薬とは異なる蛍光波長(特にピーク蛍光波長)を持つ第2蛍光試薬により肝臓癌領域を標識することができる。この場合、術者等の観察者は、蛍光画像から、肝臓区域(第1蛍光体202a)及び肝臓癌領域(第2蛍光体202b)の各々を、両者の境界を明確に区別しつつ把握することができる。したがって術者は、肝臓全体における肝臓癌領域の位置及び範囲をより正確に把握することができ、組織切除範囲の拡大を抑えつつ、肝臓癌を適切に除去することが可能である。 On the other hand, the liver segment can be labeled with the first fluorescent reagent, and the liver cancer region can be labeled with the second fluorescent reagent having a different fluorescence wavelength (especially peak fluorescence wavelength) from the first fluorescent reagent. In this case, an observer such as an operator grasps each of the liver segment (first phosphor 202a) and the liver cancer region (second phosphor 202b) from the fluorescence image while clearly distinguishing the boundaries between the two. be able to. Therefore, the operator can more accurately grasp the position and range of the liver cancer region in the entire liver, and can appropriately remove the liver cancer while suppressing the expansion of the tissue excision range.
 第2の用途例として、生体組織200中の一箇所を別々の蛍光試薬で標識する用途がある。例えば、手術の前後における血流の状態を観察するケースが挙げられる。 As a second application example, there is an application for labeling one location in the biological tissue 200 with different fluorescent reagents. For example, there is a case of observing the state of blood flow before and after surgery.
 図13は、第1蛍光試薬が血液に混入された状態で、手術前に撮影された生体組織200の蛍光画像101の一例を示す。図14は、第1蛍光試薬とは異なる第2の蛍光試薬が血液に混入された状態で、手術後に撮影された生体組織200の蛍光画像101の一例を示す。 FIG. 13 shows an example of a fluorescent image 101 of a biological tissue 200 taken before surgery with the first fluorescent reagent mixed in blood. FIG. 14 shows an example of a fluorescent image 101 of a biological tissue 200 taken after surgery in a state in which a second fluorescent reagent different from the first fluorescent reagent is mixed with blood.
 手術前に血液に混入される蛍光試薬は、手術後においても血管(特に観察対象部位)に残留することがある。そのため手術前に血液に混入される蛍光試薬と、手術後に血液に混入される蛍光試薬とが同じ場合、手術後に観察される蛍光が、手術前に投与された蛍光試薬によるものか、手術後に投与された蛍光試薬によるものか判断できない。  Fluorescent reagents mixed with blood before surgery may remain in blood vessels (especially observation sites) even after surgery. Therefore, if the fluorescent reagent mixed in blood before surgery and the fluorescent reagent mixed in blood after surgery are the same, the fluorescence observed after surgery may be due to the fluorescent reagent administered before surgery or the fluorescent reagent administered after surgery. It is not possible to determine whether it is due to the fluorescent reagent used.
 一方、手術前に血液に混入される第1蛍光試薬の蛍光波長と、手術後に血液に混入される第2蛍光試薬の蛍光波長とが異なる場合、第1蛍光試薬が発する蛍光(図13参照)から、第2蛍光試薬が発する蛍光を明確に区別できる(図14参照)。そのため、第2蛍光試薬を血液に混入させる前に第1蛍光試薬を観察対象部位から取り除く処置(すなわち洗浄処置)を行うことなく、手術後における血流の状態を、蛍光画像101から迅速且つ適切に観察できる。 On the other hand, when the fluorescence wavelength of the first fluorescent reagent mixed with blood before surgery and the fluorescence wavelength of the second fluorescent reagent mixed with blood after surgery are different, fluorescence emitted by the first fluorescent reagent (see FIG. 13) , the fluorescence emitted by the second fluorescent reagent can be clearly distinguished (see FIG. 14). Therefore, the state of blood flow after surgery can be quickly and appropriately obtained from the fluorescence image 101 without performing treatment (that is, washing treatment) to remove the first fluorescent reagent from the observation target site before mixing the second fluorescent reagent into the blood. can be observed.
 上記のような複数種類の蛍光体202の同時観察のニーズに応える観点からは、観察画像103における複数の蛍光体202の表示を、蛍光体202の種類(すなわち蛍光試薬の種類)に応じて変えることが好ましい。 From the viewpoint of meeting the need for simultaneous observation of multiple types of phosphors 202 as described above, display of multiple phosphors 202 in observation image 103 is changed according to the type of phosphor 202 (that is, the type of fluorescent reagent). is preferred.
 蛍光画像101を可視光画像100に重畳することで観察画像103を生成する場合に、蛍光画像101において複数種類の蛍光体202間で明るさに大きな差があると、蛍光画像101及び観察画像103において蛍光体202を視認し難い。 When an observation image 103 is generated by superimposing a fluorescence image 101 on a visible light image 100, if there is a large difference in brightness between a plurality of types of phosphors 202 in the fluorescence image 101, the fluorescence image 101 and the observation image 103 It is difficult to visually recognize the phosphor 202 in .
 そのため、蛍光画像101のうちの各蛍光体202に対応する部分に対し、蛍光体202の種類に応じた明るさ調整を行うことで、蛍光体202間での明るさの差を低減して、蛍光体202を見やすくすることが可能である。 Therefore, by adjusting the brightness according to the type of the phosphor 202 for the portion corresponding to each phosphor 202 in the fluorescence image 101, the difference in brightness between the phosphors 202 is reduced, It is possible to make the phosphor 202 easier to see.
 また、蛍光画像101において第1蛍光体202a及び第2蛍光体202bが相互に重なっている場合、第1蛍光体202aと第2蛍光体202bとの間の相対的な深さ位置関係を蛍光画像101から把握することができない。特に、第1蛍光体202a及び第2蛍光体202bは、蛍光波長が互いに異なるため、生体組織200中における蛍光の散乱の仕方が互いに異なる。そのため、蛍光画像101におけるボケの程度を第1蛍光体202aと第2蛍光体202bとの間で単純に比較しても、第1蛍光体202a及び第2蛍光体202bのどちらがより上方に位置しているのかを判断することは難しい。 Also, when the first phosphor 202a and the second phosphor 202b overlap each other in the fluorescence image 101, the relative depth positional relationship between the first phosphor 202a and the second phosphor 202b is shown in the fluorescence image. 101 cannot be grasped. In particular, the fluorescence wavelengths of the first phosphor 202a and the second phosphor 202b are different from each other. Therefore, even if the degree of blurring in the fluorescence image 101 is simply compared between the first phosphor 202a and the second phosphor 202b, which of the first phosphor 202a and the second phosphor 202b is positioned higher. It is difficult to determine whether
 そこで、各蛍光体202の深さ位置に応じて、観察画像103における各蛍光体202の表示の色、模様、濃淡及び/又は明るさを、対応の蛍光体202の深さ位置情報に応じて調整してもよい。この場合、観察画像103から、各蛍光体202の深さ位置を把握することができ、また蛍光体202間の相対的な深さ位置関係を把握することができる。 Therefore, according to the depth position of each phosphor 202, the display color, pattern, shade and/or brightness of each phosphor 202 in the observation image 103 is changed according to the depth position information of the corresponding phosphor 202. may be adjusted. In this case, the depth position of each phosphor 202 can be grasped from the observed image 103, and the relative depth positional relationship between the phosphors 202 can be grasped.
 また可視光画像100において蛍光体202の対応範囲を強調表示することで観察画像103を生成する場合、観察画像103において全ての蛍光体202を種類にかかわらず同じように強調表示すると、蛍光体202間の相対的な深さ位置関係を識別できない。また、観察画像103において各蛍光体202の深さ位置情報が反映されていても、全ての蛍光体202が種類によらずに同じように強調表示される場合には、観察画像103において各蛍光体202の種類を識別できない。 Further, when the observed image 103 is generated by highlighting the corresponding range of the phosphors 202 in the visible light image 100, if all the phosphors 202 are similarly highlighted in the observed image 103 regardless of their types, the phosphors 202 Unable to discern the relative depth position relationship between Further, even if the depth position information of each phosphor 202 is reflected in the observation image 103, if all the phosphors 202 are highlighted in the same way regardless of the type, each fluorescence is displayed in the observation image 103. The type of body 202 cannot be identified.
 そのため観察画像103における各蛍光体202の表示を、深さ位置情報及び蛍光体202の種類の両方に応じて、強調してもよい。この場合、観察画像103から、各蛍光体202の深さ位置及び各蛍光体202の種類の両方を把握することができる。また観察画像103から、蛍光体202間の相対的な深さ位置関係と、蛍光体202の種類間の相対的な深さ位置関係とを把握することもできる。 Therefore, the display of each phosphor 202 in the observation image 103 may be emphasized according to both the depth position information and the type of phosphor 202. In this case, both the depth position of each phosphor 202 and the type of each phosphor 202 can be grasped from the observed image 103 . Also, from the observation image 103, it is possible to grasp the relative depth positional relationship between the phosphors 202 and the relative depth positional relationship between the types of the phosphors 202. FIG.
 次に、第2実施形態の医療用観察システム10(特に医療用画像処理装置12)を使った、医療用画像処理方法の一例を説明する。 Next, an example of a medical image processing method using the medical observation system 10 (especially the medical image processing apparatus 12) of the second embodiment will be described.
 図15は、第2実施形態に係る医療用画像処理方法の一例を示すフローチャートである。 FIG. 15 is a flowchart showing an example of a medical image processing method according to the second embodiment.
 以下では、生体組織200が2つの蛍光体202(第1蛍光体202a及び第2蛍光体202b)を含む場合について説明する。 A case in which the living tissue 200 includes two phosphors 202 (a first phosphor 202a and a second phosphor 202b) will be described below.
 第2実施形態では、基本的に、上述の第1実施形態において単一の蛍光体202に対して行われる処理が、複数の蛍光体202(第1蛍光体202a及び第2蛍光体202b)の各々に対して行われる。 In the second embodiment, basically, the processing performed on a single phosphor 202 in the above-described first embodiment is performed on a plurality of phosphors 202 (first phosphor 202a and second phosphor 202b). performed for each.
 まず医療用画像処理装置12の画像取得部41によって、生体組織200の可視光画像100及び蛍光画像101が、撮影ユニット11から取得される(図15のS11)。 First, the image acquisition unit 41 of the medical image processing apparatus 12 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 from the imaging unit 11 (S11 in FIG. 15).
 生体組織200の可視光画像100は、撮影ユニット11が1回の可視光撮影を行うことによって、得られる。 The visible light image 100 of the living tissue 200 is obtained by the imaging unit 11 performing visible light imaging once.
 一方、生体組織200の蛍光画像101は、撮影ユニット11が1回の蛍光撮影を行うことにより得られてもよいし、複数回の蛍光撮影を行うことにより得られてもよい。撮影ユニット11の光照射部24(特に励起光照射部24b)から一度に発する励起光によって第1蛍光体202a及び第2蛍光体202bの両者を適切に励起できる場合には、1回の蛍光撮影により蛍光画像101を得ることが可能である。一方、励起光照射部24bから一度に発する励起光によって第1蛍光体202a及び第2蛍光体202bの両者を適切に励起できない場合、別々の蛍光撮影により、第1蛍光体202aの蛍光画像101及び第2蛍光体202bの蛍光画像101が得られる。 On the other hand, the fluorescence image 101 of the biological tissue 200 may be obtained by the imaging unit 11 performing fluorescence imaging once, or may be obtained by performing fluorescence imaging a plurality of times. When both the first phosphor 202a and the second phosphor 202b can be appropriately excited by the excitation light emitted at once from the light irradiation section 24 (especially the excitation light irradiation section 24b) of the imaging unit 11, one fluorescence photography is performed. It is possible to obtain the fluorescence image 101 by . On the other hand, when both the first phosphor 202a and the second phosphor 202b cannot be appropriately excited by the excitation light emitted from the excitation light irradiation unit 24b at once, the fluorescence image 101 and the fluorescence image 101 of the first phosphor 202a are captured separately. A fluorescence image 101 of the second phosphor 202b is obtained.
 そして、深さ位置情報取得部42が可視光画像100を解析することで、生体組織200の種類が特定される(S12)。 Then, the depth position information acquisition unit 42 analyzes the visible light image 100 to identify the type of the living tissue 200 (S12).
 そして深さ位置情報取得部42によって、生体組織200の種類に応じた線拡がり関数が取得される(S13)。本実施形態の深さ位置情報取得部42は、第1蛍光体202aに関する線拡がり関数と、第2蛍光体202bに関する線拡がり関数とを取得する。 Then, the depth position information acquisition unit 42 acquires the line spread function corresponding to the type of the living tissue 200 (S13). The depth position information acquisition unit 42 of the present embodiment acquires a line spread function for the first phosphor 202a and a line spread function for the second phosphor 202b.
 そして深さ位置情報取得部42によって、蛍光画像101が解析され、第1蛍光体202a及び第2蛍光体202bのそれぞれについて輝度に関する線拡がり情報が取得される(S14)。 Then, the depth position information acquisition unit 42 analyzes the fluorescence image 101 and acquires line spread information regarding luminance for each of the first phosphor 202a and the second phosphor 202b (S14).
 そして深さ位置情報取得部42によって、第1蛍光体202a及び第2蛍光体202bのそれぞれの線拡がり情報が対応の線拡がり関数に照らし合わされることで、第1蛍光体202a及び第2蛍光体202bの深さ位置が導出される(S15)。このように本実施形態の深さ位置情報取得部42は、蛍光画像101に基づいて、複数の蛍光体202(第1蛍光体202a及び第2蛍光体202b)の各々の深さ位置情報を取得する。 Then, the depth position information acquisition unit 42 compares the line spread information of each of the first phosphor 202a and the second phosphor 202b with the corresponding line spread function, thereby obtaining the first phosphor 202a and the second phosphor 202a. The depth position of 202b is derived (S15). As described above, the depth position information acquiring unit 42 of the present embodiment acquires depth position information of each of the plurality of phosphors 202 (the first phosphor 202a and the second phosphor 202b) based on the fluorescence image 101. do.
 そして画質調整部43により、第1蛍光体202a及び第2蛍光体202bの深さ位置に基づいて最適化された鮮鋭化処理が、第1蛍光体202a及び第2蛍光体202bの各々に対して行われる(S16)。 Then, the sharpening process optimized based on the depth positions of the first phosphor 202a and the second phosphor 202b is performed by the image quality adjustment unit 43 on each of the first phosphor 202a and the second phosphor 202b. is performed (S16).
 その後、蛍光画像101における第1蛍光体202a及び第2蛍光体202bの相互間の明るさを調整する処理が、蛍光画像101に対して行われる(S17)。これにより、第1蛍光体202a及び第2蛍光体202bの発光強度が大きく異なる場合であっても、第1蛍光体202a及び第2蛍光体202bの明るさが適切に調整される。 After that, processing for adjusting the brightness between the first phosphor 202a and the second phosphor 202b in the fluorescence image 101 is performed on the fluorescence image 101 (S17). As a result, even when the emission intensities of the first phosphor 202a and the second phosphor 202b differ greatly, the brightness of the first phosphor 202a and the second phosphor 202b can be appropriately adjusted.
 この明るさ調整処理は、画質調整部43により行われてもよいし、観察画像生成部44により行われてもよい。明るさ調整が画質調整部43により行われる場合、画質調整部43は、実質的に観察画像生成部44としても機能することになる。 This brightness adjustment processing may be performed by the image quality adjustment unit 43 or may be performed by the observation image generation unit 44 . When the brightness adjustment is performed by the image quality adjustment section 43, the image quality adjustment section 43 substantially functions as the observation image generation section 44 as well.
 このようにして蛍光画像101の複数の蛍光体202に対応する部分に対し、複数の蛍光体202間における相対的な明るさの調整が行われた後、観察画像103が生成される(S18)。すなわち観察画像生成部44によって、可視光画像100及び蛍光画像101(特に鮮鋭化処理後の蛍光画像101)から観察画像103が生成される。 After adjusting the relative brightness among the plurality of phosphors 202 in this manner for the portion corresponding to the plurality of phosphors 202 in the fluorescence image 101, the observed image 103 is generated (S18). . That is, the observation image generator 44 generates the observation image 103 from the visible light image 100 and the fluorescence image 101 (especially the fluorescence image 101 after the sharpening process).
 観察画像103の具体的な生成方法は、特に限定されない。例えば観察画像生成部44は、複数の蛍光体202間における相対的な明るさの調整が行われた後の蛍光画像101の複数の蛍光体202に対応する部分を、可視光画像100に重畳して観察画像103を生成してもよい。また観察画像生成部44は、可視光画像100において、蛍光体202の範囲の対応箇所を強調する処理を行うことで、観察画像103を生成してもよい。 A specific method for generating the observed image 103 is not particularly limited. For example, the observation image generation unit 44 superimposes, on the visible light image 100, portions corresponding to the plurality of phosphors 202 of the fluorescence image 101 after relative brightness adjustment among the plurality of phosphors 202 has been performed. may be used to generate the observed image 103 . Further, the observation image generation unit 44 may generate the observation image 103 by performing processing for emphasizing the corresponding portion of the range of the phosphor 202 in the visible light image 100 .
 図16は、第2実施形態に係る観察画像103の一例を示す図である。図17は、第2実施形態に係る観察画像103の他の例を示す図である。図16及び図17に示す例では、観察対象の生体組織200に3つの蛍光体202(第1蛍光体202a、第2蛍光体202b及び第3蛍光体202c)が含まれる。 FIG. 16 is a diagram showing an example of an observation image 103 according to the second embodiment. FIG. 17 is a diagram showing another example of the observation image 103 according to the second embodiment. In the example shown in FIGS. 16 and 17, the biological tissue 200 to be observed contains three phosphors 202 (first phosphor 202a, second phosphor 202b, and third phosphor 202c).
 一例として、観察画像生成部44は、上述の図9に示す例と同様に、可視光画像100において蛍光体202a、202b、202cの範囲に対応する部分を強調することで、観察画像103を生成することができる(図16参照)。 As an example, the observation image generation unit 44 generates the observation image 103 by emphasizing the portions corresponding to the ranges of the phosphors 202a, 202b, and 202c in the visible light image 100, as in the example shown in FIG. (See FIG. 16).
 図16に示す例では、観察画像103の蛍光体202の範囲に対応する部分の濃淡又は色が、対応の蛍光体202の深さ位置情報に応じて調整される。これにより、観察画像103において蛍光体(図16では第2蛍光体202b及び第3蛍光体202c)同士が重なっていても、重なり合う蛍光体202b、202cを区別して視認することができる。 In the example shown in FIG. 16, the shading or color of the portion of the observation image 103 corresponding to the range of the phosphors 202 is adjusted according to the depth position information of the corresponding phosphors 202 . As a result, even if the phosphors (the second phosphor 202b and the third phosphor 202c in FIG. 16) overlap each other in the observation image 103, the overlapping phosphors 202b and 202c can be distinguished and visually recognized.
 他の例として、観察画像生成部44は、蛍光体202a、202b、202cをXZ平面に投影した状態を示す観察画像103を生成することができる(図17参照)。例えば、導出された各蛍光体202a、202b、202cの深さ位置に基づいて、観察基準線203を通るXZ平面上に各蛍光体202a、202b、202cを投影することで、図17に示すような観察画像103を生成することができる。なお各蛍光体202a、202b、202cのZ軸方向の厚みが不明の場合、各蛍光体の長さ(L)に応じて各蛍光体の厚みを仮定的に定めてもよい。 As another example, the observation image generation unit 44 can generate an observation image 103 showing a state in which the phosphors 202a, 202b, and 202c are projected onto the XZ plane (see FIG. 17). For example, based on the derived depth positions of the phosphors 202a, 202b, and 202c, by projecting the phosphors 202a, 202b, and 202c onto the XZ plane passing through the observation reference line 203, as shown in FIG. observation image 103 can be generated. If the thickness of each phosphor 202a, 202b, and 202c in the Z-axis direction is unknown, the thickness of each phosphor may be presumably determined according to the length (L) of each phosphor.
 この場合、蛍光体202a、202b、202cの相対的な深さ位置関係を、観察画像103から直感的に把握することができる。 In this case, the relative depth positional relationship of the phosphors 202a, 202b, and 202c can be intuitively grasped from the observation image 103.
 図17に示す例では、各蛍光体202a、202b、202cが深さ位置に応じた濃淡で表示されているが、他の基準(例えば蛍光体(蛍光試薬)の種類)に応じた濃淡表示や色分け表示がされてもよい。 In the example shown in FIG. 17, each of the phosphors 202a, 202b, and 202c is displayed with shading according to the depth position. A color-coded display may be performed.
 以上説明したように本実施形態によれば、複数の蛍光体202が生体組織200に含まれる場合に、観察者は、観察画像103に表された各蛍光体202の深さ位置情報から、複数の蛍光体202間の相対的な深さ位置の関係を把握することができる。 As described above, according to the present embodiment, when a plurality of phosphors 202 are contained in the biological tissue 200, the observer can obtain a plurality of , the relative depth position relationship between the phosphors 202 can be grasped.
 したがって術者等の観察者は、観察画像103から、生体組織200における蛍光体202間の相対的な位置関係を簡単且つ正確に把握することができる。その結果、術者は、蛍光体202間の相対的な位置関係を把握しつつ手術を行うことができ、手技の精度及び安定性を向上させることができる。 Therefore, an observer such as an operator can easily and accurately grasp the relative positional relationship between the phosphors 202 in the living tissue 200 from the observed image 103 . As a result, the operator can perform surgery while grasping the relative positional relationship between the phosphors 202, thereby improving the accuracy and stability of the procedure.
[第3実施形態]
 第3実施形態において、上述の第1実施形態及び第2実施形態と同一又は対応の要素には、同一の符号を付し、その詳細な説明は省略する。
[Third embodiment]
In the third embodiment, elements that are the same as or correspond to those of the first and second embodiments described above are denoted by the same reference numerals, and detailed description thereof will be omitted.
 本実施形態では、蛍光波長が異なる複数種類の蛍光体202を生体組織200が含む場合に、各蛍光体202の深さ位置を直接的に導出することなく、蛍光体202間の深さ位置の相対的な関係が導出される。すなわち、上述の第1実施形態及び第2実施形態では、各蛍光体202の深さ位置の絶対値が導出されるが、第3実施形態では蛍光体202間の相対的な深さ位置関係が導出される。 In the present embodiment, when the living tissue 200 includes a plurality of types of phosphors 202 having different fluorescence wavelengths, the depth position between the phosphors 202 is calculated without directly deriving the depth position of each phosphor 202. A relative relationship is derived. That is, in the above-described first and second embodiments, the absolute value of the depth position of each phosphor 202 is derived, but in the third embodiment, the relative depth position relationship between the phosphors 202 is derived.
 一例として、上述の拡がり関数(上記の式1~式4参照)のパラメータのうち、等価散乱係数(μ’)及び蛍光体202の長さ(L)は対応の値を取得可能であるが、吸収係数(μ)については対応の値が取得できないケースを想定する。そのようなケースとして、例えば、処理記憶部45が等価散乱係数(μ’)のデータを記憶しているが、吸収係数(μ)を記憶していない場合が考えられる。 As an example, the equivalent scattering coefficient (μ s ') and the length (L) of the phosphor 202 among the parameters of the above-described spreading function (see Equations 1 to 4 above) can be obtained with corresponding values. , and the absorption coefficient (μ a ) is assumed to be a case in which the corresponding value cannot be obtained. As such a case, for example, it is conceivable that the processing storage unit 45 stores data of the equivalent scattering coefficient (μ s ′) but does not store the absorption coefficient (μ a ).
 この場合、上述の第1実施形態及び第2実施形態と同様に、深さ位置情報取得部42は、蛍光画像101を解析して、蛍光画像101における複数の蛍光体202の各々の拡がり情報を取得し、各蛍光体202の長さ「L」を取得する。 In this case, as in the first and second embodiments described above, the depth position information acquisition unit 42 analyzes the fluorescence image 101 and obtains the spread information of each of the plurality of phosphors 202 in the fluorescence image 101. and obtain the length “L” of each phosphor 202 .
 また深さ位置情報取得部42は、可視光画像100を解析して生体組織200の種類を取得し、生体組織200の種類及び各蛍光体202の蛍光波長に基づいて各蛍光体202の等価散乱係数(μ’)を処理記憶部45から取得する。 Further, the depth position information acquisition unit 42 analyzes the visible light image 100 to acquire the type of the living tissue 200, and based on the type of the living tissue 200 and the fluorescence wavelength of each phosphor 202, the equivalent scattering of each phosphor 202. A coefficient (μ s ′) is acquired from the processing storage unit 45 .
 その結果、生体組織200に含まれる複数の蛍光体202の各々に関し、蛍光体202の深さ位置(d)及び吸収係数(μ)を未知のパラメータとして含む「拡がり情報及び拡がり関数に基づく関係式」を得ることができる。例えば、2つの蛍光体202(すなわち第1蛍光体202a及び第2蛍光体202b)が生体組織200に含まれる場合、拡がり情報及び拡がり関数に基づく上述の関係式が2つ得られる。 As a result, for each of the plurality of phosphors 202 contained in the biological tissue 200, the relationship based on the spread information and the spread function including the depth position (d) and the absorption coefficient (μ a ) of the phosphor 202 as unknown parameters formula" can be obtained. For example, when two phosphors 202 (that is, the first phosphor 202a and the second phosphor 202b) are included in the biological tissue 200, the above two relational expressions based on spread information and spread function are obtained.
 このようにして得られる複数の関係式の各々は、対応の蛍光体202の蛍光波長によることなく、対応の蛍光体202の深さ位置とボケの程度を表す。 Each of the plurality of relational expressions obtained in this manner represents the depth position of the corresponding phosphor 202 and the degree of blur, regardless of the fluorescence wavelength of the corresponding phosphor 202 .
 ここで、吸収係数(μ)の値は生体組織200の種類に応じて定まる。そのため、同一の生体組織200に含まれる複数の蛍光体202の吸収係数(μ)は、同じ値を示す。 Here, the value of the absorption coefficient (μ a ) is determined according to the type of living tissue 200 . Therefore, the absorption coefficients (μ a ) of the plurality of phosphors 202 contained in the same living tissue 200 show the same value.
 したがって、上記の「拡がり情報及び拡がり関数に基づく複数の関係式」を互いに比較することによって、それぞれの蛍光体202の深さ位置の相対関係を把握することが可能である。 Therefore, by comparing the above "plurality of relational expressions based on spread information and spread function" with each other, it is possible to grasp the relative relationship between the depth positions of the respective phosphors 202 .
 例えば、第1蛍光体202a及び第2蛍光体202bに関する2つの上記関係式を比較する場合、実質的に未知のパラメータは、第1蛍光体202aの深さ位置(d)、第2蛍光体202bの深さ位置(d)、及び共通の吸収係数(μ)である。これらの3つの未知のパラメータを含む2つの関係式によれば、第1蛍光体202aの深さ位置と、第2蛍光体202bの深さ位置との間の相対関係を導き出すことが可能である。 For example, when comparing the two above relational expressions for the first phosphor 202a and the second phosphor 202b, the substantially unknown parameters are the depth position (d) of the first phosphor 202a and the depth position (d) of the second phosphor 202b. (d), and the common absorption coefficient (μ a ). According to two relational expressions involving these three unknown parameters, it is possible to derive the relative relationship between the depth position of the first phosphor 202a and the depth position of the second phosphor 202b. .
 このように本実施形態の深さ位置情報取得部42は、それぞれの蛍光体202に関して得られる上述の複数の関係式と、それぞれの蛍光体202の拡がり情報とに基づいて、複数の蛍光体202間における深さ位置の相対関係を示す相対深さ位置情報を取得する。 As described above, the depth position information acquisition unit 42 of the present embodiment obtains the plurality of phosphors 202 based on the plurality of relational expressions obtained for each phosphor 202 and the spread information of each phosphor 202 . Acquire relative depth position information that indicates the relative relationship between depth positions.
 図18は、第3実施形態に係る医療用画像処理方法の一例を示すフローチャートである。以下では、生体組織200が2つの蛍光体202(第1蛍光体202a及び第2蛍光体202b)を含む場合について説明する。 FIG. 18 is a flowchart showing an example of a medical image processing method according to the third embodiment. A case will be described below in which the living tissue 200 includes two phosphors 202 (a first phosphor 202a and a second phosphor 202b).
 まず医療用画像処理装置12の画像取得部41によって、生体組織200の可視光画像100及び蛍光画像101が、撮影ユニット11から取得される(図18のS21)。 First, the image acquisition unit 41 of the medical image processing apparatus 12 acquires the visible light image 100 and the fluorescence image 101 of the biological tissue 200 from the imaging unit 11 (S21 in FIG. 18).
 そして、深さ位置情報取得部42が可視光画像100を解析することで、生体組織200の種類が特定される(S22)。 Then, the depth position information acquisition unit 42 analyzes the visible light image 100 to identify the type of the living tissue 200 (S22).
 そして深さ位置情報取得部42によって、第1蛍光体202a及び第2蛍光体202bの各々に関し、生体組織200の種類に応じた線拡がり関数が取得される(S23)。 Then, the depth position information acquisition unit 42 acquires a line spread function corresponding to the type of living tissue 200 for each of the first phosphor 202a and the second phosphor 202b (S23).
 そして深さ位置情報取得部42が蛍光画像101を解析することで、第1蛍光体202a及び第2蛍光体202bの各々に関し、輝度に関する線拡がり情報が取得される(S24)。 Then, the depth position information acquisition unit 42 analyzes the fluorescence image 101 to acquire line spread information regarding luminance for each of the first phosphor 202a and the second phosphor 202b (S24).
 そして深さ位置情報取得部42が、第1蛍光体202a及び第2蛍光体202bの各々の線拡がり情報を、対応の線拡がり関数に照らし合わせることで、第1蛍光体202aと第2蛍光体202bとの間の相対深さ位置情報が導出される(S25)。一例として、深さ位置情報取得部42は、蛍光画像101に基づいて、第1蛍光体202a及び第2蛍光体202bのそれぞれの深さ位置情報を取得する。そして深さ位置情報取得部42は、第1蛍光体202aの深さ位置情報と、第2蛍光体202bの深さ位置情報とを比較することで、第1蛍光体202aと第2蛍光体202bとの間の相対深さ位置情報を導き出すことができる。 Then, the depth position information acquisition unit 42 compares the line spread information of each of the first phosphor 202a and the second phosphor 202b with the corresponding line spread function, thereby obtaining the first phosphor 202a and the second phosphor 202a. 202b is derived (S25). As an example, the depth position information acquisition unit 42 acquires depth position information of each of the first phosphor 202 a and the second phosphor 202 b based on the fluorescence image 101 . Then, the depth position information acquisition unit 42 compares the depth position information of the first phosphor 202a and the depth position information of the second phosphor 202b to obtain the first phosphor 202a and the second phosphor 202b. can derive relative depth position information between
 その後、本実施形態の画質調整部43は、蛍光画像101における第1蛍光体202a及び第2蛍光体202bの相互間の明るさを調整する処理を、蛍光画像101に対して行う(S26)。 After that, the image quality adjustment unit 43 of the present embodiment performs processing on the fluorescence image 101 to adjust the brightness between the first phosphor 202a and the second phosphor 202b in the fluorescence image 101 (S26).
 そして観察画像生成部44によって、可視光画像100及び蛍光画像101(特に明るさ調整後の蛍光画像101)から観察画像103が生成される(S27)。 Then, the observation image generation unit 44 generates an observation image 103 from the visible light image 100 and the fluorescence image 101 (especially the fluorescence image 101 after brightness adjustment) (S27).
 例えば、観察画像生成部44は、複数の蛍光体202間における相対的な明るさの調整が行われた後の蛍光画像101の複数の蛍光体202に対応する部分を、可視光画像100に重畳して観察画像103を生成することができる。或いは、観察画像生成部44は、蛍光画像101から各蛍光体202の範囲を取得し、各蛍光体202の範囲に対応する可視光画像100中の部分を強調することで、観察画像103を生成することができる。 For example, the observation image generation unit 44 superimposes the portion corresponding to the plurality of phosphors 202 of the fluorescence image 101 after the relative brightness adjustment among the plurality of phosphors 202 is performed on the visible light image 100. can be used to generate the observed image 103 . Alternatively, the observation image generation unit 44 acquires the range of each phosphor 202 from the fluorescence image 101 and emphasizes the portion in the visible light image 100 corresponding to the range of each phosphor 202 to generate the observation image 103. can do.
 図19は、第3実施形態に係る観察画像103の一例を示す図である。 FIG. 19 is a diagram showing an example of an observation image 103 according to the third embodiment.
 図19に示す観察画像103では、第1蛍光体202a及び第2蛍光体202bの相対深さ位置に応じて、第1蛍光体202a及び第2蛍光体202bの塗りつぶし色が変わる。第1蛍光体202aの深さ位置と第2蛍光体202b深さ位置とが近いほど、図19に示すバー状指標表示の中央に近い色で、第1蛍光体202a及び第2蛍光体202bは塗りつぶされる。一方、第1蛍光体202aの深さ位置と第2蛍光体202b深さ位置とが遠いほど、図19に示すバー状指標表示の両端に近い色で、第1蛍光体202a及び第2蛍光体202bは塗りつぶされる。 In the observation image 103 shown in FIG. 19, the filling colors of the first phosphor 202a and the second phosphor 202b change according to the relative depth positions of the first phosphor 202a and the second phosphor 202b. The closer the depth position of the first phosphor 202a and the depth position of the second phosphor 202b are, the closer the color is to the center of the bar-shaped index display shown in FIG. filled. On the other hand, the farther the depth position of the first phosphor 202a and the depth position of the second phosphor 202b are, the closer the colors are to the ends of the bar-shaped index display shown in FIG. 202b is filled.
 なお図19に示す例では、観察画像103中の各蛍光体202の表示濃度によって相対深さ位置が表されているが、他の表示状態(例えば濃淡や明るさ等)に基づいて蛍光体202間の相対深さ位置が観察画像103中に表されてもよい。 In the example shown in FIG. 19, the display density of each phosphor 202 in the observation image 103 represents the relative depth position. The relative depth positions between may be represented in the observed image 103 .
 以上説明したように本実施形態によれば、複数の蛍光体202が生体組織200に含まれる場合に、観察者は、観察画像103に表された各蛍光体202の深さ位置情報から、複数の蛍光体202間の相対的な深さ位置の関係を把握することができる。 As described above, according to the present embodiment, when a plurality of phosphors 202 are contained in the biological tissue 200, the observer can obtain a plurality of , the relative depth position relationship between the phosphors 202 can be grasped.
[応用例]
 以下に、上述の医療用観察システム10、医療用画像処理装置12及び医療用画像処理方法を応用可能な顕微鏡システムの一例について説明する。なお、上述の医療用観察システム10、医療用画像処理装置12及び医療用画像処理方法は、下述の顕微鏡システム以外の任意のシステム、装置及び方法等に対しても応用可能である。
[Application example]
An example of a microscope system to which the above-described medical observation system 10, medical image processing apparatus 12, and medical image processing method can be applied will be described below. The medical observation system 10, the medical image processing apparatus 12, and the medical image processing method described above can also be applied to any system, apparatus, method, etc. other than the microscope system described below.
 本開示の顕微鏡システムの構成例を図20に示す。図20に示される顕微鏡システム5000は、顕微鏡装置5100、制御部5110、及び情報処理部5120を含む。顕微鏡装置5100は、光照射部5101、光学部5102、及び信号取得部5103を備えている。顕微鏡装置5100はさらに、生体由来試料Sが配置される試料載置部5104を備えていてよい。なお、顕微鏡装置の構成は図20に示されるものに限定されず、例えば、光照射部5101は、顕微鏡装置5100の外部に存在してもよく、例えば顕微鏡装置5100に含まれない光源が光照射部5101として利用されてもよい。また、光照射部5101は、光照射部5101と光学部5102とによって試料載置部5104が挟まれるように配置されていてよく、例えば、光学部5102が存在する側に配置されてもよい。顕微鏡装置5100は、明視野観察、位相差観察、微分干渉観察、偏光観察、蛍光観察、及び暗視野観察のうちの1又は2以上で構成されてよい。 A configuration example of the microscope system of the present disclosure is shown in FIG. A microscope system 5000 shown in FIG. 20 includes a microscope device 5100 , a control section 5110 and an information processing section 5120 . A microscope device 5100 includes a light irradiation section 5101 , an optical section 5102 , and a signal acquisition section 5103 . The microscope device 5100 may further include a sample placement section 5104 on which the biological sample S is placed. The configuration of the microscope apparatus is not limited to that shown in FIG. 20. For example, the light irradiation unit 5101 may exist outside the microscope apparatus 5100. It may be used as the unit 5101 . Further, the light irradiation section 5101 may be arranged such that the sample mounting section 5104 is sandwiched between the light irradiation section 5101 and the optical section 5102, and may be arranged on the side where the optical section 5102 exists, for example. The microscope apparatus 5100 may be configured for one or more of bright field observation, phase contrast observation, differential interference observation, polarization observation, fluorescence observation, and dark field observation.
 顕微鏡システム5000は、いわゆるWSI(Whole Slide Imaging)システム又はデジタルパソロジーシステムとして構成されてよく、病理診断のために用いられうる。また、顕微鏡システム5000は、蛍光イメージングシステム、特には多重蛍光イメージングシステムとして構成されてもよい。 The microscope system 5000 may be configured as a so-called WSI (Whole Slide Imaging) system or digital pathology system, and can be used for pathological diagnosis. Microscope system 5000 may also be configured as a fluorescence imaging system, in particular a multiplex fluorescence imaging system.
 例えば、顕微鏡システム5000は、術中病理診断又は遠隔病理診断を行うために用いられてよい。当該術中病理診断では、手術が行われている間に、顕微鏡装置5100が、当該手術の対象者から取得された生体由来試料Sのデータを取得し、そして、当該データを情報処理部5120へと送信しうる。当該遠隔病理診断では、顕微鏡装置5100は、取得した生体由来試料Sのデータを、顕微鏡装置5100とは離れた場所(別の部屋又は建物など)に存在する情報処理装置5120へと送信しうる。そして、これらの診断において、情報処理装置5120は、当該データを受信し、出力する。出力されたデータに基づき、情報処理装置5120のユーザが、病理診断を行いうる。 For example, the microscope system 5000 may be used to perform intraoperative pathological diagnosis or remote pathological diagnosis. In the intraoperative pathological diagnosis, while the surgery is being performed, the microscope device 5100 acquires data of the biological sample S obtained from the subject of the surgery, and transfers the data to the information processing unit 5120. can send. In the remote pathological diagnosis, the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing device 5120 located in a place (another room, building, or the like) away from the microscope device 5100 . In these diagnoses, the information processing device 5120 receives and outputs the data. A user of the information processing device 5120 can make a pathological diagnosis based on the output data.
(生体由来試料)
 生体由来試料Sは、生体成分を含む試料であってよい。前記生体成分は、生体の組織、細胞、生体の液状成分(血液や尿等)、培養物、又は生細胞(心筋細胞、神経細胞、及び受精卵など)であってよい。
 前記生体由来試料は、固形物であってよく、パラフィンなどの固定試薬によって固定された標本又は凍結により形成された固形物であってよい。前記生体由来試料は、当該固形物の切片でありうる。前記生体由来試料の具体的な例として、生検試料の切片を挙げることができる。
(Biological sample)
The biological sample S may be a sample containing a biological component. The biological components may be tissues, cells, liquid components of a living body (blood, urine, etc.), cultures, or living cells (cardiomyocytes, nerve cells, fertilized eggs, etc.).
The biological sample may be a solid, a specimen fixed with a fixative such as paraffin, or a solid formed by freezing. The biological sample can be a section of the solid. A specific example of the biological sample is a section of a biopsy sample.
 前記生体由来試料は、染色又は標識などの処理が施されたものであってよい。当該処理は、生体成分の形態を示すための又は生体成分が有する物質(表面抗原など)を示すための染色であってよく、HE(Hematoxylin-Eosin)染色、免疫組織化学(Immunohistochemistry)染色を挙げることができる。前記生体由来試料は、1又は2以上の試薬により前記処理が施されたものであってよく、当該試薬は、蛍光色素、発色試薬、蛍光タンパク質、又は蛍光標識抗体でありうる。 The biological sample may be one that has undergone processing such as staining or labeling. The treatment may be staining for indicating the morphology of biological components or for indicating substances (surface antigens, etc.) possessed by biological components, examples of which include HE (Hematoxylin-Eosin) staining and immunohistochemistry staining. be able to. The biological sample may be treated with one or more reagents, and the reagents may be fluorescent dyes, chromogenic reagents, fluorescent proteins, or fluorescently labeled antibodies.
 前記標本は、人体から採取された検体または組織サンプルから病理診断または臨床検査などを目的に作製されたものであってよい。また、前記標本は、人体に限らず、動物、植物、又は他の材料に由来するものであってもよい。前記標本は、使用される組織(例えば臓器または細胞など)の種類、対象となる疾病の種類、対象者の属性(例えば、年齢、性別、血液型、または人種など)、または対象者の生活習慣(例えば、食生活、運動習慣、または喫煙習慣など)などにより性質が異なる。前記標本は、各標本それぞれ識別可能な識別情報(バーコード情報又はQRコード(商標)情報等)を付されて管理されてよい。 The specimen may be one prepared for the purpose of pathological diagnosis or clinical examination from a specimen or tissue sample collected from the human body. Moreover, the specimen is not limited to the human body, and may be derived from animals, plants, or other materials. The specimen may be the type of tissue used (such as an organ or cell), the type of target disease, the subject's attributes (such as age, sex, blood type, or race), or the subject's lifestyle. The properties differ depending on habits (for example, eating habits, exercise habits, smoking habits, etc.). The specimens may be managed with identification information (bar code information, QR code (trademark) information, etc.) that allows each specimen to be identified.
(光照射部)
 光照射部5101は、生体由来試料Sを照明するための光源、および光源から照射された光を標本に導く光学部である。光源は、可視光、紫外光、若しくは赤外光、又はこれらの組合せを生体由来試料に照射しうる。光源は、ハロゲンランプ、レーザ光源、LEDランプ、水銀ランプ、及びキセノンランプのうちの1又は2以上であってよい。蛍光観察における光源の種類及び/又は波長は、複数でもよく、当業者により適宜選択されてよい。光照射部は、透過型、反射型又は落射型(同軸落射型若しくは側射型)の構成を有しうる。
(light irradiation part)
The light irradiation unit 5101 is a light source for illuminating the biological sample S and an optical unit for guiding the light irradiated from the light source to the specimen. The light source may irradiate the biological sample with visible light, ultraviolet light, or infrared light, or a combination thereof. The light source may be one or more of halogen lamps, laser light sources, LED lamps, mercury lamps, and xenon lamps. A plurality of types and/or wavelengths of light sources may be used in fluorescence observation, and may be appropriately selected by those skilled in the art. The light irradiator may have a transmissive, reflective, or episcopic (coaxial or lateral) configuration.
(光学部)
 光学部5102は、生体由来試料Sからの光を信号取得部5103へと導くように構成される。光学部は、顕微鏡装置5100が生体由来試料Sを観察又は撮像することを可能とするように構成されうる。
 光学部5102は、対物レンズを含みうる。対物レンズの種類は、観察方式に応じて当業者により適宜選択されてよい。また、光学部は、対物レンズによって拡大された像を信号取得部に中継するためのリレーレンズを含んでもよい。光学部は、前記対物レンズ及び前記リレーレンズ以外の光学部品、接眼レンズ、位相板、及びコンデンサレンズなど、をさらに含みうる。
 また、光学部5102は、生体由来試料Sからの光のうちから所定の波長を有する光を分離するように構成された波長分離部をさらに含んでよい。波長分離部は、所定の波長又は波長範囲の光を選択的に信号取得部に到達させるように構成されうる。波長分離部は、例えば、光を選択的に透過させるフィルタ、偏光板、プリズム(ウォラストンプリズム)、及び回折格子のうちの1又は2以上を含んでよい。波長分離部に含まれる光学部品は、例えば対物レンズから信号取得部までの光路上に配置されてよい。波長分離部は、蛍光観察が行われる場合、特に励起光照射部を含む場合に、顕微鏡装置内に備えられる。波長分離部は、蛍光同士を互いに分離し又は白色光と蛍光とを分離するように構成されうる。
(Optical part)
The optical section 5102 is configured to guide the light from the biological sample S to the signal acquisition section 5103 . The optical section can be configured to allow the microscope device 5100 to observe or image the biological sample S.
Optical section 5102 may include an objective lens. The type of objective lens may be appropriately selected by those skilled in the art according to the observation method. Also, the optical section may include a relay lens for relaying the image magnified by the objective lens to the signal acquisition section. The optical unit may further include optical components other than the objective lens and the relay lens, an eyepiece lens, a phase plate, a condenser lens, and the like.
In addition, the optical section 5102 may further include a wavelength separation section configured to separate light having a predetermined wavelength from the light from the biological sample S. The wavelength separation section can be configured to selectively allow light of a predetermined wavelength or wavelength range to reach the signal acquisition section. The wavelength separator may include, for example, one or more of a filter that selectively transmits light, a polarizing plate, a prism (Wollaston prism), and a diffraction grating. The optical components included in the wavelength separation section may be arranged, for example, on the optical path from the objective lens to the signal acquisition section. The wavelength separation unit is provided in the microscope apparatus when fluorescence observation is performed, particularly when an excitation light irradiation unit is included. The wavelength separator may be configured to separate fluorescent light from each other or white light and fluorescent light.
(信号取得部)
 信号取得部5103は、生体由来試料Sからの光を受光し、当該光を電気信号、特にはデジタル電気信号へと変換することができるように構成されうる。信号取得部は、当該電気信号に基づき、生体由来試料Sに関するデータを取得することができるように構成されてよい。信号取得部は、生体由来試料Sの像(画像、特には静止画像、タイムラプス画像、又は動画像)のデータを取得することができるように構成されてよく、特に光学部によって拡大された画像のデータを取得するように構成されうる。信号取得部は、1次元又は2次元に並んで配列された複数の画素を備えている1つ又は複数の撮像素子、CMOS又はCCDなど、を含む。信号取得部は、低解像度画像取得用の撮像素子と高解像度画像取得用の撮像素子とを含んでよく、又は、AFなどのためのセンシング用撮像素子と観察などのための画像出力用撮像素子とを含んでもよい。撮像素子は、前記複数の画素に加え、各画素からの画素信号を用いた信号処理を行う信号処理部(CPU、DSP、及びメモリのうちの1つ、2つ、又は3つを含む)、及び、画素信号から生成された画像データ及び信号処理部により生成された処理データの出力の制御を行う出力制御部を含みうる。更には、撮像素子は、入射光を光電変換する画素の輝度変化が所定の閾値を超えたことをイベントとして検出する非同期型のイベント検出センサを含み得る。前記複数の画素、前記信号処理部、及び前記出力制御部を含む撮像素子は、好ましくは1チップの半導体装置として構成されうる。
(Signal acquisition part)
The signal acquisition unit 5103 can be configured to receive light from the biological sample S and convert the light into an electrical signal, particularly a digital electrical signal. The signal acquisition unit may be configured to acquire data on the biological sample S based on the electrical signal. The signal acquisition unit may be configured to acquire data of an image (image, particularly a still image, a time-lapse image, or a moving image) of the biological sample S, particularly an image magnified by the optical unit. It can be configured to acquire data. The signal acquisition unit includes one or more imaging elements, such as CMOS or CCD, having a plurality of pixels arranged one-dimensionally or two-dimensionally. The signal acquisition unit may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or an image sensor for sensing such as AF and an image sensor for image output for observation. and may include In addition to the plurality of pixels, the image sensor includes a signal processing unit (including one, two, or three of CPU, DSP, and memory) that performs signal processing using pixel signals from each pixel, and an output control unit for controlling output of image data generated from the pixel signals and processed data generated by the signal processing unit. Furthermore, the imaging device may include an asynchronous event detection sensor that detects, as an event, a change in brightness of a pixel that photoelectrically converts incident light exceeding a predetermined threshold. An imaging device including the plurality of pixels, the signal processing section, and the output control section may preferably be configured as a one-chip semiconductor device.
(制御部)
 制御部5110は、顕微鏡装置5100による撮像を制御する。制御部は、撮像制御のために、光学部5102及び/又は試料載置部5104の移動を駆動して、光学部と試料載置部との間の位置関係を調節しうる。制御部5110は、光学部及び/又は試料載置部を、互いに近づく又は離れる方向(例えば対物レンズの光軸方向)に移動させうる。また、制御部は、光学部及び/又は試料載置部を、前記光軸方向と垂直な面におけるいずれかの方向に移動させてもよい。制御部は、撮像制御のために、光照射部5101及び/又は信号取得部5103を制御してもよい。
(control part)
The control unit 5110 controls imaging by the microscope device 5100 . For imaging control, the control unit can drive the movement of the optical unit 5102 and/or the sample placement unit 5104 to adjust the positional relationship between the optical unit and the sample placement unit. The control unit 5110 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (for example, the optical axis direction of the objective lens). Further, the control section may move the optical section and/or the sample placement section in any direction on a plane perpendicular to the optical axis direction. The control unit may control the light irradiation unit 5101 and/or the signal acquisition unit 5103 for imaging control.
(試料載置部)
 試料載置部5104は、生体由来試料の試料載置部上における位置が固定できるように構成されてよく、いわゆるステージであってよい。試料載置部5104は、生体由来試料の位置を、対物レンズの光軸方向及び/又は当該光軸方向と垂直な方向に移動させることができるように構成されうる。
(Sample placement section)
The sample mounting section 5104 may be configured such that the position of the biological sample on the sample mounting section can be fixed, and may be a so-called stage. The sample mounting section 5104 can be configured to move the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
(情報処理部)
 情報処理部5120は、顕微鏡装置5100が取得したデータ(撮像データなど)を、顕微鏡装置5100から取得しうる。情報処理部は、撮像データに対する画像処理を実行しうる。当該画像処理は、色分離処理を含んでよい。当該色分離処理は、撮像データから所定の波長又は波長範囲の光成分のデータを抽出して画像データを生成する処理、又は、撮像データから所定の波長又は波長範囲の光成分のデータを除去する処理などを含みうる。また、当該画像処理は、組織切片の自家蛍光成分と色素成分を分離する自家蛍光分離処理や互いに蛍光波長が異なる色素間の波長を分離する蛍光分離処理を含みうる。前記自家蛍光分離処理では、同一ないし性質が類似する前記複数の標本のうち、一方から抽出された自家蛍光シグナルを用いて他方の標本の画像情報から自家蛍光成分を除去する処理を行ってもよい。
 情報処理部5120は、制御部5110に撮像制御のためのデータを送信してよく、当該データを受信した制御部5110が、当該データに従い顕微鏡装置5100による撮像を制御してもよい。
(Information processing department)
The information processing section 5120 can acquire data (such as imaging data) acquired by the microscope device 5100 from the microscope device 5100 . The information processing section can perform image processing on the imaging data. The image processing may include color separation processing. The color separation process is a process of extracting data of light components of a predetermined wavelength or wavelength range from the captured data to generate image data, or removing data of light components of a predetermined wavelength or wavelength range from the captured data. It can include processing and the like. Further, the image processing may include autofluorescence separation processing for separating the autofluorescence component and dye component of the tissue section, and fluorescence separation processing for separating the wavelengths between dyes having different fluorescence wavelengths. In the autofluorescence separation processing, out of the plurality of specimens having the same or similar properties, autofluorescence signals extracted from one may be used to remove autofluorescence components from image information of the other specimen. .
The information processing section 5120 may transmit data for imaging control to the control section 5110, and the control section 5110 receiving the data may control imaging by the microscope apparatus 5100 according to the data.
 情報処理部5120は、汎用のコンピュータなどの情報処理装置として構成されてよく、CPU、RAM、及びROMを備えていてよい。情報処理部は、顕微鏡装置5100の筐体内に含まれていてよく、又は、当該筐体の外にあってもよい。また、情報処理部による各種処理又は機能は、ネットワークを介して接続されたサーバコンピュータ又はクラウドにより実現されてもよい。 The information processing section 5120 may be configured as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM. The information processing section may be included in the housing of the microscope device 5100 or may be outside the housing. Also, various processes or functions by the information processing unit may be realized by a server computer or cloud connected via a network.
 顕微鏡装置5100による生体由来試料Sの撮像の方式は、生体由来試料の種類及び撮像の目的などに応じて、当業者により適宜選択されてよい。当該撮像方式の例を以下に説明する。 A method of imaging the biological sample S by the microscope device 5100 may be appropriately selected by a person skilled in the art according to the type of the biological sample and the purpose of imaging. An example of the imaging method will be described below.
 撮像方式の一つの例は以下のとおりである。顕微鏡装置は、まず、撮像対象領域を特定しうる。当該撮像対象領域は、生体由来試料が存在する領域全体をカバーするように特定されてよく、又は、生体由来試料のうちの目的部分(目的組織切片、目的細胞、又は目的病変部が存在する部分)をカバーするように特定されてもよい。次に、顕微鏡装置は、当該撮像対象領域を、所定サイズの複数の分割領域へと分割し、顕微鏡装置は各分割領域を順次撮像する。これにより、各分割領域の画像が取得される。
 図21に示されるように、顕微鏡装置は、生体由来試料S全体をカバーする撮像対象領域Rを特定する。そして、顕微鏡装置は、撮像対象領域Rを16の分割領域へと分割する。そして、顕微鏡装置は分割領域R1の撮像を行い、そして次に、その分割領域R1に隣接する領域など、撮像対象領域Rに含まれる領域の内いずれか領域を撮像しうる。そして、未撮像の分割領域がなくなるまで、分割領域の撮像が行われる。なお、撮像対象領域R以外の領域についても、分割領域の撮像画像情報に基づき、撮像しても良い。
 或る分割領域を撮像した後に次の分割領域を撮像するために、顕微鏡装置と試料載置部との位置関係が調整される。当該調整は、顕微鏡装置の移動、試料載置部の移動、又は、これらの両方の移動により行われてよい。この例において、各分割領域の撮像を行う撮像装置は、2次元撮像素子(エリアセンサ)又は1次元撮像素子(ラインセンサ)であってよい。信号取得部は、光学部を介して各分割領域を撮像してよい。また、各分割領域の撮像は、顕微鏡装置及び/又は試料載置部を移動させながら連続的に行われてよく、又は、各分割領域の撮像に際して顕微鏡装置及び/又は試料載置部の移動が停止されてもよい。各分割領域の一部が重なり合うように、前記撮像対象領域の分割が行われてよく、又は、重なり合わないように前記撮像対象領域の分割が行われてもよい。各分割領域は、焦点距離及び/又は露光時間などの撮像条件を変えて複数回撮像されてもよい。
 また、情報処理装置は、隣り合う複数の分割領域が合成して、より広い領域の画像データを生成しうる。当該合成処理を、撮像対象領域全体にわたって行うことで、撮像対象領域について、より広い領域の画像を取得することができる。また、分割領域の画像、または合成処理を行った画像から、より解像度の低い画像データを生成しうる。
One example of an imaging scheme is as follows. The microscope device can first identify an imaging target region. The imaging target region may be specified so as to cover the entire region where the biological sample exists, or a target portion (target tissue section, target cell, or target lesion portion) of the biological sample. ) may be specified to cover Next, the microscope device divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device sequentially images each divided region. As a result, an image of each divided area is obtained.
As shown in FIG. 21, the microscope device identifies an imaging target region R that covers the entire biological sample S. As shown in FIG. Then, the microscope device divides the imaging target region R into 16 divided regions. The microscope device can then image the segmented region R1, and then image any region included in the imaging target region R, such as a region adjacent to the segmented region R1. Then, image capturing of the divided areas is performed until there are no unimaged divided areas. Areas other than the imaging target area R may also be imaged based on the captured image information of the divided areas.
After imaging a certain divided area, the positional relationship between the microscope device and the sample mounting section is adjusted in order to image the next divided area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them. In this example, the image capturing device that captures each divided area may be a two-dimensional image sensor (area sensor) or a one-dimensional image sensor (line sensor). The signal acquisition section may capture an image of each divided area via the optical section. Further, the imaging of each divided area may be performed continuously while moving the microscope device and/or the sample mounting section, or the movement of the microscope apparatus and/or the sample mounting section may be performed when imaging each divided area. may be stopped. The imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap. Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
Further, the information processing device can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesizing process over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
 撮像方式の他の例は以下のとおりである。顕微鏡装置は、まず、撮像対象領域を特定しうる。当該撮像対象領域は、生体由来試料が存在する領域全体をカバーするように特定されてよく、又は、生体由来試料のうちの目的部分(目的組織切片又は目的細胞が存在する部分)をカバーするように特定されてもよい。次に、顕微鏡装置は、撮像対象領域の一部の領域(「分割スキャン領域」ともいう)を、光軸と垂直な面内における一つの方向(「スキャン方向」ともいう)へスキャンして撮像する。当該分割スキャン領域のスキャンが完了したら、次に、前記スキャン領域の隣の分割スキャン領域を、スキャンする。これらのスキャン動作が、撮像対象領域全体が撮像されるまで繰り返される。
 図22に示されるように、顕微鏡装置は、生体由来試料Sのうち、組織切片が存在する領域(グレーの部分)を撮像対象領域Saとして特定する。そして、顕微鏡装置は、撮像対象領域Saのうち、分割スキャン領域Rsを、Y軸方向へスキャンする。顕微鏡装置は、分割スキャン領域Rsのスキャンが完了したら、次に、X軸方向における隣の分割スキャン領域をスキャンする。撮像対象領域Saの全てについてスキャンが完了するまで、この動作が繰り返しされる。
 各分割スキャン領域のスキャンのために、及び、或る分割スキャン領域を撮像した後に次の分割スキャン領域を撮像するために、顕微鏡装置と試料載置部との位置関係が調整される。当該調整は、顕微鏡装置の移動、試料載置部の移動、又は、これらの両方の移動により行われてよい。この例において、各分割スキャン領域の撮像を行う撮像装置は、1次元撮像素子(ラインセンサ)又は2次元撮像素子(エリアセンサ)であってよい。信号取得部は、拡大光学系を介して各分割領域を撮像してよい。また、各分割スキャン領域の撮像は、顕微鏡装置及び/又は試料載置部を移動させながら連続的に行われてよい。各分割スキャン領域の一部が重なり合うように、前記撮像対象領域の分割が行われてよく、又は、重なり合わないように前記撮像対象領域の分割が行われてもよい。各分割スキャン領域は、焦点距離及び/又は露光時間などの撮像条件を変えて複数回撮像されてもよい。
 また、情報処理装置は、隣り合う複数の分割スキャン領域が合成して、より広い領域の画像データを生成しうる。当該合成処理を、撮像対象領域全体にわたって行うことで、撮像対象領域について、より広い領域の画像を取得することができる。また、分割スキャン領域の画像、または合成処理を行った画像から、より解像度の低い画像データを生成しうる。
Other examples of imaging schemes are as follows. The microscope device can first identify an imaging target region. The imaging target region may be specified so as to cover the entire region where the biological sample exists, or the target portion (target tissue section or target cell-containing portion) of the biological sample. may be specified. Next, the microscope device scans a part of the imaging target area (also referred to as a "divided scan area") in one direction (also referred to as a "scanning direction") in a plane perpendicular to the optical axis to capture an image. do. After the scanning of the divided scan area is completed, the next divided scan area next to the scan area is scanned. These scanning operations are repeated until the entire imaging target area is imaged.
As shown in FIG. 22 , the microscope device identifies a region (gray portion) in which the tissue section exists in the biological sample S as an imaging target region Sa. Then, the microscope device scans the divided scan area Rs in the imaging target area Sa in the Y-axis direction. After completing the scanning of the divided scan region Rs, the microscope device scans the next divided scan region in the X-axis direction. This operation is repeated until scanning is completed for the entire imaging target area Sa.
The positional relationship between the microscope device and the sample placement section is adjusted for scanning each divided scan area and for imaging the next divided scan area after imaging a certain divided scan area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them. In this example, the imaging device that captures each divided scan area may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor). The signal acquisition section may capture an image of each divided area via an enlarging optical system. Also, the imaging of each divided scan area may be performed continuously while moving the microscope device and/or the sample placement unit. The imaging target area may be divided so that the divided scan areas partially overlap each other, or the imaging target area may be divided so that the divided scan areas do not overlap. Each divided scan area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
Further, the information processing apparatus can combine a plurality of adjacent divided scan areas to generate image data of a wider area. By performing the synthesizing process over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Further, image data with lower resolution can be generated from the image of the divided scan area or the image subjected to the synthesis processing.
 本明細書で開示されている実施形態及び変形例はすべての点で例示に過ぎず限定的には解釈されないことに留意されるべきである。上述の実施形態及び変形例は、添付の特許請求の範囲及びその趣旨を逸脱することなく、様々な形態での省略、置換及び変更が可能である。例えば上述の実施形態及び変形例が全体的に又は部分的に組み合わされてもよく、また上述以外の実施形態が上述の実施形態又は変形例と組み合わされてもよい。また、本明細書に記載された本開示の効果は例示に過ぎず、その他の効果がもたらされてもよい。 It should be noted that the embodiments and modifications disclosed in this specification are merely illustrative in all respects and should not be construed as limiting. The embodiments and variations described above can be omitted, substituted, and modified in various ways without departing from the scope and spirit of the appended claims. For example, the above-described embodiments and modifications may be wholly or partially combined, and embodiments other than those described above may be combined with the above-described embodiments or modifications. Also, the advantages of the disclosure described herein are merely exemplary, and other advantages may be achieved.
 上述の技術的思想を具現化する技術的カテゴリーは限定されない。例えば上述の装置を製造する方法或いは使用する方法に含まれる1又は複数の手順(ステップ)をコンピュータに実行させるためのコンピュータプログラムによって、上述の技術的思想が具現化されてもよい。またそのようなコンピュータプログラムが記録されたコンピュータが読み取り可能な非一時的(non-transitory)な記録媒体によって、上述の技術的思想が具現化されてもよい。 The technical categories that embody the above technical ideas are not limited. For example, the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus. Also, the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
 本開示は以下の構成を取ることもできる。 The present disclosure can also take the following configuration.
[項目1]
 蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで得られる蛍光画像を取得する画像取得部と、
 前記蛍光画像に基づいて、前記蛍光体の深さ位置に関連する深さ位置情報を取得する深さ位置情報取得部と、を備え、
 前記深さ位置情報取得部は、
 前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得し、
 前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記深さ位置情報を取得する、医療用画像処理装置。
[Item 1]
an image acquisition unit that acquires a fluorescence image obtained by photographing a biological tissue containing a fluorescent substance while irradiating the biological tissue with excitation light;
a depth position information acquisition unit that acquires depth position information related to the depth position of the phosphor based on the fluorescence image;
The depth position information acquisition unit,
analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
A medical image processing apparatus for obtaining the depth position information by comparing the spread information with the spread function representing the image intensity distribution in the living tissue.
[項目2]
 前記画像取得部は、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで得られる可視光画像を取得し、
 前記深さ位置情報取得部は、
 前記可視光画像を解析することで、前記生体組織の種類を推定し、
 推定された前記生体組織の種類に応じた前記拡がり関数を取得する項目1に記載の医療用画像処理装置。
[Item 2]
The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
The depth position information acquisition unit,
estimating the type of the biological tissue by analyzing the visible light image;
The medical image processing apparatus according to item 1, wherein the spread function corresponding to the estimated type of the biological tissue is obtained.
[項目3]
 前記拡がり情報は、前記蛍光画像における前記蛍光体の輝度分布であり、
 前記拡がり関数は、前記蛍光体の輝度分布と、前記深さ位置情報と、に基づく線拡がり関数である項目1又は2に記載の医療用画像処理装置。
[Item 3]
the spread information is the luminance distribution of the phosphor in the fluorescence image;
3. The medical image processing apparatus according to item 1 or 2, wherein the spread function is a line spread function based on the luminance distribution of the phosphor and the depth position information.
[項目4]
 前記拡がり関数は、前記蛍光体の蛍光波長に応じて定まる散乱係数をパラメータとして含み、
 前記深さ位置情報取得部は、
 前記蛍光波長に対応する散乱係数を取得し、
 前記散乱係数が反映された前記拡がり関数と、前記拡がり情報とに基づいて、前記深さ位置情報を取得する項目1~3のいずれかに記載の医療用画像処理装置。
[Item 4]
The spreading function includes as a parameter a scattering coefficient determined according to the fluorescence wavelength of the phosphor,
The depth position information acquisition unit,
obtaining a scattering coefficient corresponding to the fluorescence wavelength;
4. The medical image processing apparatus according to any one of items 1 to 3, wherein the depth position information is acquired based on the spread function reflecting the scattering coefficient and the spread information.
[項目5]
 前記生体組織は、蛍光波長が相互に異なる複数の蛍光体を含み、
 前記画像取得部は、前記複数の蛍光体のそれぞれの前記励起光を前記生体組織に照射して前記生体組織を撮影することで得られる前記蛍光画像を取得し、
 前記深さ位置情報取得部は、前記蛍光画像に基づいて、前記複数の蛍光体のそれぞれの前記深さ位置情報を取得する項目1~4のいずれかに記載の医療用画像処理装置。
[Item 5]
the biological tissue contains a plurality of fluorescent substances having mutually different fluorescent wavelengths,
The image acquisition unit acquires the fluorescence image obtained by irradiating the biological tissue with the excitation light of each of the plurality of fluorophores and photographing the biological tissue,
5. The medical image processing apparatus according to any one of items 1 to 4, wherein the depth position information acquiring unit acquires the depth position information of each of the plurality of phosphors based on the fluorescence image.
[項目6]
 前記深さ位置情報取得部は、前記複数の蛍光体のそれぞれの前記深さ位置情報に基づいて、前記複数の蛍光体間における前記深さ位置の相対関係を示す相対深さ位置情報を取得する項目5に記載の医療用画像処理装置。
[Item 6]
The depth position information acquisition unit acquires relative depth position information indicating a relative relationship of the depth positions between the plurality of phosphors based on the depth position information of each of the plurality of phosphors. 6. A medical image processing apparatus according to item 5.
[項目7]
 前記深さ位置情報に応じた鮮鋭化処理を前記蛍光画像に対して行う画質調整部を備える項目1~6のいずれかに記載の医療用画像処理装置。
[Item 7]
7. The medical image processing apparatus according to any one of items 1 to 6, further comprising an image quality adjustment unit that performs sharpening processing on the fluorescence image according to the depth position information.
[項目8]
 観察画像を生成する観察画像生成部を備え、
 前記画像取得部は、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで得られる可視光画像を取得し、
 前記観察画像では、前記鮮鋭化処理を受けた後の前記蛍光画像の前記蛍光体に対応する部分が前記可視光画像に重畳される項目7に記載の医療用画像処理装置。
[Item 8]
An observation image generation unit that generates an observation image,
The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
8. The medical image processing apparatus according to item 7, wherein, in the observation image, a portion corresponding to the phosphor of the fluorescence image after undergoing the sharpening process is superimposed on the visible light image.
[項目9]
 前記生体組織は、蛍光波長が相互に異なる複数の蛍光体を含み、
 前記画像取得部は、前記複数の蛍光体のそれぞれの前記励起光を前記生体組織に照射しつつ前記生体組織を撮影することで得られる前記蛍光画像を取得し、
 前記深さ位置情報取得部は、前記蛍光画像に基づいて、前記複数の蛍光体のそれぞれの前記深さ位置情報を取得し、
 前記観察画像生成部は、
 前記蛍光画像の前記複数の蛍光体に対応する部分に対し、前記複数の蛍光体間における相対的な明るさの調整を行い、
 前記複数の蛍光体間における相対的な明るさの調整が行われた後の前記蛍光画像の前記複数の蛍光体に対応する部分を、前記可視光画像に重畳して前記観察画像を生成する項目8に記載の医療用画像処理装置。
[Item 9]
the biological tissue contains a plurality of fluorescent substances having mutually different fluorescent wavelengths,
The image acquisition unit acquires the fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue with the excitation light of each of the plurality of fluorescent substances,
The depth position information acquisition unit acquires the depth position information of each of the plurality of phosphors based on the fluorescence image,
The observation image generation unit
Adjusting the relative brightness between the plurality of phosphors in the portion corresponding to the plurality of phosphors of the fluorescence image,
An item for generating the observation image by superimposing, on the visible light image, portions corresponding to the plurality of phosphors of the fluorescence image after adjusting the relative brightness among the plurality of phosphors. 9. The medical image processing apparatus according to 8.
[項目10]
 観察画像を生成する観察画像生成部を備え、
 前記画像取得部は、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで得られる可視光画像を取得し、
 前記観察画像生成部は、
 前記鮮鋭化処理を受けた後の前記蛍光画像を解析して、前記生体組織における前記蛍光体の範囲を特定し、
 前記可視光画像において前記蛍光体の範囲に対応する部分が強調された前記観察画像を生成する項目7に記載の医療用画像処理装置。
[Item 10]
An observation image generation unit that generates an observation image,
The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
The observation image generation unit
analyzing the fluorescence image after undergoing the sharpening process to specify the range of the phosphor in the living tissue;
8. The medical image processing apparatus according to item 7, which generates the observation image in which a portion corresponding to the range of the phosphor in the visible light image is emphasized.
[項目11]
 前記観察画像生成部は、前記可視光画像の前記蛍光体の範囲に対応する部分が、前記深さ位置情報に応じて強調された前記観察画像を生成する項目10に記載の医療用画像処理装置。
[Item 11]
11. The medical image processing apparatus according to item 10, wherein the observation image generation unit generates the observation image in which a portion of the visible light image corresponding to the range of the phosphor is emphasized according to the depth position information. .
[項目12]
 蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで蛍光画像を取得する撮影ユニットと、
 前記蛍光画像を解析する医療用画像処理装置と、を備え、
 前記医療用画像処理装置は、
 前記蛍光画像を取得する画像取得部と、
 前記蛍光画像に基づいて、前記蛍光体の深さ位置に関連する深さ位置情報を取得する深さ位置情報取得部と、を有し、
 前記深さ位置情報取得部は、
 前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得し、
 前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記深さ位置情報を取得する、医療用観察システム。
[Item 12]
an imaging unit that obtains a fluorescence image by imaging a biological tissue containing a fluorescent substance while irradiating the biological tissue with excitation light;
and a medical image processing device that analyzes the fluorescence image,
The medical image processing device is
an image acquisition unit that acquires the fluorescence image;
a depth position information acquisition unit that acquires depth position information related to the depth position of the phosphor based on the fluorescence image;
The depth position information acquisition unit,
analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
A medical observation system that acquires the depth position information by comparing the spread information with the spread function representing the image intensity distribution in the living tissue.
[項目13]
 前記生体組織における前記蛍光体の場所が視認可能に表された観察画像を表示するディスプレイ装置を備え、
 前記撮影ユニットは、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで可視光画像を取得し、
 前記医療用画像処理装置は、前記可視光画像及び前記蛍光画像に基づいて前記観察画像を生成する観察画像生成部を有する、項目12に記載の医療用観察システム。
[Item 13]
A display device for displaying an observation image in which the location of the phosphor in the biological tissue is visibly displayed,
The imaging unit acquires a visible light image by imaging the living tissue while irradiating the living tissue with visible light,
13. The medical observation system according to item 12, wherein the medical image processing apparatus includes an observation image generation unit that generates the observation image based on the visible light image and the fluorescence image.
[項目14]
 前記撮影ユニットは、前記深さ位置情報に基づいて撮影条件を調整する、項目12又は13に記載の医療用観察システム。
[Item 14]
14. The medical observation system according to item 12 or 13, wherein the imaging unit adjusts imaging conditions based on the depth position information.
[項目15]
 蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで得られる蛍光画像を取得する工程と、
 前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得する工程と、
 前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記蛍光体の深さ位置に関連する深さ位置情報を取得する工程と、
を含む医療用画像処理方法。
[Item 15]
Acquiring a fluorescence image obtained by photographing the living tissue containing the fluorescent material while irradiating the living tissue with excitation light;
analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
obtaining depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution in the living tissue;
A medical image processing method comprising:
[項目16]
 コンピュータに、
 蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで得られる蛍光画像を取得する手順と、
 前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得する手順と、
 前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記蛍光体の深さ位置に関連する深さ位置情報を取得する手順と、
を実行させるためのプログラム。
[Item 16]
to the computer,
a procedure for acquiring a fluorescence image obtained by photographing a biological tissue containing a fluorescent substance while irradiating the biological tissue with excitation light;
a procedure of analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
a step of acquiring depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution in the living tissue;
program to run the
10 医療用観察システム
11 撮影ユニット
12 医療用画像処理装置
13 出力ユニット
21 カメラコントローラ
22 カメラ記憶部
23 撮像部
24 光照射部
25 試料支持部
31 出力コントローラ
32 出力記憶部
33 ディスプレイ装置
40 画像処理コントローラ
41 画像取得部
42 深さ位置情報取得部
43 画質調整部
44 観察画像生成部
45 処理記憶部
100 可視光画像
101 蛍光画像
102 鮮鋭蛍光画像
103 観察画像
200 生体組織
200a 組織表面
201 血管
202 蛍光体
203 観察基準線
10 medical observation system 11 imaging unit 12 medical image processing device 13 output unit 21 camera controller 22 camera storage unit 23 imaging unit 24 light irradiation unit 25 sample support unit 31 output controller 32 output storage unit 33 display device 40 image processing controller 41 Image acquisition unit 42 Depth position information acquisition unit 43 Image quality adjustment unit 44 Observation image generation unit 45 Processing storage unit 100 Visible light image 101 Fluorescence image 102 Sharp fluorescence image 103 Observation image 200 Living tissue 200a Tissue surface 201 Blood vessel 202 Phosphor 203 Observation reference line

Claims (13)

  1.  蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで得られる蛍光画像を取得する画像取得部と、
     前記蛍光画像に基づいて、前記蛍光体の深さ位置に関連する深さ位置情報を取得する深さ位置情報取得部と、を備え、
     前記深さ位置情報取得部は、
     前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得し、
     前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記深さ位置情報を取得する、医療用画像処理装置。
    an image acquisition unit that acquires a fluorescence image obtained by photographing a living tissue containing a fluorescent substance while irradiating the living tissue with excitation light;
    a depth position information acquisition unit that acquires depth position information related to the depth position of the phosphor based on the fluorescence image;
    The depth position information acquisition unit,
    analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
    A medical image processing apparatus that acquires the depth position information by comparing the spread information with the spread function representing the image intensity distribution in the living tissue.
  2.  前記画像取得部は、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで得られる可視光画像を取得し、
     前記深さ位置情報取得部は、
     前記可視光画像を解析することで、前記生体組織の種類を推定し、
     推定された前記生体組織の種類に応じた前記拡がり関数を取得する請求項1に記載の医療用画像処理装置。
    The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
    The depth position information acquisition unit,
    estimating the type of the biological tissue by analyzing the visible light image;
    2. The medical image processing apparatus according to claim 1, wherein the spread function corresponding to the estimated type of the biological tissue is obtained.
  3.  前記拡がり情報は、前記蛍光画像における前記蛍光体の輝度分布であり、
     前記拡がり関数は、前記蛍光体の輝度分布と、前記深さ位置情報と、に基づく線拡がり関数である請求項1に記載の医療用画像処理装置。
    the spread information is the luminance distribution of the phosphor in the fluorescence image;
    2. The medical image processing apparatus according to claim 1, wherein the spread function is a line spread function based on the luminance distribution of the phosphor and the depth position information.
  4.  前記拡がり関数は、前記蛍光体の蛍光波長に応じて定まる散乱係数をパラメータとして含み、
     前記深さ位置情報取得部は、
     前記蛍光波長に対応する散乱係数を取得し、
     前記散乱係数が反映された前記拡がり関数と、前記拡がり情報とに基づいて、前記深さ位置情報を取得する請求項1に記載の医療用画像処理装置。
    The spreading function includes as a parameter a scattering coefficient determined according to the fluorescence wavelength of the phosphor,
    The depth position information acquisition unit,
    obtaining a scattering coefficient corresponding to the fluorescence wavelength;
    2. The medical image processing apparatus according to claim 1, wherein the depth position information is acquired based on the spread function reflecting the scattering coefficient and the spread information.
  5.  前記生体組織は、蛍光波長が相互に異なる複数の蛍光体を含み、
     前記画像取得部は、前記複数の蛍光体のそれぞれの前記励起光を前記生体組織に照射して前記生体組織を撮影することで得られる前記蛍光画像を取得し、
     前記深さ位置情報取得部は、前記蛍光画像に基づいて、前記複数の蛍光体のそれぞれの前記深さ位置情報を取得する請求項1に記載の医療用画像処理装置。
    the biological tissue contains a plurality of fluorescent substances having mutually different fluorescent wavelengths,
    The image acquisition unit acquires the fluorescence image obtained by irradiating the biological tissue with the excitation light of each of the plurality of fluorophores and photographing the biological tissue,
    The medical image processing apparatus according to claim 1, wherein the depth position information acquisition unit acquires the depth position information of each of the plurality of phosphors based on the fluorescence image.
  6.  前記深さ位置情報取得部は、前記複数の蛍光体のそれぞれの前記深さ位置情報に基づいて、前記複数の蛍光体間における前記深さ位置の相対関係を示す相対深さ位置情報を取得する請求項5に記載の医療用画像処理装置。 The depth position information acquisition unit acquires relative depth position information indicating a relative relationship of the depth positions between the plurality of phosphors based on the depth position information of each of the plurality of phosphors. The medical image processing apparatus according to claim 5.
  7.  前記深さ位置情報に応じた鮮鋭化処理を前記蛍光画像に対して行う画質調整部を備える請求項1に記載の医療用画像処理装置。 The medical image processing apparatus according to claim 1, comprising an image quality adjustment unit that performs sharpening processing on the fluorescence image according to the depth position information.
  8.  観察画像を生成する観察画像生成部を備え、
     前記画像取得部は、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで得られる可視光画像を取得し、
     前記観察画像では、前記鮮鋭化処理を受けた後の前記蛍光画像の前記蛍光体に対応する部分が前記可視光画像に重畳される請求項7に記載の医療用画像処理装置。
    An observation image generation unit that generates an observation image,
    The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
    8. The medical image processing apparatus according to claim 7, wherein in the observation image, a portion corresponding to the phosphor of the fluorescence image after undergoing the sharpening process is superimposed on the visible light image.
  9.  前記生体組織は、蛍光波長が相互に異なる複数の蛍光体を含み、
     前記画像取得部は、前記複数の蛍光体のそれぞれの前記励起光を前記生体組織に照射しつつ前記生体組織を撮影することで得られる前記蛍光画像を取得し、
     前記深さ位置情報取得部は、前記蛍光画像に基づいて、前記複数の蛍光体のそれぞれの前記深さ位置情報を取得し、
     前記観察画像生成部は、
     前記蛍光画像の前記複数の蛍光体に対応する部分に対し、前記複数の蛍光体間における相対的な明るさの調整を行い、
     前記複数の蛍光体間における相対的な明るさの調整が行われた後の前記蛍光画像の前記複数の蛍光体に対応する部分を、前記可視光画像に重畳して前記観察画像を生成する請求項8に記載の医療用画像処理装置。
    the biological tissue contains a plurality of fluorescent substances having mutually different fluorescent wavelengths,
    The image acquisition unit acquires the fluorescence image obtained by photographing the biological tissue while irradiating the biological tissue with the excitation light of each of the plurality of fluorescent substances,
    The depth position information acquisition unit acquires the depth position information of each of the plurality of phosphors based on the fluorescence image,
    The observation image generation unit
    Adjusting the relative brightness between the plurality of phosphors in the portion corresponding to the plurality of phosphors of the fluorescence image,
    generating the observed image by superimposing, on the visible light image, portions corresponding to the plurality of phosphors of the fluorescence image after adjusting relative brightness among the plurality of phosphors; Item 9. The medical image processing apparatus according to item 8.
  10.  観察画像を生成する観察画像生成部を備え、
     前記画像取得部は、前記生体組織に可視光を照射しつつ前記生体組織を撮影することで得られる可視光画像を取得し、
     前記観察画像生成部は、
     前記鮮鋭化処理を受けた後の前記蛍光画像を解析して、前記生体組織における前記蛍光体の範囲を特定し、
     前記可視光画像において前記蛍光体の範囲に対応する部分が強調された前記観察画像を生成する請求項7に記載の医療用画像処理装置。
    An observation image generation unit that generates an observation image,
    The image acquiring unit acquires a visible light image obtained by photographing the living tissue while irradiating the living tissue with visible light,
    The observation image generation unit
    analyzing the fluorescence image after undergoing the sharpening process to specify the range of the phosphor in the living tissue;
    8. The medical image processing apparatus according to claim 7, which generates the observation image in which a portion corresponding to the range of the phosphor in the visible light image is emphasized.
  11.  前記観察画像生成部は、前記可視光画像の前記蛍光体の範囲に対応する部分が、前記深さ位置情報に応じて強調された前記観察画像を生成する請求項10に記載の医療用画像処理装置。 11. The medical image processing according to claim 10, wherein the observation image generator generates the observation image in which a portion of the visible light image corresponding to the range of the phosphor is emphasized according to the depth position information. Device.
  12.  蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで得られる蛍光画像を取得する工程と、
     前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得する工程と、
     前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記蛍光体の深さ位置に関連する深さ位置情報を取得する工程と、
    を含む医療用画像処理方法。
    Acquiring a fluorescence image obtained by photographing the living tissue containing the fluorescent material while irradiating the living tissue with excitation light;
    analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
    obtaining depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution in the living tissue;
    A medical image processing method comprising:
  13.  コンピュータに、
     蛍光体を含む生体組織に励起光を照射しつつ前記生体組織を撮影することで得られる蛍光画像を取得する手順と、
     前記蛍光画像を解析して、前記蛍光画像における前記蛍光体の像強度分布を示す拡がり情報を取得する手順と、
     前記生体組織における像強度分布を表す拡がり関数に対し、前記拡がり情報を照らし合わせることで、前記蛍光体の深さ位置に関連する深さ位置情報を取得する手順と、
    を実行させるためのプログラム。
    to the computer,
    a procedure for acquiring a fluorescence image obtained by photographing a biological tissue containing a fluorescent substance while irradiating the biological tissue with excitation light;
    a procedure of analyzing the fluorescence image to acquire spread information indicating the image intensity distribution of the phosphor in the fluorescence image;
    a step of acquiring depth position information related to the depth position of the phosphor by comparing the spread information with the spread function representing the image intensity distribution in the living tissue;
    program to run the
PCT/JP2022/003924 2021-02-24 2022-02-02 Medical image processing device, medical image processing method, and program WO2022181263A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280014492.2A CN116887760A (en) 2021-02-24 2022-02-02 Medical image processing apparatus, medical image processing method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-027800 2021-02-24
JP2021027800A JP2022129194A (en) 2021-02-24 2021-02-24 Medical image processing device, medical image processing method, and program

Publications (1)

Publication Number Publication Date
WO2022181263A1 true WO2022181263A1 (en) 2022-09-01

Family

ID=83048098

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/003924 WO2022181263A1 (en) 2021-02-24 2022-02-02 Medical image processing device, medical image processing method, and program

Country Status (3)

Country Link
JP (1) JP2022129194A (en)
CN (1) CN116887760A (en)
WO (1) WO2022181263A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009153970A (en) * 2007-12-05 2009-07-16 Fujifilm Corp Image processing system, image processing method, and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009153970A (en) * 2007-12-05 2009-07-16 Fujifilm Corp Image processing system, image processing method, and program

Also Published As

Publication number Publication date
CN116887760A (en) 2023-10-13
JP2022129194A (en) 2022-09-05

Similar Documents

Publication Publication Date Title
US10768402B2 (en) Microscopy of a tissue sample using structured illumination
EP3540494B1 (en) Augmented reality surgical microscope and microscopy method
CN107003242B (en) System and method for controlling imaging depth in tissue using fluorescence microscope with ultraviolet excitation after staining with fluorescent agent
US10580128B2 (en) Whole slide multispectral imaging systems and methods
US9949645B2 (en) Fluorescence imaging apparatus for identifying fluorescence region based on specified processing condition and superimposing fluorescence region at corresponding region in return-light image
JP5185151B2 (en) Microscope observation system
JP2012122852A (en) Image processing apparatus, image processing method, and image processing program
JP5498282B2 (en) Fluorescence observation equipment
EP2589328B1 (en) Image processing apparatus and image processing method
WO2022181263A1 (en) Medical image processing device, medical image processing method, and program
US11378515B2 (en) Image processing device, imaging system, actuation method of image processing device, and computer-readable recording medium
WO2022259647A1 (en) Information processing device, information processing method, and microscope system
WO2023149296A1 (en) Information processing device, biological sample observation system, and image generation method
WO2022209262A1 (en) Lighting device for biological specimen observation device, biological specimen observation device, lighting device for observation device, and observation system
WO2022249583A1 (en) Information processing device, biological sample observation system, and image generation method
WO2023276219A1 (en) Information processing device, biological sample observation system, and image generation method
WO2022270015A1 (en) Biological specimen observation device and biological specimen observation system
WO2023157756A1 (en) Information processing device, biological sample analysis system, and biological sample analysis method
WO2022209443A1 (en) Medical image analysis device, medical image analysis method, and medical image analysis system
JP2011244952A (en) Fluorescence observation device, fluorescence observation system, and fluorescence image processing method
WO2023248954A1 (en) Biological specimen observation system, biological specimen observation method, and dataset creation method
KR20180062411A (en) Method of acquiring an image of a biological sample

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22759304

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280014492.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22759304

Country of ref document: EP

Kind code of ref document: A1