WO2022259647A1 - Information processing device, information processing method, and microscope system - Google Patents

Information processing device, information processing method, and microscope system Download PDF

Info

Publication number
WO2022259647A1
WO2022259647A1 PCT/JP2022/008728 JP2022008728W WO2022259647A1 WO 2022259647 A1 WO2022259647 A1 WO 2022259647A1 JP 2022008728 W JP2022008728 W JP 2022008728W WO 2022259647 A1 WO2022259647 A1 WO 2022259647A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
optical system
information
processing unit
processing
Prior art date
Application number
PCT/JP2022/008728
Other languages
French (fr)
Japanese (ja)
Inventor
悠策 中島
信裕 林
寿一 白木
拓哉 大嶋
祐伍 勝木
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022259647A1 publication Critical patent/WO2022259647A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a microscope system.
  • the microscope system can electronically store and share images captured as digital data.
  • An image that is such digital data can be subjected to processing that is not possible with an optical microscope.
  • a microscope system reconstructs a WSI image (WSI: Whole Slide Imaging) of a pathological specimen by connecting high-magnification pathological images.
  • microscope systems that capture digital data may have limitations in their optics. For this reason, electronically reducing or enlarging a WSI image, for example, results in reducing or enlarging an image captured under specific optical conditions, resulting in a difference from an actual image observed by an optical microscope. As a result, doctors (especially pathologists) accustomed to using optical microscopes may feel a deviation in the diagnostic criteria when making a diagnosis using an image captured as digital data. Therefore, the present disclosure provides an information processing device, an information processing method, and a microscope system that can provide an image more suitable for the viewer's taste.
  • an acquisition unit that acquires a first image captured through a first optical system; a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system; An information processing device is provided.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system
  • the processing unit when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image.
  • the second image may be generated such that image information is increased.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
  • the processing unit increases image information in the optical axis direction more than the first image based on the plurality of first images.
  • the second image may be generated so as to.
  • the processing unit may change an addition method for adding the plurality of first images according to the optical system information.
  • the processing unit may process the first images by multiplying each of the plurality of first images by different coefficients based on the optical system information and adding them.
  • the processing unit when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image.
  • the second image may be generated such that it has image information corresponding to a deep image.
  • the processing unit when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image.
  • the second image may be generated to have image information corresponding to a deep image.
  • the processing unit may generate the second image by linking the numerical aperture of the second optical system and the imaging magnification of the second optical system.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system
  • the processing unit may change the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system.
  • the processing unit may change the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases.
  • the processing unit may change either the density or the spatial frequency of the first image according to the optical system information.
  • the processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image. you can go
  • the processing unit may perform processing to lower the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • the processing unit may perform processing to increase the density of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
  • the processing unit may perform processing to increase the density of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • the processing section may change the processing for the first image according to the information selected by the selection section.
  • a display control unit may be further provided for displaying a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side on the display unit.
  • the first display image may be an image in which a change in numerical aperture is suppressed even if the imaging magnification is changed
  • the second display image may be an image in which the numerical aperture is changed according to the change in imaging magnification.
  • a method of processing information comprising:
  • a microscope device that connects a first objective lens and acquires a first image captured at least at two or more focal positions of a living tissue
  • a microscope system comprising an information processing device,
  • the information processing device is an acquisition unit that acquires the first image;
  • a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device;
  • a microscope system is provided, comprising:
  • FIG. 4A and 4B are diagrams showing an example of an imaging method
  • FIG. 4A and 4B are diagrams showing an example of an imaging method
  • FIG. 3 is a block diagram showing a more detailed configuration example of the microscope system 5000
  • FIG. 4 is a diagram showing an objective lens and an aperture stop of an optical system of an optical section
  • FIG. 4 is a diagram schematically showing a first image captured under the control of a sensor control section and a stage control section
  • FIG. 2 is a block diagram showing a detailed configuration example of an image synthesizing unit
  • 4A and 4B are diagrams for schematically explaining a processing example of an optical processing unit;
  • FIG. 4 is a diagram schematically showing the processing order of an image synthesizing unit; The figure which shows the example which displays a 1st image and a 2nd image side by side.
  • FIG. 4 is a diagram showing an example of a UI screen generated by a UI control unit; The figure which shows a processing sequence when a mode button is selected. The figure which shows the UI example which instruct
  • FIG. 5 is a diagram showing an example of a UI screen related to image processing generated by a UI control unit;
  • FIG. 6 is a diagram showing an example of a UI screen related to special processing generated by the UI control unit;
  • FIG. 17 shows a processing system corresponding to the mode selected in FIG. 16;
  • FIG. 10 is a diagram showing an example in which the focal position shifts according to the distance from the optical axis of the objective lens of the second optical system;
  • FIG. 4 is a diagram showing an example of experimentally generated coefficients related to formulas (1) and (2);
  • 4 is a flowchart showing an example of processing of the microscope system according to the embodiment;
  • 5 is a flowchart showing an example of processing of the microscope system when magnification and numerical aperture are interlocked;
  • FIG. 2 is a block diagram showing a configuration example of a microscope system according to Modification 1 of the embodiment;
  • FIG. 10 is a diagram showing an example of a UI screen generated by a UI control unit according to Modification 1 of the first embodiment;
  • FIG. 5 is a block diagram showing a configuration example of a microscope system according to Modification 2 of the first embodiment;
  • 9 is a flowchart showing a processing example of the microscope system according to modification 2 of the first embodiment;
  • FIG. 1 shows a configuration example of the microscope system of the present disclosure.
  • a microscope system 5000 shown in FIG. 1 includes a microscope device 5100 , a control section 5110 and an information processing section 5120 .
  • a microscope device 5100 includes a light irradiation section 5101 , an optical section 5102 , and a signal acquisition section 5103 .
  • the microscope device 5100 may further include a sample placement section 5104 on which the biological sample S is placed. Note that the configuration of the microscope apparatus is not limited to that shown in FIG. It may be used as the unit 5101 .
  • the light irradiation section 5101 may be arranged such that the sample mounting section 5104 is sandwiched between the light irradiation section 5101 and the optical section 5102, and may be arranged on the side where the optical section 5102 exists, for example.
  • the microscope apparatus 5100 may be configured for one or more of bright field observation, phase contrast observation, differential interference observation, polarization observation, fluorescence observation, and dark field observation.
  • the microscope system 5000 may be configured as a so-called WSI (Whole Slide Imaging) system or a digital pathology system, and can be used for pathological diagnosis.
  • Microscope system 5000 may also be configured as a fluorescence imaging system, in particular a multiplex fluorescence imaging system.
  • the microscope system 5000 may be used to perform intraoperative pathological diagnosis or remote pathological diagnosis.
  • the microscope device 5100 acquires data of the biological sample S obtained from the subject of the surgery, and transfers the data to the information processing unit 5120. can send.
  • the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing device 5120 located in a place (another room, building, or the like) away from the microscope device 5100 .
  • the information processing device 5120 receives and outputs the data.
  • a user of the information processing device 5120 can make a pathological diagnosis based on the output data.
  • the biological sample S may be a sample containing a biological component.
  • the biological components may be tissues, cells, liquid components of a living body (blood, urine, etc.), cultures, or living cells (cardiomyocytes, nerve cells, fertilized eggs, etc.).
  • the biological sample may be a solid, a specimen fixed with a fixative such as paraffin, or a solid formed by freezing.
  • the biological sample can be a section of the solid.
  • a specific example of the biological sample is a section of a biopsy sample.
  • the biological sample may be one that has undergone processing such as staining or labeling.
  • the treatment may be staining for indicating the morphology of biological components or for indicating substances (surface antigens, etc.) possessed by biological components, examples of which include HE (Hematoxylin-Eosin) staining and immunohistochemistry staining. be able to.
  • the biological sample may be treated with one or more reagents, and the reagents may be fluorescent dyes, chromogenic reagents, fluorescent proteins, or fluorescently labeled antibodies.
  • the specimen may be one prepared for the purpose of pathological diagnosis or clinical examination from a specimen or tissue sample collected from the human body. Moreover, the specimen is not limited to the human body, and may be derived from animals, plants, or other materials.
  • the specimen may be the type of tissue used (such as an organ or cell), the type of target disease, the subject's attributes (such as age, sex, blood type, or race), or the subject's lifestyle. The properties differ depending on habits (for example, eating habits, exercise habits, smoking habits, etc.).
  • the specimens may be managed with identification information (bar code information, QR code (trademark) information, etc.) that allows each specimen to be identified.
  • the light irradiation unit 5101 is a light source for illuminating the biological sample S and an optical unit for guiding the light irradiated from the light source to the specimen.
  • the light source may irradiate the biological sample with visible light, ultraviolet light, or infrared light, or a combination thereof.
  • the light source may be one or more of halogen lamps, laser light sources, LED lamps, mercury lamps, and xenon lamps. A plurality of types and/or wavelengths of light sources may be used in fluorescence observation, and may be appropriately selected by those skilled in the art.
  • the light irradiator may have a transmissive, reflective, or episcopic (coaxial or lateral) configuration.
  • the optical section 5102 is configured to guide the light from the biological sample S to the signal acquisition section 5103 .
  • the optical section can be configured to allow the microscope device 5100 to observe or image the biological sample S.
  • Optical section 5102 may include an objective lens.
  • the type of objective lens may be appropriately selected by those skilled in the art according to the observation method.
  • the optical section may include a relay lens for relaying the image magnified by the objective lens to the signal acquisition section.
  • the optical unit may further include optical components other than the objective lens and the relay lens, an eyepiece lens, a phase plate, a condenser lens, and the like.
  • the optical section 5102 may further include a wavelength separation section configured to separate light having a predetermined wavelength from the light from the biological sample S.
  • the wavelength separation section can be configured to selectively allow light of a predetermined wavelength or wavelength range to reach the signal acquisition section.
  • the wavelength separator may include, for example, one or more of a filter that selectively transmits light, a polarizing plate, a prism (Wollaston prism), and a diffraction grating.
  • the optical components included in the wavelength separation section may be arranged, for example, on the optical path from the objective lens to the signal acquisition section.
  • the wavelength separation unit is provided in the microscope apparatus when fluorescence observation is performed, particularly when an excitation light irradiation unit is included.
  • the wavelength separator may be configured to separate fluorescent light from each other or white light and fluorescent light.
  • the signal acquisition unit 5103 can be configured to receive light from the biological sample S and convert the light into an electrical signal, particularly a digital electrical signal.
  • the signal acquisition unit may be configured to acquire data on the biological sample S based on the electrical signal.
  • the signal acquisition unit may be configured to acquire data of an image (image, particularly a still image, a time-lapse image, or a moving image) of the biological sample S, particularly an image magnified by the optical unit. It can be configured to acquire data.
  • the signal acquisition unit includes one or more imaging elements, such as CMOS or CCD, having a plurality of pixels arranged one-dimensionally or two-dimensionally.
  • the signal acquisition unit may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or an image sensor for sensing such as AF and an image sensor for image output for observation. and may include In addition to the plurality of pixels, the image sensor includes a signal processing unit (including one, two, or three of CPU, DSP, and memory) that performs signal processing using pixel signals from each pixel, and an output control unit for controlling output of image data generated from the pixel signals and processed data generated by the signal processing unit. Furthermore, the imaging device may include an asynchronous event detection sensor that detects, as an event, a change in brightness of a pixel that photoelectrically converts incident light exceeding a predetermined threshold. An imaging device including the plurality of pixels, the signal processing section, and the output control section may preferably be configured as a one-chip semiconductor device.
  • the control unit 5110 controls imaging by the microscope device 5100 .
  • the control unit can drive the movement of the optical unit 5102 and/or the sample placement unit 5104 to adjust the positional relationship between the optical unit and the sample placement unit.
  • the control unit 5110 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (for example, the optical axis direction of the objective lens). Further, the control section may move the optical section and/or the sample placement section in any direction on a plane perpendicular to the optical axis direction.
  • the control unit may control the light irradiation unit 5101 and/or the signal acquisition unit 5103 for imaging control.
  • the sample mounting section 5104 may be configured such that the position of the biological sample on the sample mounting section can be fixed, and may be a so-called stage.
  • the sample mounting section 5104 can be configured to move the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
  • the information processing section 5120 can acquire data (such as imaging data) acquired by the microscope device 5100 from the microscope device 5100 .
  • the information processing section can perform image processing on the imaging data.
  • the image processing may include color separation processing.
  • the color separation process is a process of extracting data of light components of a predetermined wavelength or wavelength range from the captured data to generate image data, or removing data of light components of a predetermined wavelength or wavelength range from the captured data. It can include processing and the like.
  • the image processing may include autofluorescence separation processing for separating the autofluorescence component and dye component of the tissue section, and fluorescence separation processing for separating the wavelengths between dyes having different fluorescence wavelengths.
  • autofluorescence signals extracted from one may be used to remove autofluorescence components from image information of the other specimen.
  • the information processing section 5120 may transmit data for imaging control to the control section 5110, and the control section 5110 receiving the data may control imaging by the microscope apparatus 5100 according to the data.
  • the information processing section 5120 may be configured as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM.
  • the information processing section may be included in the housing of the microscope device 5100 or may be outside the housing. Also, various processes or functions by the information processing unit may be realized by a server computer or cloud connected via a network.
  • a method of imaging the biological sample S by the microscope device 5100 may be appropriately selected by a person skilled in the art according to the type of the biological sample and the purpose of imaging. An example of the imaging method will be described below.
  • the microscope device can first identify an imaging target region.
  • the imaging target region may be specified so as to cover the entire region where the biological sample exists, or a target portion (target tissue section, target cell, or target lesion portion) of the biological sample. ) may be specified to cover
  • the microscope device divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device sequentially images each divided region. As a result, an image of each divided area is obtained.
  • the microscope device specifies an imaging target region R that covers the entire biological sample S.
  • the microscope device divides the imaging target region R into 16 divided regions.
  • the microscope device can then image the segmented region R1, and then image any region included in the imaging target region R, such as a region adjacent to the segmented region R1. Then, image capturing of the divided areas is performed until there are no unimaged divided areas. Areas other than the imaging target area R may also be imaged based on the captured image information of the divided areas. After imaging a certain divided area, the positional relationship between the microscope device and the sample mounting section is adjusted in order to image the next divided area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them.
  • the image capturing device that captures each divided area may be a two-dimensional image sensor (area sensor) or a one-dimensional image sensor (line sensor).
  • the signal acquisition section may capture an image of each divided area via the optical section.
  • the imaging of each divided area may be performed continuously while moving the microscope device and/or the sample mounting section, or the movement of the microscope apparatus and/or the sample mounting section may be performed during the imaging of each divided area. may be stopped.
  • the imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap.
  • Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
  • the information processing device can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesis processing over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
  • the microscope device can first identify an imaging target region.
  • the imaging target region may be specified so as to cover the entire region where the biological sample exists, or the target portion (target tissue section or target cell-containing portion) of the biological sample. may be specified.
  • the microscope device scans a part of the imaging target area (also referred to as a "divided scan area") in one direction (also referred to as a "scanning direction”) in a plane perpendicular to the optical axis to capture an image. do.
  • the scanning of the divided scan area is completed, the next divided scan area next to the scan area is scanned. These scanning operations are repeated until the entire imaging target area is imaged. As shown in FIG.
  • the microscope device specifies a region (gray portion) in which a tissue section exists in the biological sample S as an imaging target region Sa. Then, the microscope device scans the divided scan area Rs in the imaging target area Sa in the Y-axis direction. After completing the scanning of the divided scan region Rs, the microscope device scans the next divided scan region in the X-axis direction. This operation is repeated until scanning is completed for the entire imaging target area Sa.
  • the positional relationship between the microscope device and the sample placement section is adjusted for scanning each divided scan area and for imaging the next divided scan area after imaging a certain divided scan area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them.
  • the imaging device that captures each divided scan area may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor).
  • the signal acquisition section may capture an image of each divided area via an enlarging optical system.
  • the imaging of each divided scan area may be performed continuously while moving the microscope device and/or the sample placement unit.
  • the imaging target area may be divided such that the divided scan areas partially overlap each other, or the imaging target area may be divided so that the divided scan areas do not overlap.
  • Each divided scan area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
  • the information processing apparatus can combine a plurality of adjacent divided scan areas to generate image data of a wider area. By performing the synthesis processing over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Further, image data with lower resolution can be generated from the image of the divided scan area or the image subjected to the synthesis processing.
  • FIG. 4 is a block diagram showing a more detailed configuration example of the microscope system 5000. As shown in FIG. As shown in FIG. 4 , the microscope system 5000 further includes an image accumulation unit (storage unit) 502 and an operation unit 506 .
  • image accumulation unit storage unit
  • operation unit 506 operation unit
  • a storage device such as a non-volatile semiconductor memory or hard disk drive is used.
  • Various control parameters, programs, and the like according to this embodiment may be stored in advance in the storage unit 502 .
  • the storage unit 502 accumulates images acquired by the signal acquisition unit (sensor) 5103 .
  • the operation unit 506 is composed of, for example, a keyboard and a mouse.
  • the operation unit 506 inputs an instruction signal according to the operation of an observer, eg, a pathologist, to the signal processing device 5120a.
  • the microscope device 5100 a has a sensor control section 508 and a stage control section 514 .
  • the information processing device 5120 a has an image acquisition unit 515 , an image display unit 516 , a UI control unit 518 , a display control unit 520 , an image synthesizing unit 522 , and an image display unit 522 . Details of the information processing device 5120a will be described later.
  • FIG. 5 is a diagram showing an objective lens 5102a and an aperture stop 5102b of the optical system of the optical section 5102 according to this embodiment.
  • a region 100 indicates a depth region in the depth direction along the optical axis L100 of the optical system to be imaged in one imaging. Note that the optical system of the optical unit 5102 according to this embodiment corresponds to the first optical system.
  • the sensor control unit 508 and the stage control unit 514 move the focal position of the objective lens 5102a of the optical system in the Z direction along the optical axis L100 of the optical system. control to change and capture a plurality of images. That is, the stage controller 514 changes the relative distance between the objective lens 5102a and the stage 5104, for example, with the aperture diaphragm 5102b in the first state.
  • This aperture stop 5102b defines the thickness of the light beam, that is, the numerical aperture (NA). That is, in the first state, the aperture diaphragm 5102b is fixed in an open state in which the numerical aperture (NA) is expressed as, for example, "large”.
  • one objective lens 5102a is illustrated in order to simplify the description, but the present invention is not limited to this. For example, a plurality of objective lenses 5102a may be configured.
  • the sensor control unit 508 captures an image and supplies it to the storage unit 502 in accordance with the change of the focal position.
  • the sensor control unit 508 and the stage control unit 514 move the stage 5104 in the horizontal direction.
  • the control for changing the relative distance between the objective lens 5102a and the stage 5104 in the Z direction along the axis L100 is repeated.
  • FIG. 6 is a diagram schematically showing the first image G100 captured under the control of the sensor control section 508 and the stage control section 514.
  • the first image G100 is composed of a plurality of first images G100a captured by changing the focal position.
  • the image acquisition unit 515 acquires the captured image stored in the storage unit 502.
  • the image acquisition unit 515 may be configured to acquire a captured image from the sensor 5103 .
  • the storage unit may be configured in the information processing device 5120a.
  • the image display unit 516 is, for example, a monitor.
  • the image display unit 516 displays an image generated by the image synthesizing unit 522 and a user interface (UI screen), as will be described later.
  • UI screen user interface
  • the UI control unit 518 causes the image display unit 516 to display an input UI screen for inputting an instruction signal from the observer.
  • the UI control unit 518 causes the image display unit 516 to display, for example, FIGS.
  • the UI control unit 518 then supplies the observer's instruction signal to the image synthesizing unit 522 .
  • the display control unit 520 causes the image display unit 516 to display the UI screen generated by the UI control unit 518 and the image processed by the image synthesizing unit 522 as predetermined display images.
  • the display image is subjected to gamma conversion, for example.
  • the image synthesis unit 522 processes the image acquired by the image acquisition unit 515 using optical system information related to a second optical system different from the first optical system (see FIG. 5).
  • the second optical system means an optical system having a form different from that of the first optical system when capturing an image to be processed.
  • an optical system of an optical microscope is called a second optical system.
  • an optical system having the same objective lens 5102a but a different numerical aperture or magnification of an aperture stop 5120b is referred to as a second optical system.
  • an optical system using an objective lens different from the objective lens 5102a is referred to as a second optical system. Note that the image synthesizing unit 522 according to this embodiment corresponds to the processing unit.
  • FIG. 7 is a block diagram showing a detailed configuration example of the image synthesizing unit 522.
  • the image synthesizing section 522 has an image area synthesizing section 522a, an optical processing section 522b, a color processing section 522c, and a frequency processing section 522d.
  • the image synthesizing unit 522 stitches and synthesizes the first images G100a having different focal positions captured in each region to generate a WSI (Whole Slide Imaging) image. do.
  • WSI images corresponding to the number of first images G100a forming the first image G100 are generated. For example, when the first image G100a is captured at 20 different focal positions, 20 WSI images are generated.
  • FIG. 522b Based on the information of the second optical system, the optical processing unit 522b uses the first image G100 captured by the first optical system to generate a second image corresponding to the case of being captured by the second optical system.
  • FIG. 8 is a diagram showing an example of the second optical system 5104.
  • the second optical system 5104 has an objective lens 5104a and an aperture stop 5104b.
  • a region 200 indicates a depth region in the depth direction along the optical axis L200 of the second optical system 5104 that is imaged in one imaging.
  • the aperture diaphragm 5104b is fixed in an aperture state in which the thickness of the light beam (numerical aperture NA) is expressed as, for example, "small”.
  • the aperture stop 5104b is narrower than the aperture stop 5102b (see FIG. 4). Therefore, the depth of field is deeper than that of the first optical system 5102 (see FIG. 4).
  • Information about the second optical system 5104 includes information about the characteristics of the objective lens 5104a, the aperture stop 5104b, and the magnification. Therefore, by using the information of the second optical system 5104, it is possible to calculate the depth of field when the image is captured by the second optical system. Alternatively, information on the focal position from the objective lens 5104a and information on the depth of field may be stored in advance in the storage unit 502 in association with the information on the second optical system with the numerical aperture and the magnification.
  • FIG. 9 is a diagram schematically explaining a processing example of the optical processing section 522b.
  • the optical axis L100 of the first optical system 5102 is shown shifted to facilitate understanding of the explanation, but it is actually coaxial.
  • the depth of field of the first optical system 5102 is shallower than the area 200 indicating the depth of field of the second optical system 5104 (see FIG. 8), as indicated by the area 100 . 19, which will be described later, the optical axis L100 is shifted in a direction perpendicular to the optical axis L100 as shown in FIG. image.
  • FIG. 10 is a diagram schematically showing the processing order of the image synthesizing unit 522.
  • the optical processing unit 522b adds a plurality of first images G100a to generate a second image G200a corresponding to the image captured by the second optical system 5104 (see FIG. 8).
  • the optical processing unit 522b generates the second image G200a according to, for example, formula (1).
  • i denotes the second image
  • Ii denotes the first image G100a at focus i
  • ai denotes the coefficients corresponding to the first image Ii.
  • the optical processing unit 522b generates the second image G200a by convolutionally combining the plurality of first images G100a based on the information of the second optical system 5104.
  • the lower diagram of FIG. 10 is a diagram showing the result of processing according to the aperture area of the aperture stop 5104b, that is, the numerical aperture NA. That is, the numerical aperture NA increases as the aperture increases.
  • the optical processing unit 522b increases the addition range of the plurality of first images G100a in the Z direction. increase.
  • the optical processing unit 522b when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, from the single first image G100a The first image G100 is processed to generate a second image so that the image information in the optical axis direction L100 increases as well. That is, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, makes the object field smaller than the single first image G100a. The first image G100 is processed to produce a second image G200a so as to have image information corresponding to a deep image.
  • the display control section 520 causes the image display section 516 to display the second image G200a.
  • the optical processing unit 522b based on the plurality of first images G100a, aligns the optical axis more than the single first image G100a.
  • the first image G100 is processed to generate a second image such that the image information in the direction L100 is reduced.
  • FIG. 11 is a diagram showing an example in which the first image G100a and the second image G200a are displayed side by side.
  • the display control section 520 may cause the image display section 516 to display the second image G200a while observing the first image G100.
  • the display image corresponding to the first image G100a and the display image corresponding to the second image G200a are displayed side by side on the screens W100a and W200a.
  • This enables the pathologist to objectively grasp the difference in diagnostic criteria between the first image G100a and the second image G200a.
  • pathologists gain experience in image interpretation with this system, they can grasp the diagnostic criteria corresponding to the first image G100a, and even when using only the first image G100a, more accurate diagnosis becomes possible.
  • a single image of the first image G100a is processed by the color processing unit 522c and the frequency processing unit 522d to obtain a second image.
  • Image G ⁇ b>200 a may be displayed on image display unit 516 .
  • the second image G200a processed using the first image G100 may be displayed on the image display unit 516 when all imaging in the depth direction Z is completed.
  • the optical processing unit 522b may pre-calculate the coefficient ai as the information of the second optical system 5104 and store it in the storage unit 502. In this case, the optical processing unit 522b reads out the coefficient ai from the storage unit 502 and performs processing for generating the second image G200a.
  • This coefficient can be calculated using the optical characteristic information of the objective lens 5104a and the aperture stop 5104b of the second optical system 5104. FIG. Alternatively, it may be calculated in advance by computer simulation based on the information of the objective lens 5104a and the aperture stop 5104b of the second optical system 5104.
  • the optical processor 522b is also capable of deconvolution. Let Ik be the first image and let ik be the second image, as shown in equation (2).
  • An example of expression (2) is a case where the second image ik has a larger numerical aperture than the first image G100a.
  • a coefficient ai indicates a coefficient corresponding to the first image Ik. That is, the optical processing unit 522b inversely transforms the determinant expressed by Equation (2), calculates new coefficients using the inverse change matrix, and generates the second image ik using the plurality of first images G100a. . In this way, the optical processing unit 522b can generate the second image G200a with a larger numerical aperture than the first image G100a by the deconvolution operation.
  • the color processing unit 522c changes either the density or color of the first image G100a or the second image generated by the optical processing unit 522b according to the information of the second optical system 5104. In this case, either the density or the color may be changed with respect to the synthesized second image. Alternatively, either the density or the color may be changed for a single sheet of the first image G100a. For example, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the color processing unit 522c performs processing to increase the density. Alternatively, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the color processing unit 522c performs processing to change the color to a darker side.
  • the color processing unit 522c performs processing to further lower the density. This makes it possible to bring the color tone closer to the image captured by the second optical system 5104 .
  • the frequency processing unit 522d performs processing for changing the spatial frequency (MTF) of the first image G100a or the second image generated by the optical processing unit 522b according to the information of the second optical system 5104. For example, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the frequency processing unit 522d determines the spatial frequency (MTF ) is made lower than the first image G100a. On the other hand, when the aperture of the second optical system 5104 is wider than the aperture of the first optical system 5102, the frequency processing unit 522d adjusts the spatial frequency ( MTF) is increased more than the first image G100a. This makes it possible to bring the spatial frequency (MTF) closer to the image captured by the second optical system 5104 .
  • FIG. 12 is a diagram showing an example of a UI screen generated by the UI control unit 518. As shown in FIG. The observer designates mode buttons U100 to U104 displayed on the image display unit 516 via the operation unit 506. FIG. Note that the UI control unit 518 according to this embodiment corresponds to the selection unit.
  • FIG. 13 is a diagram showing the processing sequence when the mode button U100 is selected.
  • the vertical axis indicates numerical aperture, and the horizontal axis indicates magnification.
  • a processing sequence MU100 indicates a processing sequence when the mode button U100 is designated, and a processing sequence MU200 indicates a processing sequence when the mode button U102 is designated.
  • the UI control unit 518 may cause the image display unit 516 to display the example image shown in FIG. 13 in conjunction with the UI screens shown in FIGS. 12 and 14 . As a result, the operator can more objectively grasp the details of the operation.
  • FIG. 14 is a diagram showing a UI example for instructing the amount and magnification of the aperture.
  • a region U140 is a button for inputting a numerical aperture NA as a quantity related to aperture
  • a region U142 is a button for inputting a magnification.
  • the observer designates mode buttons U140 and U142 via the operation unit 506 and inputs numerical values. Also, the observer designates one of the mode buttons U144 and U145 via the operation unit 506 and inputs a selection signal.
  • the optical processing unit 522b outputs the image captured by the first optical system 5102 without processing. That is, the first mode is a mode that does not perform processing for making the image captured by the first optical system 5102 correspond to the processing result of the second optical system.
  • the processing sequence in this case is the MU100 (see FIG. 13), and the numerical aperture is fixed at, for example, 0.7. be done.
  • the optical processing unit 522b changes the imaging magnification to generate the second image G200a.
  • the optical processing unit 522b uses the information in the second optical system 5102 1 to process and output the first image G100. That is, the second mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the processing result 1 of the second optical system 5102 .
  • 1 of the second optical system 5102 means processing using information of the second optical system of the optical microscope of model number 100, for example.
  • the MU200 is selected as the process sequence, and the numerical aperture and the magnification are linked. Therefore, for example, if a magnification of 60 times is input, the numerical aperture (NA) is set to 0.8. On the other hand, if a numerical aperture (NA) of 0.8 is input, a magnification of 60 is set. In this way, the inputs are interlocked with each other.
  • the optical processing unit 522b generates a second image G200a using the first image G100 according to the information of 1 of the second optical system 5102.
  • the input via the mode button U140 and the input via the mode button U142 can be performed independently. is.
  • the optical processing unit 522b adjusts the Z Increases the addition range for directions.
  • the optical processing unit 522b based on the plurality of first images G100a, provides a greater depth of field than the single first image G100a.
  • the first image G100 is processed to have image information corresponding to a deeper image to produce a second image G200a.
  • the display control section 520 causes the image display section 516 to display the second image G200a.
  • the optical processing unit 522b uses the first image G100 to generate the second image G200a according to the information of 2 of the second optical system 5102, in the same way as when the mode button U102 is selected. That is, the third mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the imaging result of the second optical system 5102 .
  • the mode button U106 When the mode button U106 is selected, for example, information on the second optical system in the model number 300 digital microscope is read from the storage unit 502. Then, the optical processing unit 522b generates a second image G200a using the first image G100 according to the information of 3 of the second optical system 5102.
  • FIG. That is, the fourth mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the imaging result of 3 of the second optical system 5102 .
  • preset information of the second optical system 5102 is read from the storage unit 502 via the mode buttons U100 to U106, and the optical processing unit 522b operates according to the read information of the second optical system 5102. , the first image G100 is used to generate a second image G200a. This makes it possible to easily generate captured images corresponding to various microscopes.
  • FIG. 15 is a diagram showing an example of a UI screen related to image processing generated by the UI control unit 518.
  • the observer designates mode buttons U152 to U156 displayed on the image display unit 516 via the operation unit 506.
  • FIG. 15 for example, when the mode button U152 is selected, according to the information of 1 of the second optical system 5102, the color processing unit 522c and the frequency processing unit 522d perform the first image G100a or optical processing. Processing is performed to change any one of the density, color, and spatial frequency processing of the second image G200a generated by the unit 522b. That is, the fifth mode is a mode in which the first image G100a captured by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in one of density, color, and spatial frequency processing. be.
  • the color processing unit 522c and the frequency processing unit 522d generate the first image G100a or the optical processing unit 522b. Processing is performed to change any one of the density, color, and spatial frequency processing of the second image G200a. That is, the sixth mode is a mode in which the image picked up by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in any one of density, color, and spatial frequency processing.
  • the seventh mode is a mode in which the image picked up by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in any one of density, color, and spatial frequency processing.
  • FIG. 16 is a diagram showing an example of a UI screen related to special processing generated by the UI control unit 518.
  • FIG. FIG. 17 shows a processing system corresponding to the mode selected in FIG.
  • the mode button U162 is selected when the mode button U102 (see FIG. 12) is selected
  • the processing according to the processing sequence MU200 is performed by the optical processing unit 522b.
  • the standard mode is a mode corresponding to the imaging result of the second optical system 5102 .
  • the mode button U164 when the mode button U164 is selected, the linkage between the numerical aperture of the input UI (see FIG. 14) and the magnification input is changed. are linked so that As a result, in the high NA mode, according to the processing sequence MU200H, the second image G200a with a deeper depth of field than in the standard mode is generated. That is, the high NA mode is a mode that generates a second image G200a with a deeper depth of field than the standard mode.
  • the mode button U166 when the mode button U166 is selected, the linkage between the numerical aperture of the input UI (see FIG. 14) and the magnification input is changed. It is set so that the numbers are interlocked.
  • the low NA mode according to the processing sequence MU200L, a second image G200a with a shallower depth of field than in the standard mode is generated. That is, the low NA mode is a mode for generating a second image G200a with a shallower depth of field than the standard mode.
  • FIG. 18 is a diagram showing an example in which the focal position shifts according to the distance from the optical axis L300 of the objective lens 5108a of the second optical system.
  • the focal position of the objective lens 5108a of the second optical system is indicated by line L5108a, and the optical axis is indicated by L300.
  • the objective lens 5108a of the second optical system is, for example, an aspherical lens, and the line L5108a varies in distance from the surface layer of the sample according to the distance from the optical axis L300 of the second optical system.
  • the line L5108a approaches the surface layer of the sample as the distance from the optical axis L300 increases.
  • the optical processing unit 522b sets the focal position, which is the characteristic of the objective lens 5108a of the second optical system, according to the distance from the optical axis L300 of the second optical system based on the line L5108a. Change the value of the coefficient ai (see formula (1)). More specifically, the optical processing unit 522b adjusts the value of the coefficient ai (see formula (1)) so that the image information on the objective lens 5108a side increases as the distance from the optical axis L300 increases. to change That is, the value of the coefficient ai (see formula (1)) increases the value of the coefficient ai on the objective lens 5108a side as the distance from the axis L300 increases.
  • the special mode is a mode in which the second image G200a is generated from the group of images G100 captured by the first optical system 5102 according to the focal position of the objective lens 5108a of the second optical system.
  • FIG. 19 is a diagram showing an example of experimentally generated coefficients related to formulas (1) and (2).
  • the checker chart is placed on the stage 5104 and imaged.
  • images are taken with a digital camera at magnifications of 10, 20, and 40 (steps S100, S102, and S104).
  • the coefficients (see formula (1)) for changing the imaging magnification from 1 to 20 are calculated (steps S106 and S108).
  • the image synthesizing unit 522 performs scanner imaging at an imaging magnification of 40 (step S110). Then, using the coefficients generated in steps S106 and S108, a convolution operation is performed on an image captured at an imaging magnification of 40 (steps S112 and S114). Then, an inverse matrix calculation is performed to generate coefficients for deconvolution calculation with imaging magnifications from 20 times to 40 times and from 10 times to 40 times (steps S116, S118, S120). Similarly, the conversion of each density and color tone is measured, and the parameters of the color processing unit 522c when the imaging magnification is changed from 20 times to 40 times and from 10 times to 40 times are acquired.
  • the doctor sets the imaging magnification using the UI shown in FIG. 14, for example (step S128).
  • the optical processing unit 522b reads the coefficient corresponding to the imaging magnification from the storage unit 502, and uses the first image 100 to generate the second image 200Ga (step S130).
  • FIG. 20 is a flowchart showing a processing example of the microscope system 5000 according to this embodiment. As shown in FIG. 20, first, a pathological specimen is placed on stage 5103 (step S200). Next, under the control of the control unit 5110a, the sensor 5103 captures an image of the pathological specimen (step S202).
  • control unit 5110a determines whether or not a predetermined number of shots have been taken (step S203). If it is determined that the predetermined number of images have not been taken (n in step S203), the position of the stage 5103 is changed in the optical axis direction Z via the stage control unit 514, and the processing from step S202 is repeated.
  • step S203 if it is determined that the predetermined number of images have been taken (y in step S203), the first image G100 is accumulated in the image accumulation unit (storage unit) 502 (step S206).
  • the magnification and numerical aperture are input via the UI (see FIG. 14) (step S208).
  • the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to the inputted magnification and numerical aperture (step S210). (Step S212).
  • the display control unit 520 causes the image display unit 516 to display the second image G200a (step S214), and ends the overall processing.
  • FIG. 21 is a flowchart showing a processing example of the microscope system 5000 when the mode button U144 is pressed and the magnification and numerical aperture are linked.
  • the UI control unit 518 when the mode button U144 is selected, the UI control unit 518 generates the numerical aperture according to the indicated magnification (step S310).
  • a second image G200a is generated using the first image G100 read from the image accumulation unit (storage unit) 502 according to the numerical aperture to be used (step S312). In this way, when the magnification and the numerical aperture are interlocked, it becomes unnecessary to input the numerical aperture, for example, and it is possible to improve the processing efficiency.
  • the image acquiring unit 515 acquires the first image G100 captured from the image storage unit 502 via the first optical system 5102, and the image synthesizing unit 522 acquires the first image G100.
  • the first image G100 is processed to generate the second image G200a. This makes it possible to generate a second image G200a corresponding to the image captured by the second optical system 5104 using the first image G100 captured by the first optical system 5102 . Therefore, even an observer accustomed to using the second optical system 5104 can more easily diagnose an image.
  • the microscope system 5000 according to Modification 1 of the first embodiment is different from the microscope system 5000 according to the first embodiment in that the optical processing unit 522b can process using the information of the field stop on the illumination side. . Differences from the microscope system 5000 according to the first embodiment will be described below.
  • FIG. 22 is a block diagram showing a configuration example of a microscope system 5000 according to modification 1 of one embodiment.
  • the microscope system 5000 further includes an illumination-side field stop 530 and an illumination NA control section (illumination-side aperture section) 540 that controls the illumination-side field stop 530 .
  • the field stop 530 is arranged at a position conjugate with the aperture stop 5102b (see FIG. 5). Therefore, the field stop 530 on the illumination side can adjust the luminous flux, that is, the numerical aperture NA, like the aperture stop 5102b.
  • FIG. 23 is a diagram showing an example of a UI screen generated by the UI control unit 518 according to Modification 1 of the first embodiment. It differs from the example of the UI screen according to the first embodiment (see FIG. 14) in that a mode button U146 is added.
  • the UI control unit 518 supplies the instructed illumination-side numerical aperture to the optical processing unit 522b of the image synthesizing unit 522 when the mode button U146 is selected.
  • the optical processing unit 522b uses the first image G1 read from the image storage unit (storage unit) 502 based on the numerical aperture of the illumination side and the numerical aperture of the aperture stop 5102b (see FIG. 5) of the second optical system. , to generate a second image G200.
  • the coefficient shown in equation (1) is used to generate the second image G200. do.
  • the optical processing unit 522b generates the second image G200 based on the illumination-side numerical aperture and the numerical aperture of the aperture stop 5102b (see FIG. 5) of the second optical system. I decided to generate. This makes it possible to more accurately generate the second image G200a corresponding to the image captured by the second optical system 5104 using the first image G100 captured with illumination corresponding to the numerical aperture on the illumination side. becomes.
  • the microscope system 5000 according to Modification 2 of the first embodiment differs from the microscope system 5000 according to the first embodiment in that the second image G200a is generated without using the operation unit 506.
  • FIG. Differences from the microscope system 5000 according to the first embodiment will be described below.
  • FIG. 24 is a block diagram showing a configuration example of a microscope system 5000 according to modification 2 of the first embodiment.
  • an image composition unit 522 uses a first image G100 accumulated in an image accumulation unit 502 according to preset parameters.
  • a second image G200a corresponding to the image captured by the second optical system 5104 is generated.
  • the image synthesizing unit 522 accumulates the generated second image G200a in the synthetic image accumulating unit 545.
  • the composite image storage unit 545 is connected to an in-hospital viewer system, for example, via a network.
  • FIG. 25 is a flowchart showing a processing example of the microscope system 5000 according to modification 2 of the first embodiment.
  • the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to preset magnification and numerical aperture. (step S406). Subsequently, the image synthesizing unit 522 accumulates the generated second image G200a in the synthetic image accumulating unit 545.
  • FIG. 24 the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to preset magnification and numerical aperture.
  • the viewer system displays the second image G200a accumulated in the composite image accumulation unit 545 (step S214), and ends the overall processing.
  • the image synthesizing unit 522 uses the first image G100 read from the image storage unit (storage unit) 502 according to the preset magnification and numerical aperture. It was decided to generate the second image G200a. As a result, the second image G200a can be generated based on the set specimen without the intervention of the observer, and the pathological image can be generated more efficiently.
  • This technology can be configured as follows.
  • an acquisition unit that acquires a first image captured through a first optical system
  • a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system
  • An information processing device An information processing device.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system;
  • the processing unit when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image.
  • the information processing apparatus according to (1), wherein the second image is generated so as to increase image information.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
  • the processing unit increases image information in the optical axis direction more than the first image based on the plurality of first images.
  • the processing unit may, based on the plurality of first images, The information processing apparatus according to (5), wherein the second image is generated so as to have image information corresponding to an image with a deep depth of field.
  • the processing unit Based on the plurality of first images, when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, the processing unit The information processing apparatus according to (6), wherein the second image is generated so as to have image information corresponding to an image with a deep depth of field.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
  • the information processing device according to (5), wherein the processing unit changes the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system. .
  • the processing unit changes the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases.
  • Information processing equipment
  • the processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
  • the processing unit performs processing to reduce the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • the processing unit performs processing to increase the density of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
  • Information processing equipment
  • the processing unit performs processing to increase the density of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • Information processing equipment
  • (16) further comprising a selection unit that selects from a plurality of different optical system information as information related to the second optical system;
  • the information processing apparatus according to (1) wherein the processing unit changes processing for the first image according to information selected by the selection unit.
  • (17) further comprising a display control unit that displays a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side on the display unit, 1) The information processing device described in 1).
  • the first display image is an image in which a change in numerical aperture is suppressed even if the imaging magnification is changed
  • the second display image is an image in which the numerical aperture is varied according to the change in imaging magnification.
  • a method of processing information comprising:
  • a microscope device that connects a first objective lens to obtain a first image captured at at least two or more focal positions of a biological tissue
  • a microscope system comprising an information processing device,
  • the information processing device is an acquisition unit that acquires the first image; a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device; a microscope system.
  • 515 Image Acquisition Unit
  • 522 Image Synthesis Unit (Processing Unit)
  • 5000 Microscope System
  • 5102 First Optical System
  • 5104 Second Optical System
  • 5120, 5120a Information Processing Device
  • 5100 Microscope Device
  • G100 First image
  • G100a first image
  • G200a second image
  • L100 optical axis of first optical system
  • L200 optical axis of second optical system.

Abstract

[Problem] To provide an information processing device, an information processing method, and a microscope system that enable provision of a further preferable image for an observer. [Solution] This information processing device comprises: an acquisition unit that acquires a first image captured via a first optical system; and a processing unit that processes the first image to generate a second image by using optical system information relating to a second optical system different from the first optical system.

Description

情報処理装置、情報処理方法、及び顕微鏡システムInformation processing device, information processing method, and microscope system
 本開示は情報処理装置、情報処理方法、及び顕微鏡システムに関する。 The present disclosure relates to an information processing device, an information processing method, and a microscope system.
 顕微鏡システムは、デジタルデータとして撮像された画像を電子的に保存したり、共有したりすることが可能である。このようなデジタルデータである画像に対して、光学顕微鏡にない処理を行うことができる。例えば顕微鏡システムは、強拡大の病理画像をつなぎ合わせて病理標本のWSI画像(WSI: Whole Slide Imaging)を再構成している。 The microscope system can electronically store and share images captured as digital data. An image that is such digital data can be subjected to processing that is not possible with an optical microscope. For example, a microscope system reconstructs a WSI image (WSI: Whole Slide Imaging) of a pathological specimen by connecting high-magnification pathological images.
特開2013-007849号公報JP 2013-007849 A
 しかしながら、デジタルデータを撮像する顕微鏡システムは、光学系に制限がある場合がある。このため、例えばWSI画像などを電子的に縮小又は拡大すると、特定の光学条件下で撮影された画像を縮小又は拡大することとなり、実際の光学顕微鏡の観察像との差異が生じてしまう。これにより、光学顕微鏡を使い慣れた医師(特に病理医)は、デジタルデータとして撮像された画像で診断を行う際に、診断基準のずれを感じてしまう恐れがある。
 そこで、本開示では、より観察者の好みに適した画像を提供可能な情報処理装置、情報処理方法、及び顕微鏡システムを提供するものである。
However, microscope systems that capture digital data may have limitations in their optics. For this reason, electronically reducing or enlarging a WSI image, for example, results in reducing or enlarging an image captured under specific optical conditions, resulting in a difference from an actual image observed by an optical microscope. As a result, doctors (especially pathologists) accustomed to using optical microscopes may feel a deviation in the diagnostic criteria when making a diagnosis using an image captured as digital data.
Therefore, the present disclosure provides an information processing device, an information processing method, and a microscope system that can provide an image more suitable for the viewer's taste.
 上記の課題を解決するために、本開示によれば、第1光学系を介して撮像した第1画像を取得する取得部と、
 前記第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理して第2画像を生成する処理を行う処理部と、
 を備える、情報処理装置が提供される。
In order to solve the above problems, according to the present disclosure, an acquisition unit that acquires a first image captured through a first optical system;
a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system;
An information processing device is provided.
 前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
 前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記複数の第1の画像に基づき、前記第1の画像よりも光軸方向の画像情報が増加するように、前記第2画像を生成してもよい。
The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
The processing unit, when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image. The second image may be generated such that image information is increased.
 前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
 前記処理部は、前記第1光学系よりも前記第2光学系の倍率が低い場合には、前記複数の第1の画像に基づき、前記第1の画像よりも光軸方向の画像情報が増加するように、前記第2画像を生成してもよい。
The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
When the magnification of the second optical system is lower than that of the first optical system, the processing unit increases image information in the optical axis direction more than the first image based on the plurality of first images. The second image may be generated so as to.
 前記処理部は、前記光学系情報に応じて、前記複数の第1の画像を加算する加算方法を変更してもよい。 The processing unit may change an addition method for adding the plurality of first images according to the optical system information.
 前記処理部は、前記光学系情報に基づき、前記複数の第1の画像毎に異なる係数を乗算し、加算して前記第1画像を処理してもよい。 The processing unit may process the first images by multiplying each of the plurality of first images by different coefficients based on the optical system information and adding them.
 前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記複数の第1の画像に基づき、前記第1の画像よりも被写界深度が深い画像に対応する画像情報を有するように前記第2画像を生成してもよい。 The processing unit, when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image. The second image may be generated such that it has image information corresponding to a deep image.
 前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記複数の第1の画像に基づき、前記第1の画像よりも被写界深度が深い画像に対応する画像情報を有するよう前記第2画像を生成してもよい。 The processing unit, when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image. The second image may be generated to have image information corresponding to a deep image.
 前記処理部は、前記第2光学系の開口数と、前記第2光学系の撮像倍率を連動させて前記第2画像を生成してもよい。 The processing unit may generate the second image by linking the numerical aperture of the second optical system and the imaging magnification of the second optical system.
 前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
 前記処理部は、前記第2光学系の対物レンズの特性に基づき、前記第2光学系の光軸からの距離に応じて、前記係数の値を変更してもよい。
The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
The processing unit may change the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system.
 前記処理部は、前記第1光学系の光軸からの距離が増加するに従い、対物レンズ側の画像情報が増加するように、前記係数の値を変更してもよい。 The processing unit may change the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases.
 前記処理部は、前記光学系情報に応じて、前記第1画像の濃度、及び空間周波数のいずれかを変更する処理を行ってもよい。 The processing unit may change either the density or the spatial frequency of the first image according to the optical system information.
 前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記第1画像の空間周波数よりも前記第2画像の空間周波数を低下させる処理を行ってもよい。 When the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, the processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image. you can go
 前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記第1画像の空間周波数を低下させる処理を行ってもよい。 The processing unit may perform processing to lower the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
 前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記第1画像の濃度を上げる処理を行ってもよい。 The processing unit may perform processing to increase the density of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
 前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記第1画像の濃度を上げる処理を行ってもよい。 The processing unit may perform processing to increase the density of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
 第2光学系に関連する情報として、複数の異なる光学系情報から選択する選択部を更に有し、
 前記処理部は、選択部で選択された情報に応じて、前記第1画像に対する処理を変更してもよい。
further comprising a selection unit for selecting from a plurality of different optical system information as information related to the second optical system;
The processing section may change the processing for the first image according to the information selected by the selection section.
 前記第1画像に対応する第1表示画像と、前記処理部により処理された前記第2画像に対応する第2表示画像とを並べて表示部に表示する表示制御部を更に備えてもよい。 A display control unit may be further provided for displaying a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side on the display unit.
 前記第1表示画像は、撮像倍率を変更させても開口数の変動を抑制した画像であり、第2表示画像は、撮像倍率の変更に応じて開口数を変動させた画像であってもよい。 The first display image may be an image in which a change in numerical aperture is suppressed even if the imaging magnification is changed, and the second display image may be an image in which the numerical aperture is changed according to the change in imaging magnification. .
 上記の課題を解決するために、本開示によれば、第1光学系を介して撮像した第1画像を取得する取得工程と、
 前記第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理する処理工程と、
 を備える、情報処理方法提供される。
In order to solve the above problems, according to the present disclosure, an acquisition step of acquiring a first image captured through a first optical system;
processing the first image using optical system information associated with a second optical system different from the first optical system;
A method of processing information is provided, comprising:
 上記の課題を解決するために、本開示によれば、第1の対物レンズを接続して、生体組織の少なくとも2つ以上の焦点位置で撮像した第1画像を取得する顕微鏡装置と、
 情報処理装置と、を備える顕微鏡システムであって、
 前記情報処理装置は、
 前記第1画像を取得する取得部と、
 前記顕微鏡装置における第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理する処理部と、
 を有する、顕微鏡システムが提供される。
In order to solve the above problems, according to the present disclosure, a microscope device that connects a first objective lens and acquires a first image captured at least at two or more focal positions of a living tissue;
A microscope system comprising an information processing device,
The information processing device is
an acquisition unit that acquires the first image;
a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device;
A microscope system is provided, comprising:
顕微鏡システムの全体構成を概略的に示す図。The figure which shows roughly the whole structure of a microscope system. 撮像方式の例を示す図。4A and 4B are diagrams showing an example of an imaging method; FIG. 撮像方式の例を示す図。4A and 4B are diagrams showing an example of an imaging method; FIG. 微鏡システム5000のより詳細な構成例を示すブロック図。FIG. 3 is a block diagram showing a more detailed configuration example of the microscope system 5000; 光学部の光学系の対物レンズと、開口絞りと、を示す図。FIG. 4 is a diagram showing an objective lens and an aperture stop of an optical system of an optical section; センサ制御部と、ステージ制御部との制御により撮像された第1画像を模式的に示す図。FIG. 4 is a diagram schematically showing a first image captured under the control of a sensor control section and a stage control section; 画像合成部の詳細な構成例を示すブロック図。FIG. 2 is a block diagram showing a detailed configuration example of an image synthesizing unit; 第2光学系の一例を示す図。The figure which shows an example of a 2nd optical system. 光学処理部の処理例を模式的に説明する図。4A and 4B are diagrams for schematically explaining a processing example of an optical processing unit; FIG. 画像合成部の処理順を模式的に示す図。FIG. 4 is a diagram schematically showing the processing order of an image synthesizing unit; 第1画像と第2画像とを並べて表示させる例を示す図。The figure which shows the example which displays a 1st image and a 2nd image side by side. UI制御部が生成するUI画面例を示す図。FIG. 4 is a diagram showing an example of a UI screen generated by a UI control unit; モードボタンが選択された場合の処理系列を示す図。The figure which shows a processing sequence when a mode button is selected. 開口に関する量、及び倍率を指示するUI例を示す図。The figure which shows the UI example which instruct|indicates the amount regarding an aperture, and magnification. UI制御部が生成する画像処理関連のUI画面例を示す図。FIG. 5 is a diagram showing an example of a UI screen related to image processing generated by a UI control unit; UI制御部が生成する特殊処理関連のUI画面例を示す図。FIG. 6 is a diagram showing an example of a UI screen related to special processing generated by the UI control unit; 図16で選択されたモードに対応する処理系を示す図。FIG. 17 shows a processing system corresponding to the mode selected in FIG. 16; 第2光学系の対物レンズの光軸かの距離に応じて焦点位置がずれる例を示す図。FIG. 10 is a diagram showing an example in which the focal position shifts according to the distance from the optical axis of the objective lens of the second optical system; (1)式、(2)式に関連する係数の実験による生成例を示す図。FIG. 4 is a diagram showing an example of experimentally generated coefficients related to formulas (1) and (2); 本実施形態に係る顕微鏡システムの処理例を示すフローチャート。4 is a flowchart showing an example of processing of the microscope system according to the embodiment; 倍率、及び開口数が連動する場合の顕微鏡システムの処理例を示すフローチャート。5 is a flowchart showing an example of processing of the microscope system when magnification and numerical aperture are interlocked; 実施形態の変形例1に係る顕微鏡システムの構成例を示すブロック図。FIG. 2 is a block diagram showing a configuration example of a microscope system according to Modification 1 of the embodiment; 第1実施形態の変形例1に係るUI制御部が生成するUI画面の一例を示す図。FIG. 10 is a diagram showing an example of a UI screen generated by a UI control unit according to Modification 1 of the first embodiment; 第1実施形態の変形例2に係る顕微鏡システムの構成例を示すブロック図。FIG. 5 is a block diagram showing a configuration example of a microscope system according to Modification 2 of the first embodiment; 第1実施形態の変形例2に係る顕微鏡システムの処理例を示すフローチャート。9 is a flowchart showing a processing example of the microscope system according to modification 2 of the first embodiment;
 以下、図面を参照して、情報処理装置、情報処理方法、及び撮像システムの実施形態について説明する。以下では、情報処理装置、情報処理方法、及び撮像システムの主要な構成部分を中心に説明するが、情報処理装置、情報処理方法、及び撮像システムには、図示又は説明されていない構成部分や機能が存在しうる。以下の説明は、図示又は説明されていない構成部分や機能を除外するものではない。 Hereinafter, embodiments of an information processing device, an information processing method, and an imaging system will be described with reference to the drawings. The information processing device, the information processing method, and the imaging system will be mainly described below. can exist. The following description does not exclude components or features not shown or described.
(第1実施形態)
 本開示の顕微鏡システムの構成例を図1に示す。図1に示される顕微鏡システム5000は、顕微鏡装置5100、制御部5110、及び情報処理部5120を含む。顕微鏡装置5100は、光照射部5101、光学部5102、及び信号取得部5103を備えている。顕微鏡装置5100はさらに、生体由来試料Sが配置される試料載置部5104を備えていてよい。なお、顕微鏡装置の構成は図1に示されるものに限定されず、例えば、光照射部5101は、顕微鏡装置5100の外部に存在してもよく、例えば顕微鏡装置5100に含まれない光源が光照射部5101として利用されてもよい。また、光照射部5101は、光照射部5101と光学部5102とによって試料載置部5104が挟まれるように配置されていてよく、例えば、光学部5102が存在する側に配置されてもよい。顕微鏡装置5100は、明視野観察、位相差観察、微分干渉観察、偏光観察、蛍光観察、及び暗視野観察のうちの1又は2以上で構成されてよい。
(First embodiment)
FIG. 1 shows a configuration example of the microscope system of the present disclosure. A microscope system 5000 shown in FIG. 1 includes a microscope device 5100 , a control section 5110 and an information processing section 5120 . A microscope device 5100 includes a light irradiation section 5101 , an optical section 5102 , and a signal acquisition section 5103 . The microscope device 5100 may further include a sample placement section 5104 on which the biological sample S is placed. Note that the configuration of the microscope apparatus is not limited to that shown in FIG. It may be used as the unit 5101 . Further, the light irradiation section 5101 may be arranged such that the sample mounting section 5104 is sandwiched between the light irradiation section 5101 and the optical section 5102, and may be arranged on the side where the optical section 5102 exists, for example. The microscope apparatus 5100 may be configured for one or more of bright field observation, phase contrast observation, differential interference observation, polarization observation, fluorescence observation, and dark field observation.
 顕微鏡システム5000は、いわゆるWSI(Whole Slide Imaging)システム又はデジタルパソロジーシステムとして構成されてよく、病理診断のために用いられうる。また、顕微鏡システム5000は、蛍光イメージングシステム、特には多重蛍光イメージングシステムとして構成されてもよい。 The microscope system 5000 may be configured as a so-called WSI (Whole Slide Imaging) system or a digital pathology system, and can be used for pathological diagnosis. Microscope system 5000 may also be configured as a fluorescence imaging system, in particular a multiplex fluorescence imaging system.
 例えば、顕微鏡システム5000は、術中病理診断又は遠隔病理診断を行うために用いられてよい。当該術中病理診断では、手術が行われている間に、顕微鏡装置5100が、当該手術の対象者から取得された生体由来試料Sのデータを取得し、そして、当該データを情報処理部5120へと送信しうる。当該遠隔病理診断では、顕微鏡装置5100は、取得した生体由来試料Sのデータを、顕微鏡装置5100とは離れた場所(別の部屋又は建物など)に存在する情報処理装置5120へと送信しうる。そして、これらの診断において、情報処理装置5120は、当該データを受信し、出力する。出力されたデータに基づき、情報処理装置5120のユーザが、病理診断を行いうる。 For example, the microscope system 5000 may be used to perform intraoperative pathological diagnosis or remote pathological diagnosis. In the intraoperative pathological diagnosis, while the surgery is being performed, the microscope device 5100 acquires data of the biological sample S obtained from the subject of the surgery, and transfers the data to the information processing unit 5120. can send. In the remote pathological diagnosis, the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing device 5120 located in a place (another room, building, or the like) away from the microscope device 5100 . In these diagnoses, the information processing device 5120 receives and outputs the data. A user of the information processing device 5120 can make a pathological diagnosis based on the output data.
(生体由来試料)
 生体由来試料Sは、生体成分を含む試料であってよい。前記生体成分は、生体の組織、細胞、生体の液状成分(血液や尿等)、培養物、又は生細胞(心筋細胞、神経細胞、及び受精卵など)であってよい。
 前記生体由来試料は、固形物であってよく、パラフィンなどの固定試薬によって固定された標本又は凍結により形成された固形物であってよい。前記生体由来試料は、当該固形物の切片でありうる。前記生体由来試料の具体的な例として、生検試料の切片を挙げることができる。
(Biological sample)
The biological sample S may be a sample containing a biological component. The biological components may be tissues, cells, liquid components of a living body (blood, urine, etc.), cultures, or living cells (cardiomyocytes, nerve cells, fertilized eggs, etc.).
The biological sample may be a solid, a specimen fixed with a fixative such as paraffin, or a solid formed by freezing. The biological sample can be a section of the solid. A specific example of the biological sample is a section of a biopsy sample.
 前記生体由来試料は、染色又は標識などの処理が施されたものであってよい。当該処理は、生体成分の形態を示すための又は生体成分が有する物質(表面抗原など)を示すための染色であってよく、HE(Hematoxylin-Eosin)染色、免疫組織化学(Immunohistochemistry)染色を挙げることができる。前記生体由来試料は、1又は2以上の試薬により前記処理が施されたものであってよく、当該試薬は、蛍光色素、発色試薬、蛍光タンパク質、又は蛍光標識抗体でありうる。 The biological sample may be one that has undergone processing such as staining or labeling. The treatment may be staining for indicating the morphology of biological components or for indicating substances (surface antigens, etc.) possessed by biological components, examples of which include HE (Hematoxylin-Eosin) staining and immunohistochemistry staining. be able to. The biological sample may be treated with one or more reagents, and the reagents may be fluorescent dyes, chromogenic reagents, fluorescent proteins, or fluorescently labeled antibodies.
 前記標本は、人体から採取された検体または組織サンプルから病理診断または臨床検査などを目的に作製されたものであってよい。また、前記標本は、人体に限らず、動物、植物、又は他の材料に由来するものであってもよい。前記標本は、使用される組織(例えば臓器または細胞など)の種類、対象となる疾病の種類、対象者の属性(例えば、年齢、性別、血液型、または人種など)、または対象者の生活習慣(例えば、食生活、運動習慣、または喫煙習慣など)などにより性質が異なる。前記標本は、各標本それぞれ識別可能な識別情報(バーコード情報又はQRコード(商標)情報等)を付されて管理されてよい。 The specimen may be one prepared for the purpose of pathological diagnosis or clinical examination from a specimen or tissue sample collected from the human body. Moreover, the specimen is not limited to the human body, and may be derived from animals, plants, or other materials. The specimen may be the type of tissue used (such as an organ or cell), the type of target disease, the subject's attributes (such as age, sex, blood type, or race), or the subject's lifestyle. The properties differ depending on habits (for example, eating habits, exercise habits, smoking habits, etc.). The specimens may be managed with identification information (bar code information, QR code (trademark) information, etc.) that allows each specimen to be identified.
(光照射部)
 光照射部5101は、生体由来試料Sを照明するための光源、および光源から照射された光を標本に導く光学部である。光源は、可視光、紫外光、若しくは赤外光、又はこれらの組合せを生体由来試料に照射しうる。光源は、ハロゲンランプ、レーザ光源、LEDランプ、水銀ランプ、及びキセノンランプのうちの1又は2以上であってよい。蛍光観察における光源の種類及び/又は波長は、複数でもよく、当業者により適宜選択されてよい。光照射部は、透過型、反射型又は落射型(同軸落射型若しくは側射型)の構成を有しうる。
(light irradiation part)
The light irradiation unit 5101 is a light source for illuminating the biological sample S and an optical unit for guiding the light irradiated from the light source to the specimen. The light source may irradiate the biological sample with visible light, ultraviolet light, or infrared light, or a combination thereof. The light source may be one or more of halogen lamps, laser light sources, LED lamps, mercury lamps, and xenon lamps. A plurality of types and/or wavelengths of light sources may be used in fluorescence observation, and may be appropriately selected by those skilled in the art. The light irradiator may have a transmissive, reflective, or episcopic (coaxial or lateral) configuration.
(光学部)
 光学部5102は、生体由来試料Sからの光を信号取得部5103へと導くように構成される。光学部は、顕微鏡装置5100が生体由来試料Sを観察又は撮像することを可能とするように構成されうる。
 光学部5102は、対物レンズを含みうる。対物レンズの種類は、観察方式に応じて当業者により適宜選択されてよい。また、光学部は、対物レンズによって拡大された像を信号取得部に中継するためのリレーレンズを含んでもよい。光学部は、前記対物レンズ及び前記リレーレンズ以外の光学部品、接眼レンズ、位相板、及びコンデンサレンズなど、をさらに含みうる。
 また、光学部5102は、生体由来試料Sからの光のうちから所定の波長を有する光を分離するように構成された波長分離部をさらに含んでよい。波長分離部は、所定の波長又は波長範囲の光を選択的に信号取得部に到達させるように構成されうる。波長分離部は、例えば、光を選択的に透過させるフィルタ、偏光板、プリズム(ウォラストンプリズム)、及び回折格子のうちの1又は2以上を含んでよい。波長分離部に含まれる光学部品は、例えば対物レンズから信号取得部までの光路上に配置されてよい。波長分離部は、蛍光観察が行われる場合、特に励起光照射部を含む場合に、顕微鏡装置内に備えられる。波長分離部は、蛍光同士を互いに分離し又は白色光と蛍光とを分離するように構成されうる。
(Optical part)
The optical section 5102 is configured to guide the light from the biological sample S to the signal acquisition section 5103 . The optical section can be configured to allow the microscope device 5100 to observe or image the biological sample S.
Optical section 5102 may include an objective lens. The type of objective lens may be appropriately selected by those skilled in the art according to the observation method. Also, the optical section may include a relay lens for relaying the image magnified by the objective lens to the signal acquisition section. The optical unit may further include optical components other than the objective lens and the relay lens, an eyepiece lens, a phase plate, a condenser lens, and the like.
In addition, the optical section 5102 may further include a wavelength separation section configured to separate light having a predetermined wavelength from the light from the biological sample S. The wavelength separation section can be configured to selectively allow light of a predetermined wavelength or wavelength range to reach the signal acquisition section. The wavelength separator may include, for example, one or more of a filter that selectively transmits light, a polarizing plate, a prism (Wollaston prism), and a diffraction grating. The optical components included in the wavelength separation section may be arranged, for example, on the optical path from the objective lens to the signal acquisition section. The wavelength separation unit is provided in the microscope apparatus when fluorescence observation is performed, particularly when an excitation light irradiation unit is included. The wavelength separator may be configured to separate fluorescent light from each other or white light and fluorescent light.
(信号取得部)
 信号取得部5103は、生体由来試料Sからの光を受光し、当該光を電気信号、特にはデジタル電気信号へと変換することができるように構成されうる。信号取得部は、当該電気信号に基づき、生体由来試料Sに関するデータを取得することができるように構成されてよい。信号取得部は、生体由来試料Sの像(画像、特には静止画像、タイムラプス画像、又は動画像)のデータを取得することができるように構成されてよく、特に光学部によって拡大された画像のデータを取得するように構成されうる。信号取得部は、1次元又は2次元に並んで配列された複数の画素を備えている1つ又は複数の撮像素子、CMOS又はCCDなど、を含む。信号取得部は、低解像度画像取得用の撮像素子と高解像度画像取得用の撮像素子とを含んでよく、又は、AFなどのためのセンシング用撮像素子と観察などのための画像出力用撮像素子とを含んでもよい。撮像素子は、前記複数の画素に加え、各画素からの画素信号を用いた信号処理を行う信号処理部(CPU、DSP、及びメモリのうちの1つ、2つ、又は3つを含む)、及び、画素信号から生成された画像データ及び信号処理部により生成された処理データの出力の制御を行う出力制御部を含みうる。更には、撮像素子は、入射光を光電変換する画素の輝度変化が所定の閾値を超えたことをイベントとして検出する非同期型のイベント検出センサを含み得る。前記複数の画素、前記信号処理部、及び前記出力制御部を含む撮像素子は、好ましくは1チップの半導体装置として構成されうる。
(Signal acquisition unit)
The signal acquisition unit 5103 can be configured to receive light from the biological sample S and convert the light into an electrical signal, particularly a digital electrical signal. The signal acquisition unit may be configured to acquire data on the biological sample S based on the electrical signal. The signal acquisition unit may be configured to acquire data of an image (image, particularly a still image, a time-lapse image, or a moving image) of the biological sample S, particularly an image magnified by the optical unit. It can be configured to acquire data. The signal acquisition unit includes one or more imaging elements, such as CMOS or CCD, having a plurality of pixels arranged one-dimensionally or two-dimensionally. The signal acquisition unit may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or an image sensor for sensing such as AF and an image sensor for image output for observation. and may include In addition to the plurality of pixels, the image sensor includes a signal processing unit (including one, two, or three of CPU, DSP, and memory) that performs signal processing using pixel signals from each pixel, and an output control unit for controlling output of image data generated from the pixel signals and processed data generated by the signal processing unit. Furthermore, the imaging device may include an asynchronous event detection sensor that detects, as an event, a change in brightness of a pixel that photoelectrically converts incident light exceeding a predetermined threshold. An imaging device including the plurality of pixels, the signal processing section, and the output control section may preferably be configured as a one-chip semiconductor device.
(制御部)
 制御部5110は、顕微鏡装置5100による撮像を制御する。制御部は、撮像制御のために、光学部5102及び/又は試料載置部5104の移動を駆動して、光学部と試料載置部との間の位置関係を調節しうる。制御部5110は、光学部及び/又は試料載置部を、互いに近づく又は離れる方向(例えば対物レンズの光軸方向)に移動させうる。また、制御部は、光学部及び/又は試料載置部を、前記光軸方向と垂直な面におけるいずれかの方向に移動させてもよい。制御部は、撮像制御のために、光照射部5101及び/又は信号取得部5103を制御してもよい。
(control part)
The control unit 5110 controls imaging by the microscope device 5100 . For imaging control, the control unit can drive the movement of the optical unit 5102 and/or the sample placement unit 5104 to adjust the positional relationship between the optical unit and the sample placement unit. The control unit 5110 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (for example, the optical axis direction of the objective lens). Further, the control section may move the optical section and/or the sample placement section in any direction on a plane perpendicular to the optical axis direction. The control unit may control the light irradiation unit 5101 and/or the signal acquisition unit 5103 for imaging control.
(試料載置部)
 試料載置部5104は、生体由来試料の試料載置部上における位置が固定できるように構成されてよく、いわゆるステージであってよい。試料載置部5104は、生体由来試料の位置を、対物レンズの光軸方向及び/又は当該光軸方向と垂直な方向に移動させることができるように構成されうる。
(Sample placement section)
The sample mounting section 5104 may be configured such that the position of the biological sample on the sample mounting section can be fixed, and may be a so-called stage. The sample mounting section 5104 can be configured to move the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
(情報処理部)
 情報処理部5120は、顕微鏡装置5100が取得したデータ(撮像データなど)を、顕微鏡装置5100から取得しうる。情報処理部は、撮像データに対する画像処理を実行しうる。当該画像処理は、色分離処理を含んでよい。当該色分離処理は、撮像データから所定の波長又は波長範囲の光成分のデータを抽出して画像データを生成する処理、又は、撮像データから所定の波長又は波長範囲の光成分のデータを除去する処理などを含みうる。また、当該画像処理は、組織切片の自家蛍光成分と色素成分を分離する自家蛍光分離処理や互いに蛍光波長が異なる色素間の波長を分離する蛍光分離処理を含みうる。前記自家蛍光分離処理では、同一ないし性質が類似する前記複数の標本のうち、一方から抽出された自家蛍光シグナルを用いて他方の標本の画像情報から自家蛍光成分を除去する処理を行ってもよい。
 情報処理部5120は、制御部5110に撮像制御のためのデータを送信してよく、当該データを受信した制御部5110が、当該データに従い顕微鏡装置5100による撮像を制御してもよい。
(Information processing department)
The information processing section 5120 can acquire data (such as imaging data) acquired by the microscope device 5100 from the microscope device 5100 . The information processing section can perform image processing on the imaging data. The image processing may include color separation processing. The color separation process is a process of extracting data of light components of a predetermined wavelength or wavelength range from the captured data to generate image data, or removing data of light components of a predetermined wavelength or wavelength range from the captured data. It can include processing and the like. Further, the image processing may include autofluorescence separation processing for separating the autofluorescence component and dye component of the tissue section, and fluorescence separation processing for separating the wavelengths between dyes having different fluorescence wavelengths. In the autofluorescence separation processing, out of the plurality of specimens having the same or similar properties, autofluorescence signals extracted from one may be used to remove autofluorescence components from image information of the other specimen. .
The information processing section 5120 may transmit data for imaging control to the control section 5110, and the control section 5110 receiving the data may control imaging by the microscope apparatus 5100 according to the data.
 情報処理部5120は、汎用のコンピュータなどの情報処理装置として構成されてよく、CPU、RAM、及びROMを備えていてよい。情報処理部は、顕微鏡装置5100の筐体内に含まれていてよく、又は、当該筐体の外にあってもよい。また、情報処理部による各種処理又は機能は、ネットワークを介して接続されたサーバコンピュータ又はクラウドにより実現されてもよい。 The information processing section 5120 may be configured as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM. The information processing section may be included in the housing of the microscope device 5100 or may be outside the housing. Also, various processes or functions by the information processing unit may be realized by a server computer or cloud connected via a network.
 顕微鏡装置5100による生体由来試料Sの撮像の方式は、生体由来試料の種類及び撮像の目的などに応じて、当業者により適宜選択されてよい。当該撮像方式の例を以下に説明する。 A method of imaging the biological sample S by the microscope device 5100 may be appropriately selected by a person skilled in the art according to the type of the biological sample and the purpose of imaging. An example of the imaging method will be described below.
 撮像方式の一つの例は以下のとおりである。顕微鏡装置は、まず、撮像対象領域を特定しうる。当該撮像対象領域は、生体由来試料が存在する領域全体をカバーするように特定されてよく、又は、生体由来試料のうちの目的部分(目的組織切片、目的細胞、又は目的病変部が存在する部分)をカバーするように特定されてもよい。次に、顕微鏡装置は、当該撮像対象領域を、所定サイズの複数の分割領域へと分割し、顕微鏡装置は各分割領域を順次撮像する。これにより、各分割領域の画像が取得される。
 図1に示されるように、顕微鏡装置は、生体由来試料S全体をカバーする撮像対象領域Rを特定する。そして、顕微鏡装置は、撮像対象領域Rを16の分割領域へと分割する。そして、顕微鏡装置は分割領域R1の撮像を行い、そして次に、その分割領域R1に隣接する領域など、撮像対象領域Rに含まれる領域の内いずれか領域を撮像しうる。そして、未撮像の分割領域がなくなるまで、分割領域の撮像が行われる。なお、撮像対象領域R以外の領域についても、分割領域の撮像画像情報に基づき、撮像しても良い。
 或る分割領域を撮像した後に次の分割領域を撮像するために、顕微鏡装置と試料載置部との位置関係が調整される。当該調整は、顕微鏡装置の移動、試料載置部の移動、又は、これらの両方の移動により行われてよい。この例において、各分割領域の撮像を行う撮像装置は、2次元撮像素子(エリアセンサ)又は1次元撮像素子(ラインセンサ)であってよい。信号取得部は、光学部を介して各分割領域を撮像してよい。また、各分割領域の撮像は、顕微鏡装置及び/又は試料載置部を移動させながら連続的に行われてよく、又は、各分割領域の撮像に際して顕微鏡装置及び/又は試料載置部の移動が停止されてもよい。各分割領域の一部が重なり合うように、前記撮像対象領域の分割が行われてよく、又は、重なり合わないように前記撮像対象領域の分割が行われてもよい。各分割領域は、焦点距離及び/又は露光時間などの撮像条件を変えて複数回撮像されてもよい。
 また、情報処理装置は、隣り合う複数の分割領域が合成して、より広い領域の画像データを生成しうる。当該合成処理を、撮像対象領域全体にわたって行うことで、撮像対象領域について、より広い領域の画像を取得することができる。また、分割領域の画像、または合成処理を行った画像から、より解像度の低い画像データを生成しうる。
One example of an imaging scheme is as follows. The microscope device can first identify an imaging target region. The imaging target region may be specified so as to cover the entire region where the biological sample exists, or a target portion (target tissue section, target cell, or target lesion portion) of the biological sample. ) may be specified to cover Next, the microscope device divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device sequentially images each divided region. As a result, an image of each divided area is obtained.
As shown in FIG. 1, the microscope device specifies an imaging target region R that covers the entire biological sample S. As shown in FIG. Then, the microscope device divides the imaging target region R into 16 divided regions. The microscope device can then image the segmented region R1, and then image any region included in the imaging target region R, such as a region adjacent to the segmented region R1. Then, image capturing of the divided areas is performed until there are no unimaged divided areas. Areas other than the imaging target area R may also be imaged based on the captured image information of the divided areas.
After imaging a certain divided area, the positional relationship between the microscope device and the sample mounting section is adjusted in order to image the next divided area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them. In this example, the image capturing device that captures each divided area may be a two-dimensional image sensor (area sensor) or a one-dimensional image sensor (line sensor). The signal acquisition section may capture an image of each divided area via the optical section. In addition, the imaging of each divided area may be performed continuously while moving the microscope device and/or the sample mounting section, or the movement of the microscope apparatus and/or the sample mounting section may be performed during the imaging of each divided area. may be stopped. The imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap. Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
Further, the information processing device can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesis processing over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
 撮像方式の他の例は以下のとおりである。顕微鏡装置は、まず、撮像対象領域を特定しうる。当該撮像対象領域は、生体由来試料が存在する領域全体をカバーするように特定されてよく、又は、生体由来試料のうちの目的部分(目的組織切片又は目的細胞が存在する部分)をカバーするように特定されてもよい。次に、顕微鏡装置は、撮像対象領域の一部の領域(「分割スキャン領域」ともいう)を、光軸と垂直な面内における一つの方向(「スキャン方向」ともいう)へスキャンして撮像する。当該分割スキャン領域のスキャンが完了したら、次に、前記スキャン領域の隣の分割スキャン領域を、スキャンする。これらのスキャン動作が、撮像対象領域全体が撮像されるまで繰り返される。
 図3に示されるように、顕微鏡装置は、生体由来試料Sのうち、組織切片が存在する領域(グレーの部分)を撮像対象領域Saとして特定する。そして、顕微鏡装置は、撮像対象領域Saのうち、分割スキャン領域Rsを、Y軸方向へスキャンする。顕微鏡装置は、分割スキャン領域Rsのスキャンが完了したら、次に、X軸方向における隣の分割スキャン領域をスキャンする。撮像対象領域Saの全てについてスキャンが完了するまで、この動作が繰り返しされる。
 各分割スキャン領域のスキャンのために、及び、或る分割スキャン領域を撮像した後に次の分割スキャン領域を撮像するために、顕微鏡装置と試料載置部との位置関係が調整される。当該調整は、顕微鏡装置の移動、試料載置部の移動、又は、これらの両方の移動により行われてよい。この例において、各分割スキャン領域の撮像を行う撮像装置は、1次元撮像素子(ラインセンサ)又は2次元撮像素子(エリアセンサ)であってよい。信号取得部は、拡大光学系を介して各分割領域を撮像してよい。また、各分割スキャン領域の撮像は、顕微鏡装置及び/又は試料載置部を移動させながら連続的に行われてよい。各分割スキャン領域の一部が重なり合うように、前記撮像対象領域の分割が行われてよく、又は、重なり合わないように前記撮像対象領域の分割が行われてもよい。各分割スキャン領域は、焦点距離及び/又は露光時間などの撮像条件を変えて複数回撮像されてもよい。
 また、情報処理装置は、隣り合う複数の分割スキャン領域が合成して、より広い領域の画像データを生成しうる。当該合成処理を、撮像対象領域全体にわたって行うことで、撮像対象領域について、より広い領域の画像を取得することができる。また、分割スキャン領域の画像、または合成処理を行った画像から、より解像度の低い画像データを生成しうる。
Other examples of imaging schemes are as follows. The microscope device can first identify an imaging target region. The imaging target region may be specified so as to cover the entire region where the biological sample exists, or the target portion (target tissue section or target cell-containing portion) of the biological sample. may be specified. Next, the microscope device scans a part of the imaging target area (also referred to as a "divided scan area") in one direction (also referred to as a "scanning direction") in a plane perpendicular to the optical axis to capture an image. do. After the scanning of the divided scan area is completed, the next divided scan area next to the scan area is scanned. These scanning operations are repeated until the entire imaging target area is imaged.
As shown in FIG. 3, the microscope device specifies a region (gray portion) in which a tissue section exists in the biological sample S as an imaging target region Sa. Then, the microscope device scans the divided scan area Rs in the imaging target area Sa in the Y-axis direction. After completing the scanning of the divided scan region Rs, the microscope device scans the next divided scan region in the X-axis direction. This operation is repeated until scanning is completed for the entire imaging target area Sa.
The positional relationship between the microscope device and the sample placement section is adjusted for scanning each divided scan area and for imaging the next divided scan area after imaging a certain divided scan area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them. In this example, the imaging device that captures each divided scan area may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor). The signal acquisition section may capture an image of each divided area via an enlarging optical system. Also, the imaging of each divided scan area may be performed continuously while moving the microscope device and/or the sample placement unit. The imaging target area may be divided such that the divided scan areas partially overlap each other, or the imaging target area may be divided so that the divided scan areas do not overlap. Each divided scan area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
Further, the information processing apparatus can combine a plurality of adjacent divided scan areas to generate image data of a wider area. By performing the synthesis processing over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Further, image data with lower resolution can be generated from the image of the divided scan area or the image subjected to the synthesis processing.
 図4は、顕微鏡システム5000のより詳細な構成例を示すブロック図である。図4に示すように、顕微鏡システム5000は、更に画像蓄積部(記憶部)502と、操作部506とを備える。 FIG. 4 is a block diagram showing a more detailed configuration example of the microscope system 5000. As shown in FIG. As shown in FIG. 4 , the microscope system 5000 further includes an image accumulation unit (storage unit) 502 and an operation unit 506 .
 記憶部502には、例えば不揮発性半導体メモリ、ハードディスクドライブ等の記憶装置が用いられる。この記憶部502には、本実施形態に係る各種の制御パラメータ、プログラムなどを予め記憶してもよい。また、記憶部502は、信号取得部(センサ)5103で取得された画像が蓄積される。 For the storage unit 502, a storage device such as a non-volatile semiconductor memory or hard disk drive is used. Various control parameters, programs, and the like according to this embodiment may be stored in advance in the storage unit 502 . In addition, the storage unit 502 accumulates images acquired by the signal acquisition unit (sensor) 5103 .
 操作部506は、例えばキーボード、マウスなどで構成される。この操作部506は、観察者、例えば病理医の操作に応じた指示信号を信号処理装置5120aに入力する。 The operation unit 506 is composed of, for example, a keyboard and a mouse. The operation unit 506 inputs an instruction signal according to the operation of an observer, eg, a pathologist, to the signal processing device 5120a.
 顕微鏡装置5100aは、センサ制御部508と、ステージ制御部514とを有する。情報処理装置5120aは、画像取得部515と、画像表示部516と、UI制御部518と、表示制御部520と、画像合成部522と、画像表示部522とを有する。情報処理装置5120aの詳細は後述する。 The microscope device 5100 a has a sensor control section 508 and a stage control section 514 . The information processing device 5120 a has an image acquisition unit 515 , an image display unit 516 , a UI control unit 518 , a display control unit 520 , an image synthesizing unit 522 , and an image display unit 522 . Details of the information processing device 5120a will be described later.
 図5は、本実施形態に係る光学部5102の光学系の対物レンズ5102aと、開口絞り5102bと、を示す図である。領域100は、一回の撮像で、撮像される光学系の光軸L100に沿った深度方向の深さ領域を示す。なお、本実施形態に係る光学部5102の光学系が、第1光学系に対応する。 FIG. 5 is a diagram showing an objective lens 5102a and an aperture stop 5102b of the optical system of the optical section 5102 according to this embodiment. A region 100 indicates a depth region in the depth direction along the optical axis L100 of the optical system to be imaged in one imaging. Note that the optical system of the optical unit 5102 according to this embodiment corresponds to the first optical system.
 図5に示すように、制御部5110aの制御にしたがい、センサ制御部508と、ステージ制御部514とは、光学系の対物レンズ5102aの焦点位置を光学系の光軸L100に沿ってZ方向に変更し、複数の画像を撮像する制御を行う。すなわち、ステージ制御部514は、例えば対物レンズ5102aと、ステージ5104との相対距離を開口絞り5102bを第1状態で変更する。この開口絞り5102bは、光束の太さ、つまり開口数(NA)を規定する。すなわち、第1状態では、開口絞り5102bは、開口数(NA)を、例えば「大」で表記される開口状態で固定される。なお、図5では、説明を簡単にするため、対物レンズ5102aを1つで図示しているが、これに限定されない。例えば対物レンズ5102aを複数で構成してもよい。 As shown in FIG. 5, under the control of the control unit 5110a, the sensor control unit 508 and the stage control unit 514 move the focal position of the objective lens 5102a of the optical system in the Z direction along the optical axis L100 of the optical system. control to change and capture a plurality of images. That is, the stage controller 514 changes the relative distance between the objective lens 5102a and the stage 5104, for example, with the aperture diaphragm 5102b in the first state. This aperture stop 5102b defines the thickness of the light beam, that is, the numerical aperture (NA). That is, in the first state, the aperture diaphragm 5102b is fixed in an open state in which the numerical aperture (NA) is expressed as, for example, "large". In FIG. 5, one objective lens 5102a is illustrated in order to simplify the description, but the present invention is not limited to this. For example, a plurality of objective lenses 5102a may be configured.
 センサ制御部508は、焦点位置の変更に合わせて、画像を撮像し、記憶部502に供給する。なお、本実施形態の制御の一例では、WSI(Whole Slide Imaging)画像を生成するために、センサ制御部508と、ステージ制御部514とは、ステージ5104を水平方向に移動させつつ、移動した位置で対物レンズ5102aと、ステージ5104との相対距離を軸L100に沿ってZ方向に変更させる制御を繰り返す。 The sensor control unit 508 captures an image and supplies it to the storage unit 502 in accordance with the change of the focal position. In one example of the control of this embodiment, in order to generate a WSI (Whole Slide Imaging) image, the sensor control unit 508 and the stage control unit 514 move the stage 5104 in the horizontal direction. , the control for changing the relative distance between the objective lens 5102a and the stage 5104 in the Z direction along the axis L100 is repeated.
 図6は、センサ制御部508と、ステージ制御部514との制御により撮像された第1画像G100を模式的に示す図である。このように、第1画像G100は、焦点位置を変更して撮像した複数の第1の画像G100aで構成される。 FIG. 6 is a diagram schematically showing the first image G100 captured under the control of the sensor control section 508 and the stage control section 514. FIG. In this way, the first image G100 is composed of a plurality of first images G100a captured by changing the focal position.
 再び図4に示すように、画像取得部515は、記憶部502に記憶された撮像画像を取得する。なお、画像取得部515は、センサ5103から撮像画像を取得するように構成してもよい。この場合、記憶部を情報処理装置5120a内に構成してもよい。 As shown in FIG. 4 again, the image acquisition unit 515 acquires the captured image stored in the storage unit 502. Note that the image acquisition unit 515 may be configured to acquire a captured image from the sensor 5103 . In this case, the storage unit may be configured in the information processing device 5120a.
 画像表示部516は、例えばモニタである。この画像表示部516は、後述するように、画像合成部522が生成した画像、及びユーザインターフェース(UI画面)を表示する。 The image display unit 516 is, for example, a monitor. The image display unit 516 displays an image generated by the image synthesizing unit 522 and a user interface (UI screen), as will be described later.
 UI制御部518は、画像表示部516に、観察者の指示信号を入力するための入力用のUI画面を表示させる。このUI制御部518は、例えば後述する図12、14、15を画像表示部516に、表示させる。そして、UI制御部518は、観察者の指示信号を画像合成部522に供給する。
 表示制御部520は、UI制御部518が生成するUI画面、及び画像合成部522が処理した画像を所定の表示画像として、画像表示部516に、表示させる。表示画像には、例えばガンマー変換が施さされる。
The UI control unit 518 causes the image display unit 516 to display an input UI screen for inputting an instruction signal from the observer. The UI control unit 518 causes the image display unit 516 to display, for example, FIGS. The UI control unit 518 then supplies the observer's instruction signal to the image synthesizing unit 522 .
The display control unit 520 causes the image display unit 516 to display the UI screen generated by the UI control unit 518 and the image processed by the image synthesizing unit 522 as predetermined display images. The display image is subjected to gamma conversion, for example.
 画像合成部522は、第1光学系(図5参照)と異なる第2光学系に関連する光学系情報を用いて画像取得部515が取得した画像を処理する。第2光学系は、処理対象となる画像の撮像時の第1光学系と異なる形態の光学系を意味する。例えば、光学顕微鏡の光学系を第2光学系と称する。また、例えば、対物レンズ5102aは同一であるが、開口絞り5120bの開口数、倍率のいずれかが異なる場合の光学系を第2光学系と称する。或いは、例えば対物レンズ5102aと異なる対物レンズを使用する場合の光学系を第2光学系と称する。なお、本実施形態に係る画像合成部522が処理部に対応する。 The image synthesis unit 522 processes the image acquired by the image acquisition unit 515 using optical system information related to a second optical system different from the first optical system (see FIG. 5). The second optical system means an optical system having a form different from that of the first optical system when capturing an image to be processed. For example, an optical system of an optical microscope is called a second optical system. Further, for example, an optical system having the same objective lens 5102a but a different numerical aperture or magnification of an aperture stop 5120b is referred to as a second optical system. Alternatively, for example, an optical system using an objective lens different from the objective lens 5102a is referred to as a second optical system. Note that the image synthesizing unit 522 according to this embodiment corresponds to the processing unit.
 図7は、画像合成部522の詳細な構成例を示すブロック図である。図7に示すように、画像合成部522は、画像領域合成部522aと、光学処理部522bと、色処理部522cと、周波数処理部522dとを、有する。 FIG. 7 is a block diagram showing a detailed configuration example of the image synthesizing unit 522. As shown in FIG. As shown in FIG. 7, the image synthesizing section 522 has an image area synthesizing section 522a, an optical processing section 522b, a color processing section 522c, and a frequency processing section 522d.
 画像合成部522は、図3に示したように、各領域で撮像された、焦点位置の異なる第1の画像G100aをそれぞれ、ステッチィング処理して合成し、WSI(Whole Slide Imaging)画像を生成する。すなわち、第1画像G100を構成する第1の画像G100aの数に対応するWSI画像が生成される。例えば20箇所の異なる焦点位置で第1の画像G100aを撮像した場合には、20枚のWSI画像が生成される。 As shown in FIG. 3, the image synthesizing unit 522 stitches and synthesizes the first images G100a having different focal positions captured in each region to generate a WSI (Whole Slide Imaging) image. do. In other words, WSI images corresponding to the number of first images G100a forming the first image G100 are generated. For example, when the first image G100a is captured at 20 different focal positions, 20 WSI images are generated.
 図8乃至図10を参照しつつ、光学処理部522bの処理例を説明する。光学処理部522bは、第2光学系の情報に基づき、第1光学系で撮像された第1画像G100を用いて、第2光学系で撮像された場合に対応する第2画像を生成する。 A processing example of the optical processing unit 522b will be described with reference to FIGS. 8 to 10. FIG. Based on the information of the second optical system, the optical processing unit 522b uses the first image G100 captured by the first optical system to generate a second image corresponding to the case of being captured by the second optical system.
 図8は、第2光学系5104の一例を示す図である。図8に示すように、例えば第2光学系5104は、対物レンズ5104aと、開口絞り5104bと、を有する。領域200は、一回の撮像で、撮像される第2光学系5104の光軸L200に沿った深度方向の深さ領域を示す。開口絞り5104bは、光束の太さ(開口数NA)を例えば「小」で表記される開口状態で固定されている。この開口絞り5104bは、開口絞り5102b(図4参照)よりも狭く構成される。このため、被写界深度は、第1光学系5102(図4参照)よりも深くなる。 FIG. 8 is a diagram showing an example of the second optical system 5104. FIG. As shown in FIG. 8, for example, the second optical system 5104 has an objective lens 5104a and an aperture stop 5104b. A region 200 indicates a depth region in the depth direction along the optical axis L200 of the second optical system 5104 that is imaged in one imaging. The aperture diaphragm 5104b is fixed in an aperture state in which the thickness of the light beam (numerical aperture NA) is expressed as, for example, "small". The aperture stop 5104b is narrower than the aperture stop 5102b (see FIG. 4). Therefore, the depth of field is deeper than that of the first optical system 5102 (see FIG. 4).
 第2光学系5104の情報には、対物レンズ5104aの特性情報、開口絞り5104b、及び倍率などの情報が含まれる。このため、第2光学系5104の情報を用いることにより、第2光学系で撮像した場合の被写界深度を演算することが可能となる。或いは、第2光学系の情報に対物レンズ5104aからの焦点位置、及び被写界深度の情報を開口数及び倍率に関連づけて予め記憶部502に記憶しておいてもよい。 Information about the second optical system 5104 includes information about the characteristics of the objective lens 5104a, the aperture stop 5104b, and the magnification. Therefore, by using the information of the second optical system 5104, it is possible to calculate the depth of field when the image is captured by the second optical system. Alternatively, information on the focal position from the objective lens 5104a and information on the depth of field may be stored in advance in the storage unit 502 in association with the information on the second optical system with the numerical aperture and the magnification.
 図9は、光学処理部522bの処理例を模式的に説明する図である。図9では、説明の理解を容易にするため第1光学系5102の光軸L100をずらして図示しているが、実際には同軸である。図9に示す様に、第1光学系5102の被写界深度は、領域100に示すように、第2光学系5104(図8参照)の被写界深度を示す領域200よりも浅くなる。なお、後述する図19でのチエッカーチャートの撮像の場合には、係数の演算に必要となる情報を撮像するために、図9で示すように光軸L100を光軸L100と直交する方向にずらして撮像してもよい。 FIG. 9 is a diagram schematically explaining a processing example of the optical processing section 522b. In FIG. 9, the optical axis L100 of the first optical system 5102 is shown shifted to facilitate understanding of the explanation, but it is actually coaxial. As shown in FIG. 9, the depth of field of the first optical system 5102 is shallower than the area 200 indicating the depth of field of the second optical system 5104 (see FIG. 8), as indicated by the area 100 . 19, which will be described later, the optical axis L100 is shifted in a direction perpendicular to the optical axis L100 as shown in FIG. image.
 図10は、画像合成部522の処理順を模式的に示す図である。図10に示すように、光学処理部522bは、複数の第1画像G100aを加算して、第2光学系5104(図8参照)で撮像した画像に対応する第2画像G200aを生成する。 FIG. 10 is a diagram schematically showing the processing order of the image synthesizing unit 522. As shown in FIG. As shown in FIG. 10, the optical processing unit 522b adds a plurality of first images G100a to generate a second image G200a corresponding to the image captured by the second optical system 5104 (see FIG. 8).
 この場合、光学処理部522bは、例えば(1)式に従い、第2画像G200aを生成する。ここでは、iが第2画像を示し、Iiが焦点iにおける第1の画像G100aを示し、aiが第1の画Iiに対応する係数を示す。このように、光学処理部522bは、第2光学系5104の情報に基づき、複数の第1画像G100aを畳み込み合成して第2画像G200aを生成する。 In this case, the optical processing unit 522b generates the second image G200a according to, for example, formula (1). Here, i denotes the second image, Ii denotes the first image G100a at focus i, and ai denotes the coefficients corresponding to the first image Ii. In this manner, the optical processing unit 522b generates the second image G200a by convolutionally combining the plurality of first images G100a based on the information of the second optical system 5104. FIG.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 図10の下図は、開口絞り5104bの開口面積、すなわち開口数NAに応じて処理した結果を示す図である。つまり、開口が大きくなるに従い開口数NAが高くなる。これらから分かるように、光学処理部522bは、第1光学系5102の開口数NAよりも第2光学系5104の開口数NAが低くなるに従い、複数の第1画像G100aのZ方向に対する加算範囲を増加させる。 The lower diagram of FIG. 10 is a diagram showing the result of processing according to the aperture area of the aperture stop 5104b, that is, the numerical aperture NA. That is, the numerical aperture NA increases as the aperture increases. As can be seen from these, as the numerical aperture NA of the second optical system 5104 becomes lower than the numerical aperture NA of the first optical system 5102, the optical processing unit 522b increases the addition range of the plurality of first images G100a in the Z direction. increase.
 これらから分かるように、光学処理部522bは、第1光学系5102の開口よりも第2光学系5104の開口が狭い場合には、複数の第1画像G100aに基づき、第1画像G100a単枚よりも光軸方向L100方向の画像情報が増加するように、第1の画像G100を処理し、第2画像を生成する。すなわち、光学処理部522bは、第1光学系5102の開口よりも第2光学系5104の開口が狭い場合には、複数の第1画像G100aに基づき、第1画像G100a単枚よりも被写界深度が深い画像に対応する画像情報を有するように、第1の画像G100を処理し、第2画像G200aを生成する。そして、表示制御部520は、画像表示部516に第2画像G200aを表示させる。一方で、光学処理部522bは、第1光学系5102の開口よりも第2光学系5104の開口が広い場合には、複数の第1画像G100aに基づき、第1画像G100a単枚よりも光軸方向L100方向の画像情報が減少するように、第1の画像G100を処理し、第2画像を生成する。 As can be seen from these, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, from the single first image G100a The first image G100 is processed to generate a second image so that the image information in the optical axis direction L100 increases as well. That is, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, makes the object field smaller than the single first image G100a. The first image G100 is processed to produce a second image G200a so as to have image information corresponding to a deep image. Then, the display control section 520 causes the image display section 516 to display the second image G200a. On the other hand, when the aperture of the second optical system 5104 is wider than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, aligns the optical axis more than the single first image G100a. The first image G100 is processed to generate a second image such that the image information in the direction L100 is reduced.
 図11は、第1画像G100aと第2画像G200aとを並べて表示させる例を示す図である。図11に示すように、表示制御部520は、第1の画像G100を観察中に第2画像G200aを画像表示部516に表示させてもよい。このように、第1画像G100aに対応する表示画像と、第2画像G200aに対応する表示画像とを画面W100a、W200aに並べて表示させ。これにより、病理医は、第1画像G100aと第2画像G200aとの診断基準の相違を客観的に把握可能となる。これにより、病理医の本システムでの読影経験が積まれると、第1画像G100aに対応する診断基準が把握され、第1画像G100aのみを用いる場合にも、より正確な診断が可能となる。 FIG. 11 is a diagram showing an example in which the first image G100a and the second image G200a are displayed side by side. As shown in FIG. 11, the display control section 520 may cause the image display section 516 to display the second image G200a while observing the first image G100. In this way, the display image corresponding to the first image G100a and the display image corresponding to the second image G200a are displayed side by side on the screens W100a and W200a. This enables the pathologist to objectively grasp the difference in diagnostic criteria between the first image G100a and the second image G200a. As a result, when pathologists gain experience in image interpretation with this system, they can grasp the diagnostic criteria corresponding to the first image G100a, and even when using only the first image G100a, more accurate diagnosis becomes possible.
 また、第1の画像G100を観察中に第2画像を画像表示部516に表示させる場合には、第1画像G100aの単枚に色処理部522c及び周波数処理部522dの処理を施した第2画像G200aを画像表示部516に表示させてもよい。この場合、深さ方向Zの全撮像が終了した場合に、第1の画像G100を用いた処理を施した第2画像G200aを画像表示部516に表示させてもよい。 Further, when the second image is displayed on the image display unit 516 while observing the first image G100, a single image of the first image G100a is processed by the color processing unit 522c and the frequency processing unit 522d to obtain a second image. Image G<b>200 a may be displayed on image display unit 516 . In this case, the second image G200a processed using the first image G100 may be displayed on the image display unit 516 when all imaging in the depth direction Z is completed.
 光学処理部522bは、第2光学系5104の情報として、係数aiを予め演算し、記憶部502に記憶しておいてもよい。この場合、光学処理部522bは、係数aiを記憶部502か読み出し、第2画像G200aを生成する処理を行う。この係数は、第2光学系5104の対物レンズ5104a及び開口絞り5104bの光学特性情報を用いて演算可能である。或いは、予め、第2光学系5104の対物レンズ5104a及び開口絞り5104bの情報に基づき、コンピュータシュミレーションにより演算しておいてもよい。 The optical processing unit 522b may pre-calculate the coefficient ai as the information of the second optical system 5104 and store it in the storage unit 502. In this case, the optical processing unit 522b reads out the coefficient ai from the storage unit 502 and performs processing for generating the second image G200a. This coefficient can be calculated using the optical characteristic information of the objective lens 5104a and the aperture stop 5104b of the second optical system 5104. FIG. Alternatively, it may be calculated in advance by computer simulation based on the information of the objective lens 5104a and the aperture stop 5104b of the second optical system 5104. FIG.
 光学処理部522bは、逆畳み込み演算も可能である。(2)式に示すよう、第1の画像をIkとし、ikを第2画像とする。(2)式の例は、第2画像ikの方が、第1の画像G100aよりも開口数が大きい場合である。係数aiが第1の画Ikに対応する係数を示す。すなわち、光学処理部522bは、(2)式で示す行列式を逆変換し、逆変化行列により新たな係数を演算し、複数の第1の画像G100aを用いて、第2画像ikを生成する。このように、光学処理部522bは、逆畳み込み演算により、第1の画像G100aよりも開口数の大きな第2画像G200aを生成可能である。
Figure JPOXMLDOC01-appb-M000002
The optical processor 522b is also capable of deconvolution. Let Ik be the first image and let ik be the second image, as shown in equation (2). An example of expression (2) is a case where the second image ik has a larger numerical aperture than the first image G100a. A coefficient ai indicates a coefficient corresponding to the first image Ik. That is, the optical processing unit 522b inversely transforms the determinant expressed by Equation (2), calculates new coefficients using the inverse change matrix, and generates the second image ik using the plurality of first images G100a. . In this way, the optical processing unit 522b can generate the second image G200a with a larger numerical aperture than the first image G100a by the deconvolution operation.
Figure JPOXMLDOC01-appb-M000002
 色処理部522cは、第2光学系5104の情報に応じて、第1画像G100a又は光学処理部522bが生成した第2画像の濃度、及び色のいずれかを変更する処理を行う。この場合、合成した第2画像に対して濃度、及び色のいずれかを変更してもよい。或いは、第1画像G100aの単枚に対して濃度、及び色のいずれかを変更してもよい。例えば、色処理部522cは、第1光学系5102の開口よりも第2光学系5104の開口が狭い場合には、濃度をより上げる処理を行う。或いは、色処理部522cは、第1光学系5102の開口よりも第2光学系5104の開口が狭い場合には、色を暗色側に変更する処理を行う。一方で、色処理部522cは、第1光学系5102の開口よりも第2光学系5104の開口が広い場合には、濃度をより下げる処理を行う。これにより、より第2光学系5104で撮像した画像に色調を近づけることが可能となる。 The color processing unit 522c changes either the density or color of the first image G100a or the second image generated by the optical processing unit 522b according to the information of the second optical system 5104. In this case, either the density or the color may be changed with respect to the synthesized second image. Alternatively, either the density or the color may be changed for a single sheet of the first image G100a. For example, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the color processing unit 522c performs processing to increase the density. Alternatively, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the color processing unit 522c performs processing to change the color to a darker side. On the other hand, when the aperture of the second optical system 5104 is wider than the aperture of the first optical system 5102, the color processing unit 522c performs processing to further lower the density. This makes it possible to bring the color tone closer to the image captured by the second optical system 5104 .
 周波数処理部522dは、第2光学系5104の情報に応じて、第1画像G100a又は光学処理部522bが生成した第2画像の空間周波数(MTF)を変更する処理を行う。例えば、周波数処理部522dは、第1光学系5102の開口よりも第2光学系5104の開口が狭い場合には、第1画像G100a又は光学処理部522bが生成した第2画像の空間周波数(MTF)を第1画像G100aよりも低下させる処理を行う。一方で、周波数処理部522dは、第1光学系5102の開口よりも第2光学系5104の開口が広い場合には、第1画像G100a又は光学処理部522bが生成した第2画像の空間周波数(MTF)を第1画像G100aよりも増加させる処理を行う。これにより、第2光学系5104で撮像した画像に空間周波数(MTF)をより近づけることが可能となる。 The frequency processing unit 522d performs processing for changing the spatial frequency (MTF) of the first image G100a or the second image generated by the optical processing unit 522b according to the information of the second optical system 5104. For example, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the frequency processing unit 522d determines the spatial frequency (MTF ) is made lower than the first image G100a. On the other hand, when the aperture of the second optical system 5104 is wider than the aperture of the first optical system 5102, the frequency processing unit 522d adjusts the spatial frequency ( MTF) is increased more than the first image G100a. This makes it possible to bring the spatial frequency (MTF) closer to the image captured by the second optical system 5104 .
 図12乃至図14を参照しつつ光学処理部522bのUI制御部518を介した処理例を説明する。図12は、UI制御部518が生成するUI画面例を示す図である。観察者は、操作部506を介して、画像表示部516に表示されるモードボタンU100~U104を指示する。なお、本実施形態に係るUI制御部518が選択部に対応する。 An example of processing via the UI control unit 518 of the optical processing unit 522b will be described with reference to FIGS. 12 to 14. FIG. FIG. 12 is a diagram showing an example of a UI screen generated by the UI control unit 518. As shown in FIG. The observer designates mode buttons U100 to U104 displayed on the image display unit 516 via the operation unit 506. FIG. Note that the UI control unit 518 according to this embodiment corresponds to the selection unit.
 図13は、モードボタンU100が選択された場合の処理系列を示す図である。縦軸は開口数を示し、横軸は、倍率を示す。処理系列MU100はモードボタンU100が指示された場合の処理系列を示し、処理系列MU200はモードボタンU102が指示された場合の処理系列を示す。また、UI制御部518は、図13に示す画像例を図12、図14で示すUI画面と連動させて画像表示部516に表示させてもよい。これにより、操作者は操作内容を、より客観的に把握可能となる。 FIG. 13 is a diagram showing the processing sequence when the mode button U100 is selected. The vertical axis indicates numerical aperture, and the horizontal axis indicates magnification. A processing sequence MU100 indicates a processing sequence when the mode button U100 is designated, and a processing sequence MU200 indicates a processing sequence when the mode button U102 is designated. The UI control unit 518 may cause the image display unit 516 to display the example image shown in FIG. 13 in conjunction with the UI screens shown in FIGS. 12 and 14 . As a result, the operator can more objectively grasp the details of the operation.
 図14は、開口に関する量及び倍率を指示するUI例を示す図である。領域U140は、開口に関する量として開口数NAを入力するボタンであり、領域U142は、倍率を入力するボタンである。観察者は、操作部506を介して、モードボタンU140、U142を指示し、数値を入力する。また、観察者は、操作部506を介して、モードボタンU144、U145のいずれかを指示し、選択信号を入力する。 FIG. 14 is a diagram showing a UI example for instructing the amount and magnification of the aperture. A region U140 is a button for inputting a numerical aperture NA as a quantity related to aperture, and a region U142 is a button for inputting a magnification. The observer designates mode buttons U140 and U142 via the operation unit 506 and inputs numerical values. Also, the observer designates one of the mode buttons U144 and U145 via the operation unit 506 and inputs a selection signal.
 再び図12に戻り、モードボタンU100が選択された場合には、光学処理部522bは、第1光学系5102が撮像した画像を処理せずに出力する。すなわち、第1モードは、第1光学系5102が撮像した画像を第2光学系の処理結果に対応させる処理を行わないモードである。 Returning to FIG. 12 again, when the mode button U100 is selected, the optical processing unit 522b outputs the image captured by the first optical system 5102 without processing. That is, the first mode is a mode that does not perform processing for making the image captured by the first optical system 5102 correspond to the processing result of the second optical system.
 この場合の処理系列は、MU100(図13参照)であり、開口数は、例えば0.7で固定さであるので、領域U140は、0.7を表示し、例えば領域U140からの入力が拒否される。一方で、領域U142を介して撮像倍率が入力されると、光学処理部522bは、撮像倍率を変更して第2画像G200aを生成する。 The processing sequence in this case is the MU100 (see FIG. 13), and the numerical aperture is fixed at, for example, 0.7. be done. On the other hand, when the imaging magnification is input via the region U142, the optical processing unit 522b changes the imaging magnification to generate the second image G200a.
 モードボタンU102が選択された場合には、光学処理部522bは、第2光学系5102の1における情報を用いて、第1の画像G100を処理して出力する。すなわち、第2モードは、第1光学系5102が撮像した画像を第2光学系5102の1の処理結果に対応させる処理を行うモードである。例えば、第2光学系5102の1は、例えば型番100の光学顕微鏡の第2光学系の情報を用いた処理を意味する。 When the mode button U102 is selected, the optical processing unit 522b uses the information in the second optical system 5102 1 to process and output the first image G100. That is, the second mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the processing result 1 of the second optical system 5102 . For example, 1 of the second optical system 5102 means processing using information of the second optical system of the optical microscope of model number 100, for example.
 図13に示すように、モードボタンU102が選択され、且つモードボタンU144が選択された場合、処理系列にはMU200が選択され、開口数と倍率とを連動する。このため、例えば、倍率60倍を入力すると、開口数(NA)は0.8が設定される。一方で、開口数(NA)は0.8が入力されると、倍率60倍が設定される。このように、相互に入力が連動する。光学処理部522bは、第2光学系5102の1の情報にしたがい、第1画像G100を用いて、第2画像G200aを生成する。 As shown in FIG. 13, when the mode button U102 is selected and the mode button U144 is selected, the MU200 is selected as the process sequence, and the numerical aperture and the magnification are linked. Therefore, for example, if a magnification of 60 times is input, the numerical aperture (NA) is set to 0.8. On the other hand, if a numerical aperture (NA) of 0.8 is input, a magnification of 60 is set. In this way, the inputs are interlocked with each other. The optical processing unit 522b generates a second image G200a using the first image G100 according to the information of 1 of the second optical system 5102. FIG.
 一方で、モードボタンU102が選択され、且つモードボタンU145が選択されている場合には、モードボタンU140を介しての入力と、モードボタンU142を介しての入力とは、独立に行うことが可能である。 On the other hand, when the mode button U102 is selected and the mode button U145 is selected, the input via the mode button U140 and the input via the mode button U142 can be performed independently. is.
 これらから分かるように、開口数と倍率とを連動させる場合において、光学処理部522bは、第1光学系5102よりも第2光学系5104の倍率が低くなるに従い、複数の第1画像G100aのZ方向に対する加算範囲を増加させる。同様に、光学処理部522bは、第1光学系5102よりも第2光学系5104の倍率が低くなるに従い、複数の第1画像G100aに基づき、第1画像G100a単枚よりも被写界深度がより深い画像に対応する画像情報を有するように、第1の画像G100を処理し、第2画像G200aを生成する。そして、表示制御部520は、画像表示部516に第2画像G200aを表示させる。 As can be seen from these, when the numerical aperture and the magnification are interlocked, the optical processing unit 522b adjusts the Z Increases the addition range for directions. Similarly, as the magnification of the second optical system 5104 becomes lower than that of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, provides a greater depth of field than the single first image G100a. The first image G100 is processed to have image information corresponding to a deeper image to produce a second image G200a. Then, the display control section 520 causes the image display section 516 to display the second image G200a.
 再び図12に戻り、モードボタンU104が選択されると、例えば、型番100と異なる型番200の光学顕微鏡の第2光学系の情報が記憶部502から読み込まれる。そして、光学処理部522bは、第2光学系5102の2の情報にしたがい、モードボタンU102が選択さえた場合と同様に、第1画像G100を用いて、第2画像G200aを生成する。すなわち、第3モードは、第1光学系5102が撮像した画像を第2光学系5102の2の撮像結果に対応させる処理を行うモードである。 Returning to FIG. 12 again, when the mode button U104 is selected, the information of the second optical system of the optical microscope of model number 200 different from model number 100 is read from the storage unit 502, for example. Then, the optical processing unit 522b uses the first image G100 to generate the second image G200a according to the information of 2 of the second optical system 5102, in the same way as when the mode button U102 is selected. That is, the third mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the imaging result of the second optical system 5102 .
 モードボタンU106が選択されると、例えば、型番300のデジタル式の顕微鏡における第2光学系の情報が記憶部502から読み込まれる。そして、光学処理部522bは、第2光学系5102の3の情報にしたがい、第1画像G100を用いて、第2画像G200aを生成する。すなわち、第4モードは、第1光学系5102が撮像した画像を第2光学系5102の3の撮像結果に対応させる処理を行うモードである。 When the mode button U106 is selected, for example, information on the second optical system in the model number 300 digital microscope is read from the storage unit 502. Then, the optical processing unit 522b generates a second image G200a using the first image G100 according to the information of 3 of the second optical system 5102. FIG. That is, the fourth mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the imaging result of 3 of the second optical system 5102 .
 このように、予め設定された第2光学系5102の情報が、モードボタンU100~U106を介して記憶部502から読み込まれ、光学処理部522bは、読み込まれた第2光学系5102の情報にしたがい、第1画像G100を用いて、第2画像G200aを生成する。これにより、様々な顕微鏡に対応する撮像画像を簡易に生成することが可能となる。 In this way, preset information of the second optical system 5102 is read from the storage unit 502 via the mode buttons U100 to U106, and the optical processing unit 522b operates according to the read information of the second optical system 5102. , the first image G100 is used to generate a second image G200a. This makes it possible to easily generate captured images corresponding to various microscopes.
 図15は、UI制御部518が生成する画像処理関連のUI画面例を示す図である。観察者は、操作部506を介して、画像表示部516に表示されるモードボタンU152~U156を指示する。図15に示すように、例えばモードボタンU152が選択された場合には、第2光学系5102の1の情報にしたがい、色処理部522c及び周波数処理部522dは、第1の画像G100a又は光学処理部522bが生成した第2画像G200aの濃度、色及び空間周波数処理のいずれかを変更する処理を行う。すなわち、第5モードは、第1光学系5102が撮像した第1の画像G100aを第2光学系5102の1の処理結果に濃度、色及び空間周波数処理のいずれかを対応させる処理を行うモードである。 FIG. 15 is a diagram showing an example of a UI screen related to image processing generated by the UI control unit 518. FIG. The observer designates mode buttons U152 to U156 displayed on the image display unit 516 via the operation unit 506. FIG. As shown in FIG. 15, for example, when the mode button U152 is selected, according to the information of 1 of the second optical system 5102, the color processing unit 522c and the frequency processing unit 522d perform the first image G100a or optical processing. Processing is performed to change any one of the density, color, and spatial frequency processing of the second image G200a generated by the unit 522b. That is, the fifth mode is a mode in which the first image G100a captured by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in one of density, color, and spatial frequency processing. be.
 同様に、モードボタンU154が選択された場合には、第2光学系5102の2の情報にしたがい、色処理部522c及び周波数処理部522dは、第1の画像G100a又は光学処理部522bが生成した第2画像G200aの濃度、色及び空間周波数処理のいずれかを変更する処理を行う。すなわち、第6モードは、第1光学系5102が撮像した画像を第2光学系5102の2の処理結果に濃度、色及び空間周波数処理のいずれかを対応させる処理を行うモードである。 Similarly, when the mode button U154 is selected, according to the information of 2 of the second optical system 5102, the color processing unit 522c and the frequency processing unit 522d generate the first image G100a or the optical processing unit 522b. Processing is performed to change any one of the density, color, and spatial frequency processing of the second image G200a. That is, the sixth mode is a mode in which the image picked up by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in any one of density, color, and spatial frequency processing.
 同様に、モードボタンU156が選択された場合には、第2光学系5102の2の情報にしたがい、色処理部522c及び周波数処理部522dは、第1画像G100a又は光学処理部522bが生成した第2画像G200aの濃度、色及び空間周波数処理のいずれかを変更する処理を行う。すなわち、第7モードは、第1光学系5102が撮像した画像を第2光学系5102の3の処理結果に濃度、色及び空間周波数処理のいずれかを対応させる処理を行うモードである。 Similarly, when the mode button U156 is selected, according to the information of 2 of the second optical system 5102, the color processing unit 522c and the frequency processing unit 522d generate the first image G100a or the first image generated by the optical processing unit 522b. Processing is performed to change any one of the density, color, and spatial frequency processing of the two images G200a. That is, the seventh mode is a mode in which the image picked up by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in any one of density, color, and spatial frequency processing.
 図16は、UI制御部518が生成する特殊処理関連のUI画面例を示す図である。図17は、図16で選択されたモードに対応する処理系を示す図である。例えばモードボタンU102(図12参照)が選択されている場合に、モードボタンU162が選択されると、処理系列MU200にしたがった処理が、光学処理部522bに行われる。このように、標準モードは、第2光学系5102の撮像結果に対応させるモードである。 FIG. 16 is a diagram showing an example of a UI screen related to special processing generated by the UI control unit 518. FIG. FIG. 17 shows a processing system corresponding to the mode selected in FIG. For example, when the mode button U162 is selected when the mode button U102 (see FIG. 12) is selected, the processing according to the processing sequence MU200 is performed by the optical processing unit 522b. Thus, the standard mode is a mode corresponding to the imaging result of the second optical system 5102 .
 一方で、モードボタンU164が選択された場合には、入力UI(図14参照)の開口数と倍率入力の連動が変更され、モードボタンU162が選択された場合よりも、倍率が同じなら開口数がより高くなるように連動する。これにより、高NAモードでは、処理系列MU200Hにしたがい、標準モードよりも、より被写界深度の深い第2画像G200aが生成される。すなわち、高NAモードは、標準モードよりも、より被写界深度の深い第2画像G200aを生成させるモーである。 On the other hand, when the mode button U164 is selected, the linkage between the numerical aperture of the input UI (see FIG. 14) and the magnification input is changed. are linked so that As a result, in the high NA mode, according to the processing sequence MU200H, the second image G200a with a deeper depth of field than in the standard mode is generated. That is, the high NA mode is a mode that generates a second image G200a with a deeper depth of field than the standard mode.
 一方で、モードボタンU166が選択された場合には、入力UI(図14参照)の開口数と倍率入力の連動が変更され、モードボタンU162が選択された場合よりも、倍率が同じなら低開口数が連動するように設定される。これにより、低NAモードでは、処理系列MU200Lにしたがい、標準モードよりも、より被写界深度の浅い第2画像G200aが生成される。すなわち、低NAモードは、標準モードよりも、より被写界深度の浅い第2画像G200aを生成させるモードである。 On the other hand, when the mode button U166 is selected, the linkage between the numerical aperture of the input UI (see FIG. 14) and the magnification input is changed. It is set so that the numbers are interlocked. As a result, in the low NA mode, according to the processing sequence MU200L, a second image G200a with a shallower depth of field than in the standard mode is generated. That is, the low NA mode is a mode for generating a second image G200a with a shallower depth of field than the standard mode.
 一方で、モードボタンU168が選択された場合には、図18で説明する処理例を行う。図18は、第2光学系の対物レンズ5108aの光軸L300かの距離に応じて焦点位置がずれる例を示す図である。第2光学系の対物レンズ5108aの焦点位置をラインL5108aで示し、光軸をL300で示している。 On the other hand, when the mode button U168 is selected, the processing example described in FIG. 18 is performed. FIG. 18 is a diagram showing an example in which the focal position shifts according to the distance from the optical axis L300 of the objective lens 5108a of the second optical system. The focal position of the objective lens 5108a of the second optical system is indicated by line L5108a, and the optical axis is indicated by L300.
 図18に示すように、第2光学系の対物レンズ5108aは例えば非球面レンズであり、ラインL5108aは、第2光学系の光軸L300からの距離に応じてサンプルの表層からの距離が変わる。例えば、ラインL5108aは、光軸L300からの距離の距離が遠くなるにしたがい、サンプルの表層により近づく。 As shown in FIG. 18, the objective lens 5108a of the second optical system is, for example, an aspherical lens, and the line L5108a varies in distance from the surface layer of the sample according to the distance from the optical axis L300 of the second optical system. For example, the line L5108a approaches the surface layer of the sample as the distance from the optical axis L300 increases.
 モードボタンU168が選択された場合、光学処理部522bは、第2光学系の対物レンズ5108aの特性である焦点位置をラインL5108aに基づき、第2光学系の光軸L300からの距離に応じて、係数ai((1)式参照)の値を変更する。より具体的には、光学処理部522bは、光軸L300からの距離らの距離が増加するに従い、対物レンズ5108a側の画像情報が増加するように、係数ai((1)式参照)の値を変更する。すなわち、係数ai((1)式参照)の値は、軸L300からの距離らの距離が増加するに従い、対物レンズ5108a側の係数aiの値をより増加させる。これにより、第2画像G200aでは、対物レンズ5108aの焦点位置に応じた第1画像100aの情報がより多く加算され、対物レンズ5108aの特性に近い撮像画像が構成される。すなわち、特殊モードは、第1光学系5102が撮像した画像G100群を第2光学系の対物レンズ5108aの焦点位置に応じて、第2画像G200aを生成させるモードである。 When the mode button U168 is selected, the optical processing unit 522b sets the focal position, which is the characteristic of the objective lens 5108a of the second optical system, according to the distance from the optical axis L300 of the second optical system based on the line L5108a. Change the value of the coefficient ai (see formula (1)). More specifically, the optical processing unit 522b adjusts the value of the coefficient ai (see formula (1)) so that the image information on the objective lens 5108a side increases as the distance from the optical axis L300 increases. to change That is, the value of the coefficient ai (see formula (1)) increases the value of the coefficient ai on the objective lens 5108a side as the distance from the axis L300 increases. As a result, in the second image G200a, more information of the first image 100a corresponding to the focal position of the objective lens 5108a is added, and a captured image close to the characteristics of the objective lens 5108a is constructed. That is, the special mode is a mode in which the second image G200a is generated from the group of images G100 captured by the first optical system 5102 according to the focal position of the objective lens 5108a of the second optical system.
 図19は、(1)式、(2)式に関連する係数の実験による生成例を示す図である。図19に示す様にチエッカーチャートをステージ5104に配置撮像する。この場合、例えばデジカメの撮像倍率を10、20、40倍で撮像する(ステップS100、S102、S104)、そしてこれらのデータに基づき例えば、40倍から20倍に撮像倍率を変更する場合、40倍から20倍に撮像倍率を変更する場合の係数((1)式参照)をそれぞれ演算する(ステップS106、S108)。 FIG. 19 is a diagram showing an example of experimentally generated coefficients related to formulas (1) and (2). As shown in FIG. 19, the checker chart is placed on the stage 5104 and imaged. In this case, for example, images are taken with a digital camera at magnifications of 10, 20, and 40 (steps S100, S102, and S104). The coefficients (see formula (1)) for changing the imaging magnification from 1 to 20 are calculated (steps S106 and S108).
 次に、画像合成部522は、撮像倍率40倍でスキャナ撮影を行う(ステップS110)。そして、ステップS106、S108で生成した係数により、撮像倍率40倍で撮像した画像に対して、畳み込み演算をそれぞれ実行する(ステップS112、S114)。そして、逆行列演算を行い、撮像倍率20倍から40倍、10倍から40倍、の逆畳み込み演算用の係数を生成する(ステップS116、S118、S120)。同様に、それぞれの濃度と色調の変換を測定し、撮像倍率20倍から40倍、10倍から40倍に変倍する場合の、色処理部522cのパラメータを取得する。 Next, the image synthesizing unit 522 performs scanner imaging at an imaging magnification of 40 (step S110). Then, using the coefficients generated in steps S106 and S108, a convolution operation is performed on an image captured at an imaging magnification of 40 (steps S112 and S114). Then, an inverse matrix calculation is performed to generate coefficients for deconvolution calculation with imaging magnifications from 20 times to 40 times and from 10 times to 40 times (steps S116, S118, S120). Similarly, the conversion of each density and color tone is measured, and the parameters of the color processing unit 522c when the imaging magnification is changed from 20 times to 40 times and from 10 times to 40 times are acquired.
 次に、これらのデータを記憶部502に予め記憶される。そして、医師は、例えば図14で示すUIで撮像倍率を設定する(ステップS128)。光学処理部522bは、撮像倍率に応じた係数を記憶部502から読込、第1画像100を用いて第2画像200Gaを生成する(ステップS130)。 Next, these data are stored in the storage unit 502 in advance. Then, the doctor sets the imaging magnification using the UI shown in FIG. 14, for example (step S128). The optical processing unit 522b reads the coefficient corresponding to the imaging magnification from the storage unit 502, and uses the first image 100 to generate the second image 200Ga (step S130).
 図20は、本実施形態に係る顕微鏡システム5000の処理例を示すフローチャートである。図20に示すように、まず、ステージ5103に病理標本を載置する(ステップS200)。次に、制御部5110aの制御に従い、センサ5103は、病理標本の画像の撮影を行う(ステップS202)。 FIG. 20 is a flowchart showing a processing example of the microscope system 5000 according to this embodiment. As shown in FIG. 20, first, a pathological specimen is placed on stage 5103 (step S200). Next, under the control of the control unit 5110a, the sensor 5103 captures an image of the pathological specimen (step S202).
 次に、制御部5110aは、所定の撮影枚数が撮影されてか否かを判定する(ステップS203)。所定枚数が撮影されていないと判定する場合(ステップS203のn)、ステージ制御部514を介して、光軸方向Zにステージ5103の位置を変更し、ステップS202からの処理を繰り返す。 Next, the control unit 5110a determines whether or not a predetermined number of shots have been taken (step S203). If it is determined that the predetermined number of images have not been taken (n in step S203), the position of the stage 5103 is changed in the optical axis direction Z via the stage control unit 514, and the processing from step S202 is repeated.
 一方で、所定枚数が撮影されていると判定する場合(ステップS203のy)、画像蓄積部(記憶部)502に第1画像G100を蓄積させる(ステップS206)。 On the other hand, if it is determined that the predetermined number of images have been taken (y in step S203), the first image G100 is accumulated in the image accumulation unit (storage unit) 502 (step S206).
 次に、UI制御部518の制御に従い、UI(図14参照)を介して、倍率、及び開口数を入力する(ステップS208)。
 次に、画像合成部522は、入力された倍率、及び開口数に応じて(ステップS210)、画像蓄積部(記憶部)502から読み込んだ第1画像G100を用いて第2画像G200aを生成する(ステップS212)。
Next, under the control of the UI control unit 518, the magnification and numerical aperture are input via the UI (see FIG. 14) (step S208).
Next, the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to the inputted magnification and numerical aperture (step S210). (Step S212).
 そして、表示制御部520は、画像表示部516にて第2画像G200aを表示させ(ステップS214)、全体処理を終了する。 Then, the display control unit 520 causes the image display unit 516 to display the second image G200a (step S214), and ends the overall processing.
 図21は、モードボタンU144が押され、倍率、及び開口数が連動する場合の顕微鏡システム5000の処理例を示すフローチャートである。図21に示すように、UI制御部518は、モードボタンU144が選択された場合に、指示倍率に応じて(ステップS310)、開口数を生成し、画像合成部522は、指示倍率、及び連動する開口数に応じて、画像蓄積部(記憶部)502から読み込んだ第1画像G100を用いて第2画像G200aを生成する(ステップS312)。このように、倍率、及び開口数を連動させる場合には、例えば開口数の入力が不要となり、処理効率を上げることが可能となる。 FIG. 21 is a flowchart showing a processing example of the microscope system 5000 when the mode button U144 is pressed and the magnification and numerical aperture are linked. As shown in FIG. 21, when the mode button U144 is selected, the UI control unit 518 generates the numerical aperture according to the indicated magnification (step S310). A second image G200a is generated using the first image G100 read from the image accumulation unit (storage unit) 502 according to the numerical aperture to be used (step S312). In this way, when the magnification and the numerical aperture are interlocked, it becomes unnecessary to input the numerical aperture, for example, and it is possible to improve the processing efficiency.
 以上説明したように、本実施形態によれば、画像取得部515が、画像蓄積部502から第1光学系5102を介して撮像した第1の画像G100を取得し、画像合成部522が、第1光学系5102と異なる第2光学系5104に関連する光学系情報を用いて、第1の画像G100を処理して第2画像G200aを生成する処理を行うこととした。これにより、第1光学系5102を介して撮像した第1の画像G100を用いて、第2光学系5104で撮像した画像に対応する第2画像G200aを生成することが可能となる。このため、第2光学系5104の使用になれた観察者も、画像の診断等をより容易に行うことが可能となる。 As described above, according to the present embodiment, the image acquiring unit 515 acquires the first image G100 captured from the image storage unit 502 via the first optical system 5102, and the image synthesizing unit 522 acquires the first image G100. Using the optical system information related to the second optical system 5104 different from the first optical system 5102, the first image G100 is processed to generate the second image G200a. This makes it possible to generate a second image G200a corresponding to the image captured by the second optical system 5104 using the first image G100 captured by the first optical system 5102 . Therefore, even an observer accustomed to using the second optical system 5104 can more easily diagnose an image.
(第1実施形態の変形例1)
 第1実施形態の変形例1に係る顕微鏡システム5000は、照明側の視野絞りの情報を用いて、光学処理部522bが処理可能である点で、第1実施形態に係る顕微鏡システム5000と相違する。以下では第1実施形態に係る顕微鏡システム5000と相違する点を説明する。
(Modification 1 of the first embodiment)
The microscope system 5000 according to Modification 1 of the first embodiment is different from the microscope system 5000 according to the first embodiment in that the optical processing unit 522b can process using the information of the field stop on the illumination side. . Differences from the microscope system 5000 according to the first embodiment will be described below.
 図22は、1実施形態の変形例1に係る顕微鏡システム5000の構成例を示すブロック図である。図22に示すように、顕微鏡システム5000は、更に照明側の視野絞り530と、照明側の視野絞り530を制御する照明NA制御部(照明側絞り部)540とを更に有する。視野絞り530は、開口絞り5102b(図5参照)と共役の位置に配置される。このため、照明側の視野絞り530は、開口絞り5102bと同様に、光束、すなわち開口数NAを調整することが可能である。 FIG. 22 is a block diagram showing a configuration example of a microscope system 5000 according to modification 1 of one embodiment. As shown in FIG. 22 , the microscope system 5000 further includes an illumination-side field stop 530 and an illumination NA control section (illumination-side aperture section) 540 that controls the illumination-side field stop 530 . The field stop 530 is arranged at a position conjugate with the aperture stop 5102b (see FIG. 5). Therefore, the field stop 530 on the illumination side can adjust the luminous flux, that is, the numerical aperture NA, like the aperture stop 5102b.
 図23は、第1実施形態の変形例1に係るUI制御部518が生成するUI画面の一例を示す図である。モードボタンU146が追加されている点で、第1実施形態に係るUI画面の一例(図14参照)と相違する。 FIG. 23 is a diagram showing an example of a UI screen generated by the UI control unit 518 according to Modification 1 of the first embodiment. It differs from the example of the UI screen according to the first embodiment (see FIG. 14) in that a mode button U146 is added.
 図23に示すように、UI制御部518は、モードボタンU146が選択された場合に、指示された照明側の開口数を画像合成部522の光学処理部522bに供給する。光学処理部522bは、照明側の開口数と第2光学系の開口絞り5102b(図5参照)の開口数とに基づき、画像蓄積部(記憶部)502から読み込んだ第1画像G1を用いて、第2画像G200を生成する。例えば、照明側の開口数よりも対応する第2光学系の開口絞り5102b(図5参照)の開口数が小さい場合には、(1)式に示す係数を用いて、第2画像G200を生成する。一方で、例えば、照明側の開口数よりも対応する第2光学系の開口絞り5102b(図5参照)の開口数が大きい場合には、(2)式に示す係数の逆変換行列による係数を用いて、第2画像G200を生成する。 As shown in FIG. 23, the UI control unit 518 supplies the instructed illumination-side numerical aperture to the optical processing unit 522b of the image synthesizing unit 522 when the mode button U146 is selected. The optical processing unit 522b uses the first image G1 read from the image storage unit (storage unit) 502 based on the numerical aperture of the illumination side and the numerical aperture of the aperture stop 5102b (see FIG. 5) of the second optical system. , to generate a second image G200. For example, when the numerical aperture of the corresponding aperture diaphragm 5102b (see FIG. 5) of the second optical system is smaller than the numerical aperture of the illumination side, the coefficient shown in equation (1) is used to generate the second image G200. do. On the other hand, for example, when the numerical aperture of the corresponding aperture stop 5102b (see FIG. 5) of the second optical system is larger than the numerical aperture of the illumination side, the coefficients obtained by the inverse transformation matrix of the coefficients shown in equation (2) are is used to generate the second image G200.
 以上説明したように、本実施形態によれば、光学処理部522bは、照明側の開口数と第2光学系の開口絞り5102b(図5参照)の開口数とに基づき、第2画像G200を生成することとした。これにより、照明側の開口数に応じた照明により撮像した第1の画像G100を用いて、第2光学系5104で撮像した画像に対応する第2画像G200aをより高精度に生成することが可能となる。 As described above, according to the present embodiment, the optical processing unit 522b generates the second image G200 based on the illumination-side numerical aperture and the numerical aperture of the aperture stop 5102b (see FIG. 5) of the second optical system. I decided to generate. This makes it possible to more accurately generate the second image G200a corresponding to the image captured by the second optical system 5104 using the first image G100 captured with illumination corresponding to the numerical aperture on the illumination side. becomes.
(第1実施形態の変形例2)
 第1実施形態の変形例2に係る顕微鏡システム5000は、操作部506を介さずに第2画像G200aを生成する点で、第1実施形態に係る顕微鏡システム5000と相違する。以下では第1実施形態に係る顕微鏡システム5000と相違する点を説明する。
(Modification 2 of the first embodiment)
The microscope system 5000 according to Modification 2 of the first embodiment differs from the microscope system 5000 according to the first embodiment in that the second image G200a is generated without using the operation unit 506. FIG. Differences from the microscope system 5000 according to the first embodiment will be described below.
 図24は、第1実施形態の変形例2に係る顕微鏡システム5000の構成例を示すブロック図である。図24に示すように、第1実施形態の変形例2に係顕微鏡システム5000は、予め設定されたパラメータに従い、画像合成部522が画像蓄積部502に蓄積された第1の画像G100を用いて、第2光学系5104で撮像した画像に対応する第2画像G200aを生成する。そして、画像合成部522は生成した第2画像G200aを合成画像蓄積部545に蓄積する。合成画像蓄積部545は、例えばネットワークを介して、院内のビューワシステムに接続されている。 FIG. 24 is a block diagram showing a configuration example of a microscope system 5000 according to modification 2 of the first embodiment. As shown in FIG. 24, in a microscope system 5000 according to Modification 2 of the first embodiment, an image composition unit 522 uses a first image G100 accumulated in an image accumulation unit 502 according to preset parameters. , a second image G200a corresponding to the image captured by the second optical system 5104 is generated. Then, the image synthesizing unit 522 accumulates the generated second image G200a in the synthetic image accumulating unit 545. FIG. The composite image storage unit 545 is connected to an in-hospital viewer system, for example, via a network.
 図25は、第1実施形態の変形例2に係る顕微鏡システム5000の処理例を示すフローチャートである。図24に示すように、画像合成部522は、予め設定された倍率、及び開口数に応じて、画像蓄積部(記憶部)502から読み込んだ第1画像G100を用いて第2画像G200aを生成する(ステップS406)。続けて、画像合成部522は生成した第2画像G200aを合成画像蓄積部545に蓄積する。 FIG. 25 is a flowchart showing a processing example of the microscope system 5000 according to modification 2 of the first embodiment. As shown in FIG. 24, the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to preset magnification and numerical aperture. (step S406). Subsequently, the image synthesizing unit 522 accumulates the generated second image G200a in the synthetic image accumulating unit 545. FIG.
 そして、例えば、ビューワシステムは、合成画像蓄積部545に蓄積された第2画像G200aを表示させ(ステップS214)、全体処理を終了する。 Then, for example, the viewer system displays the second image G200a accumulated in the composite image accumulation unit 545 (step S214), and ends the overall processing.
 以上説明したように、本実施形態によれば、画像合成部522は、予め設定された倍率、及び開口数に応じて、画像蓄積部(記憶部)502から読み込んだ第1画像G100を用いて第2画像G200aを生成することとした。これにより、セットした標本に基づき、観察者を介さずに第2画像G200aの生成が可能となり、病理画像をより効率的に生成できる。 As described above, according to the present embodiment, the image synthesizing unit 522 uses the first image G100 read from the image storage unit (storage unit) 502 according to the preset magnification and numerical aperture. It was decided to generate the second image G200a. As a result, the second image G200a can be generated based on the set specimen without the intervention of the observer, and the pathological image can be generated more efficiently.
 なお、本技術は以下のような構成を取ることができる。 This technology can be configured as follows.
 (1)第1光学系を介して撮像した第1画像を取得する取得部と、
 前記第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理して第2画像を生成する処理を行う処理部と、
 を備える、情報処理装置。
(1) an acquisition unit that acquires a first image captured through a first optical system;
a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system;
An information processing device.
 (2)前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
 前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記複数の第1の画像に基づき、前記第1の画像よりも光軸方向の画像情報が増加するように、前記第2画像を生成する、(1)に記載の情報処理装置。
(2) the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system;
The processing unit, when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image. The information processing apparatus according to (1), wherein the second image is generated so as to increase image information.
 (3)前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
 前記処理部は、前記第1光学系よりも前記第2光学系の倍率が低い場合には、前記複数の第1の画像に基づき、前記第1の画像よりも光軸方向の画像情報が増加するように、前記第2画像を生成する、(1)に記載の情報処理装置。
(3) The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
When the magnification of the second optical system is lower than that of the first optical system, the processing unit increases image information in the optical axis direction more than the first image based on the plurality of first images. The information processing apparatus according to (1), wherein the second image is generated so as to.
 (4)前記処理部は、前記光学系情報に応じて、前記複数の第1の画像を加算する加算方法を変更する、(2)又は(2)に記載の情報処理装置。 (4) The information processing apparatus according to (2) or (2), wherein the processing unit changes an addition method for adding the plurality of first images according to the optical system information.
 (5)前記処理部は、前記光学系情報に基づき、前記複数の第1の画像毎に異なる係数を乗算し、加算して前記第1画像を処理する、(4)に記載の情報処理装置。 (5) The information processing device according to (4), wherein the processing unit multiplies and adds different coefficients for each of the plurality of first images based on the optical system information, and processes the first images. .
 (6)前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記複数の第1の画像に基づき、前記第1の画像よりも被写界深度が深い画像に対応する画像情報を有するように前記第2画像を生成する、(5)に記載の情報処理装置。 (6) If the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, the processing unit may, based on the plurality of first images, The information processing apparatus according to (5), wherein the second image is generated so as to have image information corresponding to an image with a deep depth of field.
 (7)前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記複数の第1の画像に基づき、前記第1の画像よりも被写界深度が深い画像に対応する画像情報を有するよう前記第2画像を生成する、(6)に記載の情報処理装置。 (7) Based on the plurality of first images, when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, the processing unit The information processing apparatus according to (6), wherein the second image is generated so as to have image information corresponding to an image with a deep depth of field.
 (8)前記処理部は、前記第2光学系の開口数と、前記第2光学系の撮像倍率を連動させて前記第2画像を生成する、(7)に記載の情報処理装置。 (8) The information processing apparatus according to (7), wherein the processing unit links the numerical aperture of the second optical system and the imaging magnification of the second optical system to generate the second image.
 (9)前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
 前記処理部は、前記第2光学系の対物レンズの特性に基づき、前記第2光学系の光軸からの距離に応じて、前記係数の値を変更する、(5)に記載の情報処理装置。
(9) The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
The information processing device according to (5), wherein the processing unit changes the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system. .
 (10)前記処理部は、前記第1光学系の光軸からの距離が増加するに従い、対物レンズ側の画像情報が増加するように、前記係数の値を変更する、(5)に記載の情報処理装置。 (10) According to (5), the processing unit changes the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases. Information processing equipment.
 (11)前記処理部は、前記光学系情報に応じて、前記第1画像の濃度、及び空間周波数のいずれかを変更する処理を行う、(1)に記載の情報処理装置。 (11) The information processing apparatus according to (1), wherein the processing unit changes either the density or the spatial frequency of the first image according to the optical system information.
 (12)前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記第1画像の空間周波数よりも前記第2画像の空間周波数を低下させる処理を行う、(11)に記載の情報処理装置。 (12) The processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system. The information processing apparatus according to (11), which performs a process of causing a
 (13)前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記第1画像の空間周波数を低下させる処理を行う、(11)に記載の情報処理装置。 (13) In (11), the processing unit performs processing to reduce the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system. The information processing device described.
 (14)前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記第1画像の濃度を上げる処理を行う、(11)に記載の情報処理装置。 (14) According to (11), the processing unit performs processing to increase the density of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system. Information processing equipment.
 (15)前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記第1画像の濃度を上げる処理を行う、(11)に記載の情報処理装置。 (15) According to (11), the processing unit performs processing to increase the density of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system. Information processing equipment.
 (16)第2光学系に関連する情報として、複数の異なる光学系情報から選択する選択部を更に有し、
 前記処理部は、選択部で選択された情報に応じて、前記第1画像に対する処理を変更する、(1)に記載の情報処理装置。
(16) further comprising a selection unit that selects from a plurality of different optical system information as information related to the second optical system;
The information processing apparatus according to (1), wherein the processing unit changes processing for the first image according to information selected by the selection unit.
 (17)前記第1画像に対応する第1表示画像と、前記処理部により処理された前記第2画像に対応する第2表示画像とを並べて表示部に表示する表示制御部を更に備える、(1)に記載の情報処理装置。 (17) further comprising a display control unit that displays a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side on the display unit, 1) The information processing device described in 1).
 (18)前記第1表示画像は、撮像倍率を変更させても開口数の変動を抑制した画像であり、第2表示画像は、撮像倍率の変更に応じて開口数を変動させた画像である、(17)に記載の情報処理装置。 (18) The first display image is an image in which a change in numerical aperture is suppressed even if the imaging magnification is changed, and the second display image is an image in which the numerical aperture is varied according to the change in imaging magnification. , the information processing apparatus according to (17).
 (19)第1光学系を介して撮像した第1画像を取得する取得工程と、
 前記第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理する処理工程と、
 を備える、情報処理方法。
(19) an acquisition step of acquiring a first image captured via the first optical system;
processing the first image using optical system information associated with a second optical system different from the first optical system;
A method of processing information, comprising:
 (20)第1の対物レンズを接続して、生体組織の少なくとも2つ以上の焦点位置で撮像した第1画像を取得する顕微鏡装置と、
 情報処理装置と、を備える顕微鏡システムであって、
 前記情報処理装置は、
 前記第1画像を取得する取得部と、
 前記顕微鏡装置における第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理する処理部と、
 を有する、顕微鏡システム。
(20) a microscope device that connects a first objective lens to obtain a first image captured at at least two or more focal positions of a biological tissue;
A microscope system comprising an information processing device,
The information processing device is
an acquisition unit that acquires the first image;
a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device;
a microscope system.
 本開示の態様は、上述した個々の実施形態に限定されるものではなく、当業者が想到しうる種々の変形も含むものであり、本開示の効果も上述した内容に限定されない。すなわち、特許請求の範囲に規定された内容およびその均等物から導き出される本開示の概念的な思想と趣旨を逸脱しない範囲で種々の追加、変更および部分的削除が可能である。 Aspects of the present disclosure are not limited to the individual embodiments described above, but include various modifications that can be conceived by those skilled in the art, and the effects of the present disclosure are not limited to the above-described contents. That is, various additions, changes, and partial deletions are possible without departing from the conceptual idea and spirit of the present disclosure derived from the content defined in the claims and equivalents thereof.
 515:画像取得部、522:画像合成部(処理部)、5000:顕微鏡システム、5102:第1光学系、5104:第2光学系、5120、5120a:情報処理装置、5100:顕微鏡装置、G100:第1画像、G100a:第1の画像、G200a:第2画像、L100:第1光学系の光軸、L200:第2光学系の光軸。 515: Image Acquisition Unit, 522: Image Synthesis Unit (Processing Unit), 5000: Microscope System, 5102: First Optical System, 5104: Second Optical System, 5120, 5120a: Information Processing Device, 5100: Microscope Device, G100: First image, G100a: first image, G200a: second image, L100: optical axis of first optical system, L200: optical axis of second optical system.

Claims (20)

  1.  第1光学系を介して撮像した第1画像を取得する取得部と、
     前記第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理して第2画像を生成する処理を行う処理部と、
     を備える、情報処理装置。
    an acquisition unit that acquires a first image captured through a first optical system;
    a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system;
    An information processing device.
  2.  前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
     前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記複数の第1の画像に基づき、前記第1の画像よりも前記光軸方向の画像情報が増加するように、前記第2画像を生成する、請求項1に記載の情報処理装置。
    The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
    The processing unit, when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image. 2. The information processing apparatus according to claim 1, wherein the second image is generated such that the image information of the image is increased.
  3.  前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
     前記処理部は、前記第1光学系よりも前記第2光学系の倍率が低い場合には、前記複数の第1の画像に基づき、前記第1の画像よりも前記光軸方向の画像情報が増加するように、前記第2画像を生成する、請求項1に記載の情報処理装置。
    The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
    The processing unit, when the magnification of the second optical system is lower than that of the first optical system, based on the plurality of first images, the image information in the optical axis direction is higher than that of the first images. 2. The information processing apparatus according to claim 1, wherein said second image is generated so as to increase.
  4.  前記処理部は、前記光学系情報に応じて、前記複数の第1の画像を加算する加算方法を変更する、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the processing unit changes an addition method for adding the plurality of first images according to the optical system information.
  5.  前記処理部は、前記光学系情報に基づき、前記複数の第1の画像毎に異なる係数を乗算し、加算して前記第1画像を処理する、請求項4に記載の情報処理装置。 The information processing apparatus according to claim 4, wherein the processing unit multiplies and adds different coefficients for each of the plurality of first images based on the optical system information, and processes the first images.
  6.  前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記複数の第1の画像に基づき、前記第1の画像よりも被写界深度が深い画像に対応する画像情報を有するように前記第2画像を生成する、請求項5に記載の情報処理装置。 The processing unit, when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image. 6. The information processing apparatus according to claim 5, wherein said second image is generated so as to have image information corresponding to a deep image.
  7.  前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記複数の第1の画像に基づき、前記第1の画像よりも被写界深度が深い画像に対応する画像情報を有するよう前記第2画像を生成する、請求項6に記載の情報処理装置。 The processing unit, when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image. 7. The information processing apparatus according to claim 6, wherein said second image is generated so as to have image information corresponding to a deep image.
  8.  前記処理部は、前記第2光学系の開口数と、前記第2光学系の撮像倍率とを連動させて前記第2画像を生成する、請求項7に記載の情報処理装置。 The information processing apparatus according to claim 7, wherein the processing unit links the numerical aperture of the second optical system and the imaging magnification of the second optical system to generate the second image.
  9.  前記第1画像は、前記第1光学系の光軸方向に撮影位置を変更して撮像した複数の第1の画像で構成され、
     前記処理部は、前記第2光学系の対物レンズの特性に基づき、前記第2光学系の光軸からの距離に応じて、前記係数の値を変更する、請求項5に記載の情報処理装置。
    The first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
    6. The information processing apparatus according to claim 5, wherein the processing unit changes the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system. .
  10.  前記処理部は、前記第1光学系の光軸からの距離が増加するに従い、対物レンズ側の画像情報が増加するように、前記係数の値を変更する、請求項5に記載の情報処理装置。 6. The information processing apparatus according to claim 5, wherein the processing unit changes the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases. .
  11.  前記処理部は、前記光学系情報に応じて、前記第1画像の濃度、及び空間周波数のいずれかを変更する処理を行う、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the processing unit performs processing for changing either the density or the spatial frequency of the first image according to the optical system information.
  12.  前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記第1画像の空間周波数よりも前記第2画像の空間周波数を低下させる処理を行う、請求項11に記載の情報処理装置。 When the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, the processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image. 12. The information processing apparatus according to claim 11, wherein:
  13.  前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記第1画像の空間周波数を低下させる処理を行う、請求項11に記載の情報処理装置。 12. The information according to claim 11, wherein the processing unit performs a process of lowering the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system. processing equipment.
  14.  前記処理部は、前記第1光学系の開口数よりも前記第2光学系の開口数が小さい場合には、前記第1画像の濃度を上げる処理を行う、請求項11に記載の情報処理装置。 12. The information processing apparatus according to claim 11, wherein when the numerical aperture of said second optical system is smaller than the numerical aperture of said first optical system, said processing unit performs processing to increase the density of said first image. .
  15.  前記処理部は、前記第1光学系の撮像倍率よりも前記第2光学系の撮像倍率が低い場合には、前記第1画像の濃度を上げる処理を行う、請求項11に記載の情報処理装置。 12. The information processing apparatus according to claim 11, wherein said processing unit performs processing to increase the density of said first image when the imaging magnification of said second optical system is lower than the imaging magnification of said first optical system. .
  16.  第2光学系に関連する情報として、複数の異なる光学系情報から選択する選択部を更に有し、
     前記処理部は、前記選択部で選択された情報に応じて、前記第1画像に対する処理を変更する、請求項1に記載の情報処理装置。
    further comprising a selection unit for selecting from a plurality of different optical system information as information related to the second optical system;
    2. The information processing apparatus according to claim 1, wherein said processing unit changes processing for said first image according to information selected by said selection unit.
  17.  前記第1画像に対応する第1表示画像と、前記処理部により処理された前記第2画像に対応する第2表示画像とを並べて表示部に表示させる表示制御部を更に備える、請求項1に記載の情報処理装置。 2. The method according to claim 1, further comprising a display control unit that causes a display unit to display a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side. The information processing device described.
  18.  前記第1表示画像は、撮像倍率を変更させても被写界深度の変動を抑制した画像であり、第2表示画像は、撮像倍率の変更に応じて被写界深度を変動させた画像である、請求項17に記載の情報処理装置。 The first display image is an image in which variation in depth of field is suppressed even if the imaging magnification is changed, and the second display image is an image in which the depth of field is varied according to the change in imaging magnification. 18. The information processing device according to claim 17.
  19.  第1光学系を介して撮像した第1画像を取得する取得工程と、
     前記第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理する処理工程と、
     を備える、情報処理方法。
    an acquisition step of acquiring a first image captured through the first optical system;
    processing the first image using optical system information associated with a second optical system different from the first optical system;
    A method of processing information, comprising:
  20.  第1の対物レンズを接続して、生体組織の少なくとも2つ以上の焦点位置で撮像した第1画像を取得する顕微鏡装置と、
     情報処理装置と、を備える顕微鏡システムであって、
     前記情報処理装置は、
     前記第1画像を取得する取得部と、
     前記顕微鏡装置における第1光学系と異なる第2光学系に関連する光学系情報を用いて、前記第1画像を処理する処理部と、
     を有する、顕微鏡システム。
    a microscope device that connects a first objective lens and obtains a first image captured at at least two or more focal positions of a biological tissue;
    A microscope system comprising an information processing device,
    The information processing device is
    an acquisition unit that acquires the first image;
    a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device;
    a microscope system.
PCT/JP2022/008728 2021-06-09 2022-03-02 Information processing device, information processing method, and microscope system WO2022259647A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-096824 2021-06-09
JP2021096824 2021-06-09

Publications (1)

Publication Number Publication Date
WO2022259647A1 true WO2022259647A1 (en) 2022-12-15

Family

ID=84425146

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/008728 WO2022259647A1 (en) 2021-06-09 2022-03-02 Information processing device, information processing method, and microscope system

Country Status (1)

Country Link
WO (1) WO2022259647A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002247439A (en) * 2001-02-13 2002-08-30 Ricoh Co Ltd Image input unit, image input method, and computer readable recording medium stored with program to allow computer to execute the method
JP2013117848A (en) * 2011-12-02 2013-06-13 Canon Inc Image processing apparatus and image processing method
JP2014071207A (en) * 2012-09-28 2014-04-21 Canon Inc Image processing apparatus, imaging system, and image processing system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002247439A (en) * 2001-02-13 2002-08-30 Ricoh Co Ltd Image input unit, image input method, and computer readable recording medium stored with program to allow computer to execute the method
JP2013117848A (en) * 2011-12-02 2013-06-13 Canon Inc Image processing apparatus and image processing method
JP2014071207A (en) * 2012-09-28 2014-04-21 Canon Inc Image processing apparatus, imaging system, and image processing system

Similar Documents

Publication Publication Date Title
EP3776458B1 (en) Augmented reality microscope for pathology with overlay of quantitative biomarker data
JP6947841B2 (en) Augmented reality microscope for pathology
Yagi et al. Digital imaging in pathology: the case for standardization
RU2522123C2 (en) System and method for enhanced predictive autofocusing
JP5715371B2 (en) Imaging system and imaging method with improved depth of field
CN107850754A (en) The image-forming assembly focused on automatically with quick sample
JP5651423B2 (en) Imaging system and imaging method with improved depth of field
JP5826561B2 (en) Microscope system, specimen image generation method and program
JP2011085932A (en) System and method for imaging with enhanced depth of field
WO2022176396A1 (en) Information processing device, information processing method, computer program, and medical diagnosis system
WO2021177446A1 (en) Signal acquisition apparatus, signal acquisition system, and signal acquisition method
WO2022259647A1 (en) Information processing device, information processing method, and microscope system
WO2022050109A1 (en) Image processing device, image processing method, and image processing system
WO2022209262A1 (en) Lighting device for biological specimen observation device, biological specimen observation device, lighting device for observation device, and observation system
WO2022181263A1 (en) Medical image processing device, medical image processing method, and program
WO2022209349A1 (en) Lighting device for observation device, observation device, and observation system
WO2022202233A1 (en) Information processing device, information processing method, information processing system and conversion model
WO2022209443A1 (en) Medical image analysis device, medical image analysis method, and medical image analysis system
WO2022259648A1 (en) Information processing program, information processing device, information processing method, and microscope system
WO2020075226A1 (en) Image processing device operation method, image processing device, and image processing device operation program
WO2022201992A1 (en) Medical image analysis device, medical image analysis method, and medical image analysis system
JP2004151263A (en) Microscope device
WO2023149296A1 (en) Information processing device, biological sample observation system, and image generation method
WO2023276219A1 (en) Information processing device, biological sample observation system, and image generation method
WO2021220857A1 (en) Image processing device, image processing method, and image processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819837

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE