WO2022259647A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et système de microscope - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et système de microscope Download PDF

Info

Publication number
WO2022259647A1
WO2022259647A1 PCT/JP2022/008728 JP2022008728W WO2022259647A1 WO 2022259647 A1 WO2022259647 A1 WO 2022259647A1 JP 2022008728 W JP2022008728 W JP 2022008728W WO 2022259647 A1 WO2022259647 A1 WO 2022259647A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
optical system
information
processing unit
processing
Prior art date
Application number
PCT/JP2022/008728
Other languages
English (en)
Japanese (ja)
Inventor
悠策 中島
信裕 林
寿一 白木
拓哉 大嶋
祐伍 勝木
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022259647A1 publication Critical patent/WO2022259647A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a microscope system.
  • the microscope system can electronically store and share images captured as digital data.
  • An image that is such digital data can be subjected to processing that is not possible with an optical microscope.
  • a microscope system reconstructs a WSI image (WSI: Whole Slide Imaging) of a pathological specimen by connecting high-magnification pathological images.
  • microscope systems that capture digital data may have limitations in their optics. For this reason, electronically reducing or enlarging a WSI image, for example, results in reducing or enlarging an image captured under specific optical conditions, resulting in a difference from an actual image observed by an optical microscope. As a result, doctors (especially pathologists) accustomed to using optical microscopes may feel a deviation in the diagnostic criteria when making a diagnosis using an image captured as digital data. Therefore, the present disclosure provides an information processing device, an information processing method, and a microscope system that can provide an image more suitable for the viewer's taste.
  • an acquisition unit that acquires a first image captured through a first optical system; a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system; An information processing device is provided.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system
  • the processing unit when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image.
  • the second image may be generated such that image information is increased.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
  • the processing unit increases image information in the optical axis direction more than the first image based on the plurality of first images.
  • the second image may be generated so as to.
  • the processing unit may change an addition method for adding the plurality of first images according to the optical system information.
  • the processing unit may process the first images by multiplying each of the plurality of first images by different coefficients based on the optical system information and adding them.
  • the processing unit when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image.
  • the second image may be generated such that it has image information corresponding to a deep image.
  • the processing unit when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, based on the plurality of first images, the depth of field is greater than that of the first image.
  • the second image may be generated to have image information corresponding to a deep image.
  • the processing unit may generate the second image by linking the numerical aperture of the second optical system and the imaging magnification of the second optical system.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system
  • the processing unit may change the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system.
  • the processing unit may change the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases.
  • the processing unit may change either the density or the spatial frequency of the first image according to the optical system information.
  • the processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image. you can go
  • the processing unit may perform processing to lower the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • the processing unit may perform processing to increase the density of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
  • the processing unit may perform processing to increase the density of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • the processing section may change the processing for the first image according to the information selected by the selection section.
  • a display control unit may be further provided for displaying a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side on the display unit.
  • the first display image may be an image in which a change in numerical aperture is suppressed even if the imaging magnification is changed
  • the second display image may be an image in which the numerical aperture is changed according to the change in imaging magnification.
  • a method of processing information comprising:
  • a microscope device that connects a first objective lens and acquires a first image captured at least at two or more focal positions of a living tissue
  • a microscope system comprising an information processing device,
  • the information processing device is an acquisition unit that acquires the first image;
  • a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device;
  • a microscope system is provided, comprising:
  • FIG. 4A and 4B are diagrams showing an example of an imaging method
  • FIG. 4A and 4B are diagrams showing an example of an imaging method
  • FIG. 3 is a block diagram showing a more detailed configuration example of the microscope system 5000
  • FIG. 4 is a diagram showing an objective lens and an aperture stop of an optical system of an optical section
  • FIG. 4 is a diagram schematically showing a first image captured under the control of a sensor control section and a stage control section
  • FIG. 2 is a block diagram showing a detailed configuration example of an image synthesizing unit
  • 4A and 4B are diagrams for schematically explaining a processing example of an optical processing unit;
  • FIG. 4 is a diagram schematically showing the processing order of an image synthesizing unit; The figure which shows the example which displays a 1st image and a 2nd image side by side.
  • FIG. 4 is a diagram showing an example of a UI screen generated by a UI control unit; The figure which shows a processing sequence when a mode button is selected. The figure which shows the UI example which instruct
  • FIG. 5 is a diagram showing an example of a UI screen related to image processing generated by a UI control unit;
  • FIG. 6 is a diagram showing an example of a UI screen related to special processing generated by the UI control unit;
  • FIG. 17 shows a processing system corresponding to the mode selected in FIG. 16;
  • FIG. 10 is a diagram showing an example in which the focal position shifts according to the distance from the optical axis of the objective lens of the second optical system;
  • FIG. 4 is a diagram showing an example of experimentally generated coefficients related to formulas (1) and (2);
  • 4 is a flowchart showing an example of processing of the microscope system according to the embodiment;
  • 5 is a flowchart showing an example of processing of the microscope system when magnification and numerical aperture are interlocked;
  • FIG. 2 is a block diagram showing a configuration example of a microscope system according to Modification 1 of the embodiment;
  • FIG. 10 is a diagram showing an example of a UI screen generated by a UI control unit according to Modification 1 of the first embodiment;
  • FIG. 5 is a block diagram showing a configuration example of a microscope system according to Modification 2 of the first embodiment;
  • 9 is a flowchart showing a processing example of the microscope system according to modification 2 of the first embodiment;
  • FIG. 1 shows a configuration example of the microscope system of the present disclosure.
  • a microscope system 5000 shown in FIG. 1 includes a microscope device 5100 , a control section 5110 and an information processing section 5120 .
  • a microscope device 5100 includes a light irradiation section 5101 , an optical section 5102 , and a signal acquisition section 5103 .
  • the microscope device 5100 may further include a sample placement section 5104 on which the biological sample S is placed. Note that the configuration of the microscope apparatus is not limited to that shown in FIG. It may be used as the unit 5101 .
  • the light irradiation section 5101 may be arranged such that the sample mounting section 5104 is sandwiched between the light irradiation section 5101 and the optical section 5102, and may be arranged on the side where the optical section 5102 exists, for example.
  • the microscope apparatus 5100 may be configured for one or more of bright field observation, phase contrast observation, differential interference observation, polarization observation, fluorescence observation, and dark field observation.
  • the microscope system 5000 may be configured as a so-called WSI (Whole Slide Imaging) system or a digital pathology system, and can be used for pathological diagnosis.
  • Microscope system 5000 may also be configured as a fluorescence imaging system, in particular a multiplex fluorescence imaging system.
  • the microscope system 5000 may be used to perform intraoperative pathological diagnosis or remote pathological diagnosis.
  • the microscope device 5100 acquires data of the biological sample S obtained from the subject of the surgery, and transfers the data to the information processing unit 5120. can send.
  • the microscope device 5100 can transmit the acquired data of the biological sample S to the information processing device 5120 located in a place (another room, building, or the like) away from the microscope device 5100 .
  • the information processing device 5120 receives and outputs the data.
  • a user of the information processing device 5120 can make a pathological diagnosis based on the output data.
  • the biological sample S may be a sample containing a biological component.
  • the biological components may be tissues, cells, liquid components of a living body (blood, urine, etc.), cultures, or living cells (cardiomyocytes, nerve cells, fertilized eggs, etc.).
  • the biological sample may be a solid, a specimen fixed with a fixative such as paraffin, or a solid formed by freezing.
  • the biological sample can be a section of the solid.
  • a specific example of the biological sample is a section of a biopsy sample.
  • the biological sample may be one that has undergone processing such as staining or labeling.
  • the treatment may be staining for indicating the morphology of biological components or for indicating substances (surface antigens, etc.) possessed by biological components, examples of which include HE (Hematoxylin-Eosin) staining and immunohistochemistry staining. be able to.
  • the biological sample may be treated with one or more reagents, and the reagents may be fluorescent dyes, chromogenic reagents, fluorescent proteins, or fluorescently labeled antibodies.
  • the specimen may be one prepared for the purpose of pathological diagnosis or clinical examination from a specimen or tissue sample collected from the human body. Moreover, the specimen is not limited to the human body, and may be derived from animals, plants, or other materials.
  • the specimen may be the type of tissue used (such as an organ or cell), the type of target disease, the subject's attributes (such as age, sex, blood type, or race), or the subject's lifestyle. The properties differ depending on habits (for example, eating habits, exercise habits, smoking habits, etc.).
  • the specimens may be managed with identification information (bar code information, QR code (trademark) information, etc.) that allows each specimen to be identified.
  • the light irradiation unit 5101 is a light source for illuminating the biological sample S and an optical unit for guiding the light irradiated from the light source to the specimen.
  • the light source may irradiate the biological sample with visible light, ultraviolet light, or infrared light, or a combination thereof.
  • the light source may be one or more of halogen lamps, laser light sources, LED lamps, mercury lamps, and xenon lamps. A plurality of types and/or wavelengths of light sources may be used in fluorescence observation, and may be appropriately selected by those skilled in the art.
  • the light irradiator may have a transmissive, reflective, or episcopic (coaxial or lateral) configuration.
  • the optical section 5102 is configured to guide the light from the biological sample S to the signal acquisition section 5103 .
  • the optical section can be configured to allow the microscope device 5100 to observe or image the biological sample S.
  • Optical section 5102 may include an objective lens.
  • the type of objective lens may be appropriately selected by those skilled in the art according to the observation method.
  • the optical section may include a relay lens for relaying the image magnified by the objective lens to the signal acquisition section.
  • the optical unit may further include optical components other than the objective lens and the relay lens, an eyepiece lens, a phase plate, a condenser lens, and the like.
  • the optical section 5102 may further include a wavelength separation section configured to separate light having a predetermined wavelength from the light from the biological sample S.
  • the wavelength separation section can be configured to selectively allow light of a predetermined wavelength or wavelength range to reach the signal acquisition section.
  • the wavelength separator may include, for example, one or more of a filter that selectively transmits light, a polarizing plate, a prism (Wollaston prism), and a diffraction grating.
  • the optical components included in the wavelength separation section may be arranged, for example, on the optical path from the objective lens to the signal acquisition section.
  • the wavelength separation unit is provided in the microscope apparatus when fluorescence observation is performed, particularly when an excitation light irradiation unit is included.
  • the wavelength separator may be configured to separate fluorescent light from each other or white light and fluorescent light.
  • the signal acquisition unit 5103 can be configured to receive light from the biological sample S and convert the light into an electrical signal, particularly a digital electrical signal.
  • the signal acquisition unit may be configured to acquire data on the biological sample S based on the electrical signal.
  • the signal acquisition unit may be configured to acquire data of an image (image, particularly a still image, a time-lapse image, or a moving image) of the biological sample S, particularly an image magnified by the optical unit. It can be configured to acquire data.
  • the signal acquisition unit includes one or more imaging elements, such as CMOS or CCD, having a plurality of pixels arranged one-dimensionally or two-dimensionally.
  • the signal acquisition unit may include an image sensor for acquiring a low-resolution image and an image sensor for acquiring a high-resolution image, or an image sensor for sensing such as AF and an image sensor for image output for observation. and may include In addition to the plurality of pixels, the image sensor includes a signal processing unit (including one, two, or three of CPU, DSP, and memory) that performs signal processing using pixel signals from each pixel, and an output control unit for controlling output of image data generated from the pixel signals and processed data generated by the signal processing unit. Furthermore, the imaging device may include an asynchronous event detection sensor that detects, as an event, a change in brightness of a pixel that photoelectrically converts incident light exceeding a predetermined threshold. An imaging device including the plurality of pixels, the signal processing section, and the output control section may preferably be configured as a one-chip semiconductor device.
  • the control unit 5110 controls imaging by the microscope device 5100 .
  • the control unit can drive the movement of the optical unit 5102 and/or the sample placement unit 5104 to adjust the positional relationship between the optical unit and the sample placement unit.
  • the control unit 5110 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (for example, the optical axis direction of the objective lens). Further, the control section may move the optical section and/or the sample placement section in any direction on a plane perpendicular to the optical axis direction.
  • the control unit may control the light irradiation unit 5101 and/or the signal acquisition unit 5103 for imaging control.
  • the sample mounting section 5104 may be configured such that the position of the biological sample on the sample mounting section can be fixed, and may be a so-called stage.
  • the sample mounting section 5104 can be configured to move the position of the biological sample in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
  • the information processing section 5120 can acquire data (such as imaging data) acquired by the microscope device 5100 from the microscope device 5100 .
  • the information processing section can perform image processing on the imaging data.
  • the image processing may include color separation processing.
  • the color separation process is a process of extracting data of light components of a predetermined wavelength or wavelength range from the captured data to generate image data, or removing data of light components of a predetermined wavelength or wavelength range from the captured data. It can include processing and the like.
  • the image processing may include autofluorescence separation processing for separating the autofluorescence component and dye component of the tissue section, and fluorescence separation processing for separating the wavelengths between dyes having different fluorescence wavelengths.
  • autofluorescence signals extracted from one may be used to remove autofluorescence components from image information of the other specimen.
  • the information processing section 5120 may transmit data for imaging control to the control section 5110, and the control section 5110 receiving the data may control imaging by the microscope apparatus 5100 according to the data.
  • the information processing section 5120 may be configured as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM.
  • the information processing section may be included in the housing of the microscope device 5100 or may be outside the housing. Also, various processes or functions by the information processing unit may be realized by a server computer or cloud connected via a network.
  • a method of imaging the biological sample S by the microscope device 5100 may be appropriately selected by a person skilled in the art according to the type of the biological sample and the purpose of imaging. An example of the imaging method will be described below.
  • the microscope device can first identify an imaging target region.
  • the imaging target region may be specified so as to cover the entire region where the biological sample exists, or a target portion (target tissue section, target cell, or target lesion portion) of the biological sample. ) may be specified to cover
  • the microscope device divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device sequentially images each divided region. As a result, an image of each divided area is obtained.
  • the microscope device specifies an imaging target region R that covers the entire biological sample S.
  • the microscope device divides the imaging target region R into 16 divided regions.
  • the microscope device can then image the segmented region R1, and then image any region included in the imaging target region R, such as a region adjacent to the segmented region R1. Then, image capturing of the divided areas is performed until there are no unimaged divided areas. Areas other than the imaging target area R may also be imaged based on the captured image information of the divided areas. After imaging a certain divided area, the positional relationship between the microscope device and the sample mounting section is adjusted in order to image the next divided area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them.
  • the image capturing device that captures each divided area may be a two-dimensional image sensor (area sensor) or a one-dimensional image sensor (line sensor).
  • the signal acquisition section may capture an image of each divided area via the optical section.
  • the imaging of each divided area may be performed continuously while moving the microscope device and/or the sample mounting section, or the movement of the microscope apparatus and/or the sample mounting section may be performed during the imaging of each divided area. may be stopped.
  • the imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap.
  • Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
  • the information processing device can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesis processing over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
  • the microscope device can first identify an imaging target region.
  • the imaging target region may be specified so as to cover the entire region where the biological sample exists, or the target portion (target tissue section or target cell-containing portion) of the biological sample. may be specified.
  • the microscope device scans a part of the imaging target area (also referred to as a "divided scan area") in one direction (also referred to as a "scanning direction”) in a plane perpendicular to the optical axis to capture an image. do.
  • the scanning of the divided scan area is completed, the next divided scan area next to the scan area is scanned. These scanning operations are repeated until the entire imaging target area is imaged. As shown in FIG.
  • the microscope device specifies a region (gray portion) in which a tissue section exists in the biological sample S as an imaging target region Sa. Then, the microscope device scans the divided scan area Rs in the imaging target area Sa in the Y-axis direction. After completing the scanning of the divided scan region Rs, the microscope device scans the next divided scan region in the X-axis direction. This operation is repeated until scanning is completed for the entire imaging target area Sa.
  • the positional relationship between the microscope device and the sample placement section is adjusted for scanning each divided scan area and for imaging the next divided scan area after imaging a certain divided scan area. The adjustment may be performed by moving the microscope device, moving the sample placement unit, or moving both of them.
  • the imaging device that captures each divided scan area may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor).
  • the signal acquisition section may capture an image of each divided area via an enlarging optical system.
  • the imaging of each divided scan area may be performed continuously while moving the microscope device and/or the sample placement unit.
  • the imaging target area may be divided such that the divided scan areas partially overlap each other, or the imaging target area may be divided so that the divided scan areas do not overlap.
  • Each divided scan area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
  • the information processing apparatus can combine a plurality of adjacent divided scan areas to generate image data of a wider area. By performing the synthesis processing over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Further, image data with lower resolution can be generated from the image of the divided scan area or the image subjected to the synthesis processing.
  • FIG. 4 is a block diagram showing a more detailed configuration example of the microscope system 5000. As shown in FIG. As shown in FIG. 4 , the microscope system 5000 further includes an image accumulation unit (storage unit) 502 and an operation unit 506 .
  • image accumulation unit storage unit
  • operation unit 506 operation unit
  • a storage device such as a non-volatile semiconductor memory or hard disk drive is used.
  • Various control parameters, programs, and the like according to this embodiment may be stored in advance in the storage unit 502 .
  • the storage unit 502 accumulates images acquired by the signal acquisition unit (sensor) 5103 .
  • the operation unit 506 is composed of, for example, a keyboard and a mouse.
  • the operation unit 506 inputs an instruction signal according to the operation of an observer, eg, a pathologist, to the signal processing device 5120a.
  • the microscope device 5100 a has a sensor control section 508 and a stage control section 514 .
  • the information processing device 5120 a has an image acquisition unit 515 , an image display unit 516 , a UI control unit 518 , a display control unit 520 , an image synthesizing unit 522 , and an image display unit 522 . Details of the information processing device 5120a will be described later.
  • FIG. 5 is a diagram showing an objective lens 5102a and an aperture stop 5102b of the optical system of the optical section 5102 according to this embodiment.
  • a region 100 indicates a depth region in the depth direction along the optical axis L100 of the optical system to be imaged in one imaging. Note that the optical system of the optical unit 5102 according to this embodiment corresponds to the first optical system.
  • the sensor control unit 508 and the stage control unit 514 move the focal position of the objective lens 5102a of the optical system in the Z direction along the optical axis L100 of the optical system. control to change and capture a plurality of images. That is, the stage controller 514 changes the relative distance between the objective lens 5102a and the stage 5104, for example, with the aperture diaphragm 5102b in the first state.
  • This aperture stop 5102b defines the thickness of the light beam, that is, the numerical aperture (NA). That is, in the first state, the aperture diaphragm 5102b is fixed in an open state in which the numerical aperture (NA) is expressed as, for example, "large”.
  • one objective lens 5102a is illustrated in order to simplify the description, but the present invention is not limited to this. For example, a plurality of objective lenses 5102a may be configured.
  • the sensor control unit 508 captures an image and supplies it to the storage unit 502 in accordance with the change of the focal position.
  • the sensor control unit 508 and the stage control unit 514 move the stage 5104 in the horizontal direction.
  • the control for changing the relative distance between the objective lens 5102a and the stage 5104 in the Z direction along the axis L100 is repeated.
  • FIG. 6 is a diagram schematically showing the first image G100 captured under the control of the sensor control section 508 and the stage control section 514.
  • the first image G100 is composed of a plurality of first images G100a captured by changing the focal position.
  • the image acquisition unit 515 acquires the captured image stored in the storage unit 502.
  • the image acquisition unit 515 may be configured to acquire a captured image from the sensor 5103 .
  • the storage unit may be configured in the information processing device 5120a.
  • the image display unit 516 is, for example, a monitor.
  • the image display unit 516 displays an image generated by the image synthesizing unit 522 and a user interface (UI screen), as will be described later.
  • UI screen user interface
  • the UI control unit 518 causes the image display unit 516 to display an input UI screen for inputting an instruction signal from the observer.
  • the UI control unit 518 causes the image display unit 516 to display, for example, FIGS.
  • the UI control unit 518 then supplies the observer's instruction signal to the image synthesizing unit 522 .
  • the display control unit 520 causes the image display unit 516 to display the UI screen generated by the UI control unit 518 and the image processed by the image synthesizing unit 522 as predetermined display images.
  • the display image is subjected to gamma conversion, for example.
  • the image synthesis unit 522 processes the image acquired by the image acquisition unit 515 using optical system information related to a second optical system different from the first optical system (see FIG. 5).
  • the second optical system means an optical system having a form different from that of the first optical system when capturing an image to be processed.
  • an optical system of an optical microscope is called a second optical system.
  • an optical system having the same objective lens 5102a but a different numerical aperture or magnification of an aperture stop 5120b is referred to as a second optical system.
  • an optical system using an objective lens different from the objective lens 5102a is referred to as a second optical system. Note that the image synthesizing unit 522 according to this embodiment corresponds to the processing unit.
  • FIG. 7 is a block diagram showing a detailed configuration example of the image synthesizing unit 522.
  • the image synthesizing section 522 has an image area synthesizing section 522a, an optical processing section 522b, a color processing section 522c, and a frequency processing section 522d.
  • the image synthesizing unit 522 stitches and synthesizes the first images G100a having different focal positions captured in each region to generate a WSI (Whole Slide Imaging) image. do.
  • WSI images corresponding to the number of first images G100a forming the first image G100 are generated. For example, when the first image G100a is captured at 20 different focal positions, 20 WSI images are generated.
  • FIG. 522b Based on the information of the second optical system, the optical processing unit 522b uses the first image G100 captured by the first optical system to generate a second image corresponding to the case of being captured by the second optical system.
  • FIG. 8 is a diagram showing an example of the second optical system 5104.
  • the second optical system 5104 has an objective lens 5104a and an aperture stop 5104b.
  • a region 200 indicates a depth region in the depth direction along the optical axis L200 of the second optical system 5104 that is imaged in one imaging.
  • the aperture diaphragm 5104b is fixed in an aperture state in which the thickness of the light beam (numerical aperture NA) is expressed as, for example, "small”.
  • the aperture stop 5104b is narrower than the aperture stop 5102b (see FIG. 4). Therefore, the depth of field is deeper than that of the first optical system 5102 (see FIG. 4).
  • Information about the second optical system 5104 includes information about the characteristics of the objective lens 5104a, the aperture stop 5104b, and the magnification. Therefore, by using the information of the second optical system 5104, it is possible to calculate the depth of field when the image is captured by the second optical system. Alternatively, information on the focal position from the objective lens 5104a and information on the depth of field may be stored in advance in the storage unit 502 in association with the information on the second optical system with the numerical aperture and the magnification.
  • FIG. 9 is a diagram schematically explaining a processing example of the optical processing section 522b.
  • the optical axis L100 of the first optical system 5102 is shown shifted to facilitate understanding of the explanation, but it is actually coaxial.
  • the depth of field of the first optical system 5102 is shallower than the area 200 indicating the depth of field of the second optical system 5104 (see FIG. 8), as indicated by the area 100 . 19, which will be described later, the optical axis L100 is shifted in a direction perpendicular to the optical axis L100 as shown in FIG. image.
  • FIG. 10 is a diagram schematically showing the processing order of the image synthesizing unit 522.
  • the optical processing unit 522b adds a plurality of first images G100a to generate a second image G200a corresponding to the image captured by the second optical system 5104 (see FIG. 8).
  • the optical processing unit 522b generates the second image G200a according to, for example, formula (1).
  • i denotes the second image
  • Ii denotes the first image G100a at focus i
  • ai denotes the coefficients corresponding to the first image Ii.
  • the optical processing unit 522b generates the second image G200a by convolutionally combining the plurality of first images G100a based on the information of the second optical system 5104.
  • the lower diagram of FIG. 10 is a diagram showing the result of processing according to the aperture area of the aperture stop 5104b, that is, the numerical aperture NA. That is, the numerical aperture NA increases as the aperture increases.
  • the optical processing unit 522b increases the addition range of the plurality of first images G100a in the Z direction. increase.
  • the optical processing unit 522b when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, from the single first image G100a The first image G100 is processed to generate a second image so that the image information in the optical axis direction L100 increases as well. That is, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the optical processing unit 522b, based on the plurality of first images G100a, makes the object field smaller than the single first image G100a. The first image G100 is processed to produce a second image G200a so as to have image information corresponding to a deep image.
  • the display control section 520 causes the image display section 516 to display the second image G200a.
  • the optical processing unit 522b based on the plurality of first images G100a, aligns the optical axis more than the single first image G100a.
  • the first image G100 is processed to generate a second image such that the image information in the direction L100 is reduced.
  • FIG. 11 is a diagram showing an example in which the first image G100a and the second image G200a are displayed side by side.
  • the display control section 520 may cause the image display section 516 to display the second image G200a while observing the first image G100.
  • the display image corresponding to the first image G100a and the display image corresponding to the second image G200a are displayed side by side on the screens W100a and W200a.
  • This enables the pathologist to objectively grasp the difference in diagnostic criteria between the first image G100a and the second image G200a.
  • pathologists gain experience in image interpretation with this system, they can grasp the diagnostic criteria corresponding to the first image G100a, and even when using only the first image G100a, more accurate diagnosis becomes possible.
  • a single image of the first image G100a is processed by the color processing unit 522c and the frequency processing unit 522d to obtain a second image.
  • Image G ⁇ b>200 a may be displayed on image display unit 516 .
  • the second image G200a processed using the first image G100 may be displayed on the image display unit 516 when all imaging in the depth direction Z is completed.
  • the optical processing unit 522b may pre-calculate the coefficient ai as the information of the second optical system 5104 and store it in the storage unit 502. In this case, the optical processing unit 522b reads out the coefficient ai from the storage unit 502 and performs processing for generating the second image G200a.
  • This coefficient can be calculated using the optical characteristic information of the objective lens 5104a and the aperture stop 5104b of the second optical system 5104. FIG. Alternatively, it may be calculated in advance by computer simulation based on the information of the objective lens 5104a and the aperture stop 5104b of the second optical system 5104.
  • the optical processor 522b is also capable of deconvolution. Let Ik be the first image and let ik be the second image, as shown in equation (2).
  • An example of expression (2) is a case where the second image ik has a larger numerical aperture than the first image G100a.
  • a coefficient ai indicates a coefficient corresponding to the first image Ik. That is, the optical processing unit 522b inversely transforms the determinant expressed by Equation (2), calculates new coefficients using the inverse change matrix, and generates the second image ik using the plurality of first images G100a. . In this way, the optical processing unit 522b can generate the second image G200a with a larger numerical aperture than the first image G100a by the deconvolution operation.
  • the color processing unit 522c changes either the density or color of the first image G100a or the second image generated by the optical processing unit 522b according to the information of the second optical system 5104. In this case, either the density or the color may be changed with respect to the synthesized second image. Alternatively, either the density or the color may be changed for a single sheet of the first image G100a. For example, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the color processing unit 522c performs processing to increase the density. Alternatively, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the color processing unit 522c performs processing to change the color to a darker side.
  • the color processing unit 522c performs processing to further lower the density. This makes it possible to bring the color tone closer to the image captured by the second optical system 5104 .
  • the frequency processing unit 522d performs processing for changing the spatial frequency (MTF) of the first image G100a or the second image generated by the optical processing unit 522b according to the information of the second optical system 5104. For example, when the aperture of the second optical system 5104 is narrower than the aperture of the first optical system 5102, the frequency processing unit 522d determines the spatial frequency (MTF ) is made lower than the first image G100a. On the other hand, when the aperture of the second optical system 5104 is wider than the aperture of the first optical system 5102, the frequency processing unit 522d adjusts the spatial frequency ( MTF) is increased more than the first image G100a. This makes it possible to bring the spatial frequency (MTF) closer to the image captured by the second optical system 5104 .
  • FIG. 12 is a diagram showing an example of a UI screen generated by the UI control unit 518. As shown in FIG. The observer designates mode buttons U100 to U104 displayed on the image display unit 516 via the operation unit 506. FIG. Note that the UI control unit 518 according to this embodiment corresponds to the selection unit.
  • FIG. 13 is a diagram showing the processing sequence when the mode button U100 is selected.
  • the vertical axis indicates numerical aperture, and the horizontal axis indicates magnification.
  • a processing sequence MU100 indicates a processing sequence when the mode button U100 is designated, and a processing sequence MU200 indicates a processing sequence when the mode button U102 is designated.
  • the UI control unit 518 may cause the image display unit 516 to display the example image shown in FIG. 13 in conjunction with the UI screens shown in FIGS. 12 and 14 . As a result, the operator can more objectively grasp the details of the operation.
  • FIG. 14 is a diagram showing a UI example for instructing the amount and magnification of the aperture.
  • a region U140 is a button for inputting a numerical aperture NA as a quantity related to aperture
  • a region U142 is a button for inputting a magnification.
  • the observer designates mode buttons U140 and U142 via the operation unit 506 and inputs numerical values. Also, the observer designates one of the mode buttons U144 and U145 via the operation unit 506 and inputs a selection signal.
  • the optical processing unit 522b outputs the image captured by the first optical system 5102 without processing. That is, the first mode is a mode that does not perform processing for making the image captured by the first optical system 5102 correspond to the processing result of the second optical system.
  • the processing sequence in this case is the MU100 (see FIG. 13), and the numerical aperture is fixed at, for example, 0.7. be done.
  • the optical processing unit 522b changes the imaging magnification to generate the second image G200a.
  • the optical processing unit 522b uses the information in the second optical system 5102 1 to process and output the first image G100. That is, the second mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the processing result 1 of the second optical system 5102 .
  • 1 of the second optical system 5102 means processing using information of the second optical system of the optical microscope of model number 100, for example.
  • the MU200 is selected as the process sequence, and the numerical aperture and the magnification are linked. Therefore, for example, if a magnification of 60 times is input, the numerical aperture (NA) is set to 0.8. On the other hand, if a numerical aperture (NA) of 0.8 is input, a magnification of 60 is set. In this way, the inputs are interlocked with each other.
  • the optical processing unit 522b generates a second image G200a using the first image G100 according to the information of 1 of the second optical system 5102.
  • the input via the mode button U140 and the input via the mode button U142 can be performed independently. is.
  • the optical processing unit 522b adjusts the Z Increases the addition range for directions.
  • the optical processing unit 522b based on the plurality of first images G100a, provides a greater depth of field than the single first image G100a.
  • the first image G100 is processed to have image information corresponding to a deeper image to produce a second image G200a.
  • the display control section 520 causes the image display section 516 to display the second image G200a.
  • the optical processing unit 522b uses the first image G100 to generate the second image G200a according to the information of 2 of the second optical system 5102, in the same way as when the mode button U102 is selected. That is, the third mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the imaging result of the second optical system 5102 .
  • the mode button U106 When the mode button U106 is selected, for example, information on the second optical system in the model number 300 digital microscope is read from the storage unit 502. Then, the optical processing unit 522b generates a second image G200a using the first image G100 according to the information of 3 of the second optical system 5102.
  • FIG. That is, the fourth mode is a mode in which processing is performed to make the image captured by the first optical system 5102 correspond to the imaging result of 3 of the second optical system 5102 .
  • preset information of the second optical system 5102 is read from the storage unit 502 via the mode buttons U100 to U106, and the optical processing unit 522b operates according to the read information of the second optical system 5102. , the first image G100 is used to generate a second image G200a. This makes it possible to easily generate captured images corresponding to various microscopes.
  • FIG. 15 is a diagram showing an example of a UI screen related to image processing generated by the UI control unit 518.
  • the observer designates mode buttons U152 to U156 displayed on the image display unit 516 via the operation unit 506.
  • FIG. 15 for example, when the mode button U152 is selected, according to the information of 1 of the second optical system 5102, the color processing unit 522c and the frequency processing unit 522d perform the first image G100a or optical processing. Processing is performed to change any one of the density, color, and spatial frequency processing of the second image G200a generated by the unit 522b. That is, the fifth mode is a mode in which the first image G100a captured by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in one of density, color, and spatial frequency processing. be.
  • the color processing unit 522c and the frequency processing unit 522d generate the first image G100a or the optical processing unit 522b. Processing is performed to change any one of the density, color, and spatial frequency processing of the second image G200a. That is, the sixth mode is a mode in which the image picked up by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in any one of density, color, and spatial frequency processing.
  • the seventh mode is a mode in which the image picked up by the first optical system 5102 is processed to correspond to the processing result of the second optical system 5102 in any one of density, color, and spatial frequency processing.
  • FIG. 16 is a diagram showing an example of a UI screen related to special processing generated by the UI control unit 518.
  • FIG. FIG. 17 shows a processing system corresponding to the mode selected in FIG.
  • the mode button U162 is selected when the mode button U102 (see FIG. 12) is selected
  • the processing according to the processing sequence MU200 is performed by the optical processing unit 522b.
  • the standard mode is a mode corresponding to the imaging result of the second optical system 5102 .
  • the mode button U164 when the mode button U164 is selected, the linkage between the numerical aperture of the input UI (see FIG. 14) and the magnification input is changed. are linked so that As a result, in the high NA mode, according to the processing sequence MU200H, the second image G200a with a deeper depth of field than in the standard mode is generated. That is, the high NA mode is a mode that generates a second image G200a with a deeper depth of field than the standard mode.
  • the mode button U166 when the mode button U166 is selected, the linkage between the numerical aperture of the input UI (see FIG. 14) and the magnification input is changed. It is set so that the numbers are interlocked.
  • the low NA mode according to the processing sequence MU200L, a second image G200a with a shallower depth of field than in the standard mode is generated. That is, the low NA mode is a mode for generating a second image G200a with a shallower depth of field than the standard mode.
  • FIG. 18 is a diagram showing an example in which the focal position shifts according to the distance from the optical axis L300 of the objective lens 5108a of the second optical system.
  • the focal position of the objective lens 5108a of the second optical system is indicated by line L5108a, and the optical axis is indicated by L300.
  • the objective lens 5108a of the second optical system is, for example, an aspherical lens, and the line L5108a varies in distance from the surface layer of the sample according to the distance from the optical axis L300 of the second optical system.
  • the line L5108a approaches the surface layer of the sample as the distance from the optical axis L300 increases.
  • the optical processing unit 522b sets the focal position, which is the characteristic of the objective lens 5108a of the second optical system, according to the distance from the optical axis L300 of the second optical system based on the line L5108a. Change the value of the coefficient ai (see formula (1)). More specifically, the optical processing unit 522b adjusts the value of the coefficient ai (see formula (1)) so that the image information on the objective lens 5108a side increases as the distance from the optical axis L300 increases. to change That is, the value of the coefficient ai (see formula (1)) increases the value of the coefficient ai on the objective lens 5108a side as the distance from the axis L300 increases.
  • the special mode is a mode in which the second image G200a is generated from the group of images G100 captured by the first optical system 5102 according to the focal position of the objective lens 5108a of the second optical system.
  • FIG. 19 is a diagram showing an example of experimentally generated coefficients related to formulas (1) and (2).
  • the checker chart is placed on the stage 5104 and imaged.
  • images are taken with a digital camera at magnifications of 10, 20, and 40 (steps S100, S102, and S104).
  • the coefficients (see formula (1)) for changing the imaging magnification from 1 to 20 are calculated (steps S106 and S108).
  • the image synthesizing unit 522 performs scanner imaging at an imaging magnification of 40 (step S110). Then, using the coefficients generated in steps S106 and S108, a convolution operation is performed on an image captured at an imaging magnification of 40 (steps S112 and S114). Then, an inverse matrix calculation is performed to generate coefficients for deconvolution calculation with imaging magnifications from 20 times to 40 times and from 10 times to 40 times (steps S116, S118, S120). Similarly, the conversion of each density and color tone is measured, and the parameters of the color processing unit 522c when the imaging magnification is changed from 20 times to 40 times and from 10 times to 40 times are acquired.
  • the doctor sets the imaging magnification using the UI shown in FIG. 14, for example (step S128).
  • the optical processing unit 522b reads the coefficient corresponding to the imaging magnification from the storage unit 502, and uses the first image 100 to generate the second image 200Ga (step S130).
  • FIG. 20 is a flowchart showing a processing example of the microscope system 5000 according to this embodiment. As shown in FIG. 20, first, a pathological specimen is placed on stage 5103 (step S200). Next, under the control of the control unit 5110a, the sensor 5103 captures an image of the pathological specimen (step S202).
  • control unit 5110a determines whether or not a predetermined number of shots have been taken (step S203). If it is determined that the predetermined number of images have not been taken (n in step S203), the position of the stage 5103 is changed in the optical axis direction Z via the stage control unit 514, and the processing from step S202 is repeated.
  • step S203 if it is determined that the predetermined number of images have been taken (y in step S203), the first image G100 is accumulated in the image accumulation unit (storage unit) 502 (step S206).
  • the magnification and numerical aperture are input via the UI (see FIG. 14) (step S208).
  • the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to the inputted magnification and numerical aperture (step S210). (Step S212).
  • the display control unit 520 causes the image display unit 516 to display the second image G200a (step S214), and ends the overall processing.
  • FIG. 21 is a flowchart showing a processing example of the microscope system 5000 when the mode button U144 is pressed and the magnification and numerical aperture are linked.
  • the UI control unit 518 when the mode button U144 is selected, the UI control unit 518 generates the numerical aperture according to the indicated magnification (step S310).
  • a second image G200a is generated using the first image G100 read from the image accumulation unit (storage unit) 502 according to the numerical aperture to be used (step S312). In this way, when the magnification and the numerical aperture are interlocked, it becomes unnecessary to input the numerical aperture, for example, and it is possible to improve the processing efficiency.
  • the image acquiring unit 515 acquires the first image G100 captured from the image storage unit 502 via the first optical system 5102, and the image synthesizing unit 522 acquires the first image G100.
  • the first image G100 is processed to generate the second image G200a. This makes it possible to generate a second image G200a corresponding to the image captured by the second optical system 5104 using the first image G100 captured by the first optical system 5102 . Therefore, even an observer accustomed to using the second optical system 5104 can more easily diagnose an image.
  • the microscope system 5000 according to Modification 1 of the first embodiment is different from the microscope system 5000 according to the first embodiment in that the optical processing unit 522b can process using the information of the field stop on the illumination side. . Differences from the microscope system 5000 according to the first embodiment will be described below.
  • FIG. 22 is a block diagram showing a configuration example of a microscope system 5000 according to modification 1 of one embodiment.
  • the microscope system 5000 further includes an illumination-side field stop 530 and an illumination NA control section (illumination-side aperture section) 540 that controls the illumination-side field stop 530 .
  • the field stop 530 is arranged at a position conjugate with the aperture stop 5102b (see FIG. 5). Therefore, the field stop 530 on the illumination side can adjust the luminous flux, that is, the numerical aperture NA, like the aperture stop 5102b.
  • FIG. 23 is a diagram showing an example of a UI screen generated by the UI control unit 518 according to Modification 1 of the first embodiment. It differs from the example of the UI screen according to the first embodiment (see FIG. 14) in that a mode button U146 is added.
  • the UI control unit 518 supplies the instructed illumination-side numerical aperture to the optical processing unit 522b of the image synthesizing unit 522 when the mode button U146 is selected.
  • the optical processing unit 522b uses the first image G1 read from the image storage unit (storage unit) 502 based on the numerical aperture of the illumination side and the numerical aperture of the aperture stop 5102b (see FIG. 5) of the second optical system. , to generate a second image G200.
  • the coefficient shown in equation (1) is used to generate the second image G200. do.
  • the optical processing unit 522b generates the second image G200 based on the illumination-side numerical aperture and the numerical aperture of the aperture stop 5102b (see FIG. 5) of the second optical system. I decided to generate. This makes it possible to more accurately generate the second image G200a corresponding to the image captured by the second optical system 5104 using the first image G100 captured with illumination corresponding to the numerical aperture on the illumination side. becomes.
  • the microscope system 5000 according to Modification 2 of the first embodiment differs from the microscope system 5000 according to the first embodiment in that the second image G200a is generated without using the operation unit 506.
  • FIG. Differences from the microscope system 5000 according to the first embodiment will be described below.
  • FIG. 24 is a block diagram showing a configuration example of a microscope system 5000 according to modification 2 of the first embodiment.
  • an image composition unit 522 uses a first image G100 accumulated in an image accumulation unit 502 according to preset parameters.
  • a second image G200a corresponding to the image captured by the second optical system 5104 is generated.
  • the image synthesizing unit 522 accumulates the generated second image G200a in the synthetic image accumulating unit 545.
  • the composite image storage unit 545 is connected to an in-hospital viewer system, for example, via a network.
  • FIG. 25 is a flowchart showing a processing example of the microscope system 5000 according to modification 2 of the first embodiment.
  • the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to preset magnification and numerical aperture. (step S406). Subsequently, the image synthesizing unit 522 accumulates the generated second image G200a in the synthetic image accumulating unit 545.
  • FIG. 24 the image synthesizing unit 522 generates a second image G200a using the first image G100 read from the image storage unit (storage unit) 502 according to preset magnification and numerical aperture.
  • the viewer system displays the second image G200a accumulated in the composite image accumulation unit 545 (step S214), and ends the overall processing.
  • the image synthesizing unit 522 uses the first image G100 read from the image storage unit (storage unit) 502 according to the preset magnification and numerical aperture. It was decided to generate the second image G200a. As a result, the second image G200a can be generated based on the set specimen without the intervention of the observer, and the pathological image can be generated more efficiently.
  • This technology can be configured as follows.
  • an acquisition unit that acquires a first image captured through a first optical system
  • a processing unit that processes the first image to generate a second image using optical system information related to a second optical system different from the first optical system
  • An information processing device An information processing device.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system;
  • the processing unit when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system, based on the plurality of first images, the optical axis direction than the first image.
  • the information processing apparatus according to (1), wherein the second image is generated so as to increase image information.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
  • the processing unit increases image information in the optical axis direction more than the first image based on the plurality of first images.
  • the processing unit may, based on the plurality of first images, The information processing apparatus according to (5), wherein the second image is generated so as to have image information corresponding to an image with a deep depth of field.
  • the processing unit Based on the plurality of first images, when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system, the processing unit The information processing apparatus according to (6), wherein the second image is generated so as to have image information corresponding to an image with a deep depth of field.
  • the first image is composed of a plurality of first images taken by changing the shooting position in the optical axis direction of the first optical system,
  • the information processing device according to (5), wherein the processing unit changes the value of the coefficient according to the distance from the optical axis of the second optical system based on the characteristics of the objective lens of the second optical system. .
  • the processing unit changes the value of the coefficient so that the image information on the objective lens side increases as the distance from the optical axis of the first optical system increases.
  • Information processing equipment
  • the processing unit reduces the spatial frequency of the second image below the spatial frequency of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
  • the processing unit performs processing to reduce the spatial frequency of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • the processing unit performs processing to increase the density of the first image when the numerical aperture of the second optical system is smaller than the numerical aperture of the first optical system.
  • Information processing equipment
  • the processing unit performs processing to increase the density of the first image when the imaging magnification of the second optical system is lower than the imaging magnification of the first optical system.
  • Information processing equipment
  • (16) further comprising a selection unit that selects from a plurality of different optical system information as information related to the second optical system;
  • the information processing apparatus according to (1) wherein the processing unit changes processing for the first image according to information selected by the selection unit.
  • (17) further comprising a display control unit that displays a first display image corresponding to the first image and a second display image corresponding to the second image processed by the processing unit side by side on the display unit, 1) The information processing device described in 1).
  • the first display image is an image in which a change in numerical aperture is suppressed even if the imaging magnification is changed
  • the second display image is an image in which the numerical aperture is varied according to the change in imaging magnification.
  • a method of processing information comprising:
  • a microscope device that connects a first objective lens to obtain a first image captured at at least two or more focal positions of a biological tissue
  • a microscope system comprising an information processing device,
  • the information processing device is an acquisition unit that acquires the first image; a processing unit that processes the first image using optical system information related to a second optical system different from the first optical system in the microscope device; a microscope system.
  • 515 Image Acquisition Unit
  • 522 Image Synthesis Unit (Processing Unit)
  • 5000 Microscope System
  • 5102 First Optical System
  • 5104 Second Optical System
  • 5120, 5120a Information Processing Device
  • 5100 Microscope Device
  • G100 First image
  • G100a first image
  • G200a second image
  • L100 optical axis of first optical system
  • L200 optical axis of second optical system.

Abstract

Le problème décrit par la présente invention est de fournir un dispositif de traitement d'informations, un procédé de traitement d'informations et un système de microscope qui permettent la fourniture d'une autre image préférable pour un observateur. À cet effet, l'invention concerne un dispositif de traitement d'informations qui comprend : une unité d'acquisition qui acquiert une première image capturée par l'intermédiaire d'un premier système optique ; et une unité de traitement qui traite la première image pour générer une seconde image à l'aide d'informations de système optique relatives à un second système optique différent du premier système optique.
PCT/JP2022/008728 2021-06-09 2022-03-02 Dispositif de traitement d'informations, procédé de traitement d'informations et système de microscope WO2022259647A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021096824 2021-06-09
JP2021-096824 2021-06-09

Publications (1)

Publication Number Publication Date
WO2022259647A1 true WO2022259647A1 (fr) 2022-12-15

Family

ID=84425146

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/008728 WO2022259647A1 (fr) 2021-06-09 2022-03-02 Dispositif de traitement d'informations, procédé de traitement d'informations et système de microscope

Country Status (1)

Country Link
WO (1) WO2022259647A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002247439A (ja) * 2001-02-13 2002-08-30 Ricoh Co Ltd 画像入力装置、画像入力方法、およびその方法をコンピュータで実行するためのプログラムが格納されているコンピュータが読み取り可能な記録媒体
JP2013117848A (ja) * 2011-12-02 2013-06-13 Canon Inc 画像処理装置及び画像処理方法
JP2014071207A (ja) * 2012-09-28 2014-04-21 Canon Inc 画像処理装置、撮像システム、画像処理システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002247439A (ja) * 2001-02-13 2002-08-30 Ricoh Co Ltd 画像入力装置、画像入力方法、およびその方法をコンピュータで実行するためのプログラムが格納されているコンピュータが読み取り可能な記録媒体
JP2013117848A (ja) * 2011-12-02 2013-06-13 Canon Inc 画像処理装置及び画像処理方法
JP2014071207A (ja) * 2012-09-28 2014-04-21 Canon Inc 画像処理装置、撮像システム、画像処理システム

Similar Documents

Publication Publication Date Title
EP3776458B1 (fr) Microscope à réalité augmentée pour pathologie avec superposition de données quantitatives de biomarqueurs
CN110476101B (zh) 用于病理学的增强现实显微镜
Yagi et al. Digital imaging in pathology: the case for standardization
RU2522123C2 (ru) Система и способ улучшенной автофокусировки с предсказанием
JP5715371B2 (ja) 被写界深度が向上した撮像システム及び撮像方法
CN107850754A (zh) 具有快速样本自动聚焦的成像组件
JP5826561B2 (ja) 顕微鏡システム、標本画像生成方法及びプログラム
JP2011085932A (ja) 被写界深度が向上した撮像システム及び撮像方法
US20110091125A1 (en) System and method for imaging with enhanced depth of field
WO2021177446A1 (fr) Appareil d'acquisition de signaux, système d'acquisition de signaux, et procédé d'acquisition de signaux
JP2022126373A (ja) 情報処理装置及び情報処理方法、コンピュータプログラム、並びに医療診断システム
WO2022259647A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de microscope
WO2022050109A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images, et système de traitement d'images
WO2022209262A1 (fr) Dispositif d'éclairage pour dispositif d'observation d'échantillon biologique, dispositif d'observation d'échantillon biologique, dispositif d'éclairage pour dispositif d'observation et système d'observation
WO2022181263A1 (fr) Dispositif de traitement d'image médicale, procédé de traitement d'image médicale et programme
WO2022209349A1 (fr) Dispositif d'éclairage pour dispositif d'observation, dispositif d'observation et système d'observation
WO2022202233A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, système de traitement d'informations et modèle de conversion
WO2022209443A1 (fr) Dispositif d'analyse d'image médicale, procédé d'analyse d'image médicale et système d'analyse d'image médicale
WO2022259648A1 (fr) Programme de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et système de microscope
WO2020075226A1 (fr) Procédé de fonctionnement pour dispositif de traitement d'image, dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image
WO2022201992A1 (fr) Dispositif d'analyse d'image médicale, procédé d'analyse d'image médicale et système d'analyse d'image médicale
JP2004151263A (ja) 顕微鏡装置
WO2023149296A1 (fr) Dispositif de traitement d'informations, système d'observation d'échantillon biologique et procédé de production d'image
WO2023276219A1 (fr) Dispositif de traitement d'informations, système d'observation d'échantillon biologique et procédé de production d'image
WO2021220857A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et système de traitement d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819837

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE