WO2020066041A1 - Microscope system - Google Patents

Microscope system Download PDF

Info

Publication number
WO2020066041A1
WO2020066041A1 PCT/JP2018/047494 JP2018047494W WO2020066041A1 WO 2020066041 A1 WO2020066041 A1 WO 2020066041A1 JP 2018047494 W JP2018047494 W JP 2018047494W WO 2020066041 A1 WO2020066041 A1 WO 2020066041A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
microscope
microscope system
sample
Prior art date
Application number
PCT/JP2018/047494
Other languages
French (fr)
Japanese (ja)
Inventor
中田 竜男
佐々木 浩
城田 哲也
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2020547902A priority Critical patent/JP7150867B2/en
Priority to EP18935405.3A priority patent/EP3988986A4/en
Priority to CN201880097737.6A priority patent/CN112703440B/en
Publication of WO2020066041A1 publication Critical patent/WO2020066041A1/en
Priority to US17/196,705 priority patent/US20210215923A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/368Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements details of associated display arrangements, e.g. mounting of LCD monitor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/02Objectives
    • G02B21/025Objectives with variable magnification
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • G02B21/20Binocular arrangements
    • G02B21/22Stereoscopic arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/364Projection microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison

Definitions

  • the disclosure herein relates to a microscope system.
  • W WSI Whole Slide Imaging
  • WSI technology is a technology that creates a digital image of the entire specimen on a slide glass.
  • the WSI technology is described in Patent Document 1, for example.
  • ⁇ Techniques such as the WSI technology in which a plurality of images are tiled to form an image larger than the field of view of the microscope with high resolution, are also used in industrial applications. For example, for quality control, an example is an application of inspecting and evaluating a microstructure of a material of an industrial part.
  • An object according to one aspect of the present invention is to provide a new technology that reduces the burden on an operator by assisting operations such as diagnosis, inspection, and evaluation performed based on an optical image obtained by an optical microscope. It is.
  • a microscope system includes an eyepiece, an objective lens that guides light from a sample to the eyepiece, and an optical lens disposed on an optical path between the eyepiece and the objective lens, and light from the sample.
  • An imaging lens that forms an optical image of the sample based on the imaging device that acquires digital image data of the sample based on light from the sample, and the imaging lens on which the optical image is formed.
  • a projection device that projects a projection image onto an image plane between the eyepieces, at least, a first magnification at which the sample is projected onto the image surface, and a second magnification at which the sample is projected onto the imaging device;
  • a control device that manages microscope information including a third magnification at which the projection device is projected onto the image plane, a size of the imaging device, and a size of the projection device.
  • the burden on the operator can be reduced.
  • FIG. 1 is a diagram showing a configuration of a microscope system 1. It is a figure for explaining microscope information MI.
  • FIG. 2 is a diagram showing a configuration of a control device 10.
  • 4 is an example of a flowchart of an image projection process performed by the microscope system 1.
  • 3 is an example of an image viewed from an eyepiece 104 of the microscope system 1.
  • 5 is another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. 9 is another example of a flowchart of an image projection process performed by the microscope system 1.
  • FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. 9 is still another example of the flowchart of the image projection process performed by the microscope system 1. It is a figure for explaining binning. It is a figure for explaining a capture range.
  • FIG. 2 is a diagram showing a configuration of a microscope system 2.
  • FIG. 2 is a diagram showing a configuration of a microscope system 3.
  • FIG. 2 is a diagram showing a configuration of a microscope 400.
  • FIG. 3 is a diagram showing a configuration of a microscope 600.
  • FIG. 3 is a diagram showing a configuration of a microscope 700.
  • FIG. 1 is a diagram showing a configuration of a microscope system 1 according to the present embodiment.
  • FIG. 2 is a diagram for explaining the microscope information MI.
  • FIG. 3 is a diagram illustrating a configuration of the control device 10.
  • the microscope system 1 is a microscope system for observing a sample by looking through the eyepiece 104, and includes at least an objective lens 102, an imaging lens 103, an eyepiece 104, an imaging device 140, a projection device 131, and a control device. 10 is provided.
  • the microscope system 1 projects a projection image using the projection device 131 on an image plane on which an optical image of the sample is formed by the objective lens 102 and the imaging lens 103. Thereby, various information can be provided to the user of the microscope system 1 that observes the sample with the optical image looking through the eyepiece lens 104. For this reason, the microscope system 1 can assist the user in performing the operation while observing the sample with the optical image. Further, in the microscope system 1, the control device 10 manages the microscope information MI. The microscope system 1 can appropriately perform projection control for projecting a projection image on an image plane by using the microscope information MI managed by the control device 10.
  • the microscope system 1 includes a microscope 100, a control device 10, an input device 40, and an audio output device 50, as shown in FIG. Note that the microscope system 1 may include a display device and the like in addition to the above.
  • the microscope 100 is, for example, an upright microscope, and includes a microscope main body 110, a lens barrel 120, an intermediate lens barrel 130, and an imaging device 140. Note that the microscope 100 may be an inverted microscope.
  • the microscope main body 110 includes a stage 101 on which a sample is mounted, an objective lens (objective lens 102, objective lens 102a) that guides light from the sample to the eyepiece 104, an epi-illumination optical system, and a transmission illumination optical system.
  • Stage 101 may be a manual stage or an electric stage. It is desirable that a plurality of objective lenses having different magnifications are mounted on the revolver.
  • the microscope main body 110 may include at least one of an epi-illumination optical system and a transmission illumination optical system.
  • the microscope main body 110 further includes a turret 111 for switching the microscope method.
  • a turret 111 for example, a fluorescent cube used in a fluorescence observation method, a half mirror used in a bright field observation method, and the like are arranged.
  • the microscope main body 110 may be provided with an optical element used in a specific microscope method so that it can be inserted into and removed from the optical path.
  • the microscope main body 110 may include, for example, a DIC prism, a polarizer, and an analyzer used in the differential interference observation method.
  • the lens barrel 120 is a trinocular barrel on which the eyepiece 104 and the imaging device 140 are mounted.
  • An imaging lens 103 is provided in the lens barrel 120.
  • the imaging lens 103 is arranged on an optical path between the objective lens 102 and the eyepiece 104.
  • the imaging lens 103 forms an optical image of the sample on an image plane between the eyepiece 104 and the imaging lens 103 based on light from the sample. That is, the objective lens 102 and the imaging lens 103 project the object plane OP1 shown in FIG. 2 onto the image plane IP1.
  • the imaging lens 103 also forms an optical image of the sample on the image plane IP1a between the imaging element 141 and the imaging lens 103 based on light from the sample. That is, the objective lens 102 and the imaging lens 103 also project the object plane OP1 shown in FIG. 2 onto the image plane IP1a.
  • the projection magnification for projecting the object plane OP1 onto the image plane IP1 and the image plane IP1a is a projection magnification ⁇ calculated by “focal length of the imaging lens 103 / focal length of the objective lens 102”.
  • the imaging lens 103 also forms a later-described projection image on these image planes (image plane IP1 and image plane IP1a) based on light from the projection device 131. That is, the projection lens 133 and the imaging lens 103 project the display surface OP2 shown in FIG. 2 onto the image surfaces IP1 and IP1a. Accordingly, the projection image is superimposed on the optical image on the image plane, so that the user of the microscope system 1 can see the superimposed image in which the projection image is superimposed on the optical image by looking through the eyepiece 104.
  • the projection magnification for projecting the display surface OP2 onto the image plane IP1 and the image plane IP1a is a projection magnification ⁇ calculated by “focal length of the imaging lens 103 / focal length of the projection lens 133”.
  • the imaging lens 103 has a function of changing the focal length without changing the position of the image plane, a function of changing the position of the image plane without changing the focal length, or a function of changing the position and the focal length of the image plane. Each has a function that can be changed independently.
  • a lens that realizes these functions includes a lens that moves at least a part of the lens that forms the imaging lens 103 in the optical axis direction.
  • an active lens that changes at least one of the radius of curvature and the refractive index of at least a part of the lenses of the optical system that forms the imaging lens 103 is included, for example, by electrical control.
  • the active lens may be, for example, a liquid lens.
  • the intermediate lens barrel 130 is provided between the microscope main body 110 and the lens barrel 120.
  • the intermediate lens barrel 130 includes a projection device 131, a light deflecting element 132, and a projection lens 133.
  • the projection device 131 is a device that projects a projection image on an image plane on which an optical image is formed in accordance with a command from the control device 10.
  • the projection device 131 is, for example, a projector using a liquid crystal device, a projector using a digital mirror device, a projector using LCOS, and the like.
  • the size of the projection device 131 on the display surface OP2 is size B.
  • the display surface OP2 is a surface from which the projection device 131 emits light.
  • the size of the projection device 131 refers to the size of a region from which the projection device 131 emits light, and specifically, for example, a diagonal length.
  • the light deflector 132 deflects the light emitted from the projection device 131 toward the image plane, and guides the light to the eyepiece.
  • the light deflecting element 132 is, for example, a beam splitter such as a half mirror or a dichroic mirror, and different types of beam splitters may be used depending on the microscope method.
  • a variable beam splitter that changes the transmittance and the reflectance may be used.
  • the light deflecting element 132 is arranged on an optical path between the objective lens 102 and the eyepiece 104, more specifically, between the objective lens 102 and the imaging lens 103.
  • the projection lens 133 is a lens that guides light from the projection device 131 to the imaging lens 103.
  • the projection lens 133 may be a lens having a function of changing at least one of the position of the image plane and the focal length, for example, an active lens. By changing the focal length of the projection lens 133, the size of the projection image can be adjusted independently of the size of the optical image.
  • the imaging device 140 is, for example, a digital camera, and includes an imaging element 141 and an adapter lens 142.
  • the imaging device 140 acquires digital image data of the sample based on light from the sample.
  • the image sensor 141 is an example of a photodetector that detects light from a sample.
  • the image sensor 141 is a two-dimensional image sensor, such as a CCD image sensor or a CMOS image sensor.
  • the image sensor 141 detects light from the sample and converts the light into an electric signal.
  • the size of the projection device 131 on the image plane IP2 is size A.
  • the image plane IP2 is a light receiving surface of the image sensor 141.
  • the size of the image sensor 141 refers to the size of an effective pixel area of the image sensor 141, and specifically, for example, a diagonal length.
  • the digital image represented by the digital image data acquired by the imaging device 140 may include a projection image in addition to the optical image of the sample.
  • the imaging device 140 can acquire digital image data of a sample that does not include a projection image.
  • the adapter lens 142 projects the optical image formed on the image plane IP1a to the image sensor 141. That is, the image plane IP1a is projected onto the image plane IP2.
  • the projection magnification for projecting the object plane OP1 onto the image plane IP2 is a projection magnification ⁇ .
  • the input device 40 outputs an operation signal according to a user's input operation to the control device 10.
  • the input device 40 is, for example, a keyboard, but may include a mouse, a joystick, a touch panel, and the like.
  • the input device 40 includes a voice input device 41 that receives a voice input.
  • the voice input device 41 is, for example, a microphone.
  • the sound output device 50 outputs a sound according to an instruction from the control device 10.
  • the audio output device 50 is, for example, a speaker.
  • the control device 10 controls the entire microscope system 1.
  • the control device 10 is connected to the microscope 100, the input device 40, and the audio output device 50.
  • the control device 10 mainly includes an imaging control unit 21, an image analysis unit 22, a projection image generation unit 23, a projection control unit 24, and a communication control unit as components related to control of the projection device 131.
  • a section 25 is provided.
  • the control device 10 further includes an information management unit 30.
  • the imaging control unit 21 acquires digital image data of a sample from the imaging device 140 by controlling the imaging device 140.
  • the imaging control unit 21 may control the imaging device 140 so that the exposure period of the imaging device 140 and the projection period of the projection device 131 do not overlap.
  • the digital image data acquired by the imaging control unit 21 is output to the image analysis unit 22, the projection image generation unit 23, and the communication control unit 25.
  • the image analysis unit 22 analyzes the digital image data acquired by the imaging control unit 21 and outputs the analysis result to the projection image generation unit 23.
  • the content of the analysis processing performed by the image analysis unit 22 is not particularly limited.
  • the analysis process may be, for example, a process of counting the number of cells in a digital image, or a process of graphing a time change such as the number of cells and the cell density. Further, the process may be a process of automatically detecting a region of interest based on a threshold value of luminance, or a process of recognizing a shape of a structure shown in a digital image, calculating a center of gravity, or the like.
  • the image analysis unit 22 classifies, for example, one or more structures appearing in the digital image represented by the digital image data into one or more classes, and classifies them into at least one of the one or more classes.
  • An analysis result including information for specifying the position of the classified structure may be output. More specifically, the image analysis unit 22 classifies the cells shown in the digital image according to the staining intensity, and class information on which the cells are classified and position information for specifying the outline of the cell or the outline of the nucleus of the cell. May be generated.
  • the structures classified into at least one class are objects that serve as a basis for determination in pathological diagnosis by a pathologist.
  • the image analysis unit 22 may track the region of interest in the sample based on, for example, digital image data.
  • the analysis result output by the image analysis unit 22 includes the position information of the attention area.
  • the attention area to be tracked may be determined by analyzing the digital image data, or may be determined by the user using the input device 40 to specify.
  • the projection image generation unit 23 generates projection image data representing a projection image.
  • the projection image data generated by the projection image generation unit 23 is output to the projection control unit 24 and the communication control unit 25.
  • the projection image generation unit 23 generates projection image data based on at least the microscope information MI managed by the information management unit 30. Further, the projection image generation unit 23 may generate projection image data based on the microscope information MI and the analysis result of the image analysis unit 22. Further, the projection image generation unit 23 may generate projection image data based on the microscope information MI and data received by the communication control unit 25 from an external system. Further, the projection image generation unit 23 may generate projection image data based on the microscope information MI and the input information from the input device 40.
  • the projection control unit 24 controls the projection of the projection image on the image plane by controlling the projection device 131.
  • the projection control unit 24 may control the projection device 131 according to the setting of the microscope system 1.
  • the projection control unit 24 may determine whether to project the projection image on the image plane according to the setting of the microscope system 1.
  • the projection device 131 may be controlled so that the device 131 projects the projection image on the image plane. That is, the microscope system 1 can change whether or not to project the projection image on the image plane by setting.
  • the projection control unit 24 may control the projection device 131 so that, for example, the light emission period of the projection device 131 and the exposure period of the image sensor 141 do not overlap. Thereby, it is possible to prevent the projection image from appearing in the digital image.
  • the communication control unit 25 exchanges data with a system outside the microscope system 1.
  • the microscope system 1 is connected to an external system via a network such as the Internet.
  • the communication control unit 25 may, for example, transmit image data to an external system and receive an analysis result of the image data. Further, the communication control unit 25 may receive, for example, operation information input by a user of an external system.
  • the information management unit 30 manages the microscope information MI.
  • the microscope information MI includes at least a projection magnification ⁇ that is a first magnification at which the sample is projected onto the image plane IP1 and a projection magnification ⁇ that is a second magnification at which the sample is projected onto the imaging device 140.
  • a projection magnification ⁇ which is a third magnification at which the projection device 131 is projected onto the image plane IP1, a size A of the imaging device 140, and a size B of the projection device 131.
  • the projection magnification ⁇ , the projection magnification ⁇ , the projection magnification ⁇ , the size A of the imaging device 140, and the size B of the projection device 131 are such that the projection image is projected at a desired position on the optical image at a desired size.
  • This information is used to perform the following, and is hereinafter referred to as basic information.
  • the microscope information MI may include other information in addition to the basic information.
  • the microscope information MI may include, for example, information on the microscope method used for forming the optical image. Further, the microscope information MI may include a combination of coordinate information in a direction orthogonal to the optical axis of the objective lens 102 and coordinate information in the direction of the optical axis indicating the position of the stage 101 in the focused state. Good.
  • the microscope information MI may include the number of fields of view (FN) of the eyepiece 104, the number of fields of view (OFN) of the objective lens 102, the number of effective pixels of the imaging device 140, the number of pixels of the projection device 131, and the like.
  • the control device 10 may be a general-purpose device or a dedicated device.
  • the control device 10 is not particularly limited to this configuration, but may have a physical configuration as shown in FIG. 3, for example.
  • the control device 10 may include a processor 10a, a memory 10b, an auxiliary storage device 10c, an input / output interface 10d, a medium drive device 10e, and a communication control device 10f, which are connected to each other by a bus 10g. Is also good.
  • the processor 10a is an arbitrary processing circuit including, for example, a CPU (Central Processing Unit).
  • the processor 10a executes a program stored in the memory 10b, the auxiliary storage device 10c, or the storage medium 10h to perform a programmed process, thereby performing a component (imaging) related to the control of the projection device 131 described above.
  • the control unit 21, the image analysis unit 22, the projection image generation unit 23, etc.) may be realized.
  • the processor 10a may be configured using a dedicated processor such as an ASIC or an FPGA.
  • the memory 10b is a working memory of the processor 10a.
  • the memory 10b is an arbitrary semiconductor memory such as a random access memory (RAM).
  • the auxiliary storage device 10c is a nonvolatile memory such as an EPROM (Erasable Programmable ROM) and a hard disk drive (Hard Disc Drive).
  • the input / output interface 10d exchanges information with external devices (the microscope 100, the input device 40, and the audio output device 50).
  • the medium drive device 10e can output data stored in the memory 10b and the auxiliary storage device 10c to the storage medium 10h, and can read out programs and data from the storage medium 10h.
  • the storage medium 10h is any portable storage medium.
  • the storage medium 10h includes, for example, an SD card, a USB (Universal Serial Bus) flash memory, a CD (Compact Disc), a DVD (Digital Versatile Disc), and the like.
  • the communication control device 10f inputs and outputs information to and from a network.
  • a NIC Network Interface Card
  • a wireless LAN Local Area Network
  • the bus 10g connects the processor 10a, the memory 10b, the auxiliary storage device 10c, and the like so that data can be exchanged with each other.
  • FIG. 4 is a flowchart of the image projection process performed by the microscope system 1.
  • FIG. 5 is an example of an image viewed from the eyepiece 104 of the microscope system 1.
  • an image projection method of the microscope system 1 will be described with reference to FIGS. 4 and 5.
  • the microscope system 1 projects the optical image of the sample on the image plane IP1 (Step S1).
  • the light from the sample captured by the objective lens 102 is condensed by the imaging lens 103 on the image plane IP1, and an optical image of the sample is formed.
  • the optical image O1 is projected on the region R1 on the image plane IP1.
  • the region R1 indicates a region on the image plane IP1 where the light beam from the objective lens 102 enters.
  • the region R2 indicates a region on the image plane IP1 which can be seen by looking through the eyepiece lens 104.
  • the microscope system 1 acquires the microscope information MI (step S2).
  • the projection image generation unit 23 acquires the microscope information MI from the information management unit 30.
  • the microscope system 1 generates projection image data (Step S3).
  • the projection image generation unit 23 generates projection image data based on the microscope information MI acquired in step S2.
  • the projection image generation unit 23 uses the projection magnification ⁇ , the projection magnification ⁇ , and the size B of the projection device 131 included in the microscope information MI to generate a projection image P1 including a scale as shown in an image V2 in FIG. Generate projection image data to represent. More specifically, it can be seen from the relationship between the projection magnification ⁇ and the projection magnification ⁇ that the unit length on the object plane OP1 is ⁇ / ⁇ on the display plane OP2.
  • the size per pixel of the projection device 131 is also known from the size B of the projection device 131, it is also known how many pixels the unit length on the object surface OP1 is on the display surface OP2.
  • the projection image generation unit 23 uses these relationships to generate projection image data for projecting the scale on the optical image O1 with an accurate size.
  • the microscope system 1 projects the projection image on the image plane IP1 (Step S4).
  • the projection control unit 24 controls the projection device 131 based on the projection image data generated in step S3, so that the projection device 131 projects the projection image on the image plane.
  • the projection image P1 including the scale is superimposed on the optical image O1.
  • a projection image is projected on an image plane IP1 on which an optical image is formed.
  • the microscope system 1 generates projection image data using microscope information including a projection magnification and the like.
  • information having a desired size with respect to the optical image can be displayed at a desired position with respect to the optical image using the projection image. Therefore, according to the microscope system 1, it is possible to assist the user in performing the work while observing the sample with the optical image, and it is possible to reduce the work load on the user.
  • the microscope system 1 does not require expensive equipment unlike a WSI system that performs pathological diagnosis based on digital images. Therefore, according to the microscope system 1, the burden on the user can be reduced while avoiding a significant increase in equipment cost.
  • FIGS. 6 and 7 show another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • FIG. FIG. 5 shows an example in which the projection image P1 including the scale is projected on the image plane IP1, but the projection image projected on the image plane IP1 is not limited to the projection image P1.
  • the projection device 131 may project a projection image P2 including setting information of the microscope system 1.
  • the projection image generation unit 23 can also generate the projection image data representing the projection image P2 based on the microscope information.
  • the projection image generation unit 23 may determine the background color of the projection image P2 based on the information on the microscope method included in the microscope information. For example, during bright-field observation, the background color of the optical image is white, and thus the background color of the projection image P2 may be similarly white. Further, the background color of the projection image P2 may be intentionally different from the background color of the optical image. Further, the projection image P2 may be projected only for a period instructed by the user. For example, when the user instructs display by voice, the projection image generation unit 23 may generate projection image data, and the projection device 131 may project the projection image P2 for a predetermined period (for example, 10 seconds). When the user instructs the display by voice, the microscope information may be output from the voice output device 50 as voice.
  • the projection device 131 may project a projection image P3 including a navigation display prompting the user to perform an operation.
  • the projection image data representing the projection image P3 is also generated by the projection image generator 23 based on the microscope information. For this reason, the projection device 131 can project the projection image P3 to a position where both the product included in the optical image O2 and the navigation display included in the projection image P3 can be simultaneously confirmed. Further, the contents of the navigation display may be output as audio from the audio output device 50. Further, when the user follows the navigation display, the product No., which is the product identification information, is displayed. Is input, the projection image generation unit 23 outputs the microscope information and the product No. input from the audio input device 41.
  • the projection device 131 projects the projection image P4 on the optical image O2, as shown in an image V5 in FIG. 7, and further, the imaging device 140 detects the light based on the light from the sample and the light from the projection device 131.
  • superimposed image data representing a superimposed image in which the projection image P4 is superimposed on the optical image O2 may be acquired.
  • the projection image P4 is a barcode that specifies the product indicated by the optical image O2.
  • the product No By including the image related to the product identified in the projection image in the projection image, the information of the product and the product can be recorded at a time.
  • the microscope system 1 may perform the image projection processing shown in FIG. 8 instead of the image projection processing shown in FIG.
  • FIG. 8 is another example of the flowchart of the image projection process performed by the microscope system 1.
  • FIG. 9 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • the microscope system 1 first projects an optical image of the sample onto the image plane IP1 (Step S11), and further acquires microscope information (Step S12).
  • Steps S11 and S12 are the same as steps S1 and S2 shown in FIG.
  • the microscope system 1 acquires digital image data of the sample (step S13).
  • the imaging device 140 acquires digital image data by imaging the sample based on light from the sample.
  • the microscope system 1 analyzes the digital image data (Step S14).
  • the image analysis unit 22 analyzes the digital image data and generates, for example, information that assists in pathological diagnosis. Specifically, the nuclei of the cells are specified by analysis, and classified according to the staining intensity.
  • the microscope system 1 When the analysis is completed, the microscope system 1 generates projection image data (Step S15).
  • the projection image generation unit 23 performs, for example, a projection in which the nucleus of the cell is color-coded according to the staining intensity, as shown in an image V6 in FIG.
  • the projection image data representing the image P5 is generated. More specifically, from the projection magnification ⁇ and the size A of the imaging device 140 included in the microscope information, the relationship between each pixel of the digital image (the imaging device 140) and the position of the object plane OP1 is known.
  • the projection image generation unit 23 generates projection image data representing a projection image P5 including an image of a color corresponding to the size of the nucleus at the position of the nucleus of the cell included in the optical image O1 using these relationships.
  • the microscope system 1 projects the projection image on the image plane IP1 (Step S16).
  • the projection control unit 24 controls the projection device 131 based on the projection image data generated in step S15, so that the projection device 131 projects the projection image on the image plane.
  • the projection image P5 obtained by color-coding the nuclei of the cells by the staining intensity is superimposed on the optical image O1.
  • the microscope system 1 can provide more useful information to the user. For this reason, the work load on the user can be further reduced.
  • FIG. 10 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • the projection device 131 may project a projection image P3 including a navigation display prompting the user to perform an operation. Then, the user inputs the product number, which is the product identification information, according to the navigation display. Is input by voice, the image analysis unit 22 outputs the microscope information, the digital image data, and the product No. input from the voice input device 41. May be used to inspect the sample, and the inspection result may be output as an analysis result. Thereafter, the projection image generation unit 23 may generate projection image data representing the projection image P6 based on the analysis result and the microscope information.
  • the projection image P6 includes, as shown in an image V7 of FIG.
  • the inspection is performed by the image analysis and the inspection result is displayed, so that the work load of the user can be further reduced.
  • the control device 10 may create a check sheet for the test item of the sample based on the analysis result and record the check sheet in, for example, the auxiliary storage device 10c. It is desirable that the check sheet be recorded in association with the image of the sample.
  • information on the inspection target of each product may be stored in the auxiliary storage device 10c in advance, or may be transmitted via a network using the communication control device 10f. May be obtained from an external device.
  • the product identification number which is product identification information
  • the method of acquiring the product identification information is not limited to the voice input.
  • the image analysis unit 22 may specify the product identification information by analyzing the digital image data acquired by the imaging device 140. . Then, the image analysis unit 22 may inspect the product based on the product identification information, the microscope information, and the digital image data, and output the inspection result to the projection image generation unit 23 as an analysis result.
  • the projection image generation unit 23 When the image analysis unit 22 specifies the product identification information by analyzing the digital image data, the projection image generation unit 23 outputs the product identified by the product identification information, for example, as illustrated in an image V5 in FIG. May be generated as the projection image data representing the projection image P4 including the image related to. It is desirable that the image related to the product is, for example, an image including a product code of the product, a product number of the product, or an inspection procedure of the product.
  • FIG. 11 shows still another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • the projection device 131 may project a projection image P7 including a map image.
  • the map image is an image of the sample, which is an image of a region wider than the actual field of view corresponding to the optical image formed on the image plane.
  • the map image may be, for example, an image of a sample acquired using an objective lens having a lower magnification than the objective lens 102.
  • the image may be an image generated by tiling a plurality of images such as Whole Slide Image.
  • the image analysis unit 22 may specify a position corresponding to the optical image on the map image by comparing the digital image with a map image acquired in advance, and output the position to the projection image generation unit 23 as an analysis result.
  • the projection image generation unit 23 may generate a projection image P7 including the mark C at a position corresponding to the optical image of the map image by using the analysis result.
  • the region R1 where the optical image O3 is projected may be made sufficiently smaller than the region R2.
  • the projection image P8 may be projected outside the region R1 by further changing the focal length of the projection lens 133.
  • the region R3 indicates a region on the image plane IP1 where the light beam from the projection lens 133 enters.
  • the region R4 indicates a region where the projection device 131 is projected on the image plane IP1. That is, FIG. 11 shows a state where the image circle formed on the image plane IP1 by the light from the projection device 131 is larger than the image circle formed on the image plane IP1 by the light from the sample.
  • FIG. 12 is another example of an image viewed from the eyepiece 104 of the microscope system 1.
  • an optical image O4 having a higher magnification than the optical image O3 is projected on the region R1, as shown in an image V9 in FIG.
  • the projection image generation unit 23 may generate projection image data such that the size of the mark C on the image plane IP1 is maintained, and the projection device 131 may project the projection image P8.
  • the microscope system 1 may perform the image projection processing shown in FIG. 13 instead of the image projection processing shown in FIGS.
  • FIG. 13 is still another example of the flowchart of the image projection process performed by the microscope system 1.
  • the microscope system 1 first projects an optical image of the sample onto the image plane IP1 (Step S21), and further acquires microscope information (Step S22). Steps S21 and S22 are the same as steps S11 and S12 shown in FIG.
  • the microscope system 1 changes the image acquisition setting (step S23).
  • the control device 10 changes the image acquisition setting based on the microscope information acquired in step S22.
  • Step S24 acquires digital image data
  • Step S25 analyzes the digital image data
  • Step S26 generates projection image data
  • Step S27 projects the projection image on an image plane
  • the microscope system 1 can further reduce the work load on the user.
  • a specific example will be described.
  • the control device 10 may determine the detection sensitivity of the imaging device 140, for example, by estimating the brightness of the digital image based on the projection magnification ⁇ , and the imaging control unit 21 may set the detection sensitivity of the imaging device 140 May be changed. Specifically, for example, the amplification factor may be changed, and a binning process may be performed in which a plurality of pixels PX1 are collectively treated as one pixel PX2 as shown in FIG. As a result, the brightness required for image analysis is secured, and a highly accurate analysis result can be obtained.
  • the control device 10 may change the reflectance of the light deflecting element 132 by estimating the brightness of the optical image based on the projection magnification ⁇ , for example. Specifically, for example, when the brightness of the optical image is low, the transmittance of the light deflection element 132 may be increased to reduce the reflectance. This makes it possible to suppress the loss of light from the sample in the light deflection element 132 and to secure the brightness of the optical image.
  • the control device 10 may determine a pixel from which a signal is read out of the effective pixels of the image sensor 141 based on, for example, the projection magnification ⁇ and the size A of the image sensor 141.
  • the setting of the read range of 140 may be changed. For example, as shown in FIG. 15, when the area A3 irradiated with the light beam from the sample is smaller than the area A1 composed of the effective pixels, for example, even if the signal is read from the pixels included in both areas A2. good. This makes it possible to acquire digital image data in a shorter time than in a case where signals are read from the entire effective pixels.
  • the image acquisition setting may be changed based on the microscope information and the digital image data.
  • the brightness of the digital image is detected based on the digital image data, and based on the result, the illumination intensity, the emission intensity of the projection device 131, the reflectance of the light deflecting element 132, and the like are adjusted, and the optical image and the The brightness of the projected image may be adjusted.
  • the image acquisition setting may be changed based on the microscope information and the analysis result.
  • the control device 10 controls the stage 101, which is an electric stage, based on the tracking result so that the region of interest is located on the optical axis of the objective lens 102. May be. That is, the image acquisition position may be changed based on the analysis result.
  • FIG. 16 is a diagram illustrating a configuration of the microscope system 2 according to the present embodiment.
  • the microscope system 2 is different from the microscope system 1 in that a microscope 200 is provided instead of the microscope 100.
  • the microscope 200 includes an intermediate lens barrel 150 instead of the intermediate lens barrel 130.
  • the intermediate lens barrel 150 is provided with an imaging device 140 and a light deflecting element 143 in addition to the projection device 131, the light deflecting element 132, and the projection lens 133.
  • the light deflector 143 deflects light from the sample toward the image sensor 141.
  • the light deflection element 143 is, for example, a beam splitter such as a half mirror.
  • the light deflection element 143 is arranged on an optical path between the light deflection element 132 and the objective lens 102. Accordingly, it is possible to prevent light from the projection device 131 from being incident on the image sensor 141.
  • the same effect as the microscope system 1 can be obtained.
  • the projection device 131 and the imaging device 140 in the intermediate lens barrel 150 a device for projecting a projection image on an image plane can be configured as one unit. Therefore, the existing microscope system can be easily expanded.
  • FIG. 17 is a diagram illustrating a configuration of the microscope system 3 according to the present embodiment.
  • the microscope system 3 differs from the microscope system 1 in that a microscope 300 is provided instead of the microscope 100.
  • the microscope 300 includes an intermediate lens barrel 160 instead of the intermediate lens barrel 130.
  • the intermediate barrel 160 is provided with an imaging device 140 and a light deflecting element 143 in addition to the projection device 131, the light deflecting device 132, and the projection lens 133.
  • the light deflector 143 deflects light from the sample toward the image sensor 141.
  • the light deflection element 143 is, for example, a beam splitter such as a half mirror.
  • the light deflecting element 143 is arranged on an optical path between the imaging lens 103 and the light deflecting element 132.
  • FIG. 18 is a diagram illustrating a configuration of a microscope 400 according to the present embodiment.
  • the microscope system according to the present embodiment is the same as the microscope system 1 except that a microscope 400 is provided instead of the microscope 100.
  • the microscope 400 is different from the microscope 100 in that the microscope 400 includes an active type autofocus device 500. Other points are the same as those of the microscope 100.
  • the autofocus device 500 includes a laser 501, a collimating lens 502, a shielding plate 503, a polarizing beam splitter 504, a quarter-wave plate 505, a dichroic mirror 506, an imaging lens 507, and a two-segment detector 508.
  • the laser light emitted from the laser 501 is collimated by the collimating lens 502, and half of the light beam is blocked by the shielding plate 503.
  • the other half of the light beam is reflected by the polarizing beam splitter 504, enters the objective lens 102 via the quarter-wave plate 505 and the dichroic mirror 506, and is irradiated on the sample by the objective lens 102.
  • the laser beam reflected by the sample passes through the objective lens 102, the dichroic mirror 506, and the quarter-wave plate 505, and is incident on the polarization beam splitter 504 again.
  • the laser light that enters the polarization beam splitter 504 for the second time is reflected by the polarization beam splitter 504 and then passes through the quarter-wave plate 505 twice. Therefore, it has a polarization direction orthogonal to the polarization direction when it first enters the polarization beam splitter 504. Therefore, the laser light passes through the polarizing beam splitter 504. Thereafter, the laser light is applied to the two-segment detector 508 by the imaging lens 507.
  • the light amount distribution detected by the two-segment detector 508 changes according to the amount of deviation from the in-focus state. Therefore, by adjusting the distance between the stage 101 and the objective lens 102 according to the distribution of the amount of light detected by the two-segment detector 508, a focused state can be achieved.
  • the autofocus device 500 when the stage 101 moves in a direction perpendicular to the optical axis of the objective lens 102, the autofocus device 500 performs autofocus processing. Thereby, the work load on the user can be further reduced as compared with the microscope system 1.
  • FIG. 19 is a diagram illustrating a configuration of a microscope 600 according to the present embodiment.
  • the microscope system according to the present embodiment is the same as the microscope system 1 except that a microscope 600 is provided instead of the microscope 100.
  • the microscope 600 is an inverted microscope.
  • the microscope 600 includes a light source 601 and a condenser lens 602 as a transmission illumination optical system.
  • the microscope 600 has an objective lens 603 at a position facing the condenser lens 602.
  • a beam splitter 604, a beam splitter 606, an imaging lens 609, a beam splitter 610, a relay lens 612, and an eyepiece 613 are arranged on the optical axis of the objective lens 603.
  • the microscope 600 further includes a light source 605.
  • the illumination light emitted from the light source 605 is deflected by the beam splitter 604 toward the sample.
  • the microscope 600 further includes a projection device 607 and a projection lens 608.
  • the light from the projection device 607 is deflected toward the eyepiece 613 by the beam splitter 606 that enters via the projection lens 608.
  • a projection image is projected on the image plane between the imaging lens 609 and the relay lens 612 by the light from the projection device 607.
  • the microscope 600 further includes an image sensor 611.
  • the image sensor 611 detects light from the sample reflected by the beam splitter 610 and outputs digital image data.
  • FIG. 20 is a diagram illustrating a configuration of a microscope 700 according to the present embodiment.
  • the microscope system according to the present embodiment is the same as the microscope system 1 except that a microscope 700 is provided instead of the microscope 100.
  • the microscope 700 is a stereo microscope.
  • the microscope 700 includes a light source 712, a collector lens 711, and a condenser lens 710 as a transmission illumination optical system.
  • the microscope 700 includes a light source 707 and an objective lens 708 as an epi-illumination optical system.
  • the microscope 700 includes a light source 709 that is an external illumination light source.
  • the microscope 700 further includes two sets of imaging lenses (702a, 702b) for condensing light from the objective lens 708 to form an intermediate image and two sets of eyepieces (701a, 701b).
  • the eyepiece 701a and the imaging lens 702a are an optical system for the right eye
  • the eyepiece 701b and the imaging lens 702b are an optical system for the left eye.
  • the microscope 700 further includes a projection device 703a as a first projection device and a projection lens 704a on an optical path branched from an optical path for the right eye, and a second projection device on an optical path branched from an optical path for the left eye.
  • a projection device 703b as a device and a projection lens 704b are provided.
  • the microscope 700 further includes, on an optical path branched from the optical path for the right eye, an imaging device 710a that is a first imaging device that acquires first digital image data of the sample based on light from the sample, and a light for the left eye.
  • An imaging device 710b which is a second imaging device that acquires second digital image data of the sample based on light from the sample, is provided on an optical path branched off from the road.
  • the imaging device 710a includes an imaging lens 706a and an imaging device 705a
  • the imaging device 710b includes an imaging lens 706b and an imaging device 705b.
  • the image analysis unit 22 can perform stereo measurement based on the microscope information, the first digital image data, and the second digital image data, and can output height information of the sample as an analysis result.
  • the projection image generation unit 23 generates projection image data representing a projection image as a three-dimensional image based on the microscope information and the analysis result.
  • the projection device 703a and the projection device 703b can project a projection image, which is a three-dimensional image, onto an image plane.
  • the light source 709 may irradiate the sample with the phase pattern.
  • the image analysis unit 22 can analyze the first digital image data and the second digital image data and output the point cloud data as an analysis result.
  • the projection image generation unit 23 generates projection image data representing a projection image as a three-dimensional image based on the microscope information and the analysis result.
  • the projection device 703a and the projection device 703b can project a projection image, which is a three-dimensional image, onto an image plane.
  • the microscope includes the imaging device.
  • the above-described technology may provide, for example, a scanning microscope with the above-described technology.
  • the microscope may include a photodetector such as a photomultiplier tube (PMT) instead of the imaging device.
  • PMT photomultiplier tube
  • the imaging lens 103 is a lens that varies the focal length
  • the projection lens 133 is a lens that varies the focal length
  • the microscope system includes a lens that changes at least one of the first projection magnification, the second projection magnification, and the third projection magnification.
  • the image analysis unit 22 may perform the analysis process using a predetermined algorithm, or may perform the analysis process using a trained neural network.
  • the parameters of the trained neural network may be generated by training the neural network in a device different from the microscope system, and the control device 10 downloads the generated parameters and applies the downloaded parameters to the image analysis unit 22. Is also good.
  • the image analysis unit 22 may be provided outside the microscope system.
  • the communication control unit 25 may transmit digital image data to an external system, and the external system including the image analysis unit 22 may analyze the digital image data.
  • the communication control unit 25 may receive the analysis result of the digital image data from the external system, and the projection image generation unit 23 may generate the projection image data based on the received analysis result and the microscope information.
  • Microscope system 10 Control device 10a Processor 10b Memory 10c Auxiliary storage device 10d Input / output interface 10e Medium drive device 10f Communication control device 10g Bus 10h Storage medium 21 Imaging control unit 22 Image analysis unit 23 Projected image generation unit 24 Projection Control unit 25 Communication control unit 30 Information management unit 40 Input device 41 Audio input device 50 Audio output device 100, 200, 300, 400, 600, 700 Microscope 101 Stage 02, 102a, 603, 708 Objective lenses 103, 507, 609, 702a, 702b, 706a, 706b Imaging lenses 104, 613, 701a, 701b Eyepiece 110 Microscope main body 111 Turret 120 Barrels 130, 150, 160 Intermediate barrel 131, 607, 703a, 703b Projection device 132, 143 Light deflection element 133, 608, 704a, 704b Projection lens 140, 710a, 710b Image pickup device 141, 611, 705a, 705b Image pickup device 142 Adapter lens 500 Autofocus

Abstract

This microscope system 1 is provided with an ocular lens 104, an objective lens 102 which guides light from a sample towards the ocular lens 104, an imaging lens 103 which is arranged on the optical path between the ocular lens 104 and the objective lens 102 and which forms an optical image of the sample, an imaging device 140 which acquires digital image data of the sample, a projection device 131 which projects a projection image onto an image plane, where the optical image is formed, between the imaging lens 103 and the ocular lens 104, and a control device 10. The control device 10 manages microscope information including at least a first magnification at which the sample is projected on the image plane, a second magnification at which the sample is projected on the imaging device 140, a third magnification at which the sample is projected on the image plane, the size of the imaging device 140, and the size of the projection device 131.

Description

顕微鏡システムMicroscope system
 本明細書の開示は、顕微鏡システムに関する。 The disclosure herein relates to a microscope system.
 病理診断における病理医の負担を軽減する技術の一つとして、WSI(Whole Slide Imaging)技術が注目されている。WSI技術とはスライドガラス上の検体全域のデジタル画像を作成する技術である。WSI技術については、例えば、特許文献1に記載されている。 W WSI (Whole Slide Imaging) technology has attracted attention as one of the technologies to reduce the burden on pathologists in pathological diagnosis. WSI technology is a technology that creates a digital image of the entire specimen on a slide glass. The WSI technology is described in Patent Document 1, for example.
 また、WSI技術のような、複数の画像をタイリングすることで顕微鏡の視野よりも広い領域を高い解像力で画像化する技術は、工業用途でも使用されている。例えば、品質管理のため、工業用部品の材料の微細組織を検査し、評価するなどの用途がその一例である。 技術 Techniques such as the WSI technology, in which a plurality of images are tiled to form an image larger than the field of view of the microscope with high resolution, are also used in industrial applications. For example, for quality control, an example is an application of inspecting and evaluating a microstructure of a material of an industrial part.
 上述した技術によれば、モニタに表示した高解像度の画像を見ながら対象物の任意の領域を観察することが可能となる。このため、診断、検査、評価等における作業者の負担を軽減することができる。 According to the technique described above, it is possible to observe an arbitrary area of the target object while viewing the high-resolution image displayed on the monitor. For this reason, the burden on the operator in diagnosis, inspection, evaluation, and the like can be reduced.
特表2001-519944号公報JP 2001-519944 A
 その一方で、接眼レンズを覗いて試料の光学画像を目視するニーズも引き続き存在する。これは、一般に、デジタル画像は、光学画像と比較して、色再現性とダイナミックレンジにおいて劣っているためである。例えば、病理診断では、色と濃淡の情報は極めて重要であることから、光学画像を用いて診断を行いたいといったニーズが存在している。また、仮に、光学画像と同程度まで高い色再現性と広いダイナミックレンジをデジタル画像に要求すると、顕微鏡システムは、非常に高価なものになってしまう。このため、そのような顕微鏡システムを導入できる利用者は、限られてしまう。 On the other hand, there is a continuing need to look at the optical image of the sample through the eyepiece. This is because digital images are generally inferior in color reproducibility and dynamic range as compared with optical images. For example, in pathological diagnosis, color and shading information are extremely important, and there is a need to make a diagnosis using an optical image. Further, if a digital image is required to have high color reproducibility and a wide dynamic range as high as an optical image, a microscope system becomes very expensive. For this reason, users who can introduce such a microscope system are limited.
 本発明の一側面に係る目的は、光学顕微鏡によって得られる光学画像に基づいて行う、診断、検査、評価などの作業を補助することで、作業者の負担を軽減する新たな技術を提供することである。 An object according to one aspect of the present invention is to provide a new technology that reduces the burden on an operator by assisting operations such as diagnosis, inspection, and evaluation performed based on an optical image obtained by an optical microscope. It is.
 本発明の一態様に係る顕微鏡システムは、接眼レンズと、試料からの光を前記接眼レンズへ導く対物レンズと、前記接眼レンズと前記対物レンズの間の光路上に配置され、前記試料からの光に基づいて前記試料の光学画像を形成する結像レンズと、前記試料からの光に基づいて前記試料のデジタル画像データを取得する撮像装置と、前記光学画像が形成されている前記結像レンズと前記接眼レンズの間の像面へ投影画像を投影する投影装置と、少なくとも、前記試料が前記像面に投影される第1倍率と、前記試料が前記撮像装置に投影される第2倍率と、前記投影装置が前記像面に投影される第3倍率と、前記撮像装置のサイズと、前記投影装置のサイズと、を含む顕微鏡情報を管理する制御装置と、を備える。 A microscope system according to one embodiment of the present invention includes an eyepiece, an objective lens that guides light from a sample to the eyepiece, and an optical lens disposed on an optical path between the eyepiece and the objective lens, and light from the sample. An imaging lens that forms an optical image of the sample based on the imaging device that acquires digital image data of the sample based on light from the sample, and the imaging lens on which the optical image is formed. A projection device that projects a projection image onto an image plane between the eyepieces, at least, a first magnification at which the sample is projected onto the image surface, and a second magnification at which the sample is projected onto the imaging device; A control device that manages microscope information including a third magnification at which the projection device is projected onto the image plane, a size of the imaging device, and a size of the projection device.
 上記の態様によれば、光学顕微鏡によって得られる光学画像に基づいて行う、診断、検査、評価などの作業を補助することで、作業者の負担を軽減することができる。 According to the above aspect, by assisting operations such as diagnosis, inspection, and evaluation performed based on the optical image obtained by the optical microscope, the burden on the operator can be reduced.
顕微鏡システム1の構成を示した図である。FIG. 1 is a diagram showing a configuration of a microscope system 1. 顕微鏡情報MIについて説明するための図である。It is a figure for explaining microscope information MI. 制御装置10の構成を示した図である。FIG. 2 is a diagram showing a configuration of a control device 10. 顕微鏡システム1が行う画像投影処理のフローチャートの一例である。4 is an example of a flowchart of an image projection process performed by the microscope system 1. 顕微鏡システム1の接眼レンズ104から見える画像の一例である。3 is an example of an image viewed from an eyepiece 104 of the microscope system 1. 顕微鏡システム1の接眼レンズ104から見える画像の別の例である。5 is another example of an image viewed from the eyepiece 104 of the microscope system 1. 顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1. FIG. 顕微鏡システム1が行う画像投影処理のフローチャートの別の例である。9 is another example of a flowchart of an image projection process performed by the microscope system 1. 顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1. FIG. 顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1. FIG. 顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1. FIG. 顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。FIG. 6 is still another example of an image viewed from the eyepiece 104 of the microscope system 1. FIG. 顕微鏡システム1が行う画像投影処理のフローチャートの更に別の例である。9 is still another example of the flowchart of the image projection process performed by the microscope system 1. ビニングについて説明するための図である。It is a figure for explaining binning. 取り込み範囲について説明するための図である。It is a figure for explaining a capture range. 顕微鏡システム2の構成を示した図である。FIG. 2 is a diagram showing a configuration of a microscope system 2. 顕微鏡システム3の構成を示した図である。FIG. 2 is a diagram showing a configuration of a microscope system 3. 顕微鏡400の構成を示した図である。FIG. 2 is a diagram showing a configuration of a microscope 400. 顕微鏡600の構成を示した図である。FIG. 3 is a diagram showing a configuration of a microscope 600. 顕微鏡700の構成を示した図である。FIG. 3 is a diagram showing a configuration of a microscope 700.
[第1実施形態]
 図1は、本実施形態に係る顕微鏡システム1の構成を示した図である。図2は、顕微鏡情報MIについて説明するための図である。図3は、制御装置10の構成を示した図である。顕微鏡システム1は、接眼レンズ104を覗いて試料を観察する顕微鏡システムであり、少なくとも、対物レンズ102と、結像レンズ103と、接眼レンズ104と、撮像装置140と、投影装置131と、制御装置10を備えている。
[First Embodiment]
FIG. 1 is a diagram showing a configuration of a microscope system 1 according to the present embodiment. FIG. 2 is a diagram for explaining the microscope information MI. FIG. 3 is a diagram illustrating a configuration of the control device 10. The microscope system 1 is a microscope system for observing a sample by looking through the eyepiece 104, and includes at least an objective lens 102, an imaging lens 103, an eyepiece 104, an imaging device 140, a projection device 131, and a control device. 10 is provided.
 顕微鏡システム1は、対物レンズ102と結像レンズ103によって試料の光学画像が形成されている像面に、投影装置131を用いて投影画像を投影する。これにより、接眼レンズ104を覗き光学画像で試料を観察する顕微鏡システム1の利用者に、種々の情報を提供することができる。このため、顕微鏡システム1は、利用者が光学画像で試料を観察しながら行う作業を補助することができる。さらに、顕微鏡システム1では、制御装置10が顕微鏡情報MIを管理している。顕微鏡システム1は、制御装置10が管理する顕微鏡情報MIを用いることで、投影画像を像面に投影する投影制御を適切に行うことができる。 The microscope system 1 projects a projection image using the projection device 131 on an image plane on which an optical image of the sample is formed by the objective lens 102 and the imaging lens 103. Thereby, various information can be provided to the user of the microscope system 1 that observes the sample with the optical image looking through the eyepiece lens 104. For this reason, the microscope system 1 can assist the user in performing the operation while observing the sample with the optical image. Further, in the microscope system 1, the control device 10 manages the microscope information MI. The microscope system 1 can appropriately perform projection control for projecting a projection image on an image plane by using the microscope information MI managed by the control device 10.
 以下、図1から図3を参照しながら、顕微鏡システム1の構成の具体例について詳細に説明する。顕微鏡システム1は、図1に示すように、顕微鏡100と、制御装置10と、入力装置40と、音声出力装置50を備えている。なお、顕微鏡システム1は、これらに加えて、表示装置などを備えても良い。 Hereinafter, a specific example of the configuration of the microscope system 1 will be described in detail with reference to FIGS. 1 to 3. The microscope system 1 includes a microscope 100, a control device 10, an input device 40, and an audio output device 50, as shown in FIG. Note that the microscope system 1 may include a display device and the like in addition to the above.
 顕微鏡100は、例えば、正立顕微鏡であり、顕微鏡本体110と、鏡筒120と、中間鏡筒130と、撮像装置140を備えている。なお、顕微鏡100は、倒立顕微鏡であってもよい。 The microscope 100 is, for example, an upright microscope, and includes a microscope main body 110, a lens barrel 120, an intermediate lens barrel 130, and an imaging device 140. Note that the microscope 100 may be an inverted microscope.
 顕微鏡本体110は、試料を載置するステージ101と、試料からの光を接眼レンズ104に導く対物レンズ(対物レンズ102、対物レンズ102a)と、落射照明光学系と、透過照明光学系と、を備えている。ステージ101は、手動ステージであっても、電動ステージであってもよい。レボルバには、倍率の異なる複数の対物レンズが装着されていることが望ましい。なお、顕微鏡本体110は、落射照明光学系と透過照明光学系の少なくとも一方を備えていれば良い。 The microscope main body 110 includes a stage 101 on which a sample is mounted, an objective lens (objective lens 102, objective lens 102a) that guides light from the sample to the eyepiece 104, an epi-illumination optical system, and a transmission illumination optical system. Have. Stage 101 may be a manual stage or an electric stage. It is desirable that a plurality of objective lenses having different magnifications are mounted on the revolver. The microscope main body 110 may include at least one of an epi-illumination optical system and a transmission illumination optical system.
 顕微鏡本体110は、さらに、顕鏡法を切り換えるためのターレット111を備えている。ターレット111には、例えば、蛍光観察法で用いられる蛍光キューブ、明視野観察法で用いられるハーフミラーなどが配置されている。その他、顕微鏡本体110は、特定の顕鏡法で用いられる光学素子を光路に対して挿脱自在に備えていても良い。具体的には、顕微鏡本体110は、例えば、微分干渉観察法で用いられるDICプリズム、ポラライザ、アナライザなどを備えても良い。 The microscope main body 110 further includes a turret 111 for switching the microscope method. In the turret 111, for example, a fluorescent cube used in a fluorescence observation method, a half mirror used in a bright field observation method, and the like are arranged. In addition, the microscope main body 110 may be provided with an optical element used in a specific microscope method so that it can be inserted into and removed from the optical path. Specifically, the microscope main body 110 may include, for example, a DIC prism, a polarizer, and an analyzer used in the differential interference observation method.
 鏡筒120は、接眼レンズ104と撮像装置140が装着された三眼鏡筒である。鏡筒120内には、結像レンズ103が設けられる。結像レンズ103は、対物レンズ102と接眼レンズ104の間の光路上に配置されている。 The lens barrel 120 is a trinocular barrel on which the eyepiece 104 and the imaging device 140 are mounted. An imaging lens 103 is provided in the lens barrel 120. The imaging lens 103 is arranged on an optical path between the objective lens 102 and the eyepiece 104.
 結像レンズ103は、接眼レンズ104と結像レンズ103の間の像面に、試料からの光に基づいて試料の光学画像を形成する。つまり、対物レンズ102と結像レンズ103は、図2に示す物体面OP1を像面IP1に投影する。また、結像レンズ103は、撮像素子141と結像レンズ103の間の像面IP1aにも、試料からの光に基づいて試料の光学画像を形成する。つまり、対物レンズ102と結像レンズ103は、図2に示す物体面OP1を像面IP1aにも投影する。なお、物体面OP1を像面IP1及び像面IP1aへ投影する投影倍率は、“結像レンズ103の焦点距離/対物レンズ102の焦点距離”で算出される投影倍率αである。 The imaging lens 103 forms an optical image of the sample on an image plane between the eyepiece 104 and the imaging lens 103 based on light from the sample. That is, the objective lens 102 and the imaging lens 103 project the object plane OP1 shown in FIG. 2 onto the image plane IP1. The imaging lens 103 also forms an optical image of the sample on the image plane IP1a between the imaging element 141 and the imaging lens 103 based on light from the sample. That is, the objective lens 102 and the imaging lens 103 also project the object plane OP1 shown in FIG. 2 onto the image plane IP1a. The projection magnification for projecting the object plane OP1 onto the image plane IP1 and the image plane IP1a is a projection magnification α calculated by “focal length of the imaging lens 103 / focal length of the objective lens 102”.
 結像レンズ103は、これらの像面(像面IP1、像面IP1a)に、投影装置131からの光に基づいて後述する投影画像も形成する。つまり、投影レンズ133と結像レンズ103は、図2に示す表示面OP2を像面IP1及び像面IP1aに投影する。これにより、像面において光学画像上に投影画像が重畳されるため、顕微鏡システム1の利用者は、接眼レンズ104を覗くことで光学画像に投影画像が重畳した重畳画像を見ることができる。なお、表示面OP2を像面IP1及び像面IP1aへ投影する投影倍率は、“結像レンズ103の焦点距離/投影レンズ133の焦点距離”で算出される投影倍率γである。 像 The imaging lens 103 also forms a later-described projection image on these image planes (image plane IP1 and image plane IP1a) based on light from the projection device 131. That is, the projection lens 133 and the imaging lens 103 project the display surface OP2 shown in FIG. 2 onto the image surfaces IP1 and IP1a. Accordingly, the projection image is superimposed on the optical image on the image plane, so that the user of the microscope system 1 can see the superimposed image in which the projection image is superimposed on the optical image by looking through the eyepiece 104. The projection magnification for projecting the display surface OP2 onto the image plane IP1 and the image plane IP1a is a projection magnification γ calculated by “focal length of the imaging lens 103 / focal length of the projection lens 133”.
 なお、結像レンズ103は、像面の位置を変更することなく焦点距離を可変する機能、焦点距離を変更することなく像面の位置を可変する機能、または、像面の位置と焦点距離をそれぞれ独立して可変する機能を備える。これらの機能を実現するものには、結像レンズ103を構成するレンズの少なくとも一部を光軸方向に移動させるレンズが含まれる。また、結像レンズ103を構成する光学系の少なくとも一部のレンズの曲率半径と屈折率の少なくとも一方を、例えば、電気的な制御により、可変するアクティブレンズも含まれる。アクティブレンズは、例えば、液体レンズであってもよい。 The imaging lens 103 has a function of changing the focal length without changing the position of the image plane, a function of changing the position of the image plane without changing the focal length, or a function of changing the position and the focal length of the image plane. Each has a function that can be changed independently. A lens that realizes these functions includes a lens that moves at least a part of the lens that forms the imaging lens 103 in the optical axis direction. Also, an active lens that changes at least one of the radius of curvature and the refractive index of at least a part of the lenses of the optical system that forms the imaging lens 103 is included, for example, by electrical control. The active lens may be, for example, a liquid lens.
 中間鏡筒130は、顕微鏡本体110と鏡筒120の間に設けられている。中間鏡筒130は、投影装置131と、光偏向素子132と、投影レンズ133を備えている。 The intermediate lens barrel 130 is provided between the microscope main body 110 and the lens barrel 120. The intermediate lens barrel 130 includes a projection device 131, a light deflecting element 132, and a projection lens 133.
 投影装置131は、制御装置10からの命令に従って、投影画像を光学画像が形成されている像面に投影する装置である。投影装置131は、例えば、液晶デバイスを用いたプロジェクタ、デジタルミラーデバイスを用いたプロジェクタ、LCOSを用いたプロジェクタなどである。なお、表示面OP2における投影装置131のサイズはサイズBである。ここで、表示面OP2は、投影装置131が光を出射する面である。また、投影装置131のサイズとは、投影装置131が光を出射する領域のサイズのことであり、具体的には、例えば、対角長などである。 The projection device 131 is a device that projects a projection image on an image plane on which an optical image is formed in accordance with a command from the control device 10. The projection device 131 is, for example, a projector using a liquid crystal device, a projector using a digital mirror device, a projector using LCOS, and the like. The size of the projection device 131 on the display surface OP2 is size B. Here, the display surface OP2 is a surface from which the projection device 131 emits light. The size of the projection device 131 refers to the size of a region from which the projection device 131 emits light, and specifically, for example, a diagonal length.
 光偏向素子132は、投影装置131から出射した光を像面に向けて偏向し、接眼レンズへ導く。光偏向素子132は、例えば、ハーフミラー、ダイクロイックミラーなどのビームスプリッタであり、顕鏡法に応じて、異なる種類のビームスプリッタが使用されてもよい。光偏向素子132には、透過率と反射率を可変する可変ビームスプリッタが用いられても良い。光偏向素子132は、対物レンズ102と接眼レンズ104の間、より詳細には、対物レンズ102と結像レンズ103の間、の光路上に配置される。 The light deflector 132 deflects the light emitted from the projection device 131 toward the image plane, and guides the light to the eyepiece. The light deflecting element 132 is, for example, a beam splitter such as a half mirror or a dichroic mirror, and different types of beam splitters may be used depending on the microscope method. As the light deflection element 132, a variable beam splitter that changes the transmittance and the reflectance may be used. The light deflecting element 132 is arranged on an optical path between the objective lens 102 and the eyepiece 104, more specifically, between the objective lens 102 and the imaging lens 103.
 投影レンズ133は、投影装置131からの光を結像レンズ103へ導くレンズである。投影レンズ133にも、結像レンズ103と同様に、像面の位置と焦点距離の少なくとも一方を可変する機能を備えたレンズ、例えば、アクティブレンズが採用されてもよい。投影レンズ133の焦点距離を変更することで、光学画像の大きさとは独立して投影画像の大きさを調整することができる。 The projection lens 133 is a lens that guides light from the projection device 131 to the imaging lens 103. Like the imaging lens 103, the projection lens 133 may be a lens having a function of changing at least one of the position of the image plane and the focal length, for example, an active lens. By changing the focal length of the projection lens 133, the size of the projection image can be adjusted independently of the size of the optical image.
 撮像装置140は、例えば、デジタルカメラであり、撮像素子141と、アダプタレンズ142を備えている。撮像装置140は、試料からの光に基づいて、試料のデジタル画像データを取得する。 The imaging device 140 is, for example, a digital camera, and includes an imaging element 141 and an adapter lens 142. The imaging device 140 acquires digital image data of the sample based on light from the sample.
 撮像素子141は、試料からの光を検出する光検出器の一例である。撮像素子141は、二次元イメージセンサであり、例えば、CCDイメージセンサ、CMOSイメージセンサなどである。撮像素子141は、試料からの光を検出し、電気信号へ変換する。なお、像面IP2における投影装置131のサイズはサイズAである。ここで、像面IP2は、撮像素子141の受光面である。また、撮像素子141のサイズとは、撮像素子141の有効画素領域のサイズのことであり、具体的には、例えば、対角長などである。 The image sensor 141 is an example of a photodetector that detects light from a sample. The image sensor 141 is a two-dimensional image sensor, such as a CCD image sensor or a CMOS image sensor. The image sensor 141 detects light from the sample and converts the light into an electric signal. The size of the projection device 131 on the image plane IP2 is size A. Here, the image plane IP2 is a light receiving surface of the image sensor 141. The size of the image sensor 141 refers to the size of an effective pixel area of the image sensor 141, and specifically, for example, a diagonal length.
 投影装置131が像面に投影画像を投影しているとき、撮像装置140には投影装置131からの光も入射する。このため、撮像装置140で取得したデジタル画像データで表されるデジタル画像には、試料の光学画像に加えて投影画像も含まれ得る。ただし、投影装置131の投影期間と撮像装置140の露光期間とを調整することで、撮像装置140は、投影画像が含まれない試料のデジタル画像データを取得することができる。 と き When the projection device 131 projects a projection image on the image plane, light from the projection device 131 also enters the imaging device 140. Therefore, the digital image represented by the digital image data acquired by the imaging device 140 may include a projection image in addition to the optical image of the sample. However, by adjusting the projection period of the projection device 131 and the exposure period of the imaging device 140, the imaging device 140 can acquire digital image data of a sample that does not include a projection image.
 アダプタレンズ142は、像面IP1aに形成された光学画像を撮像素子141へ投影する。つまり、像面IP1aを像面IP2へ投影する。なお、物体面OP1を像面IP2へ投影する投影倍率は、投影倍率βである。 The adapter lens 142 projects the optical image formed on the image plane IP1a to the image sensor 141. That is, the image plane IP1a is projected onto the image plane IP2. The projection magnification for projecting the object plane OP1 onto the image plane IP2 is a projection magnification β.
 入力装置40は、利用者の入力操作に応じた操作信号を制御装置10へ出力する。入力装置40は、例えば、キーボードであるが、マウス、ジョイスティック、タッチパネルなどを含んでもよい。また、入力装置40は、音声入力を受け付ける音声入力装置41を含んでいる。音声入力装置41は、例えば、マイクなどである。音声出力装置50は、制御装置10からの指示に従って音声を出力する。音声出力装置50は、例えば、スピーカーである。 The input device 40 outputs an operation signal according to a user's input operation to the control device 10. The input device 40 is, for example, a keyboard, but may include a mouse, a joystick, a touch panel, and the like. The input device 40 includes a voice input device 41 that receives a voice input. The voice input device 41 is, for example, a microphone. The sound output device 50 outputs a sound according to an instruction from the control device 10. The audio output device 50 is, for example, a speaker.
 制御装置10は、顕微鏡システム1全体を制御する。制御装置10は、顕微鏡100、入力装置40、及び、音声出力装置50に接続されている。制御装置10は、主に投影装置131の制御に関連する構成要素として、図1に示すように、撮像制御部21、画像解析部22、投影画像生成部23、投影制御部24、及び通信制御部25を備えている。制御装置10は、さらに、情報管理部30を備えている。 The control device 10 controls the entire microscope system 1. The control device 10 is connected to the microscope 100, the input device 40, and the audio output device 50. As shown in FIG. 1, the control device 10 mainly includes an imaging control unit 21, an image analysis unit 22, a projection image generation unit 23, a projection control unit 24, and a communication control unit as components related to control of the projection device 131. A section 25 is provided. The control device 10 further includes an information management unit 30.
 撮像制御部21は、撮像装置140を制御することで、撮像装置140から試料のデジタル画像データを取得する。撮像制御部21は、例えば、撮像装置140の露光期間と投影装置131の投影期間が重ならないように、撮像装置140を制御しても良い。撮像制御部21が取得したデジタル画像データは、画像解析部22、投影画像生成部23、及び、通信制御部25へ出力される。 The imaging control unit 21 acquires digital image data of a sample from the imaging device 140 by controlling the imaging device 140. For example, the imaging control unit 21 may control the imaging device 140 so that the exposure period of the imaging device 140 and the projection period of the projection device 131 do not overlap. The digital image data acquired by the imaging control unit 21 is output to the image analysis unit 22, the projection image generation unit 23, and the communication control unit 25.
 画像解析部22は、撮像制御部21が取得したデジタル画像データを解析し、解析結果を投影画像生成部23へ出力する。画像解析部22が行う解析処理の内容は特に限定しない。解析処理は、例えば、デジタル画像の写っている細胞数をカウントする処理であってもよく、細胞数、細胞密度などの時間変化をグラフ化する処理であってもよい。また、輝度の閾値に基づいて注目領域を自動的に検出する処理であっても良く、デジタル画像に写っている構造物の形状認識、重心計算などを行う処理であってもよい。 The image analysis unit 22 analyzes the digital image data acquired by the imaging control unit 21 and outputs the analysis result to the projection image generation unit 23. The content of the analysis processing performed by the image analysis unit 22 is not particularly limited. The analysis process may be, for example, a process of counting the number of cells in a digital image, or a process of graphing a time change such as the number of cells and the cell density. Further, the process may be a process of automatically detecting a region of interest based on a threshold value of luminance, or a process of recognizing a shape of a structure shown in a digital image, calculating a center of gravity, or the like.
 また、画像解析部22は、例えば、デジタル画像データが表現するデジタル画像に写る一つ以上の構造物を一つ以上のクラスに分類し、その一つ以上のクラスのうちの少なくとも一つのクラスに分類された構造物の位置を特定する情報を含む解析結果を出力してもよい。より具体的には、画像解析部22は、デジタル画像に写る細胞を染色強度に応じて分類し、細胞が分類されたクラス情報とその細胞の輪郭又はその細胞の核の輪郭を特定する位置情報とを含む解析結果を生成してもよい。なお、その場合、少なくとも一つのクラスに分類された構造物は、病理医による病理診断における判定の根拠となる対象物であることが望ましい。 Further, the image analysis unit 22 classifies, for example, one or more structures appearing in the digital image represented by the digital image data into one or more classes, and classifies them into at least one of the one or more classes. An analysis result including information for specifying the position of the classified structure may be output. More specifically, the image analysis unit 22 classifies the cells shown in the digital image according to the staining intensity, and class information on which the cells are classified and position information for specifying the outline of the cell or the outline of the nucleus of the cell. May be generated. In this case, it is desirable that the structures classified into at least one class are objects that serve as a basis for determination in pathological diagnosis by a pathologist.
 また、画像解析部22は、例えば、デジタル画像データに基づいて、試料内の注目領域を追跡しても良い。この場合、画像解析部22が出力する解析結果には、注目領域の位置情報が含まれる。なお、追跡すべき注目領域は、デジタル画像データを解析することで決定されてもよく、利用者が入力装置40を用いて指定することで決定されてもよい。 {Circle around (4)} The image analysis unit 22 may track the region of interest in the sample based on, for example, digital image data. In this case, the analysis result output by the image analysis unit 22 includes the position information of the attention area. The attention area to be tracked may be determined by analyzing the digital image data, or may be determined by the user using the input device 40 to specify.
 投影画像生成部23は、投影画像を表す投影画像データを生成する。投影画像生成部23で生成された投影画像データは、投影制御部24及び通信制御部25へ出力される。投影画像生成部23は、少なくとも情報管理部30に管理されている顕微鏡情報MIに基づいて、投影画像データを生成する。また、投影画像生成部23は、顕微鏡情報MIと画像解析部22での解析結果とに基づいて、投影画像データを生成してもよい。また、投影画像生成部23は、顕微鏡情報MIと通信制御部25が外部のシステムから受信したデータとに基づいて、投影画像データを生成してもよい。また、投影画像生成部23は、顕微鏡情報MIと入力装置40からの入力情報とに基づいて、投影画像データを生成してもよい。 The projection image generation unit 23 generates projection image data representing a projection image. The projection image data generated by the projection image generation unit 23 is output to the projection control unit 24 and the communication control unit 25. The projection image generation unit 23 generates projection image data based on at least the microscope information MI managed by the information management unit 30. Further, the projection image generation unit 23 may generate projection image data based on the microscope information MI and the analysis result of the image analysis unit 22. Further, the projection image generation unit 23 may generate projection image data based on the microscope information MI and data received by the communication control unit 25 from an external system. Further, the projection image generation unit 23 may generate projection image data based on the microscope information MI and the input information from the input device 40.
 投影制御部24は、投影装置131を制御することで、像面への投影画像の投影を制御する。投影制御部24は、例えば、顕微鏡システム1の設定に応じて投影装置131を制御しても良い。具体的には、投影制御部24は、顕微鏡システム1の設定に応じて、像面に投影画像を投影するか否かを決定してもよく、顕微鏡システム1が所定の設定のときに、投影装置131が像面へ投影画像を投影するように、投影装置131を制御しても良い。つまり、顕微鏡システム1は、投影画像を像面に投影するか否かを設定によって変更することができる。 The projection control unit 24 controls the projection of the projection image on the image plane by controlling the projection device 131. For example, the projection control unit 24 may control the projection device 131 according to the setting of the microscope system 1. Specifically, the projection control unit 24 may determine whether to project the projection image on the image plane according to the setting of the microscope system 1. The projection device 131 may be controlled so that the device 131 projects the projection image on the image plane. That is, the microscope system 1 can change whether or not to project the projection image on the image plane by setting.
 また、投影制御部24は、例えば、投影装置131の発光期間と撮像素子141の露光期間が重ならないように、投影装置131を制御しても良い。これにより、デジタル画像に投影画像が写りこむことを防止することができる。 The projection control unit 24 may control the projection device 131 so that, for example, the light emission period of the projection device 131 and the exposure period of the image sensor 141 do not overlap. Thereby, it is possible to prevent the projection image from appearing in the digital image.
 通信制御部25は、顕微鏡システム1外部のシステムとデータをやり取りする。なお、顕微鏡システム1は、外部のシステムとインターネットなどのネットワークを経由して接続されている。通信制御部25は、例えば、画像データを外部のシステムへ送信し、画像データの解析結果を受信しても良い。また、通信制御部25は、例えば、外部のシステムの利用者が入力した操作情報を受信してもよい。 The communication control unit 25 exchanges data with a system outside the microscope system 1. The microscope system 1 is connected to an external system via a network such as the Internet. The communication control unit 25 may, for example, transmit image data to an external system and receive an analysis result of the image data. Further, the communication control unit 25 may receive, for example, operation information input by a user of an external system.
 情報管理部30は、顕微鏡情報MIを管理する。顕微鏡情報MIは、図2に示すように、少なくとも、試料が像面IP1に投影される第1倍率である投影倍率αと、試料が撮像装置140に投影される第2倍率である投影倍率βと、投影装置131が像面IP1に投影される第3倍率である投影倍率γと、撮像装置140のサイズAと、投影装置131のサイズBと、を含んでいる。上述した、投影倍率α、投影倍率β、投影倍率γ、撮像装置140のサイズA、及び、投影装置131のサイズBは、光学画像に対して投影画像を所望の位置に所望の大きさで投影するために用いられる情報であり、以降では、これらを基本情報と記す。顕微鏡情報MIには、基本情報に加えて、その他の情報が含まれても良い。顕微鏡情報MIには、例えば、光学画像の形成に用いられた顕鏡法の情報が含まれてもよい。また、顕微鏡情報MIには、合焦状態におけるステージ101の位置を示す、対物レンズ102の光軸と直交する方向の座標情報と、光軸の方向の座標情報と、の組み合わせが含まれてもよい。また、顕微鏡情報MIには、接眼レンズ104の視野数(FN)、対物レンズ102の視野数(OFN)、撮像装置140の有効画素数、投影装置131の画素数などが含まれてもよい。 (4) The information management unit 30 manages the microscope information MI. As shown in FIG. 2, the microscope information MI includes at least a projection magnification α that is a first magnification at which the sample is projected onto the image plane IP1 and a projection magnification β that is a second magnification at which the sample is projected onto the imaging device 140. And a projection magnification γ, which is a third magnification at which the projection device 131 is projected onto the image plane IP1, a size A of the imaging device 140, and a size B of the projection device 131. As described above, the projection magnification α, the projection magnification β, the projection magnification γ, the size A of the imaging device 140, and the size B of the projection device 131 are such that the projection image is projected at a desired position on the optical image at a desired size. This information is used to perform the following, and is hereinafter referred to as basic information. The microscope information MI may include other information in addition to the basic information. The microscope information MI may include, for example, information on the microscope method used for forming the optical image. Further, the microscope information MI may include a combination of coordinate information in a direction orthogonal to the optical axis of the objective lens 102 and coordinate information in the direction of the optical axis indicating the position of the stage 101 in the focused state. Good. Further, the microscope information MI may include the number of fields of view (FN) of the eyepiece 104, the number of fields of view (OFN) of the objective lens 102, the number of effective pixels of the imaging device 140, the number of pixels of the projection device 131, and the like.
 なお、制御装置10は、汎用装置であっても、専用装置であってもよい。制御装置10は、特にこの構成に限定されるものではないが、例えば、図3に示すような物理構成を有してもよい。具体的には、制御装置10は、プロセッサ10a、メモリ10b、補助記憶装置10c、入出力インタフェース10d、媒体駆動装置10e、通信制御装置10fを備えてもよく、それらが互いにバス10gによって接続されてもよい。 The control device 10 may be a general-purpose device or a dedicated device. The control device 10 is not particularly limited to this configuration, but may have a physical configuration as shown in FIG. 3, for example. Specifically, the control device 10 may include a processor 10a, a memory 10b, an auxiliary storage device 10c, an input / output interface 10d, a medium drive device 10e, and a communication control device 10f, which are connected to each other by a bus 10g. Is also good.
 プロセッサ10aは、例えば、CPU(Central Processing Unit)を含む、任意の処理回路である。プロセッサ10aは、メモリ10b、補助記憶装置10c、又は、記憶媒体10hに格納されているプログラムを実行してプログラムされた処理を行うことで、上述した投影装置131の制御に関連する構成要素(撮像制御部21、画像解析部22、投影画像生成部23等)を実現しても良い。また、プロセッサ10aは、ASIC、FPGA等の専用プロセッサを用いて構成されてもよい。 The processor 10a is an arbitrary processing circuit including, for example, a CPU (Central Processing Unit). The processor 10a executes a program stored in the memory 10b, the auxiliary storage device 10c, or the storage medium 10h to perform a programmed process, thereby performing a component (imaging) related to the control of the projection device 131 described above. The control unit 21, the image analysis unit 22, the projection image generation unit 23, etc.) may be realized. Further, the processor 10a may be configured using a dedicated processor such as an ASIC or an FPGA.
 メモリ10bは、プロセッサ10aのワーキングメモリである。メモリ10bは、たとえば、RAM(Random Access Memory)等の任意の半導体メモリである。補助記憶装置10cは、EPROM(Erasable Programmable ROM)、ハードディスクドライブ(Hard Disc Drive)等の不揮発性のメモリである。入出力インタフェース10dは、外部装置(顕微鏡100、入力装置40、音声出力装置50)と情報をやり取りする。 The memory 10b is a working memory of the processor 10a. The memory 10b is an arbitrary semiconductor memory such as a random access memory (RAM). The auxiliary storage device 10c is a nonvolatile memory such as an EPROM (Erasable Programmable ROM) and a hard disk drive (Hard Disc Drive). The input / output interface 10d exchanges information with external devices (the microscope 100, the input device 40, and the audio output device 50).
 媒体駆動装置10eは、メモリ10b及び補助記憶装置10cに格納されているデータを記憶媒体10hに出力することができ、また、記憶媒体10hからプログラム及びデータ等を読み出すことができる。記憶媒体10hは、持ち運びが可能な任意の記憶媒体である。記憶媒体10hには、例えば、SDカード、USB(Universal Serial Bus)フラッシュメモリ、CD(Compact Disc)、DVD(Digital Versatile Disc)などが含まれる。 The medium drive device 10e can output data stored in the memory 10b and the auxiliary storage device 10c to the storage medium 10h, and can read out programs and data from the storage medium 10h. The storage medium 10h is any portable storage medium. The storage medium 10h includes, for example, an SD card, a USB (Universal Serial Bus) flash memory, a CD (Compact Disc), a DVD (Digital Versatile Disc), and the like.
 通信制御装置10fは、ネットワークへの情報の入出力を行う。通信制御装置10fとしては、例えば、NIC(Network Interface Card)、無線LAN(Local Area Network)カード等が採用され得る。バス10gは、プロセッサ10a、メモリ10b、補助記憶装置10c等を、相互にデータの授受可能に接続する。 (4) The communication control device 10f inputs and outputs information to and from a network. As the communication control device 10f, for example, a NIC (Network Interface Card), a wireless LAN (Local Area Network) card, or the like can be adopted. The bus 10g connects the processor 10a, the memory 10b, the auxiliary storage device 10c, and the like so that data can be exchanged with each other.
 以上のように構成された顕微鏡システム1は、図4に示す画像投影処理を行う。図4は、顕微鏡システム1が行う画像投影処理のフローチャートである。図5は、顕微鏡システム1の接眼レンズ104から見える画像の一例である。以下、図4及び図5を参照しながら、顕微鏡システム1の画像投影方法について説明する。 顕 微鏡 The microscope system 1 configured as described above performs the image projection processing shown in FIG. FIG. 4 is a flowchart of the image projection process performed by the microscope system 1. FIG. 5 is an example of an image viewed from the eyepiece 104 of the microscope system 1. Hereinafter, an image projection method of the microscope system 1 will be described with reference to FIGS. 4 and 5.
 まず、顕微鏡システム1は、試料の光学画像を像面IP1に投影する(ステップS1)。ここでは、対物レンズ102が取り込んだ試料からの光を結像レンズ103が像面IP1に集光し、試料の光学画像を形成する。これにより、例えば、図5の画像V1に示すように、像面IP1上の領域R1に光学画像O1が投影される。なお、領域R1は、対物レンズ102からの光束が入射する像面IP1上の領域を示している。また、領域R2は、接眼レンズ104を覗くことで見ることが出来る像面IP1上の領域を示している。 First, the microscope system 1 projects the optical image of the sample on the image plane IP1 (Step S1). Here, the light from the sample captured by the objective lens 102 is condensed by the imaging lens 103 on the image plane IP1, and an optical image of the sample is formed. Thereby, for example, as shown in the image V1 of FIG. 5, the optical image O1 is projected on the region R1 on the image plane IP1. The region R1 indicates a region on the image plane IP1 where the light beam from the objective lens 102 enters. The region R2 indicates a region on the image plane IP1 which can be seen by looking through the eyepiece lens 104.
 次に、顕微鏡システム1は、顕微鏡情報MIを取得する(ステップS2)。ここでは、投影画像生成部23が情報管理部30から顕微鏡情報MIを取得する。 Next, the microscope system 1 acquires the microscope information MI (step S2). Here, the projection image generation unit 23 acquires the microscope information MI from the information management unit 30.
 その後、顕微鏡システム1は、投影画像データを生成する(ステップS3)。ここでは、投影画像生成部23がステップS2で取得した顕微鏡情報MIに基づいて投影画像データを生成する。投影画像生成部23は、例えば、顕微鏡情報MIに含まれる投影倍率α、投影倍率γ、投影装置131のサイズBを用いて、図5の画像V2に示すような、スケールを含む投影画像P1を表す投影画像データを生成する。より詳細には、投影倍率αと投影倍率γの関係から物体面OP1における単位長さは、表示面OP2におけるα/γであることが分かる。さらに、投影装置131のサイズBから投影装置131の1画素当たりの大きさも既知であるため、物体面OP1における単位長さが表示面OP2における何画素分かも既知である。投影画像生成部23は、これらの関係を用いて、光学画像O1上にスケールを正確な大きさで投影するための投影画像データを生成する。 Thereafter, the microscope system 1 generates projection image data (Step S3). Here, the projection image generation unit 23 generates projection image data based on the microscope information MI acquired in step S2. The projection image generation unit 23 uses the projection magnification α, the projection magnification γ, and the size B of the projection device 131 included in the microscope information MI to generate a projection image P1 including a scale as shown in an image V2 in FIG. Generate projection image data to represent. More specifically, it can be seen from the relationship between the projection magnification α and the projection magnification γ that the unit length on the object plane OP1 is α / γ on the display plane OP2. Furthermore, since the size per pixel of the projection device 131 is also known from the size B of the projection device 131, it is also known how many pixels the unit length on the object surface OP1 is on the display surface OP2. The projection image generation unit 23 uses these relationships to generate projection image data for projecting the scale on the optical image O1 with an accurate size.
 最後に、顕微鏡システム1は、投影画像を像面IP1に投影する(ステップS4)。ここでは、投影制御部24がステップS3で生成された投影画像データに基づいて投影装置131を制御することで、投影装置131が投影画像を像面に投影する。これにより、図5の画像V2に示すように、光学画像O1上に、スケールを含む投影画像P1が重畳される。 Finally, the microscope system 1 projects the projection image on the image plane IP1 (Step S4). Here, the projection control unit 24 controls the projection device 131 based on the projection image data generated in step S3, so that the projection device 131 projects the projection image on the image plane. Thereby, as shown in the image V2 in FIG. 5, the projection image P1 including the scale is superimposed on the optical image O1.
 顕微鏡システム1では、光学画像が形成される像面IP1に投影画像が投影される。これにより、利用者は、接眼レンズ104から眼を離すことなく、作業に有益な情報の提供を受けることができる。また、顕微鏡システム1は、投影倍率などを含む顕微鏡情報を用いて、投影画像データを生成する。これにより、投影画像を用いて、光学画像に対して所望の位置に光学画像に対して所望の大きさを有する情報を表示することができる。従って、顕微鏡システム1によれば、利用者が光学画像で試料を観察しながら行う作業を補助することが可能であり、利用者の作業負担を軽減することができる。 In the microscope system 1, a projection image is projected on an image plane IP1 on which an optical image is formed. Thus, the user can receive useful information for the work without keeping his / her eye from the eyepiece lens 104. Further, the microscope system 1 generates projection image data using microscope information including a projection magnification and the like. Thus, information having a desired size with respect to the optical image can be displayed at a desired position with respect to the optical image using the projection image. Therefore, according to the microscope system 1, it is possible to assist the user in performing the work while observing the sample with the optical image, and it is possible to reduce the work load on the user.
 さらに、顕微鏡システム1では、デジタル画像に基づいて病理診断を行うWSIシステムとは異なり高価な機器を必要としない。従って、顕微鏡システム1によれば、大幅な機器コストの上昇を回避しながら利用者の負担を軽減することができる。 Furthermore, the microscope system 1 does not require expensive equipment unlike a WSI system that performs pathological diagnosis based on digital images. Therefore, according to the microscope system 1, the burden on the user can be reduced while avoiding a significant increase in equipment cost.
 図6及び図7は、顕微鏡システム1の接眼レンズ104から見える画像の別の例である。図5では、像面IP1にスケールを含む投影画像P1を投影する例を示したが、像面IP1に投影する投影画像は、投影画像P1に限らない。 FIGS. 6 and 7 show another example of an image viewed from the eyepiece 104 of the microscope system 1. FIG. FIG. 5 shows an example in which the projection image P1 including the scale is projected on the image plane IP1, but the projection image projected on the image plane IP1 is not limited to the projection image P1.
 例えば、図6の画像V3に示すように、投影装置131は、顕微鏡システム1の設定情報を含む投影画像P2を投影しても良い。投影画像P2を表す投影画像データについても、投影画像生成部23が顕微鏡情報に基づいて生成することができる。 For example, as shown in an image V3 of FIG. 6, the projection device 131 may project a projection image P2 including setting information of the microscope system 1. The projection image generation unit 23 can also generate the projection image data representing the projection image P2 based on the microscope information.
 なお、投影画像生成部23は、顕微鏡情報に含まれる顕鏡法の情報に基づいて、投影画像P2の背景色を決定してもよい。例えば、明視野観察中であれば、光学画像の背景色は白色であるので、投影画像P2も同様に背景色を白色にしてもよい。また、投影画像P2の背景色を光学画像の背景色とは意図的に異なる色にしてもよい。また、投影画像P2は、利用者から指示された期間だけ投影されても良い。例えば、利用者が音声で表示を指示すると、投影画像生成部23が投影画像データを生成して、投影装置131が所定期間(例えば、10秒間)だけ投影画像P2を投影してもよい。また、利用者が音声で表示を指示すると、顕微鏡情報が音声として音声出力装置50から出力されても良い。 The projection image generation unit 23 may determine the background color of the projection image P2 based on the information on the microscope method included in the microscope information. For example, during bright-field observation, the background color of the optical image is white, and thus the background color of the projection image P2 may be similarly white. Further, the background color of the projection image P2 may be intentionally different from the background color of the optical image. Further, the projection image P2 may be projected only for a period instructed by the user. For example, when the user instructs display by voice, the projection image generation unit 23 may generate projection image data, and the projection device 131 may project the projection image P2 for a predetermined period (for example, 10 seconds). When the user instructs the display by voice, the microscope information may be output from the voice output device 50 as voice.
 また、図7の画像V4に示すように、投影装置131は、利用者に操作を促すナビゲーション表示を含む投影画像P3を投影しても良い。投影画像P3を表す投影画像データについても、投影画像生成部23が顕微鏡情報に基づいて生成する。このため、投影装置131は、例えば、光学画像O2に含まれる製品と投影画像P3に含まれるナビゲーション表示の両方を同時に確認できる位置に投影画像P3を投影することができる。また、ナビゲーション表示の内容は、音声出力装置50から音声として出力されても良い。さらに、利用者がナビゲーション表示に従って製品識別情報である製品No.を音声入力すると、投影画像生成部23が、顕微鏡情報と音声入力装置41から入力された製品No.とに基づいて投影画像P4を表す投影画像データを生成する。そして、投影装置131が、図7の画像V5に示すように、光学画像O2上に投影画像P4を投影し、さらに、撮像装置140が、試料からの光と投影装置131からの光に基づいて、光学画像O2に投影画像P4が重畳された重畳画像を表す重畳画像データを取得しても良い。なお、投影画像P4は、光学画像O2が示す製品を特定するバーコードである。このように、製品No.で識別された製品に関連する画像を投影画像に含めることで、製品と製品の情報を一度に記録することができる。 投影 Further, as shown in an image V4 in FIG. 7, the projection device 131 may project a projection image P3 including a navigation display prompting the user to perform an operation. The projection image data representing the projection image P3 is also generated by the projection image generator 23 based on the microscope information. For this reason, the projection device 131 can project the projection image P3 to a position where both the product included in the optical image O2 and the navigation display included in the projection image P3 can be simultaneously confirmed. Further, the contents of the navigation display may be output as audio from the audio output device 50. Further, when the user follows the navigation display, the product No., which is the product identification information, is displayed. Is input, the projection image generation unit 23 outputs the microscope information and the product No. input from the audio input device 41. And generates projection image data representing the projection image P4. Then, the projection device 131 projects the projection image P4 on the optical image O2, as shown in an image V5 in FIG. 7, and further, the imaging device 140 detects the light based on the light from the sample and the light from the projection device 131. Alternatively, superimposed image data representing a superimposed image in which the projection image P4 is superimposed on the optical image O2 may be acquired. Note that the projection image P4 is a barcode that specifies the product indicated by the optical image O2. Thus, the product No. By including the image related to the product identified in the projection image in the projection image, the information of the product and the product can be recorded at a time.
 顕微鏡システム1は、図4に示す画像投影処理の代わりに図8に示す画像投影処理を行ってもよい。図8は、顕微鏡システム1が行う画像投影処理のフローチャートの別の例である。図9は、顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。 The microscope system 1 may perform the image projection processing shown in FIG. 8 instead of the image projection processing shown in FIG. FIG. 8 is another example of the flowchart of the image projection process performed by the microscope system 1. FIG. 9 is still another example of an image viewed from the eyepiece 104 of the microscope system 1.
 図8に示す画像投影処理では、顕微鏡システム1は、まず、試料の光学画像を像面IP1に投影し(ステップS11)、さらに、顕微鏡情報を取得する(ステップS12)。ステップS11とステップS12の処理は、図4に示すステップS1とステップS2と同様である。 In the image projection process shown in FIG. 8, the microscope system 1 first projects an optical image of the sample onto the image plane IP1 (Step S11), and further acquires microscope information (Step S12). Steps S11 and S12 are the same as steps S1 and S2 shown in FIG.
 次に、顕微鏡システム1は、試料のデジタル画像データを取得する(ステップS13)。ここでは、撮像装置140が、試料からの光に基づいて試料を撮像することでデジタル画像データを取得する。 Next, the microscope system 1 acquires digital image data of the sample (step S13). Here, the imaging device 140 acquires digital image data by imaging the sample based on light from the sample.
 その後、顕微鏡システム1は、デジタル画像データを解析する(ステップS14)。ここでは、画像解析部22が、デジタル画像データを解析して、例えば、病理診断を補助する情報を生成する。具体的には、解析により細胞の核を特定し、その染色強度に応じてクラス分けする。 Thereafter, the microscope system 1 analyzes the digital image data (Step S14). Here, the image analysis unit 22 analyzes the digital image data and generates, for example, information that assists in pathological diagnosis. Specifically, the nuclei of the cells are specified by analysis, and classified according to the staining intensity.
 解析が終了すると、顕微鏡システム1は、投影画像データを生成する(ステップS15)。ここでは、投影画像生成部23が、顕微鏡情報と画像解析部22から出力された解析結果とに基づいて、例えば、図9の画像V6に示すような、細胞の核を染色強度により色分けする投影画像P5を表す投影画像データを生成する。より詳細には、顕微鏡情報に含まれる投影倍率β及び撮像装置140のサイズAから、デジタル画像(撮像装置140)の各画素と物体面OP1の位置の関係は既知である。さらに、投影倍率α、投影倍率γ及び投影装置131のサイズBから、投影装置131の各画素と物体面OP1の位置の関係も既知である。投影画像生成部23は、これらの関係を用いて、光学画像O1に含まれる細胞の核の位置に核の大きさにあわせた色の画像を含む投影画像P5を表す投影画像データを生成する。 When the analysis is completed, the microscope system 1 generates projection image data (Step S15). Here, based on the microscope information and the analysis result output from the image analysis unit 22, the projection image generation unit 23 performs, for example, a projection in which the nucleus of the cell is color-coded according to the staining intensity, as shown in an image V6 in FIG. The projection image data representing the image P5 is generated. More specifically, from the projection magnification β and the size A of the imaging device 140 included in the microscope information, the relationship between each pixel of the digital image (the imaging device 140) and the position of the object plane OP1 is known. Further, the relationship between each pixel of the projection device 131 and the position of the object plane OP1 is also known from the projection magnification α, the projection magnification γ, and the size B of the projection device 131. The projection image generation unit 23 generates projection image data representing a projection image P5 including an image of a color corresponding to the size of the nucleus at the position of the nucleus of the cell included in the optical image O1 using these relationships.
 最後に、顕微鏡システム1は、投影画像を像面IP1に投影する(ステップS16)。ここでは、投影制御部24がステップS15で生成された投影画像データに基づいて投影装置131を制御することで、投影装置131が投影画像を像面に投影する。これにより、図9の画像V6に示すように、光学画像O1上に、細胞の核を染色強度により色分けした投影画像P5が重畳される。 Finally, the microscope system 1 projects the projection image on the image plane IP1 (Step S16). Here, the projection control unit 24 controls the projection device 131 based on the projection image data generated in step S15, so that the projection device 131 projects the projection image on the image plane. Thereby, as shown in the image V6 of FIG. 9, the projection image P5 obtained by color-coding the nuclei of the cells by the staining intensity is superimposed on the optical image O1.
 このように、顕微鏡情報に加えて解析結果を用いて投影画像データを生成することで、顕微鏡システム1は、更に有益な情報を利用者に提供することができる。このため、利用者の作業負担をさらに軽減することができる。 As described above, by generating the projection image data using the analysis result in addition to the microscope information, the microscope system 1 can provide more useful information to the user. For this reason, the work load on the user can be further reduced.
 図10は、顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。図10のV4に示すように、投影装置131は、利用者に操作を促すナビゲーション表示を含む投影画像P3を投影しても良い。そして、利用者がナビゲーション表示に従って製品識別情報である製品No.を音声入力すると、画像解析部22は、顕微鏡情報と、デジタル画像データと、音声入力装置41から入力された製品No.と、に基づいて試料を検査し、その検査結果を解析結果として出力してもよい。その後、投影画像生成部23は、解析結果と顕微鏡情報とに基づいて投影画像P6を表す投影画像データを生成してもよい。投影画像P6は、図10の画像V7に示すように、試料の検査の合否を表す画像を含んでいる。このように、画像解析により検査を行い、検査結果を表示することで、利用者の作業負担をさらに軽減することができる。特に、検査に顕微鏡情報を用いることで検査対象物の長さなどを精度良く測定することができる。このため、高い精度で検査を行うことができる。さらに、制御装置10は、解析結果に基づいて試料の検査項目についてのチェックシートを作成し、例えば、補助記憶装置10cに記録してもよい。なお、チェックシートは、試料の画像と関連付けて記録することが望ましい。また、各製品の検査対象の情報(例えば、製品の各部のサイズ、形状など)は、予めに補助記憶装置10cに記憶されていてもよく、また、通信制御装置10fを用いてネットワークを経由して外部の装置から取得してもよい。 FIG. 10 is still another example of an image viewed from the eyepiece 104 of the microscope system 1. As indicated by V4 in FIG. 10, the projection device 131 may project a projection image P3 including a navigation display prompting the user to perform an operation. Then, the user inputs the product number, which is the product identification information, according to the navigation display. Is input by voice, the image analysis unit 22 outputs the microscope information, the digital image data, and the product No. input from the voice input device 41. May be used to inspect the sample, and the inspection result may be output as an analysis result. Thereafter, the projection image generation unit 23 may generate projection image data representing the projection image P6 based on the analysis result and the microscope information. The projection image P6 includes, as shown in an image V7 of FIG. As described above, the inspection is performed by the image analysis and the inspection result is displayed, so that the work load of the user can be further reduced. In particular, by using the microscope information for the inspection, the length of the inspection object and the like can be accurately measured. Therefore, the inspection can be performed with high accuracy. Further, the control device 10 may create a check sheet for the test item of the sample based on the analysis result and record the check sheet in, for example, the auxiliary storage device 10c. It is desirable that the check sheet be recorded in association with the image of the sample. In addition, information on the inspection target of each product (for example, the size and shape of each part of the product) may be stored in the auxiliary storage device 10c in advance, or may be transmitted via a network using the communication control device 10f. May be obtained from an external device.
 図10では、音声入力によって製品識別情報である製品No.を取得する例を示したが、製品識別情報の取得方法は、音声入力に限らない。例えば、試料に製品識別情報が付されている場合には、画像解析部22は、撮像装置140で取得したデジタル画像データを画像解析部22が解析することで製品識別情報を特定してもよい。そして、画像解析部22は、製品識別情報と顕微鏡情報とデジタル画像データとに基づいて製品を検査し、検査結果を解析結果として投影画像生成部23へ出力しても良い。 In FIG. 10, the product identification number, which is product identification information, is input by voice. Although the example of acquiring the product identification information has been described, the method of acquiring the product identification information is not limited to the voice input. For example, when the product identification information is attached to the sample, the image analysis unit 22 may specify the product identification information by analyzing the digital image data acquired by the imaging device 140. . Then, the image analysis unit 22 may inspect the product based on the product identification information, the microscope information, and the digital image data, and output the inspection result to the projection image generation unit 23 as an analysis result.
 また、画像解析部22がデジタル画像データを解析することで製品識別情報を特定した場合、投影画像生成部23は、例えば、図7の画像V5に示すように、製品識別情報で識別される製品に関連する画像を含む投影画像P4を表す投影画像データを生成してもよい。なお、製品に関連する画像は、例えば、製品の製品コード、製品の製品番号、又は、製品の検査手順を含む画像であることが望ましい。 When the image analysis unit 22 specifies the product identification information by analyzing the digital image data, the projection image generation unit 23 outputs the product identified by the product identification information, for example, as illustrated in an image V5 in FIG. May be generated as the projection image data representing the projection image P4 including the image related to. It is desirable that the image related to the product is, for example, an image including a product code of the product, a product number of the product, or an inspection procedure of the product.
 図11は、顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。図11のV8に示すように、投影装置131は、マップ画像を含む投影画像P7を投影しても良い。マップ画像とは、試料の画像であって、像面に形成されている光学画像に対応する実視野よりも広い領域を写した画像のことである。このため、マップ画像は、例えば、対物レンズ102よりも低倍の対物レンズを用いて取得した試料の画像であってもよい。また、Whole Slide Imageのような複数の画像をタイリングすることで生成された画像であってもよい。画像解析部22は、デジタル画像と予め取得したマップ画像とを比較することで、マップ画像上における光学画像に対応する位置を特定し、解析結果として投影画像生成部23へ出力してもよい。投影画像生成部23は、解析結果を用いることで、マップ画像の光学画像に対応する位置にマークCを含む投影画像P7を生成しても良い。 FIG. 11 shows still another example of an image viewed from the eyepiece 104 of the microscope system 1. As indicated by V8 in FIG. 11, the projection device 131 may project a projection image P7 including a map image. The map image is an image of the sample, which is an image of a region wider than the actual field of view corresponding to the optical image formed on the image plane. For this reason, the map image may be, for example, an image of a sample acquired using an objective lens having a lower magnification than the objective lens 102. Further, the image may be an image generated by tiling a plurality of images such as Whole Slide Image. The image analysis unit 22 may specify a position corresponding to the optical image on the map image by comparing the digital image with a map image acquired in advance, and output the position to the projection image generation unit 23 as an analysis result. The projection image generation unit 23 may generate a projection image P7 including the mark C at a position corresponding to the optical image of the map image by using the analysis result.
 また、結像レンズ103の焦点距離を変更することで、図11に示すように、光学画像O3が投影される領域R1を領域R2に対して十分に小さくしてもよい。その上で、さらに、投影レンズ133の焦点距離を変更することで、投影画像P8を領域R1の外側に投影しても良い。なお、領域R3は、投影レンズ133からの光束が入射する像面IP1上の領域を示している。領域R4は、投影装置131が像面IP1に投影される領域を示している。つまり、図11には、投影装置131からの光によって像面IP1に形成されるイメージサークルが試料からの光によって像面IP1に形成されるイメージサークルよりも大きい様子が示されている。 領域 Alternatively, by changing the focal length of the imaging lens 103, as shown in FIG. 11, the region R1 where the optical image O3 is projected may be made sufficiently smaller than the region R2. Then, the projection image P8 may be projected outside the region R1 by further changing the focal length of the projection lens 133. The region R3 indicates a region on the image plane IP1 where the light beam from the projection lens 133 enters. The region R4 indicates a region where the projection device 131 is projected on the image plane IP1. That is, FIG. 11 shows a state where the image circle formed on the image plane IP1 by the light from the projection device 131 is larger than the image circle formed on the image plane IP1 by the light from the sample.
 図12は、顕微鏡システム1の接眼レンズ104から見える画像の更に別の例である。図11に示す状態から対物レンズをより高い倍率を有する対物レンズに切り換えると、図12の画像V9に示すように、領域R1には光学画像O3よりも高い倍率の光学画像O4が投影される。このとき、投影画像生成部23は、マークCの像面IP1における大きさが維持されるように投影画像データを生成し、投影装置131が投影画像P8を投影してもよい。 FIG. 12 is another example of an image viewed from the eyepiece 104 of the microscope system 1. When the objective lens is switched from the state shown in FIG. 11 to an objective lens having a higher magnification, an optical image O4 having a higher magnification than the optical image O3 is projected on the region R1, as shown in an image V9 in FIG. At this time, the projection image generation unit 23 may generate projection image data such that the size of the mark C on the image plane IP1 is maintained, and the projection device 131 may project the projection image P8.
 顕微鏡システム1は、図4及び図8に示す画像投影処理の代わりに図13に示す画像投影処理を行ってもよい。図13は、顕微鏡システム1が行う画像投影処理のフローチャートの更に別の例である。 The microscope system 1 may perform the image projection processing shown in FIG. 13 instead of the image projection processing shown in FIGS. FIG. 13 is still another example of the flowchart of the image projection process performed by the microscope system 1.
 図13に示す画像投影処理では、顕微鏡システム1は、まず、試料の光学画像を像面IP1に投影し(ステップS21)、さらに、顕微鏡情報を取得する(ステップS22)。ステップS21とステップS22の処理は、図8に示すステップS11とステップS12と同様である。 In the image projection processing shown in FIG. 13, the microscope system 1 first projects an optical image of the sample onto the image plane IP1 (Step S21), and further acquires microscope information (Step S22). Steps S21 and S22 are the same as steps S11 and S12 shown in FIG.
 次に、顕微鏡システム1は、画像取得設定を変更する(ステップS23)。ここでは、制御装置10は、ステップS22で取得した顕微鏡情報に基づいて、画像取得設定を変更する。 Next, the microscope system 1 changes the image acquisition setting (step S23). Here, the control device 10 changes the image acquisition setting based on the microscope information acquired in step S22.
 その後、顕微鏡システム1は、デジタル画像データを取得し(ステップS24)、デジタル画像データを解析する(ステップS25)。さらに、顕微鏡システム1は、投影画像データを生成し(ステップS26)、投影画像を像面に投影する(ステップS27)。ステップS24からステップS26の処理は、図8に示すステップS13とステップS16と同様である。 Thereafter, the microscope system 1 acquires digital image data (Step S24), and analyzes the digital image data (Step S25). Further, the microscope system 1 generates projection image data (Step S26), and projects the projection image on an image plane (Step S27). Steps S24 to S26 are the same as steps S13 and S16 shown in FIG.
 このように、顕微鏡情報に基づいて画像取得設定を変更することで、顕微鏡システム1は、利用者の作業負担をさらに軽減することができる。以下、具体例について説明する。 顕 微鏡 By changing the image acquisition setting based on the microscope information as described above, the microscope system 1 can further reduce the work load on the user. Hereinafter, a specific example will be described.
 制御装置10は、例えば、投影倍率βに基づいてデジタル画像の明るさを推定することで、撮像装置140の検出感度を決定してもよく、撮像制御部21が撮像装置140の検出感度の設定を変更してもよい。具体的には、例えば、増幅率を変更してもよく、図14に示すように複数の画素PX1をひとまとめにして1つの画素PX2として扱うビニング処理を行ってもよい。これにより、画像解析に必要される明るさが確保されるため、精度の高い解析結果を得ることができる。 The control device 10 may determine the detection sensitivity of the imaging device 140, for example, by estimating the brightness of the digital image based on the projection magnification β, and the imaging control unit 21 may set the detection sensitivity of the imaging device 140 May be changed. Specifically, for example, the amplification factor may be changed, and a binning process may be performed in which a plurality of pixels PX1 are collectively treated as one pixel PX2 as shown in FIG. As a result, the brightness required for image analysis is secured, and a highly accurate analysis result can be obtained.
 また、制御装置10は、例えば、投影倍率αに基づいて光学画像の明るさを推定することで、光偏向素子132の反射率を変更してもよい。具体的には、例えば、光学画像の明るさが低い場合には、光偏向素子132の透過率を高めて反射率を低下させてもよい。これにより、光偏向素子132での試料からの光の損失を抑えて光学画像の明るさを確保することが可能となる。 The control device 10 may change the reflectance of the light deflecting element 132 by estimating the brightness of the optical image based on the projection magnification α, for example. Specifically, for example, when the brightness of the optical image is low, the transmittance of the light deflection element 132 may be increased to reduce the reflectance. This makes it possible to suppress the loss of light from the sample in the light deflection element 132 and to secure the brightness of the optical image.
 また、制御装置10は、例えば、投影倍率βと撮像素子141のサイズAに基づいて、撮像素子141が有する有効画素のうち信号を読み出す画素を決定してもよく、撮像制御部21が撮像装置140の読み出し範囲の設定を変更してもよい。例えば、図15に示すように、有効画素からなる領域A1に対して、試料からの光束が照射される領域A3が小さい場合には、例えば、両域A2に含まれる画素から信号を読み出しても良い。これにより、有効画素全体から信号を読み出す場合に比べてデジタル画像データを短時間で取得することができる。 The control device 10 may determine a pixel from which a signal is read out of the effective pixels of the image sensor 141 based on, for example, the projection magnification β and the size A of the image sensor 141. The setting of the read range of 140 may be changed. For example, as shown in FIG. 15, when the area A3 irradiated with the light beam from the sample is smaller than the area A1 composed of the effective pixels, for example, even if the signal is read from the pixels included in both areas A2. good. This makes it possible to acquire digital image data in a shorter time than in a case where signals are read from the entire effective pixels.
 なお、以上では、顕微鏡情報に基づいて画像取得設定を変更する例を示したが、顕微鏡情報とデジタル画像データに基づいて画像取得設定を変更してもよい。例えば、デジタル画像データに基づいてデジタル画像の明るさを検出し、その結果を踏まえて、照明強度、投影装置131の発光強度、光偏向素子132の反射率、などを調整して、光学画像と投影画像の明るさを調整しても良い。また、顕微鏡情報と解析結果に基づいて画像取得設定を変更してもよい。例えば、注目領域を追跡する解析を行う場合であれば、制御装置10は、追跡結果に基づいて、注目領域が対物レンズ102の光軸上に位置するように、電動ステージであるステージ101を制御してもよい。つまり、解析結果に基づいて画像取得位置を変更してもよい。 In the above description, an example in which the image acquisition setting is changed based on the microscope information has been described. However, the image acquisition setting may be changed based on the microscope information and the digital image data. For example, the brightness of the digital image is detected based on the digital image data, and based on the result, the illumination intensity, the emission intensity of the projection device 131, the reflectance of the light deflecting element 132, and the like are adjusted, and the optical image and the The brightness of the projected image may be adjusted. Further, the image acquisition setting may be changed based on the microscope information and the analysis result. For example, when performing an analysis for tracking a region of interest, the control device 10 controls the stage 101, which is an electric stage, based on the tracking result so that the region of interest is located on the optical axis of the objective lens 102. May be. That is, the image acquisition position may be changed based on the analysis result.
[第2実施形態]
 図16は、本実施形態に係る顕微鏡システム2の構成を示した図である。顕微鏡システム2は、顕微鏡100の代わりに顕微鏡200を備える点が、顕微鏡システム1とは異なっている。顕微鏡200は、中間鏡筒130の代わりに中間鏡筒150を備えている。中間鏡筒150には、投影装置131、光偏向素子132、及び、投影レンズ133に加えて、撮像装置140と、光偏向素子143が設けられている。
[Second embodiment]
FIG. 16 is a diagram illustrating a configuration of the microscope system 2 according to the present embodiment. The microscope system 2 is different from the microscope system 1 in that a microscope 200 is provided instead of the microscope 100. The microscope 200 includes an intermediate lens barrel 150 instead of the intermediate lens barrel 130. The intermediate lens barrel 150 is provided with an imaging device 140 and a light deflecting element 143 in addition to the projection device 131, the light deflecting element 132, and the projection lens 133.
 光偏向素子143は、試料からの光を撮像素子141に向けて偏向する。光偏向素子143は、例えば、ハーフミラーなどのビームスプリッタである。光偏向素子143は、光偏向素子132と対物レンズ102の間の光路上に配置されている。これにより、投影装置131からの光が撮像素子141へ入射することを回避することができる。 The light deflector 143 deflects light from the sample toward the image sensor 141. The light deflection element 143 is, for example, a beam splitter such as a half mirror. The light deflection element 143 is arranged on an optical path between the light deflection element 132 and the objective lens 102. Accordingly, it is possible to prevent light from the projection device 131 from being incident on the image sensor 141.
 本実施形態に係る顕微鏡システム2によっても、顕微鏡システム1と同様の効果を得ることができる。また、投影装置131と撮像装置140を中間鏡筒150内に組み込むことで、投影画像を像面に投影するための機器を一つのユニットてして構成することができる。このため、既存の顕微鏡システムを容易に拡張することができる。 顕 微鏡 With the microscope system 2 according to the present embodiment, the same effect as the microscope system 1 can be obtained. In addition, by incorporating the projection device 131 and the imaging device 140 in the intermediate lens barrel 150, a device for projecting a projection image on an image plane can be configured as one unit. Therefore, the existing microscope system can be easily expanded.
[第3実施形態]
 図17は、本実施形態に係る顕微鏡システム3の構成を示した図である。顕微鏡システム3は、顕微鏡100の代わりに顕微鏡300を備える点が、顕微鏡システム1とは異なっている。顕微鏡300は、中間鏡筒130の代わりに中間鏡筒160を備えている。中間鏡筒160には、投影装置131、光偏向素子132、及び、投影レンズ133に加えて、撮像装置140と、光偏向素子143が設けられている。
[Third embodiment]
FIG. 17 is a diagram illustrating a configuration of the microscope system 3 according to the present embodiment. The microscope system 3 differs from the microscope system 1 in that a microscope 300 is provided instead of the microscope 100. The microscope 300 includes an intermediate lens barrel 160 instead of the intermediate lens barrel 130. The intermediate barrel 160 is provided with an imaging device 140 and a light deflecting element 143 in addition to the projection device 131, the light deflecting device 132, and the projection lens 133.
 光偏向素子143は、試料からの光を撮像素子141に向けて偏向する。光偏向素子143は、例えば、ハーフミラーなどのビームスプリッタである。光偏向素子143は、結像レンズ103と光偏向素子132の間の光路上に配置されている。 The light deflector 143 deflects light from the sample toward the image sensor 141. The light deflection element 143 is, for example, a beam splitter such as a half mirror. The light deflecting element 143 is arranged on an optical path between the imaging lens 103 and the light deflecting element 132.
 本実施形態に係る顕微鏡システム3によっても、顕微鏡システム1と同様の効果を得ることができる。 顕 微鏡 With the microscope system 3 according to the present embodiment, the same effect as the microscope system 1 can be obtained.
[第4実施形態]
 図18は、本実施形態に係る顕微鏡400の構成を示した図である。なお、本実施形態に係る顕微鏡システムは、顕微鏡100の代わりに顕微鏡400を備える点を除き、顕微鏡システム1と同様である。
[Fourth embodiment]
FIG. 18 is a diagram illustrating a configuration of a microscope 400 according to the present embodiment. The microscope system according to the present embodiment is the same as the microscope system 1 except that a microscope 400 is provided instead of the microscope 100.
 顕微鏡400は、アクティブ方式のオートフォーカス装置500を備えている点が、顕微鏡100とは異なっている。その他の点は、顕微鏡100と同様である。 The microscope 400 is different from the microscope 100 in that the microscope 400 includes an active type autofocus device 500. Other points are the same as those of the microscope 100.
 オートフォーカス装置500は、レーザ501、コリメートレンズ502、遮蔽板503、偏光ビームスプリッタ504、1/4波長板505、ダイクロイックミラー506、結像レンズ507、2分割ディテクタ508を備えている。レーザ501から出射したレーザ光は、コリメートレンズ502でコリメートされた後に、遮蔽板503でその半分の光束が遮断される。残りの半分の光束は、偏光ビームスプリッタ504で反射し、1/4波長板505、ダイクロイックミラー506を経由して、対物レンズ102に入射し、対物レンズ102によって試料に照射される。試料で反射したレーザ光は、対物レンズ102、ダイクロイックミラー506、1/4波長板505を経由して、再び、偏光ビームスプリッタ504へ入射する。偏光ビームスプリッタ504へ2度目に入射するレーザ光は、偏光ビームスプリッタ504で反射して以降、1/4波長板505を2度通過している。従って、偏光ビームスプリッタ504へ1度目に入射したときとの偏光方向とは直交する偏光方向を有している。このため、レーザ光は偏光ビームスプリッタ504を透過する。その後、レーザ光は、結像レンズ507により2分割ディテクタ508へ照射される。2分割ディテクタ508で検出される光量分布は、合焦状態からのずれ量に応じて変化する。このため、2分割ディテクタ508で検出される光量分布に応じてステージ101と対物レンズ102間の距離を調整することで、合焦状態を達成することができる。 The autofocus device 500 includes a laser 501, a collimating lens 502, a shielding plate 503, a polarizing beam splitter 504, a quarter-wave plate 505, a dichroic mirror 506, an imaging lens 507, and a two-segment detector 508. The laser light emitted from the laser 501 is collimated by the collimating lens 502, and half of the light beam is blocked by the shielding plate 503. The other half of the light beam is reflected by the polarizing beam splitter 504, enters the objective lens 102 via the quarter-wave plate 505 and the dichroic mirror 506, and is irradiated on the sample by the objective lens 102. The laser beam reflected by the sample passes through the objective lens 102, the dichroic mirror 506, and the quarter-wave plate 505, and is incident on the polarization beam splitter 504 again. The laser light that enters the polarization beam splitter 504 for the second time is reflected by the polarization beam splitter 504 and then passes through the quarter-wave plate 505 twice. Therefore, it has a polarization direction orthogonal to the polarization direction when it first enters the polarization beam splitter 504. Therefore, the laser light passes through the polarizing beam splitter 504. Thereafter, the laser light is applied to the two-segment detector 508 by the imaging lens 507. The light amount distribution detected by the two-segment detector 508 changes according to the amount of deviation from the in-focus state. Therefore, by adjusting the distance between the stage 101 and the objective lens 102 according to the distribution of the amount of light detected by the two-segment detector 508, a focused state can be achieved.
 本実施形態に係る顕微鏡システムは、ステージ101が対物レンズ102の光軸と直交する方向に移動したときに、オートフォーカス装置500によりオートフォーカス処理を行う。これにより、顕微鏡システム1に比べて、更に利用者の作業負担を軽減することができる。 顕 微鏡 In the microscope system according to the present embodiment, when the stage 101 moves in a direction perpendicular to the optical axis of the objective lens 102, the autofocus device 500 performs autofocus processing. Thereby, the work load on the user can be further reduced as compared with the microscope system 1.
[第5実施形態]
 図19は、本実施形態に係る顕微鏡600の構成を示した図である。なお、本実施形態に係る顕微鏡システムは、顕微鏡100の代わりに顕微鏡600を備える点を除き、顕微鏡システム1と同様である。
[Fifth Embodiment]
FIG. 19 is a diagram illustrating a configuration of a microscope 600 according to the present embodiment. The microscope system according to the present embodiment is the same as the microscope system 1 except that a microscope 600 is provided instead of the microscope 100.
 顕微鏡600は、倒立顕微鏡である。顕微鏡600は、透過照明光学系として、光源601とコンデンサレンズ602を備えている。顕微鏡600は、コンデンサレンズ602と対向する位置に対物レンズ603を備えている。対物レンズ603の光軸上には、ビームスプリッタ604、ビームスプリッタ606、結像レンズ609、ビームスプリッタ610、リレーレンズ612、接眼レンズ613が並んでいる。 The microscope 600 is an inverted microscope. The microscope 600 includes a light source 601 and a condenser lens 602 as a transmission illumination optical system. The microscope 600 has an objective lens 603 at a position facing the condenser lens 602. A beam splitter 604, a beam splitter 606, an imaging lens 609, a beam splitter 610, a relay lens 612, and an eyepiece 613 are arranged on the optical axis of the objective lens 603.
 顕微鏡600は、さらに、光源605を備える。光源605から出射した照明光は、ビームスプリッタ604によって試料に向けて偏向される。顕微鏡600は、さらに、投影装置607と投影レンズ608を備える。投影装置607からの光は、投影レンズ608を経由して入射するビームスプリッタ606で接眼レンズ613へ向けて偏向される。これにより、結像レンズ609とリレーレンズ612の間の像面に、投影装置607からの光によって投影画像が投影される。顕微鏡600は、さらに、撮像素子611を備えている。撮像素子611は、ビームスプリッタ610で反射した試料からの光を検出して、デジタル画像データを出力する。 The microscope 600 further includes a light source 605. The illumination light emitted from the light source 605 is deflected by the beam splitter 604 toward the sample. The microscope 600 further includes a projection device 607 and a projection lens 608. The light from the projection device 607 is deflected toward the eyepiece 613 by the beam splitter 606 that enters via the projection lens 608. As a result, a projection image is projected on the image plane between the imaging lens 609 and the relay lens 612 by the light from the projection device 607. The microscope 600 further includes an image sensor 611. The image sensor 611 detects light from the sample reflected by the beam splitter 610 and outputs digital image data.
 本実施形態に係る顕微鏡システムによっても、顕微鏡システム1と同様の効果を得ることができる。 顕 微鏡 With the microscope system according to the present embodiment, the same effects as those of the microscope system 1 can be obtained.
[第6実施形態]
 図20は、本実施形態に係る顕微鏡700の構成を示した図である。なお、本実施形態に係る顕微鏡システムは、顕微鏡100の代わりに顕微鏡700を備える点を除き、顕微鏡システム1と同様である。
[Sixth embodiment]
FIG. 20 is a diagram illustrating a configuration of a microscope 700 according to the present embodiment. The microscope system according to the present embodiment is the same as the microscope system 1 except that a microscope 700 is provided instead of the microscope 100.
 顕微鏡700は、実体顕微鏡である。顕微鏡700は、透過照明光学系として、光源712とコレクタレンズ711と、コンデンサレンズ710を備えている。顕微鏡700は、落射照明光学系として、光源707と、対物レンズ708を備えている。顕微鏡700は、外付けの照明光源である光源709を備えている。 The microscope 700 is a stereo microscope. The microscope 700 includes a light source 712, a collector lens 711, and a condenser lens 710 as a transmission illumination optical system. The microscope 700 includes a light source 707 and an objective lens 708 as an epi-illumination optical system. The microscope 700 includes a light source 709 that is an external illumination light source.
 顕微鏡700は、さらに、対物レンズ708からの光を集光して中間像を形成する2組の結像レンズ(702a、702b)と2組の接眼レンズ(701a、701b)を備えている。接眼レンズ701aと結像レンズ702aは、右目用の光学系であり、接眼レンズ701bと結像レンズ702bは、左目用の光学系である。 The microscope 700 further includes two sets of imaging lenses (702a, 702b) for condensing light from the objective lens 708 to form an intermediate image and two sets of eyepieces (701a, 701b). The eyepiece 701a and the imaging lens 702a are an optical system for the right eye, and the eyepiece 701b and the imaging lens 702b are an optical system for the left eye.
 顕微鏡700は、さらに、右目用の光路上から分岐した光路上に、第1投影装置である投影装置703aと、投影レンズ704aを備え、左目用の光路上から分岐した光路上に、第2投影装置である投影装置703bと、投影レンズ704bを備えている。 The microscope 700 further includes a projection device 703a as a first projection device and a projection lens 704a on an optical path branched from an optical path for the right eye, and a second projection device on an optical path branched from an optical path for the left eye. A projection device 703b as a device and a projection lens 704b are provided.
 顕微鏡700は、さらに、右目用の光路上から分岐した光路上に、試料からの光に基づいて試料の第1デジタル画像データを取得する第1撮像装置である撮像装置710aと、左目用の光路上から分岐した光路上に、試料からの光に基づいて試料の第2デジタル画像データを取得する第2撮像装置である撮像装置710bを備えている。撮像装置710aは、結像レンズ706aと撮像素子705aを備え、撮像装置710bは、結像レンズ706bと撮像素子705bを備えている。 The microscope 700 further includes, on an optical path branched from the optical path for the right eye, an imaging device 710a that is a first imaging device that acquires first digital image data of the sample based on light from the sample, and a light for the left eye. An imaging device 710b, which is a second imaging device that acquires second digital image data of the sample based on light from the sample, is provided on an optical path branched off from the road. The imaging device 710a includes an imaging lens 706a and an imaging device 705a, and the imaging device 710b includes an imaging lens 706b and an imaging device 705b.
 本実施形態に係る顕微鏡システムによっても、顕微鏡システム1と同様の効果を得ることができる。また、本実施形態では、画像解析部22は、顕微鏡情報と第1デジタル画像データと第2デジタル画像データに基づいて、ステレオ計測を行い、試料の高さ情報を解析結果として出力することができる。そして、投影画像生成部23は、顕微鏡情報と解析結果に基づいて、三次元画像である投影画像を表す投影画像データを生成する。その結果は、本実施形態に係る顕微鏡システムでは、投影装置703aと投影装置703bが三次元画像である投影画像を像面に投影することができる。 顕 微鏡 With the microscope system according to the present embodiment, the same effects as those of the microscope system 1 can be obtained. Further, in the present embodiment, the image analysis unit 22 can perform stereo measurement based on the microscope information, the first digital image data, and the second digital image data, and can output height information of the sample as an analysis result. . Then, the projection image generation unit 23 generates projection image data representing a projection image as a three-dimensional image based on the microscope information and the analysis result. As a result, in the microscope system according to the present embodiment, the projection device 703a and the projection device 703b can project a projection image, which is a three-dimensional image, onto an image plane.
 さらに、本実施形態に係る顕微鏡システムでは、光源709が位相パターンを試料に照射してもよい。これにより、画像解析部22は、第1デジタル画像データと第2デジタル画像データを解析して、ポイントクラウドデータを解析結果として出力することができる。そして、投影画像生成部23は、顕微鏡情報と解析結果に基づいて、三次元画像である投影画像を表す投影画像データを生成する。その結果は、本実施形態に係る顕微鏡システムでは、投影装置703aと投影装置703bが三次元画像である投影画像を像面に投影することができる。 In the microscope system according to this embodiment, the light source 709 may irradiate the sample with the phase pattern. Thus, the image analysis unit 22 can analyze the first digital image data and the second digital image data and output the point cloud data as an analysis result. Then, the projection image generation unit 23 generates projection image data representing a projection image as a three-dimensional image based on the microscope information and the analysis result. As a result, in the microscope system according to the present embodiment, the projection device 703a and the projection device 703b can project a projection image, which is a three-dimensional image, onto an image plane.
 上述した実施形態は、発明の理解を容易にするための具体例を示したものであり、本発明の実施形態はこれらに限定されるものではない。顕微鏡システムは、特許請求の範囲の記載を逸脱しない範囲において、さまざまな変形、変更が可能である。 The embodiments described above show specific examples for facilitating understanding of the invention, and the embodiments of the invention are not limited to these. Various modifications and changes can be made to the microscope system without departing from the scope of the claims.
 また、上述した実施形態では、顕微鏡が撮像装置を含む例を示したが、上述した技術は、例えば、走査型顕微鏡に上述した技術を提供してもよい。その場合、顕微鏡は、撮像装置の代わりに、光電子増倍管(PMT)などの光検出器を備えてもよい。 In the above-described embodiment, the example in which the microscope includes the imaging device is described. However, the above-described technology may provide, for example, a scanning microscope with the above-described technology. In that case, the microscope may include a photodetector such as a photomultiplier tube (PMT) instead of the imaging device.
 また、上述した実施形態では、結像レンズ103が焦点距離を可変するレンズである例、投影レンズ133が焦点距離を可変するレンズである例を示したが、顕微鏡システムは、他のレンズが可変焦点レンズであってもよい。顕微鏡システムは、第1投影倍率、第2投影倍率、第3投影倍率の少なくとも1つを可変するレンズを備えていることが望ましい。 Further, in the above-described embodiment, the example in which the imaging lens 103 is a lens that varies the focal length, and the example in which the projection lens 133 is a lens that varies the focal length have been described. It may be a focus lens. It is preferable that the microscope system includes a lens that changes at least one of the first projection magnification, the second projection magnification, and the third projection magnification.
 また、画像解析部22は、所定のアルゴリズムを用いて解析処理を行ってもよく、訓練済みのニューラルネットワークを用いて解析処理を行ってもよい。訓練済みのニューラルネットワークのパラメータは、顕微鏡システムとは異なる装置において、ニューラルネットワークを訓練することで生成されても良く、制御装置10は生成されたパラメータをダウンロードして画像解析部22に適用しても良い。 {Circle around (4)} The image analysis unit 22 may perform the analysis process using a predetermined algorithm, or may perform the analysis process using a trained neural network. The parameters of the trained neural network may be generated by training the neural network in a device different from the microscope system, and the control device 10 downloads the generated parameters and applies the downloaded parameters to the image analysis unit 22. Is also good.
 また、画像解析部22は、顕微鏡システムの外部に設けられてもよい。例えば、通信制御部25が外部システムへデジタル画像データを送信し、画像解析部22を備える外部システムがデジタル画像データを解析してもよい。通信制御部25は、デジタル画像データの解析結果を外部システムから受信してもよく、投影画像生成部23は、受信した解析結果と顕微鏡情報に基づいて、投影画像データを生成してもよい。 画像 The image analysis unit 22 may be provided outside the microscope system. For example, the communication control unit 25 may transmit digital image data to an external system, and the external system including the image analysis unit 22 may analyze the digital image data. The communication control unit 25 may receive the analysis result of the digital image data from the external system, and the projection image generation unit 23 may generate the projection image data based on the received analysis result and the microscope information.
1、2、3                    顕微鏡システム
10                       制御装置
10a                      プロセッサ
10b                      メモリ
10c                      補助記憶装置
10d                      入出力インタフェース
10e                      媒体駆動装置
10f                      通信制御装置
10g                      バス
10h                      記憶媒体
21                       撮像制御部
22                       画像解析部
23                       投影画像生成部
24                       投影制御部
25                       通信制御部
30                       情報管理部
40                       入力装置
41                       音声入力装置
50                       音声出力装置
100、200、300、400、600、700  顕微鏡
101                      ステージ
102、102a、603、708         対物レンズ
103、507、609、702a、702b、706a、706b  結像レンズ
104、613、701a、701b        接眼レンズ
110                      顕微鏡本体
111                      ターレット
120                      鏡筒
130 、150、160             中間鏡筒
131、607、703a、703b        投影装置
132、143                  光偏向素子
133、608、704a、704b        投影レンズ
140、710a、710b            撮像装置
141、611、705a、705b        撮像素子
142                      アダプタレンズ
500                      オートフォーカス装置
501                      レーザ
502                      コリメートレンズ
503                      遮蔽板
504                      偏光ビームスプリッタ
505                      1/4波長板
506                      ダイクロイックミラー
508                      2分割ディテクタ
601、605、707、709、712      光源
602、710                  コンデンサレンズ
604、606、610              ビームスプリッタ
612                      リレーレンズ
711                      コレクタレンズ
A1~A3、R1~R4              領域
C                        マーク
IP1、IP1a、IP2             像面
MI                       顕微鏡情報
O1~O4                    光学画像
OP1                      物体面
OP2                      表示面
P1~P8                    投影画像
PX1、PX2                  画素
V1~V9                    画像
1, 2, 3 Microscope system 10 Control device 10a Processor 10b Memory 10c Auxiliary storage device 10d Input / output interface 10e Medium drive device 10f Communication control device 10g Bus 10h Storage medium 21 Imaging control unit 22 Image analysis unit 23 Projected image generation unit 24 Projection Control unit 25 Communication control unit 30 Information management unit 40 Input device 41 Audio input device 50 Audio output device 100, 200, 300, 400, 600, 700 Microscope 101 Stage 02, 102a, 603, 708 Objective lenses 103, 507, 609, 702a, 702b, 706a, 706b Imaging lenses 104, 613, 701a, 701b Eyepiece 110 Microscope main body 111 Turret 120 Barrels 130, 150, 160 Intermediate barrel 131, 607, 703a, 703b Projection device 132, 143 Light deflection element 133, 608, 704a, 704b Projection lens 140, 710a, 710b Image pickup device 141, 611, 705a, 705b Image pickup device 142 Adapter lens 500 Autofocus device 501 Laser 502 Collimating lens 503 Shielding plate 504 Polarizing beam splitter 505 Quarter-wave plate 506 Dichroic mirror 508 2-split detector 601, 605, 707, 709, 712 Light source 602, 710 Condenser lenses 604, 606, 610 Beam splitter 612 Relay lens 711 Collector lenses A1 to A3 , R1 to R4 Area C mark IP1, IP1a, IP2 Image plane MI Microscopic information O1 to O4 Optical image OP1 Object plane OP2 Display plane P1 to P8 Projected image PX1, PX2 Pixel V1 to V9 Image

Claims (28)

  1.  接眼レンズと、
     試料からの光を前記接眼レンズへ導く対物レンズと、
     前記接眼レンズと前記対物レンズの間の光路上に配置され、前記試料からの光に基づいて前記試料の光学画像を形成する結像レンズと、
     前記試料からの光に基づいて前記試料のデジタル画像データを取得する撮像装置と、
     前記光学画像が形成されている前記結像レンズと前記接眼レンズの間の像面へ投影画像を投影する投影装置と、
     少なくとも、前記試料が前記像面に投影される第1倍率と、前記試料が前記撮像装置に投影される第2倍率と、前記投影装置が前記像面に投影される第3倍率と、前記撮像装置のサイズと、前記投影装置のサイズと、を含む顕微鏡情報を管理する制御装置と、を備える
    ことを特徴とする顕微鏡システム。
    An eyepiece,
    An objective lens for guiding light from a sample to the eyepiece,
    An imaging lens that is arranged on an optical path between the eyepiece and the objective lens and forms an optical image of the sample based on light from the sample;
    An imaging device that acquires digital image data of the sample based on light from the sample,
    A projection device that projects a projection image onto an image plane between the imaging lens and the eyepiece on which the optical image is formed,
    At least a first magnification at which the sample is projected onto the image plane, a second magnification at which the sample is projected onto the imaging device, a third magnification at which the projection device is projected onto the image surface, and A microscope system comprising: a control device that manages microscope information including a device size and a size of the projection device.
  2.  請求項1に記載の顕微鏡システムにおいて、
     前記制御装置は、少なくとも前記顕微鏡情報に基づいて、前記投影画像を表す投影画像データを生成する投影画像生成部を備える
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 1,
    The microscope system according to claim 1, wherein the control device includes a projection image generation unit that generates projection image data representing the projection image based on at least the microscope information.
  3.  請求項2に記載の顕微鏡システムにおいて、
     前記制御装置は、さらに、前記撮像装置で取得した前記デジタル画像データを解析する画像解析部と、を備え、
     前記投影画像生成部は、前記画像解析部での解析結果と前記顕微鏡情報とに基づいて、前記投影画像データを生成する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 2,
    The control device further includes an image analysis unit that analyzes the digital image data acquired by the imaging device,
    The microscope system, wherein the projection image generation unit generates the projection image data based on an analysis result of the image analysis unit and the microscope information.
  4.  請求項2又は請求項3に記載の顕微鏡システムにおいて、
     前記顕微鏡情報は、さらに、前記光学画像の形成に用いられた顕鏡法の情報を含み、
     前記投影画像生成部は、前記顕鏡法の情報に基づいて、前記投影画像の背景色を決定する
    ことを特徴とする顕微鏡システム。
    In the microscope system according to claim 2 or 3,
    The microscope information further includes information on the microscope method used for forming the optical image,
    The microscope system according to claim 1, wherein the projection image generation unit determines a background color of the projection image based on information on the microscope method.
  5.  請求項1乃至請求項4のいずれか1項に記載の顕微鏡システムにおいて、
     前記制御装置は、さらに、前記撮像装置を制御する撮像制御部を備え、
     前記撮像制御部は、前記顕微鏡情報に基づいて、前記撮像装置の検出感度を決定する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 4,
    The control device further includes an imaging control unit that controls the imaging device,
    The imaging system according to claim 1, wherein the imaging control unit determines a detection sensitivity of the imaging device based on the microscope information.
  6.  請求項1乃至請求項4のいずれか1項に記載の顕微鏡システムにおいて、
     前記制御装置は、さらに、前記撮像装置を制御する撮像制御部を備え、
     前記撮像制御部は、前記顕微鏡情報に基づいて、前記撮像装置の有効画素から信号を読み出す画素を決定する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 4,
    The control device further includes an imaging control unit that controls the imaging device,
    The microscope system according to claim 1, wherein the imaging control unit determines a pixel from which a signal is read from an effective pixel of the imaging device based on the microscope information.
  7.  請求項1乃至請求項6のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     前記対物レンズと前記接眼レンズの間の光路上に配置され、前記投影装置からの光を前記接眼レンズへ導く光偏向素子を備え、
     前記制御装置は、前記顕微鏡情報に基づいて、前記光偏向素子の反射率を変更する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 6, further comprising:
    An optical deflecting element is provided on an optical path between the objective lens and the eyepiece, and guides light from the projection device to the eyepiece,
    The microscope system, wherein the control device changes the reflectance of the light deflection element based on the microscope information.
  8.  請求項1乃至請求項7のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     前記試料を載置するステージを備え、
     前記顕微鏡情報は、さらに、合焦状態における前記ステージの位置を示す、前記対物レンズの光軸と直交する方向の座標情報と、前記光軸の方向の座標情報と、の組み合わせ、を含む
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 7, further comprising:
    Comprising a stage for mounting the sample,
    The microscope information further indicates a position of the stage in a focused state, a combination of coordinate information in a direction orthogonal to an optical axis of the objective lens and coordinate information in a direction of the optical axis, Characteristic microscope system.
  9.  請求項1乃至請求項8のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     前記顕微鏡情報を音声で出力する音声出力装置を備える
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 8, further comprising:
    A microscope system comprising an audio output device that outputs the microscope information by audio.
  10.  請求項1乃至請求項9のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     前記第1倍率、前記第2倍率、前記第3倍率の少なくとも1つを可変するレンズを備える
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 9, further comprising:
    A microscope system comprising a lens that changes at least one of the first magnification, the second magnification, and the third magnification.
  11.  請求項10に記載の顕微鏡システムにおいて、
     前記レンズは、前記結像レンズである
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 10,
    The microscope system according to claim 1, wherein the lens is the imaging lens.
  12.  請求項1乃至請求項11のいずれか1項に記載の顕微鏡システムにおいて、
     前記投影装置からの光によって前記像面に形成されるイメージサークルが前記試料からの光によって前記像面に形成されるイメージサークルよりも大きい
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 11,
    A microscope system, wherein an image circle formed on the image plane by light from the projection device is larger than an image circle formed on the image plane by light from the sample.
  13.  請求項1乃至請求項4のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     前記制御装置は、さらに、前記撮像装置を制御する撮像制御部を備え、
     前記撮像制御部は、前記撮像装置の露光期間と前記投影画像の投影期間が重ならないように、前記撮像装置を制御する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 1 to 4, further comprising:
    The control device further includes an imaging control unit that controls the imaging device,
    The microscope system according to claim 1, wherein the imaging control unit controls the imaging device such that an exposure period of the imaging device and a projection period of the projection image do not overlap.
  14.  請求項3に記載の顕微鏡システムにおいて、
     前記画像解析部は、前記デジタル画像データに基づいて、前記試料内の注目領域を追跡する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 3,
    The microscope system according to claim 1, wherein the image analysis unit tracks a region of interest in the sample based on the digital image data.
  15.  請求項14に記載の顕微鏡システムにおいて、さらに、
     前記試料を載置する電動ステージを備え、
     前記制御装置は、前記画像解析部の追跡結果に基づいて、前記注目領域が前記対物レンズの光軸上に位置するように、前記電動ステージを制御する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 14, further comprising:
    Comprising an electric stage for mounting the sample,
    The microscope system according to claim 1, wherein the control device controls the motorized stage based on a tracking result of the image analysis unit such that the attention area is located on an optical axis of the objective lens.
  16.  請求項3に記載の顕微鏡システムにおいて、
     前記撮像装置は、
      前記試料からの光に基づいて前記試料の第1デジタル画像データを取得する第1撮像装置と、
      前記試料からの光に基づいて前記試料の第2デジタル画像データを取得する第2撮像装置と、を備え、
     前記画像解析部は、前記顕微鏡情報と前記第1デジタル画像データと前記第2デジタル画像データに基づいて、ステレオ計測を行い、前記試料の高さ情報を前記解析結果として出力する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 3,
    The imaging device,
    A first imaging device that acquires first digital image data of the sample based on light from the sample,
    A second imaging device that acquires second digital image data of the sample based on light from the sample,
    The image analysis unit performs stereo measurement based on the microscope information, the first digital image data, and the second digital image data, and outputs height information of the sample as the analysis result. Microscope system.
  17.  請求項16に記載の顕微鏡システムにおいて、
     前記投影画像生成部は、前記顕微鏡情報と前記解析結果に基づいて、前記投影画像データを生成し、ここで、前記投影画像データが表す前記投影画像は三次元画像であり、
     前記投影装置は、第1投影装置と、第2投影装置と、を備え、
     前記第1投影装置と前記第2投影装置は、前記像面へ前記投影画像を投影する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 16,
    The projection image generation unit generates the projection image data based on the microscope information and the analysis result, wherein the projection image represented by the projection image data is a three-dimensional image,
    The projection device includes a first projection device and a second projection device,
    The microscope system according to claim 1, wherein the first projection device and the second projection device project the projection image onto the image plane.
  18.  請求項3に記載の顕微鏡システムにおいて、さらに、
     位相パターンを前記試料に照射する光源を備え、
     前記画像解析部は、前記デジタル画像データを解析して、前記試料のポイントクラウドデータを前記解析結果として出力し、
     前記投影画像生成部は、前記顕微鏡情報と前記解析結果に基づいて、前記投影画像データを生成し、ここで、前記投影画像データが表す前記投影画像は三次元画像であり、
     前記投影装置は、第1投影装置と、第2投影装置と、を備え、
     前記第1投影装置と前記第2投影装置は、前記像面へ前記投影画像を投影する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 3, further comprising:
    A light source for irradiating the sample with a phase pattern,
    The image analysis unit analyzes the digital image data, and outputs point cloud data of the sample as the analysis result,
    The projection image generation unit generates the projection image data based on the microscope information and the analysis result, wherein the projection image represented by the projection image data is a three-dimensional image,
    The projection device includes a first projection device and a second projection device,
    The microscope system according to claim 1, wherein the first projection device and the second projection device project the projection image onto the image plane.
  19.  請求項2乃至請求項4のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     音声入力装置と、を備え、
     前記投影画像生成部は、前記音声入力装置から入力された製品識別情報と前記顕微鏡情報とに基づいて、前記投影画像データを生成し、
     前記投影画像は、前記製品識別情報で識別される製品に関連する画像を含む
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 2 to 4, further comprising:
    And a voice input device,
    The projection image generation unit, based on the product identification information and the microscope information input from the voice input device, to generate the projection image data,
    The microscope system, wherein the projection image includes an image related to a product identified by the product identification information.
  20.  請求項3に記載の顕微鏡システムにおいて、さらに、
     前記画像解析部は、前記デジタル画像データを解析して、前記試料の付された製品識別情報を特定し、
     前記投影画像生成部は、前記画像解析部で特定された前記製品識別情報と前記顕微鏡情報とに基づいて、前記投影画像データを生成し、
     前記投影画像は、前記製品識別情報で識別される製品に関連する画像を含む
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 3, further comprising:
    The image analysis unit analyzes the digital image data to identify the product identification information attached to the sample,
    The projection image generation unit, based on the product identification information and the microscope information specified by the image analysis unit, to generate the projection image data,
    The microscope system, wherein the projection image includes an image related to a product identified by the product identification information.
  21.  請求項19又は請求項20に記載の顕微鏡システムにおいて、
     前記製品に関連する画像は、前記製品の製品コード、前記製品の製品番号、又は、前記製品の検査手順を含む
    ことを特徴とする顕微鏡システム。
    In the microscope system according to claim 19 or 20,
    A microscope system, wherein the image related to the product includes a product code of the product, a product number of the product, or an inspection procedure of the product.
  22.  請求項19乃至請求項21のいずれか1項に記載の顕微鏡システムにおいて、
     前記撮像装置は、前記試料からの光と前記投影装置からの光とに基づいて、前記光学画像に前記投影画像が重畳された重畳画像を表す重畳画像データを取得する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 19 to 21,
    A microscope system configured to acquire superimposed image data representing a superimposed image in which the projection image is superimposed on the optical image, based on light from the sample and light from the projection device. .
  23.  請求項3に記載の顕微鏡システムにおいて、さらに、
     音声入力装置と、を備え、
     前記画像解析部は、前記音声入力装置から入力された製品識別情報と前記顕微鏡情報と前記デジタル画像データとに基づいて前記試料を検査し、検査結果を前記解析結果として出力し、
     前記投影画像は、前記試料の検査の合否を表す画像を含む
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 3, further comprising:
    And a voice input device,
    The image analysis unit inspects the sample based on the product identification information, the microscope information, and the digital image data input from the voice input device, and outputs an inspection result as the analysis result.
    The microscope system according to claim 1, wherein the projection image includes an image indicating pass / fail of inspection of the sample.
  24.  請求項3に記載の顕微鏡システムにおいて、さらに、
     前記画像解析部は、
      前記デジタル画像データを解析して、前記試料の付された製品識別情報を特定し、
      特定された前記製品識別情報と前記顕微鏡情報と前記デジタル画像データとに基づいて前記試料を検査し、検査結果を前記解析結果として出力し、
     前記投影画像は、前記試料の検査の合否を表す画像を含む
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 3, further comprising:
    The image analysis unit,
    Analyze the digital image data, identify the product identification information attached to the sample,
    Inspecting the sample based on the specified product identification information, the microscope information and the digital image data, outputting an inspection result as the analysis result,
    The microscope system according to claim 1, wherein the projection image includes an image indicating pass / fail of inspection of the sample.
  25.  請求項23又は請求項24に記載の顕微鏡システムにおいて、
     前記制御装置は、前記検査結果に基づいて前記試料の検査項目についてのチェックシートを作成する
    ことを特徴とする顕微鏡システム。
    The microscope system according to claim 23 or claim 24,
    The microscope system, wherein the control device creates a check sheet for an inspection item of the sample based on the inspection result.
  26.  請求項23乃至請求項25のいずれか1項に記載の顕微鏡システムにおいて、
     前記画像解析部は、前記試料の訓練済みのニューラルネットワークを用いて前記デジタル画像データを解析する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 23 to 25,
    The microscope system, wherein the image analysis unit analyzes the digital image data using a trained neural network of the sample.
  27.  請求項2乃至請求項4のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     ネットワークを経由して前記顕微鏡システムと接続された外部システムとデータをやり取りする通信制御部と、を備え、
     前記投影画像生成部は、前記通信制御部が前記外部システムから受信したデータと前記顕微鏡情報に基づいて、前記投影画像データを生成する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 2 to 4, further comprising:
    A communication control unit that exchanges data with an external system connected to the microscope system via a network,
    The microscope system, wherein the projection image generation unit generates the projection image data based on the data received by the communication control unit from the external system and the microscope information.
  28.  請求項2乃至請求項4のいずれか1項に記載の顕微鏡システムにおいて、さらに、
     ネットワークを経由して前記顕微鏡システムと接続された外部システムとデータをやり取りする通信制御部と、を備え、
     前記通信制御部は、
      前記外部システムへ前記デジタル画像データを送信し、
      前記デジタル画像データの解析結果を受信し、
     前記投影画像生成部は、前記通信制御部が前記外部システムから受信した前記解析結果と前記顕微鏡情報に基づいて、前記投影画像データを生成する
    ことを特徴とする顕微鏡システム。
    The microscope system according to any one of claims 2 to 4, further comprising:
    A communication control unit that exchanges data with an external system connected to the microscope system via a network,
    The communication control unit,
    Transmitting the digital image data to the external system,
    Receiving the analysis result of the digital image data,
    The microscope system, wherein the projection image generation unit generates the projection image data based on the analysis result and the microscope information received by the communication control unit from the external system.
PCT/JP2018/047494 2018-09-28 2018-12-25 Microscope system WO2020066041A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2020547902A JP7150867B2 (en) 2018-09-28 2018-12-25 microscope system
EP18935405.3A EP3988986A4 (en) 2018-09-28 2018-12-25 Microscope system
CN201880097737.6A CN112703440B (en) 2018-09-28 2018-12-25 Microscope system
US17/196,705 US20210215923A1 (en) 2018-09-28 2021-03-09 Microscope system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018183761 2018-09-28
JP2018-183761 2018-09-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/196,705 Continuation US20210215923A1 (en) 2018-09-28 2021-03-09 Microscope system

Publications (1)

Publication Number Publication Date
WO2020066041A1 true WO2020066041A1 (en) 2020-04-02

Family

ID=69951314

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/047494 WO2020066041A1 (en) 2018-09-28 2018-12-25 Microscope system

Country Status (5)

Country Link
US (1) US20210215923A1 (en)
EP (1) EP3988986A4 (en)
JP (1) JP7150867B2 (en)
CN (1) CN112703440B (en)
WO (1) WO2020066041A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4270083A1 (en) 2022-04-28 2023-11-01 Evident Corporation Microscope system, projection unit, and image projection method
EP4303638A1 (en) 2022-06-20 2024-01-10 Evident Corporation Lens-barrel device and microscope system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3988988A4 (en) 2018-09-28 2023-09-13 Evident Corporation Microscope system, projection unit, and image projection method
WO2020066040A1 (en) * 2018-09-28 2020-04-02 オリンパス株式会社 Microscope system, projection unit, and image projection method
EP4345776A2 (en) 2018-09-28 2024-04-03 Evident Corporation Microscope system, projection unit, and image projection method
WO2020138279A1 (en) 2018-12-28 2020-07-02 オリンパス株式会社 Microscope system
DE102022117270B3 (en) * 2022-07-12 2023-10-26 Leica Microsystems Cms Gmbh Imaging device with a camera adapter, method and computer program product

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0580255A (en) * 1991-09-19 1993-04-02 Nippon Telegr & Teleph Corp <Ntt> Optical microscopic system device
JPH07253548A (en) * 1994-03-15 1995-10-03 Nikon Corp Device and method for automatically pursuing sample image
JPH0829694A (en) * 1994-07-20 1996-02-02 Nikon Corp Microscope with image processor
JPH11242189A (en) * 1997-12-25 1999-09-07 Olympus Optical Co Ltd Method and device for forming image
JP2001519944A (en) 1997-03-03 2001-10-23 バクス リサーチ ラボラトリーズ インコーポレイテッド Method and apparatus for acquiring and reconstructing a magnified image of a sample from a computer controlled microscope
JP2005351916A (en) * 2004-06-08 2005-12-22 Olympus Corp Binocular microscope device
JP2006292999A (en) * 2005-04-11 2006-10-26 Direct Communications:Kk Slide image data generation device and slide image data
JP2008090072A (en) * 2006-10-03 2008-04-17 Keyence Corp Magnified image observation system, confocal microscope, image data transfer method, three-dimensional focused image forming method, data transfer program, three-dimensional focused image forming program, computer recordable recording medium, and apparatus with program recorded
JP2016133668A (en) * 2015-01-20 2016-07-25 オリンパス株式会社 Pattern projection device, pattern projection method and phase modulation amount setting method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10243852B4 (en) * 2002-09-20 2006-01-26 Carl Zeiss Microscopy system and microscopy method
DE102014205038B4 (en) * 2014-02-19 2015-09-03 Carl Zeiss Meditec Ag Visualization devices with calibration of a display and calibration methods for display in a visualization device
DE102015103426B4 (en) * 2015-03-09 2020-07-02 Carl Zeiss Meditec Ag Microscope system and method for automated alignment of a microscope
JP2019532352A (en) * 2016-08-28 2019-11-07 オーグメンティクス メディカル リミテッド System for histological examination of tissue specimens
DE102017105941B3 (en) * 2017-03-20 2018-05-17 Carl Zeiss Meditec Ag Surgical microscope with an image sensor and a display and method for operating a surgical microscope

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0580255A (en) * 1991-09-19 1993-04-02 Nippon Telegr & Teleph Corp <Ntt> Optical microscopic system device
JPH07253548A (en) * 1994-03-15 1995-10-03 Nikon Corp Device and method for automatically pursuing sample image
JPH0829694A (en) * 1994-07-20 1996-02-02 Nikon Corp Microscope with image processor
JP2001519944A (en) 1997-03-03 2001-10-23 バクス リサーチ ラボラトリーズ インコーポレイテッド Method and apparatus for acquiring and reconstructing a magnified image of a sample from a computer controlled microscope
JPH11242189A (en) * 1997-12-25 1999-09-07 Olympus Optical Co Ltd Method and device for forming image
JP2005351916A (en) * 2004-06-08 2005-12-22 Olympus Corp Binocular microscope device
JP2006292999A (en) * 2005-04-11 2006-10-26 Direct Communications:Kk Slide image data generation device and slide image data
JP2008090072A (en) * 2006-10-03 2008-04-17 Keyence Corp Magnified image observation system, confocal microscope, image data transfer method, three-dimensional focused image forming method, data transfer program, three-dimensional focused image forming program, computer recordable recording medium, and apparatus with program recorded
JP2016133668A (en) * 2015-01-20 2016-07-25 オリンパス株式会社 Pattern projection device, pattern projection method and phase modulation amount setting method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4270083A1 (en) 2022-04-28 2023-11-01 Evident Corporation Microscope system, projection unit, and image projection method
EP4303638A1 (en) 2022-06-20 2024-01-10 Evident Corporation Lens-barrel device and microscope system

Also Published As

Publication number Publication date
US20210215923A1 (en) 2021-07-15
EP3988986A1 (en) 2022-04-27
JPWO2020066041A1 (en) 2021-08-30
EP3988986A4 (en) 2023-09-06
CN112703440A (en) 2021-04-23
JP7150867B2 (en) 2022-10-11
CN112703440B (en) 2022-12-23

Similar Documents

Publication Publication Date Title
JP7150867B2 (en) microscope system
JP5677473B2 (en) Focus adjustment device and focus adjustment method
CN101303269B (en) Optical system evaluation apparatus, optical system evaluation method and program thereof
WO2020066042A1 (en) Microscope system, projection unit, and image projection method
JP6798625B2 (en) Quantitative phase image generation method, quantitative phase image generator and program
US9804029B2 (en) Microspectroscopy device
US20150185460A1 (en) Image forming method and image forming apparatus
JP6534658B2 (en) Scanning microscope and method of determining point spread function (PSF) of scanning microscope
US8294728B2 (en) Process for generating display images from acquired recorded images, and means for carrying out the process
CN103168265A (en) Imaging systems and associated methods thereof
JP6363477B2 (en) 3D shape measuring device
KR101505745B1 (en) Dual detection confocal reflecting microscope and method of detecting information on height of sample using same
US20200355902A1 (en) Microscopy device
JP6969655B2 (en) Quantitative phase image generator
US11954172B2 (en) One-to-many randomizing interference microscope
US9366567B2 (en) Focusing device including a differential interference prism
JP2021056508A (en) Method for imaging object using imaging device and microscope
JP2009192721A (en) Confocal microscope
JP2013222086A (en) Microscope device
JP2012220780A (en) Imaging system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18935405

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020547902

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018935405

Country of ref document: EP

Effective date: 20210428