US20190268573A1 - Digital microscope apparatus for reimaging blurry portion based on edge detection - Google Patents

Digital microscope apparatus for reimaging blurry portion based on edge detection Download PDF

Info

Publication number
US20190268573A1
US20190268573A1 US16/411,487 US201916411487A US2019268573A1 US 20190268573 A1 US20190268573 A1 US 20190268573A1 US 201916411487 A US201916411487 A US 201916411487A US 2019268573 A1 US2019268573 A1 US 2019268573A1
Authority
US
United States
Prior art keywords
image
imaging
area
focus position
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/411,487
Inventor
Goh Matsunobu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US16/411,487 priority Critical patent/US20190268573A1/en
Publication of US20190268573A1 publication Critical patent/US20190268573A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0016Technical microscopes, e.g. for inspection or measuring in industrial production processes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • G02B21/08Condensers
    • G02B21/14Condensers affording illumination for phase-contrast observation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/245Devices for focusing using auxiliary sources, detectors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Definitions

  • the present disclosure relates to a digital microscope apparatus that captures an enlarged image of a sample such as a biological sample as an observation image, and to an imaging method for the digital microscope apparatus and a program.
  • AF autofocusing
  • a focal position of the objective lens in the enlarging imaging system is moved in an optical axis direction at predetermined intervals, images are captured at the respective moved positions, and a position at which an image having the highest contrast in the captured images is captured is detected as an in-focus position (see, for example, Japanese Patent Application Laid-open No. 2011-197283).
  • This type of focusing system is called “contrast AF”.
  • the contrast AF performs repetitive movements and evaluations of the focal position of the objective lens in order to search for an optimal focal position. Consequently, it takes a relatively long time to obtain the focal position.
  • a microscope apparatus that adopts a “phase difference AF” in which light taken through an objective lens is split into two light beams by a splitter lens and the position and direction of a focal point is determined based on a distance between two formed images is also disclosed (see, for example, Japanese Patent Application Laid-open No. 2011-090222).
  • the phase difference AF can provide a focal position at higher speed than the contrast AF without the search for a focal position.
  • the accuracy is lowered due to the size of an object within the imaging surface or due to the number of tissues.
  • a digital microscope apparatus including: an observation image capturing unit configured to capture an observation image of each of a plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; and a controller configured to set at least one evaluation area for the observation image of each of the plurality of small areas, the observation image being captured by the observation image capturing unit, to perform an edge detection on the at least one evaluation area, and to calculate, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the observation images according to the partition.
  • the controller may be configured to determine, as a blurred boundary part, a boundary part located between the observation images and having a difference in blur evaluation amount that is out of a predetermined allowable range.
  • the controller may be configured to determine, based on a result of the determination of the blurred boundary part, a small area to be reimaged by the observation image capturing unit and a condition of the reimaging for the small area to be reimaged.
  • the controller may be configured to determine the small area surrounded by the blurred boundary parts as the small area to be reimaged, and to determine to switch an autofocusing method, for the condition of the reimaging. This allows an observation image to be obtained by a more appropriate autofocusing method in accordance with the state of the sample.
  • the controller may be configured to determine, when determining that the blurred boundary parts exist in n small areas successively located in a predetermined one axial direction (where n is an integer of 3 or more), the n small areas as small areas to be reimaged, to change an imaging range in the one axial direction of the observation image capturing unit into 1/m of the imaging range at a first imaging (where m is an integer of 2 or more), for the condition of the reimaging, and to determine to divide each of the small areas into m areas for reimaging.
  • the digital microscope apparatus may further include a thumbnail image capturing unit configured to capture an entire image of the sample on the glass slide, in which the controller may be configured to generate, when the blurred boundary part is determined again in the connected image containing the observation image of the small area that is reimaged by the observation image capturing unit, an image that clearly specifies a position of the blurred boundary part in a thumbnail image captured by the thumbnail image capturing unit and to display the image on a display.
  • an imaging method for a digital microscope apparatus including: setting at least one evaluation area for an observation image of each of a plurality of small areas, the observation image being captured by an observation image capturing unit configured to capture an observation image of each of the plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; performing an edge detection on the at least one evaluation area; and calculating, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the plurality of observation images according to the partition.
  • a program causing a computer to operate as a controller configured to set at least one evaluation area for an observation image of each of a plurality of small areas, the observation image being captured by an observation image capturing unit configured to capture an observation image of each of the plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; to perform an edge detection on the at least one evaluation area; and to calculate, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the plurality of observation images according to the partition.
  • FIG. 1 is a diagram showing a whole configuration of a digital microscope apparatus according to an embodiment
  • FIG. 2 is a functional block diagram of an integration controller in the digital microscope apparatus shown in FIG. 1 ;
  • FIG. 3 is a flowchart showing a flow of an imaging operation by the digital microscope apparatus shown in FIG. 1 ;
  • FIG. 4 is a diagram partially showing an observation image converted into a grayscale image
  • FIG. 5 is a diagram showing a specific setting example of evaluation areas in the observation image shown in FIG. 4 and determination results of blurred boundary parts;
  • FIG. 6 is a diagram showing an example of an observation image, for which small areas to be reimaged are determined by a second determination method
  • FIG. 7 is a diagram showing the determination results of the blurred boundary parts in the observation image shown in FIG. 6 ;
  • FIG. 8 is a diagram showing a relationship among an x-z-axis cross section of a part of a sample arranged at a large inclination angle with respect to a glass slide surface, a focal position, an imaging range for each focal position, an allowable blur evaluation amount, an in-focus area, and a blurred area;
  • FIG. 9 is an explanatory diagram of reimaging
  • FIG. 10 is a diagram showing a display example in which the position of a blurred boundary part is clearly specified in a thumbnail image
  • FIG. 11 is an explanatory diagram of reimaging by a modified example 1;
  • FIG. 12 is a diagram showing another setting method for the evaluation area
  • FIG. 13 is a diagram showing another setting method for the evaluation area
  • FIG. 14 is a diagram showing another setting method for the evaluation area
  • FIG. 15 is a flowchart of an imaging operation for describing a modified example 4.
  • FIG. 16 is a diagram showing an example of a plurality of small areas that are surrounded by the blurred boundary parts and connected to one another.
  • FIG. 1 is a diagram showing a whole configuration of a digital microscope apparatus 100 according to this embodiment.
  • the digital microscope apparatus 100 includes a thumbnail image capturing unit 10 , an observation image capturing unit 20 , a phase difference image acquiring unit 30 , a stage 40 , and a controller 50 .
  • the thumbnail image capturing unit 10 captures an entire image of a preparation PRT on which a sample SPL is provided (this image being hereinafter referred to as a “thumbnail image”).
  • the observation image capturing unit 20 captures an image obtained by magnifying the sample SPL provided on the preparation PRT at a predetermined magnification (the image being hereinafter referred to as an “observation image”).
  • the phase difference image acquiring unit 30 captures a phase difference image containing information on the amount and orientation of a displacement in an optical axis direction between a focal point of an objective lens 23 of the observation image capturing unit 20 and the sample SPL on the preparation PRT.
  • the stage 40 moves the preparation PRT placed thereon to a position for imaging by the thumbnail image capturing unit 10 and a position for imaging by the observation image capturing unit 20 .
  • the stage 40 is configured to be movable by a stage drive mechanism 41 in a direction of an optical axis (z-axis direction) of the objective lens 23 of the observation image capturing unit 20 and also in a direction (x-axis direction and y-axis direction) orthogonal to the direction of the optical axis. Additionally, the stage 40 is desirably movable also in a direction inclining with respect to a plane orthogonal to the direction of the optical axis. Furthermore, for the movement in the optical axis direction, a configuration in which the objective lens 23 is moved vertically by using, for example, a piezo-stage may be provided.
  • the preparation PRT is obtained by fixing the sample SPL to a glass slide by a predetermined fixing method.
  • the sample SPL is a biological sample that includes tissue slices of connective tissues such as blood, epithelial tissues, and tissues including both of the above tissues, or the like or includes smear cells.
  • the tissue slices or smear cells are subjected to various types of staining as necessary.
  • staining examples include not only general staining represented by HE (hematoxylin-eosin) staining, Giemsa staining, Papanicolaou staining, Ziehl-Neelsen staining, and Gram staining but also fluorescent staining such as FISH (Fluorescence In-Situ Hybridization) and an enzyme antibody technique.
  • HE hematoxylin-eosin
  • Giemsa staining Giemsa staining
  • Papanicolaou staining Papanicolaou staining
  • Ziehl-Neelsen staining Ziehl-Neelsen staining
  • Gram staining examples include fluorescent staining such as FISH (Fluorescence In-Situ Hybridization) and an enzyme antibody technique.
  • the digital microscope apparatus 100 is additionally equipped with a preparation stock loader 70 that stores the preparations PRT each containing the sample SPL and loads the stored preparations PRT one by one onto the stage 40 . It should be noted that the preparation stock loader 70 may be integrated into the digital microscope apparatus 100 .
  • thumbnail image capturing unit 10 observation image capturing unit 20 , phase difference image acquiring unit 30 , and controller 50 described above will be described.
  • the thumbnail image capturing unit 10 includes a light source 11 , an objective lens 12 , and an imaging device 13 as shown in FIG. 1 .
  • the light source 11 is provided on a surface of the stage 40 , which is on the opposite side to the surface on which the preparation is arranged.
  • the light source 11 can switch between light (bright field illumination light) for illuminating a sample SPL on which general staining is performed and light (dark field illumination light) for illuminating a sample SPL on which special staining is performed. Further, the light source 11 may apply only one of the bright field illumination light and the dark field illumination light. In this case, two types of light sources, i.e., the light source that applies the bright field illumination light and the light source that applies the dark field illumination light, are provided as the light sources 11 . It should be noted that the light source that applies the dark field illumination light may be provided on the surface side of the stage 40 on which the preparation is arranged (hereinafter, the surface being also referred to as preparation arrangement surface).
  • the objective lens 12 is arranged on the preparation arrangement surface side of the stage 40 , with the normal line of a reference position of the thumbnail image capturing unit 10 on the preparation arrangement surface being as an optical axis SR. Transmitted light that has been transmitted through the preparation PRT arranged on the stage 40 is collected by the objective lens 12 and forms an image onto the imaging device 13 that is provided behind the objective lens 12 (that is, in a traveling direction of the illumination light).
  • Light covering an imaging range in which the entire preparation PRT placed on the preparation arrangement surface of the stage 40 is included is focused onto the imaging device 13 to form an image.
  • the image formed onto the imaging device 13 is a thumbnail image of the entire preparation PRT.
  • the observation image capturing unit 20 includes a light source 21 , a condenser lens 22 , the objective lens 23 , an imaging device 24 , a condenser lens drive mechanism 25 , and a beam splitter 26 .
  • the light source 21 applies the bright field illumination light.
  • the light source 21 is provided on the surface of the stage 40 , which is on the opposite side to the preparation arrangement surface. Further, at a position different from the light source 21 (for example, on the preparation arrangement surface side), a light source (not shown) that applies the dark field illumination light is provided.
  • the condenser lens 22 is a lens that collects the bright field illumination light applied from the light source 21 or dark field illumination light applied from a dark field illumination light source to guide the light to the preparation PRT on the stage 40 .
  • the condenser lens 22 is arranged between the light source 21 and the stage 40 , with the normal line of a reference position of the observation image capturing unit 20 on the preparation arrangement surface being as an optical axis ER.
  • the condenser lens drive mechanism 25 changes the position of the condenser lens 22 on the optical axis ER by driving the condenser lens 22 along a direction of the optical axis ER.
  • the objective lens 23 is arranged on the preparation arrangement surface side of the stage 40 , with the normal line of the reference position of the observation image capturing unit 20 on the preparation arrangement surface being as the optical axis ER.
  • the objective lens 23 is appropriately replaced, so that the image of the sample SPL can be enlarged and captured at various magnifications. Further, in the case where an imaging lens is arranged in the infinite conjugate system, a magnification can be changed by appropriately replacing the imaging lens. Transmitted light that has been transmitted through the preparation PRT arranged on the stage 40 is collected by the objective lens 23 and reaches the beam splitter 26 .
  • the beam splitter 26 splits the transmitted light that has been transmitted through the objective lens 23 into reflected light that proceeds to the imaging device 24 and transmitted light that proceeds to a field lens 32 in the phase difference image acquiring unit 30 .
  • an image of a small imaging range on the preparation arrangement surface of the stage 40 is formed in accordance with a pixel size of the imaging device 24 and a magnification of the objective lens 23 .
  • the phase difference image acquiring unit 30 includes the field lens 32 , a separator lens 33 , and an imaging device 34 .
  • the field lens 32 collects the reflected light that has been transmitted through the beam splitter 26 and guides the reflected light to the separator lens 33 that is provided behind the field lens 32 (on a traveling direction side of the reflected light).
  • the separator lens 33 divides the light beam guided from the field lens 32 into two light beams.
  • the divided light beams form a set of subject images on an imaging surface of the imaging device 34 provided behind the separator lens 33 (on the traveling direction side of the reflected light).
  • phase difference image On the imaging device 34 , a set of subject images that has been transmitted through the separator lens 33 is formed. A phase difference exists between the set of formed subject images because light beams in various directions, which are emitted from the field lens 32 , enter the separator lens 33 .
  • the set of subject images is referred to as a “phase difference image”.
  • the beam splitter 26 is provided between the objective lens 23 and the imaging device 24 in the above description, but a light beam branching unit that branches a light beam is not limited to the beam splitter.
  • a movable mirror and the like may be used as the beam splitter.
  • the phase difference image acquiring unit 30 is arranged on the optical axis ER of the objective lens 23 , and the imaging device 24 of the observation image capturing unit 20 is arranged at a position on which the reflected light split by the beam splitter 26 is incident.
  • the imaging device 24 of the observation image capturing unit 20 may be arranged on the optical axis ER of the objective lens 23 and the phase difference image acquiring unit 30 may be arranged at a position on which the reflected light split by the beam splitter 26 is incident.
  • phase difference AF optical system in the phase difference image acquiring unit 30
  • phase difference AF optical system may be a different optical system in which equivalent functions can be achieved by using a condenser lens and twin lenses instead of the field lens, the separator lens, and the like.
  • each of the imaging devices provided to the thumbnail image capturing unit 10 , the observation image capturing unit 20 , and the phase difference image acquiring unit 30 may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor).
  • the controller 50 includes an integration controller 51 , an illumination controller 52 , a stage drive controller 53 , a condenser lens drive controller 54 , a phase difference image capturing controller 57 , a thumbnail image capturing controller 56 , an observation image capturing controller 55 , a storage unit 58 , a development unit 59 , an image coding unit 60 , a communication unit 61 , and a display controller 62 .
  • the integration controller 51 is constituted of hardware elements of a computer including, for example, a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory).
  • the integration controller 51 may be constituted of a dedicated IC such as an FPGA (field programmable gate array).
  • the integration controller 51 exchanges various signals with the illumination controller 52 , the stage drive controller 53 , the condenser lens drive controller 54 , the observation image capturing controller 55 , the thumbnail image capturing controller 56 , the phase difference image capturing controller 57 , the storage unit 58 , the development unit 59 , the image coding unit 60 , the communication unit 61 , and the display controller 62 , and executes various types of arithmetic processing and control to acquire an observation image.
  • Various programs and various types of data for the various types of arithmetic processing and control are loaded to the RAM.
  • the CPU executes the programs loaded to the RAM.
  • the ROM stores the programs and data loaded to the RAM.
  • the illumination controller 52 , the stage drive controller 53 , the condenser lens drive controller 54 , the phase difference image capturing controller 57 , the thumbnail image capturing controller 56 , and the observation image capturing controller 55 may be each constituted of hardware elements of a computer including, for example, a CPU, a ROM, and a RAM or may be constituted of a dedicated IC such as an FPGA.
  • the development unit 59 and the image coding unit 60 are each constituted of hardware elements of a computer including, for example, a CPU, a ROM, and a RAM.
  • the development unit 59 and the image coding unit 60 may be constituted of a GPU (Graphics Processing Unit).
  • the illumination controller 52 controls the light sources 11 and 21 according to an instruction on an illumination method for the sample SPL.
  • the instruction is given from the integration controller 51 .
  • the illumination controller 52 selects the type of light source, such as the intensity of illumination light of the light sources 11 and 21 , a light source for a bright field, a light source for a dark field, and the like according to the instruction from the integration controller 51 .
  • Examples of the light source for a bright field include a light source that applies visible light.
  • Examples of the light source for a dark field include a light source to apply light having a wavelength that can excite a fluorescent marker used in special staining.
  • the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in a stage surface direction (in x- and y-axis directions) in such a way that the entire preparation PRT falls within the imaging range of the imaging device 13 .
  • the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction in such a way that the objective lens 12 is focused on the entire preparation PRT.
  • the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the stage surface direction in such a way that the small area of the instructed sample SPL falls within the imaging range of the imaging device 24 .
  • the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction in such a way that the objective lens 23 is focused on the sample SPL.
  • the condenser lens drive controller 54 controls the condenser lens drive mechanism 25 based on information on an illumination field stop from the integration controller 51 , to adjust the illumination light from the light source 21 so as to be applied only to a small area of the sample SPL as an imaging target.
  • the information on the illumination field stop includes the amount of defocus and the orientation of defocus. Those pieces of information are obtained based on a distance between the set of phase difference images generated by the phase difference image acquiring unit 30 .
  • the phase difference image capturing controller 57 acquires signals of the set of phase difference images that are formed on the imaging surface of the imaging device 34 provided to the phase difference image acquiring unit 30 , and supplies the signals to the integration controller 51 .
  • the integration controller 51 calculates the amount of defocus and the orientation of defocus of the focal point of the objective lens 23 of the observation image capturing unit 20 with respect to the sample SPL, based on the distance between the set of phase difference images that are acquired from the phase difference image capturing controller 57 . Based on those pieces of information, the integration controller 51 generates control information for the stage 40 and supplies the control information to the stage drive controller 53 .
  • the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction based on the control information from the integration controller 51 .
  • the phase difference AF autofocusing
  • the thumbnail image capturing controller 56 Based on a signal corresponding to a thumbnail formed on the imaging surface of the imaging device 13 of the thumbnail image capturing unit 10 , the thumbnail image capturing controller 56 generates data corresponding to the thumbnail image and supplies the data to the integration controller 51 .
  • the integration controller 51 detects an area including the sample SPL as a sample area from the thumbnail image acquired by the thumbnail image capturing controller 56 . Additionally, the integration controller 51 partitions the sample area into a plurality of areas in a mesh pattern, the plurality of areas each having a size corresponding to the field of view of the observation image capturing unit 20 , and performs processing such as setting each of the areas as an area corresponding to one-time imaging by the observation image capturing unit 20 , and the like. Hereinafter, this area is referred to as a “small area”.
  • the observation image capturing controller 55 generates, based on a signal corresponding to an observation image of each small area that is formed on the imaging surface of the imaging device 24 of the observation image capturing unit 20 , raw data corresponding to the observation image of each small area and supplies the raw data to the integration controller 51 .
  • the integration controller 51 supplies the raw data of each small area, which has been acquired from the observation image capturing controller 55 , to the development unit 59 , so that the development unit 59 executes development processing.
  • the integration controller 51 connects the data of the observation images of respective small areas, which have been developed by the development unit 59 , to generate a large image for each sample SPL, and performs processing of dividing the generated large image for each sample SPL into units of a predetermined resolution that is called tile, and other processing. Further, the integration controller 51 supplies each of the generated tiles to the image coding unit 60 and causes the image coding unit 60 to generate image data in a predetermined compression coding format and the storage unit 58 to store the image data.
  • the storage unit 58 stores various types of setting information or programs for controlling the digital microscope apparatus 100 , tile groups in the predetermined compression coding format, and the like.
  • the development unit 59 develops the raw data of the observation image of each small area that has been captured by the observation image capturing unit 20 .
  • the image coding unit 60 codes the image data for each tile in the predetermined image compression format.
  • JPEG Joint Photographic Experts Group
  • compression coding formats other than JPEG may be adopted.
  • the tiles stored in the storage unit 58 are accumulated in an image management server 81 by the communication unit 61 through a network 80 .
  • the image management server 81 sends one or more appropriate tiles to the viewer terminal 82 .
  • the viewer terminal 82 generates an observation image for display by using the one or more tiles acquired from the image management server 81 and displays the observation image on a display of the viewer terminal 82 .
  • the display controller 62 generates screen data to be displayed on a display 90 that is connected to the digital microscope apparatus 100 .
  • a phase difference autofocusing and a contrast autofocusing are implemented as autofocus functions of automatically focusing the objective lens 23 of the observation image capturing unit 20 on the sample SPL serving as an imaging target.
  • the integration controller 51 instructs the phase difference image capturing controller 57 to capture a phase difference image.
  • the phase difference image capturing controller 57 takes in signals of a set of phase difference images that are formed side by side on the imaging surface of the imaging device 34 from the phase difference image acquiring unit 30 , and obtains a phase difference between those phase difference images.
  • the focal point of the objective lens 23 is farther than an appropriate surface, identical areas of observed surfaces on the two phase difference images move so as to be separated from each other toward the outward direction of the imaging device 24 .
  • the focal point of the objective lens 23 is closer than the appropriate surface, the identical areas of the observed surfaces on the two phase difference images move so as to be close to each other toward the inward direction of the imaging device 24 .
  • the integration controller 51 obtains a distance between the identical areas of the observed surfaces on the two phase difference images as a phase difference.
  • the integration controller 51 obtains, based on the obtained phase difference, the amount of defocus and the orientation of defocus of the focal point of the objective lens 23 with respect to the sample SPL.
  • the integration controller 51 generates control information for the stage 40 based on the obtained amount and orientation of defocus and supplies the control information to the stage drive controller 53 .
  • the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction based on the control information from the integration controller 51 .
  • the phase difference autofocusing that focuses the objective lens 23 of the observation image capturing unit 20 on the sample SPL is performed.
  • phase difference autofocusing allows a focal position to be obtained at higher speed than the contrast autofocusing without the search for a focal position. Conversely, there is a possibility that the accuracy is lowered due to the size of an object within the imaging surface or due to the number of tissues.
  • the contrast autofocusing is a method in which a focal position is searched for in a hill climbing method by use of the observation image capturing unit 20 .
  • the integration controller 51 displaces the focal position of the objective lens 23 by a predetermined distance and causes the observation image capturing unit 20 to perform imaging at respective focal positions.
  • the integration controller 51 determines a focal position when an image having a highest contrast in the captured images is captured, as an optimal focal position.
  • the contrast autofocusing provides higher accuracy in focal point than the phase difference autofocusing.
  • the contrast autofocusing involves repetitive movements and evaluations of the focal position of the objective lens. Consequently, it takes a relatively long time to obtain the focal position (focal position search time).
  • a focal point is detected for each small area by the phase difference autofocusing or the contrast autofocusing, so that an observation image is captured.
  • images may be captured in an insufficient in-focus state depending on areas.
  • the observation image of each small area is blurred as a whole or partially.
  • a blurred part of such a blurred observation image has been discovered only when a user observes a large image that is generated by connecting such blurred observation images of respective small areas by stitching processing, because the blurred part is more prominent than the adjacent observation images.
  • the digital microscope apparatus 100 has the following functions.
  • the digital microscope apparatus 100 sets one or more evaluation areas for the observation image of each small area, which has been captured by the observation image capturing unit 20 , and performs an edge detection on each evaluation area. Subsequently, the digital microscope apparatus 100 calculates a difference in blur evaluation amount between two observation images by using results of the edge detection on two evaluation areas.
  • the two evaluation areas are located closest among any pairs of adjacent observation images in the connected image obtained by connecting a plurality of observation images.
  • the digital microscope apparatus 100 determines a boundary part between those observation images, in which a difference in blur evaluation amount between the two observation images is out of a predetermined allowable range, as a blurred boundary part. Based on the determination results, the digital microscope apparatus 100 determines a small area to be reimaged by the observation image capturing unit 20 and a reimaging condition for such a small area.
  • FIG. 2 is a functional block diagram of the integration controller 51 .
  • the integration controller 51 includes a sample area detection unit 511 , a small area setting unit 512 , a stitching unit 513 , a blur evaluation unit 514 , and a reimaging controller 515 . Those functions are achieved when a CPU in the integration controller 51 executes a program loaded to a RAM.
  • the sample area detection unit 511 detects an area including the sample SPL as a sample area.
  • the sample area detection unit 511 determines a sample area in the thumbnail image based on a distribution of pixels whose luminance values drastically change, for example.
  • the pixels whose luminance values drastically change are detected by, for example, a method of detecting a boundary of samples by edge detection. It should be noted that a user can manually perform the determination of sample areas by observing the thumbnail image on a monitor.
  • the small area setting unit 512 partitions the sample area, which has been detected by the sample area detection unit 511 , into a plurality of areas in a mesh pattern in a size unit corresponding to the field of view of the observation image capturing unit 20 and sets each of the partitioned areas as an area (small area) corresponding to one-time imaging by the observation image capturing unit 20 .
  • the small area setting unit 512 gives information on the position of each small area to the stage drive controller 53 . Based on the given information on the position of each small area, the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 such that the small areas sequentially fall within the imaging range of the observation image capturing unit 20 . Subsequently, the small area setting unit 512 instructs the observation image capturing controller 55 to capture images of the small areas.
  • the stitching unit 513 connects observation images of the plurality of small areas, which have been captured by the observation image capturing controller 55 , and generates a large image (connected image) in units of the sample SPL.
  • the blur evaluation unit 514 sets one or more evaluation areas for the observation image of each small area and performs an edge detection on each evaluation area, to calculate a difference in blur evaluation amount between two observation images by using results of the edge detection on two evaluation areas.
  • the two evaluation areas are located closest among any pairs of adjacent observation images in the connected image.
  • the blur evaluation unit 514 determines a boundary part between those two observation images, in which a difference in blur evaluation amount between the two observation images is out of a predetermined allowable range, as a blurred boundary part.
  • the blur evaluation unit 514 determines, based on the determination results of the blurred boundary part, a small area to be reimaged by the observation image capturing unit 20 and a reimaging condition for such a small area.
  • the above-mentioned series of processing by the blur evaluation unit 514 is referred to as “blur evaluation”.
  • the reimaging controller 515 receives the small area to be reimaged and the reimaging condition, which have been determined by the blur evaluation unit 514 , and controls the stage drive controller 53 , the observation image capturing controller 55 , and the phase difference image capturing controller 57 to execute reimaging of the small area to be reimaged.
  • FIG. 3 is a flowchart showing a flow of such an imaging operation.
  • a thumbnail image captured by the thumbnail image capturing unit 10 is supplied to the integration controller 51 .
  • the sample area detection unit 511 detects a sample area from the thumbnail image (Step S 101 ).
  • the small area setting unit 512 of the integration controller 51 sets a plurality of small areas for the sample area detected by the sample area detection unit 511 , and instructs the observation image capturing controller 55 to capture an image while giving information on the positions of the plurality of small areas to the stage drive controller 53 .
  • the stage 40 is moved such that each small area falls within an imaging range of the observation image capturing unit 20 , autofocusing is performed, and the small area is imaged by the observation image capturing unit 20 (Step S 102 ).
  • the autofocusing method executed when the observation image is captured may be any of the phase difference autofocusing and the contrast autofocusing.
  • the autofocusing method for the first imaging may be appropriately selected by a user, for example.
  • the stitching unit 513 of the integration controller 51 connects the observation images of the respective small areas in the x- and y-axis directions and generates a large image (connected image) in units of the sample SPL (Step S 103 ).
  • the blur evaluation by the blur evaluation unit 514 will be performed as follows.
  • the blur evaluation unit 514 converts the large image (connected image) in units of the sample SPL into a grayscale image having a predetermined bit number for the purpose of speed-up of processing (Step S 104 ). At that time, for the purpose of further speed-up of processing, the large image (connected image) may be reduced in size before the conversion into a grayscale image, or an observation image converted into a grayscale image may be reduced in size. Alternatively, only the evaluation area may be cut out and converted into a grayscale image.
  • the blur evaluation unit 514 performs an edge detection by filtering on the size-reduced observation image or the observation image converted into a grayscale image (Step S 105 ).
  • the edge detection for example, the Canny edge detection method or the like may be used. However, the present disclosure is not limited to this.
  • the blur evaluation unit 514 sets a plurality of evaluation areas for the size-reduced observation image or the observation image converted into a grayscale image (Step S 106 ).
  • One or more evaluation areas are set for the observation image of each small area. It is desirable that the plurality of evaluation areas be set in the vicinity of the outer circumferential area of the observation image of each small area.
  • FIG. 4 is a diagram partially showing an observation image converted into a grayscale image.
  • Parts A, B, C, D, and E are observation images of small areas. Practically, the observation images of small areas other than the parts A to E are also connected in the x-axis direction and the y-axis direction, but the illustration thereof is omitted in FIG. 4 .
  • FIG. 5 is a diagram showing a specific setting example of the evaluation area in the observation image shown in FIG. 4 .
  • Evaluation areas A 1 to A 4 are set in the observation image of a small area A.
  • Evaluation areas B 1 to B 4 are set in the observation image of a small area B.
  • Evaluation areas C 1 to C 4 are set in the observation image of a small area C.
  • Evaluation areas D 1 to D 4 are set in the observation image of a small area D.
  • Evaluation areas E 1 to E 4 are set in the observation image of a small area E.
  • each evaluation area is set at the middle part of each of the four sides in the circumferential area of the observation image of each of the small areas A, B, C, D, and E.
  • the blur evaluation unit 514 calculates a difference in blur evaluation amount between the observation images of two adjacent small areas by using the edge detection results described above. Specifically, the blur evaluation unit 514 first calculates an edge density d from the edge detection results (Step S 107 ).
  • the edge density d indicates, for example, a ratio of a part whose edge intensity obtained from the edge detection exceeds a threshold value in the evaluation area, and an average value of the edge intensity in the evaluation area.
  • the edge density d becomes smaller because a component with a high spatial frequency attenuates.
  • the edge density d becomes higher because many components with a high spatial frequency are contained.
  • the blur evaluation unit 514 calculates an absolute value of a difference Ad in edge density d between two evaluation areas.
  • the two evaluation areas are located closest among any pairs of adjacent observation images of the small areas in the connected image.
  • the blur evaluation unit 514 sets the absolute value as a difference in blur evaluation amount between those two observation images (Step S 108 ).
  • the two closest evaluation areas in the adjacent observation images of the small areas in the connected image are A 4 and B 1 , B 2 and C 3 , B 4 and D 1 , and B 3 and E 2 .
  • an absolute value of a difference in edge density between A 4 and B 1 is represented by
  • an absolute value of a difference in edge density between B 2 and C 3 is represented by
  • an absolute value of a difference in edge density between B 4 and D 1 is represented by
  • the blur evaluation unit 514 determines a boundary part between those two evaluation areas, in which the absolute value of the difference ⁇ d in edge density d is larger than a predetermined threshold value ⁇ , as a “blurred boundary part” whose difference in blur evaluation amount is out of the predetermined allowable range (Step S 109 ). Specifically, if imaging is performed at an appropriate focal position for each of the two adjacent small areas, the two closest evaluation areas between the adjacent observation images of those small areas should have the same edge density d. When the difference ⁇ d in edge density between the two evaluation areas is large, this means that a boundary part of the observation image of at least one of the small areas is blurred at minimum.
  • the blur evaluation unit 514 determines whether a sample exists in the connected image or not and excluding an appropriate evaluation area from the blur evaluation. For example, the blur evaluation unit 514 can determine a position at which a sample exists by detecting an area equal to the background where no samples exist in the connected image or detecting an area containing color information derived from the staining of samples.
  • an image to be used for the determination may be a thumbnail image by the thumbnail image capturing unit 10 or an observation image by the imaging device 13 .
  • the boundary part between the small area A and the small area B, the boundary part between the small area C and the small area B, the boundary part between the small area D and the small area B, and the boundary part between the small area E and the small area B are determined as a blurred boundary part 3 AB, a blurred boundary part 3 BC, a blurred boundary part 3 BD, and a blurred boundary part 3 BE, respectively.
  • the blur evaluation unit 514 determines a small area to be reimaged by the observation image capturing unit 20 or determines a reimaging condition for a small area to be reimaged. More specifically, in the following manner, a small area to be reimaged and a reimaging condition for the small area to be reimaged are determined.
  • Step S 110 the imaging of this preparation PRT is assumed to be successfully performed, and thus the imaging processing for this preparation PRT is terminated. Subsequently, the imaging of the next preparation PRT is executed.
  • the determination method for a small area to be reimaged includes the following methods.
  • First determination method to determine at least one small area surrounded by the blurred boundary parts as a small area to be reimaged.
  • Second determination method to determine, when the blurred boundary parts exist in n small areas successively located in a predetermined one axial direction (x-axis or y-axis direction), the n small areas as small areas to be reimaged, where n is an integer of 3 or more.
  • the blur evaluation unit 514 determines that small area as a small area to be reimaged. For example, as shown in FIG. 5 , the observation image of the small area B is surrounded by the blurred boundary parts 3 AB, 3 BC, 3 BD, and 3 BE in all directions. In such a case, the observation image of the small area B is an overall blurred image in many cases.
  • the blur evaluation unit 514 determines the small area B, which is surrounded in all directions by the blurred boundary parts, as a small area to be reimaged, and instructs the reimaging controller 515 to perform reimaging of the small area B under a first condition.
  • the blur evaluation unit 514 determines those small areas as small areas to be reimaged.
  • FIG. 16 is a diagram showing an example of a plurality of small areas that are surrounded by the blurred boundary parts and connected to one another.
  • the small areas shown in FIG. 16 are denoted by shot numbers of 1 to 25.
  • 6 small areas with the shot numbers of 7, 8, 12, 13, 14, and 19 are determined as a plurality of small areas surrounded by the blurred boundary parts and connected to one another.
  • the reimaging controller 515 controls the stage drive controller 53 , the observation image capturing controller 55 , and the phase difference image capturing controller 57 to change the autofocusing method and reimage the small area B according to the first condition (Step S 112 ). For example, in the case where focusing is performed by the phase difference autofocusing at the first imaging, the reimaging controller 515 performs control so as to search for a focal position by the contrast autofocusing at the reimaging. Conversely, in the case where the focal position is searched for by the contrast autofocusing at the first imaging, the reimaging controller 515 performs control so as to perform focusing by the phase difference autofocusing at the reimaging.
  • the imaging is performed by the phase difference autofocusing in order to give propriety to the efficiency at the first imaging, and reimaging is performed by switching to the contrast autofocusing as necessary. This allows an improvement in imaging efficiency for a lot of preparations PRT to be expected.
  • the blur evaluation unit 514 determines the n small areas as small areas to be reimaged.
  • n is an integer of 3 or more.
  • FIG. 6 is a diagram showing an example of the observation image, for which small areas to be reimaged are determined by the second determination method.
  • the evaluation area is set as described above.
  • > ⁇ are satisfied, while the expressions
  • the small areas B, C, and E are determined as small areas to be reimaged.
  • the blur evaluation unit 514 instructs the reimaging controller 515 to perform reimaging of the small areas B, C, and E under a second condition.
  • Factors causing periodic blurs in a predetermined one axial direction (x-axis or y-axis direction) in such a manner include the posture of the sample SPL, which is arranged at a large inclination angle with respect to a glass slide surface, and the like.
  • the end of the sample may be partially inclined at a large angle. Such a case also causes generation of periodic blurs in a predetermined one axial direction (x-axis or y-axis direction) easily.
  • FIG. 8 is a diagram showing a relationship among an x-z-axis cross section of a part (including small areas B, C, and E) of the sample SPL arranged at a large inclination angle with respect to the glass slide surface, which are shown in lower part of the diagram, focal positions z_focus 1 , z_focus 2 , and z_focus 3 , imaging ranges for the respective focal positions, allowable blur evaluation amounts ⁇ z, in-focus areas, and blurred areas.
  • the vertical axis indicates the optical axis direction z
  • the horizontal axis indicates the x-axis direction.
  • the position z_focus 1 indicates a focal position when the small area C is imaged
  • the position z_focus 2 indicates a focal position when the small area B is imaged
  • the position z_focus 3 indicates a focal position when the small area E is imaged.
  • the blur evaluation amount is proportional to the difference between the focal position and the position of the sample SPL. Further, a part whose blur evaluation amount exceeds the range of a predetermined allowable blur evaluation amount ⁇ z due to the inclination of the sample SPL is defined as a “blurred area”. A part whose blur evaluation amount falls within the range of the allowable blur evaluation amount ⁇ z is defined as an “in-focus area”.
  • the blurred areas and the in-focus areas appear side by side with boundaries being interposed between the small areas over the small areas successively located in one axial direction (in this case, the x-axis direction).
  • the blurred boundary parts exist between the n small areas arranged in one axial direction (in this case, the x-axis direction).
  • the blur evaluation unit 514 instructs the reimaging controller 515 to perform reimaging of the small areas B, C, and E under the second condition.
  • the reimaging controller 515 controls the stage drive controller 53 , the observation image capturing controller 55 , and the phase difference image capturing controller 57 to change the imaging range in the one axial direction of the observation image capturing unit 20 into 1/m of the imaging range at the first imaging and to divide one small area into m areas for reimaging (Step S 114 ).
  • the autofocusing method at the reimaging is the same as that of the first imaging.
  • the value m is an integer of 2 or more.
  • FIG. 9 is a diagram showing a specific example of the reimaging.
  • FIG. 9 shows an example in which the imaging range in the x-axis direction is changed into 1 ⁇ 2 of the imaging range at the first imaging, and each of the small areas B, C, and E to be reimaged is reimaged by 1 ⁇ 2 size in the x-axis direction by two times of imaging.
  • this example shows a case where a small area is divided into two areas arranged in the x-axis direction and reimaging is performed in a focal position obtained by autofocusing for each area.
  • this example shows that the maximum blur evaluation amount falls within the range of the allowable blur evaluation amount ⁇ z in each of the small areas B, C, and E, and a good observation image without a blurred area is obtained.
  • Examples of the method of changing the imaging range of the observation image capturing unit 20 into 1/m of the imaging range include the following methods.
  • a method of changing a feed amount of the stage 40 at the imaging in the case where a line senor is used as the imaging device 34 of the observation image capturing unit 20 .
  • the blur evaluation unit 514 performs the blur evaluation as in Steps S 107 to S 109 described above (Step S 115 ).
  • the imaging of this preparation PRT is assumed to be successfully performed, and thus the imaging processing for this preparation PRT is terminated. Subsequently, the imaging of the next preparation PRT is executed.
  • the reimaging controller 515 considers that the search for a focal position has failed by the changed autofocusing method and performs control to clearly specify the positions of the blurred boundary parts on the screen on which a thumbnail image of the sample is displayed (Step S 117 ).
  • FIG. 10 is a diagram showing a display example in which the position of a blurred boundary part is clearly specified in a thumbnail image.
  • the display 90 may be a display apparatus directly connected to the controller 50 of the digital microscope apparatus 100 .
  • a thumbnail image 91 the position of the blurred boundary part is clearly specified by a synthetic image 92 such as a frame line.
  • the method of clearly specifying the position of the blurred boundary part is not limited to use of the synthetic image 92 .
  • the luminance, color, and the like of the observation image at the position of the blurred boundary part may be changed, so that the user can recognize the position of the blurred boundary part.
  • an operating element with which a user can change the setting of the threshold value c may be provided.
  • the change of the setting of the threshold value c allows the blurred boundary part to be displayed in a state where the user easily recognizes the blurred boundary part. This allows an improvement in efficiency when the user checks the position of the blurred boundary part, for example, in the case where an allowable blur amount fluctuates depending on the sample.
  • the adjustment of the threshold value c allows an improvement in detection accuracy of the blurred boundary part corresponding to the sample to be expected.
  • the user visually checks a blurred boundary part on the preparation PRT or checks a blurred boundary part by using an enlarged image while viewing the display of the position of the blurred boundary part. If dirt and the like are mixed in the preparation PRT, for example, the focal position is manually adjusted to perform reimaging.
  • Operation to validate the determination result by the first determination method 1. Operation to validate the determination result by the first determination method. 2. Operation to select whether to validate the determination result by the first determination method or validate the determination result by the second determination method according to the setting value of n. For example, when n is 3, the determination result by the first determination method is set to be valid, and when n is 4 or more, the determination result by the second determination method is set to be valid.
  • the imaging range in the one axial direction is changed into 1/m of the imaging range at the first imaging, and one small area is reimaged by m times of imaging.
  • a control operation of inclining the stage 40 or the optical system of the observation image capturing unit 20 so as to cancel the inclination of the sample SPL may be performed.
  • a control operation of inclining the stage 40 or the optical system of the observation image capturing unit 20 at an appropriate position so as to locally cancel the inclination may be performed.
  • FIG. 11 is an explanatory diagram of reimaging by the modified example 1.
  • the stage 40 or the optical system of the observation image capturing unit 20 is inclined so as to cancel the inclination of the sample SPL. This may allow the sample SPL to fall within the range of the allowable blur evaluation amount ⁇ z over the entire imaging range and allow a good observation image without a blurred area to be obtained.
  • the stage 40 or the optical system of the observation image capturing unit 20 is first inclined on the right-hand side of FIG. 11 so as to be raised by a predetermined angle, and then the blur evaluation is performed again.
  • the stage 40 or the optical system of the observation image capturing unit 20 only needs to be inversely inclined on the left-hand side of FIG. 11 so as to be raised by a predetermined angle to perform the imaging and the blur evaluation again. Additionally, for the angle to be inclined, the angle may be increased stepwise to repeat reimaging and reevaluation.
  • the evaluation area in the observation image is set at the middle part of each of the four sides of the small area as shown in FIG. 5 .
  • evaluation areas A 1 to A 4 , B 1 to B 4 , C 1 to C 4 , D 1 to D 4 , and E 1 to E 4 may be set at the four corners of the small areas A, B, C, D, and E, respectively.
  • the two closest evaluation areas among adjacent small areas in the connected image are two pairs, i.e., A 3 and B 1 , and A 4 and B 2 , in the case of the small area A and the small area B.
  • the blur evaluation for the other small areas is similarly performed.
  • the two closest evaluation areas among adjacent small areas in the connected image are two pairs, i.e., B 1 and C 2 , and B 3 and C 4 .
  • the blur evaluation is performed by the same calculation method as that described above.
  • three or more evaluation areas A 1 to A 8 , B 1 to B 8 , C 1 to C 8 , D 1 to D 8 , and E 1 to E 8 may be set for respective sides of the observation images of the small areas A, B, C, D, and E, respectively.
  • the calculation method for the blur evaluation in this case is also the same calculation method as that described above.
  • the evaluation areas are not necessarily set near the outer circumferential area of the small area.
  • the small areas A, B, C, D, and E are each provided with a total of four straight lines that are constituted of two straight lines dividing the small area into two in the y-axis direction and dividing the small area into two in the x-axis direction and that intersect one another.
  • Evaluation areas A 1 to A 4 , B 1 to B 4 , C 1 to C 4 , D 1 to D 4 , and E 1 to E 4 may be assigned to those four lines of the respective areas.
  • the evaluation area is a sample area.
  • the difference Ad in edge density between an area where no sample exists (non-sample area) and the sample area tends to increase.
  • the one evaluation area may be erroneously determined as a boundary part or a blurred boundary part between those small areas.
  • the sample area detection unit 511 detects a sample area and this sample area is set as an imaging target for an observation image.
  • the imaging is performed in units of the imaging range of the observation image capturing unit 20 , actually, a small area containing a non-sample part is also imaged in some cases.
  • the blur evaluation unit 514 acquires positional information of a sample area from the sample area detection unit 511 .
  • the positional information of the sample area naturally has a much higher resolution than the imaging range of the observation image capturing unit 20 .
  • the blur evaluation unit 514 selects a position in which a non-sample area is not contained in the evaluation area based on the positional information of the sample area, and sets an evaluation area. Alternatively, the size or form of the evaluation area may be changed.
  • the non-sample area can be prevented from being contained in the evaluation area, and this allows an increase in accuracy of the blur evaluation.
  • the observation images of the plurality of small areas are connected to generate a connected image, and then the processing of blur evaluation by the blur evaluation unit 514 is performed.
  • the present disclosure is not limited thereto.
  • a gray scale conversion (Step S 203 ), an edge detection (Step S 204 ), the setting of an evaluation area (Step S 205 ), and the calculation of an edge density d (Step S 206 ) may executed in units of the observation image of the small area and then observation images of a plurality of small areas may be connected (Step S 207 ). Subsequently, the calculation of a difference ⁇ d in edge density (Step S 208 ) and the determination of a blurred boundary part (Step S 209 ) are similarly performed.
  • the autofocusing method is changed and the imaging range is changed to 1/m to reevaluate the observation image of one small area reimaged by m times, and as a result of this evaluation, when a blurred boundary part remains, the position of the blurred boundary part is clearly specified on the screen on which a thumbnail image of the sample is displayed.
  • the present disclosure is not limited thereto. In the case where a blurred boundary part is detected by the blur evaluation performed on the observation image obtained at the first imaging, the position of the blurred boundary part may be clearly specified on the screen on which a thumbnail image of the sample is displayed.
  • a digital microscope apparatus including:

Abstract

A digital microscope apparatus includes: an observation image capturing unit configured to capture an observation image of each of a plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; and a controller configured to set at least one evaluation area for the observation image of each of the plurality of small areas, the observation image being captured by the observation image capturing unit, to perform an edge detection on the at least one evaluation area, and to calculate, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the observation images according to the partition.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation application of U.S. patent application Ser. No. 14/199,682, filed Mar. 6, 2014, which claims the benefit of Japanese Priority Patent Application JP 2013-050520 filed Mar. 13, 2013, the entire contents of which are incorporated herein by reference.
  • BACKGROUND
  • The present disclosure relates to a digital microscope apparatus that captures an enlarged image of a sample such as a biological sample as an observation image, and to an imaging method for the digital microscope apparatus and a program.
  • In the past, in order to observe the whole of a sample with use of a digital microscope apparatus, images of small areas that partition an area containing the sample on a glass slide are captured by an enlarging imaging system and such images of the respective small areas are connected to one another to generate one large image.
  • For a focusing system in which an objective lens of the enlarging imaging system is focused on a pathological sample serving as an imaging target, autofocusing (AF) is adopted. For example, the following autofocusing is disclosed: a focal position of the objective lens in the enlarging imaging system is moved in an optical axis direction at predetermined intervals, images are captured at the respective moved positions, and a position at which an image having the highest contrast in the captured images is captured is detected as an in-focus position (see, for example, Japanese Patent Application Laid-open No. 2011-197283). This type of focusing system is called “contrast AF”.
  • Although providing relatively high accuracy in focal point, the contrast AF performs repetitive movements and evaluations of the focal position of the objective lens in order to search for an optimal focal position. Consequently, it takes a relatively long time to obtain the focal position.
  • In this regard, a microscope apparatus that adopts a “phase difference AF” in which light taken through an objective lens is split into two light beams by a splitter lens and the position and direction of a focal point is determined based on a distance between two formed images is also disclosed (see, for example, Japanese Patent Application Laid-open No. 2011-090222). The phase difference AF can provide a focal position at higher speed than the contrast AF without the search for a focal position. Conversely, there is a possibility that the accuracy is lowered due to the size of an object within the imaging surface or due to the number of tissues.
  • SUMMARY
  • In the digital microscope apparatus, images of a lot of samples have been requested to be acquired with high quality and at high speed as much as possible, but the request is not met sufficiently.
  • In view of the circumstances as described above, it is desirable to provide a digital microscope apparatus, an imaging method therefor, and a program that are capable of acquiring images of a lot of samples with high quality and at high speed as much as possible.
  • According to an embodiment of the present disclosure, there is provided a digital microscope apparatus including: an observation image capturing unit configured to capture an observation image of each of a plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; and a controller configured to set at least one evaluation area for the observation image of each of the plurality of small areas, the observation image being captured by the observation image capturing unit, to perform an edge detection on the at least one evaluation area, and to calculate, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the observation images according to the partition.
  • The controller may be configured to determine, as a blurred boundary part, a boundary part located between the observation images and having a difference in blur evaluation amount that is out of a predetermined allowable range.
  • The controller may be configured to determine, based on a result of the determination of the blurred boundary part, a small area to be reimaged by the observation image capturing unit and a condition of the reimaging for the small area to be reimaged.
  • More specifically, the controller may be configured to determine the small area surrounded by the blurred boundary parts as the small area to be reimaged, and to determine to switch an autofocusing method, for the condition of the reimaging. This allows an observation image to be obtained by a more appropriate autofocusing method in accordance with the state of the sample.
  • The controller may be configured to determine, when determining that the blurred boundary parts exist in n small areas successively located in a predetermined one axial direction (where n is an integer of 3 or more), the n small areas as small areas to be reimaged, to change an imaging range in the one axial direction of the observation image capturing unit into 1/m of the imaging range at a first imaging (where m is an integer of 2 or more), for the condition of the reimaging, and to determine to divide each of the small areas into m areas for reimaging.
  • This also allows a good observation image of the sample to be obtained from, for example, a preparation in which the sample is sealed in at a large inclination angle with respect to a glass slide surface.
  • Further, the digital microscope apparatus according to the embodiment of the present disclosure may further include a thumbnail image capturing unit configured to capture an entire image of the sample on the glass slide, in which the controller may be configured to generate, when the blurred boundary part is determined again in the connected image containing the observation image of the small area that is reimaged by the observation image capturing unit, an image that clearly specifies a position of the blurred boundary part in a thumbnail image captured by the thumbnail image capturing unit and to display the image on a display.
  • This allows a user to, for example, visually recognize the blurred boundary part on the preparation while checking the position of the blurred boundary part in the connected image, and if dirt and the like are mixed in the preparation, to manually adjust a focal position to perform reimaging, for example.
  • According to another embodiment of the present disclosure, there is provided an imaging method for a digital microscope apparatus, the imaging method including: setting at least one evaluation area for an observation image of each of a plurality of small areas, the observation image being captured by an observation image capturing unit configured to capture an observation image of each of the plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; performing an edge detection on the at least one evaluation area; and calculating, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the plurality of observation images according to the partition.
  • According to another embodiment of the present disclosure, there is provided a program causing a computer to operate as a controller configured to set at least one evaluation area for an observation image of each of a plurality of small areas, the observation image being captured by an observation image capturing unit configured to capture an observation image of each of the plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; to perform an edge detection on the at least one evaluation area; and to calculate, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the plurality of observation images according to the partition.
  • As described above, according to the present disclosure, it is possible to acquire images of a lot of samples with high quality and at high speed as much as possible.
  • These and other objects, features and advantages of the present disclosure will become more apparent in light of the following detailed description of best mode embodiments thereof, as illustrated in the accompanying drawings.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a diagram showing a whole configuration of a digital microscope apparatus according to an embodiment;
  • FIG. 2 is a functional block diagram of an integration controller in the digital microscope apparatus shown in FIG. 1;
  • FIG. 3 is a flowchart showing a flow of an imaging operation by the digital microscope apparatus shown in FIG. 1;
  • FIG. 4 is a diagram partially showing an observation image converted into a grayscale image;
  • FIG. 5 is a diagram showing a specific setting example of evaluation areas in the observation image shown in FIG. 4 and determination results of blurred boundary parts;
  • FIG. 6 is a diagram showing an example of an observation image, for which small areas to be reimaged are determined by a second determination method;
  • FIG. 7 is a diagram showing the determination results of the blurred boundary parts in the observation image shown in FIG. 6;
  • FIG. 8 is a diagram showing a relationship among an x-z-axis cross section of a part of a sample arranged at a large inclination angle with respect to a glass slide surface, a focal position, an imaging range for each focal position, an allowable blur evaluation amount, an in-focus area, and a blurred area;
  • FIG. 9 is an explanatory diagram of reimaging;
  • FIG. 10 is a diagram showing a display example in which the position of a blurred boundary part is clearly specified in a thumbnail image;
  • FIG. 11 is an explanatory diagram of reimaging by a modified example 1;
  • FIG. 12 is a diagram showing another setting method for the evaluation area;
  • FIG. 13 is a diagram showing another setting method for the evaluation area;
  • FIG. 14 is a diagram showing another setting method for the evaluation area;
  • FIG. 15 is a flowchart of an imaging operation for describing a modified example 4; and
  • FIG. 16 is a diagram showing an example of a plurality of small areas that are surrounded by the blurred boundary parts and connected to one another.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
  • First Embodiment
  • FIG. 1 is a diagram showing a whole configuration of a digital microscope apparatus 100 according to this embodiment.
  • Whole Configuration
  • The digital microscope apparatus 100 includes a thumbnail image capturing unit 10, an observation image capturing unit 20, a phase difference image acquiring unit 30, a stage 40, and a controller 50.
  • The thumbnail image capturing unit 10 captures an entire image of a preparation PRT on which a sample SPL is provided (this image being hereinafter referred to as a “thumbnail image”).
  • The observation image capturing unit 20 captures an image obtained by magnifying the sample SPL provided on the preparation PRT at a predetermined magnification (the image being hereinafter referred to as an “observation image”).
  • The phase difference image acquiring unit 30 captures a phase difference image containing information on the amount and orientation of a displacement in an optical axis direction between a focal point of an objective lens 23 of the observation image capturing unit 20 and the sample SPL on the preparation PRT.
  • The stage 40 moves the preparation PRT placed thereon to a position for imaging by the thumbnail image capturing unit 10 and a position for imaging by the observation image capturing unit 20. The stage 40 is configured to be movable by a stage drive mechanism 41 in a direction of an optical axis (z-axis direction) of the objective lens 23 of the observation image capturing unit 20 and also in a direction (x-axis direction and y-axis direction) orthogonal to the direction of the optical axis. Additionally, the stage 40 is desirably movable also in a direction inclining with respect to a plane orthogonal to the direction of the optical axis. Furthermore, for the movement in the optical axis direction, a configuration in which the objective lens 23 is moved vertically by using, for example, a piezo-stage may be provided.
  • It should be noted that the preparation PRT is obtained by fixing the sample SPL to a glass slide by a predetermined fixing method. The sample SPL is a biological sample that includes tissue slices of connective tissues such as blood, epithelial tissues, and tissues including both of the above tissues, or the like or includes smear cells. The tissue slices or smear cells are subjected to various types of staining as necessary. Examples of staining include not only general staining represented by HE (hematoxylin-eosin) staining, Giemsa staining, Papanicolaou staining, Ziehl-Neelsen staining, and Gram staining but also fluorescent staining such as FISH (Fluorescence In-Situ Hybridization) and an enzyme antibody technique.
  • The digital microscope apparatus 100 is additionally equipped with a preparation stock loader 70 that stores the preparations PRT each containing the sample SPL and loads the stored preparations PRT one by one onto the stage 40. It should be noted that the preparation stock loader 70 may be integrated into the digital microscope apparatus 100.
  • Next, the details of the thumbnail image capturing unit 10, observation image capturing unit 20, phase difference image acquiring unit 30, and controller 50 described above will be described.
  • Thumbnail Image Capturing Unit 10
  • The thumbnail image capturing unit 10 includes a light source 11, an objective lens 12, and an imaging device 13 as shown in FIG. 1.
  • The light source 11 is provided on a surface of the stage 40, which is on the opposite side to the surface on which the preparation is arranged. The light source 11 can switch between light (bright field illumination light) for illuminating a sample SPL on which general staining is performed and light (dark field illumination light) for illuminating a sample SPL on which special staining is performed. Further, the light source 11 may apply only one of the bright field illumination light and the dark field illumination light. In this case, two types of light sources, i.e., the light source that applies the bright field illumination light and the light source that applies the dark field illumination light, are provided as the light sources 11. It should be noted that the light source that applies the dark field illumination light may be provided on the surface side of the stage 40 on which the preparation is arranged (hereinafter, the surface being also referred to as preparation arrangement surface).
  • The objective lens 12 is arranged on the preparation arrangement surface side of the stage 40, with the normal line of a reference position of the thumbnail image capturing unit 10 on the preparation arrangement surface being as an optical axis SR. Transmitted light that has been transmitted through the preparation PRT arranged on the stage 40 is collected by the objective lens 12 and forms an image onto the imaging device 13 that is provided behind the objective lens 12 (that is, in a traveling direction of the illumination light).
  • Further, it may be possible to provide not the above-mentioned finite conjugate arrangement but an infinite conjugate arrangement and to provide a configuration in which an imaging lens is arranged behind the objective lens 12 and an image is formed onto the imaging device 13.
  • Light covering an imaging range in which the entire preparation PRT placed on the preparation arrangement surface of the stage 40 is included is focused onto the imaging device 13 to form an image. The image formed onto the imaging device 13 is a thumbnail image of the entire preparation PRT.
  • Observation Image Capturing Unit 20
  • As shown in FIG. 1, the observation image capturing unit 20 includes a light source 21, a condenser lens 22, the objective lens 23, an imaging device 24, a condenser lens drive mechanism 25, and a beam splitter 26.
  • The light source 21 applies the bright field illumination light. The light source 21 is provided on the surface of the stage 40, which is on the opposite side to the preparation arrangement surface. Further, at a position different from the light source 21 (for example, on the preparation arrangement surface side), a light source (not shown) that applies the dark field illumination light is provided.
  • The condenser lens 22 is a lens that collects the bright field illumination light applied from the light source 21 or dark field illumination light applied from a dark field illumination light source to guide the light to the preparation PRT on the stage 40. The condenser lens 22 is arranged between the light source 21 and the stage 40, with the normal line of a reference position of the observation image capturing unit 20 on the preparation arrangement surface being as an optical axis ER.
  • The condenser lens drive mechanism 25 changes the position of the condenser lens 22 on the optical axis ER by driving the condenser lens 22 along a direction of the optical axis ER.
  • The objective lens 23 is arranged on the preparation arrangement surface side of the stage 40, with the normal line of the reference position of the observation image capturing unit 20 on the preparation arrangement surface being as the optical axis ER. In the observation image capturing unit 20, the objective lens 23 is appropriately replaced, so that the image of the sample SPL can be enlarged and captured at various magnifications. Further, in the case where an imaging lens is arranged in the infinite conjugate system, a magnification can be changed by appropriately replacing the imaging lens. Transmitted light that has been transmitted through the preparation PRT arranged on the stage 40 is collected by the objective lens 23 and reaches the beam splitter 26.
  • The beam splitter 26 splits the transmitted light that has been transmitted through the objective lens 23 into reflected light that proceeds to the imaging device 24 and transmitted light that proceeds to a field lens 32 in the phase difference image acquiring unit 30.
  • On the imaging device 24, an image of a small imaging range on the preparation arrangement surface of the stage 40 is formed in accordance with a pixel size of the imaging device 24 and a magnification of the objective lens 23.
  • Phase Difference Image Acquiring Unit 30
  • As shown in FIG. 1, the phase difference image acquiring unit 30 includes the field lens 32, a separator lens 33, and an imaging device 34.
  • The field lens 32 collects the reflected light that has been transmitted through the beam splitter 26 and guides the reflected light to the separator lens 33 that is provided behind the field lens 32 (on a traveling direction side of the reflected light).
  • The separator lens 33 divides the light beam guided from the field lens 32 into two light beams. The divided light beams form a set of subject images on an imaging surface of the imaging device 34 provided behind the separator lens 33 (on the traveling direction side of the reflected light).
  • On the imaging device 34, a set of subject images that has been transmitted through the separator lens 33 is formed. A phase difference exists between the set of formed subject images because light beams in various directions, which are emitted from the field lens 32, enter the separator lens 33. In the following description, the set of subject images is referred to as a “phase difference image”.
  • It should be noted that the beam splitter 26 is provided between the objective lens 23 and the imaging device 24 in the above description, but a light beam branching unit that branches a light beam is not limited to the beam splitter. A movable mirror and the like may be used as the beam splitter.
  • Additionally, in the above description, the phase difference image acquiring unit 30 is arranged on the optical axis ER of the objective lens 23, and the imaging device 24 of the observation image capturing unit 20 is arranged at a position on which the reflected light split by the beam splitter 26 is incident. Conversely, the imaging device 24 of the observation image capturing unit 20 may be arranged on the optical axis ER of the objective lens 23 and the phase difference image acquiring unit 30 may be arranged at a position on which the reflected light split by the beam splitter 26 is incident.
  • Further, in the above-mentioned description, the configuration in which the field lens, the separator lens, and the imaging device are provided is shown as a phase difference AF optical system in the phase difference image acquiring unit 30, but the present disclosure is not limited to such an example. Such a phase difference AF optical system may be a different optical system in which equivalent functions can be achieved by using a condenser lens and twin lenses instead of the field lens, the separator lens, and the like.
  • Furthermore, each of the imaging devices provided to the thumbnail image capturing unit 10, the observation image capturing unit 20, and the phase difference image acquiring unit 30 may be a one-dimensional imaging device (line sensor) or a two-dimensional imaging device (area sensor).
  • Controller 50
  • The controller 50 includes an integration controller 51, an illumination controller 52, a stage drive controller 53, a condenser lens drive controller 54, a phase difference image capturing controller 57, a thumbnail image capturing controller 56, an observation image capturing controller 55, a storage unit 58, a development unit 59, an image coding unit 60, a communication unit 61, and a display controller 62.
  • The integration controller 51 is constituted of hardware elements of a computer including, for example, a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory). Alternatively, the integration controller 51 may be constituted of a dedicated IC such as an FPGA (field programmable gate array). The integration controller 51 exchanges various signals with the illumination controller 52, the stage drive controller 53, the condenser lens drive controller 54, the observation image capturing controller 55, the thumbnail image capturing controller 56, the phase difference image capturing controller 57, the storage unit 58, the development unit 59, the image coding unit 60, the communication unit 61, and the display controller 62, and executes various types of arithmetic processing and control to acquire an observation image. Various programs and various types of data for the various types of arithmetic processing and control are loaded to the RAM. The CPU executes the programs loaded to the RAM. The ROM stores the programs and data loaded to the RAM.
  • The illumination controller 52, the stage drive controller 53, the condenser lens drive controller 54, the phase difference image capturing controller 57, the thumbnail image capturing controller 56, and the observation image capturing controller 55 may be each constituted of hardware elements of a computer including, for example, a CPU, a ROM, and a RAM or may be constituted of a dedicated IC such as an FPGA.
  • The development unit 59 and the image coding unit 60 are each constituted of hardware elements of a computer including, for example, a CPU, a ROM, and a RAM. Alternatively, the development unit 59 and the image coding unit 60 may be constituted of a GPU (Graphics Processing Unit).
  • The illumination controller 52 controls the light sources 11 and 21 according to an instruction on an illumination method for the sample SPL. The instruction is given from the integration controller 51. For example, the illumination controller 52 selects the type of light source, such as the intensity of illumination light of the light sources 11 and 21, a light source for a bright field, a light source for a dark field, and the like according to the instruction from the integration controller 51. Examples of the light source for a bright field include a light source that applies visible light. Examples of the light source for a dark field include a light source to apply light having a wavelength that can excite a fluorescent marker used in special staining.
  • For example, when receiving an instruction from the integration controller 51 to capture a thumbnail image, the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in a stage surface direction (in x- and y-axis directions) in such a way that the entire preparation PRT falls within the imaging range of the imaging device 13. The stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction in such a way that the objective lens 12 is focused on the entire preparation PRT. Further, when receiving an instruction from the integration controller 51 to capture an observation image, the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the stage surface direction in such a way that the small area of the instructed sample SPL falls within the imaging range of the imaging device 24. The stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction in such a way that the objective lens 23 is focused on the sample SPL.
  • The condenser lens drive controller 54 controls the condenser lens drive mechanism 25 based on information on an illumination field stop from the integration controller 51, to adjust the illumination light from the light source 21 so as to be applied only to a small area of the sample SPL as an imaging target. The information on the illumination field stop includes the amount of defocus and the orientation of defocus. Those pieces of information are obtained based on a distance between the set of phase difference images generated by the phase difference image acquiring unit 30.
  • The phase difference image capturing controller 57 acquires signals of the set of phase difference images that are formed on the imaging surface of the imaging device 34 provided to the phase difference image acquiring unit 30, and supplies the signals to the integration controller 51. The integration controller 51 calculates the amount of defocus and the orientation of defocus of the focal point of the objective lens 23 of the observation image capturing unit 20 with respect to the sample SPL, based on the distance between the set of phase difference images that are acquired from the phase difference image capturing controller 57. Based on those pieces of information, the integration controller 51 generates control information for the stage 40 and supplies the control information to the stage drive controller 53. The stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction based on the control information from the integration controller 51. Thus, the phase difference AF (autofocusing) in which the objective lens 23 of the observation image capturing unit 20 is focused on the sample SPL is performed.
  • Based on a signal corresponding to a thumbnail formed on the imaging surface of the imaging device 13 of the thumbnail image capturing unit 10, the thumbnail image capturing controller 56 generates data corresponding to the thumbnail image and supplies the data to the integration controller 51. The integration controller 51 detects an area including the sample SPL as a sample area from the thumbnail image acquired by the thumbnail image capturing controller 56. Additionally, the integration controller 51 partitions the sample area into a plurality of areas in a mesh pattern, the plurality of areas each having a size corresponding to the field of view of the observation image capturing unit 20, and performs processing such as setting each of the areas as an area corresponding to one-time imaging by the observation image capturing unit 20, and the like. Hereinafter, this area is referred to as a “small area”.
  • The observation image capturing controller 55 generates, based on a signal corresponding to an observation image of each small area that is formed on the imaging surface of the imaging device 24 of the observation image capturing unit 20, raw data corresponding to the observation image of each small area and supplies the raw data to the integration controller 51. The integration controller 51 supplies the raw data of each small area, which has been acquired from the observation image capturing controller 55, to the development unit 59, so that the development unit 59 executes development processing. The integration controller 51 connects the data of the observation images of respective small areas, which have been developed by the development unit 59, to generate a large image for each sample SPL, and performs processing of dividing the generated large image for each sample SPL into units of a predetermined resolution that is called tile, and other processing. Further, the integration controller 51 supplies each of the generated tiles to the image coding unit 60 and causes the image coding unit 60 to generate image data in a predetermined compression coding format and the storage unit 58 to store the image data.
  • The storage unit 58 stores various types of setting information or programs for controlling the digital microscope apparatus 100, tile groups in the predetermined compression coding format, and the like.
  • The development unit 59 develops the raw data of the observation image of each small area that has been captured by the observation image capturing unit 20.
  • The image coding unit 60 codes the image data for each tile in the predetermined image compression format. Here, for example, JPEG (Joint Photographic Experts Group) is adopted as the image compression format. As a matter of course, compression coding formats other than JPEG may be adopted.
  • The tiles stored in the storage unit 58 are accumulated in an image management server 81 by the communication unit 61 through a network 80. In response to a request from a viewer terminal 82, the image management server 81 sends one or more appropriate tiles to the viewer terminal 82. The viewer terminal 82 generates an observation image for display by using the one or more tiles acquired from the image management server 81 and displays the observation image on a display of the viewer terminal 82.
  • The display controller 62 generates screen data to be displayed on a display 90 that is connected to the digital microscope apparatus 100.
  • Autofocusing of Observation Image Capturing System
  • In the digital microscope apparatus 100 according to this embodiment, a phase difference autofocusing and a contrast autofocusing are implemented as autofocus functions of automatically focusing the objective lens 23 of the observation image capturing unit 20 on the sample SPL serving as an imaging target.
  • In the phase difference autofocusing, the integration controller 51 instructs the phase difference image capturing controller 57 to capture a phase difference image. When receiving the instruction, the phase difference image capturing controller 57 takes in signals of a set of phase difference images that are formed side by side on the imaging surface of the imaging device 34 from the phase difference image acquiring unit 30, and obtains a phase difference between those phase difference images.
  • In the observation image capturing unit 20, as the focal point of the objective lens 23 is farther than an appropriate surface, identical areas of observed surfaces on the two phase difference images move so as to be separated from each other toward the outward direction of the imaging device 24. Conversely, as the focal point of the objective lens 23 is closer than the appropriate surface, the identical areas of the observed surfaces on the two phase difference images move so as to be close to each other toward the inward direction of the imaging device 24. The integration controller 51 obtains a distance between the identical areas of the observed surfaces on the two phase difference images as a phase difference.
  • The integration controller 51 obtains, based on the obtained phase difference, the amount of defocus and the orientation of defocus of the focal point of the objective lens 23 with respect to the sample SPL. The integration controller 51 generates control information for the stage 40 based on the obtained amount and orientation of defocus and supplies the control information to the stage drive controller 53. The stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 in the z-axis direction based on the control information from the integration controller 51. Thus, the phase difference autofocusing that focuses the objective lens 23 of the observation image capturing unit 20 on the sample SPL is performed.
  • The phase difference autofocusing allows a focal position to be obtained at higher speed than the contrast autofocusing without the search for a focal position. Conversely, there is a possibility that the accuracy is lowered due to the size of an object within the imaging surface or due to the number of tissues.
  • In contrast, the contrast autofocusing is a method in which a focal position is searched for in a hill climbing method by use of the observation image capturing unit 20. In the contrast autofocusing, the integration controller 51 displaces the focal position of the objective lens 23 by a predetermined distance and causes the observation image capturing unit 20 to perform imaging at respective focal positions. The integration controller 51 determines a focal position when an image having a highest contrast in the captured images is captured, as an optimal focal position.
  • In general, it is thought that the contrast autofocusing provides higher accuracy in focal point than the phase difference autofocusing. However, because the focal position is searched for, the contrast autofocusing involves repetitive movements and evaluations of the focal position of the objective lens. Consequently, it takes a relatively long time to obtain the focal position (focal position search time).
  • Additionally, there is a case where sufficient accuracy in focal point may not be obtained by any of the phase difference autofocusing and the contrast autofocusing. For example, in the case where the sample SPL is sealed in the preparation PRT at a large inclination angle with respect to the glass slide surface, there is a possibility that such blurring that is out of the allowable range partially occurs at any height position set for the focal point.
  • As described above, in the digital microscope apparatus, a focal point is detected for each small area by the phase difference autofocusing or the contrast autofocusing, so that an observation image is captured. Thus, images may be captured in an insufficient in-focus state depending on areas. In this case, the observation image of each small area is blurred as a whole or partially. However, a blurred part of such a blurred observation image has been discovered only when a user observes a large image that is generated by connecting such blurred observation images of respective small areas by stitching processing, because the blurred part is more prominent than the adjacent observation images.
  • In order to improve such circumstances, the digital microscope apparatus 100 according to this embodiment has the following functions.
  • Specifically, the digital microscope apparatus 100 according to this embodiment sets one or more evaluation areas for the observation image of each small area, which has been captured by the observation image capturing unit 20, and performs an edge detection on each evaluation area. Subsequently, the digital microscope apparatus 100 calculates a difference in blur evaluation amount between two observation images by using results of the edge detection on two evaluation areas. The two evaluation areas are located closest among any pairs of adjacent observation images in the connected image obtained by connecting a plurality of observation images. The digital microscope apparatus 100 determines a boundary part between those observation images, in which a difference in blur evaluation amount between the two observation images is out of a predetermined allowable range, as a blurred boundary part. Based on the determination results, the digital microscope apparatus 100 determines a small area to be reimaged by the observation image capturing unit 20 and a reimaging condition for such a small area.
  • Hereinafter, the details of such functions will be described.
  • Function of Integration Controller 51
  • FIG. 2 is a functional block diagram of the integration controller 51. As shown in FIG. 2, the integration controller 51 includes a sample area detection unit 511, a small area setting unit 512, a stitching unit 513, a blur evaluation unit 514, and a reimaging controller 515. Those functions are achieved when a CPU in the integration controller 51 executes a program loaded to a RAM.
  • In the thumbnail image that has been acquired by the thumbnail image capturing controller 56 using the thumbnail image capturing unit 10, the sample area detection unit 511 detects an area including the sample SPL as a sample area. The sample area detection unit 511 determines a sample area in the thumbnail image based on a distribution of pixels whose luminance values drastically change, for example. The pixels whose luminance values drastically change are detected by, for example, a method of detecting a boundary of samples by edge detection. It should be noted that a user can manually perform the determination of sample areas by observing the thumbnail image on a monitor.
  • The small area setting unit 512 partitions the sample area, which has been detected by the sample area detection unit 511, into a plurality of areas in a mesh pattern in a size unit corresponding to the field of view of the observation image capturing unit 20 and sets each of the partitioned areas as an area (small area) corresponding to one-time imaging by the observation image capturing unit 20. The small area setting unit 512 gives information on the position of each small area to the stage drive controller 53. Based on the given information on the position of each small area, the stage drive controller 53 drives the stage drive mechanism 41 to move the stage 40 such that the small areas sequentially fall within the imaging range of the observation image capturing unit 20. Subsequently, the small area setting unit 512 instructs the observation image capturing controller 55 to capture images of the small areas.
  • The stitching unit 513 connects observation images of the plurality of small areas, which have been captured by the observation image capturing controller 55, and generates a large image (connected image) in units of the sample SPL.
  • The blur evaluation unit 514 sets one or more evaluation areas for the observation image of each small area and performs an edge detection on each evaluation area, to calculate a difference in blur evaluation amount between two observation images by using results of the edge detection on two evaluation areas. The two evaluation areas are located closest among any pairs of adjacent observation images in the connected image. The blur evaluation unit 514 determines a boundary part between those two observation images, in which a difference in blur evaluation amount between the two observation images is out of a predetermined allowable range, as a blurred boundary part. The blur evaluation unit 514 determines, based on the determination results of the blurred boundary part, a small area to be reimaged by the observation image capturing unit 20 and a reimaging condition for such a small area. Hereinafter, the above-mentioned series of processing by the blur evaluation unit 514 is referred to as “blur evaluation”.
  • The reimaging controller 515 receives the small area to be reimaged and the reimaging condition, which have been determined by the blur evaluation unit 514, and controls the stage drive controller 53, the observation image capturing controller 55, and the phase difference image capturing controller 57 to execute reimaging of the small area to be reimaged.
  • Flow of Imaging Operation
  • Next, the flow of the imaging operation by the digital microscope apparatus 100 according to this embodiment will be described particularly focusing on the operations of the blur evaluation unit 514 and the reimaging controller 515.
  • FIG. 3 is a flowchart showing a flow of such an imaging operation.
  • First, a thumbnail image captured by the thumbnail image capturing unit 10 is supplied to the integration controller 51. When the integration controller 51 acquires the thumbnail image, the sample area detection unit 511 detects a sample area from the thumbnail image (Step S101).
  • Next, the small area setting unit 512 of the integration controller 51 sets a plurality of small areas for the sample area detected by the sample area detection unit 511, and instructs the observation image capturing controller 55 to capture an image while giving information on the positions of the plurality of small areas to the stage drive controller 53. Thus, the stage 40 is moved such that each small area falls within an imaging range of the observation image capturing unit 20, autofocusing is performed, and the small area is imaged by the observation image capturing unit 20 (Step S102). The autofocusing method executed when the observation image is captured may be any of the phase difference autofocusing and the contrast autofocusing. The autofocusing method for the first imaging may be appropriately selected by a user, for example.
  • Subsequently, the stitching unit 513 of the integration controller 51 connects the observation images of the respective small areas in the x- and y-axis directions and generates a large image (connected image) in units of the sample SPL (Step S103).
  • Next, the blur evaluation by the blur evaluation unit 514 will be performed as follows.
  • Blur Evaluation by Blur Evaluation Unit 514
  • The blur evaluation unit 514 converts the large image (connected image) in units of the sample SPL into a grayscale image having a predetermined bit number for the purpose of speed-up of processing (Step S104). At that time, for the purpose of further speed-up of processing, the large image (connected image) may be reduced in size before the conversion into a grayscale image, or an observation image converted into a grayscale image may be reduced in size. Alternatively, only the evaluation area may be cut out and converted into a grayscale image.
  • Next, the blur evaluation unit 514 performs an edge detection by filtering on the size-reduced observation image or the observation image converted into a grayscale image (Step S105). For the edge detection, for example, the Canny edge detection method or the like may be used. However, the present disclosure is not limited to this.
  • Next, the blur evaluation unit 514 sets a plurality of evaluation areas for the size-reduced observation image or the observation image converted into a grayscale image (Step S106). One or more evaluation areas are set for the observation image of each small area. It is desirable that the plurality of evaluation areas be set in the vicinity of the outer circumferential area of the observation image of each small area.
  • A specific setting method for the evaluation area will be described.
  • FIG. 4 is a diagram partially showing an observation image converted into a grayscale image. Parts A, B, C, D, and E are observation images of small areas. Practically, the observation images of small areas other than the parts A to E are also connected in the x-axis direction and the y-axis direction, but the illustration thereof is omitted in FIG. 4.
  • FIG. 5 is a diagram showing a specific setting example of the evaluation area in the observation image shown in FIG. 4. Evaluation areas A1 to A4 are set in the observation image of a small area A. Evaluation areas B1 to B4 are set in the observation image of a small area B. Evaluation areas C1 to C4 are set in the observation image of a small area C. Evaluation areas D1 to D4 are set in the observation image of a small area D. Evaluation areas E1 to E4 are set in the observation image of a small area E. In this setting example, each evaluation area is set at the middle part of each of the four sides in the circumferential area of the observation image of each of the small areas A, B, C, D, and E.
  • Subsequently, for the respective evaluation areas, the blur evaluation unit 514 calculates a difference in blur evaluation amount between the observation images of two adjacent small areas by using the edge detection results described above. Specifically, the blur evaluation unit 514 first calculates an edge density d from the edge detection results (Step S107). The edge density d indicates, for example, a ratio of a part whose edge intensity obtained from the edge detection exceeds a threshold value in the evaluation area, and an average value of the edge intensity in the evaluation area. In general, as the blur evaluation amount becomes larger, the edge density d becomes smaller because a component with a high spatial frequency attenuates. As the blur evaluation amount becomes smaller, the edge density d becomes higher because many components with a high spatial frequency are contained.
  • Next, the blur evaluation unit 514 calculates an absolute value of a difference Ad in edge density d between two evaluation areas. The two evaluation areas are located closest among any pairs of adjacent observation images of the small areas in the connected image. The blur evaluation unit 514 sets the absolute value as a difference in blur evaluation amount between those two observation images (Step S108).
  • For example, according to the setting example for the evaluation area shown in FIG. 5, the two closest evaluation areas in the adjacent observation images of the small areas in the connected image are A4 and B1, B2 and C3, B4 and D1, and B3 and E2. Here, an absolute value of a difference in edge density between A4 and B1 is represented by |ΔdA4-B1|, an absolute value of a difference in edge density between B2 and C3 is represented by |ΔdB2-C3|, an absolute value of a difference in edge density between B4 and D1 is represented by |ΔdB4-D1|, and an absolute value of a difference in edge density between B3 and E2 is represented by |ΔdB3-E2|.
  • The blur evaluation unit 514 determines a boundary part between those two evaluation areas, in which the absolute value of the difference Δd in edge density d is larger than a predetermined threshold value ε, as a “blurred boundary part” whose difference in blur evaluation amount is out of the predetermined allowable range (Step S109). Specifically, if imaging is performed at an appropriate focal position for each of the two adjacent small areas, the two closest evaluation areas between the adjacent observation images of those small areas should have the same edge density d. When the difference Δd in edge density between the two evaluation areas is large, this means that a boundary part of the observation image of at least one of the small areas is blurred at minimum.
  • Further, also in the case where no sample exists in one of the small areas, the difference Δd in edge density becomes large. It should be noted that this can be avoided by the blur evaluation unit 514 determining whether a sample exists in the connected image or not and excluding an appropriate evaluation area from the blur evaluation. For example, the blur evaluation unit 514 can determine a position at which a sample exists by detecting an area equal to the background where no samples exist in the connected image or detecting an area containing color information derived from the staining of samples. In this case, an image to be used for the determination may be a thumbnail image by the thumbnail image capturing unit 10 or an observation image by the imaging device 13.
  • In the case where the relationship between each difference in edge density and the threshold value c is expressed as follows, for example, |ΔdA4-B1|>ε, |ΔdB2-C3|>ε, |ΔdB4-D1|>ε, and |ΔdB3-E2|>ε, as shown in FIG. 5, the boundary part between the small area A and the small area B, the boundary part between the small area C and the small area B, the boundary part between the small area D and the small area B, and the boundary part between the small area E and the small area B are determined as a blurred boundary part 3AB, a blurred boundary part 3BC, a blurred boundary part 3BD, and a blurred boundary part 3BE, respectively.
  • Subsequently, based on the determination results of those blurred boundary parts, the blur evaluation unit 514 determines a small area to be reimaged by the observation image capturing unit 20 or determines a reimaging condition for a small area to be reimaged. More specifically, in the following manner, a small area to be reimaged and a reimaging condition for the small area to be reimaged are determined.
  • In the case where the blurred boundary parts are not detected (NO in Step S110), the imaging of this preparation PRT is assumed to be successfully performed, and thus the imaging processing for this preparation PRT is terminated. Subsequently, the imaging of the next preparation PRT is executed.
  • Determination Method for Small Area to Be Reimaged and Determination Method for Reimaging Condition
  • The determination method for a small area to be reimaged includes the following methods.
  • 1. First determination method to determine at least one small area surrounded by the blurred boundary parts as a small area to be reimaged.
    2. Second determination method to determine, when the blurred boundary parts exist in n small areas successively located in a predetermined one axial direction (x-axis or y-axis direction), the n small areas as small areas to be reimaged, where n is an integer of 3 or more.
  • Regarding First Determination Method
  • When there is a small area surrounded in all directions by the blurred boundary parts (YES in Step S111), the blur evaluation unit 514 determines that small area as a small area to be reimaged. For example, as shown in FIG. 5, the observation image of the small area B is surrounded by the blurred boundary parts 3AB, 3BC, 3BD, and 3BE in all directions. In such a case, the observation image of the small area B is an overall blurred image in many cases. The blur evaluation unit 514 determines the small area B, which is surrounded in all directions by the blurred boundary parts, as a small area to be reimaged, and instructs the reimaging controller 515 to perform reimaging of the small area B under a first condition.
  • Additionally, in the case where there are a plurality of connected small areas surrounded by the blurred boundary parts, the blur evaluation unit 514 determines those small areas as small areas to be reimaged.
  • FIG. 16 is a diagram showing an example of a plurality of small areas that are surrounded by the blurred boundary parts and connected to one another. For convenience of description, the small areas shown in FIG. 16 are denoted by shot numbers of 1 to 25. Here, 6 small areas with the shot numbers of 7, 8, 12, 13, 14, and 19 are determined as a plurality of small areas surrounded by the blurred boundary parts and connected to one another.
  • When receiving an instruction of reimaging of the small area B under the first condition, the reimaging controller 515 controls the stage drive controller 53, the observation image capturing controller 55, and the phase difference image capturing controller 57 to change the autofocusing method and reimage the small area B according to the first condition (Step S112). For example, in the case where focusing is performed by the phase difference autofocusing at the first imaging, the reimaging controller 515 performs control so as to search for a focal position by the contrast autofocusing at the reimaging. Conversely, in the case where the focal position is searched for by the contrast autofocusing at the first imaging, the reimaging controller 515 performs control so as to perform focusing by the phase difference autofocusing at the reimaging.
  • Consequently, a good observation image can be efficiently obtained by a more appropriate autofocusing method in accordance with the state of the observation target. For example, the imaging is performed by the phase difference autofocusing in order to give propriety to the efficiency at the first imaging, and reimaging is performed by switching to the contrast autofocusing as necessary. This allows an improvement in imaging efficiency for a lot of preparations PRT to be expected.
  • Regarding Second Determination Method
  • When determining that there are blurred boundary parts in n small areas successively located in a predetermined one axial direction (x-axis or y-axis direction) (YES in Step S113), the blur evaluation unit 514 determines the n small areas as small areas to be reimaged. Here, n is an integer of 3 or more.
  • FIG. 6 is a diagram showing an example of the observation image, for which small areas to be reimaged are determined by the second determination method. As shown in FIG. 7, the evaluation area is set as described above. In this example, the expressions |ΔdB2-C3|>ε and |ΔdB3-E2|>ε are satisfied, while the expressions |ΔdA4-B1|>ε and |ΔdB4-D1|>ε are not satisfied. Consequently, in this example, as shown in FIG. 7, it is determined that the blurred boundary parts 3BC and 3BE exist between the three small areas B, C, and E, which are successively located in the x-axis direction. As a result, the small areas B, C, and E are determined as small areas to be reimaged. When determining such small areas B, C, and E to be reimaged, the blur evaluation unit 514 instructs the reimaging controller 515 to perform reimaging of the small areas B, C, and E under a second condition.
  • Factors causing periodic blurs in a predetermined one axial direction (x-axis or y-axis direction) in such a manner include the posture of the sample SPL, which is arranged at a large inclination angle with respect to a glass slide surface, and the like.
  • Further, also when a most area of the sample SPL is parallel to the glass slide surface, the end of the sample may be partially inclined at a large angle. Such a case also causes generation of periodic blurs in a predetermined one axial direction (x-axis or y-axis direction) easily.
  • FIG. 8 is a diagram showing a relationship among an x-z-axis cross section of a part (including small areas B, C, and E) of the sample SPL arranged at a large inclination angle with respect to the glass slide surface, which are shown in lower part of the diagram, focal positions z_focus1, z_focus2, and z_focus3, imaging ranges for the respective focal positions, allowable blur evaluation amounts δz, in-focus areas, and blurred areas.
  • In FIG. 8, the vertical axis indicates the optical axis direction z, and the horizontal axis indicates the x-axis direction. The position z_focus1 indicates a focal position when the small area C is imaged, the position z_focus2 indicates a focal position when the small area B is imaged, and the position z_focus3 indicates a focal position when the small area E is imaged. The blur evaluation amount is proportional to the difference between the focal position and the position of the sample SPL. Further, a part whose blur evaluation amount exceeds the range of a predetermined allowable blur evaluation amount δz due to the inclination of the sample SPL is defined as a “blurred area”. A part whose blur evaluation amount falls within the range of the allowable blur evaluation amount δz is defined as an “in-focus area”.
  • As shown in FIG. 8, when the sample SPL is arranged at a large inclination angle with respect to the glass slide surface, the blurred areas and the in-focus areas appear side by side with boundaries being interposed between the small areas over the small areas successively located in one axial direction (in this case, the x-axis direction). Specifically, the blurred boundary parts exist between the n small areas arranged in one axial direction (in this case, the x-axis direction). Thus, a determination condition of the second determination method described above is satisfied.
  • In such a manner, when determining the n small areas B, C, and E to be reimaged by the second determination method (YES in Step S113), the blur evaluation unit 514 instructs the reimaging controller 515 to perform reimaging of the small areas B, C, and E under the second condition.
  • When receiving the instruction of reimaging under the second condition, the reimaging controller 515 controls the stage drive controller 53, the observation image capturing controller 55, and the phase difference image capturing controller 57 to change the imaging range in the one axial direction of the observation image capturing unit 20 into 1/m of the imaging range at the first imaging and to divide one small area into m areas for reimaging (Step S114). It should be noted that the autofocusing method at the reimaging is the same as that of the first imaging. The value m is an integer of 2 or more.
  • FIG. 9 is a diagram showing a specific example of the reimaging. FIG. 9 shows an example in which the imaging range in the x-axis direction is changed into ½ of the imaging range at the first imaging, and each of the small areas B, C, and E to be reimaged is reimaged by ½ size in the x-axis direction by two times of imaging. As shown in FIG. 9, this example shows a case where a small area is divided into two areas arranged in the x-axis direction and reimaging is performed in a focal position obtained by autofocusing for each area. As a result, this example shows that the maximum blur evaluation amount falls within the range of the allowable blur evaluation amount δz in each of the small areas B, C, and E, and a good observation image without a blurred area is obtained.
  • Although the case where the imaging range in the x-axis direction is changed into 1/m has been described, in the case where the in-focus areas and the blurred areas appear alternately over n small areas successively located in the y-axis direction, it is only necessary to change the imaging range in the y-axis direction into 1/m of the imaging range at the first imaging and divide each of the small areas B, C, and E into m areas for reimaging.
  • Examples of the method of changing the imaging range of the observation image capturing unit 20 into 1/m of the imaging range include the following methods.
  • 1. A method of changing a feed amount of the stage 40 at the imaging, in the case where a line senor is used as the imaging device 34 of the observation image capturing unit 20.
    2. A method of controlling the width of a read area in an x-axis direction or y-axis direction of an area sensor to be 1/m of the width of the read area at the first imaging, in the case where the area sensor is used as the imaging device 34 of the observation image capturing unit 20.
  • For the observation image that has been captured again as described above, the blur evaluation unit 514 performs the blur evaluation as in Steps S107 to S109 described above (Step S115). As a result of the blur evaluation, when it is determined that no blurred boundary parts remain (NO in Step S116), the imaging of this preparation PRT is assumed to be successfully performed, and thus the imaging processing for this preparation PRT is terminated. Subsequently, the imaging of the next preparation PRT is executed. Alternatively, when it is determined that one or more blurred boundary parts remain (YES in Step S116), the reimaging controller 515 considers that the search for a focal position has failed by the changed autofocusing method and performs control to clearly specify the positions of the blurred boundary parts on the screen on which a thumbnail image of the sample is displayed (Step S117).
  • FIG. 10 is a diagram showing a display example in which the position of a blurred boundary part is clearly specified in a thumbnail image. Here, the display 90 may be a display apparatus directly connected to the controller 50 of the digital microscope apparatus 100. In a thumbnail image 91, the position of the blurred boundary part is clearly specified by a synthetic image 92 such as a frame line. It should be noted that the method of clearly specifying the position of the blurred boundary part is not limited to use of the synthetic image 92. The luminance, color, and the like of the observation image at the position of the blurred boundary part may be changed, so that the user can recognize the position of the blurred boundary part.
  • Further, for a user interface on the screen, in which the position of the blurred boundary part is clearly specified and displayed on the thumbnail image, an operating element (knob etc.) with which a user can change the setting of the threshold value c may be provided. The change of the setting of the threshold value c allows the blurred boundary part to be displayed in a state where the user easily recognizes the blurred boundary part. This allows an improvement in efficiency when the user checks the position of the blurred boundary part, for example, in the case where an allowable blur amount fluctuates depending on the sample. Additionally, the adjustment of the threshold value c allows an improvement in detection accuracy of the blurred boundary part corresponding to the sample to be expected.
  • The user visually checks a blurred boundary part on the preparation PRT or checks a blurred boundary part by using an enlarged image while viewing the display of the position of the blurred boundary part. If dirt and the like are mixed in the preparation PRT, for example, the focal position is manually adjusted to perform reimaging.
  • Incidentally, depending on the setting value of n in the second determination method, there is a possibility that both of the determination conditions of the first determination method and the second determination method are simultaneously satisfied. In the case where the both of the determination conditions of the first and second determination methods are simultaneously satisfied, the following operations are performed.
  • 1. Operation to validate the determination result by the first determination method.
    2. Operation to select whether to validate the determination result by the first determination method or validate the determination result by the second determination method according to the setting value of n. For example, when n is 3, the determination result by the first determination method is set to be valid, and when n is 4 or more, the determination result by the second determination method is set to be valid.
  • According to this embodiment, as described above, the following effects are obtained.
  • 1. It is possible to satisfactorily obtain the difference in blur evaluation amount between observation images of two small areas that are adjacent to each other in the connected image.
    2. It is possible to satisfactorily and efficiently determine, based on the difference in blur evaluation amount between the observation images of the small areas, whether a boundary part between the observation images is a blurred boundary part serving as a boundary part between a blurred area and an in-focus area.
    3. It is possible to satisfactorily and efficiently determine the blurred boundary part and, as a result of this, satisfactorily determine a small area to be reimaged.
    4. By using an autofocusing method at reimaging, which is different from the autofocusing method at the first imaging, it is possible to obtain an observation image by a more appropriate autofocusing method that corresponds to the state of a sample.
    5. It is possible to obtain an observation image of a sample even from a preparation PRT in which the sample SPL is sealed in at a large inclination angle with respect to a glass slide surface, such a preparation PRT being reimaged by changing the imaging range in a predetermined one axial direction of the observation image capturing unit 20 into 1/m of the imaging range at the first imaging and to divide one small area into m areas.
  • Modified Example 1
  • In the embodiment described above, when it is determined that the blurred boundary parts exist between the n small areas that are successively arranged in a predetermined one axial direction (x-axis or y-axis direction), the imaging range in the one axial direction is changed into 1/m of the imaging range at the first imaging, and one small area is reimaged by m times of imaging. Instead of such control, a control operation of inclining the stage 40 or the optical system of the observation image capturing unit 20 so as to cancel the inclination of the sample SPL may be performed.
  • Alternatively, in the case where the sample SPL is partially inclined, a control operation of inclining the stage 40 or the optical system of the observation image capturing unit 20 at an appropriate position so as to locally cancel the inclination may be performed.
  • FIG. 11 is an explanatory diagram of reimaging by the modified example 1. As shown in FIG. 11, the stage 40 or the optical system of the observation image capturing unit 20 is inclined so as to cancel the inclination of the sample SPL. This may allow the sample SPL to fall within the range of the allowable blur evaluation amount δz over the entire imaging range and allow a good observation image without a blurred area to be obtained. In this case, the stage 40 or the optical system of the observation image capturing unit 20 is first inclined on the right-hand side of FIG. 11 so as to be raised by a predetermined angle, and then the blur evaluation is performed again. When a blurred boundary part remains, the stage 40 or the optical system of the observation image capturing unit 20 only needs to be inversely inclined on the left-hand side of FIG. 11 so as to be raised by a predetermined angle to perform the imaging and the blur evaluation again. Additionally, for the angle to be inclined, the angle may be increased stepwise to repeat reimaging and reevaluation.
  • Modified Example 2
  • For the evaluation area in the observation image, in the first embodiment, the evaluation area is set at the middle part of each of the four sides of the small area as shown in FIG. 5. However, the present disclosure is not limited thereto. As shown in FIG. 12, evaluation areas A1 to A4, B1 to B4, C1 to C4, D1 to D4, and E1 to E4 may be set at the four corners of the small areas A, B, C, D, and E, respectively. In this case, the two closest evaluation areas among adjacent small areas in the connected image are two pairs, i.e., A3 and B1, and A4 and B2, in the case of the small area A and the small area B. When an absolute value of a difference in edge density between A3 and B1 is represented by |ΔdA3-B1| and an absolute value of a difference in edge density between A4 and B2 is represented by |ΔdA4-B2| and when (|ΔdA3-B1|+|ΔdA4-B2|)/2>ε, a boundary part between the small area A and the small area B is determined as a blurred boundary part.
  • The blur evaluation for the other small areas is similarly performed. For example, in the case of the small area B and the small area C, the two closest evaluation areas among adjacent small areas in the connected image are two pairs, i.e., B1 and C2, and B3 and C4. The blur evaluation is performed by the same calculation method as that described above.
  • Alternatively, as shown in FIG. 13, three or more evaluation areas A1 to A8, B1 to B8, C1 to C8, D1 to D8, and E1 to E8 may be set for respective sides of the observation images of the small areas A, B, C, D, and E, respectively. The calculation method for the blur evaluation in this case is also the same calculation method as that described above.
  • Further, the evaluation areas are not necessarily set near the outer circumferential area of the small area. For example, as shown in FIG. 14, the small areas A, B, C, D, and E are each provided with a total of four straight lines that are constituted of two straight lines dividing the small area into two in the y-axis direction and dividing the small area into two in the x-axis direction and that intersect one another. Evaluation areas A1 to A4, B1 to B4, C1 to C4, D1 to D4, and E1 to E4 may be assigned to those four lines of the respective areas.
  • Modified Example 3
  • It is desirable to ensure as much as possible that the evaluation area is a sample area. The difference Ad in edge density between an area where no sample exists (non-sample area) and the sample area tends to increase. Thus, in the case where one of the two closest evaluation areas among the observation images of the small areas includes a non-sample area, if the observation images of both the small areas are in focus, the one evaluation area may be erroneously determined as a boundary part or a blurred boundary part between those small areas.
  • For example, the sample area detection unit 511 detects a sample area and this sample area is set as an imaging target for an observation image. However, since the imaging is performed in units of the imaging range of the observation image capturing unit 20, actually, a small area containing a non-sample part is also imaged in some cases.
  • In this regard, the following processing can be additionally performed.
  • 1. The blur evaluation unit 514 acquires positional information of a sample area from the sample area detection unit 511. The positional information of the sample area naturally has a much higher resolution than the imaging range of the observation image capturing unit 20.
    2. When setting an evaluation area for the small area, the blur evaluation unit 514 selects a position in which a non-sample area is not contained in the evaluation area based on the positional information of the sample area, and sets an evaluation area. Alternatively, the size or form of the evaluation area may be changed.
  • Thus, the non-sample area can be prevented from being contained in the evaluation area, and this allows an increase in accuracy of the blur evaluation.
  • Modified Example 4
  • In the first embodiment described above, the observation images of the plurality of small areas are connected to generate a connected image, and then the processing of blur evaluation by the blur evaluation unit 514 is performed. However, the present disclosure is not limited thereto.
  • For example, as shown in FIG. 15, a gray scale conversion (Step S203), an edge detection (Step S204), the setting of an evaluation area (Step S205), and the calculation of an edge density d (Step S206) may executed in units of the observation image of the small area and then observation images of a plurality of small areas may be connected (Step S207). Subsequently, the calculation of a difference Δd in edge density (Step S208) and the determination of a blurred boundary part (Step S209) are similarly performed.
  • Modified Example 5
  • In the embodiment described above, the autofocusing method is changed and the imaging range is changed to 1/m to reevaluate the observation image of one small area reimaged by m times, and as a result of this evaluation, when a blurred boundary part remains, the position of the blurred boundary part is clearly specified on the screen on which a thumbnail image of the sample is displayed. The present disclosure is not limited thereto. In the case where a blurred boundary part is detected by the blur evaluation performed on the observation image obtained at the first imaging, the position of the blurred boundary part may be clearly specified on the screen on which a thumbnail image of the sample is displayed.
  • It should be noted that the present disclosure can take the following configurations.
  • (1) A digital microscope apparatus, including:
      • an observation image capturing unit configured to capture an observation image of each of a plurality of small areas, an area containing a sample on a glass slide being partitioned by the plurality of small areas; and
      • a controller configured
        • to set at least one evaluation area for the observation image of each of the plurality of small areas, the observation image being captured by the observation image capturing unit,
        • to perform an edge detection on the at least one evaluation area, and
          • to calculate, using results of the edge detection on two evaluation areas that are closest between two of the observation images adjacently located in a connected image, a difference in blur evaluation amount between the two observation images, the connected image being obtained by connecting the observation images according to the partition.
  • (2) The digital microscope apparatus according to (1), in which
      • the controller is configured to determine, as a blurred boundary part, a boundary part located between the observation images and having a difference in blur evaluation amount that is out of a predetermined allowable range.
  • (3) The digital microscope apparatus according to (2), in which the controller is configured to determine, based on a result of the determination of the blurred boundary part, a small area to be reimaged by the observation image capturing unit and a condition of the reimaging for the small area to be reimaged.
  • (4) The digital microscope apparatus according to any one of (1) to (3), in which
      • the controller is configured to determine the small area surrounded by the blurred boundary parts as the small area to be reimaged, and to determine to switch an autofocusing method, for the condition of the reimaging.
  • (5) The digital microscope apparatus according to any one of (1) to (4), in which
      • the controller is configured
        • to determine, when determining that the blurred boundary parts exist in n small areas successively located in a predetermined one axial direction (where n is an integer of 3 or more), the n small areas as small areas to be reimaged,
        • to change an imaging range in the one axial direction of the observation image capturing unit into 1/m of the imaging range at a first imaging (where m is an integer of 2 or more), for the condition of the reimaging, and
        • to determine to divide each of the small areas into m areas for reimaging.
  • (6) The digital microscope apparatus according to any one of (1) to (5), further including a thumbnail image capturing unit configured to capture an entire image of the sample on the glass slide, in which
      • the controller is configured to generate, when the blurred boundary part is determined again in the connected image containing the observation image of the small area that is reimaged by the observation image capturing unit, an image that clearly specifies a position of the blurred boundary part in a thumbnail image captured by the thumbnail image capturing unit and to display the image on a display.
  • It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (20)

1-8. (canceled)
9. An imaging control apparatus, comprising:
a circuitry configured to:
calculate a first in-focus position based on a phase difference image of a first area of an object captured by an imaging device;
search for a second in-focus position based on the first in-focus position; and
cause the imaging device to capture an output image of the object at the second in-focus position.
10. The imaging control apparatus according to claim 9, wherein the first in-focus position includes a position of an optical axis direction.
11. The imaging control apparatus according to claim 9, wherein the output image is a pathological image.
12. The imaging control apparatus according to claim 9, wherein the output image is an image captured of a living body sample irradiated with at least one of a bright field illumination light or a dark field illumination light.
13. The imaging control apparatus according to claim 9, wherein the circuitry is further configured to store the second in-focus position in a memory.
14. The imaging control apparatus according to claim 9, wherein the circuitry is further configured to move a stage holding the object to capture a second area of the object.
15. The imaging control apparatus according to claim 14, wherein the second area is an area adjacent to the first area.
16. The imaging control apparatus according to claim 9, wherein the output image is an image of a biological sample that includes at least one of blood, epithelial tissues, or tissues including the epithelial tissues and smear cells.
17. The imaging control apparatus according to claim 9, wherein the phase difference image includes information on an amount and orientation of a displacement in an optical axis direction between a focal point of an objective lens of the imaging device and the object.
18. The imaging control apparatus according to claim 9, wherein the imaging device further comprises a splitter lens that splits a light beam used to calculate the first in-focus position.
19. An imaging control method, comprising:
calculating a first in-focus position based on a phase difference image of a first area of an object captured by an imaging device;
searching for a second in-focus position based on the first in-focus position; and
capturing, with the imaging device, an output image of the object at the second in-focus position.
20. An imaging apparatus, comprising:
an optical assembly configured to split a light transmitted through or reflected from an object into a first light guided to a first image sensor and a second light guided to a second image sensor;
the first image sensor configured to obtain a phase difference image of the object to determine a first in-focus position; and
the second image sensor configured to obtain an image of the object at a second in-focus position, wherein the second in-focus position is determined based on the first in-focus position.
21. The imaging apparatus according to claim 20, wherein the first light is transmitted through a field lens and a separator lens, and forms a set of images of the object on the first image sensor.
22. The imaging apparatus according to claim 20, wherein the first light is transmitted through a condenser lens and twin lenses, and forms a set of images of the object on the first image sensor.
23. The imaging apparatus according to claim 20, wherein the optical assembly is a beam splitter or a movable mirror.
24. The imaging apparatus according to claim 20, wherein at least one of the first image sensor or the second image sensor is arranged on an optical axis of an objective lens of the imaging apparatus.
25. The imaging apparatus according to claim 20, wherein the first image sensor and the second image sensor are one of a one-dimensional imaging device or a two-dimensional imaging device.
26. The imaging apparatus according to claim 20, further comprising a thumbnail image sensor configured to obtain a thumbnail image of the object.
27. The imaging apparatus according to claim 20, wherein the imaging apparatus is a microscope.
US16/411,487 2013-03-13 2019-05-14 Digital microscope apparatus for reimaging blurry portion based on edge detection Abandoned US20190268573A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/411,487 US20190268573A1 (en) 2013-03-13 2019-05-14 Digital microscope apparatus for reimaging blurry portion based on edge detection

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013050520A JP2014178357A (en) 2013-03-13 2013-03-13 Digital microscope device, imaging method of the same and program
JP2013-050520 2013-03-13
US14/199,682 US10313637B2 (en) 2013-03-13 2014-03-06 Digital microscope apparatus for reimaging blurry portion based on edge detection
US16/411,487 US20190268573A1 (en) 2013-03-13 2019-05-14 Digital microscope apparatus for reimaging blurry portion based on edge detection

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/199,682 Continuation US10313637B2 (en) 2013-03-13 2014-03-06 Digital microscope apparatus for reimaging blurry portion based on edge detection

Publications (1)

Publication Number Publication Date
US20190268573A1 true US20190268573A1 (en) 2019-08-29

Family

ID=51502424

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/199,682 Active 2035-05-12 US10313637B2 (en) 2013-03-13 2014-03-06 Digital microscope apparatus for reimaging blurry portion based on edge detection
US16/411,487 Abandoned US20190268573A1 (en) 2013-03-13 2019-05-14 Digital microscope apparatus for reimaging blurry portion based on edge detection

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/199,682 Active 2035-05-12 US10313637B2 (en) 2013-03-13 2014-03-06 Digital microscope apparatus for reimaging blurry portion based on edge detection

Country Status (3)

Country Link
US (2) US10313637B2 (en)
JP (1) JP2014178357A (en)
CN (1) CN104049351A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110780440A (en) * 2019-11-12 2020-02-11 四川沃文特生物技术有限公司 Photographic microscope and method for rapidly photographing by using same

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014178474A (en) * 2013-03-14 2014-09-25 Sony Corp Digital microscope apparatus, focusing position searching method therefor, and program
CN104038699B (en) * 2014-06-27 2016-04-06 努比亚技术有限公司 The reminding method of focusing state and filming apparatus
US10613308B2 (en) * 2016-11-30 2020-04-07 Yuefeng YIN Method and microscope for measuring and calculating heights on curved surface of microscope slide
JP2020202748A (en) * 2017-08-31 2020-12-24 富士フイルム株式会社 Photographing processing apparatus, control method of photographing processing apparatus, and photographing processing program
KR102523559B1 (en) * 2017-09-29 2023-04-19 라이카 바이오시스템즈 이미징 인크. A digital scanning apparatus
JP7184077B2 (en) * 2018-03-22 2022-12-06 ソニーグループ株式会社 CONTROLLER AND METHOD AND OPERATING MICROSCOPE SYSTEM
JP7028099B2 (en) * 2018-08-02 2022-03-02 日本電信電話株式会社 Candidate area estimation device, candidate area estimation method, and program
CN109995998B (en) * 2019-01-03 2020-06-12 中国科学院生物物理研究所 Automatic focusing method suitable for scanning/transmission electron microscope imaging
DE102019113540A1 (en) * 2019-05-21 2020-11-26 Carl Zeiss Microscopy Gmbh Light microscope with automatic focusing
CN110764244B (en) * 2019-11-05 2022-03-18 安图实验仪器(郑州)有限公司 Automatic focusing method for microscope tabletting microscopic examination

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3838275A (en) * 1973-07-18 1974-09-24 Honeywell Inc Detecting apparatus for determining when image is in focus
US5426521A (en) * 1991-12-24 1995-06-20 Research Development Corporation Of Japan Aberration correction method and aberration correction apparatus
US5583342A (en) * 1993-06-03 1996-12-10 Hamamatsu Photonics K.K. Laser scanning optical system and laser scanning optical apparatus
US5936253A (en) * 1996-12-05 1999-08-10 Nikon Corporation Position detector and microlithography apparatus comprising same
US6023338A (en) * 1996-07-12 2000-02-08 Bareket; Noah Overlay alignment measurement of wafers
US20020027708A1 (en) * 2000-06-30 2002-03-07 Lin Charles P. Fiber-coupled multiplexed confocal microscope
US20070206105A1 (en) * 2006-03-01 2007-09-06 Hamamatsu Photonics K.K. Image acquiring apparatus, image acquiring method, and image acquiring program
US20090086314A1 (en) * 2006-05-31 2009-04-02 Olympus Corporation Biological specimen imaging method and biological specimen imaging apparatus
US20090206234A1 (en) * 2006-07-12 2009-08-20 Toyo Boseki Kabushiki Kaisha Analyzer and use thereof
US20100172020A1 (en) * 2008-10-14 2010-07-08 Burnham Institute For Medical Research Automated scanning cytometry using chromatic aberrtation for multiplanar image acquisition
US20100231922A1 (en) * 2006-12-21 2010-09-16 Howard Hughes Medical Institute Three-dimensional interferometric microscopy
US20100309365A1 (en) * 2009-06-05 2010-12-09 Canon Kabushiki Kaisha Image pickup apparatus having improved contrast autofocus accuracy
US20110188053A1 (en) * 2010-02-01 2011-08-04 Illumina, Inc. Focusing methods and optical systems and assemblies using the same
US20110262123A1 (en) * 2010-04-27 2011-10-27 Canon Kabushiki Kaisha Focus detection apparatus
US20110317259A1 (en) * 2010-06-28 2011-12-29 Sony Corporation Microscope and focusing method
US20120212662A1 (en) * 2011-02-21 2012-08-23 Sony Corporation Imaging device and imaging apparatus
US20120268647A1 (en) * 2011-04-21 2012-10-25 Canon Kabushiki Kaisha Image pickup apparatus and control method thereof
US20120312957A1 (en) * 2009-10-19 2012-12-13 Loney Gregory L Imaging system and techniques
US20130010179A1 (en) * 2011-06-29 2013-01-10 Nikon Corporation Focus adjustment device and imaging apparatus
US20130044203A1 (en) * 2004-11-02 2013-02-21 Cascade Microtech, Inc. Optically enhanced digital imaging system
US20130242173A1 (en) * 2012-03-16 2013-09-19 Canon Kabushiki Kaisha Focusing apparatus and method of controlling focusing apparatus

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6711283B1 (en) * 2000-05-03 2004-03-23 Aperio Technologies, Inc. Fully automatic rapid microscope slide scanner
US7792338B2 (en) * 2004-08-16 2010-09-07 Olympus America Inc. Method and apparatus of mechanical stage positioning in virtual microscopy image capture
JP4799428B2 (en) * 2007-01-22 2011-10-26 株式会社東芝 Image processing apparatus and method
US8086037B2 (en) * 2008-02-15 2011-12-27 Microsoft Corporation Tiling and merging framework for segmenting large images
JP5672688B2 (en) 2009-10-23 2015-02-18 ソニー株式会社 Focusing device, focusing method, focusing program, and microscope
JP2011197283A (en) 2010-03-18 2011-10-06 Sony Corp Focusing device, focusing method, focusing program, and microscope
US8396269B2 (en) * 2010-04-08 2013-03-12 Digital Pathco LLC Image quality assessment including comparison of overlapped margins
US9297995B2 (en) * 2011-02-11 2016-03-29 University Of South Florida Automatic stereological analysis of biological tissue including section thickness determination
JP2013011856A (en) * 2011-06-01 2013-01-17 Canon Inc Imaging system and control method thereof
JP5898322B2 (en) * 2011-10-12 2016-04-06 ベンタナ メディカル システムズ, インコーポレイテッド Multifocal interferometer image acquisition
JP2013201530A (en) * 2012-03-23 2013-10-03 Canon Inc Imaging device and control method of the same

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3838275A (en) * 1973-07-18 1974-09-24 Honeywell Inc Detecting apparatus for determining when image is in focus
US5426521A (en) * 1991-12-24 1995-06-20 Research Development Corporation Of Japan Aberration correction method and aberration correction apparatus
US5583342A (en) * 1993-06-03 1996-12-10 Hamamatsu Photonics K.K. Laser scanning optical system and laser scanning optical apparatus
US6023338A (en) * 1996-07-12 2000-02-08 Bareket; Noah Overlay alignment measurement of wafers
US5936253A (en) * 1996-12-05 1999-08-10 Nikon Corporation Position detector and microlithography apparatus comprising same
US20020027708A1 (en) * 2000-06-30 2002-03-07 Lin Charles P. Fiber-coupled multiplexed confocal microscope
US20130044203A1 (en) * 2004-11-02 2013-02-21 Cascade Microtech, Inc. Optically enhanced digital imaging system
US20070206105A1 (en) * 2006-03-01 2007-09-06 Hamamatsu Photonics K.K. Image acquiring apparatus, image acquiring method, and image acquiring program
US20090086314A1 (en) * 2006-05-31 2009-04-02 Olympus Corporation Biological specimen imaging method and biological specimen imaging apparatus
US20090206234A1 (en) * 2006-07-12 2009-08-20 Toyo Boseki Kabushiki Kaisha Analyzer and use thereof
US20100231922A1 (en) * 2006-12-21 2010-09-16 Howard Hughes Medical Institute Three-dimensional interferometric microscopy
US20100172020A1 (en) * 2008-10-14 2010-07-08 Burnham Institute For Medical Research Automated scanning cytometry using chromatic aberrtation for multiplanar image acquisition
US20100309365A1 (en) * 2009-06-05 2010-12-09 Canon Kabushiki Kaisha Image pickup apparatus having improved contrast autofocus accuracy
US20120312957A1 (en) * 2009-10-19 2012-12-13 Loney Gregory L Imaging system and techniques
US20110188053A1 (en) * 2010-02-01 2011-08-04 Illumina, Inc. Focusing methods and optical systems and assemblies using the same
US20110262123A1 (en) * 2010-04-27 2011-10-27 Canon Kabushiki Kaisha Focus detection apparatus
US20110317259A1 (en) * 2010-06-28 2011-12-29 Sony Corporation Microscope and focusing method
US20120212662A1 (en) * 2011-02-21 2012-08-23 Sony Corporation Imaging device and imaging apparatus
US20120268647A1 (en) * 2011-04-21 2012-10-25 Canon Kabushiki Kaisha Image pickup apparatus and control method thereof
US20130010179A1 (en) * 2011-06-29 2013-01-10 Nikon Corporation Focus adjustment device and imaging apparatus
US20130242173A1 (en) * 2012-03-16 2013-09-19 Canon Kabushiki Kaisha Focusing apparatus and method of controlling focusing apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110780440A (en) * 2019-11-12 2020-02-11 四川沃文特生物技术有限公司 Photographic microscope and method for rapidly photographing by using same

Also Published As

Publication number Publication date
JP2014178357A (en) 2014-09-25
US10313637B2 (en) 2019-06-04
CN104049351A (en) 2014-09-17
US20140267675A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
US20190268573A1 (en) Digital microscope apparatus for reimaging blurry portion based on edge detection
US11320642B2 (en) Information processing apparatus, information processing method, and information processing program
US11156823B2 (en) Digital microscope apparatus, method of searching for in-focus position thereof, and program
US9088729B2 (en) Imaging apparatus and method of controlling same
KR102411099B1 (en) Real-time autofocus scanning
US10073258B2 (en) Microscope system
US11454781B2 (en) Real-time autofocus focusing algorithm
US10429632B2 (en) Microscopy system, microscopy method, and computer-readable recording medium
US9575305B2 (en) Digital microscope apparatus, information processing method, and information processing program
EP3625611B1 (en) Dual processor image processing
JP2015102694A (en) Alignment device, microscopic system, alignment method, and alignment program
WO2024014079A1 (en) Cell observation device and imaging method used in cell observation device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE