WO2014046138A1 - Sample analysis device, sample analysis method, sample analysis program, and particle track analysis device - Google Patents

Sample analysis device, sample analysis method, sample analysis program, and particle track analysis device Download PDF

Info

Publication number
WO2014046138A1
WO2014046138A1 PCT/JP2013/075183 JP2013075183W WO2014046138A1 WO 2014046138 A1 WO2014046138 A1 WO 2014046138A1 JP 2013075183 W JP2013075183 W JP 2013075183W WO 2014046138 A1 WO2014046138 A1 WO 2014046138A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sample
luminance value
unit
approach
Prior art date
Application number
PCT/JP2013/075183
Other languages
French (fr)
Japanese (ja)
Inventor
洋介 梅島
Original Assignee
セイコープレシジョン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by セイコープレシジョン株式会社 filed Critical セイコープレシジョン株式会社
Publication of WO2014046138A1 publication Critical patent/WO2014046138A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/94Investigating contamination, e.g. dust
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes

Definitions

  • the present invention relates to a sample analysis device, a sample analysis method, a sample analysis program, and a particle track analysis device.
  • Patent Documents 1 and 2 disclose techniques for detecting scratches and attached dust on a sample such as a film with a microscope, a scanner, a sensor, or the like.
  • JP 2003-232746 A Japanese Patent Laid-Open No. 11-215319
  • Patent Documents 1 and 2 can detect whether a sample has a concave portion (a scratch on the sample) or a convex portion (dust attached to the sample), the concave and convex portions of the sample are detected. Cannot be distinguished.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to detect a concave portion and a convex portion of a sample and discriminate between the concave portion and the convex portion.
  • a sample analyzer provides: An emission part for emitting visible light; A support part that is arranged on the optical axis of visible light emitted by the emission part and supports a sample through which visible light is transmitted; An imaging unit that has an optical system and captures an image of the sample supported by the support unit; The focal position of the optical system is set along the optical axis of visible light, the approach position closer to the emitting portion than the reference position of the imaging surface, which is the surface of the sample on the imaging portion side, and the emitting position than the reference position.
  • a focal point moving part that is moved between a remote position away from the part,
  • the focus moving unit moves the focus position to the approach position, and the brightness value of the approach image that is the image of the sample captured by the imaging unit at the approach position is larger than the brightness value of the peripheral portion that is set in advance.
  • the brightness value of the remote image which is the image of the sample imaged by the imaging unit at the remote position by the focus moving unit moving the focal position to the remote position, is smaller than the luminance value of the peripheral portion.
  • Concave detecting means for detecting a portion as a concave recessed from the imaging surface;
  • a convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface.
  • Detection means It is characterized by providing.
  • the approach / separation difference image obtained by subtracting the brightness value of each pixel corresponding to the separation image from the brightness value of each pixel of the approach image, and the brightness of each pixel corresponding to the approach image from the brightness value of each pixel of the separation image
  • a difference acquisition means for acquiring a distance approaching difference image obtained by subtracting a value
  • a closed region extracting means for extracting a closed region whose luminance value is different from the peripheral portion from the image; Further comprising
  • the closed region extracting means moves the focus position to a measurement position moved by a preset measurement distance from the reference position along the optical axis, and the imaging unit images at the measurement position.
  • a closed region in the measurement image as the image of the sample a closed region in which the luminance value in the approaching / separating difference image is a positive value
  • the concave detecting means detects, as the concave portion, a closed region in the measurement image that is arranged at a position of the closed region where the brightness value in the approaching / separating difference image is a positive value
  • the convex detecting means detects, as the convex portion, a closed region in the measurement image that is arranged at a position of the closed region where the luminance value in the separation approach difference image is a positive value. May be.
  • the boundary of the region having the lowest average luminance value may be the boundary of the closed region in the measurement image.
  • a sample analysis method includes: An imaging step of capturing an image of the sample by an imaging unit having an optical system when visible light is transmitted through the sample disposed on the optical axis;
  • the focal position of the optical system is closer to the light source than the reference position of the imaging surface, which is the surface on the imaging unit side of the sample, along the optical axis of visible light, and away from the light source than the reference position
  • the focus moving unit moves the focus position to the approach position, and the brightness value of the approach image that is the image of the sample captured by the imaging unit at the approach position is larger than the brightness value of the peripheral portion that is set in advance.
  • the brightness value of the remote image which is the image of the sample imaged by the imaging unit at the remote position by the focus moving unit moving the focal position to the remote position, is smaller than the luminance value of the peripheral portion.
  • a detection process; It is characterized by providing.
  • a sample analysis program provides: Computer An imaging unit that captures an image of the sample by an imaging unit having an optical system when visible light passes through the sample disposed on the optical axis; The focal position of the optical system is closer to the light source than the reference position of the imaging surface, which is the surface on the imaging unit side of the sample, along the optical axis of visible light, and away from the light source than the reference position; A focal point moving means for moving between the separated position and The focus moving means moves the focus position to the approach position, and the brightness value of the approach image, which is the image of the sample imaged by the imaging unit at the approach position, is larger than the brightness value of the peripheral portion set in advance.
  • the brightness value of the remote image which is the image of the sample imaged by the imaging unit at the remote position with the focus moving means moving the focus position to the remote position, is smaller than the luminance value of the peripheral portion.
  • Concave detecting means for detecting a portion as a concave recessed from the imaging surface; A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. Detection means, It is made to function as.
  • a particle track analysis apparatus provides: An emission part for emitting visible light; A support part that is disposed on the optical axis of visible light emitted by the emission part and supports a track detection solid that transmits visible light; and An imaging unit that has an optical system and captures an image of a track detection solid as a sample supported by the support;
  • the focal position of the optical system is closer to the emitting part than the reference position of the imaging surface, which is the surface on the imaging part side of the solid for track detection, along the optical axis of visible light, and from the reference position
  • a focal point moving unit that moves between a remote position away from the emitting unit,
  • the brightness value of the peripheral portion in which the brightness value of the separated image, which is an image of the track detection solid image captured by the imaging unit at the separated position by the focus moving unit moving the focal position to the separated position is set in advance.
  • the brightness value of the approach image which is an image of the track detection solid image captured by the imaging unit at the approach position when the focus moving unit moves the focus position to the approach position
  • a foreign matter removing means for removing a portion smaller than the brightness value from the image of the track detection solid as a foreign matter attached to the imaging surface and protruding from the imaging surface
  • the concave and convex portions of the sample can be detected and the concave and convex portions can be distinguished.
  • A) is a diagram showing a state in which the focal position of the objective lens is moved to the separation position
  • (B) is a diagram showing a state in which the focal position of the objective lens is moved to the measurement position
  • (C) is a diagram showing the focal position of the objective lens.
  • FIG. 5H is a diagram showing a separation approach difference image. It is explanatory drawing of the target which has a multiple boundary.
  • (A) is a figure which shows the state which extracted the target with high coincidence with the track
  • (B) is the state which extracted the target with high matching with the foreign material in the separating / separating difference image
  • FIG. 8C is a diagram taken in a state where the focal position is moved to the measurement position.
  • FIG. 6A is a diagram illustrating a state in which foreign matter is attached to the imaging surface in the state of FIG. 6A
  • FIG. 6B is a diagram illustrating a separated image captured in the state of FIG. 6A
  • FIG. 6A is a diagram illustrating a state in which a track of particles is formed on the imaging surface in the state of FIG. 6A
  • FIG. 6D is a diagram illustrating a remote image captured in the state of FIG. 6C
  • (F) is a diagram illustrating an approach image captured in the state of (E)
  • (G) is captured in the state of FIG. 6 (C).
  • (H) is a figure which shows the approach image imaged in the state of (G). It is a flowchart of a sample analysis process. It is a flowchart of the target object extraction process of a measurement image. It is a flowchart of the target object extraction process of each difference image. It is a flowchart of a track extraction process. It is a figure which shows the measurement image by which the binarization process was carried out. It is a block diagram which shows an example of the hardware constitutions of a control computer.
  • the sample analyzer 100 includes a microscope unit 110 and a control computer 150 as shown in FIG.
  • the microscope unit 110 includes a moving unit 2 that movably supports the sample 1, a microscope 3 that magnifies the image of the sample 1, and an image sensor that captures an image of the sample magnified by the microscope 3. 4.
  • the moving unit 2 and the microscope 3 are supported by an L-shaped gantry 7.
  • Sample 1 is a track detection solid.
  • the track detection solid is an organic plastic plate.
  • the polymer bond is damaged in the portion through which the radiation has passed.
  • this damaged portion is etched with a predetermined solution, a minute recess called an etch pit is generated.
  • This etch pit shows a track of radiation.
  • the etch pit has a different shape (for example, a perfect circle or an ellipse), a diameter, and a depth depending on an incident amount and an incident direction of radiation. For this reason, by observing the number and shape of the etch pits generated in the sample 1 with a microscope, it is possible to acquire information such as the amount of incident radiation and the incident direction.
  • Dust may adhere to the surface of the sample 1.
  • an image of the sample 1 to which dust is attached is captured, it is necessary to discriminate between etch pits and dust in the captured image.
  • the moving part 2 is installed in the horizontal part of the L-shaped gantry 7.
  • the moving unit 2 includes an XY stage 21, a sample stage (supporting unit) 22, a condenser lens 23, and an encoder 24.
  • the sample stage 22 supports the sample 1.
  • the sample stage 22 is installed on the XY stage 21.
  • the XY stage 21 is driven by an X-axis motor 131 and a Y-axis motor 132 shown in FIG. 5, which will be described later, and moves horizontally in the left-right and front-back directions (X-axis direction and Y-axis direction).
  • the X-axis motor 131 and the Y-axis motor 132 are controlled by the control computer 150.
  • the condenser lens 23 is installed under the sample table 22.
  • the encoder 24 supplies the control computer 150 with pulses corresponding to the amounts of movement of the XY stage 21 in the X-axis direction and the Y-axis direction.
  • the encoder 24 includes an X-axis encoder 24X and a Y-axis encoder 24Y. Position correction (shift correction) in the X-axis direction and the Y-axis direction with respect to an area image acquired by the image sensor 4 described later is performed based on outputs from the X-axis encoder 24X and the Y-axis encoder 24Y, respectively.
  • the microscope 3 includes an objective lens (optical system) 31, an illumination unit 32 that irradiates the sample 1, a control imaging unit 33, a lens barrel 34, an eyepiece 35 for visual observation, and a Z-axis control unit (focus). Moving part) 36.
  • the objective lens 31 includes a plurality of lenses having different magnifications (for example, a lens having a magnification of 10 times and a lens having a magnification of 20 times).
  • the plurality of lenses can be switched by rotating the revolver 37.
  • the illumination unit 32 includes an epi-illumination unit 32a and a transmission illumination unit 32b.
  • the epi-illumination unit 32 a irradiates the sample 1 with the light emitted from the built-in light source being bent along the optical axis of the microscope 3. At this time, the reflected light from the sample 1 enters the objective lens 31.
  • the transmitted illumination unit 32 b is provided in the horizontal portion of the L-shaped mount 7 under the sample table 22.
  • the transmitted illumination unit 32b irradiates the sample 1 with light emitted from a light source built in the transmitted illumination unit 32b or light emitted from an external light source introduced from the optical fiber 8 from below. At this time, the transmitted light from the sample 1 enters the objective lens 31.
  • the light irradiated to the sample 1 from the transmission illumination unit 32b is collected by the condenser lens 23 and irradiated (emitted) to the sample 1 as parallel light (collected parallel light). For this reason, the transmitted light from the sample 1 is parallel light.
  • the transmitted illumination unit 32b and the condenser lens 23 constitute an output unit 41 as shown in FIG. 2 and FIG. 9 described later.
  • the lens barrel 34 supports an eyepiece 35 for visual observation and the image sensor 4.
  • a side portion of the lens barrel 34 is attached to an upright portion of the L-shaped gantry 7 via a Z-axis control unit 36.
  • the Z-axis control unit 36 moves the position of the microscope 3 in the vertical direction (Z-axis direction, optical axis direction).
  • the Z-axis control unit 36 includes a slide plate 36 a fixed to the gantry 7, a slide plate 36 b fixed to the lens barrel 34, and a Z-axis motor 361.
  • one of the slide plates 36 a and 36 b includes a Z-axis motor 361, a pinion gear 363 fixed to the rotation shaft 362 of the Z-axis motor 361, and an operation configured to be coaxial with the rotation shaft 362. And a knob 364.
  • the other of the slide plates 36 a and 36 b includes a rack 365.
  • a rack and pinion mechanism is configured by engaging a pinion gear 363 with the rack 365.
  • the pinion gear 363 rotates by driving the Z-axis motor 361 and the operation knob 364 to drive the rack 365.
  • the slide plate 36b slides in the vertical direction with respect to the slide plate 36a fixed to the gantry 7, and the lens barrel 34 moves in the vertical direction (Z-axis direction).
  • the eyepiece 35 for visual observation makes it possible to visually observe the sample 1 by tilting the optical axis of the light incident from the objective lens 31 with a prism.
  • the image sensor 4 is housed in a case and is detachably attached to the tip of the lens barrel 34.
  • a C mount or an F mount is used as the mounting portion.
  • the image sensor 4 includes CCD (Charge Coupled Device) elements arranged on the XY plane.
  • the image sensor 4 captures an area included in the area AR that is the imaging range of the image sensor 4 in the image of the sample 1 captured in the visual field VF of the microscope 3.
  • the image sensor 4 sequentially images the sample 1 that is moved in the X-axis direction or the Y-axis direction by the moving unit 2 for each area AR that is an imaging range, and connects the image data (area image data) of each area AR. This is transmitted to the control computer 150 via the code. That is, the image sensor 4 scans the image of the sample 1 magnified by the microscope 3 by capturing the image while performing relative movement with respect to the sample 1, and supplies the captured area image data to the control computer 150.
  • the control computer 150 is, for example, a computer. As illustrated in FIG. 1, the control computer 150 includes an arithmetic processing unit 51, a display unit 52, and an image recording device 53.
  • the arithmetic processing unit 51 sets an imaging region, movement control and position control of the moving unit 2 in the X-axis direction and the Y-axis direction, movement control of the microscope 3 in the Z-axis direction, imaging control of the image sensor 4, and image sensor 4
  • the area image data supplied from the image data is fetched, and the entire image of the imaging area is created based on the area image data.
  • FIG. 5 shows a functional block diagram centering on the arithmetic processing unit 51.
  • the arithmetic processing unit 51 is connected to the moving unit 2, the Z-axis control unit 36, the imaging unit 123, the image RAM 124, the image recording device 53, the data communication unit 125, and the like.
  • the moving unit 2 includes an X-axis drive unit 121, a Y-axis drive unit 122, an X-axis encoder 24X, and a Y-axis encoder 24Y.
  • the X-axis drive unit 121 drives the X-axis motor 131 according to the control of the arithmetic processing unit 51 to move the XY stage 21 in the X-axis direction.
  • the Y-axis drive unit 122 drives the Y-axis motor 132 according to the control of the arithmetic processing unit 51, and moves the XY stage 21 in the Y-axis direction orthogonal to the X-axis direction.
  • the X-axis encoder 24X is composed of a linear encoder or the like, and outputs a pulse train corresponding to the moving direction and distance when the XY stage 21 moves in the X-axis direction.
  • the arithmetic processing unit 51 determines the moving direction and moving distance of the XY stage 21 in the X-axis direction by determining the number of pulses, the phase, and the like.
  • the Y-axis encoder 24Y is composed of a linear encoder or the like, and outputs a pulse train corresponding to the moving direction and distance when the XY stage 21 moves in the Y-axis direction.
  • the arithmetic processing unit 51 determines the moving direction and moving distance of the XY stage 21 in the Y-axis direction by determining the number of pulses, the phase, and the like.
  • the Z-axis control unit 36 includes the aforementioned Z-axis motor 361, a Z-axis drive unit 366, and a Z-axis encoder 367.
  • the Z-axis drive unit 366 drives the Z-axis motor 361 according to the control of the arithmetic processing unit 51 to move the microscope 3 in the Z-axis direction (vertical direction).
  • the Z-axis encoder 367 is composed of a linear encoder or the like, and outputs a pulse train corresponding to the moving direction and distance when the microscope 3 moves in the Z-axis direction.
  • the arithmetic processing unit 51 determines the moving direction and moving distance of the microscope 3 by determining the number of pulses, the phase, and the like.
  • the Z-axis control unit 36 detects the contrast of the image acquired by the image sensor 4 and performs autofocus processing, whereby the imaging surface (sample surface, imaging target surface) that is the surface of the sample 1 on the side close to the objective lens 31. And the focal position of the objective lens 31 included in the imaging unit 123 can be moved to the imaging surface.
  • the focal position of the objective lens 31 in a state where the focal position of the objective lens 31 coincides with the imaging surface is referred to as a reference position.
  • the Z-axis control unit 36 sets the focal position of the objective lens 31 to a separation position that is separated from the reference position by a preset separation distance (on the side closer to the objective lens) as shown in FIG. FIG.
  • FIG. 6B shows a measurement position that is separated from the reference position by a preset measurement distance
  • FIG. 6C shows a measurement position that is below the reference position by a preset approach distance (far from the objective lens).
  • Position is set as the measurement position.
  • the separation position and the approach position are set based on the thickness of the sample 1, the assumed particle track, the size of dust attached to the sample 1, and the like.
  • the imaging unit 123 includes the objective lens 31, the image sensor 4, and the like. Based on the output signal of the arithmetic processing unit 51, the imaging unit 123 captures a plurality of images of the sample 1 whose focal positions of the objective lens 31 are different from each other when imaging each image, and supplies the images to the arithmetic processing unit 51. Specifically, the imaging unit 123 captures the separated image (Far image) 201 shown in FIG. 6D captured in a state where the focal position is aligned with the separated position, and the focused position is aligned with the measured position. The measured image (Msr: Measure image) 202 shown in FIG. 6 (E) and the approach image (Near image) 203 shown in FIG. 6 (F) captured in a state where the focus position is adjusted to the approach position. An image is taken and supplied to the arithmetic processing unit 51.
  • the image RAM 124 functions as a work area for the arithmetic processing unit 51.
  • the image recording device 53 is composed of a mass storage device or the like, and records captured images.
  • the data communication unit 125 transmits / receives various data to / from an external device.
  • the arithmetic processing unit 51 includes a difference acquisition unit 511, a target object extraction unit 512, a foreign matter removal unit 513, and a track extraction unit 514.
  • the difference acquisition unit 511 obtains a close-up difference image (Near-Far image, a concave image) using a luminance value of each pixel as a value obtained by subtracting the luminance value of each corresponding pixel of the separate image 201 from the luminance value of each pixel of the close-up image 203.
  • a difference image for detection (see FIG. 6G) 204 is acquired.
  • the difference acquisition unit 511 uses a value obtained by subtracting the luminance value of each corresponding pixel of the approach image 203 from the luminance value of each pixel of the remote image 201 as a luminance value of each pixel (Far-Near image).
  • a difference image for convex detection (see FIG. 6H) 205 is acquired.
  • the difference acquisition unit 511 of the present embodiment performs smoothing processing on the approach image 203 and the separation image 201 to smooth the contour lines in the image before acquiring the approach / separation difference image 204 and the separation approach difference image 205. Yes. It should be noted that the difference images 204 and 205 shown in FIGS. 6G and 6H are different from black only in a portion where the luminance value has a positive value (positive information) for easy understanding. Is displayed.
  • the target object extraction unit (closed region extraction unit) 512 obtains, from each image, a portion (Obj: Object, closed region) whose luminance value is different from other portions (a white portion or a black portion than other portions). Extract as The target object extraction unit 512 includes a target object in the measurement image 202 shown in FIG. 6E, a target object whose luminance value in the approaching / separating difference image 204 shown in FIG. A target whose luminance value in the separated approach difference image 205 shown in (H) has positive information is extracted.
  • the target extraction unit 512 When extracting the target in the measurement image 202, the target extraction unit 512 first performs a smoothing process on the measurement image 202, and in the measurement image 202, a boundary that surrounds a portion whose luminance value is different from other portions ( Clarify the edges. Next, the target object extraction unit 512 executes a boundary extraction process (edge extraction process) for extracting boundaries in the measurement image 202, and labels the connected boundaries one by one.
  • a boundary extraction process edge extraction process
  • the target object extraction unit 512 is unnecessary when an outer boundary (outer boundary) surrounding the inner boundary (inner boundary) exists, that is, when a target having multiple boundaries (multiple circles) is extracted.
  • Execute boundary removal processing to remove the boundary. Specifically, among the boundaries, an average luminance value (average value of luminance values) inside the boundary is measured, and a boundary having the lowest average luminance value in the boundary is extracted as a target boundary in the measurement image 202. And remove the remaining boundaries.
  • the target 202a in the measurement image 202 shown in FIG. 7 has an outer boundary 202b, an intermediate boundary 202c, and an inner boundary 202d, as shown by the broken line in FIG. Therefore, the target object extraction unit 512 first calculates the average luminance value of the region in the boundary 202b, the region in the boundary 202c, and the region in the boundary 202d. At this time, the average luminance value is the entire region inside the boundary. Specifically, when the average luminance value of the region within the boundary 202c is obtained, the average luminance value of the region within the region 202c is calculated together with the luminance value of the region within the boundary 202d.
  • the average luminance value of the region in the intermediate boundary 202c is the lowest, so the boundary of the target 202a becomes the intermediate boundary 202c, and the remaining boundaries 202b and 202d are removed. If the average luminance value of the area in the intermediate boundary 202c and the average luminance value of the area in the inner boundary 202d are the same value, the intermediate boundary outside the inner boundary 202d. 202c becomes the boundary of the target 202a, and the inner boundary 202d is removed.
  • the target object extraction unit 512 extracts a target object whose luminance value in each of the difference images 204 and 205 has positive information
  • the light amount of each pixel in the measurement image 202 has a preset threshold value. Whether or not to exceed is determined, and based on the determination result, a binarization extraction process for binarizing the luminance value of each pixel into a white value and a black value is executed.
  • the target object extraction unit 512 labels a set of pixels each having a luminance value of a white value connected in units of pixels one by one.
  • the target discharge unit 512 is a technique called elliptic fitting described in JP-A-2005-293299. Is used to exclude from the target a set of labeled pixels that is not considered a track or a foreign object.
  • the target extraction unit 512 creates an approximate ellipse that approximates the outline of the pixel set based on the coordinates of each pixel that forms the boundary of the labeled pixel set.
  • the target extraction unit 512 measures the shortest distance to the outer periphery (boundary) of the approximate ellipse existing in the vicinity of the pixel coordinates indicating the boundary. For example, when the shortest distance is within a preset threshold value It is determined as a valid pixel.
  • the target extraction unit 512 determines that the degree of coincidence with a track or a foreign object is high, and extracts the pixel set as a target. To do.
  • the target extraction unit 512 determines that the number of pixels determined to be valid does not reach a predetermined threshold and the degree of matching is low, the target extraction unit 512 excludes the pixel set from the target.
  • the part surrounded by the solid line in FIGS. 8A and 8B is , The degree of match exceeds the threshold value and is extracted as a target.
  • the portion surrounded by the dotted lines in FIGS. 8A and 8B has a matching degree equal to or less than the threshold value and is excluded from the target.
  • the luminance value in each difference image 204, 205 is positive information.
  • the foreign matter removing unit 513 removes foreign matter attached to the imaging surface from the measurement image 202.
  • the foreign substance removal unit 513 includes a convex detection unit 513a.
  • the foreign matter 1a shown in FIGS. 9A and 9E is attached to the imaging surface, the foreign matter 1a is irradiated as shown by the solid lines in FIGS. 9A and 9E.
  • the focused parallel light is refracted and the focal point is formed above the imaging surface. That is, the foreign material 1a functions like a convex lens.
  • the focal position of the objective lens 31 is moved to the separation position, as shown by the broken line in FIG. 9A, the focal position of the objective lens 31 and the focal position of the transmitted light in the portion corresponding to the foreign matter 1a are obtained. Nearly, the amount of light increases. For this reason, in the separated image 201a shown in FIG.
  • the luminance value of the portion 201b corresponding to the foreign object 1a is larger than the luminance value of the peripheral portion (close to white).
  • the focal position of the objective lens 31 is moved to the approach position, as shown by the broken line in FIG. 9E, the focal position of the objective lens 31 and the focal position of the transmitted light corresponding to the foreign matter 1a are far away. , Less light. Therefore, in the captured approach image 203a shown in FIG. 9F, the luminance value of the portion 203b corresponding to the foreign object 1a is smaller than the luminance value of the peripheral portion (close to black).
  • the foreign matter removing unit 513 detects the convex portion protruding from the imaging surface by the convex detecting portion 513a described later, and removes the detected convex portion as the foreign matter 1a.
  • the convex detection unit 513a projects from the imaging surface portions 201b and 203b in which the luminance value in the separated image 201 is larger than the luminance value in the peripheral portion and the luminance value in the approaching image 203 is smaller than the luminance value in the peripheral portion. It detects as a convex part. For example, when the convex detection unit 513a detects a convex part from the separation approach difference images 205 and 205a shown in FIGS. 6H and 8B, the brightness value of the positive information is detected in the separation approach difference images 205 and 205a. All the obtained targets (removal Obj) are detected as convex portions.
  • the track extraction unit 514 extracts a track of particles formed to be recessed from the imaging surface.
  • the track extraction unit 514 includes a concave detection unit 514a and a non-extraction target removal unit 514b.
  • the particle tracks 1b shown in FIGS. 9C and 9G are formed on the imaging surface, as shown by the solid lines in FIGS. 9C and 9G,
  • the parallel light applied to the track 1b is refracted and scattered, and a focal point is formed below the imaging surface. That is, the particle track 1b functions like a concave lens.
  • the focal position of the objective lens 31 is moved to the separation position, as shown by the broken line in FIG. 9C, the focal position of the objective lens 31 and the focal position of the transmitted light in the part corresponding to the track 1b of the particles. Is far away and the amount of light is small. For this reason, in the separated image 201c shown in FIG.
  • the luminance value of the portion 201d corresponding to the particle track 1b is smaller than the luminance value of the peripheral portion.
  • the focal position of the objective lens 31 is moved to the approach position, as shown by a broken line in FIG. 9G, the focal position of the objective lens 31 and the focal position of the transmitted light in the portion corresponding to the particle track 1b Is close and there is a lot of light.
  • the luminance value of the portion 203d corresponding to the particle track 1b is larger than the luminance value of the peripheral portion.
  • the track extraction unit 514 detects a recess recessed from the imaging surface by a recess detection unit 514a described later, and extracts the detected recess as a track 1b of particles. Further, in this embodiment, since the purpose is to extract the particle track 1b, the track extraction unit 514 is different from the particle track 1b in the measurement image 202 by a non-extraction target removal unit 514b described later. The target is discriminated based on the boundary length (edge length), area, etc., and removed from the extraction target.
  • the concave detection unit 514a dents portions 203d and 201d from the imaging surface whose luminance value in the approach image 203c is larger than the luminance value in the peripheral portion and whose luminance value in the separated image 201c is smaller than the luminance value in the peripheral portion. Detect as a concave. For example, when the concave detection unit 514a detects a concave portion from the approach / separation difference images 204, 204a shown in FIGS. 6 (G) and 8 (A), the luminance value of the positive information is obtained in the approach / separation difference images 204, 204a. All the target objects (extracted Obj) are detected as concave portions.
  • the non-extraction target removing unit 514b removes a target in the measurement image 202 that is different from the particle track 1b as a non-extraction target.
  • the non-extraction target removal unit 514b removes the target in the measurement image 202 whose boundary length is shorter than the preset threshold value for boundary LL. Further, the non-extraction target removal unit 514b has a target area in the measurement image 202 whose area is smaller than the preset area minimum value SL and greater than the preset area maximum value SH. Remove things and things.
  • the area of the target in the measurement image 202 is obtained by obtaining the area of the ellipse obtained by the ellipse fitting process for obtaining an ellipse that approximates the outline of the target using a predetermined number of pixels extracted from the boundary. Ask for. Further, the non-extraction target removal unit 514b excludes the target in the measurement image 202 that has a value obtained by dividing the obtained minor axis of the ellipse by the major axis is smaller than the preset ellipse threshold OD.
  • the track extraction unit 514 of the present embodiment prevents the extraction of the tracks of the particles to be extracted (prevents counting leakage), so that a track larger than a predetermined size included in the measurement image 202 is the difference image. 204 and 205 are unconditionally extracted without collation.
  • the track extraction unit 514 is larger than a preset area threshold value SE among targets in the measurement image 202 whose area is not less than the area minimum value SL and not more than the area maximum value SH. Is extracted as a track of particles without matching with a target having a luminance value of positive information in the approaching / separating difference image 204 (unconditionally extracted).
  • the track extraction unit 514 of the present embodiment obtains the center-of-gravity coordinates of the target in the measurement image 202, and the center-of-gravity coordinates of the target having the luminance value of the positive information in the approaching / separating difference image 204 Is obtained, and this value is stored as the distance for the track.
  • the track extraction unit 514 obtains the distance between the center of gravity coordinates of the target in the obtained measurement image 202 and the center of gravity coordinates of the target having the positive brightness value in the separation approach difference image 205.
  • the value is stored as the distance for the foreign object.
  • the track extraction unit 514 has a shorter track distance than the preset distance threshold GD, and all the determined foreign object distances are greater than or equal to the distance threshold GD.
  • the target in the measurement image 202 is extracted as a track of particles.
  • the distance threshold GD is set to a value that is larger than any of all the obtained distances for tracks and smaller than any of the obtained distances for all foreign objects.
  • the track extraction unit 514 obtains the center-of-gravity coordinates COD 1202a of the target 1202a.
  • the track extraction unit 514 obtains, for example, the center-of-gravity coordinates COD1202b of 1202b as the center-of-gravity coordinates of the target having the luminance value of the positive information in FIG. The distance between them is obtained, and the obtained distance is set as a track distance GD1.
  • the track extraction unit 514 obtains, for example, the centroid coordinates COD1202c of 1202c as the centroid coordinates of the target having the luminance value of the positive information in FIG. 8B, which is the separation approach difference image, and between the COD1202a and the COD1202c.
  • the obtained distance is defined as a foreign object distance GD2.
  • the center-of-gravity coordinates COD 1202a of the target 1202a in the measurement image and the center-of-gravity coordinates COD 1202b of the target 1202b in the approach / separation difference image match if there is no measurement error.
  • the track distance GD1 is very small and is not more than the distance threshold GD. Therefore, it is determined that there is an object closer than the distance threshold GD in the approach / separation difference image.
  • FIG. 8B which is an isolated approach difference image
  • the target corresponding to the target 1202a is excluded from the object, and the closest target is 1202c.
  • the foreign object distance GD2 is larger than the track distance GD1 and larger than the distance threshold value GD, and therefore it is determined that there is no centroid coordinate GD or less in the distance approaching difference image.
  • the track extraction unit 514 has a target track 1202a in the measurement image that is shorter than the preset distance threshold GD among the calculated track distances, and all the determined foreign object distances are distances. It determines with it being more than the use threshold value GD, and the target 1202a is extracted as a track of particles.
  • the portion indicated by a solid line is extracted as a track of particles, and the portion indicated by a dotted line is excluded as a foreign matter or the like.
  • the excluded portion 202f indicated by the dotted line at the center of the measurement image 202e is detected as a concave portion or a convex portion, as shown in FIGS.
  • the foreign object distance is excluded because it is smaller than the distance threshold GD.
  • the sample analysis process starts when the sample 1 is supported on the upper surface of the sample table 22 and the imaging region of the sample 1 is set.
  • the arithmetic processing unit 51 moves the XY stage 21 to an area on the sample 1 where imaging is first started via the moving unit 2 (step S101).
  • the arithmetic processing unit 51 detects the imaging surface by autofocus processing, and moves the focal position of the objective lens 31 to the reference position via the Z-axis control unit 36 (step S102).
  • the arithmetic processing unit 51 moves the focal position of the objective lens 31 to the separation position via the Z-axis control unit 36, and images the separation image 201 by the imaging unit 123 (step S103).
  • the arithmetic processing unit 51 moves the focal position of the objective lens 31 to the measurement position via the Z-axis control unit 36, and images the measurement image 202 by the imaging unit 123 (step S104).
  • the arithmetic processing unit 51 moves the focal position of the objective lens 31 to the approach position via the Z-axis control unit 36, and captures the approach image 203 by the image capturing unit 123 (step S105).
  • the arithmetic processing unit 51 performs the target image extraction process of the measurement image shown in FIG. 11 (step S106), which will be described later, and the target object extraction process of each difference image shown in FIG. 12, which will be described later (step S107).
  • the track extraction process (step S108) shown in FIG. 13 is executed.
  • the arithmetic processing unit 51 determines whether or not all areas have been imaged (step S109), and when not imaging (step S109; No), to the next area via the moving unit 2.
  • the XY stage 21 is moved (step S110), and the processes of steps S102 to S108 are repeated. And the arithmetic processing part 51 complete
  • the target object extraction unit 512 performs a smoothing process (step S201) and a boundary extraction process on the captured measurement image 202 (step S202).
  • the boundaries connected in units of pixels are labeled one by one (step S203).
  • the target object extraction unit 512 determines whether or not a plurality of labeled boundaries are multiplexed (multiple circles) (step S204), and when they are multiplexed (step S204; Yes).
  • the boundary other than the boundary having the lowest average luminance value in the region within the boundary is removed (step S205), and step S204 is repeated.
  • the target object extraction unit 512 ends the target object extraction process of the measurement image when there are no multiple boundaries or when all the multiple boundaries are removed (No in step S204).
  • the difference acquisition unit 511 performs smoothing processing on the separation image 201 and the approach image 203 (step S301), and then approaches the separation image 201 and the separation approach.
  • the difference image 205 is acquired (step S302).
  • the target object extraction unit 512 extracts and labels the target object having the brightness value of the positive information in the difference images 204 and 205 (step S303).
  • the target object extraction unit 512 removes target objects whose degree of coincidence with a track or a foreign object is equal to or less than a threshold value from among the labeled target objects (step S304).
  • the target object extraction unit 512 acquires the barycentric coordinates of the remaining labeled target objects (step S305), and ends the target object extraction process of each difference image.
  • the track extraction unit 514 first selects one target in the measurement image 202 having a labeled boundary (step S401). Next, the non-extraction target removal unit 514b determines whether or not the boundary length is shorter than the boundary threshold LL (step S402). If the boundary length is shorter (step S402; Yes), the selected target is selected. It removes as a foreign material (step S410). In addition, when the boundary length is equal to or greater than the boundary threshold LL (step S402; No), the non-extraction target removal unit 514b obtains an ellipse that approximates the contour of the object by the ellipse fitting process (step S403).
  • the non-extraction target removal unit 514b determines whether or not the area of the ellipse obtained in step S403 is smaller than the area minimum value SL (step S404). If the area is smaller (step S404; Yes), the selection is performed. The completed target is removed as a foreign object (step S410).
  • the non-extraction target removing unit 514b determines whether the area of the ellipse is larger than the area maximum value SH (step S404). S405). When the area of the ellipse is larger than the maximum area value SH (step S405; Yes), the non-extraction target removal unit 514b removes the selected target as a foreign object (step S410).
  • step S405 If the area of the ellipse is less than or equal to the area maximum value SH (step S405; No), it is determined whether or not the value obtained by dividing the minor axis of the ellipse by the major axis is smaller than the ellipse threshold OD (step S406).
  • step S406 If the divided value is smaller than the ellipse threshold OD (step S406; Yes), the non-extraction target removal unit 514b removes the selected target as a foreign object (step S410).
  • the track extraction unit 514 determines whether or not the area of the ellipse is larger than the area threshold SE (step S407). When the area of the ellipse is larger than the area threshold value SE (step S407; Yes), the track extraction unit 514 extracts the selected target as a track of particles (step S411).
  • the track extraction unit 514 detects positive information in the approach / separation difference image 204 that is labeled with the barycentric coordinates of the target in the measurement image 202.
  • the distance for the track which is the distance from the center of gravity coordinates of the target having the brightness value, is obtained, and the foreign matter removing unit 513 includes the distance approach difference image 205 labeled with the center of gravity coordinates of the target in the measurement image 202.
  • a distance for a foreign object which is a distance from the center of gravity coordinates of the target having the luminance value of the positive information is obtained (step S408).
  • the track extraction unit 514 determines whether there is a track distance shorter than the distance threshold GD and there is no foreign object distance shorter than the distance threshold GD (step S409).
  • the foreign matter removing unit 513 selects the selected target. It removes as a foreign material (step S410).
  • the track extraction unit 514 is selected. The target is extracted as a track of particles (step S411).
  • step S412 determines whether or not all the targets in the measurement image 202 have been selected. If not selected (step S412; No), the process proceeds to step S401. Return The next object is selected, and if it has been selected (step S412; Yes), the track extraction process is terminated.
  • the sample analyzer 100 of the present embodiment by moving the focal position of the objective lens 31 in the Z-axis direction, the convex portion protruding from the imaging surface (the portion having the function of a convex lens) and , A concave portion (a portion having a function of a concave lens) recessed from the imaging surface can be detected.
  • the sample analysis apparatus 100 of the present embodiment can automatically detect and remove the foreign matter 1a attached to the imaging surface when analyzing the track detection solid, and the track 1b of the particles recessed from the imaging surface. Only can be extracted.
  • the sample analyzer 100 of the present embodiment only parallel light (collected parallel light) is emitted from the condenser lens 23 to the sample 1 to analyze the sample.
  • emits to the sample 1 is unnecessary.
  • the sample analyzer 100 of the present embodiment can omit the configuration (the diffuser plate and the second light source unit) for causing the collected diffused light to be incident on the film, and can be manufactured to detect convex portions and concave portions. Cost can be reduced.
  • the separation position and the approach position are adjusted so that the contrast (for example, the difference value of the luminance value) between the convex portion and the concave portion and the other portion is increased. It is possible to accurately detect convex portions (for example, foreign matter 1a) and concave portions (for example, track 1b of particles).
  • a target object having a positive luminance value is extracted from each of the difference images 204 and 205 acquired from the separated image 201 and the approach image 203,
  • the contrast between the concave portion (the target having the luminance value of the positive information) and the other portion (the black portion other than the target) is further increased.
  • the sample analyzer 100 according to the present embodiment can detect the convex portion (for example, the foreign matter 1a) and the concave portion (for example, the track 1b of the particles) with higher accuracy.
  • the target of the so-called just focus measurement image 202 is extracted and placed in the position of the target having a positive luminance value in each of the difference images 204 and 205. These are detected as convex portions and concave portions, and the convex portions and concave portions can be detected with higher accuracy.
  • the sample analyzer 100 of the present embodiment when the boundary of the target in the measurement image 202 is multiple (multiple circle), the boundary other than the boundary having the lowest average luminance value in the boundary region is detected. Remove.
  • the sample analyzer 100 according to the present embodiment can prevent erroneous detection in which the same convex portion or concave portion in the measurement image 202 is redundantly detected as a plurality of convex portions or concave portions.
  • the sample analysis apparatus 100 of the present embodiment can accurately remove the foreign matter 1a and the like in the measurement image 202 and extract only the particle tracks 1b with high accuracy.
  • the control computer 150 of the sample analyzer 100 includes a control unit 151, a main storage unit 152, an external storage unit 153, an operation unit 154, a display unit 52, an input / output unit 156, and a transmission / reception unit 157.
  • the main storage unit 152, the external storage unit 153, the operation unit 154, the display unit 52, the input / output unit 156, and the transmission / reception unit 157 are all connected to the control unit 151 via the internal bus 160.
  • the control unit 151 includes a CPU (Central Processing Unit) and the like, and according to a control program 158 stored in the external storage unit 153, a difference acquisition unit 511, a target extraction unit 512, a foreign matter removal unit 513 of the sample analyzer 100, and Each process of the track extraction unit 514 is executed.
  • the arithmetic processing unit 51 is configured by the control unit 151.
  • the main storage unit 152 includes a RAM (Random-Access Memory) and the like, loads the control program 158 stored in the external storage unit 153, and functions as a work area of the control unit 151.
  • the image RAM 124 is configured by the main storage unit 152.
  • the external storage unit 153 includes a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Disc ReWritable), and controls the processing of the sample analyzer 100.
  • 151 stores a program to be executed in advance. Further, the external storage unit 153 supplies the stored data to the control unit 151 in accordance with an instruction from the control unit 151, and stores the data supplied from the control unit 151.
  • the image storage device 53 is configured by an external storage unit 153.
  • the operation unit 154 includes a keyboard, a pointing device such as a mouse, and an interface device that connects the keyboard and the pointing device to the internal bus 160.
  • the display unit 52 includes a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display), and displays the images 201 to 205, the operation screen of the sample analyzer 100, and the like.
  • CTR Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the input / output unit 156 includes a serial interface or a parallel interface.
  • the input / output unit 156 is connected to the microscope unit 110.
  • the transmission / reception unit 157 includes a network termination device or a wireless communication device connected to the network, and a serial interface or a LAN (Local Area Network) interface connected thereto.
  • a network termination device or a wireless communication device connected to the network, and a serial interface or a LAN (Local Area Network) interface connected thereto.
  • LAN Local Area Network
  • the control program 158 performs processing using the control unit 151, the main storage unit 152, the external storage unit 153, the operation unit 154, the display unit 52, the input / output unit 156, the transmission / reception unit 157, and the like as resources, and is shown in FIG.
  • the processing of the difference acquisition unit 511, the target object extraction unit 512, the foreign matter removal unit 513, and the track extraction unit 514 of the sample analyzer 100 is executed.
  • the sample 1 is composed of the track detection solid and the track 1b of the particles is extracted.
  • the sample 1 is a film described in Patent Documents 1 and 2, and the film is a film. It is good also as a structure which detects a crack (concave part) and dust (convex part).
  • the image sensor 4 is used as the imaging unit 123.
  • a line sensor or the like may be used.
  • CMOS Complementary Metal-Oxide Semiconductor
  • one separation position, one measurement position, and one approach position are set.
  • two or more can be set.
  • the most in-focus measurement images are selected, the one with the highest brightness value of the target is selected from among the plurality of remote images, or the plurality of approach images are selected. It is possible to select the target having the maximum luminance value of the positive information and having the maximum luminance value.
  • the boundary extraction process for extracting the boundary in the measurement image 202 is executed and the boundaries connected in units of pixels are labeled one by one.
  • the boundary of the target may be labeled after the measurement image 202 is converted into the monochrome image 202g by binarization extraction processing.
  • the structure is not extracted as the particle track 1b, but is not limited thereto.
  • the target is extracted as a target having a positive luminance value in the approaching / separating difference image 204
  • the corresponding portion in the measurement image 202 is re-image processed (neighbor re-image processing). And it is good also as a structure which discriminate
  • the distance between the target in the selected measurement image 202 and the target having the luminance values of all the labeled positive information in the difference images 204 and 205 is determined whether or not there is a track distance smaller than the distance threshold GD and there is no foreign object distance smaller than the distance threshold GD, but the present invention is not limited to this. For example, it may be determined whether or not the track distance is smaller than the distance threshold GD while obtaining the order of the label numbers. In this case, depending on the label number of the target in each of the difference images 204 and 205 determined to be separated from the target in the measurement image 202 by a track distance smaller than the distance threshold GD, the calculation is performed more than in the embodiment. The load can be reduced.
  • the light may not necessarily be parallel light, and the visible light that can be extracted using the approaching / separating difference image and the approaching / separating difference image. Everything is targeted.
  • sample analyzer 100 When each function of the sample analyzer 100 is realized by sharing of an OS (operating system) and an application program, or by cooperation between the OS and an application program (sample analysis program), only the application program portion is externally provided. You may store in the memory
  • an application program may be posted on a bulletin board (BBS: Bulletin Board System) on a communication network, and the application program may be distributed via the network. Then, the application program may be installed and activated in a computer, and may be configured to execute the above-described processing by being executed in the same manner as other application programs under the control of the OS.
  • BSS Bulletin Board System

Abstract

A concavity detection unit (514b) detects, as a recess recessed from an imaging surface, a portion (203d, 201d) at which the luminance value of an approaching image (203c) taken in a state in which the focal position of an objective lens (31) has moved to an approaching position is large and the luminance value of a separation image (201c) taken in a state in which the focal position has moved to a separation position is small. A convexity detection unit (513a) detects, as a protrusion projecting from the imaging surface, a portion (201b, 203b) at which the luminance value of the separation image (201a) is large and the luminance value of the approach image (203a) is small.

Description

試料解析装置、試料解析方法、試料解析プログラムおよび粒子飛跡解析装置Sample analyzer, sample analysis method, sample analysis program, and particle track analyzer
 本発明は、試料解析装置、試料解析方法、試料解析プログラムおよび粒子飛跡解析装置に関する。 The present invention relates to a sample analysis device, a sample analysis method, a sample analysis program, and a particle track analysis device.
 フィルムなどの試料についた傷や、付着した塵を、顕微鏡、スキャナ、センサ等によって検出する技術が、例えば、特許文献1、2に開示されている。 For example, Patent Documents 1 and 2 disclose techniques for detecting scratches and attached dust on a sample such as a film with a microscope, a scanner, a sensor, or the like.
特開2003-232746号公報JP 2003-232746 A 特開平11-215319号公報Japanese Patent Laid-Open No. 11-215319
 特許文献1、2に記載の技術は、試料に、凹部(試料についた傷)または凸部(試料に付着した塵)のいずれかがあることを検出することはできるが、試料の凹部と凸部とを判別できない。 Although the techniques described in Patent Documents 1 and 2 can detect whether a sample has a concave portion (a scratch on the sample) or a convex portion (dust attached to the sample), the concave and convex portions of the sample are detected. Cannot be distinguished.
 本発明は、上述のような事情に鑑みてなされたものであり、試料の凹部及び凸部を検出し、凹部と凸部とを判別することを目的とする。 The present invention has been made in view of the above-described circumstances, and an object thereof is to detect a concave portion and a convex portion of a sample and discriminate between the concave portion and the convex portion.
 上記目的を達成するため、本発明の第1の観点に係る試料解析装置は、
 可視光を出射する出射部と、
 前記出射部によって出射された可視光の光軸上に配置され、可視光が透過する試料を支持する支持部と、
 光学系を有し、前記支持部に支持された試料の画像を撮像する撮像部と、
 前記光学系の焦点位置を、可視光の光軸に沿って、試料の前記撮像部側の表面である撮像表面の基準位置よりも前記出射部に近い接近位置と、前記基準位置よりも前記出射部から離れた離隔位置と、の間で移動させる焦点移動部と、
 前記焦点移動部が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記試料の画像である接近画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動部が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記試料の画像である離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んだ凹部として検出する凹検出手段と、
 前記離隔画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から突出した凸部として検出する凸検出手段と、
 を備えることを特徴とする。
In order to achieve the above object, a sample analyzer according to the first aspect of the present invention provides:
An emission part for emitting visible light;
A support part that is arranged on the optical axis of visible light emitted by the emission part and supports a sample through which visible light is transmitted;
An imaging unit that has an optical system and captures an image of the sample supported by the support unit;
The focal position of the optical system is set along the optical axis of visible light, the approach position closer to the emitting portion than the reference position of the imaging surface, which is the surface of the sample on the imaging portion side, and the emitting position than the reference position. A focal point moving part that is moved between a remote position away from the part,
The focus moving unit moves the focus position to the approach position, and the brightness value of the approach image that is the image of the sample captured by the imaging unit at the approach position is larger than the brightness value of the peripheral portion that is set in advance. In addition, the brightness value of the remote image, which is the image of the sample imaged by the imaging unit at the remote position by the focus moving unit moving the focal position to the remote position, is smaller than the luminance value of the peripheral portion. Concave detecting means for detecting a portion as a concave recessed from the imaging surface;
A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. Detection means;
It is characterized by providing.
 なお、上述の試料解析装置において、
 前記接近画像の各画素の輝度値から前記離隔画像の対応する各画素の輝度値を減算した接近離隔差分画像と、前記離隔画像の各画素の輝度値から前記接近画像の対応する各画素の輝度値を減算した離隔接近差分画像と、を取得する差分取得手段と、
 画像から輝度値が前記周辺部分とは異なる部分である閉領域を抽出する閉領域抽出手段と、
 をさらに備え、
 前記閉領域抽出手段は、前記焦点移動部が前記焦点位置を前記光軸に沿って前記基準位置から予め設定された測定距離だけ移動した測定位置に移動させて前記撮像部が前記測定位置で撮像した前記試料の画像としての測定画像内の閉領域と、前記接近離隔差分画像内の輝度値が正の値となる閉領域と、前記離隔接近差分画像内の輝度値が正の値となる閉領域と、を抽出し、
 前記凹検出手段は、前記測定画像内の閉領域のうち、前記接近離隔差分画像内の輝度値が正の値となる閉領域の位置に配置されたものを、前記凹部として検出し、
 前記凸検出手段は、前記測定画像内の閉領域のうち、前記離隔接近差分画像内の輝度値が正の値となる閉領域の位置に配置されたものを、前記凸部として検出する
 ようにしてもよい。
In the above sample analyzer,
The approach / separation difference image obtained by subtracting the brightness value of each pixel corresponding to the separation image from the brightness value of each pixel of the approach image, and the brightness of each pixel corresponding to the approach image from the brightness value of each pixel of the separation image A difference acquisition means for acquiring a distance approaching difference image obtained by subtracting a value;
A closed region extracting means for extracting a closed region whose luminance value is different from the peripheral portion from the image;
Further comprising
The closed region extracting means moves the focus position to a measurement position moved by a preset measurement distance from the reference position along the optical axis, and the imaging unit images at the measurement position. A closed region in the measurement image as the image of the sample, a closed region in which the luminance value in the approaching / separating difference image is a positive value, and a closed region in which the luminance value in the separating / approaching difference image is a positive value. Extract the region and
The concave detecting means detects, as the concave portion, a closed region in the measurement image that is arranged at a position of the closed region where the brightness value in the approaching / separating difference image is a positive value,
The convex detecting means detects, as the convex portion, a closed region in the measurement image that is arranged at a position of the closed region where the luminance value in the separation approach difference image is a positive value. May be.
 なお、上述の試料解析装置において、
 前記閉領域抽出手段は、前記測定画像から輝度値が他の部分とは異なる部分を囲む境界として、内側境界と、前記内側境界を囲む外側境界と、を抽出した場合に、各境界の領域内の平均輝度値が最も低い領域の境界を前記測定画像内の閉領域の境界とする
 ようにしてもよい。
In the above sample analyzer,
When the closed region extraction unit extracts an inner boundary and an outer boundary surrounding the inner boundary as a boundary surrounding a portion having a luminance value different from other portions from the measurement image, The boundary of the region having the lowest average luminance value may be the boundary of the closed region in the measurement image.
 上記目的を達成するため、本発明の第2の観点に係る試料解析方法は、
 可視光が光軸上に配置された試料を透過する場合に、光学系を有する撮像部によって試料の画像を撮像する撮像工程と、
 前記光学系の焦点位置を、可視光の光軸に沿って、試料の前記撮像部側の表面である撮像表面の基準位置よりも光源に近い接近位置と、前記基準位置よりも光源から離れた離隔位置と、の間で焦点移動部を介して移動させる焦点移動工程と、
 前記焦点移動部が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記試料の画像である接近画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動部が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記試料の画像である離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んだ凹部として検出する凹検出工程と、
 前記離隔画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から突出した凸部として検出する凸検出工程と、
 を備えることを特徴とする。
In order to achieve the above object, a sample analysis method according to the second aspect of the present invention includes:
An imaging step of capturing an image of the sample by an imaging unit having an optical system when visible light is transmitted through the sample disposed on the optical axis;
The focal position of the optical system is closer to the light source than the reference position of the imaging surface, which is the surface on the imaging unit side of the sample, along the optical axis of visible light, and away from the light source than the reference position A focal point moving step for moving between the separated position and the focal point moving unit;
The focus moving unit moves the focus position to the approach position, and the brightness value of the approach image that is the image of the sample captured by the imaging unit at the approach position is larger than the brightness value of the peripheral portion that is set in advance. In addition, the brightness value of the remote image, which is the image of the sample imaged by the imaging unit at the remote position by the focus moving unit moving the focal position to the remote position, is smaller than the luminance value of the peripheral portion. A concave detecting step of detecting a portion as a concave recessed from the imaging surface;
A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. A detection process;
It is characterized by providing.
 上記目的を達成するため、本発明の第3の観点に係る試料解析プログラムは、
 コンピュータを、
 可視光が光軸上に配置された試料を透過する場合に、光学系を有する撮像部によって試料の画像を撮像する撮像手段、
 前記光学系の焦点位置を、可視光の光軸に沿って、試料の前記撮像部側の表面である撮像表面の基準位置よりも光源に近い接近位置と、前記基準位置よりも光源から離れた離隔位置と、の間で移動させる焦点移動手段、
 前記焦点移動手段が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記試料の画像である接近画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動手段が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記試料の画像である離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んだ凹部として検出する凹検出手段、
 前記離隔画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から突出した凸部として検出する凸検出手段、
 として機能させることを特徴とする。
In order to achieve the above object, a sample analysis program according to the third aspect of the present invention provides:
Computer
An imaging unit that captures an image of the sample by an imaging unit having an optical system when visible light passes through the sample disposed on the optical axis;
The focal position of the optical system is closer to the light source than the reference position of the imaging surface, which is the surface on the imaging unit side of the sample, along the optical axis of visible light, and away from the light source than the reference position A focal point moving means for moving between the separated position and
The focus moving means moves the focus position to the approach position, and the brightness value of the approach image, which is the image of the sample imaged by the imaging unit at the approach position, is larger than the brightness value of the peripheral portion set in advance. In addition, the brightness value of the remote image, which is the image of the sample imaged by the imaging unit at the remote position with the focus moving means moving the focus position to the remote position, is smaller than the luminance value of the peripheral portion. Concave detecting means for detecting a portion as a concave recessed from the imaging surface;
A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. Detection means,
It is made to function as.
 上記目的を達成するため、本発明の第4の観点に係る粒子飛跡解析装置は、
 可視光を出射する出射部と、
 前記出射部によって出射された可視光の光軸上に配置され、可視光が透過する飛跡検出用固体を支持する支持部と、
 光学系を有し、前記支持部に支持された試料としての飛跡検出用固体の画像を撮像する撮像部と、
 前記光学系の焦点位置を、可視光の光軸に沿って、飛跡検出用固体の前記撮像部側の表面である撮像表面の基準位置よりも前記出射部に近い接近位置と、前記基準位置よりも前記出射部から離れた離隔位置と、の間で移動させる焦点移動部と、
 前記焦点移動部が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記飛跡検出用固体の画像である離隔画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動部が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記飛跡検出用固体の画像である接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面に付着して前記撮像表面から突出した異物として前記飛跡検出用固体の画像から除去する異物除去手段と、
 前記接近画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んで形成された粒子の飛跡として前記飛跡検出用固体の画像から抽出する飛跡抽出手段と、
 を備えることを特徴とする。
In order to achieve the above object, a particle track analysis apparatus according to the fourth aspect of the present invention provides:
An emission part for emitting visible light;
A support part that is disposed on the optical axis of visible light emitted by the emission part and supports a track detection solid that transmits visible light; and
An imaging unit that has an optical system and captures an image of a track detection solid as a sample supported by the support;
The focal position of the optical system is closer to the emitting part than the reference position of the imaging surface, which is the surface on the imaging part side of the solid for track detection, along the optical axis of visible light, and from the reference position And a focal point moving unit that moves between a remote position away from the emitting unit,
The brightness value of the peripheral portion in which the brightness value of the separated image, which is an image of the track detection solid image captured by the imaging unit at the separated position by the focus moving unit moving the focal position to the separated position, is set in advance. And the brightness value of the approach image, which is an image of the track detection solid image captured by the imaging unit at the approach position when the focus moving unit moves the focus position to the approach position, A foreign matter removing means for removing a portion smaller than the brightness value from the image of the track detection solid as a foreign matter attached to the imaging surface and protruding from the imaging surface;
A track of particles formed by recessing a portion where the luminance value of the approaching image is larger than the luminance value of the peripheral portion and the luminance value of the separated image is smaller than the luminance value of the peripheral portion from the imaging surface Track extraction means for extracting from the track detection solid image as,
It is characterized by providing.
 本発明によれば、試料の凹部及び凸部を検出し、凹部と凸部とを判別することができる。 According to the present invention, the concave and convex portions of the sample can be detected and the concave and convex portions can be distinguished.
本実施形態に係る試料解析装置の正面図である。It is a front view of the sample analyzer which concerns on this embodiment. 顕微鏡部の側面図である。It is a side view of a microscope part. 顕微鏡の視野とイメージセンサの撮像範囲との関係を示す図である。It is a figure which shows the relationship between the visual field of a microscope, and the imaging range of an image sensor. Z軸制御部の構成例を示す図である。It is a figure which shows the structural example of a Z-axis control part. 試料解析装置の機能ブロック図である。It is a functional block diagram of a sample analyzer. (A)は対物レンズの焦点位置が離隔位置に移動した状態を示す図、(B)は対物レンズの焦点位置が測定位置に移動した状態を示す図、(C)は対物レンズの焦点位置が接近位置に移動した状態を示す図、(D)は離隔画像を示す図、(E)は測定画像を示す図、(F)は接近画像を示す図、(G)は接近離隔差分画像を示す図、(H)は離隔接近差分画像を示す図である。(A) is a diagram showing a state in which the focal position of the objective lens is moved to the separation position, (B) is a diagram showing a state in which the focal position of the objective lens is moved to the measurement position, and (C) is a diagram showing the focal position of the objective lens. The figure which shows the state which moved to the approach position, (D) is a figure which shows a separated image, (E) is a figure which shows a measured image, (F) is a figure which shows an approach image, (G) shows an approached and separated difference image. FIG. 5H is a diagram showing a separation approach difference image. 多重の境界を有する目標物の説明図である。It is explanatory drawing of the target which has a multiple boundary. (A)は接近離隔差分画像内の飛跡との合致度の高い目標物を抽出した状態を示す図、(B)は離隔接近差分画像内の異物との合致度の高い目標物を抽出した状態を示す図、(C)は焦点位置が測定位置に移動した状態で撮像された図である。(A) is a figure which shows the state which extracted the target with high coincidence with the track | truck in the approaching / separating difference image, (B) is the state which extracted the target with high matching with the foreign material in the separating / separating difference image FIG. 8C is a diagram taken in a state where the focal position is moved to the measurement position. (A)は図6(A)の状態で撮像表面に異物が付着している状態を示す図、(B)は(A)の状態で撮像された離隔画像を示す図、(C)は図6(A)の状態で撮像表面に粒子の飛跡が形成されている状態を示す図、(D)は(C)の状態で撮像された離隔画像を示す図、(E)は図6(C)の状態で撮像表面に異物が付着している状態を示す図、(F)は(E)の状態で撮像された接近画像を示す図、(G)は図6(C)の状態で撮像表面に粒子の飛跡が形成されている状態を示す図、(H)は(G)の状態で撮像された接近画像を示す図である。6A is a diagram illustrating a state in which foreign matter is attached to the imaging surface in the state of FIG. 6A, FIG. 6B is a diagram illustrating a separated image captured in the state of FIG. 6A, and FIG. 6A is a diagram illustrating a state in which a track of particles is formed on the imaging surface in the state of FIG. 6A, FIG. 6D is a diagram illustrating a remote image captured in the state of FIG. 6C, and FIG. ) Is a diagram illustrating a state in which foreign matter is attached to the imaging surface, (F) is a diagram illustrating an approach image captured in the state of (E), and (G) is captured in the state of FIG. 6 (C). The figure which shows the state in which the track of a particle is formed on the surface, (H) is a figure which shows the approach image imaged in the state of (G). 試料解析処理のフローチャートである。It is a flowchart of a sample analysis process. 測定画像の目標物抽出処理のフローチャートである。It is a flowchart of the target object extraction process of a measurement image. 各差分画像の目標物抽出処理のフローチャートである。It is a flowchart of the target object extraction process of each difference image. 飛跡抽出処理のフローチャートである。It is a flowchart of a track extraction process. 2値化処理された測定画像を示す図である。It is a figure which shows the measurement image by which the binarization process was carried out. 制御コンピュータのハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of a control computer.
 以下、図1~図15を参照しつつ、本発明の実施形態に係る試料解析装置(粒子飛跡解析装置)100を説明する。 Hereinafter, a sample analysis apparatus (particle track analysis apparatus) 100 according to an embodiment of the present invention will be described with reference to FIGS.
 試料解析装置100は、図1に示すように、顕微鏡部110と制御コンピュータ150とを備える。 The sample analyzer 100 includes a microscope unit 110 and a control computer 150 as shown in FIG.
 顕微鏡部110は、図2に示すように、試料1を移動可能に支持する移動部2と、試料1の像を拡大する顕微鏡3と、顕微鏡3により拡大された試料の像を撮像するイメージセンサ4と、を備える。移動部2と顕微鏡3とは、L字型の架台7によって支持されている。 As shown in FIG. 2, the microscope unit 110 includes a moving unit 2 that movably supports the sample 1, a microscope 3 that magnifies the image of the sample 1, and an image sensor that captures an image of the sample magnified by the microscope 3. 4. The moving unit 2 and the microscope 3 are supported by an L-shaped gantry 7.
 試料1は、飛跡検出用固体である。飛跡検出用固体は、有機系プラスチック製のプレートである。飛跡検出用固体を放射線が通過すると、放射線が通過した部分において高分子結合が損傷を受ける。そしてこの損傷部分を所定の溶液でエッチングすると、エッチピットと呼ばれる微小な凹部が生じる。このエッチピットは、放射線の飛跡を示す。エッチピットは、放射線の入射量や入射方向によって形状(例えば、真円や楕円)、直径、深さが異なる。このため、試料1に生じたエッチピットの数や形状を顕微鏡で観察することにより、放射線の入射量や、入射方向等の情報を取得することができる。 Sample 1 is a track detection solid. The track detection solid is an organic plastic plate. When the radiation passes through the track detection solid, the polymer bond is damaged in the portion through which the radiation has passed. When this damaged portion is etched with a predetermined solution, a minute recess called an etch pit is generated. This etch pit shows a track of radiation. The etch pit has a different shape (for example, a perfect circle or an ellipse), a diameter, and a depth depending on an incident amount and an incident direction of radiation. For this reason, by observing the number and shape of the etch pits generated in the sample 1 with a microscope, it is possible to acquire information such as the amount of incident radiation and the incident direction.
 試料1の表面には、塵埃が付着してしまう場合がある。塵埃が付着した試料1の像を撮像した場合は、撮像された画像中のエッチピットと塵埃とを判別する必要がある。 Dust may adhere to the surface of the sample 1. When an image of the sample 1 to which dust is attached is captured, it is necessary to discriminate between etch pits and dust in the captured image.
 移動部2は、L字型の架台7の水平部分に設置されている。移動部2は、X-Yステージ21と、試料台(支持部)22と、コンデンサレンズ23と、エンコーダ24と、を備える。 The moving part 2 is installed in the horizontal part of the L-shaped gantry 7. The moving unit 2 includes an XY stage 21, a sample stage (supporting unit) 22, a condenser lens 23, and an encoder 24.
 試料台22は試料1を支持する。試料台22はX-Yステージ21の上に設置されている。
 X-Yステージ21は、後述する図5に示すX軸モータ131とY軸モータ132により駆動され、左右及び前後方向(X軸方向及びY軸方向)に水平移動する。X-Yステージ21が動くにしたがって、試料台22および試料台22に支持された試料1も移動する。X軸モータ131及びY軸モータ132は、制御コンピュータ150によって制御される。
 コンデンサレンズ23は、試料台22の下に設置されている。
 エンコーダ24は、X-Yステージ21のX軸方向及びY軸方向の移動量に対応するパルスを制御コンピュータ150に供給する。エンコーダ24は、X軸エンコーダ24Xと、Y軸エンコーダ24Yと、を備える。後述するイメージセンサ4が取得したエリア画像に対するX軸方向およびY軸方向の位置補正(ずれ補正)は、それぞれ、X軸エンコーダ24XとY軸エンコーダ24Yの出力に基づいて行われる。
The sample stage 22 supports the sample 1. The sample stage 22 is installed on the XY stage 21.
The XY stage 21 is driven by an X-axis motor 131 and a Y-axis motor 132 shown in FIG. 5, which will be described later, and moves horizontally in the left-right and front-back directions (X-axis direction and Y-axis direction). As the XY stage 21 moves, the sample stage 22 and the sample 1 supported on the sample stage 22 also move. The X-axis motor 131 and the Y-axis motor 132 are controlled by the control computer 150.
The condenser lens 23 is installed under the sample table 22.
The encoder 24 supplies the control computer 150 with pulses corresponding to the amounts of movement of the XY stage 21 in the X-axis direction and the Y-axis direction. The encoder 24 includes an X-axis encoder 24X and a Y-axis encoder 24Y. Position correction (shift correction) in the X-axis direction and the Y-axis direction with respect to an area image acquired by the image sensor 4 described later is performed based on outputs from the X-axis encoder 24X and the Y-axis encoder 24Y, respectively.
 顕微鏡3は、対物レンズ(光学系)31と、試料1を照射する照明部32と、制御用撮像ユニット33と、鏡筒34と、目視観察用の接眼レンズ35と、Z軸制御部(焦点移動部)36と、を備える。 The microscope 3 includes an objective lens (optical system) 31, an illumination unit 32 that irradiates the sample 1, a control imaging unit 33, a lens barrel 34, an eyepiece 35 for visual observation, and a Z-axis control unit (focus). Moving part) 36.
 対物レンズ31は、倍率の異なる複数のレンズ(例えば、倍率10倍のレンズと倍率20倍のレンズ)を備える。レボルバ37を回転させることによって、これら複数のレンズを切り替えることができる。 The objective lens 31 includes a plurality of lenses having different magnifications (for example, a lens having a magnification of 10 times and a lens having a magnification of 20 times). The plurality of lenses can be switched by rotating the revolver 37.
 照明部32は、落射照明部32aと、透過照明部32bと、を備える。
 落射照明部32aは、内蔵する光源が発した光を顕微鏡3の光軸に沿うよう屈曲させ、試料1に照射する。このとき、試料1からの反射光は対物レンズ31に入射する。
 透過照明部32bは、試料台22の下に、L字型の架台7の水平部分に内蔵されて設けられている。透過照明部32bは、透過照明部32bが内蔵する光源が発した光や、光ファイバ8から導入された外部光源が発した光を、試料1に対して下方から照射する。このとき、試料1からの透過光は対物レンズ31に入射する。なお、本実施形態では、透過照明部32bから試料1に対して照射された光は、コンデンサレンズ23によって集光されて平行光(集平行光)として試料1に照射(出射)される。このため、試料1からの透過光は平行光である。
 透過照明部32bおよびコンデンサレンズ23は、図2および後述する図9に示すように、出射部41を構成している。
The illumination unit 32 includes an epi-illumination unit 32a and a transmission illumination unit 32b.
The epi-illumination unit 32 a irradiates the sample 1 with the light emitted from the built-in light source being bent along the optical axis of the microscope 3. At this time, the reflected light from the sample 1 enters the objective lens 31.
The transmitted illumination unit 32 b is provided in the horizontal portion of the L-shaped mount 7 under the sample table 22. The transmitted illumination unit 32b irradiates the sample 1 with light emitted from a light source built in the transmitted illumination unit 32b or light emitted from an external light source introduced from the optical fiber 8 from below. At this time, the transmitted light from the sample 1 enters the objective lens 31. In the present embodiment, the light irradiated to the sample 1 from the transmission illumination unit 32b is collected by the condenser lens 23 and irradiated (emitted) to the sample 1 as parallel light (collected parallel light). For this reason, the transmitted light from the sample 1 is parallel light.
The transmitted illumination unit 32b and the condenser lens 23 constitute an output unit 41 as shown in FIG. 2 and FIG. 9 described later.
 鏡筒34は、目視観察用の接眼レンズ35と、イメージセンサ4とを支持する。鏡筒34の側部は、Z軸制御部36を介して、L字型の架台7の直立部分に取り付けられている。 The lens barrel 34 supports an eyepiece 35 for visual observation and the image sensor 4. A side portion of the lens barrel 34 is attached to an upright portion of the L-shaped gantry 7 via a Z-axis control unit 36.
 Z軸制御部36は、顕微鏡3の位置を上下方向(Z軸方向、光軸方向)に移動する。Z軸制御部36は、架台7に固定されたスライド板36aと、鏡筒34に固定されたスライド板36bと、Z軸モータ361と、を備える。 The Z-axis control unit 36 moves the position of the microscope 3 in the vertical direction (Z-axis direction, optical axis direction). The Z-axis control unit 36 includes a slide plate 36 a fixed to the gantry 7, a slide plate 36 b fixed to the lens barrel 34, and a Z-axis motor 361.
 図4に示すように、スライド板36a、36bの一方は、Z軸モータ361と、Z軸モータ361の回転軸362に固定されたピニオンギア363と、回転軸362と共軸に構成された操作つまみ364と、を備える。
 また、スライド板36a、36bの他方は、ラック365を備える。ラック365にピニオンギア363が係合して、ラックアンドピニオン機構を構成している。Z軸モータ361の回転駆動及び操作つまみ364の回転駆動により、ピニオンギア363が回転し、ラック365を駆動する。ラック365が駆動すると、スライド板36bが架台7に固定されているスライド板36aに対して上下方向にスライドして、鏡筒34が上下方向(Z軸方向)に移動する。
As shown in FIG. 4, one of the slide plates 36 a and 36 b includes a Z-axis motor 361, a pinion gear 363 fixed to the rotation shaft 362 of the Z-axis motor 361, and an operation configured to be coaxial with the rotation shaft 362. And a knob 364.
The other of the slide plates 36 a and 36 b includes a rack 365. A rack and pinion mechanism is configured by engaging a pinion gear 363 with the rack 365. The pinion gear 363 rotates by driving the Z-axis motor 361 and the operation knob 364 to drive the rack 365. When the rack 365 is driven, the slide plate 36b slides in the vertical direction with respect to the slide plate 36a fixed to the gantry 7, and the lens barrel 34 moves in the vertical direction (Z-axis direction).
 目視観察用の接眼レンズ35は、対物レンズ31から入射した光の光軸をプリズムで傾けて、試料1の目視観察を可能にしている。 The eyepiece 35 for visual observation makes it possible to visually observe the sample 1 by tilting the optical axis of the light incident from the objective lens 31 with a prism.
 イメージセンサ4はケースに収納されており、鏡筒34の先端に着脱可能に装着される。装着部は、例えば、Cマウントや、Fマウントを採用する。
 イメージセンサ4は、XY平面上に並べたCCD(Charge Coupled Device)素子を備える。
The image sensor 4 is housed in a case and is detachably attached to the tip of the lens barrel 34. For example, a C mount or an F mount is used as the mounting portion.
The image sensor 4 includes CCD (Charge Coupled Device) elements arranged on the XY plane.
 イメージセンサ4は、図3に示すように、顕微鏡3の視野VF中に捉えられた試料1の像のうち、イメージセンサ4の撮像範囲であるエリアARに含まれる領域を撮像する。イメージセンサ4は、移動部2によってX軸方向又はY軸方向に移動させられる試料1を、撮像範囲であるエリアAR毎に順次撮像し、各エリアARの画像データ(エリア画像データ)を、連結コードを介して、制御コンピュータ150に伝達する。即ち、イメージセンサ4は、顕微鏡3により拡大された試料1の像を、試料1に対して相対運動を行いながら撮像することで走査し、撮像したエリア画像データを制御コンピュータ150に供給する。 As shown in FIG. 3, the image sensor 4 captures an area included in the area AR that is the imaging range of the image sensor 4 in the image of the sample 1 captured in the visual field VF of the microscope 3. The image sensor 4 sequentially images the sample 1 that is moved in the X-axis direction or the Y-axis direction by the moving unit 2 for each area AR that is an imaging range, and connects the image data (area image data) of each area AR. This is transmitted to the control computer 150 via the code. That is, the image sensor 4 scans the image of the sample 1 magnified by the microscope 3 by capturing the image while performing relative movement with respect to the sample 1, and supplies the captured area image data to the control computer 150.
 制御コンピュータ150は、例えば、コンピュータである。
 制御コンピュータ150は、図1に示すように、演算処理部51、表示部52、画像記録装置53、を備える。
 演算処理部51は、撮像領域の設定、移動部2のX軸方向及びY軸方向への移動制御と位置制御、顕微鏡3のZ軸方向の移動制御、イメージセンサ4の撮像制御、イメージセンサ4から供給されたエリア画像データの取り込み、エリア画像データにもとづいて撮像領域の全体画像を作成する処理、などを行う。
The control computer 150 is, for example, a computer.
As illustrated in FIG. 1, the control computer 150 includes an arithmetic processing unit 51, a display unit 52, and an image recording device 53.
The arithmetic processing unit 51 sets an imaging region, movement control and position control of the moving unit 2 in the X-axis direction and the Y-axis direction, movement control of the microscope 3 in the Z-axis direction, imaging control of the image sensor 4, and image sensor 4 The area image data supplied from the image data is fetched, and the entire image of the imaging area is created based on the area image data.
 図5に演算処理部51を中心とした機能ブロック図を示す。
 図5に示すように、演算処理部51には、移動部2、Z軸制御部36、撮像部123、画像RAM124、画像記録装置53、データ通信部125などが接続されている。
FIG. 5 shows a functional block diagram centering on the arithmetic processing unit 51.
As shown in FIG. 5, the arithmetic processing unit 51 is connected to the moving unit 2, the Z-axis control unit 36, the imaging unit 123, the image RAM 124, the image recording device 53, the data communication unit 125, and the like.
 移動部2は、X軸駆動部121、Y軸駆動部122、X軸エンコーダ24X,Y軸エンコーダ24Y、を備える。
 X軸駆動部121は、演算処理部51の制御に従ってX軸モータ131を駆動して、X-Yステージ21をX軸方向に移動する。
 Y軸駆動部122は、演算処理部51の制御に従ってY軸モータ132を駆動して、X-Yステージ21をX軸方向に直交するY軸方向に移動する。
The moving unit 2 includes an X-axis drive unit 121, a Y-axis drive unit 122, an X-axis encoder 24X, and a Y-axis encoder 24Y.
The X-axis drive unit 121 drives the X-axis motor 131 according to the control of the arithmetic processing unit 51 to move the XY stage 21 in the X-axis direction.
The Y-axis drive unit 122 drives the Y-axis motor 132 according to the control of the arithmetic processing unit 51, and moves the XY stage 21 in the Y-axis direction orthogonal to the X-axis direction.
 X軸エンコーダ24Xは、リニアエンコーダ等から構成され、X-Yステージ21がX軸方向に移動すると、移動の方向と距離とに応じたパルス列を出力する。演算処理部51は、パルス数及び位相等を判別することにより、X-Yステージ21のX軸方向の移動方向と移動距離とを判別する。 The X-axis encoder 24X is composed of a linear encoder or the like, and outputs a pulse train corresponding to the moving direction and distance when the XY stage 21 moves in the X-axis direction. The arithmetic processing unit 51 determines the moving direction and moving distance of the XY stage 21 in the X-axis direction by determining the number of pulses, the phase, and the like.
 Y軸エンコーダ24Yは、リニアエンコーダ等から構成され、X-Yステージ21がY軸方向に動くと、移動の方向と距離とに応じたパルス列を出力する。演算処理部51は、パルス数及び位相等を判別することにより、X-Yステージ21のY軸方向の移動方向と移動距離とを判別する。 The Y-axis encoder 24Y is composed of a linear encoder or the like, and outputs a pulse train corresponding to the moving direction and distance when the XY stage 21 moves in the Y-axis direction. The arithmetic processing unit 51 determines the moving direction and moving distance of the XY stage 21 in the Y-axis direction by determining the number of pulses, the phase, and the like.
 Z軸制御部36は、前述のZ軸モータ361と、Z軸駆動部366及びZ軸エンコーダ367と、を備える。Z軸駆動部366は、演算処理部51の制御にしたがってZ軸モータ361を駆動して、顕微鏡3をZ軸方向(上下方向)に移動する。Z軸エンコーダ367は、リニアエンコーダ等から構成され、顕微鏡3がZ軸方向に動くと、移動の方向と距離とに応じたパルス列を出力する。演算処理部51は、パルス数及び位相等を判別することにより、顕微鏡3の移動方向と移動距離とを判別する。 The Z-axis control unit 36 includes the aforementioned Z-axis motor 361, a Z-axis drive unit 366, and a Z-axis encoder 367. The Z-axis drive unit 366 drives the Z-axis motor 361 according to the control of the arithmetic processing unit 51 to move the microscope 3 in the Z-axis direction (vertical direction). The Z-axis encoder 367 is composed of a linear encoder or the like, and outputs a pulse train corresponding to the moving direction and distance when the microscope 3 moves in the Z-axis direction. The arithmetic processing unit 51 determines the moving direction and moving distance of the microscope 3 by determining the number of pulses, the phase, and the like.
 Z軸制御部36は、イメージセンサ4が取得した画像のコントラストを検出しオートフォーカス処理を行うことにより試料1の、対物レンズ31に近い側の表面である撮像表面(試料表面、撮像対象面)を検出し、撮像部123が備える対物レンズ31の焦点位置を、この撮像表面へと移動させることができる。なお、本明細書中では、対物レンズ31の焦点位置が撮像表面と一致している状態における対物レンズ31の焦点位置を基準位置と呼ぶこととする。
 また、Z軸制御部36は、対物レンズ31の焦点位置を、図6(A)に示す、予め設定された離隔距離だけ基準位置よりも上方(対物レンズに近い側)に離れた離隔位置と、図6(B)に示す、予め設定された測定距離だけ基準位置から離れた測定位置と、図6(C)に示す、予め設定された接近距離だけ基準位置よりも下方(対物レンズから遠い側)に離れた接近位置と、の間で、演算処理部51の出力信号に基づいて移動させることができる。なお、本実施形態では、接近位置よりも試料1の撮像表面から凹んだ部分に対物レンズ31の焦点位置が合っている状態における対物レンズ31の焦点位置(鮮鋭度の高い焦点位置、ジャストフォーカスの位置)が測定位置として設定される。
 また、本実施形態において、離隔位置や接近位置は、試料1の厚さや、想定される粒子の飛跡、試料1に付着した塵埃の大きさなどに基づいて設定される。
The Z-axis control unit 36 detects the contrast of the image acquired by the image sensor 4 and performs autofocus processing, whereby the imaging surface (sample surface, imaging target surface) that is the surface of the sample 1 on the side close to the objective lens 31. And the focal position of the objective lens 31 included in the imaging unit 123 can be moved to the imaging surface. In the present specification, the focal position of the objective lens 31 in a state where the focal position of the objective lens 31 coincides with the imaging surface is referred to as a reference position.
Further, the Z-axis control unit 36 sets the focal position of the objective lens 31 to a separation position that is separated from the reference position by a preset separation distance (on the side closer to the objective lens) as shown in FIG. FIG. 6B shows a measurement position that is separated from the reference position by a preset measurement distance, and FIG. 6C shows a measurement position that is below the reference position by a preset approach distance (far from the objective lens). Between the approach positions separated from each other on the basis of the output signal of the arithmetic processing unit 51. In the present embodiment, the focal position of the objective lens 31 in a state where the focal position of the objective lens 31 is in a portion recessed from the imaging surface of the sample 1 rather than the approach position (a focal position with a high sharpness, a just-focused position). Position) is set as the measurement position.
In the present embodiment, the separation position and the approach position are set based on the thickness of the sample 1, the assumed particle track, the size of dust attached to the sample 1, and the like.
 撮像部123は、対物レンズ31、イメージセンサ4等を備える。撮像部123は、演算処理部51の出力信号に基づいて、各画像の撮像時における対物レンズ31の焦点位置が互いに異なる、試料1の複数の画像を撮像して演算処理部51に供給する。具体的には、撮像部123は、焦点位置が離隔位置に合わされた状態で撮像された図6(D)に示す離隔画像(Far画像)201と、焦点位置が測定位置に合わされた状態で撮像された図6(E)に示す測定画像(Msr:Measure画像)202と、焦点位置が接近位置に合わされた状態で撮像された図6(F)に示す接近画像(Near画像)203と、を撮像して演算処理部51に供給する。 The imaging unit 123 includes the objective lens 31, the image sensor 4, and the like. Based on the output signal of the arithmetic processing unit 51, the imaging unit 123 captures a plurality of images of the sample 1 whose focal positions of the objective lens 31 are different from each other when imaging each image, and supplies the images to the arithmetic processing unit 51. Specifically, the imaging unit 123 captures the separated image (Far image) 201 shown in FIG. 6D captured in a state where the focal position is aligned with the separated position, and the focused position is aligned with the measured position. The measured image (Msr: Measure image) 202 shown in FIG. 6 (E) and the approach image (Near image) 203 shown in FIG. 6 (F) captured in a state where the focus position is adjusted to the approach position. An image is taken and supplied to the arithmetic processing unit 51.
 画像RAM124は、演算処理部51のワークエリアとして機能する。 The image RAM 124 functions as a work area for the arithmetic processing unit 51.
 画像記録装置53は、大容量記憶装置等から構成され、撮像した画像を記録する。
 データ通信部125は、外部装置との間で種々のデータを送受信する。
The image recording device 53 is composed of a mass storage device or the like, and records captured images.
The data communication unit 125 transmits / receives various data to / from an external device.
 演算処理部51は、差分取得部511と、目標物抽出部512と、異物除去部513と、飛跡抽出部514と、を備える。 The arithmetic processing unit 51 includes a difference acquisition unit 511, a target object extraction unit 512, a foreign matter removal unit 513, and a track extraction unit 514.
 差分取得部511は、接近画像203の各画素の輝度値から離隔画像201の対応する各画素の輝度値を減算した値を各画素の輝度値とする接近離隔差分画像(Near-Far画像、凹検出用の差分画像、図6(G)参照)204を取得する。また、差分取得部511は、離隔画像201の各画素の輝度値から接近画像203の対応する各画素の輝度値を減算した値を各画素の輝度値とする離隔接近差分画像(Far-Near画像、凸検出用の差分画像、図6(H)参照)205を取得する。本実施形態の差分取得部511は、接近離隔差分画像204、離隔接近差分画像205を取得する前に、接近画像203及び離隔画像201にスムージング処理を実行して画像内の輪郭線を滑らかにしている。
 なお、図6(G)及び図6(H)に示す各差分画像204、205は、理解を容易にするため、輝度値が正の値(正情報)を有する部分のみを黒色とは異なる色で表示している。
The difference acquisition unit 511 obtains a close-up difference image (Near-Far image, a concave image) using a luminance value of each pixel as a value obtained by subtracting the luminance value of each corresponding pixel of the separate image 201 from the luminance value of each pixel of the close-up image 203. A difference image for detection (see FIG. 6G) 204 is acquired. In addition, the difference acquisition unit 511 uses a value obtained by subtracting the luminance value of each corresponding pixel of the approach image 203 from the luminance value of each pixel of the remote image 201 as a luminance value of each pixel (Far-Near image). , A difference image for convex detection (see FIG. 6H) 205 is acquired. The difference acquisition unit 511 of the present embodiment performs smoothing processing on the approach image 203 and the separation image 201 to smooth the contour lines in the image before acquiring the approach / separation difference image 204 and the separation approach difference image 205. Yes.
It should be noted that the difference images 204 and 205 shown in FIGS. 6G and 6H are different from black only in a portion where the luminance value has a positive value (positive information) for easy understanding. Is displayed.
 目標物抽出部(閉領域抽出部)512は、各画像から、輝度値が他の部分とは異なる部分(他の部分よりも白い部分または黒い部分)を目標物(Obj :Object、閉領域)として抽出する。目標物抽出部512は、図6(E)に示す測定画像202内の目標物と、図6(G)に示す接近離隔差分画像204内の輝度値が正情報を有する目標物と、図6(H)に示す離隔接近差分画像205内の輝度値が正情報を有する目標物と、を抽出する。 The target object extraction unit (closed region extraction unit) 512 obtains, from each image, a portion (Obj: Object, closed region) whose luminance value is different from other portions (a white portion or a black portion than other portions). Extract as The target object extraction unit 512 includes a target object in the measurement image 202 shown in FIG. 6E, a target object whose luminance value in the approaching / separating difference image 204 shown in FIG. A target whose luminance value in the separated approach difference image 205 shown in (H) has positive information is extracted.
 目標物抽出部512は、測定画像202内の目標物を抽出する場合、まず、測定画像202にスムージング処理を実行し、測定画像202において、輝度値が他の部分とは異なる部分を囲む境界(エッジ)を明確にする。次に、目標物抽出部512は、測定画像202内の境界を抽出する境界抽出処理(エッジ抽出処理)を実行し、ピクセル単位でつながっている境界を1つずつラベリングする。 When extracting the target in the measurement image 202, the target extraction unit 512 first performs a smoothing process on the measurement image 202, and in the measurement image 202, a boundary that surrounds a portion whose luminance value is different from other portions ( Clarify the edges. Next, the target object extraction unit 512 executes a boundary extraction process (edge extraction process) for extracting boundaries in the measurement image 202, and labels the connected boundaries one by one.
 ところで、同じ目標物を異なる複数の焦点位置で撮影した画像において、光の屈折が原因で、ある焦点位置における撮像ではひとつの境界しか生じないのに、別の焦点位置における撮像では多重の境界(多重円)が生じる場合がある。あるいは、同じ目標物に対して、全ての焦点位置における撮像で多重の境界が生じる場合がある。目標物抽出部512は、内側の境界(内側境界)を囲む外側の境界(外側境界)が存在する場合、すなわち、多重の境界(多重円)を有する目標物が抽出された場合に、不要な境界を除去する境界除去処理を実行する。具体的には、各境界のうち、境界の内部の平均輝度値(輝度値の平均値)を計測し、境界内の平均輝度値が最も低い境界を測定画像202内の目標物の境界として抽出し、残りの境界を除去する。
 例えば、図7に示す測定画像202内の目標物202aは、図7の破線に示すように、外側の境界202bと、中間の境界202cと、内側の境界202dとを有する。そこで、目標物抽出部512は、まず、境界202b内の領域と、境界202c内の領域と、境界202d内の領域と、の平均輝度値を算出する。このとき、平均輝度値は境界より内側の領域の全てが対象となる。具体的には、境界202c内の領域の平均輝度値を求める場合には、境界202d内の領域の輝度値も合わせて領域202c内の領域の平均輝度値が算出される。目標物202aの場合、中間の境界202c内の領域の平均輝度値が最も低くなるため、目標物202aの境界は中間の境界202cとなり、残りの境界202b、202dは除去される。
 なお、仮に、中間の境界202c内の領域の平均輝度値と、内側の境界202d内の領域の平均輝度値と、が同じ値であった場合、内側の境界202dよりも外側にある中間の境界202cが目標物202aの境界となり、内側の境界202dは除去される。
By the way, in an image in which the same target is photographed at a plurality of different focal positions, due to light refraction, there is only one boundary in imaging at one focal position, but multiple boundaries ( (Multiple circles) may occur. Alternatively, multiple boundaries may occur for the same target in imaging at all focal positions. The target object extraction unit 512 is unnecessary when an outer boundary (outer boundary) surrounding the inner boundary (inner boundary) exists, that is, when a target having multiple boundaries (multiple circles) is extracted. Execute boundary removal processing to remove the boundary. Specifically, among the boundaries, an average luminance value (average value of luminance values) inside the boundary is measured, and a boundary having the lowest average luminance value in the boundary is extracted as a target boundary in the measurement image 202. And remove the remaining boundaries.
For example, the target 202a in the measurement image 202 shown in FIG. 7 has an outer boundary 202b, an intermediate boundary 202c, and an inner boundary 202d, as shown by the broken line in FIG. Therefore, the target object extraction unit 512 first calculates the average luminance value of the region in the boundary 202b, the region in the boundary 202c, and the region in the boundary 202d. At this time, the average luminance value is the entire region inside the boundary. Specifically, when the average luminance value of the region within the boundary 202c is obtained, the average luminance value of the region within the region 202c is calculated together with the luminance value of the region within the boundary 202d. In the case of the target 202a, the average luminance value of the region in the intermediate boundary 202c is the lowest, so the boundary of the target 202a becomes the intermediate boundary 202c, and the remaining boundaries 202b and 202d are removed.
If the average luminance value of the area in the intermediate boundary 202c and the average luminance value of the area in the inner boundary 202d are the same value, the intermediate boundary outside the inner boundary 202d. 202c becomes the boundary of the target 202a, and the inner boundary 202d is removed.
 また、目標物抽出部512は、各差分画像204、205内の輝度値が正情報を有する目標物を抽出する場合、まず、測定画像202内の各ピクセルの光量が、予め設定された閾値を超えるか否かを判別し、この判別結果にもとづいて各ピクセルの輝度値を白色の値と黒色の値とに2値化する2値化抽出処理を実行する。次に、目標物抽出部512は、ピクセル単位でつながっている白色の値の輝度値を有するピクセルの集合を1つずつラベリングする。なお、本実施形態は、異物を除去し粒子の飛跡を抽出することが目的であるため、次に、目標物排出部512は、特開2005-293299号公報に記載されている楕円フィッティングという手法を用いて、ラベリングされたピクセルの集合のうち、飛跡や異物とは考えられないものを目標物から排除する。 In addition, when the target object extraction unit 512 extracts a target object whose luminance value in each of the difference images 204 and 205 has positive information, first, the light amount of each pixel in the measurement image 202 has a preset threshold value. Whether or not to exceed is determined, and based on the determination result, a binarization extraction process for binarizing the luminance value of each pixel into a white value and a black value is executed. Next, the target object extraction unit 512 labels a set of pixels each having a luminance value of a white value connected in units of pixels one by one. Note that the purpose of this embodiment is to remove foreign matters and extract the tracks of particles, so the target discharge unit 512 is a technique called elliptic fitting described in JP-A-2005-293299. Is used to exclude from the target a set of labeled pixels that is not considered a track or a foreign object.
 具体的には、目標物抽出部512は、ラベリングされたピクセルの集合の境界をなす各ピクセルの座標を基に、ピクセル集合の輪郭に近似した近似楕円を作成する。目標物抽出部512は、次に、境界を示すピクセル座標付近に存在する近似楕円の外周(境界)までの最短距離を測定し、例えば、この最短距離が予め設定された閾値内である場合は有効なピクセルと判定する。目標抽出部512は、次に、有効と判定されたピクセルの数が所定の閾値以上に達した場合には、飛跡や異物との合致度が高いと判定し、当該ピクセル集合を目標物として抽出する。また、目標物抽出部512は、有効と判定されたピクセルの数が所定の閾値に達せず、合致度が低いと判定した場合は、当該ピクセル集合を目標物から排除する。 Specifically, the target extraction unit 512 creates an approximate ellipse that approximates the outline of the pixel set based on the coordinates of each pixel that forms the boundary of the labeled pixel set. Next, the target extraction unit 512 measures the shortest distance to the outer periphery (boundary) of the approximate ellipse existing in the vicinity of the pixel coordinates indicating the boundary. For example, when the shortest distance is within a preset threshold value It is determined as a valid pixel. Next, when the number of pixels determined to be valid reaches a predetermined threshold or more, the target extraction unit 512 determines that the degree of coincidence with a track or a foreign object is high, and extracts the pixel set as a target. To do. In addition, when the target extraction unit 512 determines that the number of pixels determined to be valid does not reach a predetermined threshold and the degree of matching is low, the target extraction unit 512 excludes the pixel set from the target.
 例えば、図8(A)及び図8(B)に示す各差分画像204a、205aに2値化処理を実行した場合、図8(A)及び図8(B)の実線によって囲まれた部分は、合致度が閾値を超えており、目標物として抽出される。一方、図8(A)及び図8(B)の点線によって囲まれた部分は、合致度が閾値以下となり、目標物から除外される。
 また、本実施形態では、抽出された各差分画像204、205内の、輝度値が正情報を有する目標物の重心座標を取得した後、各差分画像204、205内の、輝度値が正情報を有する目標物の抽出処理を終了する。
For example, when binarization processing is performed on each of the difference images 204a and 205a shown in FIGS. 8A and 8B, the part surrounded by the solid line in FIGS. 8A and 8B is , The degree of match exceeds the threshold value and is extracted as a target. On the other hand, the portion surrounded by the dotted lines in FIGS. 8A and 8B has a matching degree equal to or less than the threshold value and is excluded from the target.
Further, in the present embodiment, after acquiring the barycentric coordinates of the target whose luminance value has positive information in each extracted difference image 204, 205, the luminance value in each difference image 204, 205 is positive information. The extraction process of the target having
 異物除去部513は、撮像表面に付着した異物を測定画像202から除去する。異物除去部513は、凸検出部513aを有する。 The foreign matter removing unit 513 removes foreign matter attached to the imaging surface from the measurement image 202. The foreign substance removal unit 513 includes a convex detection unit 513a.
 ここで、撮像表面に図9(A)及び図9(E)に示す異物1aが付着している場合、図9(A)及び図9(E)の実線に示すように、異物1aに照射された平行光が屈折等して、焦点が撮像表面の上方に形成される。すなわち、異物1aが凸レンズのように機能する。
 このため、対物レンズ31の焦点位置が離隔位置に移動した場合、図9(A)の破線に示すように、対物レンズ31の焦点位置と異物1aに対応する部分の透過光の焦点位置とが近く、光量が多くなる。このため、撮像される図9(B)に示す離隔画像201aは、異物1aに対応する部分201bの輝度値が周辺部分の輝度値よりも大きくなる(白色に近くなる)。
 また、対物レンズ31の焦点位置が接近位置に移動した場合、図9(E)の破線に示すように、対物レンズ31の焦点位置と異物1aに対応する部分の透過光の焦点位置とが遠く、光量が少なくなる。このため、撮像される図9(F)に示す接近画像203aは、異物1aに対応する部分203bの輝度値が周辺部分の輝度値よりも小さくなる(黒色に近くなる)。
Here, when the foreign matter 1a shown in FIGS. 9A and 9E is attached to the imaging surface, the foreign matter 1a is irradiated as shown by the solid lines in FIGS. 9A and 9E. The focused parallel light is refracted and the focal point is formed above the imaging surface. That is, the foreign material 1a functions like a convex lens.
For this reason, when the focal position of the objective lens 31 is moved to the separation position, as shown by the broken line in FIG. 9A, the focal position of the objective lens 31 and the focal position of the transmitted light in the portion corresponding to the foreign matter 1a are obtained. Nearly, the amount of light increases. For this reason, in the separated image 201a shown in FIG. 9B, the luminance value of the portion 201b corresponding to the foreign object 1a is larger than the luminance value of the peripheral portion (close to white).
When the focal position of the objective lens 31 is moved to the approach position, as shown by the broken line in FIG. 9E, the focal position of the objective lens 31 and the focal position of the transmitted light corresponding to the foreign matter 1a are far away. , Less light. Therefore, in the captured approach image 203a shown in FIG. 9F, the luminance value of the portion 203b corresponding to the foreign object 1a is smaller than the luminance value of the peripheral portion (close to black).
 この結果、離隔画像201a中の輝度値が周辺部分の輝度値よりも大きく、且つ、接近画像203aの輝度値が周辺部分の輝度値よりも小さい部分201b、203bを、撮像表面に付着した異物1aとみなして除去することができる。よって、異物除去部513は、後述する凸検出部513aによって撮像表面から突出した凸部を検出し、検出された凸部を異物1aとして除去する。 As a result, the foreign matter 1a in which the portions 201b and 203b in which the luminance value in the separated image 201a is larger than the luminance value in the peripheral portion and the luminance value in the approach image 203a is smaller than the luminance value in the peripheral portion are attached to the imaging surface. Can be considered and removed. Therefore, the foreign matter removing unit 513 detects the convex portion protruding from the imaging surface by the convex detecting portion 513a described later, and removes the detected convex portion as the foreign matter 1a.
 凸検出部513aは、離隔画像201中の輝度値が周辺部分の輝度値よりも大きく、且つ、接近画像203の輝度値が周辺部分の輝度値よりも小さい部分201b、203bを、撮像表面から突出した凸部として検出する。
 例えば、凸検出部513aが図6(H)及び図8(B)に示す離隔接近差分画像205、205aから凸部を検出する場合、離隔接近差分画像205、205a内で正情報の輝度値が得られた目標物(除去Obj)は、全て凸部として検出される。
The convex detection unit 513a projects from the imaging surface portions 201b and 203b in which the luminance value in the separated image 201 is larger than the luminance value in the peripheral portion and the luminance value in the approaching image 203 is smaller than the luminance value in the peripheral portion. It detects as a convex part.
For example, when the convex detection unit 513a detects a convex part from the separation approach difference images 205 and 205a shown in FIGS. 6H and 8B, the brightness value of the positive information is detected in the separation approach difference images 205 and 205a. All the obtained targets (removal Obj) are detected as convex portions.
 飛跡抽出部514は、撮像表面から凹んで形成された、粒子の飛跡を抽出する。飛跡抽出部514は、凹検出部514aと、非抽出対象除去部514bとを有する。 The track extraction unit 514 extracts a track of particles formed to be recessed from the imaging surface. The track extraction unit 514 includes a concave detection unit 514a and a non-extraction target removal unit 514b.
 ここで、撮像表面に図9(C)及び図9(G)に示す粒子の飛跡1bが形成されている場合、図9(C)及び図9(G)の実線に示すように、粒子の飛跡1bに照射された平行光が屈折・散乱等して、焦点が撮像表面の下方に形成される。すなわち、粒子の飛跡1bが凹レンズのように機能する。
 このため、対物レンズ31の焦点位置が離隔位置に移動した場合、図9(C)の破線に示すように、対物レンズ31の焦点位置と粒子の飛跡1bに対応する部分の透過光の焦点位置とが遠く、光量が少ない。このため、撮像される図9(D)に示す離隔画像201cは、粒子の飛跡1bに対応する部分201dの輝度値が周辺部分の輝度値よりも小さくなる。また、対物レンズ31の焦点位置が接近位置に移動した場合、図9(G)の破線に示すように、対物レンズ31の焦点位置と粒子の飛跡1bに対応する部分の透過光の焦点位置とが近く、光量が多い。このため、撮像される図9(H)に示す接近画像203cは、粒子の飛跡1bに対応する部分203dの輝度値が周辺部分の輝度値よりも大きくなる。よって、撮像表面に粒子の飛跡1bが形成されている場合は、撮像表面に異物1aが付着している場合とは、対応する部分の明暗が逆転した画像が得られる(白黒反転した画像が得られる)。
Here, when the particle tracks 1b shown in FIGS. 9C and 9G are formed on the imaging surface, as shown by the solid lines in FIGS. 9C and 9G, The parallel light applied to the track 1b is refracted and scattered, and a focal point is formed below the imaging surface. That is, the particle track 1b functions like a concave lens.
For this reason, when the focal position of the objective lens 31 is moved to the separation position, as shown by the broken line in FIG. 9C, the focal position of the objective lens 31 and the focal position of the transmitted light in the part corresponding to the track 1b of the particles. Is far away and the amount of light is small. For this reason, in the separated image 201c shown in FIG. 9D, the luminance value of the portion 201d corresponding to the particle track 1b is smaller than the luminance value of the peripheral portion. When the focal position of the objective lens 31 is moved to the approach position, as shown by a broken line in FIG. 9G, the focal position of the objective lens 31 and the focal position of the transmitted light in the portion corresponding to the particle track 1b Is close and there is a lot of light. For this reason, in the approach image 203c shown in FIG. 9 (H), the luminance value of the portion 203d corresponding to the particle track 1b is larger than the luminance value of the peripheral portion. Therefore, when the particle track 1b is formed on the imaging surface, an image in which the brightness and darkness of the corresponding portion is reversed is obtained compared to the case where the foreign matter 1a is attached to the imaging surface (an image with black and white reversed is obtained). ).
 この結果、接近画像203c中の輝度値が周辺部分の輝度値よりも大きく、且つ、離隔画像201cの輝度値が周辺部分の輝度値よりも小さい部分203d、201dを、撮像表面から凹んだ粒子の飛跡1bとみなして抽出することができる。よって、飛跡抽出部514は、後述する凹検出部514aによって撮像表面から凹んだ凹部を検出し、検出された凹部を粒子の飛跡1bとして抽出する。
 また、本実施形態では、粒子の飛跡1bを抽出することが目的であるため、飛跡抽出部514は、後述する非抽出対象除去部514bによって、測定画像202内の、粒子の飛跡1bとは異なる目標物を、境界の長さ(エッジ長)や面積等に基づいて判別して抽出対象から除去する。
As a result, the portions 203d and 201d in which the luminance value in the approach image 203c is larger than the luminance value in the peripheral portion and the luminance value in the separated image 201c is smaller than the luminance value in the peripheral portion It can be extracted as a track 1b. Therefore, the track extraction unit 514 detects a recess recessed from the imaging surface by a recess detection unit 514a described later, and extracts the detected recess as a track 1b of particles.
Further, in this embodiment, since the purpose is to extract the particle track 1b, the track extraction unit 514 is different from the particle track 1b in the measurement image 202 by a non-extraction target removal unit 514b described later. The target is discriminated based on the boundary length (edge length), area, etc., and removed from the extraction target.
 凹検出部514aは、接近画像203c中の輝度値が周辺部分の輝度値よりも大きく、且つ、離隔画像201cの輝度値が周辺部分の輝度値よりも小さい部分203d、201dを、撮像表面から凹んだ凹部として検出する。
 例えば、凹検出部514aが図6(G)及び図8(A)に示す接近離隔差分画像204、204aから凹部を検出する場合、接近離隔差分画像204、204a内で正情報の輝度値が得られた目標物(抽出Obj)は、全て凹部として検出される。
The concave detection unit 514a dents portions 203d and 201d from the imaging surface whose luminance value in the approach image 203c is larger than the luminance value in the peripheral portion and whose luminance value in the separated image 201c is smaller than the luminance value in the peripheral portion. Detect as a concave.
For example, when the concave detection unit 514a detects a concave portion from the approach / separation difference images 204, 204a shown in FIGS. 6 (G) and 8 (A), the luminance value of the positive information is obtained in the approach / separation difference images 204, 204a. All the target objects (extracted Obj) are detected as concave portions.
 非抽出対象除去部514bは、測定画像202内の目標物のうち、粒子の飛跡1bとは異なるものを非抽出対象として除去する。
 非抽出対象除去部514bは、測定画像202内の目標物のうち、境界の長さが、予め設定された境界用閾値LLよりも短いものを除去する。
 また、非抽出対象除去部514bは、測定画像202内の目標物のうち、面積が、予め設定された面積用最小値SLよりも小さいものと、予め設定された面積用最大値SHよりも大きいものと、を除去する。
 なお、本実施形態では、境界から抽出した所定数のピクセルを用いて目標物の輪郭に近似する楕円を求める楕円フィッティング処理により求めた楕円の面積を求めることで測定画像202内の目標物の面積を求める。
 さらに、非抽出対象除去部514bは、測定画像202内の目標物のうち、求めた楕円の短径を長径で除算した値が、予め設定された楕円用閾値ODより小さいものを除外する。
The non-extraction target removing unit 514b removes a target in the measurement image 202 that is different from the particle track 1b as a non-extraction target.
The non-extraction target removal unit 514b removes the target in the measurement image 202 whose boundary length is shorter than the preset threshold value for boundary LL.
Further, the non-extraction target removal unit 514b has a target area in the measurement image 202 whose area is smaller than the preset area minimum value SL and greater than the preset area maximum value SH. Remove things and things.
In the present embodiment, the area of the target in the measurement image 202 is obtained by obtaining the area of the ellipse obtained by the ellipse fitting process for obtaining an ellipse that approximates the outline of the target using a predetermined number of pixels extracted from the boundary. Ask for.
Further, the non-extraction target removal unit 514b excludes the target in the measurement image 202 that has a value obtained by dividing the obtained minor axis of the ellipse by the major axis is smaller than the preset ellipse threshold OD.
 なお、本実施形態の飛跡抽出部514は、抽出する粒子の飛跡の抽出漏れを防止(数え漏れを防止)するため、測定画像202に含まれる、所定のサイズよりも大きい飛跡は、各差分画像204、205と照合することなく無条件で抽出する。具体的には、飛跡抽出部514は、測定画像202内の、面積が面積用最小値SL以上面積用最大値SH以下である目標物のうち、予め設定された面積用閾値SEよりも大きいものを、接近離隔差分画像204中の正情報の輝度値を有する目標物と照合することなく粒子の飛跡として抽出する(無条件で抽出する)。 Note that the track extraction unit 514 of the present embodiment prevents the extraction of the tracks of the particles to be extracted (prevents counting leakage), so that a track larger than a predetermined size included in the measurement image 202 is the difference image. 204 and 205 are unconditionally extracted without collation. Specifically, the track extraction unit 514 is larger than a preset area threshold value SE among targets in the measurement image 202 whose area is not less than the area minimum value SL and not more than the area maximum value SH. Is extracted as a track of particles without matching with a target having a luminance value of positive information in the approaching / separating difference image 204 (unconditionally extracted).
 また、本実施形態の飛跡抽出部514は、測定画像202内の目標物の重心座標を求め、求めた重心座標と接近離隔差分画像204中の正情報の輝度値を有する目標物の重心座標との間の距離を求め、この値を飛跡用の距離として記憶する。
 次に、飛跡抽出部514は、求めた測定画像202内の目標物の重心座標と離隔接近差分画像205中の正情報の輝度値を有する目標物の重心座標との間の距離を求め、この値を異物用の距離として記憶する。
 そして、飛跡抽出部514は、求めた飛跡用の距離のなかに予め設定された距離用閾値GDよりも短いものがあり、且つ、求めた異物用の距離が全て距離用閾値GD以上であれば、測定画像202内の目標物を粒子の飛跡として抽出する。なお、距離用閾値GDには、求めた全ての飛跡用の距離のいずれよりも大きく、求めた全ての異物用の距離のいずれよりも小さい値が設定される。
Further, the track extraction unit 514 of the present embodiment obtains the center-of-gravity coordinates of the target in the measurement image 202, and the center-of-gravity coordinates of the target having the luminance value of the positive information in the approaching / separating difference image 204 Is obtained, and this value is stored as the distance for the track.
Next, the track extraction unit 514 obtains the distance between the center of gravity coordinates of the target in the obtained measurement image 202 and the center of gravity coordinates of the target having the positive brightness value in the separation approach difference image 205. The value is stored as the distance for the foreign object.
The track extraction unit 514 has a shorter track distance than the preset distance threshold GD, and all the determined foreign object distances are greater than or equal to the distance threshold GD. Then, the target in the measurement image 202 is extracted as a track of particles. The distance threshold GD is set to a value that is larger than any of all the obtained distances for tracks and smaller than any of the obtained distances for all foreign objects.
 例えば、測定画像である図8(C)内の1202aが目標物である場合について説明する。
 まず、飛跡抽出部514は、目標物である1202aの重心座標COD1202aを求める。次に、飛跡抽出部514は、接近離隔差分画像である図8(A)中の正情報の輝度値を有する目標物の重心座標として、例えば1202bの重心座標COD1202bを求め、COD1202bとCOD1202aとの間の距離を求め、求めた距離を飛跡用距離GD1とする。
 次に飛跡抽出部514は、離隔接近差分画像である図8(B)中の正情報の輝度値を有する目標物の重心座標として、例えば1202cの重心座標COD1202cを求め、COD1202aとCOD1202cとの間の距離を求め、求めた距離を異物用距離GD2とする。
For example, a case where a measurement image 1202a in FIG. 8C is a target will be described.
First, the track extraction unit 514 obtains the center-of-gravity coordinates COD 1202a of the target 1202a. Next, the track extraction unit 514 obtains, for example, the center-of-gravity coordinates COD1202b of 1202b as the center-of-gravity coordinates of the target having the luminance value of the positive information in FIG. The distance between them is obtained, and the obtained distance is set as a track distance GD1.
Next, the track extraction unit 514 obtains, for example, the centroid coordinates COD1202c of 1202c as the centroid coordinates of the target having the luminance value of the positive information in FIG. 8B, which is the separation approach difference image, and between the COD1202a and the COD1202c. And the obtained distance is defined as a foreign object distance GD2.
 この場合、測定画像内の目標物1202aの重心座標COD1202aと接近隔離差分画像内の目標物1202bの重心座標COD1202bとは、測定誤差が無ければ合致している。このため、飛跡用距離GD1は非常に小さく、距離用閾値GD以下となる。したがって、接近離隔差分画像の中に距離用閾値GD以下のものが存在していると判定される。 In this case, the center-of-gravity coordinates COD 1202a of the target 1202a in the measurement image and the center-of-gravity coordinates COD 1202b of the target 1202b in the approach / separation difference image match if there is no measurement error. For this reason, the track distance GD1 is very small and is not more than the distance threshold GD. Therefore, it is determined that there is an object closer than the distance threshold GD in the approach / separation difference image.
 一方、隔離接近差分画像である図8(B)では目標物1202aに対応する目標物は対象外として除外されており、最も近い目標物は、1202cである。この場合、異物用距離GD2は飛跡用距離GD1よりも大きく、距離用閾値GDよりも大きな値となるので、離隔接近差分画像の中には重心座標GD以下のものは存在しないと判定される。 On the other hand, in FIG. 8B, which is an isolated approach difference image, the target corresponding to the target 1202a is excluded from the object, and the closest target is 1202c. In this case, the foreign object distance GD2 is larger than the track distance GD1 and larger than the distance threshold value GD, and therefore it is determined that there is no centroid coordinate GD or less in the distance approaching difference image.
 飛跡抽出部514は、測定画像内の目標物1202aについて、求めた飛跡用の距離のなかに予め設定された距離用閾値GDよりも短いものがあり、且つ、求めた異物用の距離が全て距離用閾値GD以上であると判定し、目標物1202aを粒子の飛跡として抽出する。 The track extraction unit 514 has a target track 1202a in the measurement image that is shorter than the preset distance threshold GD among the calculated track distances, and all the determined foreign object distances are distances. It determines with it being more than the use threshold value GD, and the target 1202a is extracted as a track of particles.
 例えば、図8(C)に示す測定画像202eでは、実線に示す部分は粒子の飛跡として抽出され、点線に示す部分は異物等として除外される。なお、測定画像202eの中央部の点線で示す除外部分202fは、図8(A)及び図8(B)に示すように、凹部としても凸部としても検出されており、飛跡用の距離と異物用の距離とが共に距離用閾値GDより小さいため除外されている。 For example, in the measurement image 202e shown in FIG. 8C, the portion indicated by a solid line is extracted as a track of particles, and the portion indicated by a dotted line is excluded as a foreign matter or the like. The excluded portion 202f indicated by the dotted line at the center of the measurement image 202e is detected as a concave portion or a convex portion, as shown in FIGS. The foreign object distance is excluded because it is smaller than the distance threshold GD.
 次に、図10~図13を参照して、本実施の形態の試料解析装置100の動作を説明する。
 図10に示すように、試料解析処理は、試料1が試料台22上面に支持され、試料1の撮像領域が設定されることにより開始する。
 まず、演算処理部51は、移動部2を介して試料1上の最初に撮像を始めるエリアへとX-Yステージ21を移動する(ステップS101)。
 次に、演算処理部51は、オートフォーカス処理によって撮像表面を検出し、Z軸制御部36を介して対物レンズ31の焦点位置を基準位置に移動させる(ステップS102)。
 次に、演算処理部51は、Z軸制御部36を介して対物レンズ31の焦点位置を離隔位置に移動させ、撮像部123によって離隔画像201を撮像する(ステップS103)。
 次に、演算処理部51は、Z軸制御部36を介して対物レンズ31の焦点位置を測定位置に移動させ、撮像部123によって測定画像202を撮像する(ステップS104)。
 次に、演算処理部51は、Z軸制御部36を介して対物レンズ31の焦点位置を接近位置に移動させ、撮像部123によって接近画像203を撮像する(ステップS105)。
Next, the operation of the sample analyzer 100 of the present embodiment will be described with reference to FIGS.
As shown in FIG. 10, the sample analysis process starts when the sample 1 is supported on the upper surface of the sample table 22 and the imaging region of the sample 1 is set.
First, the arithmetic processing unit 51 moves the XY stage 21 to an area on the sample 1 where imaging is first started via the moving unit 2 (step S101).
Next, the arithmetic processing unit 51 detects the imaging surface by autofocus processing, and moves the focal position of the objective lens 31 to the reference position via the Z-axis control unit 36 (step S102).
Next, the arithmetic processing unit 51 moves the focal position of the objective lens 31 to the separation position via the Z-axis control unit 36, and images the separation image 201 by the imaging unit 123 (step S103).
Next, the arithmetic processing unit 51 moves the focal position of the objective lens 31 to the measurement position via the Z-axis control unit 36, and images the measurement image 202 by the imaging unit 123 (step S104).
Next, the arithmetic processing unit 51 moves the focal position of the objective lens 31 to the approach position via the Z-axis control unit 36, and captures the approach image 203 by the image capturing unit 123 (step S105).
 次に、演算処理部51は、後述する図11に示す測定画像の目標物抽出処理と(ステップS106)、後述する図12に示す各差分画像の目標物抽出処理と(ステップS107)、後述する図13に示す飛跡抽出処理と(ステップS108)、を実行する。
 次に、演算処理部51は、全てのエリアを撮像したか否かを判別し(ステップS109)、撮像していない場合には(ステップS109;No)、移動部2を介して次のエリアへX-Yステージ21を移動し(ステップS110)、ステップS102~S108の処理を繰り返す。
 そして、演算処理部51は、全てのエリアを撮像している場合には(ステップS109;Yes)、試料解析処理を終了する。
Next, the arithmetic processing unit 51 performs the target image extraction process of the measurement image shown in FIG. 11 (step S106), which will be described later, and the target object extraction process of each difference image shown in FIG. 12, which will be described later (step S107). The track extraction process (step S108) shown in FIG. 13 is executed.
Next, the arithmetic processing unit 51 determines whether or not all areas have been imaged (step S109), and when not imaging (step S109; No), to the next area via the moving unit 2. The XY stage 21 is moved (step S110), and the processes of steps S102 to S108 are repeated.
And the arithmetic processing part 51 complete | finishes a sample analysis process, when imaging all the areas (step S109; Yes).
 図11に示す測定画像の目標物抽出処理において、まず、目標物抽出部512は、撮像された測定画像202に、スムージング処理(ステップS201)と、境界抽出処理とを実行し(ステップS202)、ピクセル単位でつながっている境界を1つずつラベリングする(ステップS203)。
 次に、目標物抽出部512は、ラベリングされた複数の境界が多重(多重円)となっているか否かを判別し(ステップS204)、多重となっている場合には(ステップS204;Yes)、境界内の領域の平均輝度値が最も低い境界以外の境界を除去し(ステップS205)、ステップS204を繰り返す。
 そして、目標物抽出部512は、多重となる境界がない場合や、多重となる境界が全て除去された場合(ステップS204;No)、測定画像の目標物抽出処理を終了する。
In the target image extraction process of the measurement image shown in FIG. 11, first, the target object extraction unit 512 performs a smoothing process (step S201) and a boundary extraction process on the captured measurement image 202 (step S202). The boundaries connected in units of pixels are labeled one by one (step S203).
Next, the target object extraction unit 512 determines whether or not a plurality of labeled boundaries are multiplexed (multiple circles) (step S204), and when they are multiplexed (step S204; Yes). Then, the boundary other than the boundary having the lowest average luminance value in the region within the boundary is removed (step S205), and step S204 is repeated.
Then, the target object extraction unit 512 ends the target object extraction process of the measurement image when there are no multiple boundaries or when all the multiple boundaries are removed (No in step S204).
 図12に示す各差分画像の目標物抽出処理において、まず、差分取得部511は、離隔画像201及び接近画像203にそれぞれスムージング処理を実行した後に(ステップS301)、接近離隔差分画像204と離隔接近差分画像205とを取得する(ステップS302)。
 次に、目標物抽出部512は、各差分画像204、205中の、正情報の輝度値を有する目標物を抽出してラベリングする(ステップS303)。
 次に、目標物抽出部512は、ラベリングされた目標物のうち、飛跡や異物との合致度が閾値以下の目標物を除去する(ステップS304)。
 そして、目標物抽出部512は、ラベリングされた残りの目標物の重心座標を取得し(ステップS305)、各差分画像の目標物抽出処理を終了する。
In the target extraction processing of each difference image shown in FIG. 12, first, the difference acquisition unit 511 performs smoothing processing on the separation image 201 and the approach image 203 (step S301), and then approaches the separation image 201 and the separation approach. The difference image 205 is acquired (step S302).
Next, the target object extraction unit 512 extracts and labels the target object having the brightness value of the positive information in the difference images 204 and 205 (step S303).
Next, the target object extraction unit 512 removes target objects whose degree of coincidence with a track or a foreign object is equal to or less than a threshold value from among the labeled target objects (step S304).
Then, the target object extraction unit 512 acquires the barycentric coordinates of the remaining labeled target objects (step S305), and ends the target object extraction process of each difference image.
 また、図13に示す飛跡抽出処理において、まず、飛跡抽出部514は、ラベリングされた境界を有する測定画像202内の目標物を1つ選択する(ステップS401)。
 次に、非抽出対象除去部514bは、境界の長さが境界用閾値LLよりも短いか否かを判別し(ステップS402)、短い場合には(ステップS402;Yes)、選択した目標物を異物として除去する(ステップS410)。
 また、非抽出対象除去部514bは、境界の長さが境界用閾値LL以上の場合には(ステップS402;No)、楕円フィッティング処理によって目的物の輪郭に近似した楕円を求める(ステップS403)。
 次に、非抽出対象除去部514bは、ステップS403で求めた楕円の面積が面積用最小値SLより小さいか否かを判別し(ステップS404)、小さい場合には(ステップS404;Yes)、選択した目標物を異物として除去する(ステップS410)。
In the track extraction process shown in FIG. 13, the track extraction unit 514 first selects one target in the measurement image 202 having a labeled boundary (step S401).
Next, the non-extraction target removal unit 514b determines whether or not the boundary length is shorter than the boundary threshold LL (step S402). If the boundary length is shorter (step S402; Yes), the selected target is selected. It removes as a foreign material (step S410).
In addition, when the boundary length is equal to or greater than the boundary threshold LL (step S402; No), the non-extraction target removal unit 514b obtains an ellipse that approximates the contour of the object by the ellipse fitting process (step S403).
Next, the non-extraction target removal unit 514b determines whether or not the area of the ellipse obtained in step S403 is smaller than the area minimum value SL (step S404). If the area is smaller (step S404; Yes), the selection is performed. The completed target is removed as a foreign object (step S410).
 また、非抽出対象除去部514bは、楕円の面積が面積用最小値SL以上の場合には(ステップS404;No)、楕円の面積が面積用最大値SHより大きいか否かを判別する(ステップS405)。
 非抽出対象除去部514bは、楕円の面積が面積用最大値SHより大きい場合には(ステップS405;Yes)、選択した目標物を異物として除去する(ステップS410)。楕円の面積が面積用最大値SH以下の場合には(ステップS405;No)、楕円の短径を長径で除算した値が楕円用閾値ODより小さいか否かを判別する(ステップS406)。
Further, when the area of the ellipse is equal to or larger than the area minimum value SL (step S404; No), the non-extraction target removing unit 514b determines whether the area of the ellipse is larger than the area maximum value SH (step S404). S405).
When the area of the ellipse is larger than the maximum area value SH (step S405; Yes), the non-extraction target removal unit 514b removes the selected target as a foreign object (step S410). If the area of the ellipse is less than or equal to the area maximum value SH (step S405; No), it is determined whether or not the value obtained by dividing the minor axis of the ellipse by the major axis is smaller than the ellipse threshold OD (step S406).
 非抽出対象除去部514bは、除算した値が楕円用閾値ODより小さい場合には(ステップS406;Yes)、選択した目標物を異物として除去する(ステップS410)。
 また、除算した値が楕円用閾値OD以下の場合には(ステップS406;No)、飛跡抽出部514は、楕円の面積が面積用閾値SEより大きいか否かを判別する(ステップS407)。
 楕円の面積が面積用閾値SEより大きい場合(ステップS407;Yes)、飛跡抽出部514は、選択された目標物を粒子の飛跡として抽出する(ステップS411)。
 また、楕円の面積が面積用閾値SE以下の場合(ステップS407;No)、飛跡抽出部514は、測定画像202内の目標物の重心座標とラベリングされた接近離隔差分画像204中の、正情報の輝度値を有する目標物の重心座標との距離である飛跡用の距離を求め、異物除去部513は、測定画像202内の目標物の重心座標とラベリングされた離隔接近差分画像205中の、正情報の輝度値を有する目標物の重心座標との距離である異物用の距離を求める(ステップS408)。
If the divided value is smaller than the ellipse threshold OD (step S406; Yes), the non-extraction target removal unit 514b removes the selected target as a foreign object (step S410).
When the divided value is equal to or less than the ellipse threshold OD (step S406; No), the track extraction unit 514 determines whether or not the area of the ellipse is larger than the area threshold SE (step S407).
When the area of the ellipse is larger than the area threshold value SE (step S407; Yes), the track extraction unit 514 extracts the selected target as a track of particles (step S411).
When the area of the ellipse is equal to or smaller than the area threshold value SE (step S407; No), the track extraction unit 514 detects positive information in the approach / separation difference image 204 that is labeled with the barycentric coordinates of the target in the measurement image 202. The distance for the track, which is the distance from the center of gravity coordinates of the target having the brightness value, is obtained, and the foreign matter removing unit 513 includes the distance approach difference image 205 labeled with the center of gravity coordinates of the target in the measurement image 202. A distance for a foreign object which is a distance from the center of gravity coordinates of the target having the luminance value of the positive information is obtained (step S408).
 次に、飛跡抽出部514は、距離用閾値GDよりも短い飛跡用の距離が有り、且つ、距離用閾値GDよりも短い異物用の距離が無いか否かを判別する(ステップS409)。
 距離用閾値GDよりも短い飛跡用の距離が無い場合や、距離用閾値GDよりも短い異物用の距離が有る場合には(ステップS409;No)、異物除去部513は、選択した目標物を異物として除去する(ステップS410)。
 また、距離用閾値GDよりも短い飛跡用の距離が有り、且つ、距離用閾値GDよりも短い異物用の距離が無い場合には(ステップS409;Yes)、飛跡抽出部514は、選択された目標物を粒子の飛跡として抽出する(ステップS411)。
 次に、飛跡抽出部514は、測定画像202内の全ての目標物を選択済であるか否かを判別し(ステップS412)、選択済でない場合には(ステップS412;No)、ステップS401に戻り次の目的物を選択し、選択済である場合には(ステップS412;Yes)、飛跡抽出処理を終了する。
Next, the track extraction unit 514 determines whether there is a track distance shorter than the distance threshold GD and there is no foreign object distance shorter than the distance threshold GD (step S409).
When there is no track distance shorter than the distance threshold GD, or when there is a distance for foreign matter shorter than the distance threshold GD (step S409; No), the foreign matter removing unit 513 selects the selected target. It removes as a foreign material (step S410).
When there is a track distance shorter than the distance threshold GD and there is no foreign object distance shorter than the distance threshold GD (step S409; Yes), the track extraction unit 514 is selected. The target is extracted as a track of particles (step S411).
Next, the track extraction unit 514 determines whether or not all the targets in the measurement image 202 have been selected (step S412). If not selected (step S412; No), the process proceeds to step S401. Return The next object is selected, and if it has been selected (step S412; Yes), the track extraction process is terminated.
 以上説明したように、本実施形態の試料解析装置100によれば、対物レンズ31の焦点位置をZ軸方向に移動させることにより、撮像表面から突出した凸部(凸レンズの機能を有する部分)と、撮像表面から凹んだ凹部(凹レンズの機能を有する部分)と、をそれぞれ検出できる。この結果、本実施形態の試料解析装置100は、飛跡検出用固体の解析の際に、撮像表面に付着した異物1aを自動検出して除去することができ、撮像表面から凹んだ粒子の飛跡1bのみを抽出することができる。 As described above, according to the sample analyzer 100 of the present embodiment, by moving the focal position of the objective lens 31 in the Z-axis direction, the convex portion protruding from the imaging surface (the portion having the function of a convex lens) and , A concave portion (a portion having a function of a concave lens) recessed from the imaging surface can be detected. As a result, the sample analysis apparatus 100 of the present embodiment can automatically detect and remove the foreign matter 1a attached to the imaging surface when analyzing the track detection solid, and the track 1b of the particles recessed from the imaging surface. Only can be extracted.
 また、本実施形態の試料解析装置100では、コンデンサレンズ23から平行光(集平行光)のみが試料1に出射して試料の解析を行っており、特許文献1、2のように集拡散光を試料1に出射する構成が不要となっている。この結果、本実施形態の試料解析装置100は、集拡散光をフィルムに入射させるための構成(拡散板や第2光源部)を省略することができ、凸部や凹部を検出するための製造コストを低減できる。
 また、本実施形態の試料解析装置100では、離隔位置及び接近位置を調整して、凸部や凹部と、他の部分と、のコントラスト(例えば、輝度値の差分値)が大きくなるように設定でき、凸部(例えば、異物1a)や凹部(例えば、粒子の飛跡1b)を精度良く検出できる。
Further, in the sample analyzer 100 of the present embodiment, only parallel light (collected parallel light) is emitted from the condenser lens 23 to the sample 1 to analyze the sample. The structure which radiates | emits to the sample 1 is unnecessary. As a result, the sample analyzer 100 of the present embodiment can omit the configuration (the diffuser plate and the second light source unit) for causing the collected diffused light to be incident on the film, and can be manufactured to detect convex portions and concave portions. Cost can be reduced.
In the sample analyzer 100 of the present embodiment, the separation position and the approach position are adjusted so that the contrast (for example, the difference value of the luminance value) between the convex portion and the concave portion and the other portion is increased. It is possible to accurately detect convex portions (for example, foreign matter 1a) and concave portions (for example, track 1b of particles).
 特に、本実施形態の試料解析装置100では、離隔画像201と接近画像203とから取得した各差分画像204、205中の、正情報の輝度値を有する目的物を抽出しており、凸部や凹部(正情報の輝度値を有する目標物)と、他の部分(目標物以外の黒色の部分)と、のコントラストがさらに大きくなっている。この結果、本実施形態の試料解析装置100は、凸部(例えば、異物1a)や凹部(例えば、粒子の飛跡1b)をさらに精度良く検出できる。
 さらに、本実施形態の試料解析装置100では、所謂ジャストフォーカスの測定画像202の目標物を抽出して、各差分画像204、205中の、正情報の輝度値を有する目的物の位置に配置されたものを凸部や凹部として検出しており、凸部や凹部をさらに精度良く検出できる。
In particular, in the sample analyzer 100 of the present embodiment, a target object having a positive luminance value is extracted from each of the difference images 204 and 205 acquired from the separated image 201 and the approach image 203, The contrast between the concave portion (the target having the luminance value of the positive information) and the other portion (the black portion other than the target) is further increased. As a result, the sample analyzer 100 according to the present embodiment can detect the convex portion (for example, the foreign matter 1a) and the concave portion (for example, the track 1b of the particles) with higher accuracy.
Further, in the sample analyzer 100 of the present embodiment, the target of the so-called just focus measurement image 202 is extracted and placed in the position of the target having a positive luminance value in each of the difference images 204 and 205. These are detected as convex portions and concave portions, and the convex portions and concave portions can be detected with higher accuracy.
 また、本実施形態の試料解析装置100では、測定画像202内の目標物の境界が多重(多重円)になっている場合に、境界の領域内の平均輝度値が最も低い境界以外の境界を除去する。この結果、本実施形態の試料解析装置100は、測定画像202内の同じ凸部や凹部を複数の凸部や凹部として重複して検出する誤検出を防止することができる。 Further, in the sample analyzer 100 of the present embodiment, when the boundary of the target in the measurement image 202 is multiple (multiple circle), the boundary other than the boundary having the lowest average luminance value in the boundary region is detected. Remove. As a result, the sample analyzer 100 according to the present embodiment can prevent erroneous detection in which the same convex portion or concave portion in the measurement image 202 is redundantly detected as a plurality of convex portions or concave portions.
 さらに、本実施形態の試料解析装置100では、円形の凹部として形成された粒子の飛跡1bを抽出するため、各差分画像204、205中の、正情報の輝度値を有する目標物のうち、飛跡や異物との合致度の低いものを除去したり、各閾値LL、SE、SL、SH、OD、GDを用いて粒子の飛跡1bとは異なるものを非抽出対象として除去したりする。この結果、本実施形態の試料解析装置100は、測定画像202内の異物1a等を精度良く除去し、粒子の飛跡1bのみを精度良く抽出することができる。 Furthermore, in the sample analyzer 100 of this embodiment, in order to extract the track 1b of the particle formed as a circular recess, the track among the targets having the luminance value of the positive information in the difference images 204 and 205. Or those having a low degree of coincidence with a foreign object are removed, or those different from the particle track 1b are removed as non-extraction targets using the thresholds LL, SE, SL, SH, OD, and GD. As a result, the sample analysis apparatus 100 of the present embodiment can accurately remove the foreign matter 1a and the like in the measurement image 202 and extract only the particle tracks 1b with high accuracy.
 なお、試料解析装置100の制御コンピュータ150は、図15に示すように、制御部151、主記憶部152、外部記憶部153、操作部154、表示部52、入出力部156および送受信部157を備える。
 主記憶部152、外部記憶部153、操作部154、表示部52、入出力部156、および送受信部157は、いずれも内部バス160を介して制御部151に接続されている。
As shown in FIG. 15, the control computer 150 of the sample analyzer 100 includes a control unit 151, a main storage unit 152, an external storage unit 153, an operation unit 154, a display unit 52, an input / output unit 156, and a transmission / reception unit 157. Prepare.
The main storage unit 152, the external storage unit 153, the operation unit 154, the display unit 52, the input / output unit 156, and the transmission / reception unit 157 are all connected to the control unit 151 via the internal bus 160.
 制御部151はCPU(Central Processing Unit)等を備え、外部記憶部153に記憶されている制御プログラム158に従って、試料解析装置100の差分取得部511、目標物抽出部512、異物除去部513、および飛跡抽出部514の各処理を実行する。演算処理部51は、制御部151によって構成される。 The control unit 151 includes a CPU (Central Processing Unit) and the like, and according to a control program 158 stored in the external storage unit 153, a difference acquisition unit 511, a target extraction unit 512, a foreign matter removal unit 513 of the sample analyzer 100, and Each process of the track extraction unit 514 is executed. The arithmetic processing unit 51 is configured by the control unit 151.
 主記憶部152はRAM(Random-Access Memory)等を備え、外部記憶部153に記憶されている制御プログラム158をロードし、制御部151の作業領域として機能する。画像RAM124は、主記憶部152によって構成される。 The main storage unit 152 includes a RAM (Random-Access Memory) and the like, loads the control program 158 stored in the external storage unit 153, and functions as a work area of the control unit 151. The image RAM 124 is configured by the main storage unit 152.
 外部記憶部153は、フラッシュメモリ、ハードディスク、DVD-RAM(Digital Versatile Disc Random-Access Memory)、DVD-RW(Digital Versatile Disc ReWritable)等の不揮発性メモリを備え、試料解析装置100の処理を制御部151に行わせるためのプログラムをあらかじめ記憶する。また、外部記憶部153は、制御部151の指示に従って、記憶したデータを制御部151に供給し、制御部151から供給されたデータを記憶する。画像記憶装置53は、外部記憶部153によって構成される。 The external storage unit 153 includes a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Disc ReWritable), and controls the processing of the sample analyzer 100. 151 stores a program to be executed in advance. Further, the external storage unit 153 supplies the stored data to the control unit 151 in accordance with an instruction from the control unit 151, and stores the data supplied from the control unit 151. The image storage device 53 is configured by an external storage unit 153.
 操作部154は、キーボードと、マウスなどのポインティングデバイスと、キーボードおよびポインティングデバイスを内部バス160に接続するインタフェース装置と、を備える。 The operation unit 154 includes a keyboard, a pointing device such as a mouse, and an interface device that connects the keyboard and the pointing device to the internal bus 160.
 表示部52は、CRT(Cathode Ray Tube)またはLCD(Liquid Crystal Display)等を備え、各画像201~205や試料解析装置100の操作画面等を表示する。 The display unit 52 includes a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display), and displays the images 201 to 205, the operation screen of the sample analyzer 100, and the like.
 入出力部156は、シリアルインタフェースまたはパラレルインタフェースを備える。入出力部156は、顕微鏡部110に接続されている。 The input / output unit 156 includes a serial interface or a parallel interface. The input / output unit 156 is connected to the microscope unit 110.
 送受信部157は、ネットワークに接続する網終端装置または無線通信装置、およびそれらと接続するシリアルインタフェースまたはLAN(Local Area Network)インタフェース等を備える。 The transmission / reception unit 157 includes a network termination device or a wireless communication device connected to the network, and a serial interface or a LAN (Local Area Network) interface connected thereto.
 制御プログラム158が、制御部151、主記憶部152、外部記憶部153、操作部154、表示部52、入出力部156及び送受信部157等を資源として用いて処理することによって、図5に示す試料解析装置100の差分取得部511、目標物抽出部512、異物除去部513及び飛跡抽出部514の処理を実行する。 The control program 158 performs processing using the control unit 151, the main storage unit 152, the external storage unit 153, the operation unit 154, the display unit 52, the input / output unit 156, the transmission / reception unit 157, and the like as resources, and is shown in FIG. The processing of the difference acquisition unit 511, the target object extraction unit 512, the foreign matter removal unit 513, and the track extraction unit 514 of the sample analyzer 100 is executed.
 なお、本実施形態では、試料1が飛跡検出用固体によって構成されており、粒子の飛跡1bを抽出する構成としたが、例えば、試料1を特許文献1、2に記載されたフィルムとして、フィルムの傷(凹部)や塵(凸部)を検出等する構成としてもよい。 In the present embodiment, the sample 1 is composed of the track detection solid and the track 1b of the particles is extracted. For example, the sample 1 is a film described in Patent Documents 1 and 2, and the film is a film. It is good also as a structure which detects a crack (concave part) and dust (convex part).
 なお、本実施形態では、撮像部123としてイメージセンサ4を使用したが、例えば、ラインセンサ等を使用してもよい。 In the present embodiment, the image sensor 4 is used as the imaging unit 123. However, for example, a line sensor or the like may be used.
 なお、本実施形態では、イメージセンサ4としてCCDセンサを使用する例を示したが、例えば、CMOS(Complementary Metal-Oxide Semiconductor)センサ等でもよい。 In this embodiment, an example in which a CCD sensor is used as the image sensor 4 has been described. However, for example, a CMOS (Complementary Metal-Oxide Semiconductor) sensor or the like may be used.
 なお、本実施形態では、離隔位置、測定位置、接近位置はそれぞれ1つずつ設定したが、例えば、それぞれ2つ以上設定することも可能である。この場合、撮像された複数の測定画像のうち最もピントが合っているものを選択したり、複数の離隔画像のうち目標物の輝度値が最大のものを選択したり、複数の接近画像のうち正情報の輝度値を有する目標物の輝度値が最大のものを選択したりすることができる。 In this embodiment, one separation position, one measurement position, and one approach position are set. However, for example, two or more can be set. In this case, the most in-focus measurement images are selected, the one with the highest brightness value of the target is selected from among the plurality of remote images, or the plurality of approach images are selected. It is possible to select the target having the maximum luminance value of the positive information and having the maximum luminance value.
 なお、本実施形態では、測定画像202内の境界を抽出する境界抽出処理(エッジ抽出処理)を実行し、ピクセル単位でつながっている境界を1つずつラベリングしたが、例えば、図14に示すように、2値化抽出処理で測定画像202をモノクロ画像202gにしてから目標物の境界をラベリングしてもよい。 In this embodiment, the boundary extraction process (edge extraction process) for extracting the boundary in the measurement image 202 is executed and the boundaries connected in units of pixels are labeled one by one. For example, as shown in FIG. Alternatively, the boundary of the target may be labeled after the measurement image 202 is converted into the monochrome image 202g by binarization extraction processing.
 なお、本実施形態では、飛跡抽出処理において、接近離隔差分画像204中の、正情報の輝度値を有する目標物として抽出されている目標物であっても、測定画像202内の目標物として抽出されていない場合には粒子の飛跡1bとして抽出されない構成としたが、これに限定されない。例えば、目標物が、接近離隔差分画像204中の、正情報の輝度値を有する目標物として抽出されている場合には、測定画像202内の対応する部分を再画像処理(近傍再画像処理)して、粒子の飛跡1bとして抽出するか否かを判別する構成としてもよい。 In the present embodiment, in the track extraction process, even a target that is extracted as a target having a positive brightness value in the approaching / separating difference image 204 is extracted as a target in the measurement image 202. If not, the structure is not extracted as the particle track 1b, but is not limited thereto. For example, when the target is extracted as a target having a positive luminance value in the approaching / separating difference image 204, the corresponding portion in the measurement image 202 is re-image processed (neighbor re-image processing). And it is good also as a structure which discriminate | determines whether it extracts as the track 1b of particle | grains.
 なお、本実施形態では、飛跡抽出処理において、選択された測定画像202内の目標物と、各差分画像204、205中の、ラベリングされた全ての正情報の輝度値を有する目標物との距離を求めた後に、距離用閾値GDより小さい飛跡用の距離が有り、且つ、距離用閾値GDより小さい異物用の距離が無いか否かを判別したが、これに限定されない。例えば、飛跡用の距離については、ラベルの番号順に求めながら距離用閾値GDより小さいか否かを判別してもよい。この場合、測定画像202内の目標物と距離用閾値GDより小さい飛跡用の距離を隔てていると判別された各差分画像204、205中の目標物のラベルの番号によっては実施形態よりも演算負荷を低減できる。 In the present embodiment, in the track extraction process, the distance between the target in the selected measurement image 202 and the target having the luminance values of all the labeled positive information in the difference images 204 and 205. It is determined whether or not there is a track distance smaller than the distance threshold GD and there is no foreign object distance smaller than the distance threshold GD, but the present invention is not limited to this. For example, it may be determined whether or not the track distance is smaller than the distance threshold GD while obtaining the order of the label numbers. In this case, depending on the label number of the target in each of the difference images 204 and 205 determined to be separated from the target in the measurement image 202 by a track distance smaller than the distance threshold GD, the calculation is performed more than in the embodiment. The load can be reduced.
 その他、上記のハードウェア構成やフローチャートや閾値やパラメタ等は一例であり、任意に変更および修正が可能である。 Other than the above, the above hardware configuration, flowchart, threshold value, parameter, etc. are merely examples, and can be arbitrarily changed and modified.
 なお、本実施形態では平行光を照明に使う場合について述べたが、必ずしも平行光でなくともよく、接近離隔差分画像と、離隔接近差分画像とを用いて目標抽出が行なうことができる可視光の全てが対象となる。 In the present embodiment, the case where parallel light is used for illumination has been described. However, the light may not necessarily be parallel light, and the visible light that can be extracted using the approaching / separating difference image and the approaching / separating difference image. Everything is targeted.
 なお、試料解析装置100の各機能を、OS(オペレーティングシステム)とアプリケーションプログラムの分担、またはOSとアプリケーションプログラム(試料解析プログラム)との協働により実現する場合などには、アプリケーションプログラム部分のみを外部記憶部153や、記録媒体、記憶装置等に格納してもよい。 When each function of the sample analyzer 100 is realized by sharing of an OS (operating system) and an application program, or by cooperation between the OS and an application program (sample analysis program), only the application program portion is externally provided. You may store in the memory | storage part 153, a recording medium, a memory | storage device, etc.
 また、搬送波にアプリケーションプログラム(試料解析プログラム)を重畳し、通信ネットワークを介して配信することも可能である。たとえば、通信ネットワーク上の掲示板(BBS:Bulletin Board System)にアプリケーションプログラムを掲示し、ネットワークを介してアプリケーションプログラムを配信してもよい。そして、このアプリケーションプログラムをコンピュータにインストールして起動し、OSの制御下で、他のアプリケーションプログラムと同様に実行することにより、前記の処理を実行できるように構成してもよい。 It is also possible to superimpose an application program (sample analysis program) on a carrier wave and distribute it via a communication network. For example, an application program may be posted on a bulletin board (BBS: Bulletin Board System) on a communication network, and the application program may be distributed via the network. Then, the application program may be installed and activated in a computer, and may be configured to execute the above-described processing by being executed in the same manner as other application programs under the control of the OS.
 なお、本発明は、本発明の広義の精神と範囲を逸脱することなく、様々な実施形態及び変形が可能とされるものである。また、上述した実施形態は、本発明を説明するためのものであり、本発明の範囲を限定するものではない。つまり、本発明の範囲は、実施形態ではなく、特許請求の範囲によって示される。そして、特許請求の範囲内及びそれと同等の発明の意義の範囲内で施される様々な変形が、本発明の範囲内とみなされる。 The present invention is capable of various embodiments and modifications without departing from the broad spirit and scope of the present invention. Further, the above-described embodiment is for explaining the present invention, and does not limit the scope of the present invention. That is, the scope of the present invention is shown not by the embodiments but by the claims. Various modifications within the scope of the claims and within the scope of the equivalent invention are considered to be within the scope of the present invention.
 本出願は、2012年9月20日に出願された日本国特許出願2012-206493号に基づく。本明細書中に日本国特許出願2012-206493号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 This application is based on Japanese Patent Application No. 2012-206493 filed on September 20, 2012. The specification, claims, and entire drawings of Japanese Patent Application No. 2012-206493 are incorporated herein by reference.
1…試料
1a…異物
1b…粒子の飛跡
22…試料台
36…Z軸制御部
41…出射部
100…試料解析装置
123…撮像部
201、201a、201c…離隔画像
202、202a、202e、202g…測定画像
202b~202d…境界
203、203a、203c…接近画像
204、204a…接近離隔差分画像
205、205a…離隔接近差分画像
511…差分取得部
512…目標物抽出部
513…異物除去部
513a…凸検出部
514…飛跡抽出部
514a…凹検出部
DESCRIPTION OF SYMBOLS 1 ... Sample 1a ... Foreign substance 1b ... Track of particle 22 ... Sample stand 36 ... Z-axis control part 41 ... Output part 100 ... Sample analysis apparatus 123 ... Imaging part 201, 201a, 201c ... Separation image 202, 202a, 202e, 202g ... Measurement images 202b to 202d ... Boundaries 203, 203a, 203c ... Approaching images 204, 204a ... Approaching / separating difference image 205, 205a ... Separating approaching difference image 511 ... Difference acquiring unit 512 ... Target object extracting unit 513 ... Foreign object removing unit 513a ... Convex Detection unit 514 ... Track extraction unit 514a ... Concave detection unit

Claims (6)

  1.  可視光を出射する出射部と、
     前記出射部によって出射された可視光の光軸上に配置され、可視光が透過する試料を支持する支持部と、
     光学系を有し、前記支持部に支持された試料の画像を撮像する撮像部と、
     前記光学系の焦点位置を、可視光の光軸に沿って、試料の前記撮像部側の表面である撮像表面の基準位置よりも前記出射部に近い接近位置と、前記基準位置よりも前記出射部から離れた離隔位置と、の間で移動させる焦点移動部と、
     前記焦点移動部が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記試料の画像である接近画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動部が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記試料の画像である離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んだ凹部として検出する凹検出手段と、
     前記離隔画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から突出した凸部として検出する凸検出手段と、
     を備えることを特徴とする試料解析装置。
    An emission part for emitting visible light;
    A support part that is arranged on the optical axis of visible light emitted by the emission part and supports a sample through which visible light is transmitted;
    An imaging unit that has an optical system and captures an image of the sample supported by the support unit;
    The focal position of the optical system is set along the optical axis of visible light, the approach position closer to the emitting portion than the reference position of the imaging surface, which is the surface of the sample on the imaging portion side, and the emitting position than the reference position. A focal point moving part that is moved between a remote position away from the part,
    The focus moving unit moves the focus position to the approach position, and the brightness value of the approach image that is the image of the sample captured by the imaging unit at the approach position is larger than the brightness value of the peripheral portion that is set in advance. In addition, the brightness value of the remote image, which is the image of the sample imaged by the imaging unit at the remote position by the focus moving unit moving the focal position to the remote position, is smaller than the luminance value of the peripheral portion. Concave detecting means for detecting a portion as a concave recessed from the imaging surface;
    A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. Detection means;
    A sample analyzing apparatus comprising:
  2.  前記接近画像の各画素の輝度値から前記離隔画像の対応する各画素の輝度値を減算した接近離隔差分画像と、前記離隔画像の各画素の輝度値から前記接近画像の対応する各画素の輝度値を減算した離隔接近差分画像と、を取得する差分取得手段と、
     画像から輝度値が前記周辺部分とは異なる部分である閉領域を抽出する閉領域抽出手段と、
     をさらに備え、
     前記閉領域抽出手段は、前記焦点移動部が前記焦点位置を前記光軸に沿って前記基準位置から予め設定された測定距離だけ移動した測定位置に移動させて前記撮像部が前記測定位置で撮像した前記試料の画像としての測定画像内の閉領域と、前記接近離隔差分画像内の輝度値が正の値となる閉領域と、前記離隔接近差分画像内の輝度値が正の値となる閉領域と、を抽出し、
     前記凹検出手段は、前記測定画像内の閉領域のうち、前記接近離隔差分画像内の輝度値が正の値となる閉領域の位置に配置されたものを、前記凹部として検出し、
     前記凸検出手段は、前記測定画像内の閉領域のうち、前記離隔接近差分画像内の輝度値が正の値となる閉領域の位置に配置されたものを、前記凸部として検出する
     ことを特徴とする請求項1に記載の試料解析装置。
    The approach / separation difference image obtained by subtracting the brightness value of each pixel corresponding to the separation image from the brightness value of each pixel of the approach image, and the brightness of each pixel corresponding to the approach image from the brightness value of each pixel of the separation image A difference acquisition means for acquiring a distance approaching difference image obtained by subtracting a value;
    A closed region extracting means for extracting a closed region whose luminance value is different from the peripheral portion from the image;
    Further comprising
    The closed region extracting means moves the focus position to a measurement position moved by a preset measurement distance from the reference position along the optical axis, and the imaging unit images at the measurement position. A closed region in the measurement image as the image of the sample, a closed region in which the luminance value in the approaching / separating difference image is a positive value, and a closed region in which the luminance value in the separating / approaching difference image is a positive value. Extract the region and
    The concave detecting means detects, as the concave portion, a closed region in the measurement image that is arranged at a position of the closed region where the brightness value in the approaching / separating difference image is a positive value,
    The convex detection means detects, as the convex portion, a closed region in the measurement image that is arranged at a position of a closed region where the brightness value in the separation approach difference image is a positive value. The sample analysis apparatus according to claim 1, characterized in that:
  3.  前記閉領域抽出手段は、前記測定画像から輝度値が他の部分とは異なる部分を囲む境界として、内側境界と、前記内側境界を囲む外側境界と、を抽出した場合に、各境界の領域内の平均輝度値が最も低い領域の境界を前記測定画像内の閉領域の境界とする
     ことを特徴とする請求項2に記載の試料解析装置。
    When the closed region extraction unit extracts an inner boundary and an outer boundary surrounding the inner boundary as a boundary surrounding a portion having a luminance value different from other portions from the measurement image, The sample analysis apparatus according to claim 2, wherein a boundary of a region having the lowest average luminance value is defined as a boundary of a closed region in the measurement image.
  4.  可視光が光軸上に配置された試料を透過する場合に、光学系を有する撮像部によって試料の画像を撮像する撮像工程と、
     前記光学系の焦点位置を、可視光の光軸に沿って、試料の前記撮像部側の表面である撮像表面の基準位置よりも光源に近い接近位置と、前記基準位置よりも光源から離れた離隔位置と、の間で焦点移動部を介して移動させる焦点移動工程と、
     前記焦点移動部が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記試料の画像である接近画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動部が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記試料の画像である離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んだ凹部として検出する凹検出工程と、
     前記離隔画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から突出した凸部として検出する凸検出工程と、
     を備えることを特徴とする試料解析方法。
    An imaging step of capturing an image of the sample by an imaging unit having an optical system when visible light is transmitted through the sample disposed on the optical axis;
    The focal position of the optical system is closer to the light source than the reference position of the imaging surface, which is the surface on the imaging unit side of the sample, along the optical axis of visible light, and away from the light source than the reference position A focal point moving step for moving between the separated position and the focal point moving unit;
    The focus moving unit moves the focus position to the approach position, and the brightness value of the approach image that is the image of the sample captured by the imaging unit at the approach position is larger than the brightness value of the peripheral portion that is set in advance. In addition, the brightness value of the remote image, which is the image of the sample imaged by the imaging unit at the remote position by the focus moving unit moving the focal position to the remote position, is smaller than the luminance value of the peripheral portion. A concave detecting step of detecting a portion as a concave recessed from the imaging surface;
    A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. A detection process;
    A sample analysis method comprising:
  5.  コンピュータを、
     可視光が光軸上に配置された試料を透過する場合に、光学系を有する撮像部によって試料の画像を撮像する撮像手段、
     前記光学系の焦点位置を、可視光の光軸に沿って、試料の前記撮像部側の表面である撮像表面の基準位置よりも光源に近い接近位置と、前記基準位置よりも光源から離れた離隔位置と、の間で移動させる焦点移動手段、
     前記焦点移動手段が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記試料の画像である接近画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動手段が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記試料の画像である離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んだ凹部として検出する凹検出手段、
     前記離隔画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から突出した凸部として検出する凸検出手段、
     として機能させることを特徴とする試料解析プログラム。
    Computer
    An imaging unit that captures an image of the sample by an imaging unit having an optical system when visible light passes through the sample disposed on the optical axis;
    The focal position of the optical system is closer to the light source than the reference position of the imaging surface, which is the surface on the imaging unit side of the sample, along the optical axis of visible light, and away from the light source than the reference position A focal point moving means for moving between the separated position and
    The focus moving means moves the focus position to the approach position, and the brightness value of the approach image, which is the image of the sample imaged by the imaging unit at the approach position, is larger than the brightness value of the peripheral portion set in advance. In addition, the brightness value of the remote image, which is the image of the sample imaged by the imaging unit at the remote position with the focus moving means moving the focus position to the remote position, is smaller than the luminance value of the peripheral portion. Concave detecting means for detecting a portion as a concave recessed from the imaging surface;
    A convex portion that detects a portion where the luminance value of the separated image is larger than the luminance value of the peripheral portion and the luminance value of the approaching image is smaller than the luminance value of the peripheral portion as a convex portion protruding from the imaging surface. Detection means,
    Sample analysis program characterized by functioning as
  6.  可視光を出射する出射部と、
     前記出射部によって出射された可視光の光軸上に配置され、可視光が透過する飛跡検出用固体を支持する支持部と、
     光学系を有し、前記支持部に支持された試料としての飛跡検出用固体の画像を撮像する撮像部と、
     前記光学系の焦点位置を、可視光の光軸に沿って、飛跡検出用固体の前記撮像部側の表面である撮像表面の基準位置よりも前記出射部に近い接近位置と、前記基準位置よりも前記出射部から離れた離隔位置と、の間で移動させる焦点移動部と、
     前記焦点移動部が前記焦点位置を前記離隔位置に移動させて前記撮像部が前記離隔位置で撮像した前記飛跡検出用固体の画像である離隔画像の輝度値が予め設定された周辺部分の輝度値よりも大きく、且つ、前記焦点移動部が前記焦点位置を前記接近位置に移動させて前記撮像部が前記接近位置で撮像した前記飛跡検出用固体の画像である接近画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面に付着して前記撮像表面から突出した異物として前記飛跡検出用固体の画像から除去する異物除去手段と、
     前記接近画像の輝度値が前記周辺部分の輝度値よりも大きく、且つ、前記離隔画像の輝度値が前記周辺部分の輝度値よりも小さい部分を、前記撮像表面から凹んで形成された粒子の飛跡として前記飛跡検出用固体の画像から抽出する飛跡抽出手段と、
     を備えることを特徴とする粒子飛跡解析装置。
    An emission part for emitting visible light;
    A support part that is disposed on the optical axis of visible light emitted by the emission part and supports a track detection solid that transmits visible light; and
    An imaging unit that has an optical system and captures an image of a track detection solid as a sample supported by the support;
    The focal position of the optical system is closer to the emitting part than the reference position of the imaging surface, which is the surface on the imaging part side of the solid for track detection, along the optical axis of visible light, and from the reference position And a focal point moving unit that moves between a remote position away from the emitting unit,
    The brightness value of the peripheral portion in which the brightness value of the separated image, which is an image of the track detection solid image captured by the imaging unit at the separated position by the focus moving unit moving the focal position to the separated position, is set in advance. And the brightness value of the approach image, which is an image of the track detection solid image captured by the imaging unit at the approach position when the focus moving unit moves the focus position to the approach position, A foreign matter removing means for removing a portion smaller than the brightness value from the image of the track detection solid as a foreign matter attached to the imaging surface and protruding from the imaging surface;
    A track of particles formed by recessing a portion where the luminance value of the approaching image is larger than the luminance value of the peripheral portion and the luminance value of the separated image is smaller than the luminance value of the peripheral portion from the imaging surface Track extraction means for extracting from the track detection solid image as,
    A particle track analysis apparatus comprising:
PCT/JP2013/075183 2012-09-20 2013-09-18 Sample analysis device, sample analysis method, sample analysis program, and particle track analysis device WO2014046138A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012206493A JP5960006B2 (en) 2012-09-20 2012-09-20 Sample analyzer, sample analysis method, sample analysis program, and particle track analyzer
JP2012-206493 2012-09-20

Publications (1)

Publication Number Publication Date
WO2014046138A1 true WO2014046138A1 (en) 2014-03-27

Family

ID=50341448

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/075183 WO2014046138A1 (en) 2012-09-20 2013-09-18 Sample analysis device, sample analysis method, sample analysis program, and particle track analysis device

Country Status (2)

Country Link
JP (1) JP5960006B2 (en)
WO (1) WO2014046138A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7292457B1 (en) 2022-03-14 2023-06-16 三菱電機株式会社 Surface profile inspection method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06118026A (en) * 1992-10-01 1994-04-28 Toyo Seikan Kaisha Ltd Method for inspecting vessel inner surface
JP2001027611A (en) * 1999-07-13 2001-01-30 Lasertec Corp Flow inspecting apparatus
JP2004191215A (en) * 2002-12-12 2004-07-08 Talk Engineering Kk Inspection device for mirror body
JP2011027443A (en) * 2009-07-21 2011-02-10 Ryuze Inc Simple telecentric lens device, and method and apparatus for inspecting minute uneven flaw of flat platelike transparent object using the same
JP2011095512A (en) * 2009-10-30 2011-05-12 Fujitsu Ltd Confocal microscope

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06118026A (en) * 1992-10-01 1994-04-28 Toyo Seikan Kaisha Ltd Method for inspecting vessel inner surface
JP2001027611A (en) * 1999-07-13 2001-01-30 Lasertec Corp Flow inspecting apparatus
JP2004191215A (en) * 2002-12-12 2004-07-08 Talk Engineering Kk Inspection device for mirror body
JP2011027443A (en) * 2009-07-21 2011-02-10 Ryuze Inc Simple telecentric lens device, and method and apparatus for inspecting minute uneven flaw of flat platelike transparent object using the same
JP2011095512A (en) * 2009-10-30 2011-05-12 Fujitsu Ltd Confocal microscope

Also Published As

Publication number Publication date
JP2014062936A (en) 2014-04-10
JP5960006B2 (en) 2016-08-02

Similar Documents

Publication Publication Date Title
JP5703609B2 (en) Microscope and region determination method
US9235040B2 (en) Biological sample image acquiring apparatus, biological sample image acquiring method, and biological sample image acquiring program
JP5644447B2 (en) Microscope, region determination method, and program
US10890750B2 (en) Observation system, observation program, and observation method
US9322782B2 (en) Image obtaining unit and image obtaining method
US8654188B2 (en) Information processing apparatus, information processing system, information processing method, and program
US9338408B2 (en) Image obtaining apparatus, image obtaining method, and image obtaining program
US20130188033A1 (en) Recording medium having observation program recorded therein and observation apparatus
JP6069825B2 (en) Image acquisition apparatus, image acquisition method, and image acquisition program
WO2010128670A1 (en) Focus control device, and incubation and observation device
JP2018512609A (en) Method, system and apparatus for automatically focusing a microscope on a substrate
CN102156976A (en) Arithmetically operating device, arithmetically operating method, arithmetically operating program, and microscope
US10379335B2 (en) Illumination setting method, light sheet microscope apparatus, and recording medium
JP2014203038A (en) Analysis apparatus, analysis program, and analysis system
JP2021502594A (en) Safety light curtain that disables the rotation of the carousel
US20200351414A1 (en) Slide rack determination system
US10551607B2 (en) Imaging apparatus and method and imaging control program
WO2021148465A1 (en) Method for outputting a focused image through a microscope
WO2014046138A1 (en) Sample analysis device, sample analysis method, sample analysis program, and particle track analysis device
CN112322713B (en) Imaging method, device and system and storage medium
JP2021512346A (en) Impact rescanning system
US20190033192A1 (en) Observation Device
JP6312410B2 (en) Alignment apparatus, microscope system, alignment method, and alignment program
JP4344862B2 (en) Method and apparatus for automatic detection of observation object
WO2024014079A1 (en) Cell observation device and imaging method used in cell observation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13838677

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13838677

Country of ref document: EP

Kind code of ref document: A1