US9007452B2 - Magnification observation device, magnification observation method, and magnification observation program - Google Patents

Magnification observation device, magnification observation method, and magnification observation program Download PDF

Info

Publication number
US9007452B2
US9007452B2 US13/558,430 US201213558430A US9007452B2 US 9007452 B2 US9007452 B2 US 9007452B2 US 201213558430 A US201213558430 A US 201213558430A US 9007452 B2 US9007452 B2 US 9007452B2
Authority
US
United States
Prior art keywords
image data
imaging
image
unit
magnification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/558,430
Other languages
English (en)
Other versions
US20130050464A1 (en
Inventor
Woobum Kang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Keyence Corp
Original Assignee
Keyence Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keyence Corp filed Critical Keyence Corp
Assigned to KEYENCE CORPORATION reassignment KEYENCE CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANG, WOOBUM
Publication of US20130050464A1 publication Critical patent/US20130050464A1/en
Application granted granted Critical
Publication of US9007452B2 publication Critical patent/US9007452B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/26Stages; Adjusting means therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/362Mechanical details, e.g. mountings for the camera or image sensor, housings
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/02Objectives
    • G02B21/025Objectives with variable magnification

Definitions

  • the present invention relates to a magnification observation device, a magnification observation method, and a magnification observation program.
  • Japanese Unexamined Patent Publication No. 2008-139795 discloses a fluorescence microscope system in which images of a plurality of regions of an object are joined together to generate one synthetic wide-region image. According to this fluorescence microscope system, it is possible to obtain an image of a wider region of an object than a region corresponding to a visual field of an object lens with the lowest magnification.
  • a sample placement part placed with an object can be moved, and a plurality of regions of the object can be imaged, to thereby obtain a plurality of images. Thereafter, the obtained plurality of images are joined together. This leads to generation of a synthetic wide-region image, and the generated synthetic wide-region image is displayed in a display part.
  • a user can view the generated synthetic wide-region image, to thereby recognize that part of the regions has not been appropriately imaged. In this case, the user performs re-imaging on all the regions to regenerate a synthetic wide-region image.
  • a magnification observation device which images an object to display an image of the object
  • the device including: an imaging part that respectively images a plurality of unit regions of an object on a previously set imaging condition, to generate a plurality of pieces of image data respectively corresponding to the plurality of unit regions; a positional information generating part that generates positional information indicative of respective positions of the plurality of unit regions; a storage part that stores the plurality of pieces of image data generated by the imaging part along with the positional information generated by the positional information generating part; a connecting part that connects the plurality of pieces of image data stored into the storage part, to generate connected image data; a display part that displays images of the object including the plurality of unit regions as a region presentation image; an accepting part that accepts from a user a selection instruction for selecting any of the plurality of unit regions with the region presentation image being displayed by the display part; and a control part that controls the imaging part so as to generate image data corresponding to the selected unit region by re
  • a plurality of unit regions of an object are respectively imaged on a previously set imaging condition, to generate a plurality of pieces of image data respectively corresponding to the plurality of unit regions. Further, there is generated positional information indicative of respective positions of the plurality of unit regions. The generated plurality of pieces of image data are stored along with the positional information, and the stored plurality of pieces of image data are connected, to generate connected image data.
  • An image of the object including the plurality of unit regions is displayed as a region presentation image.
  • the selected unit region is re-imaged based on positional information on an imaging condition different from the previously set imaging condition, to generate image data corresponding to the selected unit region.
  • the image data generated by the re-imaging is stored replaceably with the image data corresponding to the selected unit region among the image data corresponding to the stored plurality of unit regions. Hence, the image data corresponding to the selected unit region can be replaced with the stored image data, to thereby generate the connected image data corresponding to the plurality of unit regions.
  • the image data corresponding to the selected unit region is replaceable with the image data generated by re-imaging on the imaging condition different from the previously set imaging condition. Therefore, when an imaging condition for part of the unit regions is not appropriate, an image of that unit region can be obtained on an appropriate imaging condition.
  • control part may replace the image data corresponding to the selected unit region among the image data corresponding to the plurality of unit regions stored in the storage part with the image data generated by the re-imaging.
  • the image data corresponding to the selected unit region among the stored image data corresponding to the plurality of unit regions is replaced with the image data generated by the re-imaging.
  • the accepting part may further accept from the user an adjustment instruction for adjusting the imaging condition of the imaging part to an imaging condition different from the previously set imaging condition, and when the selection instruction is accepted by the accepting part, the control part may control the imaging part so as to perform the re-imaging on the imaging condition adjusted in accordance with the adjustment instruction, thereby generating image data.
  • re-imaging can be performed while the imaging condition for each unit region is adjusted to an imaging condition different from the previously set imaging condition. This allows replacement of the image data corresponding to the selected unit region with the image data generated by the re-imaging on the adjusted imaging condition. Therefore, when an imaging condition for part of the unit regions is not appropriate, an image of that unit region can be obtained on an appropriate imaging condition.
  • the connecting part may sequentially connect the generated image data to previously generated image data corresponding to another unit region, and the display part may sequentially display images of a plurality of unit regions as the region presentation image based on the image data sequentially connected by the connecting part.
  • the images of the plurality of unit regions are sequentially displayed as the region presentation image in the display part.
  • the user can view the region presentation image displayed in the display part, to thereby select with ease a unit region that needs re-imaging. Further, the user can recognize with ease which region among the plurality of unit regions is a currently imaged unit region.
  • control part may control the connecting part so as to suspend connection of image data.
  • the selection instruction is accepted during the time until all the unit regions are imaged, to suspend the connection of the image data.
  • This prevents connection of a plurality of pieces of image data including inappropriate image data from being continued.
  • the wasteful time in generation of the connected image data is deleted. Consequently, appropriate connected image data can be efficiently generated.
  • the imaging part can image the object at a first magnification and a second magnification lower than the first magnification, and may respectively image the plurality of unit regions at the first magnification to generate a plurality of pieces of image data corresponding to the plurality of unit regions, and the display part may display, as the region presentation image, images based on image data generated by imaging at the second magnification by the imaging part.
  • the control part may automatically set the previously set imaging condition for imaging each unit region based on image data generated by imaging each unit region before imaging on the previously set imaging condition, and determine whether or not the imaging condition for each unit region has been normally set, and the control part may control the display part so as to display in the region presentation image an indicator for identifying an image of a unit region determined not to have been normally set with the imaging condition.
  • each unit region is imaged on an appropriate imaging condition without the user performing an operation for adjusting the imaging condition. Further, the user can view the indicator displayed on the region presentation image, to thereby recognize with ease the image of the unit region having not been normally set with the imaging condition. Therefore, the unit region can be selected based on the indicator, to thereby re-image with ease the unit region determined not to have been normally set with the imaging condition.
  • the magnification observation device may further include a determination part which determines whether or not the image of each unit region satisfies the previously set condition based on the image data corresponding to each unit region, and the control part may control the display part so as to display in the region presentation image an indicator for identifying an image of a unit region having been determined by the determination part not to satisfy the previously set condition.
  • the user can view the indicator displayed on the region presentation image, to thereby recognize with ease the image of the unit region in which an image satisfying the previously set condition has not been obtained. Therefore, the unit region can be selected based on the indicator, to thereby re-image with ease the unit region in which an image satisfying the previously set condition has not been obtained.
  • the imaging part may sequentially image the plurality of unit regions to sequentially generate image data corresponding to each unit region, and the control part may control the imaging part so as to sequentially re-image the unit region selected based on the selection instruction accepted by the accepting part and subsequent unit regions, thus generating image data corresponding to the selected unit region and the subsequent unit regions.
  • the selected unit region and the subsequent unit regions are sequentially re-imaged. Therefore, when images are obtained which are not appropriate with respect to the plurality of unit regions, the selected unit region and the plurality of subsequent unit regions are re-imaged by the user issuing a one-time selection direction. This simplifies the operation for the selection instruction for the unit region, performed by the user.
  • a magnification observation method for imaging an object to display an image of the object including the steps of: respectively imaging a plurality of unit regions of an object on a previously set imaging condition, to generate a plurality of pieces of image data respectively corresponding to the plurality of unit regions; generating positional information indicative of respective positions of the plurality of unit regions; storing the generated plurality of pieces of image data along with the generated positional information; connecting the stored plurality of pieces of image data, to generate connected image data; displaying images of the object including the plurality of unit regions as a region presentation image; accepting a selection instruction for selecting any of the plurality of unit regions with the region presentation image in a displayed state; and generating image data corresponding to the selected unit region by re-imaging the selected unit region on an imaging condition different from the previously set imaging condition based on the accepted selection instruction and the stored positional information, and storing the generated image data replaceably with the image data corresponding to the selected unit region among the image data corresponding to the
  • a plurality of unit regions of an object are respectively imaged on a previously set imaging condition, to generate a plurality of pieces of image data respectively corresponding to the plurality of unit regions. Further, there is generated positional information indicative of respective positions of the plurality of unit regions. The generated plurality of pieces of image data are stored along with the generated positional information, and the stored plurality of pieces of image data are connected, to generate connected image data.
  • An image of the object including the plurality of unit regions is displayed as a region presentation image.
  • the selected unit region is re-imaged based on positional information on an imaging condition different from the previously set imaging condition, to generate image data corresponding to the selected unit region.
  • the image data generated by the re-imaging is stored replaceably with the image data corresponding to the selected unit region among the image data corresponding to the stored plurality of unit regions. Hence, the image data corresponding to the selected unit region can be replaced with the stored image data, to thereby generate the connected image data corresponding to the plurality of unit regions.
  • the image data corresponding to the selected unit region is replaceable with the image data generated by the re-imaging on the imaging condition different from the previously set imaging condition. Therefore, when an imaging condition for part of the unit regions is not appropriate, images of those unit regions can be obtained on appropriate imaging conditions.
  • a magnification observation program for causing a processing apparatus to execute a process of imaging an object to display an image of the object, the program including the processes of respectively imaging a plurality of unit regions of an object on a previously set imaging condition, to generate a plurality of pieces of image data respectively corresponding to the plurality of unit regions; generating positional information indicative of respective positions of the plurality of unit regions; storing the generated plurality of pieces of image data along with the generated positional information; connecting the stored plurality of pieces of image data, to generate connected image data; displaying images of the object including the plurality of unit regions as a region presentation image; accepting a selection instruction for selecting any of the plurality of unit regions with the region presentation image in a displayed state; and generating image data corresponding to the selected unit region by re-imaging the selected unit region on an imaging condition different from the previously set imaging condition based on the accepted selection instruction and the stored positional information, and storing the image data generated by the re-imaging replaceably
  • a plurality of unit regions of an object are respectively imaged on a previously set imaging condition, to generate a plurality of pieces of image data respectively corresponding to the plurality of unit regions.
  • Positional information indicative of respective positions of the plurality of unit regions is also generated.
  • the generated plurality of pieces of image data are stored along with the generated positional information, and the stored plurality of pieces of image data are connected, to generate connected image data.
  • An image of the object including the plurality of unit regions is displayed as a region presentation image.
  • the selected unit region is re-imaged based on positional information on an imaging condition different from the previously set imaging condition, to generate image data corresponding to the selected unit region.
  • the image data generated by the re-imaging is stored replaceably with the image data corresponding to the selected unit region among the image data corresponding to the stored plurality of unit regions. Hence, the image data corresponding to the selected unit region can be replaced with the stored image data, to thereby generate the connected image data corresponding to the plurality of unit regions.
  • the image data corresponding to the selected unit region is replaceable with the image data generated by re-imaging on the imaging condition different from the previously set imaging condition. Therefore, when an imaging condition for part of the unit regions is not appropriate, an image of that unit region can be obtained on an appropriate imaging condition.
  • the connected image data in the case of re-imaging an object to obtain connected image data corresponding to a plurality of unit regions, the connected image data can be efficiently obtained in a short period of time.
  • FIG. 1 is a block diagram showing a configuration of a magnification observation device according to a first embodiment
  • FIG. 2 is a perspective view showing a microscope of the magnification observation device according to the first embodiment
  • FIG. 3 is a schematic view showing a state where an imaging unit of the microscope is fixed parallel to a Z-direction;
  • FIG. 4 is a schematic view showing a state where the imaging unit of the microscope is inclined at a desired angle from the Z-direction;
  • FIG. 5 is a view showing a display example of a display part at the time of imaging an observation object
  • FIG. 6 is a view showing another display example of the display part at the time of imaging the observation object
  • FIG. 7 is a view showing a display example of the display part during a magnification observation process according to the first embodiment
  • FIG. 8 is a view showing a display example of the display part during the magnification observation process according to the first embodiment
  • FIG. 9 is a view showing a display example of the display part during the magnification observation process according to the first embodiment.
  • FIG. 10 is a view showing a display example of the display part during the magnification observation process according to the first embodiment
  • FIG. 11 is a view showing a display example of the display part during the magnification observation process according to the first embodiment
  • FIG. 12 is a flowchart of a magnification observation process according to the first embodiment
  • FIG. 13 is a flowchart of the magnification observation process according to the first embodiment
  • FIG. 14 is a view showing a display example of the display part during a magnification observation process according to a second embodiment
  • FIG. 15 is a flowchart of the magnification observation process according to the second embodiment.
  • FIG. 16 is a flowchart of the magnification observation process according to the second embodiment.
  • a magnification observation device, a magnification observation method, and a magnification observation program according to a first embodiment will be described with reference to the drawings.
  • FIG. 1 is a block diagram showing a configuration of a magnification observation device according to the first embodiment.
  • X-direction and Y-direction two directions orthogonal within a horizontal plane are taken as an X-direction and a Y-direction, and a vertical direction (perpendicular direction) to the X-direction and the Y-direction is taken as a Z-direction.
  • a magnification observation device 300 is provided with a microscope 100 and an image processing apparatus 200 .
  • the microscope 100 includes an imaging unit 10 , a stage unit 20 , and a rotational angle sensor 30 .
  • the imaging unit 10 includes a color CCD (charge coupled device) 11 , a half mirror 12 , an object lens 13 , a zoom adjusting part 13 a, a magnification detecting part 13 b , an A/D converter (analog/digital converter) 15 , an illumination light source 16 , and a lens driving part 17 .
  • the stage unit 20 includes a stage 21 , a stage driving part 22 , and a stage supporting part 23 . An observation object S is placed on the stage 21 .
  • the illumination light source 16 is, for example, a halogen lamp or a white light LED (light-emitting diode) which generates white light.
  • White light generated by the illumination light source 16 is reflected by the half mirror 12 , and thereafter collected by the object lens 13 onto the observation object S on the stage 21 .
  • the white light reflected by the observation object S is transmitted through the object lens 13 and the half mirror 12 , and incident on the color CCD 11 .
  • the color CCD 11 has a plurality of pixels for red that receive red wavelength light, a plurality of pixels for green that receive green wavelength light, and a plurality of pixels for blue that receive blue wavelength light.
  • the plurality of pixels for red, the plurality of pixels for green, and the plurality of pixels for blue are two-dimensionally arrayed. From each of the pixels in the color CCD 11 , an electric signal corresponding to a light receiving amount is outputted.
  • the output signal of the color CCD 11 is converted to a digital signal by the A/D converter 15 .
  • the digital signal outputted from the A/D converter 15 is sequentially provided as image data to the image processing apparatus 200 .
  • an imaging element such as a CMOS (complementary metal oxide semiconductor) image sensor may be used.
  • the object lens 13 is a zoom lens.
  • the zoom adjusting part 13 a changes a magnification of the object lens 13 by control of the image processing apparatus 200 .
  • the magnification detecting part 13 b detects the magnification of the object lens 13 , and provides a detection result to the image processing apparatus 200 .
  • the magnification of the object lens 13 is adjustable by the image processing apparatus 200 to an arbitrary magnification within a fixed range.
  • the zoom adjusting part 13 a may be operated by the user, to adjust the magnification of the object lens 13 .
  • the adjusted magnification of the object lens 13 is detected by the magnification detecting part 13 b , and provided to the image processing apparatus 200 .
  • the object lens 13 is provided movably in the Z-direction.
  • the lens driving part 17 moves the object lens 13 in the Z-direction by control of the image processing apparatus 200 . Thereby, a focal position of the imaging unit 10 moves in the Z-direction.
  • the stage 21 is rotatably provided on the stage supporting part 23 around an axis in the Z-direction.
  • the stage driving part 22 moves the stage 21 in an x-direction and a y-direction, described later, relatively with respect to the stage supporting part 23 based on a movement command signal (drive pulse) provided from the image processing apparatus 200 .
  • the stage driving part 22 uses a stepping motor.
  • the rotational angle sensor 30 detects a rotational angle of the stage 21 , and provides the image processing apparatus 200 with an angle detection signal indicating the detected angle.
  • a position of the stage 21 in the X-direction and the Y-direction and a rotational angle thereof are acquired.
  • the image processing apparatus 200 includes an interface 210 , a CPU (central processing unit) 220 , a ROM (read only memory) 230 , a storage unit 240 , an input unit 250 , a display part 260 , and an operation memory 270 .
  • a CPU central processing unit
  • ROM read only memory
  • the storage unit 240 is made up of a hard disk and the like. In the storage unit 240 , a later-described magnification observation program is stored, and a variety of data such as image data provided from the microscope 100 through the interface 210 are also stored. A detail of the magnification observation program will be described later.
  • the input unit 250 includes a keyboard and a pointing device. As the pointing device, a mouse, a joystick, or the like is used.
  • the display part 260 is configured, for example, by a liquid crystal display panel or an organic EL (electroluminescent) panel.
  • the operation memory 270 is made up of a RAM (random access memory), and used for processing a variety of data.
  • the CPU 220 executes the magnification observation program stored in the storage unit 240 , to perform image processing based on image data by means of the operation memory 270 , and displays an image based on the image data in the display part 260 . Further, the CPU 220 controls the color CCD 11 , the zoom adjusting part 13 a , the illumination light source 16 , the lens driving part 17 , and the stage driving part 22 of the microscope 100 through the interface 210 .
  • FIG. 2 is a perspective view showing the microscope 100 of the magnification observation device 300 according to the first embodiment.
  • the X-direction, the Y-direction, and the Z-direction are indicated by arrows.
  • the microscope 100 has a base 1 .
  • a first supporting base 2 is attached onto the base 1
  • a second supporting base 3 is also attached to the front surface of the first supporting base 2 so as to be embedded thereinto.
  • a connecting part 4 is rotatably attached to the top edge of the first supporting base 2 around a rotational axis R 1 extending in the Y-direction.
  • a rotational column 5 is attached to the connecting part 4 .
  • the rotational column 5 is inclinable within a vertical plane parallel to the Z-direction with the rotational axis R 1 taken as a fulcrum point in association with rotation of the connecting part 4 .
  • the user can fix the connecting part 4 to the first supporting base 2 by means of a fixing knob 9 .
  • a circular supporting part 7 is attached to the front surface of a connecting part 6 .
  • a substantially tubular imaging unit 10 is attached to the supporting part 7 .
  • a light axis R 2 of the imaging unit 10 is parallel to the Z-direction.
  • the supporting part 7 has a plurality of adjustment screws 41 for moving the imaging unit 10 within a horizontal plane. It is possible to adjust a position of the imaging unit 10 such that the light axis R 2 of the imaging unit 10 vertically intersects with a rotational axis R 1 by means of the plurality of adjustment screws 41 .
  • a slider 8 is attached, slidably in the Z-direction, to the front surface of the second supporting base 3 on the base 1 .
  • An adjustment knob 42 is provided on the side surface of the second supporting base 3 .
  • a position of the slider 8 in the Z-direction (height direction) is adjustable by an adjustment knob 42 .
  • the stage supporting part 23 of the stage unit 20 is attached onto the slider 8 .
  • the stage 21 is rotationally provided around a rotational axis R 3 in the Z-direction with respect to the stage supporting part 23 . Further, the x-direction and the y-direction intersecting with each other within the horizontal plane are set on the stage 21 .
  • the stage 21 is provided movably in the x-direction and the y-direction by the stage driving part 22 of FIG. 1 .
  • the stage 21 rotates around the rotational axis R 3
  • the x-direction and the y-direction of the stage 21 also rotate. This leads to inclination of the x-direction and the y-direction of the stage 21 within a horizontal plane with respect to the X-direction and the Y-direction.
  • An imaging range (visual field range) of the imaging unit 10 varies depending on a magnification of the imaging unit 10 .
  • the imaging range of the imaging unit 10 is referred to as a unit region.
  • the stage 21 can be moved in the x-direction and the y-direction, to thereby acquire image data of a plurality of unit regions.
  • the image data of the plurality of unit regions can be connected, to thereby display images of the plurality of unit regions in the display part 260 of FIG. 1 .
  • the imaging range of the imaging unit 10 is referred to as the unit region in the present embodiment as thus described, the unit region is not necessarily the imaging range of the imaging unit 10 .
  • part of regions within the imaging range of the imaging unit 10 may be taken as a unit region. In this case, the unit region is smaller than the imaging range of the imaging unit 10 .
  • FIG. 3 is a schematic view showing a state where the imaging unit 10 of the microscope 100 is fixed parallel to the Z-direction.
  • FIG. 4 is a schematic view showing a state where the imaging unit 10 of the microscope 100 is inclined at a desired angle from the Z-direction.
  • the fixing knob 9 is fastened, to fix the connecting part 4 to the second supporting base 3 .
  • the light axis R 2 of the imaging unit 10 vertically intersects with the rotational axis R 1 while being in a parallel state to the Z-direction.
  • the light axis R 2 of the imaging unit 10 is vertical to the surface of the stage 21 .
  • the fixing knob 9 is loosened to make the connecting part 4 rotatable around the rotational axis R 1 , and the rotational column 5 inclinable with the rotational axis R 1 taken as a fulcrum point. Therefore, as shown in FIG. 4 , the light axis R 2 of the imaging unit 10 is inclinable at an arbitrary angle ⁇ with respect to the Z-direction. In this case, the light axis R 2 of the imaging unit 10 vertically intersects with the rotational axis R 1 . Similarly, the light axis R 2 of the imaging unit 10 is inclinable at an arbitrary angle on the side opposite to the side in FIG. 4 with respect to the Z-direction.
  • a height of the surface of an observation object on the stage 21 can be made to agree with a height of the rotational axis R 1 , to thereby observe the same portion of the observation object in a vertical direction and an oblique direction.
  • magnifications which are different from each other within a certain range, of the object lens 13 adjustable by the zoom adjusting part 13 a of FIG. 1 are referred to as a first magnification and a second magnification.
  • the second magnification is lower than the first magnification.
  • FIG. 5 is a view showing a display example of the display part 260 at the time of imaging the observation object S
  • FIG. 6 is a view showing another display example of the display part 260 at the time of imaging the observation object S.
  • an image display region 410 As shown in FIGS. 5 and 6 , an image display region 410 , a condition setting region 420 , and a pointer p are displayed on a screen of the display part 260 .
  • an image based on image data is displayed in the image display region 410 , and a first magnification set button 421 , a second magnification set button 422 and a connection process button 423 are displayed in the condition setting region 420 .
  • the user operates the first magnification set button 421 by use of the input unit 250 of FIG. 1 .
  • the zoom adjusting part 13 a is thus controlled, to adjust the magnification of the object lens 13 to the first magnification.
  • the CPU 220 displays an image corresponding to the unit region of the observation object S in the image display region 410 of the display part 260 .
  • the user operates the second magnification set button 422 by use of the input unit 250 of FIG. 1 .
  • the zoom adjusting part 13 a is thus controlled, to adjust the magnification of the object lens 13 to the second magnification.
  • the CPU 220 displays an image corresponding to the unit region of the observation object S in the image display region 410 of the display part 260 .
  • the second magnification is lower than the first magnification.
  • the unit region in the case where the first magnification is set is smaller than the unit region in the case where the second magnification is set.
  • the range of the image of the observation object S displayed in the image display region 410 can be enlarged.
  • a dotted line is provided to a portion of images corresponding to the unit region of the observation object S at the time when the magnification of the object lens 13 is the first magnification.
  • the user can move the stage 21 in the x-direction or the y-direction, to thereby change the range of the images of the observation object S displayed in the image display region 410 .
  • the user operates the connection process button 423 by use of the input unit 250 of FIG. 1 . Thereby, a later-mentioned magnification observation process is started.
  • the stage 21 can be moved in the x-direction and the y-direction, to thereby acquire image data of a plurality of unit regions, and the image data of the plurality of unit regions can be connected, to thereby display images of the plurality of unit regions in the display part 260 of FIG. 1 .
  • a process of connecting image data of a plurality of unit regions and displaying the images of the plurality of unit regions in the display part 260 as thus described is referred to as a magnification observation process.
  • the user operates the connection process button 423 of FIG. 5 or 6 , to start the magnification observation process.
  • FIGS. 7 to 11 are views each showing a display example of the display part 260 during the magnification observation process according to the first embodiment.
  • the CPU 220 of FIG. 1 acquires a digital signal provided from the A/D converter 15 of FIG. 1 as image data corresponding to a first unit region, and stores the acquired image data as first image data into the storage unit 240 of FIG. 1 .
  • the CPU 220 generates first positional information corresponding to the first image data and indicative of a position of the first unit region, and stores the generated first positional information into the storage unit 240 of FIG. 1 .
  • the CPU 220 displays an image r 1 of the first unit region in the display part 260 of FIG. 1 based on the acquired first image data and first positional information.
  • the positional information may, for example, be generated based on the foregoing response signal from the stage driving part 22 ( FIG. 1 ) to the movement command signal and the foregoing angle detection signal from the rotational angle sensor 30 ( FIG. 1 ).
  • the positional information may be a coordinate of each unit region in the X-direction and the Y-direction. Further, in a case where imaged positions of the plurality of unit regions are previously set in imaging order, the positional information may be the order of imaging.
  • the CPU 220 controls the stage driving part 22 of FIG. 1 to move the stage 21 of FIG. 1 in the x-direction and the y-direction such that regions around the first unit region are imaged.
  • a unit region adjacent to the first unit region is imaged as a second unit region.
  • the CPU 220 acquires a digital signal provided from the A/D converter 15 of FIG. 1 as image data corresponding to the second unit region, and stores the acquired image data as second image data into the storage unit 240 of FIG. 1 .
  • the CPU 220 generates second positional information corresponding to the second image data and indicative of a position of the second unit region, and stores the generated second positional information into the storage unit 240 of FIG. 1 .
  • the CPU 220 connects the second image data to the first image data stored in the storage unit 240 , and displays an image r 2 of the second unit region in the display part 260 of FIG. 1 based on the acquired second image data and second positional information.
  • respective parts of the adjacent unit regions are preferably set so as to overlap each other.
  • the pattern matching can be performed, to thereby sequentially connect the acquired image data to the image data corresponding to the adjacent unit region.
  • a series of operations including movement of the stage 21 , imaging of an n-th unit region (n is a natural number not less than 2), storage of image data and positional information, connection of image data, and display of an image.
  • this series of operations is referred to as a connected image generating operation.
  • the connected image generating operations are repeated, to sequentially display images r 1 to r 6 of first to sixth unit regions in a spiral form in the image display region 410 , for example as shown in FIG. 7 .
  • a re-imaging button 424 and a connection end button 425 are displayed in the condition setting region 420 .
  • the image display region 410 is provided with dotted lines for partitioning the plurality of images corresponding to the plurality of unit regions.
  • the re-imaging button 424 of FIG. 7 can be operated, to thereby selectively re-image any unit region among the plurality of imaged unit regions. An operation at the time of re-imaging will be described.
  • FIG. 8 shows a display example of the display part 260 in the case of operation of the re-imaging button 424 of FIG. 7 .
  • the CPU 220 suspends repetition of the connected image generating operation described above. Further, as shown in FIG. 8 , the CPU 220 displays in the display part 260 the images r 1 to r 6 as a region presentation image based on the plurality of pieces of image data, which were connected until the suspension. At this time, the connection end button 425 is displayed in the condition setting region 420 .
  • the user can select any of the plurality of images r 1 to r 6 displayed in the image display region 410 by use of the input unit 250 of FIG. 1 , to thereby select a unit region that needs re-imaging among the plurality of unit regions corresponding to the plurality of images r 1 to r 6 .
  • the pointer p is superimposed on any of the plurality of images r 1 to r 6 , to display a selection frame HF surrounding the image superimposed with the pointer p (image r 6 in FIG. 8 ).
  • the user can select with ease a unit region that needs re-imaging, while viewing the selection frame HF.
  • any of the plurality of images r 1 to r 6 is selected, to provide the CPU 220 with a signal indicative of a unit region of the selected image as a selection instruction.
  • a star-shaped mark m included in the image r 6 of the plurality of images r 1 to r 6 is unclearly displayed.
  • the user selects the image r 6 by use of the input unit 250 of FIG. 1 .
  • the CPU 220 accepts a signal indicative of the unit region of the image r 6 (sixth unit region), and controls the stage driving part 22 of FIG. 1 to move the stage 21 of FIG. 1 based on the accepted selection instruction and the positional information (sixth positional information in the present example) stored in the storage unit 240 , so as to allow imaging of the sixth unit region.
  • the CPU 220 acquires a digital signal provided from the A/D converter 15 as image data corresponding to the sixth unit region, and displays in the display part 260 of FIG. 1 the image r 6 of the sixth unit region based on the acquired image data.
  • FIG. 9 shows a display example of the display part 260 in the case where the image r 6 of FIG. 8 is selected.
  • the image r 6 of FIG. 8 is selected, the image r 6 is displayed in an entire region of the image display region 410 .
  • a re-imaging decision button 426 a gain adjustment button 427 a , an exposure time button 427 b , a white balance button 427 c , and a focus button 427 d are displayed in the condition setting region 420 .
  • the user can operate the gain adjustment button 427 a , the exposure time button 427 b , the white balance button 427 c , and the focus button 427 d , to thereby adjust imaging conditions at the time of re-imaging.
  • Any of the gain adjustment button 427 a , the exposure time button 427 b , the white balance button 427 c , and the focus button 427 d is operated, to provide the CPU 220 with an adjustment instruction for adjusting the imaging condition.
  • the user operates the gain adjustment button 427 a .
  • the CPU 220 accepts a gain adjustment instruction for the color CCD 11 of FIG. 1 as the adjustment instruction for the imaging condition for the unit region.
  • the CPU 220 increases or decreases the gain of the color CCD 11 based on the provided gain adjustment instruction.
  • the user operates the exposure time button 427 b .
  • the CPU 220 accepts an adjustment instruction for exposure time as the adjustment instruction for the imaging condition for the unit region.
  • the CPU 220 makes a shutter speed long or short based on the provided adjustment instruction for the exposure time.
  • the user operates the white balance button 427 c .
  • the CPU 220 accepts an adjustment instruction for white balance as the adjustment instruction for the imaging condition for the unit region. Based on the provided adjustment instruction for the white balance, the CPU 220 corrects a value of image data so as to change a color temperature of an image displayed based on the image data.
  • the user operates the focus button 427 d .
  • the CPU 220 accepts an adjustment instruction for a position in the Z-direction of the object lens 13 (hereinafter, referred to as Z position) as the adjustment instruction for the imaging condition for the unit region.
  • the CPU 220 controls the lens driving part 17 based on the provided adjustment instruction for the Z position, to move the object lens 13 in the Z-direction so as to move a focal position of the object lens 13 in the Z-direction.
  • the CPU 220 displays in the display part 260 of FIG. 1 the image r 6 of the sixth unit region of the observation object S based on image data acquired by imaging on an imaging condition after the adjustment.
  • the user can adjust a variety of imaging conditions with ease while viewing the image r 6 displayed in the image display region 410 .
  • the user can adjust the imaging condition with ease so as to make the star-shaped mark m, which is unclearly displayed in the example FIG. 8 , clearly displayed while viewing the image r 6 shown in the image display region 410 .
  • image data acquired by the re-imaging is referred to as re-imaged data
  • an image displayed in the display part 260 based on the re-imaged data is referred to as a re-imaged image.
  • the user After the adjustment of the imaging condition for the unit region, the user operates the re-imaging decision button 426 of FIG. 9 by use of the input unit 250 of FIG. 1 . Thereby, the CPU 220 re-images the sixth unit region on the adjusted imaging condition.
  • the CPU 220 stores the image data, acquired by the re-imaging as re-imaged data corresponding to the sixth unit region, into the storage unit 240 of FIG. 1 .
  • the CPU 220 generates sixth positional information corresponding to the re-imaged data and indicative of a position of the sixth unit region, and stores the generated sixth positional information into the storage unit 240 of FIG. 1 .
  • the CPU 220 replaces the sixth image data, stored before the re-imaging among the plurality of pieces of image data stored in the storage unit 240 of FIG. 1 , with the newly stored sixth re-imaged data. Moreover, the CPU 220 connects the sixth re-imaged data to the first to fifth image data.
  • the CPU 220 displays, in the display part 260 of FIG. 1 , the plurality of images r 1 to r 5 of the first to fifth unit regions and a re-imaged image r 6 x of the sixth unit region.
  • the CPU 220 then resumes the suspended connected image generating operation when the next unit region is not an imaged unit region.
  • the CPU 220 sequentially re-images the unit region and subsequent unit regions on the imaging condition after the adjustment, to sequentially acquire re-imaged data respectively corresponding to the unit region and the subsequent unit regions, while storing positional information corresponding to the respective re-imaged data into the storage unit 240 of FIG. 1 .
  • the CPU 220 replaces the image data stored in the storage unit 240 of FIG. 1 with the re-imaged data, and displays the re-imaged image in the image display region 410 .
  • FIG. 10 shows a display example of the display part 260 after operation of the re-imaging decision button 426 of FIG. 9 .
  • the above connected image generating operation is resumed.
  • subsequent unit regions are imaged, and images of the imaged unit regions are sequentially displayed in the image display region 410 .
  • a seventh image r 7 is displayed.
  • each unit region in the connected image generating operation after the resumption, the subsequent unit regions are imaged on the imaging condition adjusted at the time of re-imaging.
  • the re-imaging can lead to imaging of the subsequent unit regions on an appropriate condition.
  • each unit region may be imaged on an imaging condition set at the time of start of the magnification observation process.
  • FIG. 11 shows a display example of the display part 260 when the connection end button 425 is operated at the time of imaging of a 25 th unit region.
  • the CPU 220 completes the connected image generating operation.
  • the CPU 220 displays, in the display part 260 , the images r 1 to r 5 and r 7 to r 25 and the re-imaged image r 6 x as connected images based on the connected plurality of pieces of image data and re-imaged data.
  • the CPU 220 stores into the storage unit 240 the connected image data generated by mutual connection of the plurality of pieces of image data corresponding to the plurality of unit regions and re-imaged data, to complete the magnification observation process. At this time, the CPU 220 also stores into the storage unit 240 positional information respectively corresponding to the plurality of pieces of image data and the re-imaged data which constitute the connected image data.
  • a plurality of unit regions on the surface of the observation object S are imaged, and based on a plurality of pieces of image data obtained by the imaging, images of the plurality of unit regions are displayed in the display part 260 .
  • the user can select an inappropriate image to thereby select a unit region that needs re-imaging, while viewing the images r 1 to r 6 of the plurality of imaged unit regions.
  • the selected unit region is re-imaged, and the image data stored in the storage unit 240 is replaced with the re-imaged data.
  • connected image data is generated based on the plurality of pieces of image data and the re-imaged data stored in the storage unit 240 .
  • image data corresponding to the unit region to be re-imaged among the plurality of pieces of image data stored in the storage unit 240 of FIG. 1 is replaced with the re-imaged data at the time of re-imaging.
  • connected image data including the re-imaged data can be obtained with ease.
  • image data corresponding to the acquired re-imaged data may be stored into the storage unit 240 in a replaceable state.
  • the user may operate the input unit 250 of FIG. 1 to replace the image data with the re-imaged data.
  • the plurality of unit regions are sequentially imaged in the spiral form such that regions around the first unit region are imaged by the connected image generating operation, and the connection end button 425 is operated, to complete the magnification observation process.
  • the user can decide with ease the observation object range of the observation object S that needs imaging without previously setting an imaging range, while viewing the plurality of images displayed in the display part 260 .
  • the unit regions subsequent to the selected unit region are also sequentially re-imaged.
  • the plurality of subsequent unit regions are re-imaged only by one-time selection instruction by the user. This simplifies the operation for the selection instruction for the unit region, performed by the user.
  • the CPU 220 may re-image only the selected unit region instead of sequentially re-imaging the selected unit region and the subsequent unit regions.
  • the imaging condition may be appropriately adjusted with respect to each unit region.
  • FIGS. 12 and 13 are flowcharts of the magnification observation process according to the first embodiment.
  • the CPU 220 of FIG. 1 executes a magnification observation program to be stored into the storage unit 240 , to perform the magnification observation process according to the present embodiment.
  • a magnification observation program to be stored into the storage unit 240 , to perform the magnification observation process according to the present embodiment.
  • re-imaging only a unit region corresponding to an image selected at the time of re-imaging.
  • magnification observation device 300 In an initial state, previously fixed imaging conditions (gain of the color CCD 11 , exposure time, white balance, focal position of the object lens 13 , and the like) are set in the magnification observation device 300 by the user's operation.
  • the CPU 220 images a first unit region of the observation object S on the previously set imaging conditions, to acquire image data corresponding to the first unit region and generate first positional information indicative of a position of the first unit region, and stores the acquired image data into the storage unit 240 of FIG. 1 along with the first positional information (step S 1 ). Further, the CPU 220 displays an image of the first unit region in the display part 260 of FIG. 1 based on the acquired image data and the first positional information (step S 2 ).
  • the CPU 220 determines whether or not a command for re-imaging has been issued (step S 3 ). For example, the CPU 220 determines that the command for re-imaging has been issued when the re-imaging button 424 of FIG. 7 is operated, and determines that the command for re-imaging has not been issued when the re-imaging button 424 is not operated.
  • the CPU 220 determines whether or not a command for completing the magnification observation process has been issued (step S 4 ). For example, the CPU 220 determines that the command for completing the magnification observation process has been issued when the connection end button 425 of FIG. 7 , 8 , or 10 is operated, and determines that the command for completing the magnification observation process has not been issued when the connection end button 425 is not operated.
  • the CPU 220 stores into the storage unit 240 connected image data generated by the plurality of pieces of image data and the re-imaged data, and displays connected images in the display part 260 based on the connected image data (step S 5 ), to complete the magnification observation process.
  • the CPU 220 controls the stage driving part 22 of FIG. 1 to move the stage 21 of FIG. 1 so as to image the next unit region (step S 6 ).
  • the CPU 220 images the next unit region to acquire image data corresponding to that unit region, and generates positional information indicative of a position of the unit region, and stores the acquired image data into the storage unit 240 of FIG. 1 along with the corresponding positional information while connecting the acquired image data to the image data corresponding to the previously stored another unit region (step S 7 ). Further, the CPU 220 displays in the display part 260 of FIG. 1 an image of the unit region based on the acquired image data and positional information (step S 8 ), and returns to the process of step S 4 .
  • step S 4 when the command for re-imaging has been issued, the CPU 220 displays in the display part 260 images for allowing the user to select a unit region that needs re-imaging as the region presentation image (step S 9 ).
  • the region presentation image at this time there may be used a plurality of images based on all of image data acquired after the start of the magnification observation process and mutually connected to each other, or there may be used a plurality of images based on part of the image data acquired after the start of the magnification observation process and mutually connected to each other. Further, as the region presentation image, for example, there may be used images in which a plurality of non-connected images based on the plurality of pieces of image data are arrayed at fixed intervals.
  • the CPU 220 determines whether or not any imaged unit region has been selected as a re-imaging object (step S 10 ).
  • the CPU 220 determines that a unit region that needs re-imaging has been selected.
  • the CPU 220 controls the stage driving part 22 to move the stage 21 , so as to image the selected unit region based on the positional information stored in the storage unit 240 of FIG. 1 (step S 11 ).
  • the CPU 220 images the selected unit region to acquire image data, and displays an image based on the acquired image data in the display part 260 (step S 12 ).
  • the CPU 220 receives the above adjustment instruction for the imaging condition, to adjust an imaging condition based on the provided adjustment instruction (step S 13 ).
  • the CPU 220 adjusts a variety of imaging conditions based on the provided adjustment instruction. It is to be noted that, when there is no previously set adjustment instruction for a predetermined period of time, the CPU 220 does not adjust the imaging condition.
  • the CPU 220 re-images the selected unit region on the imaging condition after the adjustment, to acquire re-imaged data corresponding to the selected unit region, and stores positional information corresponding to the re-imaged data into the storage unit 240 (step S 14 ).
  • the CPU 220 replaces image data of the selected unit region with the acquired re-imaged data in the storage unit 240 , and connects the acquired re-imaged data to another image data (step S 15 ).
  • the CPU 220 displays in the display part 260 an image of the unit region selected based on the acquired re-imaged data and positional information (step S 16 ), and returns to the process of step S 4 .
  • a series of processing operations including steps S 4 , S 6 , S 7 , and S 8 corresponds to the connected image generating
  • the magnification observation device has the same configuration as that of the magnification observation device 300 of FIG. 1 , but has a different magnification observation program that is stored in the storage unit 240 . For this reason, the magnification observation process according to the second embodiment is different from the magnification observation process according to the first embodiment.
  • a detail of the magnification observation process according to the second embodiment will be described.
  • magnification observation process performed with the magnification of the object lens 13 being set to the first magnification, namely, the magnification observation process in the case of generating images corresponding to a plurality of unit regions at the first magnification.
  • an observation object range of the observation object S that needs imaging is set.
  • the user inputs information on the observation object range by use of the input unit 250 of FIG. 1 .
  • the CPU 220 of FIG. 1 sets the observation object range of the observation object S. Further, when the observation object range is larger than the unit region, the CPU 220 sets positions of a plurality of unit regions that need imaging within the set observation object range. Thereafter, the CPU 220 controls each constitutional element of the microscope 100 so as to sequentially image the plurality of set unit regions.
  • the observation object S is imaged at the second magnification which is lower than the first magnification.
  • the CPU 220 acquires image data corresponding to the unit region at the time of the second magnification being set, and generates positional information indicative of a position of that unit region, to store the acquired image data into the storage unit 240 of FIG. 1 along with the positional information.
  • the second magnification of the object lens 13 may be, for example, automatically set by the CPU 220 at the start of setting the observation object range, or set by the user operating the zoom adjusting part 13 a of FIG. 1 before the start of setting the observation object range.
  • the CPU 220 automatically sets the imaging condition at the start of imaging each unit region during the magnification observation process. This automatic setting will be specifically described. In the following specific example, the CPU 220 automatically sets the Z position of the object lens 13 as automatic setting of the imaging condition.
  • the CPU 220 controls the lens driving part 17 of FIG. 1 with the unit region opposed to the object lens 13 , to move the object lens 13 in the Z-direction, and images the unit region while moving a focal position of the object lens 13 in the Z-direction.
  • the CPU 220 detects the Z position (hereinafter, referred to as focal position) of the object lens 13 when the focus of the object lens 13 agrees with the surface of the unit region based on the image data provided from the A/D converter 15 . Subsequently, the CPU 220 moves the object lens 13 to the detected focal position. In such a manner, the Z position of the object lens 13 at the time of imaging is set to the focal position.
  • the unit region is imaged with the Z position of the object lens 13 automatically set to the focal position. Since the unit region is thus imaged with the object lens 13 in an in-focus state, the user can obtain with ease an in-focus image with respect to each unit region.
  • the focal position is not necessarily detected.
  • the CPU 220 sets the Z position of the object lens 13 to a previously set position (e.g., position set in the previously imaged unit region, or the like), and stores information (hereinafter, referred to as abnormal information) indicating that the imaging condition (Z position of the object lens 13 in the present example) has not been normally set in the unit region into the storage unit 240 of FIG. 1 .
  • a plurality of pieces of image data respectively corresponding to all the unit regions are stored into the storage unit 240 along with a plurality of pieces of respectively corresponding positional information.
  • abnormal information on the part or all of the unit regions is stored into the storage unit 240 .
  • the CPU 220 displays in the display part 260 of FIG. 1 an image (hereinafter, referred to as a low-magnification image) which has been acquired by imaging the observation object S at the second magnification at the time of setting the observation object range and is based on the image data stored in the storage unit 240 , as a region presentation image.
  • the second magnification is preferably set such that the low-magnification image displayed in the display part 260 includes the entire image in the observation object range.
  • FIG. 14 is a view showing a display example of the display part 260 during the magnification observation process according to the second embodiment. As shown in FIG. 14 , in the magnification observation process according to the second embodiment, all the unit regions within the observation object range are imaged, to display a low-magnification image of the observation object S as a region presentation image in the image display region 410 .
  • the CPU 220 of FIG. 1 displays within the image display region 410 a plurality of region frames f surrounding portions corresponding to the plurality of unit regions imaged at the first magnification.
  • the low-magnification image displayed within the image display region 410 is partitioned by the plurality of region frames f into twenty-five low-magnification images t 1 to t 25 .
  • the CPU 220 determines whether or not the imaging condition has been normally set in each unit region. Thus, the CPU 220 determines whether or not the imaging condition has been normally set in each unit region, and highlights the low-magnification image (low-magnification images t 2 , t 6 in the example of FIG. 14 ) corresponding to the unit region determined not to have been normally set with the imaging condition, based on the positional information stored in the storage unit 240 .
  • FIG. 14 presents the highlights by dark hatching.
  • the region presentation image is used for showing the unit region having not been normally set with the imaging condition. For this reason, differently from the first embodiment, it is not necessary in the second embodiment to display in the display part 260 a plurality of images based on image data acquired by imaging at the first magnification.
  • the user can operate the re-imaging button 424 of FIG. 14 , to thereby selectively re-image any of the plurality of imaged unit regions at the first magnification.
  • the CPU 220 comes into a standby state until receiving a selection instruction.
  • the user selects any of the low-magnification images t 1 to t 25 displayed in the image display region 410 by use of the pointer p.
  • the CPU 220 is provided with a selection instruction indicative of a unit region that needs re-imaging among the plurality of unit regions corresponding to the low-magnification images t 1 to t 25 .
  • the low-magnification images t 2 , t 6 portions corresponding to the unit regions having not been normally set with the imaging condition are highlighted. Therefore, the user can view the highlighted low-magnification images t 2 , t 6 , to thereby preferentially select the unit region having not been normally set with the imaging condition as the unit region that needs re-imaging.
  • the user selects the low-magnification image t 6 , to provide the CPU 220 with a signal indicative of the unit region corresponding to the low-magnification image t 6 as the selection instruction.
  • the CPU 220 images the selected unit region at the first magnification based on the provided selection instruction and the positional information stored in the storage unit 240 .
  • the CPU 220 displays the low-magnification image t 6 in the entire region of the image display region 410 , and displays the re-imaging decision button 426 , the gain adjustment button 427 a , the exposure time button 427 b , the white balance button 427 c , and the focus button 427 d in the condition setting region 420 .
  • the user can operate the gain adjustment button 427 a , the exposure time button 427 b , the white balance button 427 c , and the focus button 427 d , to thereby adjust the imaging conditions for the unit region.
  • the CPU 220 re-images the selected unit region on the imaging condition after the adjustment, to acquire re-imaged data, and stores the acquired re-imaged data into the storage unit 240 of FIG. 1 . Further, the CPU 220 generates positional information corresponding to the re-imaged data, and stores the generated position information into the storage unit 240 . At this time, the CPU 220 replaces the image data corresponding to the previously stored unit region with the re-imaged data.
  • the CPU 220 returns the displayed state of the display part 260 to the display state of FIG. 14 again.
  • the user can operate the re-imaging button 424 again while selecting the low-magnification image t 2 , to thereby re-image the unit region of the observation object S which corresponds to the low-magnification image t 2 at the first magnification.
  • the CPU 220 connects between the plurality of pieces of image data and the re-imaged data which are stored in the storage unit 240 , to generate connected image data corresponding to the observation object range.
  • the CPU 220 displays in the display part 260 a connected image in the observation object range based on the connected image data, while storing the generated connected image data into the storage unit 240 , and completes the magnification observation process.
  • the CPU 220 stores into the storage unit 240 positional information of each unit region along with the plurality of pieces of image data and the re-imaged data that constitute the connected image data.
  • the low-magnification image is used as the region presentation image for the user to select a unit region that needs re-imaging.
  • the low-magnification image can be displayed by means of a small capacity of image data. This eliminates the need for spending a long period of time on generation of image data for displaying the low-magnification image. Further, it is possible to prevent an amount of the image data for displaying the low-magnification image from exceeding a usable capacity of the operation memory 270 of FIG. 1 .
  • the Z position of the object lens 13 is automatically set to the focal position as the imaging condition.
  • abnormal information indicating that the imaging condition has not been normally set in the unit region is stored into the storage unit 240 . Based on the stored abnormal information, it is determined whether or not the imaging condition has been normally set in each unit region. A portion of the low-magnification image which corresponds to the unit region having not been normally set with the imaging condition is highlighted. Thereby, the user can view the low-magnification image including the highlight, to thereby recognize with ease the unit region having not been normally set with the imaging condition. Therefore, the user can re-image with ease the unit region having not been normally set with the imaging condition.
  • FIGS. 15 and 16 are flowcharts of the magnification observation process according to the second embodiment.
  • the CPU 220 of FIG. 1 executes the magnification observation process in accordance with the magnification observation program stored in the storage unit 240 of FIG. 1 .
  • the magnification of the object lens 13 is set to the second magnification in an initial state. Further, in the initial state, previously fixed imaging conditions (gain of the color CCD 11 , exposure time, white balance, and the like) are set in the magnification observation device 300 by the user's operation. It is to be noted that the Z position of the object lens 13 of FIG. 1 has not been set in the initial state.
  • the CPU 220 sets the observation object range, and images the observation object S at the second stage, to store the acquired image data into the storage unit 240 of FIG. 1 along with positional information corresponding thereto (step S 21 ).
  • the CPU 220 sets the observation object range of the observation object S based on, for example, information inputted from the input unit 250 of FIG. 1 . Further, when the observation object range is larger than the unit region, the CPU 220 sets positions of a plurality of unit regions that need imaging within the set observation object range.
  • step S 21 the magnification of the object lens 13 is changed from the second magnification to the first magnification by the user operating the input unit 250 of FIG. 1 or by the user operating the zoom adjusting part 13 a of FIG. 1 .
  • the CPU 220 automatically sets the imaging condition with respect to a first unit region of the observation object S (step S 22 ).
  • the CPU 220 automatically sets the Z position of the object lens 13 as the imaging condition.
  • the CPU 220 stores the above abnormal information into the storage unit 240 when being unable to detect the focal position of the object lens 13 at the time of automatic setting.
  • the CPU 220 images a first unit region with the Z position of the object lens 13 in the automatically set state, to acquire image data corresponding to the first unit region, generates first positional information indicative of a position of the first unit region, and stores the acquired image data into the storage unit 240 along with the first positional information (step S 23 ).
  • the CPU 220 determines whether or not all the unit regions within the observation object range, set in step S 21 , have been imaged (step S 24 ). Specifically, the CPU 220 determines whether or not a plurality of pieces of image data corresponding to all the unit regions within the observation object range have been stored into the storage unit 240 .
  • the CPU 220 controls the stage driving part 22 of FIG. 1 to move the stage 21 of FIG. 1 so as to image the next unit region of the observation object S (step S 25 ).
  • the CPU 220 automatically sets the imaging condition with respect to the next unit region to be imaged (step S 26 ). Further, the CPU 220 stores the above abnormal information into the storage unit 240 when being unable to detect the focal position of the object lens 13 at the time of automatic setting.
  • the CPU 220 images the next unit region with the Z position of the object lens 13 in the automatically set state, to thereby acquire image data corresponding to that unit region, and stores the acquired image data into the storage unit 240 along with positional information corresponding to the acquired image data (step S 27 ). Thereafter, the CPU 220 returns to the process of step S 24 .
  • step S 24 when all the unit regions within the observation object range are imaged, the CPU 220 displays in the display part 260 a low-magnification image based on the image data stored into the storage unit 240 in step S 21 as the region presentation image (step S 31 ). It is to be noted that the CPU 220 may perform the following process instead of storing the image data into the storage unit 240 in step S 21 .
  • the CPU 220 may change the magnification of the object lens 13 to the second magnification which is lower than the first magnification, and images the observation object S at the second magnification.
  • the CPU 220 can display in the display part 260 a low-magnification image based on the acquired image data as the region presentation image.
  • the CPU 220 determines whether or not the imaging condition has been normally set in each unit region based on the stored abnormal information, and highlights a portion of the low-magnification image which corresponds to a unit region having not been normally set with the imaging condition (step S 32 ).
  • the CPU 220 may display a letter, a symbol, a frame, or the like in the portion of the low-magnification image which corresponds to the unit region having not been normally set with the imaging condition.
  • the user can view the letter, the symbol, the frame, or the like, to thereby recognize with ease the unit region having not been imaged on an appropriate condition.
  • the CPU 220 determines whether or not a command for re-imaging has been issued (step S 33 ). For example, the CPU 220 determines that the command for re-imaging has been issued when the re-imaging button 424 of FIG. 14 is operated, and determines the command for re-imaging has not been issued when the re-imaging button 424 is not operated. When the command for re-imaging has not been issued, the CPU 220 proceeds to a process of step S 40 , to be described later.
  • the CPU 220 determines whether or not any imaged unit region has been selected as a re-imaging object, while displaying the low-magnification image in the display part 260 (step S 34 ).
  • the CPU 220 controls the stage driving part 22 to move the stage 21 , so as to image the selected unit region based on the positional information stored in the storage unit 240 of FIG. 1 (step S 35 ).
  • the CPU 220 images the selected unit region to acquire image data, and displays an image based on the acquired image data in the display part 260 (step S 36 ).
  • the CPU 220 receives the adjustment instruction for the imaging condition, to adjust an imaging condition based on the provided adjustment instruction (step S 37 ).
  • the CPU 220 adjusts a variety of imaging conditions based on the provided adjustment instruction.
  • the CPU 220 images the selected unit region of the observation object S on the imaging condition after the adjustment, to acquire re-imaged data corresponding to the selected unit region, and stores positional information corresponding to the re-imaged data into the storage unit 240 (step S 38 ). Further, the CPU 220 replaces the image data of the selected unit region with the acquired re-imaged data in the storage unit 240 (step S 39 ).
  • the CPU 220 determines whether or not a command for completing the magnification observation process has been issued (step S 40 ). For example, the CPU 220 determines that the command for completing the magnification observation process has been issued when the connection end button 425 of FIG. 14 is operated, and that the command for completing the magnification observation process has not been issued when the connection end button 425 is not operated.
  • the CPU 220 connects between the plurality of pieces of image data and the re-imaged data which are stored in the storage unit 240 , to generate connected image data, and displays a connected image based on the connected image data in the display part 260 while storing the generated connected image data into the storage unit 240 (step S 41 ). Thereby, the magnification observation process is completed.
  • the CPU 220 returns to the process of step S 33 .
  • the magnification observation process is performed in the magnification observation device 300 provided with the microscope 100 for observing the surface of the observation object S by use of white light.
  • the magnification observation process can be applied to another magnification observation device that enlarges the surface of the observation object S and observes the enlarged surface.
  • a magnification observation device for example, there are a magnification observation device provided with a microscope using optical interferometry, a magnification observation device provided with a confocal microscope, a magnification observation device provided with an electron scanning microscope, a magnification observation device provided with a scanning probe microscope, and a magnification observation device provided with a fluorescence microscope, and the like.
  • the adjustment instruction for the imaging condition for the unit region there were described the adjustment instruction for the gain of the color CCD 11 of FIG. 1 , the adjustment instruction for the exposure time, the adjustment instruction for white balance, and the adjustment instruction for the Z position of the object lens 13 .
  • the CPU 220 of FIG. 1 may accept the following adjustment instruction as the adjustment instruction for the imaging condition for the unit region of the observation object S.
  • the CPU 220 may accept an adjustment instruction for relative positions of the observation object S and the object lens 13 in the X-direction and the Y-direction as the adjustment instruction for the imaging condition of the unit region.
  • the CPU 220 controls the stage driving part 22 of FIG. 1 based on the adjustment instruction, to move the stage 21 in the x-direction and the y-direction.
  • the CPU 220 may accept an adjustment instruction for a light amount of the illumination light source 16 of FIG. 1 or an adjustment instruction with regard to an opening of a diaphragm (not shown) or a transmittance of a filter (not shown). In this case, the CPU 220 adjusts the light amount of the illumination light source 16 , the opening of the diaphragm (not shown) or the transmittance of the filter (not shown) based on the adjustment instruction.
  • the gain of the color CCD 11 can be set high, to make the level of the electric signal of the color CCD 11 high, thereby accurately identifying the image data.
  • high-reflection region a region with a large amount of reflected light
  • low-reflection region a region with a small amount of reflected light
  • this method is referred to as a wide dynamic range.
  • image data acquired with an appropriate gain corresponding to the high-reflection region and image data acquired with an appropriate gain corresponding to the low-reflection region can be synthesized, and it is thereby possible to accurately observe the state of the surface of the observation object S in the region.
  • the CPU 220 may accept an adjustment instruction for the number of imaging per unit region to be imaged and the gain of the color CCD 11 at the time of imaging, as the adjustment instruction for the imaging condition for the unit region.
  • the CPU 220 sets the number of imaging per unit region to be imaged and the gain of the color CCD 11 at the time of imaging. This enables accurate observation of the surface of the observation object S regardless of the state of the surface of the observation object S.
  • the CPU 220 may accept an adjustment instruction for a movement range (upper limit position and lower limit position) in the height direction of the object lens with respect to the observation object S as the adjustment instruction for the imaging condition for the unit region.
  • the CPU 220 sets the movement range in the height direction of the object lens based on the adjustment instruction. This allows appropriate setting of the movement range in the height direction of the object lens.
  • the CPU 220 may accept an adjustment instruction for the movement range (upper limit position and lower limit position) in the height direction of the object lens 13 with respect to the observation object S as the adjustment instruction for the imaging condition for the unit region.
  • the ultra-depth image data or the height image data of the plurality of unit regions may be connected, to display mutually connected ultra-depth images or height images of the plurality of unit regions in the display part 260 .
  • the automatic setting of the Z position of the object lens 13 has been described as the automatic setting of the imaging condition which is performed at the start of imaging each unit region.
  • the CPU 220 may automatically set, as the automatic setting of the imaging condition, the gain of the color CCD 11 of FIG. 1 , the exposure time, the white balance, the number of imaging per unit region and the gain of the color CCD 11 at the time of each imaging, the movement range in the height direction of the object lens with respect to the observation object S, and the like.
  • the CPU 220 may store the above abnormal information into the storage unit 240 of FIG. 1 . Further, when an appropriate shutter speed is not detected at the time of automatically setting the exposure time, the CPU 220 may store the above abnormal information into the storage unit 240 of FIG. 1 . Moreover, when an appropriate correction amount of an image data value cannot be decided at the time of automatically setting the white balance, the CPU 220 may store the above abnormal information into the storage unit 240 of FIG. 1 .
  • the CPU 220 may store the above abnormal information into the storage unit 240 of FIG. 1 .
  • the CPU 220 may store the above abnormal information into the storage unit 240 of FIG. 1 .
  • the user can view the low-magnification image (region presentation image) including the highlight, to recognize with ease whether or not each unit region has been imaged with a variety of imaging conditions in an appropriately set state.
  • the imaging condition is automatically set at the start of imaging each unit region, and a unit region having not been normally set with the imaging condition is highlighted in the region presentation image.
  • the CPU 220 may image a plurality of unit regions on a previously set imaging condition at the start of the magnification observation process, to thereby acquire image data corresponding to each unit region, and may determine whether or not the image of each unit region displayed in the display part 260 satisfies the previously set condition based on the acquired plurality of pieces of image data.
  • the CPU 220 may determine whether or not a total amount of brightness (brightness values) of a plurality of pixels in the image of each unit region is within the fixed range, and may highlight in the region presentation image the unit region of the image with the total of brightness of the plurality of pixels exceeding the fixed range.
  • the CPU 220 may determine whether or not the contrast ratio in the image of each unit region is within the fixed range, and may highlight in the region presentation image the unit region of the image with the contrast ratio exceeding the fixed range.
  • the user can view the region presentation image, to thereby recognize with ease whether the image of each unit region satisfies the previously set condition. Therefore, the user can select any unit region shown in the region presentation image based on the highlight, to thereby re-image with ease the unit region in which an image satisfying the previously set condition has not been obtained.
  • the imaging condition may be automatically set at the start of imaging each unit region.
  • an indicator (highlighted image, letter, symbol, frame, or the like) indicative of the unit region having not been imaged on an appropriate imaging condition may be displayed in the region presentation image.
  • the object lens 13 is moved in the Z-direction, to thereby change a relative position in the Z-direction of the observation object S with respect to the objective lens 3 , but this is not restrictive.
  • the stage 21 may be moved in the Z-direction, to thereby change the relative position in the Z-direction of the observation object S with respect to the object lens 13 .
  • the observation object S is an example of the object
  • the magnification observation device 300 is an example of the magnification observation device
  • the imaging unit 10 is an example of the imaging part
  • the storage unit 240 is an example of the storage part
  • the CPU 220 is an example of the positional information generating part
  • the connecting part is an example of the control part, the determination part, and the processing apparatus
  • the display part 260 is an example of the display part
  • the CPU 220 and the input unit 250 are examples of the accepting part.
  • the present invention is effectively applicable to magnification observation devices using a variety of microscopes.
US13/558,430 2011-08-31 2012-07-26 Magnification observation device, magnification observation method, and magnification observation program Active 2033-10-01 US9007452B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011188704A JP5732353B2 (ja) 2011-08-31 2011-08-31 拡大観察装置、拡大観察方法および拡大観察プログラム
JP2011-188704 2011-08-31

Publications (2)

Publication Number Publication Date
US20130050464A1 US20130050464A1 (en) 2013-02-28
US9007452B2 true US9007452B2 (en) 2015-04-14

Family

ID=47665494

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/558,430 Active 2033-10-01 US9007452B2 (en) 2011-08-31 2012-07-26 Magnification observation device, magnification observation method, and magnification observation program

Country Status (4)

Country Link
US (1) US9007452B2 (ja)
JP (1) JP5732353B2 (ja)
CN (1) CN102967929B (ja)
DE (1) DE102012215307A1 (ja)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9817223B2 (en) 2013-11-04 2017-11-14 Carl Zeiss Microscopy Gmbh Digital microscope comprising pivoting stand, method for calibration and method for automatic focus and image center tracking for such a digital microscope
US11009344B2 (en) 2018-04-20 2021-05-18 Keyence Corporation Image observing device, image observing method, image observing program, and computer-readable recording medium

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5841398B2 (ja) * 2011-10-07 2016-01-13 株式会社キーエンス 拡大観察装置
JP2015127779A (ja) 2013-12-27 2015-07-09 株式会社キーエンス 顕微鏡及びこれを用いた拡大観察方法
JP6325816B2 (ja) 2013-12-27 2018-05-16 株式会社キーエンス 拡大観察装置、拡大画像観察方法、拡大画像観察プログラム及びコンピュータで読み取り可能な記録媒体
JP6325815B2 (ja) 2013-12-27 2018-05-16 株式会社キーエンス 拡大観察装置、拡大画像観察方法、拡大画像観察プログラム及びコンピュータで読み取り可能な記録媒体
JP6300606B2 (ja) * 2014-04-02 2018-03-28 オリンパス株式会社 顕微鏡システム
DE102014106233A1 (de) * 2014-05-05 2015-11-05 Carl Zeiss Microscopy Gmbh Digitales Mikroskop und Verfahren zum Erkennen einer Bildvorlage
JP6560490B2 (ja) * 2014-12-10 2019-08-14 キヤノン株式会社 顕微鏡システム
KR101740815B1 (ko) * 2015-10-14 2017-05-26 삼성전기주식회사 촬상 광학계
JP6680560B2 (ja) * 2016-02-17 2020-04-15 オリンパス株式会社 共焦点顕微鏡装置、貼り合せ画像構築方法、及びプログラム
CN106327545A (zh) * 2016-08-19 2017-01-11 江苏中威科技软件系统有限公司 一种矢量线条的放大绘制方法
JP6797659B2 (ja) * 2016-12-14 2020-12-09 オリンパス株式会社 顕微鏡装置、プログラム、観察方法
JP2018101091A (ja) * 2016-12-21 2018-06-28 オリンパス株式会社 顕微鏡装置、プログラム、観察方法
US10096133B1 (en) 2017-03-31 2018-10-09 Electronic Arts Inc. Blendshape compression system
DE102017114562A1 (de) * 2017-06-29 2019-01-03 Carl Zeiss Microscopy Gmbh Mikroskop und Verfahren zum Mikroskopieren einer Probe unter einem veränderbaren mechanischen Parameter
US10878540B1 (en) * 2017-08-15 2020-12-29 Electronic Arts Inc. Contrast ratio detection and rendering system
US10535174B1 (en) 2017-09-14 2020-01-14 Electronic Arts Inc. Particle-based inverse kinematic rendering system
JP7023667B2 (ja) 2017-10-17 2022-02-22 株式会社キーエンス 拡大観察装置
US10860838B1 (en) 2018-01-16 2020-12-08 Electronic Arts Inc. Universal facial expression translation and character rendering system
JP7137345B2 (ja) * 2018-04-20 2022-09-14 株式会社キーエンス 形状測定装置、形状測定方法、形状測定プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器
US10902618B2 (en) 2019-06-14 2021-01-26 Electronic Arts Inc. Universal body movement translation and character rendering system
US11972353B2 (en) 2020-01-22 2024-04-30 Electronic Arts Inc. Character controllers using motion variational autoencoders (MVAEs)
US11504625B2 (en) 2020-02-14 2022-11-22 Electronic Arts Inc. Color blindness diagnostic system
US11232621B2 (en) 2020-04-06 2022-01-25 Electronic Arts Inc. Enhanced animation generation based on conditional modeling
US11648480B2 (en) 2020-04-06 2023-05-16 Electronic Arts Inc. Enhanced pose generation based on generative modeling
CN111694476B (zh) * 2020-05-15 2022-07-08 平安科技(深圳)有限公司 平移浏览方法、装置、计算机系统及可读存储介质
US11830121B1 (en) 2021-01-26 2023-11-28 Electronic Arts Inc. Neural animation layering for synthesizing martial arts movements
US11887232B2 (en) 2021-06-10 2024-01-30 Electronic Arts Inc. Enhanced system for generation of facial models and animation
US11670030B2 (en) 2021-07-01 2023-06-06 Electronic Arts Inc. Enhanced animation generation based on video with local phase
US11562523B1 (en) 2021-08-02 2023-01-24 Electronic Arts Inc. Enhanced animation generation based on motion matching using local bone phases

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09281405A (ja) 1996-04-17 1997-10-31 Olympus Optical Co Ltd 顕微鏡システム
US5715081A (en) * 1994-09-14 1998-02-03 International Business Machines Corporation Oblique viewing microscope system
US6661571B1 (en) * 1999-09-21 2003-12-09 Olympus Optical Co., Ltd. Surgical microscopic system
US6714205B1 (en) * 1998-08-21 2004-03-30 Canon Kabushiki Kaisha Image data processing method and apparatus, and image processing system
US7324274B2 (en) * 2003-12-24 2008-01-29 Nikon Corporation Microscope and immersion objective lens
US20080297596A1 (en) 2007-06-01 2008-12-04 Keyence Corporation Magnification Observation Apparatus and Method For Creating High Tone Image File
US20080297597A1 (en) * 2007-06-01 2008-12-04 Keyence Corporation Magnification Observation Apparatus and Method For Photographing Magnified Image
US20090159814A1 (en) * 2006-06-08 2009-06-25 Nikon Corporation Observing apparatus
US20090189994A1 (en) * 2008-01-24 2009-07-30 Keyence Corporation Image Processing Apparatus
JP2009175334A (ja) 2008-01-23 2009-08-06 Olympus Corp 顕微鏡システム、画像生成方法、及びプログラム
US20090237502A1 (en) * 2006-11-30 2009-09-24 Nikon Corporation Microscope apparatus
JP2010130408A (ja) 2008-11-28 2010-06-10 Keyence Corp 撮像装置
US20100149362A1 (en) 2008-12-12 2010-06-17 Keyence Corporation Imaging Device
US20100149364A1 (en) 2008-12-12 2010-06-17 Keyence Corporation Imaging Device
US20100149363A1 (en) 2008-12-12 2010-06-17 Keyence Corporation Imaging Device
JP2010141697A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
JP2010141700A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
US20110134517A1 (en) * 2009-12-04 2011-06-09 Olympus Corporation Microscope controller and microscope system comprising microscope controller
JP2011118107A (ja) 2009-12-02 2011-06-16 Olympus Corp 顕微鏡システム
US20120001070A1 (en) 2010-07-02 2012-01-05 Keyence Corporation Magnifying Observation Apparatus
US20120001069A1 (en) 2010-07-02 2012-01-05 Keyence Corporation Magnifying Observation Apparatus

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11134500A (ja) * 1997-10-24 1999-05-21 Sharp Corp 画像処理装置
JP3730450B2 (ja) * 1999-08-18 2006-01-05 株式会社リコー 画像入力装置および画像入力方法
JP2002202463A (ja) * 2000-12-27 2002-07-19 Nikon Corp 画像表示装置、画像表示装置を備えた顕微鏡システムおよび記録媒体
JP2003207720A (ja) * 2002-01-10 2003-07-25 Shimadzu Corp 分析用試料観察装置
US7792338B2 (en) * 2004-08-16 2010-09-07 Olympus America Inc. Method and apparatus of mechanical stage positioning in virtual microscopy image capture
JP4917329B2 (ja) * 2006-03-01 2012-04-18 浜松ホトニクス株式会社 画像取得装置、画像取得方法、及び画像取得プログラム
JP4936867B2 (ja) 2006-12-05 2012-05-23 株式会社キーエンス 拡大画像観察装置、拡大画像観察方法
JP4954800B2 (ja) * 2007-06-06 2012-06-20 オリンパス株式会社 顕微鏡撮像システム
JP5014966B2 (ja) * 2007-11-30 2012-08-29 パナソニック電工Sunx株式会社 拡大観察装置
JP5562653B2 (ja) * 2010-01-06 2014-07-30 オリンパス株式会社 バーチャルスライド作成装置およびバーチャルスライド作成方法
JP5555014B2 (ja) * 2010-03-10 2014-07-23 オリンパス株式会社 バーチャルスライド作成装置

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5715081A (en) * 1994-09-14 1998-02-03 International Business Machines Corporation Oblique viewing microscope system
JPH09281405A (ja) 1996-04-17 1997-10-31 Olympus Optical Co Ltd 顕微鏡システム
US6714205B1 (en) * 1998-08-21 2004-03-30 Canon Kabushiki Kaisha Image data processing method and apparatus, and image processing system
US6661571B1 (en) * 1999-09-21 2003-12-09 Olympus Optical Co., Ltd. Surgical microscopic system
US7324274B2 (en) * 2003-12-24 2008-01-29 Nikon Corporation Microscope and immersion objective lens
US20090159814A1 (en) * 2006-06-08 2009-06-25 Nikon Corporation Observing apparatus
US20090237502A1 (en) * 2006-11-30 2009-09-24 Nikon Corporation Microscope apparatus
US20080297596A1 (en) 2007-06-01 2008-12-04 Keyence Corporation Magnification Observation Apparatus and Method For Creating High Tone Image File
US20080297597A1 (en) * 2007-06-01 2008-12-04 Keyence Corporation Magnification Observation Apparatus and Method For Photographing Magnified Image
JP2009175334A (ja) 2008-01-23 2009-08-06 Olympus Corp 顕微鏡システム、画像生成方法、及びプログラム
US20090189994A1 (en) * 2008-01-24 2009-07-30 Keyence Corporation Image Processing Apparatus
JP2010130408A (ja) 2008-11-28 2010-06-10 Keyence Corp 撮像装置
US20100149362A1 (en) 2008-12-12 2010-06-17 Keyence Corporation Imaging Device
US20100149364A1 (en) 2008-12-12 2010-06-17 Keyence Corporation Imaging Device
US20100149363A1 (en) 2008-12-12 2010-06-17 Keyence Corporation Imaging Device
JP2010141699A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
JP2010141697A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
JP2010139890A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
JP2010141698A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
JP2010141700A (ja) 2008-12-12 2010-06-24 Keyence Corp 撮像装置
JP2011118107A (ja) 2009-12-02 2011-06-16 Olympus Corp 顕微鏡システム
US20110134517A1 (en) * 2009-12-04 2011-06-09 Olympus Corporation Microscope controller and microscope system comprising microscope controller
US20120001070A1 (en) 2010-07-02 2012-01-05 Keyence Corporation Magnifying Observation Apparatus
US20120001069A1 (en) 2010-07-02 2012-01-05 Keyence Corporation Magnifying Observation Apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9817223B2 (en) 2013-11-04 2017-11-14 Carl Zeiss Microscopy Gmbh Digital microscope comprising pivoting stand, method for calibration and method for automatic focus and image center tracking for such a digital microscope
US11009344B2 (en) 2018-04-20 2021-05-18 Keyence Corporation Image observing device, image observing method, image observing program, and computer-readable recording medium

Also Published As

Publication number Publication date
CN102967929A (zh) 2013-03-13
DE102012215307A1 (de) 2013-02-28
JP2013050594A (ja) 2013-03-14
CN102967929B (zh) 2017-04-12
US20130050464A1 (en) 2013-02-28
JP5732353B2 (ja) 2015-06-10

Similar Documents

Publication Publication Date Title
US9007452B2 (en) Magnification observation device, magnification observation method, and magnification observation program
US9690089B2 (en) Magnifying observation apparatus, magnified image observing method and computer-readable recording medium
US9383569B2 (en) Magnification observation device
US8994810B2 (en) Magnification observation device
JP4970869B2 (ja) 観察装置および観察方法
EP2194414B1 (en) Microscope system and method of operation thereof
US9959451B2 (en) Image inspection device, image inspection method and image inspection program
US9291450B2 (en) Measurement microscope device, image generating method, measurement microscope device operation program, and computer-readable recording medium
US9690088B2 (en) Magnifying observation apparatus, magnified image observing method and computer-readable recording medium
EP2804145B1 (en) Microscope system and stitched area decision method
US9122048B2 (en) Image processing apparatus and image processing program
US20150116476A1 (en) Microscopic Imaging Device, Microscopic Imaging Method, And Microscopic Imaging Program
EP2804039A1 (en) Microscope system and method for deciding stitched area
JP6266302B2 (ja) 顕微鏡撮像装置、顕微鏡撮像方法および顕微鏡撮像プログラム
JP4180322B2 (ja) 共焦点顕微鏡システム及びパラメータ設定用コンピュータプログラム
JP4979464B2 (ja) 顕微鏡装置、該制御プログラム、及び該制御方法
JP2012150142A (ja) 顕微鏡制御装置、顕微鏡システム及び該制御方法
JP2015102694A (ja) アライメント装置、顕微鏡システム、アライメント方法、及びアライメントプログラム
JP2010256724A (ja) 観察装置
JP2012150335A (ja) 共焦点顕微鏡システム、画像処理方法および画像処理プログラム
JP6951065B2 (ja) 拡大観察装置
JP5305838B2 (ja) 共焦点顕微鏡システム
JP2018066714A (ja) 画像計測装置および画像計測方法
JP2018066911A (ja) 拡大観察装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: KEYENCE CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANG, WOOBUM;REEL/FRAME:028641/0883

Effective date: 20120712

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8