CN102023167A - Method and appratus for image generation - Google Patents

Method and appratus for image generation Download PDF

Info

Publication number
CN102023167A
CN102023167A CN2010101764974A CN201010176497A CN102023167A CN 102023167 A CN102023167 A CN 102023167A CN 2010101764974 A CN2010101764974 A CN 2010101764974A CN 201010176497 A CN201010176497 A CN 201010176497A CN 102023167 A CN102023167 A CN 102023167A
Authority
CN
China
Prior art keywords
image
repetition rate
unique point
images
objective table
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2010101764974A
Other languages
Chinese (zh)
Inventor
内木裕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN102023167A publication Critical patent/CN102023167A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention aims to provide a method and an apparatus for image generation which is capable of obtaining the expected composite image without an integral image or the adjustment of repetition rate by an user. In the method for image generation, a first part image is obtained by shooting with a predetermined resolution ratio and observing with a microscope (50). Based on a feature point extracted from the first part image, the repetition rate between the first part image and a second part image to be composed is determined. A middle image is obtained by composing the first part image and the second part image obtained by moving an objective table (4) of the microscope (50) based on the determined repetition rate.

Description

Image generating method and video generation device
Technical field
The present invention relates to that subject is divided into a plurality of parts of images and take, and parts of images has been pasted the image generating method of the general image that generates subject.
Background technology
Nowadays, microscope or testing fixture in the industrial use that is used for inspection/correction FPD (flat-panel monitor) substrate, PDP (plasma scope) substrate, semiconductor wafer etc., as the method that the defective of the problem on the function that causes substrate is checked/discerned, generally utilize the method for image.
In the method for the defective of above-mentioned inspection/identification substrate, for the trickle defective of the pattern formation that influences substrate etc. with high precision inspection/identification etc., the pattern that needs to become the object of inspection etc. compares with normal reference pattern.Therefore, not only need cover the image of subject integral body with low range, it is increasing also to need to cover the situation of HD image (high-resolution image) of subject integral body with high magnification more.
But,, can not once obtain the HD image of subject integral body sometimes according to varying in size of subject.For this reason, as the technology that obtains HD image, known have a following method: the HD image integral body that will obtain is divided into a plurality of zones, make a video recording with high resolving power at these divided area respectively, to paste mutually by the parts of images that shooting obtains, obtain the HD image of expectation thus.In this method that obtains HD image, at first based on the general image of low range, take powerful parts of images, and captured parts of images is pasted together.This method is not limited to industrial use, also is used in various application in various fields.
As the technology that parts of images is pasted together, provide following method: the position that search is suitable for pasting in the general image of taking with low range, and, will paste, thereby generate HD image (for example patent documentation 1) with the parts of images that high magnification is taken according to Search Results.
[patent documentation 1] Japan public table special permission communique: Japanese Unexamined Patent Application Publication 2006-006525 communique
Carry out in the method for utilizing for example above-mentioned patent documentation 1 to be put down in writing under the situation of stickup of parts of images, in the general image of the low range that is used in the position that search is suitable for pasting, need hold and want that the scope of taking is a subject integral body.Take general image or reduce multiplying power and can not take under the situation of the general image that has held subject integral body even switch to more low range, judge the situation that necessitates such as can not paste in size according to subject.
In addition, in low range and high magnification, visibility difference, search in the general image of low range and the position obtained not necessarily is suitable for the stickup of powerful parts of images.According to circumstances different, can not paste even on the position that search obtains, attempt sometimes, thereby may obtain:, can not obtain desired images thereby perhaps can not paste smoothly owing to carry out again and the generation time loss from handling beginning at the search of general image with inferior result.
In addition, when carrying out above-mentioned stickup, the lap that parts of images is pasted each other, be that the setting of " repetition rate " of parts of images is extremely important.Set repetition rate big more, will take many more parts of images, so the processing time is elongated.On the other hand, set repetition rate more little, the zone that is used in stickup in the parts of images is more little, is difficult to more paste.The optimum value of repetition rate is according to the object of checking etc. (under the situation of inspecting substrate etc. be pattern) and different, thus exist the user by visual adjust must be skilled problem.
Summary of the invention
The present invention finishes in view of the above problems, and its purpose is to provide a kind of adjustment that does not need general image and user to repetition rate, and can access the technology of the composograph of expectation.
In order to address the above problem, image generating method involved in the present invention constitutes: obtain the part 1 image that obtains with predetermined resolution fractographic shooting object for shooting, according to the unique point of from described part 1 image, extracting, determine the repetition rate between this part 1 image and the part 2 image to be synthesized, obtain to described part 1 image with according to determined described repetition rate and move the composograph that described part 2 image that described microscopical objective table obtains is synthesized into.
According to the present invention, do not need general image and user adjustment, and can access the composograph of expectation repetition rate.
Description of drawings
Fig. 1 is the structural drawing of the related microscopic system of embodiment.
Fig. 2 is the functional block diagram of the related image processing part of embodiment.
Fig. 3 is the figure that the example of shooting object is shown.
Fig. 4 is the figure (one) that the method for utilizing unique point in the image to carry out the stickup of image is described.
Fig. 5 is the figure (its two) that the method for utilizing unique point in the image to carry out the stickup of image is described.
Fig. 6 is the figure (one) that the method to the repetition rate of determine pasting describes.
Fig. 7 A is the figure (its two) that the method to the repetition rate of determine pasting describes.
Fig. 7 B is the figure (its three) that the method to the repetition rate of determine pasting describes.
Fig. 8 is the figure that the moving method to objective table describes.
Fig. 9 is the figure that the method to the stack of carrying out image describes.
Figure 10 shows the process flow diagram that the generation of the HD image of the related microscopic system of embodiment is handled.
Label declaration
1: lens barrel; 2: object lens; 3: camera; 4: objective table; 5: image processing part; 6: the objective table control part; 7: systems control division; 8: microscope Z axle control part; 9: imaging control part; 10: shade/distortion/gamma correction handling part (treatment for correcting portion); 11: the image data storage buffer part; 12: the feature point extraction handling part; 13: image generates handling part; 14: the amount of movement determination portion; 50: microscope; 100: microscopic system.
Embodiment
Below, with reference to accompanying drawing embodiments of the present invention are elaborated.
Fig. 1 is the structural drawing of the related microscopic system of present embodiment.Microscopic system 100 shown in Figure 1 comprises microscope 50, objective table control part 6, microscope Z axle control part 8, image processing part 5 and systems control division 7, microscope 50 comprises objective table 4, lens barrel 1, object lens 2 and camera 3, observes shooting objects such as substrate with predetermined multiplying power.
Objective table 4 has 2 travel mechanisms that drive along X-direction and Y direction (being left and right directions and depth direction in Fig. 1), relatively moves with respect to the shooting object.Lens barrel 1 is positioned at the top of the shooting object of mounting on objective table 4, in objective table 4 sides object lens 2 is installed.Lens barrel 1 is driven along Z-direction (being above-below direction in Fig. 1) by not shown driving mechanism up and down.
About the object lens 2 of lens barrel 1, at its another distolateral camera 3 that is provided with.(Charge Coupled Device: charge-coupled image sensor) camera constitutes camera 3 by CCD, with predetermined resolution the shooting object of mounting on objective table 4 taken, exported obtained signal of video signal or picture signal and be sent to image processing part 5.The CCD camera for example gray scale of each RGB respective pixel (brightness) data is exported as image information.
Systems control division 7 is controlled objective table control part 6, microscope Z axle control part 8 and image processing part 5 as required respectively.And, receive the input of the formation range (lower limit) of image size, image and pixel resolution etc., synthesize the processing such as judgement of the necessity of a plurality of images.
Objective table control part 6 carries out 2 axial mobile controls about the X-axis and the Y-axis of objective table 4 according to the order from systems control division 7, carries out the adjustment of the relative position between object lens 2 and the shooting object.
Microscope Z axle control part 8 is controlled above-mentioned driving mechanism up and down according to the order from systems control division 7, make lens barrel 1 along about (Z-direction) move, carry out focus adjustment with respect to the shooting object of mounting on objective table 4.
Image processing part 5 is carried out various Flame Image Process at signal of video signal that sends from camera 3 or picture signal.
Describe the structure of the image processing part 5 in the microscopic system 100 shown in Fig. 1 in detail with reference to Fig. 2.
Fig. 2 is the functional block diagram of the related image processing part of present embodiment 5.Image processing part 5 shown in Figure 2 comprises that imaging control part 9, shade/distortion/gamma correction handling part (following simple note is done treatment for correcting portion) 10, image data storage buffer part 11, feature point extraction handling part 12, image generate handling part 13 and amount of movement determination portion 14.
Imaging control part 9 is carried out the focus adjustment according to the control of systems control division 7 by microscope Z axle control part 8, and will output to treatment for correcting portion 10 by camera 3 captured image information via object lens 2.
Treatment for correcting portion 10 is included in the correction of the deviation of shade/distortion/brightness from the image information of imaging control part 9 input, camera system etc., and stores in the image data storage buffer part 11.
11 pairs of captured images of image data storage buffer part or the intermediate image that generates in the process of the image that finally obtains until generation are stored.So-called intermediate image is meant the image that synthesizes in the stage midway that the image generation is handled.
Feature point extraction handling part 12 from image data storage buffer part 11 read intermediate image and intermediate image synthetic image, and from the image of being read extract minutiae.The part of the image that synthetic image construction finally will obtain in intermediate image, below, be made as " parts of images ".In addition, as the feature point extraction algorithm, use for example known technology such as Harris operator or SOJKA operator.
Amount of movement determination portion 14 is read parts of images from image data storage buffer part 11, and according to the information relevant with the unique point that obtains in feature point extraction handling part 12, determines the amount of movement of objective table 4.And, the amount of movement of determined objective table 4 is outputed to systems control division 7.Systems control division 7 uses from the amount of movement of amount of movement determination portion 14 inputs, and objective table control part 6 is indicated, and makes the X-direction of control objective table 4 and moving of Y direction to make objective table 4 move to the position of next wanting the obtaining section partial image.
Image generates handling part 13 according to the image of each process object (image in zone in parts of images and the intermediate image, that repeat with parts of images), according to the information relevant with the unique point that in feature point extraction handling part 12, obtains, between image, detect best corresponding point, generate transformation matrix.And, utilize the transformation matrix that is generated, carry out the synthetic processing of image each other.As the algorithm that generates transformation matrix, for example use RANSAC (RANdom Sample Consensus: the random sampling unanimity) technique known such as algorithm.
Needing in order to export necessary image under the situation of synthetic a plurality of parts of images, image processing part 5 shown in Figure 2 utilizes the unique point of extracting in feature point extraction handling part 12 to come the synthetic of operating part image.When obtaining in intermediate image synthetic parts of images, according to the extraction situation of unique point, determine that how much this parts of images and intermediate image have repeated (repetition rates), and with determined repetition rate moving stage 4 accordingly, thereby the obtaining section partial image.By obtained parts of images and intermediate image is synthetic, obtain the image that finally will obtain according to determined repetition rate.
In the following description, the processing that utilizes the unique point composograph being made as " stickup " handles.
Fig. 3 is the figure that the example of shooting object is shown.Below, the FDP substrate that will have a circuit pattern shown in Figure 3 is as the shooting object, and the generation method of the reproducing image when obtaining HD image describes.
Microscopic system 100 shown in Figure 1 according to the visual field size of camera 3 and the scope of HD image, is judged the necessity of the stickup processing of image when generating HD image at shooting object shown in Figure 3.
Particularly, visual field size is being made as V, when the scope of HD image is made as P, under the situation of the conditional that satisfies P≤V/2, is being judged as not need to paste and handles.On the other hand, under the situation of the conditional that satisfies P>V/2, be judged as to need to paste and handle, and carry out the generation method of the related HD image of present embodiment.
In addition,, constitute,, use above-mentioned conditional to judge whether to need to paste to handle at each component by X component and Y component about the scope P of visual field size V and HD image.About the scope of visual field size and HD image, also can be set at respect to actual value and leave surplus by considering displacement error etc.
Below, the details of handling at the HD image generation that is judged as under the situation that needs to paste describes.
Fig. 4 and Fig. 5 are the figure that the method for utilizing unique point in the image to carry out the stickup of image is described.Two region R A and RB in the substrate shown in Figure 4 represent microscopical field range respectively, and the image of obtaining at region R A and RB is parts of images A and B shown in Figure 5., describe: about parts of images A, handle a part that becomes intermediate image, parts of images B is pasted on the parts of images A thereby carried out to paste at following situation herein.In addition, the repetition rate of parts of images A and parts of images B is made as 50% herein.
As shown in Figure 5, be under 50% the situation in repetition rate, the repetition position of two parts of images A and B becomes the zone of the latter half of the first half of parts of images A, parts of images B.Extract these 4 unique points of P1a~P4a from parts of images A, extract these 4 unique points of P1b~P4b from parts of images B.Image generates the corresponding relation of handling part 13 search unique point of extraction in feature point extraction handling part 12, asks for the transformation matrix of the relation that satisfies P1a=P1b, P2a=P2b, P3a=P3b and P4a=P4b respectively.Use the transformation matrix of being asked for, carry out the stickup of parts of images B and parts of images A (intermediate image that comprises parts of images A), thereby obtain intermediate image M.
In Fig. 4 and Fig. 5, illustration repetition rate be 50% situation, but the microscopic system 100 that present embodiment relates to according to the extraction situation of the unique point in the feature point extraction handling part 12, the repetition rate when determining adhesive portion image A and parts of images B.Definite method of repetition rate then, is described.
Fig. 6, Fig. 7 A and Fig. 7 B are the figure that the method to the repetition rate of determine pasting describes.Lifting following situation is that example describes: the parts of images A to Fig. 6 determines repetition rate, and move along the direction of arrow of Fig. 6 in the visual field that makes camera 3 according to determined repetition rate, the parts of images that will obtain on the position after moving and parts of images A stickup.
Shown in Fig. 7 A, repetition rate is being set under 20% the situation, only among the region R O outside being made as the extraction object of unique point (apart from the zone of the top 20%~100% of image, the hatched example areas of Fig. 7 A) has unique point, therefore detects less than unique point from the zone of distance top 20%.At this moment, being judged as does not have suitably to set the value of repetition rate, so change to bigger value, carries out the extraction again of unique point.
Shown in Fig. 7 B,, can from the subject area of feature point extraction, detect 4 unique point P repetition rate being set under 50% the situation.At this moment, if repetition rate is 50%, then is judged as and pastes.The quantity of sufficient unique point preferably constitutes and can suitably be set by the user for pasting processing.
As mentioned above, determine repetition rate, and determine the amount of movement of objective table 4 according to repetition rate according to the situation of extract minutiae from parts of images.When the direction of the visual field of the mobile camera 3 according to determined amount of movement moving stage 4, the position photographs parts of images after moving also makes it and intermediate image is pasted.Fig. 8 is the figure that the moving method to objective table 4 describes.Illustration at the substrate pattern of shooting object, in scope R, need the situation of HD image.
In an embodiment, with order moving stage 4 shown in Figure 8, with P1, P2 ..., the order obtaining section partial image of P6.With among the figure laterally be made as X-direction (line direction), when vertically being made as Y direction (column direction), as shown in Figure 8, by on line direction and column direction, intersecting moving stage 4, can simplify the generation of transformation matrix of the stickup of the parts of images that is used for intermediate image and newly obtains, thereby avoid being used for the situation that a series of processing that image pastes complicate.
According to said method, difference extract minutiae from two images, and carried out the stickup of two images according to unique point, but because the pattern difference of substrate for example, the situation of the unique point of the required quantity of stickup may take place sometimes to extract from parts of images.Even in this case, handle for the generation that can carry out HD image, also can constitute and adopt following pattern match.
Fig. 9 is the figure that the method to the stack of carrying out image describes.As mentioned above, the method for utilizing unique point to obtain composograph is defined as " stickup " of image, relative therewith, the method for utilizing pattern match to obtain composograph that following explanation is such is defined as " stack " of image.
About the overlap-add procedure of image, image generates handling part 13 and uses pattern matching algorithms such as shape search.For example, as shown in Figure 9 parts of images A and B are such, under can not the situation of extract minutiae, utilize pattern matching algorithm, search for the edge from parts of images A and B.In addition, handle similarly with the stickup of image, according to before in image synthetic the repetition rate or the initial value of use determine the region of search.
In example shown in Figure 9, the result who transversely searches at image is, extracted edge E1a and edge E2a from parts of images A, extracted edge E1b and edge E2b from parts of images B.At laterally, utilize these edges that extracted, carry out the stack of image.
On the other hand, vertical at image, do not extract the edge.But even in this case, parts of images B begins longitudinally objective table 4 to be moved the quantity of repetition rate and the image obtained from parts of images A.Utilize this situation, at vertically, the amount of movement of objective table 4 is captured as the amount of movement of image, between parts of images A and parts of images B, superpose longitudinally.Thus, obtain intermediate image M shown in Figure 9.This part the edge of edge E1a and edge E1b that superposeed is equivalent to E1, and this part the edge of edge E2a and edge E2b that superposeed is equivalent to E2.
Figure 9 illustrates the example of the situation that can extract transverse edge, but in addition, also consider only to have extracted the situation of longitudinal edge and the situation that on any one horizontal and vertical direction, all can not extract the edge.In these cases, at the edge that can not obtain expecting, promptly should obtain the direction of the position of coupling, overlap on where being judged as can, by the amount of movement of objective table 4 being regarded as the amount of movement of image, can superpose thus.
By carry out " stack " of image with method shown in Figure 9,, also can continue the related HD image of present embodiment and generate the execution of handling even can not from image, extract under the situation of the unique point that is suitable for pasting.
Figure 10 shows the process flow diagram that the generation of the HD image of the related microscopic system of embodiment 100 is handled.To have imported the order that begins HD image generation processing by input media not shown in Fig. 1 is opportunity, begins processing shown in Figure 10.
At first, in step S1, systems control division 7 is set the scope of the HD image that needs according to the parameter value of user by above-mentioned input media input.In addition, in Figure 10, show processing under the following situation later on, that is: according to the scope of the visual field size of camera 3 and the HD image in step S1, set, judge to need to paste and handle at step S2.
In step S2, in the order according to systems control division 7, the imaging control part 9 of image processing part 5 switches to the appointment multiplying power with object lens 2, and microscope Z axle control part 8 makes on the lens barrel 1 to get off when focusing, imaging control part 9 usefulness cameras 3 photographing section images.In Figure 10, about objective table 4, enumerated the processing under the situation that beginning HD image generation under the state that moves to the shooting starting position is in advance handled, but also can constitute: in the moment that the generation that has begun HD image is handled, objective table 4 is in position arbitrarily.At this moment, in step S2, objective table control part 6 moves to objective table 4 on the shooting starting position definite according to the scope of HD image.
10 pairs of images of being imported of treatment for correcting portion carry out the correction of shade/distortion/brightness etc., and will store in the image data storage buffer part 11 as a parts of images by the view data that correction obtains.When storing obtained parts of images, systems control division 7 will be pasted direction as parameter, export the order that begins to paste at the parts of images of being stored.
In step S3, receive that the feature point extraction portion 12 from the order of systems control division 7 obtains the parts of images that stores in the view data buffer part 11, the extracted region unique point of the preset range from parts of images in step S2.In addition, about the initial value of the preset range of extract minutiae, determine according to predefined benchmark repetition rate.
In step S4, whether the quantity of judging the unique point extracted is more than predetermined quantity.Under the situation more than the predetermined quantity, proceed to step S5 in the quantity of unique point, " stickups " processing of beginning image.
In step S5, determine the repetition rate of pasting.Definite method of repetition rate is as previous illustrated with reference to Fig. 6, Fig. 7 A and Fig. 7 B.In step S6, repetition rate that objective table control part 6 bases are determined in step S5 and the stickup direction that is input to amount of movement determination portion 14 in above-mentioned steps S2 move to the precalculated position with objective table 4.
In step S7, obtaining section partial image on the position of the objective table 4 of imaging control part 9 after moving, and store in the image data storage buffer part 11.In step S8, image generates handling part 13 and reads out in parts of images and the intermediate image of obtaining the step S7 from image data storage buffer part 11, pastes, and proceeds to step S9.About the method for attaching of step S8, as reference Fig. 4 and Fig. 5 are illustrated.In addition, in initial value, the image of pasting with the parts of images of obtaining in step S7 is not to become intermediate image, but becomes the parts of images of obtaining in step S2.
On the other hand, in step S4, do not reach in the quantity that is judged to be unique point under the situation of predetermined quantity, proceed to step S10, " stack " of beginning image handled.
In step S10, objective table control part 6 moves objective table 4 in the scheduled volume and the mobile visual field of using pre-prepd repetition rate to calculate; In step S11, obtaining section partial image on the position of the objective table 4 of imaging control part 9 after moving, and store in the image data storage buffer part 11.In step S12, image generates handling part 13 and reads out in parts of images and the intermediate image of obtaining the step S11 from image data storage buffer part 11, superposes, and enters into step S9 then.About stacking method, as reference Fig. 9 is illustrated.In initial value, the image that superposes with the parts of images of obtaining in step S11 is not to become intermediate image, but becomes the parts of images of obtaining in step S2.
In step S9, judge whether the scope of the intermediate image that generates satisfies the scope of the HD image of setting in step S1 in step S8 or step S12.Under the situation that is judged to be the scope that does not satisfy the HD image that sets, return step S3, repeat same processing.In step S9, when the scope of the intermediate image that generates being judged as satisfies the scope of the HD image of setting in step S1, this intermediate image is exported as the HD image that finally will generate, thus end process.
As described above, the HD image generation method related according to present embodiment utilizes the unique point of parts of images to come the composite part image, therefore do not want the general image of low range.Meanwhile, utilize the quantity of the unique point of being extracted to determine repetition rate, therefore do not need the adjustment of user repetition rate.
In addition, in above-mentioned, be that example is illustrated with the situation that obtains HD image at the shooting object, but be not limited thereto.If utilize the synthetic a plurality of images of unique point to generate the method for an image, then comprise in the present invention.

Claims (9)

1. image generating method is characterized in that comprising following processing:
Obtain the part 1 image that obtains with predetermined resolution fractographic shooting object for shooting;
According to the unique point of from described part 1 image, extracting, determine the repetition rate between this part 1 image and the part 2 image to be synthesized; And
Obtain to described part 1 image with according to determined described repetition rate and move the composograph that described part 2 image that described microscopical objective table obtains is synthesized into.
2. image generating method according to claim 1 is characterized in that,
Extract described unique point according to predefined benchmark repetition rate.
3. image generating method according to claim 2 is characterized in that,
When not extracting described unique point, change described benchmark repetition rate and come extract minutiae again according to described benchmark repetition rate.
4. image generating method according to claim 3 is characterized in that,
Carrying out under the described situation about extracting again described benchmark repetition rate being changed to bigger value.
5. image generating method according to claim 1 is characterized in that,
The quantity of the unique point that judgement is extracted from described part 1 image whether more than predetermined quantity, under situation about being judged to be less than this predetermined quantity, pre-prepd repetition rate is defined as and described part 2 image between repetition rate.
6. image generating method according to claim 5 is characterized in that,
According to described pre-prepd repetition rate, search for described part 1 image and described part 2 edge of image, utilize the edge that is extracted to obtain composograph.
7. image generating method according to claim 1 is characterized in that,
Whether the quantity of the unique point that judgement is extracted from described part 1 image is more than predetermined quantity, under situation about being judged to be, the amount of movement of described microscopical objective table is regarded as the amount of movement of described part 1 image and described part 2 image and obtained composograph less than this predetermined quantity.
8. image generating method according to claim 1 is characterized in that,
At described part 2 image, extract minutiae, whether judgement comprises the described unique point more than the predetermined quantity in this part 2 image, according to the quantity that is included in the unique point in the described part 2 image, the repetition rate that use is determined according to the unique point of this part 2 image, perhaps pre-prepd described repetition rate, the described composograph that obtains synthesizing described part 1 image and described part 2 image, with move the 3rd parts of images that described microscopical objective table obtains according to this repetition rate and synthesize, regenerate composograph
Repeat above-mentioned processing until the general image of obtaining described shooting object.
9. microscopic system, it has: the microscope of observing the shooting object with predetermined multiplying power; And the objective table that relatively moves with respect to this shooting object, this microscopic system is characterised in that to have:
Image pickup part, it is obtained with predetermined resolution and takes described microscopical observation picture and the part 1 image that obtains;
The amount of movement determination portion, it is according to the unique point extracted from described part 1 image, determines the repetition rate between this part 1 image and the part 2 image to be synthesized, and determines the amount of movement of described objective table according to this repetition rate; And
Image production part, it is obtained to described part 1 image with according to determined described amount of movement and moves described objective table and composograph that the described part 2 image obtained is synthesized into.
CN2010101764974A 2009-09-16 2010-05-10 Method and appratus for image generation Pending CN102023167A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009214813A JP2011065371A (en) 2009-09-16 2009-09-16 Image generation method and image generating device
JP2009-214813 2009-09-16

Publications (1)

Publication Number Publication Date
CN102023167A true CN102023167A (en) 2011-04-20

Family

ID=43864723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010101764974A Pending CN102023167A (en) 2009-09-16 2010-05-10 Method and appratus for image generation

Country Status (4)

Country Link
JP (1) JP2011065371A (en)
KR (1) KR20110030275A (en)
CN (1) CN102023167A (en)
TW (1) TW201123029A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105472232A (en) * 2014-09-05 2016-04-06 宏达国际电子股份有限公司 Image acquisition method and electronic device
US9986155B2 (en) 2014-09-05 2018-05-29 Htc Corporation Image capturing method, panorama image generating method and electronic apparatus
CN109752835A (en) * 2019-03-25 2019-05-14 南京泰立瑞信息科技有限公司 A kind of X of microscope local field of view, Y-axis position control method and system
CN109983531A (en) * 2016-09-23 2019-07-05 堺显示器制品株式会社 Generation system, generation method and computer program
CN113344835A (en) * 2021-06-11 2021-09-03 北京房江湖科技有限公司 Image splicing method and device, computer readable storage medium and electronic equipment

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10955655B2 (en) 2012-07-04 2021-03-23 Sony Corporation Stitching images based on presence of foreign matter
TWI599793B (en) 2015-11-23 2017-09-21 財團法人金屬工業研究發展中心 Image scanning system for tissue slides
JP6120037B1 (en) * 2016-11-30 2017-04-26 国際航業株式会社 Inspection device and inspection method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105472232A (en) * 2014-09-05 2016-04-06 宏达国际电子股份有限公司 Image acquisition method and electronic device
US9986155B2 (en) 2014-09-05 2018-05-29 Htc Corporation Image capturing method, panorama image generating method and electronic apparatus
CN105472232B (en) * 2014-09-05 2018-10-19 宏达国际电子股份有限公司 Image acquisition method and electronic device
CN109983531A (en) * 2016-09-23 2019-07-05 堺显示器制品株式会社 Generation system, generation method and computer program
CN109752835A (en) * 2019-03-25 2019-05-14 南京泰立瑞信息科技有限公司 A kind of X of microscope local field of view, Y-axis position control method and system
CN113344835A (en) * 2021-06-11 2021-09-03 北京房江湖科技有限公司 Image splicing method and device, computer readable storage medium and electronic equipment

Also Published As

Publication number Publication date
TW201123029A (en) 2011-07-01
JP2011065371A (en) 2011-03-31
KR20110030275A (en) 2011-03-23

Similar Documents

Publication Publication Date Title
CN102023167A (en) Method and appratus for image generation
US7801352B2 (en) Image acquiring apparatus, image acquiring method, and image acquiring program
CN109541791A (en) High-resolution light field micro imaging system and method based on sub-pix translation
EP2402811A2 (en) Information processing apparatus, stage-undulation correcting method, program therefor
JP5555014B2 (en) Virtual slide creation device
JP4970869B2 (en) Observation apparatus and observation method
US8126250B2 (en) Image acquiring apparatus, image acquiring method, and image acquiring program
TWI285045B (en) Method and apparatus for flat patterned media inspection
US7630628B2 (en) Microscope system and microscope observation method
EP3035104B1 (en) Microscope system and setting value calculation method
JP2011081211A (en) Microscope system
US20130176617A1 (en) Microscope system and autofocus method
JP5562653B2 (en) Virtual slide creation apparatus and virtual slide creation method
KR20080003718A (en) Semiconductor substrate defects correction device
JP4477863B2 (en) Cell measurement support system and cell observation apparatus
WO2005026802A1 (en) Autofocus control method, autofocus controller, and image processor
JP2006284965A (en) Microscope device and enlarged image generating method
JP5730696B2 (en) Image processing apparatus and image display system
US8275217B2 (en) Apparatus and method for combining several sub-images for any imaging surface areas
CN105651699B (en) It is a kind of based on the dynamic of area array cameras with burnt method
JP2009223164A (en) Microscopic image photographing apparatus
CN103297673A (en) Display control apparatus, display control method, and recording medium
JP4765340B2 (en) Microscope system
JP2012083621A (en) Scan laser microscope
JP5610579B2 (en) 3D dimension measuring device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110420