WO2018174011A1 - Position detection device and position detection method - Google Patents

Position detection device and position detection method Download PDF

Info

Publication number
WO2018174011A1
WO2018174011A1 PCT/JP2018/010826 JP2018010826W WO2018174011A1 WO 2018174011 A1 WO2018174011 A1 WO 2018174011A1 JP 2018010826 W JP2018010826 W JP 2018010826W WO 2018174011 A1 WO2018174011 A1 WO 2018174011A1
Authority
WO
WIPO (PCT)
Prior art keywords
substrate
corner
image data
view
angle
Prior art date
Application number
PCT/JP2018/010826
Other languages
French (fr)
Japanese (ja)
Inventor
裕司 岡本
Original Assignee
住友重機械工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友重機械工業株式会社 filed Critical 住友重機械工業株式会社
Priority to JP2019507664A priority Critical patent/JP6862068B2/en
Priority to CN201880005782.4A priority patent/CN110446905A/en
Priority to KR1020197019474A priority patent/KR20190131475A/en
Publication of WO2018174011A1 publication Critical patent/WO2018174011A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/30Structural arrangements specially adapted for testing or measuring during manufacture or treatment, or specially adapted for reliability measurements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67242Apparatus for monitoring, sorting or marking
    • H01L21/67259Position monitoring, e.g. misposition detection or presence detection
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/68Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for positioning, orientation or alignment
    • H01L21/682Mask-wafer alignment
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/20Sequence of activities consisting of a plurality of measurements, corrections, marking or sorting steps
    • H01L22/26Acting in response to an ongoing measurement without interruption of processing, e.g. endpoint detection, in-situ thickness measurement

Definitions

  • the present invention relates to a position detection device and a position detection method.
  • Patent Document 1 discloses a screen printing technique for positioning a substrate based on image data obtained by imaging the vicinity of a corner of the substrate.
  • An object of the present invention is to provide a position detection device capable of performing more stable position detection based on image data of corners of a substrate.
  • a support part for supporting a substrate having a corner part An imaging device for imaging a portion of the substrate within an angle of view;
  • a processing unit The processor is Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value,
  • a position detection device is provided that detects the position of the corner portion of the substrate based on image data acquired by imaging the angle of view of the imaging device after movement.
  • a support part for supporting a substrate having a corner part An imaging device for imaging a portion of the substrate; A processing unit; A display unit for displaying an image,
  • the processor is Displaying an image of the corner of the substrate within the angle of view of the imaging device on the display; Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value, Display the image of the corner within the angle of view of the imaging device after movement on the display unit,
  • a position detection device that obtains coordinates of the position of the corner of the image of the substrate within the angle of view of the imaging device and displays information for specifying the obtained coordinates on the display unit.
  • the region to be imaged is moved so as to be equal to or greater than the specified value, and the board is imaged to obtain second image data.
  • Process And a step of determining the position of the first corner of the substrate based on the second image data.
  • Stable position detection can be performed by detecting the position of the corner so that the length of two sides sandwiching the corner of the substrate included in the angle of view of the imaging device is equal to or greater than a specified value. is there.
  • FIG. 1 is a schematic perspective view of a position detection apparatus according to an embodiment.
  • FIG. 2 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to the embodiment.
  • FIG. 3A is a plan view of a substrate that is an object for position detection
  • FIG. 3B is a diagram showing a template used for pattern matching executed during the position detection process as an image
  • FIG. 3C is a position detection. It is a figure which shows the image data which a process part handles during a process as an image.
  • 4A and 4B are diagrams illustrating image data handled by the processing unit as an image during the position detection process.
  • FIG. 5A is a diagram showing the image data of the corner when the shape and size of the cut-off portion of the corner of the substrate are in accordance with the standard
  • FIG. 5B is the size of the cut-out portion of the corner of the substrate as the standard
  • FIG. 5C is a diagram illustrating, as an image, image data of a corner when the shape of a cut-off portion of the corner of the substrate is out of the standard
  • FIG. 5D is a diagram illustrating, as an image, image data at corners when the posture in the in-plane rotation direction of the substrate is slightly inclined from the target posture.
  • FIG. 6 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to another embodiment.
  • FIG. 6 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to another embodiment.
  • FIG. 7A is a plan view of a substrate whose position is to be detected in still another embodiment
  • FIG. 7B is a diagram showing, as an image, a template used in pattern matching executed in steps S03 and S07 in FIG. is there.
  • FIG. 8 is a flowchart of a position detection process executed by a position detection apparatus according to another embodiment.
  • 9A is a plan view showing the positional relationship between the angle of view and the substrate when step S11 of FIG. 8 is executed
  • FIG. 9B shows the positional relationship between the angle of view and the substrate after executing step S13. It is a top view.
  • FIG. 10 is a flowchart of position detection processing executed by a position detection apparatus according to another embodiment.
  • FIG. 10 is a flowchart of position detection processing executed by a position detection apparatus according to another embodiment.
  • FIG. 11A is a plan view of a substrate whose position is to be detected using a position detection apparatus according to another embodiment
  • FIG. 11B is a diagram showing a template used for pattern matching at the time of position detection as an image.
  • 12A and 12B are schematic perspective views of a position detecting device according to still another embodiment
  • FIG. 13 is a schematic front view of a film forming apparatus according to still another embodiment.
  • FIG. 1 is a schematic perspective view of a position detection apparatus according to an embodiment.
  • the position detection apparatus according to the present embodiment includes a support unit 10, imaging devices 11A and 11B, a processing unit 12, a storage unit 13, and a display unit 14.
  • the support unit 10 supports the substrate 50 to be processed on its upper surface (support surface).
  • a movable stage that can move the substrate 50 in a two-dimensional direction in the plane and rotate it in the plane is used.
  • the substrate 50 has a planar shape having a plurality of corners such as a square and a rectangle.
  • the imaging device 11A captures an area in the angle of view (field of view) 15A on the support surface of the support unit 10 and acquires image data.
  • the imaging device 11B captures an area in the angle of view (field of view) 15B on the support surface of the support unit 10 and acquires image data.
  • the processing unit 12 executes the processing program stored in the storage unit 13, thereby realizing a position detection function.
  • various data referred to when the position detection process is executed image data acquired by the imaging devices 11 ⁇ / b> A and 11 ⁇ / b> B, coordinate data that is a processing result of the processing unit 12, and the like Is stored.
  • the processing unit 12 outputs the processing result to the display unit 14.
  • a central processing unit CPU
  • the storage unit 13 for example, a RAM, a ROM, an external storage device, or the like is used.
  • the display unit 14 for example, a liquid crystal display, an organic EL display, or the like is used.
  • FIG. 2 is a flowchart of the position detection process executed by the processing unit 12 (FIG. 1).
  • the position detection process described below the position of one corner 51 of the substrate 50 is detected.
  • the positions of the other corners 51 can also be detected by the same process.
  • FIG. 3A is a plan view of a substrate 50 that is a target for position detection.
  • the substrate 50 has a substantially rectangular or square planar shape, and four corners 51 are cut off obliquely.
  • the planar shape of the part cut off obliquely is a right isosceles triangle having a base angle of 45 degrees.
  • FIG. 3B is a view showing templates 20A, 20B, and 20C used as pattern matching executed during the position detection process as images.
  • the template 20 ⁇ / b> A corresponds to an image of an oblique side of the corner 51.
  • the template 20 ⁇ / b> B corresponds to an image of an intersecting portion between an oblique side of the corner 51 and a side extending in the horizontal direction.
  • the template 20 ⁇ / b> C corresponds to an image of an intersecting portion between an oblique side of the corner 51 and a side extending in the vertical direction.
  • the templates 20A, 20B, and 20C are created based on the external shape standard of the substrate 50.
  • 3C, 4A, and 4A are diagrams showing image data handled by the processing unit 12 as an image during the position detection process. 3C, FIG. 4A, and FIG. 4B, the area
  • step S01 the substrate 50 is supported on the support surface of the support portion 10 (step S01). This process is performed, for example, when the processing unit 12 controls a transfer device such as a robot arm.
  • the two adjacent corner portions 51 (FIG. 1) of the substrate 50 or the vicinity thereof are rough so as to be within the two angles of view 15A and 15B (FIG. 1), respectively. Positioning has been made.
  • the processing unit 12 draws the corner 51 of the substrate 50 into the angle of view 15A (step S02).
  • step S02 a process of drawing the corner 51 of the substrate 50 into the angle of view 15A will be described.
  • the estimated position of the corner 51 is imaged to obtain image data 22 (FIG. 3C).
  • the processing unit 12 displays an image corresponding to the acquired image data 22 on the display unit 14.
  • the processing unit 12 determines whether or not the image data 22 includes part of two sides sandwiching the corner portion 51 of the substrate 50.
  • “a part of two sides sandwiching the corner 51” is simply referred to as “the corner 51 is included”.
  • the processing unit 12 moves the support 10 so that the corner 51 is within the angle of view 15A.
  • the processing unit 12 displays an image corresponding to the image data acquired during the movement of the substrate 50 on the display unit 14.
  • the processing unit 12 performs pattern matching between the image data 22 and the template 20B (FIG. 3B).
  • pattern matching between the image data 22 and the template 20C (FIG. 3B) is performed.
  • both the part that matches the template 20B and the part that matches the template 20C are included in the image data 22.
  • a portion that matches the template 20B is not included in the image data 22, but a portion that matches the template 20C is included in the image data 22.
  • the corner 51 falls within the angle of view 15A.
  • the processing unit 12 performs pattern matching between the image data 22 and the template 20A.
  • the processing unit 12 estimates a moving direction and a moving distance to which the substrate 50 should be moved from the position of the matching portion. For example, when a portion matching the template 20A is found in the upper right region of the angle of view 15A, it can be seen that the substrate 50 may be moved obliquely in the lower left direction.
  • the processing unit 12 displays information indicating a portion matched with the templates 20A, 20B, and 20C in the image displayed on the display unit 14 on the display unit 14 during pattern matching. For example, a frame surrounding the matched portion is displayed over the image.
  • the image data 22 is acquired again.
  • the processing unit 12 completes the process of drawing the corner 51 into the angle of view 15A.
  • the processing unit 12 determines the length of the portion in the angle of view 15A (the length of the side in the angle of view) of each of the two sides sandwiching the corner 51. And the specified value are compared (step S03).
  • the two sides sandwiching the corner portion 51 mean sides extending from each of both ends of the oblique side formed by cutting off the corner portion 51 to the next corner portion.
  • This specified value is stored in the storage unit 13 in advance.
  • the length of the side in the angle of view 15A can be obtained from the position of the portion that matches the templates 20B and 20C, for example.
  • the processing unit 12 displays information indicating a portion matching the templates 20B and 20C on the display unit 14.
  • FIG. 4A is a diagram showing the image data 22 as an image when the length of the side in the angle of view 15A is shorter than a specified value.
  • the image data 22 in FIG. 4A completely includes the entire pattern corresponding to the template 20C, but includes only a part of the pattern corresponding to the template 20B. Even in such a case, it is possible to detect a portion matching the template 20B by lowering the matching determination threshold.
  • the length Lc of the side from the vertex of the portion matched with the template 20C to the adjacent corner is equal to or greater than the specified value, but the length Lb of the side from the vertex of the portion matched to the template 20B to the next corner is specified. Is less than the value.
  • step S03 If it is determined in step S03 that the length of at least one side in the angle of view 15A is shorter than the specified value, the length of the side in the angle of view 15A is equal to or longer than the specified value.
  • the substrate 50 is moved so as to be (step S04). For example, in the example shown in FIG. 4A, it can be seen that the substrate 50 may be moved to the left.
  • the processing unit 12 displays an image corresponding to the acquired image data on the display unit 14.
  • step S03 When it is determined in step S03 that the lengths of the two sides in the angle of view 15A are equal to or longer than the specified value, or after step S04, the processing unit 12 sets the specified angle in the angle of view 15A.
  • the position of the corner 51 is detected based on the image data 22 acquired in a state including a side that is equal to the value or longer than the specified value (step S05).
  • step S05 the process of detecting the position of the corner 51 will be described with reference to FIG. 4B.
  • FIG. 4B is a diagram showing the image data 22 acquired as an image in a state in which a side having a length equal to or longer than a specified value is included in the field angle 15A.
  • a side extending linearly in the horizontal direction and a side extending linearly in the vertical direction are detected in the image data 22.
  • pattern matching using a template including one straight side can be used.
  • the processing unit 12 employs the position of the detected intersection (reference point) 28 on the extension line of the two sides as the position of the corner 51.
  • the processing unit 12 displays information indicating the detected extension lines of the two sides and the reference point 28 on the display unit 14. For example, the extension line of two sides is displayed as a solid line or a broken line, and the intersection of both is displayed as a plus sign. Further, information 29 indicating the coordinates of the reference point is displayed on the display unit 14 as a number.
  • the position of the substrate 50 where the alignment mark is not formed can be detected. Further, the position of the substrate 50 can be detected even if the front and back sides of the substrate 50 are reversed.
  • FIG. 5A is a diagram showing the image data of the corner 51 when the shape and dimensions of the cut-off portion of the corner 51 of the substrate 50 are as specified.
  • the position of the reference point 28 of the corner 51 can be detected by the method according to the embodiment.
  • FIG. 5B is a diagram showing the image data of the corner 51 when the size of the cut-off portion of the corner 51 of the substrate 50 is smaller than the standard as an image.
  • FIG. 5C is a diagram illustrating the image data of the corner 51 when the shape of the cut-off portion of the corner 51 of the substrate 50 deviates from the standard.
  • the angle formed between the oblique side and the two sides sandwiching the corner 51 is deviated from 45 degrees. Since the vertices at both ends of the oblique side are detected by pattern matching separately using different templates 20B and 20C, the vertices at both ends of the oblique side are pattern-matched even if the oblique sides are deviated from 45 degrees. It is easy to detect by. Since the reference point 28 of the corner 51 is determined by extending two sides sandwiching the corner 51, the position detection accuracy of the reference point 28 does not decrease even if the direction of the oblique side is deviated from 45 degrees.
  • FIG. 5D is a diagram showing the image data of the corner 51 when the posture in the in-plane rotation direction of the substrate 50 is slightly tilted from the target posture as an image.
  • two sides sandwiching the corner 51 are inclined with respect to the vertical direction and the horizontal direction in the angle of view 15A (FIG. 1).
  • the corners at both ends of the oblique side can be detected by inclining the templates 20B and 20C and performing pattern matching. Since the reference point 28 of the corner 51 is determined by extending two sides sandwiching the corner 51, even if the two sides are inclined with respect to the vertical direction and the horizontal direction, the position detection accuracy of the reference point 28 is It does not decline.
  • the position detection result is not affected by the shape of the apex of the corner 51.
  • the position of the corner 51 can be detected with high accuracy.
  • the image data 22 (FIG. 4B) including a side having a length equal to or longer than the specified value in the angle of view is used as the basis for calculating the position of the corner 51.
  • the extension line of these two sides can be determined with high accuracy.
  • the calculation accuracy of the position of the reference point 28 (FIG. 4B) calculated based on this extension line can be increased.
  • the display unit 14 displays the image of the substrate 50 imaged by the imaging devices 11A and 11B (FIG. 1) and the template image when performing pattern matching according to the operation of the position detection device. Is displayed. For this reason, the operator can know the operation status of the position detection device from the information displayed on the display unit 14. Moreover, the normality of operation
  • FIG. 6 is a flowchart of the position detection process executed by the processing unit 12 of the position detection apparatus according to this embodiment. Steps S01 and S02 are the same as steps S01 and S02 in the embodiment shown in FIG.
  • step S03a the lengths of the two sides in the angle of view 15A are compared with the specified value to determine whether the lengths of the two sides in the angle of view 15A are equal to the specified value.
  • the difference between the length of the two sides in the angle of view 15A and the specified value is the range of the maximum error caused by the measurement accuracy when measuring the length based on the image data by the imaging devices 11A and 11B. If it is within the range, the processing unit 12 determines that the lengths of the two sides in the angle of view 15A are equal to the specified value.
  • the processing unit 12 moves the substrate 50 so that the lengths of the two sides in the angle of view 15A are equal to the specified value. (Step S04a).
  • step S03a If it is determined in step S03a that the lengths of the two sides in the angle of view 15A are equal to the specified value, or after step S04a, the processing unit 12 has a length of 2 equal to the specified value in the angle of view 15A. Based on the image data 22 acquired in a state including one side, the position of the corner 51 is detected.
  • the process of step S05 is the same as the process of step S05 of the embodiment shown in FIG.
  • the variation in the position of the reference point 28 (FIG. 4B) of the corner 51 within the angle of view 15A is reduced. Since the influence of the aberration of the lens of the imaging device 11A can be equalized for each position detection, it is possible to reduce variations in position detection accuracy.
  • FIGS. 7A and 7B a position detection device according to still another embodiment will be described with reference to FIGS. 7A and 7B.
  • the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
  • FIG. 7A is a plan view of the substrate 50 that is a position detection target in this embodiment.
  • the substrate 50 that is a position detection target in the embodiment shown in FIGS. 1 to 5C has a shape in which the corner portion 51 is linearly cut off.
  • the substrate 50 is a position detection target.
  • substrate 50 has the shape by which the corner
  • FIG. 7B is a diagram illustrating templates 20A, 20B, and 20C used in the pattern matching executed in step S02 of FIG. 2 as images.
  • the templates 20A, 20B, and 20C in FIG. 7B correspond to the templates 20A, 20B, and 20C in FIG. 3B, respectively.
  • the patterns of the templates 20A, 20B, and 20C are also rounded according to the round shape of the corner 51 of the substrate 50.
  • FIGS. 8 to 9B a position detection apparatus according to another embodiment will be described with reference to FIGS. 8 to 9B.
  • the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
  • FIG. 8 is a flowchart of the position detection process executed by the position detection apparatus according to this embodiment.
  • the process of supporting the substrate 50 on the support unit 10 (step S11) and the process of drawing the first corner part into the angle of view (step S12) are the same as the processes of steps S01 and S02 shown in FIG.
  • the processing unit 12 draws the second corner adjacent to the first corner via one side into the angle of view (Ste S13). For example, when the first corner is drawn into the angle of view 15A (FIG. 1), the second corner is drawn into the angle of view 15B.
  • the same technique as the process of step S02 of FIG. 2 can be applied to the process of drawing the second corner into the angle of view 15B.
  • FIG. 9A is a plan view showing the positional relationship between the angle of view 15A and 15B and the substrate 50 when step S13 is executed.
  • a direction from the origin of the angle of view 15B to the origin of the angle of view 15A is defined as a reference direction (x direction). At this time, each side of the substrate 50 is inclined with respect to the reference direction.
  • the processing unit 12 calculates the relative positional relationship between the two corners from the state in which the first corner is drawn into the angle of view and the state in which the second corner is drawn into the angle of view.
  • the posture in the in-plane rotation direction is detected (step S14).
  • the processing unit 12 performs alignment with respect to the in-plane rotation direction of the substrate 50 by rotating the support unit 10 based on the detection result of the posture of the substrate 50 in the in-plane rotation direction (step S15).
  • FIG. 9B is a plan view showing the positional relationship between the angles of view 15A and 15B after the execution of step S15 and the substrate 50.
  • FIG. A pair of sides of the substrate 50 is parallel to the reference direction.
  • the processing unit 12 After the alignment in the in-plane rotation direction of the substrate 50, the processing unit 12 detects the position of the first corner again (step S16).
  • the position of the first corner can be detected by the same procedure as the procedure from steps S02 to S05 in FIG. Similarly, the processing unit 12 detects the position of the second corner again (step S17).
  • alignment in the in-plane rotation direction of the substrate 50 can be performed, and the position of the substrate 50 in two in-plane directions can be detected. .
  • FIG. 10 is a flowchart of the position detection process executed by the position detection apparatus according to this embodiment.
  • the substrate 50 is supported on the support surface of the support portion 10 (FIG. 1) with the first surface facing upward (step S21). This process is the same as the process of step S01 shown in FIG.
  • the processing unit 12 detects the position of the substrate 50 (step S22). This position detection process is the same as the process from steps S02 to S05 shown in FIG. 2 or steps S12 to S16 shown in FIG.
  • the processing unit 12 creates a template from the image data of the corner 51 when the lengths of the two sides in the angle of view are equal to or greater than the specified value (step S23).
  • the template creation process will be described. Pattern matching between the image data 22 (FIG. 4B) used for detecting the position of the corner 51 in step S05 (FIG. 2) and the templates 20B and 20C (FIG. 3B) is performed. The image of the part corresponding to 20B and 20C is cut out. The cut-out image is mirror-image converted into a template and stored in the storage unit 13.
  • a template created based on an image cut out from the image data 22 is referred to as a dynamic template in distinction from the templates 20A, 20B, and 20C (FIG. 3B) created based on the standard shape of the substrate 50.
  • the processing unit 12 processes the first surface of the substrate 50 (step S24). For example, ink is ejected from the inkjet head toward the first surface to form an insulating resin film having a desired planar shape.
  • step S25 the front and back of the substrate 50 are reversed (step S25), and the second surface opposite to the first surface is directed upward.
  • Inversion of the front and back is performed, for example, by the processing unit 12 controlling the robot arm. In addition, you may make it reverse the front and back manually.
  • the processing unit 12 detects the position of the substrate 50 with the second surface facing upward (step S26).
  • the dynamic template created in step S23 is used instead of the templates 20A, 20B, and 20C (FIG. 3B) when executing the processes from steps S02 to S05 shown in FIG. More specifically, when searching for a specific part from the image data of the substrate 50, a dynamic template cut out from a corresponding part of the image data obtained by imaging the first surface is used.
  • the processing unit 12 After detecting the position of the substrate 50, the processing unit 12 processes the second surface of the substrate 50 (step S27).
  • the position of the substrate 50 is detected using a dynamic template in step S26. For this reason, even if it is a case where the shape of the corner
  • FIGS. 11A and 11B a position detection device according to still another embodiment will be described with reference to FIGS. 11A and 11B.
  • the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
  • FIG. 11A is a plan view of a substrate 50 that is a target whose position is detected by using the position detection device according to the present embodiment.
  • the corner portion 51 of the substrate 50 is not cut off, and a substantially perpendicular vertex appears.
  • FIG. 11B is a diagram showing the template 20 used for pattern matching at the time of position detection as an image.
  • two templates 20B and 20C are used in pattern matching when the corner 51 is detected (steps S02 and S05).
  • one template 20 is used in pattern matching when detecting the corner 51.
  • the length of two sides in the angle of view is equal to or greater than the specified value in the image data that is the basis for detecting the position of the corner 51 (step S03), the reference of the corner 51 The detection accuracy of the position of the point 28 (FIG. 4B) can be increased.
  • FIGS. 12A and 12B a position detection apparatus according to still another embodiment will be described.
  • the description of the configuration common to the embodiments shown in FIGS. 1 to 5D and FIGS. 8 to 9B will be omitted.
  • FIG. 12A and 12B are schematic perspective views of the position detection apparatus according to this embodiment.
  • the two imaging devices 11A and 11B are arranged corresponding to the two corners 51 of the substrate 50.
  • only one imaging device 11A is arranged.
  • the first corner is drawn into the angle of view 15A of the imaging device 11A in step S12
  • the second corner is drawn into the angle of view 15B of the imaging device 11B in step S13.
  • step S12 the first corner 51 is drawn into the angle of view 15A of the image pickup apparatus 11A as in the embodiment of FIG. 8 (FIG. 12A).
  • step S13 the substrate 50 is translated and the second corner 51 is drawn into the angle of view 15A of the imaging device 11A (FIG. 12B).
  • step S16 the first corner 51 is drawn into the angle of view 15A of the image pickup apparatus 11A, and processing for detecting the position of the first corner 51 is executed.
  • step S ⁇ b> 17 a process of detecting the position of the second corner portion 51 by drawing the second corner portion 51 into the angle of view 15 ⁇ / b> A of the imaging device 11 ⁇ / b> A is executed.
  • the moving direction of the substrate 50 from the state of FIG. 12A to the state of FIG. 12B may be defined as the reference direction (x direction) shown in FIGS. 9A and 9B.
  • FIG. 13 is a schematic front view of the film forming apparatus according to the present embodiment.
  • the support unit 10 is supported on the base 30 via the moving mechanism 31.
  • the support portion 10 corresponds to the support portion 10 of the embodiment shown in FIG.
  • the substrate 50 is supported on the upper surface (support surface) of the support unit 10.
  • the moving mechanism 31 can move the support unit 10 in a two-dimensional direction parallel to the support surface and rotate it in an in-plane direction parallel to the support surface. Usually, the support surface of the support part 10 is kept horizontal.
  • a plurality of inkjet heads 33 and a plurality of imaging devices 11A and 11B are arranged above the substrate 50 supported by the support unit 10.
  • the inkjet head 33 and the imaging devices 11 ⁇ / b> A and 11 ⁇ / b> B are supported on the base 30 by the portal frame 32.
  • the imaging devices 11A and 11B can be moved up and down by the lifting mechanisms 17A and 17B, respectively.
  • Each of the inkjet heads 33 is provided with a plurality of nozzle holes. A droplet of the film material is discharged from the nozzle hole toward the substrate 50.
  • the imaging devices 11 ⁇ / b> A and 11 ⁇ / b> B capture a part of the substrate 50 supported by the support unit 10 and transmit the acquired two-dimensional image data to the processing unit 12.
  • the imaging devices 11A and 11B and the processing unit 12 correspond to the imaging devices 11A and 11B and the processing unit 12 of the embodiment shown in FIG. Furthermore, the processing unit 12 can adjust the focus positions of the imaging devices 11A and 11B by controlling the elevating mechanisms 17A and 17B to raise and lower the imaging devices 11A and 11B.
  • the processing unit 12 controls the moving mechanism 31 and the inkjet head 33 based on image data defining the shape of the film to be formed.
  • a film can be formed by curing the adhering liquid film material. Thereby, a film having a desired shape can be formed on the substrate 50.
  • a photocurable resin, a thermosetting resin, or the like can be used as the film material.
  • a light source or a heat source for curing the film material attached to the substrate 50 is disposed on the side of the inkjet head 33.
  • Various commands and data are input from the input unit 16 to the processing unit 12.
  • a keyboard, a pointing device, a USB port, a communication device, or the like is used for the input unit 16.
  • Various information regarding the operation of the film forming apparatus is output to the display unit 14.
  • a liquid crystal display or the like is used for the display unit 14.
  • a communication device that transmits image data to be displayed on an external display device may be used as the display unit 14.
  • the position of the substrate 50 is detected by the method according to any one of the embodiments shown in FIGS. Thereby, a position can be detected with high accuracy.

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Container, Conveyance, Adherence, Positioning, Of Wafer (AREA)
  • Image Processing (AREA)

Abstract

In the present invention, a substrate having a corner part is supported by a support unit. An imaging device captures an image of a portion of the substrate in an angle of view. A processing unit causes the substrate or the imaging device to move so that the length of two sides on either side of a corner part of the substrate included in the angle of view of the imaging device is equal to or greater than a specified value. The position of the corner part of the substrate is detected on the basis of image data acquired by capturing of an image in the angle of view by the imaging device after the movement. Through this configuration, more position detection can be more stably performed on the basis of image data of the corner part of the substrate.

Description

位置検出装置及び位置検出方法Position detection apparatus and position detection method
 本発明は、位置検出装置及び位置検出方法に関する。 The present invention relates to a position detection device and a position detection method.
 基板に対して描画や加工を行う際に、基板に形成されたアライメントマークを用いて基板の位置決めを行う技術が知られている。下記の特許文献1に、基板の角部付近を撮像して得られた画像データに基づいて基板の位置決めを行うスクリーン印刷技術が開示されている。 A technique for positioning a substrate using an alignment mark formed on the substrate when drawing or processing the substrate is known. Patent Document 1 below discloses a screen printing technique for positioning a substrate based on image data obtained by imaging the vicinity of a corner of the substrate.
特開2014-205286号公報JP 2014-205286 A
 本発明の目的は、基板の角部の画像データに基づいて、より安定した位置検出を行うことが可能な位置検出装置を提供することである。 An object of the present invention is to provide a position detection device capable of performing more stable position detection based on image data of corners of a substrate.
 本発明の一観点によると、
 角部を有する基板を支持する支持部と、
 画角内の前記基板の一部分を撮像する撮像装置と、
 処理部と
を有し、
 前記処理部は、
 前記撮像装置の画角内に含まれる前記基板の前記角部を挟む2つの辺の長さが規定値と等しいか、または前記規定値より長くなるように前記基板または前記撮像装置を移動させ、
 移動後の前記撮像装置の画角内を撮像して取得された画像データに基づいて、前記基板の前記角部の位置を検出する位置検出装置が提供される。
According to one aspect of the invention,
A support part for supporting a substrate having a corner part;
An imaging device for imaging a portion of the substrate within an angle of view;
A processing unit,
The processor is
Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value,
A position detection device is provided that detects the position of the corner portion of the substrate based on image data acquired by imaging the angle of view of the imaging device after movement.
 本発明の他の観点によると、
 角部を有する基板を支持する支持部と、
 前記基板の一部分を撮像する撮像装置と、
 処理部と、
 画像を表示する表示部と
を有し、
 前記処理部は、
 前記撮像装置の画角内の前記基板の前記角部の画像を前記表示部に表示し、
 前記撮像装置の画角内に含まれる前記基板の前記角部を挟む2つの辺の長さが規定値と等しいか、または前記規定値より長くなるように前記基板または前記撮像装置を移動させ、移動後の前記撮像装置の画角内の前記角部の画像を前記表示部に表示し、
 前記撮像装置の画角内における前記基板の画像の前記角部の位置の座標を求め、求められた座標を特定する情報を前記表示部に表示する位置検出装置が提供される。
According to another aspect of the invention,
A support part for supporting a substrate having a corner part;
An imaging device for imaging a portion of the substrate;
A processing unit;
A display unit for displaying an image,
The processor is
Displaying an image of the corner of the substrate within the angle of view of the imaging device on the display;
Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value, Display the image of the corner within the angle of view of the imaging device after movement on the display unit,
There is provided a position detection device that obtains coordinates of the position of the corner of the image of the substrate within the angle of view of the imaging device and displays information for specifying the obtained coordinates on the display unit.
 本発明のさらに他の観点によると、
 角部を有する基板の第1角部を撮像して第1画像データを取得する工程と、
 前記第1角部を挟む2つの辺のうち、前記第1画像データに含まれている部分の長さと規定値とを比較し、前記第1角部を挟む2つの辺のうち、前記第1画像データに含まれている部分の長さが前記規定値以上でない場合には、前記規定値以上になるように撮像される領域を移動させて前記基板を撮像し、第2画像データを取得する工程と、
 前記第2画像データに基づいて、前記基板の前記第1角部の位置を求める工程と
を有する位置検出方法が提供される。
According to yet another aspect of the invention,
Imaging a first corner of a substrate having corners to obtain first image data;
Of the two sides sandwiching the first corner, the length of the portion included in the first image data is compared with a specified value, and the first of the two sides sandwiching the first corner is the first. When the length of the portion included in the image data is not equal to or greater than the specified value, the region to be imaged is moved so as to be equal to or greater than the specified value, and the board is imaged to obtain second image data. Process,
And a step of determining the position of the first corner of the substrate based on the second image data.
 撮像装置の画角内に含まれる基板の角部を挟む2つの辺の長さが規定値以上になるようにして角部の位置を検出することにより、安定した位置検出を行うことが可能である。 Stable position detection can be performed by detecting the position of the corner so that the length of two sides sandwiching the corner of the substrate included in the angle of view of the imaging device is equal to or greater than a specified value. is there.
図1は、実施例による位置検出装置の概略斜視図である。FIG. 1 is a schematic perspective view of a position detection apparatus according to an embodiment. 図2は、実施例による位置検出装置の処理部が実行する位置検出処理のフローチャートである。FIG. 2 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to the embodiment. 図3Aは、位置検出を行う対象物である基板の平面図であり、図3Bは、位置検出処理中に実行されるパターンマッチングで用いるテンプレートを画像として示す図であり、図3Cは、位置検出処理中に処理部が取り扱う画像データを画像として示す図である。FIG. 3A is a plan view of a substrate that is an object for position detection, FIG. 3B is a diagram showing a template used for pattern matching executed during the position detection process as an image, and FIG. 3C is a position detection. It is a figure which shows the image data which a process part handles during a process as an image. 図4A及び図4Bは、位置検出処理中に処理部が取り扱う画像データを画像として示す図である。4A and 4B are diagrams illustrating image data handled by the processing unit as an image during the position detection process. 図5Aは、基板の角部の切り落とし部分の形状及び寸法が規格通りであるときの角部の画像データを画像として示す図であり、図5Bは、基板の角部の切り落とし部分の寸法が規格よりも小さい場合の角部の画像データを画像として示す図であり、図5Cは、基板の角部の切り落とし部分の形状が規格から外れた場合の角部の画像データを画像として図であり、図5Dは、基板の面内回転方向の姿勢が目標とする姿勢からやや傾いている場合の角部の画像データを画像として示す図である。FIG. 5A is a diagram showing the image data of the corner when the shape and size of the cut-off portion of the corner of the substrate are in accordance with the standard, and FIG. 5B is the size of the cut-out portion of the corner of the substrate as the standard FIG. 5C is a diagram illustrating, as an image, image data of a corner when the shape of a cut-off portion of the corner of the substrate is out of the standard. FIG. 5D is a diagram illustrating, as an image, image data at corners when the posture in the in-plane rotation direction of the substrate is slightly inclined from the target posture. 図6は、他の実施例による位置検出装置の処理部が実行する位置検出処理のフローチャートである。FIG. 6 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to another embodiment. 図7Aは、さらに他の実施例で位置検出の対象となる基板の平面図であり、図7Bは、図2のステップS03及びS07で実行されるパターンマッチングにおいて用いられるテンプレートを画像として示す図である。FIG. 7A is a plan view of a substrate whose position is to be detected in still another embodiment, and FIG. 7B is a diagram showing, as an image, a template used in pattern matching executed in steps S03 and S07 in FIG. is there. 図8は、さらに他の実施例による位置検出装置が実行する位置検出処理のフローチャートである。FIG. 8 is a flowchart of a position detection process executed by a position detection apparatus according to another embodiment. 図9Aは、図8のステップS11を実行する時点の画角と基板との位置関係を示す平面図であり、図9Bは、ステップS13を実行した後の画角と基板との位置関係を示す平面図である。9A is a plan view showing the positional relationship between the angle of view and the substrate when step S11 of FIG. 8 is executed, and FIG. 9B shows the positional relationship between the angle of view and the substrate after executing step S13. It is a top view. 図10は、さらに他の実施例による位置検出装置が実行する位置検出処理のフローチャートである。FIG. 10 is a flowchart of position detection processing executed by a position detection apparatus according to another embodiment. 図11Aは、さらに他の実施例による位置検出装置を用いて位置を検出する対象となる基板の平面図であり、図11Bは、位置検出時のパターンマッチングに用いられるテンプレートを画像として示す図である。FIG. 11A is a plan view of a substrate whose position is to be detected using a position detection apparatus according to another embodiment, and FIG. 11B is a diagram showing a template used for pattern matching at the time of position detection as an image. is there. 図12A及び図12Bは、さらに他の実施例による位置検出装置の概略斜視図である。12A and 12B are schematic perspective views of a position detecting device according to still another embodiment. 図13は、さらに他の実施例による膜形成装置の概略正面図である。FIG. 13 is a schematic front view of a film forming apparatus according to still another embodiment.
 図1~図5Dを参照して、実施例による位置検出装置について説明する。
 図1は、実施例による位置検出装置の概略斜視図である。本実施例による位置検出装置は、支持部10、撮像装置11A、11B、処理部12、記憶部13、及び表示部14を含む。支持部10は、その上面(支持面)に処理対象の基板50を支持する。支持部10として、例えば基板50を面内の二次元方向に移動させ、かつ面内方向に回転させることができる可動ステージが用いられる。基板50は、例えば正方形、長方形等の複数の角部を持つ平面形状を有する。
With reference to FIGS. 1 to 5D, a position detection apparatus according to an embodiment will be described.
FIG. 1 is a schematic perspective view of a position detection apparatus according to an embodiment. The position detection apparatus according to the present embodiment includes a support unit 10, imaging devices 11A and 11B, a processing unit 12, a storage unit 13, and a display unit 14. The support unit 10 supports the substrate 50 to be processed on its upper surface (support surface). As the support unit 10, for example, a movable stage that can move the substrate 50 in a two-dimensional direction in the plane and rotate it in the plane is used. The substrate 50 has a planar shape having a plurality of corners such as a square and a rectangle.
 撮像装置11Aは支持部10の支持面上の画角(視野)15A内の領域を撮像し、画像データを取得する。撮像装置11Bは支持部10の支持面上の画角(視野)15B内の領域を撮像し、画像データを取得する。基板50の1つの角部51が画角15A内に配置されたとき、その角部51に1つの辺を介して隣り合う他の角部51が画角15B内に配置されるように、2つの撮像装置11A、11Bの位置が調整されている。 The imaging device 11A captures an area in the angle of view (field of view) 15A on the support surface of the support unit 10 and acquires image data. The imaging device 11B captures an area in the angle of view (field of view) 15B on the support surface of the support unit 10 and acquires image data. When one corner 51 of the substrate 50 is disposed within the angle of view 15A, the other corner 51 adjacent to the corner 51 via one side is disposed within the angle of view 15B. The positions of the two imaging devices 11A and 11B are adjusted.
 処理部12が記憶部13に格納されている処理プログラムを実行することにより、位置検出の機能が実現される。記憶部13には、処理プログラムの他に、位置検出処理を実行するときに参照される種々のデータ、撮像装置11A、11Bで取得された画像データ、処理部12の処理結果である座標データ等が格納される。処理部12は、処理結果を表示部14に出力する。処理部12として、例えば中央処理ユニット(CPU)が用いられる。記憶部13として、例えばRAM、ROM、外部記憶装置等が用いられる。表示部14として、例えば液晶ディスプレイ、有機ELディスプレイ等が用いられる。 The processing unit 12 executes the processing program stored in the storage unit 13, thereby realizing a position detection function. In the storage unit 13, in addition to the processing program, various data referred to when the position detection process is executed, image data acquired by the imaging devices 11 </ b> A and 11 </ b> B, coordinate data that is a processing result of the processing unit 12, and the like Is stored. The processing unit 12 outputs the processing result to the display unit 14. For example, a central processing unit (CPU) is used as the processing unit 12. As the storage unit 13, for example, a RAM, a ROM, an external storage device, or the like is used. As the display unit 14, for example, a liquid crystal display, an organic EL display, or the like is used.
 図2は、処理部12(図1)が実行する位置検出処理のフローチャートである。以下に説明する位置検出処理では、基板50の1つの角部51の位置を検出する。他の角部51の位置も、同様の処理で検出することができる。 FIG. 2 is a flowchart of the position detection process executed by the processing unit 12 (FIG. 1). In the position detection process described below, the position of one corner 51 of the substrate 50 is detected. The positions of the other corners 51 can also be detected by the same process.
 図3Aは、位置検出を行う対象物である基板50の平面図である。基板50は、ほぼ長方形または正方形の平面形状を有し、4つの角部51が斜めに切り落とされている。例えば、斜めに切り落とされた部分の平面形状は、底角が45度の直角二等辺三角形である。 FIG. 3A is a plan view of a substrate 50 that is a target for position detection. The substrate 50 has a substantially rectangular or square planar shape, and four corners 51 are cut off obliquely. For example, the planar shape of the part cut off obliquely is a right isosceles triangle having a base angle of 45 degrees.
 図3Bは、位置検出処理中に実行されるパターンマッチングで用いるテンプレート20A、20B、20Cを画像として示す図である。テンプレート20Aは、角部51の斜めの辺の画像に対応する。テンプレート20Bは、角部51の斜めの辺と横方向に延びる辺との交わり部分の画像に対応する。テンプレート20Cは、角部51の斜めの辺と縦方向に延びる辺との交わり部分の画像に対応する。テンプレート20A、20B、20Cは、基板50の外形形状の規格に基づいて作成されている。 FIG. 3B is a view showing templates 20A, 20B, and 20C used as pattern matching executed during the position detection process as images. The template 20 </ b> A corresponds to an image of an oblique side of the corner 51. The template 20 </ b> B corresponds to an image of an intersecting portion between an oblique side of the corner 51 and a side extending in the horizontal direction. The template 20 </ b> C corresponds to an image of an intersecting portion between an oblique side of the corner 51 and a side extending in the vertical direction. The templates 20A, 20B, and 20C are created based on the external shape standard of the substrate 50.
 図3C、図4A、及び図4Aは、位置検出処理中に処理部12が取り扱う画像データを画像として示す図である。図3C、図4A、及び図4Bにおいて、基板50の内側の領域にハッチングを付している。 3C, 4A, and 4A are diagrams showing image data handled by the processing unit 12 as an image during the position detection process. 3C, FIG. 4A, and FIG. 4B, the area | region inside the board | substrate 50 is hatched.
 図2に示すように、基板50の位置検出を行う前に、まず、基板50を支持部10の支持面に支持させる(ステップS01)。この処理は、例えば、処理部12がロボットアーム等の搬送装置を制御することにより行う。基板50を支持部10に支持させた時点で、基板50の隣り合う2つの角部51(図1)またはその近傍が、それぞれ2つの画角15A、15B(図1)内に収まるように粗い位置決めがなされている。 As shown in FIG. 2, before detecting the position of the substrate 50, first, the substrate 50 is supported on the support surface of the support portion 10 (step S01). This process is performed, for example, when the processing unit 12 controls a transfer device such as a robot arm. When the substrate 50 is supported by the support portion 10, the two adjacent corner portions 51 (FIG. 1) of the substrate 50 or the vicinity thereof are rough so as to be within the two angles of view 15A and 15B (FIG. 1), respectively. Positioning has been made.
 処理部12は、基板50の角部51を画角15A内に引き込む(ステップS02)。以下、基板50の角部51を画角15A内に引き込む処理について説明する。まず、基板50の粗い位置決めがなされている状態で、角部51の推定位置(画角15A内)を撮像し画像データ22(図3C)を取得する。処理部12は、取得した画像データ22に対応する画像を表示部14に表示する。 The processing unit 12 draws the corner 51 of the substrate 50 into the angle of view 15A (step S02). Hereinafter, a process of drawing the corner 51 of the substrate 50 into the angle of view 15A will be described. First, in a state where the substrate 50 is roughly positioned, the estimated position of the corner 51 (within the angle of view 15A) is imaged to obtain image data 22 (FIG. 3C). The processing unit 12 displays an image corresponding to the acquired image data 22 on the display unit 14.
 その後、処理部12は、画像データ22に基板50の角部51を挟む2つの辺の一部が含まれているか否かを判定する。以下、「角部51を挟む2つの辺の一部が含まれている」ことを、単に「角部51が含まれている」という。画像データ22に角部51が含まれていない場合には、処理部12は角部51が画角15A内に収まるように支持部10を移動させる。処理部12は、基板50の移動中に取得された画像データに対応する画像を表示部14に表示する。 Thereafter, the processing unit 12 determines whether or not the image data 22 includes part of two sides sandwiching the corner portion 51 of the substrate 50. Hereinafter, “a part of two sides sandwiching the corner 51” is simply referred to as “the corner 51 is included”. When the corner 51 is not included in the image data 22, the processing unit 12 moves the support 10 so that the corner 51 is within the angle of view 15A. The processing unit 12 displays an image corresponding to the image data acquired during the movement of the substrate 50 on the display unit 14.
 以下、図3Cを参照して、画像データ22内に角部51が含まれているか否かを判定する処理について説明する。まず、処理部12は、画像データ22とテンプレート20B(図3B)とのパターンマッチングを行う。次に、画像データ22とテンプレート20C(図3B)とのパターンマッチングを行う。 Hereinafter, with reference to FIG. 3C, a process for determining whether or not the corner 51 is included in the image data 22 will be described. First, the processing unit 12 performs pattern matching between the image data 22 and the template 20B (FIG. 3B). Next, pattern matching between the image data 22 and the template 20C (FIG. 3B) is performed.
 図3Cの左端に示した例では、テンプレート20Bにマッチングする部分及びテンプレート20Cにマッチングする部分の両方が画像データ22内に含まれている。この場合には、画像データ22に角部51が含まれていると判定される。 In the example shown at the left end of FIG. 3C, both the part that matches the template 20B and the part that matches the template 20C are included in the image data 22. In this case, it is determined that the corner 51 is included in the image data 22.
 図3Cの左から2番目に示した例では、テンプレート20Bにマッチングする部分が画像データ22内に含まれているが、テンプレート20Cにマッチングする部分は画像データ22内に含まれていない。この場合、基板50を図3Cにおいて下方向に移動させれば角部51が画角15A内に収まることがわかる。 In the second example from the left in FIG. 3C, a portion that matches the template 20B is included in the image data 22, but a portion that matches the template 20C is not included in the image data 22. In this case, it can be seen that if the substrate 50 is moved downward in FIG. 3C, the corner 51 falls within the angle of view 15A.
 図3Cの左から3番目に示した例では、テンプレート20Bにマッチングする部分が画像データ22内に含まれていないが、テンプレート20Cにマッチングする部分が画像データ22内に含まれている。この場合、基板50を図3Cにおいて左方向に移動させれば角部51が画角15A内に収まることがわかる。 In the third example from the left in FIG. 3C, a portion that matches the template 20B is not included in the image data 22, but a portion that matches the template 20C is included in the image data 22. In this case, it can be seen that if the substrate 50 is moved in the left direction in FIG. 3C, the corner 51 falls within the angle of view 15A.
 図3Cの右端に示した例では、テンプレート20Bにマッチングする部分及びテンプレート20Cにマッチングする部分のいずれも画像データ22内に含まれていない。この場合、処理部12は、画像データ22とテンプレート20Aとのパターンマッチングを行う。テンプレート20Aにマッチングする部分が画像データ22内に含まれている場合、処理部12は、マッチングする部分の位置から、基板50を移動させるべき移動方向及び移動距離を推定する。例えば、テンプレート20Aにマッチングする部分が画角15Aの右上の領域に見つかった場合、基板50を斜め左下方向に移動させればよいことがわかる。 In the example shown at the right end of FIG. 3C, neither the part matching the template 20B nor the part matching the template 20C is included in the image data 22. In this case, the processing unit 12 performs pattern matching between the image data 22 and the template 20A. When a portion that matches the template 20A is included in the image data 22, the processing unit 12 estimates a moving direction and a moving distance to which the substrate 50 should be moved from the position of the matching portion. For example, when a portion matching the template 20A is found in the upper right region of the angle of view 15A, it can be seen that the substrate 50 may be moved obliquely in the lower left direction.
 処理部12は、パターンマッチングを行っているときに、表示部14に表示されている画像内でテンプレート20A、20B、20Cにマッチングした部分を示す情報を表示部14に表示する。例えば、マッチングした部分を取り囲む枠を、画像に重ねて表示する。 The processing unit 12 displays information indicating a portion matched with the templates 20A, 20B, and 20C in the image displayed on the display unit 14 on the display unit 14 during pattern matching. For example, a frame surrounding the matched portion is displayed over the image.
 基板50を移動させるべき方向及び距離だけ基板50を移動させた後、画像データ22を再取得する。処理部12は、再取得した画像データ22内に角部51が含まれている場合には、画角15A内に角部51を引き込む処理が完了である。 After moving the substrate 50 by the direction and distance to which the substrate 50 should be moved, the image data 22 is acquired again. When the corner 51 is included in the re-acquired image data 22, the processing unit 12 completes the process of drawing the corner 51 into the angle of view 15A.
 画角15A内への角部51の引き込みが完了すると、処理部12は、角部51を挟む2つの辺の各々のうち、画角15A内の部分の長さ(画角内の辺の長さ)と規定値とを比較する(ステップS03)。ここで、角部51を挟む2つの辺とは、角部51を切り落とされて形成された斜めの辺の両端の各々から隣の角部に向かう辺を意味する。この規定値は、予め記憶部13に記憶されている。画角15A内の辺の長さは、例えば、テンプレート20B、20Cにマッチングする部分の位置から求めることができる。処理部12は、テンプレート20B、20Cにマッチングする部分を示す情報を表示部14に表示する。 When the drawing of the corner 51 into the angle of view 15A is completed, the processing unit 12 determines the length of the portion in the angle of view 15A (the length of the side in the angle of view) of each of the two sides sandwiching the corner 51. And the specified value are compared (step S03). Here, the two sides sandwiching the corner portion 51 mean sides extending from each of both ends of the oblique side formed by cutting off the corner portion 51 to the next corner portion. This specified value is stored in the storage unit 13 in advance. The length of the side in the angle of view 15A can be obtained from the position of the portion that matches the templates 20B and 20C, for example. The processing unit 12 displays information indicating a portion matching the templates 20B and 20C on the display unit 14.
 図4Aは、画角15A内の辺の長さが規定値より短い場合の画像データ22を画像として示す図である。図4Aの画像データ22は、テンプレート20Cに対応するパターンの全域を完全に含んでいるが、テンプレート20Bに対応するパターンの一部分しか含んでいない。このような場合でも、マッチング判定閾値を低くすることにより、テンプレート20Bにマッチングする部分を検出することは可能である。テンプレート20Cにマッチングした部分の頂点から隣の角部に向かう辺の長さLcは規定値以上であるが、テンプレート20Bにマッチングした部分の頂点から隣の角部に向かう辺の長さLbは規定値未満である。 FIG. 4A is a diagram showing the image data 22 as an image when the length of the side in the angle of view 15A is shorter than a specified value. The image data 22 in FIG. 4A completely includes the entire pattern corresponding to the template 20C, but includes only a part of the pattern corresponding to the template 20B. Even in such a case, it is possible to detect a portion matching the template 20B by lowering the matching determination threshold. The length Lc of the side from the vertex of the portion matched with the template 20C to the adjacent corner is equal to or greater than the specified value, but the length Lb of the side from the vertex of the portion matched to the template 20B to the next corner is specified. Is less than the value.
 ステップS03で、画角15A内の少なくとも1つの辺の長さが規定値より短いと判定された場合には、画角15A内の辺の長さが規定値と等しいか、または規定値より長くなるように基板50を移動させる(ステップS04)。例えば、図4Aに示した例では、基板50を左方向に移動させればよいことがわかる。基板50の移動中に、処理部12は取得された画像データに対応する画像を表示部14に表示する。 If it is determined in step S03 that the length of at least one side in the angle of view 15A is shorter than the specified value, the length of the side in the angle of view 15A is equal to or longer than the specified value. The substrate 50 is moved so as to be (step S04). For example, in the example shown in FIG. 4A, it can be seen that the substrate 50 may be moved to the left. During the movement of the substrate 50, the processing unit 12 displays an image corresponding to the acquired image data on the display unit 14.
 ステップS03で、画角15A内の2つの辺の長さが規定値と等しいか、または規定値より長いと判定された場合、またはステップS04の後、処理部12は、画角15A内に規定値と等しいか、または規定値より長い辺を含む状態で取得された画像データ22に基づいて、角部51の位置を検出する(ステップS05)。以下、角部51の位置を検出する処理について、図4Bを参照して説明する。 When it is determined in step S03 that the lengths of the two sides in the angle of view 15A are equal to or longer than the specified value, or after step S04, the processing unit 12 sets the specified angle in the angle of view 15A. The position of the corner 51 is detected based on the image data 22 acquired in a state including a side that is equal to the value or longer than the specified value (step S05). Hereinafter, the process of detecting the position of the corner 51 will be described with reference to FIG. 4B.
 図4Bは、画角15A内に規定値以上の長さの辺を含む状態で取得された画像データ22を画像として示す図である。まず、画像データ22内で、横方向に直線的に延びる辺、及び縦方向に直線的に延びる辺を検出する。この辺の検出には、1本の直線の辺を含むテンプレートを用いたパターンマッチングを用いることができる。その他に、縦方向に明るさが大きく変化している部分の画素からなる画素列に基づいて、横方向に延びる辺を検出することができる。また、横方向に明るさが大きく変化している部分の画素からなる画素列に基づいて、縦方向に延びる辺を検出することができる。 FIG. 4B is a diagram showing the image data 22 acquired as an image in a state in which a side having a length equal to or longer than a specified value is included in the field angle 15A. First, a side extending linearly in the horizontal direction and a side extending linearly in the vertical direction are detected in the image data 22. For the detection of this side, pattern matching using a template including one straight side can be used. In addition, it is possible to detect a side extending in the horizontal direction based on a pixel column including pixels of a portion where the brightness greatly changes in the vertical direction. Further, it is possible to detect a side extending in the vertical direction based on a pixel row composed of pixels in a portion where the brightness greatly changes in the horizontal direction.
 処理部12は、検出された2つの辺の延長線上の交点(基準点)28の位置を、角部51の位置として採用する。処理部12は、検出された2つの辺の延長線、及び基準点28を示す情報を表示部14に表示する。例えば、2つの辺の延長線を実線または破線で表示し、両者の交点をプラス記号で表示する。さらに、基準点の座標を示す情報29を表示部14に数字で表示する。 The processing unit 12 employs the position of the detected intersection (reference point) 28 on the extension line of the two sides as the position of the corner 51. The processing unit 12 displays information indicating the detected extension lines of the two sides and the reference point 28 on the display unit 14. For example, the extension line of two sides is displayed as a solid line or a broken line, and the intersection of both is displayed as a plus sign. Further, information 29 indicating the coordinates of the reference point is displayed on the display unit 14 as a number.
 次に、図5A~図5Dを参照して、上記実施例の優れた効果について説明する。上記実施例では、アライメントマークが形成されていない基板50の位置を検出することができる。さらに、基板50の表裏を反転させても、基板50の位置検出を行うことができる。 Next, with reference to FIGS. 5A to 5D, the excellent effects of the above embodiment will be described. In the above embodiment, the position of the substrate 50 where the alignment mark is not formed can be detected. Further, the position of the substrate 50 can be detected even if the front and back sides of the substrate 50 are reversed.
 図5Aは、基板50の角部51の切り落とし部分の形状及び寸法が規格通りであるときの角部51の画像データを画像として示す図である。このように、基板50の形状及び寸法が規格通りであるとき、実施例による方法で、角部51の基準点28の位置を検出することができる。 FIG. 5A is a diagram showing the image data of the corner 51 when the shape and dimensions of the cut-off portion of the corner 51 of the substrate 50 are as specified. Thus, when the shape and dimensions of the substrate 50 are in accordance with the standard, the position of the reference point 28 of the corner 51 can be detected by the method according to the embodiment.
 図5Bは、基板50の角部51の切り落とし部分の寸法が規格よりも小さい場合の角部51の画像データを画像として示す図である。斜めの辺、及びその両側の2つの辺を含む1つのテンプレートを用いた場合には、パターンマッチングによって角部51の位置を正確に検出することが困難である。本実施例においては、斜めの辺の両端の頂点を、別々にパターンマッチングすることにより検出するため、斜めの辺の長さがばらついても、角部51を挟む2つの辺を精度よく検出することができる。その結果、基準点28の位置を精度よく検出することができる。 FIG. 5B is a diagram showing the image data of the corner 51 when the size of the cut-off portion of the corner 51 of the substrate 50 is smaller than the standard as an image. When one template including an oblique side and two sides on both sides thereof is used, it is difficult to accurately detect the position of the corner 51 by pattern matching. In this embodiment, since the vertices at both ends of the oblique side are detected by pattern matching separately, even if the lengths of the oblique sides vary, the two sides sandwiching the corner 51 are detected with high accuracy. be able to. As a result, the position of the reference point 28 can be detected with high accuracy.
 図5Cは、基板50の角部51の切り落とし部分の形状が規格から外れた場合の角部51の画像データを画像として図である。図5Cに示した例では、斜めの辺と、角部51を挟む2つの辺とのなす角度が45度からずれている。斜めの辺の両端の頂点を、異なるテンプレート20B、20Cを用いて別々にパターンマッチングすることにより検出するため、斜めの辺が45度からずれていても、斜めの辺の両端の頂点をパターンマッチングにより検出することが容易である。角部51の基準点28は、角部51を挟む2つの辺を延長して決定するため、斜めの辺の方向が45度からずれていても基準点28の位置検出精度は低下しない。 FIG. 5C is a diagram illustrating the image data of the corner 51 when the shape of the cut-off portion of the corner 51 of the substrate 50 deviates from the standard. In the example shown in FIG. 5C, the angle formed between the oblique side and the two sides sandwiching the corner 51 is deviated from 45 degrees. Since the vertices at both ends of the oblique side are detected by pattern matching separately using different templates 20B and 20C, the vertices at both ends of the oblique side are pattern-matched even if the oblique sides are deviated from 45 degrees. It is easy to detect by. Since the reference point 28 of the corner 51 is determined by extending two sides sandwiching the corner 51, the position detection accuracy of the reference point 28 does not decrease even if the direction of the oblique side is deviated from 45 degrees.
 図5Dは、基板50の面内回転方向の姿勢が目標とする姿勢からやや傾いている場合の角部51の画像データを画像として示す図である。例えば、角部51を挟む2つの辺が、画角15A(図1)内の縦方向及び横方向に対して傾いている。この場合でも、テンプレート20B、20Cを傾けてパターンマッチングを行うことにより、斜めの辺の両端の角を検出することができる。角部51の基準点28は、角部51を挟む2つの辺を延長して決定するため、2つの辺が縦方向及び横方向に対して傾いていても、基準点28の位置検出精度は低下しない。 FIG. 5D is a diagram showing the image data of the corner 51 when the posture in the in-plane rotation direction of the substrate 50 is slightly tilted from the target posture as an image. For example, two sides sandwiching the corner 51 are inclined with respect to the vertical direction and the horizontal direction in the angle of view 15A (FIG. 1). Even in this case, the corners at both ends of the oblique side can be detected by inclining the templates 20B and 20C and performing pattern matching. Since the reference point 28 of the corner 51 is determined by extending two sides sandwiching the corner 51, even if the two sides are inclined with respect to the vertical direction and the horizontal direction, the position detection accuracy of the reference point 28 is It does not decline.
 さらに、実施例においては、角部51を挟む2つの辺を延長して角部51の位置を検出するため、位置の検出結果は角部51の頂点部分の形状に影響を受けない。例えば、製造上のばらつきによって頂点部分が不定形な形状になっている場合でも、角部51の位置を高精度に検出することが可能である。 Furthermore, in the embodiment, since the position of the corner 51 is detected by extending the two sides sandwiching the corner 51, the position detection result is not affected by the shape of the apex of the corner 51. For example, even when the apex portion has an irregular shape due to manufacturing variations, the position of the corner 51 can be detected with high accuracy.
 上記実施例では、角部51の位置を算出するための基礎として、画角内に規定値以上の長さの辺を含む画像データ22(図4B)を用いる。このため、この2つの辺の延長線を精度よく確定することができる。その結果、この延長線に基づいて算出される基準点28(図4B)の位置の算出精度を高めることができる。 In the above embodiment, as the basis for calculating the position of the corner 51, the image data 22 (FIG. 4B) including a side having a length equal to or longer than the specified value in the angle of view is used. For this reason, the extension line of these two sides can be determined with high accuracy. As a result, the calculation accuracy of the position of the reference point 28 (FIG. 4B) calculated based on this extension line can be increased.
 また、上記実施例では、位置検出装置の動作に応じて、撮像装置11A、11B(図1)で撮像された基板50の画像、及びパターンマッチングを行っているときのテンプレートの画像が表示部14に表示される。このため、オペレータは、表示部14に表示された情報から、位置検出装置の動作状況を知ることができる。また、これらの情報から、位置検出装置の動作の正常性を確認することができる。 Further, in the above embodiment, the display unit 14 displays the image of the substrate 50 imaged by the imaging devices 11A and 11B (FIG. 1) and the template image when performing pattern matching according to the operation of the position detection device. Is displayed. For this reason, the operator can know the operation status of the position detection device from the information displayed on the display unit 14. Moreover, the normality of operation | movement of a position detection apparatus can be confirmed from these information.
 次に、図6を参照して、他の実施例による位置検出装置について説明する。以下、図1~図5Dに示した実施例と共通の構成については説明を省略する。 Next, a position detection apparatus according to another embodiment will be described with reference to FIG. Hereinafter, the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
 図6は、本実施例による位置検出装置の処理部12が実行する位置検出処理のフローチャートである。ステップS01及びS02の処理は、図2に示した実施例のステップS01及びS02の処理と同一である。 FIG. 6 is a flowchart of the position detection process executed by the processing unit 12 of the position detection apparatus according to this embodiment. Steps S01 and S02 are the same as steps S01 and S02 in the embodiment shown in FIG.
 本実施例においては、ステップS03aにおいて、画角15A内の2つの辺の長さと規定値とを比較して、画角15A内の2つの辺の長さが規定値と等しいか否かを判定する。ここで、画角15A内の2つの辺の長さと規定値との差が、撮像装置11A、11Bによる画像データに基づいて長さを測定するときの測定精度に起因して生じる最大誤差の範囲内である場合には、処理部12は画角15A内の2つの辺の長さと規定値とが等しいと判定する。 In this embodiment, in step S03a, the lengths of the two sides in the angle of view 15A are compared with the specified value to determine whether the lengths of the two sides in the angle of view 15A are equal to the specified value. To do. Here, the difference between the length of the two sides in the angle of view 15A and the specified value is the range of the maximum error caused by the measurement accuracy when measuring the length based on the image data by the imaging devices 11A and 11B. If it is within the range, the processing unit 12 determines that the lengths of the two sides in the angle of view 15A are equal to the specified value.
 画角15A内の2つの辺の長さと規定値とが等しくない場合には、処理部12は、画角15A内の2つの辺の長さが規定値と等しくなるように基板50を移動させる(ステップS04a)。 When the lengths of the two sides in the angle of view 15A are not equal to the specified value, the processing unit 12 moves the substrate 50 so that the lengths of the two sides in the angle of view 15A are equal to the specified value. (Step S04a).
 ステップS03aで、画角15A内の2つの辺の長さが規定値と等しいと判定された場合、またはステップS04aの後、処理部12は、画角15A内に規定値と等しい長さの2つの辺を含む状態で取得された画像データ22に基づいて、角部51の位置を検出する。ステップS05の処理は、図2に示した実施例のステップS05の処理と同一である。 If it is determined in step S03a that the lengths of the two sides in the angle of view 15A are equal to the specified value, or after step S04a, the processing unit 12 has a length of 2 equal to the specified value in the angle of view 15A. Based on the image data 22 acquired in a state including one side, the position of the corner 51 is detected. The process of step S05 is the same as the process of step S05 of the embodiment shown in FIG.
 本実施例では、角部51の基準点28(図4B)の画角15A内における位置のばらつきが少なくなる。位置検出毎に、撮像装置11Aのレンズの収差の影響を均等化できるため、位置検出の精度のばらつきを低減することができる。 In this embodiment, the variation in the position of the reference point 28 (FIG. 4B) of the corner 51 within the angle of view 15A is reduced. Since the influence of the aberration of the lens of the imaging device 11A can be equalized for each position detection, it is possible to reduce variations in position detection accuracy.
 次に、図7A及び図7Bを参照して、さらに他の実施例による位置検出装置について説明する。以下、図1~図5Dに示した実施例と共通の構成については説明を省略する。 Next, a position detection device according to still another embodiment will be described with reference to FIGS. 7A and 7B. Hereinafter, the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
 図7Aは、本実施例で位置検出の対象となる基板50の平面図である。図1~図5Cに示した実施例で位置検出の対象となる基板50は、角部51が直線的に切り落とされた形状を有しているが、本実施例では、位置検出の対象となる基板50は、角部51が丸く切り取られた形状を有している。 FIG. 7A is a plan view of the substrate 50 that is a position detection target in this embodiment. The substrate 50 that is a position detection target in the embodiment shown in FIGS. 1 to 5C has a shape in which the corner portion 51 is linearly cut off. In this embodiment, the substrate 50 is a position detection target. The board | substrate 50 has the shape by which the corner | angular part 51 was cut off roundly.
 図7Bは、図2のステップS02で実行されるパターンマッチングにおいて用いられるテンプレート20A、20B、20Cを画像として示す図である。図7Bのテンプレート20A、20B、20Cは、それぞれ図3Bのテンプレート20A、20B、20Cに対応する。本実施例では、基板50の角部51の丸形状に応じて、テンプレート20A、20B、20Cのパターンも丸味を帯びている。 FIG. 7B is a diagram illustrating templates 20A, 20B, and 20C used in the pattern matching executed in step S02 of FIG. 2 as images. The templates 20A, 20B, and 20C in FIG. 7B correspond to the templates 20A, 20B, and 20C in FIG. 3B, respectively. In the present embodiment, the patterns of the templates 20A, 20B, and 20C are also rounded according to the round shape of the corner 51 of the substrate 50.
 図7Bに示したテンプレート20A、20B、20Cを用いることにより、角部51が丸味を帯びている基板50の位置検出を行うことができる。 By using the templates 20A, 20B, and 20C shown in FIG. 7B, it is possible to detect the position of the substrate 50 in which the corners 51 are rounded.
 次に、図8~図9Bを参照して、さらに他の実施例による位置検出装置について説明する。以下、図1~図5Dに示した実施例と共通の構成については説明を省略する。 Next, a position detection apparatus according to another embodiment will be described with reference to FIGS. 8 to 9B. Hereinafter, the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
 図8は、本実施例による位置検出装置が実行する位置検出処理のフローチャートである。基板50を支持部10に支持する処理(ステップS11)、及び第1角部を画角内に引き込む処理(ステップS12)は、それぞれ図2に示したステップS01及びS02の処理と同一である。 FIG. 8 is a flowchart of the position detection process executed by the position detection apparatus according to this embodiment. The process of supporting the substrate 50 on the support unit 10 (step S11) and the process of drawing the first corner part into the angle of view (step S12) are the same as the processes of steps S01 and S02 shown in FIG.
 本実施例においては、第1角部を画角内に引き込んだ後、処理部12は、第1角部に対して1つの辺を介して隣り合う第2角部を画角内に引き込む(ステップS13)。例えば、第1角部を画角15A(図1)内に引き込んだ場合には、第2角部を画角15B内に引き込む。第2角部を画角15B内に引き込む処理には、図2のステップS02の処理と同一の手法を適用することができる。 In this embodiment, after the first corner is drawn into the angle of view, the processing unit 12 draws the second corner adjacent to the first corner via one side into the angle of view ( Step S13). For example, when the first corner is drawn into the angle of view 15A (FIG. 1), the second corner is drawn into the angle of view 15B. The same technique as the process of step S02 of FIG. 2 can be applied to the process of drawing the second corner into the angle of view 15B.
 図9Aは、ステップS13を実行する時点の画角15A、15Bと、基板50との位置関係を示す平面図である。画角15Bの原点から画角15Aの原点を向く方向を基準方向(x方向)と定義する。この時点で基板50の各辺は基準方向に対して傾いている。 FIG. 9A is a plan view showing the positional relationship between the angle of view 15A and 15B and the substrate 50 when step S13 is executed. A direction from the origin of the angle of view 15B to the origin of the angle of view 15A is defined as a reference direction (x direction). At this time, each side of the substrate 50 is inclined with respect to the reference direction.
 処理部12は、第1角部を画角内に引き込んだ状態、及び第2角部を画角内に引き込んだ状態から、2つの角部の相対的位置関係を算出することにより、基板50の面内回転方向の姿勢を検出する(ステップS14)。処理部12は、基板50の面内回転方向の姿勢の検出結果に基づいて支持部10を回転させることにより、基板50の面内回転方向に関して位置合わせを行う(ステップS15)。 The processing unit 12 calculates the relative positional relationship between the two corners from the state in which the first corner is drawn into the angle of view and the state in which the second corner is drawn into the angle of view. The posture in the in-plane rotation direction is detected (step S14). The processing unit 12 performs alignment with respect to the in-plane rotation direction of the substrate 50 by rotating the support unit 10 based on the detection result of the posture of the substrate 50 in the in-plane rotation direction (step S15).
 図9Bは、ステップS15を実行した後の画角15A、15Bと、基板50との位置関係を示す平面図である。基板50のひと組の辺が基準方向に対して平行になっている。 FIG. 9B is a plan view showing the positional relationship between the angles of view 15A and 15B after the execution of step S15 and the substrate 50. FIG. A pair of sides of the substrate 50 is parallel to the reference direction.
 基板50の面内回転方向の位置合わせを行った後、処理部12は、第1角部の位置を再度検出する(ステップS16)。第1角部の位置は、図2のステップS02からS05までの手順と同一の手順により検出することができる。同様に、処理部12は、第2角部の位置を再度検出する(ステップS17)。 After the alignment in the in-plane rotation direction of the substrate 50, the processing unit 12 detects the position of the first corner again (step S16). The position of the first corner can be detected by the same procedure as the procedure from steps S02 to S05 in FIG. Similarly, the processing unit 12 detects the position of the second corner again (step S17).
 図8~図9Bに示した実施例では、図9Bに示したように、基板50の面内回転方向に関する位置合わせを行うとともに、面内の2方向に関する基板50の位置を検出することができる。 In the embodiment shown in FIGS. 8 to 9B, as shown in FIG. 9B, alignment in the in-plane rotation direction of the substrate 50 can be performed, and the position of the substrate 50 in two in-plane directions can be detected. .
 次に、図10を参照して、さらに他の実施例による位置検出装置について説明する。以下、図1~図8に示した実施例と共通の構成については説明を省略する。 Next, with reference to FIG. 10, a position detection device according to another embodiment will be described. Hereinafter, the description of the configuration common to the embodiment shown in FIGS. 1 to 8 will be omitted.
 図10は、本実施例による位置検出装置が実行する位置検出処理のフローチャートである。まず、基板50を、第1の面を上に向けて支持部10(図1)の支持面に支持させる(ステップS21)。この処理は、図2に示したステップS01の処理と同一である。 FIG. 10 is a flowchart of the position detection process executed by the position detection apparatus according to this embodiment. First, the substrate 50 is supported on the support surface of the support portion 10 (FIG. 1) with the first surface facing upward (step S21). This process is the same as the process of step S01 shown in FIG.
 処理部12は、基板50の位置を検出する(ステップS22)。この位置検出処理は、図2に示したステップS02からS05、または図8に示したステップS12からS16までの処理と同一である。 The processing unit 12 detects the position of the substrate 50 (step S22). This position detection process is the same as the process from steps S02 to S05 shown in FIG. 2 or steps S12 to S16 shown in FIG.
 次に、処理部12は、画角内の2つの辺の長さが規定値以上であるときの角部51の画像データからテンプレートを作成する(ステップS23)。以下、テンプレートの作成処理について説明する。ステップS05(図2)において角部51の位置を検出するために用いた画像データ22(図4B)とテンプレート20B、20C(図3B)とのパターンマッチングを行い、実際の画像データ22から、テンプレート20B、20Cに対応する部分の画像を切り出す。切り出した画像を鏡像変換してテンプレートとし、記憶部13に記憶する。画像データ22から切り出した画像に基づいて作成したテンプレートを、基板50の規格形状に基づいて作成したテンプレート20A、20B、20C(図3B)と区別して動的テンプレートということとする。 Next, the processing unit 12 creates a template from the image data of the corner 51 when the lengths of the two sides in the angle of view are equal to or greater than the specified value (step S23). Hereinafter, the template creation process will be described. Pattern matching between the image data 22 (FIG. 4B) used for detecting the position of the corner 51 in step S05 (FIG. 2) and the templates 20B and 20C (FIG. 3B) is performed. The image of the part corresponding to 20B and 20C is cut out. The cut-out image is mirror-image converted into a template and stored in the storage unit 13. A template created based on an image cut out from the image data 22 is referred to as a dynamic template in distinction from the templates 20A, 20B, and 20C (FIG. 3B) created based on the standard shape of the substrate 50.
 動的テンプレートを作成した後、処理部12は、基板50の第1の面の処理を行う(ステップS24)。例えば、第1の面に向かってインクジェットヘッドからインクを吐出させ、所望の平面形状の絶縁樹脂膜を形成する。 After creating the dynamic template, the processing unit 12 processes the first surface of the substrate 50 (step S24). For example, ink is ejected from the inkjet head toward the first surface to form an insulating resin film having a desired planar shape.
 第1の面の処理が終了すると、基板50の表裏を反転させ(ステップS25)、第1の面とは反対の第2の面を上に向ける。表裏の反転は、例えば処理部12がロボットアームを制御して行う。なお、人手で表裏を反転させるようにしてもよい。 When the processing of the first surface is completed, the front and back of the substrate 50 are reversed (step S25), and the second surface opposite to the first surface is directed upward. Inversion of the front and back is performed, for example, by the processing unit 12 controlling the robot arm. In addition, you may make it reverse the front and back manually.
 処理部12は、第2の面が上に向いた状態の基板50の位置を検出する(ステップS26)。この位置検出処理においては、図2に示したステップS02からS05までの処理を実行する際に、テンプレート20A、20B、20C(図3B)に代えて、ステップS23で作成した動的テンプレートを用いる。より具体的には、基板50の画像データから特定の箇所を探索するとき、第1の面を撮像して得られた画像データの対応する箇所から切り出された動的テンプレートを用いる。 The processing unit 12 detects the position of the substrate 50 with the second surface facing upward (step S26). In this position detection process, the dynamic template created in step S23 is used instead of the templates 20A, 20B, and 20C (FIG. 3B) when executing the processes from steps S02 to S05 shown in FIG. More specifically, when searching for a specific part from the image data of the substrate 50, a dynamic template cut out from a corresponding part of the image data obtained by imaging the first surface is used.
 基板50の位置検出後、処理部12は、基板50の第2の面の処理を行う(ステップS27)。 After detecting the position of the substrate 50, the processing unit 12 processes the second surface of the substrate 50 (step S27).
 図10に示した実施例では、ステップS26において動的テンプレートを用いて基板50の位置を検出する。このため、角部51の形状が規格形状から外れている場合であっても、パターンマッチングの精度の低下を抑制することができる。その結果、第1の面と第2の面との間の相対的な位置合わせ精度を高めることができる。 In the embodiment shown in FIG. 10, the position of the substrate 50 is detected using a dynamic template in step S26. For this reason, even if it is a case where the shape of the corner | angular part 51 has remove | deviated from the standard shape, the fall of the precision of pattern matching can be suppressed. As a result, it is possible to increase the relative alignment accuracy between the first surface and the second surface.
 次に、図11A及び図11Bを参照して、さらに他の実施例による位置検出装置について説明する。以下、図1~図5Dに示した実施例と共通の構成については説明を省略する。 Next, a position detection device according to still another embodiment will be described with reference to FIGS. 11A and 11B. Hereinafter, the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
 図11Aは、本実施例による位置検出装置を用いて位置を検出する対象となる基板50の平面図である。基板50の角部51は切り落とされておらず、ほぼ直角の頂点が現れている。 FIG. 11A is a plan view of a substrate 50 that is a target whose position is detected by using the position detection device according to the present embodiment. The corner portion 51 of the substrate 50 is not cut off, and a substantially perpendicular vertex appears.
 図11Bは、位置検出時のパターンマッチングに用いられるテンプレート20を画像として示す図である。図1~図5Dに示した実施例では、角部51を検出する際(ステップS02、S05)のパターンマッチングにおいて2つのテンプレート20B、20C(図4B)を用いた。これに対し、本実施例では、角部51を検出する際のパターンマッチングにおいて1つのテンプレート20(図11B)を用いる。 FIG. 11B is a diagram showing the template 20 used for pattern matching at the time of position detection as an image. In the embodiment shown in FIGS. 1 to 5D, two templates 20B and 20C (FIG. 4B) are used in pattern matching when the corner 51 is detected (steps S02 and S05). In contrast, in the present embodiment, one template 20 (FIG. 11B) is used in pattern matching when detecting the corner 51.
 本実施例においても、角部51の位置検出を行う基礎となる画像データにおいて、画角内の2つの辺の長さが規定値以上になっているため(ステップS03)、角部51の基準点28(図4B)の位置の検出精度を高めることができる。 Also in this embodiment, since the length of two sides in the angle of view is equal to or greater than the specified value in the image data that is the basis for detecting the position of the corner 51 (step S03), the reference of the corner 51 The detection accuracy of the position of the point 28 (FIG. 4B) can be increased.
 次に、図12A及び図12Bを参照して、さらに他の実施例による位置検出装置について説明する。以下、図1~図5D、及び図8~図9Bに示した実施例と共通の構成については説明を省略する。 Next, with reference to FIGS. 12A and 12B, a position detection apparatus according to still another embodiment will be described. Hereinafter, the description of the configuration common to the embodiments shown in FIGS. 1 to 5D and FIGS. 8 to 9B will be omitted.
 図12A及び図12Bは、本実施例による位置検出装置の概略斜視図である。図1に示した実施例では、基板50の2つの角部51に対応させて2つの撮像装置11A、11Bを配置したが、本実施例では、1つの撮像装置11Aのみが配置されている。図8に示した実施例では、ステップS12において第1角部を撮像装置11Aの画角15A内に引き込み、ステップS13において第2角部を撮像装置11Bの画角15B内に引き込んだ。 12A and 12B are schematic perspective views of the position detection apparatus according to this embodiment. In the embodiment shown in FIG. 1, the two imaging devices 11A and 11B are arranged corresponding to the two corners 51 of the substrate 50. However, in this embodiment, only one imaging device 11A is arranged. In the embodiment shown in FIG. 8, the first corner is drawn into the angle of view 15A of the imaging device 11A in step S12, and the second corner is drawn into the angle of view 15B of the imaging device 11B in step S13.
 本実施例においては、ステップS12において、図8の実施例と同様に、第1角部51を撮像装置11Aの画角15A内に引き込む(図12A)。ステップS13においては、基板50を並進移動させて、第2角部51を撮像装置11Aの画角15A内に引き込む(図12B)。 In this embodiment, in step S12, the first corner 51 is drawn into the angle of view 15A of the image pickup apparatus 11A as in the embodiment of FIG. 8 (FIG. 12A). In step S13, the substrate 50 is translated and the second corner 51 is drawn into the angle of view 15A of the imaging device 11A (FIG. 12B).
 その後、ステップS16において、第1角部51を撮像装置11Aの画角15A内に引き込んで、第1角部51の位置を検出する処理を実行する。ステップS17においては、第2角部51を撮像装置11Aの画角15A内に引き込んで、第2角部51の位置を検出する処理を実行する。 Thereafter, in step S16, the first corner 51 is drawn into the angle of view 15A of the image pickup apparatus 11A, and processing for detecting the position of the first corner 51 is executed. In step S <b> 17, a process of detecting the position of the second corner portion 51 by drawing the second corner portion 51 into the angle of view 15 </ b> A of the imaging device 11 </ b> A is executed.
 図12A及び図12Bに示したように、1つの撮像装置11Aのみを用いて、基板50の2つの角部の位置を検出することができる。本実施例では、図12Aの状態から図12Bの状態に至る基板50の移動方向を、図9A及び図9Bに示した基準方向(x方向)と定義するとよい。 As shown in FIGS. 12A and 12B, it is possible to detect the positions of the two corners of the substrate 50 using only one imaging device 11A. In this embodiment, the moving direction of the substrate 50 from the state of FIG. 12A to the state of FIG. 12B may be defined as the reference direction (x direction) shown in FIGS. 9A and 9B.
 次に、図13を参照して、上述の実施例による位置検出装置を搭載した膜形成装置について説明する。 Next, with reference to FIG. 13, a film forming apparatus equipped with the position detection apparatus according to the above-described embodiment will be described.
 図13は、本実施例による膜形成装置の概略正面図である。基台30に移動機構31を介して支持部10が支持されている。支持部10は、図1に示した実施例の支持部10に相当する。支持部10の上面(支持面)に基板50が支持される。移動機構31は、支持部10を支持面に平行な2次元方向に移動させ、支持面に平行な面内方向に回転させることができる。通常、支持部10の支持面は水平に保たれる。 FIG. 13 is a schematic front view of the film forming apparatus according to the present embodiment. The support unit 10 is supported on the base 30 via the moving mechanism 31. The support portion 10 corresponds to the support portion 10 of the embodiment shown in FIG. The substrate 50 is supported on the upper surface (support surface) of the support unit 10. The moving mechanism 31 can move the support unit 10 in a two-dimensional direction parallel to the support surface and rotate it in an in-plane direction parallel to the support surface. Usually, the support surface of the support part 10 is kept horizontal.
 支持部10に支持された基板50の上方に複数のインクジェットヘッド33及び複数の撮像装置11A、11Bが配置されている。インクジェットヘッド33及び撮像装置11A、11Bは、門型フレーム32により基台30に支持されている。撮像装置11A、11Bは、それぞれ昇降機構17A、17Bにより昇降可能である。インクジェットヘッド33の各々に複数のノズル孔が設けられている。ノズル孔から基板50に向けて膜材料の液滴が吐出される。 A plurality of inkjet heads 33 and a plurality of imaging devices 11A and 11B are arranged above the substrate 50 supported by the support unit 10. The inkjet head 33 and the imaging devices 11 </ b> A and 11 </ b> B are supported on the base 30 by the portal frame 32. The imaging devices 11A and 11B can be moved up and down by the lifting mechanisms 17A and 17B, respectively. Each of the inkjet heads 33 is provided with a plurality of nozzle holes. A droplet of the film material is discharged from the nozzle hole toward the substrate 50.
 撮像装置11A、11Bは、支持部10に支持された基板50の一部分を撮像し、取得された2次元画像の画像データを処理部12に送信する。撮像装置11A、11B、及び処理部12が、それぞれ図1に示した実施例の撮像装置11A、11B、及び処理部12に相当する。さらに、処理部12は、昇降機構17A、17Bを制御して撮像装置11A、11Bを昇降させることにより、撮像装置11A、11Bのピント位置を調整することができる。 The imaging devices 11 </ b> A and 11 </ b> B capture a part of the substrate 50 supported by the support unit 10 and transmit the acquired two-dimensional image data to the processing unit 12. The imaging devices 11A and 11B and the processing unit 12 correspond to the imaging devices 11A and 11B and the processing unit 12 of the embodiment shown in FIG. Furthermore, the processing unit 12 can adjust the focus positions of the imaging devices 11A and 11B by controlling the elevating mechanisms 17A and 17B to raise and lower the imaging devices 11A and 11B.
 処理部12は、形成すべき膜の形状を定義する画像データに基づいて、移動機構31及びインクジェットヘッド33を制御する。付着した液状の膜材料を硬化させることにより膜を形成することができる。これにより、基板50に所望の形状の膜を形成することができる。膜材料として、光硬化性の樹脂、熱硬化性の樹脂等を用いることができる。インクジェットヘッド33の側方に、基板50に付着した膜材料を硬化させる光源または熱源が配置されている。 The processing unit 12 controls the moving mechanism 31 and the inkjet head 33 based on image data defining the shape of the film to be formed. A film can be formed by curing the adhering liquid film material. Thereby, a film having a desired shape can be formed on the substrate 50. As the film material, a photocurable resin, a thermosetting resin, or the like can be used. A light source or a heat source for curing the film material attached to the substrate 50 is disposed on the side of the inkjet head 33.
 入力部16から処理部12に、種々のコマンドやデータが入力される。入力部16には、例えばキーボード、ポインティングデバイス、USBポート、通信装置等が用いられる。表示部14に、膜形成装置の動作に関する種々の情報が出力される。表示部14には、例えば液晶ディスプレイ等が用いられる。なお、表示部14として、外部の表示装置に表示すべき画像データを送信する通信装置を用いてもよい。 Various commands and data are input from the input unit 16 to the processing unit 12. For the input unit 16, for example, a keyboard, a pointing device, a USB port, a communication device, or the like is used. Various information regarding the operation of the film forming apparatus is output to the display unit 14. For example, a liquid crystal display or the like is used for the display unit 14. Note that a communication device that transmits image data to be displayed on an external display device may be used as the display unit 14.
 本実施例では、図1~図12に示したいずれかの実施例による方法で基板50の位置を検出する。これにより、高精度に位置を検出することができる。 In this embodiment, the position of the substrate 50 is detected by the method according to any one of the embodiments shown in FIGS. Thereby, a position can be detected with high accuracy.
 各実施例は例示であり、異なる実施例で示した構成の部分的な置換または組み合わせが可能であることは言うまでもない。複数の実施例の同様の構成による同様の作用効果については実施例ごとには逐次言及しない。さらに、本発明は上述の実施例に制限されるものではない。例えば、種々の変更、改良、組み合わせ等が可能なことは当業者に自明であろう。 Each example is an exemplification, and needless to say, partial replacement or combination of configurations shown in different examples is possible. About the same effect by the same composition of a plurality of examples, it does not refer to every example one by one. Furthermore, the present invention is not limited to the embodiments described above. It will be apparent to those skilled in the art that various modifications, improvements, combinations, and the like can be made.
10 支持部
11A、11B 撮像装置
12 処理部
13 記憶部
14 表示部
15A、15B 画角
16 入力部
17A、17B 昇降機構
20、20A、20B、20C テンプレート
22 画像データ
28 角部の基準点
29 基準点の座標を示す情報
30 基台
31 移動機構
32 門型フレーム
33 インクジェットヘッド
50 基板
51 角部
DESCRIPTION OF SYMBOLS 10 Support part 11A, 11B Image pick-up device 12 Processing part 13 Storage part 14 Display part 15A, 15B Angle of view 16 Input part 17A, 17B Lifting mechanism 20, 20A, 20B, 20C Template 22 Image data 28 Reference point 29 of a corner part Reference point Information 30 showing coordinates 30 Base 31 Moving mechanism 32 Portal frame 33 Inkjet head 50 Substrate 51 Corner

Claims (9)

  1.  角部を有する基板を支持する支持部と、
     画角内の前記基板の一部分を撮像する撮像装置と、
     処理部と
    を有し、
     前記処理部は、
     前記撮像装置の画角内に含まれる前記基板の前記角部を挟む2つの辺の長さが規定値と等しいか、または前記規定値より長くなるように前記基板または前記撮像装置を移動させ、
     移動後の前記撮像装置の画角内を撮像して取得された画像データに基づいて、前記基板の前記角部の位置を検出する位置検出装置。
    A support part for supporting a substrate having a corner part;
    An imaging device for imaging a portion of the substrate within an angle of view;
    A processing unit,
    The processor is
    Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value,
    A position detection device that detects a position of the corner portion of the substrate based on image data acquired by imaging an angle of view of the imaging device after movement.
  2.  角部を有する基板を支持する支持部と、
     前記基板の一部分を撮像する撮像装置と、
     処理部と、
     画像を表示する表示部と
    を有し、
     前記処理部は、
     前記撮像装置の画角内の前記基板の前記角部の画像を前記表示部に表示し、
     前記撮像装置の画角内に含まれる前記基板の前記角部を挟む2つの辺の長さが規定値と等しいか、または前記規定値より長くなるように前記基板または前記撮像装置を移動させ、移動後の前記撮像装置の画角内の前記角部の画像を前記表示部に表示し、
     前記撮像装置の画角内における前記基板の画像の前記角部の位置の座標を求め、求められた座標を特定する情報を前記表示部に表示する位置検出装置。
    A support part for supporting a substrate having a corner part;
    An imaging device for imaging a portion of the substrate;
    A processing unit;
    A display unit for displaying an image,
    The processor is
    Displaying an image of the corner of the substrate within the angle of view of the imaging device on the display;
    Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value, Display the image of the corner within the angle of view of the imaging device after movement on the display unit,
    A position detection device that obtains coordinates of a position of the corner of the image of the substrate within an angle of view of the imaging device and displays information for specifying the obtained coordinates on the display unit.
  3.  角部を有する基板の第1角部を撮像して第1画像データを取得する工程と、
     前記第1角部を挟む2つの辺のうち、前記第1画像データに含まれている部分の長さと規定値とを比較し、前記第1角部を挟む2つの辺のうち、前記第1画像データに含まれている部分の長さが前記規定値以上でない場合には、前記規定値以上になるように撮像される領域を移動させて前記基板を撮像し、第2画像データを取得する工程と、
     前記第2画像データに基づいて、前記基板の前記第1角部の位置を求める工程と
    を有する位置検出方法。
    Imaging a first corner of a substrate having corners to obtain first image data;
    Of the two sides sandwiching the first corner, the length of the portion included in the first image data is compared with a specified value, and the first of the two sides sandwiching the first corner is the first. When the length of the portion included in the image data is not equal to or greater than the specified value, the region to be imaged is moved so as to be equal to or greater than the specified value, and the board is imaged to obtain second image data. Process,
    Obtaining a position of the first corner of the substrate based on the second image data.
  4.  前記第1画像データとテンプレートとのパターンマッチングを行うことにより、前記第1角部を挟む2つの辺のうち前記第1画像データに含まれている部分の長さが前記規定値以上か否かを判定する請求項3に記載の位置検出方法。 By performing pattern matching between the first image data and the template, whether or not the length of the portion included in the first image data out of the two sides sandwiching the first corner is equal to or greater than the specified value The position detection method according to claim 3, wherein:
  5.  前記テンプレートは、前記基板の外形形状の規格に基づいて作成されている請求項4に記載の位置検出方法。 The position detection method according to claim 4, wherein the template is created based on a standard of an outer shape of the substrate.
  6.  前記テンプレートは、前記基板の前記第1角部を反対側の面から撮像した画像に基づいて作成されている請求項4に記載の位置検出方法。 The position detection method according to claim 4, wherein the template is created based on an image obtained by imaging the first corner of the substrate from the opposite surface.
  7.  前記第1画像データを取得する前に、前記基板の前記第1角部の推定位置を撮像して初期画像データを取得する工程と、
     前記初期画像データに前記第1角部を挟む2つの辺の各々の一部が含まれていないとき、前記第1角部を画角内に配置し、その後、前記第1画像データを取得する請求項3乃至6のいずれか1項に記載の位置検出方法。
    Imaging the estimated position of the first corner of the substrate and obtaining initial image data before obtaining the first image data;
    When the initial image data does not include a part of each of the two sides sandwiching the first corner, the first corner is disposed within the angle of view, and then the first image data is acquired. The position detection method according to claim 3.
  8.  前記第1角部の位置を検出した後、前記第1角部に隣り合う第2角部を撮像して第3画像データを取得し、前記第1画像データと前記第3画像データとに基づいて前記基板の面内回転方向の姿勢を求める工程と、
     前記基板の面内回転方向の姿勢に応じて前記基板を面内方向に回転させ、回転方向の位置合わせを行う工程と
    を有し、
     面内回転方向の位置合わせを行った後、前記第1角部の位置を再度検出する請求項3乃至7のいずれか1項に記載の位置検出方法。
    After detecting the position of the first corner portion, the second corner portion adjacent to the first corner portion is imaged to obtain third image data, and based on the first image data and the third image data. Obtaining the posture of the substrate in the in-plane rotation direction;
    A step of rotating the substrate in an in-plane direction according to the posture in the in-plane rotation direction of the substrate, and performing alignment in the rotation direction,
    The position detection method according to claim 3, wherein the position of the first corner portion is detected again after alignment in the in-plane rotation direction.
  9.  前記第2画像データを取得する工程において、前記第1角部を挟む2つの辺のうち前記第1画像データに含まれている部分の長さが前記規定値と等しくない場合には、前記規定値と等しくなるように撮像される領域を移動させて前記基板を撮像し、前記第2画像データを取得する請求項3乃至8のいずれか1項に記載の位置検出方法。 In the step of obtaining the second image data, if the length of the portion included in the first image data of the two sides sandwiching the first corner is not equal to the specified value, the specified value 9. The position detection method according to claim 3, wherein a region to be imaged is moved so as to be equal to a value, the substrate is imaged, and the second image data is acquired.
PCT/JP2018/010826 2017-03-22 2018-03-19 Position detection device and position detection method WO2018174011A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019507664A JP6862068B2 (en) 2017-03-22 2018-03-19 Position detection device and position detection method
CN201880005782.4A CN110446905A (en) 2017-03-22 2018-03-19 Position detecting device and method for detecting position
KR1020197019474A KR20190131475A (en) 2017-03-22 2018-03-19 Position detection device and location detection method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-055685 2017-03-22
JP2017055685 2017-03-22

Publications (1)

Publication Number Publication Date
WO2018174011A1 true WO2018174011A1 (en) 2018-09-27

Family

ID=63585488

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010826 WO2018174011A1 (en) 2017-03-22 2018-03-19 Position detection device and position detection method

Country Status (5)

Country Link
JP (1) JP6862068B2 (en)
KR (1) KR20190131475A (en)
CN (1) CN110446905A (en)
TW (1) TWI654500B (en)
WO (1) WO2018174011A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019132709A (en) * 2018-01-31 2019-08-08 株式会社豊田自動織機 Measuring device and measurement method

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112629444B (en) * 2021-03-08 2021-06-22 南京航空航天大学 Automatic correction method for radiation library cover plate dropping errors based on machine vision
KR102616399B1 (en) 2021-08-20 2023-12-20 지영배 A Nano bubble dishwasher
KR102616395B1 (en) 2021-08-20 2023-12-20 지영배 A Nano bubble device for dishwasher
KR102632696B1 (en) 2021-09-16 2024-02-01 지영배 A Nano bubble device
KR102632695B1 (en) 2021-09-16 2024-02-01 지영배 A Nano bubble dishwasher

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04238206A (en) * 1991-01-22 1992-08-26 Matsushita Electric Ind Co Ltd Multi-visual field recognizing method for pattern
JPH05149716A (en) * 1991-11-29 1993-06-15 Toshiba Corp Corner detecting method
JP2000321024A (en) * 1999-05-11 2000-11-24 Matsushita Electric Ind Co Ltd Position detecting method utilizing image recognition
JP2002288634A (en) * 2001-03-28 2002-10-04 Juki Corp Part position detecting method and device
JP2007256053A (en) * 2006-03-23 2007-10-04 Nikon Corp Position measuring method and device manufacturing method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200736813A (en) * 2005-12-16 2007-10-01 Asahi Kasei Denshi Kk Position detector
CN101561248A (en) * 2008-04-17 2009-10-21 鸿富锦精密工业(深圳)有限公司 Position measurement device and measuring method
JP5907110B2 (en) 2013-04-12 2016-04-20 信越化学工業株式会社 Screen printing method and screen printing apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04238206A (en) * 1991-01-22 1992-08-26 Matsushita Electric Ind Co Ltd Multi-visual field recognizing method for pattern
JPH05149716A (en) * 1991-11-29 1993-06-15 Toshiba Corp Corner detecting method
JP2000321024A (en) * 1999-05-11 2000-11-24 Matsushita Electric Ind Co Ltd Position detecting method utilizing image recognition
JP2002288634A (en) * 2001-03-28 2002-10-04 Juki Corp Part position detecting method and device
JP2007256053A (en) * 2006-03-23 2007-10-04 Nikon Corp Position measuring method and device manufacturing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019132709A (en) * 2018-01-31 2019-08-08 株式会社豊田自動織機 Measuring device and measurement method

Also Published As

Publication number Publication date
CN110446905A (en) 2019-11-12
TWI654500B (en) 2019-03-21
KR20190131475A (en) 2019-11-26
TW201835692A (en) 2018-10-01
JPWO2018174011A1 (en) 2020-01-23
JP6862068B2 (en) 2021-04-21

Similar Documents

Publication Publication Date Title
WO2018174011A1 (en) Position detection device and position detection method
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
JP4446609B2 (en) Image processing method and apparatus
JP2011180084A (en) Picked-up image processor of component mounting machine
JP5052302B2 (en) Component mounting method and apparatus
JP5545737B2 (en) Component mounter and image processing method
WO2015045649A1 (en) Component mounting device
JP6177255B2 (en) Work machine and position shift data acquisition method
JP6941306B2 (en) Imaging device, bump inspection device, and imaging method
JP2008171873A (en) Positioning apparatus, positioning method, processing apparatus and processing method
KR102320369B1 (en) Position Detecting Apparatus and the Method thereof
JP2015115528A (en) Substrate processing device and substrate processing method
JP4061265B2 (en) Method and apparatus for measuring height of protrusion
JP2011040474A (en) Method for specifying glass substrate position and apparatus used therefor
JP2008135423A (en) Contour detector, positioning apparatus, pattern-drawing device, and contour detection method
CN115808356B (en) Chip testing method and chip testing equipment
JP3858633B2 (en) Bonding state inspection method
JP4829701B2 (en) Camera scaling acquisition method for component mounters
JP2009170586A (en) Method and apparatus for recognizing electronic component
WO2023095635A1 (en) Distortion aberration rate calculation method, and position detecting device
TWI834949B (en) Laser processing system that can quickly position the robot arm to the three-dimensional coordinate system
JP7287877B2 (en) 3D part data creation method and device
US10861199B2 (en) System for creating component shape data for image processing, and method for creating component shape data for image processing
JP5113657B2 (en) Surface mounting method and apparatus
JP2008229404A (en) Head position-correcting method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18771210

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019507664

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20197019474

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18771210

Country of ref document: EP

Kind code of ref document: A1