WO2018174011A1 - Dispositif de détection de position et procédé de détection de position - Google Patents

Dispositif de détection de position et procédé de détection de position Download PDF

Info

Publication number
WO2018174011A1
WO2018174011A1 PCT/JP2018/010826 JP2018010826W WO2018174011A1 WO 2018174011 A1 WO2018174011 A1 WO 2018174011A1 JP 2018010826 W JP2018010826 W JP 2018010826W WO 2018174011 A1 WO2018174011 A1 WO 2018174011A1
Authority
WO
WIPO (PCT)
Prior art keywords
substrate
corner
image data
view
angle
Prior art date
Application number
PCT/JP2018/010826
Other languages
English (en)
Japanese (ja)
Inventor
裕司 岡本
Original Assignee
住友重機械工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友重機械工業株式会社 filed Critical 住友重機械工業株式会社
Priority to JP2019507664A priority Critical patent/JP6862068B2/ja
Priority to KR1020197019474A priority patent/KR20190131475A/ko
Priority to CN201880005782.4A priority patent/CN110446905A/zh
Publication of WO2018174011A1 publication Critical patent/WO2018174011A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/30Structural arrangements specially adapted for testing or measuring during manufacture or treatment, or specially adapted for reliability measurements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67242Apparatus for monitoring, sorting or marking
    • H01L21/67259Position monitoring, e.g. misposition detection or presence detection
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/68Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for positioning, orientation or alignment
    • H01L21/682Mask-wafer alignment
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/20Sequence of activities consisting of a plurality of measurements, corrections, marking or sorting steps
    • H01L22/26Acting in response to an ongoing measurement without interruption of processing, e.g. endpoint detection, in-situ thickness measurement

Definitions

  • the present invention relates to a position detection device and a position detection method.
  • Patent Document 1 discloses a screen printing technique for positioning a substrate based on image data obtained by imaging the vicinity of a corner of the substrate.
  • An object of the present invention is to provide a position detection device capable of performing more stable position detection based on image data of corners of a substrate.
  • a support part for supporting a substrate having a corner part An imaging device for imaging a portion of the substrate within an angle of view;
  • a processing unit The processor is Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value,
  • a position detection device is provided that detects the position of the corner portion of the substrate based on image data acquired by imaging the angle of view of the imaging device after movement.
  • a support part for supporting a substrate having a corner part An imaging device for imaging a portion of the substrate; A processing unit; A display unit for displaying an image,
  • the processor is Displaying an image of the corner of the substrate within the angle of view of the imaging device on the display; Move the substrate or the imaging device so that the length of two sides sandwiching the corner portion of the substrate included in the angle of view of the imaging device is equal to or longer than the specified value, Display the image of the corner within the angle of view of the imaging device after movement on the display unit,
  • a position detection device that obtains coordinates of the position of the corner of the image of the substrate within the angle of view of the imaging device and displays information for specifying the obtained coordinates on the display unit.
  • the region to be imaged is moved so as to be equal to or greater than the specified value, and the board is imaged to obtain second image data.
  • Process And a step of determining the position of the first corner of the substrate based on the second image data.
  • Stable position detection can be performed by detecting the position of the corner so that the length of two sides sandwiching the corner of the substrate included in the angle of view of the imaging device is equal to or greater than a specified value. is there.
  • FIG. 1 is a schematic perspective view of a position detection apparatus according to an embodiment.
  • FIG. 2 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to the embodiment.
  • FIG. 3A is a plan view of a substrate that is an object for position detection
  • FIG. 3B is a diagram showing a template used for pattern matching executed during the position detection process as an image
  • FIG. 3C is a position detection. It is a figure which shows the image data which a process part handles during a process as an image.
  • 4A and 4B are diagrams illustrating image data handled by the processing unit as an image during the position detection process.
  • FIG. 5A is a diagram showing the image data of the corner when the shape and size of the cut-off portion of the corner of the substrate are in accordance with the standard
  • FIG. 5B is the size of the cut-out portion of the corner of the substrate as the standard
  • FIG. 5C is a diagram illustrating, as an image, image data of a corner when the shape of a cut-off portion of the corner of the substrate is out of the standard
  • FIG. 5D is a diagram illustrating, as an image, image data at corners when the posture in the in-plane rotation direction of the substrate is slightly inclined from the target posture.
  • FIG. 6 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to another embodiment.
  • FIG. 6 is a flowchart of the position detection process executed by the processing unit of the position detection apparatus according to another embodiment.
  • FIG. 7A is a plan view of a substrate whose position is to be detected in still another embodiment
  • FIG. 7B is a diagram showing, as an image, a template used in pattern matching executed in steps S03 and S07 in FIG. is there.
  • FIG. 8 is a flowchart of a position detection process executed by a position detection apparatus according to another embodiment.
  • 9A is a plan view showing the positional relationship between the angle of view and the substrate when step S11 of FIG. 8 is executed
  • FIG. 9B shows the positional relationship between the angle of view and the substrate after executing step S13. It is a top view.
  • FIG. 10 is a flowchart of position detection processing executed by a position detection apparatus according to another embodiment.
  • FIG. 10 is a flowchart of position detection processing executed by a position detection apparatus according to another embodiment.
  • FIG. 11A is a plan view of a substrate whose position is to be detected using a position detection apparatus according to another embodiment
  • FIG. 11B is a diagram showing a template used for pattern matching at the time of position detection as an image.
  • 12A and 12B are schematic perspective views of a position detecting device according to still another embodiment
  • FIG. 13 is a schematic front view of a film forming apparatus according to still another embodiment.
  • FIG. 1 is a schematic perspective view of a position detection apparatus according to an embodiment.
  • the position detection apparatus according to the present embodiment includes a support unit 10, imaging devices 11A and 11B, a processing unit 12, a storage unit 13, and a display unit 14.
  • the support unit 10 supports the substrate 50 to be processed on its upper surface (support surface).
  • a movable stage that can move the substrate 50 in a two-dimensional direction in the plane and rotate it in the plane is used.
  • the substrate 50 has a planar shape having a plurality of corners such as a square and a rectangle.
  • the imaging device 11A captures an area in the angle of view (field of view) 15A on the support surface of the support unit 10 and acquires image data.
  • the imaging device 11B captures an area in the angle of view (field of view) 15B on the support surface of the support unit 10 and acquires image data.
  • the processing unit 12 executes the processing program stored in the storage unit 13, thereby realizing a position detection function.
  • various data referred to when the position detection process is executed image data acquired by the imaging devices 11 ⁇ / b> A and 11 ⁇ / b> B, coordinate data that is a processing result of the processing unit 12, and the like Is stored.
  • the processing unit 12 outputs the processing result to the display unit 14.
  • a central processing unit CPU
  • the storage unit 13 for example, a RAM, a ROM, an external storage device, or the like is used.
  • the display unit 14 for example, a liquid crystal display, an organic EL display, or the like is used.
  • FIG. 2 is a flowchart of the position detection process executed by the processing unit 12 (FIG. 1).
  • the position detection process described below the position of one corner 51 of the substrate 50 is detected.
  • the positions of the other corners 51 can also be detected by the same process.
  • FIG. 3A is a plan view of a substrate 50 that is a target for position detection.
  • the substrate 50 has a substantially rectangular or square planar shape, and four corners 51 are cut off obliquely.
  • the planar shape of the part cut off obliquely is a right isosceles triangle having a base angle of 45 degrees.
  • FIG. 3B is a view showing templates 20A, 20B, and 20C used as pattern matching executed during the position detection process as images.
  • the template 20 ⁇ / b> A corresponds to an image of an oblique side of the corner 51.
  • the template 20 ⁇ / b> B corresponds to an image of an intersecting portion between an oblique side of the corner 51 and a side extending in the horizontal direction.
  • the template 20 ⁇ / b> C corresponds to an image of an intersecting portion between an oblique side of the corner 51 and a side extending in the vertical direction.
  • the templates 20A, 20B, and 20C are created based on the external shape standard of the substrate 50.
  • 3C, 4A, and 4A are diagrams showing image data handled by the processing unit 12 as an image during the position detection process. 3C, FIG. 4A, and FIG. 4B, the area
  • step S01 the substrate 50 is supported on the support surface of the support portion 10 (step S01). This process is performed, for example, when the processing unit 12 controls a transfer device such as a robot arm.
  • the two adjacent corner portions 51 (FIG. 1) of the substrate 50 or the vicinity thereof are rough so as to be within the two angles of view 15A and 15B (FIG. 1), respectively. Positioning has been made.
  • the processing unit 12 draws the corner 51 of the substrate 50 into the angle of view 15A (step S02).
  • step S02 a process of drawing the corner 51 of the substrate 50 into the angle of view 15A will be described.
  • the estimated position of the corner 51 is imaged to obtain image data 22 (FIG. 3C).
  • the processing unit 12 displays an image corresponding to the acquired image data 22 on the display unit 14.
  • the processing unit 12 determines whether or not the image data 22 includes part of two sides sandwiching the corner portion 51 of the substrate 50.
  • “a part of two sides sandwiching the corner 51” is simply referred to as “the corner 51 is included”.
  • the processing unit 12 moves the support 10 so that the corner 51 is within the angle of view 15A.
  • the processing unit 12 displays an image corresponding to the image data acquired during the movement of the substrate 50 on the display unit 14.
  • the processing unit 12 performs pattern matching between the image data 22 and the template 20B (FIG. 3B).
  • pattern matching between the image data 22 and the template 20C (FIG. 3B) is performed.
  • both the part that matches the template 20B and the part that matches the template 20C are included in the image data 22.
  • a portion that matches the template 20B is not included in the image data 22, but a portion that matches the template 20C is included in the image data 22.
  • the corner 51 falls within the angle of view 15A.
  • the processing unit 12 performs pattern matching between the image data 22 and the template 20A.
  • the processing unit 12 estimates a moving direction and a moving distance to which the substrate 50 should be moved from the position of the matching portion. For example, when a portion matching the template 20A is found in the upper right region of the angle of view 15A, it can be seen that the substrate 50 may be moved obliquely in the lower left direction.
  • the processing unit 12 displays information indicating a portion matched with the templates 20A, 20B, and 20C in the image displayed on the display unit 14 on the display unit 14 during pattern matching. For example, a frame surrounding the matched portion is displayed over the image.
  • the image data 22 is acquired again.
  • the processing unit 12 completes the process of drawing the corner 51 into the angle of view 15A.
  • the processing unit 12 determines the length of the portion in the angle of view 15A (the length of the side in the angle of view) of each of the two sides sandwiching the corner 51. And the specified value are compared (step S03).
  • the two sides sandwiching the corner portion 51 mean sides extending from each of both ends of the oblique side formed by cutting off the corner portion 51 to the next corner portion.
  • This specified value is stored in the storage unit 13 in advance.
  • the length of the side in the angle of view 15A can be obtained from the position of the portion that matches the templates 20B and 20C, for example.
  • the processing unit 12 displays information indicating a portion matching the templates 20B and 20C on the display unit 14.
  • FIG. 4A is a diagram showing the image data 22 as an image when the length of the side in the angle of view 15A is shorter than a specified value.
  • the image data 22 in FIG. 4A completely includes the entire pattern corresponding to the template 20C, but includes only a part of the pattern corresponding to the template 20B. Even in such a case, it is possible to detect a portion matching the template 20B by lowering the matching determination threshold.
  • the length Lc of the side from the vertex of the portion matched with the template 20C to the adjacent corner is equal to or greater than the specified value, but the length Lb of the side from the vertex of the portion matched to the template 20B to the next corner is specified. Is less than the value.
  • step S03 If it is determined in step S03 that the length of at least one side in the angle of view 15A is shorter than the specified value, the length of the side in the angle of view 15A is equal to or longer than the specified value.
  • the substrate 50 is moved so as to be (step S04). For example, in the example shown in FIG. 4A, it can be seen that the substrate 50 may be moved to the left.
  • the processing unit 12 displays an image corresponding to the acquired image data on the display unit 14.
  • step S03 When it is determined in step S03 that the lengths of the two sides in the angle of view 15A are equal to or longer than the specified value, or after step S04, the processing unit 12 sets the specified angle in the angle of view 15A.
  • the position of the corner 51 is detected based on the image data 22 acquired in a state including a side that is equal to the value or longer than the specified value (step S05).
  • step S05 the process of detecting the position of the corner 51 will be described with reference to FIG. 4B.
  • FIG. 4B is a diagram showing the image data 22 acquired as an image in a state in which a side having a length equal to or longer than a specified value is included in the field angle 15A.
  • a side extending linearly in the horizontal direction and a side extending linearly in the vertical direction are detected in the image data 22.
  • pattern matching using a template including one straight side can be used.
  • the processing unit 12 employs the position of the detected intersection (reference point) 28 on the extension line of the two sides as the position of the corner 51.
  • the processing unit 12 displays information indicating the detected extension lines of the two sides and the reference point 28 on the display unit 14. For example, the extension line of two sides is displayed as a solid line or a broken line, and the intersection of both is displayed as a plus sign. Further, information 29 indicating the coordinates of the reference point is displayed on the display unit 14 as a number.
  • the position of the substrate 50 where the alignment mark is not formed can be detected. Further, the position of the substrate 50 can be detected even if the front and back sides of the substrate 50 are reversed.
  • FIG. 5A is a diagram showing the image data of the corner 51 when the shape and dimensions of the cut-off portion of the corner 51 of the substrate 50 are as specified.
  • the position of the reference point 28 of the corner 51 can be detected by the method according to the embodiment.
  • FIG. 5B is a diagram showing the image data of the corner 51 when the size of the cut-off portion of the corner 51 of the substrate 50 is smaller than the standard as an image.
  • FIG. 5C is a diagram illustrating the image data of the corner 51 when the shape of the cut-off portion of the corner 51 of the substrate 50 deviates from the standard.
  • the angle formed between the oblique side and the two sides sandwiching the corner 51 is deviated from 45 degrees. Since the vertices at both ends of the oblique side are detected by pattern matching separately using different templates 20B and 20C, the vertices at both ends of the oblique side are pattern-matched even if the oblique sides are deviated from 45 degrees. It is easy to detect by. Since the reference point 28 of the corner 51 is determined by extending two sides sandwiching the corner 51, the position detection accuracy of the reference point 28 does not decrease even if the direction of the oblique side is deviated from 45 degrees.
  • FIG. 5D is a diagram showing the image data of the corner 51 when the posture in the in-plane rotation direction of the substrate 50 is slightly tilted from the target posture as an image.
  • two sides sandwiching the corner 51 are inclined with respect to the vertical direction and the horizontal direction in the angle of view 15A (FIG. 1).
  • the corners at both ends of the oblique side can be detected by inclining the templates 20B and 20C and performing pattern matching. Since the reference point 28 of the corner 51 is determined by extending two sides sandwiching the corner 51, even if the two sides are inclined with respect to the vertical direction and the horizontal direction, the position detection accuracy of the reference point 28 is It does not decline.
  • the position detection result is not affected by the shape of the apex of the corner 51.
  • the position of the corner 51 can be detected with high accuracy.
  • the image data 22 (FIG. 4B) including a side having a length equal to or longer than the specified value in the angle of view is used as the basis for calculating the position of the corner 51.
  • the extension line of these two sides can be determined with high accuracy.
  • the calculation accuracy of the position of the reference point 28 (FIG. 4B) calculated based on this extension line can be increased.
  • the display unit 14 displays the image of the substrate 50 imaged by the imaging devices 11A and 11B (FIG. 1) and the template image when performing pattern matching according to the operation of the position detection device. Is displayed. For this reason, the operator can know the operation status of the position detection device from the information displayed on the display unit 14. Moreover, the normality of operation
  • FIG. 6 is a flowchart of the position detection process executed by the processing unit 12 of the position detection apparatus according to this embodiment. Steps S01 and S02 are the same as steps S01 and S02 in the embodiment shown in FIG.
  • step S03a the lengths of the two sides in the angle of view 15A are compared with the specified value to determine whether the lengths of the two sides in the angle of view 15A are equal to the specified value.
  • the difference between the length of the two sides in the angle of view 15A and the specified value is the range of the maximum error caused by the measurement accuracy when measuring the length based on the image data by the imaging devices 11A and 11B. If it is within the range, the processing unit 12 determines that the lengths of the two sides in the angle of view 15A are equal to the specified value.
  • the processing unit 12 moves the substrate 50 so that the lengths of the two sides in the angle of view 15A are equal to the specified value. (Step S04a).
  • step S03a If it is determined in step S03a that the lengths of the two sides in the angle of view 15A are equal to the specified value, or after step S04a, the processing unit 12 has a length of 2 equal to the specified value in the angle of view 15A. Based on the image data 22 acquired in a state including one side, the position of the corner 51 is detected.
  • the process of step S05 is the same as the process of step S05 of the embodiment shown in FIG.
  • the variation in the position of the reference point 28 (FIG. 4B) of the corner 51 within the angle of view 15A is reduced. Since the influence of the aberration of the lens of the imaging device 11A can be equalized for each position detection, it is possible to reduce variations in position detection accuracy.
  • FIGS. 7A and 7B a position detection device according to still another embodiment will be described with reference to FIGS. 7A and 7B.
  • the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
  • FIG. 7A is a plan view of the substrate 50 that is a position detection target in this embodiment.
  • the substrate 50 that is a position detection target in the embodiment shown in FIGS. 1 to 5C has a shape in which the corner portion 51 is linearly cut off.
  • the substrate 50 is a position detection target.
  • substrate 50 has the shape by which the corner
  • FIG. 7B is a diagram illustrating templates 20A, 20B, and 20C used in the pattern matching executed in step S02 of FIG. 2 as images.
  • the templates 20A, 20B, and 20C in FIG. 7B correspond to the templates 20A, 20B, and 20C in FIG. 3B, respectively.
  • the patterns of the templates 20A, 20B, and 20C are also rounded according to the round shape of the corner 51 of the substrate 50.
  • FIGS. 8 to 9B a position detection apparatus according to another embodiment will be described with reference to FIGS. 8 to 9B.
  • the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
  • FIG. 8 is a flowchart of the position detection process executed by the position detection apparatus according to this embodiment.
  • the process of supporting the substrate 50 on the support unit 10 (step S11) and the process of drawing the first corner part into the angle of view (step S12) are the same as the processes of steps S01 and S02 shown in FIG.
  • the processing unit 12 draws the second corner adjacent to the first corner via one side into the angle of view (Ste S13). For example, when the first corner is drawn into the angle of view 15A (FIG. 1), the second corner is drawn into the angle of view 15B.
  • the same technique as the process of step S02 of FIG. 2 can be applied to the process of drawing the second corner into the angle of view 15B.
  • FIG. 9A is a plan view showing the positional relationship between the angle of view 15A and 15B and the substrate 50 when step S13 is executed.
  • a direction from the origin of the angle of view 15B to the origin of the angle of view 15A is defined as a reference direction (x direction). At this time, each side of the substrate 50 is inclined with respect to the reference direction.
  • the processing unit 12 calculates the relative positional relationship between the two corners from the state in which the first corner is drawn into the angle of view and the state in which the second corner is drawn into the angle of view.
  • the posture in the in-plane rotation direction is detected (step S14).
  • the processing unit 12 performs alignment with respect to the in-plane rotation direction of the substrate 50 by rotating the support unit 10 based on the detection result of the posture of the substrate 50 in the in-plane rotation direction (step S15).
  • FIG. 9B is a plan view showing the positional relationship between the angles of view 15A and 15B after the execution of step S15 and the substrate 50.
  • FIG. A pair of sides of the substrate 50 is parallel to the reference direction.
  • the processing unit 12 After the alignment in the in-plane rotation direction of the substrate 50, the processing unit 12 detects the position of the first corner again (step S16).
  • the position of the first corner can be detected by the same procedure as the procedure from steps S02 to S05 in FIG. Similarly, the processing unit 12 detects the position of the second corner again (step S17).
  • alignment in the in-plane rotation direction of the substrate 50 can be performed, and the position of the substrate 50 in two in-plane directions can be detected. .
  • FIG. 10 is a flowchart of the position detection process executed by the position detection apparatus according to this embodiment.
  • the substrate 50 is supported on the support surface of the support portion 10 (FIG. 1) with the first surface facing upward (step S21). This process is the same as the process of step S01 shown in FIG.
  • the processing unit 12 detects the position of the substrate 50 (step S22). This position detection process is the same as the process from steps S02 to S05 shown in FIG. 2 or steps S12 to S16 shown in FIG.
  • the processing unit 12 creates a template from the image data of the corner 51 when the lengths of the two sides in the angle of view are equal to or greater than the specified value (step S23).
  • the template creation process will be described. Pattern matching between the image data 22 (FIG. 4B) used for detecting the position of the corner 51 in step S05 (FIG. 2) and the templates 20B and 20C (FIG. 3B) is performed. The image of the part corresponding to 20B and 20C is cut out. The cut-out image is mirror-image converted into a template and stored in the storage unit 13.
  • a template created based on an image cut out from the image data 22 is referred to as a dynamic template in distinction from the templates 20A, 20B, and 20C (FIG. 3B) created based on the standard shape of the substrate 50.
  • the processing unit 12 processes the first surface of the substrate 50 (step S24). For example, ink is ejected from the inkjet head toward the first surface to form an insulating resin film having a desired planar shape.
  • step S25 the front and back of the substrate 50 are reversed (step S25), and the second surface opposite to the first surface is directed upward.
  • Inversion of the front and back is performed, for example, by the processing unit 12 controlling the robot arm. In addition, you may make it reverse the front and back manually.
  • the processing unit 12 detects the position of the substrate 50 with the second surface facing upward (step S26).
  • the dynamic template created in step S23 is used instead of the templates 20A, 20B, and 20C (FIG. 3B) when executing the processes from steps S02 to S05 shown in FIG. More specifically, when searching for a specific part from the image data of the substrate 50, a dynamic template cut out from a corresponding part of the image data obtained by imaging the first surface is used.
  • the processing unit 12 After detecting the position of the substrate 50, the processing unit 12 processes the second surface of the substrate 50 (step S27).
  • the position of the substrate 50 is detected using a dynamic template in step S26. For this reason, even if it is a case where the shape of the corner
  • FIGS. 11A and 11B a position detection device according to still another embodiment will be described with reference to FIGS. 11A and 11B.
  • the description of the configuration common to the embodiment shown in FIGS. 1 to 5D will be omitted.
  • FIG. 11A is a plan view of a substrate 50 that is a target whose position is detected by using the position detection device according to the present embodiment.
  • the corner portion 51 of the substrate 50 is not cut off, and a substantially perpendicular vertex appears.
  • FIG. 11B is a diagram showing the template 20 used for pattern matching at the time of position detection as an image.
  • two templates 20B and 20C are used in pattern matching when the corner 51 is detected (steps S02 and S05).
  • one template 20 is used in pattern matching when detecting the corner 51.
  • the length of two sides in the angle of view is equal to or greater than the specified value in the image data that is the basis for detecting the position of the corner 51 (step S03), the reference of the corner 51 The detection accuracy of the position of the point 28 (FIG. 4B) can be increased.
  • FIGS. 12A and 12B a position detection apparatus according to still another embodiment will be described.
  • the description of the configuration common to the embodiments shown in FIGS. 1 to 5D and FIGS. 8 to 9B will be omitted.
  • FIG. 12A and 12B are schematic perspective views of the position detection apparatus according to this embodiment.
  • the two imaging devices 11A and 11B are arranged corresponding to the two corners 51 of the substrate 50.
  • only one imaging device 11A is arranged.
  • the first corner is drawn into the angle of view 15A of the imaging device 11A in step S12
  • the second corner is drawn into the angle of view 15B of the imaging device 11B in step S13.
  • step S12 the first corner 51 is drawn into the angle of view 15A of the image pickup apparatus 11A as in the embodiment of FIG. 8 (FIG. 12A).
  • step S13 the substrate 50 is translated and the second corner 51 is drawn into the angle of view 15A of the imaging device 11A (FIG. 12B).
  • step S16 the first corner 51 is drawn into the angle of view 15A of the image pickup apparatus 11A, and processing for detecting the position of the first corner 51 is executed.
  • step S ⁇ b> 17 a process of detecting the position of the second corner portion 51 by drawing the second corner portion 51 into the angle of view 15 ⁇ / b> A of the imaging device 11 ⁇ / b> A is executed.
  • the moving direction of the substrate 50 from the state of FIG. 12A to the state of FIG. 12B may be defined as the reference direction (x direction) shown in FIGS. 9A and 9B.
  • FIG. 13 is a schematic front view of the film forming apparatus according to the present embodiment.
  • the support unit 10 is supported on the base 30 via the moving mechanism 31.
  • the support portion 10 corresponds to the support portion 10 of the embodiment shown in FIG.
  • the substrate 50 is supported on the upper surface (support surface) of the support unit 10.
  • the moving mechanism 31 can move the support unit 10 in a two-dimensional direction parallel to the support surface and rotate it in an in-plane direction parallel to the support surface. Usually, the support surface of the support part 10 is kept horizontal.
  • a plurality of inkjet heads 33 and a plurality of imaging devices 11A and 11B are arranged above the substrate 50 supported by the support unit 10.
  • the inkjet head 33 and the imaging devices 11 ⁇ / b> A and 11 ⁇ / b> B are supported on the base 30 by the portal frame 32.
  • the imaging devices 11A and 11B can be moved up and down by the lifting mechanisms 17A and 17B, respectively.
  • Each of the inkjet heads 33 is provided with a plurality of nozzle holes. A droplet of the film material is discharged from the nozzle hole toward the substrate 50.
  • the imaging devices 11 ⁇ / b> A and 11 ⁇ / b> B capture a part of the substrate 50 supported by the support unit 10 and transmit the acquired two-dimensional image data to the processing unit 12.
  • the imaging devices 11A and 11B and the processing unit 12 correspond to the imaging devices 11A and 11B and the processing unit 12 of the embodiment shown in FIG. Furthermore, the processing unit 12 can adjust the focus positions of the imaging devices 11A and 11B by controlling the elevating mechanisms 17A and 17B to raise and lower the imaging devices 11A and 11B.
  • the processing unit 12 controls the moving mechanism 31 and the inkjet head 33 based on image data defining the shape of the film to be formed.
  • a film can be formed by curing the adhering liquid film material. Thereby, a film having a desired shape can be formed on the substrate 50.
  • a photocurable resin, a thermosetting resin, or the like can be used as the film material.
  • a light source or a heat source for curing the film material attached to the substrate 50 is disposed on the side of the inkjet head 33.
  • Various commands and data are input from the input unit 16 to the processing unit 12.
  • a keyboard, a pointing device, a USB port, a communication device, or the like is used for the input unit 16.
  • Various information regarding the operation of the film forming apparatus is output to the display unit 14.
  • a liquid crystal display or the like is used for the display unit 14.
  • a communication device that transmits image data to be displayed on an external display device may be used as the display unit 14.
  • the position of the substrate 50 is detected by the method according to any one of the embodiments shown in FIGS. Thereby, a position can be detected with high accuracy.

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Container, Conveyance, Adherence, Positioning, Of Wafer (AREA)
  • Image Processing (AREA)

Abstract

Dans la présente invention, un substrat comportant une partie d'angle est soutenu par une unité de support. Un dispositif d'imagerie capture une image d'une partie du substrat selon un angle d'observation. Une unité de traitement amène le substrat ou le dispositif d'imagerie à se déplacer de sorte que la longueur de deux côtés de chaque côté d'une partie d'angle du substrat incluse dans l'angle d'observation du dispositif d'imagerie soit égale ou supérieure à une valeur spécifiée. La position de la partie d'angle du substrat est détectée sur la base de données d'image acquises par capture d'une image dans l'angle d'observation par le dispositif d'imagerie après le mouvement. Grâce à cette configuration, une détection de position plus précise peut être effectuée de façon plus stable sur la base de données d'image de la partie d'angle du substrat.
PCT/JP2018/010826 2017-03-22 2018-03-19 Dispositif de détection de position et procédé de détection de position WO2018174011A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019507664A JP6862068B2 (ja) 2017-03-22 2018-03-19 位置検出装置及び位置検出方法
KR1020197019474A KR20190131475A (ko) 2017-03-22 2018-03-19 위치검출장치 및 위치검출방법
CN201880005782.4A CN110446905A (zh) 2017-03-22 2018-03-19 位置检测装置及位置检测方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-055685 2017-03-22
JP2017055685 2017-03-22

Publications (1)

Publication Number Publication Date
WO2018174011A1 true WO2018174011A1 (fr) 2018-09-27

Family

ID=63585488

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010826 WO2018174011A1 (fr) 2017-03-22 2018-03-19 Dispositif de détection de position et procédé de détection de position

Country Status (5)

Country Link
JP (1) JP6862068B2 (fr)
KR (1) KR20190131475A (fr)
CN (1) CN110446905A (fr)
TW (1) TWI654500B (fr)
WO (1) WO2018174011A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019132709A (ja) * 2018-01-31 2019-08-08 株式会社豊田自動織機 計測装置及び計測方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112629444B (zh) * 2021-03-08 2021-06-22 南京航空航天大学 一种基于机器视觉的放射库盖板落放误差自动纠正方法
KR102616395B1 (ko) 2021-08-20 2023-12-20 지영배 식기세척기용 나노버블장치
KR102616399B1 (ko) 2021-08-20 2023-12-20 지영배 나노버블 식기세척기
KR102632696B1 (ko) 2021-09-16 2024-02-01 지영배 나노버블장치
KR102632695B1 (ko) 2021-09-16 2024-02-01 지영배 나노버블이 적용된 식기세척기

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04238206A (ja) * 1991-01-22 1992-08-26 Matsushita Electric Ind Co Ltd 多角形部品の多視野認識方法
JPH05149716A (ja) * 1991-11-29 1993-06-15 Toshiba Corp コーナ検出方法
JP2000321024A (ja) * 1999-05-11 2000-11-24 Matsushita Electric Ind Co Ltd 画像認識による位置検出方法
JP2002288634A (ja) * 2001-03-28 2002-10-04 Juki Corp 部品位置検出方法及び装置
JP2007256053A (ja) * 2006-03-23 2007-10-04 Nikon Corp 位置計測方法及びデバイス製造方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007069680A1 (fr) * 2005-12-16 2007-06-21 Asahi Kasei Emd Corporation Détecteur de position
CN101561248A (zh) * 2008-04-17 2009-10-21 鸿富锦精密工业(深圳)有限公司 位置测量装置及测量方法
JP5907110B2 (ja) 2013-04-12 2016-04-20 信越化学工業株式会社 スクリーン印刷方法及びスクリーン印刷装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04238206A (ja) * 1991-01-22 1992-08-26 Matsushita Electric Ind Co Ltd 多角形部品の多視野認識方法
JPH05149716A (ja) * 1991-11-29 1993-06-15 Toshiba Corp コーナ検出方法
JP2000321024A (ja) * 1999-05-11 2000-11-24 Matsushita Electric Ind Co Ltd 画像認識による位置検出方法
JP2002288634A (ja) * 2001-03-28 2002-10-04 Juki Corp 部品位置検出方法及び装置
JP2007256053A (ja) * 2006-03-23 2007-10-04 Nikon Corp 位置計測方法及びデバイス製造方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019132709A (ja) * 2018-01-31 2019-08-08 株式会社豊田自動織機 計測装置及び計測方法

Also Published As

Publication number Publication date
JP6862068B2 (ja) 2021-04-21
CN110446905A (zh) 2019-11-12
KR20190131475A (ko) 2019-11-26
TW201835692A (zh) 2018-10-01
JPWO2018174011A1 (ja) 2020-01-23
TWI654500B (zh) 2019-03-21

Similar Documents

Publication Publication Date Title
WO2018174011A1 (fr) Dispositif de détection de position et procédé de détection de position
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
JP6576664B2 (ja) エッジ検出偏り補正値計算方法、エッジ検出偏り補正方法、及びプログラム
JP2011180084A (ja) 部品実装機の撮像画像処理装置
WO2015045649A1 (fr) Dispositif de montage de composants
JP5052302B2 (ja) 部品実装方法及び装置
JP5545737B2 (ja) 部品実装機及び画像処理方法
JP2007085912A (ja) 位置測定方法及び位置測定装置並びに位置測定システム
JP6177255B2 (ja) 作業機、および、位置ズレデータ取得方法
JP6941306B2 (ja) 撮像装置、バンプ検査装置、及び撮像方法
JP6723648B2 (ja) 位置検出装置及び位置検出方法
JP2008171873A (ja) 位置決め装置、位置決め方法、加工装置及び加工方法
JP2015115528A (ja) 基板処理装置および基板処理方法
JP4061265B2 (ja) 突起の高さ測定方法および測定装置
JP2011040474A (ja) ガラス基板位置特定方法およびそれに用いられる装置
JP2008135423A (ja) 輪郭検出装置、位置決め装置、パターン描画装置および輪郭検出方法
CN115808356B (zh) 芯片测试方法及芯片测试设备
JP3858633B2 (ja) ボンディング状態の検査方法
JP4829701B2 (ja) 部品実装機のカメラスケーリング取得方法
JP2009170586A (ja) 電子部品認識方法及び装置
WO2023095635A1 (fr) Procédé de calcul de taux d'aberration de distorsion et dispositif de détection de position
TWI834949B (zh) 可快速將機械手臂定位到三維座標系統的雷射加工系統
US10861199B2 (en) System for creating component shape data for image processing, and method for creating component shape data for image processing
JP5113657B2 (ja) 表面実装方法及び装置
JP2008229404A (ja) ヘッド位置補正方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18771210

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019507664

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20197019474

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18771210

Country of ref document: EP

Kind code of ref document: A1