WO2016178300A1 - ロボットの教示方法及びロボット - Google Patents
ロボットの教示方法及びロボット Download PDFInfo
- Publication number
- WO2016178300A1 WO2016178300A1 PCT/JP2016/001936 JP2016001936W WO2016178300A1 WO 2016178300 A1 WO2016178300 A1 WO 2016178300A1 JP 2016001936 W JP2016001936 W JP 2016001936W WO 2016178300 A1 WO2016178300 A1 WO 2016178300A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand
- robot
- target
- axis
- sensor
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37415—By cutting light beam
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39536—Planning of hand motion, grasping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45057—Storage handling for disks or material
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L21/00—Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
- H01L21/67—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
- H01L21/67005—Apparatus not specifically provided for elsewhere
- H01L21/67242—Apparatus for monitoring, sorting or marking
- H01L21/67259—Position monitoring, e.g. misposition detection or presence detection
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L21/00—Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
- H01L21/67—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
- H01L21/68—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for positioning, orientation or alignment
- H01L21/681—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for positioning, orientation or alignment using optical controlling means
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/02—Arm motion controller
- Y10S901/03—Teaching system
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
- Y10S901/47—Optical
Definitions
- the present invention relates to a robot teaching method and a robot.
- a mapping sensor may be provided at the tip of the hand of the transfer robot. The presence or absence of the substrate stored in the cassette is detected by the mapping sensor.
- the mapping sensor is configured such that the sensor beam goes straight through a space between the first and second tip portions of the hand branched in a bifurcated shape, for example.
- the position where the object blocks the sensor beam is the detection position. For this reason, the position of the mapping sensor in the beam direction cannot be specified.
- US Pat. No. 7,522,267 discloses a technique for calculating an XY plane position of an object from detection positions of a mapping sensor in two different postures with respect to the object (for example, Patent Document 1). See).
- the calculated position includes an error.
- accumulation of robot zeroing errors, mechanical dimensional errors, and the like causes errors not only in angles but also in XY plane positions. As a result, there is a problem that the position of the object cannot be taught accurately.
- an object of the present invention is to accurately specify the position of the object and improve the teaching accuracy of the robot.
- a robot teaching method includes a robot arm having a degree of freedom in at least two directions of the X axis and the Y axis, a first tip portion attached to the tip of the arm and bifurcated, and A hand having a second tip, and a mapping sensor configured to detect whether or not a target has blocked the sensor beam, wherein the sensor beam travels straight through a space between the first and second tips; and
- a robot teaching method comprising a control device for controlling the operation of a robot arm, comprising: an arrangement step of arranging a target at a teaching position; and the target is moved straight from a predetermined position so that the target blocks the sensor beam.
- a swinging step for swinging the hand around a predetermined turning axis on an axis orthogonal to the optical axis of the sensor beam so as to scan in a horizontal direction, and the swing changed by swinging the hand Based on the detection signal of the mapping sensor, a determination step for determining whether or not the target matches the position along the central axis in the longitudinal direction of the hand, and if it is determined that the target does not match, the hand is shaken.
- An offset amount of the hand is calculated based on a detection signal of the mapping sensor changed by moving the hand, and the hand is moved in either the left or right direction along the optical axis of the sensor beam according to the calculated offset amount.
- the mapping sensor is configured to detect whether or not the target blocks the sensor beam passing through the space at the tip of the hand. For this reason, it is only possible to specify the position in one direction of the target as seen from the robot.
- the hand is swung around the predetermined turning axis, and the target is scanned in the horizontal direction by the sensor beam.
- the waveform of the detection signal of the mapping sensor changes.
- An offset amount is calculated based on the detection signal that changes.
- the hand is shifted based on the calculated offset amount.
- the hand may be swung by the same left and right angles around a predetermined turning axis on an axis orthogonal to the optical axis of the sensor beam.
- the offset amount can be suitably calculated from the detection signal of the mapping sensor by swinging the robot arm by the same left and right angle with respect to the central axis in the longitudinal direction of the hand.
- the hand is positioned at a position along the central axis in the longitudinal direction of the hand. It may be determined whether the target matches.
- the position of the target can be accurately specified.
- the hand shift direction and each axis of the robot's reference coordinate system are not always parallel. Therefore, the hand can be accurately shifted by calculating the inclination of the optical axis of the sensor beam with respect to the axis of the reference coordinate system of the robot in advance.
- the arranging step two targets are respectively arranged at two teaching positions, and the first specifying step, the swinging step, the determining step, the shifting step, the second specifying step, and the teaching step are performed on the two targets. Performing each, and adjusting a deviation in the case where the hand moves straight a predetermined distance from the position of the taught hand based on a relative position of each position of the identified target and a design distance between the targets An adjustment step may be further included.
- a robot includes a robot arm having a degree of freedom in at least two directions of the X axis and the Y axis, a first tip portion that is attached to the tip of the arm, and bifurcated, and a second tip.
- a hand having a tip, a mapping sensor configured to detect whether or not a target has blocked the sensor beam, and a robot arm configured so that a sensor beam goes straight through a space between the first and second tips;
- a control device that controls the operation of the robot, wherein the control device moves the hand straight from a predetermined position, and the robot from when the target arranged at the teaching position blocks the sensor beam.
- the position of the target in the front-rear direction is identified, and the sensor beam is scanned in the horizontal direction by the sensor beam.
- the hand is swung around a predetermined swivel axis on an axis orthogonal to the axis, and based on the detection signal of the mapping sensor changed by swinging the hand, the central axis in the longitudinal direction of the hand.
- the target of the robot arm is determined based on the detection signal of the mapping sensor changed by swinging the hand when it is determined that the target does not match the position along the position.
- An offset amount is calculated, and when the hand is shifted in the left or right direction along the optical axis of the sensor beam in accordance with the calculated offset amount, and when it is determined that they match, the robot is viewed from the robot
- the position of the target in the left-right direction is specified, and the target position is determined based on the position of the specified target in the front-back direction and the left-right direction
- FIG. 1 is a schematic diagram illustrating the configuration of the robot according to the first embodiment.
- FIG. 2 is a plan view of the hand of FIG.
- FIG. 3 is a block diagram showing a control system of the robot of FIG.
- FIG. 4 is a flowchart showing an example of the teaching method of the robot of FIG.
- FIG. 5 is a diagram for explaining the operation of the robot that specifies the position of the target in the XY plane.
- FIG. 6 is a schematic plan view when the hand of FIG. 5 is swung.
- FIG. 7 shows the waveform of the detection signal of the mapping sensor when the center of the hand matches the target.
- FIG. 8 shows the waveform of the detection signal of the mapping sensor when the center of the hand does not match the target.
- FIG. 7 shows the waveform of the detection signal of the mapping sensor when the center of the hand matches the target.
- FIG. 9 is a schematic plan view when the hand is shifted.
- FIG. 10 is a schematic plan view of a hand and a target in the second embodiment.
- FIG. 11 is a schematic plan view when the coordinate axis is inclined in the robot reference coordinate system with respect to the hand shift direction.
- FIG. 1 is a schematic diagram illustrating a configuration of the robot 1 according to the first embodiment.
- the robot 1 is used to transport a substrate W such as a wafer, which is a material of a semiconductor element, in a semiconductor processing facility for manufacturing a semiconductor element, for example.
- Wafers include semiconductor wafers and glass wafers.
- Semiconductor wafers include, for example, silicon wafers, other single semiconductor wafers, and compound semiconductor wafers.
- the glass wafer includes, for example, a glass substrate for FPD, a glass substrate for MEMS, and a sapphire (single crystal alumina) wafer.
- a plurality of processing apparatuses are provided for performing processing such as heat treatment, impurity introduction processing, thin film formation processing, lithography processing, cleaning processing, and planarization processing on a wafer.
- the robot 1 transports the substrate W to an area (processing chamber) where these processing apparatuses are arranged.
- the substrate W is stored in a shelf inside the cassette 6 installed on the cassette table 7.
- the robot 1 includes, for example, an arm 2, a lifting shaft 3, a base 4, a control device 5, and a hand 10.
- the substrate W is placed on the hand 10 of a so-called horizontal articulated four-axis robot.
- a wrist having a horizontal degree of freedom is provided at the tip of an arm 2 having degrees of freedom in the three directions of the X axis, the Y axis, and the Z axis, and a hand 10 is provided on this wrist.
- the robot 1 has a base 4 fixed to an appropriate place (for example, a floor) of a semiconductor processing facility, and the base 4 is provided with a lifting shaft 3.
- the axis of the elevating shaft 3 is directed vertically, for example.
- the base 4 incorporates an actuator (not shown) made of an air cylinder, for example. By this operation of the actuator, the lifting shaft 3 moves up and down on the upper surface side of the base 4.
- the arm 2 includes a first arm 2a and a second arm 2b.
- the first arm 2 a is provided at the upper end of the lifting shaft 3.
- the first arm 2 a extends horizontally from the upper end of the lifting shaft 3.
- One end of the first arm 2a is connected to the lifting shaft 3 so as to be swingable about the vertical axis L1, and the lifting shaft 3 incorporates an actuator (not shown) made of, for example, an electric motor. By the operation of this actuator, the first arm 2 a swings in the horizontal plane with respect to the lifting shaft 3.
- the second arm 2b is provided on the upper surface side of the other end of the first arm 2a.
- the second arm 2b extends horizontally from the other end of the first arm 2a.
- One end of the second arm 2b is connected to the first arm 2a so as to be swingable around the vertical axis L2.
- the other end of the first arm 2a incorporates an actuator (not shown) made of, for example, an electric motor. By the operation of the actuator, the second arm 2b swings in the horizontal plane with respect to the other end of the first arm 2a.
- a hand 10 On the upper surface side of the other end of the second arm 2b, there is provided a hand 10 on which the substrate W is placed and held.
- the hand 10 is connected to the other end of the second arm 2b so as to be swingable around the vertical axis L3.
- An actuator (not shown) made of, for example, an electric motor is built in the other end of the second arm 2b. By the operation of the actuator, the hand 10 swings in the horizontal plane with respect to the other end of the second arm 2b.
- FIG. 2 is a plan view of the hand 10 of FIG. 1 as viewed from above.
- the hand 10 is made of a plate material formed in a U shape in plan view. In the present embodiment, the plate material is symmetrical with respect to the U-shaped central axis C.
- the U-shaped main body has a single base end portion 10a and a pair of first tip end portion 10b and second tip end portion 10c extending in a forked manner from the base end portion. A space is formed between the first tip portion 10b and the second tip portion 10c.
- the base end portion 10 a of the hand is fixed to one end of the mounting plate 20, and the main body of the hand 10 extends horizontally from the mounting plate 20.
- the other end of the mounting plate 20 is connected to the other end of the second arm 2b so as to be swingable around the vertical axis L3.
- the hand 10 is configured to place and hold a disk-shaped substrate W.
- the hand 10 includes a pressing surface 11a as the substrate holding unit 11 and two edge grips 11b and 11c.
- the pressing surface 11 a is provided on the upper surface of the base end portion 10 a of the hand 10.
- the two edge grips 11b and 11c are provided on the upper surfaces of the first tip portion 10b and the second tip portion 10c of the hand 10. The edge of the substrate W is pushed toward the edge grips 11b and 11c by the pressing surface 11a, and the substrate W is held together with the edge grips 11b and 11c.
- the mapping sensor 12 is formed in the first tip portion 10b, the second tip portion 10c, and a region extending across the space of the hand 10, and the presence of the substrate W is detected by facing the substrate W.
- the mapping sensor 12 is configured such that the sensor beam B goes straight through the space between the first tip portion 10b and the second tip portion 10c.
- the mapping sensor 12 detects whether or not the substrate W blocks the sensor beam B.
- the center axis C in the longitudinal direction of the hand 10 and the center of the sensor beam B coincide.
- the light emitting unit 13 is built in the mounting plate 20 of the hand 10.
- the light emitting unit 13 converts the electrical input from the control device 5 and generates detection light.
- One end of an optical fiber 15a is connected to the light emitting portion 13, and the optical fiber 15a is laid from the back side of the proximal end portion 10a of the hand to the back side of the distal end portion 10b.
- the optical fiber 15a guides the detection light emitted from the light emitting unit 13 to the back side of the tip 10b of the hand.
- the light receiving unit 14 is built in the mounting plate 20 of the hand 10. The light receiving unit 14 receives the detection light and converts the detection light into an electrical output to the control device 5.
- FIG. 3 is a block diagram showing a control system in the robot 1. As shown in FIG.
- control device 5 is connected to the light emitting unit 13 and the light receiving unit 14 and the substrate holding unit 11 of the hand 10 and the driving device 30 of the robot 1 via a control line. It is a robot controller equipped.
- the control device 5 is not limited to a single device, and may be composed of a plurality of devices.
- the control device 5 includes a calculation unit 51, a storage unit 52, and a servo control unit 53.
- the storage unit 52 stores information such as a basic program of the control device 5 and a robot operation program.
- the arithmetic unit 51 performs arithmetic processing for robot control and generates a control command for the robot 1.
- the servo control unit 53 is configured to control the drive device 30 of the robot 1 based on the control command generated by the calculation unit 51.
- the light emitting unit 13 includes a light emitting element 16 and a drive circuit 17.
- the light emitting element 16 generates and emits detection light.
- the drive circuit 17 applies a voltage to the light emitting element 16 to drive the light emitting element.
- the drive circuit 17 generates a voltage according to a control signal (electrical input) from the control device 5 and drives the light emitting element 16.
- the light receiving unit 14 includes a light receiving element 18 and an output circuit 19.
- the light receiving element 18 converts the optical signal into an electrical signal by generating a voltage according to the amount of received detection light.
- a photodiode is used as the light receiving element 18.
- the output circuit 19 amplifies the electric signal and outputs it as a detection signal (electrical output) of the mapping sensor 12.
- the light emitting element 16 or the light receiving element 18 and the optical fibers 15a and 15b are connected by a connector (not shown).
- the light emitting unit 13 and the light receiving unit 14 include the light emitting element 16 and the light receiving element 18, and the light emitting element 16 and the light receiving element 18 constitute a transmissive optical sensor.
- the pressure of the pressing surface 11 a in contact with the substrate W is controlled according to the control command of the control device 5.
- the edge of the substrate W is pushed toward the edge grips 11b and 11c by the pressing surface 11a, and the substrate W is held together with the edge grips 11b and 11c.
- the drive device 30 is configured by an actuator that drives the lifting shaft 3, the first arm 2a, and the second arm 2b shown in FIG.
- the drive device 30 operates the actuator that drives the lifting shaft 3, the first arm 2 a, and the second arm 2 b according to the control command of the control device 5, and moves the hand 10 up and down and horizontally.
- mapping operation Next, the mapping operation by the hand 10 will be described with reference to FIGS.
- the robot 1 controls the operation of the arm 2 so that, for example, the tip of the hand 10 is sequentially opposed to the substrate W stored in each shelf from the bottom shelf to the top shelf of the cassette 6. In this way, scanning is performed (see FIG. 1).
- the sensor beam B goes straight in the space between the first tip portion 10b and the second tip portion 10c (see FIG. 2).
- the detection light is received by the end of the optical fiber 15 b on the back side of the tip 10 c of the hand 10.
- the light receiving unit 14 outputs a high level detection signal (ON signal) to the control device 5. That is, the detection signal when the substrate W is not stored on the shelf is at a high level.
- the mapping sensor 12 shields the sensor beam B traveling in the space between the tip portion 10b and the tip portion 10c of the hand by the outer peripheral portion of the substrate W.
- the light receiving unit 14 outputs a low level detection signal (OFF signal) to the control device 5. That is, when the substrate W is stored in the shelf, the detection signal of the sensor becomes a low level. In this way, the control device 5 can sequentially determine whether or not a substrate is stored in each shelf in the cassette 6. [Target location] As described above, the mapping sensor 12 detects the sensor beam B depending on whether the target is blocked.
- the control device 5 can only specify the position in one direction of the target as viewed from the robot 1 (for example, the straight direction of the hand 10).
- the target position viewed from the robot 1 is, for example, the coordinate position of the target in the reference coordinate system of the robot 1.
- the robot 1 uses the mapping sensor 12 to identify the planar position of the target and teaches it itself.
- FIG. 4 is a flowchart showing an example of the teaching method of the robot 1.
- FIG. 5 is a diagram for explaining the operation of the robot for specifying the position of the target in the XY plane.
- the reference coordinate system of the robot 1 is set in the control device 5.
- the intersection of the installation surface of the base 4 and the rotation axis of the turning axis L1 of the first arm 2a is the origin
- the rotation axis of the turning axis L1 is the Z axis
- the arbitrary axis is the X axis
- the Z axis and the axis orthogonal to the X axis are the Y axis.
- the operator places the target 40 at the teaching position (step S1 in FIG. 4).
- the shape of the target 40 is arbitrary. As shown in FIG. 5, the target 40 extends in the Z-axis direction and is formed in a cylindrical shape. The Z-axis direction is the vertical direction.
- the target 40 is disposed at a teaching position predetermined in the cassette 6 or the like.
- the robot 1 is moved to a predetermined start position. The robot 1 may be moved to the start position according to a preset program, or the operator may operate the robot 1 and move it manually to the start position.
- the control device 5 specifies the position of the target 40 in the front-rear direction of the hand 10 (step S2 in FIG. 4). Specifically, the control device 5 operates the arm 2 to move the hand 10 straight from a predetermined position. Then, the control device 5 specifies the position of the target 40 in the front-rear direction viewed from the robot 1 when the target 40 blocks the sensor beam B.
- the front-rear direction of the target 40 is a direction parallel to the Y-axis of the reference coordinate system of the robot.
- the control device 5 determines the position of the hand 10 in the reference coordinate system based on the dimensions of the links constituting the first arm 2a and the second arm 2b and the angles of the joint axes when the detection signal changes from the high level to the low level. The position is calculated, and the calculated position is recorded in the storage unit 52. Thereby, the hand 10 of the robot 1 is set so that the sensor beam B of the mapping sensor 12 comes in front of the target 40.
- FIG. 6 is a schematic plan view when the hand 10 of FIG. 5 is swung.
- the hand 10 shown in the drawings is simplified for convenience.
- the hand 10 is swung around a turning axis L3 on an axis orthogonal to the optical axis of the sensor beam B so that the target 40 is scanned in the horizontal direction by the sensor beam B.
- the control device 5 swings the hand 10 by the same left and right angle (for example, 10 degrees left and right) around the turning axis L3 on the longitudinal center axis C of the hand 10 from the low level detection signal.
- the pivot axis of the hand 10 only needs to be on the central axis C in the longitudinal direction of the hand. For example, you may rock
- the control device 5 determines whether or not the target 40 coincides with the position along the central axis C in the longitudinal direction of the hand 10 based on the detection signal of the mapping sensor 12 changed by swinging the hand 10.
- 7 and 8 show the waveforms of the detection signals of the mapping sensor 12 when the center axis C of the hand 10 matches the target 40 and when it does not match, respectively.
- the vertical axis represents the detection signal
- the horizontal axis represents the swing angle.
- the value of the detection signal in the case of coincidence has symmetry in a predetermined swing angle range centered on 0 degree.
- FIG. 7 shows the waveforms of the detection signals of the mapping sensor 12 when the center axis C of the hand 10 matches the target 40 and when it does not match, respectively.
- the vertical axis represents the detection signal
- the horizontal axis represents the swing angle.
- the value of the detection signal in the case of coincidence has symmetry in a predetermined swing angle range centered on 0 degree.
- the control device 5 follows the central axis C in the longitudinal direction of the hand 10 depending on whether or not the detection signal value of the mapping sensor 12 has symmetry in a predetermined swing angle range centered on 0 degree. It is determined whether or not the target 40 matches the position.
- the control device 5 calculates the offset amount of the hand 10 based on the detection signal of the mapping sensor 12 changed by swinging the hand 10 (step S5). .
- the control device 5 swings the integrated value of the detection signal (absolute value) when the hand 10 is swung in the plus (right) direction and the hand 10 is swung in the minus (left) direction.
- the amount of offset is calculated by comparing the magnitude of the integrated value of the detection signal (absolute value).
- FIG. 9 shows a case where the hand 10 is shifted.
- the control device 5 shifts the hand 10 in the right direction along the optical axis of the sensor beam B according to the calculated offset amount.
- the hand 10 is shifted in the X-axis direction of the robot reference coordinate system.
- the shift amount of the hand 10 is determined by integrating the detection signal (absolute value) when the hand 10 is swung in the plus (right) direction and the hand 10 in the minus (left) direction. The value is determined so that the integrated values of the detection signals (absolute values) when swinging are equal.
- the control device 5 returns to step S3 and repeats the above steps until the center axis C of the hand 10 coincides with the target 40. That is, the control device 5 repeats the above steps until the value of the detection signal becomes symmetric within a predetermined swing angle range centered on 0 degrees as shown in FIG.
- the target 40 is aligned with the position along the central axis C in the longitudinal direction of the hand 10, that is, the center of the line of the sensor beam B.
- the plane position of the target 40 can be specified.
- step S7 when the control device 5 determines that they match (YES in step S4), the control device 5 specifies the position of the target in the left-right direction viewed from the robot (step S7).
- the horizontal direction of the target 40 is a direction parallel to the X axis of the reference coordinate system of the robot 1.
- the control device 5 calculates the position of the hand 10 in the reference coordinate system based on the dimensions of the links constituting the first arm 2a and the second arm 2b and the angle of the joint axis, and the calculated position is stored in the storage unit 52. To record.
- control device 5 teaches the robot 1 the position of the hand 10 corresponding to the teaching position based on the position of the specified target 40 in the front-rear direction and the left-right direction (step S8).
- the position of the target 40 can be accurately specified, so that the teaching accuracy of the robot 1 can be improved.
- the position (posture) of the hand 10 recognized by the robot 1 is actually different.
- the target 40 is deviated from the center axis C of the hand 10. That is, even if the center axis C of the hand 10 is perpendicular to the target 40, the hand is at a distance between the position where the sensor beam B is blocked and the position where the hand 10 holds the substrate at the center of the substrate W. 10 is shifted in the horizontal direction.
- FIG. 10 is a plan view schematically showing the hand 10 and the target 40 in the second embodiment.
- the present embodiment is different from the first embodiment in that two targets 40a and 40b are respectively arranged at two teaching positions. Then, steps S2 to S8 are performed for each of the two targets. Based on the relative position of each of the specified positions of the targets 40a and 40b and the design distance between the targets 40a and 40b, the deviation in the case where the hand 10 moves straight from the taught position of the hand 10 by a predetermined distance is adjusted. For example, when the robot 1 holds the substrate W, it is possible to suitably adjust a deviation in the case where the hand 10 moves straight from the edge of the substrate in the cassette to the center of the substrate.
- the X axis of the reference coordinate system of the robot 1 and the shift direction of the hand 10 coincide with each other, and when the hand is shifted along the optical axis of the sensor beam B, the hand 10 is moved to the robot coordinate system. Were shifted along the X-axis direction (see FIG. 9). However, the optical axis direction of the sensor beam B and each axis of the reference coordinate system of the robot are not always parallel.
- FIG. 11 is a schematic plan view when the coordinate axis of the robot reference coordinate system is inclined with respect to the hand shift direction.
- the shift direction (optical axis direction) of the hand 10 and the robot reference coordinate system X-axis are inclined by an angle ⁇ .
- the inclination ⁇ of the optical axis of the sensor beam B with respect to the X axis of the base coordinate system of the robot (inclination of the central axis C of the hand 10 with respect to the Y axis of the reference coordinate system) is calculated in advance. For example, two positions where the sensor reacts to the target 40 are obtained, and the inclination ⁇ is obtained from the movement amount b along the X axis and the movement amount a along the Y axis.
- the control device 5 shifts the hand 10 along the optical axis of the sensor beam B so as to maintain the calculated inclination ⁇ . Thereby, a hand can be shifted correctly.
- mapping sensor 12 of the present embodiment is a transmissive type (see FIG. 2), it is reflective if the sensor beam B travels straight in the space between the first tip portion 10b and the second tip portion 10c. It may be a mold. That is, the mapping sensor 12 may be provided with the light emitting part 13 and the light receiving part 14 at one of the first tip part 10b and the second tip part 10c, and with the reflecting member on the other. For example, the sensor beam B emitted from the light emitting portion 13 of the first tip portion 10b is reflected by the reflecting member of the second tip portion 10c and received by the light receiving portion 14 of the first tip portion 10b.
- the robot 1 is a horizontal articulated transfer robot.
- the robot 1 is not limited to this as long as it is a general robot in which the mapping sensor is provided at the tip of the hand.
- the present invention is useful for all robots having a mapping sensor at the tip.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
- Container, Conveyance, Adherence, Positioning, Of Wafer (AREA)
Abstract
Description
以下、本発明に係る第1実施形態について図面を参照しつつ説明する。以下では、全ての図面を通じて同一又は相当する要素には同じ符号を付して、重複する説明は省略する。
[ロボット]
ロボット1は、例えば、アーム2と、昇降軸3と、基台4と、制御装置5と、ハンド10とを備える。本実施の形態では、いわゆる水平多関節型の4軸ロボットのハンド10に基板Wが載置される。ロボット1は、X軸、Y軸、Z軸の3軸方向に自由度を有するアーム2の先端部に、水平方向の自由度を有する手首が設けられ、この手首にハンド10が設けられる。
[ハンド]
図2は、図1のハンド10を上から見た平面図である。図2に示すように、ハンド10は、平面視でU状に形成された板材から成る。本実施の形態では、板材はU型の中心軸Cに対して左右対称である。U状の本体は、単一の基端部10aと、該基端部から二股に分かれて延びる一対の第1先端部10b及び第2先端部10cとを有する。第1先端部10b及び第2先端部10cの間には空間が形成されている。ハンドの基端部10aは、取付板20の一端に固定され、ハンド10の本体は、取付板20から水平に延びている。取付板20の他端は、第2アーム2bの他端部に対して鉛直軸線L3周りに揺動可能に連結されている。
[制御系]
図3は、ロボット1における制御系を示すブロック図である。図3に示すように、制御装置5は、ハンド10の発光部13及び受光部14及び基板保持部11、ロボット1の駆動装置30と制御線を介して接続され、例えばマイクロコントローラ等のコンピュータを備えたロボットコントローラである。制御装置5は単一の装置とは限らず、複数の装置で構成されてもよい。制御装置5は、演算部51と、記憶部52と、サーボ制御部53とを備える。
[マッピング動作]
次に、ハンド10によるマッピング動作について図1及び図2を用いて説明する。マッピング検出動作では、ロボット1は、アーム2の動作を制御して、例えばカセット6の最下段の棚から最上段の棚まで順次ハンド10の先端を、各棚に収納された基板Wに対向するようにして走査させる(図1参照)。
[ターゲット位置特定]
上述したように、マッピングセンサ12はセンサビームBをターゲットが遮るか否かによって検出する。このため、制御装置5は、ロボット1から見たターゲットの一方向(例えばハンド10の直進方向)の位置を特定することしかできない。ここでロボット1から見たターゲットの位置とは、例えばロボット1の基準座標系におけるターゲットの座標位置である。本実施形態では、ロボット1は、マッピングセンサ12を用いて、ターゲットの平面位置を特定し、それを自身に教示する。
150mm × sin(0.3)=0.79mm
(第2実施形態)
次に、第2実施形態について、図10を用いて説明する。以下では、第1実施形態と共通する構成の説明は省略し、相違する構成についてのみ説明する。
制御装置5は算出した傾きθを維持するようにして、センサビームBの光軸に沿ってハンド10をシフトさせる。これにより、ハンドを正確にシフトさせることができる。
2 アーム
3 昇降軸
4 基台
5 制御装置
6 カセット
7 カセット台
10 ハンド
11 基板保持部
12 マッピングセンサ
13 発光部
14 受光部
15a,15b 光ファイバ
16 発光素子
17 ドライブ回路
18 受光素子
19 出力回路
20 取付板
30 駆動装置
40 ターゲット
51 演算部
52 記憶部
53 サーボ制御部
Claims (6)
- 少なくともX軸、Y軸の2軸方向に自由度を有するロボットアームと、前記アームの先端に取り付けられ、二股状に分岐した第1先端部及び第2先端部を有するハンドと、前記第1及び第2先端部の間の空間をセンサビームが直進するように構成され、ターゲットが前記センサビームを遮ったか否かを検出するマッピングセンサと、前記ロボットアームの動作を制御する制御装置とを備えるロボットの教示方法であって、
教示位置にターゲットを配置する配置ステップと、
前記ハンドを所定位置から直進させて、前記ターゲットが前記センサビームを遮ったときの前記ロボットから見た前記ターゲットの前後方向の位置を特定する第1特定ステップと、
前記センサビームにより前記ターゲットを水平方向に走査するように前記センサビームの光軸に直交する軸上にある所定の旋回軸の周りに前記ハンドを揺動させる揺動ステップと、
前記ハンドを揺動させることにより変化した前記マッピングセンサの検出信号に基づいて、前記ハンドの長手方向の中心軸に沿った位置に前記ターゲットが一致したか否かを判定する判定ステップと、
前記一致しないと判定した場合に、前記ハンドを揺動させることにより変化した前記マッピングセンサの検出信号に基づいて、前記ハンドのオフセット量を算出し、前記算出したオフセット量に応じて前記センサビームの光軸に沿って左右いずれかの方向に前記ハンドをシフトさせるシフトステップと、
前記一致したと判定した場合に、前記ロボットから見た前記ターゲットの左右方向の位置を特定する第2特定ステップと、
前記特定された前記ターゲットの前後方向及び左右方向の位置に基づいて前記教示位置に対応する前記ハンドの位置をロボットに教示する教示ステップと、
を含む、ロボットの教示方法。 - 前記揺動ステップでは、前記センサビームの光軸に直交する軸上にある所定の旋回軸の周りに左右同じ角度だけ前記ハンドを揺動させる、請求項1に記載のロボットの教示方法。
- 前記判定ステップでは、前記マッピングセンサの検出信号値が0度を中心とした所定の揺動角の範囲において対称性を有するか否かにより、前記ハンドの長手方向の中心軸に沿った位置に前記ターゲットが一致したか否かを判定する、請求項1又は2に記載のロボットの教示方法。
- ロボットの基準座標系の軸に対する前記センサビームの光軸の傾きを算出するステップを更に含み、
前記シフトステップでは、前記算出した傾きを維持するように前記センサビームの光軸に沿って前記ハンドをシフトさせる、請求項1乃至3のいずれかに記載のロボットの教示方法。 - 前記配置ステップでは2つのターゲットを2つの教示位置に各々配置し、前記第1特定ステップ、前記揺動ステップ、前記判定ステップ、前記シフトステップ、前記第2特定ステップ及び前記教示ステップを前記2つのターゲット各々について行い、
前記特定した前記ターゲットの各々の位置の相対位置と前記ターゲットの間の設計距離に基づいて、前記教示されたハンドの位置から前記ハンドを所定距離だけ直進させる場合のずれを調整する調整ステップを更に含む、請求項1乃至4のいずれかに記載のロボットの教示方法。 - 少なくともX軸、Y軸の2軸方向に自由度を有するロボットアームと、前記アームの先端に取り付けられ、二股状に分岐した第1先端部及び第2先端部を有するハンドと、前記第1及び第2先端部の間の空間をセンサビームが直進するように構成され、ターゲットが前記センサビームを遮ったか否かを検出するマッピングセンサと、前記ロボットアームの動作を制御する制御装置とを備えるロボットであって、
前記制御装置は、前記ハンドを所定位置から直進させて、教示位置に配置された前記ターゲットが前記センサビームを遮ったときの前記ロボットから見た前記ターゲットの前後方向の位置を特定し、
前記センサビームにより前記ターゲットを水平方向に走査するように前記センサビームの光軸に直交する軸上にある所定の旋回軸の周りに前記ハンドを揺動させ、
前記ハンドを揺動させることにより変化した前記マッピングセンサの検出信号に基づいて、前記ハンドの長手方向の中心軸に沿った位置に前記ターゲットが一致したか否かを判定し、
前記一致しないと判定した場合に、前記ハンドを揺動させることにより変化した前記マッピングセンサの検出信号に基づいて、前記ハンドのオフセット量を算出し、前記算出したオフセット量に応じて前記センサビームの光軸に沿って左右いずれかの方向に前記ハンドをシフトさせ、前記一致したと判定した場合に、前記ロボットから見た前記ターゲットの左右方向の位置を特定し、前記特定された前記ターゲットの前後方向及び左右方向の位置に基づいて前記教示位置に対応する前記ハンドの位置をロボットに教示する、ロボット。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020177034231A KR101923089B1 (ko) | 2015-05-01 | 2016-04-06 | 로봇의 교시방법 및 로봇 |
JP2017516547A JP6637494B2 (ja) | 2015-05-01 | 2016-04-06 | ロボットの教示方法及びロボット |
CN201680024079.9A CN107530877B (zh) | 2015-05-01 | 2016-04-06 | 机器人的示教方法及机器人 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/702,333 | 2015-05-01 | ||
US14/702,333 US9796086B2 (en) | 2015-05-01 | 2015-05-01 | Method of teaching robot and robot |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016178300A1 true WO2016178300A1 (ja) | 2016-11-10 |
Family
ID=57204491
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/001936 WO2016178300A1 (ja) | 2015-05-01 | 2016-04-06 | ロボットの教示方法及びロボット |
Country Status (6)
Country | Link |
---|---|
US (1) | US9796086B2 (ja) |
JP (1) | JP6637494B2 (ja) |
KR (1) | KR101923089B1 (ja) |
CN (1) | CN107530877B (ja) |
TW (1) | TWI593526B (ja) |
WO (1) | WO2016178300A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019067910A (ja) * | 2017-09-29 | 2019-04-25 | 川崎重工業株式会社 | 基板搬送装置及び基板搬送ロボットと基板載置部との位置関係を求める方法 |
WO2019216401A1 (ja) * | 2018-05-11 | 2019-11-14 | 川崎重工業株式会社 | 基板搬送ロボット及び基板保持ハンドの光軸ずれ検出方法 |
WO2020137799A1 (ja) * | 2018-12-27 | 2020-07-02 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
WO2020137800A1 (ja) * | 2018-12-27 | 2020-07-02 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
KR20220024899A (ko) | 2019-06-27 | 2022-03-03 | 카와사키 주코교 카부시키 카이샤 | 기판 매핑 장치, 그 매핑 방법 및 매핑 교시 방법 |
JP2022520052A (ja) * | 2019-02-08 | 2022-03-28 | ヤスカワ アメリカ インコーポレイティッド | スルービーム自動ティーチング |
JP2022531326A (ja) * | 2019-06-28 | 2022-07-06 | 川崎重工業株式会社 | 基板搬送装置 |
WO2024075839A1 (ja) * | 2022-10-07 | 2024-04-11 | 川崎重工業株式会社 | 基板搬送用ロボットシステム、および、基板搬送用ロボットによる基板の置き位置および取り位置の少なくともいずれかの補正方法 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101802993B1 (ko) | 2015-02-12 | 2017-12-28 | 남한석 | 비제한적 구동형 마킹 시스템 및 그 마킹 방법 |
CN204937899U (zh) * | 2015-09-10 | 2016-01-06 | 合肥京东方光电科技有限公司 | 一种基板卡匣 |
US10500726B2 (en) * | 2016-04-25 | 2019-12-10 | Kindred Systems Inc. | Facilitating device control |
WO2018119450A1 (en) | 2016-12-23 | 2018-06-28 | Gecko Robotics, Inc. | Inspection robot |
US11673272B2 (en) | 2016-12-23 | 2023-06-13 | Gecko Robotics, Inc. | Inspection robot with stability assist device |
US11307063B2 (en) | 2016-12-23 | 2022-04-19 | Gtc Law Group Pc & Affiliates | Inspection robot for horizontal tube inspection having vertically positionable sensor carriage |
CN106956290B (zh) * | 2017-04-17 | 2019-09-10 | 京东方科技集团股份有限公司 | 机械臂及其操作方法、机械臂装置及显示面板生产设备 |
KR101853127B1 (ko) * | 2017-05-19 | 2018-04-27 | 주식회사 랜도르아키텍쳐 | 구동형 마킹 시스템, 구동형 마킹 장치의 제어방법 및 컴퓨터 판독 가능한 기록매체 |
US20190013215A1 (en) * | 2017-07-05 | 2019-01-10 | Kawasaki Jukogyo Kabushiki Kaisha | Substrate holding hand and substrate conveying apparatus including the same |
JP6966913B2 (ja) * | 2017-09-29 | 2021-11-17 | 川崎重工業株式会社 | 基板搬送装置及び基板載置部の回転軸の探索方法 |
CN108189025A (zh) * | 2017-11-23 | 2018-06-22 | 上海楷沃机器人科技有限公司 | 一种仿真人形机器人控制胸部震动幅度的方法 |
JP7029983B2 (ja) * | 2018-03-09 | 2022-03-04 | 東京エレクトロン株式会社 | 測定器及び測定器のずれ量を求める方法 |
US10867821B2 (en) * | 2018-09-11 | 2020-12-15 | Kawasaki Jukogyo Kabushiki Kaisha | Substrate transfer robot and method of teaching edge position of target body |
US10953539B2 (en) * | 2018-12-27 | 2021-03-23 | Kawasaki Jukogyo Kabushiki Kaisha | Substrate transfer robot and automatic teaching method |
CA3173116A1 (en) | 2021-04-20 | 2022-10-20 | Edward A. Bryner | Flexible inspection robot |
US11971389B2 (en) | 2021-04-22 | 2024-04-30 | Gecko Robotics, Inc. | Systems, methods, and apparatus for ultra-sonic inspection of a surface |
CN113625658B (zh) * | 2021-08-17 | 2022-12-06 | 杭州飞钛航空智能装备有限公司 | 偏移信息处理方法、装置、电子设备和制孔机构 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003022534A1 (fr) * | 2001-09-07 | 2003-03-20 | Kabushiki Kaisha Yaskawa Denki | Apprentissage de la position d'une plaquette et montage d'apprentissage |
JP2005310858A (ja) * | 2004-04-19 | 2005-11-04 | Yaskawa Electric Corp | ウェハ位置教示方法および教示治具装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5194743A (en) * | 1990-04-06 | 1993-03-16 | Nikon Corporation | Device for positioning circular semiconductor wafers |
JP3215086B2 (ja) * | 1998-07-09 | 2001-10-02 | ファナック株式会社 | ロボット制御装置 |
JP4260423B2 (ja) * | 2002-05-30 | 2009-04-30 | ローツェ株式会社 | 円盤状物の基準位置教示方法、位置決め方法および搬送方法並びに、それらの方法を使用する円盤状物の基準位置教示装置、位置決め装置、搬送装置および半導体製造設備 |
US6760976B1 (en) * | 2003-01-15 | 2004-07-13 | Novellus Systems, Inc. | Method for active wafer centering using a single sensor |
JP4524132B2 (ja) * | 2004-03-30 | 2010-08-11 | 東京エレクトロン株式会社 | 真空処理装置 |
KR20060088817A (ko) * | 2005-01-28 | 2006-08-07 | 가부시키가이샤 이빔 | 기판처리장치 및 기판처리방법 |
JP4566798B2 (ja) * | 2005-03-30 | 2010-10-20 | 東京エレクトロン株式会社 | 基板位置決め装置,基板位置決め方法,プログラム |
TWI447061B (zh) * | 2005-07-11 | 2014-08-01 | Brooks Automation Inc | 備有自動化對準功能的基板移送裝置 |
DE102005048136B4 (de) * | 2005-10-06 | 2010-01-21 | Kuka Roboter Gmbh | Verfahren zum Bestimmen eines virtuellen Tool-Center-Points |
JP4522360B2 (ja) * | 2005-12-02 | 2010-08-11 | 日東電工株式会社 | 半導体ウエハの位置決定方法およびこれを用いた装置 |
JP5235376B2 (ja) * | 2007-10-05 | 2013-07-10 | 川崎重工業株式会社 | ロボットのターゲット位置検出装置 |
KR101621814B1 (ko) * | 2008-08-01 | 2016-05-17 | 가부시키가이샤 알박 | 반송 로봇의 티칭 방법 |
JP2010153769A (ja) * | 2008-11-19 | 2010-07-08 | Tokyo Electron Ltd | 基板位置検出装置、基板位置検出方法、成膜装置、成膜方法、プログラム及びコンピュータ可読記憶媒体 |
US9275886B2 (en) * | 2012-10-29 | 2016-03-01 | Rorze Corporation | Device and method for detecting position of semiconductor substrate |
DE202013101050U1 (de) * | 2013-03-11 | 2014-08-05 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Führungssystem für eine Roboteranordnung |
CN103419199A (zh) * | 2013-07-09 | 2013-12-04 | 天津大学 | 一种基于开放式焊接机器人的示教系统 |
-
2015
- 2015-05-01 US US14/702,333 patent/US9796086B2/en active Active
-
2016
- 2016-04-06 CN CN201680024079.9A patent/CN107530877B/zh active Active
- 2016-04-06 WO PCT/JP2016/001936 patent/WO2016178300A1/ja active Application Filing
- 2016-04-06 JP JP2017516547A patent/JP6637494B2/ja active Active
- 2016-04-06 KR KR1020177034231A patent/KR101923089B1/ko active IP Right Grant
- 2016-04-29 TW TW105113421A patent/TWI593526B/zh active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003022534A1 (fr) * | 2001-09-07 | 2003-03-20 | Kabushiki Kaisha Yaskawa Denki | Apprentissage de la position d'une plaquette et montage d'apprentissage |
JP2005310858A (ja) * | 2004-04-19 | 2005-11-04 | Yaskawa Electric Corp | ウェハ位置教示方法および教示治具装置 |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019067910A (ja) * | 2017-09-29 | 2019-04-25 | 川崎重工業株式会社 | 基板搬送装置及び基板搬送ロボットと基板載置部との位置関係を求める方法 |
KR102374083B1 (ko) | 2018-05-11 | 2022-03-14 | 카와사키 주코교 카부시키 카이샤 | 기판 반송 로봇 및 기판 유지 핸드의 광축 어긋남 검출 방법 |
WO2019216401A1 (ja) * | 2018-05-11 | 2019-11-14 | 川崎重工業株式会社 | 基板搬送ロボット及び基板保持ハンドの光軸ずれ検出方法 |
JP2019195890A (ja) * | 2018-05-11 | 2019-11-14 | 川崎重工業株式会社 | 基板搬送ロボット及び基板保持ハンドの光軸ずれ検出方法 |
JP7049909B2 (ja) | 2018-05-11 | 2022-04-07 | 川崎重工業株式会社 | 基板搬送ロボット及び基板保持ハンドの光軸ずれ検出方法 |
KR20210003904A (ko) * | 2018-05-11 | 2021-01-12 | 카와사키 주코교 카부시키 카이샤 | 기판 반송 로봇 및 기판 유지 핸드의 광축 어긋남 검출 방법 |
TWI719474B (zh) * | 2018-05-11 | 2021-02-21 | 日商川崎重工業股份有限公司 | 基板搬送機器人及基板保持手的光軸偏差檢測方法 |
WO2020137800A1 (ja) * | 2018-12-27 | 2020-07-02 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
JPWO2020137800A1 (ja) * | 2018-12-27 | 2021-10-07 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
JPWO2020137799A1 (ja) * | 2018-12-27 | 2021-10-07 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
WO2020137799A1 (ja) * | 2018-12-27 | 2020-07-02 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
JP7064624B2 (ja) | 2018-12-27 | 2022-05-10 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
JP7064623B2 (ja) | 2018-12-27 | 2022-05-10 | 川崎重工業株式会社 | ロボットの位置補正方法およびロボット |
JP2022520052A (ja) * | 2019-02-08 | 2022-03-28 | ヤスカワ アメリカ インコーポレイティッド | スルービーム自動ティーチング |
KR20220024899A (ko) | 2019-06-27 | 2022-03-03 | 카와사키 주코교 카부시키 카이샤 | 기판 매핑 장치, 그 매핑 방법 및 매핑 교시 방법 |
JP2022531326A (ja) * | 2019-06-28 | 2022-07-06 | 川崎重工業株式会社 | 基板搬送装置 |
JP7266714B2 (ja) | 2019-06-28 | 2023-04-28 | 川崎重工業株式会社 | 基板搬送装置 |
WO2024075839A1 (ja) * | 2022-10-07 | 2024-04-11 | 川崎重工業株式会社 | 基板搬送用ロボットシステム、および、基板搬送用ロボットによる基板の置き位置および取り位置の少なくともいずれかの補正方法 |
Also Published As
Publication number | Publication date |
---|---|
KR101923089B1 (ko) | 2019-02-27 |
US20160318182A1 (en) | 2016-11-03 |
JP6637494B2 (ja) | 2020-01-29 |
JPWO2016178300A1 (ja) | 2018-04-26 |
TW201706096A (zh) | 2017-02-16 |
CN107530877B (zh) | 2020-07-31 |
CN107530877A (zh) | 2018-01-02 |
KR20170140362A (ko) | 2017-12-20 |
TWI593526B (zh) | 2017-08-01 |
US9796086B2 (en) | 2017-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016178300A1 (ja) | ロボットの教示方法及びロボット | |
JP6979012B2 (ja) | 基板搬送装置及び基板搬送ロボットの教示方法 | |
TWI488723B (zh) | Training methods of handling robot | |
KR102560895B1 (ko) | 기판 반송 로봇 및 자동 교시 방법 | |
JP3955499B2 (ja) | ハンドの位置合わせ方法およびその装置 | |
KR100860246B1 (ko) | 캐리어 형상 측정기 | |
JP4064361B2 (ja) | 搬送装置の搬送位置の位置情報取得方法 | |
JP6111065B2 (ja) | 自動教示システム及び教示方法 | |
JP7266714B2 (ja) | 基板搬送装置 | |
JP2010162611A (ja) | 相対ティーチング方法 | |
KR102560896B1 (ko) | 로봇의 위치 보정 방법 및 로봇 | |
WO2018168962A1 (ja) | 基板搬送装置 | |
JP7238126B2 (ja) | 基板マッピング装置、そのマッピング方法及びマッピング教示方法 | |
US20230322504A1 (en) | Robot and hand orientation adjustment method | |
KR20220044674A (ko) | 산업용 로봇 | |
JP4439993B2 (ja) | 半導体製造装置 | |
JP7443142B2 (ja) | 産業用ロボットおよび産業用ロボットの制御方法 | |
JP4098598B2 (ja) | ウェハ搬送装置の教示用装置 | |
JP2000106467A (ja) | 半導体レーザ素子の配置方法及びその装置 | |
JP2007129063A (ja) | ウェハ搬送装置の教示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16789439 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017516547 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20177034231 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16789439 Country of ref document: EP Kind code of ref document: A1 |