WO2009098803A1 - Image information generation device and image information generation method - Google Patents

Image information generation device and image information generation method Download PDF

Info

Publication number
WO2009098803A1
WO2009098803A1 PCT/JP2008/069345 JP2008069345W WO2009098803A1 WO 2009098803 A1 WO2009098803 A1 WO 2009098803A1 JP 2008069345 W JP2008069345 W JP 2008069345W WO 2009098803 A1 WO2009098803 A1 WO 2009098803A1
Authority
WO
WIPO (PCT)
Prior art keywords
image information
light source
dimensional
imaging
information generation
Prior art date
Application number
PCT/JP2008/069345
Other languages
French (fr)
Japanese (ja)
Inventor
Shigeo Murakami
Original Assignee
Dainippon Screen Mfg.Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dainippon Screen Mfg.Co., Ltd. filed Critical Dainippon Screen Mfg.Co., Ltd.
Publication of WO2009098803A1 publication Critical patent/WO2009098803A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means

Definitions

  • the present invention relates to an image information generating apparatus and an image information generating method for generating both 2D image information and 3D information.
  • a visual system that generates not only two-dimensional image information but also three-dimensional information generally specifies parallax from two pieces of two-dimensional image information generated by two cameras installed at intervals, Three-dimensional information is derived from the parallax.
  • a vision system is expensive because it requires two cameras, and its introduction into a robot has not progressed.
  • Patent Document 1 is an example of such a visual system, in which one camera is tilted downward toward the front of an automobile, and the position where the object is reflected depends on the distance to the object.
  • a visual system that generates information on the distance to an object by using the movement to the object is disclosed.
  • Patent Document 2 is also an example of such a visual system, in which one camera capable of zooming is installed, and the object when the camera is zoomed according to the distance to the object is shown.
  • generates the information of the distance to a target object using the movement amount of a position changing is disclosed.
  • Patent Document 1 the visual system disclosed in Patent Document 1 is useful in a limited situation such as detecting the distance between an automobile traveling on a flat road and the automobile, but is expected to make the robot used intelligent. It is not very useful in complicated situations such as when generating information on the distance to an object in such a complicated work. In particular, having to fix the elevation angle of the camera limits the situations in which it can be used.
  • Patent Document 2 since the visual system disclosed in Patent Document 2 takes time to drive the optical system for zooming, it is difficult to generate information on the distance to the object at high speed.
  • An object of the present invention is to provide an image information generation device such as a visual system that can be used.
  • a first aspect of an image information generation device is an image information generation device that generates both two-dimensional image information and three-dimensional information, an imaging unit that performs imaging and generates two-dimensional image information, 3D information generation for generating 3D information from a projection light source for projecting a geometric pattern by irradiating parallel light and a portion related to the geometric pattern projected by the projection light source in the 2D image information generated by the imaging unit
  • a control unit that controls lighting of the projection light source and timing of imaging of the imaging unit, and a process selection unit that selects processing for the two-dimensional image information generated by the imaging unit, and the control unit Takes the imaging unit to perform the first imaging when the projection light source is not lit, generates first two-dimensional image information, and the imaging unit when the projection light source is lit On the second To produce a second two-dimensional image information by performing the image, the processing selection unit, to process the second two-dimensional image information to the three-dimensional information generating section.
  • the image information generation device further includes an illumination light source that emits illumination light that illuminates the imaging range of the imaging unit, and the control The unit controls lighting of the illumination light source, and turns on the illumination light source when causing the imaging unit to perform first imaging.
  • the projection light source is a light emitting diode and diffused light emitted by the light emitting diode.
  • An optical system that converts the light into parallel light.
  • the three-dimensional information generating unit adds the second two-dimensional image information.
  • Information on the distance from the imaging unit to the projection object of the geometric pattern is generated based on the length between the reference positions of the geometric pattern shown in the two-dimensional image.
  • the three-dimensional information generation unit adds the second two-dimensional image information. Based on the distortion of the shape of the geometric pattern shown in the two-dimensional image, information on the posture of the projection object of the geometric pattern is generated.
  • the target having a specific shape in the first two-dimensional image information.
  • a projection availability determination unit that identifies a portion in which an object is reflected, and determines whether the projection light source can project a geometric pattern on an object reflected in the identified portion;
  • the projection selection unit determines that the process selection unit can cause the projection determination unit to process the first two-dimensional image information and project a geometric pattern onto an object having a specific shape
  • the second two-dimensional image information is processed by the three-dimensional information generation unit.
  • a first aspect of an image information generation method is an image information generation method for generating both two-dimensional image information and three-dimensional information, and (a) a first two-dimensional image is obtained by performing first imaging. Generating image information; (b) projecting a geometric pattern by irradiating parallel light when imaging is not performed in step (a); and (c) projecting a geometric pattern in step (b).
  • a step of performing the second imaging to generate the second two-dimensional image information when (d) the step (b) of the second two-dimensional image information generated in the step (c) (b) And 3D information is generated from a portion where the projected geometric pattern is reflected.
  • both the two-dimensional image information and the three-dimensional information can be generated as long as the geometric pattern can be projected. Situation constraints can be reduced. Furthermore, according to the first aspect of the image information generating apparatus of the present invention, since it is not necessary to drive the optical system, both the two-dimensional image information and the three-dimensional information can be generated at high speed.
  • the first two-dimensional image information is hardly affected by the environmental light.
  • the projection light source can be switched on and off at high speed, and two-dimensional image information can be generated.
  • the generation of three-dimensional information can be switched at high speed.
  • the light emitting diode is close to an ideal point light source, the diffused light can be converted into parallel light with a simple optical system.
  • the distance from the imaging unit to the projection object can be calculated by simple calculation, the distance from the imaging unit to the projection object can be increased at high speed. Can be derived.
  • the posture of the projection object can be calculated by simple calculation, the posture of the projection object can be derived at high speed.
  • the process for generating the three-dimensional information is not performed, so unnecessary three-dimensional information is generated. Can be prevented.
  • FIG. 5 is a schematic diagram for explaining a pan angle ⁇ and a tilt angle ⁇ representing a posture of a semiconductor wafer. It is a figure which shows the two-dimensional image in which the semiconductor wafer on which the stripe was projected is reflected. It is a figure which shows the two-dimensional image before and behind extracting an edge. It is a flowchart explaining the unit measurement operation
  • 3 is a diagram illustrating a two-dimensional image related to image data captured by an image processing apparatus 232; FIG. It is a flowchart explaining the moving body prediction operation
  • FIG. 1 is a schematic diagram of an industrial robot 1 according to a preferred embodiment of the present invention.
  • the industrial robot 1 receives the semiconductor wafer W from the previous process and loads it onto the carrier, and unloads the semiconductor wafer W from the carrier and delivers it to the subsequent process.
  • this does not mean that the introduction destination of the image information generation apparatus and the image information generation method according to the present invention is limited to the industrial robot 1. Therefore, the image information generating apparatus and the image information generating method according to the present invention may be introduced into other types of industrial robots and robots other than industrial robots, and the present invention is applied to apparatuses not included in the category of robots. Such an image information generation apparatus and an image information generation method may be introduced.
  • the industrial robot 1 includes a robot hand 102 that holds a semiconductor wafer W to be processed, a robot arm 116 that changes the position and posture of the robot hand 102, a robot hand 102, and a robot arm.
  • the robot hand 102 includes a finger 104 that rotates relative to the arm 122 about the joint 106, a finger 110 that rotates relative to the arm 122 about the joint 108, and a motor 112 that rotates the fingers 104 and 110.
  • the robot hand 102 opens and closes the fingers 104 and 110 by causing the motor 112 to rotate the fingers 104 and 110 in accordance with a control signal given from the robot controller 132, and the semiconductor wafer W between the fingers 104 and 110. Grasp or release.
  • the robot hand 102 may be of any type as long as it can hold or release the semiconductor wafer W according to the control signal.
  • the robot arm 116 includes an arm 118 that rotates relative to the base 130 with the joint 120 as an axis, an arm 122 that rotates relative to the arm 118 with the joint 124 as an axis, a motor 126 that rotates the arm 118, and an arm 122 that rotates. And a motor 128 to be operated.
  • the robot arm 116 changes the position and posture of the robot hand 102 by causing the motor 126 to rotate the arm 118 and causing the motor 128 to rotate the arm 122 in accordance with a control signal given from the robot controller 132.
  • the robot arm 116 may be of any type as long as the position and posture of the robot hand 102 can be changed according to the control signal.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the robot control device 132.
  • the robot control device 132 is a computer including a CPU 134, a memory 136, and a hard disk drive 138.
  • the robot controller 132 outputs a control signal to the motors 112, 126, and 128 according to the robot control program 146 loaded from the hard disk drive 138 to the memory 136, and controls the robot hand 102 and the robot arm 116.
  • the industrial robot 1 grips and transports the semiconductor wafer W in the work space WS.
  • the robot controller 132 calculates the rotation amounts of the fingers 104 and 110 and the arms 118 and 122 necessary for gripping the semiconductor wafer W from the two-dimensional image information and the three-dimensional information transferred from the image processing device 232.
  • control signals necessary for obtaining the rotation amount are output to the motors 112, 126, and 128.
  • Output of control signals from the robot control device 132 to the motors 112, 126, and 128 is performed via the input / output interface 140, and transfer of two-dimensional image information and three-dimensional information from the image processing device 232 to the robot control device 132. Is performed via the communication interface 142.
  • the CPU 134, the memory 136, the hard disk drive 138, the input / output interface 140, and the communication interface 142 are connected to the internal bus 144 and can exchange data with each other.
  • the vision system 202 captures the front of the robot hand 102 and generates two-dimensional image information.
  • the vision system 202 projects the geometric pattern GP onto the semiconductor wafer W in front of the robot hand 102, and generates three-dimensional image information related to the position and orientation of the semiconductor wafer W from the imaging result of the geometric pattern GP.
  • the visual system 202 will be described in detail.
  • the visual system 202 includes a projection light source 204 that projects a geometric pattern GP, an illumination light source 214 that illuminates the work space WS, and a projection light source 204 and an illumination light source 214, which will be described later.
  • LED driver 220 which supplies electric power for light emission to LED (light emitting diode) 206,216 to perform.
  • the visual system 202 includes a camera 224 that captures an image of the front of the robot hand 102 and generates an image signal including two-dimensional image information, and an image processing device 232 that processes the image signal generated by the camera 224.
  • FIG. 3 is a schematic diagram of the projection light source 204.
  • FIG. 3 is a cross-sectional view of the projection light source 204.
  • the projection light source 204 includes an LED 206, an optical system 208 that converts the diffused light emitted by the LED 206 into parallel light, and a portion through which a light beam that illuminates a bright portion of the geometric pattern GP to be projected is a transmissive portion. Is provided with a mask 210 that is a shielding part. Thereby, the projection light source 204 irradiates parallel light and projects a geometric pattern GP having a constant size onto the semiconductor wafer W regardless of the distance to the projection object. Note that it is not essential to use the LED 206 as the light source of the projection light source 204, and an incandescent bulb, a fluorescent lamp, or the like can also be used.
  • the LED 206 when used as the light emission source, the response of the LED 206 is high speed, so that the projection light source 204 can be switched on and off at high speed. There is an advantage that information generation can be switched at high speed. Further, since the LED 206 is close to an ideal point light source, there is also an advantage that diffused light can be converted into parallel light with a simple optical system.
  • the geometric pattern GP needs to include at least two reference points for specifying a place whose length is to be measured in the two-dimensional image in which the geometric pattern GP is reflected.
  • “At least two reference locations” include, for example, points and points, points and lines, lines and lines, and the like. It should be understood that “point” includes the intersection of the line and “line” includes the boundary line between the dark part and the bright part.
  • the geometric pattern GP is easily recognized by the image processing device 232.
  • a geometric pattern GP for example, as shown in FIG. 4 (a), stripes (striped pattern) ST in which strip-shaped bright portions BR1 and the same strip-shaped dark portions DA1 are alternately arranged in one direction.
  • FIG. 4B there is a check (checkered pattern) CH or the like in which a square-shaped bright portion BR2 and a square-shaped dark portion DA2 are alternately arranged in two directions orthogonal to each other.
  • the geometric pattern GP is the stripe ST, and the boundary line BDL3 between the dark part BR1 and the bright part DA1 is used as the “reference location”.
  • Projection light source 204 is fixed to the tip of arm 122 and emits parallel light toward the front of robot hand 102. Thereby, the projection light source 204 irradiates parallel light toward the center of the imaging range of the camera 224, and projects the stripe ST onto the semiconductor wafer W present at the center of the imaging range of the camera 224.
  • the parallel light beam emitted from the projection light source 204 may be thin. Therefore, the projection light source 204 can be reduced in size to the extent that it can be built into the tip of the arm 122.
  • the illumination light source 214 includes an LED 216 that emits diffused light. Note that it is not essential to use the LED 216 as the light source of the illumination light source 214, and an incandescent bulb, a fluorescent lamp, or the like can also be used. However, when the LED 216 is used as the light source, the response of the LED 216 is high speed, so that the illumination light source 214 can be switched on and off at high speed, and the generation of 2D image information and the generation of 3D information can be performed. There is an advantage that generation can be switched at high speed.
  • the illumination light source 214 is fixed separately from the robot hand 102 and the robot arm 116, and emits diffused light toward the work space WS. As a result, the illumination light source 214 emits substantially uniform illumination light toward the imaging range of the camera 224.
  • the “substantially uniform illumination light” it is sufficient for the “substantially uniform illumination light” to have such a degree of uniformity that a two-dimensional image in which the entire imaging range of the camera 224 is clearly reflected can be obtained.
  • the two-dimensional image information generated by the visual system 202 is less affected by ambient light.
  • the illumination light source 214 can be omitted if a two-dimensional image in which the entire imaging range of the camera 224 is clearly reflected can be obtained only with ambient light.
  • FIG. 5 is a time chart for explaining the lighting timing of the projection light source 204 and the illumination light source 214 and the imaging timing of the camera 224.
  • the LED driver 220 alternately supplies light emission power to the LEDs 206 and 216 in synchronization with the synchronization signal input from the camera 224, and alternately turns on the projection light source 204 and the illumination light source 214.
  • the LED driver 220 as shown in FIG. 5A, even-numbered frames FL2, FL4 among the frames FL1, FL2, FL3, FL4, FL5, FL6,... , FL6, ..., the projection light source 204 is turned on, and the projection light source 204 is turned off in odd-numbered frames FL1, FL3, FL5,.
  • the LED driver 220 turns on the illumination light source 214 in the odd-numbered frames FL1, FL3, FL5,. .. turn off the illumination light source 214.
  • the length of the frames FL1, FL2, FL3, FL4, FL5, FL6,... is typically about 1/30 second. However, if the LEDs 206 and 216 having a fast response are used as the light source, The speed can be increased to 1/60 seconds or 1/200 seconds.
  • the lighting time of the projection light source 204 is the same as the lighting time of the illumination light source 212.
  • the lighting time of the projection light source 204 is not necessarily required.
  • the lighting time of the illumination light source 214 may be different.
  • the projection light source 204 is turned on in the 3nth frames FL3, FL6,..., Where n is a natural number, and the 3n-2nd and 3n ⁇
  • the illumination light source 214 may be turned on in the first frames FL1, FL2, FL4, FL5,.
  • the light emitted from the projection light source 204 is made stronger than the light emitted from the illumination light source 214, and the projection light source 204 irradiates the semiconductor wafer W with the illumination light. If the geometric pattern GP can be projected clearly even if the geometric pattern GP is projected onto the camera, the illumination light source 214 is always lit as shown in FIGS. 7 (a) and 7 (b).
  • the projection light source 204 may be turned on in the even-numbered frames FL2, FL4, FL6,... And may be turned off in the odd-numbered frames FL1, FL3, FL5,.
  • FIG. 8 is a block diagram of the camera 224.
  • the camera 224 captures an image signal in synchronization with the synchronization signal input from the synchronization signal generator 226 and the synchronization signal generator 226 that generates a synchronization signal for controlling the imaging timing. And an imaging sensor 228 for outputting.
  • the camera 224 performs imaging on the frames FL1, FL2, FL3, FL4, FL5, FL6,...
  • FIGS. 5 (c), 6 (c) and 7 (c) As shown in FIGS. 5 (d), 6 (d), and 7 (d), image signals obtained by imaging are displayed on FL2, FL3, FL4, FL5, FL6,. Output.
  • FIG. 9 is a block diagram illustrating a hardware configuration of the image processing apparatus 232.
  • the image processing apparatus 232 is a computer that includes a CPU 234, a memory 236, and a hard disk drive 238.
  • the image processing device 232 processes the image signal input from the camera 224 according to the image processing program 258 loaded from the hard disk drive 238 to the memory 236.
  • the robot control device 132 and the image processing device 232 are separate computers, and one computer may have the functions of both the robot control device 132 and the image processing device 232. Further, all or some of the functions of the image processing apparatus 232 may be realized by dedicated hardware.
  • the image processing apparatus 232 also includes an A / D converter 240 that converts an analog image signal input from the camera 224 into digital image data, and a frame memory 242 that stores the image data.
  • the image processing apparatus 232 includes a keyboard 246 and a mouse 248 that accept an operation of the operator, and a monitor 250 that displays information as a user interface.
  • a keyboard 246 and a mouse 248 are connected to the CPU 2234.
  • the CPU 234 performs processing according to operations performed on the keyboard 246 and the mouse 248.
  • the monitor 250 is connected to the video board 252 and displays 2D image information, 3D information, and other information in accordance with a drawing command given from the CPU 234. Transfer of the two-dimensional image information and the three-dimensional information from the image processing device 232 to the robot control device 132 is performed via the communication interface 254.
  • the CPU 234, the memory 236, the hard disk drive 238, the A / D converter 240, the frame memory 242, the video board 252 and the communication interface 254 are connected to the internal bus 256 and can exchange data with each other.
  • FIG. 10 is a block diagram illustrating functions realized by the image processing apparatus 232 executing the image processing program 258.
  • the image processing apparatus 232 functionally includes a process selection unit 260 that selects a process for image data including two-dimensional image information, and three-dimensional information about the semiconductor wafer W that is an object.
  • a three-dimensional information generation unit 262 to generate, and a projection availability determination unit 270 that determines whether the projection light source 204 can project the stripe ST onto the semiconductor wafer W are provided.
  • the image data is transferred to the camera 224 in frames FL1, FL2, FL4, FL5,... In which the projection light source 204 is turned off and the illumination light source 214 is turned on.
  • the image data is generated by causing the camera 224 to perform the first imaging on the frames FL1, FL3, FL5,.
  • the generated image signals are classified into two types of digital image data D2, D4, D6,.
  • image processing in the image processing apparatus 232 will be described as an example of the timing control shown in FIG. 5 among these timing controls.
  • the timing control shown in FIG. 6 or 7 may also be performed.
  • the first image data and the second image data can be processed similarly.
  • the processing selection unit 260 causes the projection availability determination unit 270 to process the first image data D1, D3, D5,... Among the image data D1, D2, D3, D4, D5, D6,.
  • the second image data D2, D4, D6,... are processed by the three-dimensional information generation unit 262. Accordingly, the visual system 202 can generate both the two-dimensional image information and the three-dimensional information if there is one camera 224 that generates the two-dimensional image information. This contributes to the inexpensive configuration of the visual system 202 that generates both 2D image information and 3D information.
  • the processing selection unit 260 inputs information indicating that the projection light source 204 can project the stripe ST onto the semiconductor wafer W (hereinafter referred to as “projectable information”) from the projection availability determination unit 270. Only when this is done, the three-dimensional information generator 262 processes the second image data D2, D4, D6,. Thereby, the visual system 202 does not perform the process of generating the three-dimensional information when the stripe ST cannot be projected onto the semiconductor wafer W, and therefore it is possible to prevent the generation of unnecessary three-dimensional information.
  • projectable information information indicating that the projection light source 204 can project the stripe ST onto the semiconductor wafer W
  • the three-dimensional information generation unit 262 extracts an edge from the two-dimensional image related to the second image data D2, D4, D6,..., And the stripe ST from the edge extracted by the edge extraction unit 264.
  • a boundary line detection unit 266 that detects a boundary line BDL3 between the bright part BR1 and the dark part DA1, and a three-dimensional information calculation unit 268 that calculates three-dimensional information about the semiconductor wafer W from the geometric relationship of the plurality of boundary lines BDL3. Is provided.
  • the edge extraction unit 264 extracts edges by a method using a differential filter or the like.
  • the boundary line detection unit 266 detects the boundary line by a method using template matching or Hough transform.
  • the three-dimensional information calculation unit 268 derives a distance D representing the position of the semiconductor wafer W and a pan angle ⁇ representing the attitude of the semiconductor wafer W as three-dimensional information about the semiconductor wafer W.
  • the distance D represents the distance from the camera 224 to the semiconductor wafer W.
  • the distance D is known, the distance from the robot hand 102 to the semiconductor wafer W can also be calculated.
  • the inclination of the semiconductor wafer W from the normal state toward the arrangement direction STA is shown.
  • FIG. 12 is a diagram showing two-dimensional images IMG1 and IMG2 in which the semiconductor wafer W onto which the stripe ST is projected is shown.
  • the stripe widths Xa and Xb in the two-dimensional image IMG1 are one boundary line BDL11 of the stripe ST shown in the two-dimensional image IMG1 and the other boundary adjacent to the two.
  • the length between the line BDL12 is expressed by the number of pixels, and the stripe width Xa is acquired at the time of calibration, and the stripe width Xb is acquired at the time of measurement.
  • the distance L is a distance from the camera 224 acquired at the time of calibration to the projection target of the stripe ST. It is not essential that the interval between two adjacent boundary lines is the stripe width Xa, Xb, and the interval between adjacent boundary lines or three or more adjacent boundary lines may be the stripe widths Xa, Xb.
  • the distance D can be calculated by simple calculation based on the stripe width Xb in the two-dimensional image IMG1 because the stripe ST is projected by parallel light. This utilizes the fact that the stripe width Xb in the two-dimensional image IMG1 decreases as the distance D increases without changing the size of the projected stripe ST. Needless to say, the calculation of the distance D by such simple calculation contributes to deriving the distance D at high speed.
  • the boundary line BDL2 is parallel in the two-dimensional image IMG2, but when the interval between the adjacent boundary lines BDL1 is not constant, the three-dimensional information calculation unit 268 includes a semiconductor It is determined that the wear W is inclined with respect to the optical axis OA, that is, ⁇ ⁇ 0, and the distance D is calculated based on the stripe widths Xb1 and Xb2 according to the equation (2), and the equation (3) Accordingly, the pan angle ⁇ is calculated based on the distortion of the shape of the stripe ST.
  • disortion of shape means a deviation from a shape similar to the stripe ST projected on the projection surface perpendicular to the optical axis of the projection light source 204.
  • the stripe widths Xb1 and Xb2 in the two-dimensional image IMG2 are, as shown in FIG. 12B, the boundary line BDL21 of the stripe ST shown in the two-dimensional image IMG2 and another boundary line BDL22 adjacent to the stripe ST2.
  • BDL23 is represented by the number of pixels, and is acquired at the time of measurement.
  • the stripe width STW is equal to the boundary line BDL31 in the stripe ST projected on the projection surface perpendicular to the optical axis of the projection light source 204 and other boundary lines adjacent to the boundary line BDL31. The distance from the BDL 32.
  • the stripe width STW is determined by the width of the transmission part of the mask 210.
  • the tilt angle ⁇ is calculated as the posture of the semiconductor wafer W.
  • the check CH is used as the geometric pattern GP or the direction of the stripe ST is temporarily rotated by 90 °.
  • the tilt angle ⁇ can be calculated by the same method.
  • the tilt angle ⁇ here represents the inclination of the semiconductor wafer W in the direction perpendicular to the arrangement direction STA of the stripes ST with respect to the optical axis OA.
  • the projection availability determination unit 270 identifies a portion in which the semiconductor wafer W is reflected in the two-dimensional image information, and the projection light source 204 can project the stripe ST onto the semiconductor wafer W reflected in the identified portion. It is determined whether or not. Further, when it is determined that the projection light source 204 can project the stripe ST onto the semiconductor wafer W, the projection availability determination unit 270 outputs projection enable information indicating that fact to the processing selection unit 260.
  • a method for specifying a portion related to the semiconductor wafer W is not limited, a method for specifying by using template matching using the fact that the semiconductor wafer W has a specific shape will be described below.
  • the projection availability determination unit 270 is extracted by an edge extraction unit 272 that extracts edges from a two-dimensional image related to the first image data D1, D3, D5,.
  • a semiconductor wafer detection unit 274 that detects the contour of the semiconductor wafer W from the edge, and a position determination unit 276 that determines whether the position of the semiconductor wafer W detected by the semiconductor wafer detection unit 274 is within the reference range; Is provided.
  • the edge extraction unit 272 extracts edges by a method using a differential filter or the like.
  • a differential filter or the like For example, when an edge is extracted from a two-dimensional image IMG3 in which a disk-shaped object and a rectangular-plate-shaped object OB3 as shown in FIG. A two-dimensional image IMG4 from which circular and rectangular edges ED4 are extracted is obtained.
  • the semiconductor wafer detection unit 274 detects an edge having a shape similar to a template registered in advance from the edges extracted by the edge extraction unit 272. Therefore, if the outline shape of the semiconductor wafer W is registered as a template, the semiconductor wafer detection unit 274 can detect the semiconductor wafer W and specify the position where the semiconductor wafer W is reflected in the two-dimensional image. For example, if the semiconductor wafer W is a disc, a circle may be registered as a template.
  • a well-known image processing technique such as affine transformation of the template and deformation may be employed.
  • the position determination unit 276 determines whether the projection light source 204 can project the stripe ST on the semiconductor wafer W based on the position where the semiconductor wafer W is reflected in the two-dimensional image specified by the semiconductor wafer detection unit 274. Determine. More specifically, the position determination unit 276 determines that the projection light source 204 can project the stripe ST onto the semiconductor wafer W if the semiconductor wafer W is reflected at the center of the two-dimensional image. Of course, the determination based on the center of the two-dimensional image can be adopted only when the deviation between the optical axis OA of the camera 224 and the optical axis of the projection light source 204 can be ignored. If it cannot be ignored, the standard is set in consideration of the deviation.
  • perception system 202 According to such a perceptual system 202, if the stripe ST can be projected, both the two-dimensional image information and the three-dimensional information can be generated, so that restrictions on the situation in which the stripe ST can be used can be reduced. .
  • the perception system 202 can generate both two-dimensional image information and three-dimensional information at high speed even when a plurality of semiconductor wafers W having different sizes coexist.
  • it is not necessary to drive the optical system so that both two-dimensional image information and three-dimensional information can be generated at high speed.
  • the projection object of the stripe ST is placed in front of the robot hand 102 so that the optical axis OA of the camera 224 and the projection surface are perpendicular to each other. Measure the distance L to the surface.
  • the camera 224 performs imaging, and a two-dimensional image related to the image data captured from the camera 224 is displayed on the monitor 250, and the stripe interval Xa in the two-dimensional image is counted.
  • the distance L and the stripe interval Xa obtained in this way are input from the keyboard 246 to the image processing apparatus 232 and stored in the memory 236, so that the visual system 202 can generate highly accurate three-dimensional information. become.
  • FIG. 14 is a flowchart for explaining the unit measurement operation of the industrial robot 1 when the visual system 202 generates two-dimensional information and three-dimensional information.
  • FIG. 17 is realized by repeating the unit measurement operation.
  • 5 is a flowchart for explaining a moving object prediction operation of the industrial robot 1 for gripping the semiconductor wafer W.
  • 15 is a schematic diagram showing the positional relationship between the projection light source 204 and camera 224 and the semiconductor wafer W
  • FIG. 16 shows two-dimensional images IMG5 and IMG6 related to the image data captured by the image processing apparatus 232.
  • FIG. 14 is a flowchart for explaining the unit measurement operation of the industrial robot 1 when the visual system 202 generates two-dimensional information and three-dimensional information.
  • FIG. 17 is realized by repeating the unit measurement operation.
  • 5 is a flowchart for explaining a moving object prediction operation of the industrial robot 1 for gripping the semiconductor wafer W.
  • 15 is a schematic diagram showing the positional relationship between the projection light source 204 and camera
  • step S101 when the industrial robot 1 starts the unit measurement operation, first, the illumination light source 214 is turned on in the odd-numbered frame FLp (step S101), and the camera 224 performs the first imaging and stripes. An image signal including the first two-dimensional image information in which ST is not reflected is generated (step S102).
  • the process selection unit 260 causes the projection availability determination unit 270 to process the first image data Dp captured by the image processing apparatus 232 by performing the first imaging in step S101, and the projection availability determination unit 270 performs the semiconductor processing. It is determined whether or not the stripe ST can be projected onto the wafer W (step S103). At this time, as shown in FIG. 15A, the semiconductor wafer W is not in front of the front of the camera 224, and the semiconductor wafer W is reflected in the peripheral portion of the two-dimensional image IMG5 as shown in FIG.
  • the projection availability determination unit 270 determines that the stripe ST cannot be projected onto the semiconductor wafer W (“NO” in step S103), and causes the robot control unit 132 to operate the robot arm 116 to project the projection light source 204.
  • the projection destination of the stripe and the imaging range of the camera 224 are shifted (step S104).
  • the visual system 202 executes Step S101 again after Step S104 is completed.
  • the robot arm 116 operates until the stripe ST can be projected onto the semiconductor wafer W. Therefore, the projection light source 204, the camera 224, the semiconductor wafer W, Regardless of the initial positional relationship, the stripe ST can be projected onto the semiconductor wafer W.
  • the projection availability determination unit 270 determines that the stripe ST can be projected onto the semiconductor wafer W (“YES” in step S103), and provides the projection selection information to the process selection unit 260 (step S105).
  • the projection light source 204 is turned on in the odd-numbered frame FLq (step S106), and the camera 224 performs the second imaging and generates an image signal including the second two-dimensional image information in which the stripe ST is reflected.
  • the frame FLq for executing steps S106 to S107 is naturally a frame after the frame FLp for executing steps S101 to S102, but is not necessarily the frame following the frame FLp. Accordingly, an empty frame that is not processed may be inserted between the frames FLp and FLq. In this empty frame, the lighting of the projection light source 204 and the illumination light source 206 may be temporarily stopped to reduce the power consumption of the visual system 202.
  • the process selection unit 260 given the projectable information causes the three-dimensional information generation unit 270 to process the second image data Dq captured by the image processing device 232 by performing the second imaging in step S107.
  • the three-dimensional information generation unit 270 generates three-dimensional information about the semiconductor wafer W (Step S108).
  • the two-dimensional image information generated in step S102 by the visual system 202 and the three-dimensional information generated in step S108 are used for controlling the industrial robot 202. Note that these two-dimensional image information and three-dimensional information may be displayed on the monitor 250.
  • ⁇ Motion prediction operation of industrial robot 1 ⁇ As shown in FIG. 17, when performing the moving body prediction operation, the industrial robot 1 first executes a first unit measurement operation (step S111) and a second unit measurement operation (step S112) sequentially. At this time, the second unit measurement operation may be started after the first unit measurement operation is finished, but the second unit measurement operation may be started before the first unit measurement operation is finished. .
  • the robot controller 132 determines the position of the semiconductor wafer W from the three-dimensional information obtained by the first unit measurement operation and the second unit measurement operation. And the posture is predicted (step S113). That is, the robot controller 132 calculates the distance D (t) at time t according to the equation (4), and calculates the pan angle ⁇ at time t according to the equation (5).
  • the distance D1 and the pan angle ⁇ 1 are the distance and pan angle obtained by the first unit measurement operation, and the time t1 is the time at which the camera 224 has taken an image in the first unit measurement operation.
  • the distance D2 and the pan angle ⁇ 2 are the distance and pan angle obtained by the second unit measurement operation, and the time t2 is the time when the camera 224 has taken an image in the second unit measurement operation.
  • the distance D (t) and the pan angle ⁇ (t) at time t may be calculated by extrapolating three or more distances and the pan angle.
  • the robot control apparatus 1 When the prediction of the distance D (t) and the pan angle ⁇ (t) at time t is completed, the robot control apparatus 1 outputs a control signal to the motors 112, 126, and 128 based on the result of the prediction, and the semiconductor wafer W Is gripped (step S114).
  • the industrial robot W predicts and holds the movement of the semiconductor wafer W.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A process selection unit (260) causes a projection enabled/disabled judgment unit (270) to process first image data (D1, D3, D5, ...) obtained by digitizing an image signal generated by a camera (224) which has performed first imaging on frames (FL1, FL3, FL5, ...) in which a projection light source (204) is OFF and an illumination light source (214) is ON; and causes a 3D information generation unit (262) to process second image data (D2, D4, D6, ...) obtained by digitizing an image signal generated by the camera (224) which has performed second imaging on frames (FL2, FL4, FL6, ...) in which the illumination light source (214) is OFF and the projection light source (204) is ON. The 3D information generation unit (262) generates 3D information associated with a semiconductor wafer W from the image data (D2, D4, D6,...).

Description

画像情報生成装置及び画像情報生成方法Image information generating apparatus and image information generating method
 本発明は、2次元画像情報及び3次元情報の両方を生成する画像情報生成装置及び画像情報生成方法に関する。 The present invention relates to an image information generating apparatus and an image information generating method for generating both 2D image information and 3D information.
 産業用ロボットその他のロボットを知能化するためには、視覚情報を生成する視覚システムをロボットに導入することが重要である。特に、平面的な2次元画像情報だけでなく対象物までの距離等の立体的な3次元情報も生成することができる視覚システムをロボットに導入すれば、ロボットの自律性を大きく向上することができる。このことは、対象物を把持するという基本的な動作であっても、対象物までの距離等の3次元情報が必須となることからも明らかである。 In order to make industrial robots and other robots intelligent, it is important to introduce visual systems that generate visual information into the robots. In particular, if a visual system capable of generating not only planar two-dimensional image information but also three-dimensional three-dimensional information such as a distance to an object is introduced into the robot, the autonomy of the robot can be greatly improved. it can. This is clear from the fact that even the basic operation of gripping an object requires three-dimensional information such as the distance to the object.
 2次元画像情報だけでなく3次元情報も生成する視覚システムは、一般的には、間隔を置いて設置した2台のカメラの各々が生成した2個の2次元画像情報から視差を特定し、当該視差から3次元情報を導出する。しかし、このような視覚システムは、2台のカメラを必要とするために高価であり、ロボットへの導入は進んでいない。 A visual system that generates not only two-dimensional image information but also three-dimensional information generally specifies parallax from two pieces of two-dimensional image information generated by two cameras installed at intervals, Three-dimensional information is derived from the parallax. However, such a vision system is expensive because it requires two cameras, and its introduction into a robot has not progressed.
 そこで、1台のカメラで2次元画像情報及び3次元情報の両方を生成することができる視覚システムが求められている。 Therefore, a visual system capable of generating both 2D image information and 3D information with a single camera is required.
 特許文献1は、そのような視覚システムの一例であって、自動車の前方に向けて1台のカメラを下方に傾けて設置し、対象物までの距離によって対象物が映っている位置が上下方向に移動することを利用して対象物までの距離の情報を生成する視覚システムを開示している。 Patent Document 1 is an example of such a visual system, in which one camera is tilted downward toward the front of an automobile, and the position where the object is reflected depends on the distance to the object. A visual system that generates information on the distance to an object by using the movement to the object is disclosed.
 一方、特許文献2も、そのような視覚システムの一例であって、ズーミングをすることができる1台のカメラを設置し、対象物までの距離によってカメラをズーミングしたときの対象物が映っている位置の移動量が変化することを利用して対象物までの距離の情報を生成する視覚システムを開示している。 On the other hand, Patent Document 2 is also an example of such a visual system, in which one camera capable of zooming is installed, and the object when the camera is zoomed according to the distance to the object is shown. The visual system which produces | generates the information of the distance to a target object using the movement amount of a position changing is disclosed.
特開2006-31313号公報JP 2006-31313 A 特開2004-239791号公報JP 2004-239791 A
 しかし、特許文献1に開示されている視覚システムは、平坦な路面を走行する自動車と自動車との車間距離を検知する場合のような限られた状況では役に立つが、使用するロボットの知能化が期待されるような複雑な作業において対象物までの距離の情報を生成する場合等のような複雑な状況ではあまり役に立たない。特に、カメラの仰角を固定しなければならないことは、使用することができる状況を制約してしまう。 However, the visual system disclosed in Patent Document 1 is useful in a limited situation such as detecting the distance between an automobile traveling on a flat road and the automobile, but is expected to make the robot used intelligent. It is not very useful in complicated situations such as when generating information on the distance to an object in such a complicated work. In particular, having to fix the elevation angle of the camera limits the situations in which it can be used.
 また、特許文献2に開示されている視覚システムは、ズーミングのための光学系の駆動に時間がかかるので、対象物までの距離の情報を高速に生成することは困難である。 Further, since the visual system disclosed in Patent Document 2 takes time to drive the optical system for zooming, it is difficult to generate information on the distance to the object at high speed.
 本発明は、これらの問題を解決するためになされたもので、安価に構成することができ、使用することができる状況の制約が少なく、2次元画像情報及び3次元情報の両方を高速に生成することができる視覚システム等の画像情報生成装置を提供することを目的とする。 The present invention has been made to solve these problems, can be configured at low cost, has few restrictions on the situation in which it can be used, and generates both 2D image information and 3D information at high speed. An object of the present invention is to provide an image information generation device such as a visual system that can be used.
 この発明に係る画像情報生成装置の第1の態様は、2次元画像情報及び3次元情報の両方を生成する画像情報生成装置であって、撮像を行い2次元画像情報を生成する撮像部と、平行光を照射し幾何パターンを投影する投影用光源と、前記撮像部が生成した2次元画像情報のうちの前記投影用光源が投影した幾何パターンに関する部分から3次元情報を生成する3次元情報生成部と、前記投影用光源の点灯及び前記撮像部の撮像のタイミングを制御する制御部と、前記撮像部が生成した2次元画像情報に対する処理を選択する処理選択部と、を備え、前記制御部は、前記投影用光源を点灯させていないときに前記撮像部に第1の撮像を行わせて第1の2次元画像情報を生成させ、前記投影用光源を点灯させているときに前記撮像部に第2の撮像を行わせて第2の2次元画像情報を生成させ、前記処理選択部は、第2の2次元画像情報を前記3次元情報生成部に処理させる。 A first aspect of an image information generation device according to the present invention is an image information generation device that generates both two-dimensional image information and three-dimensional information, an imaging unit that performs imaging and generates two-dimensional image information, 3D information generation for generating 3D information from a projection light source for projecting a geometric pattern by irradiating parallel light and a portion related to the geometric pattern projected by the projection light source in the 2D image information generated by the imaging unit A control unit that controls lighting of the projection light source and timing of imaging of the imaging unit, and a process selection unit that selects processing for the two-dimensional image information generated by the imaging unit, and the control unit Takes the imaging unit to perform the first imaging when the projection light source is not lit, generates first two-dimensional image information, and the imaging unit when the projection light source is lit On the second To produce a second two-dimensional image information by performing the image, the processing selection unit, to process the second two-dimensional image information to the three-dimensional information generating section.
 この発明に係る画像情報生成装置の第2の態様は、画像情報生成装置の第1の態様において、前記撮像部の撮像範囲を照明する照明光を照射する照明用光源、をさらに備え、前記制御部は、前記照明用光源の点灯を制御し、前記撮像部に第1の撮像を行わせるときに前記照明用光源を点灯させる。 According to a second aspect of the image information generation device according to the present invention, in the first aspect of the image information generation device, the image information generation device further includes an illumination light source that emits illumination light that illuminates the imaging range of the imaging unit, and the control The unit controls lighting of the illumination light source, and turns on the illumination light source when causing the imaging unit to perform first imaging.
 この発明に係る画像情報生成装置の第3の態様は、画像情報生成装置の第1の態様又は第2の態様において、前記投影用光源は、発光ダイオードと、前記発光ダイオードが発光した拡散光を平行光に変換する光学系と、を備える。 According to a third aspect of the image information generating apparatus of the present invention, in the first or second aspect of the image information generating apparatus, the projection light source is a light emitting diode and diffused light emitted by the light emitting diode. An optical system that converts the light into parallel light.
 この発明に係る画像情報生成装置の第4の態様は、画像情報生成装置の第1の態様ないし第3の態様のいずれかにおいて、前記3次元情報生成部は、第2の2次元画像情報に係る2次元画像に映っている幾何パターンの基準箇所の間の長さに基づいて前記撮像部から幾何パターンの被投影物までの距離の情報を生成する。 According to a fourth aspect of the image information generating apparatus of the present invention, in any one of the first to third aspects of the image information generating apparatus, the three-dimensional information generating unit adds the second two-dimensional image information. Information on the distance from the imaging unit to the projection object of the geometric pattern is generated based on the length between the reference positions of the geometric pattern shown in the two-dimensional image.
 この発明に係る画像情報生成装置の第5の態様は、画像情報生成装置の第1の態様ないし第3の態様のいずれかにおいて、前記3次元情報生成部は、第2の2次元画像情報に係る2次元画像に映っている幾何パターンの形状のひずみに基づいて幾何パターンの被投影物の姿勢の情報を生成する。 According to a fifth aspect of the image information generation device of the present invention, in any one of the first to third aspects of the image information generation device, the three-dimensional information generation unit adds the second two-dimensional image information. Based on the distortion of the shape of the geometric pattern shown in the two-dimensional image, information on the posture of the projection object of the geometric pattern is generated.
 この発明に係る画像情報生成装置の第6の態様は、画像情報生成装置の第1の態様ないし第5の態様のいずれかにおいて、第1の2次元画像情報のうちの特定の形状を有する対象物が映っている部分を特定し、特定した部分に映っている対象物に前記投影用光源が幾何パターンを投影することができるか否かを判定する投影可否判定部と、をさらに備え、前記処理選択部は、第1の2次元画像情報を前記投影可否判定部に処理させ、特定の形状を有する対象物に幾何パターンを投影することができると前記投影可否判定部が判定した場合に、第2の2次元画像情報を前記3次元情報生成部に処理させる。 According to a sixth aspect of the image information generating apparatus of the present invention, in any one of the first to fifth aspects of the image information generating apparatus, the target having a specific shape in the first two-dimensional image information. A projection availability determination unit that identifies a portion in which an object is reflected, and determines whether the projection light source can project a geometric pattern on an object reflected in the identified portion; When the projection selection unit determines that the process selection unit can cause the projection determination unit to process the first two-dimensional image information and project a geometric pattern onto an object having a specific shape, The second two-dimensional image information is processed by the three-dimensional information generation unit.
 この発明に係る画像情報生成方法の第1の態様は、2次元画像情報及び3次元情報の両方を生成する画像情報生成方法であって、(a)第1の撮像を行い第1の2次元画像情報を生成する工程と、(b)前記工程(a)において撮像を行っていないときに平行光を照射し幾何パターンを投影する工程と、(c)前記工程(b)において幾何パターンを投影しているときに第2の撮像を行い第2の2次元画像情報を生成する工程と、(d)前記工程(c)において生成された第2の2次元画像情報のうちの前記工程(b)において投影された幾何パターンが映っている部分から3次元情報を生成する工程と、を備える。 A first aspect of an image information generation method according to the present invention is an image information generation method for generating both two-dimensional image information and three-dimensional information, and (a) a first two-dimensional image is obtained by performing first imaging. Generating image information; (b) projecting a geometric pattern by irradiating parallel light when imaging is not performed in step (a); and (c) projecting a geometric pattern in step (b). A step of performing the second imaging to generate the second two-dimensional image information when (d) the step (b) of the second two-dimensional image information generated in the step (c) (b) And 3D information is generated from a portion where the projected geometric pattern is reflected.
 この発明に係る画像情報生成装置の第1の態様ないし第6の態様及びこの発明に係る画像情報生成方法の第1の態様によれば、2次元画像情報を生成する撮像部が1台あれば2次元画像情報及び3次元情報の両方を生成することができるので、2次元画像情報及び3次元情報の両方を出力する画像情報生成装置を安価に構成することができる。また、この発明に係る画像情報生成装置の第1の態様によれば、幾何パターンを投影することができれば2次元画像情報及び3次元情報の両方を生成することができるので、使用することができる状況の制約を少なくすることができる。さらに、この発明に係る画像情報生成装置の第1の態様によれば、光学系の駆動が不要であるので、2次元画像情報及び3次元情報の両方を高速に生成することができる。 According to the first to sixth aspects of the image information generating device according to the present invention and the first aspect of the image information generating method according to the present invention, if there is one imaging unit that generates two-dimensional image information Since both two-dimensional image information and three-dimensional information can be generated, an image information generating apparatus that outputs both two-dimensional image information and three-dimensional information can be configured at low cost. Further, according to the first aspect of the image information generating apparatus of the present invention, both the two-dimensional image information and the three-dimensional information can be generated as long as the geometric pattern can be projected. Situation constraints can be reduced. Furthermore, according to the first aspect of the image information generating apparatus of the present invention, since it is not necessary to drive the optical system, both the two-dimensional image information and the three-dimensional information can be generated at high speed.
 この発明に係る画像情報生成装置の第2の態様によれば、第1の撮像を行うときに照明光が照射されるので、第1の2次元画像情報が環境光の影響を受けにくくなる。 According to the second aspect of the image information generating apparatus according to the present invention, since the illumination light is irradiated when performing the first imaging, the first two-dimensional image information is hardly affected by the environmental light.
 この発明に係る画像情報生成装置の第3の態様によれば、発光ダイオードの応答が高速であるので、投影用光源の点灯と消灯とを高速で切り替えることができ、2次元画像情報の生成と3次元情報の生成とを高速に切り替えることができる。また、発光ダイオードは理想的な点光源に近いので、簡単な光学系で拡散光を平行光に変換することができる。 According to the third aspect of the image information generating apparatus of the present invention, since the response of the light emitting diode is high speed, the projection light source can be switched on and off at high speed, and two-dimensional image information can be generated. The generation of three-dimensional information can be switched at high speed. Further, since the light emitting diode is close to an ideal point light source, the diffused light can be converted into parallel light with a simple optical system.
 この発明に係る画像情報生成装置の第4の態様によれば、撮像部から被投影物までの距離を簡単な計算で算出することができるので、撮像部から被投影物までの距離を高速に導出することができる。 According to the fourth aspect of the image information generating device of the present invention, since the distance from the imaging unit to the projection object can be calculated by simple calculation, the distance from the imaging unit to the projection object can be increased at high speed. Can be derived.
 この発明に係る画像情報生成装置の第5の態様によれば、被投影物の姿勢を簡単な計算で算出することができるので、被投影物の姿勢を高速に導出することができる。 According to the fifth aspect of the image information generating apparatus of the present invention, since the posture of the projection object can be calculated by simple calculation, the posture of the projection object can be derived at high speed.
 この発明に係る画像情報生成装置の第6の態様によれば、対象物に幾何パターンを投影することができない場合には3次元情報を生成する処理を行わないので、不要な3次元情報を生成することを防ぐことができる。 According to the sixth aspect of the image information generation device of the present invention, when the geometric pattern cannot be projected onto the object, the process for generating the three-dimensional information is not performed, so unnecessary three-dimensional information is generated. Can be prevented.
 この発明の目的、特徴、局面、および利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description and the accompanying drawings.
望ましい実施形態に係る産業用ロボットの模式図である。It is a schematic diagram of the industrial robot which concerns on desirable embodiment. ロボット制御装置のハードウエアの構成を示すブロック図である。It is a block diagram which shows the structure of the hardware of a robot control apparatus. 投影用光源の模式図である。It is a schematic diagram of the light source for projection. 幾何パターンを示す図である。It is a figure which shows a geometric pattern. カメラの撮像並びに投影用光源及び照明用光源の点灯のタイミングを説明するタイムチャートである。It is a time chart explaining the imaging timing of a camera, and the lighting timing of the light source for projection, and the light source for illumination. カメラの撮像並びに投影用光源及び照明用光源の点灯のタイミングを説明するタイムチャートである。It is a time chart explaining the imaging timing of a camera, and the lighting timing of the light source for projection, and the light source for illumination. カメラの撮像並びに投影用光源及び照明用光源の点灯のタイミングを説明するタイムチャートである。It is a time chart explaining the imaging timing of a camera, and the lighting timing of the light source for projection, and the light source for illumination. カメラのブロック図である。It is a block diagram of a camera. 画像処理装置のハードウエアの構成を示すブロック図である。It is a block diagram which shows the structure of the hardware of an image processing apparatus. 画像処理装置の機能を示すブロック図である。It is a block diagram which shows the function of an image processing apparatus. 半導体ウエハの姿勢をあらわすパン角θ及びチルト角φを説明する模式図である。FIG. 5 is a schematic diagram for explaining a pan angle θ and a tilt angle φ representing a posture of a semiconductor wafer. ストライプが投影された半導体ウエハが映っている2次元画像を示す図である。It is a figure which shows the two-dimensional image in which the semiconductor wafer on which the stripe was projected is reflected. エッジを抽出する前後の2次元画像を示す図である。It is a figure which shows the two-dimensional image before and behind extracting an edge. 産業用ロボットの単位測定動作を説明するフローチャートである。It is a flowchart explaining the unit measurement operation | movement of an industrial robot. 投影用光源及びカメラと半導体ウエハとの位置関係を示す模式図である。It is a schematic diagram which shows the positional relationship of the light source for projection and a camera, and a semiconductor wafer. 画像処理装置232が取り込んだ画像データに係る2次元画像を示す図である。3 is a diagram illustrating a two-dimensional image related to image data captured by an image processing apparatus 232; FIG. 産業用ロボットの動体予測動作を説明するフローチャートである。It is a flowchart explaining the moving body prediction operation | movement of an industrial robot.
 <1 産業用ロボット1の構成>
 図1は、本発明の望ましい実施形態に係る産業用ロボット1の模式図である。産業用ロボット1は、前工程から半導体ウエハWを受け取ってキャリアにロードするとともに、キャリアから半導体ウエハWをアンロードして後工程に受け渡す。ただし、このことは、本発明に係る画像情報生成装置及び画像情報生成方法の導入先が産業用ロボット1に制限されることを意味しない。したがって、他の種類の産業用ロボットや産業用ロボット以外のロボットに本発明に係る画像情報生成装置及び画像情報生成方法を導入しても良いし、ロボットの範疇に含まれない装置に本発明に係る画像情報生成装置及び画像情報生成方法を導入しても良い。
<1 Configuration of industrial robot 1>
FIG. 1 is a schematic diagram of an industrial robot 1 according to a preferred embodiment of the present invention. The industrial robot 1 receives the semiconductor wafer W from the previous process and loads it onto the carrier, and unloads the semiconductor wafer W from the carrier and delivers it to the subsequent process. However, this does not mean that the introduction destination of the image information generation apparatus and the image information generation method according to the present invention is limited to the industrial robot 1. Therefore, the image information generating apparatus and the image information generating method according to the present invention may be introduced into other types of industrial robots and robots other than industrial robots, and the present invention is applied to apparatuses not included in the category of robots. Such an image information generation apparatus and an image information generation method may be introduced.
 図1に示すように、産業用ロボット1は、処理の対象となる半導体ウエハWを把持するロボットハンド102と、ロボットハンド102の位置及び姿勢を変化させるロボットアーム116と、ロボットハンド102及びロボットアーム116を制御するロボット制御装置132と、ロボットハンド102の前方に関する2次元画像情報及び3次元情報を生成する画像情報生成装置となる視覚システム202とを備える。 As shown in FIG. 1, the industrial robot 1 includes a robot hand 102 that holds a semiconductor wafer W to be processed, a robot arm 116 that changes the position and posture of the robot hand 102, a robot hand 102, and a robot arm. A robot control device 132 for controlling 116, and a visual system 202 serving as an image information generation device for generating two-dimensional image information and three-dimensional information regarding the front of the robot hand 102.
 {ロボットハンド102}
 ロボットハンド102は、関節106を軸としてアーム122に対して回転するフィンガ104と、関節108を軸としてアーム122に対して回転するフィンガ110と、フィンガ104,110を回転させるモータ112とを備える。ロボットハンド102は、ロボット制御装置132から与えられた制御信号に応じてモータ112にフィンガ104,110を回転させることにより、フィンガ104,110を開閉し、フィンガ104,110の間にある半導体ウエハWを把持又は解放する。もちろん、ロボットハンド102は、制御信号に応じて半導体ウエハWを把持又は解放することができれば、どのような形式のものであってもよい。
{Robot hand 102}
The robot hand 102 includes a finger 104 that rotates relative to the arm 122 about the joint 106, a finger 110 that rotates relative to the arm 122 about the joint 108, and a motor 112 that rotates the fingers 104 and 110. The robot hand 102 opens and closes the fingers 104 and 110 by causing the motor 112 to rotate the fingers 104 and 110 in accordance with a control signal given from the robot controller 132, and the semiconductor wafer W between the fingers 104 and 110. Grasp or release. Of course, the robot hand 102 may be of any type as long as it can hold or release the semiconductor wafer W according to the control signal.
 {ロボットアーム116}
 ロボットアーム116は、関節120を軸としてベース130に対して回転するアーム118と、関節124を軸としてアーム118に対して回転するアーム122と、アーム118を回転させるモータ126と、アーム122を回転させるモータ128とを備える。ロボットアーム116は、ロボット制御装置132から与えられた制御信号に応じてモータ126にアーム118を回転させ、モータ128にアーム122を回転させることにより、ロボットハンド102の位置及び姿勢を変化させる。もちろん、ロボットアーム116は、制御信号に応じてロボットハンド102の位置及び姿勢を変化させることができれば、どのような形式のものであってもよい。
{Robot arm 116}
The robot arm 116 includes an arm 118 that rotates relative to the base 130 with the joint 120 as an axis, an arm 122 that rotates relative to the arm 118 with the joint 124 as an axis, a motor 126 that rotates the arm 118, and an arm 122 that rotates. And a motor 128 to be operated. The robot arm 116 changes the position and posture of the robot hand 102 by causing the motor 126 to rotate the arm 118 and causing the motor 128 to rotate the arm 122 in accordance with a control signal given from the robot controller 132. Of course, the robot arm 116 may be of any type as long as the position and posture of the robot hand 102 can be changed according to the control signal.
 {ロボット制御装置132}
 図2は、ロボット制御装置132のハードウエアの構成を示すブロック図である。
{Robot controller 132}
FIG. 2 is a block diagram illustrating a hardware configuration of the robot control device 132.
 図2に示すように、ロボット制御装置132は、CPU134とメモリ136とハードディスクドライブ138とを備えるコンピュータである。ロボット制御装置132は、ハードディスクドライブ138からメモリ136にロードされたロボット制御プログラム146にしたがってモータ112,126,128に制御信号を出力し、ロボットハンド102及びロボットアーム116を制御する。これにより、産業用ロボット1は、作業空間WSにおいて半導体ウエハWを把持し搬送する。また、ロボット制御装置132は、画像処理装置232から転送されてきた2次元画像情報及び3次元情報から半導体ウエハWを把持するために必要なフィンガ104,110及びアーム118,122の回転量を算出し、当該回転量を得るために必要な制御信号をモータ112,126,128に出力する。ロボット制御装置132からモータ112,126,128への制御信号の出力は、入出力インターフェース140を介して行われ、画像処理装置232からロボット制御装置132への2次元画像情報及び3次元情報の転送は、通信インターフェース142を介して行われる。CPU134、メモリ136、ハードディスクドライブ138、入出力インターフェース140及び通信インターフェース142は、内部バス144に接続され、相互にデータのやり取りをすることができる。 As shown in FIG. 2, the robot control device 132 is a computer including a CPU 134, a memory 136, and a hard disk drive 138. The robot controller 132 outputs a control signal to the motors 112, 126, and 128 according to the robot control program 146 loaded from the hard disk drive 138 to the memory 136, and controls the robot hand 102 and the robot arm 116. Thereby, the industrial robot 1 grips and transports the semiconductor wafer W in the work space WS. Further, the robot controller 132 calculates the rotation amounts of the fingers 104 and 110 and the arms 118 and 122 necessary for gripping the semiconductor wafer W from the two-dimensional image information and the three-dimensional information transferred from the image processing device 232. Then, control signals necessary for obtaining the rotation amount are output to the motors 112, 126, and 128. Output of control signals from the robot control device 132 to the motors 112, 126, and 128 is performed via the input / output interface 140, and transfer of two-dimensional image information and three-dimensional information from the image processing device 232 to the robot control device 132. Is performed via the communication interface 142. The CPU 134, the memory 136, the hard disk drive 138, the input / output interface 140, and the communication interface 142 are connected to the internal bus 144 and can exchange data with each other.
 {視覚システム202}
 視覚システム202は、ロボットハンド102の前方を撮像して2次元画像情報を生成する。加えて、視覚システム202は、ロボットハンド102の前方にある半導体ウエハWに幾何パターンGPを投影し、幾何パターンGPの撮像結果から半導体ウエハWの位置及び姿勢に関する3次元画像情報を生成する。以下では、この視覚システム202について詳細に説明する。
{Visual system 202}
The vision system 202 captures the front of the robot hand 102 and generates two-dimensional image information. In addition, the vision system 202 projects the geometric pattern GP onto the semiconductor wafer W in front of the robot hand 102, and generates three-dimensional image information related to the position and orientation of the semiconductor wafer W from the imaging result of the geometric pattern GP. In the following, the visual system 202 will be described in detail.
 <2 視覚システム202の構成>
 図1に示すように、視覚システム202は、幾何パターンGPを投影する投影用光源204と、作業空間WSを照明する照明用光源214と、投影用光源204及び照明用光源214の各々が備える後述するLED(発光ダイオード)206,216に発光用の電力を供給するLEDドライバ220とを備える。また、視覚システム202は、ロボットハンド102の前方を撮像して2次元画像情報を含む画像信号を生成するカメラ224と、カメラ224が生成した画像信号を処理する画像処理装置232とを備える。
<Configuration of 2 visual system 202>
As shown in FIG. 1, the visual system 202 includes a projection light source 204 that projects a geometric pattern GP, an illumination light source 214 that illuminates the work space WS, and a projection light source 204 and an illumination light source 214, which will be described later. LED driver 220 which supplies electric power for light emission to LED (light emitting diode) 206,216 to perform. The visual system 202 includes a camera 224 that captures an image of the front of the robot hand 102 and generates an image signal including two-dimensional image information, and an image processing device 232 that processes the image signal generated by the camera 224.
 {投影用光源204}
 図3は、投影用光源204の模式図である。図3は、投影用光源204の断面図となっている。
{Projection light source 204}
FIG. 3 is a schematic diagram of the projection light source 204. FIG. 3 is a cross-sectional view of the projection light source 204.
 投影用光源204は、LED206と、LED206が発光した拡散光を平行光に変換する光学系208と、投影すべき幾何パターンGPの明部を照らす光束が通過する部分が透過部となっており残余の部分が遮蔽部となっているマスク210とを備える。これにより、投影用光源204は、平行光を照射し、被投影物までの距離にかかわらず大きさが一定の幾何パターンGPを半導体ウエハWに投影する。なお、投影用光源204の発光源にLED206を使用することは必須ではなく、白熱電球、蛍光灯等も使用することができる。ただし、発光源にLED206を使用することには、LED206の応答が高速であるので、投影用光源204の点灯と消灯とを高速で切り替えることができ、後述する2次元画像情報の生成と3次元情報の生成とを高速に切り替えることができるという利点がある。また、LED206は理想的な点光源に近いので、簡単な光学系で拡散光を平行光に変換することができるという利点もある。 The projection light source 204 includes an LED 206, an optical system 208 that converts the diffused light emitted by the LED 206 into parallel light, and a portion through which a light beam that illuminates a bright portion of the geometric pattern GP to be projected is a transmissive portion. Is provided with a mask 210 that is a shielding part. Thereby, the projection light source 204 irradiates parallel light and projects a geometric pattern GP having a constant size onto the semiconductor wafer W regardless of the distance to the projection object. Note that it is not essential to use the LED 206 as the light source of the projection light source 204, and an incandescent bulb, a fluorescent lamp, or the like can also be used. However, when the LED 206 is used as the light emission source, the response of the LED 206 is high speed, so that the projection light source 204 can be switched on and off at high speed. There is an advantage that information generation can be switched at high speed. Further, since the LED 206 is close to an ideal point light source, there is also an advantage that diffused light can be converted into parallel light with a simple optical system.
 「平行光」は、投影用光源204からの距離による幾何パターンGPの大きさの変化が視覚システム202により生成される3次元情報の精度に実質的な影響を与えない程度の平行性を有していれば足りる。このことの意義は、ロボットハンド102の位置決め及び姿勢決めの精度以上の精度を3次元情報に求めることは無意味であることを考慮すれば明らかである。 “Parallel light” has parallelism such that a change in the size of the geometric pattern GP depending on the distance from the projection light source 204 does not substantially affect the accuracy of the three-dimensional information generated by the vision system 202. If it is, it is enough. The significance of this is clear in view of the fact that it is meaningless to obtain accuracy higher than the accuracy of positioning and posture determination of the robot hand 102 from the three-dimensional information.
 幾何パターンGPは、幾何パターンGPが映っている2次元画像において長さを測定すべき場所を特定する少なくとも2個の基準箇所を含むことが必要である。「少なくとも2個の基準箇所」には、例えば、点と点、点と線、線と線等がある。「点」は、線と線との交点を含み、「線」は、暗部と明部との境界線も含むと解すべきである。 The geometric pattern GP needs to include at least two reference points for specifying a place whose length is to be measured in the two-dimensional image in which the geometric pattern GP is reflected. “At least two reference locations” include, for example, points and points, points and lines, lines and lines, and the like. It should be understood that “point” includes the intersection of the line and “line” includes the boundary line between the dark part and the bright part.
 さらに、幾何パターンGPは、画像処理装置232により認識することが容易であることが望ましい。そのような幾何パターンGPには、例えば、図4(a)に示すような、帯形状の明部BR1と同一の帯形状の暗部DA1とを1方向に交互に配置したストライプ(縞模様)ST、図4(b)に示すような、正方形形状の明部BR2と同一の正方形形状の暗部DA2とを直行する2方向に交互に配置したチェック(市松模様)CH等がある。なお、以下では、幾何パターンGPがストライプSTであり、暗部BR1と明部DA1との境界線BDL3を「基準箇所」として用いるものとして説明を進める。 Furthermore, it is desirable that the geometric pattern GP is easily recognized by the image processing device 232. In such a geometric pattern GP, for example, as shown in FIG. 4 (a), stripes (striped pattern) ST in which strip-shaped bright portions BR1 and the same strip-shaped dark portions DA1 are alternately arranged in one direction. As shown in FIG. 4B, there is a check (checkered pattern) CH or the like in which a square-shaped bright portion BR2 and a square-shaped dark portion DA2 are alternately arranged in two directions orthogonal to each other. In the following description, it is assumed that the geometric pattern GP is the stripe ST, and the boundary line BDL3 between the dark part BR1 and the bright part DA1 is used as the “reference location”.
 投影用光源204は、アーム122の先端に固定され、ロボットハンド102の前方に向かって平行光を照射する。これにより、投影用光源204は、カメラ224の撮像範囲の中央部に向かって平行光を照射し、カメラ224の撮像範囲の中央部に存在する半導体ウエハWにストライプSTを投影する。 Projection light source 204 is fixed to the tip of arm 122 and emits parallel light toward the front of robot hand 102. Thereby, the projection light source 204 irradiates parallel light toward the center of the imaging range of the camera 224, and projects the stripe ST onto the semiconductor wafer W present at the center of the imaging range of the camera 224.
 ロボットアーム116が半導体ウエハWを把持するときには、アーム122の前方に半導体ウエハWが存在するので、投影用光源204が照射する平行光の光束は細くても足りる。したがって、アーム122の先端に内蔵することができる程度に投影用光源204を小型化することができる。 When the robot arm 116 grips the semiconductor wafer W, since the semiconductor wafer W exists in front of the arm 122, the parallel light beam emitted from the projection light source 204 may be thin. Therefore, the projection light source 204 can be reduced in size to the extent that it can be built into the tip of the arm 122.
 {照明用光源214}
 照明用光源214は、拡散光を発光するLED216を備える。なお、照明用光源214の発光源にLED216を使用することは必須ではなく、白熱電球、蛍光灯等も使用することができる。ただし、発光源にLED216を使用することには、LED216の応答が高速であるので、照明用光源214の点灯と消灯とを高速で切り替えることができ、2次元画像情報の生成と3次元情報の生成とを高速に切り替えることができるという利点がある。
{Light source 214 for illumination}
The illumination light source 214 includes an LED 216 that emits diffused light. Note that it is not essential to use the LED 216 as the light source of the illumination light source 214, and an incandescent bulb, a fluorescent lamp, or the like can also be used. However, when the LED 216 is used as the light source, the response of the LED 216 is high speed, so that the illumination light source 214 can be switched on and off at high speed, and the generation of 2D image information and the generation of 3D information can be performed. There is an advantage that generation can be switched at high speed.
 照明用光源214は、ロボットハンド102及びロボットアーム116から分離して固定され、作業空間WSに向かって拡散光を発光する。これにより、照明用光源214は、カメラ224の撮像範囲に向かって略均一な照明光を照射する。 The illumination light source 214 is fixed separately from the robot hand 102 and the robot arm 116, and emits diffused light toward the work space WS. As a result, the illumination light source 214 emits substantially uniform illumination light toward the imaging range of the camera 224.
 「略均一な照明光」は、カメラ224の撮像範囲の全体が明瞭に映された2次元画像を得ることができる程度の均一性を有していれば足りる。これにより、後述する第1の撮像を行うときにカメラ224の撮像範囲に照明光が照射されるので、視覚システム202が生成する2次元画像情報が環境光の影響を受けにくくなる。ただし、カメラ224の撮像範囲の全体が明瞭に映された2次元画像を環境光だけで得ることができるならば、照明用光源214を省略することもできる。 It is sufficient for the “substantially uniform illumination light” to have such a degree of uniformity that a two-dimensional image in which the entire imaging range of the camera 224 is clearly reflected can be obtained. Thereby, since illumination light is irradiated to the imaging range of the camera 224 when performing first imaging, which will be described later, the two-dimensional image information generated by the visual system 202 is less affected by ambient light. However, the illumination light source 214 can be omitted if a two-dimensional image in which the entire imaging range of the camera 224 is clearly reflected can be obtained only with ambient light.
 {LEDドライバ220}
 図5は、投影用光源204及び照明用光源214の点灯並びにカメラ224の撮像のタイミングを説明するタイムチャートである。
{LED driver 220}
FIG. 5 is a time chart for explaining the lighting timing of the projection light source 204 and the illumination light source 214 and the imaging timing of the camera 224.
 LEDドライバ220は、カメラ224から入力された同期信号に同期してLED206,216に発光用の電力を交互に供給し、投影用光源204及び照明用光源214を交互に点灯させる。これにより、LEDドライバ220は、図5(a)に示すように、一定の周期で繰り返されるフレームFL1,FL2,FL3,FL4,FL5,FL6,・・・のうちの偶数番目のフレームFL2,FL4,FL6,・・・に投影用光源204を点灯させ、奇数番目のフレームFL1,FL3,FL5,・・・に投影用光源204を消灯させる。一方、LEDドライバ220は、図5(b)に示すように、奇数番目のフレームFL1,FL3,FL5,・・・に照明用光源214を点灯させ、偶数番目のフレームFL2,FL4,FL6,・・・に照明用光源214を消灯させる。フレームFL1,FL2,FL3,FL4,FL5,FL6,・・・の長さは、典型的には1/30秒程度であるが、発光源として応答の速いLED206,216を使用しているならば、1/60秒や1/200秒に高速化することもできる。 The LED driver 220 alternately supplies light emission power to the LEDs 206 and 216 in synchronization with the synchronization signal input from the camera 224, and alternately turns on the projection light source 204 and the illumination light source 214. As a result, the LED driver 220, as shown in FIG. 5A, even-numbered frames FL2, FL4 among the frames FL1, FL2, FL3, FL4, FL5, FL6,... , FL6, ..., the projection light source 204 is turned on, and the projection light source 204 is turned off in odd-numbered frames FL1, FL3, FL5,. On the other hand, as shown in FIG. 5B, the LED driver 220 turns on the illumination light source 214 in the odd-numbered frames FL1, FL3, FL5,. .. turn off the illumination light source 214. The length of the frames FL1, FL2, FL3, FL4, FL5, FL6,... Is typically about 1/30 second. However, if the LEDs 206 and 216 having a fast response are used as the light source, The speed can be increased to 1/60 seconds or 1/200 seconds.
 なお、図5(a)及び図5(b)に示すように投影用光源204の点灯時間と照明用光源212の点灯時間とを同じにすることは必須ではなく、投影用光源204の点灯時間と照明用光源214の点灯時間とを異ならせてもよい。例えば、図6(a)及び図6(b)に示すように、nを自然数として、3n番目のフレームFL3,FL6,・・・に投影用光源204を点灯させ、3n-2番目及び3n-1番目のフレームFL1,FL2,FL4,FL5,・・・に照明用光源214を点灯させるようにしてもよい。 As shown in FIGS. 5A and 5B, it is not essential that the lighting time of the projection light source 204 is the same as the lighting time of the illumination light source 212. The lighting time of the projection light source 204 is not necessarily required. And the lighting time of the illumination light source 214 may be different. For example, as shown in FIGS. 6A and 6B, the projection light source 204 is turned on in the 3nth frames FL3, FL6,..., Where n is a natural number, and the 3n-2nd and 3n− The illumination light source 214 may be turned on in the first frames FL1, FL2, FL4, FL5,.
 また、投影用光源204が照射する光を照明用光源214が照射する光よりも強くし、照明用光源214が半導体ウエハWに照明光を照射している状態で投影用光源204が半導体ウエハWに幾何パターンGPを投影しても幾何パターンGPをカメラ224が明瞭に映すことができるようにすれば、図7(a)及び図7(b)に示すように、照明用光源214を常時点灯したまま、投影用光源204を偶数番目のフレームFL2,FL4,FL6,・・・に点灯させ、奇数番目のフレームFL1,FL3,FL5,・・・に消灯させるようにしてもよい。 Further, the light emitted from the projection light source 204 is made stronger than the light emitted from the illumination light source 214, and the projection light source 204 irradiates the semiconductor wafer W with the illumination light. If the geometric pattern GP can be projected clearly even if the geometric pattern GP is projected onto the camera, the illumination light source 214 is always lit as shown in FIGS. 7 (a) and 7 (b). The projection light source 204 may be turned on in the even-numbered frames FL2, FL4, FL6,... And may be turned off in the odd-numbered frames FL1, FL3, FL5,.
 {カメラ224}
 図8は、カメラ224のブロック図である。図8に示すように、カメラ224は、撮像のタイミングを制御する同期信号を発生する同期信号発生器226と、同期信号発生器226から入力された同期信号に同期して撮像を行い画像信号を出力する撮像センサ228とを備える。これにより、カメラ224は、図5(c)、図6(c)及び図7(c)に示すように、フレームFL1,FL2,FL3,FL4,FL5、FL6,・・・に撮像を行い、図5(d)、図6(d)及び図7(d)に示すように、撮像より1フレーム遅れたFL2,FL3,FL4,FL5、FL6,・・・に撮像により得られた画像信号を出力する。
{Camera 224}
FIG. 8 is a block diagram of the camera 224. As shown in FIG. 8, the camera 224 captures an image signal in synchronization with the synchronization signal input from the synchronization signal generator 226 and the synchronization signal generator 226 that generates a synchronization signal for controlling the imaging timing. And an imaging sensor 228 for outputting. As a result, the camera 224 performs imaging on the frames FL1, FL2, FL3, FL4, FL5, FL6,... As shown in FIGS. 5 (c), 6 (c) and 7 (c), As shown in FIGS. 5 (d), 6 (d), and 7 (d), image signals obtained by imaging are displayed on FL2, FL3, FL4, FL5, FL6,. Output.
 {画像処理装置232}
 図9は、画像処理装置232のハードウエアの構成を示すブロック図である。図9に示すように、画像処理装置232は、CPU234とメモリ236とハードディスクドライブ238とを備えるコンピュータである。画像処理装置232は、ハードディスクドライブ238からメモリ236にロードされた画像処理プログラム258にしたがってカメラ224から入力された画像信号を処理する。なお、ロボット制御装置132と画像処理装置232とを別々のコンピュータとすることは必須ではなく、1台のコンピュータにロボット制御装置132及び画像処理装置232の両方の機能を持たせてもよい。また、画像処理装置232の全部又は一部の機能を専用のハードウエアによって実現してもよい。
{Image processing device 232}
FIG. 9 is a block diagram illustrating a hardware configuration of the image processing apparatus 232. As shown in FIG. 9, the image processing apparatus 232 is a computer that includes a CPU 234, a memory 236, and a hard disk drive 238. The image processing device 232 processes the image signal input from the camera 224 according to the image processing program 258 loaded from the hard disk drive 238 to the memory 236. Note that it is not essential that the robot control device 132 and the image processing device 232 are separate computers, and one computer may have the functions of both the robot control device 132 and the image processing device 232. Further, all or some of the functions of the image processing apparatus 232 may be realized by dedicated hardware.
 また、画像処理装置232は、カメラ224から入力されたアナログの画像信号をデジタルの画像データに変換するA/Dコンバータ240と、画像データを記憶するフレームメモリ242とを備える。 The image processing apparatus 232 also includes an A / D converter 240 that converts an analog image signal input from the camera 224 into digital image data, and a frame memory 242 that stores the image data.
 さらに、画像処理装置232は、ユーザインターフェースとして、操作者の操作を受け付けるキーボード246及びマウス248並びに情報を表示するモニタ250を備える。キーボード246及びマウス248はCPU2234に接続される。CPU234は、キーボード246及びマウス248に対して行われた操作に応じた処理を行う。また、モニタ250は、ビデオボード252に接続されており、CPU234から与えられた描画命令にしたがって、2次元画像情報や3次元情報その他の情報を表示する。画像処理装置232からロボット制御装置132への2次元画像情報及び3次元情報の転送は、通信インターフェース254を介して行われる。CPU234、メモリ236、ハードディスクドライブ238、A/Dコンバータ240、フレームメモリ242、ビデオボード252及び通信インターフェース254は、内部バス256に接続され、相互にデータのやり取りをすることができる。 Furthermore, the image processing apparatus 232 includes a keyboard 246 and a mouse 248 that accept an operation of the operator, and a monitor 250 that displays information as a user interface. A keyboard 246 and a mouse 248 are connected to the CPU 2234. The CPU 234 performs processing according to operations performed on the keyboard 246 and the mouse 248. The monitor 250 is connected to the video board 252 and displays 2D image information, 3D information, and other information in accordance with a drawing command given from the CPU 234. Transfer of the two-dimensional image information and the three-dimensional information from the image processing device 232 to the robot control device 132 is performed via the communication interface 254. The CPU 234, the memory 236, the hard disk drive 238, the A / D converter 240, the frame memory 242, the video board 252 and the communication interface 254 are connected to the internal bus 256 and can exchange data with each other.
 {画像処理装置232における画像処理}
 図10は、画像処理装置232が画像処理プログラム258を実行することにより実現される機能を示すブロック図である。
{Image processing in image processing device 232}
FIG. 10 is a block diagram illustrating functions realized by the image processing apparatus 232 executing the image processing program 258.
 図10に示すように、画像処理装置232は、機能的にいって、2次元画像情報を含む画像データに対する処理を選択する処理選択部260と、対象物である半導体ウエハWに関する3次元情報を生成する3次元情報生成部262と、半導体ウエハWに投影用光源204がストライプSTを投影することができるか否かを判定する投影可否判定部270とを備える。 As shown in FIG. 10, the image processing apparatus 232 functionally includes a process selection unit 260 that selects a process for image data including two-dimensional image information, and three-dimensional information about the semiconductor wafer W that is an object. A three-dimensional information generation unit 262 to generate, and a projection availability determination unit 270 that determines whether the projection light source 204 can project the stripe ST onto the semiconductor wafer W are provided.
 {処理選択部260}
 LEDドライバ220及び同期信号発生器226が図5に示すタイミングの制御を行う場合、A/Dコンバータ240を介してカメラ224から取り込んだ画像データは、投影用光源204を消灯させ照明用光源214を点灯させているフレームFL1,FL3,FL5,・・・にカメラ224に第1の撮像を行わせ生成させた画像信号をデジタル化した第1の画像データD1,D3,D5,・・・と、照明用光源214を消灯させ投影用光源204を点灯させているフレームFL2,FL4,FL6,・・・にカメラ224に第2の撮像を行わせ生成させた画像信号をデジタル化した第2の画像データD2,D4,D6,・・・との2種類に分類される。もちろん、図6に示すタイミングの制御を行う場合、画像データは、投影用光源204を消灯させ照明用光源214を点灯させているフレームFL1,FL2,FL4,FL5,・・・にカメラ224に第1の撮像を行わせ生成させた画像信号をデジタル化した第1の画像データD1,D2,D4,D5・・・と、照明用光源214を消灯させ投影用光源204を点灯させているフレームFL3,FL6,・・・にカメラ224に第2の撮像を行わせ生成させた画像信号をデジタル化した第2の画像データD3,D6,・・・との2種類に分類される。また、図7に示すタイミングの制御を行う場合、画像データは、投影用光源204を消灯させているフレームFL1,FL3,FL5,・・・にカメラ224に第1の撮像を行わせ生成させた画像信号をデジタル化した第1の画像データD1,D3,D5,・・・と、投影用光源204を点灯させているフレームFL2,FL4,FL6,・・・にカメラ224に第2の撮像を行わせ生成させた画像信号をデジタル化した第2の画像データD2,D4,D6,・・・との2種類に分類される。
{Processing selection unit 260}
When the LED driver 220 and the synchronization signal generator 226 control the timing shown in FIG. 5, the image data captured from the camera 224 via the A / D converter 240 turns off the projection light source 204 and turns off the illumination light source 214. First image data D1, D3, D5,... Obtained by digitizing an image signal generated by causing the camera 224 to perform first imaging on the lit frames FL1, FL3, FL5,. A second image obtained by digitizing an image signal generated by causing the camera 224 to perform the second imaging on the frames FL2, FL4, FL6,... With the illumination light source 214 turned off and the projection light source 204 turned on. Data D2, D4, D6,... Are classified into two types. Of course, when the timing control shown in FIG. 6 is performed, the image data is transferred to the camera 224 in frames FL1, FL2, FL4, FL5,... In which the projection light source 204 is turned off and the illumination light source 214 is turned on. The first image data D1, D2, D4, D5,... Obtained by digitizing the image signal generated by imaging 1 and the frame FL3 in which the illumination light source 214 is turned off and the projection light source 204 is turned on. , FL 6,..., And the second image data D 3, D 6,. Further, when the timing control shown in FIG. 7 is performed, the image data is generated by causing the camera 224 to perform the first imaging on the frames FL1, FL3, FL5,. The first image data D1, D3, D5,... Obtained by digitizing the image signal, and frames FL2, FL4, FL6,. The generated image signals are classified into two types of digital image data D2, D4, D6,.
 以下では、これらのタイミングの制御のうち、図5に示すタイミングの制御を行う場合を例として画像処理装置232における画像処理について説明するが、図6又は図7に示すタイミングの制御を行う場合も第1の画像データ及び第2の画像データを同様に処理することができる。 Hereinafter, image processing in the image processing apparatus 232 will be described as an example of the timing control shown in FIG. 5 among these timing controls. However, the timing control shown in FIG. 6 or 7 may also be performed. The first image data and the second image data can be processed similarly.
 処理選択部260は、画像データD1,D2,D3,D4,D5,D6,・・・のうち、第1の画像データD1,D3,D5,・・・を投影可否判定部270に処理させ、第2の画像データD2,D4,D6,・・・を3次元情報生成部262に処理させる。これにより、視覚システム202は、2次元画像情報を生成するカメラ224が1台あれば2次元画像情報及び3次元情報の両方を生成することができる。このことは、2次元画像情報及び3次元情報の両方を生成する視覚システム202を安価に構成することに寄与する。 The processing selection unit 260 causes the projection availability determination unit 270 to process the first image data D1, D3, D5,... Among the image data D1, D2, D3, D4, D5, D6,. The second image data D2, D4, D6,... Are processed by the three-dimensional information generation unit 262. Accordingly, the visual system 202 can generate both the two-dimensional image information and the three-dimensional information if there is one camera 224 that generates the two-dimensional image information. This contributes to the inexpensive configuration of the visual system 202 that generates both 2D image information and 3D information.
 ただし、処理選択部260は、半導体ウエハWに投影用光源204がストライプSTを投影することができると判定した旨の情報(以下では、「投影可能情報」という)が投影可否判定部270から入力された場合にのみ、第2の画像データD2,D4,D6,・・・を3次元情報生成部262に処理させる。これにより、視覚システム202は、半導体ウエハWにストライプSTを投影することができない場合には3次元情報を生成する処理を行わないので、不要な3次元情報を生成することを防ぐことができる。 However, the processing selection unit 260 inputs information indicating that the projection light source 204 can project the stripe ST onto the semiconductor wafer W (hereinafter referred to as “projectable information”) from the projection availability determination unit 270. Only when this is done, the three-dimensional information generator 262 processes the second image data D2, D4, D6,. Thereby, the visual system 202 does not perform the process of generating the three-dimensional information when the stripe ST cannot be projected onto the semiconductor wafer W, and therefore it is possible to prevent the generation of unnecessary three-dimensional information.
 {3次元情報生成部262}
 3次元情報生成部262は、第2の画像データD2,D4,D6,・・・に係る2次元画像からエッジを抽出するエッジ抽出部264と、エッジ抽出部264が抽出したエッジからストライプSTの明部BR1と暗部DA1との境界線BDL3を検出する境界線検出部266と、複数の境界線BDL3の幾何学的な関係から半導体ウエハWに関する3次元情報を演算する3次元情報演算部268とを備える。
{3D information generation unit 262}
The three-dimensional information generation unit 262 extracts an edge from the two-dimensional image related to the second image data D2, D4, D6,..., And the stripe ST from the edge extracted by the edge extraction unit 264. A boundary line detection unit 266 that detects a boundary line BDL3 between the bright part BR1 and the dark part DA1, and a three-dimensional information calculation unit 268 that calculates three-dimensional information about the semiconductor wafer W from the geometric relationship of the plurality of boundary lines BDL3. Is provided.
 エッジ抽出部264は、微分フィルタを用いる方法等によりエッジを抽出する。 The edge extraction unit 264 extracts edges by a method using a differential filter or the like.
 境界線検出部266は、テンプレートマッチングやハフ変換を用いる方法等により境界線を検出する。 The boundary line detection unit 266 detects the boundary line by a method using template matching or Hough transform.
 3次元情報演算部268は、半導体ウエハWに関する3次元情報として、半導体ウエハWの位置をあらわす距離D及び半導体ウエハWの姿勢をあらわすパン角θを導出する。距離Dは、カメラ224から半導体ウエハWまでの距離をあらわしているが、産業用ロボット1では、ロボットハンド102及びカメラ224の両方がアーム122に固定されているので、カメラ224から半導体ウエハWまでの距離Dがわかればロボットハンド102から半導体ウエハWまでの距離も算出することができる。また、パン角θは、図11(a)に示すように、光軸OAに対するストライプSTの配列方向STAへの半導体ウエハWの傾き、すなわち、光軸OAと半導体ウエハWの主面とが垂直な状態からの半導体ウエハWの配列方向STAへの傾きをあらわしている。 The three-dimensional information calculation unit 268 derives a distance D representing the position of the semiconductor wafer W and a pan angle θ representing the attitude of the semiconductor wafer W as three-dimensional information about the semiconductor wafer W. The distance D represents the distance from the camera 224 to the semiconductor wafer W. However, in the industrial robot 1, since both the robot hand 102 and the camera 224 are fixed to the arm 122, from the camera 224 to the semiconductor wafer W. If the distance D is known, the distance from the robot hand 102 to the semiconductor wafer W can also be calculated. 11A, the tilt of the semiconductor wafer W in the arrangement direction STA of the stripes ST with respect to the optical axis OA, that is, the optical axis OA and the main surface of the semiconductor wafer W are perpendicular to each other. The inclination of the semiconductor wafer W from the normal state toward the arrangement direction STA is shown.
 なお、以下では、カメラ102の光軸と投影用光源204の光軸とのずれは無視できるほど小さいものとするとして、距離D及びパン角θの算出例について説明する。 In the following, an example of calculating the distance D and the pan angle θ will be described on the assumption that the deviation between the optical axis of the camera 102 and the optical axis of the projection light source 204 is negligibly small.
 図12は、ストライプSTが投影された半導体ウエハWが映っている2次元画像IMG1,IMG2を示す図である。図12(a)に示すように、2次元画像IMG1において境界線BDL1が平行となっており、隣接する境界線BDL1の間隔が一定である場合は、3次元情報演算部268は、半導体ウエアWが光軸OAに対して傾いていない、すなわち、θ=0であると判断して、式(1)にしたがって、ストライプ幅Xbに基づいて距離Dを算出する。 FIG. 12 is a diagram showing two-dimensional images IMG1 and IMG2 in which the semiconductor wafer W onto which the stripe ST is projected is shown. As shown in FIG. 12A, when the boundary line BDL1 is parallel in the two-dimensional image IMG1 and the interval between the adjacent boundary lines BDL1 is constant, the three-dimensional information calculation unit 268 performs the semiconductor ware W Is not inclined with respect to the optical axis OA, that is, θ = 0, and the distance D is calculated based on the stripe width Xb according to the equation (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、2次元画像IMG1におけるストライプ幅Xa,Xbは、図12(a)に示すように、2次元画像IMG1に映っているストライプSTの一の境界線BDL11とその2つ隣りの他の境界線BDL12との間の長さを画素数であらわたものであって、ストライプ幅Xaはキャリブレーション時に取得され、ストライプ幅Xbは測定時に取得される。また、距離Lは、キャリブレーション時に取得されたカメラ224からストライプSTの被投影物までの距離である。なお、2つ隣りの境界線との間隔をストライプ幅Xa,Xbとすることは必須ではなく、隣接する境界線又は3つ以上隣の境界線との間隔をストライプ幅Xa,Xbとしてもよい。 Here, as shown in FIG. 12A, the stripe widths Xa and Xb in the two-dimensional image IMG1 are one boundary line BDL11 of the stripe ST shown in the two-dimensional image IMG1 and the other boundary adjacent to the two. The length between the line BDL12 is expressed by the number of pixels, and the stripe width Xa is acquired at the time of calibration, and the stripe width Xb is acquired at the time of measurement. The distance L is a distance from the camera 224 acquired at the time of calibration to the projection target of the stripe ST. It is not essential that the interval between two adjacent boundary lines is the stripe width Xa, Xb, and the interval between adjacent boundary lines or three or more adjacent boundary lines may be the stripe widths Xa, Xb.
 このように、2次元画像IMG1におけるストライプ幅Xbに基づいて距離Dを簡単な計算で算出することができるのは、ストライプSTが平行光によって投影されるため、距離Dにかかわらず半導体ウエハWに投影されるストライプSTの大きさが変化せず、距離Dが離れるほど2次元画像IMG1におけるストライプ幅Xbが小さくなることを利用している。言うまでもなく、このような簡単な計算による距離Dの算出は、距離Dを高速に導出することに寄与する。 As described above, the distance D can be calculated by simple calculation based on the stripe width Xb in the two-dimensional image IMG1 because the stripe ST is projected by parallel light. This utilizes the fact that the stripe width Xb in the two-dimensional image IMG1 decreases as the distance D increases without changing the size of the projected stripe ST. Needless to say, the calculation of the distance D by such simple calculation contributes to deriving the distance D at high speed.
 また、図12(b)に示すように、2次元画像IMG2において境界線BDL2が平行となっているが、隣接する境界線BDL1の間隔が一定でない場合は、3次元情報演算部268は、半導体ウエアWが光軸OAに対して傾いている、すなわち、θ≠0であると判断して、式(2)にしたがって、ストライプ幅Xb1,Xb2に基づいて距離Dを算出し、式(3)にしたがって、ストライプSTの形状のひずみに基づいてパン角θを算出する。ここで、「形状のひずみ」とは、投影用光源204の光軸と垂直な被投影面に投影されたストライプSTと相似な形状からのずれを意味する。 In addition, as shown in FIG. 12B, the boundary line BDL2 is parallel in the two-dimensional image IMG2, but when the interval between the adjacent boundary lines BDL1 is not constant, the three-dimensional information calculation unit 268 includes a semiconductor It is determined that the wear W is inclined with respect to the optical axis OA, that is, θ ≠ 0, and the distance D is calculated based on the stripe widths Xb1 and Xb2 according to the equation (2), and the equation (3) Accordingly, the pan angle θ is calculated based on the distortion of the shape of the stripe ST. Here, “distortion of shape” means a deviation from a shape similar to the stripe ST projected on the projection surface perpendicular to the optical axis of the projection light source 204.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、2次元画像IMG2におけるストライプ幅Xb1,Xb2は、図12(b)に示すように、2次元画像IMG2に映っているストライプSTの境界線BDL21とその2つ隣りの他の境界線BDL22,BDL23との間の長さを画素数であらわたものであって、測定時に取得される。また、ストライプ幅STWは、図4(a)に示すように、投影用光源204の光軸と垂直な被投影面に投影されたストライプSTにおける境界線BDL31とその2つ隣りの他の境界線BDL32との間隔である。このストライプ幅STWは、マスク210の透過部の幅によって決まる。 Here, the stripe widths Xb1 and Xb2 in the two-dimensional image IMG2 are, as shown in FIG. 12B, the boundary line BDL21 of the stripe ST shown in the two-dimensional image IMG2 and another boundary line BDL22 adjacent to the stripe ST2. , BDL23 is represented by the number of pixels, and is acquired at the time of measurement. Further, as shown in FIG. 4A, the stripe width STW is equal to the boundary line BDL31 in the stripe ST projected on the projection surface perpendicular to the optical axis of the projection light source 204 and other boundary lines adjacent to the boundary line BDL31. The distance from the BDL 32. The stripe width STW is determined by the width of the transmission part of the mask 210.
 なお、ここでは、半導体ウエハWの姿勢としてパン角θのみを算出する例を示したが、幾何パターンGPとしてチェックCHを使用したり、ストライプSTの向きを一時的に90°回転したりすれば、同様の方法によりチルト角φも算出することができる。ここでいうチルト角φは、図11(b)に示すように、光軸OAに対するストライプSTの配列方向STAと垂直な方向への半導体ウエハWの傾きをあらわしている。 Here, an example is shown in which only the pan angle θ is calculated as the posture of the semiconductor wafer W. However, if the check CH is used as the geometric pattern GP or the direction of the stripe ST is temporarily rotated by 90 °. The tilt angle φ can be calculated by the same method. As shown in FIG. 11B, the tilt angle φ here represents the inclination of the semiconductor wafer W in the direction perpendicular to the arrangement direction STA of the stripes ST with respect to the optical axis OA.
 {投影可否判定部270}
 投影可否判定部270は、2次元画像情報のうちの半導体ウエハWが映っている部分を特定し、特定した部分に映っている半導体ウエハWに投影用光源204がストライプSTを投影することができるか否かを判定する。さらに、投影可否判定部270は、半導体ウエハWに投影用光源204がストライプSTを投影することができると判定した場合に、その旨を示す投影可能情報を処理選択部260に出力する。
{Projection propriety determination unit 270}
The projection availability determination unit 270 identifies a portion in which the semiconductor wafer W is reflected in the two-dimensional image information, and the projection light source 204 can project the stripe ST onto the semiconductor wafer W reflected in the identified portion. It is determined whether or not. Further, when it is determined that the projection light source 204 can project the stripe ST onto the semiconductor wafer W, the projection availability determination unit 270 outputs projection enable information indicating that fact to the processing selection unit 260.
 半導体ウエハWに関する部分を特定する方法は制限されないが、以下では、半導体ウエハWが特定の形状を有していることを利用してテンプレートマッチングを用いて特定する方法を説明する。 Although a method for specifying a portion related to the semiconductor wafer W is not limited, a method for specifying by using template matching using the fact that the semiconductor wafer W has a specific shape will be described below.
 図10に示すように、投影可否判定部270は、第1の画像データD1,D3,D5,・・・に係る2次元画像からエッジを抽出するエッジ抽出部272と、エッジ抽出部272が抽出したエッジから半導体ウエハWの輪郭を検出する半導体ウエハ検出部274と、半導体ウエハ検出部274が検出した半導体ウエハWの位置が基準範囲内に含まれているか否かを判定する位置判定部276とを備える。 As shown in FIG. 10, the projection availability determination unit 270 is extracted by an edge extraction unit 272 that extracts edges from a two-dimensional image related to the first image data D1, D3, D5,. A semiconductor wafer detection unit 274 that detects the contour of the semiconductor wafer W from the edge, and a position determination unit 276 that determines whether the position of the semiconductor wafer W detected by the semiconductor wafer detection unit 274 is within the reference range; Is provided.
 エッジ抽出部272は、微分フィルタを用いる方法等によりエッジを抽出する。これにより、例えば、例えば、図13(a)に示すような円板形状の物体及び矩形板形状の物体OB3が映っている2次元画像IMG3からエッジを抽出をすると、図13(b)に示すような円形及び矩形のエッジED4が抽出された2次元画像IMG4が得られる。 The edge extraction unit 272 extracts edges by a method using a differential filter or the like. Thus, for example, when an edge is extracted from a two-dimensional image IMG3 in which a disk-shaped object and a rectangular-plate-shaped object OB3 as shown in FIG. A two-dimensional image IMG4 from which circular and rectangular edges ED4 are extracted is obtained.
 半導体ウエハ検出部274は、エッジ抽出部272が抽出したエッジからあらかじめ登録されているテンプレートと類似した形状を有するエッジを検出する。したがって、半導体ウエハWの輪郭形状をテンプレートとして登録しておけば、半導体ウエハ検出部274は、半導体ウエハWを検出し、2次元画像において半導体ウエハWが映っている位置を特定することができる。例えば、半導体ウエハWが円板であれば、円をテンプレートとして登録しておけばよい。もちろん、検出能力を向上するため、テンプレートをアフィン変換して変形する等の周知の画像処理技術を採用してもよい。 The semiconductor wafer detection unit 274 detects an edge having a shape similar to a template registered in advance from the edges extracted by the edge extraction unit 272. Therefore, if the outline shape of the semiconductor wafer W is registered as a template, the semiconductor wafer detection unit 274 can detect the semiconductor wafer W and specify the position where the semiconductor wafer W is reflected in the two-dimensional image. For example, if the semiconductor wafer W is a disc, a circle may be registered as a template. Of course, in order to improve the detection capability, a well-known image processing technique such as affine transformation of the template and deformation may be employed.
 位置判定部276は、半導体ウエハ検出部274が特定した2次元画像において半導体ウエハWが映っている位置に基づいて、投影用光源204が半導体ウエハWにストライプSTを投影することができるか否かを判定する。より具体的には、位置判定部276は、2次元画像の中央部に半導体ウエハWが映っていれば、投影用光源204が半導体ウエハWにストライプSTを投影することができると判定する。もちろん、2次元画像の中央部を基準とする判定は、カメラ224の光軸OAと投影用光源204の光軸とのずれを無視できる場合にのみ採用することができるものであるので、ずれを無視することができない場合は、ずれを考慮して基準を設定する。 The position determination unit 276 determines whether the projection light source 204 can project the stripe ST on the semiconductor wafer W based on the position where the semiconductor wafer W is reflected in the two-dimensional image specified by the semiconductor wafer detection unit 274. Determine. More specifically, the position determination unit 276 determines that the projection light source 204 can project the stripe ST onto the semiconductor wafer W if the semiconductor wafer W is reflected at the center of the two-dimensional image. Of course, the determination based on the center of the two-dimensional image can be adopted only when the deviation between the optical axis OA of the camera 224 and the optical axis of the projection light source 204 can be ignored. If it cannot be ignored, the standard is set in consideration of the deviation.
 {知覚システム202の利点}
 このような知覚システム202によれば、ストライプSTを投影することができれば2次元画像情報及び3次元情報の両方を生成することができるので、使用することができる状況の制約を少なくすることができる。特に、知覚システム202は、大きさが異なる複数の半導体ウエハWが混在する場合でも2次元画像情報及び3次元情報の両方を高速に生成することができる。また、このような知覚システム202によれば、光学系の駆動が不要であるので、2次元画像情報及び3次元情報の両方を高速に生成することができる。
{Advantages of perception system 202}
According to such a perceptual system 202, if the stripe ST can be projected, both the two-dimensional image information and the three-dimensional information can be generated, so that restrictions on the situation in which the stripe ST can be used can be reduced. . In particular, the perception system 202 can generate both two-dimensional image information and three-dimensional information at high speed even when a plurality of semiconductor wafers W having different sizes coexist. In addition, according to such a perception system 202, it is not necessary to drive the optical system, so that both two-dimensional image information and three-dimensional information can be generated at high speed.
 <3 視覚システム202のキャリブレーション>
 視覚システム202に精度の高い3次元情報を生成させ、産業用ロボット1を正確に動作させるためには、事前に視覚システム202のキャリブレーションを行い、距離L及びストライプ間隔Xaを視覚システム202に登録する必要がある。
<3 Calibration of Vision System 202>
In order for the visual system 202 to generate highly accurate three-dimensional information and operate the industrial robot 1 accurately, the visual system 202 is calibrated in advance and the distance L and the stripe interval Xa are registered in the visual system 202. There is a need to.
 視覚システム202のキャリブレーションにあたっては、まず、ロボットハンド102の前方にストライプSTの被投影物をカメラ224の光軸OAと被投影面とが垂直になるように設置し、カメラ224から当該被投影面までの距離Lを計る。次に、カメラ224に撮像を行わせ、カメラ224から取り込んだ画像データに係る2次元画像をモニタ250に表示させて、2次元画像におけるストライプ間隔Xaを数える。 In calibrating the vision system 202, first, the projection object of the stripe ST is placed in front of the robot hand 102 so that the optical axis OA of the camera 224 and the projection surface are perpendicular to each other. Measure the distance L to the surface. Next, the camera 224 performs imaging, and a two-dimensional image related to the image data captured from the camera 224 is displayed on the monitor 250, and the stripe interval Xa in the two-dimensional image is counted.
 そして、このようにして得た距離L及びストライプ間隔Xaをキーボード246から画像処理装置232に入力しメモリ236に記憶させることにより、視覚システム202は精度の高い3次元情報を生成することができるようになる。 The distance L and the stripe interval Xa obtained in this way are input from the keyboard 246 to the image processing apparatus 232 and stored in the memory 236, so that the visual system 202 can generate highly accurate three-dimensional information. become.
 <4 産業用ロボット1の動作>
 図14は、視覚システム202が2次元情報及び3次元情報を生成するときの産業用ロボット1の単位測定動作を説明するフローチャートであり、図17は、当該単位測定動作を繰り返すことによる実現される、半導体ウエハWの把持のための産業用ロボット1の動体予測動作を説明するフローチャートである。また、図15は、投影用光源204及びカメラ224と半導体ウエハWとの位置関係を示す模式図であり、図16は、画像処理装置232が取り込んだ画像データに係る2次元画像IMG5,IMG6を示す図である。
<4 Operation of industrial robot 1>
FIG. 14 is a flowchart for explaining the unit measurement operation of the industrial robot 1 when the visual system 202 generates two-dimensional information and three-dimensional information. FIG. 17 is realized by repeating the unit measurement operation. 5 is a flowchart for explaining a moving object prediction operation of the industrial robot 1 for gripping the semiconductor wafer W. 15 is a schematic diagram showing the positional relationship between the projection light source 204 and camera 224 and the semiconductor wafer W, and FIG. 16 shows two-dimensional images IMG5 and IMG6 related to the image data captured by the image processing apparatus 232. FIG.
 {産業用ロボット1の単位測定動作}
 図14に示すように、産業用ロボット1が単位測定動作を開始すると、まず、奇数番目のフレームFLpに照明用光源214が点灯するとともに(ステップS101)、カメラ224が第1の撮像を行いストライプSTが映っていない第1の2次元画像情報を含む画像信号を生成する(ステップS102)。
{Unit measurement operation of industrial robot 1}
As shown in FIG. 14, when the industrial robot 1 starts the unit measurement operation, first, the illumination light source 214 is turned on in the odd-numbered frame FLp (step S101), and the camera 224 performs the first imaging and stripes. An image signal including the first two-dimensional image information in which ST is not reflected is generated (step S102).
 続いて、処理選択部260がステップS101において第1の撮像を行うことにより画像処理装置232に取り込まれた第1の画像データDpを投影可否判定部270に処理させ、投影可否判定部270が半導体ウエハWにストライプSTを投影することができるか否かを判定する(ステップS103)。このとき、図15(a)に示すように、半導体ウエハWがカメラ224の前方正面になく、図16(a)に示すように2次元画像IMG5の周辺部に半導体ウエハWが映っている場合は、投影可否判定部270は、半導体ウエハWにストライプSTを投影することができないと判定して(ステップS103で"NO")、ロボット制御装置132にロボットアーム116を動作させて投影用光源204のストライプの投影先及びカメラ224の撮像範囲をずらす(ステップS104)。そして、視覚システム202は、ステップS104の終了後に再びステップS101を実行する。このようなステップS101~S104のループ処理によれば、半導体ウエハWにストライプSTを投影することができるようになるまでロボットアーム116が動作するので、投影用光源204及びカメラ224と半導体ウエハWとの当初の位置関係にかかわらず、半導体ウエハWにストライプSTを投影することができるようになる。 Subsequently, the process selection unit 260 causes the projection availability determination unit 270 to process the first image data Dp captured by the image processing apparatus 232 by performing the first imaging in step S101, and the projection availability determination unit 270 performs the semiconductor processing. It is determined whether or not the stripe ST can be projected onto the wafer W (step S103). At this time, as shown in FIG. 15A, the semiconductor wafer W is not in front of the front of the camera 224, and the semiconductor wafer W is reflected in the peripheral portion of the two-dimensional image IMG5 as shown in FIG. The projection availability determination unit 270 determines that the stripe ST cannot be projected onto the semiconductor wafer W (“NO” in step S103), and causes the robot control unit 132 to operate the robot arm 116 to project the projection light source 204. The projection destination of the stripe and the imaging range of the camera 224 are shifted (step S104). Then, the visual system 202 executes Step S101 again after Step S104 is completed. According to such loop processing of steps S101 to S104, the robot arm 116 operates until the stripe ST can be projected onto the semiconductor wafer W. Therefore, the projection light source 204, the camera 224, the semiconductor wafer W, Regardless of the initial positional relationship, the stripe ST can be projected onto the semiconductor wafer W.
 一方,図15(b)に示すように、半導体ウエハWがカメラ224の前方正面にあり、図16(b)に示すように2次元画像IMG5の中央部に半導体ウエハWが映っている場合は、投影可否判定部270は、半導体ウエハWにストライプSTを投影することができると判定して(ステップS103で"YES")、投影可能情報を処理選択部260に与える(ステップS105)。 On the other hand, when the semiconductor wafer W is in front of the camera 224 as shown in FIG. 15B and the semiconductor wafer W is reflected in the center of the two-dimensional image IMG5 as shown in FIG. The projection availability determination unit 270 determines that the stripe ST can be projected onto the semiconductor wafer W (“YES” in step S103), and provides the projection selection information to the process selection unit 260 (step S105).
 続いて、奇数番目のフレームFLqに投影用光源204が点灯するとともに(ステップS106)、カメラ224が第2の撮像を行いストライプSTが映っている第2の2次元画像情報を含む画像信号を生成する(ステップS107)。このステップS106~S107を実行するフレームFLqは、ステップS101~S102を実行したフレームFLpの後のフレームであることは当然であるが、必ずしもフレームFLpの次のフレームである必要はない。したがって、フレームFLpとフレームFLqとの間に処理が行われない空フレームが挿入されることがありうる。なお、この空フレームにおいて投影用光源204や照明用光源206の点灯を一時的に中止して、視覚システム202の消費電力を抑制するようにしてもよい。 Subsequently, the projection light source 204 is turned on in the odd-numbered frame FLq (step S106), and the camera 224 performs the second imaging and generates an image signal including the second two-dimensional image information in which the stripe ST is reflected. (Step S107). The frame FLq for executing steps S106 to S107 is naturally a frame after the frame FLp for executing steps S101 to S102, but is not necessarily the frame following the frame FLp. Accordingly, an empty frame that is not processed may be inserted between the frames FLp and FLq. In this empty frame, the lighting of the projection light source 204 and the illumination light source 206 may be temporarily stopped to reduce the power consumption of the visual system 202.
 次に、投影可能情報を与えられた処理選択部260がステップS107において第2の撮像を行うことにより画像処理装置232に取り込まれた第2の画像データDqを3次元情報生成部270に処理させ、3次元情報生成部270が半導体ウエハWに関する3次元情報を生成する(ステップS108)。 Next, the process selection unit 260 given the projectable information causes the three-dimensional information generation unit 270 to process the second image data Dq captured by the image processing device 232 by performing the second imaging in step S107. The three-dimensional information generation unit 270 generates three-dimensional information about the semiconductor wafer W (Step S108).
 このように視覚システム202がステップS102において生成した2次元画像情報及びステップS108において生成した3次元情報は、産業用ロボット202の制御に使用される。なお、これらの2次元画像情報及び3次元情報をモニタ250に表示するようにしてもよい。 Thus, the two-dimensional image information generated in step S102 by the visual system 202 and the three-dimensional information generated in step S108 are used for controlling the industrial robot 202. Note that these two-dimensional image information and three-dimensional information may be displayed on the monitor 250.
 {産業用ロボット1の動体予測動作}
 図17に示すように、産業用ロボット1は、動体予測動作を行う場合、まず、第1の単位測定動作(ステップS111)及び第2の単位測定動作(ステップS112)を順次実行する。このとき、第1の単位測定動作が終了した後に第2の単位測定動作を開始してもよいが、第1の単位測定動作が終了する前に第2の単位測定動作を開始してもよい。
{Motion prediction operation of industrial robot 1}
As shown in FIG. 17, when performing the moving body prediction operation, the industrial robot 1 first executes a first unit measurement operation (step S111) and a second unit measurement operation (step S112) sequentially. At this time, the second unit measurement operation may be started after the first unit measurement operation is finished, but the second unit measurement operation may be started before the first unit measurement operation is finished. .
 第1の単位測定動作及び第2の単位測定動作が終了した後、ロボット制御装置132は、第1の単位測定動作及び第2の単位測定動作により得られた3次元情報から半導体ウエハWの位置及び姿勢を予測する(ステップS113)。すなわち、ロボット制御装置132は、式(4)に従って時刻tにおける距離D(t)を算出し、式(5)に従って時刻tにおけるパン角θを算出する。 After the first unit measurement operation and the second unit measurement operation are completed, the robot controller 132 determines the position of the semiconductor wafer W from the three-dimensional information obtained by the first unit measurement operation and the second unit measurement operation. And the posture is predicted (step S113). That is, the robot controller 132 calculates the distance D (t) at time t according to the equation (4), and calculates the pan angle θ at time t according to the equation (5).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 ここで、距離D1及びパン角θ1は、第1の単位測定動作により得られた距離及びパン角であり、時刻t1は、第1の単位測定動作においてカメラ224が撮像を行った時刻である。同様に、距離D2及びパン角θ2は、第2の単位測定動作により得られた距離及びパン角であり、時刻t2は、第2の単位測定動作においてカメラ224が撮像を行った時刻である。 Here, the distance D1 and the pan angle θ1 are the distance and pan angle obtained by the first unit measurement operation, and the time t1 is the time at which the camera 224 has taken an image in the first unit measurement operation. Similarly, the distance D2 and the pan angle θ2 are the distance and pan angle obtained by the second unit measurement operation, and the time t2 is the time when the camera 224 has taken an image in the second unit measurement operation.
 なお、式(4)及び式(5)のような1次の外挿ではなく2次以上の高次の外挿を行ってもよい。また、3個以上の距離及びパン角を外挿して時刻tにおける距離D(t)及びパン角θ(t)を算出してもよい。 It should be noted that higher-order extrapolation of the second or higher order may be performed instead of the first-order extrapolation as in Expression (4) and Expression (5). Alternatively, the distance D (t) and the pan angle θ (t) at time t may be calculated by extrapolating three or more distances and the pan angle.
 時刻tにおける距離D(t)及びパン角θ(t)の予測が終了すると、ロボット制御装置1は、当該予測の結果に基づいて制御信号をモータ112,126,128に出力し、半導体ウエハWを把持する(ステップS114)。 When the prediction of the distance D (t) and the pan angle θ (t) at time t is completed, the robot control apparatus 1 outputs a control signal to the motors 112, 126, and 128 based on the result of the prediction, and the semiconductor wafer W Is gripped (step S114).
 これにより、産業用ロボットWは、半導体ウエハWの動きを予測して把持することになる。 Thereby, the industrial robot W predicts and holds the movement of the semiconductor wafer W.
 この発明は詳細に説明されたが、上記した説明は、すべての局面において、例示であって、この発明がそれに限定されるものではない。例示されていない無数の変形例が、この発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is illustrative in all aspects, and the present invention is not limited thereto. It is understood that countless variations that are not illustrated can be envisaged without departing from the scope of the present invention.

Claims (7)

  1.  2次元画像情報及び3次元情報の両方を生成する画像情報生成装置(202)であって、
     撮像を行い2次元画像情報を生成する撮像部(224)と、
     平行光を照射し幾何パターンを投影する投影用光源(204)と、
     前記撮像部が生成した2次元画像情報のうちの前記投影用光源が投影した幾何パターン(GP)に関する部分から3次元情報を生成する3次元情報生成部(262)と、
     前記投影用光源の点灯及び前記撮像部の撮像のタイミングを制御する制御部(220)と、
     前記撮像部が生成した2次元画像情報に対する処理を選択する処理選択部(260)と、
    を備え、
     前記制御部は、
     前記投影用光源を点灯させていないときに前記撮像部に第1の撮像を行わせて第1の2次元画像情報を生成させ、前記投影用光源を点灯させているときに前記撮像部に第2の撮像を行わせて第2の2次元画像情報を生成させ、
     前記処理選択部は、
     第2の2次元画像情報を前記3次元情報生成部に処理させる、
    画像情報生成装置。
    An image information generation device (202) that generates both two-dimensional image information and three-dimensional information,
    An imaging unit (224) that performs imaging and generates two-dimensional image information;
    A projection light source (204) that projects parallel patterns and projects a geometric pattern;
    A three-dimensional information generation unit (262) that generates three-dimensional information from a portion related to the geometric pattern (GP) projected by the projection light source in the two-dimensional image information generated by the imaging unit;
    A control unit (220) for controlling lighting of the projection light source and imaging timing of the imaging unit;
    A process selection unit (260) for selecting a process for the two-dimensional image information generated by the imaging unit;
    With
    The controller is
    When the projection light source is not lit, the imaging unit performs first imaging to generate first two-dimensional image information, and when the projection light source is lit, the imaging unit 2 to generate second 2D image information,
    The process selection unit
    Causing the 3D information generator to process second 2D image information;
    Image information generation device.
  2.  請求項1に記載の画像情報生成装置において、
     前記撮像部の撮像範囲を照明する照明光を照射する照明用光源(214)、
    をさらに備え、
     前記制御部は、
     前記照明用光源の点灯を制御し、
     前記撮像部に第1の撮像を行わせるときに前記照明用光源を点灯させる、
    画像情報生成装置。
    The image information generation apparatus according to claim 1,
    An illumination light source (214) for illuminating illumination light for illuminating the imaging range of the imaging unit;
    Further comprising
    The controller is
    Controlling lighting of the illumination light source,
    Turning on the illumination light source when the imaging unit performs the first imaging;
    Image information generation device.
  3.  請求項1に記載の画像情報生成装置において、
     前記投影用光源は、
     発光ダイオード(206)と、
     前記発光ダイオードが発光した拡散光を平行光に変換する光学系(208)と、
    を備える画像情報生成装置。
    The image information generation apparatus according to claim 1,
    The projection light source is
    A light emitting diode (206);
    An optical system (208) for converting the diffused light emitted by the light emitting diode into parallel light;
    An image information generating apparatus comprising:
  4.  請求項1に記載の画像情報生成装置において、
     前記3次元情報生成部は、
     第2の2次元画像情報に係る2次元画像に映っている幾何パターンの基準箇所の間の長さに基づいて前記撮像部から幾何パターンの被投影物までの距離の情報を生成する、
    画像情報生成装置。
    The image information generation apparatus according to claim 1,
    The three-dimensional information generation unit
    Generating information on the distance from the imaging unit to the projection object of the geometric pattern based on the length between the reference positions of the geometric pattern shown in the two-dimensional image related to the second two-dimensional image information;
    Image information generation device.
  5.  請求項1に記載の画像情報生成装置において、
     前記3次元情報生成部は、
     第2の2次元画像情報に係る2次元画像に映っている幾何パターンの形状のひずみに基づいて幾何パターンの被投影物の姿勢の情報を生成する、
    画像情報生成装置。
    The image information generation apparatus according to claim 1,
    The three-dimensional information generation unit
    Generating posture information of the projection object of the geometric pattern based on the distortion of the shape of the geometric pattern reflected in the two-dimensional image according to the second two-dimensional image information;
    Image information generation device.
  6.  請求項1ないし請求項5のいずれかに記載の画像情報生成装置において、
     第1の2次元画像情報のうちの特定の形状を有する対象物が映っている部分を特定し、特定した部分に映っている対象物に前記投影用光源が幾何パターンを投影することができるか否かを判定する投影可否判定部(270)と、
    をさらに備え、
     前記処理選択部は、
     第1の2次元画像情報を前記投影可否判定部に処理させ、
     特定の形状を有する対象物に幾何パターンを投影することができると前記投影可否判定部が判定した場合に、第2の2次元画像情報を前記3次元情報生成部に処理させる、
    画像情報生成装置。
    In the image information generation device according to any one of claims 1 to 5,
    Whether the projection light source can project a geometric pattern on the object reflected in the identified part by identifying the part of the first two-dimensional image information that reflects the object having a specific shape A projection propriety determination unit (270) for determining whether or not;
    Further comprising
    The processing selection unit
    Causing the projection determination unit to process the first two-dimensional image information;
    When the projection availability determination unit determines that a geometric pattern can be projected onto an object having a specific shape, causes the 3D information generation unit to process second 2D image information.
    Image information generation device.
  7.  2次元画像情報及び3次元情報の両方を生成する画像情報生成方法であって、
     (a) 第1の撮像を行い第1の2次元画像情報を生成する工程(S102)と、
     (b) 前記工程(a)において撮像を行っていないときに平行光を照射し幾何パターン(GP)を投影する工程(S106)と、
     (c) 前記工程(b)において幾何パターンを投影しているときに第2の撮像を行い第2の2次元画像情報を生成する工程(S107)と、
     (d) 前記工程(c)において生成された第2の2次元画像情報のうちの前記工程(b)において投影された幾何パターンが映っている部分から3次元情報を生成する工程(S108)と、
    を備える画像情報生成方法。
    An image information generation method for generating both 2D image information and 3D information,
    (a) a step of performing first imaging and generating first two-dimensional image information (S102);
    (b) a step of projecting a geometric pattern (GP) by irradiating parallel light when imaging is not performed in the step (a) (S106);
    (c) generating a second two-dimensional image information by performing a second imaging when projecting a geometric pattern in the step (b) (S107);
    (d) generating three-dimensional information from a portion of the second two-dimensional image information generated in the step (c) where the geometric pattern projected in the step (b) is shown (S108); ,
    An image information generation method comprising:
PCT/JP2008/069345 2008-02-08 2008-10-24 Image information generation device and image information generation method WO2009098803A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-028689 2008-02-08
JP2008028689A JP2009186404A (en) 2008-02-08 2008-02-08 Image information creating device and image information creating method

Publications (1)

Publication Number Publication Date
WO2009098803A1 true WO2009098803A1 (en) 2009-08-13

Family

ID=40951888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/069345 WO2009098803A1 (en) 2008-02-08 2008-10-24 Image information generation device and image information generation method

Country Status (2)

Country Link
JP (1) JP2009186404A (en)
WO (1) WO2009098803A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104931039A (en) * 2014-03-17 2015-09-23 纬创资通股份有限公司 Free space positioning method and system
WO2021199744A1 (en) * 2020-04-03 2021-10-07 株式会社Xtia Measurement device, measurement method, and program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5587137B2 (en) * 2010-10-29 2014-09-10 キヤノン株式会社 Measuring apparatus and measuring method
KR101218566B1 (en) * 2011-03-16 2013-01-15 성균관대학교산학협력단 Apparatus and method for measuring 3-dimension distance

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01203907A (en) * 1988-02-10 1989-08-16 Aisin Seiki Co Ltd Three-dimensional position and attitude measuring method
JPH03289505A (en) * 1990-04-06 1991-12-19 Nippondenso Co Ltd Three-dimensional shape measuring apparatus
JP2000287120A (en) * 1999-03-30 2000-10-13 Minolta Co Ltd Three-dimensional information input camera
JP2003270719A (en) * 2002-03-13 2003-09-25 Osaka Industrial Promotion Organization Projection method, projector, and method and system for supporting work

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01203907A (en) * 1988-02-10 1989-08-16 Aisin Seiki Co Ltd Three-dimensional position and attitude measuring method
JPH03289505A (en) * 1990-04-06 1991-12-19 Nippondenso Co Ltd Three-dimensional shape measuring apparatus
JP2000287120A (en) * 1999-03-30 2000-10-13 Minolta Co Ltd Three-dimensional information input camera
JP2003270719A (en) * 2002-03-13 2003-09-25 Osaka Industrial Promotion Organization Projection method, projector, and method and system for supporting work

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104931039A (en) * 2014-03-17 2015-09-23 纬创资通股份有限公司 Free space positioning method and system
WO2021199744A1 (en) * 2020-04-03 2021-10-07 株式会社Xtia Measurement device, measurement method, and program

Also Published As

Publication number Publication date
JP2009186404A (en) 2009-08-20

Similar Documents

Publication Publication Date Title
US11511421B2 (en) Object recognition processing apparatus and method, and object picking apparatus and method
US7526121B2 (en) Three-dimensional visual sensor
JP6061616B2 (en) Measuring apparatus, control method therefor, and program
JP5570126B2 (en) Method and apparatus for determining the posture of an object
JP6338421B2 (en) Information processing apparatus, information processing apparatus control method, gripping system, and program
JP6703812B2 (en) 3D object inspection device
JP2007206797A (en) Image processing method and image processor
WO2013008804A1 (en) Measurement device and information processing device
JP2005072888A (en) Image projection method and image projection device
JP2004090183A (en) Article position and orientation detecting device and article taking-out device
CN110340883A (en) Information processing unit, information processing method and computer readable storage medium
US11590657B2 (en) Image processing device, control method thereof, and program storage medium
WO2009098803A1 (en) Image information generation device and image information generation method
US10607337B2 (en) Object inspection system and object inspection method
US10656097B2 (en) Apparatus and method for generating operation program of inspection system
JP2007093412A (en) Three-dimensional shape measuring device
JP2015010845A (en) Information processing unit, instrumentation system, control system, light volume determination method, program and storage medium
KR100698535B1 (en) Position recognition device and method of mobile robot with tilt correction function
US11282187B2 (en) Inspection system, inspection apparatus, and method using multiple angle illumination
JP2014029268A (en) Semiconductor integrated circuit and object distance measuring instrument
JP6596530B2 (en) Method and apparatus
JP7438734B2 (en) Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system
JP7502641B2 (en) Recognition System
WO2022168617A1 (en) Workpiece detection device, workpiece detection method, workpiece detection system, and workpiece detection program
JP2018194544A (en) Information processor, method for processing information, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08872251

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08872251

Country of ref document: EP

Kind code of ref document: A1