WO1990007690A1 - Sensor and system for setting three-dimensional position - Google Patents
Sensor and system for setting three-dimensional position Download PDFInfo
- Publication number
- WO1990007690A1 WO1990007690A1 PCT/JP1986/000033 JP8600033W WO9007690A1 WO 1990007690 A1 WO1990007690 A1 WO 1990007690A1 JP 8600033 W JP8600033 W JP 8600033W WO 9007690 A1 WO9007690 A1 WO 9007690A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- head
- image
- light
- dimensional position
- teaching
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- the present invention determines the relative position of the sensor main body to the workpiece and the inclination angle of the sensor main body to the workpiece without contact.
- the present invention relates to a three-dimensional position sensor that can perform measurement, and also relates to a three-dimensional position setting system for teaching a working posture and an assembly posture on a rod or the like by using this three-dimensional position sensor.
- teaching is generally performed manually by an operator.
- teaching the operator should operate the teaching pendant while the machining head is the object.
- the three-dimensional position of the machining head with respect to the above-mentioned teaching point is the relative position Positioned to have a relationship.
- the above-mentioned three-dimensional position information is stored.
- a magnetic sensor placed near the processing head is used.
- the size of the eddy current generated in the work is detected, and the distance between the machining head and the work surface is calculated from the detected value. Based on this calculated heddle, the facing distance of the machining head P to the workpiece is set.
- the purpose of the present invention is to accurately and accurately detect the position of the sensor body with respect to the workpiece and the inclination angle of the sensor body with respect to the workpiece. It provides a 3D position sensor that can be output. Another object of the present invention is to use the above-mentioned three-dimensional position sensor to accurately teach a robot or the like in a short time without requiring skill. It provides a positioning system.
- a three-dimensional position sensor detects a three-dimensional position and orientation with respect to a position set point displayed on an object, and a sensor body and an object.
- An image pickup means for picking up an image including the formed bright spot, the position set point displayed on the object, and the surface of the object; the bright spot and the position set point picked up by the image pickup means
- the special feature is that it is equipped with an image processing means that electrically reads and, and detects the three-dimensional position and orientation of the sensor body with respect to the position set point.
- the three-dimensional position setting system detects the three-dimensional position and orientation of the head with respect to the position set point displayed on the object, and sets the head arbitrarily. Is placed on the head and obliquely projects on the object, and at least a plurality of bright spots forming the vertices of a triangle are formed on the surface of the object.
- Image processing means for electrically reading the imaged bright spots and position set points and detecting the three-dimensional position and orientation of the head with respect to the position set points;
- a storage means for storing information on the three-dimensional position and orientation of the head with respect to the position set point detected by this image processing means
- a driving means for reading the information stored in the storage means and causing the head to face the position set point in a predetermined three-dimensional position and posture based on the information.
- Fig. 1 is a perspective view showing the head part of the mouth where the three-dimensional position sensor and the three-dimensional position setting system according to the present invention are adopted, and Fig. 2 shows the structure of the three-dimensional position sensor.
- the perspective views shown in Figs. 3 and 4 show the structure of the teaching head, and Fig. 3 is a sectional view taken along the line III-1 in Fig. 2 and shown in Fig. 4. Is shown in Figure 3.
- Fig. 5 is a sectional view taken along the line, Fig. 5 is a block diagram showing the structure of the image processing circuit, Fig. 6A and Fig. 6B, Fig. 7A, Fig. 7B and Fig.
- FIGS. 9A to 9C are diagrams showing the forms of image signals, respectively.
- Figure 10 is a front view showing an example of the arrangement of conventional reference position marks and bright spots used to explain the effect
- Figures 11A and 11B are the head and work position gaps, respectively.
- FIG. 12 is a cross-sectional view showing a second embodiment of the present invention in a cross-section similar to FIG. 4 and a front view showing a display image of an image display
- FIG. 3 is a front view showing the three-dimensional position sensor of the third embodiment
- Fig. 14 is a vertical sectional view of the teaching head
- Fig. 15 is a view taken along the line XV-XV in Fig. 14.
- Sectional views shown by cutting FIGS. 16A and 16B are sectional views and front views respectively for explaining the operation of the third embodiment
- FIG. 17 is a view of the fourth embodiment.
- Fig. 18 is a cross-sectional view taken along the line X tt-XVi in Fig. 17 and Fig. 19 and
- FIG. 20 is a sectional view and a front view respectively for explaining the operation of the fourth embodiment
- FIG. 21 is a three-dimensional positioning system according to the invention.
- FIG. 22A, FIG. 22B and FIG. 23 are diagrams for explaining the operation principle of the three-dimensional position setting system
- FIG. Fig. 25 is a front view showing the display screen of the image display 7
- Fig. 25 is a flow chart showing the teaching processing operation
- Fig. 26 is a teaching of the tertiary position setting system of another embodiment.
- Flow chart showing the processing operation Figure 27 is a flow chart showing the subroutine of step Q17
- Figure 28 is a flow chart showing the subroutine of step QIS
- Figure 29 is Flowchart showing the teaching processing operation in an embodiment without using an image display
- FIG. 1 shows an example of the structure of the hem P portion of the cutting lock equipped with the three-dimensional position sensor and the three-dimensional position setting system according to the present invention.
- reference symbol J indicates the body of the head P.
- the laser processing head 2 is attached to one mounting surface of this head * body J] 3, and the teaching head ⁇ 3 is mounted to the other mounting surface. Is being worn.
- the head * main body J is rotatably connected to the mouth end 4 through the turning mechanism 6.
- the head body J is rotated via the rotation mechanism 6! ),
- the target is the teaching head 3 during teaching. It is now possible to face the surface of all workpieces 9 and the laser processing head 2 to face the surface of the workpiece S during cutting.
- Fig. 2 shows a schematic example of the overall configuration of the three-dimensional position sensor and the three-dimensional position setting system of this embodiment.
- the three-dimensional positioning system is attached to the teaching head 3 as shown in Fig. 2], and the details provided in the teaching head 3 are provided.
- the image pickup device 35 that will be described later, the image display device ⁇ that displays the image taken by the image pickup device 35, and the reference position pattern that displays the reference position turn on the image display device 7.
- the generation circuit S and the image signal of the light point for setting the facing distance and the facing posture of the teaching head 3 with respect to the work are mixed with the image signal of the background light such as illumination. It has an image processing circuit S a that separates the image signal of the point and measures the position.
- the tee one ⁇ to ring head 3 to the head casing 3 0, this to the: four projector provided in the housing 3 in 0 3 13 4 and imaging device 35 and word over It is equipped with an illuminator 3 that illuminates the surface of the ground.
- the structure is as shown in Figs. 3 and 4.
- each of the projectors 3 1 to 3 4 respectively connect the semiconductor laser device 3 1 & to 3 4 a and the reflecting mirror 3 1 b to 3 4 b. I'm crying.
- the semiconductor laser devices 3 1 ⁇ to 3 4 * are mounted and fixed on the inner wall surface of the head housing 30 at equal intervals, and the heads * 3 1 b to 3 4 b are heads.
- the holders 31c to 3 are fixed to the tip of the housing 30 so that the laser light from the semiconductor laser devices 3 1 to 3 4a can be reflected by the work.
- the projector 3 J has a bright spot S 1 for setting the distance as shown in Fig. 2 and other projectors 3 2 to 3 4 Bright points S 2 to S 4 for posture setting are formed respectively.
- the image pickup device 35 is composed of a solid-state image pickup device camera using the solid-state image pickup device 35a. Camera holder to the center of the imager 3 5 Wae' de housing 3 in 0
- This image pickup device 35 is mounted on the surface of the smoke 9 using the above-mentioned respective projectors 3 J to 34! )
- the bright points S 1 to S 4 formed and the teaching point P on the marking line K previously displayed on the surface of the work are respectively imaged.
- a plurality of illuminators 3 S that illuminate the surface of the work 9 are fixed to the tip of the head housing 30 by a holder 36 a.
- 3 7 is an optical filter for protection against fire
- 3 ⁇ is a teaching head
- the reference position turn generating circuit S has a first reference position mark used to set the facing distance of the head with respect to the work 9 and three reference position marks used to set the head posture with respect to the work.
- the second reference position marks M2 to M4 are generated respectively.
- This reference position turn generation circuit S] ?, and each of these reference position marks M1 to M4 is located at a predetermined position on the display screen of the image display 7 as shown in Fig. 2. 3 Displayed by superimposing on the captured image signal from 5.
- the image processing circuit S a detects the positions of the bright spots S 1 to S 4 on the image display 7 based on the imaged image signal from the imager 35, and the details are shown in FIG. 5)) To do.
- Figure 5 shows an example of the block configuration of the image processing circuit S a.
- reference numeral 40 is a clock V signal generator that generates a clock signal as a reference for determining the horizontal position of the screen for extracting an image from the imager 35
- reference numeral 4 J is a horizontal sync signal generator that counts the number of scanning lines on the display screen and generates a horizontal sync signal as a reference for determining the vertical position of the screen.
- Reference numeral 42 indicates the number of screens.
- Each of the vertical sync signal generators generates a vertical sync signal to be output.
- the clock signal, horizontal sync signal and vertical sync signal output from these are It is supposed to be input to the image pickup device 35.
- reference numeral 4 S indicates an image memory input unit to which the clock signal, the horizontal synchronizing signal and the vertical synchronizing signal are input.
- This image memory counter 4 S is for determining the memory address to be stored when the image signal is digitized.
- Reference numeral 4 indicates an image memory counter to which the above clock signal, horizontal synchronizing signal and vertical synchronizing signal are input.
- This image memory down counter 49 is for counting the address for reading out data from the image memory 5 1 described later.
- reference numeral 50 refers to an analog-digital converter in which the image signal from the image pickup device 35 and the address of the image signal are input from the image memory counter 48.
- the code 5 J is a digital signal output from the analog digital converter 50 and an image memory based on a write signal 5 2 b from a computer 5 2 which will be described later. It is stored in the address specified by the counter 48 and is read by the readout signal 5 2 * from the computer 5 2 described later. The image memory from which the data corresponding to the installed address is read out is shown.
- reference numeral 5 3 is an analog data. Refer to the image signal differentiator for extracting the difference between the output signal of the converter 5 0 and the data read from the image memory 5 J.
- Reference numeral 5 5 indicates an image signal level compensator that performs a Lepel comparison between the output signal of the image signal differentiator 5 3 and the reference signal output from the reference signal generator 5 4.
- reference numeral 57 counts the clock signal output from the clock signal generator 4 as the horizontal position and outputs it from the horizontal synchronization signal generator 4 J.
- the horizontal position counter is a horizontal position counter that is cleared for each scanning line.
- Reference numeral 60 is the horizontal sync signal generator 4 J and vertical sync signal.
- the vertical sync signal output from the signal generator 4 2] 9 The vertical position is output and the horizontal sync signal is applied] ⁇ The vertical position count is cleared for each screen.
- Reference numeral 5 is a latching control tf unit for the output signal of the image signal level comparator 5 5 according to the horizontal position signal output from the horizontal position counter 5 7.
- a horizontal position latch that is cleared by the latch clear signal 5 2 c from the device 5 2 and a reference numeral 2 indicates the output signal of the image signal repel comparator 5 5 in the vertical position counter.
- the vertical position signal output from the input device 6 2 depending on the latch clear signal 5 2 c from the computer 5 2]? It shows a chi.
- reference numeral 52 represents the data of the light spots corresponding to the horizontal and vertical positions latched by the horizontal position latch 5 and the vertical position latch 6 as a screen number force counter. It shows a computer that takes in together with the vertical sync signal output from the direct sync signal generator 42. This computer 52 calculates the facing distance and facing posture of the work 9 from the data of the spots by the method described later, and sends the calculation result to the image display 7 or the robot.
- a three-dimensional position sensor is configured by the projectors 3 to 3 4, the imager 35, and the image processing circuit Sa.
- FIG. 2 it is assumed that the teaching head 3 is fixed in the illustrated state, and that the work S can adjust the facing position and the facing posture of the teaching head 3. ..
- the work S can adjust the facing position and the facing posture of the teaching head 3. ..
- the work S is irradiated.
- a state where the facing distance of the work 9 is a predetermined value and the facing posture is in a correct posture is defined as a work ⁇ shown by a dashed line in Fig. 6A and a teaching point PO.
- the bright spots SJ to S4 can be formed at the intersection of the oblique beam and the workpiece 9 at the peak 9 located at the solid line in the figure. Therefore, the change of the facing distance or the facing posture of the work 9 appears as the deviation of the bright points SJ to S 4 on the image display 7. For example, if the work 9 is at the reference position and the work distance is long, the bright spots are located outside the reference marks M 1 to M 4 as shown in FIG. 6B. Yo. Further, when the work S is tilted in the vertical direction of the image display 7 in the facing posture, as shown in Figs. 7A and 7B. Therefore, the bright spot S is arranged inside the reference mark M 2.
- 3 and S 4 are arranged so as to be offset in the negative direction with respect to the reference marks M 3 and M 4.
- the bright spot occurs at the intersection of the oblique beam and the work 9, so if the deviation of the bright spot on the image display 7 is measured, the reference triangle is determined from the triangle similarity rule. By doing so, it is possible to obtain the coordinates of each bright point S 1 to S 4 in the three-dimensional space by proportional calculation. Moreover, the facing distance of the work can be calculated from the three-dimensional position of the bright point S ⁇ , and the facing posture of the work can be calculated from the bright points S 1 to S 4.
- an image in which no bright spots have been captured such as the image shown in Fig. 9A
- an image in which each bright spot is sequentially captured for example, 9B.
- An image like the one shown in the figure is captured.
- the image signal differentiator 53 calculates the difference between the image in which these bright spots are not captured and the image in which these bright spots are captured. Therefore, the image signal differencer 53 obtains an image of only bright spots from which the background light component has been removed, for example, the image shown in Fig. 9C.
- the image signal level comparator 55 can compare the reference signal output from the reference signal generator 5 4 with the bright spot image signal. That is, when an image signal including background light, for example, the image signal of the image shown in Fig. 6B is input to the image signal level comparator 55, the background light that is not at the correct bright spot position is detected. There is a possibility that it may be mistakenly detected as a bright spot position. However, it is possible to detect the correct luminescent spot position by the signal processing as described above.
- the bright spot is detected by the horizontal position latch 5 9 which outputs the detected signal at the horizontal position of the bright spot.
- the vertical position of the bright spot is transmitted to the vertical position latch 62 that is counting, and the correct position of the bright spot is maintained.
- the bright spot position data stored in the horizontal position latch 5 9 and the vertical position latch 5 2 are read by the computer 5 2 as appropriate, and the bright spots S 1 ⁇
- the horizontal and vertical position data of S 4 are the same as the facing distance of workpiece 9 based on the above-mentioned triangle similarity rule.
- the facing posture is obtained by substituting it into a predetermined mathematical formula. Then, the calculation result is displayed on the image display 7 through an appropriate signal line.
- the operator sees the data of the facing distance and the facing posture of the work 9 displayed on the image display 7], and sees the deviation between the reference marks M 1 to M 4 and the bright spots S 1 to S 4.
- the position of the mouth can be corrected so that the mouth has the correct facing distance and facing attitude on the teaching point P 0 of the work station.
- the data of the facing distance and facing posture of the workpiece 9 calculated by the image processing circuit S a in the 3D position sensor is transmitted to the robot controller. It is also possible to have the mouth port automatically correct the heading distance and the facing posture.
- the operator reads the facing distance and the facing attitude data of the work 9 while visually observing the display image of the image display 7], each bright point SJ to S 4 and the teaching point.
- the three-dimensional position of the main body 3 is adjusted so that P 0 is aligned with the reference marks MJ to M 4. Therefore, it is possible to perform the work as in the conventional way. It is possible to perform the touching accurately and in a short time without the intuition of the operator. You can improve it. 7690 P
- the facing distance and facing attitude data of the work S by the image processing circuit S a is sent to the rotor controller, and the three-dimensional position of the head main body 3 is automatically adjusted. By doing so, the efficiency of the teaching can be greatly improved.
- the influence of background light is removed by the image signal differentiator 53, so the accurate bright spot position can be measured, and the teaching gamma of the operator due to erroneous measurement can be measured. It is also possible to prevent runaway due to erroneous measurement when the measurement data is sent to the robot controller and the three-dimensional position is automatically adjusted. Can be o
- the present invention is not limited to the configuration of the above-mentioned embodiment.
- the bright points SJ to S4 are arranged around the teaching point P0, and the facing distance and the facing attitude are measured from the average value of each bright point.
- the bright spot SJ and the first reference position mark MJ dedicated to the setting of the facing distance are provided. It has the following effects.
- the teaching point P may be set on such a convex portion.
- the facing distance and facing posture are determined from the four bright spots and the reference position marks as shown in FIG.
- the distance from the machining head to the teaching point will differ from the specified value when laser cutting is performed. As a result, the laser was out of focus and the prescribed cutting could not be performed.]
- the tip of the processing head touches the convex part of the work and the head is removed. If damaged, it may cause malfunctions.
- the position of the teaching point P and the reference position mark MJ are made to coincide with each other, and the bright point SJ is made to coincide with this reference position mark. J?
- the facing distance is set. Therefore, even if the teaching point P is on a small convex portion as shown in Fig. 11A, it is possible to set the facing distance to the teaching point P on this convex portion. .. others Therefore, it is possible to prevent the problem of teaching in the wrong position, and as a result, it is possible to always perform accurate teaching without affecting the condition of the work surface.
- a semiconductor laser having the same wavelength is used for the projectors 3 i to 3 4 and the illuminator 36, and an optical filter 39 is used.
- the solid-state imaging device 35 may be disposed in front of a. With this optical filter 39, only the wavelength component of the output light of the projectors 3 1 to 3 4 is guided to the image sensor 35 a. Therefore, even if the ambient light such as sunlight is strong, the extraneous light is not filtered by the optical filter.
- the teaching heads 3 may be constructed by using an optical fiber.
- the same components as those in the configuration of the third embodiment, which are the same as those in the first embodiment, are designated by the same reference numerals, and the description thereof will be omitted. 7 0
- the three-dimensional position sensor of the third embodiment is constructed with the teaching head 3 and the teaching head S.
- J 0 4 a sensor controller that incorporates the image processing circuit S a and the reference position pattern generation circuit S for controlling the sensor and the repeater device J 0 4.
- the image display 7 is composed of a CRT display for displaying the displayed image.
- FIG. 14 shows
- Figure 13 shows the teaching head, the longitudinal section of Figure 3, and Figure 15 shows the section taken along the line D- ⁇ in Figure 4.
- the inside of the teaching head * 3 is equipped with four projection optical fiber optics J 3 1 to 1 3 4 and an imaging device. It contains the optical fiber optics i 3 2 for use. It should be noted that in Fig. 15 the projection optical fiber optics system J 3 2 is hidden and located directly under 13 J.
- the projection optical fiber optics J 3 J to J 3 4 are provided with reflecting mirrors 3 fixed in grooves and holes formed inside the teaching head 3, respectively.
- each projection optical fiber optics system J 3 1 to _ ⁇ 3 4 is an optical system with the semiconductor laser devices 3 1 to 3 4 a of the sensor drive unit 0 4.
- the reflectors J 3 1 b, I 3 2 b, 1 33 b, 1 3 3 b-1, 1 3 4 b, J 3 4 b-J are holders.
- optical fiber optics J3 1 to 1 3 4 for projecting light are laser beams emitted from the semiconductor laser devices 3 1 to 3 4 » and are narrow lenses 1 3 1 d to J 3 4 d.
- the projection optical fiber optics system J 3 J uses the projection point S 1 for setting the facing distance and the other projection optical fiber optics J 3 2 ⁇ J. 3 and 4 respectively form the bright spots S 2 to S 4 for posture setting.
- the imaging optical fiber optical system J 3 5 has an imaging lens 3 5 -... ⁇ as shown in Fig. 14 and a reflecting mirror.
- the sensor controller J (? 5 is the head for the work: the first reference position mark used to set the facing distance of the X
- Three second reference position marks M2 to M4, which are used to set the head posture, are generated respectively.
- Each of these reference position marks MJ to M4 is the first reference position mark.
- the image is displayed at a predetermined position on the display screen of the image display 7 by being superimposed on the imaged image signal from the imager 1 3 5-3.
- the optical fiber optics system for imaging 1 3 5 is provided inside the taper to image each light point and position set point formed above by the imager J 3 5-3. It is designed to form a hinge head *. Therefore, the head itself is small]), and there is less interference with the work compared to laser processing heads.] You can easily perform the touching operation.
- the same operation as described above is performed even if a part of the configuration is set as follows. It is possible. That is, in order to illuminate the work, the illumination optical fiber optical system is provided between the teaching head 3 and the sensor / liver unit 04. It is also possible to provide it in parallel with the system J 3 S and to illuminate the work 9 by reflecting the illumination light with the reflecting mirror 1 3S 5-2.
- the illumination optical fiber optical system in this way, it is possible to brightly display the marking line K and the teaching point P on the surface of the work P on the image display 7. so
- a semiconductor laser light is used as the illumination light for the illumination light optical system to illuminate the teaching point, etc., and a clear image is obtained without being affected by ambient light. I can do it.
- the three-dimensional position sensor has an autofocus mechanism and a measurement beam irradiation direction switching mechanism. You can also do so.
- the second 1 figure since showing the structure of this embodiment, for indicating an arbitrary position of the images display 7 'click 3 Lee stearyl I Tsu and click 3 JS, di 3 Lee stearyl I data click 3 IS
- the control unit 3 17 that outputs the tailing data by summing the specified position and the facing distance and facing attitude calculated by the image processing circuit S a is supported.
- the operation of this embodiment is as follows. First, in Fig. 21, the line connecting the projectors 3 3 and 3 4 is the X axis, and the line connecting the projectors 3 1 and 32 is the ⁇ axis. , The plane containing the Y axis is orthogonal to the optical axis of the camera 35.
- this optical axis 9 be the Z axis, and let the focal point F on the Z axis be the coordinate of this point of the Z axis, the intersection point P 0Y (reference position) with respect to the surface be ⁇ 0, and each projector 3 1 ⁇ 3
- the angle at which the spot ray from 4 intersects the ⁇ axis be, and the cutaway views in the plane including the ⁇ axis and the: X axis are shown in Figures 2 2 ⁇ and 22 2B. ..
- the general relationship between the X-axis, Y-axis and Z-axis is shown in Fig. 23.
- Y coordinates of each spot light S 2 and S 4 are Y 2 and Y 4, respectively.
- the inclination angle of the X-axis rotation]? At the intersection ⁇ on the surface of the work 9 is set to 0.
- the actual value of Y 4 is (one) value. Since the irradiation angle is a constant value, the intersection point
- the Z coordinate value of P 0 ⁇ is based on a simple geometrical consideration using the similarity of triangles i? , It is calculated by the equation (1).
- ⁇ ⁇ 2 ⁇ 2 ⁇ / ( ⁇ 2- ⁇ 4) tan ............ (1)
- the sign of the Z axis is positive in the direction toward the imager 35.
- ⁇ 0 ( ⁇ + ⁇ ) / 2
- the teaching point 3 I 3 ( ⁇ ) ( ⁇ , ⁇ ⁇ , ⁇ ⁇ ) of the teaching point 3 I 3 ( ⁇ ) ( ⁇ , ⁇ ⁇ , ⁇ ⁇ ) displayed on the image display 7 is calculated as ⁇ coordinate value ⁇ ⁇ .
- the instruction of teaching point ⁇ is given by operating the joystick S connected to the control unit 3 J 7 on the image display 7. Therefore, if the teaching point P is near the intersection P o] ?, and the above-mentioned tilt angle 0, 0 does not change significantly between P and P 0, the Z coordinate value of the teaching point P Z P is the Z coordinate value Z of the intersection point P o.
- ⁇ ⁇ ( ⁇ ⁇ 2 + ⁇ 2 ⁇ 4 + y Yp) /
- the machining head 2 of the mouth is moved from the position above the intersection point P o, which is currently stopped, to the position above the teaching point P, based on the coordinate relationship between the intersection point P 0 and the teaching point P. Information will be obtained.
- control unit 317 is configured to execute the teaching process for the rotor according to the flowchart of FIG.
- the movement information consisting of the direction and the distance with respect to code 3 is calculated.
- the teaching head 3 is moved to a position above the teaching point ⁇ .
- the distance between the focus F and the teaching point ⁇ is the first focus F and the intersection ⁇ .
- the position is controlled so as to match the distance with.
- the posture of the teaching head 3 is controlled to a predetermined value according to the inclination angle ⁇ , 0 at step.
- the posture control ends, in step Q9, the teaching head corresponding to this teaching point ⁇ , the three-dimensional coordinates of 3 and the posture angle data are stored in the storage unit. .. This is the end of the teaching process for the teaching points.
- the operator of the robot turns on the power of the teaching device and turns on each projector 3 1, After illuminating 3 2, 3, 3 and 3 4 , move teaching head 3 to a position near teaching point ⁇ formed on marking line ⁇ , for example, and teach point ⁇ is displayed on image display 7 Check that it is displayed in. Then, it is only necessary to specify the teaching point P with the joystick 3 JS. Teaching point P is specified Then, the teaching head 3 automatically moves to a position above the teaching point P, keeps a predetermined distance from the teaching point P, and stops at a predetermined posture angle. Then, the coordinates and posture data of the teaching head ⁇ 3 for this teaching point P are stored in the storage unit.
- the joystick 3 1 S is used as the means for designating the teaching point P, but it is possible to use a lite pen, a digitizer, a truck #, etc. It is okay if any position (coordinates) can be electrically specified on the image display 7.
- the teaching point position imaged on the image display is electrically read by the same configuration as that shown in FIG. Another example will be explained in which the method of automatically tracking the continuous teaching point mark is adopted by combining the facing distance and the facing attitude data by the camera.
- the three-dimensional coordinate (XP j, ⁇ , Zpi :) of the first teaching point P i is the two-dimensional positional relationship of each spot light S i to S 4.
- the teaching head 3 of the robot is set to The movement information for moving to the upper position will be obtained.
- control unit 317 is configured to execute the teaching process for the slot according to the flowchart of FIG.
- the instruction of the first teaching point P i is given by the operation of'Good stick 3 1 S'at step QJ.
- the coordinate values ⁇ , ⁇ of the teaching point P 1 designated as the first teaching point in step QJ 5 are read via the image discrimination circuit S a described above. Coordinates Kairoiota, the ⁇ is obtained at stearyl-up Q 1 6 using (6) Z-coordinate value Z P 1 of the first teaching point P i is calculated. Coordinates of first teaching point P i
- the direction of the teaching point * 3 of the robot is determined from the coordinate relationship between the intersection point P o and the first teaching point Pp at step QJ 7. And the movement information consisting of distance is calculated as ffi. According to the movement information, the teaching head 3 is moved to the position above the first teaching point P i. In this case, the position is controlled so that the distance between the focus F and the first teaching point P i matches the distance between the first focus F and the intersection point P 0.
- step Q J S is the posture of the teaching pendant F3 tilted? It is controlled to a predetermined value according to 5, 0.
- the data of the three-dimensional coordinate and the posture angle of the machining head corresponding to the first teaching point P i of the stepper are stored in the storage unit. This is the end of the teaching process for the first teaching point P i.
- step i As shown in Q 25, when the bright and dark level of the first teaching point P i is identified by the image identification circuit S a, the timing when the teaching head 3 moves to this teaching point P i Then this first teaching point P i becomes the new intersection P o.
- the image identification circuit 8 * identifies the pixel closest to the light / dark level from the adjacent pixels as the adjacent teaching point P N.
- the teaching point PN is automatically designated as a new teaching point P 1 after the teaching head 3 finishes moving as described above.
- the operation of the mouth port should be considered by turning on the power of the teaching device and observing each projector 3 1, 3 After 2, 3, 3 and 3 4 are lit up, move the teaching head 3 to, for example, the position near the first teaching point P i of the marking line K and move to this first teaching point. All you have to do is to confirm that P i is displayed on the image display 7 and then point to the first teaching point P i in the ‘good check 3 J ⁇ ’. When this first teaching point P i is specified, the teaching head 3 automatically moves to a position above the first teaching point, maintains a predetermined distance from the teaching point P i, and of Stop at the posture angle.
- step Q J 7 and step Q JT S are shown as subroutines in Figures 27 and 28, respectively.
- the teaching accuracy can be improved and the machining accuracy of the work S can be improved. Can be improved.
- the spot coordinates were automatically measured from the spots displayed on the Brown tube.
- the spot imaged by the imager is directly image-processed and its coordinates are automatically measured.
- the same effect can be obtained.
- the facing distance and the facing posture of the head and the object can be measured electrically by measuring each bright spot for measurement, and the fixed distance is fixed on the image display. It is possible to check the teaching position by making a measurement, and by matching the position set point on the object with the mark fixedly displayed on the image display. By using an image pickup device at the same time as these measurements, you can observe the surface condition of the target object with an image display, and by using image processing, you can also use position processing points such as marking lines. ? It is possible to achieve a number of effects such as electrical detection.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
In three-dimensionally machining and assembling an object (9) which is a work being treated by a machining robot or the like, a three-dimensional position sensor measures a relative position and a tilt angle of a head (1) with respect to the object in a contactless manner. The three-dimensional position sensor is equipped with a plurality of projectors (31 to 34) that are provided in a teaching head (3) to slantly project light onto the object (9) in order to form a plurality of bright points (S1 to S4) that constitute apexes of a triangle on the object (9); an image pickup device (35) which takes an image that contains the bright points (S1 to S4), position setpoints (P) indicated on the object (9), and the surface of the object (9); and an image processing circuit (8a) which electrically reads the bright points (S1 to S4) imaged by the image pickup device (35) and the position setpoints (P), and detects a three-dimensional position and attitude of the head (1) relative to the position setpoint (P). According to a three-dimensional position setting system which employs the above three-dimensional position sensor, a laser machining head (2) is positioned on the object maintaining a predetermined three-dimensional position and attitude based upon the data detected by the three-dimensional position sensor in a contactless manner.
Description
明 細 香 Incense
三次元位置セ ンサ及び三次元位置設定システム 技術分野 Technical field of 3D position sensor and 3D position setting system
この発明は 、 例えば加工 口 ッ ト等によ る加工物 への三次元の加工、 組立において、 非接触で加工物へ のセ ンサ本体の相対位置及びセ ンサ本体の加工物への 傾斜角度を計測する こ と のでき る三次元位置セ ンサに 関する と共に 、 この三次元位置セ ンサを用いて ロ ッ ト等に加工、 組立姿勢を教示するための三次元位置設 定シ ス テ ムに関する 。 The present invention, for example, in three-dimensional machining and assembling of a workpiece using a machining port or the like, determines the relative position of the sensor main body to the workpiece and the inclination angle of the sensor main body to the workpiece without contact. The present invention relates to a three-dimensional position sensor that can perform measurement, and also relates to a three-dimensional position setting system for teaching a working posture and an assembly posture on a rod or the like by using this three-dimensional position sensor.
背景技術 Background technology
例えば、 製造ラ ィ ンにおける製品の組立て工程や 材料の切断工程および加工工程等に設置される産業用 ロ ポ ッ ト においては、 実際に稼働させるま えに ロ ポ ッ ト に対 して作業へッ ドの移動経路および動作手順を三 次元的な被加工物の形状に即して教え込む必要がある。 こ の作業へ ッ ドの三次元動作を教え込む作業はテ ィ ー チ ング と呼ばれている 。 こ のテ ィ ーチ ングは従来よ 一般に作業者の手動操作によ ]?行なわれている 。 例え ば、 C02 レーザ切断ロ ポ ッ ト にテ ィ ーチングを行な う 場合は 、 作業者がテ ィ 一チ ング ペン ダ ン ト を操作しな が ら加工へ ッ ドを対象物と しての ワーク表面に描かれ たけがき線の教示点に接近させる 。 しかるのち上記教 示点に対 し加工へ ッ ドの三次元位置が所定の相対位置
関係となる よ うに位置設定される 。 このと き の上記三 次元位置情報を記憶させる こ とに よ ])ティ ーチングが 行なわれている 。 For example, in industrial robots that are installed in the product assembly process, material cutting process, and processing process in the manufacturing line, the robot must be operated before the actual operation. It is necessary to teach the movement path and operation procedure of the rod according to the three-dimensional shape of the workpiece. The task of teaching the three-dimensional movement of this work head is called teaching. Conventionally, this teaching is generally performed manually by an operator. For example, when teaching the C0 2 laser cutting robot, the operator should operate the teaching pendant while the machining head is the object. Approach the teaching point of the marking line drawn on the work surface of. After that, the three-dimensional position of the machining head with respect to the above-mentioned teaching point is the relative position Positioned to have a relationship. At this time, the above-mentioned three-dimensional position information is stored.]) Teaching is performed.
最近ではテ ィ 一チ ング作業の能率向上のために、 例えば加工へッ ドの近傍に配置された磁気セ ンサが用 いられている 。 この磁気セ ンサによ ]) ワ ークに生じた 渦電流の大きさが検出され、 その検出値から加工へッ ト,と ワ ーク表面との距雔が算出される 。 この算出綜杲 に基いて ワークに対する加工へッ Pの対向距離を設定 する こ とが行なわれている 。 Recently, in order to improve the efficiency of the touching operation, for example, a magnetic sensor placed near the processing head is used. With this magnetic sensor]) The size of the eddy current generated in the work is detected, and the distance between the machining head and the work surface is calculated from the detected value. Based on this calculated heddle, the facing distance of the machining head P to the workpiece is set.
と ころが 、 こ の よ う な従来のテ ィ 一チ ング作業は、 作業者が教示点に対する加工へツ ドの三次元位置を直 接目視しなが ら行なわれている 。 このため、 テ ィ ーチ ングに手間と時間とが必要とな ]? 、 しかも正確な位置 設定を行なうためには熟練を要する という欠点がある。 However, such conventional teaching work is performed while the operator directly visually checks the three-dimensional position of the machining head with respect to the teaching point. For this reason, there is a disadvantage in that it requires time and effort for the teaching, and that skill is required for accurate position setting.
また、 磁気セ ンサなど従来のセ ンサを備えた位置 設定装置を使用した場合は、 ワ ークに対するへ : X ドの 対向距離については確かに簡単に設定する ことができ る 。 しかしなが ら、 磁気セ ンサは検出面積が大き いた め、 ヘ ッ ドの微少領域に対する ワ ーク の対向距離の測 定精度が低 く 、 またワークに対する加工へッ Pの対向 姿勢については以前と して作業者の目視設定に頼らざ るを得ず、 ティ ーチング能率および精度の大幅な向上 は望めなかった。
この発明は上記した事情に鑑みてなされたも ので、 この発明の目 的は加工物へのセ ン サ本体の位置及び加 ェ物へのセ ンサ本体の傾斜角度を非接触で精度よ く検 出する こ とのでき る三次元位置セ ンサを提供する こ と である 。 また、 この発明の別の目的は、 上記した三次 元位置セ ン サを用いて、 ロ ^ ツ ト等に短時間に しかも 熟練を要する こ とな く正確に教示する こ とのでき る三 次元位置設定シ ス テ ムを提供する こ とである 。 When using a position setting device equipped with a conventional sensor such as a magnetic sensor, it is certainly easy to set the facing distance between the work and the head. However, since the magnetic sensor has a large detection area, the accuracy of measuring the facing distance of the work with respect to a minute area of the head is low, and the facing position of the machining head P with respect to the workpiece has not been measured before. However, the operator had no choice but to rely on the visual setting, and no significant improvement in teaching efficiency and accuracy could be expected. The present invention has been made in view of the above circumstances. Therefore, the purpose of the present invention is to accurately and accurately detect the position of the sensor body with respect to the workpiece and the inclination angle of the sensor body with respect to the workpiece. It provides a 3D position sensor that can be output. Another object of the present invention is to use the above-mentioned three-dimensional position sensor to accurately teach a robot or the like in a short time without requiring skill. It provides a positioning system.
発明の開示 Disclosure of the invention
この発明に係る三次元位置セ ンサは、 上記目的を 達成するため、 対象物上に表示された位置設定点に対 する三次元位置及び姿勢を検出する も のであって、 セ ンサ本体と ; 対象物上に斜方投光し、 少な く と も三角 形の頂点を構成する複数の輝点を対象物の表面上に形 成させる投光手段と ; 投光手段に よ ]? 対象物上に形成 された輝点と 、 対象物上に表示された位置設定点と 、 対象物の表面とを含んだ像を撮像する撮像手段と ; こ の撮像手段によ 撮像された輝点と位置設定点とを電 気的に読み取 、 位置設定点に対するセ ンサ本体の三 次元位置及び姿勢を検出する画像処理手段とを具備す る こ と を特徵と している 。 In order to achieve the above object, a three-dimensional position sensor according to the present invention detects a three-dimensional position and orientation with respect to a position set point displayed on an object, and a sensor body and an object. A projecting means for obliquely projecting light onto an object to form a plurality of bright spots forming at least the vertices of a triangular shape on the surface of the object; depending on the projecting means]? On the object An image pickup means for picking up an image including the formed bright spot, the position set point displayed on the object, and the surface of the object; the bright spot and the position set point picked up by the image pickup means The special feature is that it is equipped with an image processing means that electrically reads and, and detects the three-dimensional position and orientation of the sensor body with respect to the position set point.
ま た、 この発明に係る三次元位置設定シ ス テムは、 対象物上に表示された位置設定点に対するへッ ドの三 次元位置及び姿勢を検出 し、 へ ッ ドを任意に位置設定
に対向させる ことのできるも のであ って、 へッ ドに設 けられ、 対象物上に斜方投光し、 少な く と も三角形の 頂点を構成する複数の輝点を対象物の表面上に形成さ せる投光手段と 、 In addition, the three-dimensional position setting system according to the present invention detects the three-dimensional position and orientation of the head with respect to the position set point displayed on the object, and sets the head arbitrarily. Is placed on the head and obliquely projects on the object, and at least a plurality of bright spots forming the vertices of a triangle are formed on the surface of the object. A light projecting means formed on the
へ : y ドに設け られ投光手段によ u対象物上に形成 された輝点と 、 対象物上に表示された位置設定点と対 象物の表面とを含んだ像を撮像する撮像手段と ; To: Imaging means for capturing an image including the bright spots formed on the object by the light projecting means provided on the y-axis, the position set points displayed on the object, and the surface of the object When ;
この撮像手段によ ]?撮像された輝点と位置設定点 とを電気的に読み取 、 位置設定点に対するへ ッ ドの 三次元位置及び姿勢を検 する画像処理手段と ; Image processing means for electrically reading the imaged bright spots and position set points and detecting the three-dimensional position and orientation of the head with respect to the position set points;
この画像処理手段で検出 した位置設定点に対する へッ ドの三次元位置及び姿勢の情報を記億する記億手 段と ; A storage means for storing information on the three-dimensional position and orientation of the head with respect to the position set point detected by this image processing means;
この記憶手段に記憶された情報を読み出 して、 こ の情報に基づいてへッ ドを位置設定点に所定の三次元 位置及び姿勢で対向させる駆動手段とを具備する こと を特徵と している 。 As a special feature, it is provided with a driving means for reading the information stored in the storage means and causing the head to face the position set point in a predetermined three-dimensional position and posture based on the information. There is.
図面の簡単な説明 Brief description of the drawings
第 1 図はこ の発明に係る三次元位置セ ンサ及び三 次元位置設定システ ムが採用される 口 ッ ト のヘ ッ ド 部分を示す斜視図、 第 2 図は三次元位置セ ンサの構成 を示す斜視図、 第 3 図および第 4 図はテ ィ ーチングへ ッ ドの構成を示すも ので、 第 3 図は第 2 図の III - 1線 に沿って切断して示す断面図、 第 4 図は第 3 図の
線に沿って切断して示す断面図、 第 5 図は画像処理回 路の構成を示すブ tt ツ ク図、 第 6 A図及び第 6 B 図、 第 7 A図及び第 7 B 図および第 8 A 図及び第 8 B図は それぞれ作用説明に用いるためのも ので、 各第 6 A図、 第 7 A図、 第 8 A図はヘ ッ ドと ワ ーク と の位置闋係を 示す断面図、 各第 6 B 図、 第 7 B 図、 第 8 B 図は画像 表示器の表示画像を示す正面図、 第 9 A図乃至第 9 C 図はそれぞれ画像信号の形態を示す線図、 第 1 0 図は 効果の説明に用いる従来の基準位置マーク よび輝点 の配置の一例を示す正面図、 第 1 1 A 図及び第 1 1 B 図はそれぞれへジ ドと ワ ーク と の位置閧係を示す断面 図及び画像表示器の表示画像を示す正面図、 第 1 2 図 は、 第 4 図と同様の断面において本発明の第 2 の実施 例を示す断面図、 第 1 3 図は第 3 の実施例の三次元位 置セ ンサを示す正面図 、 第 1 4 図はテ ィ ーチ ングへ ッ の縦断面図、 第 1 5 図は第 1 4 図の XV - XV線に沿 つて切断して示す断面図、 第 1 6 A図及び第 1 6 B 図 はそれぞれ第 3 の実施例の作用を説明するための断面 図及び正面図、 第 1 7 図は、 第 4 の実施例の三次元位 置セ ンサを示す第 4 図と同様の断面における断面図、 第 1 8 図は、 第 1 7図の X tt - XVi線に沿って切断して 示す断面図、 第 1 9 図及び第 2 0 図はそれぞれ第 4 の 実施例の作用を説明するための断面図及び正面図、 第 2 1 図は、 こ の発明に係る三次元位置設定システムの
—実施例の全体の構成を示す斜視図、 第 2 2 A図、 第 2 2 B 図及び第 2 3 図は三次元位置設定システムの作 動原理を説明するための図、 第 2 4 図は画像表示器 7 の表示画面を示す正面図、 第 2 5 図はテ ィ ーチング処 理動作を示す流れ図、 第 2 6 図は、 他の実施例の三次 元位置設定システ ムのテ ィ 一チ ング処理動作を示す流 れ図、 第 2 7 図はステ ッ プ Q 1 7 のサ ブルーチ ンを示 す流れ図、 第 2 8 図はステ ッ プ Q I S のサブルーチ ン を示す流れ図、 そして第 2 9 図は画像表示器を用いな い実施例におけるテ ィ 一チ ング処理動作を示す流れ図 乙ある 0 Fig. 1 is a perspective view showing the head part of the mouth where the three-dimensional position sensor and the three-dimensional position setting system according to the present invention are adopted, and Fig. 2 shows the structure of the three-dimensional position sensor. The perspective views shown in Figs. 3 and 4 show the structure of the teaching head, and Fig. 3 is a sectional view taken along the line III-1 in Fig. 2 and shown in Fig. 4. Is shown in Figure 3. Fig. 5 is a sectional view taken along the line, Fig. 5 is a block diagram showing the structure of the image processing circuit, Fig. 6A and Fig. 6B, Fig. 7A, Fig. 7B and Fig. Figures 8A and 8B are used to explain the operation, respectively, so Figures 6A, 7A, and 8A are cross-sections showing the position and engagement of the head and work. Figures, 6B, 7B, and 8B are front views showing images displayed on the image display, and FIGS. 9A to 9C are diagrams showing the forms of image signals, respectively. Figure 10 is a front view showing an example of the arrangement of conventional reference position marks and bright spots used to explain the effect, and Figures 11A and 11B are the head and work position gaps, respectively. FIG. 12 is a cross-sectional view showing a second embodiment of the present invention in a cross-section similar to FIG. 4 and a front view showing a display image of an image display, and FIG. Fig. 3 is a front view showing the three-dimensional position sensor of the third embodiment, Fig. 14 is a vertical sectional view of the teaching head, and Fig. 15 is a view taken along the line XV-XV in Fig. 14. Sectional views shown by cutting, FIGS. 16A and 16B are sectional views and front views respectively for explaining the operation of the third embodiment, and FIG. 17 is a view of the fourth embodiment. A cross-sectional view in the same cross section as in Fig. 4 showing the three-dimensional position sensor, Fig. 18 is a cross-sectional view taken along the line X tt-XVi in Fig. 17 and Fig. 19 and FIG. 20 is a sectional view and a front view respectively for explaining the operation of the fourth embodiment, and FIG. 21 is a three-dimensional positioning system according to the invention. — A perspective view showing the entire configuration of the embodiment, FIG. 22A, FIG. 22B and FIG. 23 are diagrams for explaining the operation principle of the three-dimensional position setting system, and FIG. Fig. 25 is a front view showing the display screen of the image display 7 , Fig. 25 is a flow chart showing the teaching processing operation, and Fig. 26 is a teaching of the tertiary position setting system of another embodiment. Flow chart showing the processing operation, Figure 27 is a flow chart showing the subroutine of step Q17, Figure 28 is a flow chart showing the subroutine of step QIS, and Figure 29 is Flowchart showing the teaching processing operation in an embodiment without using an image display
発明を実施するための最良の形態 BEST MODE FOR CARRYING OUT THE INVENTION
以下本発明の一実施例を図面を参照して説明する。 第 1 図は本発明によ る三次元位置セ ンサ及び三次 元位置設定シ ステ ムを備えた切断用ロ ッ 卜 のへ ッ P 部分の構成例を示すも のである。 An embodiment of the present invention will be described below with reference to the drawings. FIG. 1 shows an example of the structure of the hem P portion of the cutting lock equipped with the three-dimensional position sensor and the three-dimensional position setting system according to the present invention.
第 1 図において、 参照符号 J はへッ P本体を示し ている 。 このへ ッ ト *本体 J の一方の取付け面には、 レ 一ザ加工へッ ド 2が取着されてお ]3 、 また他方の取付 け面にはテ ィ ーチングへ ッ ト♦ 3が取着されている 。 ま た 、 このへ ッ ト *本体 J は 口 ッ ト 了ー ム 4 に対し回動 機構 6 を介して回動可能に連結されている 。 へ ッ ド本 体 J を回動機構 6 を介して回動させる こ とによ !) 、 テ ィ ーチング時にはテ ィ ーチ ングへ ッ ド 3 を対象物と し
ての ワ ーク 9 表面に対向させ、 切断時にはレーザ加工 へ ッ ド 2 をワーク S 表面に対向させる こ とができ る よ うにな つている 。 In Fig. 1, reference symbol J indicates the body of the head P. The laser processing head 2 is attached to one mounting surface of this head * body J] 3, and the teaching head ♦ 3 is mounted to the other mounting surface. Is being worn. Further, the head * main body J is rotatably connected to the mouth end 4 through the turning mechanism 6. The head body J is rotated via the rotation mechanism 6! ), The target is the teaching head 3 during teaching. It is now possible to face the surface of all workpieces 9 and the laser processing head 2 to face the surface of the workpiece S during cutting.
第 2 図は本一実施例の三次元位置セ ンサ及び三次 元位置設定システ ム全体の構成例を概略的に示すも の こ め 0 Fig. 2 shows a schematic example of the overall configuration of the three-dimensional position sensor and the three-dimensional position setting system of this embodiment.
即ち、 三次元位置設定システ ムは、 第 2 図に示す よ うに、 テ ィ ーチ ングへ ッ ド 3に取付けられてお ]? 、 このティ ーチ ングへッ ド 3内に設けられた詳細を後述 する撮像器 3 5 と 、 こ の撮像器 3 5 に よ 撮像された 画像を表示する画像表示器 τ と 、 こ の画像表示器 7 に 基準位置 ター ンを表示させる基準位懂 タ ーン発生 回路 S と 、 ワ ーク に対するテ ィ ーチ ングへ ッ ド 3 の 対向距離および対向姿勢を設定するための辉点の画像 信号と照明等の背景光の画像信号が混合した画像信号 から輝点の画像信号を分離して位置を計測する画像処 理回路 S a とを傭えている 。 That is, the three-dimensional positioning system is attached to the teaching head 3 as shown in Fig. 2], and the details provided in the teaching head 3 are provided. The image pickup device 35 that will be described later, the image display device τ that displays the image taken by the image pickup device 35, and the reference position pattern that displays the reference position turn on the image display device 7. The generation circuit S and the image signal of the light point for setting the facing distance and the facing posture of the teaching head 3 with respect to the work are mixed with the image signal of the background light such as illumination. It has an image processing circuit S a that separates the image signal of the point and measures the position.
上記ティ一^ ングへ ッ ド 3 はへ ッ ド筐体 3 0 と 、 こ のへ : 筐体 3 0 内に設けられた 4個の投光器 3 1 〜 3 4及び、 撮像器 3 5 及びワ ーク の表面を照明する 照明器 3 を備えてお ]? 、 その構成は第 3 図及び第 4 図に示すよ うにな つている 。 The tee one ^ to ring head 3 to the head casing 3 0, this to the: four projector provided in the housing 3 in 0 3 13 4 and imaging device 35 and word over It is equipped with an illuminator 3 that illuminates the surface of the ground. The structure is as shown in Figs. 3 and 4.
つま ]?、 各投光器 3 1 〜 3 4 はそれぞれ半導体レ 一ザ装置 3 1 & 〜 3 4 a と反射鏡 3 1 b 〜 3 4 b とを
傭えている 。 半導体レーザ装置 3 1 Λ 〜 3 4 * はへ ッ ド筐体 3 0 の内壁面に等間隔を 存して配設固定され、 またへ ッ ト *镜 3 1 b 〜 3 4 b はへ ッ P筐体 3 0の先端 部位に半導体レーザ装置 3 1 & 〜 3 4 a か らのレーザ 光がワ ーク に反射でき る よ うに保持体 31 c 〜 3 " によ ]?固定されている 。 これらの投光器 3 1 〜 3 4は 半導体レーザ装置 » 〜 3 4 *か ら発生するレーザ 光を反射鏡 3 1 b 〜 3 4 b でそれぞれ反射してワ ーク の表面に投光し、 これによ ]? ワーク S の表面上に辉 点を形成 している 。 投光器 3 J は第 2 図に示すよ う に、 距雔設定用の輝点 S 1 を、 また他の投光器 3 2 〜 3 4 は姿勢設定用の輝点 S 2 〜 S 4 をそれぞれ形成する 。 ], And each of the projectors 3 1 to 3 4 respectively connect the semiconductor laser device 3 1 & to 3 4 a and the reflecting mirror 3 1 b to 3 4 b. I'm crying. The semiconductor laser devices 3 1 Λ to 3 4 * are mounted and fixed on the inner wall surface of the head housing 30 at equal intervals, and the heads * 3 1 b to 3 4 b are heads. The holders 31c to 3 "are fixed to the tip of the housing 30 so that the laser light from the semiconductor laser devices 3 1 to 3 4a can be reflected by the work. The projectors 3 1 to 3 4 of the semiconductor laser device »reflect the laser light generated from the semiconductor laser device» to 3 4 * by reflecting mirrors 3 1 b to 3 4 b, respectively, and project the laser light onto the surface of the work. ]? A light spot is formed on the surface of the work S. The projector 3 J has a bright spot S 1 for setting the distance as shown in Fig. 2 and other projectors 3 2 to 3 4 Bright points S 2 to S 4 for posture setting are formed respectively.
—方、 撮像器 3 5 は固体撮像素子 3 5 a を使用し た固体撮像素子カメ ラから構成されている 。 この撮像 器 3 5 はへッ ド筐体 3 0 内の中央部にカメ ラ保持体 On the other hand, the image pickup device 35 is composed of a solid-state image pickup device camera using the solid-state image pickup device 35a. Camera holder to the center of the imager 3 5 Wae' de housing 3 in 0
3 S b によ D固定されている 。 この撮像器 3 5 はヮー ク 9の表面に上記各投光器 3 J 〜 3 4 によ !)形成され た輝点 S 1 〜 S 4 および ワーク の表面に予め表示さ れたけがき線 K上に教示点 Pをそれぞれ撮像するも の である 。 また、 ワーク 9 の表面を照明する複数個の照 明器 3 S はへ タ ド筐体 3 0 の先端部位に保持体 3 6 a によ ]?固定されている 。 Fixed by 3 S b. This image pickup device 35 is mounted on the surface of the smoke 9 using the above-mentioned respective projectors 3 J to 34! ) The bright points S 1 to S 4 formed and the teaching point P on the marking line K previously displayed on the surface of the work are respectively imaged. A plurality of illuminators 3 S that illuminate the surface of the work 9 are fixed to the tip of the head housing 30 by a holder 36 a.
尚、 3 7 は防廛用の光フ ィ ル タ 、 3 β はテ ィ ーチ ングへッ ト, 3 をへッ 本体 J に取付けるための取付部
材である 。 In addition, 3 7 is an optical filter for protection against fire, 3 β is a teaching head, and a mounting part for mounting 3 on the main body J. It is a material.
基準位置 タ ー ン発生回路 S はワーク 9 に対する へッ ドの対向距離を設定するために使用する第 1 の基 準位置マーク と ワーク に対するへ ッ ドの姿勢を 設定するために使用する 3 個の第 2 の基準位置マーク M 2 〜 M 4 とをそれぞれ発生している 。 この基準位置 タ ー ン発生回路 S に よ ]? 、 これらの各基準位置マー ク M 1 〜 M 4 はそれぞれ第 2 図に示すよ うに画像表示 器 7 の表示画面上の所定の位置に撮像器 3 5 からの撮 像画像信号に重畳 して表示される 。 The reference position turn generating circuit S has a first reference position mark used to set the facing distance of the head with respect to the work 9 and three reference position marks used to set the head posture with respect to the work. The second reference position marks M2 to M4 are generated respectively. This reference position turn generation circuit S] ?, and each of these reference position marks M1 to M4 is located at a predetermined position on the display screen of the image display 7 as shown in Fig. 2. 3 Displayed by superimposing on the captured image signal from 5.
画像処理回路 S a は撮像器 3 5 からの撮像画像信 号によ 前記輝点 S 1 〜 S 4 の画像表示器 7 における 位置を検出する も ので、 その詳細を第 5 図によ ])説明 する 。 The image processing circuit S a detects the positions of the bright spots S 1 to S 4 on the image display 7 based on the imaged image signal from the imager 35, and the details are shown in FIG. 5)) To do.
第 5 図は画像処理回路 S a のプ ロ ッ ク構成例を示 すも のである 。 第 5 図において、 参照符号 4 0 は撮像 器 3 5 から画像を取出すための画面水平位置を決める 基準と してのク 口 ッ ク信号を発生するク 口 V ク信号発 生器、 参照符号 4 J は表示画面上の走査線の数をカ ウ ン ト し画面垂直位置を決める基準と しての水平同期信 号を発生する水平同期信号発生器、 参照符号 4 2 は画 面数をカ ウ ン トする垂直同期信号を発生する垂直同期 信号発生器をそれぞれ示している 。 これらから出力さ れるク ロ タ ク信号、 水平同期信号及び垂直同期信号は
撮像器 3 5 に入力される よ う になっている。 Figure 5 shows an example of the block configuration of the image processing circuit S a. In FIG. 5, reference numeral 40 is a clock V signal generator that generates a clock signal as a reference for determining the horizontal position of the screen for extracting an image from the imager 35, and reference numeral 4 J is a horizontal sync signal generator that counts the number of scanning lines on the display screen and generates a horizontal sync signal as a reference for determining the vertical position of the screen.Reference numeral 42 indicates the number of screens. Each of the vertical sync signal generators generates a vertical sync signal to be output. The clock signal, horizontal sync signal and vertical sync signal output from these are It is supposed to be input to the image pickup device 35.
ま た、 参照符号 4 S は上記ク ロ ッ ク信号、 水平同 期信号及び垂直同期信号が入力される画像メ モ リ 力 ゥ ンタ を示 している 。 この画像メ モ リ カ ウ ン タ 4 S は画 像信号をデ ィ ジタ ル化したとき に記憶すべき メ モ リ の ァ ド レ スを決めるためのも のである 0 また参照符号 Further, reference numeral 4 S indicates an image memory input unit to which the clock signal, the horizontal synchronizing signal and the vertical synchronizing signal are input. This image memory counter 4 S is for determining the memory address to be stored when the image signal is digitized.
4 は上記ク ロ ツ ク信号、 水平同期信号及び垂直同期 信号が入力される画像メ モ リ ダ ウ ンカ ウ ン タを示して いる 。 この画像メ モ リ ダ ウ ン カ ウ ン タ 4 9 は後述する 画像メ モ リ 5 1 からデー タを読み出すア ド レ スをカ ウ ン トするための も のである 。 Reference numeral 4 indicates an image memory counter to which the above clock signal, horizontal synchronizing signal and vertical synchronizing signal are input. This image memory down counter 49 is for counting the address for reading out data from the image memory 5 1 described later.
更に、 参照符号 5 0は撮像器 3 5 からの画像信号 とその画像信号のァ ド レスが画像メ モ リ カ ウ ン タ 4 8 か ら入力されるアナ ログ · デ ジタ ル変換器を、 参照符 号 5 J はこ のアナ ロ グ ' デジタ ル変換器 5 0から出力 されるデ 'クタ ル信号を後述する コ ン ピ ュ ー タ 5 2 から の書き込み信号 5 2 b に よ 画像メ モ リ カ ウ ンタ 4 8 で指定されるァ P レスに記憶され且つ後述する コ ン ビ ユ ータ 5 2 からの読み出し信号 5 2 * によ ]?画像メ モ リ ダ ウ ン カ ウ ン タ で カ ウ ン ト されたア ド レスに対 応するデータが読み出される画像メ モ リ をそれぞれ示 している 。 ま た、 参照符号 5 3はアナ ログ · デ タ ル. 変換器 5 0 の出力信号と画像メ モ リ 5 J から読み ίΰさ れたデータ と の差分を取出す画像信号差分器を、 参照
符号 5 5 は画像信号差分器 5 3 の出力信号と基準信号 発生器 5 4 か ら出力される基準信,と を レペル比較す る画像信号レ ベ ル比教器をそれぞれ示している 。 Further, reference numeral 50 refers to an analog-digital converter in which the image signal from the image pickup device 35 and the address of the image signal are input from the image memory counter 48. The code 5 J is a digital signal output from the analog digital converter 50 and an image memory based on a write signal 5 2 b from a computer 5 2 which will be described later. It is stored in the address specified by the counter 48 and is read by the readout signal 5 2 * from the computer 5 2 described later. The image memory from which the data corresponding to the installed address is read out is shown. Also, reference numeral 5 3 is an analog data. Refer to the image signal differentiator for extracting the difference between the output signal of the converter 5 0 and the data read from the image memory 5 J. Reference numeral 5 5 indicates an image signal level compensator that performs a Lepel comparison between the output signal of the image signal differentiator 5 3 and the reference signal output from the reference signal generator 5 4.
さ らに参照符号 5 7 はク ロ タ ク信号発生器 4 か ら出力されるク ロ ッ ク信号を水平位置と してカ ウ ン ト し且つ水平同期信号発生器 4 J か ら出力される水平同 期信号によ ]?その水平位置カ ウ ン ト を一走査線ごとに ク リ アされる水平位置カ ウ ン ト器を、 参照符号 6 0は 水平同期信号発生器 4 J および垂直同期信号発生器 4 2 から出力される垂直同期信号によ ]9垂直位置を力 ゥ ン ト し且つ水平同期信号によ ]}その垂直位置カ ウ ン トが一画面ごとにク リ ァされる垂直位置カ ウ ン ト器を それぞれ示している 。 そ して、 参照符号 5 は画像信 号 レペ ル比較器 5 5 の出力信号を水平位置カ ウ ン ト器 5 7 から出力される水平位置信号に応じてラ ッチ しコ ン tf ユ ータ 5 2 からのラ ツチク リ ア信号 5 2 c に よ ]5 ク リ ァされる水平位置ラ ツ チ器を、 参照符号 2 は画 像信号レペル比較器 5 5 の出力信号を垂直位置カ ウ ン ト器 6 2 から出力される垂直位置信号に応じてラ ツチ しコ ン ピ ュ ー タ 5 2 からのラ ッ チク リ ア信号 5 2 c に よ ]? ク リ ァされる垂直位置ラ ツ チ器を示している 。 Further, reference numeral 57 counts the clock signal output from the clock signal generator 4 as the horizontal position and outputs it from the horizontal synchronization signal generator 4 J. By horizontal sync signal]? The horizontal position counter is a horizontal position counter that is cleared for each scanning line. Reference numeral 60 is the horizontal sync signal generator 4 J and vertical sync signal. The vertical sync signal output from the signal generator 4 2] 9 The vertical position is output and the horizontal sync signal is applied]} The vertical position count is cleared for each screen. Each position counter is shown. Reference numeral 5 is a latching control tf unit for the output signal of the image signal level comparator 5 5 according to the horizontal position signal output from the horizontal position counter 5 7. A horizontal position latch that is cleared by the latch clear signal 5 2 c from the device 5 2 and a reference numeral 2 indicates the output signal of the image signal repel comparator 5 5 in the vertical position counter. Depending on the vertical position signal output from the input device 6 2 depending on the latch clear signal 5 2 c from the computer 5 2]? It shows a chi.
また、 参照符号 5 2は水平位置ラ ツ チ器 5 及び 垂直位置ラ ツチ器 6 でラ ツ チされた水平及び垂直位 置に対応する辉点のデータを画面数力 ゥ ン タ と して垂
直同期信号発生器 4 2から出力される垂直同期信号と 共に取込むコ ン ピュータを示している 。 このコ ン ビ ュ ータ 5 2 は後述する方法で辉点データからワーク 9 の 対向距離と対向姿勢を計算し、 画像表示器 7 あるいは ロ ^ ツ 卜に計算結果を送信するも のである 。 Further, reference numeral 52 represents the data of the light spots corresponding to the horizontal and vertical positions latched by the horizontal position latch 5 and the vertical position latch 6 as a screen number force counter. It shows a computer that takes in together with the vertical sync signal output from the direct sync signal generator 42. This computer 52 calculates the facing distance and facing posture of the work 9 from the data of the spots by the method described later, and sends the calculation result to the image display 7 or the robot.
尚 、 投光器 3 ί 〜 3 4 、 撮像器 3 5 、 並びに画像 処理回路 S a に よ ]?三次元位置セ ンサが構成される こ とになる 。 A three-dimensional position sensor is configured by the projectors 3 to 3 4, the imager 35, and the image processing circuit Sa.
次に上記のよ うに構成された三次元位置セ ンサの 作用について説明する 。 Next, the operation of the 3D position sensor configured as described above will be described.
今、 第 2 図においてテ ィ ーチングヘッ ド 3が図示 状態に固定され、 ワ ーク S はテ ィ 一チングへツ ド 3 と の対向位置お よび対向姿勢が調整可能になっている も の とする 。 この よ うな状態にある時、 半導体レーザ装 置 3 1 a 〜 3 4 a から レーザ光が出射される と このレ 一ザ光は反射鏡 3 1 b 〜 3 4 b でそれぞれ反射され斜 めの ビーム と して ワーク S 上に照射される 。 こ こで、 ワ ーク 9 の対向距離が所定値で、 対向姿勢が正しい姿 勢にある状態を第 6 A図の一点鎖線で示すワーク ^と し、 また教示点 P O とする 。 同図の実線位置にある ヮ ーク 9 には輝点 S J 〜 S 4 が斜めのビーム と ワーク 9 との交点上にでき る 。 このため、 ワーク 9 の対向距離 あるいは対向姿勢の変化は画像表示器 7上での輝点 S J 〜 S 4 の偏倚と して現われる 。
例えば、 ワーク 9が基準位置にある ワーク ^よ ] 5 対向距離が遠い状態にある と 、 第 6 B 図に示すよ うに、 各輝点は基準マーク M 1 〜 M 4 の外側に配設される よ. うになる 。 また、 ワ ーク Sが基準位置のワ ーク ^よ ]? 対向姿勢において画像表示器 7 の上下方向に傾斜 して いる状態にある と 、 第 7 A図及び第 7 B 図に示すよ う に輝点 S が基準マーク M 2 よ 内側に配設される よ う になる 。 さ らにワーク Pが基準位置のワーク ^よ 対向姿勢において画像表示器 マ の左右方向に傾斜して いる状態にある と 、 第 8 A 図及び第 8 B 図に示すよ う に 、 辉点 S 3 , S 4 は基準マーク M 3 , M 4 に対して —方向に片寄る よ うに配設される 。 Now, in FIG. 2, it is assumed that the teaching head 3 is fixed in the illustrated state, and that the work S can adjust the facing position and the facing posture of the teaching head 3. .. In such a state, when laser light is emitted from the semiconductor laser devices 31a to 34a, the laser light is reflected by the reflecting mirrors 31b to 34b, respectively, and the oblique beam is emitted. Then, the work S is irradiated. Here, a state where the facing distance of the work 9 is a predetermined value and the facing posture is in a correct posture is defined as a work ^ shown by a dashed line in Fig. 6A and a teaching point PO. The bright spots SJ to S4 can be formed at the intersection of the oblique beam and the workpiece 9 at the peak 9 located at the solid line in the figure. Therefore, the change of the facing distance or the facing posture of the work 9 appears as the deviation of the bright points SJ to S 4 on the image display 7. For example, if the work 9 is at the reference position and the work distance is long, the bright spots are located outside the reference marks M 1 to M 4 as shown in FIG. 6B. Yo. Further, when the work S is tilted in the vertical direction of the image display 7 in the facing posture, as shown in Figs. 7A and 7B. Therefore, the bright spot S is arranged inside the reference mark M 2. Furthermore, when the work P is tilted in the left-right direction of the image display machine in the facing position relative to the work at the reference position ^, as shown in Figs. 8A and 8B, 3 and S 4 are arranged so as to be offset in the negative direction with respect to the reference marks M 3 and M 4.
つま 1?、 輝点は斜向ビ ー ム と ワーク 9 との交点に 生じるので、 輝点の画像表示器 7上での偏倚を計測す る と 、 三角形の相似則から基準となる三角形を決めて おけば比例計算によ 各輝点 S 1 〜 S 4 の三次元空間 での座標を求める こ とができ る 。 ま た輝点 S ί の三次 元位置か らワ ーク の対向距離が計算でき 、 辉点 S 1 〜 S 4 から ワ ークの対向姿勢が計算でき る 。 That is, the bright spot occurs at the intersection of the oblique beam and the work 9, so if the deviation of the bright spot on the image display 7 is measured, the reference triangle is determined from the triangle similarity rule. By doing so, it is possible to obtain the coordinates of each bright point S 1 to S 4 in the three-dimensional space by proportional calculation. Moreover, the facing distance of the work can be calculated from the three-dimensional position of the bright point S ί, and the facing posture of the work can be calculated from the bright points S 1 to S 4.
こ こで、 第 5 図に示す画像処理回路 S a の作用を 第 9 A図乃至第 9 C 図を参照 して説明する 。 Here, the operation of the image processing circuit S a shown in FIG. 5 will be described with reference to FIGS. 9A to 9C.
まず、 最初に輝点の撮像されていない画像、 例え ぱ第 9 A図に示すよ うな画像が画像メ モ リ 5 1 に格納 される 、 次に各輝点を順次撮像した画像、 例えば第 9B
図に示すよ うな画像が取 ]?込まれる 。 画像信号差分器 5 3 にてこれら輝点の撮像されていない画像と輝点を 撮像した画像との差分が求め られる 。 このため画像信 号差分器 5 3からは背景光の成分が除去された輝点の みの画像、 例えば第 9 C図に示すよ うな画像が得られ D O First, an image in which no bright spots have been captured, such as the image shown in Fig. 9A, is stored in the image memory 51, and then an image in which each bright spot is sequentially captured, for example, 9B. An image like the one shown in the figure is captured. The image signal differentiator 53 calculates the difference between the image in which these bright spots are not captured and the image in which these bright spots are captured. Therefore, the image signal differencer 53 obtains an image of only bright spots from which the background light component has been removed, for example, the image shown in Fig. 9C.
したがって 、 画像信号レ ベ ル比較器 5 5 において は基準信号発生器 5 4 から出力される基準信号と輝点 画像信号とを比較するこ とができ る 。 即ち、 背景光を 含む画像信号、 例えぱ第 6 B 図に示された画像の画像 信号を画像信号レ ペ ル比較器 5 5 に入力する と 、 正し い輝点の位置ではない背景光を誤って輝点位置と して 検知して しま う可能性がある 。 しかしなが ら、 前述し たよ うな信号処理に よ ]3正しい輝点位置を検出する こ とができ る 。 Therefore, the image signal level comparator 55 can compare the reference signal output from the reference signal generator 5 4 with the bright spot image signal. That is, when an image signal including background light, for example, the image signal of the image shown in Fig. 6B is input to the image signal level comparator 55, the background light that is not at the correct bright spot position is detected. There is a possibility that it may be mistakenly detected as a bright spot position. However, it is possible to detect the correct luminescent spot position by the signal processing as described above.
この よ うに画像信号レ ペ ル比較器 5 5 によ 輝点 が検出される と 、 その検出信号を輝点の水平位置を力 ゥ ン ト している水平位置ラ ツ チ器 5 9 と輝点の垂直位 置をカ ウ ン ト している垂直位置ラ ツ チ器 6 2 に送信し、 正しい輝点の位置が保持される 。 これら水平位置ラ ッ チ器 5 9 と垂直位置ラ タチ器 5 2 に保持された輝点位 置データは適宜コ ン ビユ ータ 5 2 で読取られ、 こ こで 収集された輝点 S 1 〜 S 4 の水平、 垂直位置データは 前述した三角形の相似則に基 く ワーク 9 の対向距離と
対向姿勢とが所定の数式に代入して求められる 。 そし て、 その計算結果は適当な信号線を通して画像表示器 7 に表示させる 。 In this way, when a bright spot is detected by the image signal level comparator 5 5, the bright spot is detected by the horizontal position latch 5 9 which outputs the detected signal at the horizontal position of the bright spot. The vertical position of the bright spot is transmitted to the vertical position latch 62 that is counting, and the correct position of the bright spot is maintained. The bright spot position data stored in the horizontal position latch 5 9 and the vertical position latch 5 2 are read by the computer 5 2 as appropriate, and the bright spots S 1 ~ The horizontal and vertical position data of S 4 are the same as the facing distance of workpiece 9 based on the above-mentioned triangle similarity rule. The facing posture is obtained by substituting it into a predetermined mathematical formula. Then, the calculation result is displayed on the image display 7 through an appropriate signal line.
したがって、 作業者は画像表示器 7 に表示された ワーク 9 の対向距離および対向姿勢のデータを見た ]?、 基準マーク M 1 〜M 4 と輝点 S 1 〜 S 4 のずれを見る こ とによ ]? 、 口 ッ トがワーク タ の教示点 P 0上で正 しい対向距離および対向姿勢を と る よ うに位置修正す る こ とができ る 。 Therefore, the operator sees the data of the facing distance and the facing posture of the work 9 displayed on the image display 7], and sees the deviation between the reference marks M 1 to M 4 and the bright spots S 1 to S 4. The position of the mouth can be corrected so that the mouth has the correct facing distance and facing attitude on the teaching point P 0 of the work station.
上記では作業者が画像表示器 7 に表示された画像 を見なが ら位置修正する場合について述べた。 しかし なが ら、 三次元位置セ ンサにおいて画像処理回路 S a で計算に よ 求め られたワーク 9 の対向距離および対 向姿勢のデータを ロ ポ ッ ト コ ン ト ロ ーラに送信して対 向距離および対向姿勢を 自動的に 口 ポ ッ ト に修正させ る こ と もでき る 。 In the above, the case was described where the operator repositioned the image displayed on the image display 7 while considering it. However, in the 3D position sensor, the data of the facing distance and facing posture of the workpiece 9 calculated by the image processing circuit S a in the 3D position sensor is transmitted to the robot controller. It is also possible to have the mouth port automatically correct the heading distance and the facing posture.
このよ う に本実施例では、 作業者が画像表示器 7 の表示画像を 目視しなが ら ワーク 9 の対向距離および 対向姿勢データを読取 ]? 、 各輝点 S J 〜 S 4 および教 示点 P 0 を基準マーク M J 〜 M 4 に一致する よ うにへ y ド本体 3 の三次元位置を調整する よ うに した。 従つ て、 従来のよ う に作業.者の勘にた よ る こ とな く正確に、 しかも短時間でテ ィ 一チ ングする こ とが可能とな ]? 、 能率および精度共に大幅に向上させる こ とができ る 。
7690 P As described above, in the present embodiment, the operator reads the facing distance and the facing attitude data of the work 9 while visually observing the display image of the image display 7], each bright point SJ to S 4 and the teaching point. The three-dimensional position of the main body 3 is adjusted so that P 0 is aligned with the reference marks MJ to M 4. Therefore, it is possible to perform the work as in the conventional way. It is possible to perform the touching accurately and in a short time without the intuition of the operator. You can improve it. 7690 P
16 また、 画像処理回路 S a によ る ワーク S の対向距 離および対向姿勢データを ロ ッ ト コ ン ト ロ ーラに送 信し、 自動的にへッ ド本体 3 の三次元位置を調整する よ うに してもテ ィ 一チ ングの能率を大幅に向上させる ことができ る 。 16 Also, the facing distance and facing attitude data of the work S by the image processing circuit S a is sent to the rotor controller, and the three-dimensional position of the head main body 3 is automatically adjusted. By doing so, the efficiency of the teaching can be greatly improved.
さらに、 画像処理回路 S a では画像信号差分器 5 3 によ 背景光の影響が除去されるため、 正確な輝 点位置を計測でき 、 誤計測によ る作業者のテ ィ ーチ ン グミ スを防止する ことができ 、 また ロ ^ツ ト コ ン ト 口 ーラに計測デー タを送信して自動的に三次元位置を調 整する と き の誤計測によ る暴走を防止する こ とができ る o Further, in the image processing circuit S a, the influence of background light is removed by the image signal differentiator 53, so the accurate bright spot position can be measured, and the teaching gamma of the operator due to erroneous measurement can be measured. It is also possible to prevent runaway due to erroneous measurement when the measurement data is sent to the robot controller and the three-dimensional position is automatically adjusted. Can be o
尚、 本発明は前記一実施例の構成に限定される も The present invention is not limited to the configuration of the above-mentioned embodiment.
-のではな く 、 例えば第 5 図に示す画像処理回路の動作 と して、 人間の目が 3 0 ms以下の瞬時の点灯や消灯で は感 じない程度短 くする と同時に照明器 3 6 をその間 瞬時消灯し、 またそれに合せて画像信号に同期したク π ッ ク信号、 水平同期信号および垂直同期信号も高速 化する こ とによ 、 照明光によ る誤計測を防止する こ と も でき る 。 さ らに、 本発明は第 1 0 図に示すごと く 輝点 S J 〜 S 4 を教示点 P 0 の周囲に配設して各輝点 の平均値から対向距離と対向姿勢を計測 したのではな く 、 本実施例では、 対向距離設定専用の輝点 S J およ び第 1 の基準位置マーク M J を設けたことによ つて、
次のよ うな効果を奏する 。 -Instead of, for example, the operation of the image processing circuit shown in Fig. 5 makes it so short that the human eye does not notice it by instantaneous lighting or extinguishing of less than 30 ms. Is turned off momentarily, and the clock signal, horizontal sync signal, and vertical sync signal synchronized with the image signal are also speeded up accordingly to prevent erroneous measurement due to illumination light. it can . Furthermore, in the present invention, as shown in FIG. 10, the bright points SJ to S4 are arranged around the teaching point P0, and the facing distance and the facing attitude are measured from the average value of each bright point. In particular, in this embodiment, the bright spot SJ and the first reference position mark MJ dedicated to the setting of the facing distance are provided. It has the following effects.
すなわち、 ワーク 9 の表面形状に よ っては小さな 凹凸が存在する 。 例えば第 1 1 A図に示す如 く 、 この 様な凸部上に教示点 Pが設定される場合がある 。 この 様な場合、 例えば対向距離設定用の輝点および基準位 置マークを設けずに、 第 1 0 図に示す如 く 4個の輝点 およ び基準位置マークか ら対向距離と対向姿勢とをそ れぞれ設定しょ う とする と 、 凸部上の教示点 P とへ ッ That is, there are small irregularities depending on the surface shape of the work 9. For example, as shown in FIG. 11A, the teaching point P may be set on such a convex portion. In such a case, for example, without providing the bright spots and reference position marks for setting the facing distance, the facing distance and facing posture are determined from the four bright spots and the reference position marks as shown in FIG. When setting each of these, the teaching point P on the convex
P 3 との対向距離が所定の距離にな っていな くても各 輝点と基準位置マーク との位置が一致して しま う こ と め 。 Even if the facing distance to P 3 is not the specified distance, the positions of the bright spots and the reference position mark will match.
この状態でテ ィ ーチングを行な う と 、 レーザ切断 加工等を行なった際に加工へ ッ ドから教示点までの距 離が規定値と異なる こ とになる 。 このために 、 レーザ の焦点がずれて所定の切断を行なえな く なった ]? 、 著 しい場合には加工へ^ ドの先端部が ワ ーク の凸部に接 触 してへ ッ ドが損傷する といつた不具合を生 じる恐れ がある 。 これに対 しこ の一実施例の構成であれば、 教 示点 P と基準位置マーク M J との位置を一致させ、 こ の基準位置マーク に対 して輝点 S J を一致させる こ とによ J? 対向距離の設定が行なわれる よ う にしてあ る 。 従って 、 第 1 1 A図に示す如 く小さな凸部上に教 示点 Pがある場合でも 、 必ずこの凸部上の教示点 Pに 対して対向距離の設定を行な う こ とができ る 。 このた
め、 誤った位置状態でテ ィ 一チング して しま う不具合 は防止され、 この結果ワーク表面の状態に影饗を受け ずに常に正確なテ ィ ーチングを行な う ことができ る 。 If teaching is performed in this state, the distance from the machining head to the teaching point will differ from the specified value when laser cutting is performed. As a result, the laser was out of focus and the prescribed cutting could not be performed.] In some cases, the tip of the processing head touches the convex part of the work and the head is removed. If damaged, it may cause malfunctions. In the configuration of this embodiment, the position of the teaching point P and the reference position mark MJ are made to coincide with each other, and the bright point SJ is made to coincide with this reference position mark. J? The facing distance is set. Therefore, even if the teaching point P is on a small convex portion as shown in Fig. 11A, it is possible to set the facing distance to the teaching point P on this convex portion. .. others Therefore, it is possible to prevent the problem of teaching in the wrong position, and as a result, it is possible to always perform accurate teaching without affecting the condition of the work surface.
本発明の第 2 の実施例と して、 第 1 2 図に示すよ うに、 投光器 3 i 〜 3 4および照明器 3 6 に同 じ波長 の半導体レーザを使用し、 光フ ィ ルタ 3 9 を固体撮像 素子 3 5 a の前方に配設しても 良い。 この光フ ィ ルタ 3 9 によ ]?投光器 3 1 〜 3 4 の出力光の波長成分のみ を撮像素子 3 5 a に導く よ うに している 。 従って、 太 陽光等の外乱光が強 くても こ の外来光は光フ ィ ルタ As a second embodiment of the present invention, as shown in FIG. 12, a semiconductor laser having the same wavelength is used for the projectors 3 i to 3 4 and the illuminator 36, and an optical filter 39 is used. the solid-state imaging device 35 may be disposed in front of a. With this optical filter 39, only the wavelength component of the output light of the projectors 3 1 to 3 4 is guided to the image sensor 35 a. Therefore, even if the ambient light such as sunlight is strong, the extraneous light is not filtered by the optical filter.
3 S で滅衰されて極めて小レベル とな ]3 、 これによ 投光器 3 1 〜 3 4 によ る輝点を上記外乱光の影饗を受 ける こ とな く 明確に画像表示器 7 に表示する ことがで き る 。 また照明器 3 6 の照明光についても 同様であ ]3、 この結果、 照明光の照度を抑制しな く ても各輝点を明 確に表示する こ とができ る 。 このよ うに して、 常に十 分な照度の照明を行なった状態でテ ィ ーチ ングへッ ド 3 の位置調整操作を行な う ことができ 、 これによ ]?操 作性よ く 円滑なテ ィ ーチングを行な う こ とができ る 。 It becomes extinct at 3 S and reaches an extremely small level] 3, so that the bright spots from the projectors 3 1 to 3 4 are clearly displayed on the image display 7 without being affected by the above-mentioned disturbance light. It can be displayed. The same applies to the illumination light of the illuminator 36. [3] As a result, each bright spot can be displayed clearly without suppressing the illuminance of the illumination light. In this way, it is possible to adjust the position of the teaching head 3 while always illuminating with sufficient illuminance, which allows smooth operation and good operability. You can perform various teaching.
また、 第 3 の実施例と して、 第 1 3 図乃至第 1 6 図に示すよ うに、 テ ィ ーチ ングへッ ト, 3 を光ファ イバ を利用 して構成 しても 良い。 尚、 ^下の説明において、 こ の第 3 の実施例の構成の う ち前記した第 1 の実施例 と同 じも のは同 じ符号を付してその説明を省略する 。
7 0 In addition, as a third embodiment, as shown in FIGS. 13 to 16 the teaching heads 3 may be constructed by using an optical fiber. In the following description, the same components as those in the configuration of the third embodiment, which are the same as those in the first embodiment, are designated by the same reference numerals, and the description thereof will be omitted. 7 0
19 第 1 3 図乃至第 1 5 図に示す如 く 、 この第 3 の実施例 の三次元位置セ ン サは、 テ ィ ーチ ングへ ッ ド 3 と この テ ィ ーチ ングへ ッ ド S 内に設けられた詳細を後述する 撮像用光フ ァ ィ パ光学系および投光用光フ ァ ィパ光学 系と 、 こ の光学系に光学的に結合されたセ ンサ ト, ラ イ プ器 J 0 4 と 、 こ のセ ンサ ト,ラ イ プ器 J 0 4 を制御す るための前記画像処理回路 S a と基準位置パタ ー ン発 生回路 S を内蔵する セ ンサ コ ン ト ロ ー ラ J 0 5 と 、 こ のセ ンサ コ ン ト ロ ー ラ J 0 5 によ ]?作られた計測光の 基準位置 タ ー ン と前記テ ィ ーチ ングへ ッ ド 3 に よ ]? 撮像されたィ メ 一 'クを表示する CRTディ スプレイから なる画像表示器 7 とを備えている 。 19 As shown in FIGS. 13 to 15, the three-dimensional position sensor of the third embodiment is constructed with the teaching head 3 and the teaching head S. An optical fiber optical system for imaging and an optical fiber optical system for projection, which will be described later in detail, and a sensor and a lyper which are optically coupled to the optical system. J 0 4, a sensor controller that incorporates the image processing circuit S a and the reference position pattern generation circuit S for controlling the sensor and the repeater device J 0 4. Image sensor J 0 5 and this sensor controller J 0 5]? Based on the reference position turn of the measurement light created and the touching head 3]? The image display 7 is composed of a CRT display for displaying the displayed image.
次に第 1 4 図及び第 1 5 図に よ ]?テ ィ ーチ ングへ ッ ト 3 の内部構成の詳細を説明する 。 第 1 4 図は第 Next, referring to Figs. 14 and 15], the internal structure of the teaching head 3 will be described in detail. Fig. 14 shows
1 3 図のテ ィ ーチ ングへ ッ ト, 3 の縦断面図、 第 1 5 図 は第 4 図の D - Π線に沿 う断面図である 。 第 1 4 図 及び第 1 5 図に示すよ うに、 テ ィ ーチ ングへ ト * 3 の 内部には 、 4 個の投光用光フ ァ イバ光学系 J 3 1 〜 1 3 4 と 、 撮像用光フ ァ イバ光学系 i 3 2 とが収容さ れている 。 なお、 第 1 5 図においては投光用光フ ア イ パ光学系 J 3 2 が 1 3 J の真下に隠れて位置している。 Figure 13 shows the teaching head, the longitudinal section of Figure 3, and Figure 15 shows the section taken along the line D-Π in Figure 4. As shown in Fig. 14 and Fig. 15, as shown in Fig. 14 and Fig. 15, the inside of the teaching head * 3 is equipped with four projection optical fiber optics J 3 1 to 1 3 4 and an imaging device. It contains the optical fiber optics i 3 2 for use. It should be noted that in Fig. 15 the projection optical fiber optics system J 3 2 is hidden and located directly under 13 J.
上記各投光用光フ ァ イバ光学系 J 3 J 〜 J 3 4は、 テ ィ ーチ ングへ ッ ド 3 の内部に形成された溝や穴にそ れぞれ固定された反射鏡 ί 3 1 b , 1 3 2 b , 1 3 3 b ,
1 3 3 b - 1 r 1 3 4 b , J 3 4 b - J および集光レ ンズ 1 3 1 d , 1 3 2 ά , 1 3 3 ά , 1 3 4 d を傭え ている 。 第 1 3 図に示すよ うに、 各投光用光フ ァ イバ 光学系 J 3 1 〜 _ί 3 4はセ ンサ ドラ イ ブ器 0 4 の半 導体レーザ装置 3 1 ゝ 〜 3 4 a と光学的に結合されて いる 。 この場合、 反射鏡 J 3 1 b , I 3 2 b , 1 33 b , 1 3 3 b - 1 , 1 3 4 b , J 3 4 b - J は、 保持体 The projection optical fiber optics J 3 J to J 3 4 are provided with reflecting mirrors 3 fixed in grooves and holes formed inside the teaching head 3, respectively. 1 b, 1 3 2 b, 1 3 3 b, 1 3 3 b-1 r 1 3 4 b, J 3 4 b-J and light collecting lens 1 3 1 d, 1 3 2 ά, 1 3 3 ά, 1 3 4 d. As shown in Fig. 13, each projection optical fiber optics system J 3 1 to _ί 3 4 is an optical system with the semiconductor laser devices 3 1 to 3 4 a of the sensor drive unit 0 4. Are combined with. In this case, the reflectors J 3 1 b, I 3 2 b, 1 33 b, 1 3 3 b-1, 1 3 4 b, J 3 4 b-J are holders.
1 3 1 c , 1 3 2 c , 1 3 3 c , 1 3 3 c - 1 f 134c , ■Z 34 c - Jによ ]?固定されている 。 1 3 1 c, 1 3 2 c, 1 3 3 c, 1 3 3 c-1 f 134 c, ■ Due to Z 34 c-J]?
これら投光用光フ ァ イバ光学系 J 3 1 〜 1 3 4は 半導体レーザ装置 3 1 &〜 3 4 » から発生する レーザ 光を集光レ ンズ 1 3 1 d〜 J 3 4 d で細いビ ー ム にし て、 さ らに反射鏡 3 1 b , I 3 2 b , 1 3 3 b , 1 3 3 b - 1 , 1 3 4 b , 1 3 4 b - ! でそれぞれ反 射 し、 ワーク S の表面に投光して輝点をそれぞれ形成 している。 投光用光フ ァ イバ光学系 J 3 J は第 1 6 図 に示すよ うに、 対向距離設定用の辉点 S 1 を、 また他 の投光用光フ ァ イバ光学系 J 3 2 〜 J 3 4は姿勢設定 用の輝点 S 2〜 S 4をそれぞれ形成するも のである 。 These optical fiber optics J3 1 to 1 3 4 for projecting light are laser beams emitted from the semiconductor laser devices 3 1 to 3 4 », and are narrow lenses 1 3 1 d to J 3 4 d. The mirrors 3 1 b, I 3 2 b, 1 3 3 b, 1 3 3 b-1, 1 3 4 b, 1 3 4 b-! And the light is projected onto the surface of the work S to form bright spots. As shown in Fig. 16, the projection optical fiber optics system J 3 J uses the projection point S 1 for setting the facing distance and the other projection optical fiber optics J 3 2 ~ J. 3 and 4 respectively form the bright spots S 2 to S 4 for posture setting.
—方、 撮像用光フ ァ イバ光学系 J 3 5 は第 1 4 図 に示すよ うに、 撮像用レ ンズ 3 5 - ·Ζ と 、 反射鏡 On the other hand, the imaging optical fiber optical system J 3 5 has an imaging lens 3 5 -... Ζ as shown in Fig. 14 and a reflecting mirror.
1 3 5 - 2 を介 して ワーク S の像を第 1 3 図に示す撮 像器 J 3 5 - 3 に導く ものである 。 また、 この撮像器 1 3 5 - 3は第 1 6 図に示す如 く ワ ー ク 9 の表面上に
上記各投光用光フ ァ イバ光学系 J 3 i 〜 J 3 4 によ 形成された輝点 S J 〜 S 4 お よ び ワーク表面に予め描 かれたけがき線 K上に教示点 P をそれぞれ撮像する も のである 。 1 3 5 - 2 and through the image of the work S first 3 IMAGING device shown in FIG. J 3 5 - and guides the 3. The imager 1 3 5-3 is placed on the surface of the work 9 as shown in Fig. 16. Each of the bright spots SJ to S4 formed by the above-mentioned light-projecting optical fiber optical systems J3i to J34 and the teaching point P on the marking line K drawn in advance on the work surface are imaged. It does.
再び第 1 3 図に戻って、 セ ン サ コ ン ト ロ ーラ J (?5 はワーク に対するへ : X ドの対向距離を設定するため に使用する第 1 の基準位置マーク と 、 ワーク に 対するへ ッ ドの姿勢を設定するために使用する 3個の 第 2 の基準位置マーク M 2 〜 M 4 とをそれぞれ発生し ている 。 これらの各基準位置マーク M J 〜 M 4 はそれ ぞれ第 1 3 図に示す如 く 、 画像表示器 7 の表示画面上 の所定の位置に前記撮像器 1 3 5 - 3 からの撮像画像 信号に重畳して表示されている 。 こ の第 3 の実施例に よれば、 半導体レーザ装置 J 3 1 a 〜 J 3 4 a から発 生する光を ワーク 9 の表面に照射する 4 個の投光用光 フ ア イハ,光学系 J 3 1 〜 1 3 4 およびワーク 上に形 成された各辉点と位置設定点を撮像器 J 3 5 ― 3 に よ ]?撮像するための撮像用光フ ァ ィパ光学系 1 3 5 を内 部に設けてテ ィ ーチ ングへ ダ ト *を構成する よ う に して いる 。 従って、 へッ ド自体が小形にな ]) 、 レーザ加工 へ ッ ド等と比較してワ ーク との干渉が少な く な ]? 、 テ ィ 一チ ング作業を容易に行な う こ とができ る 。 Returning to Fig. 13 again, the sensor controller J (? 5 is the head for the work: the first reference position mark used to set the facing distance of the X Three second reference position marks M2 to M4, which are used to set the head posture, are generated respectively. Each of these reference position marks MJ to M4 is the first reference position mark. As shown in Fig. 3, the image is displayed at a predetermined position on the display screen of the image display 7 by being superimposed on the imaged image signal from the imager 1 3 5-3. In the third embodiment, According to the above, the four light beams for projection for irradiating the surface of the work 9 with the light emitted from the semiconductor laser devices J 3 1 a to J 3 4 a, the optical systems J 3 1 to 1 3 4 and the work The optical fiber optics system for imaging 1 3 5 is provided inside the taper to image each light point and position set point formed above by the imager J 3 5-3. It is designed to form a hinge head *. Therefore, the head itself is small]), and there is less interference with the work compared to laser processing heads.] You can easily perform the touching operation.
尚、 上記した第 3 の実施例において、 その構成の —部を例えば次のよ う に しても 、 前述と同様の実施が
可能である 。 すなわち、 ワーク 上に照明をするため に、 テ ィ ーチングへ ッ ド 3 とセ ンサ ト, ラ イ ブ器 ί 0 4 との間に照明用光フ ァ イバ光学系を撮像用光フ ァ イバ 光学系 J 3 S と並列に設け、 その照明光を反射鏡 1 3S 5 - 2 で反射させてワーク 9 を照明する よ うに しても 良 い。 In addition, in the third embodiment described above, the same operation as described above is performed even if a part of the configuration is set as follows. It is possible. That is, in order to illuminate the work, the illumination optical fiber optical system is provided between the teaching head 3 and the sensor / liver unit 04. It is also possible to provide it in parallel with the system J 3 S and to illuminate the work 9 by reflecting the illumination light with the reflecting mirror 1 3S 5-2.
この よ うに して照明用光フ ア イパ光学系を設ける こ とによ って、 ワーク P の表面のけがき線 Kおよび教 示点 Pなどを明る く画像表示器 7 に表示する こ とがで By providing the illumination optical fiber optical system in this way, it is possible to brightly display the marking line K and the teaching point P on the surface of the work P on the image display 7. so
I D き 、 教示点 P の位置をよ ])正確に判断する こ とができ る 。 さ らに照明用光フア イ 光学系の照明光と して半 導体レーザ光を用いて教示点等を照明する よ うにすれ ぱ、 外乱光の影響を受ける こ とな く 、 明確な画像を得 る こ とができ る 。 It is possible to accurately judge the position of teaching point P when I D). In addition, a semiconductor laser light is used as the illumination light for the illumination light optical system to illuminate the teaching point, etc., and a clear image is obtained without being affected by ambient light. I can do it.
15 更に、 第 4 の実施例と して、 第 1 7 図乃至第 2 0 図に示すごと く三次元位置セ ンサはォー ト フ オー カ ス 機構と 、 計測 ビームの照射方向切換機構を有する よ う に しても 良い。 第 1 7 図と第 1 8 図とにおいて、 テ ィ 一チ ンダへ ッ ト, 3は、 撮像器 3 5 を前後に摺動させる15 Furthermore, in the fourth embodiment, as shown in FIGS. 17 to 20, the three-dimensional position sensor has an autofocus mechanism and a measurement beam irradiation direction switching mechanism. You can also do so. In the first 7 figures and first FIG. 8, Tsu preparative to Te I Ichichi Sunda, 3, sliding the imager 35 back and forth
20 ねじ部 2 0 9 と 、 ねじ部 2 0 9 に取付けられる リ ー ド スク リ ュ ー 2 と、 リ ー ドスク リ ュ ー 2 2 0 の端部 に嵌合し一部がへ ッ ド筐体 3 0 に固定された軸受部 21 1 と、 リ ー ドス ク リ ュ ー 2 J 0 の一端に取付け られ た軸継手 2 i 2 と 、 軸継手 2 1 2 に一端が取付けられ
一部がへ ッ ド筐体 3 0 に固定されているモーク 2 1 3 と 、 撮像器 3 5 の摺動距離を制限するためにヘ ド筐 体 3 に取付けられた位置決めピ ン 2 2 3 - ·Τ と 、 前 記ミ ラー 3 2 b の各々に対して固体撮像素子 3 5 a の 中心軸上でしかも遠い距離はなれた ワーク の表面上 の第 2 0 図点 P あるいは近い距離はなれた ワ ーク 9 の 表面上の第 2 0 図点 Q に光を反射させるよ う保持する シー ソ機構部 2 1 9 ヒ 、 シー ソ機構部 2 1 タ の先端に 取付けられた反射光の絞 部 1 9 - 1 と 、 シーソ機 構部 2 1 タ の回動の中心に嵌合された軸 2 2 0 と 、 シ ー ソ機構 2 1 9 の一端を押すよ うへ ッ ド筐体 3 0 に内 蔵されたソ レノ ィ ド 2 2 0 と 、 前記ソ レノ ィ ド 2 2 0 と反対側に設置されシーソ機構部の他端を押すよ うへ ッ ド筐体 3 0 に内蔵されたスプ リ ング部 2 2 3 、 シ ーソ機構部 2 1 タ の回動のス ト ッ 2 2 4 およ びス ト ツ 2 2 5 とを備えている 。 この第 4 の実施例に よ る 作用を、 第 1 7 図 第 1 9 図とに よ 説明する 。 第 1 9 図に示すよ う に 、 テ ィ ーチ ングへ ッ ドの ワ ーク タ に対する距離を遠近 2 通 ])で測定する場合について説 明する 。 まずモータ 2 J 3 を踞動 し、 撮像器 3 5 を軸 方向に沿って移動調整させる こ とによ 、 焦点合せが 自動的に実施される 。 さ らに遠ノ近を設定し、 シ ー ソ 機構 2 1 9 を ソ レノ ィ ド 2 2 0 の伸縮作用に よ ]?回動 させ、 半導体レーザ装置 3 1 » 〜 3 4 a から発生する
レーザ光の反射方向を大小変化させる ことができ る 。 従ってこの実施例の効果と して、 テ ィ ーチ ングへ ッ ド 3 を第 1 9 図の遠方に位置させレーザビ ー ム (L-1 ) , ( L - 2 ) …の反射方向を小さ く し、 かつ撮像器 3 5 の 焦点合せを遠方にする こ とによ ] 3第 2 0 図のごと く ヮ ーク 9全体が撮像でき る 。 このよ うにして粗いテ ィ ー チングが実施でき る 。 続いてへ ッ ド 3 を ワーク S の近 ぐに位置させレーザビ ー ム ( L - l ) , ( L - 2 ) "'の反 射方向を大き く し、 かつ撮像器 3 5の焦点合せを近 く にするこ とによ D、 前記した実施例と同様の精密なテ ィ ーチ ングができ る効果がある 。 20 Threaded part 2 0 9 and the lead screw 2 attached to the screw part 2 0 9 and the end of the lead screw 2 20 Bearing part 21 1 fixed to 30 0, shaft coupling 2 i 2 attached to one end of lead screw 2 J 0, and one end attached to shaft coupling 2 1 2 The moke 2 1 3 part of which is fixed to the head housing 3 0 and the positioning pin 2 2 3- mounted on the head housing 3 to limit the sliding distance of the imager 3 5. · Τ and the mirror 32 2b described above are located on the center axis of the solid-state image sensor 35 a and are far away from each of the mirrors 32 b, and the point P of Figure 20 on the surface of the workpiece or a close distance is obtained. Fig. 20 on the front surface of the cursor 9 Holds the seesaw mechanism part 2 1 9 h to hold the light reflected at the point Q 1 9 and the diaphragm part of the reflected light attached to the tip of the seesaw mechanism part 2 1 1 9 -1, the shaft 2 20 fitted in the center of rotation of the seesaw structure 21 1 and the head housing 3 0 that presses one end of the seesaw mechanism 2 1 9 Installed in the head housing 30 to push the other end of the seesaw mechanism installed on the opposite side from the above-mentioned solenoid 220. 2 2 3, a rotation mechanism 2 2 4 and a 2 2 5 unit for rotating the mechanism unit 21. The operation of the fourth embodiment will be described with reference to FIGS. As shown in Fig. 19 we will explain the case of measuring the distance between the teaching head and the workstation with two distances]. Focusing is automatically performed by first rotating the motor 2 J 3 and adjusting the movement of the imager 35 along the axial direction. In addition, the far-near distance is set, and the thesis mechanism 2 19 is rotated by the expansion and contraction action of the solenoid 2 20], and is generated from the semiconductor laser device 3 1 »~ 3 4 a It is possible to change the reflection direction of laser light. Therefore, as an effect of this embodiment, the teaching head 3 is located at a distance in Fig. 19 and the reflection direction of the laser beams (L-1), (L-2) is reduced. and, and by the and this for the focusing of the image pickup device 35 to a distant] 3 each rather entire Wa chromatography click 9 of the second 0 diagram Ru be imaged. In this way, coarse teaching can be performed. Then, the head 3 is positioned close to the work S to increase the reflection direction of the laser beams (L-l), (L-2) "'and the focusing of the image pickup device 35 is made closer. By doing so, there is an effect that the same precise teaching as in the above-mentioned embodiment can be performed.
次にこの発明に係る三次元位置設定システ ムの一 実施例を 、 第 2 1 図乃至第 2 4 図を参照して説明する。 この三次元位置設定システ ム においては、 ジョ イステ イ ツ クなどによ j? 、 画像表示器に撮像された教示点位 置を読み取 ]? 、 前記計測 ビーム による対向距離と対向 姿勢データ と合せ自動的にテ ィ一チ ングされる こ とに なる o Next, one embodiment of the three-dimensional position setting system according to the present invention will be described with reference to FIGS. 21 to 24. In this three-dimensional position setting system, j? Is read by a joystick or the like, the teaching point position imaged on the image display is read, and the facing distance and facing attitude data obtained by the measuring beam are combined and automatically set. Will be tailored o
第 2 1 図は、 この実施例の構成を示すも ので 、 画 像表示器 7 の任意の位置を指し示すための 'ク 3 イ ステ ィ ッ ク 3 J S と 、 ジ 3 イ ステ ィ タ ク 3 I S の指定位置 と画像処理回路 S a で計算した対向距離と対向姿勢を 合算してテ ィ 一チ ングデータを出力する制御部 3 1 7 とを傭えている 。
この実施例の作用は、 まず、 第 2 1 図において投 光器 3 3 , 3 4 を結ぶ線を X軸と し、 投光器 3 1 , 32 を結ぶ線を γ軸とする と 、 この: X軸、 Y軸を含む面は カ メ ラ 3 5 の光軸 に直交する 。 こ の光軸 9 を Z軸と し、 Z軸上において焦点 F と この Z軸のヮ一 ク 、 表面に対する交点 P0Y (基準位置 ) の座標値を Ζ 0 と し、 各投光器 3 1 〜 3 4 からのス ポ ッ ト光線が Ζ軸と交わる角度を とする と 、 Υ軸および: X軸を含 む面での斬面図は第 2 2 Α図及び第 2 2 B 図に示され る 。 X軸、 Y軸、 Z軸の全体の関係は第 2 3 図に示さ れる 。 The second 1 figure, since showing the structure of this embodiment, for indicating an arbitrary position of the images display 7 'click 3 Lee stearyl I Tsu and click 3 JS, di 3 Lee stearyl I data click 3 IS The control unit 3 17 that outputs the tailing data by summing the specified position and the facing distance and facing attitude calculated by the image processing circuit S a is supported. The operation of this embodiment is as follows. First, in Fig. 21, the line connecting the projectors 3 3 and 3 4 is the X axis, and the line connecting the projectors 3 1 and 32 is the γ axis. , The plane containing the Y axis is orthogonal to the optical axis of the camera 35. Let this optical axis 9 be the Z axis, and let the focal point F on the Z axis be the coordinate of this point of the Z axis, the intersection point P 0Y (reference position) with respect to the surface be Ζ 0, and each projector 3 1 ~ 3 Let the angle at which the spot ray from 4 intersects the Ζ axis be, and the cutaway views in the plane including the Υ axis and the: X axis are shown in Figures 2 2 Α and 22 2B. .. The general relationship between the X-axis, Y-axis and Z-axis is shown in Fig. 23.
第 2 2 A図において、 各ス ポ ッ ト光 S 2 および S 4 の Y座標をそれぞれ Y 2 , Y 4 とする 。 図示する よ う にワーク 9 表面上での交点 Ρ θ での; X軸まわ ]?の 傾斜角を 0 とする 。 図においては、 Y 4 の実際の値は (一) 値となる 。 照射角 なは一定値であるので、 交点In Fig. 2 2A, the Y coordinates of each spot light S 2 and S 4 are Y 2 and Y 4, respectively. As shown in the figure, the inclination angle of the X-axis rotation]? At the intersection Ρθ on the surface of the work 9 is set to 0. In the figure, the actual value of Y 4 is (one) value. Since the irradiation angle is a constant value, the intersection point
P 0 の Z座標値 Ζογ は三角形の相似性を利用 した簡単 な幾何学的考察に よ i? 、 (1)式にて求ま る 。 The Z coordinate value of P 0 Ζογ is based on a simple geometrical consideration using the similarity of triangles i? , It is calculated by the equation (1).
Ζο γ = 2 Υ2 Ϊ /( Υ2 - Υ4 ) tanな ······ (1) なお 、 Z軸の符号は撮像器 3 5 に向 う方向を正と している 。 Ζο γ = 2 Υ 2 Ϊ / (Υ 2- Υ 4) tan ............ (1) The sign of the Z axis is positive in the direction toward the imager 35.
同様に第 2 2 B 図において 、 ス ポ ッ ト光 , S 3 の X座標を , X 3 ( 実際値は負値 ) と し、 交
点 P o における ワ ー ク 9 表面の Y軸まわ ]) の傾斜角を <? とする と 、 交点 Ρ 0 の Ζ軸座標 Ζ0Χ の値は (2)式で表 わされる 。 Similarly, in a 2 2 B diagram, scan pop DOO light, the X-coordinate of S 3, X 3 (the actual value is negative value), and the exchange If the inclination angle of the Y-axis of the work 9 surface at the point P o]) is <?, the value of the Ζ-axis coordinate Ζ 0Χ of the intersection Ρ 0 is expressed by equation ( 2 ).
Zox = 2Xi Xs/(Xi - Xs ) tan ― (2) 理論的には上記各 Z座標値 Ζοττ , ζο χは一致する はずであるが、 実測値から求めたも のであるので一致 しない場合がある 。 したがって、 交点 Ρ。 の Ζ座標値 Ζ 0 は(3)式のよ うに(1)式と(2)式との平均値で示される。 Zox = 2Xi Xs / (Xi-Xs) tan ― (2) The above Z coordinate values Ζοττ and ζ ο χ should theoretically match, but since they were obtained from the measured values, they may not match. is there . Therefore, the intersection Ρ. The Ζ coordinate value Ζ 0 of is expressed by the average value of Eqs. (1) and (2) as in Eq. (3).
Ζ 0 = ( Ζοτ + Ζοχ )/ 2 Ζ 0 = (Ζοτ + Ζοχ) / 2
- 〔 Υ2 Υ4 Ζ( Υ2 - Υ4 ) - [Υ 2 Υ 4 Ζ (Υ 2 - Υ4)
+ Χι Χ3 /( Χι 一 Xs ) D/tan ··· (3) なお、 撮像器 3 5 と焦点 F との間の距離 Zs は機 械的構成によ ]?予め決ま った値である 。 + Χι Χ 3 / (Χι 1 Xs) D / tan ··· (3) Note that the distance Zs between the imager 35 and the focus F depends on the mechanical configuration]? It is a predetermined value. ..
また、 第 2 2 A図及び第 2 2 B図における交点 Also, the intersections in Figures 22A and 22B
P 0 での ワーク 表面の X軸まわ!? 、 Y軸まわ の各 傾斜角 《5 , 0 は前述と同様に簡単な幾何学的考察によ ]? (4)式、 (5)式のよ うに求ま る 。 Rotate the X axis of the work surface at P 0! ? , The Y-axis rotation tilt angles << 5, 0 are obtained by the same simple geometrical considerations as above]? Eqs. (4) and (5).
«5 = - tan"' C ( Y2 + Y4 ) / «5 =-tan"'C (Y 2 + Y4) /
( Υ2 - Υ4 ) tan 〕 (4) tan"1 〔 ( Xi + X3 ) (Υ 2 -Υ 4 ) tan] (4) tan " 1 〔(Xi + X 3 )
( Xi - 3 ) t»nな 〕 (5)
但 し、 Φ , の符号は第 2 2 Α図及び第 2 2 Β 図 に示すよ うに傾斜 した場合を負と している 。 (Xi- 3 ) t »n not) (5) However, the sign of Φ, is negative when it is tilted as shown in Figs. 22 2 A and 22 2 B.
このよ う に 、 画像表示器 7 と してのブラ ゥ ン管上 に表字された各ス ポ ッ ト光 S i , S 2 , S 3 , S 4 の 各座標値 X l , Y 2 f X 3 , Υ 4 を求める こ とによ つ て、 光軸 ( Ζ軸 ) 3 1 の ワーク 5 表面に対する交点 Ρ 0 の座標値 Ζ 。 お よ び ワ ーク 9 の交点 Ρ。 での傾斜 角 , (? を算出する こ とができ る 。 In this way, the coordinate values X l, Y 2 f of each spot light S i, S 2 , S 3 , S 4 represented on the Brown tube as the image display 7 are displayed. By obtaining X 3 and Υ 4, the coordinate value Ζ of the intersection Ρ 0 of the optical axis (Ζ axis) 3 1 with respect to the workpiece 5 surface is obtained. Crossing point of day 9 and work Ρ. It is possible to calculate the tilt angle at (,?
次に 、 第 2 4 図に示すよ うに、 画像表示器 7 に表 示されたけがき線 Κ上の教示点 3 I 3 ( Ρ ) ( ΧΡ , Υ Ρ , Ζ Ρ ) の Ζ座標値 Ζ Ρ を求める 。 なお、 教示点 Ρ の指示は画像表示器 7上にて制御部 3 J 7 に接続され たジ ョ イ ステ ッ ク S の操作によ ]? なされる 。 した がって、 教示点 Pが交点 P o の近傍位置にあ ]? 、 前述 の傾斜角 0 , 0 が P , P 0 間で大き く 変化しないとす る と 、 教示点 P の Z座標値 Z P は、 交点 P o の Z座標 値 Z。 に、 上記傾斜に よ る変化分 ( — Y P tan Φ , -X P tan θ )を加えたも のになる 。 各傾斜角 , は (4) , (5)式で求ま るので、 教示点 Ρ の Ζ座標値 Ζ Ρ は最 終的に(6)式となる 。 Next, as shown in Fig. 24, the teaching point 3 I 3 (Ρ) (ΧΡ, Υ Ρ , Ζ Ρ ) of the teaching point 3 I 3 (Ρ) (ΧΡ, Υ Ρ , Ζ Ρ ) displayed on the image display 7 is calculated as Ζ coordinate value Ζ Ρ. Ask . The instruction of teaching point Ρ is given by operating the joystick S connected to the control unit 3 J 7 on the image display 7. Therefore, if the teaching point P is near the intersection P o] ?, and the above-mentioned tilt angle 0, 0 does not change significantly between P and P 0, the Z coordinate value of the teaching point P Z P is the Z coordinate value Z of the intersection point P o. In addition, the variation (—YP tan Φ, -XP tan θ) due to the above slope is added. Since each tilt angle, is calculated by Eqs. (4) and (5), the Ζ coordinate value Ζ Ρ of the teaching point Ρ finally becomes Eq. (6).
ζ ρ = ( ΥΡ γ2 + Υ2 Υ4 + y Yp ) / ζ ρ = (ΥΡ γ 2 + Υ 2 Υ 4 + y Yp) /
( y2 - Υ* ) tan (y 2 -Υ *) tan
+ ( XP XI + X1 X3 + Xs Xp )/ + (XP XI + X1 X3 + Xs Xp) /
( Xi -X3 ) tan ··♦··' (6)
したがって、 教示点 P の三次元座標 ( XP , Υρ , ΖΡ )は各ス ポ ッ ト光 S i 〜 S 4 の二次元的位置関係 (Xi -X 3 ) tan ····· '(6) Therefore, the three-dimensional coordinates (XP, Υρ, ΖΡ) of teaching point P are the two-dimensional positional relationship of each spot light S i ~ S 4.
( 1 . Y 2 , X 3 , Y 4 ) と光線の照射角 ( Ct ) に て求ま る 。 したが って、 交点 P 0 と教示点 P との座標 関係から 口 ッ ト の加工ヘッ ド 2 を、 今停止 している 交点 P o の上方位置か ら教示点 P の上方位置へ移動さ せるための移動情報が得られる こ とになる 。 (1 .Y 2, X 3, Y 4) and the irradiation angle (Ct) of the light beam. Therefore, the machining head 2 of the mouth is moved from the position above the intersection point P o, which is currently stopped, to the position above the teaching point P, based on the coordinate relationship between the intersection point P 0 and the teaching point P. Information will be obtained.
このよ うな動作原理に基づいて前記制御部 3 1 7 は第 2 5 図の流れ図に従って ロ y トに対するテ ィ 一 チング処理を実行する よ う に構成されている 。 Based on such an operation principle, the control unit 317 is configured to execute the teaching process for the rotor according to the flowchart of FIG.
すなわち、 電源が投入されて各種の初期処理が終 了する と、 ステ ッ プ Q J にて画像表示器 7 に表示され た第 2 4 図に示す 4個のス ポ ッ ト光 S i , S 2 , S 3 , S 4 の各座標位置 X i , Y 2 , X 3 , Υ * の値が画像 識別回路 S a を介して読取られる。 次にステ ツ ° Q 2 にて上記読取った各座標値を用いて、 (4) , (5)式にて ヮ ーク S 表面上の光軸 3 1 9 ( Z軸 ) との交点 Ρ。 にお ける各傾斜角 0 , 0が算出される 。 傾斜角 の算 出が終了する と、 ステ ッ プ Q 3にて(3)式を甩いて交点 P Q の Z座標の値 Z a が算出される 。 That is, when the power is turned on and the various initial processing is completed, the four spot lights S i, S 2 shown in Fig. 24 displayed on the image display 7 are displayed on the step QJ. The value of each coordinate position X i, Y 2, X 3, Υ * of S 3 , S 4 is read via the image discrimination circuit S a. Next, using the coordinate values read above at step ° Q 2, use the equations (4) and (5) to find the intersection point Ρ with the optical axis 3 19 (Z axis) on the surface of S ark S. Each tilt angle 0, 0 in the is calculated. When the calculation of the tilt angle is completed, the value Z a of the Z coordinate of the intersection point PQ is calculated by step (Eq. (3)) in step Q 3.
■fe 上の処理が終了する と、 ステ ッ プ Q 4 にてジョ イ ステ ッ ク 3 J S の操作にて教示点 Pが指示されるの を待つ。 教示点 P の指示が入力する と、 ステ ッ プ Q 5 にてその指定された教示点 P の座標値 ΧΡ , ΥΡが前述
の画像識別回路 S a を介して読取 られる 。 座標値 XP , が求まる と 、 ステ ッ プ Q ff にて(6)式を用いて教示点 P の Z座標値 Z Pが算出される 。 教示点 P の座標 (XP , YP , Ζ Ρ ) が求ま る と 、 ステ ッ プ Q 7 にて交点 Ρ ο と 教示点 Ρ と の座標関係から口 ポ ッ ト のテ ィ ーチ ングへ ッ ド 3に対する方向と距離からなる移動情報が算出さ れる 。 その移動情報に従って 、 テ ィ ーチ ングへ ッ 3 は教示点 Ρの上方位置へ移動される 。 この場合、 焦点 F と教示点 Ρ との距離は最初の焦点 F と交点 Ρ。 との 距離に一致する よ うに位置制御される 。 次に 、 ステ ツ プ にてテ ィ ーチ ングへ ッ ド 3 の姿勢が傾斜角 ø , 0 に応 じて予め定め られた値.に制御される 。 姿勢制御 が終了する と 、 ステ ッ プ Q 9 にてこの教示点 Ρ に対応 するテ ィ ーチ ングへ ッ ト, 3 の三次元座標およ び姿勢角 度のデータが記憶部に格納される 。 以上で教示点 に 対するテ ィ 一チ ング処理が終了する 。 ■ When the process on fe is completed, wait for the teaching point P to be instructed by operating joystick 3 JS in step Q4. When the instruction of the teaching point P is inputted, stearyl-up Q 5 at coordinates chi [rho of the designated teaching point P, Upushironro is above It is read via the image identification circuit S a of. When the coordinate values X P, is obtained, Z-coordinate value Z P of the teaching point P with stearyl Tsu at up Q ff (6) equation is calculated. When the coordinates of the teaching point P (XP, YP, Ζ Ρ) are obtained, the teaching point of the mouth port is determined from the coordinate relationship between the intersection point Ρ ο and the teaching point Ρ at step Q 7. The movement information consisting of the direction and the distance with respect to code 3 is calculated. According to the movement information, the teaching head 3 is moved to a position above the teaching point Ρ. In this case, the distance between the focus F and the teaching point Ρ is the first focus F and the intersection Ρ. The position is controlled so as to match the distance with. Next, the posture of the teaching head 3 is controlled to a predetermined value according to the inclination angle ø, 0 at step. When the posture control ends, in step Q9, the teaching head corresponding to this teaching point Ρ, the three-dimensional coordinates of 3 and the posture angle data are stored in the storage unit. .. This is the end of the teaching process for the teaching points.
このよ うに構成されたロ ポ ッ トの三次元テ ィ ーチ ング装置であれば、 ロ ポ ッ ト の操作者は、 テ ィ ーチ ン グ装置の電源を投入して各投光器 3 1 , 3 2 , 3 3 , 3 4 を点灯したのち、 テ ィ ーチングヘ ッ ド 3 を例えば けがき線 Κ上に形成された教示点 Ρ の近傍位置へ移動 して 、 この教示点 Ρが画像表示器 7 に表示されたこ と を確認する 。 そ して 、 ジ ョ イ ステ ッ ク 3 J S にてその 教示点 P を指示指定するのみでよい。 教示点 Pが指定
される と 、 テ ィ ーチングへッ ド 3が自動的に教示点 P の上方位置へ移動して、 教示点 P と所定の間隔を保ち、 かつ所定の姿勢角で停止する 。 そして、 この教示点 P に対するテ ィ ーチングへ ッ ト♦ 3 の座標および姿勢デー タが記憶部に格納される 。 In the case of the three-dimensional teaching device of the robot configured in this way, the operator of the robot turns on the power of the teaching device and turns on each projector 3 1, After illuminating 3 2, 3, 3 and 3 4 , move teaching head 3 to a position near teaching point Ρ formed on marking line Κ, for example, and teach point Ρ is displayed on image display 7 Check that it is displayed in. Then, it is only necessary to specify the teaching point P with the joystick 3 JS. Teaching point P is specified Then, the teaching head 3 automatically moves to a position above the teaching point P, keeps a predetermined distance from the teaching point P, and stops at a predetermined posture angle. Then, the coordinates and posture data of the teaching head ♦ 3 for this teaching point P are stored in the storage unit.
このよ うに 、 操作者は教示点 P を指示するのみで 自動的にテ ィ ーチ ングが実行されるので、 テ ィ ーチン グ作業能率を大幅に向上する ことが可能である。 その 結果、 ロポ ッ トの稼働率を大幅に向上でき る 。 In this way, the operator automatically executes the teaching simply by pointing at the teaching point P, and it is possible to greatly improve the teaching work efficiency. As a result, the operating rate of the robot can be significantly improved.
また、 作業者が目視で各教示点 Pに対する加工へ ッ ド 2の位置、 姿勢をセ ッ トする必要ないので、 テ ィ 一チ ング精度を向上でき 、 ワーク S の加工精度を向上 でき る 。 Further, since the operator does not need to visually set the position and orientation of the machining head 2 with respect to each teaching point P, it is possible to improve the teaching accuracy and the machining accuracy of the work S.
なお、 この実施例においては、 教示点 P を指定す る手段と してジ ョ イステ ィ ク 3 1 S を用いたが、 ラ イ ト ペ ン 、 デジタ イ ザ、 ト ラ ッ ク # ー ル等の電気的に 画像表示器 7上で任意の位置 (座標) を指定でき るも のであれぱよい。 In this embodiment, the joystick 3 1 S is used as the means for designating the teaching point P, but it is possible to use a lite pen, a digitizer, a truck #, etc. It is okay if any position (coordinates) can be electrically specified on the image display 7.
本発明の三次元位置設定シ ス テ ム と して、 第 2 1 図に示した構成 と同 じ構成で、 画像像表示器に撮像さ れた教示点位置を電気的に読み取 、 前記計測 ビ ー ム によ る対向距離と対向姿勢データ と合せ、 自動的に連 続する教示点マークを追尾する方法を採用した他の実 施例を説明する 。
最初の教示点 P i の三次元座標 ( XP j , Υρι , Zpi :) は各ス ポ ッ ト光 S i 〜 S 4 の二次元的位置関係As the three-dimensional position setting system of the present invention, the teaching point position imaged on the image display is electrically read by the same configuration as that shown in FIG. Another example will be explained in which the method of automatically tracking the continuous teaching point mark is adopted by combining the facing distance and the facing attitude data by the camera. The three-dimensional coordinate (XP j, Υρι, Zpi :) of the first teaching point P i is the two-dimensional positional relationship of each spot light S i to S 4.
( X 1 . Y 2 . X 3 , Y 4 ) と光線の照射角 ( ) に て求ま る 。 したがって、 交点 Ρ ο と最初の教示点 ΡΡ と の座標関係から ロ ポ ッ 卜 のテ ィ ーチ ングへ ツ ド 3 を、 今停止 している交点 Ρ ο の上方位置から最初の教示点 Ρρの上方位置へ移動させるための移動情報が得られる こ とになる 。 (X 1 .Y 2 .X 3, Y 4) and the irradiation angle () of the light beam. Therefore, from the coordinate relationship between the intersection point Ρο and the first teaching point ΡΡ, the teaching head 3 of the robot is set to The movement information for moving to the upper position will be obtained.
こ の よ うな動作原理に基つ'いて前記制御部 3 1 7 は第 2 6 図の流れ図に従って ロ ^ ッ ト に対するテ ィ ー チング処理を実行する よ うに構成されている 。 Based on this operation principle, the control unit 317 is configured to execute the teaching process for the slot according to the flowchart of FIG.
すなわち、 電源が投入されて各種の初期処理が終 了する と 、 ス テ ッ プ ·ί にて画像表示器 Γに表示さ れた第 2 図に示す 4個のス ポ ッ ト光 S i , S 2 , S 3 , S 4 の各座標位置 X i , Y 2 , X a , Y 4 の値が画像 識別回路 S a を介 して読取られる 。 次にステ ッ プ That is, when the power is turned on and various initial processes are completed, the four spot lights S i, shown in Fig. 2 displayed on the image display Γ are displayed on the step display. The values of the coordinate positions X i, Y 2 , X a, and Y 4 of S 2, S 3 , and S 4 are read via the image discrimination circuit S a. Next step
にて、 上記読取った各座標値を用いて、 (4) , (5)式にて ワ ーク 表面上の光軸 3 2 9 ( Z軸 ) との交点 P。 に おける各傾斜角 ø , が算出される 。 傾斜角 , Θ <D 算出が終了する と 、 ステ ッ プ 2 3 にて 式を用いて交 点 P o の Z座標の値 Z n が算出される 。 At the intersection point P with the optical axis 329 (Z axis) on the work surface in Eqs. (4) and (5), using each coordinate value read above. Each tilt angle ø, is calculated. When the calculation of the inclination angle, Θ <D is completed, the value Z n of the Z coordinate of the intersection point P o is calculated using the formula in Step 23.
以上の処理が終了する と 、 ス テ ッ プ Q J にて 'ク ョ イ ステ ッ ク 3 1 S の操作にて最初の教示点 P i の指 示を持つ。 最初の教示点 P i の指示が入力する と、 ス
テ ツ プ Q J 5 にてその最初の教示点に指定された教示 点 P 1 の座標値 Χρι , Υρι が前述の画像識別回路 S a を介 して読取られる 。 座標値 Χρι , ΡΙ が求まる と 、 ステ ップ Q 1 6 にて(6)式を用いて最初の教示点 P i の Z座標値 ZP 1 が算出される。 最初の教示点 P i の座標When the above processing is completed, the instruction of the first teaching point P i is given by the operation of'Good stick 3 1 S'at step QJ. When the instruction of the first teaching point P i is input, The coordinate values Χρι, Υρι of the teaching point P 1 designated as the first teaching point in step QJ 5 are read via the image discrimination circuit S a described above. Coordinates Kairoiota, the ΡΙ is obtained at stearyl-up Q 1 6 using (6) Z-coordinate value Z P 1 of the first teaching point P i is calculated. Coordinates of first teaching point P i
( XPI , Υρι , Ζρι ) が求ま る と、 ステ ッ プ Q J 7 て交点 P o と最初の教示点 Pp との座標関係か らロポ ッ ト のテ ィ ーチ ングへ ッ ト * 3 に対する方向と距離からな る移動情報が算 ffiされる 。 その移動情報に従って、 テ ィ ーチ ングへ ッ ド 3が最初の教示点 P i の上方位置へ 移動される 。 この場合、 焦点 F と最初の教示点 P i と の距離は最初の焦点 F と交点 P 0 との距離に一致する よ う に位置制御される 。 When (XPI, Υρι, Ζρι) is obtained, the direction of the teaching point * 3 of the robot is determined from the coordinate relationship between the intersection point P o and the first teaching point Pp at step QJ 7. And the movement information consisting of distance is calculated as ffi. According to the movement information, the teaching head 3 is moved to the position above the first teaching point P i. In this case, the position is controlled so that the distance between the focus F and the first teaching point P i matches the distance between the first focus F and the intersection point P 0.
次に 、 ステ ッ プ Q J S にてテ ィ ーチ ングへ ッ F 3 の姿勢が傾斜角 ? 5 , 0 に応じて予め定められた値に制 御される 。 姿勢制御が終了する と 、 ステ ッ プ タ に て こ の最初の教示点 P i に対応する加工へッ ドの三次 元座標および姿勢角度のデータが記憶部に格納される。 以上で最初の教示点 P i に対するテ ィ ーチング処理が 終了する 。 Next, at step Q J S, is the posture of the teaching pendant F3 tilted? It is controlled to a predetermined value according to 5, 0. When the posture control is completed, the data of the three-dimensional coordinate and the posture angle of the machining head corresponding to the first teaching point P i of the stepper are stored in the storage unit. This is the end of the teaching process for the first teaching point P i.
最初の教示点 P i に対するテ ィ ーチング処理が終 了する と 、 ステ ッ プ Q 2 (? にて画像表示器 7上に表示 されている最初の教示点 P i の近 く の他の教示点 Pを 搜す。 この搜す動作においては、 ステ ッ プ i 〜
Q 2 5 に示すよ うに、 最初の教示点 P i の明暗のレぺ ルを画像識別回路 S a にて識別 し、 テ ィ ーチ ングへ ッ ド 3がこの教示点 P i へ移動 した時点でこの最初の教 示点 P i が新しい交点 P o となる 。 次に、 画像識別回 路 8 * は隣接した画素の中から前記明暗レベルに最も 近い画素を隣接する教示点 PN と識別する。 教示点 PN はテ ィ ーチ ングへ ッ ド 3が前記移動終了後、 新しい教 示点 P 1 と 自動指定される 。 When the teaching process for the first teaching point P i is completed, other teaching points near the first teaching point P i displayed on the image display 7 at step Q 2 (? Sway P. In this swaying motion, step i ~ As shown in Q 25, when the bright and dark level of the first teaching point P i is identified by the image identification circuit S a, the timing when the teaching head 3 moves to this teaching point P i Then this first teaching point P i becomes the new intersection P o. Next, the image identification circuit 8 * identifies the pixel closest to the light / dark level from the adjacent pixels as the adjacent teaching point P N. The teaching point PN is automatically designated as a new teaching point P 1 after the teaching head 3 finishes moving as described above.
そして、 新し く最初の教示点 P i と 自動指定され た教示点 PNに対して前述したテ ィ ーチング処理が実行 される 。 そして、 ステ ッ プ Q 2 6 にて教示点 PNが最終 の教示点 PEに達した時点で、 この 口 ッ トに対する全 てのテ ィ 一チ ング処理が終了する 。 Then, the above-mentioned teaching process is executed for the new first teaching point P i and the teaching point P N automatically designated. When the teaching point PN reaches the final teaching point PE in step Q26, all the teaching processing for this mouth ends.
このよ うに構成された ロ ッ ト の三次元テ ィ ーチ ング装置であれば、 口 ポ ッ ト の操作考は、 テ ィ ーチ ン グ装置の電源を投入して各投光器 3 1 , 3 2 , 3 3 , 3 4 を点灯したのち 、 テ ィ ーチ ングへ ッ ド 3 を 、 例え ぱ、 けがき線 Kの最初の教示点 P i の近傍位置へ移動 して、 この最初の教示点 P i が画像表示器 7 に表示さ れたこ とを確認し、 そして、 'ク ョ イ ステ ッ ク 3 J β に てその最初の教示点 P i を指示指定するのみでよい。 この最初の教示点 P i が指定される と、 テ ィ ーチング へ ッ ド 3が自動的に最初の教示点 の上方位置へ移 動して、 教示点 P i と所定の間隔を保ち、 かつ所定の
姿勢角で停止する 。 そ して、 この教示点 P i に対する ティ ーチ ングへ ッ ド 3 の座標および姿勢データが記憶 部に格納される 。 そして、 格納が終了する と 、 次に明 るい教示点 Pが最初の教示点 P i と 自動指定され、 テ ィ 一チ ングヘ ッ ド 3 はこ の新しい最初の教示点 へ 移動する。 このよ うにテ ィ ーチングヘ ッ ド 3は次々 と 新しい教示点 PNへ移動してい く 。 In the case of a three-dimensional teaching device with a lot configured in this way, the operation of the mouth port should be considered by turning on the power of the teaching device and observing each projector 3 1, 3 After 2, 3, 3 and 3 4 are lit up, move the teaching head 3 to, for example, the position near the first teaching point P i of the marking line K and move to this first teaching point. All you have to do is to confirm that P i is displayed on the image display 7 and then point to the first teaching point P i in the ‘good check 3 J β’. When this first teaching point P i is specified, the teaching head 3 automatically moves to a position above the first teaching point, maintains a predetermined distance from the teaching point P i, and of Stop at the posture angle. Then, the coordinate and posture data of teaching head 3 for this teaching point P i are stored in the storage unit. Then, when the storage is completed, the next bright teaching point P is automatically designated as the first teaching point P i, and the teaching head 3 moves to this new first teaching point. In this way, teaching head 3 moves to new teaching point PN one after another.
尚、 ステ ッ プ Q J 7及びス テ ッ プ Q JT S の内容の 詳細は、 それぞれ第 2 7 図及び第 2 8 図にサブルーチ ンと して示されている 。 The details of the contents of step Q J 7 and step Q JT S are shown as subroutines in Figures 27 and 28, respectively.
このよ うに 、 操作者は最初の教示点 P i を指示す るのみで 自動的に テ ィ ーチングが実行されてい く ので、 テ ィ 一チ ング作業能率を大幅に向上する こ とが可能で ある 。 その結果、 口 ッ ト の稼働率を大幅に向上でき る o In this way, since the operator only needs to point the first teaching point P i and the teaching is automatically executed, it is possible to greatly improve the teaching work efficiency. .. As a result, the operating rate of the mouth can be significantly improved.
また、 作業者が目視で各教示点 Pに対するテ ィ ー チ ングへ ッ ト * 3 の位置、 姿勢をセ ッ トする必要ないの で、 ティ ーチング精度を向上でき 、 ワーク S の加工精 度を向上でき る 。 Further, since the operator does not need to visually set the position and posture of the teaching head * 3 for each teaching point P, the teaching accuracy can be improved and the machining accuracy of the work S can be improved. Can be improved.
また、 第 2 6 図に示した動作において、 ス ポ ッ ト の座標をブラ ウ ン管に表示されたス ポ ッ トから 自動計 測する よ うに したが、 第 2 9 図にステ ッ プ Q 3 J 〜ス テ ップ Q 4 € と して示すよ うに撮像器で撮像されたス ポ ッ ト を直接に画像処理して、 その座標を自動計測す
る よ うに しても 、 同様の効果が得られる 。 Also, in the operation shown in Fig. 26, the spot coordinates were automatically measured from the spots displayed on the Brown tube. As shown by 3 J to step Q 4 €, the spot imaged by the imager is directly image-processed and its coordinates are automatically measured. However, the same effect can be obtained.
産業上の利用可能性 Industrial availability
本発明に よれば、 へッ ドと対象物の対向距離およ び対向姿勢が、 計測用各輝点を電気的に計測する こ と によ ]?でき る とか、 画像表示器上に固定的に表示され たマーク との合致 ぐあいに よ ]?計測でき る とか、 また 対象物上の位置設定点を画像表示器上に固定的に表示 されたマーク と一致させる ことによ 教示位置を確認 でき る とか、 これら計測と同時に撮像器を使用する こ とによ ]?対象物の表面状態を画像表示器で観察でき る とか、 さ らに 、 けがき線などの位置設定点を画像処理 によ ]?電気的に検出でき るなど、 多数の効果を奏する こ とができ る 。
According to the present invention, the facing distance and the facing posture of the head and the object can be measured electrically by measuring each bright spot for measurement, and the fixed distance is fixed on the image display. It is possible to check the teaching position by making a measurement, and by matching the position set point on the object with the mark fixedly displayed on the image display. By using an image pickup device at the same time as these measurements, you can observe the surface condition of the target object with an image display, and by using image processing, you can also use position processing points such as marking lines. ? It is possible to achieve a number of effects such as electrical detection.
Claims
(1) 対象物上に表示された位置設定点に対する三次 元位置及び姿勢を検出する三次元位置セ ン サにおいて、 セ ンサ本体と ; (1) In the three-dimensional position sensor that detects the three-dimensional position and orientation with respect to the position set point displayed on the object, the sensor main body;
対象物上に斜方投光し、 少な く と も三角形の頂点 を構成する複数の輝点を対象物の表面上に形成させる 投光手段と ; Projecting means for obliquely projecting light onto the object to form a plurality of bright spots forming at least the vertices of a triangle on the surface of the object;
投光手段によ 対象物上に形成された輝点と 、 対 象物上に表示された位置設定点と 、 対象物の表面とを 含んだ像を撮像する撮像手段と ; An image pickup means for picking up an image including a bright spot formed on the object by the light projecting means, a position set point displayed on the object, and the surface of the object;
この撮像手段によ 撮像された辉点と位置設定点 とを電気的に読み取 、 位置設定点に対するセ ンサ本 体の三次元位置及び姿勢を検出する画像処理手段とを 具傭する こと を特徴とする三次元位置セ ンサ 。 An image processing means for electrically reading the radiant point and the position set point imaged by the image pickup means and detecting the three-dimensional position and orientation of the sensor main body with respect to the position set point. 3D position sensor.
(2) 前記投光手段は、 セ ンサ本体と位置設定点との 間の距離を検 するための専用の光ビー ムを対象物上 に投光する第 1 の投光器を傭え、 (2) The light projecting means includes a first light projector for projecting a dedicated optical beam for detecting the distance between the sensor body and the position set point onto the object,
前記画像処理手段は、 この投光器から投光され、 対象物に形成された第 1 の輝点を介 して、 セ ンサ本体 と位置設定点との間の距離を検出する事を特徵とする 請求の範囲第 1 項に記載の三次元位置セ ンサ 。 The image processing means is characterized in that the distance between the sensor body and the position set point is detected via the first bright spot formed on the object, which is projected from the projector. The three-dimensional position sensor described in the first item of the range.
(3) 前.記第 1 の投光器は光ビ ー ムを斜方から対象物 上に投光する こ とを特徴とする請求の範囲第 2項に記 載の三次元位置セ ンサ 。
(3) The three-dimensional position sensor according to claim 2, characterized in that the first projector mentioned above projects the light beam obliquely onto the object.
(4) 前記投光手段は、 前記セ ンサ本体に設けられ、 対象物上に複数の光ビ ー ムを投光し、 前記第 1 の輝点 と共に少な く と も三角形を構成する複数の第 2 の輝点 を形成せしめる複数の第 2 の投光器を備えてお ]? 、 (4) The light projecting means is provided in the sensor main body, projects a plurality of light beams onto an object, and forms a plurality of first light spots and at least a triangular shape. It has multiple second floodlights to form two bright spots] ?,
前記画像処理手段は第 1 及び第 2 の輝点からセ ン サ本体の対象物に対する姿勢を検出する事を特徴とす る請求の範囲第 3 項に記載の三次元位置セ ン サ 。 4. The three-dimensional position sensor according to claim 3, wherein the image processing means detects the attitude of the sensor body with respect to the object from the first and second bright spots.
(5) 前記撮像手段に接続され、 この撮像手段に よ ]) 撮像された画像を表示する画面を有する画像表示手段 を更に具備する こ とを特徴とする請求の範囲第 1 項に 記載の三次元位置セ ンサ。 (5) The tertiary display device according to claim 1, further comprising: an image display unit that is connected to the image capturing unit and has a screen for displaying the captured image. Original position sensor.
(6) 前記画像処理手段は、 画像表示手段に接続され、 この撮像手段に よ 撮像された輝点と位置設定点とを、 画像表示手段の画面から電気的に読み出す事を特徵と する請求の範囲第 5項に記載の三次元位置セ ン サ 。 (6) The image processing means is connected to the image display means, and is characterized in that the bright spot and the position set point imaged by the image pickup means are electrically read from the screen of the image display means. Range Three-dimensional position sensor described in item 5.
(7) 前記画像表示手段は、 (7) The image display means,
画面に表示される位置設定点の位置合せの目標と な ])且つ、 対象物上の位置設定点に対するセ ン サ本体 の三次元位置が所定の相対位置とな った時に、 前記輝 点と一致する よ う に、 画面上の所定位置に固定的に表 示される複数の基準位置マークを備えている事を特徵 とする請求の範囲第 6項に記載の三次元位置セ ン サ 。 When the three-dimensional position of the main body of the sensor with respect to the position set point on the object reaches a predetermined relative position, The three-dimensional position sensor according to claim 6, characterized in that it is provided with a plurality of reference position marks that are fixedly displayed at predetermined positions on the screen so as to coincide with each other.
(8) 前記各投光器は、 投光方向を変化可能に支持す る支持手段と 、 この投光方向を変化させる駆動手段を
有する事を特徵とする請求の範囲第 4項に記載の三次 元位置セ ンサ o (8) Each of the projectors includes a supporting unit that supports the light projecting direction so that the projecting direction can be changed, and a driving unit that changes the projecting direction. Tertiary element position sensor according to claim 4 characterized by having o
(9) 前記投光手段は、 (9) The light projecting means is
第 1及び第 2 の輝点の明るさ と 、 外乱光の明るさ との間にある明るさで対象物表面を照明する照明手段 を備えている事を特徵とする請求の範囲第 4項に記載 の三次元位置セ ン サ 。 The scope of claim 4 characterized in that it is provided with an illumination means for illuminating the surface of the object with a brightness between the brightness of the first and second bright spots and the brightness of the ambient light. The three-dimensional position sensor shown.
α 前記画像処理手段は、 撮像手段によ 撮像され た対象物及び各輝点の明暗を示す画像信号から、 外乱 光及び照明手段からの照明光の明暗を示す画像信号を 差し引き 、 各輝点の画像信号のみを取 出して、 各辉 点の位置を電気的に検出する事を特徵とする請求の範 囲第 4項に記載の三次元位置セ ンサ 。 α The image processing means subtracts the image signal showing the contrast of the ambient light and the illumination light from the illumination means from the image signal showing the contrast of the object and each bright spot captured by the image capturing means, The three-dimensional position sensor according to claim 4, characterized in that only the image signal is taken out and the position of each light spot is electrically detected.
ω 前記撮像手段は、 ω The imaging means is
撮影レ ンズを有し、 こ の撮影レ ンズを介 して像を 撮像する撮像本体と 、 An imaging body that has a shooting lens and that captures an image through the shooting lens;
こ の撮像本体を、 対象物の位置設定点との距雜に 合わせて撮影 レ ンズを光軸方向に沿って移動させ、 焦 点を合わせる駆動手段とを備える事を特徴とする請求 の範囲第 1項に記載の三次元位置セ ンサ 。 The driving means for moving the photographing lens along the optical axis in accordance with the distance from the position set point of the object to adjust the focal point of the imaging main body. The three-dimensional position sensor described in item 1.
0¾ 前記撮像手段は、 0¾
像からの光を受光する受光面を備えた撮像本体と、 こ の受光面の前方に位置するよ うに撮像本体に取 付けられ、 前記投光手段から出力される光ビー ム の
波長成分以外の成分を有する光の通過を遮断する光フ ィ ル タ とを備える事を特徵とする請求の範囲第 1項に 記載の三次元位置セ ンサ 。 An image pickup main body having a light-receiving surface for receiving light from an image, and an optical beam output from the light-projecting means which is attached to the image pickup main body so as to be positioned in front of the light-receiving surface. The three-dimensional position sensor according to claim 1, further comprising an optical filter that blocks passage of light having a component other than a wavelength component.
前記各投光器は、 Each of the projectors is
セ ンサ本体外に設けられた発光源 と 、 A light source provided outside the sensor body,
セ ン サ本体内に設け られた投光源と 、 A light source provided in the sensor body,
発光源と投光源とを光伝達可能に接続する光フ ァ ィパ光学系とを備えている事を特徴とする請求の範囲 第 4項に記載の三次元位置セ ン サ 。 The three-dimensional position sensor according to claim 4, further comprising an optical fiber optical system that connects the light emitting source and the light emitting source so that light can be transmitted.
前記撮像手段は、 The imaging means is
セ ンサ本体外に設けられた撮像素子と 、 An image sensor provided outside the sensor body,
セ ンサ本体内に設けられた撮影 レ ンズ と 、 撮像素子と撮影レ ンズ とを光伝達可能に接続する 光フ ァ ィパ光学系とを備えている事を特徵とする請求 の範西第 1 項に記載の三次元位置セ ン サ 。 No. 1 of the claim, which is characterized in that it is provided with a photographing lens provided in the sensor body and an optical fiber optical system for connecting the image pickup device and the photographing lens so that light can be transmitted. The three-dimensional position sensor described in Section.
$ 前記撮像手段は、 $ The imaging means,
セ ン サ本体外に設けられ、 放出する光の光度を調 節可能な光源 と、 A light source that is provided outside the sensor body and that can adjust the luminous intensity of the emitted light,
セ ン サ本体内に設けられた照明 レ ンズ と 、 An illumination lens provided inside the sensor body,
光源 と照明 レ ンズ とを光伝達可能に接続する光フ ァ ィハ'光学系と を傭えている事を特徵とする請求の範 囲第 1 項に記載の三次元位置セ ン サ 。 The three-dimensional position sensor according to claim 1, characterized in that the light source and the illumination lens are connected to each other so that they can transmit light.
対象物上に表示された位置設定点に対するへ ッ ドの三次元位置及び姿勢を検出 し、 へ ッ ドを任意に位
置設定に対向させる こ とのでき る三次元位置設定シス テムにおいて、 It detects the three-dimensional position and orientation of the head with respect to the position set point displayed on the object, and positions the head arbitrarily. In a three-dimensional position setting system that can be opposed to the position setting,
へ ッ ドに設けられ、 対象物上に斜方投光し、 少な く と も三角形の頂点を構成する複数の輝点を対象物の 表面上に形成させる投光手段と 、 A projection means provided on the head for obliquely projecting light onto the object to form a plurality of bright spots forming at least the vertices of a triangle on the surface of the object;
へ ッ ドに設けられ、 投光手段に よ ])対象物上に形 成された輝点と、 対象物上に表示された位置設定点と 対象物の表面とを含んだ像を撮像する撮像手段と ; この撮像手段によ ]?撮像された輝点と位置設定点 とを電気的に読み取 i> 、 位置設定点に対するヘ ッ ドの 三次元位置及び姿勢を検出する画像処理手段と ; It is provided on the head and depends on the light emitting means.)) Imaging that captures an image including the bright spots formed on the object, the position set points displayed on the object, and the surface of the object Means; image processing means for electrically reading the imaged bright spots and position set points i>, and detecting the three-dimensional position and orientation of the head with respect to the position set points;
この画像処理手段で検出し 位置設定点に対する へ ッ ドの三次元位置及び姿勢の情報を記憶する記憶手 段と ; A storage means for storing information on the three-dimensional position and orientation of the head with respect to the position set point detected by this image processing means;
この記億手段に記憶された情報を読み出 して、 こ の情報に基づいてへッ を位置設定点に所定の三次元 位置及び姿勢で対向させる顧動手段とを具傭するこ と を特徵とする三次元位置設定シス テ ム 。 The special feature is to read out the information stored in this storage means, and to provide a consulting means for causing the head to face the position set point in a predetermined three-dimensional position and posture based on this information. 3D position setting system.
> 前記撮像手段に接続され、 この撮像手段によ 撮像された画像を表示する画面を有する画像表示手段 を更に具傭する こ とを特徵とする請求の範囲第 1 6項 に記載の三次元位置設定シス テ ム 。 The three-dimensional position according to claim 16, further comprising: an image display unit that is connected to the image capturing unit and has a screen for displaying an image captured by the image capturing unit. Setting system.
前記画像処理手段は、 The image processing means,
画像表示手段の画面に表示された各輝点の二次元
的位置関係及び投光手段における照射角度から、 対象 物上での各辉点間の位置関係に よ ]?決定されるへ ッ ド から位置設定点までの距離及び対象物に対するへ ッ V の傾斜角度を算出する基準位置算出手段と ; Two-dimensional of each bright spot displayed on the screen of the image display means Of the head position to the position setting point and the inclination of the head V with respect to the object, which are determined from the relative position and the irradiation angle of the light projecting means, depending on the positional relationship between each of the light points on the object. Reference position calculating means for calculating an angle;
画像表示手段の画面に教示点を指定する指定手段 と ί Designating means for designating teaching points on the screen of the image display means
この指定手段によ j?指定された教示点と位置設定 点との二次的な位置関係を自動計測する 自動計測手段 と ; Automatic measuring means for automatically measuring the secondary positional relationship between the teaching point and the position setting point designated by j?
この自動計測手段で計測された位置闋係と 、 位置 設定点及びへッ ド間の距離と 、 へ ッ ドの傾斜角度とを 用いて、 へ ッ ドの教示点までの移動情報を算出する移 動惰報算出手段とを備えている事を特徵とする請求の 範囲第 1 7項に記載の三次元位置設定シス テ ム 。 The movement information to the teaching point of the head is calculated using the position checker measured by this automatic measuring means, the distance between the position set point and the head, and the inclination angle of the head. The three-dimensional position setting system according to claim 17, characterized in that it is provided with a motion inertial information calculating means.
L9) 前記駆動手段は、 移動情報算出手段によ i?算出 された移動情報に基づいて、 へ ッ ドを教示点へ移動さ せる移動手段と 、 L9) The drive means is a movement means for moving the head to a teaching point based on the movement information calculated by the movement information calculation means.
この移動手段に よ ]?移動されたへ ッ Pの姿勢を傾 斜角度に基づいて制御する姿勢制御手段とを備えてい る事を特徵とする請求の範囲第 1 8項に記載の三次元 位置設定シ ス テ ム 。 The three-dimensional position according to claim 18, characterized in that it is provided with a posture control means for controlling the posture of the moved head P based on the tilt angle. Setting system.
前記画像処理手段は、 The image processing means,
画像表示手段の画面に表示された各輝点の二次元 的位置関係及び照射角から、 対象物上の各輝点間の位
置関係にて決定される基準位置のへ ッ Pまでの距離及 び対象物表面に対するへツ ドの傾斜角を算出する基準 位置算出手段と ; From the two-dimensional positional relationship and irradiation angle of each bright spot displayed on the screen of the image display means, the position between each bright spot on the object A reference position calculating means for calculating the distance to the head P of the reference position determined by the positional relationship and the inclination angle of the head with respect to the surface of the object;
前記画面に表示された各教示点の明暗を複数段階 に分けて識別する画像識別手段と ; Image identifying means for identifying the brightness of each teaching point displayed on the screen in a plurality of stages.
前記画面に表示された各教示点の中で、 最初の教 . 示点を指定する指定手段と 、 Of the teaching points displayed on the screen, designation means for designating the first teaching point,
この指定手段によ ]?指定された最初の教示点と基 準位置との二次元的な位置関係を自動計測する 自動計 測手段と ; With this designating means]? Automatic measuring means for automatically measuring the two-dimensional positional relationship between the designated first teaching point and the reference position;
この自動計測手段で計測された位置関係と 、 基準 位置のへッ ドとの間の距離及び傾斜角を用いてへ ジ ド の最初の教示点までの移動情報を算出する移動情報箕 出手段とを傭えている ことを特徵とする 、 請求の範囲 第 1 7項に記載の三次元位置設定シ ス テ ム 。 A movement information generating means for calculating movement information up to the first teaching point of the head by using the positional relationship measured by the automatic measuring means, the distance between the head at the reference position and the inclination angle. The three-dimensional position setting system according to claim 17, characterized in that
前記駆動手段は、 The drive means is
前記移動情報算出手段によ i?算出された移動情報 に基づいて、 へッ ドを最初の教示点へ移動する移動手 段と 、 A movement means for moving the head to the first teaching point based on the movement information calculated by the movement information calculation means;
この移動手段によ ])移動されたへッ ドの姿勢を、 傾斜角に基づいて制御する第 1 の姿勢制御手段と 、 By this moving means]) First attitude control means for controlling the attitude of the moved head based on the inclination angle,
画像識別手段によ D最初の教示点の周囲の画素か ら、 その教示点の明るさに最も近い明るさを有する画 素を捜し出 し、 隣接する教示点を識別し、 この隣接す
る教示点を最初の教示点に置換える よ う に順次自動指 定する 自動指定手段と 、 From the pixels around the first teaching point by the image identifying means, the pixel having the brightness closest to that of the teaching point is searched for, the adjacent teaching point is identified, and the adjacent teaching point is identified. Automatic designating means that automatically designates the teaching point to be replaced with the first teaching point,
この自動指定手段によ ]?順次指定された教示点へ、 へッ ト *を順次移動させ、 へ: ドの姿勢を順次制御する 第 2 の姿勢制御手段とを備えている事を特徵とする請 求の範囲第 2 0 項に記載の三次元位置設定シ ス テ ム 。 By this automatic designating means]? The feature is that the head * is sequentially moved to the designated teaching point and the second posture controlling means for sequentially controlling the posture of the head is provided. Scope of request The three-dimensional positioning system described in item 20.
¾ 前記へ ッ ドは加工へッ ドを備えている事を特徴 とする請求の範囲第 1 6項に記載の三次元位置設定シ ス テ ム 。
¾ The three-dimensional position setting system according to claim 16, characterized in that the head is provided with a processing head.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07/023,553 US4879664A (en) | 1985-05-23 | 1986-01-28 | Three-dimensional position sensor and three-dimensional position setting system |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP11090985A JPS61270093A (en) | 1985-05-23 | 1985-05-23 | Three-dimensional posion setting auxiliary equipment |
JP60/110909 | 1985-05-23 | ||
JP17551785A JPS6235909A (en) | 1985-08-09 | 1985-08-09 | Auxiliary device for setting three-dimensional position |
JP60/175517 | 1985-08-09 | ||
JP60/235678 | 1985-10-22 | ||
JP23567885A JPS6295606A (en) | 1985-10-22 | 1985-10-22 | Setting device for 3-dimensional position |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1990007690A1 true WO1990007690A1 (en) | 1990-07-12 |
Family
ID=27311838
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1986/000033 WO1990007690A1 (en) | 1985-05-23 | 1986-01-28 | Sensor and system for setting three-dimensional position |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO1990007690A1 (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4831552B1 (en) * | 1968-04-09 | 1973-09-29 | ||
JPS5255658A (en) * | 1975-10-31 | 1977-05-07 | Yashica Co Ltd | Distance ditector |
JPS5476179A (en) * | 1977-11-29 | 1979-06-18 | Kawasaki Heavy Ind Ltd | Method of optically detecting angle |
JPS54121164A (en) * | 1978-03-10 | 1979-09-20 | Minolta Camera Co Ltd | Photometric circuit of distance detector |
JPS5618707A (en) * | 1979-07-25 | 1981-02-21 | Hitachi Ltd | Method and device for optically measuring shape |
JPS57156504A (en) * | 1981-03-23 | 1982-09-27 | Hitachi Ltd | Position recognition device |
JPS5972012A (en) * | 1982-10-19 | 1984-04-23 | Nec Corp | Method and device for detecting gap and angle |
JPS59104007U (en) * | 1982-12-28 | 1984-07-13 | 新明和工業株式会社 | distance detection device |
-
1986
- 1986-01-28 WO PCT/JP1986/000033 patent/WO1990007690A1/en unknown
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4831552B1 (en) * | 1968-04-09 | 1973-09-29 | ||
JPS5255658A (en) * | 1975-10-31 | 1977-05-07 | Yashica Co Ltd | Distance ditector |
JPS5476179A (en) * | 1977-11-29 | 1979-06-18 | Kawasaki Heavy Ind Ltd | Method of optically detecting angle |
JPS54121164A (en) * | 1978-03-10 | 1979-09-20 | Minolta Camera Co Ltd | Photometric circuit of distance detector |
JPS5618707A (en) * | 1979-07-25 | 1981-02-21 | Hitachi Ltd | Method and device for optically measuring shape |
JPS57156504A (en) * | 1981-03-23 | 1982-09-27 | Hitachi Ltd | Position recognition device |
JPS5972012A (en) * | 1982-10-19 | 1984-04-23 | Nec Corp | Method and device for detecting gap and angle |
JPS59104007U (en) * | 1982-12-28 | 1984-07-13 | 新明和工業株式会社 | distance detection device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4879664A (en) | Three-dimensional position sensor and three-dimensional position setting system | |
US11262194B2 (en) | Triangulation scanner with blue-light projector | |
US10837756B2 (en) | Multi-dimensional measuring system with measuring instrument having 360° angular working range | |
JP5123932B2 (en) | Camera-equipped 6-degree-of-freedom target measurement device and target tracking device with a rotating mirror | |
KR20170130320A (en) | System and method for calibrating a vision system with respect to a touch probe | |
JP6663808B2 (en) | Image measuring device | |
US20140192187A1 (en) | Non-contact measurement device | |
JP2002090113A (en) | Position and attiude recognizing device | |
JP2002172575A (en) | Teaching device | |
JPH0428518B2 (en) | ||
JP6663807B2 (en) | Image measuring device | |
US10186398B2 (en) | Sample positioning method and charged particle beam apparatus | |
CN114509005B (en) | Coordinate measuring device with automatic target recognition function and recognition method thereof | |
CN113273176A (en) | Automated motion picture production using image-based object tracking | |
US10655946B2 (en) | Automated rotation mechanism for spherically mounted retroreflector | |
JP2007010354A (en) | Device for observing/measuring surface shape of object | |
JPH0755439A (en) | Three-dimensional shape measuring equipment | |
WO1990007690A1 (en) | Sensor and system for setting three-dimensional position | |
JPH0545117A (en) | Optical method for measuring three-dimensional position | |
JP6701460B1 (en) | Image sensor alignment system in multiple directions | |
JPH10128689A (en) | Visual correcting device of unmanned movable body | |
TWI776694B (en) | Automatic robot arm system and method of coordinating robot arm and computer vision thereof | |
JPH0719813A (en) | Three-dimensional visual recognition system | |
JPH06285619A (en) | Brazing robot | |
JPS61173877A (en) | Three-dimensional position setting auxiliary device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): US |