WO1997024206A1 - Systeme robotique composite de detection - Google Patents
Systeme robotique composite de detection Download PDFInfo
- Publication number
- WO1997024206A1 WO1997024206A1 PCT/JP1996/003765 JP9603765W WO9724206A1 WO 1997024206 A1 WO1997024206 A1 WO 1997024206A1 JP 9603765 W JP9603765 W JP 9603765W WO 9724206 A1 WO9724206 A1 WO 9724206A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- robot
- laser
- measurement
- image
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36427—Jog feed to a command position, if close enough robot takes over positioning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39391—Visual servoing, track end effector with camera image feedback
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40607—Fixed camera to observe workspace, object, workpiece, global
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40613—Camera, laser scanner on end effector, hand eye manipulator, local
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45123—Electrogoniometer, neuronavigator, medical robot used by surgeon to operate
Definitions
- the present invention relates to a composite sensor robot system applicable to various factories and, more particularly, to a two-dimensional robot system.
- a combined sensor robot system combining a robot with a sensor equipped with a sensor that acquires images and a sensor that uses a three-dimensional sensor.
- a sensor that can acquire a two-dimensional image for example, a visual sensor using a CCD camera
- laser sensors are effective for measuring local positions and shapes.
- the laser sensor is easy to mount on the robot's hand, and the output of the laser sensor is incorporated into the robot controller.
- a laser beam is projected from the nearby sculpture to the target object to measure the details fi, shape, etc. of the target object precisely and in real time, and this is used as a robot.
- It has a special feature that it can be used to control the position correction of a robot (for example, a robot system equipped with a laser sensor is an example). For example, it is widely used in arc welding robots, sealing robots, and measurement robots.
- the laser sensor scans the laser light with a spot light beam Some of them project slit light. Regardless of the type of laser sensor used, it is necessary to ensure that the spot light beam or the slit light is projected onto the object surface. No. If the target object is located at a certain position with a certain degree of accuracy, the robot is instructed in advance as the approach position fi to the robot near the fixed position. In this case, it is easy to quickly show a state in which a spot light beam or a slit light is projected on an object surface.
- the position fi of the target object is not determined or the accuracy of the position fi is not reliable.
- the spot light beam or slit light of the laser sensor should be emitted while moving the robot along the appropriate teaching path. Use the probe to search for the object, and if no object is found, start measuring the mains with the laser sensor (for example, shape measurement) I can't do this.
- One object of the present invention is that the position and orientation of an object are unknown. Even in this case, the robot is quickly attached to the object, and the object can be measured by a sensor such as a laser sensor that has 3D position measurement capability.
- the purpose of the present invention is to provide a composite sensor o-bot system that can be started and started.
- Another object of the present invention is to improve the reliability of various operations using a robot and shorten the cycle time through this.
- the first sensor means capable of acquiring a two-dimensional image in a relatively large area and the light projected to a relatively narrow area are used.
- a combined sensor equipped with a second sensor means capable of performing dimensional position S measurement and a combined sensor and a combined sensor mouthpiece System is provided,
- the composite sensor mouthpiece system consists of a means for outputting an operation command for the first sensor means, and an operation means for the second sensor means. Means for outputting a command, means for processing a translation signal obtained by the first sensor means, and processing for a measurement output signal from the second sensor means Means and a mouth control means for controlling the robot.
- the means for processing the image signal acquired by the first sensor means and the means for processing the measurement output signal of the second sensor means are shared by both sensor means. It is preferable that the surface image processing apparatus be used.
- the processing of the image signal obtained by the first sensor means includes processing for specifying the position S of the measurement target within a relatively wide area.
- the robot control means causes the mouth robot to perform an approaching operation on the object to be measured based on the specified position of the object to be measured.
- the second sensor means is moved to the approach position by the approach operation. The processing of the measurement output signal that is output after is included.
- the first sensor means is a visual sensor using a camera as the first sensor means
- the second sensor means is a laser spot light or laser.
- the composite sensor contains elements that are shared by the first sensor means and the second sensor means.
- a first sensor means or a visual sensor using a camera and a second sensor means a laser spot light or laser slit.
- the camera of the visual sensor is shared with the light-receiving part of the laser sensor.
- the deviation amount of the measurement target from the reference position fi is obtained.
- the process is included, and the approach position is determined as a position fi obtained by correcting the taught approach position, which has been taught in advance, according to the obtained shift amount. Is preferred,
- the composite sensor hot system displays a two-dimensional image acquired by the first sensor means.
- Image display means and means for inputting an approach position S on a two-dimensional surface image displayed by the plane image display means.
- a second sensor of a type that lacks the ability to search for a measurement target from a relatively wide range and detect the position fi and attitude.
- a composite sensor that compensates for the drawbacks of a laser sensor (for example, a laser sensor) with the ability of a first sensor (for example, a two-dimensional visual sensor using a CCD camera) to acquire a two-dimensional image. Is combined with the robot
- FIG. 1 shows an overview of the overall arrangement of the complex sensor-robot system according to Embodiment I of the present invention and an example of measurement using the same. Illustrated diagram;
- FIG. 2 is a diagram showing a schematic configuration of a laser sensor used in the embodiment I;
- FIG. 3 shows how to measure the three-dimensional position g of the spot beam reflection point using the laser sensor shown in Fig. 2.
- Fig. 4 shows the concept of the embodiment I.
- a main block diagram showing the internal configuration of the image processing apparatus used and the image processing apparatus S used in the embodiment ⁇ , and the connection relationship with related elements;
- Fig. 5 shows the outline of the processing executed inside the robot controller when measuring the step distribution performed at the position K shown in Fig. 1.
- FIG. 6 is a flow chart outlining the processing executed by a part of the image processing apparatus when measuring the step distribution performed by the ES shown in FIG. 1;
- FIG. 7 is a diagram illustrating an outline of the overall arrangement of the composite sensor-robot system according to the embodiment 1 of the present invention and a measurement example using the same;
- Figure 8 illustrates the structure and measurement principle of a slit-light projection type laser sensor
- Fig. 9 is a flow chart showing the outline of the processing executed inside the robot controller when measuring the step distribution performed by the K fi shown in Fig. 7. Chat;
- FIG. 10 is a flow chart outlining the processing executed inside the image processing device fi when the step distribution measurement is executed in the arrangement shown in FIG. 7;
- FIG. 11 is a diagram for explaining an outline of the overall arrangement of the composite sensor port bot system according to the embodiment 2 of the present invention and a measurement example using the same; ,
- FIG. 12 shows an example of an approach performed by the robot controller in the robot controller during the step distribution measurement performed at the position E shown in Fig. 11.
- FIG. 1 shows an outline of the overall arrangement of the composite sensor robot system according to the first embodiment (Embodiment I) of the present invention.
- FIG. 4 is a diagram illustrating a measurement example using the same. The system has been applied, for example, to the application of work step measurement. Compare wide-area sensors included in composite sensors Detects a work that is supplied in an uncertain position at an uncertain position within a wide range, and after an approach operation by the robot, the same complex sensor is detected. The sensor for narrow area included in the above measures the step distribution.
- the entire system is composed of a robot controller 1, an image processing device 2, and a laser sensor (main unit, hereinafter the same). It is composed of a laser sensor control unit 20, a CCD force camera 30 and a robot 40.
- the laser sensor 10 is mounted on the tip of the robot 40 (near the tool tip TCP), but the CCD camera 30 is installed outside. You.
- the CCD camera 30, which forms a wide-area sensor (first sensor means) together with the image processing device 2, has a field of view 31 of a work W supply area 50. However, it is arranged so as to cover the area where the work can be supplied immediately with a margin.
- the laser sensor 10 adopted as the sensor for the narrow area (second sensor means) is a known spot light projection type or slit type. Either light projection type may be used, but here we will describe the case that adopts the former.
- scanning laser beams LB 1, LB 2, LB 3, etc. are projected according to the robot position fi, and the reflected light is detected by light.
- the light is received by the unit 14.
- the field of view of the camera 30 is set sufficiently large, while the scanning range of the laser beams LB1, LB2, and LB3 by the laser sensor 10 (measurement range ) Is much narrower than the area captured by camera 30 in field of view 31.
- the laser sensor 10 is connected to the surface image processing device 2 via a laser sensor control unit 20.
- the CCD camera 30 is also connected to the west image processing device S2.
- the image processing device 2 is shared by the laser sensor (main unit 10 and control unit 20) and the CCD camera 30.
- the CCD camera 30 shoots when the work W is supplied. ⁇ The two-dimensional surface image obtained by the shooting is sent to the image processing device fi2.
- the image processing device 2 analyzes the image and detects the position and orientation of the work W. Based on the detection result, the robot controller 1 makes the robot 40 approach the work W.
- position 1 is the approach start position fi
- position 2 is the approach position fi.
- the approach position 2 is a position suitable for starting the step measurement by projecting the laser beam LB 2 onto the work W, and corresponds to the supply position of the work W. It is determined by
- the supply positions fi of the various works W can vary within the supply region 50.
- the robot is instructed in advance on the optimal approach position (taught approach position) when the work W is located at the appropriate reference position. It is done.
- the deviation from the reference position fi including the posture
- the robot controller 1 calculates the shifts (position and attitude corrections) that are detected and compensated for the deviations for the taught approach position fi.
- the approach position 2 is required by performing this operation.
- the robot 40 which has completed the approach, continuously measures the step d while moving from the position S2 to the position S3 (measurement end position fi). For position 3 which is the end point of the route section starting from position 2, the position fi is instructed to the robot in advance when the work W is at the reference position. Then, during actual work, the deviation of the detected work W from the reference position.
- the position given to the measurement end position 11 in which the shift (position / posture correction) for compensating is given is set as position 3. Is calculated.
- the projection unit 13 of the laser sensor (main unit) 10 is equipped with a laser oscillator 11 and an automatic mirror (galvanometer) 12 for laser beam scanning.
- the detection unit 14 includes a light receiving element 14a and an optical system 14b for imaging.
- the laser sensor control section 20 drives the mirror driving section 21 for driving the driving mirror 12 and the laser oscillator 11 to generate a laser beam.
- the signal detection unit 23 that detects the position from the position of the laser beam that is received by the laser unit 22 and the light receiving element 14a is connected.
- the mirror drive unit 21, the laser drive unit 22, and the signal detection unit 23 are connected via a line 24 to a laser sensor interface of the image processing apparatus 2. ⁇ It is surrounded by an ice (described later).
- the laser drive unit 22 turns on the laser oscillator 11 and outputs the laser beam. Generates LB. In parallel with this, the movement of the oscillating mirror 12 is started by the mirror moving section 21. As a result, a laser beam generated from the laser oscillator 11 is scanned on the object surface.
- the laser beam diffusely reflected at the reflection point S on the object surface forms an image on the light receiving element 14a according to the position of the reflection point S by the optical system 14b.
- the light receiving element 14a can use a power such as CCD or PSD (Position Sensitive Detector).
- a one-dimensional CCD array is used as the light receiving element 14a.
- the light (image of the reflected light) hitting the light receiving surface of the light receiving element 14a is converted into photoelectrons and stored in that cell.
- the electric charge accumulated in the cell is output from the first end at predetermined intervals in accordance with the CCD scanning signal from the signal detection unit 23, and is then output to the signal detection unit 23 and line 2 It is sent to the surface image processing device 2 via 4.
- the scanning cycle of the CCD is set to be sufficiently shorter than the scanning cycle of the oscillating mirror 12 (for example, 1 of several 100 minutes), and the oscillating angle of the oscillating mirror 12 is set. It is possible to keep track of changes in the output and CCD element output status at any time.
- the output state of the CCD element is grasped by the cell position (cell number) where the output is maximum, and the cell position where the reflected light hits is detected. From this position, the position fi of the reflection point S of the laser beam LB is calculated. Software processing for the position detection is performed in the image processing device 2.
- FIG. 3 is a diagram for explaining a method of obtaining the coordinate position (X, Y) of the reflection point S from the sensor based on the position xa detected by the light receiving element 14a.
- the sensor origin (0.0) is on the joint between the center of the optical system 14b and the center point of the light receiving element 14a. Let this line be the Y axis, and let «orthogonal to this Y ⁇ be the X axis.
- the distance from the origin to the center of the optical system is L 1
- the distance from the center of the optical system to the center of the photodetector 14 b is L 2
- the distance from the sensor origin is L 2.
- the distance from the X-axis swinging mirror 12 to the center of motion is D
- the Y-axis from the sensor origin to the center of motion of the mirror 12 is Is L
- the coordinates fi (X, Y) of the point S reflected from the laser beam LB or the object are given by the following equations.
- X X a ⁇ [(L 1-L 0)-tan ⁇ + ⁇ 1 / ( ⁇ a + L 2 tan 0)
- the image processing apparatus 2 includes a CPU (Central Processing Unit S) 201, and the CPU 201 includes a laser sensor / interfacing device. Interface, camera interface 200, image memory 204, program memory 205, surface image processing processor 20 6, data memory 207, monitor interface 209, and console interface 209 are connected via the BS Connected.
- CPU Central Processing Unit S
- the laser sensor interface 202 is an input / output device for the laser sensor control unit 20 which is surrounded by the laser sensor body 10. It controls the operation of the laser sensor 10 via the mirror driver 21 and the laser driver 22 in the manner described above.
- the detection signal sent via the signal detection section 23 is AD-converted and stored in the data memory 207.
- the data stored in the data memory 207 is used when the step calculation program stored in the program memory 205 is started. .
- the slit light projecting unit 13 1 whose connection is indicated by a broken line, is used in a case employing a slit light projecting type laser sensor. This is described in detail in Implementation Concept II.
- Camera interface 203 is a CCD camera 30 Function as an input / output device for the camera, and when receiving a shooting command from the CPU 201, it sends a shooting signal to the CCD camera 30 and the image obtained by shooting Is converted to grayscale and stored in image memory 204.
- the stored image is read out at the start of the position detection program stored in the program memory 205, and the image processing processor 206 is read out. It is used for image analysis by.
- the positions of the two special landmarks ⁇ and Q of the work W are detected by rooster image analysis using the surface image processing processor 206. .
- the monitor interface 208 is connected to the TV monitor.
- a monitor plane image display command is received from the CPU 201
- the current plane image or image memory 2 obtained by the CCD camera 30 is displayed on the TV monitor surface.
- the rooster statue stored in 04 is displayed as appropriate.
- the console 211 which is surrounded by the console interface 208, is provided with a manual input for performing various operations of the image processing apparatus 2. It is used to output commands from the CPU 201, so that various programs and setting parameters can be registered, edited, started, and so on.
- the communication interface 210 is used for transmitting and receiving commands and data to and from the robot controller 1.
- the robot controller 1 an ordinary one equipped with a CPU, ROM, RAM, axis controller, servo circuit, communication interface X-ice, etc. is used. Therefore, description of the internal configuration and functions is omitted here.
- the BR of the step distribution measurement of the work W executed in the distribution S shown in FIG. 1 is performed in the robot controller 1 and the image processing device 2. 5 (internal processing of the robot controller 1) and FIG. 6 (image processing apparatus 2). This will be described with reference to the flowchart of (internal processing).
- the calibration of the laser sensor 10 and CCD camera 30 has been completed, and the image data of the reference position of the work W described above has been completed.
- the approach start position (position 1), the teacher approach position, and the teaching measurement end point have already been taught to the robot controller port 1. You.
- the robot controller 1 reads out one block of the operation program that describes the move command to position 1 (the start position of the approach). Then, the normal processing is performed (step Rl), and the robot 40 is moved to the position fi1 (step R2).
- a motion position detection command is sent to the image processing device fi2 (step R3). )
- the image processing device fi2 step R3.
- the data representing the deviation of the work W from the reference position on the robot coordinate system is calculated, and the necessary robot correction is performed. Calculate the quantity (step R5).
- the data of the robot correction amount is, for example, a vector shift from P to Q, and the vectors PQ and P are expressed by the following equation. It can be calculated as data.
- step R6 one block describing a move instruction to position 2 (approach position) is read out (step R6), and the block is moved to approach position S2.
- Move bot 40 step R7.
- the approach position ⁇ 2 is obtained by shifting the approach position taught by the robot position correction amount obtained for the work W.
- Position The movement type during approach operation is each axis movement or direct movement.
- the image processing unit g2 Send the sensor command (Step R8). Then, one block in which a movement instruction to position 3 (measurement end position fi) is described is read out (step R9), and the robot 40 is moved to position fi3. Move (Step R10).
- position 3 is a position obtained by shifting the instructed measurement end position by the above-described robot position correction amount. The form of motion during measurement is linear movement along the direction in which the step extends.
- step Sl On the other hand, on the image processing device 2 side, first, it is ready to receive the work position S detection command from the robot controller 1 (step Sl). ).
- a shooting command is output to the CCD camera 30 (step S2), and an image in which the work W is captured in the field of view 31 is processed by the image processing device. 2 and stored in image memory 204 (step S3).
- the position detection program stored in the program memory 205 is started, and the stored image data is read out to execute the image processing program.
- the rooster image analysis by the processor 206 is performed, and the positions of the two special points P and Q of the work W are detected (step S4).
- step S5 The detection result is immediately sent to the robot controller 1 (step S5), and the laser sensor from the robot controller 1 is sent.
- the system is ready to receive a start command (step S6).
- a start command for the laser sensor 10 is sent to the laser sensor control unit 20 to start measuring the level difference, and the measurement result is output.
- the accumulation is started (step S7). Accumulation of step measurement results is, for example, data measurement of all measurement data or data obtained by sampling the data at an appropriately set sampling period. Write in a format that writes to memory 207 It is.
- the measurement by the laser sensor 10 is performed until the robot 40 receives the laser sensor deactivation command immediately after the robot 40 reaches the measurement end position 3. (Step S8).
- the laser sensor deactivation command is sent to the laser sensor control unit 20. Is sent to end the measurement of the level difference and end the accumulation of the measurement results (step S9).
- the processing performed inside the image processing device S2 is completed.
- FIG. 7 shows an overview of the overall arrangement fi of the composite sensor robot system according to the second embodiment (Embodiment I) of the present invention and a measurement example using the same.
- FIG. The second arrangement ffi shown in FIG. 7 differs from the arrangement S (Embodiment I) shown in FIG. 1 in the following two points.
- the camera is mounted on the robot hand along with the laser sensor.
- the camera was used not only as a wide-area sensor, but also as a light-receiving part of a laser sensor (narrow-area sensor).
- the system is applied to an application for measuring the level difference of a work, and the sensor for a wide area is located at an uncertain position within a relatively wide range. Detects a work supplied in an uncertain position, and after approaching the robot, the narrow-area sensor included in the composite sensor also Measure the step distribution.
- the entire system consists of a robot controller 1, an image processing unit 2, a slit light projecting unit 131, It consists of a laser sensor control unit 20, a CCD camera 30, and a robot 40.
- the slit light projecting unit 101 and the camera 30 are mounted on the tip of the robot 40 (near the tool tip point TCP), and the slit light projecting Of the laser sensor 101.
- the camera 30 is not only used as a wide area sensor, but also as a slit light projection type laser sensor. It is also used as the light-receiving part of the sensor (narrow area sensor) 101.
- the camera used for the wide area sensor (first sensor means) together with the surface image processing device 2 is a camera 40, and the robot 40 is used for the camera.
- the visual field 31 of the lever is automatically or manually positioned at an appropriate position covering the supply area 50 of the work W.
- Laser slit light beams SB 1, SB 2, SB 3, etc. are projected from the slit light projecting portion 13 1, and the shining image formed on the work W is captured. Shot with camera 30.
- the field of view of the camera 30 is set to be sufficiently large, while the projection range (measurement range) of the slit light SB1, SB2, LB3, etc. It is much narrower than the field of view 31 when used as a wide area sensor.
- the slit light projecting section 13 1 of the laser sensor 101 is connected to the image processing device 2 via the laser sensor control section 20.
- the CCD camera 30 is also connected to the surface image processing device 2.
- the CCD camera 30 is first used as a wide-area sensor when supplying the work W, and is used to shoot the work W from a relatively remote location (the front position S in the approach). Do.
- the two-dimensional image acquired by the photographing is sent to the image processing device 2.
- Image processing device 2 Analyzes the surface image and detects the position and orientation of the mark W. Based on the detection result, the robot controller 1 causes the mouth boat 40 to approach the work W.
- position 1 is the approach start position S
- position 2 is the approach position.
- the approach position 2 is a position suitable for starting the step measurement by projecting the slit light SB 2 onto the 7 W, and corresponds to the supply position S of the work W. Is determined by
- the supply positions fi of the individual works W in the actual work can vary within the supply region 50. If the appropriate reference position it has a work W, the optimal approach position (taught abutment position) is taught to the mouth in advance. For each of the workpieces supplied during the actual work, the deviation from the reference position (including the posture) is detected by the CCD camera 30 and the image processing device e2. Calculation that gives a shift (position / posture correction) to the taught approach position to compensate for the deviation.
- the approach position 2 is obtained by performing within the D-bottom ⁇ unit ⁇ ,
- the robot 40 which has completed the approach, steps up the step d while stepping to the position 2 or another position 3 (not shown) as necessary. Measure at one or more locations of mark W
- the robot When performing measurement at the position S3, the robot is instructed in advance to the position S3 when there is a peak W at the above-mentioned reference position.
- the position where the shift (position / posture correction) for compensating for the deviation of the detected work W from the reference position is given to the taught position 3 is It is calculated in the robot control ⁇ -roller 1 at the actual position 3.
- the slit light projecting unit 13 1 is composed of one or more (here, two) laser oscillators LSI, LS2, and stepper with a built-in cylindrical lens. It is equipped with a deflection mirror MR attached to a galvanometer that is coarsely moved by a swing motor SM. When the slit light projecting section 13 1 receives a light projecting command from the image processing device, it is flattened by the cylindrical lens from the laser oscillators LSI and LS 2. Two bundles of laser light (slit light) are emitted.
- the emission command includes a position ⁇ command of the steering motor SM which controls the angle of the deflection mirror MR.
- the two bundles of laser light are deflected in a predetermined direction to the stepping mode SM according to the position fi command value, and the light emission window (not shown)
- the light is projected onto the work (object) W.
- two bundles of slit light SB 1 and SB 2 are projected on the work (object) W, and two emission lines b 1, b 5 and b 6 -b 10 are formed. It is formed.
- the CCD camera 30 When the CCD camera 30 receives a shooting command from the image processing device fi, it obtains a rooster image including the emission lines b1-1b5 and b6-b10 and sends it to the image processing device 2. .
- the image processing device 2 analyzes the image including the bright lines bl-1b5 and b6-b10 using the image processing function, and calculates the end points and the bending points bl and b2 '* * included in the bright lines. Find the three-dimensional position fi such as bl0. The step d is calculated from these three-dimensional position data.
- the laser sensor interface 202 is connected to the laser sensor control section 20 connected to the slit light projecting section 13 1. It functions as an input / output device for communication.
- the CPU 201 issues an operation command (ONZOFF of the laser oscillator LSI, LS2, rotation of the stepping motor SM) through the laser sensor interface 220. Command, etc.). Controls the slit light emitter 13 1.
- the camera interface 230 functions as an input / output device for the CCD camera 30, and receives a shooting command from the CPU 201 so that the CCD camera can operate. Sends a shooting signal to the camera 30 »The image acquired by shadow is converted to a gray scale and stored in the image memory 204.
- the stored image is read out when the position detection program stored in the program memory 205 is started, and the image processing processor 2 is used. It is used for surface image analysis by 06. In the present embodiment, two types of surface image analysis are performed. When the camera 30 is used as a wide-area sensor, the slit light is not projected, and a normal image of the work W is acquired. Then, as in the embodiment I, the positions of the two feature points P and Q are detected.
- the camera 30 when the camera 30 is used as a narrow area sensor, the slit light is projected and the emission line formed on the work W bl — b 5, b 6 — bl The image of O is acquired, and the end point no bending point bl — b 5, b 6-b 1 0 The three-dimensional position fi is detected (see Fig. 8).
- the robot controller 1 includes a CPU, a ROM, a RAM, an axis controller, a servo circuit, a communication interface, etc., as in the first embodiment. Since a normal one with a is used, a detailed description is omitted.
- a slit light projection type laser sensor can project slit light in a certain angular range by swinging a deflection mirror MR.
- the robot controller 1 reads one block of the operation program that describes the move instruction to position 1 (abch start position fi). Then, the robot 40 performs the normal processing (step RR1), and moves the robot 40 to the position 1 (step RR2).
- the motion format is each axis movement or linear movement.
- a peak position »detection command is sent to image processing device 2 (step RR 3), and the position of work W (position of P, Q) Wait for the detection result of the RR 4).
- the data that expresses the deviation of the reference position force of the peak W, etc. on the D-bottom coordinate system is calculated and Calculate the correct amount of the bottom (Step RR5) o
- the data of the amount of the D-bot correction is P.
- the vector toward the force, Q, etc. It can be calculated as a 4 ⁇ 4 homogenous transformation matrix representing the deviation of PQ and P from the fiducial position fi.
- step RR6 one block that describes a move instruction to the position e2 (approve position) is read out (step RR6), and the ap position fi is read.
- Move the robot 40 to 2 Step RR7 No., the approach position 2 determines the taught aproach position with respect to the mark W). This is the position shifted by the determined robot position correction amount.
- the motion type at the time of the approach operation is to move each axis or linearly.
- Step RR8 After the completion of the printer operation, send the image processing device S2 laser sensor start command (Step RR8). Wait for notification that step measurement has been completed (Step RR 9) ( , After receiving the notification, send a laser sensor deactivation command to surface image processing unit fi 2 (Step R 1 0 Terminates processing.
- the image processing device S2 first enters a state of receiving a work position detection command from the robot controller 1 (step SS1). .
- a shooting command is output to the CCD camera 30 (step S S).
- Step S S 3 Captured in S 2 and stored in plane image memory 204 (Step S S 3) o
- the position detection program stored in the program memory 205 is started, and the stored surface image data is read.
- the image is read out and subjected to image analysis by the plane image processing processor 206 to detect the positions of the two feature points P and Q of the work W (step SS). Four ) .
- the detection result is immediately sent to the robot controller 1 (step SS5), and the laser sensor from the robot controller 1 is sent. Wait for the start command to be received (step SS6).
- a start command for the slit light projecting section 13 1 is sent to the laser sensor control section 20 and the slit light SB 1, SB 2 (Step SS 7).
- a shooting command is output to the CCD camera 30 (step SS8), and the image of the bright lines b1-b5, b6-b10 formed on the work W is displayed.
- the data is taken into the image processing device 2 and stored in the image memory 204 (step SS9).
- the step difference detection program stored in the program memory 205 is started, and the stored image data is read out and the image processing processor 2 is read out.
- the plane image analysis is performed by using 0, and the three-dimensional positions fi of the endpoint bending points b1, b2, b10 included in the luminosity are detected, the step d is calculated, and the step The measurement completion is transmitted to the robot controller 1 together with the measurement result as required (SS10).
- a step d is obtained after sending a movement command to the data SM and repeatedly returning steps SS7 to SS9 under the condition that the projection direction of the slit light is changed. Is preferred.
- step SS11 Waiting for the laser sensor deactivation command from CJ-bottom controller 1 (step SS11), the laser sensor (slit light The optical section 13 1) is deactivated (step SS 12), and the processing is completed.
- the image of the camera 30 as a wide area sensor is analyzed by the image processing processor fi, and the robot 40 is self-contained.
- the dynamic approach is being performed, and the 'ab approach position fi' can be obtained from image analysis of the taught approach position taught in advance. It is determined by the method of correcting based on the deviation data.
- Embodiment 2 described below is to display an image of the camera 30 as a wide-area sensor on a display and to transmit a jog on the displayed screen.
- Figure 11 shows an overview of the overall arrangement of the system that incorporates the D-bot- tory's approach.
- the application is, for example, a step measurement of a peak similarly to the first and second embodiments.
- the entire system consists of a robot controller 1, a laser sensor (spot projection type or a slit optical system). (Projection type) 10, teaching sensor equipped with laser sensor control unit 20, CCD camera 30, robot 40, and display DP. It is.
- the laser sensor 10 is mounted on the tip of the mouth box 40 (near the tool tip point TC T), but the CCD camera 30 is installed outside.
- the CCD camera 30 used as a sensor for a wide area has a table whose field of view 31 corresponds to the peak W supply area. It is arranged above the table TB and the robot 40 so that the peripheral area used for the approach of the TB and the robot 40 is covered with a margin. It is.
- the light beam of the camera 30 is directed vertically downward (the Z-axis direction of the robot coordinate system).
- Laser sensor spot projection type or slit light projection type used as a sensor for narrow area (second sensor means) Since the configuration and functions have been described in Embodiments I and II, the description is omitted here.
- the robot controller 1 is a well-known device with a built-in surface image processing device, and issues a shooting command to the camera 30 to capture images.
- the explanation of the processing contents etc. has already been given, so it is omitted.
- the teaching operation « ⁇ ⁇ is equipped with an operation key group or button group Kl, K2, K3, etc., and has a function similar to that of the normal teaching operation. Has a function to specify the movement target S.
- groups Kl and K2 are used for normal jogging, such as XYZ directions, XYZtt rotation, and robot axis rotation, and group K2 is a cursor.
- group K2 is a cursor.
- 0 (back side) and the surface image of the work W are displayed as the images 40 ′, 10 ′, and W ′ as shown.
- the operator operates the cursor key of group K3 while looking at the screen, and does not show the position a suitable for approach position 2.
- a confirming operation for example, pressing one of K3
- the position of the cursor or the robot controller is changed. This is transmitted to 1 and the XY position fi (in the robot coordinate system) represented by the force solution position S is obtained.
- the robot when an approach start operation is performed (for example, one continuous press of one of K 3), the robot is moved to the XY position S obtained. Moves.
- the Z position fi may be performed after the end of the XY movement, but an appropriate operation while the surface jog is in progress (for example, pressing one key in group K1) ), It is preferable to issue a direction jog.
- FIG. 12 An outline of the processing performed in the robot controller 1 is shown in a flowchart in FIG. 12.
- the feature of the present embodiment lies in the approach operation, and the measurement by the laser sensor (generally, a three-dimensional position sensor) 10 after the approach is completed. This is the same as in Embodiment I or II. Therefore, here, the processing until the approach is completed will be described.
- a projection command is sent to the CCD camera 30 to capture an image including the robot hand and the work (step Tl).
- the surface images 40 ', 10', and W ' are displayed on the monitor (the display DP provided for the teaching operation « ⁇ () (step T2). ).
- Step T3 Loosely move the position indicated by the cursor to the position on the robot coordinate system (X, ⁇ ) (Step T 4).
- the TCP of robot 40 is transferred to the requested position fi 2 (X, Y) (step T5).
- the position fi adjust while observing the vertical separation between the laser sensor 10 and the work W.
- a jog command in the Z-axis direction is received while the surface jog is in progress, and a command that includes movement in the Z-axis direction is created inside the robot controller 1.
- it is passed to the servo and the TCP of robot 40 is moved in the Z-axis direction (step T6).
- the approach position 2 can be specified in a form that includes the posture around the ⁇ axis. It is possible.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP52417597A JP3904605B2 (ja) | 1995-12-27 | 1996-12-24 | 複合センサーロボットシステム |
US08/894,688 US5987591A (en) | 1995-12-27 | 1996-12-24 | Multiple-sensor robot system for obtaining two-dimensional image and three-dimensional position information |
DE69637413T DE69637413T2 (de) | 1995-12-27 | 1996-12-24 | Kompositdetektionssystem für roboter |
EP96942631A EP0812662B1 (en) | 1995-12-27 | 1996-12-24 | Composite sensor robot system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP7/351180 | 1995-12-27 | ||
JP35118095 | 1995-12-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1997024206A1 true WO1997024206A1 (fr) | 1997-07-10 |
Family
ID=18415599
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1996/003765 WO1997024206A1 (fr) | 1995-12-27 | 1996-12-24 | Systeme robotique composite de detection |
Country Status (5)
Country | Link |
---|---|
US (1) | US5987591A (ja) |
EP (1) | EP0812662B1 (ja) |
JP (1) | JP3904605B2 (ja) |
DE (1) | DE69637413T2 (ja) |
WO (1) | WO1997024206A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6898486B2 (en) * | 2001-11-08 | 2005-05-24 | Fanuc Ltd | Position detecting device and takeout apparatus with position detecting device |
JP2009000782A (ja) * | 2007-06-21 | 2009-01-08 | Idec Corp | ロボット制御システムおよびロボットハンド |
JP2009218933A (ja) * | 2008-03-11 | 2009-09-24 | Seiko Epson Corp | スマートカメラ及びロボットビジョンシステム |
WO2009119510A1 (ja) | 2008-03-28 | 2009-10-01 | 本田技研工業株式会社 | ワーク測定方法、サスペンションアッセンブリ取り付け方法、およびサスペンションアッセンブリ取り付け装置 |
JP2011033497A (ja) * | 2009-08-03 | 2011-02-17 | Honda Motor Co Ltd | 環境認識システム、環境認識方法およびロボット |
JP2014109466A (ja) * | 2012-11-30 | 2014-06-12 | Yokohama National Univ | 対象物移動装置、方法、プログラム、及び記録媒体 |
WO2015101311A1 (zh) * | 2014-01-03 | 2015-07-09 | 科沃斯机器人有限公司 | 光点指示机器人及其光点指示方法 |
JP2018138318A (ja) * | 2017-02-24 | 2018-09-06 | パナソニックIpマネジメント株式会社 | 電子機器製造装置および電子機器製造方法 |
WO2020175425A1 (ja) * | 2019-02-25 | 2020-09-03 | 国立大学法人 東京大学 | ロボットシステム、ロボットの制御装置、およびロボットの制御プログラム |
Families Citing this family (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0970780A (ja) * | 1995-09-06 | 1997-03-18 | Fanuc Ltd | ロボットのツール形状補正方式 |
US7065462B2 (en) * | 1998-07-24 | 2006-06-20 | Merilab, Inc. | Vehicle wheel alignment by rotating vision sensor |
US6195620B1 (en) * | 1998-08-05 | 2001-02-27 | Ford Global Technologies, Inc. | Method for determining a geometry of a raw cylinder head casting and device for carrying out same |
JP2000135689A (ja) | 1998-10-30 | 2000-05-16 | Fanuc Ltd | ロボット用画像処理装置 |
US8944070B2 (en) | 1999-04-07 | 2015-02-03 | Intuitive Surgical Operations, Inc. | Non-force reflecting method for providing tool force information to a user of a telesurgical system |
JP3421608B2 (ja) * | 1999-04-08 | 2003-06-30 | ファナック株式会社 | 教示モデル生成装置 |
US6415051B1 (en) * | 1999-06-24 | 2002-07-02 | Geometrix, Inc. | Generating 3-D models using a manually operated structured light source |
JP3556589B2 (ja) | 2000-09-20 | 2004-08-18 | ファナック株式会社 | 位置姿勢認識装置 |
DE20019096U1 (de) * | 2000-11-10 | 2001-01-18 | Dual M Tech Ag | Vorrichtung zum zielgerichteten Bewegen elektronischer Bauteile mit einem Schwenkspiegel |
JP3703411B2 (ja) | 2001-07-19 | 2005-10-05 | ファナック株式会社 | ワーク取り出し装置 |
US6730926B2 (en) * | 2001-09-05 | 2004-05-04 | Servo-Robot Inc. | Sensing head and apparatus for determining the position and orientation of a target object |
WO2003033219A2 (de) * | 2001-10-15 | 2003-04-24 | Hermann Tropf | Korrektur der relativbewegung zwischen greif- oder bearbeitungswerkzeugen und werkstücken |
GB0125079D0 (en) * | 2001-10-18 | 2001-12-12 | Cimac Automation Ltd | Auto motion:robot guidance for manufacturing |
US7099796B2 (en) * | 2001-10-22 | 2006-08-29 | Honeywell International Inc. | Multi-sensor information fusion technique |
EP1314510B1 (en) * | 2001-11-26 | 2008-03-19 | Mitsubishi Heavy Industries, Ltd. | Method of welding three-dimensional structure and apparatus for use in such method |
JP3806342B2 (ja) | 2001-11-26 | 2006-08-09 | 三菱重工業株式会社 | 3次元形状物溶接方法及びその装置 |
JP3859571B2 (ja) * | 2002-10-17 | 2006-12-20 | ファナック株式会社 | 3次元視覚センサ |
DE20218352U1 (de) * | 2002-11-26 | 2003-01-23 | Reishauer Ag | Einzentriervorrichtung zum Ausrichten von vorverzahnten Werkstücken auf Verzahnungsfeinbearbeitungsmaschinen |
JP3950805B2 (ja) * | 2003-02-27 | 2007-08-01 | ファナック株式会社 | 教示位置修正装置 |
JP2005108144A (ja) * | 2003-10-02 | 2005-04-21 | Fanuc Ltd | ロボットの補正データ確認装置 |
DE102004024378B4 (de) * | 2004-05-17 | 2009-05-20 | Kuka Roboter Gmbh | Verfahren zur robotergestützten Vermessung von Objekten |
DE102004026185A1 (de) * | 2004-05-28 | 2005-12-22 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zum Betreiben einer Maschine, wie eines Mehrachs- Industrieroboters |
NZ533533A (en) * | 2004-06-11 | 2006-11-30 | Avalon Engineering Ltd | Deployment method and apparatus for moving stacked articles in varying orientations |
JP4087841B2 (ja) * | 2004-12-21 | 2008-05-21 | ファナック株式会社 | ロボット制御装置 |
US7830561B2 (en) * | 2005-03-16 | 2010-11-09 | The Trustees Of Columbia University In The City Of New York | Lensless imaging with controllable apertures |
US9789608B2 (en) | 2006-06-29 | 2017-10-17 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
JP2007101197A (ja) * | 2005-09-30 | 2007-04-19 | Nachi Fujikoshi Corp | 物体探索装置,物体探索装置を備えるロボットシステム及び物体探索方法 |
WO2007072837A1 (en) * | 2005-12-20 | 2007-06-28 | Semiconductor Energy Laboratory Co., Ltd. | Laser irradiation apparatus and method for manufacturing semiconductor device |
JP4199264B2 (ja) * | 2006-05-29 | 2008-12-17 | ファナック株式会社 | ワーク取り出し装置及び方法 |
US20080064931A1 (en) | 2006-06-13 | 2008-03-13 | Intuitive Surgical, Inc. | Minimally invasive surgical illumination |
US10008017B2 (en) | 2006-06-29 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Rendering tool information as graphic overlays on displayed images of tools |
US7590680B2 (en) * | 2006-06-29 | 2009-09-15 | Microsoft Corporation | Extensible robotic framework and robot modeling |
US9718190B2 (en) | 2006-06-29 | 2017-08-01 | Intuitive Surgical Operations, Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
US20090192523A1 (en) | 2006-06-29 | 2009-07-30 | Intuitive Surgical, Inc. | Synthetic representation of a surgical instrument |
US10258425B2 (en) | 2008-06-27 | 2019-04-16 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide |
US8903546B2 (en) | 2009-08-15 | 2014-12-02 | Intuitive Surgical Operations, Inc. | Smooth control of an articulated instrument across areas with different work space conditions |
US9138129B2 (en) | 2007-06-13 | 2015-09-22 | Intuitive Surgical Operations, Inc. | Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide |
US9089256B2 (en) | 2008-06-27 | 2015-07-28 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US9084623B2 (en) * | 2009-08-15 | 2015-07-21 | Intuitive Surgical Operations, Inc. | Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide |
US9469034B2 (en) | 2007-06-13 | 2016-10-18 | Intuitive Surgical Operations, Inc. | Method and system for switching modes of a robotic system |
US8620473B2 (en) | 2007-06-13 | 2013-12-31 | Intuitive Surgical Operations, Inc. | Medical robotic system with coupled control modes |
DE102008023264B4 (de) * | 2008-05-13 | 2017-01-05 | Smart Optics Sensortechnik Gmbh | Verfahren zur flexiblen Erfassung der geometrischen Form von Objekten mit optischer 3D-Messtechnik |
US8864652B2 (en) | 2008-06-27 | 2014-10-21 | Intuitive Surgical Operations, Inc. | Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip |
JP5333344B2 (ja) * | 2009-06-19 | 2013-11-06 | 株式会社安川電機 | 形状検出装置及びロボットシステム |
US8918211B2 (en) | 2010-02-12 | 2014-12-23 | Intuitive Surgical Operations, Inc. | Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument |
US9492927B2 (en) | 2009-08-15 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose |
JP4837116B2 (ja) | 2010-03-05 | 2011-12-14 | ファナック株式会社 | 視覚センサを備えたロボットシステム |
JP4938115B2 (ja) * | 2010-07-27 | 2012-05-23 | ファナック株式会社 | ワーク取出し装置およびワーク取出し方法 |
CN102909728B (zh) * | 2011-08-05 | 2015-11-25 | 鸿富锦精密工业(深圳)有限公司 | 机器人工具中心点的视觉校正方法 |
US10507066B2 (en) | 2013-02-15 | 2019-12-17 | Intuitive Surgical Operations, Inc. | Providing information of tools by filtering image areas adjacent to or on displayed images of the tools |
CN103213136B (zh) * | 2013-03-22 | 2016-06-15 | 南通航运职业技术学院 | 一种用于工业机器人牵引示教的方法和系统 |
US9874628B2 (en) * | 2013-11-12 | 2018-01-23 | The Boeing Company | Dual hidden point bars |
EP3137954B1 (en) * | 2014-04-30 | 2019-08-28 | ABB Schweiz AG | Method for calibrating tool centre point for industrial robot system |
DE102015204867A1 (de) * | 2015-03-18 | 2016-09-22 | Kuka Roboter Gmbh | Robotersystem und Verfahren zum Betrieb eines teleoperativen Prozesses |
US9757859B1 (en) * | 2016-01-21 | 2017-09-12 | X Development Llc | Tooltip stabilization |
US9815204B2 (en) * | 2016-01-22 | 2017-11-14 | The Boeing Company | Apparatus and method to optically locate workpiece for robotic operations |
US9744665B1 (en) | 2016-01-27 | 2017-08-29 | X Development Llc | Optimization of observer robot locations |
US10059003B1 (en) | 2016-01-28 | 2018-08-28 | X Development Llc | Multi-resolution localization system |
CN108313778B (zh) * | 2017-01-16 | 2020-01-14 | 泰科电子(上海)有限公司 | 膜片供应系统和方法 |
US10526799B2 (en) | 2017-03-31 | 2020-01-07 | Canvas Construction, Inc. | Automated drywall cutting and hanging system and method |
CN110831718A (zh) * | 2017-05-24 | 2020-02-21 | 因洛泰科有限公司 | 用于包括具有直立型材图案的基板的工件的自动缝焊的设备和方法 |
US11117262B2 (en) | 2017-08-01 | 2021-09-14 | eBots Inc. | Intelligent robots |
IL272882B (en) | 2017-09-25 | 2022-07-01 | Canvas Construction Inc | Automatic system for planning plaster walls and method |
JP2019058993A (ja) * | 2017-09-27 | 2019-04-18 | セイコーエプソン株式会社 | ロボットシステム |
US10967507B2 (en) | 2018-05-02 | 2021-04-06 | X Development Llc | Positioning a robot sensor for object classification |
US11724404B2 (en) * | 2019-02-21 | 2023-08-15 | Canvas Construction, Inc. | Surface finish quality evaluation system and method |
CN109895098B (zh) * | 2019-03-25 | 2020-09-29 | 华中科技大学 | 一种机器人结构参数和手眼关系的统一标定模型 |
US11312581B2 (en) * | 2019-04-16 | 2022-04-26 | Abb Schweiz Ag | Object grasp system and method |
CN114152206B (zh) * | 2021-12-15 | 2024-02-27 | 武汉理工大学重庆研究院 | 一种工件尺寸视觉检测装置 |
DE102022213715A1 (de) * | 2022-12-15 | 2024-06-20 | Peri Se | Verfahren zur positionierung eines ersten bauteils relativ zu einem zweiten bauteil durch ein roboterarmsystem |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58114892A (ja) * | 1981-12-26 | 1983-07-08 | 日産自動車株式会社 | ロボツトの視覚装置 |
JPH04261789A (ja) * | 1991-02-18 | 1992-09-17 | Matsushita Electric Ind Co Ltd | 部品移載装置 |
JPH0724585U (ja) * | 1993-09-29 | 1995-05-09 | 株式会社明電舎 | ワークハンドリング装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3635076A1 (de) * | 1986-10-15 | 1988-04-28 | Messerschmitt Boelkow Blohm | Roboteranlage mit beweglichen manipulatoren |
JP2716052B2 (ja) * | 1987-01-13 | 1998-02-18 | 株式会社日立製作所 | 加工方法及び装置並びに加工品質管理方法 |
FR2630957A1 (fr) * | 1988-05-05 | 1989-11-10 | Peugeot | Dispositif d'asservissement pour robot manipulateur |
JPH041505A (ja) * | 1990-04-18 | 1992-01-07 | Matsushita Electric Ind Co Ltd | ワークの3次元位置計測方法とワークの捕捉方法 |
US5321353A (en) * | 1992-05-13 | 1994-06-14 | Storage Technolgy Corporation | System and method for precisely positioning a robotic tool |
JPH0724585A (ja) * | 1993-07-09 | 1995-01-27 | Tomoe Corp | 構造部材の溶接方法 |
JPH07266272A (ja) * | 1994-03-29 | 1995-10-17 | Nippon Telegr & Teleph Corp <Ntt> | マニピュレータ用追従方法及び装置 |
JP3418456B2 (ja) * | 1994-06-23 | 2003-06-23 | ファナック株式会社 | ロボット位置教示具及びロボット位置教示方法 |
-
1996
- 1996-12-24 JP JP52417597A patent/JP3904605B2/ja not_active Expired - Fee Related
- 1996-12-24 WO PCT/JP1996/003765 patent/WO1997024206A1/ja active IP Right Grant
- 1996-12-24 EP EP96942631A patent/EP0812662B1/en not_active Expired - Lifetime
- 1996-12-24 US US08/894,688 patent/US5987591A/en not_active Expired - Lifetime
- 1996-12-24 DE DE69637413T patent/DE69637413T2/de not_active Expired - Lifetime
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58114892A (ja) * | 1981-12-26 | 1983-07-08 | 日産自動車株式会社 | ロボツトの視覚装置 |
JPH04261789A (ja) * | 1991-02-18 | 1992-09-17 | Matsushita Electric Ind Co Ltd | 部品移載装置 |
JPH0724585U (ja) * | 1993-09-29 | 1995-05-09 | 株式会社明電舎 | ワークハンドリング装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP0812662A4 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6898486B2 (en) * | 2001-11-08 | 2005-05-24 | Fanuc Ltd | Position detecting device and takeout apparatus with position detecting device |
JP2009000782A (ja) * | 2007-06-21 | 2009-01-08 | Idec Corp | ロボット制御システムおよびロボットハンド |
JP2009218933A (ja) * | 2008-03-11 | 2009-09-24 | Seiko Epson Corp | スマートカメラ及びロボットビジョンシステム |
WO2009119510A1 (ja) | 2008-03-28 | 2009-10-01 | 本田技研工業株式会社 | ワーク測定方法、サスペンションアッセンブリ取り付け方法、およびサスペンションアッセンブリ取り付け装置 |
US8712678B2 (en) | 2008-03-28 | 2014-04-29 | Honda Motor Co., Ltd. | Method of measuring a displacement amount for an automobile suspension assembly |
JP2011033497A (ja) * | 2009-08-03 | 2011-02-17 | Honda Motor Co Ltd | 環境認識システム、環境認識方法およびロボット |
JP2014109466A (ja) * | 2012-11-30 | 2014-06-12 | Yokohama National Univ | 対象物移動装置、方法、プログラム、及び記録媒体 |
WO2015101311A1 (zh) * | 2014-01-03 | 2015-07-09 | 科沃斯机器人有限公司 | 光点指示机器人及其光点指示方法 |
US10639795B2 (en) | 2014-01-03 | 2020-05-05 | Ecovacs Robotics Co., Ltd. | Light spot indication robot and light spot indication method thereof |
JP2018138318A (ja) * | 2017-02-24 | 2018-09-06 | パナソニックIpマネジメント株式会社 | 電子機器製造装置および電子機器製造方法 |
WO2020175425A1 (ja) * | 2019-02-25 | 2020-09-03 | 国立大学法人 東京大学 | ロボットシステム、ロボットの制御装置、およびロボットの制御プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP0812662B1 (en) | 2008-01-23 |
EP0812662A4 (en) | 2000-11-15 |
DE69637413T2 (de) | 2009-01-22 |
US5987591A (en) | 1999-11-16 |
JP3904605B2 (ja) | 2007-04-11 |
DE69637413D1 (de) | 2008-03-13 |
EP0812662A1 (en) | 1997-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1997024206A1 (fr) | Systeme robotique composite de detection | |
JP3556589B2 (ja) | 位置姿勢認識装置 | |
JP4167954B2 (ja) | ロボット及びロボット移動方法 | |
JP6464213B2 (ja) | レーザ加工ヘッドおよび撮影装置を備えるレーザ加工システム | |
JP3263724B2 (ja) | 2次元レーザパターンによる形状特徴抽出装置 | |
WO1993013383A1 (en) | Method and apparatus for measuring three-dimensional position and posture of object | |
US10571254B2 (en) | Three-dimensional shape data and texture information generating system, imaging control program, and three-dimensional shape data and texture information generating method | |
EP2959681A1 (en) | Projection system | |
JPH08166813A (ja) | ウィービング動作を伴うロボットのトラッキング制御方法 | |
JP2024009106A (ja) | ツールの作業位置のずれ量を取得する装置、及び方法 | |
JP2004138462A (ja) | 3次元視覚センサ | |
JPH1063317A (ja) | ロボット−視覚センサシステムにおける座標系結合方法 | |
JP5198078B2 (ja) | 計測装置および計測方法 | |
JP6503278B2 (ja) | 形状測定装置および形状測定方法 | |
JPH09183087A (ja) | 作業ロボット装置 | |
JP2020165658A (ja) | 三次元計測方法、三次元計測装置およびロボットシステム | |
JPH1133962A (ja) | ロボットの三次元位置センサのキャリブレーション 方法とその装置 | |
JP6387251B2 (ja) | 位置計測装置 | |
JPH11248432A (ja) | 三次元形状測定装置 | |
JPH08328624A (ja) | センサとロボットとの結合方法及びロボットシステム | |
JP7183372B1 (ja) | マーカ検出装置及びロボット教示システム | |
JP4340138B2 (ja) | 非接触式3次元形状測定装置 | |
WO2022190240A1 (ja) | 作業情報投影システム及び相対情報較正方法 | |
US11131543B2 (en) | Three-dimensional measuring apparatus and robot system | |
JPH05337785A (ja) | 研削ロボットの研削経路修正装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 08894688 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1996942631 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1996942631 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 1996942631 Country of ref document: EP |