US20060212171A1 - Off-line teaching device - Google Patents

Off-line teaching device Download PDF

Info

Publication number
US20060212171A1
US20060212171A1 US11/375,440 US37544006A US2006212171A1 US 20060212171 A1 US20060212171 A1 US 20060212171A1 US 37544006 A US37544006 A US 37544006A US 2006212171 A1 US2006212171 A1 US 2006212171A1
Authority
US
United States
Prior art keywords
vision sensor
teaching device
robot
measurement
line teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/375,440
Inventor
Kazunori Ban
Taro Arimatsu
Takashi Jumonji
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Assigned to FANUC LTD. reassignment FANUC LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ARIMATSU, TARO, BAN, KAZUNORI, JUMONJI, TAKASHI
Publication of US20060212171A1 publication Critical patent/US20060212171A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/4202Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine preparation of the programme medium using a drawing, a model
    • G05B19/4207Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine preparation of the programme medium using a drawing, a model in which a model is traced or scanned and corresponding data recorded
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Definitions

  • the present invention relates to an off-line teaching device used in a measuring operation using a vision sensor.
  • Japanese Unexamined Patent Publication No. 8-167800 discloses a mounting device in which a component may be identified by detecting reflected or transmitted light, of an irradiating light, on a mark of the component.
  • the measuring operation in the prior art takes much time because the measurement is carried out after an object and equipment such as a camera are actually prepared and arranged at predetermined positions.
  • an actual control unit must be prepared for checking the measuring result by means of the actual motion of the robot.
  • adjusting work, in the field is considerable in a system using a vision sensor. For example, it is necessary to adjust a parameter for detecting the object to be measured through a trial and error, in the field.
  • an off-line teaching device to allow off-line arranging and adjusting of a vision sensor for measuring an object to be measured, the off-line teaching device comprising: a storing part for storing a measurement condition of the object; a displaying part for indicating an image of the vision sensor and an image of the object; a simulating part for generating the images of the vision sensor and the object on the displaying part and for calculating measurement data for measuring the object by means of the vision sensor and based on the measurement condition.
  • the off-line teaching device may further comprise a choice assisting part for assisting the choice of an optical condition of a camera of the vision sensor, based on data in relation to the shape and the dimensions of the object.
  • the off-line teaching device may further comprise a measurement executing part for carrying out measurement, by means of the vision sensor, on the displaying part based on the measurement data and a programming part for preparing a robot program defining the motion of a robot handling the object.
  • the vision sensor may be attached to a movable part of the robot, otherwise, the vision sensor may be arranged at a fixed place.
  • FIG. 1 is a schematic diagram of a basic constitution of an off-line teaching device according to the invention.
  • FIG. 2 is a flowchart showing a teaching method for a vision sensor using the off-line teaching device
  • FIG. 3 is an exemplary view of a display of the off-line teaching device, on which images of a workpiece and a robot are displayed;
  • FIG. 4 is a view indicating the display of FIG. 3 further including an image of a camera
  • FIG. 5 is an exemplary view of a display on the off-line teaching device during a model teaching.
  • FIG. 1 shows a typical constitution of an off-line teaching device 10 for a vision sensor according to the invention.
  • the off-line teaching device 10 may be a processing device such as a personal computer.
  • the teaching device includes a storing device 12 , such as a ROM or hard disk, for storing data of the shapes and the dimensions of an object or a workpiece to be worked, a robot for carrying out various operations such as conveying, assembling, welding, deburring and sealing the workpiece, and a vision sensor for measuring the workpiece.
  • the teaching device 10 also includes a display 14 for indicating images of the workpiece, the robot and the vision sensor.
  • the teaching device 10 includes a simulation program 16 for generating the images of the workpiece, the robot and the vision sensor on the display 14 and for calculating measurement data of the vision sensor based on the arrangement of the images on the display, and a sensor program 18 for carrying out measurement on the display by the vision sensor based on the measurement data.
  • These programs may be stored in the storing device 12 .
  • the teaching device 10 further includes a processing device or a CPU 19 for executing each process assigned to the teaching device.
  • the teaching device 10 may include a keyboard (not shown) for inputting data by an operator.
  • FIG. 3 shows a typical example of the display.
  • a window 14 a displayed on the display 14 may be generated by executing the above simulation program 16 .
  • the window 14 a includes the images of workpiece 20 positioned on a worktable 22 and the robot 30 . These images may be indicated by using CAD data previously stored.
  • the location of the images may be determined by the operator using the keyboard. Otherwise, the location may be previously set as a suitable initial location.
  • the image data including the positions and the orientations of the workpiece 20 and the robot 30 may be fed to the sensor program 18 .
  • a measurement condition for example, a site of the workpiece to be measured, the style of the measurement (two-dimension or three-dimension), and/or the measurement accuracy
  • a measurement condition for example, a site of the workpiece to be measured, the style of the measurement (two-dimension or three-dimension), and/or the measurement accuracy
  • This step may be performed by the operator using the keyboard of the teaching device 10 .
  • optical conditions such as the types of a camera and a lens thereof to be used as the vision sensor, a field of the camera, a standoff, etc.
  • the optical condition may be inputted by the operator, alternatively, an assist program for selecting the camera, capable of indicating a window on the display, may be previously installed in the teaching device 10 , by which the operator may interactively input or check the type of the camera and the measurement accuracy, etc.
  • the camera may be arranged on a fixed place or, alternatively, attached to a movable part, such as an arm, of the robot 30 , depending on the above measurement condition.
  • a motion compensating program for the robot on a base coordinate system or a tool coordinate system of the robot.
  • step S 4 the measurement accuracy expected by using the selected camera is estimated and, then, the result is compared to a desired accuracy included in the measurement condition inputted in step S 2 (step S 4 ).
  • the procedure progresses to step S 5 , otherwise, returns to step S 2 to reexamine the measurement condition.
  • an image of the selected camera denoted by a numeral 40 is located at a suitable position (e.g., where the camera may roughly image the workpiece 20 ) in the window 14 a (step S 5 ).
  • a suitable position e.g., where the camera may roughly image the workpiece 20
  • an image of the workpiece 20 which is expected to be actually obtained by the camera 40 , may be indicated on a window 14 b . Therefore, the operator may visually check whether the selected camera 40 is suitable for the measurement.
  • Teaching device 10 then carries out the model teaching of each site to be measured through steps S 6 to S 9 .
  • the camera 40 is moved and oriented so as to image a site or a model 24 to be measured in a field 42 of the camera 40 (step S 6 ).
  • an image of the site imaged by the camera is generated (step S 7 ).
  • the site in the window 14 b is indicated as a featured model defined by a closing line 26 or pointed by a cross-shape marker 28 .
  • a floodlight device 44 such as a laser floodlight may be arranged on or near the camera 40 , whereby three-dimensional images of the workpiece 20 and the model 24 may be obtained.
  • step S 8 the model teaching is carried out.
  • step S 9 a set of steps S 6 to S 8 is repeated a certain number of times, the certain number being equal to the number of the plurality of sites. Therefore, a taught model corresponding to all of the sites may be obtained and a measurement program for obtaining the taught model may be prepared (step S 10 ).
  • Teaching device 10 executes the measurement program prepared in steps S 6 to S 10 (step S 1 ) and prepares a robot motion program including a motion compensating program for simulating the compensational motion of the robot 30 (step S 12 ).
  • the position and the orientation of the robot 30 during handling the workpiece 20 using the robot, may be compensated by a measurement result of the vision sensor.
  • the teaching device 10 may purposely change the position and/or the orientation of the model 24 and/or the camera 40 on the display 14 a and, then, execute the motion compensating program to check whether the motion of the robot is suitably compensated.
  • the measurement program and the motion compensating program thus prepared may be downloaded to the actual system as a program or data for an operation in the field.
  • the workload of trial and error regarding the vision sensor in the field may be reduced or eliminated, whereby the total time and workload in the field may be greatly reduced.
  • the compensating motion of the robot may be simulated or checked off-line, which also contributes to a reduction of the workload in the field.
  • the simulation program 16 and the sensor program 18 are executed in the same device and results thereof are indicated on the same display.
  • the programs may be separately executed in different devices and the results may be indicated on different displays.
  • teaching and adjusting of the vision sensor may be executed off-line by locating images of the object to be measured and the vision sensor with a calculator, whereby the workload and time for a trial and error, such as a program coordination in the field, may be greatly reduced. Further, the compensating motion of the robot for handling the object may be checked off-line.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

An off-line teaching device, for adjusting a vision sensor and carrying out set-up work off-line, which used to be done in the field, so as to reduce the operation time in the field. The off-line teaching device has a storing device for storing data including the shapes and the dimensions of a workpiece, a robot and a vision sensor, and a display for indicating images of the workpiece, the robot and the vision sensor. The teaching device also has a simulation program for generating the images of the workpiece, the robot and the vision sensor on the display and for calculating measurement data of the vision sensor based on the arrangement of the images on the display, and a sensor program for carrying out measurement on the display by the vision sensor based on the measurement data.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an off-line teaching device used in a measuring operation using a vision sensor.
  • 2. Description of the Related Art
  • In the prior art, a measuring operation is carried out by actually providing an object to be measured, a vision sensor and a camera. For example, Japanese Unexamined Patent Publication No. 8-167800 discloses a mounting device in which a component may be identified by detecting reflected or transmitted light, of an irradiating light, on a mark of the component.
  • The measuring operation in the prior art takes much time because the measurement is carried out after an object and equipment such as a camera are actually prepared and arranged at predetermined positions. When the motion of a control unit of an industrial robot or the like is compensated by using a measuring result of a vision sensor, an actual control unit must be prepared for checking the measuring result by means of the actual motion of the robot. Further, adjusting work, in the field, is considerable in a system using a vision sensor. For example, it is necessary to adjust a parameter for detecting the object to be measured through a trial and error, in the field.
  • SUMMARY OF THE INVENTION
  • Accordingly, it is an object of the present invention to adjust a vision sensor and carry out set-up work off-line, which used to be done in the field, so as to reduce the operation time in the field.
  • To this end, according to the present invention, there is provided an off-line teaching device to allow off-line arranging and adjusting of a vision sensor for measuring an object to be measured, the off-line teaching device comprising: a storing part for storing a measurement condition of the object; a displaying part for indicating an image of the vision sensor and an image of the object; a simulating part for generating the images of the vision sensor and the object on the displaying part and for calculating measurement data for measuring the object by means of the vision sensor and based on the measurement condition.
  • The off-line teaching device may further comprise a choice assisting part for assisting the choice of an optical condition of a camera of the vision sensor, based on data in relation to the shape and the dimensions of the object.
  • The off-line teaching device may further comprise a measurement executing part for carrying out measurement, by means of the vision sensor, on the displaying part based on the measurement data and a programming part for preparing a robot program defining the motion of a robot handling the object.
  • The vision sensor may be attached to a movable part of the robot, otherwise, the vision sensor may be arranged at a fixed place.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects, features and advantages of the present invention will be made more apparent by the following description, of the preferred embodiments thereof, with reference to the accompanying drawings, wherein:
  • FIG. 1 is a schematic diagram of a basic constitution of an off-line teaching device according to the invention;
  • FIG. 2 is a flowchart showing a teaching method for a vision sensor using the off-line teaching device;
  • FIG. 3 is an exemplary view of a display of the off-line teaching device, on which images of a workpiece and a robot are displayed;
  • FIG. 4 is a view indicating the display of FIG. 3 further including an image of a camera; and
  • FIG. 5 is an exemplary view of a display on the off-line teaching device during a model teaching.
  • DETAILED DESCRIPTIONS
  • The present invention will be described below with reference to the drawings. FIG. 1 shows a typical constitution of an off-line teaching device 10 for a vision sensor according to the invention. The off-line teaching device 10 may be a processing device such as a personal computer. The teaching device includes a storing device 12, such as a ROM or hard disk, for storing data of the shapes and the dimensions of an object or a workpiece to be worked, a robot for carrying out various operations such as conveying, assembling, welding, deburring and sealing the workpiece, and a vision sensor for measuring the workpiece. The teaching device 10 also includes a display 14 for indicating images of the workpiece, the robot and the vision sensor. Further, the teaching device 10 includes a simulation program 16 for generating the images of the workpiece, the robot and the vision sensor on the display 14 and for calculating measurement data of the vision sensor based on the arrangement of the images on the display, and a sensor program 18 for carrying out measurement on the display by the vision sensor based on the measurement data. These programs may be stored in the storing device 12. The teaching device 10 further includes a processing device or a CPU 19 for executing each process assigned to the teaching device. The teaching device 10 may include a keyboard (not shown) for inputting data by an operator.
  • Next, a procedure for teaching the vision sensor using the off-line teaching device is described with reference to a flowchart of FIG. 2 and display examples as shown in FIGS. 3 to 5.
  • First, images of a workpiece to be measured and a robot are located on the display 14 of the teaching device 10 (step S1). FIG. 3 shows a typical example of the display. A window 14 a displayed on the display 14 may be generated by executing the above simulation program 16. As shown in FIG. 3, the window 14 a includes the images of workpiece 20 positioned on a worktable 22 and the robot 30. These images may be indicated by using CAD data previously stored. The location of the images may be determined by the operator using the keyboard. Otherwise, the location may be previously set as a suitable initial location. The image data including the positions and the orientations of the workpiece 20 and the robot 30 may be fed to the sensor program 18.
  • Next, a measurement condition (for example, a site of the workpiece to be measured, the style of the measurement (two-dimension or three-dimension), and/or the measurement accuracy) is inputted (step S2). This step may be performed by the operator using the keyboard of the teaching device 10.
  • Based on the above measurement condition and previously stored CAD data including the shape and the dimension of the workpiece, optical conditions, such as the types of a camera and a lens thereof to be used as the vision sensor, a field of the camera, a standoff, etc., is determined (step S3). The optical condition may be inputted by the operator, alternatively, an assist program for selecting the camera, capable of indicating a window on the display, may be previously installed in the teaching device 10, by which the operator may interactively input or check the type of the camera and the measurement accuracy, etc. At this point, the camera may be arranged on a fixed place or, alternatively, attached to a movable part, such as an arm, of the robot 30, depending on the above measurement condition. In general, by measuring the position and the orientation of the workpiece using the vision sensor, it is possible to identify and inspect the workpiece and, further, to make a motion compensating program for the robot on a base coordinate system or a tool coordinate system of the robot.
  • After that, the measurement accuracy expected by using the selected camera is estimated and, then, the result is compared to a desired accuracy included in the measurement condition inputted in step S2 (step S4). When the estimated accuracy is higher than the desired accuracy, the procedure progresses to step S5, otherwise, returns to step S2 to reexamine the measurement condition.
  • Next, as shown in FIG. 4, an image of the selected camera denoted by a numeral 40 is located at a suitable position (e.g., where the camera may roughly image the workpiece 20) in the window 14 a (step S5). At this point, by executing the sensor program 18, an image of the workpiece 20, which is expected to be actually obtained by the camera 40, may be indicated on a window 14 b. Therefore, the operator may visually check whether the selected camera 40 is suitable for the measurement.
  • Teaching device 10 then carries out the model teaching of each site to be measured through steps S6 to S9. First, as shown in FIG. 5, in the window 14 a, the camera 40 is moved and oriented so as to image a site or a model 24 to be measured in a field 42 of the camera 40 (step S6). In the window 14 b, on the other hand, an image of the site imaged by the camera is generated (step S7). The site in the window 14 b is indicated as a featured model defined by a closing line 26 or pointed by a cross-shape marker 28. A floodlight device 44 such as a laser floodlight may be arranged on or near the camera 40, whereby three-dimensional images of the workpiece 20 and the model 24 may be obtained. Further, shadow processing may be performed on the model 24, with reference to data including the location of a lighting apparatus (not shown). When the image thus generated on the window 14 b is considered to be valid, the model teaching is carried out (step S8). When a plurality of sites are to be measured, a set of steps S6 to S8 is repeated a certain number of times, the certain number being equal to the number of the plurality of sites (step S9). Therefore, a taught model corresponding to all of the sites may be obtained and a measurement program for obtaining the taught model may be prepared (step S10).
  • Teaching device 10 executes the measurement program prepared in steps S6 to S10 (step S1) and prepares a robot motion program including a motion compensating program for simulating the compensational motion of the robot 30 (step S12). In other words, the position and the orientation of the robot 30, during handling the workpiece 20 using the robot, may be compensated by a measurement result of the vision sensor. Also, the teaching device 10 may purposely change the position and/or the orientation of the model 24 and/or the camera 40 on the display 14 a and, then, execute the motion compensating program to check whether the motion of the robot is suitably compensated.
  • The measurement program and the motion compensating program thus prepared may be downloaded to the actual system as a program or data for an operation in the field. By using the program and data obtained by the off-line teaching device, the workload of trial and error regarding the vision sensor in the field may be reduced or eliminated, whereby the total time and workload in the field may be greatly reduced. Further, the compensating motion of the robot may be simulated or checked off-line, which also contributes to a reduction of the workload in the field.
  • In the above embodiment, the simulation program 16 and the sensor program 18 are executed in the same device and results thereof are indicated on the same display. However, the programs may be separately executed in different devices and the results may be indicated on different displays.
  • According to the off-line teaching device of the present invention, teaching and adjusting of the vision sensor may be executed off-line by locating images of the object to be measured and the vision sensor with a calculator, whereby the workload and time for a trial and error, such as a program coordination in the field, may be greatly reduced. Further, the compensating motion of the robot for handling the object may be checked off-line.
  • While the invention has been described with reference to specific embodiments chosen for the purpose of illustration, it should be apparent that numerous modifications could be made thereto, by one skilled in the art, without departing from the basic concept and scope of the invention.

Claims (5)

1. An off-line teaching device to allow off-line arranging and adjusting of a vision sensor for measuring an object to be measured, the off-line teaching device comprising:
a storing part for storing a measurement condition of the object;
a displaying part for indicating an image of the vision sensor and an image of the object;
a simulating part for generating the images of the vision sensor and the object on the displaying part and for calculating measurement data for measuring the object by means of the vision sensor and based on the measurement condition.
2. The off-line teaching device as set forth in claim 1, further comprising a choice assisting part for assisting the choice of an optical condition of a camera of the vision sensor, based on data in relation to the shape and the dimensions of the object.
3. The off-line teaching device as set forth in claim 1, further comprising a measurement executing part, for carrying out measurement by means of the vision sensor, on the displaying part based on the measurement data and a programming part for preparing a robot program defining the motion of a robot handling the object.
4. The off-line teaching device as set forth in claim 3, wherein the vision sensor is attached to a movable part of the robot.
5. The off-line teaching device as set forth in claim 1, wherein the vision sensor is arranged at a fixed place.
US11/375,440 2005-03-17 2006-03-15 Off-line teaching device Abandoned US20060212171A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-77847 2005-03-17
JP2005077847A JP4266946B2 (en) 2005-03-17 2005-03-17 Offline teaching device

Publications (1)

Publication Number Publication Date
US20060212171A1 true US20060212171A1 (en) 2006-09-21

Family

ID=36648636

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/375,440 Abandoned US20060212171A1 (en) 2005-03-17 2006-03-15 Off-line teaching device

Country Status (4)

Country Link
US (1) US20060212171A1 (en)
EP (1) EP1703349A2 (en)
JP (1) JP4266946B2 (en)
CN (1) CN1834835A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090187276A1 (en) * 2008-01-23 2009-07-23 Fanuc Ltd Generating device of processing robot program
DE102018116245B4 (en) 2017-07-11 2021-08-26 Fanuc Corporation Programming device that creates an operating program, as well as a method for creating the program
EP4104980A3 (en) * 2021-06-18 2023-03-01 Doosan Robotics Inc Apparatus and method for capturing image using robot

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101275831B (en) * 2007-03-26 2011-06-22 鸿富锦精密工业(深圳)有限公司 Image off-line processing system and method
JP5582427B2 (en) * 2012-12-18 2014-09-03 株式会社安川電機 Teaching data creation apparatus, robot system, and teaching data creation method
JP5549749B1 (en) * 2013-01-16 2014-07-16 株式会社安川電機 Robot teaching system, robot teaching program generation method and teaching tool
JP5729404B2 (en) * 2013-02-21 2015-06-03 株式会社安川電機 Teaching system and teaching method
JP5815761B2 (en) 2014-01-23 2015-11-17 ファナック株式会社 Visual sensor data creation system and detection simulation system
JP5850958B2 (en) * 2014-01-24 2016-02-03 ファナック株式会社 Robot programming device for creating a robot program for imaging a workpiece
JP5850962B2 (en) 2014-02-13 2016-02-03 ファナック株式会社 Robot system using visual feedback
JP6883392B2 (en) * 2016-07-29 2021-06-09 川崎重工業株式会社 Robot system
JP6626057B2 (en) * 2017-09-27 2019-12-25 ファナック株式会社 Inspection device and inspection system
JP7260405B2 (en) * 2019-06-07 2023-04-18 ファナック株式会社 Offline programming devices, robot controllers and augmented reality systems
JP2020203348A (en) * 2019-06-18 2020-12-24 株式会社ダイヘン Robot control device, and robot control system
CN115397629A (en) 2020-03-30 2022-11-25 发那科株式会社 Off-line simulation system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6278906B1 (en) * 1999-01-29 2001-08-21 Georgia Tech Research Corporation Uncalibrated dynamic mechanical system controller
US20040172168A1 (en) * 2003-02-27 2004-09-02 Fanuc Ltd. Taught position modification device
US20040199288A1 (en) * 2003-02-28 2004-10-07 Fanuc Ltd Robot teaching device
US20050004709A1 (en) * 2003-07-03 2005-01-06 Fanuc Ltd Robot off-line simulation apparatus
US20060025890A1 (en) * 2004-08-02 2006-02-02 Fanuc Ltd Processing program generating device
US20060184279A1 (en) * 2003-06-02 2006-08-17 Matsushita Electric Industrial Co., Ltd. Article handling system and method and article management system and method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01209505A (en) * 1988-02-17 1989-08-23 Toshiba Corp Teaching device for remote control robot
JPH07122823B2 (en) * 1988-06-27 1995-12-25 株式会社日立製作所 Teaching and control method for robot / automatic machine with hand vision
JPH0421105A (en) * 1990-05-16 1992-01-24 Hitachi Ltd Stereoscopic teaching device for manipulator
JPH08167800A (en) * 1994-12-15 1996-06-25 Toshiba Corp Part mounting equipment
JPH09297611A (en) * 1996-05-02 1997-11-18 Nippon Telegr & Teleph Corp <Ntt> Method and device for teaching robot
JP3415427B2 (en) * 1998-02-25 2003-06-09 富士通株式会社 Calibration device in robot simulation
JP4227863B2 (en) * 2003-08-04 2009-02-18 株式会社デンソー Teaching apparatus and teaching method for visual inspection apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6278906B1 (en) * 1999-01-29 2001-08-21 Georgia Tech Research Corporation Uncalibrated dynamic mechanical system controller
US20040172168A1 (en) * 2003-02-27 2004-09-02 Fanuc Ltd. Taught position modification device
US20040199288A1 (en) * 2003-02-28 2004-10-07 Fanuc Ltd Robot teaching device
US20060184279A1 (en) * 2003-06-02 2006-08-17 Matsushita Electric Industrial Co., Ltd. Article handling system and method and article management system and method
US20050004709A1 (en) * 2003-07-03 2005-01-06 Fanuc Ltd Robot off-line simulation apparatus
US20060025890A1 (en) * 2004-08-02 2006-02-02 Fanuc Ltd Processing program generating device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090187276A1 (en) * 2008-01-23 2009-07-23 Fanuc Ltd Generating device of processing robot program
DE102018116245B4 (en) 2017-07-11 2021-08-26 Fanuc Corporation Programming device that creates an operating program, as well as a method for creating the program
EP4104980A3 (en) * 2021-06-18 2023-03-01 Doosan Robotics Inc Apparatus and method for capturing image using robot

Also Published As

Publication number Publication date
CN1834835A (en) 2006-09-20
EP1703349A2 (en) 2006-09-20
JP2006260271A (en) 2006-09-28
JP4266946B2 (en) 2009-05-27

Similar Documents

Publication Publication Date Title
US20060212171A1 (en) Off-line teaching device
US7333879B2 (en) Offline programming device
US20080013825A1 (en) Simulation device of robot system
KR100986669B1 (en) A device and method for calibrating a robot
EP1607194B1 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
US9199379B2 (en) Robot system display device
EP1315056A2 (en) Simulation apparatus for working machine
JP4347386B2 (en) Processing robot program creation device
US8155789B2 (en) Device, method, program and recording medium for robot offline programming
US8406922B2 (en) System and method for setting the tool center point of a robotic tool
EP1661669A2 (en) Processing program generating device
EP1769890A2 (en) Robot simulation device
JP2011031346A (en) Apparatus and method for measuring position of tool end point of robot
KR20130141664A (en) Robotic work object cell calibration device, system, and method
CN113211493B (en) Calibration method and calibration system
KR20080088165A (en) Robot calibration method
JP2021059012A (en) Information processing device, information processing method and robot system
KR100644174B1 (en) Method for compensating in welding robot
US7069175B2 (en) Method and apparatus for supporting measurement of object to be measured
Cakir et al. High precise and zero-cost solution for fully automatic industrial robot TCP calibration
Li et al. Toward general industrial robot cell calibration
JP2019089201A (en) Teaching data creation device, method for controlling teaching data creation device, and robot system
KR101438657B1 (en) Method of measuring industrial robot jig
KR20220164749A (en) Operator training assemblies and methods for digitally controlled machining devices, production assemblies including such training assemblies
JP2021186929A (en) Control method for multi-axis robot

Legal Events

Date Code Title Description
AS Assignment

Owner name: FANUC LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAN, KAZUNORI;ARIMATSU, TARO;JUMONJI, TAKASHI;REEL/FRAME:017688/0123

Effective date: 20060302

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION