WO2017085811A1 - Dispositif d'enseignement et procédé de génération d'informations de commande - Google Patents

Dispositif d'enseignement et procédé de génération d'informations de commande Download PDF

Info

Publication number
WO2017085811A1
WO2017085811A1 PCT/JP2015/082405 JP2015082405W WO2017085811A1 WO 2017085811 A1 WO2017085811 A1 WO 2017085811A1 JP 2015082405 W JP2015082405 W JP 2015082405W WO 2017085811 A1 WO2017085811 A1 WO 2017085811A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
work
marker
information
display
Prior art date
Application number
PCT/JP2015/082405
Other languages
English (en)
Japanese (ja)
Inventor
亘 前橋
政利 藤田
Original Assignee
富士機械製造株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士機械製造株式会社 filed Critical 富士機械製造株式会社
Priority to PCT/JP2015/082405 priority Critical patent/WO2017085811A1/fr
Priority to JP2017551442A priority patent/JP6660962B2/ja
Publication of WO2017085811A1 publication Critical patent/WO2017085811A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine

Definitions

  • the present invention relates to a teaching device for teaching an operation to a robot and a method for generating control information for controlling the robot.
  • Some industrial robots for example, robot arms, include a so-called parallel link mechanism that supports one end effector by a plurality of arm units arranged in parallel, or a plurality of arm units such as an articulated robot in one direction.
  • Patent Document 1 there is a teaching device that simulates the work performed by these robot arms by a person, acquires the motion of the simulated person by motion capture, and teaches the robot arm (for example, Patent Document 1).
  • a teaching device that simulates the work performed by these robot arms by a person, acquires the motion of the simulated person by motion capture, and teaches the robot arm (for example, Patent Document 1).
  • Patent Literature 1 when a user operates a measuring device (in the literature, “motion capture”) while wearing the hand, based on the three-dimensional coordinate data transmitted from the measuring device, The position of the measuring device is sampled at a predetermined sampling interval. The teaching device calculates the movement position, movement speed, and the like based on the sampled data, and controls the arm unit to move the calculated movement position at the movement speed.
  • industrial robots such as the robot arm described above perform not only the movement operation but also work on the workpiece at various positions during movement and at the movement destination.
  • the user who handles the teaching apparatus needs to teach the robot arm not only the movement operation but also the work content to be executed during or at the movement destination.
  • the present invention has been made in view of the above-described problems, and an object of the present invention is to provide a teaching device and a control information generation method capable of teaching a robot at work position in addition to a moving operation. .
  • a teaching device that generates control information for controlling the operation of a robot, and includes a position marker unit that indicates the position of the robot, A detection unit that detects a state of the marker unit, a processing unit that inputs detection data obtained by detecting the position marker unit by the detection unit, a display unit that displays a processing result of the processing unit, and an input unit that receives an input from the outside
  • the processing unit includes, based on the detection data, position information generation processing for generating position information in the three-dimensional coordinates of the position marker unit, position display processing for displaying the position information on the display unit, and displayed position information.
  • the work position selection process for selecting the work position where the robot performs the work according to the input of the input unit, and the content of the work performed at the work position
  • the work information and executes a working information setting processing for setting in response to the input of the input unit, and a control information generation process of generating a control information data that associates the work information and work positions.
  • the processing unit displays a list box indicating a list of work information according to an input of the input unit.
  • the structure which performs the 1st display process displayed on may be sufficient. It is characterized by that.
  • the processing unit includes an input process for setting the first number of position marker units based on an input of the input unit; A determination process for determining whether or not the first number of position marker units set in the input process matches the second number of position marker units detected by the position information generation process, and the result of the determination process is displayed on the display unit It is also possible to perform a second display process to be displayed on the screen.
  • the processing unit executes a reading process for reading the output information generated as control information
  • the position information included in the read output information may be displayed on the display unit together with the position information detected by the position information generation process and displayed on the display unit.
  • the processing unit performs two processes of a position information generation process and a position display process in parallel. It is also possible to have a configuration in which a third display process is executed for displaying a mode selection button on the display unit for selecting the real-time display mode.
  • the teaching device is the teaching device according to any one of claims 1 to 5, further comprising a jig having a position marker portion, wherein the jig includes an end effector and an end effector. It has a drive unit that drives the effector and an end effector marker unit that indicates the position of the end effector, the detection unit detects the state of the end effector marker unit, and the processing unit drives as position information generation processing. The position information of the end effector marker unit that moves as the end effector operates based on the drive of the unit may be generated.
  • a control information generation method for controlling the operation of a robot, and includes a position marker portion indicating a position of the robot
  • a detection unit that detects a state of the position marker unit, a processing unit that inputs detection data obtained by detecting the position marker unit by the detection unit, a display unit that displays a processing result of the processing unit, and an input from outside
  • a position information generating step for generating position information in the three-dimensional coordinates of the position marker unit based on the detection data; a position display step for displaying the position information on the display unit; Of the position information, the work position selection step for selecting the work position where the robot performs the work according to the input of the input unit and the work position are performed.
  • a work information setting step for setting work information corresponding to the content of the work according to an input of the input unit; and a control information generating step for generating data associating the work position with the work information as control information. It is characterized by that.
  • the teaching device generates control information for controlling the robot.
  • the control information is information that sets the movement position, movement direction, movement time, movement work, etc. of the robot.
  • the detection unit outputs detection data obtained by detecting the state of the position marker unit to the processing unit.
  • As a method for detecting the position marker portion for example, an optical or magnetic motion capture can be used.
  • the processing unit generates position information of the position marker unit based on the detection data of the detection unit, and displays the generated position information on the display unit. Thereby, the user can confirm the position of the position marker part which carried out the motion capture by seeing a display part.
  • the user can select a work position where the robot performs work from among the displayed position information, and can set information on the work performed at the selected work position.
  • the processing unit generates control information that associates the work position with the work information.
  • the work performed by the robot at the work position is, for example, a work such as sandwiching a work, grasping a work, irradiating a laser, taking an image, or sucking a work. Therefore, the user can generate control information that sets the work to be performed by the robot at a desired position by operating the input unit while confirming the display on the display unit. Further, the user can teach the robot a desired action using the control information.
  • the processing unit displays a list box indicating a list of work information in the work information setting process.
  • the user can check the list of the list box and associate the desired work content with the work position easily and reliably.
  • the display contents of such a list box can be determined by referring to a text file (definition data) according to a format such as an ini file format. Therefore, the user can appropriately select and set work information desired to be displayed as a list box by editing the text file.
  • the processing unit can display only work information that can be performed by the robot in the list box, and can prevent erroneous input.
  • the user inputs the actual number of position marker units for performing motion capture as the first number through the input unit.
  • the processing unit determines whether or not the input first number matches the second number detected by the position information generation process, that is, the number detected by the motion capture.
  • the processing unit displays the determination result on the display unit. Thereby, the user can confirm whether all the position marker parts which are implementing the motion capture can be captured by confirming the displayed determination result. The user can perform an appropriate response such as redoing the motion capture according to the confirmation result.
  • the output information is already generated control information, for example, control information on which motion capture has been performed in advance.
  • the processing unit reads the output information and displays the position information included in the output information on the display unit together with the position information detected by the position information generation process. Thereby, for example, the user can display the state of the work, the position of work, the presence or absence of interference, etc. by cooperating the robot that has performed motion capture in advance with another robot that is performing motion capture by displaying on the display unit. Can be confirmed.
  • the user can set the processing unit to the real-time display mode by operating the input unit and selecting a mode selection button of the display unit, for example.
  • the processing unit executes real-time processing for displaying the generated position information on the display unit while generating the position information.
  • the user can confirm the position information of the captured position marker part with a real-time moving image while performing motion capture. The user can quickly determine whether to continue or redo the motion capture according to the confirmation result.
  • the teaching device further includes a jig having a position marker portion.
  • a jig having a position marker portion.
  • the hand of a person wearing the measuring apparatus or the like has entered the capture area where the motion capture is performed.
  • the teaching device for example, only a jig held in a hand can be made to enter the capture region as a detection target. The user can prevent problems such as interference of the hand or arm with other members by not inserting the hand or arm into the capture area.
  • the user can teach the user by moving the jig.
  • the jig operates the end effector having the end effector marker portion by driving the drive portion.
  • the user drives the drive unit at the work position to operate the end effector.
  • the processing unit performs motion capture of the operating end effector marker unit to generate position information.
  • the user can easily determine the work position on the captured data by confirming the position where the end effector marker unit operates on the display of the display unit.
  • the processing unit can detect the position where the end effector marker unit is operated, and can set the detected position as a work position in advance.
  • the invention according to the present application is not limited to the teaching device, and can be implemented as an invention of a method for generating control information by the teaching device.
  • FIG. 1 It is the figure which showed roughly the structure of the teaching apparatus of this embodiment. It is a perspective view of the frame part to which the some camera was attached. It is a schematic diagram which shows the structure of the industrial robot which is the object controlled using control information. It is a schematic diagram which shows the state which implements a motion capture using the teaching apparatus of this embodiment. It is a figure which shows the display screen of a display part. It is the figure which expanded the display of the capture display part. It is a figure which shows the content of definition data. It is a figure which shows the display state of a sub screen. It is a figure which shows the display state of a sub screen. It is the figure which expanded the display of the work information registration box.
  • FIG. 1 schematically shows a configuration of a main part of the teaching device 10 of the present embodiment.
  • the teaching device 10 is a device that performs optical motion capture, and includes a plurality of cameras 13 (only one is shown in FIG. 1), a plurality of jigs 15 (only one is shown in FIG. 1), and control information. And a generation device 17.
  • the teaching device 10 captures the movement of one or a plurality of jigs 15 with a plurality of cameras 13, and generates control information D5 for controlling the robot arms 101 and 103 shown in FIG. 17 is generated.
  • each of the plurality of cameras 13 is attached to a frame portion 23 in which a plurality of (12 in this embodiment) pipes 21 are assembled in a rectangular parallelepiped shape.
  • Each of the plurality of pipes 21 is formed with the same length.
  • any three pipes 21 are connected to each other by a connecting member 25 at a corner portion of the rectangular parallelepiped frame portion 23.
  • Each connecting member 25 inserts and holds the end portions 21A of the three pipes 21, and fixes the three pipes 21 so as to be orthogonal to each other.
  • the direction orthogonal to the placement surface of the table 19 on which the frame portion 23 is arranged is the vertical direction, the direction orthogonal to the vertical direction and going forward and backward in FIG.
  • the direction orthogonal to the front-rear direction is referred to as the left-right direction and will be described.
  • a total of six cameras 13 are attached to the frame portion 23.
  • the six cameras will be collectively referred to as “camera 13”.
  • four cameras 13 ⁇ / b> A, 13 ⁇ / b> B, 13 ⁇ / b> C, and 13 ⁇ / b> D are attached to each of the four pipes 21 on the upper side of the frame portion 23 by a fixing member 27.
  • Each of the four cameras 13A to 13D is attached to a position close to each of the upper four connecting members 25.
  • the fixing member 27 fixes each of the cameras 13A to 13D so that the imaging direction faces the central portion of the frame portion 23.
  • the remaining two cameras 13E and 13F are connected to each of a pair of pipes 21 that face each other diagonally among the four pipes 21 provided along the vertical direction.
  • the fixing member 27 is attached.
  • the cameras 13 ⁇ / b> E and 13 ⁇ / b> F are attached to a lower end portion of the pipe 21 on the side of the base 19, and are fixed by a fixing member 27 so that the imaging direction faces the central portion of the frame portion 23.
  • These six cameras 13 move the capture region R1, that is, the jig 15 and the marker unit 43, in the cubic region surrounded by the frame unit 23 in order to photograph the marker unit 43 of the jig 15 to be described later. Is set as an area for tracking.
  • the six cameras 13 are set so that the imaging ranges overlap each other in order to capture the marker unit 43, and can capture the capture region R1 three-dimensionally.
  • the shape of the frame part 23 shown in FIG. 2, the number of cameras 13, the attachment position of the camera 13, etc. are examples, and can be changed suitably.
  • each of the cameras 13 includes an image sensor 31 and illumination devices 33 and 34.
  • the image sensor 31 is, for example, a CCD image sensor or a CMOS image sensor.
  • the illumination devices 33 and 34 are, for example, LED illuminations, and irradiate light having different wavelengths. This is two types of light corresponding to the marker portions 43A and 43B provided in each of two jigs 15A and 15B (see FIG. 4) described later.
  • the camera 13 receives reflected light reflected from the marker units 43 ⁇ / b> A and 43 ⁇ / b> B by the imaging device 31 while being irradiated from the illumination devices 33 and 34.
  • the camera 13 outputs the imaged data as image data D1 to the teaching device 10 via the video cable 35 (see FIG. 2).
  • the camera 13 has an optical filter corresponding to the wavelength of light emitted from the illumination devices 33 and 34 attached to the light incident port of the image sensor 31 so that the reflected light of the marker portions 43A and 43B can be easily detected. It may be configured.
  • a jig 15 shown in FIG. 1 is a detection target that simulates the robot arms 101 and 103 of the industrial robot 100 shown in FIG. And.
  • FIG. 3 schematically shows the configuration of the industrial robot 100.
  • the robot arm 101 is an articulated robot having a serial link mechanism that connects two arm portions 105 in one direction and supports a hand portion 109 that is an end effector at a tip portion.
  • the robot arm 103 connects the two arm portions 107 in one direction and supports the hand portion 111 at the tip portion.
  • the industrial robot 100 drives the robot arms 101 and 103 to attach the workpieces W1 and W2 sandwiched between the hand units 109 and 111 to the substrate B.
  • the workpieces W1 and W2 are, for example, electronic parts, screws, and the like.
  • the teaching device 10 generates control information D5 for operating the two robot arms 101 and 103 in cooperation.
  • the teaching device 10 when it is necessary to distinguish between the two types of jigs 15 or the parts (marker part 43 etc.) provided in the jigs 15, as shown in FIG. An explanation will be given by adding an alphabet after the reference numeral. When there is no need to distinguish between them, the two jigs will be collectively referred to as “jigs 15”.
  • the marker part 43 is fixed to the outer peripheral part of the main body part 41.
  • the marker unit 43 forms a sphere and reflects light emitted from the illumination devices 33 and 34 of each camera 13.
  • the marker portion 43A provided in the jig 15A shown in FIG. 4 is made of a material having a reflection characteristic that reflects light of a specific wavelength irradiated from the illumination device 33.
  • the marker portion 43B provided on the other jig 15B is made of a material having a reflection characteristic that reflects light of a specific wavelength irradiated from the illumination device 34.
  • the control information generation device 17 can acquire the position information D2 that indicates the position of the marker unit 43 in three-dimensional coordinates.
  • the end effector 45 has a shape simulating the hand portions 109 and 111 sandwiching the workpieces W1 and W2 of the robot arms 101 and 103 (see FIG. 3), and a pair of end portions bent in a direction approaching each other. It is comprised by the rod-shaped member. A pair of end effector 45 is provided in the position which pinches the marker part 43, and the front-end
  • the end effector marker portion 46 has a shape different from that of the marker portion 43, for example, a rectangular shape.
  • the end effector marker portions 46A and 46B are configured with mutually different reflection characteristics like the marker portions 43A and 43B, and reflect the light emitted from each of the illumination devices 33 and 34 of the camera 13.
  • the control information generation device 17 can also acquire the position information D2 of the end effector 45 as in the case of the jig 15.
  • the camera 13 may include a dedicated illumination device that irradiates the end effector 45 with light separately from the illumination devices 33 and 34 used for the jig 15.
  • the main body 41 has an actuator 49 for opening and closing the end effector 45 built therein.
  • the main body 41 is attached with a tip end portion of a rod-shaped gripping portion 47 at a portion opposite to the marker portion 43 and the end effector 45.
  • the gripping portion 47 is a state in which the user hands the proximal end portion of the gripping portion 47 that has jumped out of the frame portion 23 in a state where the jig 15 is placed in the capture region R1 (see FIG. 2) of the frame portion 23. It has a length that can be held by Thereby, the user can operate the jig 15 without putting a part of the body into the capture region R1.
  • a driving switch 51 for driving or stopping the actuator 49 is provided at the base end portion of the gripping portion 47 opposite to the main body portion 41.
  • the drive switch 51 is connected to the actuator 49 by a connection line 53 disposed in the grip portion 47 and the main body portion 41.
  • the user holds the proximal end portion of the grip portion 47 and the jig 15 provided at the distal end portion from the start position within the capture region R1 of the frame portion 23.
  • the robot arms 101 and 103 are moved to the work position where the hands W1 and W2 are clamped by the hand units 109 and 111, respectively.
  • the user turns on the drive switch 51 in a state where the jig 15 is stopped after being moved, thereby closing the tip portion of the end effector 45.
  • the user turns off the drive switch 51 to open the tip end portion of the end effector 45.
  • the control information generation device 17 captures the end effector marker unit 46.
  • the control information generation device 17 is, for example, a personal computer mainly composed of a CPU (Central Processing Unit) 61, and includes a conversion unit 63, a storage unit 65, an input unit 67, a display unit 69, and the like.
  • the control information generation device 17 inputs the imaging data D1 output from the camera 13 to the conversion unit 63 via the video cable 35 (see FIG. 2).
  • the conversion unit 63 arranges the imaging data D1 captured by the plurality of cameras 13 in time series, adds identification information of the camera 13, time information, and the like, and outputs them to the CPU 61.
  • the CPU 61 stores the imaging data D1 input from the conversion unit 63 in the storage unit 65.
  • the storage unit 65 includes a memory, a hard disk, and the like, and stores definition data D3, a control program D7, and the like in addition to the imaging data D1.
  • the definition data D3 stores setting data necessary for display on the display unit 69.
  • the control program D7 is a program executed on the CPU 61.
  • the CPU 61 implements various processing modules of the position information generation unit 71, the position information display unit 73, and the control information generation unit 75 by reading and executing the control program D7 stored in the storage unit 65.
  • the position information generation unit 71 and the like are configured as software realized by the CPU 61 executing the control program D7, but may be configured as dedicated hardware.
  • the input unit 67 is an input device such as a keyboard or a mouse that receives input from the user.
  • the display unit 69 is, for example, a liquid crystal monitor, and displays position information D2 and the like captured from the marker unit 43.
  • the teaching device 10 captures the jig 15 moved from the starting position to the reaching position, and generates the position information D2 of the marker unit 43 and the end effector marker unit 46, for example. To do.
  • the user operates the input unit 67 while confirming the position information D2 displayed on the display unit 69, and inputs relay points, continuous relay points, work information, and the like.
  • the teaching device 10 generates control information D5 in which the relay point and the like and the work information performed at the relay point and the like are associated with the position information D2.
  • FIG. 4 schematically shows a state in which motion capture is performed.
  • supply devices 81 and 82 for supplying the workpieces W1 and W2 are arranged in the capture region R1 of the frame portion 23.
  • the supply devices 81 and 82 are, for example, tape feeder type supply devices that send taped electronic components (workpieces) one by one to a supply position.
  • a supply position marker portion 84 is provided at the supply position of the workpiece W2 of the supply device 81.
  • a supply position marker portion 85 is provided at the supply position of the workpiece W1 of the supply device 82.
  • a substrate 86 is disposed between the supply devices 81 and 82 in the front-rear direction.
  • the substrate 86 is formed in a rectangular shape, and is disposed horizontally such that the plane is along the front-rear direction and the left-right direction.
  • mounting position marker portions 88 are provided on the four corners of the substrate 86.
  • the mounting position marker portion 88 on which the jig 15A performs the mounting operation is referred to as a mounting position marker portion 88A in order to distinguish it from the other mounting position marker portions 88.
  • the mounting position marker portion 88 on which the jig 15B performs the mounting operation is referred to as a mounting position marker portion 88B in order to distinguish it from other mounting position marker portions 88.
  • the supply devices 81 and 82 and the substrate 86 described above may be actual devices or substrates, or may be members that simulate shapes. Further, three reference marker portions 91 are provided adjacent to each other at the center of the substrate 86.
  • the reference marker unit 91 is a position serving as a reference for the operation of the robot arms 101 and 103 (see FIG. 3).
  • the jig 15A teaches the work of picking up the workpiece W1 from the supply position of the supply device 82 (see FIG. 4) by the hand unit 109 of the robot arm 101 shown in FIG.
  • the user operates the jig 15A by holding the grip portion 47A, and moves the jig 15A from the starting position to the supply position marker portion 85 shown in FIG.
  • the user turns on the drive switch 51A to close the end effector 45A.
  • the user moves the jig 15A from the supply position marker portion 85 to the position (relay position) of the mounting position marker portion 88A.
  • the user turns off the drive switch 51A to open the end effector 45A.
  • the user moves the jig 15A to a position (arrival position) for performing the next operation.
  • the jig 15B teaches the work of picking up the workpiece W2 from the supply position of the supply device 81 (see FIG. 4) by the hand unit 111 of the robot arm 103 and mounting it on the substrate B. This operation is performed simultaneously with the operation of the robot arm 101 described above.
  • the user moves the jig 15B from the starting position shown in FIG. 4 to the supply position marker section 84 as shown by the broken line arrow 95 in FIG. 4 to operate the jig 15B while holding the grip portion 47B.
  • the user turns on the drive switch 51B to close the end effector 45B at the position of the supply position marker portion 84 (relay position).
  • the user moves the jig 15B from the supply position marker portion 84 to the position (relay position) of the mounting position marker portion 88B.
  • the user turns off the drive switch 51B to open the end effector 45B.
  • the user moves the jig 15B to a position (arrival position) for performing the next operation.
  • the CPU 61 executes the control program D7 and the like, and the inclination of the marker portions 43A and 43B and the end effector marker portions 46A and 46B with respect to the position of each three-dimensional coordinate (X coordinate, Y coordinate, Z coordinate) and the X axis.
  • the position information D2 that generated the above is generated.
  • the CPU 61 captures the imaging data D1 from the conversion unit 63 and stores it in the storage unit 65.
  • the position information generation unit 71 executed by the CPU 61 is attached to the jigs 15 ⁇ / b> A and 15 ⁇ / b> B based on the identification information and time information of the camera 13 added to the imaging data D ⁇ b> 1 stored in the storage unit 65.
  • the positions of the three-dimensional coordinates of the marker portions 43A and 43B and the end effector marker portions 46A and 46B are calculated for each photographing time.
  • the position information generation unit 71 stores the calculated position information D2 in the storage unit 65.
  • the position information generation unit 71 performs labeling on the binarized imaging data D1, performs processing using an algorithm such as epipolar matching, and calculates the coordinate position in the three-dimensional space of the marker unit 43 and the like. To do. Further, the position information generation unit 71 calculates the position of the coordinates relative to the reference marker unit 91. For example, the position information generation unit 71 calculates the coordinate position of the marker unit 43 and the like based on the barycentric positions of the three reference marker units 91.
  • the marker unit 43 and the like have a structure having different reflection characteristics according to the wavelength of light emitted from the illumination devices 33 and 34. For this reason, for example, the position information generation unit 71 identifies reflected light from the marker units 43A and 43B with respect to the imaging data D1 based on a difference in luminance and the like, and calculates a coordinate position for each of the marker units 43A and 43B.
  • the processing method by which the position information generation unit 71 calculates the position information D2 is not particularly limited.
  • the position information D2 may be calculated by the principle of triangulation.
  • the position information generation unit 71 samples the positions of the marker units 43A and 43B at predetermined time intervals, and generates data as a set of points indicating coordinate positions as the position information D2.
  • the position information generation unit 71 calculates inclinations with respect to the X axis, the Y axis, and the Z axis from the relationship between adjacent coordinate positions among the sampled coordinate positions.
  • the position information generation unit 71 stores the calculated inclination value together with the coordinate position in the storage unit 65 as position information D2.
  • the position information generation unit 71 captures the operation of the end effector marker unit 46 provided in the end effector 45. For example, when the position information generation unit 71 detects a position where only the end effector marker unit 46 is moving relative to the marker unit 43 in a state where the position of the marker unit 43 is stopped within a certain range. The detected position is set as the relay position. Accordingly, the position information generation unit 71 sets a position where the user operates the end effector 45 as a relay position from the set of points indicating the coordinate positions. The position information generation unit 71 stores the position information D2 in which the relay point is set in the storage unit 65.
  • the position information display unit 73 displays the position information D2 generated by the position information generation unit 71 and stored in the storage unit 65 on the display unit 69.
  • FIG. 5 shows an example of the display screen 121 displayed on the display unit 69 by the position information display unit 73. Note that FIG. 5 shows a simpler locus of the marker unit 43, unlike FIG. 4, in order to avoid the complexity of the explanation and display contents of the drawing. Further, FIG. 5 shows three trajectories including a trajectory obtained by reading and displaying generated control information D5 described later, in addition to the marker portions 43A and 43B of the two jigs 15A and 15B.
  • a capture display unit 123 is displayed on the left side in the display screen 121.
  • the position information display unit 73 displays, on the capture display unit 123, the coordinate positions of the marker units 43 ⁇ / b> A and 43 ⁇ / b> B expressed in three-dimensional coordinates based on the position information D ⁇ b> 2 generated by the position information generation unit 71 described above.
  • the X direction displayed on the capture display unit 123 corresponds to, for example, the front-rear direction shown in FIG.
  • the Y direction corresponds to the up and down direction.
  • the Z direction corresponds to the left-right direction.
  • the numbers attached to the vertical axis in the Y direction indicate examples of coordinate position values.
  • the coordinate position for example, a position based on the reference marker portion 91 (FIG. 4) is set.
  • the position information display unit 73 may schematically show the supply device 81 and the substrate 86 shown in FIG. 4 on the display screen 121.
  • FIG. 6 shows an enlarged view of the capture display unit 123.
  • a locus 125 indicated by a solid line in the capture display unit 123 indicates, for example, a locus obtained by capturing the marker unit 43A (displayed as “marker 1” in the drawing).
  • a locus 126 indicated by a broken line indicates, for example, a locus obtained by capturing the marker portion 43B (indicated as “marker 2” in the drawing).
  • a locus 127 indicated by a bold line is obtained by reading the control information D5 generated by performing motion capture in advance and displaying the position information D2 included in the control information D5 (“marker 3” in the figure). Is displayed).
  • the position information display unit 73 automatically sets the correspondence between the marker units 43A and 43B to be captured and the markers 1 to 4 to be displayed. For example, the position information display unit 73 automatically sets the marker unit that reflects the light of the illumination device 33 to “marker 1” and the marker unit that reflects the light of the illumination device 34 to “marker 2”. Alternatively, the position information display unit 73 may be set in the order of the markers 1 to 4 in order from the one that can be captured first. Further, the position information display unit 73 may be configured such that the correspondence between the marker units 43A and 43B to be captured and the markers 1 to 4 to be displayed can be manually changed. As will be described later, when reading the generated control information D5, the position information display unit 73 associates the read position information D2 with which marker 1 to 4 according to the setting of the marker display field 193. Will be changed.
  • the position information display unit 73 displays the number of traces 125, 126, and 127 to be displayed and the line type based on the setting data of the definition data D3.
  • FIG. 7 shows an example of the definition data D3.
  • the definition data D3 is, for example, data described in an ini file format, and defines the line type and work information of each marker.
  • the file format of the definition data D3 is not limited to the ini file format, but may be other formats such as an xml file format.
  • a display selection field 133 for selecting a locus to be displayed is displayed below the capture display unit 123.
  • the position information display unit 73 displays four display selection fields 133 on the capture display unit 123 according to the setting of the definition data D3.
  • the display selection field 133 is provided corresponding to each locus 125, 126, 127.
  • the user can select the display selection field 133 by operating the mouse of the input unit 67, moving the pointer 135 to the position of the marker display selection field 133 to be displayed on the capture display unit 123, and clicking.
  • the position information display unit 73 displays a checked mark in the display selection field 133 selected by the user, and displays a trajectory corresponding to the selected marker on the capture display unit 123.
  • the state shown in FIG. 6 is a state in which the markers 1 to 3 are selected and the trajectories 125, 126, and 127 corresponding to the markers 1 to 3 are displayed.
  • the user is not limited to operating the pointer 135, and may select the display selection field 133 by operating a keyboard or the like.
  • a relay point 129 indicated by a black circle in the capture display unit 123 corresponds to a position where the jig 15 is stopped and the drive switch 51 is operated in the above-described capture.
  • the relay point 129 is set as a passing point of the hand units 109 and 111.
  • a continuous relay point 131 indicated by a white circle in the capture display unit 123 is a position set as a point through which the hand units 109 and 111 pass continuously. Unlike the relay point 129, the continuous relay point 131 is set with a relay point that should pass even at adjacent front and rear coordinate positions.
  • the continuous relay point 131 may be automatically set by the position information generation unit 71 in the above-described capture. For example, the position information generation unit 71 may set points adjacent to each other at a predetermined interval or less as the continuous relay points 131 among the sampled points. Further, as described later, the user can set the continuous relay point 131 on the capture display unit 123.
  • the start position 137 indicated by a black triangle in the capture display unit 123 is a position where the capture of the jig 15 is started in the above-described capture.
  • the starting position 137 is set as a position where movement of the hand units 109 and 111 is started.
  • An arrival position 138 indicated by a white square is a position where the capture of the jig 15 is completed in the above-described capture.
  • the arrival position 138 is set as the position to which the hand units 109 and 111 are moved.
  • the position information display unit 73 displays the position information D2 generated by the position information generation unit 71.
  • the display mode of the position information display unit 73 differs depending on each mode (trajectory display mode or the like) described later.
  • the position information display unit 73 displays the relay point 129 set in the position information D2 on the capture display unit 123. Further, the user can set the relay point 129 even after capture. As shown in FIG. 6, the user places the pointer 135 at an arbitrary coordinate position in the capture display unit 123 and performs a single click operation.
  • the position information display unit 73 displays a list 141 at the selected position.
  • the list 141 displays items such as relay points, continuous relay points, and deletions.
  • the user can set the relay point 129 or the continuous relay point 131 to an arbitrary position by selecting one of these items.
  • the position information display unit 73 changes the trajectory 127 and the like so as to pass through the set relay point 129.
  • the user can delete the relay point 129 and the like once set.
  • the user moves the pointer 135 to a position such as the relay point 129 and performs a click operation.
  • the user can delete the set relay point 129 by selecting a deletion item from the items displayed in the list 141.
  • the user can change the position by performing an operation (drag operation) for moving the departure position 137, the relay point 129, and the like with the pointer 135 selected.
  • the position information display unit 73 changes the display and data such as the departure position 137 according to the drag operation.
  • the user can intuitively set / change / delete the relay point 129 and the continuous relay point 131 while confirming the display on the capture display unit 123.
  • the user does not set the relay point 129 in advance by operating the drive switch 51, but only the locus to be moved is captured by the marker unit 43 and displayed on the capture display unit 123. All the relay points 129 may be set later.
  • the position information display unit 73 displays a sub-screen that can directly change the data of the selected track.
  • indicate. 8 and 9 show an example of the sub screen 121A when the locus 125 is selected.
  • a control information display unit 143, a change save button 145, and a cancel button 147 are displayed on the sub screen 121A.
  • the position information generation unit 71 captures the closing and opening movements of the end effector marker unit 46 of the end effector 45 in the above-described capture, so that the work information (hand open, hand close) at the relay point 129 is captured. Etc.) automatically.
  • the control information display unit 143 shown in FIGS. 8 and 9 the work information detected by the position information generation unit 71 by the capture of the end effector marker unit 46 is already set.
  • the control information display unit 143 is a table in which data is set in a row direction and a column direction. This data can be output as control information D5 for controlling the industrial robot 100 as will be described later.
  • “time”, “marker number”, “position type”, “work information”, “coordinate value”, “angle”, “position name” are sequentially displayed from the left. "Work information name” is displayed.
  • the time column the elapsed time since the capture was started at the starting position 137 is displayed.
  • the marker number column the number of the marker portion 43, for example, the number “1” indicating the marker 1 in the case of the locus 125 is displayed.
  • the position information display unit 73 displays the position name of the control information display unit 143 based on the setting data of the position name (see FIG. 7) of the definition data D3. For example, the position type “0” indicates an unset coordinate position where the relay point 129 or the like is not set.
  • the position type “1” indicates the coordinate position of the departure position 137.
  • “2” indicates the coordinate position of the relay point 129.
  • “3” indicates the coordinate position of the continuous relay point 131.
  • “4” indicates the coordinate position of the arrival position 138.
  • the user selects the position type field of the desired coordinate position with the pointer 135 or the like, and changes the set number to change the coordinate position of the relay point 129 or the like. Can be changed.
  • the position information display unit 73 displays the work information name of the control information display unit 143 based on the setting data of the work information (see FIG. 7) of the definition data D3. For example, “0” in the work information indicates that no work is set. “1” indicates that the hand opening for opening the hand units 109 and 111 (see FIG. 3) is executed. “2” indicates that hand closing for closing the hand units 109 and 111 is executed. “3” indicates that imaging is performed by a camera (not shown) attached to the robot arm 101 or the robot arm 103.
  • the type of work information is not limited to the above, and can be changed as appropriate.
  • the work information may be a work of sucking the workpiece W1 by the suction nozzle.
  • the work information may be a work of making a hole by irradiating the workpiece W1 with a laser from a laser light source attached to the arm unit 105.
  • the user can set appropriate work information by changing the file of the definition data D3 in accordance with the contents that the robot arm 101 and the robot arm 103 can execute.
  • the position information display unit 73 reads and displays the coordinate value and angle of the position information D2 corresponding to the trajectories 125, 126, 127 selected by the user, for example.
  • the position information display unit 73 highlights and displays the coordinate position selected by the user on the sub screen 121A. For example, the user selects the relay point 129A on the arrival position 138 side among the two relay points 129 on the locus 125 shown in FIG. In this case, as shown in FIG. 8, the position information display unit 73 displays the line of “relay point 2” corresponding to the relay point 129 ⁇ / b> A surrounded by the emphasis mark 149. Thereby, the user can easily confirm the coordinate position selected in the capture display unit 123. In addition, the user is prevented from erroneously changing data at other coordinate positions.
  • the user can select the data of each item in the control information display unit 143 with the pointer 135 and input a number or the like, thereby changing the data to a desired position, angle, work content, and the like.
  • the user selects the change save button 145.
  • the position information display unit 73 reflects the changed content in the position information D2, and ends the display of the sub screen 121A.
  • the user selects the cancel button 147.
  • the position information display unit 73 ends the display of the sub screen 121A without reflecting the change contents in the position information D2.
  • a mode display unit 151 is displayed on the display screen 121 at the upper part on the right side of the capture display unit 123.
  • the mode display unit 151 displays the current mode of the position information generation unit 71 and the position information display unit 73. This mode is changed by selecting any one of a locus display button 153, a playback button 155, a real-time display button 157, and a recording start button 159 provided below the mode display unit 151.
  • the position information display section 73 displays “trajectory being displayed” on the mode display section 151.
  • the position information display unit 73 is in the locus display mode, and performs a process of reading the control information D5 that has already finished the motion capture and has been stored in the storage unit 65.
  • the control information D5 is data (an example of output information) output from the control information display unit 143 shown in FIGS. 8 and 9 in a file format as will be described later, and is changed according to the setting of the marker display field 193. .
  • the position information display unit 73 displays both the point indicating the coordinate position of the read control information D5 and the locus connecting the points on the capture display unit 123.
  • the position information display unit 73 displays “Now playing” on the mode display unit 151.
  • the position information display unit 73 enters the reproduction mode, reads the control information D5 that has already finished the motion capture and has been stored in the storage unit 65.
  • the position information display unit 73 displays only points indicating the coordinate position of the read control information D5 on the capture display unit 123. For this reason, the position information display unit 73 does not display a locus on the capture display unit 123 in this playback mode.
  • a playback speed setting field 158 is displayed on the left of the playback button 155.
  • the position information display unit 73 changes the display speed of the points and tracks in the playback mode and the track display mode according to the numbers in the playback speed setting field 158.
  • the position information display unit 73 can set, for example, a speed from 0.1 times to 5 times as a playback speed. In the position information display unit 73, for example, 1 is set as the initial value of the reproduction speed.
  • the position information display unit 73 displays “real time display” on the mode display unit 151.
  • the position information display unit 73 enters the real-time display mode, and displays the position information D2 of the marker unit 43 being captured on the capture display unit 123 within the capture region R1 (see FIG. 2).
  • the position information display unit 73 receives the position information D2 from the position information generation unit 71, processes it in real time, and displays it as a moving image. That is, the CPU 61 executes in parallel two processes, a process for generating the position information D2 by the position information generating unit 71 and a process for displaying the locus 125 by the position information display unit 73. Further, the position information generation unit 71 is set not to store the position information D2 in the storage unit 65 in the real-time display mode.
  • the position information display unit 73 displays “recording” on the mode display unit 151.
  • the position information display unit 73 enters the recording mode, and displays the position information D2 of the marker unit 43 being captured in real time.
  • the position information generation unit 71 is configured to save the generated position information D2 in the storage unit 65.
  • an effective marker number column 161 Under the mode display section 151, an effective marker number column 161, a difference real-time display 163, and a difference detection holding display 165 are displayed side by side in the left-right direction.
  • a recording marker number column 167 is displayed below the effective marker number column 161.
  • the position information display unit 73 displays the number of marker units 43 captured in the capture region R1 by the position information generation unit 71 in the effective marker number column 161.
  • the number in the recording marker number column 167 is set by the user as the number of marker portions 43 that are actually moved in the capture area R1.
  • the number of marker parts 43 to be captured by the position information generating part 71 is set in the recording marker number field 167.
  • the user selects a recording marker number field 167 and inputs a number to set a desired number in the field.
  • the user can also change the number in the column by operating the selection button 169 provided on the left of the recording marker number column 167.
  • the position information display unit 73 increases the number in the recording marker number field 167. Further, when the downward triangle button of the selection button 169 is selected, the position information display unit 73 decreases the number in the recording marker number column 167.
  • the position information display unit 73 turns on the difference real-time display 163 (for example, lights up in red) when the number in the effective marker number field 161 and the number in the recorded marker number field 167 are different. Accordingly, the difference real-time display 163 is turned on when the number of marker portions 43 set by the user does not match the number of marker portions 43 that can be actually captured.
  • the difference real-time display 163 for example, lights up in red
  • the position information display unit 73 has a number of the effective marker number column 161 and a number of the recording marker number column 167 between the start and the end of the capture of the marker unit 43 in the real-time display mode and the recording mode. Is detected, the difference detection holding display 165 is lit (for example, lit in red). Further, once the difference detection holding display 165 is turned on, the position information display unit 73 holds the state until the next capture is started. The position information display unit 73 turns off the difference detection holding display 165 at the start of the next capture.
  • the start and end of the capture described above can be detected by, for example, the processing state of the position information generation unit 71 or the start of the real-time display mode or recording mode.
  • FIG. 10 shows an enlarged view of the work information registration box 171.
  • the marker name, the name of the relay point 129, and the work content are displayed in order from the left.
  • the user can also change the work information set for each relay point 129 by operating the work information registration box 171.
  • Each line of the work information registration box 171 corresponds to each of the markers 1 to 4.
  • a marker name, a selection button 173 provided on the right side of the name, and a relay point number column 175 on the right side of the selection button 173 are displayed.
  • three relay points 129 are displayed in the work information registration box 171.
  • the relay point 129 that cannot be displayed at once in the work information registration box 171 is set.
  • the position information display unit 73 scrolls the display portion of the relay point 129 of only the marker 1 to the right.
  • the position information display unit 73 scrolls the display portion of the relay point 129 of only the marker 1 to the left.
  • the position information display unit 73 displays the input relay point 129 of the number on the leftmost side. For example, when the number “2” is input in the relay point number column 175 of the marker 1, the position information display unit 73 displays the relay point 2 on the leftmost side in the marker 1 row, and then the relay point 3 on the right side. The relay point 4 is displayed. In this case, the relay point 1 of the marker 1 is not displayed. Further, the position information display unit 73 collectively displays the display portions of the relay points 129 of all the markers as the scroll bar 177 provided in the lower part of the work information registration box 171 is operated in the left-right direction. Scroll in the direction.
  • a selection button 179 and a list box 181 are displayed beside the name of each relay point 129 on the right side of the relay point number column 175 of the work information registration box 171.
  • the position information display unit 73 highlights and displays the relay point 129 selected in the capture display unit 123. For example, as shown in FIG. 10, the user places the pointer 135 at the position of the selection button 179 of the relay point 2 in the marker 2 and performs a click operation. Accordingly, the position information display unit 73 displays the relay point 2 of the selected marker 2 surrounded by the highlight mark 183 as shown in FIG.
  • the user can visually confirm whether or not the selected relay point 129 matches the intended one by displaying on the capture display unit 123. As a result, it is possible to suppress erroneous input that mistakes the relay point 129 for inputting work information.
  • the position information display unit 73 displays the work information (see FIG. 7) set in the definition data D3 as a list when an arbitrary list box 181 is selected by the user.
  • FIG. 10 shows a state where the list box 181 of the relay point 1 of the marker 3 is selected.
  • a check is displayed in the list box 181 for items that are not set.
  • items set in the work information of the definition data D ⁇ b> 3 are displayed as a list below unset items. Thereby, the user can select and set the work information appropriately by selecting the list box 181 of the desired relay point 129 and selecting the work information from the list.
  • an area 185 indicated by dot hatching indicates an invalid area that does not accept an operation.
  • two relay points 129 are set on the locus 125 corresponding to the marker 1.
  • two relay points 129 are set on the trajectory 126 corresponding to the marker 2.
  • one relay point 129 is set in the trajectory 127 corresponding to the marker 3. In other words, except for the relay point 129 described above, no other relay points 129 are set.
  • the position information display unit 73 sets an area surrounding the relay point 129 other than the relay point 129 as the area 185 in the display of the work information registration box 171.
  • the position information display unit 73 performs a process of not accepting an operation even if an item in the area 185 is selected with the pointer 135 or the like. Thereby, it is possible to prevent erroneous input of work information to a coordinate position where no relay point 129 is set.
  • a sheet selection button 187 is displayed at the bottom of the work information registration box 171.
  • the position information display unit 73 displays a work information registration box 171 for inputting work information of the relay point 129 shown in FIG.
  • the position information display unit 73 displays a work information registration box (not shown) for inputting work information of the continuous relay point 131.
  • the work information registration box 171 may include a work information registration box 171 for inputting work information on the departure position 137 and the arrival position 138.
  • a selection button 191, a marker display field 193, a write button 195, a read button 197, and an end button 199 are displayed below the work information registration box 171.
  • the position information display unit 73 changes the display of the marker display field 193 in ascending order of the markers 1, 2, 3, 4, 1,.
  • the position information display unit 73 changes the display of the marker display field 193 in the descending order of the markers 1, 4, 3, 2, 1.
  • the position information display unit 73 causes the control information generation unit 75, which is another processing module, to execute a process of writing or reading the control information D5.
  • the control information generation unit 75 outputs the data of the control information display unit 143 shown in FIGS. 8 and 9 as control information D5.
  • the control information generation unit 75 outputs control information D5 corresponding to the marker number set in the marker display field 193.
  • the control information generation unit 75 outputs control information D5 corresponding to all markers.
  • an output file format for example, a CSV (Comma Separated Values) format can be adopted. In this case, the output file is output, for example, as data in which the time, marker number, etc. shown in FIGS. 8 and 9 are separated by commas.
  • the control information generation unit 75 displays a file selection screen (not shown), and the file already output (CSV format control information) selected by the user from the file selection screen. D5) is read.
  • the control information generation unit 75 reads the read data as data corresponding to the marker number displayed in the marker display field 193. For example, when marker 1 is set in the marker display field 193, the position information display unit 73 displays the read data in the marker 1 row of the work information registration box 171. Further, the position information display unit 73 displays the read data on the capture display unit 123.
  • the position information display unit 73 is based on the data read by the control information generation unit 75.
  • a locus 125 is displayed at 123.
  • the position information display unit 73 ends the display of the display screen 121 when the end button 199 is selected.
  • the teaching device 10 can generate the control information D5 in which the relay point 129 and the like where the robot arms 101 and 103 are desired to move are associated with the work information.
  • the user can generate a control program for controlling the industrial robot 100 using the control information D5.
  • the user can use the control information D5 (combination of coordinate position and work information, etc.), parameters of a movement command (movement direction, etc.) for moving the industrial robot 100, and parameters (execution position) of a control command for hand opening. Or timing etc.).
  • the user can easily generate a control program that causes the hand units 109 and 111 (see FIG. 3) to perform a desired operation using the output control information D5.
  • the camera 13 is an example of a detection unit.
  • the marker unit 43 is an example of a position marker unit.
  • the actuator 49 is an example of a drive unit.
  • the CPU 61 is an example of a processing unit.
  • the robot arms 101 and 103 are examples of robots.
  • the relay point 129, the continuous relay point 131, the departure position 137, and the arrival position 138 are examples of work positions.
  • the real time display button 157 is an example of a selection button.
  • the imaging data D1 is an example of detection data.
  • the camera 13 outputs imaging data D1 obtained by imaging the marker unit 43 to the control information generating device 17.
  • the position information generation unit 71 executed by the CPU 61 calculates the position of the three-dimensional coordinates of the marker unit 43 as the position information D2 based on the identification information and time information of the camera 13 added to the imaging data D1 ( An example of position information generation processing).
  • the position information display unit 73 displays the position information D2 on the display unit 69 (an example of position display processing). Thereby, the user can confirm the position of the marker unit 43 that has been motion-captured by looking at the display unit 69.
  • the position information display unit 73 selects a position selected by the user as a relay point 129 or a continuous relay point 131 for performing work by the robot arms 101 and 103 (work position) (an example of work position selection processing).
  • the position information display unit 73 sets information of work to be performed at each relay point 129 or the like when a user selects a desired work from the list box 181 of the work information registration box 171 (work information setting process). Example).
  • the position information display unit 73 updates data related to the control information D5 (see FIGS. 8 and 9) in accordance with the user's setting operation.
  • the control information generation unit 75 outputs the data of the control information display unit 143 shown in FIGS. 8 and 9 as control information D5 (an example of control information generation processing). Therefore, the user operates the input unit 67 in accordance with the display on the display unit 69, so that both the position where the hand units 109 and 111 of the robot arms 101 and 103 are desired to pass and the work content that is desired to be executed at the position are displayed. You can easily register while visually checking.
  • the position information display unit 73 displays a list of work information (see FIG. 7) set in the definition data D3 (an example of first display processing). .
  • the user can check the list in the list box 181 and associate the desired work content with the relay point 129 or the like easily and reliably.
  • the number in the recorded marker number column 167 is set by the user as the number of marker portions 43 that are actually moved in the capture region R1 (an example of input processing).
  • the position information display unit 73 displays the number of marker units 43 captured in the capture region R1 by the position information generation unit 71 in the effective marker number column 161.
  • the position information display unit 73 turns on the difference real-time display 163 (an example of determination processing and second display processing). Thereby, the user can confirm whether all the marker parts 43 which are implementing the motion capture have been captured by the teaching device 10 by confirming the lighting state of the difference real-time display 163.
  • the control information generation unit 75 displays a file selection screen (not shown), and the control information D5 (output) already selected by the user from the file selection screen (output).
  • the position information display unit 73 displays the read control information D5 on the capture display unit 123 together with the position information D2 during or after the capture (see markers 1 to 3 in FIG. 6). Thereby, the user can confirm the state which cooperates with the industrial robot 100 which implemented motion capture previously, and the other industrial robot 100 in motion capture by the display part 69.
  • FIG. Furthermore, the user can also obtain control information D5 as new control information D5 by comparing control information D5 obtained in advance with control information D5 later, changing or combining the control information D5, and the like.
  • the position information display unit 73 displays a real-time display button 157 on the display screen 121 (an example of third display processing). When the real time display button 157 is selected, the position information display unit 73 displays “real time display” on the mode display unit 151. The position information display unit 73 enters the real-time display mode, and displays the position information D2 of the marker unit 43 being captured in the capture region R1 on the capture display unit 123 as a real-time moving image. In this case, the CPU 61 executes in parallel two processes, a process for generating the position information D2 by the position information generating unit 71 and a process for displaying the locus 125 by the position information display unit 73. Thereby, the user can confirm the position of the captured marker unit 43 in real time while performing motion capture.
  • the jig 15 drives the actuator 49 when the drive switch 51 is operated by the user.
  • the end effector 45 opens and closes according to the driving of the actuator 49.
  • the control information generation device 17 captures the end effector marker section 46 of the operating end effector 45 in motion capture.
  • the position information generating unit 71 sets a position where only the end effector marker unit 46 moves as a relay position in a state where the position of the marker unit 43 is stopped within a certain range. Accordingly, the user can easily determine the work position by confirming the position where the end effector marker unit 46 operates as the relay position on the display of the display unit 69.
  • the jig 15A provided with the marker portion 43A and the jig 15B provided with the marker portion 43B are used as the detection target.
  • a plurality of user arms are not inserted into the capture region R1 where the motion capture is performed, and the user arms interfere with each other. It becomes possible to prevent.
  • the position information generation unit 71 calculates the relative coordinate positions of the marker units 43A and 43B with reference to the barycentric positions of the three reference marker units 91.
  • the industrial robot 100 is configured by matching the position of the center of gravity of the reference marker unit 91 with the reference in the actual work area, for example, the center position of the substrate B shown in FIG. Can be accurately controlled.
  • the teaching device 10 may generate the control information D5 by executing motion capture for only one jig 15. Further, the teaching device 10 may execute motion capture for three or more jigs 15. In this case, the teaching device 10 may generate control information D5 corresponding to each of the plurality of marker units 43, or generate control information D5 in which data corresponding to all the marker units 43 are combined into one. May be.
  • the teaching device 10 may include, for example, illumination devices 33 and 34 that emit light of three or more different wavelengths in order to distinguish the marker portions 43 of the three or more jigs 15 from each other.
  • the teaching device 10 may read and combine a plurality of control information D5 acquired by operating one jig 15 a plurality of times.
  • the setting of the work information is not limited to the list box 181, but may be configured such that selection by a button or direct input of characters or numbers is set.
  • the position information display unit 73 may not perform display related to the number of marker units 43 (effective marker number column 161, difference real-time display 163, difference detection hold display 165, etc.). Further, the position information display unit 73 may be configured not to perform the reading process of the control information D5 using the reading button 197.
  • the position information display unit 73 may be configured without the real time display mode or the real time display button 157.
  • the marker part 43 was attached to the jig
  • the user may hold the marker unit 43 by hand and move it within the capture region R1.
  • the jig 15 may be configured not to include the end effector 45, the actuator 49, the drive switch 51, and the like.
  • the position information D2 may include the moving speed and acceleration of the marker unit 43 as information in addition to the position and angle information of the marker unit 43. Further, the position information generation unit 71 may correct the generated position information D2 in order to correct the shake caused by the user's manual work.
  • the said embodiment demonstrated the example which applies the robot arms 101 and 103 as a robot in this application, it is not restricted to this.
  • the robot in the present application may be a robot that performs operations such as suction of electronic components, laser beam irradiation, and screw tightening.
  • the robot is not limited to a robot having a serial link mechanism, and may be a robot that operates orthogonal to the XY axis direction or a robot that has a parallel link mechanism.
  • the detection method of the marker units 43A and 43B may execute image processing for detecting the shapes of the marker units 43A and 43B on the imaging data D1, and detect the position and the like from the execution result.
  • a magnetic system that detects the operation of the magnetic sensor may be used.
  • a magnetic sensor that transmits position data may be attached to the jig 15, and a receiving device that receives position data may be attached instead of the camera 13.
  • the magnetic sensor transmits position data to which identification information is added while moving.
  • the receiving device can calculate the position information of the magnetic sensor and the like from the received position data.
  • the magnetic sensor corresponds to a position marker portion indicating the position of the robot in the present application.
  • the receiving device corresponds to the detection unit.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un dispositif d'enseignement et un procédé de génération d'informations de commande au moyen desquels il est possible d'apprendre à un robot des actions de mouvement ainsi que des détails d'un travail dans une position de travail. Le dispositif d'enseignement affiche des tracés de marqueurs sur lesquels une capture de mouvement a été effectuée sur une section (123) d'affichage de capture d'un écran (121) d'affichage. Un utilisateur peut spécifier des points de relais dans des positions souhaitées sur un tracé sur la section (123) d'affichage de capture en pointant à l'aide d'un pointeur (135) et en cliquant. Lorsque l'utilisateur sélectionne un travail prévu dans une zone de liste d'une zone (171) d'inscription d'informations de travail, le dispositif d'enseignement associe les informations de travail sélectionnées au point de relais correspondant. Lorsque le bouton (195) d'écriture en sortie de l'écran (121) d'affichage est sélectionné, le dispositif d'enseignement délivre les données associées en tant qu'informations de commande.
PCT/JP2015/082405 2015-11-18 2015-11-18 Dispositif d'enseignement et procédé de génération d'informations de commande WO2017085811A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/082405 WO2017085811A1 (fr) 2015-11-18 2015-11-18 Dispositif d'enseignement et procédé de génération d'informations de commande
JP2017551442A JP6660962B2 (ja) 2015-11-18 2015-11-18 ティーチング装置及び制御情報の生成方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/082405 WO2017085811A1 (fr) 2015-11-18 2015-11-18 Dispositif d'enseignement et procédé de génération d'informations de commande

Publications (1)

Publication Number Publication Date
WO2017085811A1 true WO2017085811A1 (fr) 2017-05-26

Family

ID=58718672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/082405 WO2017085811A1 (fr) 2015-11-18 2015-11-18 Dispositif d'enseignement et procédé de génération d'informations de commande

Country Status (2)

Country Link
JP (1) JP6660962B2 (fr)
WO (1) WO2017085811A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110561431A (zh) * 2019-08-30 2019-12-13 哈尔滨工业大学(深圳) 用于离线示例学习的机器人装配演示轨迹提取方法及装置
WO2020218352A1 (fr) * 2019-04-23 2020-10-29 Arithmer株式会社 Dispositif de génération de données de désignation, procédé de génération de données de désignation et programme de génération de données de désignation
WO2021090868A1 (fr) * 2019-11-07 2021-05-14 川崎重工業株式会社 Système de chirurgie et procédé de commande

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010089218A (ja) * 2008-10-09 2010-04-22 Seiko Epson Corp 産業用ロボットの位置教示装置、動作プログラム作成装置、産業用ロボットの位置教示方法およびプログラム
JP2010117886A (ja) * 2008-11-13 2010-05-27 Kanto Auto Works Ltd 仮想試作システム及び仮想試作における動作情報の処理方法並びに該処理方法を記録した記録媒体
JP2011156641A (ja) * 2010-02-03 2011-08-18 Kanto Auto Works Ltd 作業補助システム及び作業補助方法並びに該作業補助方法を記録した記録媒体
JP2012076181A (ja) * 2010-10-01 2012-04-19 Yaskawa Electric Corp ロボット制御装置、ロボットおよびロボット制御装置の教示方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05305590A (ja) * 1992-04-30 1993-11-19 Mitsubishi Heavy Ind Ltd ロボットティーチング装置
US5511147A (en) * 1994-01-12 1996-04-23 Uti Corporation Graphical interface for robot
US6681151B1 (en) * 2000-12-15 2004-01-20 Cognex Technology And Investment Corporation System and method for servoing robots based upon workpieces with fiducial marks using machine vision
SE531104C2 (sv) * 2002-12-30 2008-12-16 Abb Research Ltd Metod och system för programmering av en industrirobot
JP5573275B2 (ja) * 2010-03-25 2014-08-20 富士ゼロックス株式会社 特徴点抽出装置及びこれを用いた動作教示装置、動作処理装置
JP2011200997A (ja) * 2010-03-26 2011-10-13 Kanto Auto Works Ltd ロボットのティーチング装置及びティーチング方法
JP2015058488A (ja) * 2013-09-17 2015-03-30 セイコーエプソン株式会社 ロボット制御システム、ロボット、ロボット制御方法及びプログラム
JP6424432B2 (ja) * 2014-01-23 2018-11-21 セイコーエプソン株式会社 制御装置、ロボットシステム、ロボット及びロボット制御方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010089218A (ja) * 2008-10-09 2010-04-22 Seiko Epson Corp 産業用ロボットの位置教示装置、動作プログラム作成装置、産業用ロボットの位置教示方法およびプログラム
JP2010117886A (ja) * 2008-11-13 2010-05-27 Kanto Auto Works Ltd 仮想試作システム及び仮想試作における動作情報の処理方法並びに該処理方法を記録した記録媒体
JP2011156641A (ja) * 2010-02-03 2011-08-18 Kanto Auto Works Ltd 作業補助システム及び作業補助方法並びに該作業補助方法を記録した記録媒体
JP2012076181A (ja) * 2010-10-01 2012-04-19 Yaskawa Electric Corp ロボット制御装置、ロボットおよびロボット制御装置の教示方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020218352A1 (fr) * 2019-04-23 2020-10-29 Arithmer株式会社 Dispositif de génération de données de désignation, procédé de génération de données de désignation et programme de génération de données de désignation
US11420321B2 (en) 2019-04-23 2022-08-23 Arithmer Inc. Specifying data generating apparatus, specifying data generating method, and computer-readable medium having recorded thereon specifying data generating program
CN110561431A (zh) * 2019-08-30 2019-12-13 哈尔滨工业大学(深圳) 用于离线示例学习的机器人装配演示轨迹提取方法及装置
CN110561431B (zh) * 2019-08-30 2021-08-31 哈尔滨工业大学(深圳) 用于离线示例学习的机器人装配演示轨迹提取方法及装置
WO2021090868A1 (fr) * 2019-11-07 2021-05-14 川崎重工業株式会社 Système de chirurgie et procédé de commande
JP2021074242A (ja) * 2019-11-07 2021-05-20 川崎重工業株式会社 手術システム及び制御方法
CN114585321A (zh) * 2019-11-07 2022-06-03 川崎重工业株式会社 手术系统以及控制方法
CN114585321B (zh) * 2019-11-07 2023-12-12 川崎重工业株式会社 手术系统以及控制方法
JP7432340B2 (ja) 2019-11-07 2024-02-16 川崎重工業株式会社 手術システム及び制御方法

Also Published As

Publication number Publication date
JPWO2017085811A1 (ja) 2018-09-06
JP6660962B2 (ja) 2020-03-11

Similar Documents

Publication Publication Date Title
JP6499273B2 (ja) ティーチング装置及び制御情報の生成方法
JP7490349B2 (ja) 入力装置、入力装置の制御方法、ロボットシステム、ロボットシステムを用いた物品の製造方法、制御プログラム及び記録媒体
JP6458713B2 (ja) シミュレーション装置、シミュレーション方法、およびシミュレーションプログラム
JP6880982B2 (ja) 制御装置およびロボットシステム
EP1435280B1 (fr) Procédé et système de programmation d'un robot industriel
JP2020055075A (ja) 拡張現実と複合現実を用いたロボット制御装置及び表示装置
JP6361213B2 (ja) ロボット制御装置、ロボット、ロボットシステム、教示方法、及びプログラム
US20050149231A1 (en) Method and a system for programming an industrial robot
WO2003099526A1 (fr) Procede et systeme de programmation d'un robot industriel
JP2017094407A (ja) シミュレーション装置、シミュレーション方法、およびシミュレーションプログラム
TW201703948A (zh) 機器人及其控制方法
JP2002172575A (ja) 教示装置
WO2017085811A1 (fr) Dispositif d'enseignement et procédé de génération d'informations de commande
TWI659279B (zh) 基於擴充實境的加工規劃設備
JP6499272B2 (ja) ティーチング装置及び制御情報の生成方法
CN115338855A (zh) 双臂机器人组装系统
JP7035555B2 (ja) 教示装置、及びシステム
JP2022163836A (ja) ロボット画像の表示方法、コンピュータープログラム、及び、ロボット画像の表示システム
JP6343930B2 (ja) ロボットシステム、ロボット制御装置、及びロボット制御方法
JP2015058488A (ja) ロボット制御システム、ロボット、ロボット制御方法及びプログラム
WO2023203747A1 (fr) Procédé et dispositif d'apprentissage pour robot
JP7068416B2 (ja) 拡張現実と複合現実を用いたロボット制御装置、ロボットの位置姿勢規定用コンピュータプログラム及びロボットの位置姿勢規定方法、相対位置姿勢取得用コンピュータプログラム及び相対位置姿勢取得方法
JP7383350B2 (ja) 画像測定装置のティーチングプログラム
WO2023171687A1 (fr) Dispositif de commande de robot et procédé de commande de robot
JP2519444B2 (ja) 工作線追従装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15908751

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017551442

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15908751

Country of ref document: EP

Kind code of ref document: A1