WO2019235314A1 - Système de commande, procédé de commande de système de commande, et programme de système de commande - Google Patents

Système de commande, procédé de commande de système de commande, et programme de système de commande Download PDF

Info

Publication number
WO2019235314A1
WO2019235314A1 PCT/JP2019/021220 JP2019021220W WO2019235314A1 WO 2019235314 A1 WO2019235314 A1 WO 2019235314A1 JP 2019021220 W JP2019021220 W JP 2019021220W WO 2019235314 A1 WO2019235314 A1 WO 2019235314A1
Authority
WO
WIPO (PCT)
Prior art keywords
position information
coordinate system
controller
time
world coordinate
Prior art date
Application number
PCT/JP2019/021220
Other languages
English (en)
Japanese (ja)
Inventor
哲司 若年
征彦 仲野
純児 島村
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019235314A1 publication Critical patent/WO2019235314A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/05Programmable logic controllers, e.g. simulating logic interconnections of signals according to ladder diagrams or function charts
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/414Structure of the control system, e.g. common controller or multiprocessor systems, interface to servo, programmable interface controller

Definitions

  • the present invention relates to a control system for controlling a plurality of control applications including a robot by a control device, a control method for the control system, and a program for the control system.
  • the position of a set point on the robot coordinate system is obtained using the robot coordinate system as the world coordinate system, and further on the machine tool coordinate system. It is disclosed that the relative relationship between the robot coordinate system and the machine tool coordinate system is derived by measuring the position of the set point.
  • the image processing device, the robot, and the machine tool update the position information at each cycle.
  • the position of the image processing apparatus in the camera coordinate system is updated every several hundred ms
  • the position of the robot coordinate system is updated every several ms. Therefore, when the position of the workpiece flowing on the conveyor is measured by the camera, when the robot controller receives the position information, the workpiece has advanced to a position different from that at the time of measurement.
  • each of the image processing device, the robot, the machine tool, and the controller has a clock, and these clocks are always displaced. As described above, each cycle is different, and a clock that generates the cycle also shifts. Therefore, it is difficult to eliminate the positional information shift as described above.
  • an object of the present invention is to eliminate positional deviation and to perform highly accurate position control when a controller is used to control a device that has each timepiece and updates position information at different periods. It is to provide a control system, a control method for the control system, and a program for the control system.
  • control system of this disclosure is: A control system comprising a controller and a processing device connected to the controller and performing a predetermined process related to a workpiece,
  • the processor is A timekeeping section that keeps time,
  • a position information update unit that updates the position information output from the processing device at a predetermined period in a coordinate system in the processing device;
  • a transmission unit that transmits the position information updated by the position information update unit and time information at which the update is performed to the controller;
  • the controller is A receiving unit that receives the position information and the time information from the processing device;
  • the position information is converted into a world coordinate system position of the world coordinate system by using a predetermined coordinate transformation expression that represents a relative relationship between the coordinate system and the world coordinate system in the processing apparatus from which the position information is output.
  • a coordinate conversion unit for converting information An estimation unit that estimates position information between the world coordinate system position information based on the world coordinate system position information and the time information;
  • a world coordinate system management unit that manages the world coordinate system position information and the position information estimated by the estimation unit on a time axis;
  • the processing device updates the position information of the workpiece or the like at a predetermined cycle by the position information update unit, and transmits the updated position information and the updated time information to the controller by the transmission unit.
  • the updated time information is based on the time measured by the time measuring unit.
  • the controller receives position information and time information from the processing device by the receiving unit, and converts the position information to world coordinate system position information of the world coordinate system by the coordinate conversion unit.
  • the conversion by the coordinate conversion unit is performed using a coordinate conversion formula given in advance representing the relative relationship between the coordinate system in the processing apparatus that outputs the position information and the world coordinate system.
  • the estimation unit estimates position information between the world coordinate system position information based on the world coordinate system position information and the time information.
  • the world coordinate system management unit manages the world coordinate system position information and the position information estimated by the estimation unit on the time axis.
  • the position information output from the processing device is the time information when the position information is updated. And sent to the controller. Therefore, by converting the position information in the coordinate system of the processing device into the world coordinate system position information in the controller, the position information of each processing device can be managed on the time axis in the unified world coordinate system. Moreover, even when the update period of the position information in the processing device is long or when there is no punctuality in the update, the position information between the world coordinate system position information is estimated based on the time information. Therefore, the position information of the processing devices and the estimated position information can be managed on the time axis in the unified world coordinate system, and the positional information in each processing device is eliminated, and highly accurate position control is performed. It can be carried out.
  • the timekeeping unit has a time correction function.
  • the timekeeping unit in each processing device has a time correction function, so that a delay due to communication is corrected, and a common time axis between the controller and each processing device.
  • the position information with the time information above is exchanged.
  • the control system may include a display unit that displays the world coordinate system position information.
  • the position information in the coordinate system of the processing device is converted into the world coordinate system position information and displayed on the display unit. Therefore, the user can easily set the relative relationship between the controller and the processing device.
  • a control method of the control system of this disclosure is: A control method in a control system comprising a controller and a processing device connected to the controller and performing a predetermined process related to a workpiece, In the processing apparatus, A step of measuring time; Updating the position information output from the processing device with a predetermined period in a coordinate system in the processing device; Transmitting the updated location information and updated time information to the controller, and In the controller, Receiving the position information and the time information from the processing device; The position information is converted into a world coordinate system position of the world coordinate system by using a predetermined coordinate transformation expression that represents a relative relationship between the coordinate system and the world coordinate system in the processing apparatus from which the position information is output. Converting to information; Estimating position information between the world coordinate system position information based on the world coordinate system position information and the time information, Managing the world coordinate system position information and the estimated position information on a time axis.
  • the position information output from the processing device is the position information even when the coordinate systems in the processing device are different and the position information is updated at different periods. It is transmitted to the controller together with the updated time information. Therefore, by converting the position information in the coordinate system of the processing device into the world coordinate system position information in the controller, the position information of each processing device can be managed on the time axis in the unified world coordinate system. Moreover, even when the update period of the position information in the processing device is long or when there is no punctuality in the update, the position information between the world coordinate system position information is estimated based on the time information. Therefore, the position information of the processing devices and the estimated position information can be managed on the time axis in the unified world coordinate system, and the positional information in each processing device is eliminated, and highly accurate position control is performed. It can be carried out.
  • the program of the control system of this disclosure is a program for causing a computer to execute the control method of the control system.
  • the control method of the control system can be implemented by causing a computer to execute the disclosed program.
  • FIG. 1st Embodiment It is a figure which shows schematic structure of the control system in 1st Embodiment. It is a functional block diagram of a control system. It is a figure which shows an example of the timing of communication with a 1st control application and a 2nd control application, and a controller. It is a flowchart which shows the procedure of the process of a control system. It is a figure which shows an example of GUI in an integrated development environment. It is a figure for demonstrating the relationship between a reference
  • FIG. 1 is a diagram illustrating a schematic configuration of a control system 1 according to the first embodiment.
  • the control system 1 includes a controller 100, an integrated development environment 200, an image processing device 510, a robot 520, a machine tool 530, a first conveyor 546, and a second conveyor 544. ing.
  • the controller 100 is connected to the host computer 300 and the touch panel 400 via the host network 3.
  • the controller 100 is connected to the image processing apparatus 510, the robot controller 522, the machine tool 530, and the servo amplifier 540 via the industrial network 2. Further, the controller 100 is connected to the integrated development environment 200.
  • EtherCAT registered trademark
  • the controller 100 is, for example, a PLC (Programmable Logic Controller), a robot control program for controlling the operation of the arm type robot 520, a sequence for controlling the operation of the end effector attached to the robot 520 and the operation of the machine tool 530.
  • a control program is executed and a control signal is output.
  • the integrated development environment 200 is a computer such as a personal computer, for example, and is connected to the controller 100 so as to be communicable.
  • the integrated development environment 200 has a function of downloading a robot control program and a sequence control program executed by the controller 100 to the controller 100, a function of debugging these programs, a function of simulating these programs, and the like. Yes.
  • the robot 520 is a 6-axis vertical articulated robot, for example, and is connected to the robot controller 522 so as to be communicable.
  • the robot 520 includes a power source such as a servomotor, and drives the servomotor via the robot controller 522 and operates each joint axis by a control signal output from the controller 100 based on the robot control program.
  • An end effector is attached to the tip of the robot 520, and the end effector drives a servo motor in the end effector via the robot controller 522 by a control signal output from the controller 10 based on a sequence control program. To do.
  • the end effector is attached to the tip of the robot 520, and includes, for example, a mechanism for gripping a component.
  • the machine tool 530 processes the workpiece W on the mounting part such as a table or a tool post with a tool.
  • the processed workpiece W is gripped by an end effector attached to the robot 520 and placed on the first conveyor 546.
  • the second conveyor 544 includes a servo motor 542, and the servo motor 542 is connected to the servo amplifier 540.
  • the servo amplifier 540 includes a counter and an encoder, and the counter and the encoder are electrically connected.
  • the encoder is electrically connected to a servo motor 542 for driving the second conveyor 544.
  • the counter measures the amount of movement of the second conveyor 544 based on the pulse wave generated from the encoder. More specifically, the encoder generates a pulse signal according to the amount of movement of the second conveyor 544. The counter receives the pulse signal from the encoder and counts the number of pulses included in the pulse signal, thereby measuring the movement amount of the second conveyor 544. The counter transmits the count value of the pulse wave to the controller 100 via the servo amplifier 540 at regular communication cycles.
  • a camera 512 is connected to the image processing apparatus 510.
  • the camera 512 images the workpiece W moving on the second conveyor 544.
  • the image processing apparatus 510 transmits the captured image to the controller 100 at regular communication cycles.
  • the workpiece W is a product or a semi-finished product, and may be an electronic component such as a connector, for example.
  • FIG. 2 is a functional block diagram of the control system 1 in the present embodiment.
  • the controller 100 includes an upper network interface 101, a logic processing unit 102, a world coordinate system management unit 104, a position data estimation processing unit 105, a field network interface 106, and a clock 108.
  • the logic processing unit 102 includes a motion processing unit 103.
  • the controller 100 is connected to the host computer 300, the touch panel 400, and the integrated development environment 200 via the host network interface 101.
  • controller 100 is connected to the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 as processing devices via the field network interface 106.
  • first control application engine 700 and the second control application engine 710 are connected to a control application 600 such as a robot 520 and a machine tool 530.
  • the first control application engine 700 controls the end effector attached to the robot 520 and the machine tool 530. Only one first control application engine 700 is also shown in FIG. 2 for the sake of simplicity.
  • the second control application engine 710 controls the robot 520.
  • the robot controller 522 shown in FIG. 1 has functions of a first control application engine 700 and a second control application engine 710.
  • the controller 100, the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 are provided with clocks 108, 701, 711, 511, and 541 as timekeeping units, respectively.
  • an actuator 600 is connected to the first control application engine 700 and the second control application engine 710 via an I / O interface.
  • a camera 512 is connected to the image processing apparatus 510 via an I / O interface.
  • the world coordinate system management unit 104 of the controller 100 manages the world coordinate system.
  • the image processing apparatus 510, the first control application engine 700, the second control application engine 710, the machine tool 530, and the servo amplifier 540 manage their local coordinate systems.
  • the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 have a function as a position information update unit, and the position in each local coordinate system in each cycle. Update.
  • the position means the position of the end effector and the position of the workpiece W on the mounting portion of the machine tool 530 or the tool on the machine tool 530.
  • the coordinate values represented by the six axes of the robot 520 are used.
  • the image processing apparatus 510 it means an image taken by the camera 512.
  • the servo amplifier 540 refers to the encoder value.
  • FIG. 3 is a diagram illustrating an example of the timing of communication between the first control application engine 700 and the second control application engine 710 and the controller 100. As shown in FIG. 3, the first control application engine 700 and the second control application engine 710 transmit and receive data at a period different from the control period of the controller 100.
  • the first control application engine 700 When the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 update the position information, each time the position information is updated is added to the position information.
  • the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 transmit position information and time information to the controller 100 via the industrial network 2. That is, the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 have a function as a transmission unit that transmits the updated position information and time information to the controller 100. Yes.
  • the world coordinate system management unit 104 of the controller 100 receives position information and time information via the field network interface 106. Therefore, the world coordinate system management unit 104 and the field network interface 106 have a function as a reception unit.
  • the world coordinate system management unit 104 serving as a coordinate conversion unit of the controller 100 converts the world coordinate system into a world coordinate system using a coordinate conversion formula representing a relative relationship between the world coordinate system and each coordinate system.
  • the position data estimation processing unit 105 as an estimation unit of the controller 100 performs position data estimation processing until new data is obtained from the first control application engine 700 or the like.
  • the world coordinate system management unit 104 of the controller 100 manages position data obtained from the first control application engine 700 or the like or estimated position data in the world coordinate system as position data at a predetermined time.
  • the user program operating on the controller 100 can perform position control for each control application using the position of the world coordinate system expression, and can manage the command position of the world coordinate system expression.
  • the integrated development environment 200 can display position information in the world coordinate system managed by the controller 100.
  • the user can input position information in the world coordinate system from the integrated development environment 200. Therefore, the integrated development environment 200 has a function as a display unit of the controller 100.
  • FIG. 4 is a flowchart showing a processing procedure of the control system 1.
  • the user executes CAE software such as 3D CAD / CAM on the host computer 300 to set the positional relationship between the world coordinate system, the control application, and the camera (S10).
  • CAE software such as 3D CAD / CAM
  • the user captures the CAE data from the host computer 300 into the integrated development environment 200 (S20).
  • a method for capturing CAE data in the integrated development environment 200 for example, there is a method of converting CAE data into a DFX file and capturing the converted data.
  • the user performs setting between the coordinate system between the world coordinate system and each local coordinate system in each control application 600 on the setting screen of the integrated development environment 200 (S30).
  • FIG. 5 is a diagram illustrating an example of a GUI (Graphical User Interface) 210 in the integrated development environment 200. As shown in FIG. 5, the user inputs the relative relationship between the world coordinate system and each local coordinate system of each control application in this GUI 210. A measurement tool is provided on the GUI 210, and data of the measurement tool is displayed on the GUI 210 as a reference value.
  • GUI Graphic User Interface
  • the user After setting the coordinate system as described above, the user creates a user program in the integrated development environment 200 (S40). After creating the user program, the user downloads the user program from the integrated development environment 200 to the controller 100.
  • the controller 100 starts controlling the operation of the control application 600, the image processing in the image processing apparatus 510, and the operation of the servo amplifier 540 according to the user program (S60).
  • the clocks 701, 711, 511, and 541 of the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 are corrected (S70).
  • the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 transmit position data at a predetermined time to the controller 100 in each update cycle (S80).
  • the world coordinate system management unit 104 of the controller 100 converts the position data updated at the same time into position representation of the world coordinate system as position data at the same time (S90).
  • the position data estimation processing unit 105 of the controller 100 estimates position data based on the encoder value obtained from the servo amplifier 540 when new position data cannot be obtained from the image processing device 510 (S100).
  • the world coordinate system management unit 104 of the controller 100 passes the position of the world coordinate system expression to the user program or the integrated development environment 200 (S110).
  • the user program passes the command position of the world coordinate system expression to the controller 100 (S120).
  • the controller 100 converts the command position of the world coordinate system expression back to the local coordinate system of each control application 600 and passes it to the first control application engine 700 and the second control application engine 710 (S130).
  • steps S70 to S130 are repeated until the end of the control.
  • FIGS. 6 and 7 are diagrams for explaining the relationship between the reference coordinate system and the object coordinate system.
  • Cartesian coordinate system fixed to the object and the direction of each coordinate axis when viewed from a certain Cartesian coordinate system. I do.
  • the former is called a reference coordinate system, and the latter is called an object coordinate system.
  • the reference coordinate system is ⁇ A
  • the origin is O A
  • the origin is O B
  • the origin is O B
  • the three orthogonal axes are X B , Y B , and Z B.
  • the origin O origin from A O B toward vectors (O B position vector) viewed in the reference coordinate system sigma A written as A P B
  • 3 axes X B, Y B unit vector pointing in the direction of Z B those expressed in reference coordinate system sigma a a, a X B, a Y B
  • the position of the object when viewed in the reference coordinate system sigma A can be represented by A P B
  • the posture can be expressed by (A X B, A Y B , A Z B).
  • the upper left subscripts A indicates that the vector is represented by a reference coordinate system sigma A.
  • the posture of the object can be represented by three vectors ( A X B , A Y B , A Z B ), and these are represented by a rotation matrix A R B.
  • a R B [A X B , A Y B, A Z B]
  • a reference coordinate system ⁇ A and an object coordinate system ⁇ B are given, the position of the object coordinate system ⁇ B with respect to the reference coordinate system ⁇ A is A p B ], and the posture rotation matrix is A Assume that R B is given.
  • This equation is called a homogeneous transformation equation, and A T B is called a homogeneous transformation matrix.
  • a T B is called a homogeneous transformation matrix.
  • the world coordinate system management unit 104 of the controller 100 uses such a homogeneous conversion formula to convert each local coordinate system to the world coordinate system, and from the world coordinate system to each local coordinate system. Inverse conversion to
  • the position expression using the local coordinate system A is as follows. (X A , Y A , Z A , ⁇ A , ⁇ A , ⁇ A , t A )
  • ( ⁇ n , ⁇ n , ⁇ n ) is a roll, pitch, and yaw angle, and represents a rotation angle.
  • T n represents time information.
  • FIG. 8 is a diagram illustrating an example of the calibration sheet S on which the target pattern is drawn according to the present embodiment.
  • the target pattern shown on the calibration sheet S includes five circles (marks) whose inside is divided by about 90 °.
  • the calibration is basically performed using four marks. However, in the additionally arranged one mark, the arrangement direction of the calibration sheet S is unified in a predetermined direction. Used to do.
  • the user places the calibration sheet S on which the target pattern is drawn in the field of view of the camera 512. Then, the user gives an imaging instruction from the controller 100 to the image processing apparatus 510. Then, the image processing apparatus 510 transmits an image (an image including a target pattern as a subject) obtained by imaging to the controller 100.
  • the controller 100 performs a measurement process on the received image, and determines the coordinate value of each center point for the four marks arranged at the four corners included in the target pattern. As a result, the coordinate values [pixel] of the image coordinate system for the four marks included in the target pattern are acquired.
  • the four coordinate values acquired are (xi1, yi1), (xi2, yi2), (xi3, yi3), and (xi4, yi4).
  • the user moves the second conveyor 544 to place the calibration sheet S on which the target pattern is drawn within the tracking range (operating range) of the robot 520, and operates the robot 520, The positional relationship between the four marks included in the target pattern and the robot 520 is associated.
  • the user moves the conveyor 544 to place the calibration sheet S within the tracking range (operating range) of the robot 520. It is assumed that the count value of the encoder before the movement of the second conveyor 544 (at the start of calibration) is acquired in advance. This count value is set to E1.
  • the user positions the hand tip of the robot 520 so as to correspond to one mark on the calibration sheet S by operating a teaching pendant attached to the robot controller 522 or the like.
  • the position information of the robot 520 (coordinate value in the robot coordinate system indicating the position of the hand tip of the robot 520) grasped by the robot controller 522 is obtained from the image processing apparatus. 510 to the controller 100.
  • the process of positioning the tip of the hand of the robot 520 and transmitting the position information of the robot 520 in the positioned state to the controller 100 is repeatedly executed for all four marks included in the target pattern.
  • the position information of the robot 520 corresponding to the four marks included in the target pattern is acquired.
  • the position information of the robot 520 corresponding to the four marks acquired is, for example, (X1, Y1), (X2, Y2), (X3, Y3), (X4, Y4).
  • the position information of the robot 520 corresponding to all four marks is transmitted from the image processing apparatus 510 to the controller 100. Until maintained.
  • the controller 100 also stores the count value when the second conveyor 544 is moved to the robot operating range (upstream). This count value is set to E2.
  • the user further moves the second conveyor 544 to place the calibration sheet S at the most downstream position in the tracking range (operating range) of the robot 520 and operates the robot 520 to operate the target.
  • the positional relationship between one mark included in the pattern and the robot 520 is associated.
  • the user moves the second conveyor 544 to place the calibration sheet S at the downstream end of the tracking range (operating range) of the robot 520.
  • the user operates the teaching pendant or the like to associate the tip of the hand of the robot 520 with the first mark on the calibration sheet S (the mark having acquired the coordinate values (X1, Y1) in the second stage). Position to do.
  • the position information (the coordinate value in the robot coordinate system indicating the position of the hand tip of the robot 520) grasped by the robot controller 522 is sent to the controller 100. Sent.
  • the position information of the robot 520 corresponding to the first mark included in the target pattern is acquired.
  • the position information of the robot 520 corresponding to the acquired first mark is (X5, Y5).
  • controller 100 moves the second conveyor 544 to the operating range (downstream) of the robot 520 and stores the count value in the state. This count value is set to E3.
  • dX (X5-X1) / (E3-E2)
  • dY (Y5-Y1) / (E3-E2)
  • Y4 six parameters A to F of the conversion formula relating to the conversion of the coordinate system are determined. That is, parameters A to F that satisfy the following expression (or that minimize the error) are determined using a known method.
  • the measurement of the image processing apparatus 510 is longer than the communication cycle of the controller 100, and the measurement time is not punctual. Therefore, the controller 100 estimates the position of the workpiece W in a cycle in which position data cannot be obtained from the image processing apparatus 510.
  • the image processing apparatus 510 transmits two images captured at time N-1 and time N and time data when the two images are captured to the controller 100 via the industrial network 2.
  • the position data estimation processing unit 105 of the controller 100 measures the difference between the number of moving pixels of feature points such as edges of the workpiece W in the two images and the time. As a result, the position data estimation processing unit 105 of the controller 100 determines the moving speed and moving direction of the workpiece W. Then, the position data estimation processing unit 105 of the controller 100 estimates the position of the workpiece in the control cycle of the controller 100 based on this vector information.
  • clock correction method Next, a clock correction method will be described.
  • a conventional method can be used, and for example, the method disclosed in Japanese Patent No. 5794449 can be used.
  • the controller 100 has a function of transmitting a time synchronization frame for time adjustment to the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540.
  • the first control application engine 700, the second control application engine 710, the image processing apparatus 510, and the servo amplifier 540 are referred to as slave units.
  • the slave unit When each slave unit receives the time synchronization frame from the controller 100, the slave unit immediately transmits the time synchronization frame to the next slave unit without adding data to the frame.
  • the time required from the reception of the time synchronization frame by one slave unit to the transmission to the next slave unit is substantially constant.
  • the time for the time synchronization frame to move on the line inside the slave unit and the system bus between the slave units is uniquely defined from the line length and is constant. Therefore, it can be said that the time required to transfer the time synchronization frame to the next slave unit is almost equal for the slave unit.
  • the time required from when a time synchronization frame is transmitted from a slave unit to when the time synchronization frame is received by the next slave unit and output from the next slave unit is the same for each slave unit.
  • a reference time is set to t0, that is, the propagation delay time required from when the time synchronization frame is transmitted from the controller 100 to when the time synchronization frame is output from the first slave unit is , T0.
  • the propagation delay time required from when the time synchronization frame is transmitted from the first slave unit until the time synchronization frame is received by the second slave unit and output to the next slave unit. Becomes t0. Accordingly, the propagation delay time required from when the time synchronization frame is transmitted from the controller 100 to when the time synchronization frame is transmitted from the second slave unit is 2 ⁇ t0. Similarly, the propagation delay time required from when the time synchronization frame is transmitted from the controller 100 to when the time synchronization frame is transmitted from the transmission control unit of the Nth slave unit is N ⁇ t0.
  • t0 may be a time required for internal processing of the slave unit.
  • N ⁇ t0 is stored in the memory or register as the correction time based on the propagation delay time.
  • t0 is stored in the register of the first image processing apparatus 510
  • 2 ⁇ t0 is stored in the register of the second first control application engine 700.
  • the register of the second second control application engine 710 stores “3 ⁇ t0”.
  • the correction time stored in the register of each slave unit is the time obtained by adding t0 in order from the slave unit close to the controller 100 (Nth slave unit is “N ⁇ t0”). Therefore, when the user manually stores the correction time in the register, the user confirms the slave unit to be set from the controller 100 and confirms the corresponding correction time using the setting tool device. It can be set individually, or each slave unit can be provided with a switch for setting a correction time, etc., and the user can set it for each unit by operating the switch. This switch is, for example, a switch that identifies the Nth switch. Then, each slave unit multiplies the correction reference time t0 stored and held in advance by N specified by the switch to obtain the correction time and stores it in the register.
  • the controller 100 as a master may automatically set the correction time for the register of each slave unit. That is, as an initial process at power-on or system startup, the controller 100 acquires connection position information of each slave unit, calculates the correction time of each slave unit for which correction time is set, and calculates the calculated time. Notify each slave unit of the correction time. Then, the slave unit that has received the notification stores the sent correction time in the register. That is, the controller 100 transmits a profile request message to each slave unit prior to communication. Each slave unit sends its own profile (model information, etc.). Thereby, the controller 100 recognizes the configuration information indicating what slave unit is set in each node from the received profile information of the slave units configuring each node. Therefore, the controller 100 can obtain the correction time by multiplying the reference time t0 while obtaining what number each slave unit is connected to based on the collected configuration information.
  • the controller 100 broadcasts time information of the clock 108. That is, when the time alignment condition is satisfied, the controller 100 starts preparation for transmission of transmission data (time information) for the time synchronization frame.
  • the transmission condition is a time-up of the transmission timer. Therefore, the time synchronization frame transmission processing is performed at regular intervals at a time interval set by the transmission timer. This time synchronization frame is transmitted with the highest priority.
  • the controller 100 latches the time information of the clock 108 serving as the master time, and stores it in the data portion of the time synchronization frame.
  • the controller 100 encodes the time synchronization frame generated in this way and transmits it to the first slave unit. If the time information is latched, the controller 100 transmits the time synchronization frame with the highest priority without delay.
  • the time synchronization frame (serial) is received by the first slave unit (in this embodiment, the image processing apparatus 510).
  • the first slave unit can acquire the master time stored in the time synchronization frame.
  • the first slave unit latches the master time sent in the received time synchronization frame, and obtains the current time by adding the correction time stored in the register to the master time.
  • the own clock (in this embodiment, the clock 511 of the image processing apparatus 510) is corrected based on the determined time.
  • the correction of the clock based on the obtained current time can be realized, for example, by performing a process of updating the time of the clock by overwriting it with the obtained current time.
  • the value indicated by the clock is scattered discretely before and after correction.
  • the first slave unit uses a predetermined correction algorithm or the like, and corrects the time to gradually and gradually adjust to the master time by increasing or decreasing the clock speed. You may do it.
  • the first slave unit transmits a time synchronization frame to the second slave unit.
  • the time synchronization frame transmitted in this way by the digital chain method is received by the slave unit (here, the second first control application engine 700) adjacent to the next stage.
  • the second slave unit obtains the master time stored in the received time synchronization frame, and adds the correction time (2 ⁇ t0) to the current time.
  • the clock 701) of the first control application engine 700 is corrected.
  • the first control application engine 700 transmits the received time synchronization frame to the third slave unit.
  • each slave unit sequentially transfers the time synchronization frame sent from the controller 100 to the adjacent slave units by the daisy chain method. Further, the time of the Nth slave unit is corrected by adding the station delay ((N ⁇ 1) ⁇ t0) of N ⁇ 1 slave units existing between the controller 100 and the master time. Do
  • clock correction method described above is merely an example, and other methods may be used in addition to the method disclosed in Japanese Patent No. 5141972, for example.
  • FIG. 9 is a conceptual diagram schematically showing the function block 180.
  • FIG. 10 is a conceptual diagram schematically showing the function block 190.
  • the function block 180 is stored in, for example, a main memory or a storage device of the controller 100. As shown in FIG. 9, the function block 180 includes a receiving unit 181 that receives a structure array variable World in which the position and time of the world coordinate system are possible, an enable unit 182 that receives TRUE or FALSE, and a local of the Nth control application. A receiving unit 183 that receives a structure variable Apply [N] in which the position and time of the coordinate system are stored.
  • the function block 180 also outputs an output unit 184 that outputs a structure array variable World in which the position and time of the world coordinate system are possible, and an output that outputs an output variable Status indicating whether the function block 180 is operating normally.
  • Unit 185 an output unit 186 that outputs an output variable Error when an abnormality occurs, and an output unit 187 that outputs an output variable ErrorID in which an error code is stored.
  • the function block 190 is stored in, for example, a main memory or a storage device of the controller 100. As shown in FIG. 10, the function block 190 includes a receiving unit 191 that receives a structural array variable World in which the position and time of the world coordinate system are possible.
  • the function block 190 also outputs an output unit 193 that outputs a structure array variable World in which the position and time of the world coordinate system are possible, and a structure variable Apply that stores the position and time of the local coordinate system of the Nth control application.
  • An output unit 194 that outputs [N]
  • an output unit 195 that outputs an output variable Status indicating whether the function block 190 is operating normally
  • an output unit 196 that outputs an output variable Error when an abnormality occurs.
  • an output unit 197 for outputting an output variable ErrorID in which an error code is stored.
  • FIG. 11 is a diagram showing a program example in a ladder diagram defined by IEC61131-3.
  • the structure array variable Appli [1] storing the position and time of the coordinate system of the first control application is converted, and the position and time of the converted coordinate system are , Stored in the structure array variable World of the world coordinate system, and output from the output unit 184.
  • the output variable Status output from the output unit 185 is TRUE.
  • the output variable Error output from the output unit 186 becomes TRUE, the error code is stored in the output variable ErrorID, and is output from the output unit 187.
  • the output variable Status output from the output unit 195 is TRUE. If an abnormality occurs, the output variable Error output from the output unit 196 becomes TRUE, and an error code is stored in the output variable ErrorID and output from the output unit 197.
  • the position information expressed in the world coordinate system may be displayed on the GUI of the integrated development environment 200 shown in FIG. 5 in addition to the mode of outputting the function block output information as described above.
  • the clocks 108, 701, 711, 511, 541 of the controller 100, the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540 are used.
  • the controller 100 updates the position of each local coordinate system updated in each cycle from the first control application engine 700, the second control application engine 710, the image processing device 510, and the servo amplifier 540. Receive updated time information. Accordingly, it is possible to manage each piece of position information represented in different coordinate systems and updated at different periods as position information of a unified world coordinate system at a common time.
  • the user program operating on the controller 100 can issue a command or monitor the position using the position expressed in the world coordinate system.
  • the position can be estimated by the control cycle in the controller 100 even when the update cycle is long or the position data is output from the control application engine or the image processing device 510 having no punctuality.
  • the command position can be output to the control application engine in the control cycle of the controller 100.
  • the position of the world coordinate system can be managed, but also the speed and acceleration data can be calculated. 400, and the integrated development environment 200. Further, in the integrated development environment 200, the output position, velocity, and acceleration data of the world coordinate system can be displayed on the GUI.
  • information such as speed, acceleration, torque, and current value may be transmitted from the first control application engine 700 and the second control application engine 710 to the controller 100.
  • first control application engine 700 and the second control application engine 710 are merely examples, and the number of control application engines can be appropriately increased or decreased as necessary.
  • FIG. 12 is a diagram showing functional blocks of the control system 1 in the present embodiment. As shown in FIG. 12, in the present embodiment, a first control application engine 700 and a second control application engine 710 are provided inside the controller 100.
  • first control application engine 700 and the second control application engine 710 are not provided with a clock, and the control application 600 such as the robot 520 and the machine tool 530 is provided with a clock 601.
  • control application 600 is also provided with a position information update unit that updates the position information and a transmission unit that transmits the time and position information at which the position information was updated to the controller 100.
  • the controller 100 corrects the clocks 601, 511, and 541 in the control application 600, the image processing device 510, and the servo amplifier 540, and the controller 100 performs the control application 600, the image processing device 510, and The position of each local coordinate system updated in each cycle and the updated time information are received from the servo amplifier 540. Accordingly, it is possible to manage each piece of position information represented in different coordinate systems and updated at different periods as position information of a unified world coordinate system at a common time.
  • the present invention is not limited to such an embodiment.
  • the counter value provided in the servo amplifier 540 may be transmitted from the servo amplifier 540 to the controller 100, and the position of the workpiece W may be estimated based on this counter value.
  • the mode in which the position of the workpiece W is estimated by transmitting an image from the image processing apparatus 510 and analyzing the image in the controller 100 has been described.
  • the image processing device 510 measures the difference between the number of moving pixels of feature points such as the edges of the workpiece W in the two images taken at the time N-1 and the time N and the time, and the moving speed of the workpiece W The moving direction may be determined. Then, information on the moving speed and moving direction of the workpiece W may be transmitted from the image processing apparatus 510 to the controller 100 so that the controller 100 estimates the position of the workpiece W.
  • Control system 100 Controller 101 Clock 510 Image processing apparatus 511 Clock 600 Control application 700 1st control application engine 701 Clock 710 2nd control application engine 711 Clock

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)
  • Programmable Controllers (AREA)

Abstract

La présente invention concerne l'utilisation d'un dispositif de commande pour commander des appareils dont chacun comporte une horloge, et dont chacun met à jour des informations de position avec un cycle différent, l'objectif étant d'éliminer des divergences entre les informations de position respectives et d'effectuer une commande de position avec un degré élevé de précision. Dans un système de commande (1) pourvu d'un dispositif de commande (100) et d'un dispositif de traitement (510), par exemple, le dispositif de traitement (510) et similaire est doté d'une unité de temporisation (511) ou analogue, d'une unité de mise à jour d'informations de position pour mettre à jour des informations de position avec un cycle prescrit, et d'une unité de transmission pour transmettre au dispositif de commande (100) les informations de position mises à jour et les informations temporelles à ce moment-là, le dispositif de commande (100) étant pourvu : d'une unité de réception (106) destinée à recevoir les informations de position et les informations temporelles; d'une unité de conversion de coordonnées (104) destinée à convertir les informations de position en informations de position de système de coordonnées universelles au moyen d'une formule de conversion de coordonnées; d'une unité d'estimation (105) destinée à estimer des informations de position entre les informations de position de système de coordonnées universelles sur la base des informations de position de système de coordonnées universelles et des informations temporelles; et d'une unité de gestion de système de coordonnées universelles (104) pour gérer les informations de position de système de coordonnées universelles et les informations de position estimées sur un axe temporel.
PCT/JP2019/021220 2018-06-06 2019-05-29 Système de commande, procédé de commande de système de commande, et programme de système de commande WO2019235314A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018108922A JP7095417B2 (ja) 2018-06-06 2018-06-06 制御システム、制御システムの制御方法、および制御システムのプログラム
JP2018-108922 2018-06-06

Publications (1)

Publication Number Publication Date
WO2019235314A1 true WO2019235314A1 (fr) 2019-12-12

Family

ID=68770448

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/021220 WO2019235314A1 (fr) 2018-06-06 2019-05-29 Système de commande, procédé de commande de système de commande, et programme de système de commande

Country Status (2)

Country Link
JP (1) JP7095417B2 (fr)
WO (1) WO2019235314A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0866881A (ja) * 1994-08-29 1996-03-12 Toyoda Mach Works Ltd ロボット制御装置
JP2005262369A (ja) * 2004-03-18 2005-09-29 Yaskawa Electric Corp ロボットシステム
JP2009279608A (ja) * 2008-05-21 2009-12-03 Fanuc Ltd ロボットおよびプレス機械を含むシステム、複数のロボットを含むシステム、ならびにそのようなシステムにおいて使用されるロボットの制御装置
JP2011083841A (ja) * 2009-10-13 2011-04-28 Seiko Epson Corp ロボット制御装置、ロボット制御システム及びロボット制御方法
JP2017094406A (ja) * 2015-11-18 2017-06-01 オムロン株式会社 シミュレーション装置、シミュレーション方法、およびシミュレーションプログラム
JP2018024044A (ja) * 2016-08-09 2018-02-15 オムロン株式会社 情報処理システム、情報処理装置、ワークの位置特定方法、およびワークの位置特定プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0866881A (ja) * 1994-08-29 1996-03-12 Toyoda Mach Works Ltd ロボット制御装置
JP2005262369A (ja) * 2004-03-18 2005-09-29 Yaskawa Electric Corp ロボットシステム
JP2009279608A (ja) * 2008-05-21 2009-12-03 Fanuc Ltd ロボットおよびプレス機械を含むシステム、複数のロボットを含むシステム、ならびにそのようなシステムにおいて使用されるロボットの制御装置
JP2011083841A (ja) * 2009-10-13 2011-04-28 Seiko Epson Corp ロボット制御装置、ロボット制御システム及びロボット制御方法
JP2017094406A (ja) * 2015-11-18 2017-06-01 オムロン株式会社 シミュレーション装置、シミュレーション方法、およびシミュレーションプログラム
JP2018024044A (ja) * 2016-08-09 2018-02-15 オムロン株式会社 情報処理システム、情報処理装置、ワークの位置特定方法、およびワークの位置特定プログラム

Also Published As

Publication number Publication date
JP7095417B2 (ja) 2022-07-05
JP2019212123A (ja) 2019-12-12

Similar Documents

Publication Publication Date Title
EP3691192B1 (fr) Système de commande et dispositif de commande
EP3171237B1 (fr) Simulateur, procédé de simulation et programme de simulation
JP3577028B2 (ja) ロボットの協調制御システム
US4833624A (en) Functioning-distributed robot control system
JP2006187826A (ja) ロボットコントローラ
US20220292234A1 (en) Simulation apparatus, recording medium, and simulation method
JP2006105782A (ja) ロボットビジョンによる計測装置及びロボット制御装置
JP2019079344A (ja) 制御システム
JP2014128845A (ja) ロボットシステム表示装置
CN108568816B (zh) 用于控制自动化工作单元的方法
WO2021106467A1 (fr) Système de commande, dispositif de commande et procédé de commande
JP2018001393A (ja) ロボット装置、ロボット制御方法、プログラム及び記録媒体
JP3349652B2 (ja) オフラインティーチング方法
JPH08132373A (ja) ロボット−センサシステムにおける座標系結合方法
JP5378908B2 (ja) ロボットの精度調整方法およびロボット
EP3467603B1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
WO2019235314A1 (fr) Système de commande, procédé de commande de système de commande, et programme de système de commande
CN106774178A (zh) 一种自动化控制系统及方法、机械设备
WO2012124145A1 (fr) Unité de calcul, unité d'assistance, programme d'assistance, support d'enregistrement contenant un programme d'assistance, et procédé de fonctionnement dans un dispositif d'assistance
JP2021142596A (ja) シミュレーション装置およびプログラム
JP7299674B2 (ja) ロボット高周波数位置ストリーミング
JP7392590B2 (ja) ロボット制御システム、制御プログラムおよび制御方法
JP2023038747A (ja) コントローラシステムおよびその制御方法
Yang et al. Synchronous Control Method and Simulation Research of Multi-degree of Freedom Humanoid Space Manipulator
JP2004038855A (ja) 軸操作装置、軸操作方法、及び軸操作プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19816071

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19816071

Country of ref document: EP

Kind code of ref document: A1