CN110666820A - High-performance industrial robot controller - Google Patents
High-performance industrial robot controller Download PDFInfo
- Publication number
- CN110666820A CN110666820A CN201910968392.3A CN201910968392A CN110666820A CN 110666820 A CN110666820 A CN 110666820A CN 201910968392 A CN201910968392 A CN 201910968392A CN 110666820 A CN110666820 A CN 110666820A
- Authority
- CN
- China
- Prior art keywords
- robot
- interface
- target
- processor
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Numerical Control (AREA)
Abstract
The invention discloses a high-performance industrial robot controller, which comprises a processor single board, a machine vision unit, a demonstrator, an external IO unit, a gripper unit, an external sensor and a motor driver, wherein the processor single board is connected with the demonstrator; the processor single board comprises an ARM processor unit, an FPGA unit, a power supply module, an Ethernet interface A, an Ethernet interface B, an Ethernet interface C, an Ethernet interface D, IO interface and a CAN interface; the robot controller adopted by the invention is miniaturized, so that the cost is reduced, the computing capacity of the controller can be ensured, and the robot controller has strong expandability and high safety; meanwhile, the robot controller is convenient to expand, and meanwhile, the three-dimensional machine vision is included, so that (X, Y and Z) information of a target can be acquired, the target can be grabbed more accurately, and the robot controller can be suitable for a plurality of complex application scenes; and the robot motion control algorithm is equally distributed in the ARM and the FPGA, so that the computing capacity of the robot controller is effectively improved.
Description
Technical Field
The invention belongs to the field of industrial robot control, relates to a robot control technology, and particularly relates to a high-performance industrial robot controller.
Background
The controller, one of the important core components of an industrial robot, determines the quality of the control performance of the robot. Industrial application high performance motion control ware, the realization is to motion control functions such as industrial robot's position, speed, acceleration, industrial robot to continuous orbit motion has trajectory planning and control function, has to the detection and the perception function of external environment including the operation condition, has simple and convenient swift human-computer interaction, teaching and programming function, industrial Ethernet provides abundant nimble external equipment control communication interface, can satisfy industrial robot's high dynamic response, the user demand of high accuracy.
The existing industrial robot control system mostly adopts a controller based on an X86 platform, has the advantages of strong computing capability and stable performance, and has the defects of poor system expandability, difficulty in adapting to the differentiated application scene of the industrial robot and higher price. In addition, an embedded platform controller based on an ARM exists, although the embedded platform controller can achieve high precision, the computing capability of a processor is poor, and the requirement of the real-time performance of the robot is difficult to meet for a part of application scenes.
In order to solve the above-mentioned drawbacks, a solution is now provided.
Disclosure of Invention
The invention aims to provide a high-performance industrial robot controller.
The purpose of the invention can be realized by the following technical scheme:
a high-performance industrial robot controller comprises a processor single board, a machine vision unit, a demonstrator, an external IO unit, a gripper unit, an external sensor and a motor driver;
the processor single board comprises an ARM processor unit, an FPGA unit, a power supply module, an Ethernet interface A, an Ethernet interface B, an Ethernet interface C, an Ethernet interface D, IO interface and a CAN interface;
the FPGA unit is connected with the ARM processing unit through a high-speed link, and two Ethernet interfaces are output from the ARM end and two Ethernet interfaces are output from the FPGA; the CAN interface and the IO interface are respectively connected with the ARM processor unit, and the power module provides power for the processor single board;
the Ethernet interface B of the single board of the processor is connected with the demonstrator, the Ethernet interface C is connected with the machine vision unit, the Ethernet interface A is connected with the motor driver, the motor driver is interconnected through an EtherCAT bus, the motor driver controls the motor, the Ethernet interface D is connected to the local Ethernet, the IO interface is connected with the external IO unit, the CAN interface is connected with the gripper unit, and the external sensor is a safety protection sensor connected to the outside of the single board of the processor.
Further, the demonstrator is used for demonstrating the robot, acquiring key position information of the robot, programming the robot on site through the demonstrator, and setting a motion process of the robot;
the key position information is a working origin and a material placing point of the robot.
Further, the machine vision unit is used for acquiring coordinate information of the target, and the machine vision unit is a three-dimensional camera; the machine vision unit acquires the depth information of the target while acquiring the plane information of the target, acquires the height Z of the target from the robot through the depth information, then performs image segmentation on the plane information, acquires the (X, Y) coordinates of the target, and accordingly acquires the coordinate information (X, Y, Z) of the target
For the coordinate information (X, Y, Z) of the obtained target is a coordinate system based on a three-dimensional camera, the machine vision unit transmits the obtained coordinate information (X, Y, Z) to the processor single board through the Ethernet interface B, and the (X, Y, Z) information of the target is converted into a robot coordinate system from the camera coordinate system inside the processor single board to obtain machine coordinate information;
after the robot controller acquires the machine coordinate information of the target, solving the positive and inverse solution of each axis angle of the robot and planning interpolation of a motion trail are completed in the FPGA, namely the process of calculating the inverse solution for the kinematics of the mechanical arm is completed, and interpolation data are obtained; the method comprises the following specific steps:
the method comprises the following steps: establishing D-H parameters, and establishing a homogeneous transformation matrix of a coordinate system i in a coordinate system i-1 according to the parameters;
step two: then substituting DH parameters into the transformation matrix of all adjacent coordinate systems;
step three: multiplying the obtained transformation matrixes of the adjacent coordinate systems in sequence to obtain a result, namely a space description equation of the gripper in a polar coordinate system;
step four: automatically acquiring motion variables of all joints of the robot according to the target position coordinates acquired by the machine vision unit, and marking the obtained motion data of all axes of all joints of the robot as interpolation data;
the FPGA transmits the plug value data of all joints of the robot to the ARM processor through a high-speed interface between the FPGA and the ARM; the robot comprises an ARM, a three-dimensional camera, an FPGA, a robot controller and a robot controller, wherein the FPGA sends the ARM to obtain joint axis motion variables obtained by solving the space description equation of the gripper in a polar coordinate system according to the coordinate position obtained by the three-dimensional camera;
the ARM processor controls a motor driver through the acquired interpolation data so as to drive the motor to rotate; the ARM processor is interconnected with the motor driver through an Ethernet interface A;
when the motor drives the robot arm to move to the target position given by the machine vision unit, the robot control processor controls the external machine gripper to grip the target through the CAN interface.
Further, the machine vision unit is further used for detecting the robot running abnormity, and the specific detection method comprises the following steps:
the method comprises the following steps: firstly, acquiring the condition of goods on a tray of a robot; when no goods are detected on the target, judging that the process is abnormal and generating an abnormal signal; judging whether goods are not on the target by means of the height from the starting position of the robot to the goods;
step two: when the machine vision unit can not find the proper grabbing position of the goods, an abnormal signal is also generated; the judgment method for not finding the proper grabbing position of the goods is as follows:
s1: when the machine vision unit cannot correctly divide the goods target, the machine vision unit judges that a proper goods grabbing position cannot be found;
step three: when an abnormal signal is generated, stopping running the robot;
the machine vision unit is also used for transmitting an abnormal signal to the processor single board when the abnormal signal is generated, and the processor single board receives the abnormal signal transmitted by the machine vision unit and gives out alarm information through the IO interface.
Furthermore, the safety sensor is a photoelectric sensor arranged in a designated working area around the robot, and is used for generating an intrusion signal when detecting that a person or an object intrudes into a safety area in which the robot operates; the method for detecting that a person or an object breaks into a safety area in which the robot operates specifically comprises the following steps:
photoelectric sensors are arranged in the working area around the robot, and when a person or an object breaks through the photoelectric sensors, the fact that the person or the object breaks through the safe area is judged, and the person or the object breaks into the working area of the robot; generating an intrusion signal;
and the safety sensor sends the intrusion information to the processor single board through an external IO interface to control the robot to stop moving immediately.
Further, the processor single board gives an indication of the running state of the robot through an external IO interface;
the processor single board is connected to the local area Ethernet through the network port D, the running state of the robot is remotely monitored through the local area Ethernet, and a plurality of robots are monitored in real time; meanwhile, the robot software can be upgraded on line.
The invention has the beneficial effects that:
the robot controller adopted by the invention is miniaturized, so that the cost is reduced, the computing capacity of the controller can be ensured, and the robot controller has strong expandability and high safety;
meanwhile, the robot controller is convenient to expand, and meanwhile, the three-dimensional machine vision is included, so that (X, Y and Z) information of a target can be acquired, the target can be grabbed more accurately, and the robot controller can be suitable for a plurality of complex application scenes;
in addition, the robot motion control algorithm is equally distributed in the ARM and the FPGA, so that the computing capacity of the robot controller is effectively improved; the robot gripper is expanded through the CAN bus, so that the robot arm is more flexibly expanded; the invention is simple, effective and easy to use.
Drawings
In order to facilitate understanding for those skilled in the art, the present invention will be further described with reference to the accompanying drawings.
FIG. 1 is a block diagram of the system of the present invention;
fig. 2 is a diagram of the internal structure of the single board of the processor according to the present invention.
Detailed Description
As shown in fig. 1, a high-performance industrial robot controller includes a processor single board, a machine vision unit, a demonstrator, an external IO unit, a gripper unit, an external sensor, and a motor driver;
as shown in fig. 2, the processor board includes an ARM processor unit, an FPGA unit, a power module, an ethernet interface a, an ethernet interface B, an ethernet interface C, an ethernet interface D, IO, and a CAN interface;
the FPGA unit is connected with the ARM processing unit through a high-speed link, and two Ethernet interfaces are output from the ARM end and two Ethernet interfaces are output from the FPGA; the CAN interface and the IO interface are respectively connected with the ARM processor unit, and the power module provides power for the processor single board;
the Ethernet interface B of the single board of the processor is connected with the demonstrator, the Ethernet interface C is connected with the machine vision unit, the Ethernet interface A is connected with the motor driver, the motor drivers are interconnected through an EtherCAT bus, the motor driver controls the motor, the Ethernet interface D is connected to the local Ethernet, the IO interface is connected with an external IO unit, the CAN interface is connected with the gripper unit, and the external sensor is a safety protection sensor connected to the outside of the single board of the processor;
the teaching device is used for teaching the robot, acquiring key position information of the robot, programming the robot on site through the teaching device and setting a motion process of the robot; the key position information is a working origin and a material placing point of the robot.
The machine vision unit is used for acquiring coordinate information of a target and is a three-dimensional camera; the machine vision unit acquires the plane information of the target and the depth information of the target, the height Z of the target from the robot can be acquired through the depth information, then the plane information is subjected to image segmentation, the (X, Y) coordinate of the target is acquired, and the coordinate information (X, Y, Z) of the target is acquired
For the coordinate information (X, Y, Z) of the obtained target is a coordinate system based on a three-dimensional camera, the machine vision unit transmits the obtained coordinate information (X, Y, Z) to the processor single board through the Ethernet interface B, and the (X, Y, Z) information of the target is converted into a robot coordinate system from the camera coordinate system inside the processor single board to obtain machine coordinate information;
after the robot controller acquires the machine coordinate information of the target, solving the positive and inverse solution of each axis angle of the robot and planning interpolation of a motion trail are completed in the FPGA, namely the process of calculating the inverse solution for the kinematics of the mechanical arm is completed, and interpolation data are obtained; the method comprises the following specific steps:
the method comprises the following steps: establishing D-H parameters, and establishing a homogeneous transformation matrix of a coordinate system i in a coordinate system i-1 according to the parameters;
step two: then substituting DH parameters into the transformation matrix of all adjacent coordinate systems;
step three: multiplying the obtained adjacent coordinate system transformation matrixes in sequence to obtain a result, namely a space description equation of the gripper in a polar coordinate system;
step four: and calculating motion variables of all joints of the robot according to the target position coordinates obtained by the machine vision unit, and marking the obtained motion data of all axes of all joints of the robot as interpolation data. The method is a common method for controlling the motion of the robot, and therefore, detailed description is omitted;
the FPGA transmits the plug value data of all joints of the robot obtained through calculation to an ARM processor through a high-speed interface between the FPGA and the ARM; and the joint axis motion variables obtained by solving the gripper in a space description equation in a polar coordinate system according to the coordinate position acquired by the three-dimensional camera are sent to the ARM by the FPGA.
The ARM processor controls a motor driver through the acquired interpolation data so as to drive the motor to rotate; the ARM processor is interconnected with the motor driver through an Ethernet interface A;
when the motor drives the robot arm to move to the target position given by the machine vision unit, the robot control processor controls the external machine gripper to grip the target through the CAN interface.
The machine vision unit is also used for detecting the abnormal operation of the robot, and the specific detection method comprises the following steps:
the method comprises the following steps: firstly, acquiring the condition of goods on a tray of a robot; when no goods are detected on the target, judging that the process is abnormal and generating an abnormal signal; judging whether goods are not on the target by means of the height from the starting position of the robot to the goods;
step two: when the machine vision unit cannot find a proper cargo gripping position, an abnormal signal is also generated; the judgment method for not finding the proper grabbing position of the goods is as follows:
s1: when the machine vision unit cannot correctly divide the goods target, the machine vision unit judges that a proper goods grabbing position cannot be found;
step three: when an abnormal signal is generated, stopping running the robot;
the machine vision unit is also used for transmitting an abnormal signal to the processor single board when the abnormal signal is generated, and the processor single board receives the abnormal signal transmitted by the machine vision unit and gives out alarm information through the IO interface;
the processor single board gives the running state indication of the robot through an external IO interface;
the safety sensor is a photoelectric sensor arranged in a designated working area around the robot, and is used for generating an intrusion signal when detecting that a person or an object intrudes into a safety area in which the robot operates; the method for detecting that a person or an object breaks into a safety area in which the robot operates specifically comprises the following steps:
photoelectric sensors are arranged in a working area around the robot, and when a person or an object breaks through the photoelectric sensors, the fact that the person or the object breaks through a safety area is judged, and the person or the object breaks into the working area of the robot; generating an intrusion signal;
the safety sensor sends the intrusion information to the processor single board through an external IO interface to control the robot to stop moving immediately; and the safety of the intruder is protected.
The processor single board is connected to the local area Ethernet through the network port D, can remotely monitor the running state of the robot through the local area Ethernet, and can monitor a plurality of robots at any time. Meanwhile, the robot software can be upgraded on line.
The robot controller adopted by the invention is miniaturized, so that the cost is reduced, the computing capacity of the controller can be ensured, and the robot controller has strong expandability and high safety;
meanwhile, the robot controller is convenient to expand, and meanwhile, the three-dimensional machine vision is included, so that (X, Y and Z) information of a target can be acquired, the target can be grabbed more accurately, and the robot controller can be suitable for a plurality of complex application scenes;
in addition, the robot motion control algorithm is equally distributed in the ARM and the FPGA, so that the computing capacity of the robot controller is effectively improved; the robot gripper is expanded through the CAN bus, so that the robot arm is more flexibly expanded; the invention is simple, effective and easy to use.
The foregoing is merely exemplary and illustrative of the present invention and various modifications, additions and substitutions may be made by those skilled in the art to the specific embodiments described without departing from the scope of the invention as defined in the following claims.
Claims (6)
1. A high-performance industrial robot controller is characterized by comprising a processor single board, a machine vision unit, a demonstrator, an external IO unit, a gripper unit, an external sensor and a motor driver;
the processor single board comprises an ARM processor unit, an FPGA unit, a power supply module, an Ethernet interface A, an Ethernet interface B, an Ethernet interface C, an Ethernet interface D, IO interface and a CAN interface;
the FPGA unit is connected with the ARM processing unit through a high-speed link, and two Ethernet interfaces are output from the ARM end and two Ethernet interfaces are output from the FPGA; the CAN interface and the IO interface are respectively connected with the ARM processor unit, and the power module provides power for the processor single board;
the Ethernet interface B of the single board of the processor is connected with the demonstrator, the Ethernet interface C is connected with the machine vision unit, the Ethernet interface A is connected with the motor driver, the motor driver is interconnected through an EtherCAT bus, the motor driver controls the motor, the Ethernet interface D is connected to the local Ethernet, the IO interface is connected with the external IO unit, the CAN interface is connected with the gripper unit, and the external sensor is a safety protection sensor connected to the outside of the single board of the processor.
2. The high-performance industrial robot controller according to claim 1, wherein the teach pendant is used for teaching the robot, acquiring the key position information of the robot, programming the robot on site through the teach pendant, and setting the motion process of the robot;
the key position information is a working origin and a material placing point of the robot.
3. A high performance industrial robot controller according to claim 1, wherein the machine vision unit is used to obtain coordinate information of the target, the machine vision unit is a three-dimensional camera; the machine vision unit acquires the depth information of the target while acquiring the plane information of the target, acquires the height Z of the target from the robot through the depth information, then performs image segmentation on the plane information, acquires the (X, Y) coordinates of the target, and accordingly acquires the coordinate information (X, Y, Z) of the target
For the coordinate information (X, Y, Z) of the obtained target is a coordinate system based on a three-dimensional camera, the machine vision unit transmits the obtained coordinate information (X, Y, Z) to the processor single board through the Ethernet interface B, and the (X, Y, Z) information of the target is converted into a robot coordinate system from the camera coordinate system inside the processor single board to obtain machine coordinate information;
after the robot controller acquires the machine coordinate information of the target, solving the positive and inverse solution of each axis angle of the robot and planning interpolation of a motion trail are completed in the FPGA, namely the process of calculating the inverse solution for the kinematics of the mechanical arm is completed, and interpolation data are obtained; the method comprises the following specific steps:
the method comprises the following steps: establishing D-H parameters, and establishing a homogeneous transformation matrix of a coordinate system i in a coordinate system i-1 according to the parameters;
step two: then substituting DH parameters into the transformation matrix of all adjacent coordinate systems;
step three: multiplying the obtained transformation matrixes of the adjacent coordinate systems in sequence to obtain a result, namely a space description equation of the gripper in a polar coordinate system;
step four: automatically acquiring motion variables of all joints of the robot according to the target position coordinates acquired by the machine vision unit, and marking the obtained motion data of all axes of all joints of the robot as interpolation data;
the FPGA transmits the plug value data of all joints of the robot to the ARM processor through a high-speed interface between the FPGA and the ARM; the robot comprises an ARM, a three-dimensional camera, an FPGA, a robot controller and a robot controller, wherein the FPGA sends the ARM to obtain joint axis motion variables obtained by solving the space description equation of the gripper in a polar coordinate system according to the coordinate position obtained by the three-dimensional camera;
the ARM processor controls a motor driver through the acquired interpolation data so as to drive the motor to rotate; the ARM processor is interconnected with the motor driver through an Ethernet interface A;
when the motor drives the robot arm to move to the target position given by the machine vision unit, the robot control processor controls the external machine gripper to grip the target through the CAN interface.
4. A high performance industrial robot controller according to claim 1, wherein the machine vision unit is further configured to detect robot operation anomalies by:
the method comprises the following steps: firstly, acquiring the condition of goods on a tray of a robot; when no goods are detected on the target, judging that the process is abnormal and generating an abnormal signal; judging whether goods are not on the target by means of the height from the starting position of the robot to the goods;
step two: when the machine vision unit can not find the proper grabbing position of the goods, an abnormal signal is also generated; the judgment method for not finding the proper grabbing position of the goods is as follows:
s1: when the machine vision unit cannot correctly divide the goods target, the machine vision unit judges that a proper goods grabbing position cannot be found;
step three: when an abnormal signal is generated, stopping running the robot;
the machine vision unit is also used for transmitting an abnormal signal to the processor single board when the abnormal signal is generated, and the processor single board receives the abnormal signal transmitted by the machine vision unit and gives out alarm information through the IO interface.
5. The controller of claim 1, wherein the safety sensor is a photoelectric sensor arranged in a designated working area around the robot, and the safety sensor is used for generating an intrusion signal when detecting that a person or an object intrudes into a safety area in which the robot runs; the method for detecting that a person or an object breaks into a safety area in which the robot operates specifically comprises the following steps:
photoelectric sensors are arranged in the working area around the robot, and when a person or an object breaks through the photoelectric sensors, the fact that the person or the object breaks through the safe area is judged, and the person or the object breaks into the working area of the robot; generating an intrusion signal;
and the safety sensor sends the intrusion information to the processor single board through an external IO interface to control the robot to stop moving immediately.
6. The controller of claim 1, wherein the processor board gives an indication of the operation status of the robot through an external IO interface;
the processor single board is connected to the local area Ethernet through the network port D, the running state of the robot is remotely monitored through the local area Ethernet, and a plurality of robots are monitored in real time; meanwhile, the robot software can be upgraded on line.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910968392.3A CN110666820A (en) | 2019-10-12 | 2019-10-12 | High-performance industrial robot controller |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910968392.3A CN110666820A (en) | 2019-10-12 | 2019-10-12 | High-performance industrial robot controller |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110666820A true CN110666820A (en) | 2020-01-10 |
Family
ID=69081894
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910968392.3A Pending CN110666820A (en) | 2019-10-12 | 2019-10-12 | High-performance industrial robot controller |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110666820A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112548996A (en) * | 2020-12-08 | 2021-03-26 | 广东工业大学 | Open industrial robot control system and open industrial robot |
CN113524192A (en) * | 2021-08-02 | 2021-10-22 | 广州市斯睿特智能科技有限公司 | AI vision robot motion control system |
CN116713992A (en) * | 2023-06-12 | 2023-09-08 | 之江实验室 | Electrical control system, method and device for humanoid robot |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101045297A (en) * | 2007-04-12 | 2007-10-03 | 武汉科技大学 | Distribution multiple freedom robot controlling system |
CN102848388A (en) * | 2012-04-05 | 2013-01-02 | 上海大学 | Service robot locating and grabbing method based on multiple sensors |
CN102862161A (en) * | 2012-09-10 | 2013-01-09 | 王伟栋 | Field bus-based PAC (Programmable Automation Controller) industrial robot control system |
CN204143223U (en) * | 2014-11-07 | 2015-02-04 | 南京科远自动化集团股份有限公司 | A kind of kinetic control system |
CN104552311A (en) * | 2014-12-05 | 2015-04-29 | 杭州新松机器人自动化有限公司 | EtherCAT-based intelligent industrial robot bus module and operating method thereof |
CN104977912A (en) * | 2015-07-02 | 2015-10-14 | 深圳市蜂鸟智航科技有限公司 | Ethernet-exchange-bus-based unmanned plane flight control system and method |
CN105643624A (en) * | 2016-03-04 | 2016-06-08 | 南京科远自动化集团股份有限公司 | Machine vision control method, robot controller and robot control system |
WO2017198301A1 (en) * | 2016-05-19 | 2017-11-23 | Abb Schweiz Ag | An industrial robot system and a method for communication between an industrial robot and an external network |
CN208681602U (en) * | 2018-07-04 | 2019-04-02 | 苏州东控自动化科技有限公司 | A kind of internet of things type robot controller |
-
2019
- 2019-10-12 CN CN201910968392.3A patent/CN110666820A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101045297A (en) * | 2007-04-12 | 2007-10-03 | 武汉科技大学 | Distribution multiple freedom robot controlling system |
CN102848388A (en) * | 2012-04-05 | 2013-01-02 | 上海大学 | Service robot locating and grabbing method based on multiple sensors |
CN102862161A (en) * | 2012-09-10 | 2013-01-09 | 王伟栋 | Field bus-based PAC (Programmable Automation Controller) industrial robot control system |
CN204143223U (en) * | 2014-11-07 | 2015-02-04 | 南京科远自动化集团股份有限公司 | A kind of kinetic control system |
CN104552311A (en) * | 2014-12-05 | 2015-04-29 | 杭州新松机器人自动化有限公司 | EtherCAT-based intelligent industrial robot bus module and operating method thereof |
CN104977912A (en) * | 2015-07-02 | 2015-10-14 | 深圳市蜂鸟智航科技有限公司 | Ethernet-exchange-bus-based unmanned plane flight control system and method |
CN105643624A (en) * | 2016-03-04 | 2016-06-08 | 南京科远自动化集团股份有限公司 | Machine vision control method, robot controller and robot control system |
WO2017198301A1 (en) * | 2016-05-19 | 2017-11-23 | Abb Schweiz Ag | An industrial robot system and a method for communication between an industrial robot and an external network |
CN208681602U (en) * | 2018-07-04 | 2019-04-02 | 苏州东控自动化科技有限公司 | A kind of internet of things type robot controller |
Non-Patent Citations (1)
Title |
---|
温秀兰等: "基于三维机器视觉的工业机器人定位系统设计", 《组合机床与自动化加工技术》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112548996A (en) * | 2020-12-08 | 2021-03-26 | 广东工业大学 | Open industrial robot control system and open industrial robot |
CN113524192A (en) * | 2021-08-02 | 2021-10-22 | 广州市斯睿特智能科技有限公司 | AI vision robot motion control system |
CN116713992A (en) * | 2023-06-12 | 2023-09-08 | 之江实验室 | Electrical control system, method and device for humanoid robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110666820A (en) | High-performance industrial robot controller | |
JP7067816B1 (en) | Robot teaching system and method based on image segmentation and surface EMG | |
US9132551B2 (en) | Teleoperated industrial robots | |
CN111633644A (en) | Industrial robot digital twin system combined with intelligent vision and operation method thereof | |
CN208614800U (en) | A kind of control system of robot | |
CN111906778B (en) | Robot safety control method and device based on multiple perceptions | |
CN111421528A (en) | Industrial robot's automated control system | |
CN110216674A (en) | A kind of redundant degree of freedom mechanical arm visual servo obstacle avoidance system | |
CN111515951A (en) | Teleoperation system and teleoperation control method for robot | |
CN104842356B (en) | A kind of many robot palletizers teaching method based on Distributed Calculation Yu machine vision | |
CN111015649A (en) | Driving and controlling integrated control system | |
TW202200332A (en) | Determination of safety zones around an automatically operating machine | |
Magrini et al. | Human-robot coexistence and contact handling with redundant robots | |
CN109202852A (en) | A kind of intelligent inspection robot | |
CN111168660B (en) | Redundant degree of freedom hydraulic heavy load robot arm initiative safety system | |
CN207014366U (en) | A kind of six axle welding industry robot anticollision control systems | |
CN114419154A (en) | Mechanical arm dual-mode control method and system based on vision and man-machine cooperation | |
CN105479431A (en) | Inertial navigation type robot demonstration equipment | |
CN102528811B (en) | Mechanical arm positioning and obstacle avoiding system in Tokamak cavity | |
CN111633653A (en) | Mechanical arm control system and method based on visual positioning | |
CN107030700A (en) | A kind of six axle welding industry robot anticollision control systems | |
RU124622U1 (en) | MOBILE ROBOT CONTROL SYSTEM | |
CN113618731A (en) | Robot control system | |
JPH01271185A (en) | Remote robot manipulating system | |
JPH01209505A (en) | Teaching device for remote control robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200110 |