CN113510720A - Real-time distributed cooperative robot control system - Google Patents

Real-time distributed cooperative robot control system Download PDF

Info

Publication number
CN113510720A
CN113510720A CN202110665852.2A CN202110665852A CN113510720A CN 113510720 A CN113510720 A CN 113510720A CN 202110665852 A CN202110665852 A CN 202110665852A CN 113510720 A CN113510720 A CN 113510720A
Authority
CN
China
Prior art keywords
real
unit
time
control
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110665852.2A
Other languages
Chinese (zh)
Other versions
CN113510720B (en
Inventor
刘辛军
叶彦雷
谢福贵
李鹏
李子豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202110665852.2A priority Critical patent/CN113510720B/en
Publication of CN113510720A publication Critical patent/CN113510720A/en
Application granted granted Critical
Publication of CN113510720B publication Critical patent/CN113510720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/023Cartesian coordinate type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The application provides a real-time distributed cooperative robot control system, which is connected with a robot body and comprises a Linux operating system unit, a real-time kernel unit, a data distribution service unit, a robot control unit and an EtherCAT field bus unit. The Linux operating system unit is used for establishing an Ubuntu environment for establishing an operating environment of the robot; the real-time kernel unit is used for executing the real-time control of the control system; the data distribution service unit is used for distributing and subscribing data; the robot control unit is connected with the data distribution service unit and comprises a real-time control module and a non-real-time control module; the real-time control module and the non-real-time control module calculate to obtain a control instruction, and transmit the control instruction to the EtherCAT field bus unit; and the EtherCAT field bus unit is used for transmitting the received control signal to a driver of the robot body through the EtherCAT bus to perform corresponding control. The distributed real-time control of the robot is realized.

Description

Real-time distributed cooperative robot control system
Technical Field
The application relates to the technical field of robot control and artificial intelligence, in particular to a real-time distributed cooperative robot control system.
Background
With the development of the robot industry, there is an urgent need in the academic and industrial communities for high-quality, highly stable, distributed and easy-to-operate control systems. The design of the current robot control system is limited by the following key factors: the design of system real-time performance, a field bus, a robot frame and a man-machine interaction module. The real-time performance of the control system is crucial to the application of the robot in various fields, and delay or advance will cause instability of system motion and cause loss which is difficult to estimate in related application fields.
Disclosure of Invention
The application aims to provide a real-time distributed cooperative robot control system.
According to an aspect of the present application, a real-time distributed cooperative robot control system is provided, where the control system is connected to a robot body, and the control system includes:
the Linux operating system unit is used for establishing an Ubuntu environment for establishing the operating environment of the robot;
the real-time kernel unit is used for executing the real-time control of the control system; the real-time kernel unit is constructed by modifying and compiling a native Linux kernel based on the Linux operating system unit;
the data distribution service unit is used for distributing and subscribing data; wherein the data distribution service unit is built based on the environment of the ROS 2;
the robot control unit is connected with the data distribution service unit and comprises a real-time control module and a non-real-time control module; the real-time control module is used for obtaining a first control instruction by calculation according to a real-time task based on the real-time kernel unit and inputting the first control instruction to the EtherCAT field bus unit; the non-real-time control module is used for calculating according to a non-real-time task to obtain a second control instruction and inputting the second control instruction to the EtherCAT field bus unit;
and the EtherCAT field bus unit is used for transmitting the received control signals sent by the real-time control module and the non-real-time control module to a driver of the robot body through an EtherCAT bus to perform corresponding control.
In some embodiments of the present application, the real-time kernel unit is constructed by:
configuring a real-time preemption patch for the native Linux kernel, and recompiling; the preemption mode is set as a full preemption kernel, the timing system is set as a high resolution timer, the timer interrupt processing is set as no interrupt, the default CPU frequency adjustment strategy is set as performance priority, and the power state of the processor is set as the highest level.
In some embodiments of the present application, a quality of service policy QoS is built in the data distribution service unit.
Further, the robot control unit is specifically configured to:
aiming at a data task, establishing a model according to forward kinematics, reverse kinematics and dynamics;
planning a Cartesian layer and a joint space according to the model;
according to the planning of the joint space, the real-time control module calculates based on a real-time kernel unit to obtain a first control instruction aiming at the force control and the position control of the space, and inputs the first control instruction to the EtherCAT field bus unit;
and according to the plan of the Cartesian layer, calculating a second control instruction aiming at the strength control and the position control of the Cartesian layer by the non-real-time control module, and inputting the second control instruction to the EtherCAT field bus unit.
In some embodiments of the present application, the real-time control module is specifically configured to:
setting a scheduling mode of the operating system to be a first-in first-out mode;
setting a scheduling priority;
the memory is distributed and fixed in advance;
setting a rejection processor to enter a low loss mode;
and timing by a high-resolution timer of the real-time kernel unit, and calculating according to a real-time task by setting a timing period.
Wherein, the calculation formula of the real-time control module for the force control comprises:
Figure BDA0003117367960000031
Figure BDA0003117367960000032
wherein, Kθ,DθIs the position and velocity proportional gain coefficient, G (θ)Ld) Indicating a constant gravity compensation based on the desired position of the link end, θmdTo desired motor position, θmIs the actual motor position,
Figure BDA0003117367960000033
For actual motor speed, TiIndicating position control loop torque output, ImIs the inertia of the motor, InIs the nominal motor inertia, T is the actual motor torque, D is the motor damping, DnFor nominal motor damping, Tm is the motor control torque and K is the stiffness matrix.
In addition, the real-time distributed cooperative robot control system that this application provided still includes:
the visual semantic perception unit is used for acquiring images, carrying out visual perception processing on the acquired images to obtain visual semantic perception data, and sending the semantic perception data to the robot control unit through the data distribution service unit;
and the robot control unit is also used for generating a corresponding control instruction according to the received visual semantic perception data and sending the corresponding control instruction to the EtherCAT field bus unit.
In some embodiments of the present application, the robot control unit and the visual semantic perception unit are connected by:
the visual semantic perception unit and the robot control unit run on an X86 hardware platform, and a GPU display card is configured on the X86 hardware platform; alternatively, the first and second electrodes may be,
the visual semantic perception unit and the robot control unit run on two different X86 hardware platforms, wherein the visual semantic perception unit is responsible for accelerated calculation of visual images.
According to the technical scheme provided by the application, the real-time performance of the operating system is achieved through the construction of the real-time kernel unit, so that the real-time performance of the control system can be effectively improved. Meanwhile, the data distribution service unit can realize the publishing and subscribing of the distributed nodes, thereby improving the real-time performance of a single node and further improving the real-time performance of data interaction between nodes. In addition, the real-time control module in the robot control unit can calculate corresponding control instructions according to the real-time tasks and transmit the control instructions to the robot body through the field bus, so that distributed real-time control of the robot can be realized, and the control system has flexibility, openness, modularization, expansibility and friendliness.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic connection diagram of a robot body and a control system according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a real-time distributed cooperative robot control system according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a working principle of a robot control unit in a real-time distributed cooperative robot control system according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating a working principle of a real-time control module in a real-time distributed cooperative robot control system according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of another real-time distributed cooperative robot control system provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a construction method of a visual semantic perception unit in another real-time distributed cooperative robot control system according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
It should be noted that the real-time design of the control system is crucial for industrial applications, such as: control of nuclear plants, medical systems, rail switching systems, flight control systems, space missions, autopilots, robots, etc., all require precise real-time control. Real-time, among other things, means that events occurring in the environment must be reacted to within precise timing. Thus, for real-time systems, not only is the logic correct, but data interaction is required at a given time or expiration time, and delaying or advancing trigger timing results in an immeasurable penalty. Furthermore, real-time systems do not mean as fast as possible, but rather keep a small timing jitter, during which tasks are received, calculated, transmitted, etc.
In the prior art, the ROS1(robot operating system1, first generation robot operating system) needs a main node when realizing distribution and subscription of distributed nodes, but the instability of the main node causes the system to be down. Meanwhile, the ROS1 is not designed in real time at the beginning, so that real-time control of the robot and the automation device cannot be realized. In order to solve the problem of the ROS1, a scheme of the ROS2 (a robott operating system2, a second-generation robot operating system) is proposed, and the ROS2 uses a DDS (Data Distribution Service) as a middleware to realize distributed publishing and subscribing of Data. The DDS has various QoS (Quality of Service) strategies, and ensures real-time, efficient and flexible data distribution. But still cannot solve the problem that the real-time performance such as system delay and jitter is crucial.
Based on the above problem, the present application provides a real-time distributed cooperative robot control system.
It should be noted that, in the embodiment of the present application, the real-time distributed cooperative robot control system operates based on a hardware platform. As an example, the hardware platform can be constructed based on an X86 architecture, and can include 3 Ethernet network cards, an Intel processor (model: I7_7700), a memory, a solid state disk, and 3 relays for emergency stop and power off. The memory can be DDR4(Dual Data Rate4, fourth generation double Data Rate synchronous dynamic random access memory), the memory capacity is 16G, and the solid state disk capacity can be 256G. As shown in fig. 1, the robot body may be connected directly to the real-time distributed cooperative robot control system through ethernet cable, wherein the robot may be single-arm and double-arm with end gripper and visual perception functions. In addition, 48V and 24V direct current power supplies are provided for the robot body by the real-time distributed cooperative robot control system. Wherein 48V is mainly used for supplying power to a robot motor, and 24V is used for supplying power to execution units such as a driver, a clamping jaw and the like. The robot body externally connected with the control system is connected through an Ethernet cable, and the connected topological structure can be a linear type, a tree type, a star type and the like.
Fig. 2 is a schematic structural diagram of a real-time distributed cooperative robot control system according to an embodiment of the present application. The control system includes: the system comprises a Linux operating system unit 210, a real-time kernel unit 220, a data distribution service unit 230, a robot control unit 240 connected with the data distribution service unit 230, and an EtherCAT field bus unit 250.
The Linux operating system unit 210 is configured to build an Ubuntu environment for establishing an operating environment of the robot. It should be noted that the environment is a system native Ubuntu environment with a GNOME desktop. As an example, the version of the Linux operating system may be 20.04, the version of the Linux kernel may be 5.10.27, or other versions, which is not limited in this application.
And a real-time kernel unit 220 for performing real-time control of the control system. The real-time kernel unit is constructed by modifying and compiling a native Linux kernel based on a Linux operating system unit. As an example, the specific construction method may be as follows: configuring a real-time preemption patch for a native Linux kernel, and recompiling; the system comprises a preemption mode, a timing system, a CPU (central processing unit) and a processor, wherein the preemption mode is set as a full preemption kernel, the timing system is set as a high-resolution timer, the interrupt processing of the timer is set to be uninterrupted, a default CPU frequency regulation strategy is set to be performance priority, and the movie state of the processor is set to be the highest level; the frequency of the timer can be set to 1000Hz, and can also be set to other frequency values according to actual needs.
The data distribution service unit 230 is configured to distribute and subscribe data. The data distribution service unit 230 is constructed based on the environment of the ROS2, and the specific construction method may be as follows: firstly, building a robot environment, and building a ROS2 robot environment; in order to construct data publishing and subscribing of the distributed nodes, a Service instruction policy qos (quality of Service policy) can be constructed in the process, so as to meet the quality of Service requirements of data interaction between systems.
The robot control unit 240 includes a real-time control module 241 and a non-real-time control module 242. The real-time control module 241 is configured to calculate a first control instruction according to a real-time task based on the real-time kernel unit 220, and input the first control instruction to the EtherCAT fieldbus unit 250. And the non-real-time control module 242 is configured to obtain a second control instruction according to the non-real-time task calculation, and input the second control instruction to the EtherCAT fieldbus unit 250.
In some embodiments of the present application, as shown in fig. 3, the robot control unit is specifically configured to:
aiming at a data task, establishing a model according to forward kinematics, reverse kinematics and dynamics;
planning a Cartesian layer and a joint space according to the model;
according to the planning of joint space, a real-time control module calculates based on a real-time kernel unit to obtain a first control instruction aiming at the force control and the position control of the space, and inputs the first control instruction to an EtherCAT field bus unit;
and according to the plan of the Cartesian layer, calculating by the non-real-time control module to obtain a second control instruction aiming at the strength control and the position control of the Cartesian layer, and inputting the second control instruction into the EtherCAT field bus unit.
In this embodiment of the present application, the real-time control module 241 in fig. 2 is used for implementing data subscription, processing, and publishing of a single node and multiple nodes, and reduces jitter, reduces delay, improves response speed, and provides guarantee for high-quality motion control by modifying each node in real time. Further, the real-time control module 241 is specifically configured to: setting a scheduling mode of an operating system as a first-in first-out mode; setting a scheduling priority; the memory is distributed and fixed in advance; setting a rejection processor to enter a low loss mode; and timing by a high-resolution timer of the real-time kernel unit, and calculating according to the real-time task by setting a timing period.
As an example, fig. 4 is a schematic diagram of an operating principle of a real-time control module, and as shown in fig. 4, the specific operating principle of the real-time control module is as follows: firstly, setting a scheduling mode of an operating system as a first-in first-out mode, and then setting a higher scheduling priority such as 81 (all 1 to 99 are real-time priorities); the memory is allocated and fixed in advance, and it can be understood that dynamic memory allocation sometimes causes page missing, and then the memory is reallocated, which may increase the jitter of the system due to the previous page missing; the processor is rejected to enter a low-loss mode by setting a parameter cpu _ dma _ latency to 0, so that the power state of the processor is at the highest level, and system delay caused by system wake-up from a lower power state to a higher power state is avoided; a high-precision timer is adopted for timing, the timer has nanosecond timing precision, the current precise time is obtained by obtaining a time function CLOCK _ gettime (CLOCK _ MONOTONIC, ts _ now), so that the timing accuracy is improved, and IO operation cannot be introduced in the process; setting a timing period by a constant high resolution sleep function CLOCK _ sleep (CLOCK _ mono, TIMER _ abstme, ts _ nest, NULL), implementing a timing delay; and according to the judgment of the preset condition, if the task is not completed, circularly executing the corresponding task, and if the task is completed, ending the current task and performing the next task. Further, the difference between the time value that can be obtained this time and the time value that can be obtained last time based on the time function can be analyzed as a timing jitter to set different timing frequencies.
In some embodiments of the present application, the force control and position control of the joint space are run in a real-time control module for real-time control. As a way to implement, the real-time control module may be run in different processes, and the real-time processes may process real-time tasks in time, and adopt different scheduling for each real-time task, especially adopt a high-priority first-in first-out scheduling mode for the joint layer. In addition, for the force control, the joint impedance controller can be configured in a real-time control module, the control period can be set to be 1ms, and the driver is correspondingly configured in a period synchronous torque mode, so that the real-time calculation of dynamics and a moment ring is realized. The calculation formula of the real-time control module for the force control is shown as a formula (1) and a formula (2):
Figure BDA0003117367960000091
Figure BDA0003117367960000092
wherein, Kθ,DθThe distribution is a position and velocity proportional gain coefficient, G (θ)Ld) Indicating a constant gravity compensation based on the desired position of the link end, θmdTo desired motor position, θmIs the actual motor position,
Figure BDA0003117367960000093
For actual motor speed, TiIndicating position control loop torque output, ImIs the inertia of the motor, InIs the nominal motor inertia, T is the actual motor torque, D is the motor damping, DnFor nominal motor damping, Tm is the motor control torque and K is the stiffness matrix. All the variable values are values corresponding to the reduction ratio.
In the embodiment of the present application, as shown in fig. 2, the EtherCAT fieldbus unit 205 is configured to transmit the received control signal sent by the real-time control module and the non-real-time control module to a driver of the robot body through an EtherCAT bus for corresponding control.
It should be noted that communication of EtherCAT is a one-master-multiple-slave manner, and a master station may send a data packet to a slave station, and may also receive a data packet returned by the slave station; the slave station needs to receive the data packet sent by the master station, and realizes the control of an application layer according to the instruction in the data packet, and after the operation is completed, the slave station writes corresponding information in the data packet and finally feeds the information back to the master station. The data packets communicated between the master station and the slave station are standard Ethernet data frame structures. After the master station sends the data packet, the slave station receives the data packet to complete the corresponding application layer operation, the data packet is rewritten and the data packet is fed back to the master station, and a complete EtherCAT communication process is formed.
In the embodiment of the application, the EtherCAT field bus unit realizes the control of the robot by constructing the EtherCAT main station and using the EtherCAT bus. As an example, an EtherCAT master station, such as an Acontis master station installed commercially or an open source version IGH Etherlab and SOME master station, may be constructed by software, and an ethernet controller is changed to the EtherCAT controller by a network card driver change.
For example, to implement the connection between the slave station and the master station, the following may be implemented:
(1) a PDO (Process Data Object) and an SDO (Service Data Object) are configured to generate corresponding description files. The input PDO comprises a state word, a display mode, an actual position, an actual speed value, an actual position value, an actual moment value, a digital quantity input and an analog quantity input. The output PDO includes control words, operating mode, target torque, target speed, target position. The above parameters need to be configured for each axis, for example, for a 7-degree-of-freedom cooperative robot, 7 axes need to be configured, and a gripper axis needs to be configured.
(2) The PDO and SDO are configured in synchronous mode and bus cycles are set.
(3) And mapping the configured description file to a description file corresponding to the primary station, which is equivalent to only named parsing.
(4) And (3) unloading a network card drive carried by Ubuntu, loading an EtherCAT drive, and starting the EtherCAT master station, thereby realizing the connection between the EtherCAT master station and the slave station.
Wherein, the slave station drives are all EtherCAT drivers which are connected in series through an Ethernet network line. For example, for a single arm (7 degrees of freedom), the concatenation can be done directly; for both arms (14 degrees of freedom, without end-effector) this is achieved by an EtherCAT portal branching module (e.g. EKs 1100 and FKs 1110 modules combined in double-fu, or CU1124 module alone).
According to the real-time distributed cooperative robot control system provided by the embodiment of the application, the real-time performance of the operating system is realized through the construction of the real-time kernel unit, so that the real-time performance of the control system can be effectively improved. Meanwhile, the data distribution service unit can realize the publishing and subscribing of the distributed nodes, thereby improving the real-time performance of a single node and further improving the real-time performance of data interaction between nodes. In addition, the real-time control module in the robot control unit can calculate corresponding control instructions according to the real-time tasks and transmit the control instructions to the robot body through the field bus, so that distributed real-time control of the robot can be realized, and the control system has flexibility, openness, modularization, expansibility and friendliness.
In order to accurately control the robot by acquiring the perception of the environment around the robot, the application proposes another embodiment.
Fig. 5 is a schematic structural diagram of another real-time distributed cooperative robot control system according to an embodiment of the present application. As shown in fig. 5, on the basis of the above embodiment, the real-time distributed cooperative robot control system further includes:
the visual semantic perception unit 560 is used for acquiring images, performing visual perception processing on the acquired images to obtain visual semantic perception data, and sending the semantic perception data to the robot control unit through the data distribution service unit;
the robot control unit 540 is further configured to generate a corresponding control instruction according to the received visual semantic perception data, and send the corresponding control instruction to the EtherCAT fieldbus unit 550.
It can be understood that the visual perception requires a large amount of computation, and usually a GPU (graphics processing Unit) is used for performing parallel acceleration of computation, so that the connection between the visual semantic perception Unit and the robot control Unit can be in two ways. The first method is that a visual semantic perception unit and a robot control unit run on an X86 hardware platform, and a GPU display card is configured on an X86 hardware platform; the second is that the visual semantic perception unit and the robot control unit run on two different X86 hardware platforms, wherein the visual semantic perception unit is responsible for the accelerated calculation of the visual image. In the embodiment of the present application, fig. 5 illustrates a second connection manner as an example.
For example, for the second connection mode, the communication between the two platforms may be interconnected through an ethernet cable or WIFI, and the two X86 hardware platforms are set in the same IP segment, and the same robot system domain is set at the same time. Besides, the running environment of the ROS2 is built in the X86 hardware platform, so that mutual communication is realized through a subscription and publishing mode.
Since the visual semantic perception unit 560 is used to recognize the object to be operated in the environment, determine the three-dimensional spatial position of the object, and the like, a corresponding visual recognition algorithm needs to be constructed in the visual semantic perception unit 560. As an example, as shown in fig. 6, the construction of the visual recognition algorithm in the visual semantic perception unit and the implementation of the data interaction with the robot control unit may include the following steps:
(1) configuring a visual semantic perception environment, and configuring an Nvidia drive in an Ubuntu system;
(2) installing Anaconda, and constructing a virtual environment through conda in Anaconda so as to prevent damage to the system environment caused by installation of various kinds of software;
(3) installing a PyTorch neural network framework in a virtual environment, and installing OpenCV software and a CUDA platform;
(4) an image recognition architecture built with Detectron2 in the installed environment;
(5) acquiring a picture of an operated object and a picture of a surrounding environment aiming at a specific application, and labeling the category in the picture by adopting image labeling software;
(6) the training and test sets of pictures were trained and tested using a MaskRCNN network in a detetron 2-based architecture. The image processing tool is used for processing the image and extracting information such as the size, position coordinates, semantic information and the like of the image by reasoning the image and sensing the surrounding environment, so that the execution of a specific task is realized (for example, an object is dynamically tracked and adaptively grabbed in a position-based visual servo mode);
(7) and transmitting the data processed by the visual semantic perception module to the robot control unit in a message or service transmission mode in ROS2, so that data interaction between the visual semantic perception unit and the robot control unit is realized.
It should be noted that, in the embodiment of the present application, 510 to 550 in fig. 5 have the same structure and function as 210 to 250 in fig. 2, and are not described again here.
According to the real-time distributed cooperative robot control system provided by the embodiment of the application, the visual semantic perception unit is introduced to perform visual perception processing on the acquired image, and the processed data is sent to the robot control unit through the data distribution service unit through data communication with the robot control unit, so that the control instruction output by the robot control unit is more accurate, and the applicability and the friendliness of the real-time distributed cooperative robot control system are further improved.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (8)

1. A real-time distributed cooperative robot control system, wherein the control system is connected with a robot body, and the control system comprises:
the Linux operating system unit is used for establishing an Ubuntu environment for establishing the operating environment of the robot;
the real-time kernel unit is used for executing the real-time control of the control system; the real-time kernel unit is constructed by modifying and compiling a native Linux kernel based on the Linux operating system unit;
the data distribution service unit is used for distributing and subscribing data; wherein the data distribution service unit is built based on the environment of the ROS 2;
the robot control unit is connected with the data distribution service unit and comprises a real-time control module and a non-real-time control module; the real-time control module is used for obtaining a first control instruction by calculation according to a real-time task based on the real-time kernel unit and inputting the first control instruction to the EtherCAT field bus unit; the non-real-time control module is used for calculating according to a non-real-time task to obtain a second control instruction and inputting the second control instruction to the EtherCAT field bus unit;
and the EtherCAT field bus unit is used for transmitting the received control signals sent by the real-time control module and the non-real-time control module to a driver of the robot body through an EtherCAT bus to perform corresponding control.
2. The system of claim 1, wherein the real-time kernel unit is constructed by:
configuring a real-time preemption patch for the native Linux kernel, and recompiling; the preemption mode is set as a full preemption kernel, the timing system is set as a high resolution timer, the timer interrupt processing is set as no interrupt, the default CPU frequency adjustment strategy is set as performance priority, and the power state of the processor is set as the highest level.
3. The system of claim 1, wherein the data distribution service unit builds a quality of service policy (QoS).
4. The system of claim 1, wherein the robot control unit is specifically configured to:
aiming at a data task, establishing a model according to forward kinematics, reverse kinematics and dynamics;
planning a Cartesian layer and a joint space according to the model;
according to the planning of the joint space, the real-time control module calculates based on a real-time kernel unit to obtain a first control instruction aiming at the force control and the position control of the space, and inputs the first control instruction to the EtherCAT field bus unit;
and according to the plan of the Cartesian layer, calculating a second control instruction aiming at the strength control and the position control of the Cartesian layer by the non-real-time control module, and inputting the second control instruction to the EtherCAT field bus unit.
5. The system of claim 1, wherein the real-time control module is specifically configured to:
setting a scheduling mode of the operating system to be a first-in first-out mode;
setting a scheduling priority;
the memory is distributed and fixed in advance;
setting a rejection processor to enter a low loss mode;
and timing by a high-resolution timer of the real-time kernel unit, and calculating according to a real-time task by setting a timing period.
6. The system of claim 4, wherein the calculation formula of the real-time control module for the force control comprises:
Figure FDA0003117367950000031
Figure FDA0003117367950000032
wherein, Kθ,DθIs the position and velocity proportional gain coefficient, G (θ)Ld) Indicating a constant gravity compensation based on the desired position of the link end, θmdTo desired motor position, θmIs the actual motor position,
Figure FDA0003117367950000033
For actual motor speed, TiIndicating position control loop torque output, ImIs the inertia of the motor, InIs the nominal motor inertia, T is the actual motor torque, D is the motor damping, DnFor nominal motor damping, Tm is the motor control torque and K is the stiffness matrix.
7. The system of claim 1, further comprising:
the visual semantic perception unit is used for acquiring images, carrying out visual perception processing on the acquired images to obtain visual semantic perception data, and sending the semantic perception data to the robot control unit through the data distribution service unit;
and the robot control unit is also used for generating a corresponding control instruction according to the received visual semantic perception data and sending the corresponding control instruction to the EtherCAT field bus unit.
8. The system according to claim 7, wherein the robot control unit and the visual semantic perception unit are connected by:
the visual semantic perception unit and the robot control unit run on an X86 hardware platform, and a GPU display card is configured on the X86 hardware platform; alternatively, the first and second electrodes may be,
the visual semantic perception unit and the robot control unit run on two different X86 hardware platforms, wherein the visual semantic perception unit is responsible for accelerated calculation of visual images.
CN202110665852.2A 2021-06-16 2021-06-16 Real-time distributed cooperative robot control system Active CN113510720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110665852.2A CN113510720B (en) 2021-06-16 2021-06-16 Real-time distributed cooperative robot control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110665852.2A CN113510720B (en) 2021-06-16 2021-06-16 Real-time distributed cooperative robot control system

Publications (2)

Publication Number Publication Date
CN113510720A true CN113510720A (en) 2021-10-19
CN113510720B CN113510720B (en) 2022-03-25

Family

ID=78065700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110665852.2A Active CN113510720B (en) 2021-06-16 2021-06-16 Real-time distributed cooperative robot control system

Country Status (1)

Country Link
CN (1) CN113510720B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114102588A (en) * 2021-11-23 2022-03-01 上海景吾智能科技有限公司 EtherCAT-based single controller-double mechanical arm device and control method
CN114531318A (en) * 2022-02-25 2022-05-24 深圳市优必选科技股份有限公司 EtherCAT switch and sports equipment
CN117226847A (en) * 2023-11-02 2023-12-15 广东电网有限责任公司广州供电局 Control method and system of teleoperation equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060063573A (en) * 2004-12-07 2006-06-12 한국전자통신연구원 Control apparatus of network robot having a distributing structure
CN101293350A (en) * 2008-06-12 2008-10-29 上海交通大学 Apery robot distributed dual-bus motion control system
CN101592951A (en) * 2009-07-02 2009-12-02 上海交通大学 Common distributed control system for humanoid robot
CN110427039A (en) * 2019-08-29 2019-11-08 山东大学 A kind of distributed cloud navigation system and air navigation aid based on ROS2
CN112091978A (en) * 2020-09-24 2020-12-18 哈尔滨工业大学 Real-time control system for mechanical arm

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060063573A (en) * 2004-12-07 2006-06-12 한국전자통신연구원 Control apparatus of network robot having a distributing structure
CN101293350A (en) * 2008-06-12 2008-10-29 上海交通大学 Apery robot distributed dual-bus motion control system
CN101592951A (en) * 2009-07-02 2009-12-02 上海交通大学 Common distributed control system for humanoid robot
CN110427039A (en) * 2019-08-29 2019-11-08 山东大学 A kind of distributed cloud navigation system and air navigation aid based on ROS2
CN112091978A (en) * 2020-09-24 2020-12-18 哈尔滨工业大学 Real-time control system for mechanical arm

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114102588A (en) * 2021-11-23 2022-03-01 上海景吾智能科技有限公司 EtherCAT-based single controller-double mechanical arm device and control method
CN114531318A (en) * 2022-02-25 2022-05-24 深圳市优必选科技股份有限公司 EtherCAT switch and sports equipment
CN114531318B (en) * 2022-02-25 2024-04-16 深圳市优必选科技股份有限公司 EtherCAT switch and sports equipment
CN117226847A (en) * 2023-11-02 2023-12-15 广东电网有限责任公司广州供电局 Control method and system of teleoperation equipment

Also Published As

Publication number Publication date
CN113510720B (en) 2022-03-25

Similar Documents

Publication Publication Date Title
CN113510720B (en) Real-time distributed cooperative robot control system
JP2019517929A (en) Trajectory planning method of point-to-point movement in robot joint space
CN111360830A (en) Vibration control method of flexible mechanical arm based on cooperative tracking
CN105227488A (en) A kind of network flow group scheduling method for distributed computer platforms
CN111034128A (en) Control system and control device
CN110412921B (en) Robot single-leg high-real-time control system based on EtherCAT
CN110340894B (en) Teleoperation system self-adaptive multilateral control method based on fuzzy logic
CN108279630B (en) Bus-based distributed motion control system and method
CN109352661A (en) A kind of control method of joint of robot, servo drive and can storage medium
US20240100697A1 (en) Multi-Thread Controller for Parallel Robot
CN116197919B (en) Robot control system and control method
US11269313B2 (en) Controller and control system that stores data in current control cycle for output in next control cycle
Mineo et al. Interfacing toolbox for robotic arms with real-time adaptive behavior capabilities
CN114690767A (en) Robot trajectory planning method and system and robot
Zhang et al. A real-time robot control framework using ROS Control for 7-DoF light-weight robot
JP7231073B2 (en) Controllers and control systems
Song et al. CONCORD: A control framework for distributed real-time systems
CN114115352B (en) Unmanned aerial vehicle flight control method and system and unmanned aerial vehicle
CN117040678B (en) Time delay control method based on hardware time synchronization
CN116985149B (en) Robot edge control system capable of supporting TSN network technology
Hu et al. Multi-UGV experimental platform based on cloud and edge control: design and implementation
CN116578039A (en) Real-time motion control system and method for numerical control machine tool
JP2002297204A (en) Control system
Pérez et al. An intelligent sensor architecture for mobile robots
CN114281079A (en) ROS-based multi-agent distributed formation control system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant