WO2017052061A1 - Gpos 연동형 실시간 로봇 제어 시스템 및 이를 이용한 실시간 디바이스 제어 시스템 - Google Patents
Gpos 연동형 실시간 로봇 제어 시스템 및 이를 이용한 실시간 디바이스 제어 시스템 Download PDFInfo
- Publication number
- WO2017052061A1 WO2017052061A1 PCT/KR2016/008040 KR2016008040W WO2017052061A1 WO 2017052061 A1 WO2017052061 A1 WO 2017052061A1 KR 2016008040 W KR2016008040 W KR 2016008040W WO 2017052061 A1 WO2017052061 A1 WO 2017052061A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control system
- real
- time
- device control
- gpos
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
Definitions
- the present invention relates to a real time device control system and a real time robotic system. More specifically, the present invention relates to a GPOS linked real-time robot control system having a hierarchical architecture capable of accurate real-time processing, easy development and debugging, and robust hardware, and a real-time device control system using the same.
- Robots can be largely divided into hardware and software, and they are integrated to form a system.
- Components of the robot hardware include a driver and a controller for moving the robot joint, a battery and a power controller, a communication module, a sensor, an exoskeleton of the robot, an electronic circuit, and a battery. These different types of elements are combined according to the characteristics of each desired robot to form a robot hardware platform.
- the present invention is to solve the above problems, in the robot control system that requires real-time, while several independent processes for the same hardware control and processing can coexist, the operation of the robot can be stably controlled accordingly
- the purpose of the present invention is to provide a real-time device control system having a GPOS interlocking hierarchical architecture that can provide robustness and scalability, and a real-time robot control system using the same.
- a system for solving the above problems, the real-time device control system, GPOS (General Purpose Operation System); A Real Time Operation System (RTOS) operating on the GPOS and driving a device control system; And one or more devices connected to the RTOS and controlled in hard real-time, wherein the device control system provides a user interface with the GPOS, and performs real-time device control processing according to the interface input or time synchronization. Perform communication with the one or more devices according to the control process.
- GPOS General Purpose Operation System
- RTOS Real Time Operation System
- the system for solving the above problems, in a real-time robot control system, at least one agent interworking with the GPOS through an interface provided to the GPOS, and having a mutually independent process; Shared memory for updating the reference data for controlling the robot device in accordance with the operation of the at least one agent; And a device control module synchronized with the agent and outputting a control signal of the robot device based on reference data obtained from the shared memory.
- the method and system for solving the above problems can be implemented with a program for executing the method on a computer and a recording medium on which the program is recorded.
- a plurality of agents having mutually independent processes and a shared memory in which references generated according to operations of the plurality of agents are stored are provided, and the reference to the hardware device is controlled using the reference.
- GPOS General Purpose Operation System
- FIG. 1 is a conceptual diagram schematically showing an entire system according to an embodiment of the present invention.
- FIG. 2 is a flowchart illustrating a control method of a robot control system according to an exemplary embodiment of the present invention.
- 3 to 4 are diagrams for describing a relationship between a shared memory and a system according to an exemplary embodiment of the present invention.
- FIG. 5 is a diagram for explaining data exchange between a device control module and an agent according to an embodiment of the present invention.
- FIG. 6 is a block diagram illustrating a device control module according to an embodiment of the present invention.
- FIG. 7 is a flowchart illustrating a control operation of the robot control system according to another exemplary embodiment.
- FIGS. 8 to 9 are diagrams for describing a hierarchical structure and an operating environment according to an embodiment of the present invention.
- 9 to 10 are diagrams for describing a system and a hierarchical structure capable of driving a robot control system according to an embodiment of the present invention in a general purpose operating system.
- components expressed as means for performing the functions described in the detailed description include all types of software including, for example, a combination of circuit elements or firmware / microcode, etc. that perform the functions. It is intended to include all methods of performing a function which are combined with appropriate circuitry for executing the software to perform the function.
- the invention, as defined by these claims, is equivalent to what is understood from this specification, as any means capable of providing such functionality, as the functionality provided by the various enumerated means are combined, and in any manner required by the claims. It should be understood that.
- FIG. 1 is a conceptual diagram schematically showing an entire system according to an embodiment of the present invention.
- an entire system may include one or more devices 100, a device control module 200, a shared memory 300, one or more agents 400, and a user system 500. Include.
- the device 100 may include one or more driving devices that finally perform the operation of the robot control system.
- the drive device may comprise a hardware device or a software device.
- the drive device may include, for example, at least one of a joint device, a sensor device including a sensor board, or a simulator device that controls drive for the articulated motor.
- the device 100 may be controlled according to a control signal received from the device control module 200, and may output various data such as sensor data to the device control module 200.
- the term device 100 is not limited to hardware, but may be used as a concept including a software driver for driving an actual hardware device. Accordingly, each device 100 may be physically and softwarely connected to the device control module 200.
- Each device 100 may form a communication network with the device control module 200.
- the communication network may form a system network using a controller area network (CAN) protocol for system stability.
- CAN controller area network
- each device 100 may be connected to the device control module 200 through one or more CAN communication channels, and receive a message composed of CAN frame data according to a preset control period through the CAN communication channel.
- a message may be output to the device control module 200.
- the message may include a motor control reference, an encoder value, a controller state value, a pulse width modulation (PWM) command, a sensor value, or various other setting or output values.
- PWM pulse width modulation
- the device control module 200 obtains hardware control data for controlling one or more devices 100 from each reference generated from the plurality of agents 400 and stored in the shared memory, and the hardware control data. And transmits a control signal according to the reference to the one or more devices 100 selected from.
- the device control module 200 may always reside on an operating system for controlling the robot control system and may be executed in the background.
- the device control module 200 may uniquely directly communicate the device 100 with reference to the shared memory 300, and may transmit a control signal or receive a sensor signal through the communication channel.
- the device control module 200 may transfer a reference for controlling the joint device 100 to the joint device 100 or receive necessary sensor information from the sensor device 100.
- the device control module 200 may include a real-time thread created on the operating system. The thread is synchronized with the motion generation operation cycle of the system to enable real time processing. In addition, the device control module 200 may further include a non-real time thread for processing data reading and conversion.
- each agent 400 may be implemented as independent software modules having independent processes.
- the agents 400 may each process different motions and perform a process for outputting a reference corresponding thereto.
- each agent 400 may include a motion agent, a controller agent, a communication agent or a walking agent, a damping agent, and various other agents.
- the agents 400 can create and operate respective threads without sharing heap, data, and static memory, and mutually share them.
- the necessary data for each can be provided to the shared memory 300, thereby allowing organic processing without mutual collision, and facilitates software development and processing.
- each agent 400 may refer to hardware abstraction data and user-defined data of the shared memory 300 according to a defined process, and store the reference data generated based on the hardware abstraction data of the shared memory 300.
- the user defined data may include shared data for sharing information between the agents 400 and various data for driving other user-definable systems.
- the hardware abstraction data may include abstracted reference, sensor data, motion owner variable, and command data to control the device 100.
- the device control module 200 may generate a control signal for each device 100 by using the hardware abstraction data and hardware information previously stored in the hardware database 250.
- the device control module 200 identifies the control target device 100 using the hardware abstraction data extracted from the shared memory 300, and generates a control signal for the control target devices 100.
- the control signal according to the reference may be output to the control target device 100.
- the processing cycle of each agent 400 needs to be shorter than the operation cycle of processing the motion information of the system. Accordingly, the agent 400 generates a reference from the sensor data, the device control module 200 generates and outputs a control signal from the reference through the shared memory 300, and the time for updating the sensor data is It may be included in the first operating period of the system. Thus, the series of operations can all be processed within the first operating period.
- the user system 500 may provide a user interface for controlling and monitoring the agent 400 and the device control module 200.
- the user system 500 may include middleware for controlling the agent 400, and may provide various interfaces that may be connected to other external systems.
- FIG. 2 is a flowchart illustrating a control method of a robot control system according to an exemplary embodiment of the present invention.
- 3 to 4 are diagrams for describing a relationship between a shared memory and a system according to an exemplary embodiment of the present invention.
- FIG. 5 is a diagram for explaining data exchange between a device control module and an agent according to an embodiment of the present invention.
- the device control module 200 obtains hardware abstraction data from a reference stored in the shared memory 300.
- the device control module 200 generates a control signal for hardware control from the hardware abstraction data (S107), and transmits the generated control signal to one or more devices 100 (S109).
- the device control module 200 receives sensor data from the devices 100 corresponding to the sensor (S111), and updates the received sensor data in the shared memory 300 (S113).
- the series of operation steps may be all processed within the first period corresponding to the real-time operation period of the robot control system, thereby ensuring real-time.
- each of the agents 400 and the device control module 200 may perform data exchange and transfer processing using the shared memory 300.
- the reference corresponding to each device 100 may be stored in the shared memory 300, and the device control module 200 may obtain a reference and use it to output a control signal. .
- Such a plurality of agents 400 and the device control module 200 may configure a multi-agent system around the shared memory 300.
- each part performing independent work may be separately developed by several developers, or may have an advantageous structure in a robot control system development environment in which they may collaborate.
- developers may use the shared memory 300 while interacting with the computational output of another agent 400 while ensuring a development space independent from the process concurrent development model. Will be able to send and receive.
- the hardware abstraction data may include sensor data, reference data, motion owner- and command data, and the device control module 200 may access only the hardware abstraction data area of the shared memory 300. .
- the device control module 200 accesses the hardware abstraction data area of the shared memory 300 to update sensor data received from the device 100, or obtains updated reference data to control the device 100. You can generate a signal.
- the hardware abstraction data may have a data format converted by abstracting detailed data about the robot device control, and the device control module 200 may convert it into an actual hardware control signal and deliver it to the appropriate devices 100. have.
- the agent 400 developer or user can facilitate the control without a deep understanding of the hardware.
- the developer or the user may transfer the abstracted hardware input information as a reference through the shared memory 300, and the device control module 200 may generate a low level control signal for controlling the device 100 from the hardware abstraction data.
- the device control module 200 may manage hardware information required to generate the control signal using the hardware database 250 described above.
- the hardware information may include, for example, a list of devices 100, joint motor information (deceleration ratio, encoder pulse, driver channel number, etc.), a communication protocol, and the like.
- the device control module 200 may load the hardware database 250 to determine hardware information of the driving target device 100, thereby generating an optimal control signal for controlling the driving target device 100. can do. In addition, even if there is a change in hardware or using hardware of a new configuration, it is possible to apply only by modifying the hardware database 250, so that it is robust to hardware change and the hardware can provide scalable characteristics.
- the hardware abstraction data may include reference data, sensor data, motion owner, and command data.
- the reference data may be updated according to the calculation result in each agent 400, and may include a target value in the current step for the device control module 200 to control each device 100.
- the reference data may include a joint motion reference and a joint controller reference.
- the sensor data may include measurement data that the device control module 200 receives from each device 100.
- the measurement data may include, for example, state information at a current step including an encoder value of the joint device and sensing data.
- the command data may include command information for controlling the device control module 200 and the agent 400 at a higher system level, and may include command target process information and parameter information.
- the shared memory 300 may include motion owner information.
- the hardware abstraction data area of the shared memory 300 may include a memory area 350 for each agent 400 that can update reference data for each agent 400.
- each agent 400 may update its calculated reference in its memory space area.
- each agent 400 may calculate and update reference data corresponding to each device 100. For example, when a total of 31 joint devices 100 exist from J1 to J31, a memory space area of each agent 400 may include a reference data area corresponding to each of the joint devices 100. .
- the shared memory 300 may include a motion owner variable for each of the joint devices 100. Therefore, each motion owner variable space may include the same number of motion owner variables as the number of the joint devices 100.
- Each motion owner variable may represent one agent having the authority to the joint device 100 among a plurality of preset agents 400. Accordingly, the device control module 200 may determine which agent 400 the control right for the joint device 100 depends on.
- control right for each joint device 100 may be transferred to another agent 400 or the device control module 200 according to the change of the motion owner variable.
- the device control module 200 may first identify the agent 400 having the control right of the specific joint device 100 from the motion owner variable.
- the device control module 200 may collect reference data of the identified agent 400, and combine the collected reference data to generate overall reference data for the overall joint device 100.
- the device control module 200 may generate a control signal for each device 100 by using the entire reference data, and may appropriately transmit the signal.
- each joint of the robot can be controlled without collision in different agents 400.
- one agent 400 controls lower body joints through an algorithm for stabilizing lower body posture, and the other agent 400 generates a specific task motion of the upper body, the results of the two agents 400 are determined. In total, the whole body task of the robot can be performed. This enables efficient control according to the characteristics of the multi-agent system of the robot.
- FIG. 6 is a block diagram illustrating a device control module according to an embodiment of the present invention.
- the device control module 200 includes a motion selector 210, a controller signal accumulator 220, a signal combiner 230, and an information handler 240.
- the reference data for the joint may include two or more reference signals for joint motion control and detailed control. Accordingly, the agent 400 corresponding to each joint device 100 may generate the two or more reference signals as reference data and store the same in the shared memory 300.
- the reference signal may be referred to as a motion reference and a controller reference.
- the motion reference may include reference data that provides a dominant value for each joint, and the controller reference may include detailed reference data that is added to or subtracted from the motion reference.
- the reference is not limited to the name.
- the motion reference output data M1 to Mm and the controller references M1 to Mm may be input to the device control module 200 from the shared memory 300.
- one motion reference may be selected for each joint device 100, but all the controller references may be accumulated and added.
- the motion selector 210 may select motion reference data corresponding to each joint device 100 from the motion reference data based on the motion owner variable information, and output the motion reference data to the signal combiner 230. have. Therefore, one motion reference data may be selected for one joint device 100.
- controller signal accumulator 220 may accumulate each controller reference data and output the result value to the signal combiner 230 regardless of the motion owner variable.
- the signal combiner 230 may generate the reference data for each final joint device 100 by synthesizing the motion reference data and the controller reference data accumulated result value, and output them to the appropriate target joint devices 100. Can be.
- the signal combiner 230 may identify the type of the reference and classify the processing space according to the reference type.
- the signal combiner 230 may include a type identifier and a spatial processor.
- the reference data may have other types, such as task processing, as well as joint motion, such that a type identifier may identify whether the task type or the joint type is, and the spatial processor may determine each other according to the type. Can provide processing of other data spaces.
- separating the motion reference from the controller reference functional separation may be enabled in the process of generating the robot motion. For example, if a bipedal motion is generated, a basic walking pattern is generated in one agent 400 to generate a motion reference, a damping controller is designed in another agent 400, and another agent 400 is generated. By designing a controller to catch the vibration in the controller and outputting it to the controller reference, it is very easy to design and develop.
- the information handler 240 may perform a function of synthesizing sensor data collected from the sensor device 100 or other measurement target devices and outputting them to the shared memory 300.
- FIG. 7 is a flowchart illustrating a control operation of the robot control system according to another exemplary embodiment.
- the robot In general, when a problem occurs in a real experiment using a robot, the robot must be driven again from the beginning. In the case of a mobile platform, the robot initialization process is simple, but when the initialization is difficult in the articulated system or the ground like a humanoid, and it is necessary to initialize it in the air by using a crane, etc., the entire initialization process is very cumbersome and time consuming. do.
- the device control module 200 can debug and test the robot again without the process of initializing such a robot.
- system initialization is first performed (S201), and a plurality of agents 400 having respective mutually independent processes operate (S202).
- the user when the user tests the motion algorithm through the agent 400 and a problem is issued, the user simply passes the motion owner to another agent 400 or the device control module 200, and sends a code for the suspended agent 400. Can be modified.
- the motion owner variable may be switched back to the original agent 400 (S209).
- the developer can continue the experiment after bringing the motion owner. As a result, it can accelerate development, and from the user's point of view, it can be further utilized to continuously observe the robot's joint reference on other special eggs to detect the collision and to switch the robot to the motion owner in case of a collision. It has the effect of allowing you to experiment safely.
- FIG. 8 is a view for explaining the operating environment of the robot system according to an embodiment of the present invention.
- the robot software in order to use the robot software in general, it is necessary to be able to operate various types of robots, not software that can operate only one robot platform, so that it is extensible and easy to change in robot hardware. It must be adaptable, and not only the actual robot platform but also the robot simulator needs to be controlled by the same software.
- the robot control system 1000 is capable of utilizing functions of other useful robot middlewares such as a robot operation system (ROS) in the United States or an open platform for robotics services (OPRoS) in Korea. You can build a system. Accordingly, it is possible to provide an environment in which various vision solutions provided by software on a cloud robot system or functions for managing tasks of a robot can be easily applied to the system of the present invention.
- ROS robot operation system
- OPRoS open platform for robotics services
- the device control module 200 controlling the robot devices 100 operates accordingly to provide real-time control of the entire system.
- the device control module 200 controlling the robot devices 100 operates accordingly to provide real-time control of the entire system.
- other robot softwares of higher level may provide connection or determination criteria between motions, or may operate several robots simultaneously.
- FIG. 9 is a diagram showing a hierarchical architecture design of a robot system according to an embodiment of the present invention.
- the robot control system 1000 processes each data in order to provide an environment in which a plurality of agents or an arbitrary agent are created and operated independently in accordance with an embodiment of the present invention.
- Modules may include a layered structure.
- Each layered structure may be connected to the robot control system 1000, or may be implemented in software or hardware on a real-time operating system (RTOS) on which the robot control system 1000 is mounted or installed.
- RTOS real-time operating system
- the real-time operating system may provide global timer interrupts to the fourth layer and the second layer to synchronize operation cycles between layers.
- each agent of the fourth layer may be implemented as a process of the real-time operating system, and may access shared memory, obtain sensor data, and store reference data according to thread operations included in the process.
- the device control module of the second layer synchronized thereto stores the sensor data of the device in the shared memory and generates the device control signal according to the reference data and the motion owner of the shared memory according to the thread operation on the real-time operating system. Can output to devices.
- the robot control system 1000 which can be implemented on a real-time operating system, includes a first layer including one or more controlled devices (joints or sensors) included in a robot platform or a simulator, and the first layer.
- a second layer including a device control module directly controlling the device at an upper end of the layer, a third layer including a shared memory connected to the device control module at an upper end of the second layer, and the first layer;
- a fourth layer including one or more agents performing an independent process using the shared memory at an upper end of the third layer;
- a fifth layer that controls the one or more agents according to a user command at an upper end of the fourth layer.
- each communication protocol may be preset so that the first to fifth layers can communicate only with adjacent layers.
- Each tier can only access the next tier through the upper or lower tier, and through this controlled system, it can maintain a stabilized and systematic system.
- each device may be included in the first layer.
- the devices may include low level robotic devices that are the subjects of substantial control, for example devices of the driver's controller, sensor board or robot simulator.
- the Diabis control module 200 may always reside in the background and execute the robot to control the robot.
- the second layer may be the only layer that can directly control the devices of the robotic system.
- the device control module of the second layer may transfer the reference of the joint generated from the shared memory to the robot device, and conversely obtain the value of the sensor from the device.
- the second layer may be operated by a real time thread generated from a real time operating system (RTOS), and the thread of the second layer may have a period synchronized with a control period of motion generation. If the device is linked with the simulator, the thread of the second layer may operate in synchronization with the simulator's time.
- the second tier can also have non-real-time threads that can read and interpret instructions, and non-real-time threads can receive and process other instructions in the remaining time of the real-time thread.
- the device control module may have a hierarchical architecture residing in the background of the system and transferring control signals for controlling the device from the reference obtained from the shared memory to the first layer.
- the third layer may be a shared memory layer, and may include an abstraction data unit and a user-defined data unit of hardware.
- the hardware abstraction data unit may include the aforementioned hardware abstraction data
- the type of hardware abstraction data may include sensor data, reference data, a motion owner, and command information.
- the device control module of the second layer may be connected only to the shared memory of the third layer.
- the user defined data unit may temporarily or permanently store agent shared data shared among a plurality of agent processes existing in the fourth layer and robot driving data according to user definition.
- the fourth layer is a layer for driving each agent process for the user of the external process to create their own robot motion, etc., since the agent processes are executed independently of each other in the layer like grape grains, AL).
- Each agent independently reads sensor data from the shared memory of the third layer, generates a motion, and updates the joint reference of the generated motion in the shared memory.
- agent processes of the fourth layer may set to the motion owner which eggs have ownership of the reference of the joint.
- each agent can create a very short period of fast real-time threads from the real-time operating system (RTOS), which is used to synchronize the motion generation threads of each agent with the real-time threads of the fourth layer device control module described above.
- RTOS real-time operating system
- a thread generating motion can be synchronized in real time with the device control module by the fast thread, and the operation can be resumed at the same time as the synchronization, and suspended after one reference operation loop. have. This operation may be repeated repeatedly to control the robot control system 1000.
- agents not all agents directly generate the motion of the robot, but there may be an agent that detects a collision and brings the motion owner from another agent to make the robot safe, and may perform ancillary processing to help other agents.
- agent N in FIG. 9 there may also be an agent (agent N in FIG. 9) that is implemented as a communication module (comm. Module) to exchange information with the fifth layer and to control other agencies.
- the fifth layer may include a user interface module that provides a control function corresponding to the agents and a monitoring function for the robot control system 1000.
- the fifth layer may include various processes to provide convenience for controlling the robot.
- the fifth layer may include a GUI (Graphic User Interface) for easily giving a command, monitoring, or a logging program for storing data.
- GUI Graphic User Interface
- the fifth layer is an accessible area of an external process, and existing middleware such as ROS and OPRoS may provide one or more interface functions for controlling agents.
- existing middleware such as ROS and OPRoS may provide one or more interface functions for controlling agents.
- the robot control system 1000 may include a structure that can be infinitely extended, thereby providing a structural possibility to control a hyper multi-agent system. Can be.
- FIGS. 9 to 10 are diagrams for describing a system and a hierarchical structure capable of driving a robot control system according to an exemplary embodiment of the present invention in a general purpose operating system.
- a separate operating system is developed and used to support real time by escaping the general operating system.
- a real-time operating system of a dual kernel is illustrated.
- the robot system is controlled through a real-time communication module existing in a separate real-time operating system, and in a general operating system, a user can create his own algorithm and deliver it to the real-time operating system.
- This method is used by many robot middlewares.
- the contents of the firmware development are supported by the real-time operating system, the development difficulty of the user and the real-time computing power may be increased.
- such a dual kernel has a disadvantage in that it is not easy to implement precise hard real-time at the motion level due to the delay of operation, and it is not easy to construct a system.
- the robot control system 1000 provides hard real time control to the robot device 100 while being implemented on a general purpose OS (GPOS) to solve the aforementioned disadvantages. It can provide a control system framework of a real-time OS (RTOS).
- GPOS general purpose OS
- RTOS real-time OS
- the RTOS including the robot control system 1000 may be implemented in a virtual OS method in conjunction with the GPOS.
- a robot (or device) control system 1000 operates on a General Purpose Operation System (GPOS), the GPOS, and a Real Time Operation (RTOS) that drives a device control system.
- GPOS General Purpose Operation System
- RTOS Real Time Operation
- the robot control system 1000 provides a user interface with the GPOS, the interface input or Real time device control processing according to time synchronization may be performed, and communication with the one or more devices may be processed according to the control processing.
- robot control system 1000 may also be implemented on the virtual OS, and all important operations may be performed in the core region 1200 inside the real-time operating system (RTOS).
- RTOS real-time operating system
- the robot control system 1000 interlocks core processes for controlling the robot devices 100 according to the above-described second to fourth interlayer operations with a real-time communication unit 1300 of a real-time operating system (RTOS) running on a GPOS. Can be processed at 1200.
- RTOS real-time operating system
- the user area 1100 may process the user interface of the fifth layer for controlling the aforementioned fourth layer.
- other robot frameworks may easily access the user area 1100 through the user interface.
- the parts of motion creation that the user writes may also be present in the same real-time operating system, so that the user can easily generate motion of the robot that is controlled in real time, and how low-level robotic devices are controlled. Without knowing this, you can simply run the robot through the interface of the hardware abstraction provided by the core process that replaces it.
- robot framework middlewares such as ROS or OPRoS described above can be linked to the user area 1100 of the robot control system 1000 even while running in a general operating system, thereby enabling the user to easily perform hard real-time levels of motion. Can be implemented. In addition, real-time computing power can be increased.
- the GPOS may be exemplified by OSX, Linux, or Windows, but it will be apparent that other GPOS may be applied to the embodiment of the present invention according to changes in the OS environment.
- FIG. 10 illustrates a hierarchical structure of the robot control system 1000 framework for providing a hard real-time OS on the GPOS (General Purpose OS) as described above.
- GPOS General Purpose OS
- the robot control system 1000 may include a real-time kernel driven in a virtual OS manner on the GPOS.
- GPOS may access the agent 400 layer of the robot control system 1000 driven by the virtual OS method, and the robot control system 1000 may include an agent 400, a shared memory 300, and the real-time kernel.
- 210 a device control module 200, and a communication module 220.
- the GPOS may include an external process interworking with the agent 400 in any one of OSX, Linux, and Windows.
- GPOS may access one or more agents 400.
- Each of the agents 400 may be included in the above-described fourth layer, and the robot control system 1000 may provide a user interface of the fifth layer to allow the GPOS to access the agents 400.
- the device control module 200 when data input and output to the shared memory 300 is performed according to the operation of each agent 400, the device control module 200 according to the time synchronization of the real-time kernel 210 operating in the RTOS communication module 220
- the sensor data may be collected from the robot devices 100, updated in the shared memory 300, and output control signals to the robot devices 100.
- the device control module 220 may be directly connected to the robot framework on the GPOS.
- the hardware database 250 that stores hardware information about the robot framework described above with reference to FIG. 1 may be located in a memory area on a GPOS. In this case, the device control module 220 may access the memory area on the GPOS to acquire necessary hardware data.
- the hardware information may include a list of robot devices, joint information (deceleration ratio, encoder pulse, driver channel number, etc.), communication protocol, and the like.
- the robot control system 1000 according to the embodiment of the present invention may be robust to robot hardware change, and provide a characteristic that the hardware is expandable.
- the GPOS may further include another robot middleware framework for accessing each agent process of the device control system.
- the middleware framework may be exemplified above, such as ROS, OPRos.
- the communication module 220 may be connected to each robot device 100 to process data transmission and reception with the robot control system 1000.
- the communication module 220 may support EtherCAT, CAN, RS485, and various other communication methods according to device characteristics.
- the above-described systems and methods according to the present invention can be stored in a computer-readable recording medium produced as a program for execution in a computer
- examples of the computer-readable recording medium is ROM, RAM, CD- ROMs, magnetic tapes, floppy disks, optical data storage, and the like, and also include those implemented in the form of carrier waves (eg, transmission over the Internet).
- the computer readable recording medium can be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- functional programs, codes, and code segments for implementing the method can be easily inferred by programmers in the art to which the present invention belongs.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
Description
Claims (13)
- 실시간 디바이스 제어 시스템에 있어서,GPOS(General Purpose Operation System);상기 GPOS상에서 동작하며, 디바이스 제어 시스템을 구동하는 RTOS(Real Time Operation System); 및상기 RTOS와 연결되어 경성 실시간(hard real-time)으로 제어되는 하나 이상의 디바이스를 포함하고,상기 디바이스 제어 시스템은 상기 GPOS와의 유저 인터페이스를 제공하며, 상기 인터페이스 입력 또는 시간 동기화에 따른 실시간 디바이스 제어 처리를 수행하고, 상기 제어 처리에 따라 상기 하나 이상의 디바이스와의 통신을 처리하는실시간 디바이스 제어 시스템.
- 제1항에 있어서,상기 디바이스 제어 시스템은 상기 GPOS상에서 가상 OS 방식으로 구동되는 실시간 커널(Kernel)을 포함하는실시간 디바이스 제어 시스템.
- 제2항에 있어서,상기 디바이스 제어 시스템은상기 커널과의 시간 동기화에 따라, 디바이스로부터 센서 정보를 수신하여 공유 메모리에 업데이트하는 디바이스 제어 모듈을 더 포함하는실시간 디바이스 제어 시스템.
- 제3항에 있어서,상기 디바이스 제어 모듈은상기 커널과의 시간 동기화에 따라, 상기 공유 메모리로부터 레퍼런스 정보를 획득하고, 상기 레퍼런스 정보에 기초하여 생성된 제어 신호를 상기 디바이스로 전달하는실시간 디바이스 제어 시스템.
- 제1항에 있어서,상기 디바이스 제어 시스템은상기 GPOS에서 접근 가능한 하나 이상의 에이전트 프로세스를 포함하는실시간 디바이스 제어 시스템.
- 제5항에 있어서,상기 GPOS는 OSX, Linux, Windows 중 어느 하나에서 상기 디바이스 제어 시스템과 연동하는 외부 프로세스를 포함하는실시간 디바이스 제어 시스템.
- 제5항에 있어서,상기 GPOS는 상기 디바이스 제어 시스템의 각 에이전트 프로세스에 접근하는 다른 로봇 미들웨어 프레임워크을 더 포함하는실시간 디바이스 제어 시스템.
- 실시간 로봇 제어 시스템에 있어서,GPOS로 제공되는 인터페이스를 통해 상기 GPOS와 연동하며, 상호 독립적인 프로세스를 갖는 하나 이상의 에이전트;상기 하나 이상의 에이전트의 동작에 따라 로봇 디바이스 제어를 위한 레퍼런스 데이터를 업데이트하는 공유 메모리; 및상기 에이전트와 동기화되며, 상기 공유 메모리로부터 획득되는 레퍼런스 데이터에 기초하여, 상기 로봇 디바이스의 제어 신호를 출력하는 디바이스 제어 모듈을 포함하는GPOS 연동형 실시간 로봇 제어 시스템.
- 제8항에 있어서,상기 디바이스 제어 모듈은 상기 GPOS상에서 가상 OS 방식으로 구동되는 실시간 커널(Kernel)과 동기화되는GPOS 연동형 실시간 로봇 제어 시스템.
- 제9항에 있어서,상기 디바이스 제어 모듈은 상기 커널과의 시간 동기화에 따라, 상기 디바이스로부터 센서 정보를 수신하여 상기 공유 메모리에 업데이트하는GPOS 연동형 실시간 로봇 제어 시스템.
- 제8항에 있어서,상기 GPOS는 OSX, Linux, Windows 중 어느 하나에서 상기 에이전트와 연동하는 외부 프로세스를 포함하는GPOS 연동형 실시간 로봇 제어 시스템.
- 제11항에 있어서,상기 GPOS는 상기 디바이스 제어 시스템의 각 에이전트 프로세스에 접근하는 다른 로봇 미들웨어 프레임워크을 더 포함하는GPOS 연동형 실시간 로봇 제어 시스템.
- 제8항에 있어서,상기 제어 신호를 전달하기 위한 통신 모듈을 더 포함하고,상기 통신 모듈은 EtherCAT, CAN, RS485 중 선택되는 적어도 하나의 방식으로 변환하여 상기 로봇 디바이스로 전송하는GPOS 연동형 실시간 로봇 제어 시스템.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680054786.2A CN108136577B (zh) | 2015-09-21 | 2016-07-22 | 通用操作系统联动式实时机器人控制系统及利用其的实时设备控制系统 |
EP16848787.4A EP3354417A4 (en) | 2015-09-21 | 2016-07-22 | GPOS-LINKED REAL-TIME ROBOT CONTROL SYSTEM AND REAL-TIME DEVICE CONTROL SYSTEM USING THEREOF |
US15/762,065 US10864635B2 (en) | 2015-09-21 | 2016-07-22 | GPOS-connected real-time robot control system and real-time device control system using same |
JP2018514281A JP6771027B2 (ja) | 2015-09-21 | 2016-07-22 | Gpos連動型リアルタイムロボット制御システム及びこれを用いたリアルタイムデバイス制御システム |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562221215P | 2015-09-21 | 2015-09-21 | |
US62/221,215 | 2015-09-21 | ||
KR10-2016-0020779 | 2016-02-22 | ||
KR1020160020779A KR102235947B1 (ko) | 2015-09-21 | 2016-02-22 | Gpos 연동형 실시간 로봇 제어 시스템 및 이를 이용한 실시간 디바이스 제어 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017052061A1 true WO2017052061A1 (ko) | 2017-03-30 |
Family
ID=58386189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2016/008040 WO2017052061A1 (ko) | 2015-09-21 | 2016-07-22 | Gpos 연동형 실시간 로봇 제어 시스템 및 이를 이용한 실시간 디바이스 제어 시스템 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017052061A1 (ko) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020004428A (ja) * | 2019-01-10 | 2020-01-09 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP2020077430A (ja) * | 2019-01-23 | 2020-05-21 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP2020098608A (ja) * | 2019-01-10 | 2020-06-25 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
CN111726201A (zh) * | 2020-06-15 | 2020-09-29 | 哈工大机器人(合肥)国际创新研究院 | 一种airt-ros虚拟网卡丢包解决方法 |
CN112424777A (zh) * | 2018-08-17 | 2021-02-26 | 欧姆龙株式会社 | 用于操作工业个人计算机装置的方法及工业个人计算机装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10309685A (ja) * | 1997-05-12 | 1998-11-24 | Kawasaki Heavy Ind Ltd | ロボット制御装置 |
KR20030081370A (ko) * | 2000-12-28 | 2003-10-17 | 로보틱 워크스페이스 테크놀로지스, 인크. | 다기능 로봇 제어 시스템 |
KR100520779B1 (ko) * | 2003-01-09 | 2005-10-12 | 삼성중공업 주식회사 | Fpga를 이용한 다 축 위치 제어장치 |
KR20070083460A (ko) * | 2004-07-06 | 2007-08-24 | 엠베디오 인코포레이티드 | 다중 커널을 동시에 실행하는 방법 및 시스템 |
KR20130110289A (ko) * | 2012-03-29 | 2013-10-10 | 주식회사 엔티리서치 | 의료용 수술 로봇 장치 |
-
2016
- 2016-07-22 WO PCT/KR2016/008040 patent/WO2017052061A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10309685A (ja) * | 1997-05-12 | 1998-11-24 | Kawasaki Heavy Ind Ltd | ロボット制御装置 |
KR20030081370A (ko) * | 2000-12-28 | 2003-10-17 | 로보틱 워크스페이스 테크놀로지스, 인크. | 다기능 로봇 제어 시스템 |
KR100520779B1 (ko) * | 2003-01-09 | 2005-10-12 | 삼성중공업 주식회사 | Fpga를 이용한 다 축 위치 제어장치 |
KR20070083460A (ko) * | 2004-07-06 | 2007-08-24 | 엠베디오 인코포레이티드 | 다중 커널을 동시에 실행하는 방법 및 시스템 |
KR20130110289A (ko) * | 2012-03-29 | 2013-10-10 | 주식회사 엔티리서치 | 의료용 수술 로봇 장치 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3354417A4 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112424777A (zh) * | 2018-08-17 | 2021-02-26 | 欧姆龙株式会社 | 用于操作工业个人计算机装置的方法及工业个人计算机装置 |
CN112424777B (zh) * | 2018-08-17 | 2023-09-08 | 欧姆龙株式会社 | 工业个人计算机装置及其操作方法 |
JP2020004428A (ja) * | 2019-01-10 | 2020-01-09 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP2020098608A (ja) * | 2019-01-10 | 2020-06-25 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP7303132B2 (ja) | 2019-01-10 | 2023-07-04 | モベンシス株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP7303061B2 (ja) | 2019-01-10 | 2023-07-04 | モベンシス株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP2020077430A (ja) * | 2019-01-23 | 2020-05-21 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
JP7303131B2 (ja) | 2019-01-23 | 2023-07-04 | モベンシス株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
CN111726201A (zh) * | 2020-06-15 | 2020-09-29 | 哈工大机器人(合肥)国际创新研究院 | 一种airt-ros虚拟网卡丢包解决方法 |
CN111726201B (zh) * | 2020-06-15 | 2023-09-12 | 合肥哈工轩辕智能科技有限公司 | 一种airt-ros虚拟网卡丢包解决方法 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017052061A1 (ko) | Gpos 연동형 실시간 로봇 제어 시스템 및 이를 이용한 실시간 디바이스 제어 시스템 | |
KR102235947B1 (ko) | Gpos 연동형 실시간 로봇 제어 시스템 및 이를 이용한 실시간 디바이스 제어 시스템 | |
WO2017052060A1 (ko) | 계층적 아키텍처를 갖는 실시간 디바이스 제어 시스템 및 이를 이용한 실시간 로봇 제어 시스템 | |
EP2897007B1 (en) | Process controller and updating method thereof | |
JP2010105150A (ja) | ロボットコンポーネント管理装置及び方法 | |
KR100896705B1 (ko) | 지능형 로봇의 지능적 작업 관리를 위한 컴포넌트 기반의 작업 관리 시스템 | |
Ahn et al. | Dual-channel EtherCAT control system for 33-DOF humanoid robot TOCABI | |
Vick et al. | Using OPC UA for distributed industrial robot control | |
WO2011132807A1 (ko) | 로봇 시스템 제어 방법 및 그 장치 | |
CN111095138A (zh) | 控制装置、控制装置的控制方法、信息处理程序及记录介质 | |
Moon et al. | Real-time EtherCAT master implementation on Xenomai for a robot system | |
WO2017052059A1 (ko) | 실시간 제어 시스템, 실시간 제어 장치 및 시스템 제어 방법 | |
Li et al. | Multi-tasking syetem design for multi-axis synchronous control of robot based on RTOS | |
WO2021033861A1 (ko) | Iot 서비스 인프라와의 연동을 지원하는 초소형 iot 디바이스용 소프트웨어의 개발을 위한 클라우드 기반 통합 개발 환경을 제공하는 통합 개발 클라우드 서버 및 방법 | |
Peekema et al. | Open-source real-time robot operation and control system for highly dynamic, modular machines | |
WO2022075556A1 (ko) | 컨테이너 기반의 로봇 지능 증강 및 공유 방법 및 시스템 | |
Wruetz et al. | A wireless multi-robot network approach for industry 4.0 using RoBO2L | |
WO2023096168A1 (ko) | 로봇 및 시설물을 제어하는 방법 및 시스템 | |
US20240278425A1 (en) | Containerized plug-in system for robotics | |
WO2023128654A1 (ko) | 타겟 장치를 위한 학습 모델 최적화 방법 및 이를 위한 시스템 | |
Domínguez-Brito et al. | Coolbot: A component model and software infrastructure for robotics | |
Spirleanu et al. | An experimental framework for Multi-Agents using RTOS based robotic controllers | |
Muratore et al. | XBot: A Cross-Robot Software Framework for Real-Time Control | |
Chiu et al. | Concurrent and real-time task management for self-reconfigurable robots | |
Fernandez-Perez et al. | Integrating systems in robotics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16848787 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018514281 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15762065 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016848787 Country of ref document: EP |