WO2021085727A1 - Cutting robot system and simulation method therefor - Google Patents
Cutting robot system and simulation method therefor Download PDFInfo
- Publication number
- WO2021085727A1 WO2021085727A1 PCT/KR2019/016436 KR2019016436W WO2021085727A1 WO 2021085727 A1 WO2021085727 A1 WO 2021085727A1 KR 2019016436 W KR2019016436 W KR 2019016436W WO 2021085727 A1 WO2021085727 A1 WO 2021085727A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- cutting robot
- cutting
- virtual
- information
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G21—NUCLEAR PHYSICS; NUCLEAR ENGINEERING
- G21D—NUCLEAR POWER PLANT
- G21D1/00—Details of nuclear power plant
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E30/00—Energy generation of nuclear origin
Definitions
- the following description is an invention relating to a cutting robot system and a simulation method thereof.
- a nuclear reactor or nuclear reactor is a device that uses the heat generated during nuclear fission for power generation or obtains basic particles of substances such as neutrons and radiation and uses it for pre-scientific research or technology development.It is a device that continuously maintains and controls nuclear fission. It's a device. Nuclear reactors are mostly used to generate electrical energy and are also used as power for ships.
- the nuclear power plant radiation structure refers to a structure that is radiated by absorbing neutrons in the structure, that is, a structure that exhibits radioactivity, among discarded structures generated when a nuclear reactor is dismantled in operation.
- a cutting robot capable of remote control has been introduced to prevent damage caused by radioactivity of workers, and abolition or maintenance work has been performed.
- Such a cutting robot cuts the nuclear power plant radiation structure and transports it so that it can be disposed of.
- the load applied to each joint may vary according to the rotation of the joints of the cutting robot, and the load applied to the cutting robot may also vary according to the load of the cutting body of the nuclear power plant radiation structure. Accordingly, even with the same cutting body, the load applied to the cutting robot varies depending on the length and gripping position, resulting in a load on the rotation of the joint of the cutting robot, resulting in a problem that may lead to damage.
- the purpose of the embodiment is to provide a virtual reality image by implementing the robot and the robot's working environment as a virtual reality, and through a robot model that interacts with the user's motion while the user checks the working motion of the robot in real time in the virtual reality, it is convenient. It is to provide a cutting robot system and a simulation method that supports intuitive direct operation by supporting seamless simulation.
- robot model information that calculates the optimal operation form of the robot according to the operation to ensure stable and accurate operation is stored, and physical information and robot model information are stored when the cutting robot is actually operated. It is to provide a cutting robot system and a simulation method thereof that are provided and operate the cutting robot in an optimal operation form.
- a cutting robot system that provides physical information for cutting a nuclear power plant radiation structure of a nuclear reactor in connection with a virtual reality interface device according to an embodiment will be described.
- the cutting robot system is based on a master device that generates a control signal for controlling the cutting robot and the virtual reality-based virtual cutting robot, environmental model information preset with design information of the reactor, and robot model information preset for the cutting robot. It generates an image of a virtual reality and a virtual cutting robot, generates a motion event in a virtual environment according to a control signal of the virtual cutting robot, and the physics of the structure object by the interaction of the structure object in the virtual reality and the virtual cutting robot. It includes a simulator that calculates information and changes in joint load information of the virtual cutting robot, and a database that stores physical information and joint load information of the simulator.
- the motion event of the virtual cutting robot includes at least one of a movement of the virtual cutting robot, a movement of a robot arm of the virtual cutting robot, an operation of a gripper of the cutting robot, and an operation of plasma of the cutting robot. can do.
- the simulator may calculate the rotation of each joint corresponding to the position and posture of the end of the robot arm as inverse kinematics and reflect it in the robot model information.
- the simulator may continuously update the robot model information by calculating a load applied to each joint.
- the simulator may calculate the rotation of each joint so that the load applied to each joint does not exceed a preset range and reflect it in the robot model information.
- a cutting robot that is input to an actual nuclear reactor to measure and cut the radiation structure may be further included, and when the cutting robot is actually operated, detailed operations of the cutting robot may be controlled based on physical information.
- the environmental model information may be updated in real time with information measured by the cutting robot.
- the simulation method of a virtual reality-based cutting robot system includes the steps of generating an image of a virtual reality and a virtual cutting robot based on preset environmental model information for the reactor design information and preset robot model information for the cutting robot. Determining an input of a control signal for manipulating the cutting robot, when the control signal is input, generating a motion event of the virtual cutting robot in virtual reality according to the control signal, the virtual cutting robot and the virtual reality It may include determining whether a cutting event of the cutting object occurs, generating physical information of the cutting object in virtual reality, and storing the physical information in a database.
- it may further include calculating a load applied to each joint of the virtual cutting robot.
- it may further include calculating the rotation angle of each joint so that the load applied to each joint does not exceed a preset range and reflecting it in the robot model information.
- the purpose of the embodiment is to provide a virtual reality image by implementing the robot and the robot's working environment as a virtual reality, and through a robot model that interacts with the user's motion while the user checks the working motion of the robot in real time in the virtual reality, it is convenient. It can be supported so that it can be simulated so that intuitive direct operation is possible.
- robot model information that calculates the optimal operation form of the robot according to the operation to ensure stable and accurate operation is stored, and physical information and robot model information are stored when the cutting robot is actually operated. It is provided and can operate the cutting robot in an optimal operation mode.
- FIG. 1 is a block diagram of a virtual reality-based cutting robot system according to an embodiment.
- FIG. 2 is a perspective view of a cutting robot according to an embodiment.
- FIG. 3 is a flowchart illustrating a simulation method of a virtual reality-based cutting robot system according to an embodiment.
- first, second, A, B, (a), and (b) may be used. These terms are for distinguishing the constituent element from other constituent elements, and the nature, order, or order of the constituent element is not limited by the term.
- the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
- the computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded in the medium may be specially designed and configured for the embodiment, or may be known to and usable by a person skilled in computer software.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
- -A hardware device specially configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like.
- Examples of program instructions include not only machine language codes such as those produced by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
- the hardware device described above may be configured to operate as one or more software modules to perform the operation of the embodiment, and vice versa.
- FIG. 1 is a block diagram of a virtual reality-based cutting robot system according to an embodiment
- FIG. 2 is a perspective view of a cutting robot according to the embodiment.
- the cutting robot system 1 includes a cutting robot 10, a virtual reality interface device 20, a master device 30, a simulator 40, and a database 50.
- the cutting robot 10 is input to the nuclear reactor through remote control instead of the operator for the safety of the operator to perform maintenance work such as dismantling, replacement, and assembly of the radioactive structure.
- the radioactive structure means at least a part of the structure of a nuclear reactor.
- the cutting robot 10 is provided with a moving module 110 for moving the cutting robot 10 by having a motor for driving the plurality of wheels 111 and the plurality of wheels 111, and an elastic leg 121 for expanding and contracting.
- the end of the fixing module 120 for fixing the cutting robot 10 so that it does not move to a certain place, the plasma unit 131 for cutting or welding the radioactive structure, or the gripper 132 for gripping the radioactive structure 3D scan information on the working environment by scanning the working place of the robot arm 130, which is provided in the robot arm 130 and provided in the robot arm 130 and provided in the robot arm 130 having a plurality of joints 133 to have excitation guidance It is controlled through remote control, including the observation unit 140 provided with a three-dimensional sensor to generate the and the communication module 150 communicating with the master device or database, it is possible to perform maintenance work of the structure in the nuclear power plant.
- the virtual reality interface device 20 may include a head mounted display (HMD) device worn by a user.
- the virtual reality interface device 20 may be composed of a plurality of different devices that provide an image to a user.
- the virtual reality interface device 20 may include a separate communication module (not shown) connected to the simulator.
- the master device 30 may include a controller or a teach pendant that generates a control signal for manipulating the cutting robot 10 and the virtual reality-based virtual cutting robot by a user's manipulation.
- the master device 30 may be capable of generating a control signal by recognizing a user's motion.
- the master device 30 generates a control signal for the user's manipulation according to the user's manipulation or automatically recognizes the user's type to generate a control signal, and then the simulator 40 through a separate communication module (not shown). Can be transferred to.
- the simulator 40 generates an image of a virtual reality and a virtual cutting robot based on environment model information preset as the reactor design information and robot model information preset for the cutting robot.
- the simulator 40 may generate a motion event according to a control signal of the virtual cutting robot to operate the virtual cutting robot in virtual reality.
- the simulator 40 includes a communication unit 410, an environment modeling unit 420, a robot modeling unit 430, and a control unit 440.
- the communication unit 410 is directly connected to and communicates with the virtual reality interface device 20, the master device 30, and the database 50 in a manner such as Bluetooth, cable, Wifi direct, and WMB.
- the present invention is not limited thereto, and the communication unit 410 may be connected to the Internet to communicate with the virtual reality interface device 20, the master device 30, and the database 50.
- the environment modeling unit 420 generates environment model information based on the design information of the reactor previously stored in the database 50 and generates a virtual reality image based on this.
- the reactor design information may be information including at least one or more of a design diagram of a reactor, a material, a location, a shape, a density, a melting point, and a weight of the radioactive structure.
- the design information of the reactor is based on the design drawing, but may be changed due to design changes in maintenance and construction.
- the environmental modeling unit 420 may update and provide environmental model information in real time through information observed from a cutting robot that is actually put into a nuclear reactor.
- the environment modeling unit 420 includes a collection unit 421 and an environment information generation unit 422.
- the collection unit 421 collects design information of the reactor previously stored in the database 50 and observation information observed by the cutting robot 10. For example, the collection unit 421 collects design information by being directly connected to the database 50 and the communication unit 410 using Bluetooth WIFI DIRECT or connected through an Internet network. In addition, the collection unit 421 may be directly connected to the cutting robot 10 or connected through an Internet network to collect observation information.
- the environment information generation unit 422 generates a virtual reality image by implementing a real reactor as a 3D image of a reactor in virtual reality through the design information. For example, the environmental information generation unit 422 receives a 2D design drawing created on a plane and a 3D design drawing created in 3D from the collection unit, and receives 3D point-cloud information (or point cloud data) for virtual reality. And a depth image may be generated based on depth information included in the point cloud information, and a preset image coordinate system may be applied to the depth image.
- the environment information generation unit 422 may cluster point group information in a block form based on the image coordinates according to the image coordinate system applied to the depth image, thereby generating a plurality of different clusters, and generating a plurality of different clusters in the cluster through a preset algorithm.
- the entire structure can be modeled as a 3D image by implementing the point group information to which it belongs as a single polygon or hexahedron.
- the environmental information generation unit 422 may receive 3D scan information from the 3D sensor of the observation unit 140 provided in the cutting robot 10 that is actually put into the nuclear reactor, and update a previously implemented virtual reality 3D image.
- the three-dimensional scan information can be composed of three-dimensional point-cloud information (or point-cloud data) for the work environment, and depth information is included in the entire group information, and the design information is a method of implementing a virtual reality 3D image. It is implemented in the same way, but for the same information, the previously implemented 3D image can be maintained, and the 3D image can be updated for the changed part.
- the design information further includes at least one or more of physical information of material, location, shape, volume, density, melting point, weight, strength, and hardness of the radioactive structure object.
- the environment information generation unit 422 may generate environment model information composed of a 3D image including physical information in a structure object.
- the environmental information generation unit 422 compresses the point cloud information belonging to each cluster corresponding to the work environment into a polygon or a hexahedron to model it, and thus, based on the point cloud information on the work environment, the environment model information generated through modeling of the work environment is While reducing the amount of data, it is possible to support rapid modeling of the work environment by greatly reducing the computation time required for modeling the work environment.
- the environmental information generation unit 422 may generate environmental model information identical to the actual nuclear reactor by implementing modeling based on the design information and 3D scan information measured by the actual cutting robot.
- the robot modeling unit 430 includes a robot model generation unit 431, a joint load calculation unit 432, and a joint rotation unit 433.
- the robot model generation unit 431 generates an image of a virtual cutting robot based on preset robot model information. For example, the robot model generation unit 431 generates a virtual cutting robot image composed of 3D graphic information based on the design information of the robot previously stored in the database 50. In addition, the robot model generation unit 431 generates an image of the virtual cutting robot including part information such as material, weight, size, volume, density, and rotation information of the joint for each part of the virtual cutting robot.
- the robot model generator 431 When the robot model generator 431 receives the control signal of the master device through the communication unit, it extracts a motion event corresponding to the control signal from the database. For example, when the control signal is received, the robot model generation unit 431 may at least one of the movement of the virtual cutting robot, the movement of the robot arm of the virtual cutting robot, the operation of the gripper of the cutting robot, and the operation of the plasma of the cutting robot. A previously stored motion event including one or more is extracted from the database 50. Thereafter, the robot model generation unit 431 generates an image of the virtual cutting robot corresponding to the previously stored motion event.
- the robot model generation unit 431 calculates the rotation of each joint corresponding to the position and posture of the end of the robot arm as inverse kinematics and reflects it in the robot model information to be virtual. Create an image of the cutting robot.
- the robot model generation unit 431 first expresses a trajectory corresponding to the position and posture of the end of the robot arm of the virtual cutting robot, and then converts it into a joint trajectory of the robot to control the motor of the virtual cutting robot. It creates an image of a virtual cutting robot by creating movement.
- the robot model generation unit may generate an image of a virtual cutting robot by superimposing an image of a virtual cutting robot on a virtual reality image, and may visually provide the superimposed image to a user by transmitting the superimposed image to a virtual reality interface device.
- the joint load calculation unit 432 calculates the load of each joint of the generated virtual cutting robot 10. For example, the joint load calculation unit 432 continuously calculates the load of each joint for the movement of the robot arm from the initial loading of the virtual cutting robot.
- the joint load calculation unit 432 is a joint applied to each joint by substituting at least one of the three-dimensional coordinate position, velocity, acceleration, and weight and size of a structure object of each joint of the robot arm into a preset static and dynamic equation. You can calculate the load.
- the joint rotation unit 433 compares the joint load calculation value of the joint load calculation unit with a preset value, for example, a value set as an allowable value of the joint load that does not damage the joint, so that the calculation value of the joint load does not exceed a preset value.
- the rotation value of each joint is calculated and reflected in the robot model information.
- the robot model generation unit 431 controls the rotation of the joints so that the load does not exceed a preset value, and while maintaining the position of the end of the robot arm of the virtual cutting robot, the rotation of each joint is changed to virtually cut.
- the load of each joint according to the posture can be maintained within a preset value.
- the joint rotation unit 433 transmits the load applied to each joint to the robot model generation unit 431 to be continuously updated, and generates an optimized rotation value so that the load can be minimized at the rotation angle of the joint.
- the joint rotation unit 433 reflects the optimized rotation value to the robot model information and stores it in a database.
- the robot model generation unit 431 may regenerate the image according to the updated robot model information.
- the control unit 440 generates a single image by integrating the images of the environment modeling unit 420 and the robot modeling unit 430, receives a control signal and operates the virtual cutting robot in a virtual reality environment, Extract the interaction event of the structure object. At this time, when a cutting event of a structure object occurs, the control unit divides the structure object into the cut structure object and calculates physical information such as material, volume, mass, and center of gravity, and the environment modeling unit 420 and the database 50 ) To regenerate the structure object.
- the control unit 440 includes an event extraction unit 441, an object regeneration unit 442, and a physical information operation unit 443.
- the event extraction unit 441 extracts an interaction event between the structure object and the virtual cutting robot from the virtual cutting robot image and the virtual reality image. For example, the event extraction unit 441 extracts an interaction event by determining the overlap of the virtual cutting robot and the structure object.
- the interaction event may include a collision event, a cutting event, and a grip event.
- the collision event is an event that occurs when the structure object and the virtual cutting robot overlap.
- the robot modeling unit may limit the motion of the virtual cutting robot so that it does not overlap when the structure object and the virtual cutting robot come into contact.
- the cutting event is an event in which the plasma unit of the virtual cutting robot cuts through a structure object.
- the robot modeling unit 430 moves the plasma unit generating plasma of a predetermined intensity to pass through the structure object.
- the joint load calculation unit 432 calculates by adding a reaction force in the direction opposite to the direction in which the cutting event occurs, and the joint rotation unit 433 calculates the rotation angle of each joint according to the calculated load, and responds thereto.
- the robot modeling unit 430 rotates each joint.
- the cutting load is a value previously stored in the database 50 and may be a repulsive force against cutting that actually occurs according to the material and melting point of the structure object.
- the environmental information generation unit 422 divides the points of the plane of the structure object through which the plasma unit 311 has passed by using the equation of the plane, and uses the Delaunay triangle for the empty space of the cut surface. It fills in, regenerates physical information such as shape, density, volume, weight and center of gravity of the structure object, and stores it in the database.
- the grip event is an event that grips a structure object using a gripper of a virtual cutting robot.
- the joint load calculation unit calculates the torsion and gravity according to the shape, volume, weight, center of gravity of the structure object and the gripping position of the gripper, and calculates the load of the joint of the robot arm equipped with the gripper of the virtual cutting robot, By providing the calculated value to the joint rotation unit, the optimal rotation angle of each joint can be calculated.
- Such physical information and joint load information generated when each event occurs are stored in the database 50 and may be provided when the actual cutting robot 10 is controlled. For example, it is possible to control the actual cutting robot 10 with the master device 30. At this time, only providing the end and position of the robot arm as a control signal, the joint stored in the database as an optimized value in virtual reality.
- the rotation angle may be provided to the actual cutting robot 10.
- the physical information and the value of the joint rotation angle are provided to the actual cutting robot 10, so that the weight according to the shape and size of the structure is determined in advance and optimized.
- FIG. 3 is a flowchart illustrating a simulation method of a virtual reality-based cutting robot system according to an embodiment.
- the simulation method of the virtual reality-based cutting robot system 1 includes the steps of generating an image (61), calculating the load (62), calculating the rotation angle of each joint (63), Determining input of a control signal for manipulating a virtual reality-based virtual cutting robot (64), generating a motion event of the virtual cutting robot (65), determining whether a cutting event has occurred (56), cutting object It includes a step of generating (67) physical information of the physical information and a step (58) of storing the physical information in the database (50). .
- the step of generating the image 61 includes generating an image of a virtual reality and a virtual cutting robot based on preset environmental model information for the reactor design information and preset robot model information for the cutting robot 10. do.
- step 62 of calculating the load the load applied to each joint of the virtual cutting robot is calculated from the generated image.
- step 63 of calculating the rotation angle of the joint the load applied to each joint of the virtual cutting robot is compared with a preset value, and the rotation angle of the joint is calculated so that the load applied to each joint does not exceed a preset range.
- the calculated rotation angle of the joint is stored in the database, but reflected in the robot model information and stored. After that, the robot model generator updates the image of the virtual cutting robot according to the robot model information.
- the robot model generator determines whether there is a control signal input through the communication unit.
- the robot model generator 431 determines the type of the control signal when there is an input control signal.
- the type of the control signal may be a plurality of signals respectively corresponding to motion events including movement of the virtual cutting robot, movement of the robot arm of the virtual cutting robot, the operation of the gripper of the cutting robot, and the operation of the plasma of the cutting robot. .
- step 65 of generating a motion event of the virtual cutting robot when a control signal is input, a motion event is generated according to the control signal to operate the virtual cutting robot in virtual reality and update the image.
- step 66 of determining whether a cutting event has occurred it is determined that the cutting event of the virtual cutting robot and the cutting object in the virtual reality is generated.
- the event extraction unit 441 detects a collision event, a cutting event, and a grip event in a state in which plasma operation and robot arm movement events occur, and determines whether a cutting event has occurred.
- the physical information of the cut structure object is generated according to the cutting event in virtual reality.
- the environmental information generation unit 422 divides the points of the plane of the structure object through which the plasma unit 311 has passed using the equation of the plane, fills the empty space of the cut surface using the Delaunay triangle, and the structure object Regenerate physical information such as shape, volume, weight and center of gravity.
- the environmental information generation unit 422 stores the generated physical information event in the database.
- the optimal joint rotation angle for the motion of the cutting robot and the shape of the structure object that is easy to transport through cutting are derived, updated and stored in the database, so that the actual cutting robot can be manipulated.
- it is simple to operate with the same control signal, but it is possible to find the cutting structure of the optimized structure object through the information on the cutting shape, and by optimizing the rotation angle of the joint against the load applied to each joint according to the operation of the cutting robot. The load on each joint can be minimized.
Abstract
Description
Claims (10)
- 가상현실 인터페이스 장치와 연동하여, 원자로의 원전 방사 구조물의 절단을 위한 물리정보를 제공하는 절단로봇 시스템에 있어서, In a cutting robot system that provides physical information for cutting a nuclear power plant radiation structure of a nuclear reactor in connection with a virtual reality interface device,상기 절단로봇 및 가상현실 기반 가상 절단로봇을 조종하기 위한 제어신호를 생성하는 마스터 장치;A master device for generating a control signal for controlling the cutting robot and the virtual reality-based virtual cutting robot;상기 원자로의 설계정보로 미리 설정된 환경 모델 정보 및 상기 절단로봇에 대한 미리 설정된 로봇 모델 정보를 기초로 가상 현실 및 가상 절단로봇의 영상을 생성하고, 상기 가상 절단로봇을 제어신호에 따라 가상환경 상에서 동작이벤트를 생성하고, 가상 현실상의 구조물 오브젝트와 상기 가상 절단로봇의 상호작용에 의한 구조물 오브젝트의 물리정보 및 상기 가상 절단로봇의 관절하중정보 변화를 연산하는 시뮬레이터; 및An image of a virtual reality and a virtual cutting robot is generated based on the environmental model information set in advance as the reactor design information and the robot model information set in advance for the cutting robot, and the virtual cutting robot is operated in a virtual environment according to a control signal. A simulator for generating an event and calculating a change in physical information of a structure object and joint load information of the virtual cutting robot by an interaction between the structure object in virtual reality and the virtual cutting robot; And상기 시뮬레이터의 물리정보 및 관절하중 정보를 저장하는 데이터 베이스;A database for storing physical information and joint load information of the simulator;를 포함하는 절단로봇 시스템.Cutting robot system comprising a.
- 제1항에 있어서,The method of claim 1,상기 가상 절단로봇의 동작이벤트는, The motion event of the virtual cutting robot,상기 가상 절단로봇의 이동, 가상 절단로봇의 로봇암의 이동, 상기 절단로봇의 그리퍼의 동작 및 상기 절단로봇의 플라즈마의 작동 중 적어도 하나 이상을 포함하는 절단로봇 시스템.A cutting robot system comprising at least one of a movement of the virtual cutting robot, a movement of a robot arm of the virtual cutting robot, an operation of a gripper of the cutting robot, and an operation of plasma of the cutting robot.
- 제2항에 있어서, The method of claim 2,상기 시뮬레이터는 The simulator is로봇암의 이동에 대응하는 동작이벤트 발생 시, 로봇암의 단부의 위치 및 자세에 대응하는 각 관절의 회전을 역기구학으로 산출하여 상기 로봇 모델 정보에 반영하는 절단로봇 시스템.When a motion event corresponding to the movement of the robot arm occurs, the rotation of each joint corresponding to the position and posture of the end of the robot arm is calculated by inverse kinematics and reflected in the robot model information.
- 제3항에 있어서,The method of claim 3,상기 시뮬레이터는, The simulator,상기 로봇 모델 정보에 각 관절에 걸리는 하중을 계산하여 지속적으로 업데이트하는 절단로봇 시스템.A cutting robot system that continuously updates the robot model information by calculating the load applied to each joint.
- 제4항에 있어서,The method of claim 4,상기 시뮬레이터는,The simulator,각 관절에 걸리는 하중이 기설정범위를 초과하지 않도록 각 관절의 회전을 산출하여 상기 로봇 모델 정보에 반영하는 절단로봇 시스템.A cutting robot system that calculates the rotation of each joint so that the load applied to each joint does not exceed a preset range and reflects it in the robot model information.
- 제1항에 있어서, The method of claim 1,실제 원자로에 투입되어 방사 구조물을 실측 및 절단하는 절단로봇; 을 더 포함하고, A cutting robot that is put into an actual nuclear reactor to measure and cut the radiation structure; Including more,상기 절단로봇의 실제 조작 시, 물리정보를 기반으로 상기 절단로봇의 세부 동작을 제어하는 절단로봇 시스템.When the cutting robot is actually manipulated, a cutting robot system that controls detailed operations of the cutting robot based on physical information.
- 제6항에 있어서, The method of claim 6,상기 환경 모델 정보는,The environmental model information,상기 절단로봇에 의해 실측된 정보로 실시간 업데이트되는 절단로봇 시스템.A cutting robot system that is updated in real time with information measured by the cutting robot.
- 원자로의 설계정보에 대한 미리 설정된 환경 모델 정보 및 절단로봇에 대한 미리 설정된 로봇 모델 정보를 기초로 가상 현실 및 가상 절단로봇의 영상을 생성하는 단계;Generating an image of a virtual reality and a virtual cutting robot based on preset environmental model information for the reactor design information and preset robot model information for the cutting robot;상기 가상 절단로봇을 조종하기 위한 제어신호의 입력을 판단하는 단계;Determining an input of a control signal for manipulating the virtual cutting robot;상기 제어신호가 입력되면, 상기 제어신호에 따라 가상현실 상의 가상 절단로봇의 동작이벤트를 발생시키는 단계;Generating a motion event of a virtual cutting robot in virtual reality according to the control signal when the control signal is input;상기 가상 절단로봇 및 상기 가상 현실상의 절단 오브젝트의 절단 이벤트 발생여부를 판단하는 단계;Determining whether a cutting event of the virtual cutting robot and the cutting object in the virtual reality occurs;상기 가상 현실상의 절단 오브젝트의 물리정보를 생성하는 단계; 및Generating physical information of the cutting object in the virtual reality; And상기 물리정보를 데이터 베이스에 저장하는 단계;Storing the physical information in a database;를 포함하는 가상 현실기반 절단로봇 시스템의 시뮬레이션 방법.Simulation method of a virtual reality-based cutting robot system comprising a.
- 제8항에 있어서, The method of claim 8,상기 가상 절단로봇의 각 관절에 걸리는 하중을 연산하는 단계;Calculating a load applied to each joint of the virtual cutting robot;를 더 포함하는 가상 현실 기반 절단로봇 시스템의 시뮬레이션 방법.Simulation method of a virtual reality-based cutting robot system further comprising a.
- 제8항에 있어서, The method of claim 8,각 관절에 걸리는 하중이 기설정범위를 초과하지 않도록 각 관절의 회전각도를 산출하여 상기 로봇 모델 정보에 반영하는 단계;Calculating a rotation angle of each joint so that the load applied to each joint does not exceed a preset range and reflecting the rotation angle of each joint in the robot model information;를 더 포함하는 가상 현실기반 절단로봇 시스템의 시뮬레이션 방법.Simulation method of a virtual reality-based cutting robot system further comprising a.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20190134344 | 2019-10-28 | ||
KR10-2019-0134344 | 2019-10-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021085727A1 true WO2021085727A1 (en) | 2021-05-06 |
Family
ID=75716392
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/016436 WO2021085727A1 (en) | 2019-10-28 | 2019-11-27 | Cutting robot system and simulation method therefor |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021085727A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115981178A (en) * | 2022-12-19 | 2023-04-18 | 广东若铂智能机器人有限公司 | Simulation system and method for fish and aquatic product slaughtering |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0752068A (en) * | 1993-08-17 | 1995-02-28 | Fujitsu Ltd | Remote control system |
KR20110041950A (en) * | 2009-10-16 | 2011-04-22 | 삼성전자주식회사 | Teaching and playback method using redundancy resolution control for manipulator |
KR20140104917A (en) * | 2013-02-21 | 2014-08-29 | 가부시키가이샤 야스카와덴키 | Robot simulator, robot teaching apparatus and robot teaching method |
US20160114418A1 (en) * | 2014-10-22 | 2016-04-28 | Illinois Tool Works Inc. | Virtual reality controlled mobile robot |
KR20190048589A (en) * | 2017-10-31 | 2019-05-09 | 충남대학교산학협력단 | Apparatus and method for dual-arm robot teaching based on virtual reality |
-
2019
- 2019-11-27 WO PCT/KR2019/016436 patent/WO2021085727A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0752068A (en) * | 1993-08-17 | 1995-02-28 | Fujitsu Ltd | Remote control system |
KR20110041950A (en) * | 2009-10-16 | 2011-04-22 | 삼성전자주식회사 | Teaching and playback method using redundancy resolution control for manipulator |
KR20140104917A (en) * | 2013-02-21 | 2014-08-29 | 가부시키가이샤 야스카와덴키 | Robot simulator, robot teaching apparatus and robot teaching method |
US20160114418A1 (en) * | 2014-10-22 | 2016-04-28 | Illinois Tool Works Inc. | Virtual reality controlled mobile robot |
KR20190048589A (en) * | 2017-10-31 | 2019-05-09 | 충남대학교산학협력단 | Apparatus and method for dual-arm robot teaching based on virtual reality |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115981178A (en) * | 2022-12-19 | 2023-04-18 | 广东若铂智能机器人有限公司 | Simulation system and method for fish and aquatic product slaughtering |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106393049B (en) | A kind of robot for high-risk operations | |
CN102778886B (en) | Planar simulation and verification platform for four-degree-of-freedom robot arm control system | |
CN205075054U (en) | A robot for high -risk operation | |
CN109972674B (en) | Unmanned excavation system and method under complex construction environment based on natural interaction | |
CN109760860A (en) | The ground system test of non-cooperation rolling target is arrested in a kind of both arms collaboration | |
CN111598273B (en) | VR (virtual reality) technology-based maintenance detection method and device for environment-friendly life protection system | |
WO2021085727A1 (en) | Cutting robot system and simulation method therefor | |
CN111338287A (en) | Robot motion control method, device and system, robot and storage medium | |
CN108582031A (en) | A kind of hot line robot branch based on force feedback master & slave control connects gage lap method | |
CN114488848A (en) | Unmanned aerial vehicle autonomous flight system and simulation experiment platform for indoor building space | |
Berns et al. | Chapter unmanned ground robots for rescue tasks | |
Santamaria et al. | Teleoperated robots for live power lines maintenance (ROBTET) | |
Sirouspour et al. | Suppressing operator-induced oscillations in manual control systems with movable bases | |
Batsomboon et al. | A survey of telesensation and teleoperation technology with virtual reality and force reflection capabilities | |
Bejczy | Toward advanced teleoperation in space | |
Rossmann et al. | A virtual testbed for human-robot interaction | |
CN115809508A (en) | Aircraft landing gear mechanical system modeling method, equipment and storage medium | |
Bostelman et al. | RCS-based RoboCrane integration | |
Kleer et al. | Driving simulations for commercial vehicles-A technical overview of a robot based approach | |
Kim et al. | Robotic virtual manipulations of a nuclear hot‐cell digital mock‐up system | |
Machida et al. | Development of a graphic simulator augmented teleoperation system for space applications | |
Constantinescu et al. | Haptic feedback using local models of interaction | |
Cichon et al. | Towards a 3D simulation-based operator interface for teleoperated robots in disaster scenarios | |
Popescu et al. | Dextrous Haptic Interface for JACK™ | |
CN108544215A (en) | A kind of hot line robot arrester replacing method based on force feedback master & slave control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19950694 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19950694 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 14.12.2022). |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19950694 Country of ref document: EP Kind code of ref document: A1 |