WO2021085727A1 - Système de robot de coupe et son procédé de simulation - Google Patents

Système de robot de coupe et son procédé de simulation Download PDF

Info

Publication number
WO2021085727A1
WO2021085727A1 PCT/KR2019/016436 KR2019016436W WO2021085727A1 WO 2021085727 A1 WO2021085727 A1 WO 2021085727A1 KR 2019016436 W KR2019016436 W KR 2019016436W WO 2021085727 A1 WO2021085727 A1 WO 2021085727A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
cutting robot
cutting
virtual
information
Prior art date
Application number
PCT/KR2019/016436
Other languages
English (en)
Korean (ko)
Inventor
채장범
김성신
김재한
정진욱
이종화
Original Assignee
주식회사 엠앤디
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 엠앤디 filed Critical 주식회사 엠앤디
Publication of WO2021085727A1 publication Critical patent/WO2021085727A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • GPHYSICS
    • G21NUCLEAR PHYSICS; NUCLEAR ENGINEERING
    • G21DNUCLEAR POWER PLANT
    • G21D1/00Details of nuclear power plant
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E30/00Energy generation of nuclear origin

Definitions

  • the following description is an invention relating to a cutting robot system and a simulation method thereof.
  • a nuclear reactor or nuclear reactor is a device that uses the heat generated during nuclear fission for power generation or obtains basic particles of substances such as neutrons and radiation and uses it for pre-scientific research or technology development.It is a device that continuously maintains and controls nuclear fission. It's a device. Nuclear reactors are mostly used to generate electrical energy and are also used as power for ships.
  • the nuclear power plant radiation structure refers to a structure that is radiated by absorbing neutrons in the structure, that is, a structure that exhibits radioactivity, among discarded structures generated when a nuclear reactor is dismantled in operation.
  • a cutting robot capable of remote control has been introduced to prevent damage caused by radioactivity of workers, and abolition or maintenance work has been performed.
  • Such a cutting robot cuts the nuclear power plant radiation structure and transports it so that it can be disposed of.
  • the load applied to each joint may vary according to the rotation of the joints of the cutting robot, and the load applied to the cutting robot may also vary according to the load of the cutting body of the nuclear power plant radiation structure. Accordingly, even with the same cutting body, the load applied to the cutting robot varies depending on the length and gripping position, resulting in a load on the rotation of the joint of the cutting robot, resulting in a problem that may lead to damage.
  • the purpose of the embodiment is to provide a virtual reality image by implementing the robot and the robot's working environment as a virtual reality, and through a robot model that interacts with the user's motion while the user checks the working motion of the robot in real time in the virtual reality, it is convenient. It is to provide a cutting robot system and a simulation method that supports intuitive direct operation by supporting seamless simulation.
  • robot model information that calculates the optimal operation form of the robot according to the operation to ensure stable and accurate operation is stored, and physical information and robot model information are stored when the cutting robot is actually operated. It is to provide a cutting robot system and a simulation method thereof that are provided and operate the cutting robot in an optimal operation form.
  • a cutting robot system that provides physical information for cutting a nuclear power plant radiation structure of a nuclear reactor in connection with a virtual reality interface device according to an embodiment will be described.
  • the cutting robot system is based on a master device that generates a control signal for controlling the cutting robot and the virtual reality-based virtual cutting robot, environmental model information preset with design information of the reactor, and robot model information preset for the cutting robot. It generates an image of a virtual reality and a virtual cutting robot, generates a motion event in a virtual environment according to a control signal of the virtual cutting robot, and the physics of the structure object by the interaction of the structure object in the virtual reality and the virtual cutting robot. It includes a simulator that calculates information and changes in joint load information of the virtual cutting robot, and a database that stores physical information and joint load information of the simulator.
  • the motion event of the virtual cutting robot includes at least one of a movement of the virtual cutting robot, a movement of a robot arm of the virtual cutting robot, an operation of a gripper of the cutting robot, and an operation of plasma of the cutting robot. can do.
  • the simulator may calculate the rotation of each joint corresponding to the position and posture of the end of the robot arm as inverse kinematics and reflect it in the robot model information.
  • the simulator may continuously update the robot model information by calculating a load applied to each joint.
  • the simulator may calculate the rotation of each joint so that the load applied to each joint does not exceed a preset range and reflect it in the robot model information.
  • a cutting robot that is input to an actual nuclear reactor to measure and cut the radiation structure may be further included, and when the cutting robot is actually operated, detailed operations of the cutting robot may be controlled based on physical information.
  • the environmental model information may be updated in real time with information measured by the cutting robot.
  • the simulation method of a virtual reality-based cutting robot system includes the steps of generating an image of a virtual reality and a virtual cutting robot based on preset environmental model information for the reactor design information and preset robot model information for the cutting robot. Determining an input of a control signal for manipulating the cutting robot, when the control signal is input, generating a motion event of the virtual cutting robot in virtual reality according to the control signal, the virtual cutting robot and the virtual reality It may include determining whether a cutting event of the cutting object occurs, generating physical information of the cutting object in virtual reality, and storing the physical information in a database.
  • it may further include calculating a load applied to each joint of the virtual cutting robot.
  • it may further include calculating the rotation angle of each joint so that the load applied to each joint does not exceed a preset range and reflecting it in the robot model information.
  • the purpose of the embodiment is to provide a virtual reality image by implementing the robot and the robot's working environment as a virtual reality, and through a robot model that interacts with the user's motion while the user checks the working motion of the robot in real time in the virtual reality, it is convenient. It can be supported so that it can be simulated so that intuitive direct operation is possible.
  • robot model information that calculates the optimal operation form of the robot according to the operation to ensure stable and accurate operation is stored, and physical information and robot model information are stored when the cutting robot is actually operated. It is provided and can operate the cutting robot in an optimal operation mode.
  • FIG. 1 is a block diagram of a virtual reality-based cutting robot system according to an embodiment.
  • FIG. 2 is a perspective view of a cutting robot according to an embodiment.
  • FIG. 3 is a flowchart illustrating a simulation method of a virtual reality-based cutting robot system according to an embodiment.
  • first, second, A, B, (a), and (b) may be used. These terms are for distinguishing the constituent element from other constituent elements, and the nature, order, or order of the constituent element is not limited by the term.
  • the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded in the medium may be specially designed and configured for the embodiment, or may be known to and usable by a person skilled in computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
  • -A hardware device specially configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine language codes such as those produced by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operation of the embodiment, and vice versa.
  • FIG. 1 is a block diagram of a virtual reality-based cutting robot system according to an embodiment
  • FIG. 2 is a perspective view of a cutting robot according to the embodiment.
  • the cutting robot system 1 includes a cutting robot 10, a virtual reality interface device 20, a master device 30, a simulator 40, and a database 50.
  • the cutting robot 10 is input to the nuclear reactor through remote control instead of the operator for the safety of the operator to perform maintenance work such as dismantling, replacement, and assembly of the radioactive structure.
  • the radioactive structure means at least a part of the structure of a nuclear reactor.
  • the cutting robot 10 is provided with a moving module 110 for moving the cutting robot 10 by having a motor for driving the plurality of wheels 111 and the plurality of wheels 111, and an elastic leg 121 for expanding and contracting.
  • the end of the fixing module 120 for fixing the cutting robot 10 so that it does not move to a certain place, the plasma unit 131 for cutting or welding the radioactive structure, or the gripper 132 for gripping the radioactive structure 3D scan information on the working environment by scanning the working place of the robot arm 130, which is provided in the robot arm 130 and provided in the robot arm 130 and provided in the robot arm 130 having a plurality of joints 133 to have excitation guidance It is controlled through remote control, including the observation unit 140 provided with a three-dimensional sensor to generate the and the communication module 150 communicating with the master device or database, it is possible to perform maintenance work of the structure in the nuclear power plant.
  • the virtual reality interface device 20 may include a head mounted display (HMD) device worn by a user.
  • the virtual reality interface device 20 may be composed of a plurality of different devices that provide an image to a user.
  • the virtual reality interface device 20 may include a separate communication module (not shown) connected to the simulator.
  • the master device 30 may include a controller or a teach pendant that generates a control signal for manipulating the cutting robot 10 and the virtual reality-based virtual cutting robot by a user's manipulation.
  • the master device 30 may be capable of generating a control signal by recognizing a user's motion.
  • the master device 30 generates a control signal for the user's manipulation according to the user's manipulation or automatically recognizes the user's type to generate a control signal, and then the simulator 40 through a separate communication module (not shown). Can be transferred to.
  • the simulator 40 generates an image of a virtual reality and a virtual cutting robot based on environment model information preset as the reactor design information and robot model information preset for the cutting robot.
  • the simulator 40 may generate a motion event according to a control signal of the virtual cutting robot to operate the virtual cutting robot in virtual reality.
  • the simulator 40 includes a communication unit 410, an environment modeling unit 420, a robot modeling unit 430, and a control unit 440.
  • the communication unit 410 is directly connected to and communicates with the virtual reality interface device 20, the master device 30, and the database 50 in a manner such as Bluetooth, cable, Wifi direct, and WMB.
  • the present invention is not limited thereto, and the communication unit 410 may be connected to the Internet to communicate with the virtual reality interface device 20, the master device 30, and the database 50.
  • the environment modeling unit 420 generates environment model information based on the design information of the reactor previously stored in the database 50 and generates a virtual reality image based on this.
  • the reactor design information may be information including at least one or more of a design diagram of a reactor, a material, a location, a shape, a density, a melting point, and a weight of the radioactive structure.
  • the design information of the reactor is based on the design drawing, but may be changed due to design changes in maintenance and construction.
  • the environmental modeling unit 420 may update and provide environmental model information in real time through information observed from a cutting robot that is actually put into a nuclear reactor.
  • the environment modeling unit 420 includes a collection unit 421 and an environment information generation unit 422.
  • the collection unit 421 collects design information of the reactor previously stored in the database 50 and observation information observed by the cutting robot 10. For example, the collection unit 421 collects design information by being directly connected to the database 50 and the communication unit 410 using Bluetooth WIFI DIRECT or connected through an Internet network. In addition, the collection unit 421 may be directly connected to the cutting robot 10 or connected through an Internet network to collect observation information.
  • the environment information generation unit 422 generates a virtual reality image by implementing a real reactor as a 3D image of a reactor in virtual reality through the design information. For example, the environmental information generation unit 422 receives a 2D design drawing created on a plane and a 3D design drawing created in 3D from the collection unit, and receives 3D point-cloud information (or point cloud data) for virtual reality. And a depth image may be generated based on depth information included in the point cloud information, and a preset image coordinate system may be applied to the depth image.
  • the environment information generation unit 422 may cluster point group information in a block form based on the image coordinates according to the image coordinate system applied to the depth image, thereby generating a plurality of different clusters, and generating a plurality of different clusters in the cluster through a preset algorithm.
  • the entire structure can be modeled as a 3D image by implementing the point group information to which it belongs as a single polygon or hexahedron.
  • the environmental information generation unit 422 may receive 3D scan information from the 3D sensor of the observation unit 140 provided in the cutting robot 10 that is actually put into the nuclear reactor, and update a previously implemented virtual reality 3D image.
  • the three-dimensional scan information can be composed of three-dimensional point-cloud information (or point-cloud data) for the work environment, and depth information is included in the entire group information, and the design information is a method of implementing a virtual reality 3D image. It is implemented in the same way, but for the same information, the previously implemented 3D image can be maintained, and the 3D image can be updated for the changed part.
  • the design information further includes at least one or more of physical information of material, location, shape, volume, density, melting point, weight, strength, and hardness of the radioactive structure object.
  • the environment information generation unit 422 may generate environment model information composed of a 3D image including physical information in a structure object.
  • the environmental information generation unit 422 compresses the point cloud information belonging to each cluster corresponding to the work environment into a polygon or a hexahedron to model it, and thus, based on the point cloud information on the work environment, the environment model information generated through modeling of the work environment is While reducing the amount of data, it is possible to support rapid modeling of the work environment by greatly reducing the computation time required for modeling the work environment.
  • the environmental information generation unit 422 may generate environmental model information identical to the actual nuclear reactor by implementing modeling based on the design information and 3D scan information measured by the actual cutting robot.
  • the robot modeling unit 430 includes a robot model generation unit 431, a joint load calculation unit 432, and a joint rotation unit 433.
  • the robot model generation unit 431 generates an image of a virtual cutting robot based on preset robot model information. For example, the robot model generation unit 431 generates a virtual cutting robot image composed of 3D graphic information based on the design information of the robot previously stored in the database 50. In addition, the robot model generation unit 431 generates an image of the virtual cutting robot including part information such as material, weight, size, volume, density, and rotation information of the joint for each part of the virtual cutting robot.
  • the robot model generator 431 When the robot model generator 431 receives the control signal of the master device through the communication unit, it extracts a motion event corresponding to the control signal from the database. For example, when the control signal is received, the robot model generation unit 431 may at least one of the movement of the virtual cutting robot, the movement of the robot arm of the virtual cutting robot, the operation of the gripper of the cutting robot, and the operation of the plasma of the cutting robot. A previously stored motion event including one or more is extracted from the database 50. Thereafter, the robot model generation unit 431 generates an image of the virtual cutting robot corresponding to the previously stored motion event.
  • the robot model generation unit 431 calculates the rotation of each joint corresponding to the position and posture of the end of the robot arm as inverse kinematics and reflects it in the robot model information to be virtual. Create an image of the cutting robot.
  • the robot model generation unit 431 first expresses a trajectory corresponding to the position and posture of the end of the robot arm of the virtual cutting robot, and then converts it into a joint trajectory of the robot to control the motor of the virtual cutting robot. It creates an image of a virtual cutting robot by creating movement.
  • the robot model generation unit may generate an image of a virtual cutting robot by superimposing an image of a virtual cutting robot on a virtual reality image, and may visually provide the superimposed image to a user by transmitting the superimposed image to a virtual reality interface device.
  • the joint load calculation unit 432 calculates the load of each joint of the generated virtual cutting robot 10. For example, the joint load calculation unit 432 continuously calculates the load of each joint for the movement of the robot arm from the initial loading of the virtual cutting robot.
  • the joint load calculation unit 432 is a joint applied to each joint by substituting at least one of the three-dimensional coordinate position, velocity, acceleration, and weight and size of a structure object of each joint of the robot arm into a preset static and dynamic equation. You can calculate the load.
  • the joint rotation unit 433 compares the joint load calculation value of the joint load calculation unit with a preset value, for example, a value set as an allowable value of the joint load that does not damage the joint, so that the calculation value of the joint load does not exceed a preset value.
  • the rotation value of each joint is calculated and reflected in the robot model information.
  • the robot model generation unit 431 controls the rotation of the joints so that the load does not exceed a preset value, and while maintaining the position of the end of the robot arm of the virtual cutting robot, the rotation of each joint is changed to virtually cut.
  • the load of each joint according to the posture can be maintained within a preset value.
  • the joint rotation unit 433 transmits the load applied to each joint to the robot model generation unit 431 to be continuously updated, and generates an optimized rotation value so that the load can be minimized at the rotation angle of the joint.
  • the joint rotation unit 433 reflects the optimized rotation value to the robot model information and stores it in a database.
  • the robot model generation unit 431 may regenerate the image according to the updated robot model information.
  • the control unit 440 generates a single image by integrating the images of the environment modeling unit 420 and the robot modeling unit 430, receives a control signal and operates the virtual cutting robot in a virtual reality environment, Extract the interaction event of the structure object. At this time, when a cutting event of a structure object occurs, the control unit divides the structure object into the cut structure object and calculates physical information such as material, volume, mass, and center of gravity, and the environment modeling unit 420 and the database 50 ) To regenerate the structure object.
  • the control unit 440 includes an event extraction unit 441, an object regeneration unit 442, and a physical information operation unit 443.
  • the event extraction unit 441 extracts an interaction event between the structure object and the virtual cutting robot from the virtual cutting robot image and the virtual reality image. For example, the event extraction unit 441 extracts an interaction event by determining the overlap of the virtual cutting robot and the structure object.
  • the interaction event may include a collision event, a cutting event, and a grip event.
  • the collision event is an event that occurs when the structure object and the virtual cutting robot overlap.
  • the robot modeling unit may limit the motion of the virtual cutting robot so that it does not overlap when the structure object and the virtual cutting robot come into contact.
  • the cutting event is an event in which the plasma unit of the virtual cutting robot cuts through a structure object.
  • the robot modeling unit 430 moves the plasma unit generating plasma of a predetermined intensity to pass through the structure object.
  • the joint load calculation unit 432 calculates by adding a reaction force in the direction opposite to the direction in which the cutting event occurs, and the joint rotation unit 433 calculates the rotation angle of each joint according to the calculated load, and responds thereto.
  • the robot modeling unit 430 rotates each joint.
  • the cutting load is a value previously stored in the database 50 and may be a repulsive force against cutting that actually occurs according to the material and melting point of the structure object.
  • the environmental information generation unit 422 divides the points of the plane of the structure object through which the plasma unit 311 has passed by using the equation of the plane, and uses the Delaunay triangle for the empty space of the cut surface. It fills in, regenerates physical information such as shape, density, volume, weight and center of gravity of the structure object, and stores it in the database.
  • the grip event is an event that grips a structure object using a gripper of a virtual cutting robot.
  • the joint load calculation unit calculates the torsion and gravity according to the shape, volume, weight, center of gravity of the structure object and the gripping position of the gripper, and calculates the load of the joint of the robot arm equipped with the gripper of the virtual cutting robot, By providing the calculated value to the joint rotation unit, the optimal rotation angle of each joint can be calculated.
  • Such physical information and joint load information generated when each event occurs are stored in the database 50 and may be provided when the actual cutting robot 10 is controlled. For example, it is possible to control the actual cutting robot 10 with the master device 30. At this time, only providing the end and position of the robot arm as a control signal, the joint stored in the database as an optimized value in virtual reality.
  • the rotation angle may be provided to the actual cutting robot 10.
  • the physical information and the value of the joint rotation angle are provided to the actual cutting robot 10, so that the weight according to the shape and size of the structure is determined in advance and optimized.
  • FIG. 3 is a flowchart illustrating a simulation method of a virtual reality-based cutting robot system according to an embodiment.
  • the simulation method of the virtual reality-based cutting robot system 1 includes the steps of generating an image (61), calculating the load (62), calculating the rotation angle of each joint (63), Determining input of a control signal for manipulating a virtual reality-based virtual cutting robot (64), generating a motion event of the virtual cutting robot (65), determining whether a cutting event has occurred (56), cutting object It includes a step of generating (67) physical information of the physical information and a step (58) of storing the physical information in the database (50). .
  • the step of generating the image 61 includes generating an image of a virtual reality and a virtual cutting robot based on preset environmental model information for the reactor design information and preset robot model information for the cutting robot 10. do.
  • step 62 of calculating the load the load applied to each joint of the virtual cutting robot is calculated from the generated image.
  • step 63 of calculating the rotation angle of the joint the load applied to each joint of the virtual cutting robot is compared with a preset value, and the rotation angle of the joint is calculated so that the load applied to each joint does not exceed a preset range.
  • the calculated rotation angle of the joint is stored in the database, but reflected in the robot model information and stored. After that, the robot model generator updates the image of the virtual cutting robot according to the robot model information.
  • the robot model generator determines whether there is a control signal input through the communication unit.
  • the robot model generator 431 determines the type of the control signal when there is an input control signal.
  • the type of the control signal may be a plurality of signals respectively corresponding to motion events including movement of the virtual cutting robot, movement of the robot arm of the virtual cutting robot, the operation of the gripper of the cutting robot, and the operation of the plasma of the cutting robot. .
  • step 65 of generating a motion event of the virtual cutting robot when a control signal is input, a motion event is generated according to the control signal to operate the virtual cutting robot in virtual reality and update the image.
  • step 66 of determining whether a cutting event has occurred it is determined that the cutting event of the virtual cutting robot and the cutting object in the virtual reality is generated.
  • the event extraction unit 441 detects a collision event, a cutting event, and a grip event in a state in which plasma operation and robot arm movement events occur, and determines whether a cutting event has occurred.
  • the physical information of the cut structure object is generated according to the cutting event in virtual reality.
  • the environmental information generation unit 422 divides the points of the plane of the structure object through which the plasma unit 311 has passed using the equation of the plane, fills the empty space of the cut surface using the Delaunay triangle, and the structure object Regenerate physical information such as shape, volume, weight and center of gravity.
  • the environmental information generation unit 422 stores the generated physical information event in the database.
  • the optimal joint rotation angle for the motion of the cutting robot and the shape of the structure object that is easy to transport through cutting are derived, updated and stored in the database, so that the actual cutting robot can be manipulated.
  • it is simple to operate with the same control signal, but it is possible to find the cutting structure of the optimized structure object through the information on the cutting shape, and by optimizing the rotation angle of the joint against the load applied to each joint according to the operation of the cutting robot. The load on each joint can be minimized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Plasma & Fusion (AREA)
  • General Engineering & Computer Science (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Manipulator (AREA)

Abstract

L'invention divulgue un système de robot de coupe qui fonctionne conjointement avec un dispositif d'interface de réalité virtuelle pour fournir des informations physiques pour couper une structure radioactive nucléaire dans un réacteur nucléaire, et son procédé de simulation. Le système comprend : un robot de coupe ; un dispositif maître qui génère un signal de commande pour commander un robot de coupe virtuel basé sur la réalité virtuelle ; un simulateur qui génère une image de réalité virtuelle et une image du robot de coupe virtuel sur la base d'informations de modèle environnemental définies à l'avance par des informations de conception du réacteur nucléaire et des informations de modèle de robot définies à l'avance pour le robot de coupe, génère un événement de mouvement pour le robot de coupe virtuel dans un environnement virtuel en fonction du signal de commande et calcule un changement dans des informations physiques d'un objet de structure de réalité virtuelle et des informations de charge d'articulation du robot de coupe virtuel en raison de l'interaction entre l'objet de structure et le robot de coupe virtuel ; et une base de données qui stocke les informations physiques et les informations de charge d'articulation du simulateur.
PCT/KR2019/016436 2019-10-28 2019-11-27 Système de robot de coupe et son procédé de simulation WO2021085727A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20190134344 2019-10-28
KR10-2019-0134344 2019-10-28

Publications (1)

Publication Number Publication Date
WO2021085727A1 true WO2021085727A1 (fr) 2021-05-06

Family

ID=75716392

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/016436 WO2021085727A1 (fr) 2019-10-28 2019-11-27 Système de robot de coupe et son procédé de simulation

Country Status (1)

Country Link
WO (1) WO2021085727A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115981178A (zh) * 2022-12-19 2023-04-18 广东若铂智能机器人有限公司 一种鱼类水产宰杀的仿真系统及方法
CN115981178B (zh) * 2022-12-19 2024-05-24 广东若铂智能机器人有限公司 一种鱼类水产宰杀的仿真系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0752068A (ja) * 1993-08-17 1995-02-28 Fujitsu Ltd 遠隔操作制御方式
KR20110041950A (ko) * 2009-10-16 2011-04-22 삼성전자주식회사 여유자유도 제어를 이용한 로봇의 교시 및 재현 방법
KR20140104917A (ko) * 2013-02-21 2014-08-29 가부시키가이샤 야스카와덴키 로봇 시뮬레이터, 로봇 교시 장치 및 로봇 교시 방법
US20160114418A1 (en) * 2014-10-22 2016-04-28 Illinois Tool Works Inc. Virtual reality controlled mobile robot
KR20190048589A (ko) * 2017-10-31 2019-05-09 충남대학교산학협력단 가상 현실 기반 양팔로봇 교시 장치 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0752068A (ja) * 1993-08-17 1995-02-28 Fujitsu Ltd 遠隔操作制御方式
KR20110041950A (ko) * 2009-10-16 2011-04-22 삼성전자주식회사 여유자유도 제어를 이용한 로봇의 교시 및 재현 방법
KR20140104917A (ko) * 2013-02-21 2014-08-29 가부시키가이샤 야스카와덴키 로봇 시뮬레이터, 로봇 교시 장치 및 로봇 교시 방법
US20160114418A1 (en) * 2014-10-22 2016-04-28 Illinois Tool Works Inc. Virtual reality controlled mobile robot
KR20190048589A (ko) * 2017-10-31 2019-05-09 충남대학교산학협력단 가상 현실 기반 양팔로봇 교시 장치 및 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115981178A (zh) * 2022-12-19 2023-04-18 广东若铂智能机器人有限公司 一种鱼类水产宰杀的仿真系统及方法
CN115981178B (zh) * 2022-12-19 2024-05-24 广东若铂智能机器人有限公司 一种鱼类水产宰杀的仿真系统

Similar Documents

Publication Publication Date Title
CN106393049B (zh) 一种用于高危作业的机器人
CN102778886B (zh) 平面四自由度机械臂控制系统模拟验证平台
CN205075054U (zh) 一种用于高危作业的机器人
CN109972674B (zh) 基于自然交互的复杂施工环境下的无人挖掘系统及方法
CN109760860A (zh) 一种双臂协同抓捕非合作旋转目标的地面试验系统
CN111598273B (zh) 一种基于vr技术的环控生保系统维修性检测方法及装置
CN114488848A (zh) 面向室内建筑空间的无人机自主飞行系统及仿真实验平台
WO2021085727A1 (fr) Système de robot de coupe et son procédé de simulation
Sirouspour et al. Suppressing operator-induced oscillations in manual control systems with movable bases
Batsomboon et al. A survey of telesensation and teleoperation technology with virtual reality and force reflection capabilities
Bejczy Toward advanced teleoperation in space
Rossmann et al. A virtual testbed for human-robot interaction
Gračanin et al. Virtual environment testbed for autonomous underwater vehicles
CN115809508A (zh) 飞机起落架机械系统建模方法、设备及存储介质
Kim et al. Robotic virtual manipulations of a nuclear hot‐cell digital mock‐up system
CN108598965A (zh) 一种基于力反馈主从控制的带电作业机器人隔离刀闸更换方法
CN114625027B (zh) 基于多自由度运动模拟器的多航天器姿轨控地面全物理仿真系统
Machida et al. Development of a graphic simulator augmented teleoperation system for space applications
Constantinescu et al. Haptic feedback using local models of interaction
Cichon et al. Towards a 3D simulation-based operator interface for teleoperated robots in disaster scenarios
Popescu et al. Dextrous Haptic Interface for JACK™
Olivares-Mendez et al. Establishing a Multi-Functional Space Operations Emulation Facility: Insights from the Zero-G Lab
WO2022086081A1 (fr) Système et procédé de simulation de construction pour équipement de centrale nucléaire
Asada et al. Wearable robotic systems for supporting a load
Piedboeuf et al. Simulation environments for space robot design and verification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19950694

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19950694

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 14.12.2022).

122 Ep: pct application non-entry in european phase

Ref document number: 19950694

Country of ref document: EP

Kind code of ref document: A1