CN113618731A - Robot control system - Google Patents
Robot control system Download PDFInfo
- Publication number
- CN113618731A CN113618731A CN202110831069.9A CN202110831069A CN113618731A CN 113618731 A CN113618731 A CN 113618731A CN 202110831069 A CN202110831069 A CN 202110831069A CN 113618731 A CN113618731 A CN 113618731A
- Authority
- CN
- China
- Prior art keywords
- robot
- control
- mechanical arm
- environment
- sending
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000875 corresponding effect Effects 0.000 claims abstract description 37
- 210000000707 wrist Anatomy 0.000 claims abstract description 17
- 238000012545 processing Methods 0.000 claims description 74
- 230000005855 radiation Effects 0.000 claims description 20
- 230000009471 action Effects 0.000 claims description 18
- 238000004891 communication Methods 0.000 claims description 13
- 230000001276 controlling effect Effects 0.000 claims description 12
- 230000008447 perception Effects 0.000 claims description 8
- 230000007613 environmental effect Effects 0.000 claims description 6
- 238000012544 monitoring process Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 13
- 230000016776 visual perception Effects 0.000 description 5
- 230000004888 barrier function Effects 0.000 description 4
- 241000282414 Homo sapiens Species 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 230000003252 repetitive effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010248 power generation Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
The application relates to a robot control system, and belongs to the technical field of robots. The robot control system includes: a robot, a remote control device; the robot comprises a mechanical arm, an encoder and a wrist force sensor, and the remote control equipment comprises a virtual reality assembly and a force feedback handle; the encoder is used for acquiring joint state information of the mechanical arm; the virtual reality component is used for generating and displaying a virtual reality attitude image of the robot according to the received joint state information and a preset three-dimensional model of the robot; the wrist force sensor is used for acquiring the acting force of the mechanical arm; the robot is used for sending the acting force to the remote control equipment; and the remote control equipment is used for setting the control resistance of the force feedback handle according to the acting force and sending an operation instruction received based on the force feedback handle to the robot so as to enable the robot to perform corresponding actions. The application provides a robot control system can improve the accuracy of robot control.
Description
Technical Field
The present application relates to the field of robotics, and in particular, to a robot control system.
Background
With the rapid development of scientific technology, more and more production enterprises use industrial robots to replace human beings to complete some repetitive, heavy and dangerous physical labor. Particularly, in a nuclear radiation environment, a person cannot work in the nuclear radiation environment, and therefore, it is considered that a robot with a large load is used instead of a human being to work.
The existing robot control system usually only depends on the robot to send video information of the on-site environment information to the remote end, and the remote end generates a control instruction according to the video information to control the robot to execute a corresponding task.
However, this kind of control system is only the video information that the robot returns and generates the control command to control the robot and carry out corresponding task, so there is the problem that control accuracy is not high.
Disclosure of Invention
In view of the above, it is necessary to provide a robot control system capable of improving the accuracy of robot control.
A robot control system includes a robot, a remote control device; the robot comprises a mechanical arm, an encoder arranged in the mechanical arm and a wrist force sensor arranged on the mechanical arm, and the remote control equipment comprises a virtual reality assembly and a force feedback handle;
the robot is used for sending the joint state information to the remote control equipment;
the virtual reality component is used for generating and displaying a virtual reality attitude image of the robot according to the received joint state information and a preset three-dimensional model of the robot;
the wrist force sensor is used for acquiring the acting force of the mechanical arm;
the robot is used for sending the acting force to the remote control equipment;
and the remote control equipment is used for setting the control resistance of the force feedback handle according to the acting force and sending an operation instruction received based on the force feedback handle to the robot so as to enable the robot to perform corresponding actions.
In one embodiment, the robot further comprises: the system comprises an environment sensing component, a first processing component and a power control component;
the environment sensing component is used for detecting and acquiring environment information of the robot and sending the environment data to the first processing component;
the first processing assembly is used for planning the action path of the robot according to the environment information and generating a first control instruction according to the action path of the robot;
and the power control assembly is used for controlling the robot to execute corresponding movement according to the first control instruction.
In one embodiment, the environmental data includes: object images, obstacle data, and three-dimensional environment data; the context awareness component includes: a pan-tilt camera, a three-dimensional laser radar and a solid-state laser radar;
the holder camera is used for shooting an object image and sending the object image to the first processing component;
the three-dimensional laser radar is used for acquiring three-dimensional environment data and sending the three-dimensional environment data to the first processing assembly;
the solid-state laser radar is used for acquiring barrier data and sending the barrier data to the first processing component;
the first processing component is specifically used for identifying and obtaining a target object according to an object image; generating an environment map according to the three-dimensional environment data, and determining the position information of the target object according to the environment map; and planning the action path of the robot according to the environment map, the position information of the target object and the barrier data, and generating a first control instruction according to the action path of the robot.
In one embodiment, the environmental perception component further comprises a gamma camera; a gamma camera for acquiring radiation dose information; the first processing assembly is further used for generating an environment map according to the three-dimensional environment data and the radiation dose information, planning a motion path of the robot according to the environment map, the position information of the target object and the barrier data, and generating a first control instruction according to the motion path of the robot.
In one embodiment, the robot further comprises: an inertial sensor; the inertial sensor is used for detecting the motion characteristic of the robot, acquiring the speed parameter of the robot and sending the speed parameter to the first processing component;
the first processing component is specifically used for determining the position information of the robot according to the environment map and the speed parameter, planning the action path of the robot according to the environment map, the position information of the target object, the obstacle data and the position information of the robot, and generating a first control instruction according to the action path of the robot.
In one embodiment, the robot further comprises: the hand-eye camera is used for shooting an object image around the mechanical arm and sending the object image around the mechanical arm to the first processing component;
the first processing assembly plans the action path of each mechanical arm according to the object image and the environment information around the mechanical arm to generate a corresponding second control instruction and a corresponding third control instruction;
and the power control assembly is used for respectively controlling each mechanical arm to execute corresponding movement according to the second control command and the third control command.
In one embodiment, the robot further comprises a hydraulic control assembly;
the power control assembly is also used for sending the first control command, the second control command and the third control command to the hydraulic control assembly;
and the hydraulic control assembly is used for controlling the robot to execute corresponding movement according to the first control instruction and controlling each mechanical arm to execute corresponding movement according to the second control instruction and the third control instruction.
In one embodiment, the robot arm further comprises a clamping jaw tool head, and the robot further comprises a clamping force sensor, wherein the clamping force sensor is arranged on the clamping jaw tool head;
the clamping force sensor is used for acquiring the clamping force of the clamping jaw tool head and sending the clamping force to the first processing assembly;
the first handling assembly is further configured to receive the clamping force and adjust the clamping force based on a weight of an object acted upon by the work tool head.
In one embodiment, the system further comprises: a communication system, the communication system comprising: the robot comprises a router arranged at a remote control end, a wireless network bridge arranged in the robot and at least one wireless network repeater arranged on a traveling path of the robot;
the first processing component detects network signals of the communication system in real time;
and the robot is provided with a wireless network repeater at the position where the network signal is smaller than the preset threshold value.
In one embodiment, the robot further comprises a second processing component;
the second processing assembly is used for monitoring the running state of the first processing assembly and controlling the robot to return to the initial position when the first processing assembly fails.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the robot control system that this application embodiment provided includes: the robot comprises a mechanical arm, an encoder arranged in the mechanical arm and a wrist force sensor arranged on the mechanical arm, the remote control equipment comprises a virtual reality component and a force feedback handle, joint state information of the mechanical arm is acquired through the encoder, the robot sends the joint state information to the remote control equipment, and then the virtual reality component in the remote control equipment generates and displays a virtual display attitude image of the robot according to a received joint state diagram and a preset three-dimensional model of the robot. Meanwhile, the wrist force sensor is also used for acquiring acting force of the mechanical arm, the robot sends the acting force to the remote control equipment, then the remote control equipment sets the operation resistance of the force feedback handle according to the acting force, and sends an operation instruction received based on the force feedback handle to the robot, so that the robot performs corresponding action according to the operation instruction. The robot control system provided by the embodiment of the application can establish the virtual reality attitude image of the robot in real time, can provide rich visual perception for a remote operator, can provide real force perception for the operator through the feedback of acting force, and can improve the accuracy of robot control.
Further, the robot control system provided by the embodiment of the application can detect and acquire environment information of the robot through the environment sensing component, and send the environment data to the first processing component, then the first processing component plans the action path of the robot according to the environment information, and generates the first control instruction according to the action path of the robot, and finally the power control component controls the robot to execute corresponding movement according to the first control instruction, so that the robot can realize autonomous detection of the environment information, and plan the action path autonomously according to the environment information, and finally the robot can realize autonomous navigation and movement without a remote control device, and the intelligence of the robot can be improved.
Drawings
Fig. 1 is a block diagram of a robot control system according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a remote control device according to an embodiment of the present disclosure;
FIG. 3 is a block diagram of a context awareness component provided in an embodiment of the present application;
fig. 4 is a block diagram of a communication system according to an embodiment of the present application;
FIG. 5 is a block diagram of a hydraulic control assembly according to an embodiment of the present disclosure;
fig. 6 is a block diagram of a second processing component according to an embodiment of the present disclosure;
fig. 7 is a block diagram of a power system according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The robot control system provided by the embodiment of the application is suitable for the technical field of robots, can establish the virtual reality attitude image of the robot in real time, can give rich visual perception to remote operators, can provide real force perception for the operators through the feedback of acting force, and can improve the accuracy of robot control.
With the rapid development of scientific technology, more and more production enterprises use industrial robots to replace human beings to complete some repetitive, heavy and dangerous physical labor. Particularly, in a nuclear radiation environment, personnel cannot work in the nuclear radiation environment, for example, a fukushima nuclear accident site is taken as an example, a field device is seriously damaged, a large-load rescue device is needed to break and dismantle collapsed concrete, grab and convey broken stones, cut and convey a collapsed pipeline, cut and convey broken metal objects, grab and convey thin materials such as grid plates and the like, convey a waste liquid barrel without leakage and the like, so that other rescue devices or personnel can reach an operation area at the first time. Therefore, a rescue robot capable of realizing complex handling of nuclear accidents and effective heavy-load transportation is needed. The existing robot control system usually only depends on the robot to send video information of the on-site environment information to the remote end, and the remote end generates a control instruction according to the video information to control the robot to execute a corresponding task. However, this kind of control system is only the video information that the robot returns and generates the control command to control the robot and carry out corresponding task, so there is the problem that control accuracy is not high.
The robot control system that this application embodiment provided includes: the robot comprises a mechanical arm, an encoder arranged in the mechanical arm and a wrist force sensor arranged on the mechanical arm, the remote control equipment comprises a virtual reality component and a force feedback handle, joint state information of the mechanical arm is acquired through the encoder, the robot sends the joint state information to the remote control equipment, and then the virtual reality component in the remote control equipment generates and displays a virtual display attitude image of the robot according to a received joint state diagram and a preset three-dimensional model of the robot. Meanwhile, the wrist force sensor is also used for acquiring acting force of the mechanical arm, the robot sends the lifting force to the remote control equipment, then the remote control equipment sets the operation resistance of the force feedback handle according to the acting force, and sends an operation instruction received based on the force feedback handle to the robot, so that the robot performs corresponding action according to the operation instruction. The robot control system provided by the embodiment of the application can establish the virtual reality attitude image of the robot in real time, can provide rich visual perception for a remote operator, can provide real force perception for the operator through the feedback of acting force, and can improve the accuracy of robot control.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a block diagram of a robot control system according to an embodiment of the present disclosure. As shown in fig. 1, the control system includes: a remote manipulation device 10 and a robot 20.
As shown in fig. 2 and 3, the robot 20 includes a robot arm, an encoder disposed in the robot arm, and a wrist force sensor 214 disposed on the robot arm, and the remote manipulation apparatus 10 includes a virtual reality component 11 and a force feedback handle; an encoder for acquiring joint state information of the robot arm, the robot 20 for transmitting the joint state information to the remote manipulation apparatus 10; the virtual reality component 11 is used for generating and displaying a virtual reality attitude image of the robot 20 according to the received joint state diagram and a preset three-dimensional model of the robot 20; the wrist force sensor 214 is used for acquiring the acting force of the mechanical arm; the robot 20 is configured to send the lifting force to the remote manipulation device 10; the remote manipulation apparatus 10 is configured to set a manipulation resistance of the force feedback handle according to the acting force, and to transmit an operation instruction received based on the force feedback handle to the robot 20, so that the robot 20 performs a corresponding action.
Wherein the wrist force sensor 214 is located in the environment sensing component 21 of the robot 20. The remote manipulation apparatus 10 includes: the virtual reality assembly 11, the map display module 13, the video display module 12 and the human-computer interaction module 14, wherein the human-computer interaction module 14 comprises a control panel and a force feedback handle.
The robot 20 may be structurally divided into a mobile carrier and two robot arms.
The virtual reality component 11 can also receive a control instruction of an operator and perform predictive simulation on the posture and the like of the robot 20 according to the control instruction, so that the problems of collision and the like caused by operation errors are prevented.
Meanwhile, the control panel can realize the functions of starting and stopping each module of the robot 20, switching the operation mode, adjusting the camera view and the like. The remote manipulation apparatus 10 is also used to display a live video image taken by the robot 20.
The control system of the robot 20 provided by the embodiment of the present application includes: the robot 20 comprises a mechanical arm, an encoder arranged in the mechanical arm and a wrist force sensor 214 arranged on the mechanical arm, the remote control device 10 comprises a virtual reality component 11 and a force feedback handle, joint state information of the mechanical arm is obtained through the encoder, the robot 20 sends the joint state information to the remote control device 10, and then the virtual reality component 11 in the remote control device 10 generates and displays a virtual display posture image of the robot 20 according to a received joint state diagram and a preset three-dimensional model of the robot 20. Meanwhile, the wrist force sensor 214 is further configured to obtain an acting force of the mechanical arm, the robot 20 sends the acting force to the remote control device 10, then the remote control device 10 sets an operation resistance of the force feedback handle according to the acting force, and sends an operation instruction received based on the force feedback handle to the robot 20, so that the robot 20 performs a corresponding action according to the operation instruction. The robot 20 control system provided by the embodiment of the application can establish the virtual reality attitude image of the robot 20 in real time, can provide rich visual perception for a remote operator, can provide real force perception for the operator through the feedback of acting force, and can improve the accuracy of the control of the robot 20.
As shown in fig. 3, the environment sensing component 21 is configured to detect and acquire environment information of the robot 20, and send the environment data to the first processing component 22; a first processing component 22, configured to plan a motion path of the robot 20 according to the environment information, and generate a first control instruction according to the motion path of the robot 20; and a power control component for controlling the robot 20 to execute corresponding movement according to the first control instruction.
In practical application, the environment sensing component 21 detects and acquires environment information of the robot 20, and sends environment data to the first processing component 22, then the first processing component 22 plans a movement path of the robot 20 according to the environment information, and generates a first control instruction according to the movement path of the robot 20, and finally the power control component controls the robot 20 to execute corresponding movement according to the first control instruction, so that the robot 20 realizes autonomous detection of the environment information, and plans the movement path autonomously according to the environment information, and finally the robot 20 can realize autonomous navigation and movement without the remote control device 10, and the intelligence of the robot 20 can be improved.
As shown in fig. 3, the environment data includes: object images, obstacle data, and three-dimensional environment data; the context awareness component 21 includes: pan-tilt camera 213, three-dimensional lidar 211, and solid-state lidar 215; the pan-tilt camera 213 is used for shooting an object image and sending the object image to the first processing component 22; the three-dimensional laser radar 211 is configured to acquire three-dimensional environment data and send the three-dimensional environment data to the first processing component 22; solid state lidar 215 is configured to acquire obstacle data and transmit the obstacle data to first processing component 22.
The first processing component 22 is specifically configured to obtain a target object according to the object image identification; generating an environment map according to the three-dimensional environment data, and determining the position information of the target object according to the environment map; the movement path of the robot 20 is planned based on the environment map, the position information of the target object, and the obstacle data, and a first control instruction is generated based on the movement path of the robot 20.
Wherein, the target object may be an object to be carried, processed or detected by the robot 20; optionally, the pan-tilt camera 213 and the three-dimensional lidar 211 are mounted on the top of the mobile carrier, and the solid-state lidar 215 is mounted around the mobile carrier.
In practical application, an object image shot by the pan-tilt camera 213, three-dimensional environment data acquired by the three-dimensional laser radar 211, and obstacle data acquired by the solid-state laser radar 215 are all sent to the first processing component 22, the first processing component 22 identifies a target object according to the object image, generates an environment map according to the three-dimensional environment data, and determines position information of the target object according to the environment map, then plans a movement path of the robot 20 according to the environment map, the position information of the target object, and the obstacle data, and generates a first control instruction according to the movement path of the robot 20, and finally the power control component controls the robot 20 to execute corresponding movement according to the first control instruction, so that the robot 20 can identify the target object autonomously, plan the movement path autonomously according to the environment map, and perform active obstacle avoidance according to the obstacle information, the control system of the robot 20 provided by the embodiment of the application can improve the intelligent degree of the robot 20.
Further, the robot 20 sends an environment map to the remote control device 10, and the remote control device 10 can display the environment of the robot 20 according to the environment map and a preset field environment module, so as to facilitate observation and control by an operator.
In one embodiment, as shown in fig. 3, the environmental perception component 21 further includes a gamma camera 216, the gamma camera 216 for acquiring radiation dose information; the first processing component 22 is further configured to generate an environment map according to the three-dimensional environment data and the radiation dose information, plan a movement path of the robot 20 according to the environment map, the position information of the target object, and the obstacle data, and generate a first control instruction according to the movement path of the robot 20.
Optionally, a gamma camera 216 may be disposed on top of the mobile carrier for acquiring radiation dose information on the robot 20 site.
By sending the acquired radiation dose information to the remote control device 10, the remote control device 10 controls the display screen to display the radiation dose information, so that the distribution of the nuclear radiation on the robot 20 site can be observed. Meanwhile, the detection of the field radiation dose information can avoid a high dose area during the path planning of the robot 20, and the radiation degree of the robot is reduced.
As shown in fig. 3, the robot 20 further includes: an inertial sensor 212; an inertial sensor 212 for detecting the motion characteristics of the robot 20 and acquiring the velocity parameters of the robot 20 and sending the velocity parameters to the first processing component 22; the first processing component 22 is specifically configured to determine the position information of the robot 20 according to the environment map and the speed parameter, plan a movement path of the robot 20 according to the environment map, the position information of the target object, the obstacle data, and the position information of the robot 20, and generate a first control instruction according to the movement path of the robot 20.
Optionally, an inertial sensor 212 is mounted in the center of the chassis of the mobile carrier for acquiring the acceleration or angular velocity of the robot 20.
The first processing component 22 locates the robot 20 according to the environment map and the speed parameter of the robot 20, plans the action path of the robot 20 according to the environment map, the position information of the target object, the obstacle data and the position information of the robot 20, generates a first control instruction according to the action path of the robot 20, and finally controls the robot 20 to execute corresponding movement according to the first control instruction. By acquiring the motion characteristics of the robot 20 through the inertial sensor 212 and positioning the robot 20, a positioning error due to the running speed of the robot 20 can be avoided.
Further, the robot 20 is further configured to transmit the position information of the robot 20 and the obstacle data to the remote control device 10, and the remote control device 10 may display the position of the robot 20 in the environment map and the obstacle situation around the robot 20.
As shown in fig. 3, the hand-eye camera 217 is further configured to: shooting an object image around the mechanical arm, and sending the object image around the mechanical arm to the first processing component 22; the first processing component 22 plans the action path of each mechanical arm according to the object image around the mechanical arm and the environment information to generate a corresponding second control instruction and a corresponding third control instruction, and respectively controls the power control component according to the second control instruction and the third control instruction to enable each mechanical arm to execute a corresponding motion.
The action paths of the two mechanical arms can be the same or different according to the surrounding environment of the mechanical arms.
In practical applications, the first processing component 22 plans the action path of each mechanical arm according to the image of the object around the mechanical arm and the environmental information to generate a corresponding second control instruction and a corresponding third control instruction, and then the power control component enables each mechanical arm to execute a corresponding motion according to the second control instruction and the third control instruction, so as to realize a collision-free motion from the current position to the target object position of the two mechanical arms.
Furthermore, the first processing assembly 22 is also configured to receive a target position sent by the operator based on the remote control device 10, and the first processing assembly 22 plans a collision-free motion path of the robot arm according to the target position.
Furthermore, according to the task to be performed by the robot arm, the degrees of freedom of the robot arm may be allocated in the joint space or in the cartesian space, wherein a part of the degrees of freedom of the robot arm may be controlled by the operator through the remote control device 10 to perform the motion desired by the operator, and a part of the degrees of freedom of the robot arm may be controlled by the first processing assembly 22 in the robot arm, and the cooperation between the two arms may be realized through the two control methods to perform the cooperative operation, thereby completing the complex task.
The robot arm further comprises a gripper tool head, the robot 20 further comprises a clamping force sensor 218, and the clamping force sensor 218 is arranged on the gripper tool head; the clamping force sensor 218 is used to acquire the clamping force of the gripper tool head and send the clamping force to the first processing assembly 22; the first handling assembly 22 is also adapted to receive the clamping force and adjust the clamping force in dependence on the weight of the object acted upon by the work tool head.
In addition, other tool heads are also arranged on the mechanical arm, and the first processing assembly 22 can also automatically replace the tool heads on the mechanical arm according to task instructions, so that the intelligent degree is higher.
As shown in fig. 4, the robot 20 control system further includes: a communication system 30, the communication system 30 comprising: a router 31 provided at a remote control end, a wireless bridge 33 provided in the robot 20, and at least one wireless network repeater 32 provided on a traveling path of the robot 20; the first processing component 22 is also used for detecting network signals of the communication system 30 in real time; robot 20 is also configured to place a wireless network repeater 32 in a location where the network signal is less than a predetermined threshold.
The communication system 30 operates in an ad hoc network manner, and realizes wireless communication between the remote control device 10 and the robot 20.
In one embodiment, the robot 20 is equipped with at least one wireless network repeater 32, the first processing component 22 detects the network signal of the communication system 30 in real time, and when the network signal at a certain position is smaller than a preset threshold value, the robot 20 places one wireless network repeater 32.
In another embodiment, the operator controls the robot 20 to move forward and backward and turn around through the remote control device 10, and when the robot 20 encounters a heavy shielding body during traveling, the remote control device 10 sends an instruction to the robot 20 to place one wireless network repeater 32, and the robot 20 places one wireless network repeater 32.
Alternatively, the heavy shield may be a heavy concrete wall or an object that is hard to penetrate for wireless signals.
The wireless network repeater 32 can increase the network coverage, and avoid the situation that data cannot be transmitted due to poor network signals.
As shown in fig. 5, the robot 20 further includes a hydraulic control module 25, a power control module, and a controller, wherein the hydraulic control module is further configured to send the first control command, the second control command, and the third control command to the hydraulic control module; and the hydraulic control assembly is used for controlling the robot 20 to execute corresponding movement according to the first control command and controlling each mechanical arm to execute corresponding movement according to the second control command and the third control command.
The hydraulic control assembly 25 includes a high-precision shaft controller 251, an amplifier 252, a switching module 253, a high-precision proportional valve 254, and the like.
In practical application, the hydraulic control assembly 25 controls the robot 20 and the mechanical arm according to the first control command, the second control command and the third control command, so that the control accuracy can be improved, and further, the operation accuracy of the robot 20 can be improved.
As shown in fig. 6, the robot 20 further comprises a second handling component 23; the second processing component 23 is used to monitor the operating status of the first processing component 22 and to control the robot 20 to return to the initial position when the first processing component 22 fails.
Wherein the second processing assembly 23 comprises: a monitoring module 231, a vehicle body motion control module 232, a mechanical arm motion control module 233, and the like. The monitoring module 231 is used to monitor the operating status of the first processing assembly 22, and the vehicle body motion control module 232 and the robot arm motion control module 233 are redundant emergency control modules.
In the normal operating mode, the mobile carrier and robot arm movements are controlled by the first processing assembly 22. When the main control system of the robot 20 is found to be faulty, the second processing component 23 controls the movement of the mobile carrier and the robot arm, so that the robot 20 is evacuated from the site.
According to the application environment of the robot, the electronic components in the second processing assembly 23 are made of radiation-resistant materials, so that the radiation influence can be avoided, and therefore, in the strong radiation environment, the second processing assembly 23 can also work normally when the first processing assembly 22 fails due to the radiation influence.
As shown in fig. 7, the robot 20 further includes a Power system 24, and the Power system 24 is composed of a diesel Power generation system 241, an UPS (Uninterruptible Power Supply) system 242, a Power conversion module 243, and the like, and provides direct Power to the robot 20.
The robot control system that this application embodiment provided includes: the robot 20 comprises a mechanical arm, an encoder arranged in the mechanical arm and a wrist force sensor 214 arranged on the mechanical arm, the remote control device 10 comprises a virtual reality component 11 and a force feedback handle, joint state information of the mechanical arm is obtained through the encoder, the robot 20 sends the joint state information to the remote control device 10, and then the virtual reality component 11 in the remote control device 10 generates and displays a virtual display posture image of the robot 20 according to a received joint state diagram and a preset three-dimensional model of the robot 20. Meanwhile, the wrist force sensor 214 is further configured to obtain the lifting force of the mechanical arm, the robot 20 sends the acting force to the remote control device 10, then the remote control device 10 sets the operation resistance of the force feedback handle according to the acting force, and sends the operation instruction received based on the force feedback handle to the robot 20, so that the robot 20 performs a corresponding action according to the operation instruction. The robot 20 control system provided by the embodiment of the application can establish the virtual reality attitude image of the robot 20 in real time, can provide rich visual perception for a remote operator, can provide real force perception for the operator through the feedback of acting force, and can improve the accuracy of the control of the robot 20.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A robot control system is characterized in that the system comprises a robot and a remote control device; the robot comprises a mechanical arm, an encoder arranged in the mechanical arm and a wrist force sensor arranged on the mechanical arm, and the remote control equipment comprises a virtual reality assembly and a force feedback handle;
the encoder is used for acquiring joint state information of the mechanical arm, and the robot is used for sending the joint state information to the remote control equipment;
the virtual reality component is used for generating and displaying a virtual reality attitude image of the robot according to the received joint state information and a preset three-dimensional model of the robot;
the wrist force sensor is used for acquiring the acting force of the mechanical arm;
the robot is used for sending the acting force to the remote control equipment;
the remote control equipment is used for setting the control resistance of the force feedback handle according to the acting force and sending an operation instruction received based on the force feedback handle to the robot so as to enable the robot to perform corresponding actions.
2. The robot control system of claim 1, wherein the robot further comprises: the system comprises an environment sensing component, a first processing component and a power control component;
the environment sensing component is used for detecting and acquiring environment information of the robot and sending the environment data to the first processing component;
the first processing component is used for planning the action path of the robot according to the environment information and generating a first control instruction according to the action path of the robot;
and the power control assembly is used for controlling the robot to execute corresponding movement according to the first control instruction.
3. The robot control system of claim 2, wherein the environmental data comprises: object images, obstacle data, and three-dimensional environment data; the context awareness component includes: a pan-tilt camera, a three-dimensional laser radar and a solid-state laser radar;
the holder camera is used for shooting the object image and sending the object image to the first processing component;
the three-dimensional laser radar is used for acquiring the three-dimensional environment data and sending the three-dimensional environment data to the first processing component;
the solid-state laser radar is used for acquiring the obstacle data and sending the obstacle data to the first processing component;
the first processing component is specifically used for identifying and obtaining a target object according to the object image; generating an environment map according to the three-dimensional environment data, and determining the position information of the target object according to the environment map; planning a motion path of the robot according to the environment map, the position information of the target object and the obstacle data, and generating the first control instruction according to the motion path of the robot.
4. The robotic control system of claim 3, wherein the environmental perception component further comprises a gamma camera;
the gamma camera is used for acquiring radiation dose information;
the first processing component is further configured to generate the environment map according to the three-dimensional environment data and the radiation dose information, plan a movement path of the robot according to the environment map, the position information of the target object, and the obstacle data, and generate the first control instruction according to the movement path of the robot.
5. The robot control system of claim 4, wherein the robot further comprises: an inertial sensor;
the inertial sensor is used for detecting the motion characteristic of the robot, acquiring the speed parameter of the robot and sending the speed parameter to the first processing component;
the first processing component is specifically configured to determine location information of the robot according to the environment map and the speed parameter, plan a movement path of the robot according to the environment map, the location information of the target object, the obstacle data, and the location information of the robot, and generate the first control instruction according to the movement path of the robot.
6. The robot control system of claim 2, wherein the robot further comprises: a hand-eye camera;
the hand-eye camera is used for shooting an object image around the mechanical arm and sending the object image around the mechanical arm to the first processing component;
the first processing assembly plans the action path of each mechanical arm according to the object image around the mechanical arm and the environment information to generate a corresponding second control instruction and a corresponding third control instruction;
and the power control assembly respectively controls each mechanical arm to execute corresponding movement according to the second control instruction and the third control instruction.
7. The robot control system of claim 6, wherein the robot further comprises a hydraulic control assembly;
the power control assembly is further used for sending the first control command, the second control command and the third control command to the hydraulic control assembly;
and the hydraulic control assembly is used for controlling the robot to execute corresponding movement according to the first control instruction and controlling each mechanical arm to execute corresponding movement according to the second control instruction and the third control instruction.
8. The robot control system of claim 2, further comprising a gripper tool head on the robotic arm, the robot further comprising a gripping force sensor disposed on the gripper tool head;
the clamping force sensor is used for acquiring the clamping force of the clamping jaw tool head and sending the clamping force to the first processing assembly;
the first processing assembly is further configured to receive the clamping force and adjust the clamping force based on an object acted upon by the work tool head.
9. The robot control system of claim 1, further comprising: a communication system, the communication system comprising: the router is arranged at the remote control end, the wireless network bridge is arranged in the robot, and the at least one wireless network repeater is arranged on a traveling path of the robot;
the first processing component is further used for detecting network signals of the communication system in real time;
the robot is also used for placing a wireless network repeater at the position where the network signal is smaller than the preset threshold value.
10. The robot control system of claim 2, wherein the robot further comprises a second processing component;
the second processing assembly is used for monitoring the running state of the first processing assembly and controlling the robot to return to an initial position when the first processing assembly fails.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110831069.9A CN113618731A (en) | 2021-07-22 | 2021-07-22 | Robot control system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110831069.9A CN113618731A (en) | 2021-07-22 | 2021-07-22 | Robot control system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113618731A true CN113618731A (en) | 2021-11-09 |
Family
ID=78380587
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110831069.9A Pending CN113618731A (en) | 2021-07-22 | 2021-07-22 | Robot control system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113618731A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114200927A (en) * | 2021-11-12 | 2022-03-18 | 北京时代富臣智能科技有限公司 | Logistics robot system |
CN114872050A (en) * | 2022-06-01 | 2022-08-09 | 清华大学深圳国际研究生院 | Control method and control system for double-arm crawler-type mobile operation robot |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103722554A (en) * | 2014-01-15 | 2014-04-16 | 西北工业大学 | Space teleoperation method |
CN106504605A (en) * | 2016-10-21 | 2017-03-15 | 东南大学 | The simulation control subsystem of space-oriented station mechanical arm force feedback remote operating training |
CN108177149A (en) * | 2018-02-08 | 2018-06-19 | 苏州融萃特种机器人有限公司 | Movable mechanical arm control system and method based on MR and motion planning technology |
CN108422435A (en) * | 2018-03-21 | 2018-08-21 | 青岛理工大学 | Remote monitoring and control system based on augmented reality |
CN108519814A (en) * | 2018-03-21 | 2018-09-11 | 北京科技大学 | A kind of man-machine interactive operation system |
CN109164829A (en) * | 2018-10-23 | 2019-01-08 | 哈尔滨工业大学(深圳) | A kind of flight mechanical arm system and control method based on device for force feedback and VR perception |
CN109434870A (en) * | 2018-09-18 | 2019-03-08 | 国网江苏省电力有限公司盐城供电分公司 | A kind of virtual reality operation system for robot livewire work |
CN109933069A (en) * | 2019-03-21 | 2019-06-25 | 东南大学 | The conducting wire flaw detection robot tele-control system and control method of view-based access control model and force feedback |
-
2021
- 2021-07-22 CN CN202110831069.9A patent/CN113618731A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103722554A (en) * | 2014-01-15 | 2014-04-16 | 西北工业大学 | Space teleoperation method |
CN106504605A (en) * | 2016-10-21 | 2017-03-15 | 东南大学 | The simulation control subsystem of space-oriented station mechanical arm force feedback remote operating training |
CN108177149A (en) * | 2018-02-08 | 2018-06-19 | 苏州融萃特种机器人有限公司 | Movable mechanical arm control system and method based on MR and motion planning technology |
CN108422435A (en) * | 2018-03-21 | 2018-08-21 | 青岛理工大学 | Remote monitoring and control system based on augmented reality |
CN108519814A (en) * | 2018-03-21 | 2018-09-11 | 北京科技大学 | A kind of man-machine interactive operation system |
CN109434870A (en) * | 2018-09-18 | 2019-03-08 | 国网江苏省电力有限公司盐城供电分公司 | A kind of virtual reality operation system for robot livewire work |
CN109164829A (en) * | 2018-10-23 | 2019-01-08 | 哈尔滨工业大学(深圳) | A kind of flight mechanical arm system and control method based on device for force feedback and VR perception |
CN109933069A (en) * | 2019-03-21 | 2019-06-25 | 东南大学 | The conducting wire flaw detection robot tele-control system and control method of view-based access control model and force feedback |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114200927A (en) * | 2021-11-12 | 2022-03-18 | 北京时代富臣智能科技有限公司 | Logistics robot system |
CN114872050A (en) * | 2022-06-01 | 2022-08-09 | 清华大学深圳国际研究生院 | Control method and control system for double-arm crawler-type mobile operation robot |
CN114872050B (en) * | 2022-06-01 | 2024-05-07 | 清华大学深圳国际研究生院 | Control method and control system for double-arm crawler-type mobile operation robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108789416B (en) | System and method for controlling flexible live working of double mechanical arms | |
CN113618731A (en) | Robot control system | |
CN103324197A (en) | Voice-control multi-functional intelligent service robot | |
JPH11149315A (en) | Robot control system | |
CN212683967U (en) | Autonomous mobile robot control system and robot | |
CN110696012B (en) | Intelligent robot system for distribution room | |
Szczurek et al. | Mixed reality human–robot interface with adaptive communications congestion control for the teleoperation of mobile redundant manipulators in hazardous environments | |
CN111618854A (en) | Task segmentation and collaboration method for security robot | |
CN110666820A (en) | High-performance industrial robot controller | |
RU124622U1 (en) | MOBILE ROBOT CONTROL SYSTEM | |
Sato et al. | Development of tele-operated underfloor mobile manipulator | |
Hentout et al. | A telerobotic human/robot interface for mobile manipulators: A study of human operator performance | |
Yan et al. | Adaptive vision-based control of redundant robots with null-space interaction for human-robot collaboration | |
CN211163971U (en) | Intelligent robot system of power distribution room | |
Lin et al. | A Advanced Telerobotic Control System for a Mobile Robot with Multisensor Feedback | |
Houshangi et al. | Omnibot mobile base for hazardous environment | |
Shin et al. | Design of a remote control system for maintaining and repairing tasks in npp | |
CN107363831A (en) | The teleoperation robot control system and method for view-based access control model | |
Edlinger et al. | Solving disaster-response tasks at a nuclear power plant under realistic conditions | |
JPH10230484A (en) | Machining work robot | |
KR101321984B1 (en) | Path error compensation method for valve operating based on cooperation between robots and system thereof | |
Kurdi et al. | Design and development of efficient guidance system using multifunctional robot with quadcopter | |
KR101864758B1 (en) | Egocentric Tele-operation Control With Minimum Collision Risk | |
Salmanipour et al. | Teleoperation of a mobile robot with model-predictive obstacle avoidance control | |
JPH09216184A (en) | Remote operation type robot monitoring system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20211109 |