CN118061188A - Mechanical arm teleoperation system and method based on mixed reality - Google Patents

Mechanical arm teleoperation system and method based on mixed reality Download PDF

Info

Publication number
CN118061188A
CN118061188A CN202410402050.6A CN202410402050A CN118061188A CN 118061188 A CN118061188 A CN 118061188A CN 202410402050 A CN202410402050 A CN 202410402050A CN 118061188 A CN118061188 A CN 118061188A
Authority
CN
China
Prior art keywords
mechanical arm
mixed reality
physical
objects
teleoperation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410402050.6A
Other languages
Chinese (zh)
Inventor
吴海彬
蔡孝锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202410402050.6A priority Critical patent/CN118061188A/en
Publication of CN118061188A publication Critical patent/CN118061188A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a teleoperation system and a teleoperation method for a mechanical arm based on mixed reality, wherein the system comprises the following components: a physical robotic arm; the mixed reality equipment is used for wireless interaction between a human body and the physical mechanical arm, and a digital twin body of the physical mechanical arm and the surrounding environment is built in the mixed reality equipment; the video acquisition equipment is used for acquiring images of the physical mechanical arm and the surrounding environment and presenting the images in the mixed reality equipment; the method comprises the following steps: using a virtual control panel of the mechanical arm which is operated by the mixed reality equipment in coordination with gesture actions, acquiring corresponding control signals and mapping the corresponding control signals into position and gesture data of the tail end of the mechanical arm; resolving the joint motion parameters of the mechanical arm, sending the joint motion parameters to the physical mechanical arm, enabling the physical mechanical arm to run to the corresponding pose, and updating the state of the digital twin body according to the state data fed back by the physical mechanical arm; when the mechanical arm moves, collision detection is carried out; the operator can observe the pose state of the mechanical arm from different angles. The system and the method can conveniently and accurately control the movement of the mechanical arm and sense the state of the mechanical arm.

Description

Mechanical arm teleoperation system and method based on mixed reality
Technical Field
The invention relates to the technical field of robots, in particular to a teleoperation system and a teleoperation method for a mechanical arm based on mixed reality.
Background
As robotics are becoming mature, a wide variety of robots are being applied in various fields, replacing manpower to work in various environments. However, current artificial intelligence has not evolved to replace manual decision making in the industry, especially in some hazardous and complex environments where work remains unseparated. The main purpose of teleoperation is to make decisions by human beings in a safe environment, so that the device executes the intention of human beings in an actual working environment, and the device replaces manpower to work in dangerous environments which are inconvenient for human activities. The core of a teleoperational system is thus to obtain the intent of human operations and to let the device execute and to sense and feedback on the remote environment.
The master-slave teleoperation system is a typical teleoperation system, the system can be divided into a master end control module and a slave end action module according to hardware division, an operator receives information fed back by the slave end at the master end through various display devices, inputs signals through various information acquisition devices, forms device control signals after conversion, and transmits the device control signals to the slave end part, so that the slave end acts according to the intention of the operator. The traditional teleoperation mode of the robot mainly utilizes a demonstrator, a control lever and the like at the main end to input control signals, the equipment is often required to be specially designed and manufactured, the universality is poor, in actual work, the technical requirement on experience of an operator is high, and the operator can master related skills after a certain time of training. Meanwhile, the traditional teleoperation technology has weaker sensing feedback capability to the remote environment, and mainly feeds back the remote environment in the form of a two-dimensional plane image, so that more visual dead zones exist, and the response to some emergencies is slower. Particularly in complex operating environments, not only is an operator required to perform remote control, but also changes in the remote environment are captured in time and adjusted accordingly.
More specifically, the teleoperation system generally comprises a control instruction input part and a state feedback part, wherein the control instruction input part is used for converting the operation intention of an operator into a control signal of the robot movement so as to enable the robot to move according to the intention of the operator. The traditional teleoperation interaction mode is mechanized, operators need to adapt to the characteristics of each interaction device and learn the mapping mode from the interaction device to the robot, and the learning and using burden of the operators is increased. With the development of machine learning, computer vision and other technologies, more and more teleoperation systems are beginning to change to natural man-machine interaction centered by people. The natural man-machine interaction comprises the forms of voice interaction, gesture interaction, eye movement control, brain-computer interface and the like, wherein gestures are still the most common man-machine interaction means, and related action signal acquisition equipment is also developed to be mature, so that the man-machine interaction is widely applied to the teleoperation field, and replaces the conventional operation modes of a teleoperation system such as a demonstrator, a joystick and the like to input control instructions.
The state feedback part may feed back the slave-end environment information or the robot state information, such as pose, speed, acceleration, etc., to the operator in the case where the master-slave distance is long, so that the operator perceives the state of the slave-end device at the master end. For teleoperation systems with closer master-slave distances, the state of the robot can be directly observed by naked eyes of operators. However, as the distance between the master and slave ends increases, until the state of the slave end cannot be directly perceived by naked eyes, state feedback of the slave end becomes more important. In a traditional teleoperation system, multisource visual feedback based on a slave camera is most commonly used, but a two-dimensional image lacks depth information and cannot intuitively reflect the slave state, and operators often need to pay more attention to multisource video information in the operation process, so that the use burden of the operators is increased, and the working efficiency is reduced.
Mixed Reality (MR) combines the advantages of Virtual Reality (VR) and augmented Reality (Augmented Reality, AR) technologies, enabling the introduction of a Virtual environment into a real scene while creating an information loop between the real world, the Virtual world, and the operator, thereby enhancing the immersive and interactive experience of the operator. The existing mixed reality equipment has a relatively mature gesture control mode, conventional gestures such as clicking, dragging and the like can be conveniently used, and a control mode conforming to human operation habits is developed. The mixed reality can simultaneously present a virtual environment and an actual environment, and when the master-slave distance of the teleoperation equipment is relatively close, the virtual equipment and the entity equipment can be set to be in a superposition state, and the virtual equipment is used for motion planning, track simulation and the like, so that the safety of the entity equipment in motion is ensured. When the distance between the master and the slave is far, a virtual environment which is the same as the environment of the slave can be constructed at the master, and feedback is made to the motion of the slave in the virtual environment. The mixed reality technology well compensates the problems of inconvenient operation, untimely feedback, intuitionism and the like of the traditional teleoperation system by using a natural 3-dimensional visual presentation mode and application of a virtual interface.
Although mixed reality can construct a complete digital twin body, an operator can observe from different angles by moving the view angle, and various visual blind areas still exist in an actual scene. Therefore, a corresponding collision detection module is still needed in a teleoperation system to ensure the safety of the system operation. The physical robot is provided with various moment sensors, and whether the equipment collides with the surrounding environment or not can be judged through moment analysis of each joint, however, the situation often means that the robot collides with surrounding objects, and the operation safety of the equipment cannot be ensured.
The most widely used collision detection algorithms based on image space are currently available. Collision detection algorithms based on bounding boxes are commonly used in research in recent years, and are widely used in particular in simulation of complex virtual environments and in 3D game engines. The basic idea of the bounding box is to enclose a complex geometric shape by using a bounding box with slightly larger volume and simple geometric characteristics, when two objects are subjected to collision detection, whether the bounding box is intersected is detected firstly, if not, the two objects are not collided, otherwise, the two objects are required to be further accurately detected. The method can rapidly remove disjoint objects, and improves the efficiency of collision detection.
Disclosure of Invention
The invention aims to provide a teleoperation system and a teleoperation method for a mechanical arm based on mixed reality, which can conveniently and accurately control the movement of the mechanical arm and sense the state of the mechanical arm.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows: a mixed reality-based robotic teleoperation system, comprising:
A physical robotic arm;
The mixed reality equipment is used for realizing interaction between a human body and the physical mechanical arm, and the mixed reality equipment is internally provided with the physical mechanical arm and a digital twin body in the surrounding environment and can be in wireless communication with the physical mechanical arm;
And the video acquisition equipment is used for acquiring images of the physical mechanical arm and the surrounding environment and presenting the images in the mixed reality equipment in real time.
Further, the physical mechanical arm is provided with a mechanical arm control cabinet, the physical mechanical arm is in wireless communication with the mixed reality equipment through the mechanical arm control cabinet, and the digital twin body in the mixed reality equipment updates the state of the physical mechanical arm according to various state data of the physical mechanical arm fed back by the mechanical arm control cabinet through wireless communication.
The invention also provides a teleoperation method of the mechanical arm based on mixed reality, which is realized by adopting the system and comprises the following steps:
S1, using mixed reality equipment to cooperate with corresponding gesture actions, operating a virtual control panel of a corresponding mechanical arm, and obtaining corresponding control signals;
S2, mapping the control signals acquired in the step S1 into position and posture data of the tail end of the mechanical arm in the mixed reality equipment;
s3, performing forward and reverse kinematic computation on the position and posture data of the mechanical arm obtained in the step S2, and solving joint motion parameters of the mechanical arm;
S4, the joint motion parameters obtained in the step S3 are sent to the physical mechanical arm to enable the physical mechanical arm to run to the corresponding pose, and meanwhile, the state of the digital twin body is updated according to state data fed back by the physical mechanical arm;
S5, when the mechanical arm moves, collision detection is synchronously carried out according to a certain frequency, if collision is about to occur, the mechanical arm stops moving, and the system gives an alarm; if no collision occurs, the system can continuously run until the system is closed;
S6, an operator wearing the mixed reality equipment can observe the pose state of the mechanical arm from different angles through movement of the operator, and meanwhile, the video acquisition equipment corresponds to the actual state of the mechanical arm, so that vision blind areas are reduced, and reference is provided for next movement.
Further, in step S1, the virtual control panel is a targeted virtual control panel developed according to structural features of the physical mechanical arm, and the corresponding gesture actions include finger clicking, grabbing and dragging.
Further, the digital twin can be placed at any position in space according to the needs of practical situations, including overlapping the digital twin with the practical environment.
Further, an operator can observe the pose state of the digital twin body from different angles through movement and switching of the visual angles, so that the visual blind area is reduced as much as possible.
Further, in the step S5, a collision detection algorithm based on a bounding box is used for collision detection, and the method includes the following steps:
s501, constructing a digital twin body of an entity mechanical arm and a surrounding environment;
S502, dividing the whole object and each part of the object into different layers according to the structural characteristics of equipment and environment and the dividing principle of N-ary tree, and completely and snugly wrapping related objects by adopting a proper bounding box;
S503, judging whether collision occurs between the two objects, namely detecting whether interference occurs in the bounding boxes of the two objects; firstly, carrying out interference calculation on bounding boxes of root nodes of two objects, and if no interference phenomenon occurs between the bounding boxes of the root nodes of the two objects, indicating that the two objects do not collide; otherwise, it is indicated that the two objects may collide, and the interference calculation is continued for the bounding box of the next layer of the two objects: performing traversal calculation on each part of the two objects under the same level; until all layers of the two objects are traversed;
s504, continuously repeating the step S503, and traversing all objects in the system.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention provides a teleoperation system of a mechanical arm based on mixed reality, which has good universality, can be used for building digital twin bodies including a 6-joint mechanical arm, an end effector and a working environment, well overcomes the problem of weak perception of the remote environment in the traditional teleoperation system through the visual effect presented by the mixed reality technology, and can be widely applied to various working scenes needing teleoperation.
2. The interaction mode of the invention is a virtual interaction interface based on mixed reality development, only the layout and various functions of the control panel are designed and developed according to the equipment characteristics, and meanwhile, the virtual interaction interface can be conveniently and flexibly modified and expanded, and complicated design and manufacturing processes of an entity controller in a traditional teleoperation system are eliminated.
3. The control instruction input mode is based on various gesture operations of human operation habits, such as clicking, dragging and other visual forms for inputting the control instruction. Compared with the mode of operating by using entity equipment such as a control lever, a demonstrator and the like in the traditional teleoperation system, the control instruction input mode has better usability and expansibility, and can be widely applied to various teleoperation scenes.
4. The collision detection system based on the bounding box algorithm can detect whether each part interferes with the surrounding environment in real time in the operation process of the mechanical arm, and collision is avoided. Particularly, collision detection is carried out at the position of the visual blind area of the person, so that the teleoperation safety is improved.
Drawings
FIG. 1 is a schematic diagram of the composition and structure of a teleoperation system according to an embodiment of the present invention;
FIG. 2 is a virtual control interface in an embodiment of the invention;
FIG. 3 is a schematic view of the hierarchical division and interferometric calculation of a 6-joint manipulator in an embodiment of the present invention;
FIG. 4 is a schematic diagram of the interferometric calculation of a cylindrical bounding box of a certain articulation link of a 6-articulation robotic arm in accordance with an embodiment of the present invention;
FIG. 5 is a graph of D-H modeling parameters for a UR5 robot in an embodiment of the present invention.
In fig. 1: 1. a mixed reality device; 2. 6, a joint mechanical arm; 3. other devices in the work environment; 4. a mechanical arm control cabinet; 5. a digital twin corresponding to the joint mechanical arm; 6. digital twins corresponding to other devices; 7. wireless communication; 8. a router; 9. a display; 10. video acquisition equipment.
The meaning of the parameters in fig. 5 is as follows:
# denotes the transformation from coordinate system n to coordinate system n+1 (n=0, 1,2,3,4,5, hereinafter);
θ represents rotation of θ n+1 about the z n axis such that x n and x n+1 are parallel to each other, i.e., rot (z, θ n+1);
d represents translation along the z n axis by a distance d n+1 such that x n and x n+1 are collinear, i.e., trans (0, d n+1);
a denotes the distance of translation of a n+1 along the x n axis that has been rotated so that the two coordinate system origins where x n and x n+1 lie coincide, i.e., trans (a n+1, 0);
Alpha denotes rotating the z n axis about the x n+1 axis by alpha n+1 such that z n and z n+1 are collinear, i.e. Rot (x, alpha n+1).
Detailed Description
The invention will be further described with reference to the accompanying drawings and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
As shown in fig. 1, this embodiment provides a mechanical arm teleoperation system based on mixed reality, including: the system comprises a physical mechanical arm 2, a mixed reality device 1, a video acquisition device 10 and a wireless network for wireless communication.
The mixed reality device 1 is used for realizing interaction between a human body and the physical mechanical arm 2, and is internally provided with the physical mechanical arm and a digital twin body of surrounding environment and can be in wireless communication with the physical mechanical arm.
The video acquisition device 10 is used for acquiring images of the physical mechanical arm 2 and the surrounding environment and presenting the images in the mixed reality device in real time.
After an operator at the master end wears the mixed reality equipment, the mixed reality equipment collects gesture actions of the operator to control the virtual control panel, the collected gesture action signals are processed and converted into control signals such as pose of the mechanical arm through algorithms such as inverse kinematics and the like, the control signals are sent to the mechanical arm at the slave end through a wireless network, the mechanical arm performs corresponding actions, and meanwhile, a digital twin body of the mechanical arm updates the state of the twin body according to information fed back by the slave end, so that synchronous movement of the two is achieved. The image acquisition system at the slave end can also feed back related environment images to the mixed reality equipment in real time.
In this embodiment, the physical mechanical arm 2 is configured with a mechanical arm control cabinet 4, the physical mechanical arm performs wireless communication with the mixed reality device through the mechanical arm control cabinet, and the digital twin body in the mixed reality device updates its own state according to each item of state data of the physical mechanical arm fed back by the mechanical arm control cabinet through wireless communication.
In this embodiment, the wireless communication connection is one or more of wireless communication modes such as network, WIFI, bluetooth, etc.
The embodiment also provides a teleoperation method of the mechanical arm based on mixed reality, which is realized by adopting the system and comprises the following steps:
S1, using mixed reality equipment to cooperate with corresponding gesture actions, operating a virtual control panel of a corresponding mechanical arm, and obtaining corresponding control signals;
S2, mapping the control signals acquired in the step S1 into position and posture data of the tail end of the mechanical arm in the mixed reality equipment;
s3, performing forward and reverse kinematic computation on the position and posture data of the mechanical arm obtained in the step S2, and solving joint motion parameters of the mechanical arm;
S4, the joint motion parameters obtained in the step S3 are sent to the physical mechanical arm to enable the physical mechanical arm to run to the corresponding pose, and meanwhile, the state of the digital twin body is updated according to state data fed back by the physical mechanical arm;
S5, when the mechanical arm moves, collision detection is synchronously carried out according to a certain frequency, if collision is about to occur, the mechanical arm stops moving, and the system gives an alarm; if no collision occurs, the system can continuously run until the system is closed;
S6, an operator wearing the mixed reality equipment can observe the pose state of the mechanical arm from different angles through movement of the operator, and meanwhile, the video acquisition equipment corresponds to the actual state of the mechanical arm, so that vision blind areas are reduced, and reference is provided for next movement.
In step S1, the virtual control panel is designed and developed according to the structural features of the physical mechanical arm, so that the virtual control panel accords with the intuitive operation habit of a human body. Corresponding gesture actions include, but are not limited to, finger clicks, grabs, drags, and the like.
According to the actual situation, the mixed reality scene can be overlapped with the reality scene or placed in a space far away from the actual working scene or placed at any position desired by an operator.
The operator can observe the pose state of the digital twin body from different angles through the movement and the switching of the visual angles, so that the visual blind area is reduced as much as possible.
Because the human eyes still have blind areas for the perception of the virtual environment in the running process of the equipment, in order to ensure the safety of a teleoperation system, the embodiment also provides a collision detection algorithm based on the bounding box, so as to detect whether the mechanical arm collides with the surrounding environment in real time in the running process of the mechanical arm, and the method specifically comprises the following steps:
S501, constructing a digital twin body of a slave environment, namely a physical mechanical arm and a surrounding environment, wherein the digital twin body comprises the physical mechanical arm, a workpiece, a workbench, surrounding obstacles and the like.
S502, dividing the whole object and each part of the object into different layers according to the structural characteristics of equipment and environment and the dividing principle of N-ary tree, and completely and tightly wrapping related objects by adopting a proper bounding box.
Specifically, according to different structural characteristics of each part of the digital twin body, all objects in the virtual space are hierarchically divided according to the rule of the N-ary tree, for example, the mechanical arm can be divided into two layers, the root node is the whole mechanical arm, and the leaf node comprises a base, each transmission joint, an end effector and the like. For scenes needing higher precision, the depth of the tree structure can be increased, and the node division of the next level is continued, so that the model is closer to reality.
Then, three-dimensional figures (bounding boxes) such as spheres, cylinders, cubes and the like are used for wrapping all the nodes divided above as closely and completely as possible, and the more regular and simpler figures are used for replacing actual objects with complex edges to perform interference calculation so as to reduce the calculated amount as much as possible and improve the efficiency of collision detection.
S503, when entering the collision detection flow, whether any two objects in the detection scene collide or not needs to be traversed. And judging whether collision occurs between the two objects, namely detecting whether the bounding boxes of the two objects interfere or not. First, interference calculation is performed on bounding boxes of two object root nodes, and whether interference phenomenon occurs between bounding boxes of the object root nodes is judged. If no interference phenomenon occurs between bounding boxes of root nodes of the two objects, the two objects are not collided; otherwise, it is indicated that the two objects may collide, and the interference calculation is continued for the bounding box of the next layer of the two objects: performing traversal calculation on each part of the two objects under the same level; until all layers of the two objects have been traversed.
S504, continuously repeating the step S503, and traversing all objects in the system.
If any two objects in the scene do not interfere, the situation that no collision occurs in the scene is indicated, the mechanical arm continues to operate, if interference occurs, the mechanical arm should pause movement, and the system gives an alarm.
Further details of one embodiment are provided below.
As shown in fig. 1, the teleoperation system for a mechanical arm based on mixed reality provided in this embodiment includes a mixed reality device holonens glasses 1 for completing interaction between a human body and the mechanical arm, a 6-joint entity UR5 mechanical arm 2, a corresponding digital twin body 5, a device 3 in a working environment, a corresponding digital twin body 6, wireless communication networks 7, 8, and a display 9, and is used for acquiring a video acquisition system 10 of the environment where the physical mechanical arm is located. The mixed reality equipment is internally provided with a digital twin body of the physical mechanical arm and the surrounding environment, wireless communication with the physical mechanical arm can be realized, and the video acquisition system can acquire images of the physical mechanical arm and the surrounding environment and display the images in a display in real time.
As shown in fig. 2, the present embodiment further provides a set of gesture control schemes conforming to the human operation habit based on the above system. Including but not limited to the following:
(1) If the pose data expected by the mechanical arm are known, the pose data relative to the base can be directly input into the virtual panel, and the mechanical arm can be confirmed to move to the corresponding pose after the mechanical arm is operated.
(2) The specific pose data of the mechanical arm is unknown, but when the mechanical arm is expected to run to a certain area to reach a set pose, the end effector part of the digital twin body can be dragged to reach the corresponding three-dimensional space position, the pose orientation of the effector is continuously adjusted (the two steps can be interchanged), and the mechanical arm is confirmed to run to the corresponding position after running.
(3) Because the gesture of a person cannot make too fine movement, and meanwhile, the recognition precision of the mixed reality equipment cannot meet the corresponding requirement, when the mechanical arm performs finer movement, a click-type control mode is needed. Clicking + -buttons at two ends of the data of the joint angle to finely control the joint movement of the mechanical arm, and adjusting the sensitivity according to different required movement precision.
In teleoperation, in order to avoid collision between the slave manipulator and surrounding objects in the motion process, the embodiment also provides a collision detection and early warning method based on a bounding box algorithm, which comprises the following steps:
1) And constructing digital twin bodies of the UR5 mechanical arm and the surrounding working environment in an equal proportion, and importing the digital twin bodies into mixed reality equipment for development. 2) As shown in fig. 3, the digital twin is hierarchically divided according to the rule of the N-ary tree. And a proper bounding box is selected to wrap related objects, and a cuboid which can just wrap the whole robot arm is adopted as a bounding box of the root node. If the obstacle is outside the bounding box, it means that the robot arm must not collide with it. The bounding box of each joint part of the mechanical arm is used as a leaf node, and the joint connecting rod part of the 6-joint mechanical arm in the UR configuration is a nearly cylindrical body, so that the cylindrical body which can completely wrap each joint of the mechanical arm is used as the bounding box to replace a complex actual shape for collision detection. Similarly, for the surrounding work pieces, a simple three-dimensional graph such as a sphere, a cuboid and the like is used as a bounding box in the working environment. For more complex environments, as with the robotic arm, different levels of bounding boxes may be used instead until all parts are wrapped with the bounding box. 3) After the collision detection function of the teleoperation system is started, the system can detect whether the mechanical arm interferes with the surrounding environment in real time, and the specific flow is as follows: and 1, judging whether interference occurs between bounding boxes of the root nodes. Firstly, collision detection is carried out on the objects of the root nodes, namely whether interference phenomenon occurs between bounding boxes of the objects of the root nodes is judged. If no interference phenomenon occurs between the object bounding boxes of the two root nodes, the two objects are not collided. Otherwise, it is indicated that a collision may occur, and the interference calculation needs to be continued for the next level of the two objects: a traversal calculation is performed between each of the parts under the two levels. Until all objects have been traversed. If the bounding box of the root node of the mechanical arm interferes with the bounding box of the object 1, each part under the mechanical arm is sequentially interfered with the object 1 to calculate whether the bounding box of each part of the mechanical arm interferes with the bounding box of the object 1 or not, if not, the mechanical arm continues to normally operate, if so, the system gives an alarm, and the mechanical arm is suspended. 4) And controlling the mechanical arm to continuously work normally after exiting the area where collision is likely to happen. The above process is repeated to realize continuous operation of the mechanical arm.
Because each joint connecting rod of the mechanical arm is replaced by a cylindrical bounding box, a method for judging whether interference occurs between the cylindrical bounding box and other bounding boxes is introduced. As shown in fig. 4: the P point coordinate in the AABB bounding box is P (x 1,y1,z1), the Q point coordinate is Q (x 2,y2,z2), after each joint of the mechanical arm is wrapped by using the cylindrical bounding box, the central line AB of the cylindrical bounding box and the radius length r of the cylinder are extracted, the radius length r is added to the bounding box of a surrounding object to enable the surrounding object to expand, the P point coordinate is changed to P2 (x 1-r,y1-r,z1 -r), the Q point coordinate is changed to Q2 (x 2+r,y2+r,z2 +r), and then the central line is discretized into a series of points A1 (x a1,ya1,za1),A2(xa2,ya2,za2) … according to a preset interval (different intervals can be selected according to different scenes). If the point A1 after the point discretization is full of the following formula:
If the point is considered to fall in the expanded bounding box, whether the discretized point is in the expanded bounding box is judged one by one, and if the discretized point is in the expanded bounding box, the interference is indicated, otherwise, the interference phenomenon is indicated not to be generated. If interference between the two bounding boxes is detected, the nodes represented by the two bounding boxes are considered to collide, namely the mechanical arm collides with surrounding objects, and the mechanical arm should stop running temporarily.
More specifically, the operation flow of the mechanical arm teleoperation system based on mixed reality provided by the embodiment is as follows:
S1, starting holonens glasses, loading the holonens glasses into a mixed reality environment, and synchronizing a UR5 mechanical arm and a digital twin body of a slave end to ensure that the two mechanical arms are in the same initial state.
S2, operating the control panel in the modes according to different specific use scenes, and acquiring corresponding control signals through the IMU sensors of the holonens glasses.
S3, the collected control signals need to be converted into motion signals of the mechanical arm. The expected pose of the tail end of the mechanical arm is calculated as the motion angle of each joint by using the forward and reverse kinematics equations of the mechanical arm deduced by the D-H modeling parameters of the UR5 mechanical arm shown in fig. 5.
And S4, feeding back state data of the mechanical arm to the holonens glasses at a frequency of 125Hz in the process of moving the mechanical arm to the pose expected by an operator, and updating the state of the digital twin body in real time by the holonens glasses according to the data so as to realize synchronous operation of the mechanical arm and the holonens glasses.
S5, updating the state of the mechanical arm every time, namely, executing the collision detection system once, and traversing whether the collision occurs in the detection system.
And S6, repeating the steps S2-S5 if no collision occurs, and realizing continuous operation of the teleoperation system until the system is exited. If collision occurs, the system pauses the operation of the mechanical arm, switches to manual low-speed operation and exits from the collision area.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the invention in any way, and any person skilled in the art may make modifications or alterations to the disclosed technical content to the equivalent embodiments. However, any simple modification, equivalent variation and variation of the above embodiments according to the technical substance of the present invention still fall within the protection scope of the technical solution of the present invention.

Claims (7)

1. Mechanical arm teleoperation system based on mixed reality, which is characterized by comprising:
A physical robotic arm;
The mixed reality equipment is used for realizing interaction between a human body and the physical mechanical arm, and the mixed reality equipment is internally provided with the physical mechanical arm and a digital twin body in the surrounding environment and can be in wireless communication with the physical mechanical arm;
And the video acquisition equipment is used for acquiring images of the physical mechanical arm and the surrounding environment and presenting the images in the mixed reality equipment in real time.
2. The mixed reality-based mechanical arm teleoperation system according to claim 1, wherein the physical mechanical arm is provided with a mechanical arm control cabinet, the physical mechanical arm performs wireless communication with mixed reality equipment through the mechanical arm control cabinet, and a digital twin body in the mixed reality equipment updates the state of the physical mechanical arm according to various state data of the physical mechanical arm fed back by the mechanical arm control cabinet through wireless communication.
3. A method for teleoperation of a mechanical arm based on mixed reality, implemented by using the system according to any one of claims 1-2, comprising the steps of:
S1, using mixed reality equipment to cooperate with corresponding gesture actions, operating a virtual control panel of a corresponding mechanical arm, and obtaining corresponding control signals;
S2, mapping the control signals acquired in the step S1 into position and posture data of the tail end of the mechanical arm in the mixed reality equipment;
s3, performing forward and reverse kinematic computation on the position and posture data of the mechanical arm obtained in the step S2, and solving joint motion parameters of the mechanical arm;
S4, the joint motion parameters obtained in the step S3 are sent to the physical mechanical arm to enable the physical mechanical arm to run to the corresponding pose, and meanwhile, the state of the digital twin body is updated according to state data fed back by the physical mechanical arm;
S5, when the mechanical arm moves, collision detection is synchronously carried out according to a certain frequency, if collision is about to occur, the mechanical arm stops moving, and the system gives an alarm; if no collision occurs, the system can continuously run until the system is closed;
S6, an operator wearing the mixed reality equipment can observe the pose state of the mechanical arm from different angles through movement of the operator, and meanwhile, the video acquisition equipment corresponds to the actual state of the mechanical arm, so that vision blind areas are reduced, and reference is provided for next movement.
4. The method for teleoperation of a mechanical arm based on mixed reality according to claim 3, wherein in step S1, the virtual control panel is a virtual control panel with pertinence developed according to structural features of a physical mechanical arm, and the corresponding gesture actions include finger clicking, grabbing and dragging.
5. A mixed reality based robotic arm teleoperation method according to claim 3, characterized in that the digital twin can be placed at any position in space according to the needs of the actual situation, including overlapping the digital twin with the actual environment.
6. The teleoperation method of a mixed reality-based mechanical arm according to claim 3, wherein an operator can observe the pose state of the digital twin body from different angles through switching of movement and viewing angles, thereby minimizing the blind area of vision.
7. The mixed reality-based mechanical arm teleoperation method according to claim 3, wherein in the step S5, collision detection is performed by adopting a collision detection algorithm based on bounding boxes, and the method comprises the following steps:
s501, constructing a digital twin body of an entity mechanical arm and a surrounding environment;
S502, dividing the whole object and each part of the object into different layers according to the structural characteristics of equipment and environment and the dividing principle of N-ary tree, and completely and snugly wrapping related objects by adopting a proper bounding box;
S503, judging whether collision occurs between the two objects, namely detecting whether interference occurs in the bounding boxes of the two objects; firstly, carrying out interference calculation on bounding boxes of root nodes of two objects, and if no interference phenomenon occurs between the bounding boxes of the root nodes of the two objects, indicating that the two objects do not collide; otherwise, it is indicated that the two objects may collide, and the interference calculation is continued for the bounding box of the next layer of the two objects: performing traversal calculation on each part of the two objects under the same level; until all layers of the two objects are traversed;
s504, continuously repeating the step S503, and traversing all objects in the system.
CN202410402050.6A 2024-04-03 2024-04-03 Mechanical arm teleoperation system and method based on mixed reality Pending CN118061188A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410402050.6A CN118061188A (en) 2024-04-03 2024-04-03 Mechanical arm teleoperation system and method based on mixed reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410402050.6A CN118061188A (en) 2024-04-03 2024-04-03 Mechanical arm teleoperation system and method based on mixed reality

Publications (1)

Publication Number Publication Date
CN118061188A true CN118061188A (en) 2024-05-24

Family

ID=91105865

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410402050.6A Pending CN118061188A (en) 2024-04-03 2024-04-03 Mechanical arm teleoperation system and method based on mixed reality

Country Status (1)

Country Link
CN (1) CN118061188A (en)

Similar Documents

Publication Publication Date Title
US20210205986A1 (en) Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose
CN110394780B (en) Simulation device of robot
Krupke et al. Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction
US9643318B2 (en) Teleoperation of machines having at least one actuated mechanism
JP2019519387A (en) Visualization of Augmented Reality Robot System
Naceri et al. Towards a virtual reality interface for remote robotic teleoperation
Westerberg et al. Virtual environment-based teleoperation of forestry machines: Designing future interaction methods
CN110815189B (en) Robot rapid teaching system and method based on mixed reality
Naceri et al. The vicarios virtual reality interface for remote robotic teleoperation: Teleporting for intuitive tele-manipulation
JP2012171024A (en) Robot system
JP7388074B2 (en) Simulation device, simulation program and simulation method
US20170197308A1 (en) Teaching data generating device and teaching data-generating method for work robot
Andaluz et al. Transparency of a bilateral tele-operation scheme of a mobile manipulator robot
CN107257946B (en) System for virtual debugging
Shoukat et al. Digital twin-driven virtual control technology of home-use robot: human-cyber-physical system
Xu et al. Shared-control robotic manipulation in virtual reality
Du et al. An intelligent interaction framework for teleoperation based on human-machine cooperation
WO2016172718A1 (en) System and method of remote teleoperation using a reconstructed 3d scene
CN111152220B (en) Mechanical arm control method based on man-machine fusion
Makita et al. Offline direct teaching for a robotic manipulator in the computational space
CN114683288B (en) Robot display and control method and device and electronic equipment
CN118061188A (en) Mechanical arm teleoperation system and method based on mixed reality
Sarai et al. Robot programming for manipulators through volume sweeping and augmented reality
US20220101477A1 (en) Visual Interface And Communications Techniques For Use With Robots
US20190389061A1 (en) Displaying method of robot simulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination