CN110815189B - Robot rapid teaching system and method based on mixed reality - Google Patents
Robot rapid teaching system and method based on mixed reality Download PDFInfo
- Publication number
- CN110815189B CN110815189B CN201911144045.5A CN201911144045A CN110815189B CN 110815189 B CN110815189 B CN 110815189B CN 201911144045 A CN201911144045 A CN 201911144045A CN 110815189 B CN110815189 B CN 110815189B
- Authority
- CN
- China
- Prior art keywords
- robot
- mixed reality
- hand
- virtual robot
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 239000011521 glass Substances 0.000 claims abstract description 32
- 230000033001 locomotion Effects 0.000 claims abstract description 32
- 238000013507 mapping Methods 0.000 claims abstract description 7
- 238000004891 communication Methods 0.000 claims abstract description 4
- 230000009466 transformation Effects 0.000 claims description 50
- 239000011159 matrix material Substances 0.000 claims description 32
- 238000013519 translation Methods 0.000 claims description 18
- 230000001133 acceleration Effects 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 6
- 230000000694 effects Effects 0.000 claims description 3
- 230000003287 optical effect Effects 0.000 claims description 3
- 230000009012 visual motion Effects 0.000 claims description 3
- 230000004807 localization Effects 0.000 claims description 2
- 230000036544 posture Effects 0.000 abstract description 8
- 230000002349 favourable effect Effects 0.000 abstract 1
- 238000005516 engineering process Methods 0.000 description 16
- 230000003190 augmentative effect Effects 0.000 description 7
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000001404 mediated effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- BULVZWIRKLYCBC-UHFFFAOYSA-N phorate Chemical compound CCOP(=S)(OCC)SCSCC BULVZWIRKLYCBC-UHFFFAOYSA-N 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/023—Cartesian coordinate type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Numerical Control (AREA)
- Manipulator (AREA)
Abstract
The invention relates to a robot rapid teaching system and method based on mixed reality, wherein the system comprises an entity industrial robot and mixed reality glasses, the entity industrial robot comprises a robot body and a robot controller, a 3D virtual robot is built in the mixed reality glasses, and the mixed reality glasses are in communication connection with the robot controller. The method comprises the following steps: continuously capturing hand positions of one hand or two hands by using mixed reality glasses, and mapping the hand positions into positions and postures of the tail ends of the virtual robots; inputting the position and attitude data of the tail end of the virtual robot into an inverse kinematics algorithm of the robot, and solving joint motion parameters of the virtual robot; the mixed reality glasses synchronously update and draw the virtual robot by using the joint motion parameters; and transmitting the joint motion parameters to the robot controller in real time to enable the robot body to generate the same motion, thereby finishing the teaching. The system and the method are favorable for improving the teaching speed of the robot, and are flexible and convenient to use.
Description
Technical Field
The invention relates to the field of industrial robot teaching, in particular to a robot rapid teaching system and method based on mixed reality.
Background
Industrial robots are important automation equipment in modern manufacturing industries integrating many disciplines such as machinery, electronics, control, computers, sensors and the like. Industrial robots are widely used in the fields of Flexible Manufacturing Systems (FMS), automated Factories (FA), and the like, and are one of the important marks of industrial modernization and automation degree. With the continuous extension of the application field of the industrial robot, the working environment of the industrial robot is more and more diversified, the task to be completed is more and more complicated, and the pursuit of the user on the quality and the efficiency of the product is higher and higher. In this form, the programming manner, programming efficiency and quality of the robot become more and more important. The programming difficulty and workload are reduced, the programming efficiency is improved, and the programming self-adaptability is realized, so that the production efficiency is improved, and the programming method is the ultimate pursuit of the development of the robot programming technology. The traditional teaching mode is difficult to meet the requirements of modern production, and a new teaching mode is urgently needed to be designed.
Mixed Reality (MR) is a technology that superimposes virtual data onto a real environment, so that a user perceives a real world and a virtual world at the same time, and allows the user to interact with a current scene in real time. The mixed reality technology is a technology which integrates the information of the real world and the information of the virtual century. The root of the computer can trace back to the birth initial stage of the modern computer. As early as 1968, Satherland, a subsidiary professor of Harvard electric engineering (Ivan Sutherland), invented a device named as Sword of darnkles (The Sword of Damocles), which is The first set of AR systems. The first mention of Augmented Reality (Augmented Reality) is in the paper by Tom Caudell and David Mizell to describe this technique for presenting virtual elements of a computer in the real world. In 1994, virtual reality technology was first brought into play artistically, and the artist Julie Martin designed a dance-dance in space (dance in Cyberspace), so that real dancers interacted with virtual contents. The mixed Reality is a combination of virtual Reality and augmented Reality, is a Mediated Reality proposed by Steve Mann, the university of Toronto, the father of the intelligent hardware, and is called Mediated Reality. Mixed reality contains three important features: virtual and real integration; real-time interaction; and registering the virtual object. In recent years, the body shadow of the mixed reality can be seen in the fields of science and technology, military, education, games and the like, and particularly, the development of the 5G technology promotes the high-speed growth of the mixed reality technology.
With the advent of computer simulation technology, human-computer interaction technology, virtual reality technology, and mixed reality technology, robot off-line teaching, virtual teaching, and mixed reality teaching modes have been developed.
The mixed reality teaching is an off-line teaching mode for teaching and programming a virtual model under a real teaching scene by using a mixed reality technology. The robot teaching based on the mixed reality technology can play the advantages of off-line programming model simulation in a real environment, namely, the shutdown time is reduced, the execution of the original task is not influenced, and the programming of the next task is carried out at the same time; the operator is far away from the production environment, and potential danger of the operation environment to the operator and damage of the operator to the operation environment and the robot are avoided; the method is convenient to be combined with systems such as CAD/CAM and the like; the robot running program is convenient to debug and modify; the robot teaching system can be used for compiling complex tasks and the like, and simultaneously, robot teaching based on the mixed reality technology also exerts the advantage of simple on-line programming teaching process.
The patent of publication No. CN108161904A proposes an augmented reality-based robot online teaching system, which comprises a teaching manipulator, an orientation tracking sensor, a virtual robot model positioner, an augmented reality display and a computer, and realizes online teaching by operating the teaching manipulator to control and observe the movement of the virtual robot. However, the invention has a plurality of devices, and the hand can not be separated from the teaching aid in operation.
The patent of publication number CN108161882A proposes a robot teaching reproduction method based on augmented reality, which uses a motion sensing device to acquire the pose of a gesture, and the augmented reality device realizes teaching track reproduction, thereby achieving offline teaching. However, the patent takes the voice as the optimization of the error in the precise teaching process, but obviously increases the cognitive burden of the user, and meanwhile, a bottleneck is obvious in the teaching speed.
Disclosure of Invention
The invention aims to provide a robot rapid teaching system and method based on mixed reality, which are beneficial to improving the teaching speed of a robot and are flexible and convenient to use.
In order to achieve the purpose, the invention adopts the technical scheme that: the utility model provides a quick teaching system of robot based on mixed reality, includes an entity industrial robot and a mixed reality glasses, entity industrial robot includes a robot body and is used for controlling the robot control ware of its motion, the robot body includes a joint at least, build in the mixed reality glasses one with the same 3D virtual robot of robot body, mixed reality glasses with communication connection between the robot control ware.
The present invention also provides a method for rapid teaching of robots using the system according to claim 1, comprising the steps of:
s1, continuously capturing hand positions of one hand or two hands by using mixed reality glasses;
s2, in the mixed reality glasses, mapping the hand position obtained in the step S1 into the position and the posture of the tail end of the 3D virtual robot;
s3, inputting the position and posture data of the tail end of the 3D virtual robot obtained in the step S2 into a robot inverse kinematics algorithm, and solving joint motion parameters of the 3D virtual robot;
s4, synchronously updating and drawing the 3D virtual robot by the mixed reality glasses according to the joint motion parameters of the 3D virtual robot obtained in the step S3;
and S5, transmitting the joint motion parameters obtained in the step S3 to the robot controller in real time, and enabling the robot body to generate the same motion, thereby completing teaching.
Further, in step S1, after the user wears the mixed reality glasses, the mixed reality glasses fuse and present the 3D virtual robot in the mixed reality environment through the spatial localization technology, and position data of the three-dimensional space of the hands of one hand or both hands is acquired through optical motion capture.
Further, in step S2, the hand positions of one hand are transformed into positions of the 3D virtual robot ends, and the hand positions of both hands are transformed into positions of the 3D virtual robot ends.
Further, the hand positions of one hand or two hands are mapped into the position and the posture of the tail end of the 3D virtual robot through relative pose transformation, and the mapping algorithm is as follows:
in one-hand teaching, the relative transformation of hand position in three-dimensional space corresponds to 3D virtualRelative transformation of the origin of the coordinate system at the end of the quasi-robot and differential translation transformation matrix of adjacent momentsT F Comprises the following steps:
wherein,dx、dyanddzrepresenting the hand relative to a reference coordinate systemx、yAndza small amount of movement of the shaft is provided,S(d) Is defined as:
wherein the parametersdRepresenting the distance, parameters, of the user from the 3D virtual robotkIn order to teach the maximum lifting multiple of precision, a new pose matrix of the tail end of the 3D virtual robot after differential translation transformation is obtained by the formula (1)T T_new Comprises the following steps:
wherein,T T_old representing a pose matrix of the tail end of the 3D virtual robot before differential translation transformation;
in the teaching of both hands, the effect is the same in one of them hand and the teaching of one hand, and the relative transform of its hand position corresponds to the relative transform of 3D virtual robot terminal coordinate system original point, and the other hand corresponds to the point that is different from the original point in the terminal Z axle direction of virtual robot, the Z axle is the effective direction of the instrument that 3D virtual robot end was held, and its differential rotates the gesture transformation matrix that corresponds:
wherein,、、representing both hands relative to a reference coordinate systemx、yAndzdifferential rotation of the shaft, and obtaining a new pose matrix of the tail end of the 3D virtual robot after differential rotation transformation by the formula (4)T R_new Comprises the following steps:
wherein,T R_old representing the pose matrix of the tail end of the 3D virtual robot before differential rotation transformation, combining the differential translation transformation matrix in the single-hand operation in the formula (1) and the pose transformation matrix in the formula (4) to obtain the complete pose matrix of the tail end of the 3D virtual robot after the combined action of the differential translation transformation and the differential rotation transformation when two hands are taughtComprises the following steps:
wherein,the pose matrix of the tail end of the 3D virtual robot before the combined action of differential translation transformation and differential rotation transformation.
Further, in step S3, after the pose of each 3D virtual robot end is obtained, the rotation angle of the 3D virtual robot joint is obtained through the inverse kinematics algorithm of the robot, and the speed and the acceleration of the 3D virtual robot are not limited by the speed and the acceleration of the robot body joint, so as to respond to the fast movement of the hand of the user in real time.
Further, in step S4, the mixed reality glasses synchronously update and draw the 3D virtual robot and the visual motion trajectory through which the end of the 3D virtual robot passes, based on the calculated joint motion parameters of the 3D virtual robot, by using the robot forward kinematics algorithm.
Compared with the prior art, the invention has the following beneficial effects: the system and the method provide more humanized off-line programming such as single-hand or double-hand command teaching in a mixed reality environment, fully play the intuitive action of people for operation, realize the promotion of the teaching speed of the robot, and have strong practicability and wide application prospect.
Drawings
Fig. 1 is a schematic diagram of a system structure according to an embodiment of the present invention.
FIG. 2 is a flow chart of the teaching of an embodiment of the present invention.
FIG. 3 is a schematic illustration of the conversion of single/double-hand teaching to joint motion in an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and the embodiments.
The invention provides a robot rapid teaching system based on mixed reality, which comprises an entity industrial robot and mixed reality glasses 1, wherein the entity industrial robot comprises a robot body 2 and a robot controller 3 for controlling the robot body to move, the robot body 2 at least comprises a joint, a 3D virtual robot 4 which is the same as the robot body is built in the mixed reality glasses 1, and the mixed reality glasses 1 are in communication connection with the robot controller 3.
As shown in fig. 2, the invention further provides a mixed reality-based robot rapid teaching method based on the system, which includes the following steps:
and S1, continuously capturing hand positions of one hand or two hands by using the mixed reality glasses. Wherein, the user is wearing mixed reality glasses 1 back, and mixed reality glasses 1 fuses 3D virtual robot 4 and presents in mixed reality environment through the space positioning technique, obtains the position data of one hand or both hands hand three-dimensional space through optical formula motion capture.
And S2, mapping the hand position acquired in the step S1 to the position and the posture of the tail end of the 3D virtual robot in the mixed reality glasses. The hand positions of one hand are correspondingly converted into the positions of the tail ends of the 3D virtual robot, and the hand positions of two hands are correspondingly converted into the poses of the tail ends of the 3D virtual robot. In the present embodiment, one hand corresponds to the right hand in fig. 1, and two hands correspond to the left hand and the right hand in fig. 1.
As shown in fig. 3, the hand positions of one hand or both hands are mapped to the position and the posture of the 3D virtual robot end through the relative pose transformation, and the mapping algorithm is as follows:
in the single-hand teaching, the relative transformation of the hand position in the three-dimensional space corresponds to the relative transformation of the origin of the terminal coordinate system of the 3D virtual robot, namely, the hand positions at two adjacent moments are sequentially 、 And the origin of the terminal coordinate system of the 3D virtual robot corresponds toThe position of the moment isThen, then
Differential translation transformation matrix for next timeT F Comprises the following steps:
wherein,dx、dyanddzindicating handRelative to a reference coordinate systemx、yAndza small amount of movement of the shaft is provided,S(d) Is defined as:
wherein the parametersdRepresenting the distance, parameters, of the user from the 3D virtual robotkIn order to teach the maximum lifting multiple of precision, a new pose matrix of the tail end of the 3D virtual robot after differential translation transformation is obtained by the formula (1)T T_new Comprises the following steps:
wherein,T T_old representing a pose matrix of the tail end of the 3D virtual robot before differential translation transformation;
in the two-hand teaching, one hand has the same function as that in the one-hand teaching, the relative transformation of the hand position corresponds to the relative transformation of the origin of the coordinate system at the tail end of the 3D virtual robot, and the other hand corresponds to a point different from the origin in the Z-axis direction at the tail end of the virtual robot(the Z axis is defined as the effective direction of the tool held by the 3D virtual robot end), the relationship between the direction vector and the pose matrix can be obtained as follows:
wherein,around a reference coordinate systemAttitude transformation matrices with axes making purely rotational transformations, i.e.,Representing a direction vectorInitial position of (2), then direction vectorThe (unit vector) may represent the pose of the virtual robot end, i.e. the pose of the 3D virtual robot end, ifThe two adjacent moments of the hand postures are、The following can be obtained:
wherein,、all in oneSimilarly, if the attitude transformation matrix is relative to the reference coordinate system, the differential rotation corresponds to the attitude transformation matrix:
wherein,、、representing a direction vectorRelative to a reference coordinate systemx、yAndzdifferential rotation of the shaft, and obtaining a new pose matrix of the tail end of the 3D virtual robot after differential rotation transformation according to the formula (4)T R_new Comprises the following steps:
wherein,T R_old representing the pose matrix of the tail end of the 3D virtual robot before differential rotation transformation, combining the differential translation transformation matrix in the single-hand operation in the formula (1) and the pose transformation matrix in the formula (4) to obtain the complete pose matrix of the tail end of the 3D virtual robot after the combined action of the differential translation transformation and the differential rotation transformation when two hands are taughtComprises the following steps:
wherein,the pose matrix of the tail end of the 3D virtual robot before the combined action of differential translation transformation and differential rotation transformation.
And S3, inputting the position and posture data of the tail end of the 3D virtual robot obtained in the step S2 into a robot inverse kinematics algorithm, and solving the joint motion parameters of the 3D virtual robot.
Specifically, after the pose of each 3D virtual robot terminal is obtained, the rotation angle of the 3D virtual robot joint is obtained through a robot inverse kinematics algorithm, and the speed and the acceleration of the 3D virtual robot are not limited by the speed and the acceleration of the robot body joint, so that the hand rapid movement of a user is responded immediately.
And S4, synchronously updating and drawing the 3D virtual robot by the mixed reality glasses according to the joint motion parameters of the 3D virtual robot obtained in the step S3.
Specifically, the mixed reality glasses synchronously update and draw the 3D virtual robot and the visual motion trail passed by the tail end of the 3D virtual robot through a positive kinematics algorithm of the robot based on the obtained joint motion parameters of the 3D virtual robot.
And S5, transmitting the joint motion parameters obtained in the step S3 to the robot controller in real time, and enabling the robot body to generate the same motion, thereby completing teaching.
The above are preferred embodiments of the present invention, and all changes made according to the technical scheme of the present invention that produce functional effects do not exceed the scope of the technical scheme of the present invention belong to the protection scope of the present invention.
Claims (4)
1. A robot rapid teaching method based on mixed reality is characterized in that a robot rapid teaching system is provided, which comprises an entity industrial robot and mixed reality glasses, wherein the entity industrial robot comprises a robot body and a robot controller for controlling the robot body to move, the robot body at least comprises a joint, a 3D virtual robot which is the same as the robot body is built in the mixed reality glasses, and the mixed reality glasses are in communication connection with the robot controller;
the robot rapid teaching method adopting the system comprises the following steps:
s1, continuously capturing hand positions of one hand or two hands by using mixed reality glasses;
s2, in the mixed reality glasses, mapping the hand position obtained in the step S1 into the position and the posture of the tail end of the 3D virtual robot;
s3, inputting the position and posture data of the tail end of the 3D virtual robot obtained in the step S2 into a robot inverse kinematics algorithm, and solving joint motion parameters of the 3D virtual robot;
s4, synchronously updating and drawing the 3D virtual robot by the mixed reality glasses according to the joint motion parameters of the 3D virtual robot obtained in the step S3;
s5, transmitting the joint motion parameters obtained in the step S3 to the robot controller, so that the robot body generates the same motion, thereby completing the teaching;
in step S2, the hand positions of one hand are correspondingly transformed into the positions of the 3D virtual robot ends, and the hand positions of both hands are correspondingly transformed into the poses of the 3D virtual robot ends;
the hand positions of one hand or two hands are mapped into the position and the posture of the tail end of the 3D virtual robot through relative pose transformation, and the mapping algorithm is as follows:
in the one-hand teaching, the relative transformation of the hand position in the three-dimensional space corresponds to the relative transformation of the origin of the coordinate system at the end of the 3D virtual robot, and the differential translation transformation matrix of the adjacent momentsT F Comprises the following steps:
wherein,dx、dyanddzrepresenting the hand relative to a reference coordinate systemx、yAndza small amount of movement of the shaft is provided,S(d) Is defined as:
wherein the parametersdRepresenting the distance, parameters, of the user from the 3D virtual robotkIn order to teach the maximum lifting multiple of precision, a new pose matrix of the tail end of the 3D virtual robot after differential translation transformation is obtained by the formula (1)T T_new Comprises the following steps:
wherein,T T_old representing a pose matrix of the tail end of the 3D virtual robot before differential translation transformation;
in the teaching of both hands, the effect is the same in one of them hand and the teaching of one hand, and the relative transform of its hand position corresponds to the relative transform of 3D virtual robot terminal coordinate system original point, and the other hand corresponds to the point that is different from the original point in the terminal Z axle direction of virtual robot, the Z axle is the effective direction of the instrument that 3D virtual robot end was held, and its differential rotates the gesture transformation matrix that corresponds:
wherein,、、representing both hands relative to a reference coordinate systemx、yAndzdifferential rotation of the shaft, and obtaining a new pose matrix of the tail end of the 3D virtual robot after differential rotation transformation according to the formula (4)T R_new Comprises the following steps:
wherein,T R_old representing the position matrix of the tail end of the 3D virtual robot before differential rotation transformation, combining the differential translation transformation matrix in the single-hand operation in the formula (1) and the posture transformation matrix in the formula (4) to obtain the differential translation transformation and the differential rotation in the two-hand teachingComplete pose matrix of 3D virtual robot end after transformation combined actionComprises the following steps:
2. The method for mixed reality-based rapid robot teaching according to claim 1, wherein in step S1, after the user wears the mixed reality glasses, the mixed reality glasses fuse and present the 3D virtual robot in the mixed reality environment by using a spatial localization technique, and position data of the three-dimensional space of the hands of one or both hands is obtained by optical motion capture.
3. The method for rapid teaching of mixed reality-based robot according to claim 1, wherein in step S3, after the pose of each 3D virtual robot end is obtained, the rotation angle of the 3D virtual robot joint is obtained by the inverse kinematics algorithm of the robot, and the speed and acceleration of the 3D virtual robot are not limited by the speed and acceleration of the robot body joint, so as to respond to the rapid hand movement of the user in real time.
4. The method for rapid teaching of mixed reality-based robot according to claim 1, wherein in step S4, the mixed reality glasses synchronously update and draw the visual motion trajectory that the 3D virtual robot and the end of the 3D virtual robot pass through by the positive kinematics algorithm of the robot based on the calculated joint motion parameters of the 3D virtual robot.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911144045.5A CN110815189B (en) | 2019-11-20 | 2019-11-20 | Robot rapid teaching system and method based on mixed reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911144045.5A CN110815189B (en) | 2019-11-20 | 2019-11-20 | Robot rapid teaching system and method based on mixed reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110815189A CN110815189A (en) | 2020-02-21 |
CN110815189B true CN110815189B (en) | 2022-07-05 |
Family
ID=69557705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911144045.5A Active CN110815189B (en) | 2019-11-20 | 2019-11-20 | Robot rapid teaching system and method based on mixed reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110815189B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11969904B2 (en) | 2020-03-24 | 2024-04-30 | Qingdao university of technology | Registration system and method for robot-oriented augmented reality teaching system |
CN112917457A (en) * | 2021-01-27 | 2021-06-08 | 南京航空航天大学 | Industrial robot rapid and accurate teaching system and method based on augmented reality technology |
CN112894820A (en) * | 2021-01-29 | 2021-06-04 | 清华大学深圳国际研究生院 | Flexible mechanical arm remote operation man-machine interaction device and system |
CN113126568B (en) * | 2021-03-10 | 2022-08-09 | 上海乾庾智能科技有限公司 | Industrial robot operation and demonstration system based on augmented reality technology |
CN113290560A (en) * | 2021-05-27 | 2021-08-24 | 乐聚(深圳)机器人技术有限公司 | Robot motion control method, device, electronic equipment and storage medium |
CN113858181A (en) * | 2021-11-19 | 2021-12-31 | 国家电网有限公司 | Power transmission line operation aerial robot based on human-computer interaction mixed reality |
CN114310977B (en) * | 2021-12-31 | 2024-05-10 | 天津中屹铭科技有限公司 | Demonstrator for polishing robot and manual control method thereof |
CN117655601B (en) * | 2023-12-12 | 2024-09-03 | 中船舰客教育科技(北京)有限公司 | MR-based intelligent welding method apparatus, computer device, and medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105045398A (en) * | 2015-09-07 | 2015-11-11 | 哈尔滨市一舍科技有限公司 | Virtual reality interaction device based on gesture recognition |
CN106863295A (en) * | 2015-12-10 | 2017-06-20 | 发那科株式会社 | Robot system |
CN108161882A (en) * | 2017-12-08 | 2018-06-15 | 华南理工大学 | A kind of robot teaching reproducting method and device based on augmented reality |
CN110394779A (en) * | 2018-04-25 | 2019-11-01 | 发那科株式会社 | The simulator of robot |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9643314B2 (en) * | 2015-03-04 | 2017-05-09 | The Johns Hopkins University | Robot control, training and collaboration in an immersive virtual reality environment |
-
2019
- 2019-11-20 CN CN201911144045.5A patent/CN110815189B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105045398A (en) * | 2015-09-07 | 2015-11-11 | 哈尔滨市一舍科技有限公司 | Virtual reality interaction device based on gesture recognition |
CN106863295A (en) * | 2015-12-10 | 2017-06-20 | 发那科株式会社 | Robot system |
CN108161882A (en) * | 2017-12-08 | 2018-06-15 | 华南理工大学 | A kind of robot teaching reproducting method and device based on augmented reality |
CN110394779A (en) * | 2018-04-25 | 2019-11-01 | 发那科株式会社 | The simulator of robot |
Non-Patent Citations (1)
Title |
---|
"基于增强现实及自然人机交互的机器人示教再现技术研究";陈偕权;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190115(第1期);第6-61页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110815189A (en) | 2020-02-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110815189B (en) | Robot rapid teaching system and method based on mixed reality | |
CN108638069B (en) | Method for controlling accurate motion of tail end of mechanical arm | |
CN107943283B (en) | Mechanical arm pose control system based on gesture recognition | |
Ostanin et al. | Interactive robot programing using mixed reality | |
CN108241339B (en) | Motion solving and configuration control method of humanoid mechanical arm | |
CN105291138B (en) | It is a kind of to strengthen the visual feedback platform of virtual reality immersion sense | |
CN102848389B (en) | Realization method for mechanical arm calibrating and tracking system based on visual motion capture | |
CN106041928B (en) | A kind of robot manipulating task task generation method based on part model | |
CN115469576B (en) | Teleoperation system based on human-mechanical arm heterogeneous motion space hybrid mapping | |
CN107856014A (en) | Mechanical arm pose control method based on gesture recognition | |
CN107577159A (en) | Augmented reality analogue system | |
CN110421561A (en) | A method of clothes spraying is carried out using cooperation robot | |
CN113829343A (en) | Real-time multi-task multi-person man-machine interaction system based on environment perception | |
Lambrecht et al. | Markerless gesture-based motion control and programming of industrial robots | |
Lim et al. | Online telemanipulation framework on humanoid for both manipulation and imitation | |
Du et al. | An intelligent interaction framework for teleoperation based on human-machine cooperation | |
CN110142769A (en) | The online mechanical arm teaching system of ROS platform based on human body attitude identification | |
CN207937787U (en) | Augmented reality analogue system | |
CN111185906A (en) | Leap Motion-based dexterous hand master-slave control method | |
CN110053045A (en) | Workpiece surface contour line acquisition methods, interference detection method and relevant apparatus | |
CN107738256A (en) | A kind of teach-by-doing apery teaching robot's programing system | |
Du et al. | An offline-merge-online robot teaching method based on natural human-robot interaction and visual-aid algorithm | |
CN107443369A (en) | A kind of robotic arm of the inverse identification of view-based access control model measurement model is without demarcation method of servo-controlling | |
CN109773773A (en) | A kind of master-slave control device, the system and method for novel six freedom parallel connection platform | |
Xu et al. | Trajectory Planning of 7-Degree-of-Freedom Manipulator Based on ROS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |