CN110815189A - Robot rapid teaching system and method based on mixed reality - Google Patents

Robot rapid teaching system and method based on mixed reality Download PDF

Info

Publication number
CN110815189A
CN110815189A CN201911144045.5A CN201911144045A CN110815189A CN 110815189 A CN110815189 A CN 110815189A CN 201911144045 A CN201911144045 A CN 201911144045A CN 110815189 A CN110815189 A CN 110815189A
Authority
CN
China
Prior art keywords
robot
mixed reality
virtual robot
hand
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911144045.5A
Other languages
Chinese (zh)
Other versions
CN110815189B (en
Inventor
吴海彬
卓建华
许金山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN201911144045.5A priority Critical patent/CN110815189B/en
Publication of CN110815189A publication Critical patent/CN110815189A/en
Application granted granted Critical
Publication of CN110815189B publication Critical patent/CN110815189B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/023Cartesian coordinate type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention relates to a robot rapid teaching system and method based on mixed reality, wherein the system comprises an entity industrial robot and mixed reality glasses, the entity industrial robot comprises a robot body and a robot controller, a 3D virtual robot is built in the mixed reality glasses, and the mixed reality glasses are in communication connection with the robot controller. The method comprises the following steps: continuously capturing hand positions of one hand or two hands by using mixed reality glasses, and mapping the hand positions into positions and postures of the tail ends of the virtual robots; inputting the position and attitude data of the tail end of the virtual robot into an inverse kinematics algorithm of the robot, and solving joint motion parameters of the virtual robot; the mixed reality glasses synchronously update and draw the virtual robot by using the joint motion parameters; and transmitting the joint motion parameters to the robot controller in real time to enable the robot body to generate the same motion, thereby finishing the teaching. The system and the method are favorable for improving the teaching speed of the robot, and are flexible and convenient to use.

Description

Robot rapid teaching system and method based on mixed reality
Technical Field
The invention relates to the field of industrial robot teaching, in particular to a robot rapid teaching system and method based on mixed reality.
Background
Industrial robots are important automation equipment in modern manufacturing industries integrating many disciplines such as machinery, electronics, control, computers, sensors and the like. Industrial robots are widely used in the fields of Flexible Manufacturing Systems (FMS), automated Factories (FA), and the like, and are one of the important marks of industrial modernization and automation degree. With the continuous extension of the application field of the industrial robot, the working environment of the industrial robot is more and more diversified, the task to be completed is more and more complicated, and the pursuit of the user on the quality and the efficiency of the product is higher and higher. In this form, the programming manner, programming efficiency and quality of the robot become more and more important. The programming difficulty and workload are reduced, the programming efficiency is improved, and the programming self-adaptability is realized, so that the production efficiency is improved, and the programming method is the ultimate pursuit of the development of the robot programming technology. The traditional teaching mode is difficult to meet the requirements of modern production, and a new teaching mode is urgently needed to be designed.
Mixed Reality (MR) is a technology that superimposes virtual data onto a real environment, so that a user perceives a real world and a virtual world at the same time, and allows the user to interact with a current scene in real time. The mixed reality technology is a technology which integrates the information of the real world and the information of the virtual century. The root of the computer can trace back to the birth initial stage of the modern computer. As early as 1968, Satherland, a subsidiary professor of Harvard electric engineering (Ivan Sutherland), invented a device named as Sword of darnkles (The Sword of Damocles), which is The first set of AR systems. The first mention of Augmented Reality (Augmented Reality) is in the paper by Tom Caudell and David Mizell to describe this technique for presenting virtual elements of a computer in the real world. In 1994, virtual reality technology was first brought into play artistically, and the artist Julie Martin designed a dance-dance in the space of the Saybolt (dance in Cyberspace), so that real dancers interacted with virtual contents. The mixed Reality is a combination of virtual Reality and augmented Reality, is a Mediated Reality proposed by Steve Mann, the university of Toronto, the father of the intelligent hardware, and is called Mediated Reality. Mixed reality contains three important features: virtual and real integration; real-time interaction; and registering the virtual object. In recent years, the body shadow of the mixed reality can be seen in the fields of science and technology, military, education, games and the like, and particularly, the development of the 5G technology promotes the high-speed growth of the mixed reality technology.
With the advent of computer simulation technology, human-computer interaction technology, virtual reality technology, and mixed reality technology, robot off-line teaching, virtual teaching, and mixed reality teaching modes have been developed.
The mixed reality teaching is an off-line teaching mode for teaching and programming a virtual model under a real teaching scene by using a mixed reality technology. The robot teaching based on the mixed reality technology can play the advantages of off-line programming model simulation in a real environment, namely, the shutdown time is reduced, the execution of the original task is not influenced, and the programming of the next task is carried out at the same time; the operator is far away from the production environment, and potential danger of the operation environment to the operator and damage of the operator to the operation environment and the robot are avoided; the method is convenient to be combined with systems such as CAD/CAM and the like; the robot running program is convenient to debug and modify; the robot teaching system can be used for compiling complex tasks and the like, and simultaneously, robot teaching based on the mixed reality technology also exerts the advantage of simple on-line programming teaching process.
The patent of publication No. CN108161904A proposes an augmented reality-based robot online teaching system, which comprises a teaching manipulator, an orientation tracking sensor, a virtual robot model positioner, an augmented reality display and a computer, and realizes online teaching by operating the teaching manipulator to control and observe the movement of the virtual robot. However, the invention has a plurality of devices, and the hand can not be separated from the teaching aid in operation.
The patent of publication number CN108161882A proposes a robot teaching reproduction method based on augmented reality, which uses a motion sensing device to acquire the pose of a gesture, and the augmented reality device realizes teaching track reproduction, thereby achieving offline teaching. However, the patent takes the voice as the optimization of the error in the precise teaching process, but obviously increases the cognitive burden of the user, and meanwhile, a bottleneck is obvious in the teaching speed.
Disclosure of Invention
The invention aims to provide a robot rapid teaching system and method based on mixed reality, which are beneficial to improving the teaching speed of a robot and are flexible and convenient to use.
In order to achieve the purpose, the invention adopts the technical scheme that: the utility model provides a quick teaching system of robot based on mixed reality, includes an entity industrial robot and a mixed reality glasses, entity industrial robot includes a robot body and is used for controlling the robot control ware of its motion, the robot body includes a joint at least, build in the mixed reality glasses one with the same 3D virtual robot of robot body, mixed reality glasses with communication connection between the robot control ware.
The present invention also provides a method for rapid teaching of robots using the system according to claim 1, comprising the steps of:
s1, continuously capturing hand positions of one hand or two hands by using mixed reality glasses;
s2, in the mixed reality glasses, mapping the hand position obtained in the step S1 into the position and the posture of the tail end of the 3D virtual robot;
s3, inputting the position and posture data of the tail end of the 3D virtual robot obtained in the step S2 into a robot inverse kinematics algorithm, and solving joint motion parameters of the 3D virtual robot;
s4, synchronously updating and drawing the 3D virtual robot by the mixed reality glasses according to the joint motion parameters of the 3D virtual robot obtained in the step S3;
and S5, transmitting the joint motion parameters obtained in the step S3 to the robot controller in real time, and enabling the robot body to generate the same motion, thereby completing teaching.
Further, in step S1, after the user wears the mixed reality glasses, the mixed reality glasses fuse and present the 3D virtual robot in the mixed reality environment through the spatial localization technology, and position data of the three-dimensional space of the hands of one hand or both hands is acquired through optical motion capture.
Further, in step S2, the hand positions of one hand are transformed into positions of the 3D virtual robot ends, and the hand positions of both hands are transformed into positions of the 3D virtual robot ends.
Further, the hand positions of one hand or two hands are mapped into the position and the posture of the tail end of the 3D virtual robot through relative pose transformation, and the mapping algorithm is as follows:
in the one-hand teaching, the relative transformation of the hand position in the three-dimensional space corresponds to the relative transformation of the origin of the coordinate system at the end of the 3D virtual robot, and the differential translation transformation matrix of the adjacent momentsT F Comprises the following steps:
(1)
wherein the content of the first and second substances,dxdyanddzrepresenting the hand relative to a reference coordinate systemxyAndza small amount of movement of the shaft is provided,S(d) Is defined as:
Figure 715675DEST_PATH_IMAGE004
(2)
wherein the parametersdRepresenting the distance, parameters, of the user from the 3D virtual robotkIn order to teach the maximum lifting multiple of precision, a new pose matrix of the tail end of the 3D virtual robot after differential translation transformation is obtained by the formula (1)T T_new Comprises the following steps:
Figure 105199DEST_PATH_IMAGE006
(3)
wherein the content of the first and second substances,T T_old representing a pose matrix of the tail end of the 3D virtual robot before differential translation transformation;
in the teaching of both hands, the effect is the same in one of them hand and the teaching of one hand, and the relative transform of its hand position corresponds to the relative transform of 3D virtual robot terminal coordinate system original point, and the other hand corresponds to the point that is different from the original point in the terminal Z axle direction of virtual robot, the Z axle is the effective direction of the instrument that 3D virtual robot end was held, and its differential rotates the gesture transformation matrix that corresponds:
(4)
wherein the content of the first and second substances,
Figure 534355DEST_PATH_IMAGE010
Figure 622396DEST_PATH_IMAGE012
Figure 718791DEST_PATH_IMAGE014
representing both hands relative to a reference coordinate systemxyAndzdifferential rotation of the shaft, and obtaining a new pose matrix of the tail end of the 3D virtual robot after differential rotation transformation according to the formula (4)T R_new Comprises the following steps:
Figure 307904DEST_PATH_IMAGE016
(5)
wherein the content of the first and second substances,T R_old representing the pose matrix of the tail end of the 3D virtual robot before differential rotation transformation, combining the differential translation transformation matrix in the single-hand operation in the formula (1) and the pose transformation matrix in the formula (4) to obtain the complete pose matrix of the tail end of the 3D virtual robot after the combined action of the differential translation transformation and the differential rotation transformation when two hands are taught
Figure 507941DEST_PATH_IMAGE018
Comprises the following steps:
Figure 330666DEST_PATH_IMAGE020
(6)
wherein the content of the first and second substances,
Figure 147312DEST_PATH_IMAGE022
the pose matrix of the tail end of the 3D virtual robot before the combined action of differential translation transformation and differential rotation transformation.
Further, in step S3, after the pose of each 3D virtual robot end is obtained, the rotation angle of the 3D virtual robot joint is obtained through the inverse kinematics algorithm of the robot, and the speed and the acceleration of the 3D virtual robot are not limited by the speed and the acceleration of the robot body joint, so as to respond to the fast movement of the hand of the user in real time.
Further, in step S4, the mixed reality glasses synchronously update and draw the 3D virtual robot and the visual motion trajectory through which the end of the 3D virtual robot passes, based on the calculated joint motion parameters of the 3D virtual robot, by using the robot forward kinematics algorithm.
Compared with the prior art, the invention has the following beneficial effects: the system and the method provide more humanized off-line programming such as single-hand or double-hand command teaching in a mixed reality environment, fully play the intuitive action of people for operation, realize the promotion of the teaching speed of the robot, and have strong practicability and wide application prospect.
Drawings
Fig. 1 is a schematic system structure according to an embodiment of the present invention.
FIG. 2 is a flow chart of the teaching of an embodiment of the present invention.
FIG. 3 is a schematic illustration of the conversion of single/double-hand teaching to joint motion in an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and the embodiments.
The invention provides a robot rapid teaching system based on mixed reality, which comprises an entity industrial robot and mixed reality glasses 1, wherein the entity industrial robot comprises a robot body 2 and a robot controller 3 for controlling the robot body to move, the robot body 2 at least comprises a joint, a 3D virtual robot 4 which is the same as the robot body is built in the mixed reality glasses 1, and the mixed reality glasses 1 are in communication connection with the robot controller 3.
As shown in fig. 2, the invention further provides a mixed reality-based robot rapid teaching method based on the system, which includes the following steps:
and S1, continuously capturing hand positions of one hand or two hands by using the mixed reality glasses. Wherein, the user is wearing mixed reality glasses 1 back, and mixed reality glasses 1 fuses 3D virtual robot 4 and presents in mixed reality environment through the space positioning technique, obtains the position data of one hand or both hands hand three-dimensional space through optical formula motion capture.
And S2, mapping the hand position acquired in the step S1 to the position and the posture of the tail end of the 3D virtual robot in the mixed reality glasses. The hand positions of one hand are correspondingly converted into the positions of the tail ends of the 3D virtual robot, and the hand positions of two hands are correspondingly converted into the poses of the tail ends of the 3D virtual robot. In the present embodiment, one hand corresponds to the right hand in fig. 1, and two hands correspond to the left hand and the right hand in fig. 1.
As shown in fig. 3, the hand positions of one hand or both hands are mapped to the position and the posture of the 3D virtual robot end through the relative pose transformation, and the mapping algorithm is as follows:
in the single-hand teaching, the relative transformation of the hand position in the three-dimensional space corresponds to the relative transformation of the origin of the terminal coordinate system of the 3D virtual robot, namely, the hand positions at two adjacent moments are sequentially
Figure 946641DEST_PATH_IMAGE024
Figure 266764DEST_PATH_IMAGE026
And the origin of the terminal coordinate system of the 3D virtual robot corresponds to
Figure 260390DEST_PATH_IMAGE028
The position of the moment isThen, then
Figure 980401DEST_PATH_IMAGE032
Differential translation transformation matrix for next timeT F Comprises the following steps:
(1)
wherein the content of the first and second substances,dxdyanddzrepresenting the hand relative to a reference coordinate systemxyAndza small amount of movement of the shaft is provided,S(d) Is defined as:
Figure 370156DEST_PATH_IMAGE004
(2)
wherein the parametersdRepresenting the distance, parameters, of the user from the 3D virtual robotkIn order to teach the maximum lifting multiple of precision, a new pose matrix of the tail end of the 3D virtual robot after differential translation transformation is obtained by the formula (1)T T_new Comprises the following steps:
Figure 426974DEST_PATH_IMAGE006
(3)
wherein the content of the first and second substances,T T_old representing a pose matrix of the tail end of the 3D virtual robot before differential translation transformation;
in the two-hand teaching, one hand has the same function as that in the one-hand teaching, the relative transformation of the hand position corresponds to the relative transformation of the origin of the coordinate system at the tail end of the 3D virtual robot, and the other hand corresponds to a point different from the origin in the Z-axis direction at the tail end of the virtual robot
Figure DEST_PATH_IMAGE034AA
(the Z axis is defined as the effective direction of the tool held by the 3D virtual robot end), the relationship between the direction vector and the pose matrix can be obtained as follows:
Figure DEST_PATH_IMAGE036A
wherein the content of the first and second substances,
Figure 617040DEST_PATH_IMAGE038
around a reference coordinate systemAttitude transformation matrices with axes making purely rotational transformations, i.e.
Figure 525718DEST_PATH_IMAGE044
Representing a direction vector
Figure 158956DEST_PATH_IMAGE046
Initial position of (2), then direction vector
Figure 839336DEST_PATH_IMAGE046
The (unit vector) may represent the pose of the virtual robot end, i.e. the pose of the 3D virtual robot end, if
Figure 844201DEST_PATH_IMAGE048
The two adjacent moments of the hand postures are
Figure 439393DEST_PATH_IMAGE050
Figure 125589DEST_PATH_IMAGE052
The following can be obtained:
Figure 519530DEST_PATH_IMAGE054
wherein the content of the first and second substances,
Figure 259078DEST_PATH_IMAGE056
all in one
Figure 300295DEST_PATH_IMAGE038
Similarly, if the attitude transformation matrix is relative to the reference coordinate system, the differential rotation corresponds to the attitude transformation matrix:
Figure 79901DEST_PATH_IMAGE008
(4)
wherein the content of the first and second substances,
Figure 414596DEST_PATH_IMAGE012
Figure 504912DEST_PATH_IMAGE014
representing a direction vector
Figure 483232DEST_PATH_IMAGE046
Relative to a reference coordinate systemxyAndzdifferential rotation of the shaft, and obtaining a new pose matrix of the tail end of the 3D virtual robot after differential rotation transformation according to the formula (4)T R_new Comprises the following steps:
Figure 236687DEST_PATH_IMAGE016
(5)
wherein the content of the first and second substances,T R_old representing the pose matrix of the tail end of the 3D virtual robot before differential rotation transformation, combining the differential translation transformation matrix in the single-hand operation in the formula (1) and the pose transformation matrix in the formula (4) to obtain the complete pose matrix of the tail end of the 3D virtual robot after the combined action of the differential translation transformation and the differential rotation transformation when two hands are taught
Figure 198826DEST_PATH_IMAGE018
Comprises the following steps:
Figure 561675DEST_PATH_IMAGE020
(6)
wherein the content of the first and second substances,the pose matrix of the tail end of the 3D virtual robot before the combined action of differential translation transformation and differential rotation transformation.
And S3, inputting the position and posture data of the tail end of the 3D virtual robot obtained in the step S2 into a robot inverse kinematics algorithm, and solving the joint motion parameters of the 3D virtual robot.
Specifically, after the pose of each 3D virtual robot terminal is obtained, the rotation angle of the 3D virtual robot joint is obtained through a robot inverse kinematics algorithm, and the speed and the acceleration of the 3D virtual robot are not limited by the speed and the acceleration of the robot body joint, so that the hand rapid movement of a user is responded immediately.
And S4, synchronously updating and drawing the 3D virtual robot by the mixed reality glasses according to the joint motion parameters of the 3D virtual robot obtained in the step S3.
Specifically, the mixed reality glasses synchronously update and draw the 3D virtual robot and the visual motion trail passed by the tail end of the 3D virtual robot through a positive kinematics algorithm of the robot based on the obtained joint motion parameters of the 3D virtual robot.
And S5, transmitting the joint motion parameters obtained in the step S3 to the robot controller in real time, and enabling the robot body to generate the same motion, thereby completing teaching.
The above are preferred embodiments of the present invention, and all changes made according to the technical scheme of the present invention that produce functional effects do not exceed the scope of the technical scheme of the present invention belong to the protection scope of the present invention.

Claims (7)

1. The utility model provides a quick teaching system of robot based on mixed reality which characterized in that, includes an entity industrial robot and a mixed reality glasses, entity industrial robot includes a robot body and is used for controlling the robot control ware of its motion, the robot body includes a joint at least, mixed reality glasses built-in have one with the same 3D virtual robot of robot body, mixed reality glasses with communication connection between the robot control ware.
2. A method for rapid teaching of robots using the system of claim 1, comprising the steps of:
s1, continuously capturing hand positions of one hand or two hands by using mixed reality glasses;
s2, in the mixed reality glasses, mapping the hand position obtained in the step S1 into the position and the posture of the tail end of the 3D virtual robot;
s3, inputting the position and posture data of the tail end of the 3D virtual robot obtained in the step S2 into a robot inverse kinematics algorithm, and solving joint motion parameters of the 3D virtual robot;
s4, synchronously updating and drawing the 3D virtual robot by the mixed reality glasses according to the joint motion parameters of the 3D virtual robot obtained in the step S3;
s5, the joint motion parameters obtained in step S3 are transmitted to the robot controller, and the robot body is caused to make the same motion, thereby completing teaching.
3. The system and method for rapid teaching of mixed reality-based robot according to claim 2, wherein in step S1, after the user wears the mixed reality glasses, the mixed reality glasses fuse and present the 3D virtual robot in the mixed reality environment by using the spatial localization technology, and the position data of the three-dimensional space of the hands of one or both hands is obtained by optical motion capture.
4. The system and method for mixed reality-based rapid robot teaching according to claim 2, wherein in step S2, the hand positions of one hand are transformed into the positions of the 3D virtual robot tip, and the hand positions of both hands are transformed into the pose of the 3D virtual robot tip.
5. The system and method for mixed reality-based robot rapid teaching according to claim 4, wherein hand positions of one hand or both hands are mapped to the position and posture of the 3D virtual robot terminal through relative pose transformation, and the mapping algorithm is as follows:
in the one-hand teaching, the relative transformation of the hand position in the three-dimensional space corresponds to the relative transformation of the origin of the coordinate system at the end of the 3D virtual robot, and the differential translation transformation matrix of the adjacent momentsT F Comprises the following steps:
Figure DEST_PATH_IMAGE001
(1)
wherein the content of the first and second substances,dxdyanddzrepresenting the hand relative to a reference coordinate systemxyAndza small amount of movement of the shaft is provided,S(d) Is defined as:
Figure 385891DEST_PATH_IMAGE002
(2)
wherein the parametersdRepresenting the distance, parameters, of the user from the 3D virtual robotkIn order to teach the maximum lifting multiple of precision, a new pose matrix of the tail end of the 3D virtual robot after differential translation transformation is obtained by the formula (1)T T_new Comprises the following steps:
Figure DEST_PATH_IMAGE003
(3)
wherein the content of the first and second substances,T T_old representing a pose matrix of the tail end of the 3D virtual robot before differential translation transformation;
in the teaching of both hands, the effect is the same in one of them hand and the teaching of one hand, and the relative transform of its hand position corresponds to the relative transform of 3D virtual robot terminal coordinate system original point, and the other hand corresponds to the point that is different from the original point in the terminal Z axle direction of virtual robot, the Z axle is the effective direction of the instrument that 3D virtual robot end was held, and its differential rotates the gesture transformation matrix that corresponds:
(4)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE005
Figure 364660DEST_PATH_IMAGE006
representing both hands relative to a reference coordinate systemxyAndzdifferential rotation of the shaft, and obtaining a new pose matrix of the tail end of the 3D virtual robot after differential rotation transformation according to the formula (4)T R_new Comprises the following steps:
Figure 120388DEST_PATH_IMAGE008
(5)
wherein the content of the first and second substances,T R_old representing the pose matrix of the tail end of the 3D virtual robot before differential rotation transformation, combining the differential translation transformation matrix in the single-hand operation in the formula (1) and the pose transformation matrix in the formula (4) to obtain the complete pose matrix of the tail end of the 3D virtual robot after the combined action of the differential translation transformation and the differential rotation transformation when two hands are taughtComprises the following steps:
Figure 59394DEST_PATH_IMAGE010
(6)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE011
the pose matrix of the tail end of the 3D virtual robot before the combined action of differential translation transformation and differential rotation transformation.
6. The system and method for rapid teaching of robot based on mixed reality according to claim 2, wherein in step S3, after the pose of each 3D virtual robot end is obtained, the rotation angle of the 3D virtual robot joint is obtained through the inverse kinematics algorithm of the robot, and the speed and acceleration of the 3D virtual robot are not limited by the speed and acceleration of the robot body joint, so as to respond to the rapid hand movement of the user in real time.
7. The system and method for rapid teaching of robot based on mixed reality according to claim 2, wherein in step S4, the mixed reality glasses synchronously update and draw the visual motion trajectory of the 3D virtual robot and the terminal thereof through the positive kinematics algorithm of the robot based on the calculated joint motion parameters of the 3D virtual robot.
CN201911144045.5A 2019-11-20 2019-11-20 Robot rapid teaching system and method based on mixed reality Active CN110815189B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911144045.5A CN110815189B (en) 2019-11-20 2019-11-20 Robot rapid teaching system and method based on mixed reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911144045.5A CN110815189B (en) 2019-11-20 2019-11-20 Robot rapid teaching system and method based on mixed reality

Publications (2)

Publication Number Publication Date
CN110815189A true CN110815189A (en) 2020-02-21
CN110815189B CN110815189B (en) 2022-07-05

Family

ID=69557705

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911144045.5A Active CN110815189B (en) 2019-11-20 2019-11-20 Robot rapid teaching system and method based on mixed reality

Country Status (1)

Country Link
CN (1) CN110815189B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112894820A (en) * 2021-01-29 2021-06-04 清华大学深圳国际研究生院 Flexible mechanical arm remote operation man-machine interaction device and system
CN112917457A (en) * 2021-01-27 2021-06-08 南京航空航天大学 Industrial robot rapid and accurate teaching system and method based on augmented reality technology
CN113126568A (en) * 2021-03-10 2021-07-16 上海乾庾智能科技有限公司 Industrial robot operation and demonstration system based on augmented reality technology
CN113290560A (en) * 2021-05-27 2021-08-24 乐聚(深圳)机器人技术有限公司 Robot motion control method, device, electronic equipment and storage medium
WO2021189224A1 (en) * 2020-03-24 2021-09-30 青岛理工大学 Registration system and method for robot augmented reality teaching
CN113858181A (en) * 2021-11-19 2021-12-31 国家电网有限公司 Power transmission line operation aerial robot based on human-computer interaction mixed reality
CN114310977A (en) * 2021-12-31 2022-04-12 天津中屹铭科技有限公司 Demonstrator for polishing robot and manual control method thereof
CN114310977B (en) * 2021-12-31 2024-05-10 天津中屹铭科技有限公司 Demonstrator for polishing robot and manual control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
CN106863295A (en) * 2015-12-10 2017-06-20 发那科株式会社 Robot system
CN108161882A (en) * 2017-12-08 2018-06-15 华南理工大学 A kind of robot teaching reproducting method and device based on augmented reality
CN110394779A (en) * 2018-04-25 2019-11-01 发那科株式会社 The simulator of robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN106863295A (en) * 2015-12-10 2017-06-20 发那科株式会社 Robot system
CN108161882A (en) * 2017-12-08 2018-06-15 华南理工大学 A kind of robot teaching reproducting method and device based on augmented reality
CN110394779A (en) * 2018-04-25 2019-11-01 发那科株式会社 The simulator of robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈偕权: ""基于增强现实及自然人机交互的机器人示教再现技术研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021189224A1 (en) * 2020-03-24 2021-09-30 青岛理工大学 Registration system and method for robot augmented reality teaching
US11969904B2 (en) 2020-03-24 2024-04-30 Qingdao university of technology Registration system and method for robot-oriented augmented reality teaching system
CN112917457A (en) * 2021-01-27 2021-06-08 南京航空航天大学 Industrial robot rapid and accurate teaching system and method based on augmented reality technology
CN112894820A (en) * 2021-01-29 2021-06-04 清华大学深圳国际研究生院 Flexible mechanical arm remote operation man-machine interaction device and system
CN113126568A (en) * 2021-03-10 2021-07-16 上海乾庾智能科技有限公司 Industrial robot operation and demonstration system based on augmented reality technology
CN113290560A (en) * 2021-05-27 2021-08-24 乐聚(深圳)机器人技术有限公司 Robot motion control method, device, electronic equipment and storage medium
CN113858181A (en) * 2021-11-19 2021-12-31 国家电网有限公司 Power transmission line operation aerial robot based on human-computer interaction mixed reality
CN114310977A (en) * 2021-12-31 2022-04-12 天津中屹铭科技有限公司 Demonstrator for polishing robot and manual control method thereof
CN114310977B (en) * 2021-12-31 2024-05-10 天津中屹铭科技有限公司 Demonstrator for polishing robot and manual control method thereof

Also Published As

Publication number Publication date
CN110815189B (en) 2022-07-05

Similar Documents

Publication Publication Date Title
CN110815189B (en) Robot rapid teaching system and method based on mixed reality
CN107943283B (en) Mechanical arm pose control system based on gesture recognition
Ostanin et al. Interactive robot programing using mixed reality
US20210023694A1 (en) System and method for robot teaching based on rgb-d images and teach pendant
CN108241339B (en) Motion solving and configuration control method of humanoid mechanical arm
CN105291138B (en) It is a kind of to strengthen the visual feedback platform of virtual reality immersion sense
Nee et al. Virtual and augmented reality applications in manufacturing
Horaud et al. Visually guided object grasping
Pan et al. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device
CN107856014B (en) Mechanical arm pose control method based on gesture recognition
CN109079794B (en) Robot control and teaching method based on human body posture following
CN102350700A (en) Method for controlling robot based on visual sense
CN113829343B (en) Real-time multitasking and multi-man-machine interaction system based on environment perception
CN110421561A (en) A method of clothes spraying is carried out using cooperation robot
CN107577159A (en) Augmented reality analogue system
CN102830798A (en) Mark-free hand tracking method of single-arm robot based on Kinect
Lambrecht et al. Markerless gesture-based motion control and programming of industrial robots
CN110142769A (en) The online mechanical arm teaching system of ROS platform based on human body attitude identification
Xiong et al. Predictive display and interaction of telerobots based on augmented reality
CN111185906A (en) Leap Motion-based dexterous hand master-slave control method
CN207630048U (en) A kind of master-slave control device of novel six freedom parallel connection platform
Gallala et al. Human-robot interaction using mixed reality
CN107738256A (en) A kind of teach-by-doing apery teaching robot's programing system
CN107443369A (en) A kind of robotic arm of the inverse identification of view-based access control model measurement model is without demarcation method of servo-controlling
Guan et al. A novel robot teaching system based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant