CN110405775A - A kind of robot teaching system and method based on augmented reality - Google Patents

A kind of robot teaching system and method based on augmented reality Download PDF

Info

Publication number
CN110405775A
CN110405775A CN201910374808.9A CN201910374808A CN110405775A CN 110405775 A CN110405775 A CN 110405775A CN 201910374808 A CN201910374808 A CN 201910374808A CN 110405775 A CN110405775 A CN 110405775A
Authority
CN
China
Prior art keywords
teaching
machine
module
augmented reality
dimensional pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910374808.9A
Other languages
Chinese (zh)
Inventor
徐迟
刘翊
关泽彪
洪鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN201910374808.9A priority Critical patent/CN110405775A/en
Publication of CN110405775A publication Critical patent/CN110405775A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of robot teaching system and method based on augmented reality, when the three-dimensional pose of the teaching machine of capture is passed in model processing modules by motion capture module, it shows that software further calculates this pose data using the three-dimensional rendering in module, obtains the three-dimensional pose data of teaching operation model tail end;The three-dimensional pose data of teaching operation model end are passed in augmented reality display module, pose data are assigned to the virtual machine arm for being responsible for describing motion path track in module by the latter, it carries out path to show, prompts the completed trajectory planning situation of operator.The beneficial effects of the practice of the present invention is to simulate motion profile of the true mechanical arm in practical work process using virtual machine arm, and the robot path information for prompting operator to plan simplifies the process of robot teaching;And three-dimensional pose capture is carried out from multiple angles to teaching machine, improve capture precision.

Description

A kind of robot teaching system and method based on augmented reality
Technical field
The present invention relates to robot teaching fields, more specifically to a kind of robot based on augmented reality Teaching system and method.
Background technique
Augmented reality, which refers to, to be accurately registered to dummy model among the scene of reality, to dummy object and true ring Border carries out the technology of virtual reality fusion, and augmented reality can make operator while feel real world and virtual world.Machine Device people's teaching, which refers to the process of, is programmed the job task of robot, and major function is the operator of planning robot Formula and work flow are the important research contents of robot field.With industrial robot working environment diversification and complete Increasingly sharpen at what task complicated, people is needed to design new teaching mode, to meet modern production requirement.Thus The present invention proposes the method for carrying out robot teaching based on augmented reality.The teaching process of augmented reality teaching is true It is carried out under real environment, may be implemented the teaching under dynamic environment, and have that modeling work amount is small, feeling of immersion is strong, teaching process The advantages of true nature.
Summary of the invention
The technical problem to be solved in the present invention is that in view of the drawbacks of the prior art, providing a kind of based on augmented reality skill The robot teaching system of art.
The technical solution adopted by the present invention to solve the technical problems is: constructing a kind of machine based on augmented reality Device people's teaching system, including real machine arm and teaching machine, and for capturing and transmitting the dynamic of teaching machine motion trace data Make capture module module;The robot teaching system further includes that sequentially connected model processing modules and augmented reality are shown Module;Wherein:
The teaching machine is equipped with several markers and the teaching operation model set on teaching machine endpiece;Wherein, described Marker, so that teaching machine is in the process of movement, can be captured for identifying teaching machine by motion capture module;
The model processing modules are connected to the motion capture module module, for receiving teaching machine motion profile number According to, and further calculate out according to the data received the three-dimensional pose coordinate position of teaching machine;Wherein number after treatment According to augmented reality display module can be further transferred to;
The augmented reality display module, for making virtual machine arm, and according to the data received, by the void Quasi- mechanical arm and the true mechanical arm carry out virtual reality fusion and pass through the virtual machine in the augmented reality display module Tool arm shows the motion profile of teaching machine.
Further, several described markers constitute a marker rigid body, and the motion capture module is working When, the three-dimensional coordinate position of the marker rigid body mass center is captured, further calculates to obtain according to the data captured The three-dimensional pose coordinate position of teaching machine.
Further, three-dimensional rendering display system software is used in model processing modules, and motion capture module is transmitted Data handled, and will treated that data are transferred in augmented reality display module;Wherein, pass through motion capture mould The three-dimensional pose coordinate points for the marker rigid body mass center that block transmits, calculate the three-dimensional pose coordinate points of teaching machine, specific to calculate Formula are as follows:
Wherein, tx、tyAnd tzIt is the three-dimensional pose coordinate points of marker rigid body mass center, m respectivelyx、myAnd mzIt is teaching respectively The three-dimensional pose coordinate points of operation model tail end, k are the distance between teaching operation model tail end and marker rigid body mass center, X, y and z is the included angle between marker rigid body mass center and coordinate system respectively.
Further, enhancing display module includes the display terminal with camera;The display terminal, for showing The motion profile of religion device is highlighted.
Further, in display terminal, by OpenGL tool making virtual machine arm, and SIFT algorithm pair is used The true mechanical arm image is positioned and is identified, the virtual machine arm is navigated to the side of the true mechanical arm.
Further, in display terminal, according to the teaching machine three-dimensional pose coordinate position data received, driving is empty Quasi- mechanical arm follows the teaching operation model of teaching machine end to be moved, so that operator is from display terminal, more intuitively See the real work situation of true mechanical arm.
It further, need to be to the seat of the camera of display terminal before the motion profile to teaching machine is highlighted Cursor position is demarcated;And the three-dimensional pose coordinate position of calibrated camera coordinate position and teaching machine is subjected to unification.
A kind of robot teaching method based on augmented reality provided in this embodiment, comprising the following steps:
S1, teaching machine simulation teaching operation work, motion capture module start to catch the motion profile of teaching machine It catches;
S2, motion capture module capture the marker on teaching machine, the motion profile of distinguishing mark object rigid body;
The three-dimensional pose coordinate of the marker rigid body mass center recognized is passed to model treatment mould by S3, motion capture module In block, by three-dimensional rendering display system software in model processing modules, the three-dimensional pose of teaching operation model endpoint is calculated Coordinate, and incoming augmented reality display module;
S4, coordinate calibration is carried out to the camera of augmented reality display module, the plate electricity of camera calibration will have been carried out Brain carries out unification with the three-dimensional pose coordinate of teaching operation model;
S5, true mechanical arm image is identified and positioned using SIFT algorithm;
S6, in augmented reality display module, using OpenGL tool making virtual machine arm, obtained according to step S5 Location information, the virtual machine arm is navigated to the side of the true mechanical arm;Driving virtual machine arm follows teaching The teaching operation model of device end is moved, and current operator can observe the reality of true mechanical arm from display terminal Border working condition.
In a kind of robot teaching system based on augmented reality of the present invention, pass through motion capture mould Block module obtains teaching machine motion profile, and the three-dimensional pose of teaching machine track is further calculated, aobvious by three-dimensional rendering Show that system software generates the holographic route being highlighted, shown in display terminal, and then operator is prompted to plan Robot motion track.
Implement a kind of robot teaching system and method based on augmented reality of the invention, has beneficial below Effect:
1, virtual machine arm locator is identified and positioned using SIFT algorithm, it then will be virtual by OpenGL Mechanical arm and motion profile are added under camera image, realize that augmented reality is shown, the machine for prompting operator to plan Device people's routing information simplifies the process of robot teaching;
2, three-dimensional pose capture is carried out from multiple angles to teaching machine by using motion capture module, improves capture Precision.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples, in attached drawing:
Fig. 1 is the structure principle chart of robot teaching system;
Fig. 2 is the method flow diagram for realizing robot teaching;
Fig. 3 is operation object structure chart;
Fig. 4 is teaching machine structure chart;
Fig. 5 is motion capture module structure chart.
Specific embodiment
For a clearer understanding of the technical characteristics, objects and effects of the present invention, now control attached drawing is described in detail A specific embodiment of the invention.
In the present embodiment, by taking laser welding as an example, a kind of robot teaching system disclosed by the invention is carried out specifically It is bright.
Referring to FIG. 1, it is the structure principle chart of robot teaching system, it is disclosed by the invention a kind of existing based on enhancing The robot teaching system of real technology, including teaching machine L1, motion capture module L2, model processing modules L3, augmented reality are aobvious Show module L4 and real machine arm L5, in which:
In the present embodiment, the teaching machine is equipped with eight markers, wherein eight markers constitute a marker Rigid body;
It is equipped with teaching operation device in the tail end of teaching machine, due to during carrying out laser welding, in true machinery The end of arm has seized a laser pen on both sides by the arms, and teaching operation device described in the present embodiment uses laser pen model, by its replacement Mechanical arm laser pen, and then simulate true working environment;
To avoid during teaching, the teaching machine is soldered object and blocks, and can not be captured by motion capture module The drawbacks of, in the present embodiment, between each marker and teaching machine, a marker extension rod is fixed, marker is carried out empty Between extend;
In the present embodiment, the marker is set as reflective sphere, and the sphere is for identifying teaching machine, so that teaching Device in the process of movement, can be captured by motion capture module.
The motion capture module system L2 is for capturing and transmitting teaching machine motion trace data;In the present embodiment, benefit With six motion capture cameras, capture real-time and without dead angle is carried out to the three-dimensional coordinate position of marker rigid body mass center;Its In, the resolution ratio of camera is equal are as follows: and 1280 × 1024, largest frames speed: 240FPS.
The model processing modules L3, for receiving teaching machine motion trace data, and according to the data received into one Step calculates the three-dimensional pose coordinate position of teaching machine;Wherein, data after treatment can be further transferred to augmented reality Display module L4;In the present embodiment, consider to utilize three-dimensional rendering display system software in a computer, to motion capture module The data transmitted are handled, wherein pass through the three-dimensional pose coordinate for the marker rigid body mass center that motion capture module transmits Point calculates the three-dimensional pose coordinate points of laser pen model endpoint, specific calculation formula are as follows:
Wherein, tx、tyAnd tzIt is the three-dimensional pose coordinate points of marker rigid body mass center, m respectivelyx、myAnd mzIt is laser respectively The three-dimensional pose coordinate points of model tail end, k are the distance between laser pen model tail end and marker rigid body mass center, x, y and Z is the included angle between marker rigid body mass center and coordinate system respectively.;
In the present embodiment, the three-dimensional pose coordinate points m of laser pen model tail end is recorded according to model processing modulesx、my And mzAnd included angle x, y and z between marker rigid body mass center and coordinate system, pass through offline mode using the group information Above-mentioned executable code, is transferred to by the executable code for generating the operation of entity industrial robot by wirelessly or non-wirelessly network Among the controller of entity industrial robot, the direction of motion of true mechanical arm is further controlled;Wherein, by model treatment mould The content of block record is recorded among register.
The augmented reality display module L5, shows for the motion profile to teaching machine.Consider in the present embodiment In tablet computer, the motion profile of teaching machine is highlighted;Wherein, in tablet computer, OPENGL software is utilized Make virtual machine arm, according to the teaching machine three-dimensional pose data received, by virtual machine arm and the true mechanical arm into Row virtual reality fusion, so that operator sees the practical work of true mechanical arm from display terminal, more intuitively from tablet computer Make situation;
Before the motion profile to teaching machine carries out high display, the coordinate position of the camera of display terminal need to be carried out Calibration;And the three-dimensional pose coordinate position of calibrated camera coordinate position and teaching machine is subjected to unification.
Referring to FIG. 2, it is the method flow diagram for realizing robot teaching, wherein specifically includes the following steps:
S1, teaching machine simulated laser pen work, and 6 cameras being equipped in motion capture module start to teaching machine Motion profile is captured;
S2, motion capture module capture the reflective ball on teaching machine, the motion profile of distinguishing mark object rigid body;
The three-dimensional pose coordinate of the marker rigid body mass center recognized is passed in computer by S3, motion capture module, In By three-dimensional rendering display system software in computer, the three-dimensional pose coordinate of laser pen model endpoint, and incoming plate are calculated Computer;
S4, coordinate calibration is carried out to the camera of tablet computer, camera intrinsic parameter A and distortion factor B is obtained, according to parameter A The spin matrix and translation matrix of camera are calculated with B, and are further carried out with OpenGL come renders three-dimensional object;By into It went the tablet computer of camera calibration, and carried out unification with the three-dimensional pose coordinate of laser pen model;Wherein, the effect of coordinate unification Fruit are as follows: the motion profile of virtual machine arm and teaching machine is added under camera image, realizes that augmented reality is shown;
S5, it is identified and positioned using image of the SIFT algorithm to true mechanical arm;Wherein, using SIFT algorithm into Row identification, obtains transformation matrices according to the former target and frame images match relationship that identify, to show three-dimension object, specifically It is drawn to use OpenGL;
S6, in tablet computer, using OpenGL tool making virtual machine arm, believed according to the positioning that step S5 is obtained The virtual machine arm, is navigated to the side of the true mechanical arm by breath;Driving virtual machine arm follows teaching machine end Laser pen model is moved, and current operator from display terminal, can observe the real work feelings of true mechanical arm Condition.
Referring to FIG. 3, it is operation object structure chart, specifically in implementation process, the operation object is operation object Refer to various reality device to be welded, operation object be placed on station, in the present embodiment, the station of use it is a length of 200cm, width 100cm and a height of 100cm, further carry out laser welding by teaching machine;Wherein, station is existing Welding platform in reality needs to read the actual size of welding platform, and then according to the size in mould before carrying out teaching The limitation of footprint is carried out in type processing module to virtual welding laser pen, it is virtual sharp in model processing modules to prevent The phenomenon that light pen crosses the border;The device to be welded need to apply its reflector segment in carrying out specific implementation process It is black, or the processing such as block, preventing, which influences motion capture module, carries out trajectory coordinates acquisition to laser pen model.
When the present invention carries out scene guidance layout for off-the-shelf item, wherein operation object and device to be welded are T-type Workpiece to be processed.
Referring to FIG. 4, it is teaching machine structure chart, wherein the teaching machine model is made of 5 parts;Wherein, 4.1 For master switch;4.2 be teaching operating switch;4.3 be marker extension rod;4.4 be marker;4.5 be laser pen model.For Guarantee teaching machine can be captured from all angles at any time by motion capture module, and the present invention will add in the middle part of teaching machine Add marker, and extension processing has been carried out to marker, with still can taking the photograph by motion capture module in complex industrial environment As head takes.
Referring to FIG. 5, its be motion capture module structure chart, wherein motion capture module build space be 5.5 × 3m, capture space are 5 × 3m, can capture five rigid bodies simultaneously.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned tools Body embodiment, the above mentioned embodiment is only schematical, rather than restrictive, the ordinary skill of this field Personnel under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, can also make Many forms, all of these belong to the protection of the present invention.

Claims (7)

1. a kind of robot teaching system based on augmented reality, including real machine arm and teaching machine, and for catching Catch and transmit the motion capture module of teaching machine motion trace data;It is characterized in that, the robot teaching system further includes Sequentially connected model processing modules and augmented reality display module;Wherein:
The teaching machine is equipped with several markers and the teaching operation model set on teaching machine tail end;Wherein, the mark Object, so that teaching machine is in the process of movement, can be captured for identifying teaching machine by the motion capture module;
The model processing modules are connected to the motion capture module module, for receiving teaching machine motion trace data, and The three-dimensional pose coordinate position of teaching machine is further calculated out according to the data received;Wherein, data meeting after treatment It is further transferred to augmented reality display module;
The augmented reality display module, for making virtual machine arm, and according to the data received, by the virtual machine Arm and the true mechanical arm carry out virtual reality fusion, are shown by the virtual machine arm to the motion profile of teaching machine.
2. robot teaching system according to claim 1, which is characterized in that several described markers constitute a mark Will object rigid body, the motion capture module at work, capture the three-dimensional coordinate point of the marker rigid body mass center, root According to the data captured, the three-dimensional pose coordinate points for obtaining teaching machine are further calculated.
3. robot teaching system according to claim 2, which is characterized in that using three-dimensional wash with watercolours in model processing modules Display system software is contaminated, the data that motion capture module transmits are handled, and data are transferred to enhancing now by treated In real display module;Wherein, the three-dimensional pose coordinate points of the marker rigid body mass center transmitted by motion capture module, calculating are shown Teach the three-dimensional pose coordinate points of device, specific calculation formula are as follows:
Wherein, tx、tyAnd tzIt is the three-dimensional pose coordinate points of marker rigid body mass center, m respectivelyx、myAnd mzIt is teaching operation respectively The three-dimensional pose coordinate points of model tail end, k are the distance between teaching operation model tail end and marker rigid body mass center, x, y and z It is the included angle between marker rigid body mass center and coordinate system respectively.
4. robot teaching system according to claim 1, which is characterized in that enhancing display module includes having camera Display terminal;In display terminal, on the one hand pass through OpenGL tool making virtual machine arm;On the other hand, whole in display True mechanical arm image is positioned and identified by SIFT algorithm in end, the virtual machine arm is navigated to described true The side of mechanical arm.
5. robot teaching system according to claim 4, which is characterized in that in display terminal, according to what is received Teaching machine three-dimensional pose coordinate points, driving virtual machine arm follow the teaching operation model of teaching machine tail end to be moved, so that Operator more intuitively sees the real work situation of true mechanical arm from display terminal.
6. robot teaching system according to claim 4, which is characterized in that shown in the motion profile to teaching machine Before showing, the coordinate position of the camera of display terminal need to be demarcated;And by calibrated camera coordinate points and teaching machine Three-dimensional pose coordinate points carry out unification.
7. a kind of robot teaching method based on augmented reality, which comprises the following steps:
S1, the work of teaching machine simulated laser pen, motion capture module start to capture the motion profile of teaching machine;
S2, motion capture module capture the marker on teaching machine, the motion profile of distinguishing mark object rigid body;
The three-dimensional pose coordinate of the marker rigid body mass center recognized is passed in model processing modules by S3, motion capture module, By three-dimensional rendering display system software in model processing modules, the three-dimensional pose coordinate of teaching operation model endpoint is calculated, And incoming augmented reality display module;
S4, coordinate calibration is carried out to the camera of display terminal, the display terminal of camera calibration will have been carried out, with teaching operation The three-dimensional pose coordinate points of model carry out unification;
S5, true mechanical arm image is identified and positioned using SIFT algorithm;
S6, in augmented reality display module, using OpenGL tool making virtual machine arm, the positioning obtained according to step S5 The virtual machine arm is navigated to the side of the true mechanical arm by information;Driving virtual machine arm follows teaching machine end Teaching operation model moved, current operator can observe the real work of true mechanical arm from display terminal Situation.
CN201910374808.9A 2019-05-07 2019-05-07 A kind of robot teaching system and method based on augmented reality Pending CN110405775A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910374808.9A CN110405775A (en) 2019-05-07 2019-05-07 A kind of robot teaching system and method based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910374808.9A CN110405775A (en) 2019-05-07 2019-05-07 A kind of robot teaching system and method based on augmented reality

Publications (1)

Publication Number Publication Date
CN110405775A true CN110405775A (en) 2019-11-05

Family

ID=68357765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910374808.9A Pending CN110405775A (en) 2019-05-07 2019-05-07 A kind of robot teaching system and method based on augmented reality

Country Status (1)

Country Link
CN (1) CN110405775A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111531551A (en) * 2020-04-22 2020-08-14 实时侠智能控制技术有限公司 Safety demonstrator using universal tablet computer and demonstration method
CN112454363A (en) * 2020-11-25 2021-03-09 马鞍山学院 Control method of AR auxiliary robot for welding operation
CN113034668A (en) * 2021-03-01 2021-06-25 中科数据(青岛)科技信息有限公司 AR-assisted mechanical simulation operation method and system
WO2021189223A1 (en) * 2020-03-24 2021-09-30 青岛理工大学 Registration system and method for robot augmented reality teaching based on identification card movement
CN114067658A (en) * 2021-11-30 2022-02-18 深圳市越疆科技有限公司 Coffee flower teaching system
CN114161479A (en) * 2021-12-24 2022-03-11 上海机器人产业技术研究院有限公司 Robot dragging demonstration performance test system and test method
CN115530620A (en) * 2022-10-25 2022-12-30 深圳市越疆科技有限公司 Coffee garland track generation method, coffee making method, related equipment and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9919427B1 (en) * 2015-07-25 2018-03-20 X Development Llc Visualizing robot trajectory points in augmented reality
CN108732756A (en) * 2017-04-21 2018-11-02 发那科株式会社 The maintenance auxiliary device of shop equipment and safeguard auxiliary system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9919427B1 (en) * 2015-07-25 2018-03-20 X Development Llc Visualizing robot trajectory points in augmented reality
CN108732756A (en) * 2017-04-21 2018-11-02 发那科株式会社 The maintenance auxiliary device of shop equipment and safeguard auxiliary system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张石磊 等: "基于增强现实的6自由度工业机器人示教研究", vol. 36, no. 1, pages 77 - 82 *
新浪电竞: "比VR更黑科技 微软Hololens眼镜硬件构造详解", vol. 1, pages 288 *
薛鸿民 主编: "《大学文科计算机应用基础(应用型)》", 28 February 2019, 上海交通大学出版社, pages: 276 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021189223A1 (en) * 2020-03-24 2021-09-30 青岛理工大学 Registration system and method for robot augmented reality teaching based on identification card movement
CN111531551A (en) * 2020-04-22 2020-08-14 实时侠智能控制技术有限公司 Safety demonstrator using universal tablet computer and demonstration method
CN112454363A (en) * 2020-11-25 2021-03-09 马鞍山学院 Control method of AR auxiliary robot for welding operation
CN113034668A (en) * 2021-03-01 2021-06-25 中科数据(青岛)科技信息有限公司 AR-assisted mechanical simulation operation method and system
CN114067658A (en) * 2021-11-30 2022-02-18 深圳市越疆科技有限公司 Coffee flower teaching system
CN114067658B (en) * 2021-11-30 2023-08-04 深圳市越疆科技有限公司 Coffee draws colored teaching system
CN114161479A (en) * 2021-12-24 2022-03-11 上海机器人产业技术研究院有限公司 Robot dragging demonstration performance test system and test method
CN114161479B (en) * 2021-12-24 2023-10-20 上海机器人产业技术研究院有限公司 Robot dragging teaching performance test system and test method
CN115530620A (en) * 2022-10-25 2022-12-30 深圳市越疆科技有限公司 Coffee garland track generation method, coffee making method, related equipment and system
CN115530620B (en) * 2022-10-25 2023-08-18 深圳市越疆科技股份有限公司 Coffee-drawing track generation method, coffee-making method, related equipment and system

Similar Documents

Publication Publication Date Title
CN110405775A (en) A kind of robot teaching system and method based on augmented reality
US11813749B2 (en) Robot teaching by human demonstration
CN111906784B (en) Pharyngeal swab double-arm sampling robot based on machine vision guidance and sampling method
CN110142770A (en) A kind of robot teaching system and method based on head-wearing display device
US10857673B2 (en) Device, method, program and recording medium, for simulation of article arraying operation performed by robot
CN110170995B (en) Robot rapid teaching method based on stereoscopic vision
CN108994832B (en) Robot eye system based on RGB-D camera and self-calibration method thereof
CN110751691B (en) Automatic pipe fitting grabbing method based on binocular vision
CN106041937A (en) Control method of manipulator grabbing control system based on binocular stereoscopic vision
CN104057453A (en) Robot device and method for manufacturing processing object
CN110433467B (en) Operation method and device of table tennis ball picking robot based on binocular vision and ant colony algorithm
TWI607814B (en) Flying Laser Marking System with Real-time 3D Modeling and Method Thereof
CN107103624B (en) Stereoscopic vision conveying system and conveying method thereof
CN110170996B (en) Robot rapid teaching system based on stereoscopic vision
EP3921801B1 (en) Creating training data variability in machine learning for object labelling from images
Shahzad et al. A vision-based path planning and object tracking framework for 6-DOF robotic manipulator
WO2022000713A1 (en) Augmented reality self-positioning method based on aviation assembly
CN110281231A (en) The mobile robot 3D vision grasping means of unmanned FDM increasing material manufacturing
CN107577159A (en) Augmented reality analogue system
CN114407015A (en) Teleoperation robot online teaching system and method based on digital twins
CN112288815A (en) Target mode position measuring method, system, storage medium and equipment
Van Tran et al. BiLuNetICP: A deep neural network for object semantic segmentation and 6D pose recognition
CN113597362B (en) Method and control device for determining the relationship between a robot coordinate system and a mobile device coordinate system
CN110766752A (en) Virtual reality interactive glasses with reflective mark points and space positioning method
CN114840079B (en) High-speed rail driving action simulation virtual-real interaction method based on gesture recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination