CN109483517A - A kind of cooperation robot teaching method based on the tracking of hand appearance - Google Patents

A kind of cooperation robot teaching method based on the tracking of hand appearance Download PDF

Info

Publication number
CN109483517A
CN109483517A CN201811232885.2A CN201811232885A CN109483517A CN 109483517 A CN109483517 A CN 109483517A CN 201811232885 A CN201811232885 A CN 201811232885A CN 109483517 A CN109483517 A CN 109483517A
Authority
CN
China
Prior art keywords
robot
data
speed
hand
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811232885.2A
Other languages
Chinese (zh)
Inventor
匡加伦
王云鹏
刘宏业
段文斌
张志涛
洪鹰
肖聚亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Yangtian Technology Co Ltd
Original Assignee
Tianjin Yangtian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Yangtian Technology Co Ltd filed Critical Tianjin Yangtian Technology Co Ltd
Priority to CN201811232885.2A priority Critical patent/CN109483517A/en
Publication of CN109483517A publication Critical patent/CN109483517A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1651Programme controls characterised by the control loop acceleration, rate control

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)

Abstract

The invention discloses a kind of cooperation robot teaching methods based on the tracking of hand appearance, comprising the following steps: the first step is arranged the speed limit value of robot motion, while starting sensor, computer program and robot controller;Second step, reception, processing and the transmission of hand appearance data.Data are obtained first, are then carried out high-pass filtering, and carry out average value processing, are finally realized data communication.Third step establishes hand appearance and the mapping of robot end, and the speed and angular speed of robot end's movement are converted to by the real-time speed of palm and rotational angular velocity;4th step, robot controller receive treated gesture data and control robot motion, realize the synchronous coordination movement of robot and operator's hand;5th step, robot motion's reproduction.The present invention can be achieved the translation of Robot X-axis, the translation of Y-axis, the translation of Z axis, around the rotation of X-axis, around the rotation and rotation about the z axis of Y-axis.

Description

A kind of cooperation robot teaching method based on the tracking of hand appearance
Technical field
The present invention relates to be based on hand appearance tracing sensor gesture when robot control field more particularly to man-machine collaboration to know Other robot teaching technology.
Background technique
With the development of robot technology, man-machine collaboration, human-computer interaction become the new issue of technical field of robot control.
In industrial robot field, it is exactly that operator passes through that robot teaching technology, which is mostly using manual programming teaching, Teaching machine manually controls machine people and moves to scheduled position, while the position being recorded, and be transmitted to robot control In device, the task is repeated according to instruction automatically later.This teaching method is suitable for requiring lower work to robot dexterity Make, while teaching efficiency is not high.If encountering as clothes spraying is this to the higher work of mechanical arm requirement on flexibility, manual volume Journey teaching is obviously difficult to meet the requirements.In clothes spraying process, although robot end's required precision is not high, flexibility is wanted Ask very high.And require operator's many experiments that could choose most suitable spraying movement, and manual programming teaching is being imitated It is difficult to meet the requirements in rate.
Summary of the invention
The purpose of the present invention is to overcome the disadvantages of the prior art, propose it is a kind of realize better man-machine collaboration and it is man-machine mutually The dynamic cooperation robot teaching method based on hand appearance tracing sensor.
The present invention is realized by following technical step:
A kind of cooperation robot teaching method based on the tracking of hand appearance, comprising the following steps:
Step 1: teaching prepares:
Hand appearance tracing sensor is placed horizontally at testing stand, the X of the coordinate system for the appearance tracing sensor that makes to go smoothly1Y1Z1Axis With the Y of the base coordinate system of robot0Z0X0Axis difference is parallel;
The speed limit value of robot motion is set, while starting hand appearance tracing sensor 1, computer and robot control Device processed;Hand appearance tracing sensor and robot controller pass through data line respectively and are connected with computer;
Teaching machine is connected on robot controller, robot control is adjusted to MANUAL CONTROL mode on teaching machine, Signal, which will be controlled, by teaching machine passes to robot controller, robot servo driver successively with by the mechanical arm of robot It is adjusted to the initial position of movement, the initial pose of recorder people and inputs the journey that robot returns to initial pose in teaching machine The reference frame of robot motion is adjusted to world coordinate system on teaching machine by sequence;
Step 2: translational velocity and rotational angular velocity during acquisition operator's hand exercise, specific steps are as follows:
(a) position and the attitude data that operator's hand is acquired by hand appearance tracing sensor, then by the hand of acquisition Data be transferred to computer, differentiate to the position and posture of hand, obtain the translational velocity and Three Degree Of Freedom of Three Degree Of Freedom Rotational angular velocity;
(b) with the method for high-pass filtering to the translational velocity of the Three Degree Of Freedom after differential and the rotational angular velocity of Three Degree Of Freedom It is filtered, when movement velocity is lower than threshold value, computer is 0 to the velocity amplitude that robot transmits, and robot remain stationary shape State, to form six groups of filtered data ordered series of numbers;
(c) data processing:
The first step first determines whether data variation of the data of present frame in every group of filtered data ordered series of numbers relative to former frame Whether when more than 20%, being less than then current frame data can be used, and cast out this group of data if being more than, still using former frame Data, as present frame data available;
Second step, by the available translational velocity of present frame and rotational angular velocity data and the translational velocity of adjacent preceding four frame and Rotational angular velocity data are averaged respectively as final palm data available output;
(d) coordinate transform: by the mapping of hand appearance and robot end, the palm data available under palm coordinate system is passed through Cross the speed and angular speed of robot end's movement that coordinate transform is calculated under the base coordinate system of robot;
Step 3: data are transmitted:
(a) determine that speed and the angular speed of the speed for moving palm and angular speed and robot end's movement are reasonably real Now one-to-one proportionality coefficient amplifies the speed of palm movement and angular speed, then by amplified speed and angle Speed data is stored in computer as the data of final application;
(b) data communication: using the ICP/IP protocol of computer by speed amplified in computer and angular velocity data Robot controller is passed to, realizes the communication of computer and robot controller;
Step 4: robot controls:
(a) robot controller receives amplified gesture motion data and is transferred to robot servo controller control Robot motion processed realizes man-machine move synchronously;
(b) robot reseting procedure: after movement, robot controller reset routine is run, so that robot is rapid Return to source location set;
Step 5: robot motion reappears: the range of motion velocity amplitude of step 3 deposit is transferred from computer, by it It is again passed to robot controller, step 3 and step 4 is repeated, realizes the repeating motion of robot.
The beneficial effects of the present invention are: the cooperation robot teaching method of the invention based on gesture identification is passed by body-sensing Sensor receives the hand exercise of operator, carrys out guided robot under world coordinate system with corresponding speed and corresponding direction Upper movement, and movement reproduction may be implemented.Teaching method provided by the invention is easy to operate, work efficiency is high, does not need to operate The too many professional knowledge of personnel, is easy to operate.The present invention is suitable for most of articulated robots, while the body-sensing sensor used It is cheap, there is universal applicability.And in the present invention, operator can be with remote control robot, can be one It is used in the case that a little robots working environments are relatively severe.
Detailed description of the invention
The existing hand appearance tracing sensor structural schematic diagram of Fig. 1;
Fig. 2 is the coordinate system transition diagram that the method for the present invention uses;
Fig. 3 the method for the present invention signal control transmitting schematic diagram;
Fig. 4 is the control flow chart of the method for the present invention;
Fig. 5 is flow chart of data processing figure of the invention.
Specific embodiment
The present invention will be described in detail with embodiment with reference to the accompanying drawing.
A kind of cooperation robot teaching method based on the tracking of hand appearance of the invention as shown in drawings, including following step It is rapid:
Step 1: teaching prepares:
By hand appearance tracing sensor be placed horizontally at testing stand (present invention use existing serial manipulator platform, Such as UR robot), the X of the coordinate system for the appearance tracing sensor that makes to go smoothly1Y1Z1The Y of the base coordinate system of axis and robot0Z0X0Axis point It is not parallel.
The speed limit value of robot motion is set, while starting hand appearance tracing sensor 1, computer and robot control Device processed;Hand appearance tracing sensor and robot controller pass through data line respectively and are connected with computer.
Teaching machine is connected on robot controller, robot control is adjusted to MANUAL CONTROL mode on teaching machine, Signal, which will be controlled, by teaching machine passes to robot controller, robot servo driver successively with by the mechanical arm of robot It is adjusted to the initial position of movement, the initial pose of recorder people and inputs the journey that robot returns to initial pose in teaching machine Sequence (the intrinsic program language of teaching machine).To realize that manpower and robot move synchronously, by the ginseng of robot motion on teaching machine It examines coordinate system and is adjusted to world coordinate system.
Step 2: being related to the translation of the mechanical arm tail end of robot in movement and work due to robot and turning around axis It is dynamic, thus translational velocity and rotational angular velocity during acquisition operator's hand exercise, specific steps are as follows:
(a) position and the attitude data that operator's hand is acquired by hand appearance tracing sensor, then by the hand of acquisition Data be transferred to computer.It differentiates to the position and posture of hand, obtains the translational velocity and Three Degree Of Freedom of Three Degree Of Freedom Rotational angular velocity.
(b) with the method for high-pass filtering to the translational velocity of the Three Degree Of Freedom after differential and the rotational angular velocity of Three Degree Of Freedom It is filtered, when movement velocity is lower than threshold value, computer is 0 to the velocity amplitude that robot transmits.Robot remain stationary shape State, to form six groups of filtered data ordered series of numbers.Since the hand of operator can have certain slight jitter, unprocessed number It is obviously disabled according to for robot.Using high-pass filtering be in order to for some exercise datas lower than setting value, into Row filtering is realized and removes the hand of operator by the smooth motion of robot there are the influences of certain slight jitter.
By test of many times, the translational velocity threshold value of Three Degree Of Freedom is preferably set as to turn of 2mm/s and Three Degree Of Freedom Dynamic angular speed threshold value is set as 2rad/s.
(c) data processing:
The first step first determines whether data variation of the data of present frame in every group of filtered data ordered series of numbers relative to former frame Whether when more than 20%, being less than then current frame data can be used, and cast out this group of data if being more than, still using former frame Data, as present frame data available.
Second step distinguishes present frame data available and the available translational velocity of adjacent preceding four frame and rotational angular velocity data It averages as final palm data available output.Carrying out data processing is in order in order to work as operator's hand exercise When too fast, some measurement errors existing for hand appearance tracker are eliminated.
(d) coordinate transform: by the mapping of hand appearance and robot end, the palm data available under palm coordinate system is passed through Cross the speed and angular speed of robot end's movement that coordinate transform is calculated under the base coordinate system of robot;
As shown in Figure of description 2, palm coordinate system is X2Y2Z2, hand appearance tracing sensor coordinate system be X1Y1Z1, machine People's basis coordinates system is X0Y0Z0
Palm is mapped to the transition matrix of the reference frame of robot end's movement are as follows:
Wherein rXXIndicate the X of palm2Reference axis and sensors X1The cosine value of reference axis angle, rXYIndicate the X of palm2It sits Parameter and sensor Y2The cosine value of reference axis angle, rXZIndicate the X of palm2Reference axis and sensor Z1Reference axis angle it is remaining String value, rYXIndicate the Y of palm2Reference axis and sensors X1The cosine value of reference axis angle, rYYIndicate the Y of palm2Reference axis and Sensor Y1The cosine value of reference axis angle, rYZIndicate the Y of palm2Reference axis and sensor Z1The cosine value of reference axis angle, rZXIndicate the Z of palm2Reference axis and sensors X1The cosine value of reference axis angle, rZYIndicate the Z of palm2Reference axis and sensor Y1The cosine value of reference axis angle, rZZIndicate the Z of palm2Reference axis and sensor Z1The cosine value of reference axis angle.pX1、pY1、 pZ1Indicate coordinate of the palm center under sensor coordinate system.pX0、pY0、pZ0Indicate sensor coordinate system origin at robot end The coordinate of end motion reference frame.
The derivation process of above formula is as follows:
The first step, palm coordinate system are mapped to the transformation matrix of sensor coordinate system are as follows:
Second step, the transition matrix for the reference frame that sensor coordinate system is moved relative to robot end are as follows:
Third step, palm are mapped to the transition matrix of the reference frame of robot end's movement are as follows:
The pose of palm maps in the motion reference coordinate system of robot in this way, thus robot end It is moved fully according to the palm action of operator.
Step 3: data are transmitted:
(a) since the motion range of manpower is much smaller than the working range of robot, so it needs to be determined that moving palm Speed and angular speed and robot end movement speed and angular speed reasonably realize one-to-one proportionality coefficient.As One embodiment of the present invention, coefficient k=5 set in this example, i.e. human hand movement 10mm, robot motion 50mm.It will The speed and angular speed of palm movement amplify, then using amplified speed and angular velocity data as the number of final application According to deposit computer.
(b) data communication: using the ICP/IP protocol of computer by speed amplified in computer and angular velocity data Robot controller is passed to, realizes the communication of computer and robot controller.Six hand exercise data are sent to machine Device people's controller.
Step 4: robot controls:
(a) robot controller receives amplified gesture motion data and is transferred to robot servo controller control Robot motion processed realizes man-machine move synchronously.
(b) robot reseting procedure.After movement, robot controller reset routine is run, so that robot is rapid Return to source location set.
Step 5: robot motion reappears.The range of motion velocity amplitude that step 3 deposit is transferred from computer, by it It is again passed to robot controller, step 3 and step 4 is repeated, realizes the repeating motion of robot.
The robot teaching process completely based on the tracking of hand appearance is achieved that by above procedure.
The movement for the robot end that the present invention realizes can be analyzed to six kinds of movements, be respectively: robot motion is with reference to seat Mark system X0The translation of axis, Y0The translation of axis, Z0The translation of axis, around X0The rotation of axis, around Y0The rotation of axis and around Z0The rotation of axis. This six kinds move the main movement mode for substantially covering robot, and robot motion may be implemented to work in their combination Any position for (removing singular point) in range.
Although the preferred embodiment of the present invention is described above in conjunction with attached drawing, the invention is not limited to upper The specific embodiment stated, the above mentioned embodiment is only schematical, be not it is restrictive, this field it is common Technical staff under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, may be used also By make it is many in the form of, within these are all belonged to the scope of protection of the present invention.

Claims (3)

1. a kind of cooperation robot teaching method based on the tracking of hand appearance, it is characterised in that the following steps are included:
Step 1: teaching prepares:
Hand appearance tracing sensor is placed horizontally at testing stand, the X of the coordinate system for the appearance tracing sensor that makes to go smoothly1Y1Z1Axis and machine The Y of the base coordinate system of device people0Z0X0Axis difference is parallel;
The speed limit value of robot motion is set, while starting hand appearance tracing sensor 1, computer and robot control Device;Hand appearance tracing sensor and robot controller pass through data line respectively and are connected with computer;
Teaching machine is connected on robot controller, robot control is adjusted to MANUAL CONTROL mode on teaching machine, is passed through Teaching machine passes to robot controller, robot servo driver successively by signal is controlled the mechanical arm of robot to be adjusted to The initial position of movement the initial pose of recorder people and inputs the program that robot returns to initial pose in teaching machine, The reference frame of robot motion is adjusted to world coordinate system on teaching machine;
Step 2: translational velocity and rotational angular velocity during acquisition operator's hand exercise, specific steps are as follows:
(a) position and the attitude data that operator's hand is acquired by hand appearance tracing sensor, then by the number of the hand of acquisition It according to computer is transferred to, differentiates to the position and posture of hand, obtains the translational velocity of Three Degree Of Freedom and turning for Three Degree Of Freedom Dynamic angular speed;
(b) translational velocity of the Three Degree Of Freedom after differential and the rotational angular velocity of Three Degree Of Freedom are carried out with the method for high-pass filtering Filtering, when movement velocity is lower than threshold value, computer is 0 to the velocity amplitude that robot transmits, and robot remain stationary state, from And form six groups of filtered data ordered series of numbers;
(c) data processing:
The first step, first determine whether the data of present frame in every group of filtered data ordered series of numbers relative to former frame data variation whether When more than 20%, being less than then current frame data be can be used, and is cast out this group of data if being more than, is still used the number of former frame According to as present frame data available;
Second step, by the translational velocity and rotation of the available translational velocity of present frame and rotational angular velocity data and adjacent preceding four frame Angular velocity data is averaged respectively as final palm data available output;
(d) coordinate transform: by the mapping of hand appearance and robot end, by the palm data available under palm coordinate system by sitting Mark transformation calculations obtain the speed and angular speed of the movement of the robot end under the base coordinate system of robot;
Step 3: data are transmitted:
(a) speed and angular speed for determining the speed for moving palm and angular speed and robot end's movement reasonably realize one One corresponding proportionality coefficient amplifies the speed of palm movement and angular speed, then by amplified speed and angular speed Data are stored in computer as the data of final application;
(b) data communication: speed amplified in computer and angular velocity data are transmitted using the ICP/IP protocol of computer To robot controller, the communication of computer and robot controller is realized;
Step 4: robot controls:
(a) robot controller receives amplified gesture motion data and is transferred to robot servo controller control machine Device people movement, realizes man-machine move synchronously;
(b) robot reseting procedure: after movement, robot controller reset routine is run, so that robot rapidly returns back to Source location set;
Step 5: robot motion reappears: transferring the range of motion velocity amplitude of step 3 deposit from computer, again by it It is transmitted to robot controller, step 3 and step 4 is repeated, realizes the repeating motion of robot.
2. the cooperation robot teaching method according to claim 1 based on the tracking of hand appearance, it is characterised in that: Three Degree Of Freedom Translational velocity threshold value be set as 2mm/s, the rotational angular velocity threshold value of Three Degree Of Freedom is set as 2rad/s.
3. the cooperation robot teaching method according to claim 1 or 2 based on the tracking of hand appearance, it is characterised in that: institute The coefficient stated is 5.
CN201811232885.2A 2018-10-22 2018-10-22 A kind of cooperation robot teaching method based on the tracking of hand appearance Pending CN109483517A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811232885.2A CN109483517A (en) 2018-10-22 2018-10-22 A kind of cooperation robot teaching method based on the tracking of hand appearance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811232885.2A CN109483517A (en) 2018-10-22 2018-10-22 A kind of cooperation robot teaching method based on the tracking of hand appearance

Publications (1)

Publication Number Publication Date
CN109483517A true CN109483517A (en) 2019-03-19

Family

ID=65692327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811232885.2A Pending CN109483517A (en) 2018-10-22 2018-10-22 A kind of cooperation robot teaching method based on the tracking of hand appearance

Country Status (1)

Country Link
CN (1) CN109483517A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111515149A (en) * 2020-04-26 2020-08-11 广东弓叶科技有限公司 Man-machine cooperation sorting system and robot grabbing position obtaining method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105058396A (en) * 2015-07-31 2015-11-18 深圳先进技术研究院 Robot teaching system and control method thereof
CN105500370A (en) * 2015-12-21 2016-04-20 华中科技大学 Robot offline teaching programming system and method based on somatosensory technology
US9592608B1 (en) * 2014-12-15 2017-03-14 X Development Llc Methods and systems for providing feedback during teach mode
CN206326605U (en) * 2016-12-19 2017-07-14 广州大学 A kind of intelligent teaching system based on machine vision
CN107160364A (en) * 2017-06-07 2017-09-15 华南理工大学 A kind of industrial robot teaching system and method based on machine vision
CN107914273A (en) * 2017-11-08 2018-04-17 浙江工业大学 Mechanical arm teaching system based on gesture control
CN108274448A (en) * 2018-01-31 2018-07-13 佛山智能装备技术研究院 A kind of the robot teaching method and teaching system of human body interaction
CN108453707A (en) * 2018-04-12 2018-08-28 珞石(山东)智能科技有限公司 Robot drags teaching orbit generation method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9592608B1 (en) * 2014-12-15 2017-03-14 X Development Llc Methods and systems for providing feedback during teach mode
CN105058396A (en) * 2015-07-31 2015-11-18 深圳先进技术研究院 Robot teaching system and control method thereof
CN105500370A (en) * 2015-12-21 2016-04-20 华中科技大学 Robot offline teaching programming system and method based on somatosensory technology
CN206326605U (en) * 2016-12-19 2017-07-14 广州大学 A kind of intelligent teaching system based on machine vision
CN107160364A (en) * 2017-06-07 2017-09-15 华南理工大学 A kind of industrial robot teaching system and method based on machine vision
CN107914273A (en) * 2017-11-08 2018-04-17 浙江工业大学 Mechanical arm teaching system based on gesture control
CN108274448A (en) * 2018-01-31 2018-07-13 佛山智能装备技术研究院 A kind of the robot teaching method and teaching system of human body interaction
CN108453707A (en) * 2018-04-12 2018-08-28 珞石(山东)智能科技有限公司 Robot drags teaching orbit generation method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111515149A (en) * 2020-04-26 2020-08-11 广东弓叶科技有限公司 Man-machine cooperation sorting system and robot grabbing position obtaining method thereof
CN111515149B (en) * 2020-04-26 2020-12-29 广东弓叶科技有限公司 Man-machine cooperation sorting system and robot grabbing position obtaining method thereof

Similar Documents

Publication Publication Date Title
CN107943283B (en) Mechanical arm pose control system based on gesture recognition
CN107160364B (en) Industrial robot teaching system and method based on machine vision
CN108638069B (en) Method for controlling accurate motion of tail end of mechanical arm
CN104440864B (en) A kind of master-slave mode remote operating industrial robot system and its control method
WO2023056670A1 (en) Mechanical arm autonomous mobile grabbing method under complex illumination conditions based on visual-tactile fusion
Kofman et al. Teleoperation of a robot manipulator using a vision-based human-robot interface
CN110170995B (en) Robot rapid teaching method based on stereoscopic vision
CN107662195A (en) A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc
CN106826838A (en) A kind of interactive biomimetic manipulator control method based on Kinect space or depth perception sensors
CN110815189B (en) Robot rapid teaching system and method based on mixed reality
CN111055281A (en) ROS-based autonomous mobile grabbing system and method
CN109079794B (en) Robot control and teaching method based on human body posture following
CN106003036A (en) Object grabbing and placing system based on binocular vision guidance
Liang et al. An augmented discrete-time approach for human-robot collaboration
JP7067816B1 (en) Robot teaching system and method based on image segmentation and surface EMG
JP2019188477A (en) Robot motion teaching device, robot system, and robot control device
Dwivedi et al. Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system
CN108582031A (en) A kind of hot line robot branch based on force feedback master & slave control connects gage lap method
CN113103230A (en) Human-computer interaction system and method based on remote operation of treatment robot
CN110170996A (en) A kind of quick teaching system of robot based on stereoscopic vision
Liang et al. Robot teleoperation system based on mixed reality
CN109483517A (en) A kind of cooperation robot teaching method based on the tracking of hand appearance
CN110142769A (en) The online mechanical arm teaching system of ROS platform based on human body attitude identification
Grasshoff et al. 7dof hand and arm tracking for teleoperation of anthropomorphic robots
CN116175582A (en) Intelligent mechanical arm control system and control method based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190319

RJ01 Rejection of invention patent application after publication