CN107696036B - Dragging demonstrator for humanoid mechanical arm - Google Patents

Dragging demonstrator for humanoid mechanical arm Download PDF

Info

Publication number
CN107696036B
CN107696036B CN201710719972.XA CN201710719972A CN107696036B CN 107696036 B CN107696036 B CN 107696036B CN 201710719972 A CN201710719972 A CN 201710719972A CN 107696036 B CN107696036 B CN 107696036B
Authority
CN
China
Prior art keywords
dragging
mechanical arm
robot
end effector
humanoid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710719972.XA
Other languages
Chinese (zh)
Other versions
CN107696036A (en
Inventor
李通通
杨涛
王燕波
张科
刘嘉宇
邹河彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Research Institute of Precise Mechatronic Controls
Original Assignee
Beijing Research Institute of Precise Mechatronic Controls
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Research Institute of Precise Mechatronic Controls filed Critical Beijing Research Institute of Precise Mechatronic Controls
Priority to CN201710719972.XA priority Critical patent/CN107696036B/en
Publication of CN107696036A publication Critical patent/CN107696036A/en
Application granted granted Critical
Publication of CN107696036B publication Critical patent/CN107696036B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means

Abstract

A dragging demonstrator for a humanoid mechanical arm comprises a six-dimensional force sensor, a power supply module and a demonstration controller, wherein the demonstration controller comprises physical keys, an IO (input/output) module, a DSP (digital signal processor) chip and a CAN (controller area network) bus controller; the six-dimensional force sensor is used for measuring spatial six-dimensional force applied to the humanoid mechanical arm end effector; the physical key is used for sending a dragging signal to the IO module; the DSP chip receives the measurement data of the six-dimensional force sensor, simultaneously acquires a dragging signal of the IO module, calculates the position and posture variation and then sends the position and posture variation to the CAN bus controller; and the CAN bus controller sends the position and posture variation to a control system of the robot. The invention realizes the teaching of the position dragging and the posture dragging of the humanoid mechanical arm of the robot by using a dragging teaching method.

Description

Dragging demonstrator for humanoid mechanical arm
Technical Field
The invention belongs to the field of robots, and particularly relates to a dragging demonstrator for a humanoid mechanical arm.
Background
The light-weight seven-degree-of-freedom humanoid mechanical arm of the robot has the advantages of high precision, good safety, good man-machine interaction and the like, and is mainly used for man-machine cooperative work occasions. Generally, a mechanical arm needs to be manually dragged to a working position, and the position is recorded, so that dragging teaching is realized. However, in the prior art, dragging control of the position and the posture of the tail end is not distinguished, and for the use condition that the position and the posture of the tail end need to be distinguished, the situation that the robot arm drags to teach an expected adjusting position and the posture of a humanoid robot arm is adjusted, or the robot arm drags to teach the expected adjusting posture and the position of the humanoid robot arm is adjusted frequently occurs, so that the use process is limited, and the requirement of specified operation under a complex condition is difficult to meet.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the device overcomes the defects of the prior art, provides a dragging demonstrator for the humanoid mechanical arm, and realizes demonstration by dragging a demonstration method, wherein the position dragging signal and the posture dragging signal are controlled, the position and posture variation of the humanoid mechanical arm end effector is calculated by combining the six-dimensional force sensor measurement data, and the position and posture variation is sent to a robot control system in real time.
The purpose of the invention is realized by the following technical scheme:
a dragging demonstrator for a humanoid mechanical arm comprises a six-dimensional force sensor, a power supply module and a demonstration controller, wherein the demonstration controller comprises physical keys, an IO (input/output) module, a DSP (digital signal processor) chip and a CAN (controller area network) bus controller; the dragging demonstrator is connected with a humanoid mechanical arm end effector of the robot;
the six-dimensional force sensor is used for measuring a spatial six-dimensional force applied to the humanoid mechanical arm end effector and then sending measurement data to the DSP chip;
the power supply module is powered by a control system of the robot and is used for supplying power to the six-dimensional force sensor, the IO module, the DSP chip and the CAN bus controller;
the physical key comprises a plurality of buttons and is used for sending dragging signals to the IO module, wherein the dragging signals comprise position dragging signals and posture dragging signals;
the IO module receives a dragging signal sent by a physical key;
the DSP chip receives the measurement data of the six-dimensional force sensor, simultaneously collects the dragging signal of the IO module, calculates the position and posture variation, and then sends the position and posture variation to the CAN bus controller;
and the CAN bus controller receives the position and posture variation sent by the DSP chip and sends the position and posture variation to a control system of the robot.
In the dragging demonstrator for the humanoid mechanical arm, the number of the physical keys is 2, wherein one key sends a position dragging signal, and the other key sends an attitude dragging signal; the dragging signal is an IO signal.
In the dragging demonstrator for the humanoid mechanical arm, the specific method for the DSP chip to calculate the position and posture variation is as follows:
the initial spatial position and attitude P of the humanoid mechanical arm end effector of the robot are as follows:
P=(x,y,z,α,β,γ)
wherein x, y and z are respectively the space position coordinates of the humanoid manipulator end effector under the robot tool coordinate system, and α, β and gamma are respectively the space attitude coordinates of the humanoid manipulator end effector under the robot tool coordinate system;
step two, the six-dimensional force sensor measures the spatial six-dimensional force F applied to the humanoid mechanical arm end effector as follows:
F=(Fx,Fy,Fz,Tx,Ty,Tz)
wherein, Fx、Fy、FzRespectively the forces applied to the x, y and z directions of the humanoid mechanical arm end effector under the robot tool coordinate system, Tx、Ty、TzRespectively applying the moments to the x, y and z directions of the end effector of the humanoid mechanical arm under a robot tool coordinate system;
step three, establishing a position and attitude change quantity transformation matrix T as follows:
when the dragging signal of the IO module is a position dragging signal, the value of a is 1, otherwise, the value of a is 0; when the dragging signal of the IO module is an attitude dragging signal, the value of b is 1, otherwise, the value of b is 0;
step four, after the humanoid mechanical arm end effector of the robot is dragged, the position and posture variation delta P of the humanoid mechanical arm end effector is as follows:
Figure BDA0001384740090000032
Figure BDA0001384740090000033
wherein k is an elastic coefficient, e is a natural index, t is the acting time of the space six-dimensional force F, tau is a response coefficient, and epsilon is a viscosity coefficient.
The dragging demonstrator for the humanoid mechanical arm is used for realizing demonstration by adopting a dragging demonstration method, and the specific demonstration method comprises the following steps:
firstly, dragging a position or dragging an attitude corresponding to a certain button of a physical key, pressing the button, and sending a position dragging signal or an attitude dragging signal to an IO module by the physical key;
step two, manually dragging the humanoid mechanical arm end effector of the robot, wherein a six-dimensional force sensor can measure a spatial six-dimensional force applied to the humanoid mechanical arm end effector and then sends measurement data to a DSP chip;
thirdly, the DSP chip receives the measurement data of the six-dimensional force sensor in the second step, simultaneously collects the position dragging signal or the attitude dragging signal of the IO module in the first step, then calculates the position and attitude variation, and sends the position and attitude variation to the CAN bus controller in real time;
after receiving the position and posture variation of the step three, the CAN bus controller sends the position and posture variation to a control system of the robot in real time;
step five, the control system of the robot receives the position and attitude variation in the step four, and calculates the displacement or rotation angle of each joint of the robot according to inverse kinematics;
and step six, the control system of the robot sends the displacement or rotation angle of each joint to a joint servo controller of the robot to execute, and the dragging cycle is completed.
In the above dragging demonstrator for the humanoid robot arm, in the third step of the teaching method, the specific calculation method of the position and posture variation amount is as follows:
the initial spatial position and attitude P of the humanoid mechanical arm end effector of the robot are as follows:
P=(x,y,z,α,β,γ)
wherein x, y and z are respectively the space position coordinates of the humanoid manipulator end effector under the robot tool coordinate system, and α, β and gamma are respectively the space attitude coordinates of the humanoid manipulator end effector under the robot tool coordinate system;
step two, the six-dimensional force sensor measures the spatial six-dimensional force F applied to the humanoid mechanical arm end effector as follows:
F=(Fx,Fy,Fz,Tx,Ty,Tz)
wherein, Fx、Fy、FzRespectively the forces applied to the x, y and z directions of the humanoid mechanical arm end effector under the robot tool coordinate system, Tx、Ty、TzRespectively applying the moments to the x, y and z directions of the end effector of the humanoid mechanical arm under a robot tool coordinate system;
step three, establishing a position and attitude change quantity transformation matrix T as follows:
when the dragging signal of the IO module is position dragging, the value of a is 1, otherwise the value of a is 0; when the dragging signal of the IO module is an attitude dragging signal, the value of b is 1, otherwise, the value of b is 0;
step four, after the humanoid mechanical arm end effector of the robot is dragged, the position and posture variation delta P of the humanoid mechanical arm end effector is as follows:
Figure BDA0001384740090000051
Figure BDA0001384740090000052
wherein k is an elastic coefficient, e is a natural index, t is the acting time of the space six-dimensional force F, tau is a response coefficient, and epsilon is a viscosity coefficient.
Compared with the prior art, the invention has the following beneficial effects:
(1) the invention adopts position dragging signal and gesture dragging signal to control, thus realizing the teaching of the position and gesture of the tail end of the humanoid mechanical arm, and leading the tail end of the humanoid mechanical arm to realize independent position dragging and independent gesture dragging;
(2) the teaching controller completes the processing of signals of the six-dimensional force sensor and the IO module, and reduces the hardware dependence on a robot control system;
(3) the invention realizes the teaching of the tail end of the humanoid mechanical arm by adopting a dragging teaching method, the teaching method is simple to operate, and the teaching process is easy;
(4) according to the invention, the viscosity coefficient and the elastic coefficient in the position and posture variation model are adjusted, so that the response effect during dragging can be adjusted; when the response coefficient is small, the dragging is very easy, and the device is suitable for dragging the mechanical arm in a large range; when the response coefficient is large, the device is suitable for fine dragging of the mechanical arm.
Drawings
FIG. 1 is a schematic three-dimensional structure of a drag demonstrator according to the present invention;
FIG. 2 is a cross-sectional view of a drag teach pendant of the present invention;
FIG. 3 is a schematic view of the present invention in connection with a humanoid robotic arm end effector of a robot;
FIG. 4 is a schematic diagram of a power supply path and a signal path of the dragging demonstrator of the present invention;
FIG. 5 is a flow chart of a teaching method.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 to fig. 3 are schematic views of a dragging demonstrator assembly and a local connection between the dragging demonstrator and a robot arm end effector of a robot, wherein the dragging demonstrator comprises a six-dimensional force sensor 1, a power module 2 and a teaching controller 3, and the teaching controller 3 comprises a physical button 4, an IO module, a DSP chip and a CAN bus controller.
FIG. 4 is a schematic diagram of a power supply path and a signal path of a dragging demonstrator according to the invention, wherein a six-dimensional force sensor 1 is used for measuring a spatial six-dimensional force applied to an end effector of a humanoid mechanical arm of a robot and then sending measurement data to a DSP chip; the power module 2 is used for providing a 24V direct current power supply by a control system of the robot and supplying power to the six-dimensional force sensor 1, the IO module, the DSP chip and the CAN bus controller; in this embodiment, the physical key includes 2 buttons, which are a button a and a button B, and both output IO signals, and are used to send a dragging signal to an IO module, and when the button a is pressed, output a position dragging signal; when the button B is pressed, outputting an attitude dragging signal; the IO module receives a dragging signal sent by a physical key; the DSP chip receives the measurement data of the six-dimensional force sensor 2, simultaneously collects the dragging signal of the IO module, then calculates the position and posture variation of the end effector of the humanoid mechanical arm of the robot, and sends the position and posture variation to the CAN bus controller; and the CAN bus controller receives the position and posture variation sent by the DSP chip and then sends the position and posture variation to a control system of the robot.
Fig. 5 shows a flow chart of a teaching method, the dragging demonstrator adopts a dragging teaching method to realize teaching, and the specific teaching method is as follows:
the method comprises the following steps that firstly, a dragging demonstrator is installed on a humanoid mechanical arm end effector of a robot, then a tool coordinate system is set in a robot control system, and the physical size of the dragging demonstrator is contained in the tool coordinate system;
step two, dragging the button A of the physical key in the embodiment at a position corresponding to the button A, dragging the button B at a posture corresponding to the button B, pressing the button A, sending a position dragging signal to the IO module by the physical key, pressing the button B, and sending a posture dragging signal to the IO module by the physical key;
step three, manually dragging the end effector of the humanoid mechanical arm, wherein the six-dimensional force sensor 1 can measure the spatial six-dimensional force applied to the end effector of the humanoid mechanical arm, and then sends the measured data to the DSP chip;
step four, the DSP chip receives the measurement data of the six-dimensional force sensor 1 in step three, and simultaneously acquires the position dragging signal or the attitude dragging signal of the IO module in step two, then calculates the position and attitude variation, and sends the position and attitude variation to the CAN bus controller in real time, and the specific calculation method of the position and attitude variation is as follows:
(4a) the initial spatial position and the initial attitude of the humanoid mechanical arm end effector of the robot are P, and P is represented by the position coordinates and the attitude coordinates:
P=(x,y,z,α,β,γ)
wherein x, y and z are respectively the space position coordinates of the humanoid manipulator end effector under the robot tool coordinate system, and α, β and gamma are respectively the space attitude coordinates of the humanoid manipulator end effector under the robot tool coordinate system;
(4b) the six-dimensional force sensor measures a spatial six-dimensional force applied to the humanoid-robot-arm end effector as F, which is collectively represented by a position force and a spatial attitude force:
F=(Fx,Fy,Fz,Tx,Ty,Tz)
wherein, Fx、Fy、FzRespectively the forces applied to the x, y and z directions of the humanoid mechanical arm end effector under the robot tool coordinate system, Tx、Ty、TzRespectively applying the moments to the x, y and z directions of the end effector of the humanoid mechanical arm under a robot tool coordinate system;
(4c) establishing a position and attitude change quantity transformation matrix T as follows:
Figure BDA0001384740090000071
wherein, when the button A is pressed, the value of a is 1, and when the button A is not pressed, the value of a is 0; when the button B is pressed, the value of B is 1, and when the button B is not pressed, the value of B is 0; the button A and the button B are independent, and the two buttons can be pressed simultaneously or not pressed simultaneously, or one button can be pressed independently.
(4d) After the humanoid mechanical arm end effector of the robot is dragged, the position and posture variation delta P of the humanoid mechanical arm end effector is as follows:
Figure BDA0001384740090000072
Figure BDA0001384740090000081
in the embodiment, the value range of k is 200-1000, and the value range of epsilon is 1000-20000, wherein the value of k and epsilon needs to ensure that the value of tau is not less than 5, Δ x, Δ y and Δ z are respectively the variation of the position coordinate of the humanoid-mechanical-arm end effector relative to the position coordinates x, y and z of P after the humanoid-mechanical-arm end effector is dragged under a robot tool coordinate system, and Δ α, Δ β and Δ γ are respectively the variation of the attitude coordinate of the humanoid-mechanical-arm end effector relative to the attitude coordinates α, β and γ after the humanoid-mechanical-arm end effector is dragged under the robot tool coordinate system.
By adjusting the viscosity coefficient and the elastic coefficient in the position and posture variation model, the response effect during dragging can be adjusted; when the response coefficient is small, the dragging is very easy, and the device is suitable for dragging the mechanical arm in a large range; when the response coefficient is large, the device is suitable for fine dragging of the mechanical arm.
After receiving the position and posture variation of the humanoid mechanical arm end effector in the fourth step, the CAN bus controller sends the position and posture variation to a control system of the robot in real time;
step six, the control system of the robot receives the position and posture variation in the step five, and the displacement or rotation angle of each joint of the robot is calculated according to inverse kinematics;
step seven, the control system of the robot sends the displacement or rotation angle of each joint to a joint servo controller of the robot to execute, and the dragging cycle is completed;
and step eight, taking down the dragging demonstrator, resetting a tool coordinate system in the robot control system, and normally working without the physical size of the dragging demonstrator.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (4)

1. The utility model provides a drive demonstrator that imitative people arm was used which characterized in that: the teaching device comprises a six-dimensional force sensor, a power supply module and a teaching controller, wherein the teaching controller comprises a physical key, an IO module, a DSP chip and a CAN bus controller; the dragging demonstrator is connected with a humanoid mechanical arm end effector of the robot;
the six-dimensional force sensor is used for measuring a spatial six-dimensional force applied to the humanoid mechanical arm end effector and then sending measurement data to the DSP chip;
the power supply module is powered by a control system of the robot and is used for supplying power to the six-dimensional force sensor, the IO module, the DSP chip and the CAN bus controller;
the physical key comprises a plurality of buttons and is used for sending dragging signals to the IO module, wherein the dragging signals comprise position dragging signals and posture dragging signals;
the IO module receives a dragging signal sent by a physical key;
the DSP chip receives the measurement data of the six-dimensional force sensor, simultaneously collects the dragging signal of the IO module, calculates the position and posture variation, and then sends the position and posture variation to the CAN bus controller;
the CAN bus controller receives the position and posture variation sent by the DSP chip and sends the position and posture variation to a control system of the robot;
the specific method for the DSP chip to calculate the position and posture variation comprises the following steps:
the initial spatial position and attitude P of the humanoid mechanical arm end effector of the robot are as follows:
P=(x,y,z,α,β,γ)
wherein x, y and z are respectively the space position coordinates of the humanoid manipulator end effector under the robot tool coordinate system, and α, β and gamma are respectively the space attitude coordinates of the humanoid manipulator end effector under the robot tool coordinate system;
step two, the six-dimensional force sensor measures the spatial six-dimensional force F applied to the humanoid mechanical arm end effector as follows:
F=(Fx,Fy,Fz,Tx,Ty,Tz)
wherein, Fx、Fy、FzRespectively the forces applied to the x, y and z directions of the humanoid mechanical arm end effector under the robot tool coordinate system, Tx、Ty、TzRespectively applying the moments to the x, y and z directions of the end effector of the humanoid mechanical arm under a robot tool coordinate system;
step three, establishing a position and attitude change quantity transformation matrix T as follows:
Figure FDA0002223886820000021
when the dragging signal of the IO module is a position dragging signal, the value of a is 1, otherwise, the value of a is 0; when the dragging signal of the IO module is an attitude dragging signal, the value of b is 1, otherwise, the value of b is 0;
step four, after the humanoid mechanical arm end effector of the robot is dragged, the position and posture variation delta P of the humanoid mechanical arm end effector is as follows:
Figure FDA0002223886820000023
wherein k is an elastic coefficient, e is a natural index, t is the acting time of the space six-dimensional force F, tau is a response coefficient, and epsilon is a viscosity coefficient.
2. The dragging demonstrator for the humanoid mechanical arm as claimed in claim 1, wherein: the number of the physical keys is 2, wherein one key sends a position dragging signal, and the other key sends an attitude dragging signal; the dragging signals are IO signals.
3. The dragging demonstrator for the humanoid mechanical arm as claimed in claim 1, wherein: the dragging demonstrator adopts a dragging demonstration method to realize demonstration, and the specific demonstration method is as follows:
firstly, dragging a position or dragging an attitude corresponding to a certain button of a physical key, pressing the button, and sending a position dragging signal or an attitude dragging signal to an IO module by the physical key;
step two, manually dragging the humanoid mechanical arm end effector of the robot, wherein a six-dimensional force sensor can measure a spatial six-dimensional force applied to the humanoid mechanical arm end effector and then sends measurement data to a DSP chip;
thirdly, the DSP chip receives the measurement data of the six-dimensional force sensor in the second step, simultaneously collects the position dragging signal or the attitude dragging signal of the IO module in the first step, then calculates the position and attitude variation, and sends the position and attitude variation to the CAN bus controller in real time;
after receiving the position and posture variation of the step three, the CAN bus controller sends the position and posture variation to a control system of the robot in real time;
step five, the control system of the robot receives the position and attitude variation in the step four, and calculates the displacement or rotation angle of each joint of the robot according to inverse kinematics;
and step six, the control system of the robot sends the displacement or the rotation angle of each joint to a joint servo controller of the robot to execute, and the dragging cycle is completed.
4. The dragging demonstrator for the humanoid mechanical arm as claimed in claim 3, wherein: in the third step of the teaching method, a specific calculation method of the position and posture variation is as follows:
the initial spatial position and attitude P of the humanoid mechanical arm end effector of the robot are as follows:
P=(x,y,z,α,β,γ)
wherein x, y and z are respectively the space position coordinates of the humanoid manipulator end effector under the robot tool coordinate system, and α, β and gamma are respectively the space attitude coordinates of the humanoid manipulator end effector under the robot tool coordinate system;
step two, the six-dimensional force sensor measures the spatial six-dimensional force F applied to the humanoid mechanical arm end effector as follows:
F=(Fx,Fy,Fz,Tx,Ty,Tz)
wherein, Fx、Fy、FzRespectively applied to the tail end of the humanoid mechanical arm under the coordinate system of the robot toolForce in x, y, z direction of the vehicle, Tx、Ty、TzRespectively applying the moments to the x, y and z directions of the end effector of the humanoid mechanical arm under a robot tool coordinate system;
step three, establishing a position and attitude change quantity transformation matrix T as follows:
Figure FDA0002223886820000041
when the dragging signal of the IO module is a position dragging signal, the value of a is 1, otherwise, the value of a is 0; when the dragging signal of the IO module is an attitude dragging signal, the value of b is 1, otherwise, the value of b is 0;
step four, after the humanoid mechanical arm end effector of the robot is dragged, the position and posture variation delta P of the humanoid mechanical arm end effector is as follows:
wherein k is an elastic coefficient, e is a natural index, t is the acting time of the space six-dimensional force F, tau is a response coefficient, and epsilon is a viscosity coefficient.
CN201710719972.XA 2017-08-21 2017-08-21 Dragging demonstrator for humanoid mechanical arm Active CN107696036B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710719972.XA CN107696036B (en) 2017-08-21 2017-08-21 Dragging demonstrator for humanoid mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710719972.XA CN107696036B (en) 2017-08-21 2017-08-21 Dragging demonstrator for humanoid mechanical arm

Publications (2)

Publication Number Publication Date
CN107696036A CN107696036A (en) 2018-02-16
CN107696036B true CN107696036B (en) 2020-02-14

Family

ID=61171079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710719972.XA Active CN107696036B (en) 2017-08-21 2017-08-21 Dragging demonstrator for humanoid mechanical arm

Country Status (1)

Country Link
CN (1) CN107696036B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7263724B2 (en) * 2018-09-27 2023-04-25 株式会社デンソーウェーブ Robot control method
CN109822565A (en) * 2019-01-15 2019-05-31 北京镁伽机器人科技有限公司 Robot control method, system and storage medium
CN112847366B (en) * 2021-01-07 2023-07-25 溱者(上海)智能科技有限公司 Force-position hybrid teaching robot system and teaching method
CN114161479B (en) * 2021-12-24 2023-10-20 上海机器人产业技术研究院有限公司 Robot dragging teaching performance test system and test method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106826877A (en) * 2017-02-23 2017-06-13 上海大学 A kind of easy Manipulation of the machine people teaching machine
CN106826769A (en) * 2017-03-15 2017-06-13 福州大学 A kind of quick teaching apparatus of industrial robot and its implementation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106826877A (en) * 2017-02-23 2017-06-13 上海大学 A kind of easy Manipulation of the machine people teaching machine
CN106826769A (en) * 2017-03-15 2017-06-13 福州大学 A kind of quick teaching apparatus of industrial robot and its implementation

Also Published As

Publication number Publication date
CN107696036A (en) 2018-02-16

Similar Documents

Publication Publication Date Title
CN107696036B (en) Dragging demonstrator for humanoid mechanical arm
Li et al. Reinforcement learning of manipulation and grasping using dynamical movement primitives for a humanoidlike mobile manipulator
CN108656112A (en) A kind of mechanical arm zero-force control experimental system towards direct teaching
CN106647529B (en) A kind of intelligent teaching system towards the accurate tracing control in six-shaft industrial robot track
CN105328700A (en) Data glove for teaching programming of robot dexterous hand
CN107414834A (en) A kind of multirobot cooperative system Static stiffness real-time performance evaluation method
CN107272447A (en) A kind of emulation mode, simulator and robot emulation system
Liang et al. An augmented discrete-time approach for human-robot collaboration
Pelliccia et al. Implementation of tactile sensors on a 3-Fingers Robotiq® adaptive gripper and visualization in VR using Arduino controller
CN111515928B (en) Mechanical arm motion control system
CN203306138U (en) Quasi-man biped robot based on hydraulic system
Heisnam et al. 20 DOF robotic hand for tele-operation:—Design, simulation, control and accuracy test with leap motion
CN203380892U (en) Manipulator control system based on FPGA platform
CN204976610U (en) Tracer controller of robot
Li et al. A real-time explicit mapping and teleoperation control method for humanoid robots with posture constraints
Wang et al. Traching control of a redundant manipulator with the assistance of tactile sensing
Han et al. Reconfigurable wireless control system for a dual-arm cooperative robotic system
Karam et al. Smartly Control, Interface and Tracking for Pick and Place Robot Based on Multi Sensors and Vision Detection
Zheng et al. A mapping method of grasping posture applying to the metamorphic multi-fingered hand
Lee et al. Simulation and control of a robotic arm using MATLAB, simulink and TwinCAT
Tarao et al. Motion simulation using a high-speed parallel link mechanism
CN212794984U (en) Robot remote teaching control device
CN204471381U (en) A kind of solution magic square device carrying out three-dimensional artificial
CN212601833U (en) Multi-connection and interconnection control system of industrial-grade mechanical arm
CN210757841U (en) Six-axis industrial robot and coordinate forming system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant