CN110355750B - Interaction control method for hand-eye coordination of teleoperation - Google Patents
Interaction control method for hand-eye coordination of teleoperation Download PDFInfo
- Publication number
- CN110355750B CN110355750B CN201811270020.5A CN201811270020A CN110355750B CN 110355750 B CN110355750 B CN 110355750B CN 201811270020 A CN201811270020 A CN 201811270020A CN 110355750 B CN110355750 B CN 110355750B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- camera
- motion
- delta
- hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The invention discloses an interaction control method facing remote operation hand-eye coordination, which is used for solving the technical problem of poor practicability of the existing man-machine interaction control method. The technical scheme is based on the conversion of a coordinate system, when an operator controls the motion of a scene object through an exchange device, firstly, the pose of the scene object is converted into a camera coordinate system from a world coordinate system, then, the precession of the interaction device is added to the camera coordinate system according to the expected motion to obtain new pose coordinates of the camera coordinate system, and finally, the newly obtained pose coordinates are converted into the world coordinate system from the camera coordinate system to control the actual motion of the scene object. The invention realizes the hand-eye coordination in the teleoperation process, namely the motion of the scene object in the interactive operation process is not influenced by the change of the scene observation visual angle of the operator, and is consistent with the motion of the interactive equipment, thereby reducing the teleoperation difficulty and having good practicability.
Description
Technical Field
The invention relates to a man-machine interaction control method, in particular to an interaction control method facing teleoperation hand-eye coordination.
Background
The document "teleoperation robot system master-slave control strategy, written by the university of science and technology of Jiangsu (natural science edition), 2013, Vol27(8), p 643-647" discloses a control method of a master-slave teleoperation robot system. The method adopts an incremental position control mode, controls the motion of the slave hand by the increment of the master hand, and effectively avoids the complexity of initial return to the original point. When an operator directly observes the operated environment information and the operator observes the operated environment information through the image equipment, the correspondence of the positions of the master and the slave is different, the coordinates of the master hand and the actual environment or the coordinates of the master hand and the image equipment are matched by adjusting the proportional control gain matrix, and the working space mapping of the master end and the slave end is established to have better visual presence. The method disclosed by the document is only based on the difference between the environment information directly observed by an operator and the environment information observed by the image equipment, the working space mapping between the main hand and the actual environment or between the main hand and the image equipment is simply established by controlling the gain coefficient in proportion, a method for establishing the working space mapping between the main hand and the actual environment or between the main hand and the image equipment when the operator observes the change of the observation visual angle of the operated environment information is not provided, the working space mapping between the main hand and the slave end is not influenced by the change of the observation visual angle, the movement is consistent all the time, the application range is not wide, and the operation difficulty is higher.
Disclosure of Invention
In order to overcome the defect of poor practicability of the conventional man-machine interaction control method, the invention provides an interaction control method facing teleoperation hand-eye coordination. The method is based on the conversion of a coordinate system, when an operator controls the motion of a scene object through an exchange device, firstly, the pose of the scene object is converted into a camera coordinate system from a world coordinate system, then, the precession of the interaction device is added to the camera coordinate system according to the expected motion to obtain new pose coordinates of the camera coordinate system, and finally, the newly obtained pose coordinates are converted into the world coordinate system from the camera coordinate system to control the actual motion of the scene object. The invention realizes the hand-eye coordination in the teleoperation process, namely the motion of the scene object in the interactive operation process is not influenced by the change of the scene observation visual angle of the operator, and is consistent with the motion of the interactive equipment, thereby reducing the teleoperation difficulty and having good practicability.
The technical scheme adopted by the invention for solving the technical problems is as follows: an interactive control method facing teleoperation hand-eye coordination is characterized by comprising the following steps:
step one, data acquisition of interactive equipment, namely, acquiring real-time position information P of the hand controller at equal time intervals in the motion process of the hand controllerc n(ii) a Collected real-time position information P of hand controllerc nPosition and attitude information P of the previous momentc n-1Obtaining the advancing amount delta P of the hand controller by differentiating; the advancing amount Delta P is adjusted to Delta P0Mapping k to Δ P as an initial motion command Δ P for the end of the robot arm0And k is an operation scale factor.
ΔP0=k*ΔP (1)
Step two, combining the transformation matrix R from the interactive operation coordinate system to the camera coordinate systemxFor the initial movement command Δ P0Converting to obtain motion command delta P in camera coordinate system1。
ΔP1=ΔP0*Rx (2)
Thirdly, combining a pose matrix C of the camera coordinate system in a world coordinate system to carry out motion command delta P in the camera coordinate system1Converting to obtain motion command delta P in world coordinate system2。
During the interaction, the view observed by the operator is determined by the position and posture of the camera in the world coordinate system, and the position and posture matrix of the camera depends on three elements: namely the sight line direction of the camera, the position of the center of the camera and the forward direction of the camera, the three elements can determine the position and the posture of the camera in a world coordinate system. When the observation visual angle is converted, the essence is that the position and the posture of the camera in the world coordinate system are changed, and the direction of the camera coordinate system in the world coordinate system is changed.
Converting the motion command obtained in the second step from the camera coordinate system to a world coordinate system, wherein an inverse matrix C of a camera matrix is required-1. In addition, since the hand controller precession amount matrix is a 1x3 matrix, a 1x4 matrix D needs to be constructed.
D=[ΔP1 0] (4)
ΔP2=DC-1 (5)
Step (ii) ofFourthly, the motion command delta P in the world coordinate system2Obtaining the final movement amount delta P of the tail end of the mechanical arm through motion mapping3And generating teleoperation instructions of the whole interaction process.
The world coordinate system motion instruction delta P obtained in the step three2Unfolding to obtain Δ P2Is a matrix of 1x4, the first three columns of elements of which are the final amount of movement of the end of the mechanical arm, and the Δ P is taken by the movement map2The first three columns of elements generate Δ P3The simplification is as follows:
ΔP3=[Δx Δy Δz] (6)
and the delta x, the delta y and the delta z respectively represent the motion quantity of the tail end of the mechanical arm in three coordinate axis directions in a world coordinate system, so that the teleoperation instruction generation with hand-eye coordination is realized.
And step five, driving the mechanical arm to move by using the teleoperation instruction generated in the step four, so as to realize hand-eye coordination.
The real-time position of the end of the arm is noted as Pj nThe position at the next time is Pj n+1Then, then
And the scene object is driven to move through the real-time tail end position of the scene object, so that hand-eye coordination in the teleoperation process is realized.
The invention has the beneficial effects that: the method is based on the conversion of a coordinate system, when an operator controls the motion of a scene object through an exchange device, firstly, the pose of the scene object is converted into a camera coordinate system from a world coordinate system, then, the precession of the interaction device is added to the camera coordinate system according to the expected motion to obtain new pose coordinates of the camera coordinate system, and finally, the newly obtained pose coordinates are converted into the world coordinate system from the camera coordinate system to control the actual motion of the scene object. The invention realizes the hand-eye coordination in the teleoperation process, namely the motion of the scene object in the interactive operation process is not influenced by the change of the scene observation visual angle of the operator, and is consistent with the motion of the interactive equipment, thereby reducing the teleoperation difficulty and having good practicability.
The present invention will be described in detail with reference to the following embodiments.
Detailed Description
For a detailed description of the present invention, the definition of 3 common coordinate systems is explained:
(1) world coordinate system Oworld: the world coordinate system is a reference coordinate system of the whole system and is used for describing the actual motion of the object model in the scene, which is equivalent to a base coordinate system, and the motion of the scene object is described based on the world coordinate system.
(2) Camera coordinate system Ocamera: the camera coordinate system is used for describing a teleoperation visual scene, and when the observation visual angle of an operator is changed, the position and the posture of the camera in the world coordinate system are changed. When the observation visual angle is changed, the pose of the camera coordinate system in the world coordinate system is changed, but the direction of the camera coordinate system relative to the computer screen is unchanged.
(3) Interoperation coordinate system (i.e. hand controller coordinate system) Ointeraction: the coordinate system of the interactive equipment is used for describing the movement of the interactive equipment in the teleoperation process, and when the observation visual angle is changed, the position and the posture of the interactive equipment in the coordinate system of the world are changed, but the direction of the interactive equipment relative to the computer screen is not changed.
In order to verify the effectiveness of the operation technology for the hand-eye coordination of teleoperation, the invention combines a three-dimensional graphic development environment OSG (OpenSence graph) and an interactive tool NovintFalcon hand controller, and carries out simulation demonstration verification based on the control of the motion of the IRB120 mechanical arm in a virtual view, and the specific implementation mode is as follows:
step one, interactive equipment data acquisition, taking upward movement of the hand controller as an example, and acquiring real-time position and posture information P of the hand controller at equal time intervals in the movement process of the hand controllerc n(ii) a Will gather the hand controller real-time position and orientation information Pc nPosition and attitude information P of the previous momentc n-1Obtaining the advancing amount delta P of the hand controller by differentiating; the advancing amount Delta P is adjusted to Delta P0Mapping k to Δ P as an initial motion command Δ P for the end of the robot arm0Wherein k is an operation scale factor, the unit of the motion amount at the tail end of the hand controller is meter, the unit of the motion amount at the tail end of the mechanical arm in the virtual scene is millimeter, the operation scale factor k is 1000, and when the tail end of the hand controller moves upwards by 0.1 meter, the initial motion instruction is
ΔP0=k*ΔP=[0 0 100] (1)
Step two, combining the transformation matrix R from the interactive operation coordinate system to the camera coordinate systemxFor the initial movement command Δ P0Converting to obtain motion command delta P in camera coordinate system1。
Through analyzing the position relation between the interactive equipment coordinate system and the camera coordinate system, when the observation visual angle is changed, the relative direction between the interactive equipment coordinate system and the camera coordinate system in the world coordinate system is not changed, and the direction of the interactive equipment coordinate system can be consistent with the direction of the camera coordinate system after the interactive equipment coordinate system rotates 90 degrees along the X axis.
Thirdly, combining a pose matrix C of the camera coordinate system in a world coordinate system to carry out motion instruction delta P in the camera coordinate system1Converting to obtain motion command delta P in world coordinate system2。
When the scene is looked at, the pose matrix of the camera in the world coordinate system is C.
Since the matrix of the precession of the hand controller is 1x3 matrix, a matrix D of 1x4 needs to be constructed.
D=[ΔP1 0]=[0 100 0 0] (4)
ΔP2=DC-1=[0 0 100 0] (5)
Step four, the motion instruction delta P in the world coordinate system2Obtaining final motion of mechanical arm tail end through motion mappingQuantity Δ P3And generating teleoperation instructions of the whole interaction process.
ΔP3=[Δx Δy Δz]=[0 0 100] (6)
And the delta x, the delta y and the delta z respectively represent the motion quantity of the tail end of the mechanical arm in three coordinate axis directions in a world coordinate system, so that the teleoperation instruction generation with hand-eye coordination is realized.
And step five, driving the mechanical arm to move by using the teleoperation instruction generated in the step four, so as to realize hand-eye coordination. The real-time position of the end of the arm is noted as Pj nThe position at the next time is Pj n+1Then, then
In this embodiment, based on the interactive control of the IRB120 mechanical arm, to complete the step five, a three-dimensional model of the IRB120 mechanical arm is established by using tools such as 3D MAX, and the establishment of a virtual view is completed; and establishing a kinematic model of the mechanical arm on the basis of the D-H coordinate system.
Firstly, forward kinematics analysis is carried out on the coordinate systems, and every two adjacent coordinate systems can be mutually converted through four homogeneous transformations to set a transformation matrix asThen
Wherein, ai、αi、di、θiRespectively is the length of the connecting rod, the torsion angle of the two banks, the offset of the connecting rod and the joint angle.
For the six-degree-of-freedom mechanical arm, when the joint angle of each joint is determined, the tail end pose of the mechanical arm is also determined, and if the tail end pose is T, the tail end pose is determined
Secondly, the kinematic analysis is carried out on the joint through an analytical method, and each joint angle is obtained according to the terminal pose.
From the above formula, theta can be obtained1The other joint angles are sequentially calculated by an analytical method.
the method can be obtained by performing motion mapping through the hand-eye coordination interaction control method, when the hand controller moves upwards, the virtual view is looked forward, the motion direction of the mechanical arm is along the Z axis, and when the virtual view is overlooked, the motion direction of the mechanical arm is along the Y axis. Therefore, when the observation visual angle of an operator is changed, the hand-eye coordination implementation method provided by the text can be used for realizing that the motion of the scene object is consistent with the motion of the interactive equipment, so that the effectiveness of the hand-eye coordination method provided by the text is verified.
Claims (1)
1. An interactive control method facing teleoperation hand-eye coordination is characterized by comprising the following steps:
step one, data acquisition of interactive equipment, namely, acquiring real-time position information P of the hand controller at equal time intervals in the motion process of the hand controllerc n(ii) a Collected real-time position information P of hand controllerc nPosition and attitude information P of the previous momentc n-1Obtaining the advancing amount delta P of the hand controller by differentiating; the advancing amount Delta P is adjusted to Delta P0Mapping k to Δ P as an initial motion command Δ P for the end of the robot arm0Wherein k is an operation proportionality coefficient;
△P0=k*△P (1)
step two, combining the transformation matrix R from the interactive operation coordinate system to the camera coordinate systemxFor the initial movement command Δ P0Converting to obtain motion command delta P in camera coordinate system1;
△P1=△P0*Rx (2)
Thirdly, combining a pose matrix C of the camera coordinate system in a world coordinate system to carry out motion command delta P in the camera coordinate system1Converting to obtain motion command delta P in world coordinate system2;
During the interaction, the view observed by the operator is determined by the position and posture of the camera in the world coordinate system, and the position and posture matrix of the camera depends on three elements: namely the sight line direction of the camera, the position of the center of the camera and the forward direction of the camera, the three elements can determine the position and the posture of the camera in a world coordinate system; when the observation visual angle is converted, the essence is that the position and the posture of the camera in the world coordinate system are changed, and the direction of the camera coordinate system in the world coordinate system is changed;
converting the motion command obtained in the second step from the camera coordinate system to a world coordinate system, wherein an inverse matrix C of a camera matrix is required to be used-1(ii) a In addition, since the hand controller precession amount matrix is a 1x3 matrix, a 1x4 matrix D needs to be constructed;
D=[△P1 0] (4)
△P2=DC-1 (5)
step four, the motion instruction delta P in the world coordinate system2Obtaining the final movement amount delta P of the tail end of the mechanical arm through motion mapping3Generating teleoperation instructions of the whole interaction process;
the world coordinate system motion instruction delta P obtained in the step three2Unfolding to obtain Δ P2Is a matrix of 1x4, the first three columns of elements of which are the final amount of movement of the end of the mechanical arm, and the Δ P is taken by the movement map2The first three columns of elements generate Δ P3The simplification is as follows:
△P3=[△x △y △z] (6)
the system comprises a robot arm, a camera module and a display module, wherein delta x, delta y and delta z respectively represent the motion amount of the tail end of the robot arm in three coordinate axis directions in a world coordinate system so as to realize teleoperation instruction generation of hand-eye coordination;
fifthly, driving the mechanical arm to move by using the teleoperation instruction generated in the fourth step, and realizing hand-eye coordination;
the real-time position of the end of the arm is noted as Pj nThe position at the next time is Pj n+1Then, then
And the scene object is driven to move through the real-time tail end position of the scene object, so that hand-eye coordination in the teleoperation process is realized.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811270020.5A CN110355750B (en) | 2018-10-29 | 2018-10-29 | Interaction control method for hand-eye coordination of teleoperation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811270020.5A CN110355750B (en) | 2018-10-29 | 2018-10-29 | Interaction control method for hand-eye coordination of teleoperation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110355750A CN110355750A (en) | 2019-10-22 |
CN110355750B true CN110355750B (en) | 2022-05-10 |
Family
ID=68214781
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811270020.5A Active CN110355750B (en) | 2018-10-29 | 2018-10-29 | Interaction control method for hand-eye coordination of teleoperation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110355750B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111640189B (en) * | 2020-05-15 | 2022-10-14 | 西北工业大学 | Teleoperation enhanced display method based on artificial mark points |
CN111590537B (en) * | 2020-05-23 | 2023-01-24 | 西北工业大学 | Teleoperation interactive operation method based on force position feedback |
WO2022002159A1 (en) * | 2020-07-01 | 2022-01-06 | 北京术锐技术有限公司 | Master-slave motion control method, robot system, device, and storage medium |
CN115570558B (en) * | 2022-10-28 | 2023-07-11 | 武汉恒新动力科技有限公司 | Somatosensory collaborative teleoperation system and method for controlled object cluster |
CN115639910B (en) * | 2022-10-28 | 2023-08-15 | 武汉恒新动力科技有限公司 | Omnidirectional somatosensory interaction method and equipment for operation space of controlled object |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103991077A (en) * | 2014-02-19 | 2014-08-20 | 吉林大学 | Robot hand controller shared control method based on force fusion |
CN104950885A (en) * | 2015-06-10 | 2015-09-30 | 东南大学 | UAV (unmanned aerial vehicle) fleet bilateral remote control system and method thereof based on vision and force sense feedback |
CN106444861A (en) * | 2016-11-21 | 2017-02-22 | 清华大学深圳研究生院 | Space robot teleoperation system based on three-dimensional gestures |
CN107662195A (en) * | 2017-09-22 | 2018-02-06 | 中国东方电气集团有限公司 | A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc |
CN107748496A (en) * | 2017-09-25 | 2018-03-02 | 北京邮电大学 | Impedance controller algorithm based on parameter adaptive regulation |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5631973A (en) * | 1994-05-05 | 1997-05-20 | Sri International | Method for telemanipulation with telepresence |
-
2018
- 2018-10-29 CN CN201811270020.5A patent/CN110355750B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103991077A (en) * | 2014-02-19 | 2014-08-20 | 吉林大学 | Robot hand controller shared control method based on force fusion |
CN104950885A (en) * | 2015-06-10 | 2015-09-30 | 东南大学 | UAV (unmanned aerial vehicle) fleet bilateral remote control system and method thereof based on vision and force sense feedback |
CN106444861A (en) * | 2016-11-21 | 2017-02-22 | 清华大学深圳研究生院 | Space robot teleoperation system based on three-dimensional gestures |
CN107662195A (en) * | 2017-09-22 | 2018-02-06 | 中国东方电气集团有限公司 | A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc |
CN107748496A (en) * | 2017-09-25 | 2018-03-02 | 北京邮电大学 | Impedance controller algorithm based on parameter adaptive regulation |
Non-Patent Citations (1)
Title |
---|
基于KUKA工业机器人的遥操作控制系统设计与异构主从控制方法研究;汤卿,等.;《四川大学学报( 工程科学版)》;20160115;第48卷(第1期);180-185 * |
Also Published As
Publication number | Publication date |
---|---|
CN110355750A (en) | 2019-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110355750B (en) | Interaction control method for hand-eye coordination of teleoperation | |
CN103302668B (en) | Based on control system and the method thereof of the Space teleoperation robot of Kinect | |
CN108214445B (en) | ROS-based master-slave heterogeneous teleoperation control system | |
US10751877B2 (en) | Industrial robot training using mixed reality | |
Ostanin et al. | Interactive robot programing using mixed reality | |
US9984178B2 (en) | Robot simulator, robot teaching apparatus and robot teaching method | |
CN103870665B (en) | Space manipulator aids in docking operation three dimension dynamic simulation method | |
CN104460670A (en) | SCARA robot motion simulation and remote control system and method | |
CN110385694A (en) | Action teaching device, robot system and the robot controller of robot | |
Kim | Virtual reality calibration and preview/predictive displays for telerobotics | |
CN111590567B (en) | Space manipulator teleoperation planning method based on Omega handle | |
EP3578321A1 (en) | Method for use with a machine for generating an augmented reality display environment | |
CN112958974A (en) | Interactive automatic welding system based on three-dimensional vision | |
Frank et al. | Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet | |
CN111113414A (en) | Robot three-dimensional space scale prompting method and system based on screen identification | |
Rastogi et al. | Telerobotic control with stereoscopic augmented reality | |
CN113211430A (en) | Man-machine cooperative mechanical arm planning method and system | |
JPH11338532A (en) | Teaching device | |
Su et al. | Effective manipulation for industrial robot manipulators based on tablet PC | |
Huang et al. | Telepresence augmentation for visual and haptic guided immersive teleoperation of industrial manipulator | |
Schwandt et al. | Robot manipulator programming interface based on augmened reality | |
Chen et al. | Augmented reality tracking registration and process visualization method for large spacecraft cable assembly | |
Yang et al. | A web-based 3d virtual robot remote control system | |
CN113836745B (en) | Simulation system and method for intelligent inspection device | |
CN116968016B (en) | Construction method of hydraulic arm tail end speed feasible space and visual interaction system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |