CN112571416A - Coordinate system calibration method suitable for robot system and motion capture system - Google Patents

Coordinate system calibration method suitable for robot system and motion capture system Download PDF

Info

Publication number
CN112571416A
CN112571416A CN202011457099.XA CN202011457099A CN112571416A CN 112571416 A CN112571416 A CN 112571416A CN 202011457099 A CN202011457099 A CN 202011457099A CN 112571416 A CN112571416 A CN 112571416A
Authority
CN
China
Prior art keywords
coordinate system
rigid body
robot
motion capture
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011457099.XA
Other languages
Chinese (zh)
Other versions
CN112571416B (en
Inventor
贺京杰
邓双城
徐枫
刘强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Petrochemical Technology
Original Assignee
Beijing Institute of Petrochemical Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Petrochemical Technology filed Critical Beijing Institute of Petrochemical Technology
Priority to CN202011457099.XA priority Critical patent/CN112571416B/en
Publication of CN112571416A publication Critical patent/CN112571416A/en
Application granted granted Critical
Publication of CN112571416B publication Critical patent/CN112571416B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a coordinate system calibration method suitable for a robot system and a motion capture system, which comprises the steps of firstly, initializing the pose in the motion capture system aiming at any rigid body with a marker ball in the robot system, and having no limit to the pose; obtaining an expression mode of the positive direction vector of the axis X, Y, Z of the virtual rigid body coordinate system in a world coordinate system; obtaining an expression mode Q of the posture of the robot base coordinate system under a world coordinate system of a motion capture system1(ii) a Direct reading of attitude Q of virtual controller rigid body by motion capture system2(ii) a Then setting a coordinate system Q of the robot end effector3The attitude of (a); and keeping the coordinate system of the robot end effector consistent with the initial posture of the rigid body of the virtual controller or setting a determined posture conversion relation according to a chain method. The method can accurately map rigid body postures of the robot end effector and the virtual controller in the motion capture system, so that the two calibrated coordinate systems do not contain posture errors, and the calibration accuracy is improved.

Description

Coordinate system calibration method suitable for robot system and motion capture system
Technical Field
The invention relates to the technical field of robot control, in particular to a coordinate system calibration method suitable for a robot system and a motion capture system.
Background
The automatic motion control of video cameras is one of the key technologies in future movies, and various camera business-grade video camera motion control systems, such as BOLT, etc., are available at present. In recent years, motion capture systems have been developed as an emerging technology, which capture characteristic signals of a moving object by using sensors such as optics and inertia to realize a function of tracking a target position and posture in real time. In a motion capture system, the tracked objects are cursor points, and each cursor point has a position coordinate which can be tracked. If the tracked object is a rigid body, the rigid body is represented by 4 cursor points which are not mutually positioned on the same plane, and is called a virtual rigid body, and when the rigid body is regarded as a camera in a virtual space, the rigid body is called a virtual controller rigid body. It is customary in the industry that the lens direction of the virtual camera is the Z-axis direction, the top surface of the virtual camera is the Y-axis direction, and the pose of the robot end effector coordinate system should be consistent with the pose of the virtual camera. The motion capture system uses a world coordinate system calibration scale as a reference coordinate system, the robot system sets a base coordinate system near a base, and when a virtual rigid body in the motion capture system is used for controlling the motion of the end effector of the mechanical arm, the posture and the motion track of the end effector of the robot are consistent with the virtual rigid body no matter how the robot is placed, so that the coordinate systems of the two systems need to be calibrated.
The calibration scheme in the prior art is that the directions of a world coordinate system X, Y and Z of an action capture system and a base coordinate system X, Y and Z of a robot are set to be consistent, a marker ball which can be identified by an optical capture system is pasted on a desktop of the robot, the relative position and pose relationship between the marker ball and the base of the robot is measured, and the relationship is introduced into an ROS system. The mathematical expression of the calibration operation is a formula
Figure BDA0002829097790000011
Namely measuring the pose of the robot desktop relative to the world coordinate system of the optical motion capture system and then measuringMeasuring the pose of the robot base relative to the robot desktop, and obtaining the pose expression of the robot base UR relative to the world coordinate system of the optical motion capture system through a transmission rule. In the scheme, the base coordinate system of the robot is usually not located at the center of the robot base, so that the pose of the robot relative to the desktop marker point cannot be accurately determined; in addition, the robot body is heavy, fine adjustment of the posture of the robot body is difficult to achieve in reality, meanwhile, random variation errors can be caused when the system is installed every time, and in the system installation stage, the optical capture system world coordinate system, the robot desktop marker spherical coordinate system and the mechanical arm base coordinate system are required to be guaranteed to be consistent in posture, so that the robot is inconvenient to install and use.
Disclosure of Invention
The invention aims to provide a coordinate system calibration method suitable for a robot system and a motion capture system, which can accurately map rigid body postures of a robot end effector and a virtual controller in the motion capture system, so that two calibrated coordinate systems do not contain posture errors, and the calibration accuracy is improved.
The purpose of the invention is realized by the following technical scheme:
a coordinate system calibration method suitable for use in robotic systems and motion capture systems, the method comprising:
step 1, firstly, initializing the pose in an action capturing system aiming at any rigid body with a marker ball in a robot system, wherein the pose is not limited;
step 2, fixedly mounting the calibrated virtual rigid body on a robot end effector in any posture and at any position;
step 3, obtaining the origin position of the virtual rigid body coordinate system by using the motion capture system, and obtaining the expression mode of the positive direction vector of the axis X, Y, Z of the virtual rigid body coordinate system in the world coordinate system of the motion capture system;
step 4, obtaining an expression mode Q of the posture of the robot base coordinate system under the world coordinate system of the motion capture system according to the expression mode obtained in the step 31
Step 5, taking down the virtual rigid body as a virtual controller rigid body, and directly reading the posture Q of the virtual controller rigid body through the motion capture system2
Step 6, setting a robot end effector coordinate system Q according to the actual application situation3I.e. the pose of the robot tool coordinate system relative to the robot base coordinate system;
and 7, keeping the coordinate system of the robot end effector consistent with the initial posture of the rigid body of the virtual controller or setting a determined pose conversion relation according to a chain rule to finish the calibration of the coordinate system.
According to the technical scheme provided by the invention, the rigid body postures of the robot end effector and the virtual controller in the motion capture system can be accurately mapped, so that the two calibrated coordinate systems do not contain posture errors, and the calibration accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a coordinate system calibration method suitable for a robot system and a motion capture system according to an embodiment of the present invention;
fig. 2 is a schematic diagram of coordinate system calibration according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The embodiments of the present invention will be described in further detail with reference to the accompanying drawings, and fig. 1 is a schematic flow chart of a coordinate system calibration method suitable for a robot system and a motion capture system according to the embodiments of the present invention, where the method includes:
step 1, firstly, initializing the pose in an action capturing system aiming at any rigid body with a marker ball in a robot system, wherein the pose is not limited;
step 2, fixedly mounting the calibrated virtual rigid body on a robot end effector in any posture and at any position;
step 3, obtaining the origin position of the virtual rigid body coordinate system by using the motion capture system, and obtaining the expression mode of the positive direction vector of the axis X, Y, Z of the virtual rigid body coordinate system in the world coordinate system of the motion capture system;
in this step, the process of obtaining the expression manner of the positive direction vector of the virtual rigid body coordinate system X, Y, Z in the world coordinate system of the motion capture system specifically includes:
respectively enabling the robot to move for any long distance (more than 20cm is recommended) based on the X axis of the base coordinate system, recording the origin coordinates of the virtual rigid body coordinate system in the initial state and the final state, and respectively recording the origin coordinates as P1And P2
Then, subtracting the initial state origin coordinate from the final state origin coordinate to obtain an expression mode of the positive direction vector of the X axis of the virtual rigid body coordinate system in the world coordinate system of the motion capture system;
Vx=P2-P1
the expression mode of the Y-axis and the Z-axis of the virtual rigid body coordinate system in the world coordinate system of the motion capture system is obtained in the same way and is marked as VyAnd Vz
Step 4, obtaining an expression mode Q of the posture of the robot base coordinate system under the world coordinate system of the motion capture system according to the expression mode obtained in the step 31
In this step, Q1Expressed as:
Q1=f(Vx,Vy,Vz)
and the f function is a standard vector passing through a coordinate axis, and an attitude quaternion is solved.
Step 5, taking down the virtual rigid body as a virtual controller rigid body, and directly reading the posture Q of the virtual controller rigid body through the motion capture system2
Step 6, setting a robot end effector coordinate system Q according to the actual application situation3I.e. the pose of the robot tool coordinate system relative to the robot base coordinate system;
in a specific implementation, for example, the rigid body of the virtual controller requires that the positive direction of the Z axis is the lens shooting direction, the normal vector direction of the top surface of the camera is the Y axis (customary in the industry), and the posture of the end effector is set for this purpose.
And 7, keeping the coordinate system of the robot end effector consistent with the initial posture of the rigid body of the virtual controller or setting a determined pose conversion relation according to a chain rule to finish the calibration of the coordinate system.
In this step, as shown in fig. 2, which is a schematic diagram of coordinate system calibration according to the embodiment of the present invention, according to the chain rule, the following relationship exists:
Q2·Q1=Q4·Q3
wherein Q is4Representing the transformation of the robot tool coordinate system to the virtual rigid body controller coordinate system;
thereby obtaining Q4Is expressed as
Figure BDA0002829097790000041
In the specific implementation, after the coordinate system calibration is completed in step 7, in the motion process, the robot end effector coordinate system can correctly track the virtual controller rigid body coordinate system or track the virtual controller rigid body coordinate system in an arbitrary functional relationship, specifically:
when the rigid body of the virtual controller moves and rotates in space, the motion capture system transmits the posture of the rigid body of the virtual controller to the upper computer system and converts the posture of the rigid body of the virtual controller into a table under a robot base coordinate systemTo be, recorded as Qm
At this time, the robot end effector coordinate system only needs to execute Q ═ Q4·Qm·Q3And dynamic tracking can be realized.
In addition, the coordinate system of the robot end effector can be aligned with the coordinate system of the mobile platform through the design of cylindrical surface mechanism matching, positioning pins and the like; when the robot base coordinate system is calibrated in the motion capture system, the mobile platform coordinate system is also calibrated together.
By the calibration method, no matter how the robot is placed, after the rigid body of the virtual controller is calibrated, the virtual lens direction set by the external handle of the virtual controller corresponds to the world coordinate system direction of the motion capture system. For example, when the virtual lens is set to the Z-axis, the virtual control direction of the virtual rigid body external grip should also be the Z-axis direction.
It is noted that those skilled in the art will recognize that embodiments of the present invention are not described in detail herein.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (5)

1. A coordinate system calibration method suitable for use in a robotic system and a motion capture system, the method comprising:
step 1, firstly, initializing the pose in an action capturing system aiming at any rigid body with a marker ball in a robot system, wherein the pose is not limited;
step 2, fixedly mounting the calibrated virtual rigid body on a robot end effector in any posture and at any position;
step 3, obtaining the origin position of the virtual rigid body coordinate system by using the motion capture system, and obtaining the expression mode of the positive direction vector of the axis X, Y, Z of the virtual rigid body coordinate system in the world coordinate system of the motion capture system;
step 4, obtaining an expression mode Q of the posture of the robot base coordinate system under the world coordinate system of the motion capture system according to the expression mode obtained in the step 31
Step 5, taking down the virtual rigid body as a virtual controller rigid body, and directly reading the posture Q of the virtual controller rigid body through the motion capture system2
Step 6, setting a robot end effector coordinate system Q according to the actual application situation3I.e. the pose of the robot tool coordinate system relative to the robot base coordinate system;
and 7, keeping the coordinate system of the robot end effector consistent with the initial posture of the rigid body of the virtual controller or setting a determined pose conversion relation according to a chain rule to finish the calibration of the coordinate system.
2. The coordinate system calibration method for a robot system and a motion capture system according to claim 1, wherein in step 3, the process of obtaining the expression of the positive axial direction vector of the virtual rigid body coordinate system X, Y, Z in the world coordinate system of the motion capture system is specifically:
respectively enabling the robot to move for any long distance based on the X axis of the base coordinate system, recording the origin coordinates of the virtual rigid body coordinate system in the initial state and the final state, and respectively recording the origin coordinates as P1And P2
Then, subtracting the initial state origin coordinate from the final state origin coordinate to obtain an expression mode of the positive direction vector of the X axis of the virtual rigid body coordinate system in the world coordinate system of the motion capture system;
Vx=P2-P1
and obtaining the expression mode of the Y axis and the Z axis of the virtual rigid body coordinate system in the world coordinate system of the motion capture system in the same way, and recording the expression mode as VyAnd Vz
3. Machine according to claim 1Coordinate system calibration method for human system and motion capture system, characterized in that in step 4, Q1Expressed as:
Q1=f(Vx,Vy,Vz)
and the f function is a standard vector passing through a coordinate axis, and an attitude quaternion is solved.
4. The coordinate system calibration method for a robot system and a motion capture system according to claim 1, wherein in step 7, according to the chain rule, the following relation exists:
Q2·Q1=Q4·Q3
wherein Q is4Representing the transformation of the robot tool coordinate system to the virtual rigid body controller coordinate system;
thereby obtaining Q4Is expressed as Q4=Q2·Q1·Q3 -1
5. The method for calibrating a coordinate system suitable for a robot system and a motion capture system according to claim 1, wherein after completing the calibration of the coordinate system in step 7, the robot end effector coordinate system can correctly track the virtual controller rigid coordinate system or track in an arbitrary functional relationship during the motion, specifically:
when the rigid body of the virtual controller moves and rotates in space, the gesture of the rigid body of the virtual controller is transmitted to the upper computer system by the motion capture system and is converted into an expression under a robot base coordinate system, and the expression is marked as Qm
At this time, the robot end effector coordinate system only needs to execute Q ═ Q4·Qm·Q3And dynamic tracking can be realized.
CN202011457099.XA 2020-12-10 2020-12-10 Coordinate system calibration method suitable for robot system and motion capture system Active CN112571416B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011457099.XA CN112571416B (en) 2020-12-10 2020-12-10 Coordinate system calibration method suitable for robot system and motion capture system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011457099.XA CN112571416B (en) 2020-12-10 2020-12-10 Coordinate system calibration method suitable for robot system and motion capture system

Publications (2)

Publication Number Publication Date
CN112571416A true CN112571416A (en) 2021-03-30
CN112571416B CN112571416B (en) 2022-03-22

Family

ID=75131721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011457099.XA Active CN112571416B (en) 2020-12-10 2020-12-10 Coordinate system calibration method suitable for robot system and motion capture system

Country Status (1)

Country Link
CN (1) CN112571416B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106355617A (en) * 2016-08-12 2017-01-25 上海盟云移软网络科技股份有限公司 Dynamic positioning algorithm in the motion capture system
CN106600627A (en) * 2016-12-07 2017-04-26 成都通甲优博科技有限责任公司 Rigid body motion capturing method and system based on mark point
US20180075609A1 (en) * 2016-09-12 2018-03-15 DunAn Precision, Inc. Method of Estimating Relative Motion Using a Visual-Inertial Sensor
CN108762495A (en) * 2018-05-18 2018-11-06 深圳大学 The virtual reality driving method and virtual reality system captured based on arm action
CN111127568A (en) * 2019-12-31 2020-05-08 南京埃克里得视觉技术有限公司 Camera pose calibration method based on space point location information
CN111710002A (en) * 2020-05-27 2020-09-25 华中科技大学 Camera external parameter calibration method based on Optitrack system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106355617A (en) * 2016-08-12 2017-01-25 上海盟云移软网络科技股份有限公司 Dynamic positioning algorithm in the motion capture system
US20180075609A1 (en) * 2016-09-12 2018-03-15 DunAn Precision, Inc. Method of Estimating Relative Motion Using a Visual-Inertial Sensor
CN106600627A (en) * 2016-12-07 2017-04-26 成都通甲优博科技有限责任公司 Rigid body motion capturing method and system based on mark point
CN108762495A (en) * 2018-05-18 2018-11-06 深圳大学 The virtual reality driving method and virtual reality system captured based on arm action
CN111127568A (en) * 2019-12-31 2020-05-08 南京埃克里得视觉技术有限公司 Camera pose calibration method based on space point location information
CN111710002A (en) * 2020-05-27 2020-09-25 华中科技大学 Camera external parameter calibration method based on Optitrack system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JINGJIE HE ETC.: ""Research into A Control Schema for Camera Robot Capable of Artistic Creation"", 《2020 INTERNATIONAL CONFERENCE ON CULTURE-ORIENTED SCIENCE & TECHNOLOGY (ICCST)》 *
谭菁华: ""Optitrack三维运动捕捉系统的精准度分析"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Also Published As

Publication number Publication date
CN112571416B (en) 2022-03-22

Similar Documents

Publication Publication Date Title
US11498220B2 (en) Control system and control method
CN107428009B (en) Method for commissioning an industrial robot, industrial robot system and control system using the method
JP5850962B2 (en) Robot system using visual feedback
CN109550649B (en) Dispensing positioning method and device based on machine vision
TWI512548B (en) Moving trajectory generation method
US20160184996A1 (en) Robot, robot system, control apparatus, and control method
US20140018957A1 (en) Robot system, robot, robot control device, robot control method, and robot control program
EP2939402B1 (en) Method and device for sensing orientation of an object in space in a fixed frame of reference
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
US20190376860A1 (en) Visuo-haptic sensor
CN113858217B (en) Multi-robot interaction three-dimensional visual pose perception method and system
WO2018043524A1 (en) Robot system, robot system control device, and robot system control method
JP2019098409A (en) Robot system and calibration method
Xu et al. Vision-based simultaneous measurement of manipulator configuration and target pose for an intelligent cable-driven robot
CN106643601B (en) The sextuple measurement method of parameters of industrial robot dynamic
TWI476733B (en) Three-dimensional space motion reconstruction method and apparatus constructed thereby
CN113781558A (en) Robot vision locating method with decoupled posture and position
CN112381881B (en) Automatic butt joint method for large rigid body members based on monocular vision
CN112571416B (en) Coordinate system calibration method suitable for robot system and motion capture system
JP2013173191A (en) Robot apparatus, robot control apparatus, robot control method, and robot control program
KR101340555B1 (en) Apparatus and method for generating base view image
CN111971529A (en) Method and apparatus for managing robot system
JP2018017610A (en) Three-dimensional measuring device, robot, robot controlling device, and robot system
WO2022172471A1 (en) Assistance system, image processing device, assistance method and program
WO2024041202A1 (en) Image processing method, calibration system, and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant