CN109760107B - Monocular vision-based robot positioning accuracy evaluation method - Google Patents
Monocular vision-based robot positioning accuracy evaluation method Download PDFInfo
- Publication number
- CN109760107B CN109760107B CN201910056723.6A CN201910056723A CN109760107B CN 109760107 B CN109760107 B CN 109760107B CN 201910056723 A CN201910056723 A CN 201910056723A CN 109760107 B CN109760107 B CN 109760107B
- Authority
- CN
- China
- Prior art keywords
- robot
- positioning
- pose
- error
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Abstract
The invention discloses a monocular vision-based robot positioning accuracy evaluation method, wherein a calibration plate is arranged at any position in a robot movement space, a robot observes the same calibration plate for multiple times at different positions in the working environment in the movement process, the observation method is used for sampling for multiple times, and the quantitative comparison of the positioning accuracy of different positioning algorithms in the same working environment is realized according to all observation information acquired in the movement process and the data statistical characteristics of the observation information. The invention obtains the real pose of the robot without other precise instruments, does not need a large number of repeated experimental tests, saves the use cost and improves the working efficiency. In addition, a calibration plate with an unknown environment pose is adopted, so that the evaluation method is simpler and more convenient, and the motion environment and space of the robot are not limited. Finally, the accuracy of the robot positioning algorithm is described by expression of positive correlation to the positioning error, and the comparison of the positioning accuracy among different algorithms is realized.
Description
Technical Field
The invention relates to the technical field of robot vision, in particular to a monocular vision-based robot positioning accuracy evaluation method.
Background
The visual positioning of the mobile robot is widely applied to various aspects of the visual navigation of the mobile robot, and the positioning precision directly influences the navigation capability of the mobile robot. Therefore, it is important to evaluate the accuracy of positioning the mobile robot. The evaluation method which is common at present is mostly compared and analyzed by a motion acquisition device and a calibration board group with position correlation added in a motion environment.
However, the method of arranging the motion capture device and the calibration plate set has high application cost, great modification degree to the environment, and increased complexity of use, and such an evaluation method is difficult to be widely applied in practical applications. The invention provides a positioning accuracy evaluation method based on monocular vision, which realizes quantitative comparison of different positioning algorithm accuracy by acquiring sensing feedback of a mobile robot in the moving process and observing a calibration plate at any position in a motion space by using a monocular camera carried on a robot body.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a robot positioning accuracy evaluation method based on monocular vision. The method arranges the calibration plate at any position in the robot motion space without calibrating the posture (position and posture) between the calibration plates. In the moving process, the robot observes the same calibration plate for multiple times at different positions in the working environment of the robot, samples are obtained for multiple times by the observation method, and quantitative comparison of positioning accuracy of different positioning algorithms in the same working environment is realized according to all observation information acquired in the moving process of one time and the data statistical characteristics of the observation information.
In order to achieve the purpose, the technical scheme provided by the invention is as follows:
a robot positioning accuracy evaluation method based on monocular vision comprises the following steps:
s1: arranging a calibration plate in a visual range along the movement of the robot;
s2: the robot moves to a certain calibration plate in the motion space;
s3: when the robot approaches the calibration plate, recording the real-time pose information of the current algorithm of the robot and the relative pose information of the robot and the calibration plate;
s4: and fitting to obtain an error distribution curve, and evaluating the accuracy of the positioning algorithm.
Further, in step S3, the real-time pose information of the robot is obtained through a positioning algorithm.
Further, in step S3, the relative pose information between the robot and the calibration board is obtained by a multi-view geometric method, and the specific calculation steps are as follows:
converting an image plane coordinate system into an image pixel coordinate system:
wherein, uO0v is an image pixel coordinate system, and the unit is a pixel; xO1y is an image plane coordinate system and the unit is millimeter;
let the physical size of each pixel in the u-axis and v-axis directions be dxAnd dy;
In the above formula, dx,dy,u0,v0Are all assumed parameters;
camera coordinate system to world coordinate system:
in the above formula, (X)C,YC,ZC) Image points in a camera coordinate system, (X, Y, Z) image points in a world coordinate system; r is a 3 × 3 rotation matrix, t is a 3 × 1 translation matrix, and L is represented as a 4 × 4 matrix;
world coordinate system and pixel coordinate system:
wherein f is the camera focal length.
Further, the step S4 fits to obtain an error distribution curve, and the specific steps are as follows:
Pifor the true value, P, of the pose of the robot obtained in step S3i' calculating the difference between the pose containing the error under the positioning algorithm:
Pi’=Δi·Pi;
assuming that the error of the positioning algorithm conforms to Gaussian distribution, by means of a calibration plate with unknown pose in a working environment, the pose P containing the error of the robot is recorded at different positions simultaneouslyi' and the pose T of the robot relative to the calibration plate to acquire the pose error delta of the roboti=Pi’·Ti -1·r-1So as to obtain the expression of the error distribution curve of the robot under the current positioning algorithm:
Ω=Δ1·Δ2 -1=P1’·Ti -1·T2·P2 ’-1
from the above formula, the obtained error expression is positively related to the positioning error of the robot, so that the positioning error distribution of the estimated positioning algorithm can be plotted, and the accuracy of the positioning algorithm can be evaluated.
Compared with the prior art, the principle and the advantages of the scheme are as follows:
1. the real pose of the robot is obtained without other precise instruments, and a large number of repeated experimental tests are not needed, so that the use cost is saved, and the working efficiency is improved.
2. The calibration plate with unknown environment poses is adopted, pose information among a plurality of calibration plates does not need to be acquired in advance, the evaluation method is simpler and more convenient, and the motion environment and space of the robot are not limited.
3. The accuracy of the robot positioning algorithm is described by expression positively correlated to the positioning error, and the original true value error distribution expression is replaced, so that the comparison of the positioning accuracy among different algorithms is realized.
Drawings
FIG. 1 is a flow chart of a robot positioning accuracy evaluation method based on monocular vision according to the present invention;
FIG. 2 is a schematic diagram of error depiction in the monocular vision-based robot positioning accuracy evaluation method of the present invention;
FIG. 3 is a schematic diagram of an image plane coordinate system to an image pixel coordinate system;
fig. 4 is a schematic diagram of a world coordinate system transformed from a camera coordinate system.
Detailed Description
The invention will be further illustrated with reference to specific examples:
referring to fig. 1 and 2, the method for evaluating the positioning accuracy of the robot based on monocular vision according to the present embodiment includes the following steps:
s1: arranging a calibration plate in a visual range along the movement of the robot;
s2: the robot moves to a certain calibration plate in the motion space;
s3: when the robot approaches the calibration plate, recording the real-time pose information of the current algorithm of the robot and the relative pose information of the robot and the calibration plate;
in step S3, the real-time pose information of the robot is obtained by a positioning algorithm.
The relative pose information of the robot and the calibration plate is obtained by a camera calibration method, and the specific calculation steps are as follows:
the image plane coordinate system is converted into an image pixel coordinate system, as shown in fig. 3:
uO0v is an image pixel coordinate system, and the unit is a pixel; xO1y is an image plane coordinate system and the unit is millimeter;
let the physical size of each pixel in the u-axis and v-axis directions be dxAnd dy;
In the above formula, dx,dy,u0,v0Are all assumed parameters;
the camera coordinate system is transformed into a world coordinate system, as shown in fig. 4:
wherein (X)C,YC,ZC) Image points in a camera coordinate system, (X, Y, Z) image points in a world coordinate system; r is a 3 × 3 rotation matrix, t is a 3 × 1 translation matrix, and L is represented as a 4 × 4 matrix;
world coordinate system and pixel coordinate system:
wherein f is the camera focal length.
S4: fitting to obtain an error distribution curve, and evaluating the accuracy of the positioning algorithm; the method comprises the following specific steps: piFor the true value, P, of the pose of the robot obtained in step S3i' calculating the difference between the pose containing the error under the positioning algorithm:
Pi’=Δi·Pi;
assuming that the error of the positioning algorithm conforms to Gaussian distribution, by means of a calibration plate with unknown pose in a working environment, the pose P containing the error of the robot is recorded at different positions simultaneouslyi' and the pose T of the robot relative to the calibration plate to acquire the pose error delta of the roboti=Pi’·Ti -1·r-1So as to obtain the expression of the error distribution curve of the robot under the current positioning algorithm:
Ω=Δ1·Δ2 -1=P1’·Ti -1·T2·P2’-1
from the above formula, the obtained error expression is positively related to the positioning error of the robot, so that the positioning error distribution of the estimated positioning algorithm can be plotted, and the accuracy of the positioning algorithm can be evaluated.
The embodiment does not need other precise instruments to obtain the real pose of the robot, and a large number of repeated experimental tests are not needed, so that the use cost is saved, and the working efficiency is improved. In addition, the calibration plate with the unknown environment pose is adopted, the pose information among a plurality of calibration plates does not need to be acquired in advance, the evaluation method is simpler and more convenient, and the motion environment and the space of the robot are not limited. And finally, describing the accuracy of the robot positioning algorithm by expression positively correlated to the positioning error, replacing the original true value error distribution expression, and realizing the comparison of the positioning accuracy among different algorithms.
The above-mentioned embodiments are merely preferred embodiments of the present invention, and the scope of the present invention is not limited thereto, so that variations based on the shape and principle of the present invention should be covered within the scope of the present invention.
Claims (2)
1. A robot positioning accuracy evaluation method based on monocular vision is characterized by comprising the following steps:
s1: arranging a calibration plate in a visual range along the movement of the robot;
s2: the robot moves to a certain calibration plate in the motion space;
s3: when the robot approaches the calibration plate, recording the real-time pose information of the current algorithm of the robot and the relative pose information of the robot and the calibration plate;
s4: fitting to obtain an error distribution curve, and evaluating the accuracy of the positioning algorithm;
in step S3, the relative pose information between the robot and the calibration board is obtained by a camera calibration method, and the specific calculation steps are as follows:
converting an image plane coordinate system into an image pixel coordinate system:
u and v are image pixel coordinate systems, and the unit is a pixel; x and y are image half-plane coordinate systems with the unit of millimeter;
assuming that the physical dimensions of each pixel in the u-axis and v-axis directions are dx and dy;
in the above formula, dx, dy, u0,v0Are all assumed parameters;
camera coordinate system to world coordinate system:
wherein (X)C,YC,ZC) Image points in a camera coordinate system, (X, Y, Z) image points in a world coordinate system; r is a 3 × 3 rotation matrix, t is a 3 × 1 translation matrix, and L is represented as a 4 × 4 matrix;
world coordinate system and pixel coordinate system:
wherein f is the focal length of the camera;
the step S4 is to obtain an error distribution curve by fitting, and the specific steps are as follows:
Pifor the true value, P, of the pose of the robot obtained in step S3i' calculating the difference between the pose containing the error under the positioning algorithm:
Pi′=Δi·Pi;
assuming that the error of the positioning algorithm conforms to Gaussian distribution, by means of a calibration plate with unknown pose in a working environment, the pose of the robot with the error is recorded at different positions simultaneouslyPi' and the pose T of the robot relative to the calibration plate to acquire the pose error delta of the roboti=Pi′·Ti -1·r-1So as to obtain the expression of the error distribution curve of the robot under the current positioning algorithm:
Ω=Δ1·Δ2 -1=P1′·Ti -1·T2·P2′-1,
from the above formula, the obtained error expression is positively related to the positioning error of the robot, so that the positioning error distribution of the estimated positioning algorithm can be plotted, and the accuracy of the positioning algorithm can be evaluated.
2. The monocular vision based robot positioning accuracy evaluation method of claim i, wherein in step S3, the real-time pose information of the robot is obtained through a positioning algorithm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910056723.6A CN109760107B (en) | 2019-01-22 | 2019-01-22 | Monocular vision-based robot positioning accuracy evaluation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910056723.6A CN109760107B (en) | 2019-01-22 | 2019-01-22 | Monocular vision-based robot positioning accuracy evaluation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109760107A CN109760107A (en) | 2019-05-17 |
CN109760107B true CN109760107B (en) | 2022-04-12 |
Family
ID=66454911
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910056723.6A Active CN109760107B (en) | 2019-01-22 | 2019-01-22 | Monocular vision-based robot positioning accuracy evaluation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109760107B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111012506B (en) * | 2019-12-28 | 2021-07-27 | 哈尔滨工业大学 | Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision |
CN111678521B (en) * | 2020-06-18 | 2021-12-28 | 上海大学 | Method and system for evaluating positioning accuracy of mobile robot |
CN111896032B (en) * | 2020-09-29 | 2021-09-03 | 北京清微智能科技有限公司 | Calibration system and method for monocular speckle projector position |
CN112781498B (en) * | 2021-02-09 | 2022-07-08 | 南京景曜智能科技有限公司 | Robot actuator pose repetition precision measuring method and device |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4415419A1 (en) * | 1994-05-02 | 1995-11-09 | Horn Wolfgang | Precision position measurement appts. for robotic container high-lift truck |
JPH11231919A (en) * | 1998-02-13 | 1999-08-27 | Hitachi Zosen Corp | Method for evaluating accuracy of robot track |
JP2000061871A (en) * | 1998-08-20 | 2000-02-29 | Hitachi Zosen Corp | Specific attitude candidate searching method for robot trajectory and precision evaluation method for robot trajectory |
CN1392950A (en) * | 2000-08-18 | 2003-01-22 | 学校法人中央大学 | Positional error evaluation method for mobile device and movement accuracy improving method based on evaluation result |
CN103389072A (en) * | 2013-07-22 | 2013-11-13 | 北京信息科技大学 | An image point positioning precision assessment method based on straight line fitting |
CN104647387A (en) * | 2013-11-25 | 2015-05-27 | 佳能株式会社 | Robot control method, system and device |
CN105751245A (en) * | 2016-03-30 | 2016-07-13 | 广东工业大学 | Method and equipment for calibrating base coordinate systems of multi-robot system |
CN105945909A (en) * | 2016-05-13 | 2016-09-21 | 大族激光科技产业集团股份有限公司 | Error correction method and system for three-degree-of-freedom parallel robot |
CN106584513A (en) * | 2017-01-26 | 2017-04-26 | 武汉延锋时代检测技术股份有限公司 | Industrial robot repeated positioning accuracy testing method and industrial robot repeated positioning accuracy testing device |
CN106826922A (en) * | 2017-03-02 | 2017-06-13 | 宁波韦尔德斯凯勒智能科技有限公司 | A kind of industrial robot performance detecting system based on monocular camera machine vision |
CN108393929A (en) * | 2018-02-01 | 2018-08-14 | 大连理工大学 | A kind of mechanical arm positional precision measuring device and method |
CN108596980A (en) * | 2018-03-29 | 2018-09-28 | 中国人民解放军63920部队 | Circular target vision positioning precision assessment method, device, storage medium and processing equipment |
CN108818536A (en) * | 2018-07-12 | 2018-11-16 | 武汉库柏特科技有限公司 | A kind of online offset correction method and device of Robotic Hand-Eye Calibration |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9722640B2 (en) * | 2015-01-06 | 2017-08-01 | Discovery Robotics | Method and system for determining precise robotic position and orientation using near-simultaneous radio frequency measurements |
-
2019
- 2019-01-22 CN CN201910056723.6A patent/CN109760107B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4415419A1 (en) * | 1994-05-02 | 1995-11-09 | Horn Wolfgang | Precision position measurement appts. for robotic container high-lift truck |
JPH11231919A (en) * | 1998-02-13 | 1999-08-27 | Hitachi Zosen Corp | Method for evaluating accuracy of robot track |
JP2000061871A (en) * | 1998-08-20 | 2000-02-29 | Hitachi Zosen Corp | Specific attitude candidate searching method for robot trajectory and precision evaluation method for robot trajectory |
CN1392950A (en) * | 2000-08-18 | 2003-01-22 | 学校法人中央大学 | Positional error evaluation method for mobile device and movement accuracy improving method based on evaluation result |
CN103389072A (en) * | 2013-07-22 | 2013-11-13 | 北京信息科技大学 | An image point positioning precision assessment method based on straight line fitting |
CN104647387A (en) * | 2013-11-25 | 2015-05-27 | 佳能株式会社 | Robot control method, system and device |
CN105751245A (en) * | 2016-03-30 | 2016-07-13 | 广东工业大学 | Method and equipment for calibrating base coordinate systems of multi-robot system |
CN105945909A (en) * | 2016-05-13 | 2016-09-21 | 大族激光科技产业集团股份有限公司 | Error correction method and system for three-degree-of-freedom parallel robot |
CN106584513A (en) * | 2017-01-26 | 2017-04-26 | 武汉延锋时代检测技术股份有限公司 | Industrial robot repeated positioning accuracy testing method and industrial robot repeated positioning accuracy testing device |
CN106826922A (en) * | 2017-03-02 | 2017-06-13 | 宁波韦尔德斯凯勒智能科技有限公司 | A kind of industrial robot performance detecting system based on monocular camera machine vision |
CN108393929A (en) * | 2018-02-01 | 2018-08-14 | 大连理工大学 | A kind of mechanical arm positional precision measuring device and method |
CN108596980A (en) * | 2018-03-29 | 2018-09-28 | 中国人民解放军63920部队 | Circular target vision positioning precision assessment method, device, storage medium and processing equipment |
CN108818536A (en) * | 2018-07-12 | 2018-11-16 | 武汉库柏特科技有限公司 | A kind of online offset correction method and device of Robotic Hand-Eye Calibration |
Non-Patent Citations (2)
Title |
---|
5-UPS/PRPU五自由度并联机床基准位形的确定;鹿玲等;《机床与液压》;20061228(第12期);全文 * |
单目视觉位姿测量方法及数字仿真;张子淼等;《天津大学学报》;20110531;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109760107A (en) | 2019-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109760107B (en) | Monocular vision-based robot positioning accuracy evaluation method | |
CN110689579B (en) | Rapid monocular vision pose measurement method and measurement system based on cooperative target | |
CN107871328B (en) | Machine vision system and calibration method implemented by machine vision system | |
CN109947097B (en) | Robot positioning method based on vision and laser fusion and navigation application | |
CN105014678A (en) | Robot hand-eye calibration method based on laser range finding | |
CN109807937B (en) | Robot hand-eye calibration method based on natural scene | |
CN114012731B (en) | Hand-eye calibration method and device, computer equipment and storage medium | |
CN112465877B (en) | Kalman filtering visual tracking stabilization method based on motion state estimation | |
CN110017852B (en) | Navigation positioning error measuring method | |
CN112991460B (en) | Binocular measurement system, method and device for obtaining size of automobile part | |
CN110415286B (en) | External parameter calibration method of multi-flight time depth camera system | |
CN116433737A (en) | Method and device for registering laser radar point cloud and image and intelligent terminal | |
US20200114519A1 (en) | System, method and marker for the determination of the position of a movable object in space | |
CN112229323A (en) | Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method | |
CN111583342A (en) | Target rapid positioning method and device based on binocular vision | |
Montes et al. | Vision-based tracking of a dynamic target with application to multi-axis position control | |
CN108983702B (en) | Computer microscopic visual slice scanning technology-based microscopic visual field digital extension method and system for microscopic visual system | |
CN108955562B (en) | Digital extension method and system for microscopic depth of field of microscopic vision system | |
CN107783310B (en) | Calibration method and device of cylindrical lens imaging system | |
CN113587829B (en) | Edge thickness measuring method and device, edge thickness measuring equipment and medium | |
CN214200141U (en) | Robot repeated positioning precision measuring system based on vision | |
CN113487676A (en) | Method and apparatus for determining relative pose angle between cameras mounted to an acquisition entity | |
CN116091401A (en) | Spacecraft assembly part identification positioning method based on target detection and composite target code | |
Liang et al. | An integrated camera parameters calibration approach for robotic monocular vision guidance | |
JP3849030B2 (en) | Camera calibration apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20210901 Address after: 528253 room 3, 803, floor 8, block 3, Tian'an center, No. 31, Jihua East Road, Guicheng Street, Nanhai District, Foshan City, Guangdong Province (residence declaration) Applicant after: Jiutian innovation (Guangdong) Intelligent Technology Co.,Ltd. Address before: No. 100, Waihuan West Road, University Town, Guangzhou, Guangdong 510062 Applicant before: GUANGDONG University OF TECHNOLOGY |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |