CN112465898B - Object 3D pose tag acquisition method based on checkerboard calibration plate - Google Patents

Object 3D pose tag acquisition method based on checkerboard calibration plate Download PDF

Info

Publication number
CN112465898B
CN112465898B CN202011309922.2A CN202011309922A CN112465898B CN 112465898 B CN112465898 B CN 112465898B CN 202011309922 A CN202011309922 A CN 202011309922A CN 112465898 B CN112465898 B CN 112465898B
Authority
CN
China
Prior art keywords
checkerboard
coordinate system
pose
camera
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011309922.2A
Other languages
Chinese (zh)
Other versions
CN112465898A (en
Inventor
庄春刚
熊振华
朱向阳
雷海波
王合胜
王浩宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202011309922.2A priority Critical patent/CN112465898B/en
Publication of CN112465898A publication Critical patent/CN112465898A/en
Application granted granted Critical
Publication of CN112465898B publication Critical patent/CN112465898B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a chessboard marking plate-based object 3D pose tag acquisition method, which relates to the technical field of object pose acquisition and comprises the following steps: establishing a position and pose template of the object and the chessboard pattern calibration plate by depending on the CAD model of the object; calibrating camera internal parameter, camera external parameter and radial distortion parameter; acquiring the pose of an object under a camera coordinate system; and verifying the correctness of pose acquisition. The invention provides a low-cost and high-precision method for acquiring the real pose of the object and a real verification label for an object pose estimation method by using a checkerboard calibration method on the basis of restraining the pose relationship between the object and the checkerboard.

Description

Object 3D pose tag acquisition method based on checkerboard calibration plate
Technical Field
The invention relates to the technical field of object pose acquisition, in particular to an object pose label acquisition method based on a checkerboard calibration plate.
Background
With the development of deep learning, more and more networks pay attention to estimation of object spatial poses. The spatial pose estimation network needs a large amount of training data with pose tags, and meanwhile, the pose tags of objects in a real scene are needed to be used as the calculation basis of the network regression accuracy. Therefore, the acquisition of the real pose of the object has very important significance in the aspects of data set construction and pose estimation accuracy evaluation.
The existing object pose information is mainly acquired in two forms, one is to acquire a simulation data set and a label thereof through a physical simulation engine, and the other is to acquire the pose information in a real scene in an indirect mode. Kilian Kleeberger et al uses a closest point iteration algorithm in the article "Large-scale 6D Object position Estimation Dataset for Industrial Bin-Pickling" to obtain Object labels of a real scene. The mark Breger et al uses the markers attached to the Object to obtain Pose information of the Object in the article "Symmetry Aware Evaluation of 3D Object Detection and position Estimation in Scenes of Man Parts in Bulk". In verifying Pose Estimation accuracy, chien-Ming Lin et al, in the article "Visual Object Recognition and position Estimation Based on a depth sensing Segmentation Network," use a two-degree-of-freedom turntable to provide a label of the relative Pose of an Object in a real scene.
The mode of acquiring the pose by the indirect algorithm depends on complete and reliable scene three-dimensional information and a template matching algorithm with high robustness. In some complex scenarios, the interference information needs to be removed manually. The use of data generated by the physics engine causes an inevitable loss of precision in generalizing to a real scene. The estimation accuracy of the absolute pose of the object in the camera coordinate system is a direct factor influencing the success rate of object grabbing, so that the performance of pose estimation in an object grabbing task cannot be reflected by using the evaluation of the relative pose.
Therefore, those skilled in the art are dedicated to developing a chessboard-based object pose tag acquisition method, which does not depend on three-dimensional point cloud information and does not require a high robustness matching algorithm.
Disclosure of Invention
In view of the above defects in the prior art, the technical problem to be solved by the present invention is to provide a low-cost and high-precision method for acquiring the pose of an object in a real scene, and provide a real verification tag for an object pose estimation method.
In order to achieve the purpose, the invention provides a chessboard calibration plate-based object 3D pose tag acquisition method, which is characterized by comprising the following steps of:
step 1, establishing a CAD model of an object;
step 2, projecting the CAD model in equal proportion to obtain an object contour template;
step 3, aligning the object outline template with the origin of the chessboard grid calibration plate;
step 4, calculating a transformation matrix from the object coordinate system to the checkerboard coordinate system to obtain a first transformation matrix;
step 5, placing the object and the checkerboard on a template, and calibrating external camera parameters, internal camera parameters and radial distortion parameters;
step 6, moving the camera to enable the object and the checkerboard to be in the field of view of the camera simultaneously, and acquiring a single picture;
step 7, correcting the single picture by using the radial distortion parameters, and calculating a transformation matrix from a checkerboard coordinate system to a camera coordinate system to obtain a second transformation matrix;
step 8, calculating a transformation matrix from the object coordinate system to the camera coordinate system according to the first transformation matrix and the second transformation matrix to obtain a third transformation matrix;
step 9, sampling the CAD model to obtain coordinate values of sampling points under an object coordinate system, wherein the coordinate values of the sampling points are the coordinate values of the sampled object, and the coordinate values of the sampling points under a camera coordinate system are calculated by using the third transformation matrix to obtain coordinate values of the sampled camera;
and step 10, obtaining a pose estimation value of the object in a camera coordinate system by using an object pose estimation method, calculating coordinate values of the sampling points in the camera coordinate system to obtain sampling estimated coordinate values, and comparing the sampling estimated coordinate values with the sampling camera coordinate values to obtain a pose estimation precision index.
Further, the step 3 comprises: and selecting template feature points from the object contour template, aligning the template feature points with the origin of the checkerboard calibration plate, and calculating the offset of the template feature points.
Further, the first transformation matrix is:
Figure BDA0002789481110000021
in the formula x f 、y f 、z f And the coordinate values of the template characteristic points under the object coordinate system are shown, delta x and delta y are the offset from the template characteristic points to the origin of the checkerboard calibration plate, delta h is the thickness of the high-precision checkerboard calibration plate, and the value is 0 when the high-precision calibration plate is not used.
Further, the camera external parameters are obtained by fitting the checkerboard angular points and the actual coordinates of the checkerboard angular points, and the camera internal parameters are obtained by calculation through a Zhang Zhengyou calibration method.
Further, the camera internal reference and the radial distortion parameter are respectively:
Figure BDA0002789481110000022
where f is the focal length of the industrial camera, dx is the pixel lateral ratio, dy is the pixel longitudinal ratio, u 0 、v 0 Is the principal point coordinates of the image.
Further, the second transformation matrix is:
Figure BDA0002789481110000031
where R is the rotation transformation from the checkerboard coordinate system to the camera coordinate system and t is the translation vector from the checkerboard coordinate system to the camera coordinate system.
Further, the third transformation matrix is:
T obj2camera =T board2camera ·T board2obj -1
further, the object coordinate values of the sampling points are:
Figure BDA0002789481110000032
in the formula
Figure BDA0002789481110000033
Is the coordinate value of the ith sampling point in the object coordinate system.
Further, the sampling camera coordinate values are:
Figure BDA0002789481110000034
in the formula
Figure BDA0002789481110000035
Is the ith sample point obtained by using the third transformation matrixCoordinate values of the camera coordinate system.
Further, the sampling estimation coordinate value is:
Figure BDA0002789481110000036
in the formula
Figure BDA0002789481110000037
Is the coordinate value of the ith sampling point in the camera coordinate system calculated by using the transformation matrix obtained by the pose estimation method.
Further, the calculation formula of the pose estimation accuracy index is as follows:
Figure BDA0002789481110000038
in the formula
Figure BDA0002789481110000039
P cam i Respectively, a sampling estimated coordinate value and a sampling camera coordinate value of the ith sampling point.
Compared with the prior art, the invention at least has the following beneficial technical effects:
1. the attitude information of the object under the camera coordinate system can be acquired under the condition of low cost;
2. coordinate transformation from the checkerboard to the camera is directly used in the process of calculating the pose of the object, so that robot motion errors and other errors in other existing methods are avoided;
3. the method provides a label under a real scene for object pose estimation, and provides a basis for establishing a real data set of the object pose estimation and evaluating the pose estimation.
The conception, the specific structure and the technical effects of the present invention will be further described with reference to the accompanying drawings to fully understand the objects, the features and the effects of the present invention.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of the present invention;
FIG. 2 is a schematic representation of a CAD model of a preferred embodiment of the present invention;
FIG. 3 is a schematic diagram of an object outline template in accordance with a preferred embodiment of the present invention;
FIG. 4 is a schematic view of the alignment of an object outline template and a checkerboard calibration plate in accordance with a preferred embodiment of the present invention;
FIG. 5 is a diagram illustrating the relationship of an object coordinate system to a checkerboard coordinate system according to a preferred embodiment of the present invention;
FIG. 6 is a schematic diagram of an eye-on-hand calibration model according to a preferred embodiment of the present invention;
FIG. 7 is a schematic representation of a CAD model sampling of a preferred embodiment of the present invention;
FIG. 8 is a graph of the error distribution of the sampling points according to a preferred embodiment of the present invention.
The method comprises the following steps of 1-connecting rod workpiece CAD model, 2-checkerboard calibration plate, 3-industrial camera and 4-industrial robot.
Detailed Description
The technical contents of the preferred embodiments of the present invention will be more clearly and easily understood by referring to the drawings attached to the specification. The present invention may be embodied in many different forms of embodiments and the scope of the invention is not limited to the embodiments set forth herein.
In the drawings, structurally identical elements are represented by like reference numerals, and structurally or functionally similar elements are represented by like reference numerals throughout the several views. The size and thickness of each component shown in the drawings are arbitrarily illustrated, and the present invention is not limited to the size and thickness of each component. The thickness of the components may be exaggerated where appropriate in the figures to improve clarity.
The object of the present embodiment is a common connecting rod workpiece in an industrial scene, as shown in fig. 1, which is a flowchart of the present embodiment, and includes the following steps:
step 1, establishing a connecting rod workpiece CAD model 1, wherein the model is obtained in a product design stage and can also be obtained by using a reconstruction system or surveying and mapping, and the CAD model 1 is shown in a figure 2.
And 2, projecting the CAD model 1 of the connecting rod workpiece in an equal proportion to obtain a contour template of the connecting rod workpiece, wherein the contour template is shown in figure 3.
And 3, aligning the outline template obtained in the step 2 with the origin of the chessboard pattern calibration board 2. In this embodiment, the center of the small shaft hole is selected as a template feature point of the contour template, and the distances Δ x =0mm and Δ y = -90mm between the feature point and the origin of the checkerboard are fixed, and a schematic alignment diagram of the contour template and the checkerboard calibration plate 2 is shown in fig. 4.
And 4, calculating to obtain a transformation matrix from the object coordinate system to the checkerboard coordinate system according to the coordinate values of the template feature points in the object coordinate system and the deviation from the template feature points to the origin of the checkerboard coordinates.
The relationship of the object coordinate system to the checkerboard coordinate system of the connecting rod workpiece is shown in fig. 5. Object coordinate system of connecting rod workpiece { O 1 And building the center of the size of the workpiece, namely the origin of coordinates is at the center of the size of the workpiece. The X axis points to the center of the small axis circular hole from the size center, the Y axis points to the upper part of the front surface of the workpiece from the size center, and the coordinate system accords with the right hand rule. A coordinate system of checkerboard calibration { O } according to the definition in Zhangangyou calibration method 2 The center is located at the lower right corner point shown in fig. 5, the X-axis points from the corner point to the direction with more grids, the Y-axis points from the corner point to the direction with less grids, and the coordinate system conforms to the right-hand rule.
The coordinate of a projection point of the center of the small shaft obtained by the CAD model 1 of the connecting rod workpiece under the object coordinate system is x f =50.5mm、y f =-12.5mm、z f And =0mm, the transformation matrix from the object coordinate system to the checkerboard coordinate system is:
Figure BDA0002789481110000051
wherein, the direction of the delta X and the X axis of the object coordinate system is positive, and the direction of the delta y and the Z axis of the object coordinate system is positive;
and aligning the outer contour of the connecting rod workpiece with the contour template thereof, and aligning the high-precision checkerboard calibration plate 2 with the checkerboard template if the high-precision checkerboard calibration plate 2 is used. If the thickness of the high-precision checkerboard calibration plate 2 is Δ h =3mm, the transformation matrix from the object coordinate system to the checkerboard coordinate system is:
Figure BDA0002789481110000052
wherein, the delta h is positive when the Y axis of the object coordinate system is in the same direction.
And 5, placing the connecting rod workpiece CAD model 1 and the calibration plate under a built eye-to-hand calibration system, keeping the checkerboard calibration plate 2 still according to a Zhang Zhengyou camera calibration flow as shown in FIG. 6, and changing the 4-position shape of the industrial robot to acquire 10-20 checkerboard images.
According to the calibration method of the Zhang Zhengyou camera, the internal parameters of the industrial camera 3 are obtained by calculation:
Figure BDA0002789481110000053
where f is the industrial camera 3 focal length, dx is the pixel lateral ratio, dy is the pixel longitudinal ratio, u is the pixel lateral ratio 0 、v 0 Is the principal point coordinates of the image;
radial distortion parameters of the industrial camera 3:
Figure BDA0002789481110000061
and 6, after the calibration of the internal parameters and distortion parameters of the industrial camera 3 is completed, moving the industrial robot 4 to enable the connecting rod and the calibration plate to be in the visual field of the camera at the same time, and collecting the image of the industrial camera 3 at the moment. In order to acquire images and labels of the links at different distances and attitudes in the camera coordinate system, the workpiece and the checkerboard calibration plate 2 are kept stationary. The 4-position shape of the industrial robot is changed, the workpiece and the checkerboard calibration plate 2 are ensured to be in the camera visual field, and the posture of the workpiece under a camera coordinate system is changed.
And 7, correcting the image acquired in the step 6 by using the radial distortion parameter k acquired in the step 5, wherein the relationship between the pixel coordinate of the corrected image and the pixel coordinate of the original image is as follows:
Figure BDA0002789481110000062
Figure BDA0002789481110000063
wherein
Figure BDA0002789481110000064
Is the image pixel coordinate after the industrial camera 3 generates radial distortion, u, v are the image pixel coordinates obtained by the industrial camera 3 in ideal condition, u 0 、v 0 Is the coordinate value of the principal point of the image, k 1 、k 2 Is the radial distortion parameter, r is the actual distance between the pixel point and the principal point;
using a Zhangyingyou calibration method, and obtaining a transformation matrix from a checkerboard coordinate system to a camera coordinate system by using the image after distortion removal:
Figure BDA0002789481110000065
and 8, acquiring the pose of the workpiece in the image under the image coordinate system according to the calculation in the step 4 and the step 7, namely acquiring a homogeneous transformation matrix from the object coordinate system of the connecting rod to the camera coordinate system as follows:
T obj2camera =T board2camera ·T board2obj -1
and calculating to obtain:
Figure BDA0002789481110000066
step 9, randomly sampling the connecting rod workpiece CAD model 1 to obtain a coordinate value P of a sampling point under an object coordinate system obj The sampling point is shown in FIG. 7, and the coordinate value P of the sampling point in the camera coordinate system is calculated cam
Step 10, in order to evaluate the accuracy of the existing object pose estimation network, the coordinate values of the sampling points under the camera coordinates in the step 9 are calculated by using the object pose result estimated by the network
Figure BDA0002789481110000067
To be provided with
Figure BDA0002789481110000068
And P cam The average distance of the network pose is used as the accuracy index of the network pose estimation. The average distance in this example is 4.337mm, and the error of each sampling point is shown in fig. 8. The average distance describes the error between the object pose result obtained by the network and the pose label provided by the network, and the larger the average distance is, the larger the error of the object pose predicted by the network is, and the accuracy is low.
The object pose label acquisition method based on the checkerboard calibration plate 2 provides a low-cost and high-precision acquisition method for labels of objects in a real scene. The average distance calculated by the label can provide loss definition for network training, and meanwhile, the accuracy index of the network result can be given.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concept. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (10)

1. An object 3D pose tag obtaining method based on a checkerboard calibration plate is characterized by comprising the following steps:
step 1, establishing a CAD model of an object;
step 2, projecting the CAD model in equal proportion to obtain an object contour template;
step 3, aligning the original points of the object outline template and the chessboard pattern calibration plate template;
step 4, aligning the object with the object contour template, aligning the checkerboard calibration plate with the checkerboard calibration plate template, and calculating a transformation matrix from an object coordinate system to a checkerboard coordinate system to obtain a first transformation matrix;
step 5, calibrating external parameters, internal parameters and radial distortion parameters of the camera;
step 6, moving the camera to enable the object and the checkerboard to be in the field of view of the camera simultaneously, and acquiring a single picture;
step 7, correcting the single picture by using the radial distortion parameters, and calculating a transformation matrix from a checkerboard coordinate system to a camera coordinate system to obtain a second transformation matrix;
step 8, calculating a transformation matrix from the object coordinate system to the camera coordinate system according to the first transformation matrix and the second transformation matrix to obtain a third transformation matrix, namely, an object 3D pose label based on a checkerboard calibration board under the current camera view angle;
step 9, sampling the CAD model to obtain coordinate values of sampling points under an object coordinate system, wherein the coordinate values of the sampling points are the coordinate values of the sampled object, and the coordinate values of the sampling points under a camera coordinate system are calculated by using the third transformation matrix to obtain coordinate values of the sampled camera;
and step 10, obtaining a pose estimation value of the object in a camera coordinate system by using an object pose estimation method, calculating coordinate values of the sampling points in the camera coordinate system to obtain sampling estimated coordinate values, and comparing the sampling estimated coordinate values with the sampling camera coordinate values to obtain a pose estimation precision index.
2. The checkerboard-based object 3D pose tag acquisition method as claimed in claim 1, wherein said step 3 comprises: and selecting template feature points from the object contour template, aligning the template feature points with the origin of the checkerboard calibration plate, and calculating the offset of the template feature points.
3. The checkerboard-calibration-plate-based object 3D pose tag acquisition method of claim 2, wherein the first transformation matrix is:
Figure FDA0003854547490000011
in the formula x f 、y f 、z f And the coordinate values of the template characteristic points under the object coordinate system are shown, delta x and delta y are the offset from the template characteristic points to the origin of the checkerboard calibration plate, delta h is the thickness of the high-precision checkerboard calibration plate, and the value is 0 when the high-precision calibration plate is not used.
4. The checkerboard calibration plate-based object 3D pose tag acquisition method according to claim 3, wherein the camera internal parameters and the radial distortion parameters are respectively:
Figure FDA0003854547490000021
where f is the focal length of the industrial camera, dx is the pixel lateral ratio, dy is the pixel longitudinal ratio, u 0 、v 0 Is the principal point coordinates of the image; k is a radical of formula 1 、k 2 Is a radial distortion parameter.
5. The checkerboard-calibration-plate-based object 3D pose tag acquisition method of claim 4, wherein the second transformation matrix is:
Figure FDA0003854547490000022
where R is the rotation transformation from the checkerboard coordinate system to the camera coordinate system and t is the translation vector from the checkerboard coordinate system to the camera coordinate system.
6. The checkerboard-calibration-plate-based object 3D pose tag acquisition method as claimed in claim 5, wherein said third transformation matrix is:
T obj2camera =T board2camera ·T board2obj -1
7. the checkerboard calibration plate-based object 3D pose tag acquisition method as claimed in claim 6, wherein the sampling point object coordinate values are:
Figure FDA0003854547490000023
in the formula
Figure FDA0003854547490000024
Is the coordinate value of the ith sampling point in the object coordinate system.
8. The checkerboard-calibration-board-based object 3D pose tag acquisition method of claim 7, wherein the sampling camera coordinate values are:
Figure FDA0003854547490000031
in the formula
Figure FDA0003854547490000032
Is the coordinate value of the ith sampling point in the camera coordinate system obtained by using the third transformation matrix.
9. The checkerboard-calibration-board-based object 3D pose tag acquisition method of claim 8, wherein the sampled estimated coordinate values are:
Figure FDA0003854547490000033
in the formula
Figure FDA0003854547490000034
Is the coordinate value of the ith sampling point in the camera coordinate system calculated by using the transformation matrix obtained by the pose estimation method.
10. The checkerboard-based object 3D pose tag acquisition method according to claim 9, wherein the pose estimation accuracy index is calculated by the formula:
Figure FDA0003854547490000035
in the formula
Figure FDA0003854547490000036
P cam i The sampling estimated coordinate value and the sampling camera coordinate value of the ith sampling point, respectively.
CN202011309922.2A 2020-11-20 2020-11-20 Object 3D pose tag acquisition method based on checkerboard calibration plate Active CN112465898B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011309922.2A CN112465898B (en) 2020-11-20 2020-11-20 Object 3D pose tag acquisition method based on checkerboard calibration plate

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011309922.2A CN112465898B (en) 2020-11-20 2020-11-20 Object 3D pose tag acquisition method based on checkerboard calibration plate

Publications (2)

Publication Number Publication Date
CN112465898A CN112465898A (en) 2021-03-09
CN112465898B true CN112465898B (en) 2023-01-03

Family

ID=74798137

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011309922.2A Active CN112465898B (en) 2020-11-20 2020-11-20 Object 3D pose tag acquisition method based on checkerboard calibration plate

Country Status (1)

Country Link
CN (1) CN112465898B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555889A (en) * 2019-08-27 2019-12-10 西安交通大学 CALTag and point cloud information-based depth camera hand-eye calibration method
CN110653820A (en) * 2019-09-29 2020-01-07 东北大学 Robot grabbing pose estimation method combined with geometric constraint
CN110689579A (en) * 2019-10-18 2020-01-14 华中科技大学 Rapid monocular vision pose measurement method and measurement system based on cooperative target

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2458927B (en) * 2008-04-02 2012-11-14 Eykona Technologies Ltd 3D Imaging system
CN108555908B (en) * 2018-04-12 2020-07-28 同济大学 Stacked workpiece posture recognition and pickup method based on RGBD camera
CN109344882B (en) * 2018-09-12 2021-05-25 浙江科技学院 Convolutional neural network-based robot control target pose identification method
CN110375648A (en) * 2019-08-05 2019-10-25 华南农业大学 The spatial point three-dimensional coordinate measurement method that the single camera of gridiron pattern target auxiliary is realized
CN110706285A (en) * 2019-10-08 2020-01-17 中国人民解放军陆军工程大学 Object pose prediction method based on CAD model
CN111220126A (en) * 2019-11-19 2020-06-02 中国科学院光电技术研究所 Space object pose measurement method based on point features and monocular camera
CN111768447B (en) * 2020-07-01 2024-03-01 合肥哈工慧拣智能科技有限公司 Monocular camera object pose estimation method and system based on template matching

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555889A (en) * 2019-08-27 2019-12-10 西安交通大学 CALTag and point cloud information-based depth camera hand-eye calibration method
CN110653820A (en) * 2019-09-29 2020-01-07 东北大学 Robot grabbing pose estimation method combined with geometric constraint
CN110689579A (en) * 2019-10-18 2020-01-14 华中科技大学 Rapid monocular vision pose measurement method and measurement system based on cooperative target

Also Published As

Publication number Publication date
CN112465898A (en) 2021-03-09

Similar Documents

Publication Publication Date Title
CN111775152B (en) Method and system for guiding mechanical arm to grab scattered stacked workpieces based on three-dimensional measurement
JP4785880B2 (en) System and method for 3D object recognition
JP3735344B2 (en) Calibration apparatus, calibration method, and calibration program
CN109816704A (en) The 3 D information obtaining method and device of object
US20160214255A1 (en) Method for calibrating an articulated end effector employing a remote digital camera
CN107588721A (en) The measuring method and system of a kind of more sizes of part based on binocular vision
JPH10253322A (en) Method and apparatus for designating position of object in space
US20090274371A1 (en) Efficient model-based recognition of objects using a calibrated image system
CN115345822A (en) Automatic three-dimensional detection method for surface structure light of aviation complex part
Malassiotis et al. Stereo vision system for precision dimensional inspection of 3D holes
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
CN113269723A (en) Unordered grasping system for three-dimensional visual positioning and mechanical arm cooperative work parts
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN105787464A (en) A viewpoint calibration method of a large number of pictures in a three-dimensional scene
CN111462246A (en) Equipment calibration method of structured light measurement system
CN110853103B (en) Data set manufacturing method for deep learning attitude estimation
CN112465898B (en) Object 3D pose tag acquisition method based on checkerboard calibration plate
CN110458951B (en) Modeling data acquisition method and related device for power grid pole tower
CN113221953A (en) Target attitude identification system and method based on example segmentation and binocular depth estimation
Bauer et al. Free-form surface analysis and linking strategies for high registration accuracy in quality assurance applications
CN115719377A (en) Automatic acquisition system for pose estimation data set with six degrees of freedom
CN115457130A (en) Electric vehicle charging port detection and positioning method based on depth key point regression
Liang et al. An integrated camera parameters calibration approach for robotic monocular vision guidance
CN114972451A (en) Rotation-invariant SuperGlue matching-based remote sensing image registration method
Lynch et al. Backpropagation neural network for stereoscopic vision calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant