WO2017107565A1 - 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统 - Google Patents

确定相机坐标与机械臂手爪坐标映射关系的方法及其系统 Download PDF

Info

Publication number
WO2017107565A1
WO2017107565A1 PCT/CN2016/098233 CN2016098233W WO2017107565A1 WO 2017107565 A1 WO2017107565 A1 WO 2017107565A1 CN 2016098233 W CN2016098233 W CN 2016098233W WO 2017107565 A1 WO2017107565 A1 WO 2017107565A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinates
camera
robot arm
coordinate
reference object
Prior art date
Application number
PCT/CN2016/098233
Other languages
English (en)
French (fr)
Inventor
杨铭
Original Assignee
广州视源电子科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州视源电子科技股份有限公司 filed Critical 广州视源电子科技股份有限公司
Publication of WO2017107565A1 publication Critical patent/WO2017107565A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Definitions

  • the invention relates to the technical field of mechanical arms, in particular to a method and a system for determining a mapping relationship between camera coordinates and mechanical arm claw coordinates.
  • mapping relationship between camera coordinates and robot arm grip coordinates In the vision-based robotic arm grab system, in order to manipulate the robot arm grip according to visual information, it is first necessary to establish a mapping relationship between camera coordinates and robot arm grip coordinates.
  • the mapping relationship between the camera coordinates and the world coordinates is determined by the camera calibration, and then the mapping relationship between the world coordinates and the coordinates of the robotic arm claws is determined, and finally the camera coordinates are obtained.
  • the mapping relationship with the coordinates of the robot arm has two shortcomings: (1) the world coordinates are difficult to measure accurately; (2) the mapping relationship between the camera coordinates and the world coordinates and the estimation of the mapping relationship between the world coordinates and the robot arm hand coordinates have errors. And independent measurement, so there is a cumulative error in the mapping relationship between the camera coordinates and the robot arm parameters.
  • the present invention provides a method and system for determining the mapping relationship between camera coordinates and robot arm hand coordinates, which can improve the accuracy of the mapping relationship between camera coordinates and robot arm hand coordinates.
  • the method for determining the relationship between the camera coordinates and the coordinate of the robot arm and the claw of the present invention is as follows:
  • the same reference object is grasped by the mechanical arm to a specified position within the camera imaging range, and the coordinates of the robot arm corresponding to the specified position are read;
  • the mapping relationship between the camera coordinates and the robot arm hand coordinates is determined based on camera coordinates and robot arm hand coordinates corresponding to a plurality of different designated positions.
  • the method for determining the relationship between the camera coordinates and the robot arm hand coordinate is determined by determining the coordinates of the robot arm and the corresponding camera coordinates corresponding to the specified position, based on the camera coordinates and the robot arm hand coordinates corresponding to the different designated positions.
  • the invention also provides a system for determining a mapping relationship between camera coordinates and robot arm hand coordinates, comprising:
  • a robot arm hand coordinate acquiring module configured to capture the same reference object through a mechanical arm to a specified position within the camera imaging range, and read the coordinate of the robot arm corresponding to the specified position;
  • a camera coordinate acquisition module configured to calculate, by the camera, an image of the reference object at the position of the finger, and calculate a feature point of the image; and feature points of the image and the preset reference object Matching feature points of the template map to obtain a center coordinate of the reference object in the image as a camera coordinate corresponding to the specified position;
  • the mapping relationship determining module is configured to determine a mapping relationship between camera coordinates and robot arm finger coordinates based on camera coordinates and robot arm finger coordinates corresponding to a plurality of different designated positions.
  • the system for determining the relationship between the camera coordinates and the coordinate of the robot arm and the claw of the robot arm obtains the coordinates of the robot arm corresponding to the specified position by the robot arm coordinate acquiring module, and the camera coordinate acquiring module acquires the camera coordinates corresponding to the specified position.
  • the mapping relationship determining module directly determines the mapping relationship between the camera coordinates and the robot arm finger coordinates based on camera coordinates and robot arm finger coordinates corresponding to a plurality of different designated positions. Therefore, it is possible to overcome the shortcomings of the conventional technical solution that the world coordinates are difficult to accurately measure, and the mapping relationship between the camera coordinates and the robot arm hand coordinates has accumulated errors, thereby improving the accuracy of the mapping relationship between the camera coordinates and the robot arm hand coordinates.
  • FIG. 1 is a flow chart showing a method for determining a mapping relationship between camera coordinates and a robot arm hand coordinate according to an embodiment
  • FIG. 2 is a schematic flow chart of a method for determining a mapping relationship between camera coordinates and a robot arm hand coordinate according to a preferred embodiment
  • Figure 3 is a schematic diagram of row-by-row placement from the upper left corner to the lower right corner of the image taken by the camera;
  • FIG. 4 is a schematic structural diagram of a system for determining a mapping relationship between camera coordinates and robot arm hand coordinates according to an embodiment.
  • an object with a distinct texture feature is selected as the reference object.
  • the same reference object is selected and captured by the mechanical arm to a specified position within the camera imaging range, and the error of the central coordinate in the image of the subsequent reference object at the specified position is reduced, thereby further improving the accuracy of subsequently determining the camera coordinates.
  • a photograph of the reference object is taken as a template map, and feature points of the template map are calculated.
  • a reference is provided for the matching of subsequent image feature points to accurately obtain the camera coordinates of the reference object at the specified position.
  • the SIFT or SURF local operator is preferentially used to calculate the feature points of the template graph to improve the accuracy of calculating the feature points of the template graph.
  • a photo of the reference object in a solid color background is taken as a template map to reduce interference of the template background to calculate feature points of the template image.
  • the mechanical reference arm picks up the same reference object to a specified position within the camera imaging range, and the same reference object can be placed by the mechanical arm and placed in a plurality of different designated positions in the camera imaging range, and the order of placement is performed. It is: row-by-row from the upper left corner to the lower right corner of the image taken by the camera to obtain the robot arm hand coordinates corresponding to the specified position of the rule, thereby reducing the calculation amount of the mapping relationship between the subsequent camera coordinates and the robot arm hand coordinates.
  • step S101 specifically, the robot arm hand coordinates can be read through the robot arm system interface as the robot arm finger coordinates corresponding to the specified position.
  • the matching based on the image feature points is compared with the matching of the image gray information, which greatly reduces the calculation amount of the matching process; meanwhile, the matching metric value of the feature points changes to the position. More sensitive, can greatly improve the accuracy of matching; Moreover, the extraction process of feature points can reduce the influence of noise, and has good adaptability to grayscale changes, image deformation and occlusion. Therefore, the step matches the feature points of the image with the feature points of the preset template image of the reference object, and obtains that the center coordinate of the reference object in the image has high accuracy.
  • step S103 specifically, the feature points of the image are matched with the feature points of the template image of the preset reference object, and the matching of the confidence value lower than the preset value is filtered, and the reference object is obtained.
  • the center coordinates in the image are described, thereby improving the camera coordinate estimation accuracy corresponding to the specified position.
  • the lower confidence match is filtered by the RANSAC algorithm to accurately estimate the center coordinate of the reference object in the image as the camera coordinate of the reference object at the specified location.
  • S104 Determine a mapping relationship between camera coordinates and robot arm finger coordinates based on camera coordinates and robot arm finger coordinates corresponding to a plurality of different designated positions.
  • mapping relationship between the camera coordinates and the robot arm hand coordinates can be determined by:
  • the camera coordinates are (u, v)
  • the robot arm hand coordinates are (X, Y)
  • the matrix is solved by least squares according to multiple sets (u, v) and their corresponding (X, Y):
  • the matrix is used as a mapping matrix of camera coordinates and robot arm hand coordinates.
  • the method for determining the relationship between the camera coordinates and the robot arm hand coordinate is determined by determining the coordinates of the robot arm and the corresponding camera coordinates corresponding to the specified position, based on the camera coordinates and the robot arm hand coordinates corresponding to the different designated positions.
  • Directly determine the mapping relationship between camera coordinates and the coordinates of the robot arm overcome the shortcomings of the traditional technical solution that the world coordinates are difficult to accurately measure, and the mapping relationship between camera coordinates and the coordinates of the robot arm and the claws have accumulated errors, thereby improving the camera coordinates and the robot arm.
  • the accuracy of the hand coordinate mapping relationship is determined by determining the coordinates of the robot arm and the corresponding camera coordinates corresponding to the specified position, based on the camera coordinates and the robot arm hand coordinates corresponding to the different designated positions.
  • the PCB board pick and place system since the PCB board is visually positioned, it is necessary to first determine the mapping relationship between the camera coordinates and the robot arm hand coordinates, and the method for determining the mapping relationship between the camera coordinates and the robot arm hand coordinates according to the present invention, It can be directly applied to the scene, and directly and accurately determine the mapping relationship between the camera coordinates and the coordinates of the arm of the robot arm in a low-cost, fully automatic manner, so that the robot arm can accurately grasp the PCB in subsequent tasks.
  • the present embodiment includes three parts: reference object preparation, reference object coordinate acquisition, and mapping matrix calculation.
  • the reference object that is captured is selected, and a photo of the reference object is taken as a template image, and then the feature points of the template image are calculated.
  • a reference object with obvious texture features is selected.
  • the feature points of the template map can be calculated using local operators such as SIFT/SURF.
  • the robot arm places the reference object to a specified position, the robot arm system coordinates are read through the robot arm system interface, and the robot arm finger coordinates are used as reference points.
  • an image containing the reference object is taken by the camera, and the feature points of the image are calculated. Then, the feature points of the image are matched with the pre-calculated template image feature points, and the lower confidence matching is filtered by the RANSAC algorithm, thereby accurately estimating the center coordinate of the reference object in the image as a reference object.
  • the mapping relationship between the camera coordinates and the robot arm hand coordinates is determined based on camera coordinates and robot arm hand coordinates corresponding to a plurality of different designated positions.
  • the matrix is used as a mapping matrix of camera coordinates and robot arm hand coordinates.
  • the mapping relationship between the camera coordinates and the robot arm hand coordinates is directly determined based on the camera coordinates and the robot arm hand coordinates corresponding to the different designated positions.
  • the invention also provides a system for determining the mapping relationship between camera coordinates and robot arm hand coordinates, as shown in FIG. 4, comprising:
  • the robot arm hand coordinate acquiring module 401 is configured to capture the same reference object in the camera imaging range by the robot arm, and read the robot arm hand coordinate corresponding to the specified position.
  • a camera coordinate acquiring module 402 configured to calculate, by the camera, an image of the reference object at the position of the finger, to calculate a feature point of the image; and feature points of the image and the preset reference The feature points of the template map of the object are matched, and the center coordinates of the reference object in the image are obtained as the camera coordinates corresponding to the specified position.
  • the mapping relationship determining module 403 is configured to determine a mapping relationship between camera coordinates and robot arm finger coordinates based on camera coordinates and robot arm finger coordinates corresponding to a plurality of different designated positions.
  • the method includes a robot arm hand coordinate acquisition module for capturing a photo of the reference object as a template image before the same reference object is grasped by the robot arm to a specified position within the camera imaging range. Calculating feature points of the template map.
  • a reference is provided for the matching of subsequent image feature points to accurately obtain the camera coordinates of the reference object at the specified position.
  • the SIFT or SURF local operator is preferentially used to calculate the feature points of the template graph to improve the accuracy of calculating the feature points of the template graph.
  • a photo of the reference object in a solid color background is taken as a template map to reduce interference of the template background to calculate feature points of the template image.
  • the robot arm hand coordinate acquisition module includes a specified position placement mode
  • the block is used for grasping and placing the same reference object through the robot arm and placing it in a plurality of different designated positions in the camera image range.
  • the order of placement is: row by row from the upper left corner to the lower right corner of the camera image, and the corresponding position of the rule is obtained.
  • the mechanical arm hand claw coordinates thereby reducing the amount of calculation of the mapping relationship between the subsequent camera coordinates and the robot arm hand coordinates.
  • the robot arm grip coordinate acquisition module further includes a robot arm system interface module for reading the robot arm gripper coordinates through the robot arm system interface as the robot arm gripper corresponding to the designated position coordinate.
  • the camera coordinate acquiring module includes a matching module, configured to match a feature point of the image with a feature point of a preset template image of the reference object, and the filtering confidence is lower than a preset.
  • the matching of the values yields the center coordinates of the reference object in the image, thereby improving the camera coordinate estimation accuracy corresponding to the specified position.
  • the lower confidence match is filtered by the RANSAC algorithm to accurately estimate the center coordinate of the reference object in the image as the camera coordinate of the reference object at the specified location.
  • the camera coordinate acquisition module further includes:
  • the matrix calculation module is configured to establish a matrix relationship between the camera coordinates corresponding to each specified position and the coordinates of the arm of the robot arm:
  • the camera coordinates are (u, v)
  • the robot arm hand coordinates are (X, Y)
  • the matrix is solved by least squares according to multiple sets (u, v) and their corresponding (X, Y):
  • the matrix is used as a mapping matrix of camera coordinates and robot arm hand coordinates.
  • the system for determining the relationship between the camera coordinates and the robot arm hand coordinates is obtained by the robot arm hand coordinate acquiring module 401, and the camera coordinate acquiring module 402 acquires the camera corresponding to the specified position.
  • the coordinate, mapping relationship determination module 403 is based on several differences
  • the camera coordinates corresponding to the specified position and the coordinates of the arm of the robot arm directly determine the mapping relationship between the camera coordinates and the coordinates of the arm of the robot arm, overcome the difficulty of accurate measurement of the world coordinates of the conventional technical solution, and the mapping relationship between camera coordinates and the coordinates of the robot arm. There is a disadvantage of cumulative error, thereby improving the accuracy of the mapping relationship between camera coordinates and robot arm hand coordinates.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

一种确定相机坐标与机械臂手爪坐标映射关系的方法及其系统,通过确定指定位置对应的机械臂手爪坐标以及对应的相机坐标,基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,直接确定相机坐标与机械臂手爪坐标的映射关系,克服传统技术方案的世界坐标难以精确测量、以及相机坐标与机械臂手爪坐标的映射关系存在累积误差的缺点,从而提高相机坐标与机械臂手爪坐标映射关系的精确度。

Description

确定相机坐标与机械臂手爪坐标映射关系的方法及其系统 技术领域
本发明涉及机械臂技术领域,特别是一种确定相机坐标与机械臂手爪坐标映射关系的方法及其系统。
背景技术
在基于视觉的机械臂抓取系统中,为了根据视觉信息操纵机械臂手爪,首先要建立相机坐标与机械臂手爪坐标之间的映射关系。在传统的技术方案中,为了找出这种映射关系,首先以摄像头标定确定相机坐标与世界坐标之间的映射关系,然后确定世界坐标与机械臂手爪坐标的映射关系,最终求出相机坐标与机械臂手爪坐标的映射关系。然而,这种技术方案存在两个缺点:(1)世界坐标难以精确测量;(2)相机坐标与世界坐标之间的映射关系和世界坐标与机械臂手爪坐标的映射关系的估计均有误差,且独立测量,因此求相机坐标与机械臂手爪坐标的映射关系存在累积误差。
发明内容
针对上述现有技术中存在的问题,本发明提供一种确定相机坐标与机械臂手爪坐标映射关系的方法及其系统,能够提高相机坐标与机械臂手爪坐标映射关系的精确度。
本发明的确定相机坐标与机械臂手爪坐标映射关系的方法,技术方案如下,包括:
通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,读取所述指定位置对应的机械臂手爪坐标;
通过所述相机拍摄所述参考物体在所述指定位置时的图像,计算所述图像的特征点;将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标;
基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
本发明的确定相机坐标与机械臂手爪坐标映射关系的方法,通过确定指定位置对应的机械臂手爪坐标以及对应的相机坐标,基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,直接确定相机坐标与机械臂手爪坐标的映射关系,克服传统技术方案的上述缺点,从而提高相机坐标与机械臂手爪坐标映射关系的精确度。
本发明还提供一种确定相机坐标与机械臂手爪坐标映射关系的系统,包括:
机械臂手爪坐标获取模块,用于通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,读取所述指定位置对应的机械臂手爪坐标;
相机坐标获取模块,用于通过所述相机拍摄所述参考物体在所述指,定位置时的图像,计算所述图像的特征点;将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标;
映射关系确定模块,用于基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
本发明的确定相机坐标与机械臂手爪坐标映射关系的系统,通过机械臂手爪坐标获取模块获取指定位置对应的机械臂手爪坐标,相机坐标获取模块获取所述指定位置对应的相机坐标,映射关系确定模块基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,直接确定相机坐标与机械臂手爪坐标的映射关系。因此,能够克服传统技术方案的世界坐标难以精确测量、以及相机坐标与机械臂手爪坐标的映射关系存在累积误差的缺点,从而提高相机坐标与机械臂手爪坐标映射关系的精确度。
附图说明
图1为一个实施例的确定相机坐标与机械臂手爪坐标映射关系的方法的流程示意图;
图2为一个较佳实施方式的确定相机坐标与机械臂手爪坐标映射关系的方法的流程示意图;
图3为从相机拍摄图像的左上角到右下角逐行放置示意图;
图4为一个实施例的确定相机坐标与机械臂手爪坐标映射关系的系统的结构示意图。
具体实施方式
为了使本发明的目的、技术方案和优点更加清楚,下面将结合附图对本发明作进一步地详细描述。
请参阅图1中一个实施例的确定相机坐标与机械臂手爪坐标映射关系的方法的流程示意图,包括步骤S101至步骤S104:
S101,通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,读取所述指定位置对应的机械臂手爪坐标。
为了提高后续参考物体在所述指定位置时的相机坐标精确度,选取带有明显纹理特征的物体作为所述参考物体。选择同一个参考物体,通过机械臂抓放到相机摄像范围内的指定位置,减少后续参考物体在所述指定位置时的图像中的中心坐标的误差,进一步提高后续确定相机坐标的精确度。
在步骤S101之前,拍摄所述参考物体的照片作为模板图,计算所述模板图的特征点。为后续图像特征点的匹配提供基准,从而准确得出所述参考物体在指定位置对应的相机坐标。其中,优先使用SIFT或者SURF局部算子计算所述模板图的特征点,以提高计算模板图特征点的精确度。优选地,拍摄所述参考物体在纯色背景下的照片作为模板图,以减少模板图背景对计算模板图特征点的干扰。
进一步地,所述通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,可以是,通过机械臂抓放同一个参考物体放置于相机摄像范围内若干不同的指定位置,放置的顺序为:从相机拍摄图像的左上角到右下角逐行放置,得到规则的指定位置对应的机械臂手爪坐标,从而减少后续相机坐标与机械臂手爪坐标的映射关系的计算量。
步骤S101中,具体地可以通过机械臂系统接口读取机械臂手爪坐标,作为所述指定位置对应的机械臂手爪坐标。
S102,通过所述相机拍摄所述参考物体在所述指定位置时的图像,计算所述图像的特征点。
S103,将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标。
由于图像的特征点比像素点要少很多,基于图像特征点的匹配相比利用图象灰度信息的匹配,大大减少了匹配过程的计算量;同时,特征点的匹配度量值对位置的变化比较敏感,可以大大提高匹配的精确程度;而且,特征点的提取过程可以减少噪声的影响,对灰度变化,图像形变以及遮挡等都有较好的适应能力。所以该步骤将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标准确度较高。
步骤S103中,具体地,将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,过滤置信度低于预设值的匹配,得出所述参考物体在所述图像中的中心坐标,从而提高所述指定位置对应的相机坐标估算精度。
优选地,以RANSAC算法过滤置信度较低的匹配,从而精确估算出参考物体在图像中的中心坐标,作为参考物体在指定位置的相机坐标。
S104,基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
具体地,可通过以下方式确定相机坐标与机械臂手爪坐标的映射关系:
建立各指定位置对应的相机坐标和机械臂手爪坐标之间的矩阵关系式为:
Figure PCTCN2016098233-appb-000001
其中,相机坐标为(u,v),机械臂手爪坐标为(X,Y),根据多组(u,v)及其对应的(X,Y),通过最小二乘法求解矩阵:
Figure PCTCN2016098233-appb-000002
将所述矩阵作为相机坐标与机械臂手爪坐标的映射矩阵。
本发明的确定相机坐标与机械臂手爪坐标映射关系的方法,通过确定指定位置对应的机械臂手爪坐标以及对应的相机坐标,基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,直接确定相机坐标与机械臂手爪坐标的映射关系,克服传统技术方案的世界坐标难以精确测量、以及相机坐标与机械臂手爪坐标的映射关系存在累积误差的缺点,从而提高相机坐标与机械臂手爪坐标映射关系的精确度。
以下为本发明的确定相机坐标与机械臂手爪坐标映射关系的方法的一个应用场景:
在PCB板卡抓放系统中,由于使用到视觉定位PCB板卡,因此需要首先确定相机坐标与机械臂手爪坐标映射关系,本发明的确定相机坐标与机械臂手爪坐标映射关系的方法,可直接应用于该场景,以低成本、全自动的方式直接精准确定相机坐标与机械臂手爪坐标的映射关系,从而使得机械臂能够在后续任务能够精准抓放PCB。
以下为本发明的确定相机坐标与机械臂手爪坐标映射关系的方法的一个较佳的实施方式,如图2所示:
本实施方式包括参考物体准备、参考物体坐标采集以及映射矩阵计算三部分。
一、参考物体准备
选取被抓放的参考物体,并拍摄该参考物体的一张照片作为模板图,然后计算该模板图的特征点。为了提高后续定位精度,选取带有明显纹理特征的参考物体。拍摄模板图时,在背景纯洁的条件下,以减少模板照片背景对后续定位算法的干扰,计算模板图的特征点可以使用SIFT/SURF等局部算子。
二、参考物体坐标采集
包括三个步骤:
(1)通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置。
通过机械臂抓放同一个物体参考(如PCB板卡)到相机摄像范围内的不同位置。机械臂放置物体的位置顺序为从相机拍摄图像的左上角到右下角逐行放置,如图3所示。
(2)读取所述指定位置对应的机械臂手爪坐标。
机械臂将参考物体放置到某个指定位置后,即通过机械臂系统接口,读取机械臂手爪坐标,作为参考点的机械臂手爪坐标。
(3)参考物体图像定位获取相机坐标。
通过所述相机拍摄所述参考物体在所述指定位置时的图像,计算所述图像的特征点;将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标。
机械臂将参考物体放置到某个指定位置后,即通过相机拍摄一张包含参考物体的图像,并计算该图像的特征点。然后,对该图像的特征点与预先计算好的模板图特征点进行匹配,并通过RANSAC算法过滤置信度较低的匹配,从而精确估算出参考物体在图像中的中心坐标,作为参考物体在所述指定位置对应的相机坐标。
三、映射矩阵计算
基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
建立相机坐标与机械臂手爪坐标的映射关系,相机坐标(u,v)与机械臂手爪坐标(X,Y)的关系为:
Figure PCTCN2016098233-appb-000003
根据多组(u,v)及其对应的(X,Y),通过最小二乘法求解矩阵:
Figure PCTCN2016098233-appb-000004
将所述矩阵作为相机坐标与机械臂手爪坐标的映射矩阵。
本实施方式通过确定指定位置对应的机械臂手爪坐标以及对应的相机坐标,基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,直接确定相机坐标与机械臂手爪坐标的映射关系,克服传统技术方案的世界坐标难以精确测量、以及相机坐标与机械臂手爪坐标的映射关系存在累积误差的缺点,从而提高相机坐标与机械臂手爪坐标映射关系的精确度。
本发明还提供一种确定相机坐标与机械臂手爪坐标映射关系的系统,如图4所示,包括:
机械臂手爪坐标获取模块401,用于通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,读取所述指定位置对应的机械臂手爪坐标。
相机坐标获取模块402,用于通过所述相机拍摄所述参考物体在所述指,定位置时的图像,计算所述图像的特征点;将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标。
映射关系确定模块403,用于基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
在其中一个实施例中,所述包括机械臂手爪坐标获取模块还用于在通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置之前,拍摄所述参考物体的照片作为模板图,计算所述模板图的特征点。为后续图像特征点的匹配提供基准,从而准确得出所述参考物体在指定位置对应的相机坐标。其中,优先使用SIFT或者SURF局部算子计算所述模板图的特征点,以提高计算模板图特征点的精确度。优选地,拍摄所述参考物体在纯色背景下的照片作为模板图,以减少模板图背景对计算模板图特征点的干扰。
在其中一个实施例中,所述机械臂手爪坐标获取模块包括指定位置放置模 块,用于通过机械臂抓放同一个参考物体放置于相机摄像范围内若干不同的指定位置,放置的顺序为:从相机拍摄图像的左上角到右下角逐行放置,得到规则的指定位置对应的机械臂手爪坐标,从而减少后续相机坐标与机械臂手爪坐标的映射关系的计算量。
在其中一个实施例中,所述机械臂手爪坐标获取模块还包括机械臂系统接口模块,用于通过机械臂系统接口读取机械臂手爪坐标,作为所述指定位置对应的机械臂手爪坐标。
在其中一个实施例中,所述相机坐标获取模块包括匹配模块,用于将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,过滤置信度低于预设值的匹配,得出所述参考物体在所述图像中的中心坐标,从而提高所述指定位置对应的相机坐标估算精度。
优选地,以RANSAC算法过滤置信度较低的匹配,从而精确估算出参考物体在图像中的中心坐标,作为参考物体在指定位置的相机坐标。
在其中一个实施例中,所述相机坐标获取模块还包括:
矩阵计算模块,用于建立各指定位置对应的相机坐标和机械臂手爪坐标之间的矩阵关系式为:
Figure PCTCN2016098233-appb-000005
其中,相机坐标为(u,v),机械臂手爪坐标为(X,Y),根据多组(u,v)及其对应的(X,Y),通过最小二乘法求解矩阵:
Figure PCTCN2016098233-appb-000006
将所述矩阵作为相机坐标与机械臂手爪坐标的映射矩阵。
本发明的确定相机坐标与机械臂手爪坐标映射关系的系统,通过机械臂手爪坐标获取模块401获取指定位置对应的机械臂手爪坐标,相机坐标获取模块402获取所述指定位置对应的相机坐标,映射关系确定模块403基于若干不同 指定位置对应的相机坐标和机械臂手爪坐标,直接确定相机坐标与机械臂手爪坐标的映射关系,克服传统技术方案的世界坐标难以精确测量、以及相机坐标与机械臂手爪坐标的映射关系存在累积误差的缺点,从而提高相机坐标与机械臂手爪坐标映射关系的精确度。
以上所述实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本发明的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。

Claims (10)

  1. 确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,包括:
    通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,读取所述指定位置对应的机械臂手爪坐标;
    通过所述相机拍摄所述参考物体在所述指定位置时的图像,计算所述图像的特征点;将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标;
    基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
  2. 根据权利要求1所述的确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,所述通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置之前,包括:
    拍摄所述参考物体的照片作为模板图,计算所述模板图的特征点。
  3. 根据权利要求2所述的确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,所述拍摄所述参考物体的照片作为模板图,包括:
    拍摄所述参考物体在纯色背景下的照片作为模板图。
  4. 根据权利要求1所述的确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,所述通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,包括:
    通过机械臂抓放同一个参考物体放置于相机摄像范围内若干不同的指定位置,放置的顺序为:从相机拍摄图像的左上角到右下角逐行放置。
  5. 根据权利要求1所述的确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,所述读取所述指定位置对应的机械臂手爪坐标,包括:
    通过机械臂系统接口读取机械臂手爪坐标,作为所述指定位置对应的机械臂手爪坐标。
  6. 根据权利要求1所述的确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,所述将所述图像的特征点与预置的所述参考物体的模板图的 特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,包括:
    将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,过滤置信度低于预设值的匹配,得出所述参考物体在所述图像中的中心坐标。
  7. 根据权利要求1所述的确定相机坐标与机械臂手爪坐标映射关系的方法,其特征在于,所述确定相机坐标与机械臂手爪坐标的映射关系,包括:
    建立各指定位置对应的相机坐标和机械臂手爪坐标之间的矩阵关系式为:
    Figure PCTCN2016098233-appb-100001
    其中,相机坐标为(u,v),机械臂手爪坐标为(X,Y),根据多组(u,v)及其对应的(X,Y),通过最小二乘法求解矩阵:
    Figure PCTCN2016098233-appb-100002
    将所述矩阵作为相机坐标与机械臂手爪坐标的映射矩阵。
  8. 确定相机坐标与机械臂手爪坐标映射关系的系统,其特征在于,包括:
    机械臂手爪坐标获取模块,用于通过机械臂抓放同一个参考物体到相机摄像范围内的指定位置,读取所述指定位置对应的机械臂手爪坐标;
    相机坐标获取模块,用于通过所述相机拍摄所述参考物体在所述指,定位置时的图像,计算所述图像的特征点;将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,得出所述参考物体在所述图像中的中心坐标,作为所述指定位置对应的相机坐标;
    映射关系确定模块,用于基于若干不同指定位置对应的相机坐标和机械臂手爪坐标,确定相机坐标与机械臂手爪坐标的映射关系。
  9. 根据权利要求8所述的确定相机坐标与机械臂手爪坐标映射关系的系统,其特征在于,所述相机坐标获取模块包括:
    匹配模块,用于将所述图像的特征点与预置的所述参考物体的模板图的特征点进行匹配,过滤置信度低于预设值的匹配,得出所述参考物体在所述图像 中的中心坐标。
  10. 根据权利要求8所述的确定相机坐标与机械臂手爪坐标映射关系的系统,其特征在于,所述相机坐标获取模块还包括:
    矩阵计算模块,用于建立各指定位置对应的相机坐标和机械臂手爪坐标之间的矩阵关系式为:
    Figure PCTCN2016098233-appb-100003
    其中,相机坐标为(u,v),机械臂手爪坐标为(X,Y),根据多组(u,v)及其对应的(X,Y),通过最小二乘法求解矩阵:
    Figure PCTCN2016098233-appb-100004
    将所述矩阵作为相机坐标与机械臂手爪坐标的映射矩阵。
PCT/CN2016/098233 2015-12-25 2016-09-06 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统 WO2017107565A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201511004609.7A CN105631875A (zh) 2015-12-25 2015-12-25 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统
CN201511004609.7 2015-12-25

Publications (1)

Publication Number Publication Date
WO2017107565A1 true WO2017107565A1 (zh) 2017-06-29

Family

ID=56046761

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/098233 WO2017107565A1 (zh) 2015-12-25 2016-09-06 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统

Country Status (2)

Country Link
CN (1) CN105631875A (zh)
WO (1) WO2017107565A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005506A (zh) * 2018-09-18 2018-12-14 华志微创医疗科技(北京)有限公司 一种高精度Mark点的注册方法
CN112561999A (zh) * 2020-12-21 2021-03-26 惠州市德赛西威汽车电子股份有限公司 一种贴框设备精准贴合方法及贴框设备
CN113269723A (zh) * 2021-04-25 2021-08-17 浙江省机电设计研究院有限公司 三维视觉定位与机械手协同工作的零部件无序抓取系统
CN114670194A (zh) * 2022-03-22 2022-06-28 荣耀终端有限公司 机械手系统定位方法及装置
CN114725753A (zh) * 2022-02-28 2022-07-08 福建星云电子股份有限公司 一种基于视觉引导的自动对插方法及系统

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105631875A (zh) * 2015-12-25 2016-06-01 广州视源电子科技股份有限公司 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统
CN106384115B (zh) * 2016-10-26 2019-10-22 武汉工程大学 一种机械臂关节角度检测方法
CN107450376B (zh) * 2017-09-09 2019-06-21 北京工业大学 一种基于智能移动平台的服务机械臂抓取姿态角计算方法
CN109648568B (zh) * 2019-01-30 2022-01-04 深圳镁伽科技有限公司 机器人控制方法、系统及存储介质
CN110271001A (zh) * 2019-06-19 2019-09-24 北京微链道爱科技有限公司 机器人识别方法、控制方法、装置、存储介质及主控设备
CN110660108B (zh) * 2019-09-11 2022-12-27 北京控制工程研究所 一种交会对接测量相机和对接抓捕机构的联合标定方法
CN111736331B (zh) * 2020-06-18 2022-04-01 湖南索莱智能科技有限公司 一种判断载玻片水平和垂直方向的方法以及使用该方法的装置
CN111923042B (zh) * 2020-07-21 2022-05-24 北京全路通信信号研究设计院集团有限公司 机柜栅格的虚化处理方法及系统、巡检机器人
CN118130140B (zh) * 2024-02-28 2024-09-10 浙江大学 一种基于视觉-力反馈的小麦种子切片取样方法和系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101839692A (zh) * 2010-05-27 2010-09-22 西安交通大学 单相机测量物体三维位置与姿态的方法
CN102161198A (zh) * 2011-03-18 2011-08-24 浙江大学 用于三维空间中移动机械臂路径规划的主从式协进化方法
CN104463108A (zh) * 2014-11-21 2015-03-25 山东大学 一种单目实时目标识别及位姿测量方法
CN105631875A (zh) * 2015-12-25 2016-06-01 广州视源电子科技股份有限公司 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100493207C (zh) * 2007-03-14 2009-05-27 北京理工大学 Ccd摄像系统的畸变测量校正方法和综合测试靶
CN102013099B (zh) * 2010-11-26 2012-07-04 中国人民解放军国防科学技术大学 车载摄像机外参数交互式标定方法
US9201312B2 (en) * 2013-04-16 2015-12-01 Kla-Tencor Corporation Method for correcting position measurements for optical errors and method for determining mask writer errors
CN104180753A (zh) * 2014-07-31 2014-12-03 东莞市奥普特自动化科技有限公司 机器人视觉系统的快速标定方法
CN104700414B (zh) * 2015-03-23 2017-10-03 华中科技大学 一种基于车载双目相机的前方道路行人快速测距方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101839692A (zh) * 2010-05-27 2010-09-22 西安交通大学 单相机测量物体三维位置与姿态的方法
CN102161198A (zh) * 2011-03-18 2011-08-24 浙江大学 用于三维空间中移动机械臂路径规划的主从式协进化方法
CN104463108A (zh) * 2014-11-21 2015-03-25 山东大学 一种单目实时目标识别及位姿测量方法
CN105631875A (zh) * 2015-12-25 2016-06-01 广州视源电子科技股份有限公司 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005506A (zh) * 2018-09-18 2018-12-14 华志微创医疗科技(北京)有限公司 一种高精度Mark点的注册方法
CN109005506B (zh) * 2018-09-18 2021-04-06 华志微创医疗科技(北京)有限公司 一种高精度Mark点的注册方法
CN112561999A (zh) * 2020-12-21 2021-03-26 惠州市德赛西威汽车电子股份有限公司 一种贴框设备精准贴合方法及贴框设备
CN113269723A (zh) * 2021-04-25 2021-08-17 浙江省机电设计研究院有限公司 三维视觉定位与机械手协同工作的零部件无序抓取系统
CN113269723B (zh) * 2021-04-25 2024-08-27 浙江省机电设计研究院有限公司 三维视觉定位与机械手协同工作的零部件无序抓取系统
CN114725753A (zh) * 2022-02-28 2022-07-08 福建星云电子股份有限公司 一种基于视觉引导的自动对插方法及系统
CN114670194A (zh) * 2022-03-22 2022-06-28 荣耀终端有限公司 机械手系统定位方法及装置

Also Published As

Publication number Publication date
CN105631875A (zh) 2016-06-01

Similar Documents

Publication Publication Date Title
WO2017107565A1 (zh) 确定相机坐标与机械臂手爪坐标映射关系的方法及其系统
US11049280B2 (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
CN106767393B (zh) 机器人的手眼标定装置与方法
WO2023060926A1 (zh) 一种基于3d光栅引导机器人定位抓取的方法、装置及设备
CN110163912B (zh) 二维码位姿标定方法、装置及系统
CN108470356B (zh) 一种基于双目视觉的目标对象快速测距方法
WO2019114339A1 (zh) 机械臂运动的校正方法和装置
JP6415066B2 (ja) 情報処理装置、情報処理方法、位置姿勢推定装置、ロボットシステム
CN111624203B (zh) 一种基于机器视觉的继电器接点齐度非接触式测量方法
CN111721259A (zh) 基于双目视觉的水下机器人回收定位方法
CN106845354B (zh) 零件视图库构建方法、零件定位抓取方法及装置
CN112045676A (zh) 一种基于深度学习的机器人抓取透明物体的方法
CN112541932A (zh) 基于双光相机不同焦距变换参数的多源图像配准方法
CN107917666A (zh) 双目视觉装置及坐标标定方法
CN116563391B (zh) 一种基于机器视觉的激光结构自动标定方法
CN108335308A (zh) 一种橙子自动检测方法、系统以及机器人智能零售终端
WO2019093299A1 (ja) 位置情報取得装置およびそれを備えたロボット制御装置
JP2010214546A (ja) 組立装置および組立方法
CN114693798B (zh) 机械手控制方法及装置
CN106643483B (zh) 工件检测方法和装置
JP2020504320A (ja) 基板配置精度を検出する方法およびシステム
CN113255662A (zh) 一种基于视觉成像的定位矫正方法、系统、设备及存储介质
JP2009250777A (ja) 表面検査装置および表面検査方法
CN113393383B (zh) 一种双深度相机拍照图像的拼接方法
KR102446944B1 (ko) 웨이퍼 위치오차 수집장치 및 수집방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16877376

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/11/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16877376

Country of ref document: EP

Kind code of ref document: A1