WO2021042332A1 - 标定检验组件、机器人系统、检验方法和标定方法 - Google Patents

标定检验组件、机器人系统、检验方法和标定方法 Download PDF

Info

Publication number
WO2021042332A1
WO2021042332A1 PCT/CN2019/104559 CN2019104559W WO2021042332A1 WO 2021042332 A1 WO2021042332 A1 WO 2021042332A1 CN 2019104559 W CN2019104559 W CN 2019104559W WO 2021042332 A1 WO2021042332 A1 WO 2021042332A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
touch screen
inspection
robot
image acquisition
Prior art date
Application number
PCT/CN2019/104559
Other languages
English (en)
French (fr)
Inventor
费涛
王海峰
贾绍图
Original Assignee
西门子(中国)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西门子(中国)有限公司 filed Critical 西门子(中国)有限公司
Priority to CN201980096363.0A priority Critical patent/CN113840695B/zh
Priority to PCT/CN2019/104559 priority patent/WO2021042332A1/zh
Publication of WO2021042332A1 publication Critical patent/WO2021042332A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators

Definitions

  • the invention relates to a calibration inspection component, a robot system, an inspection method and a calibration method.
  • the solution to achieve the above situation is to add an image acquisition device such as a camera to guide the movement of the articulated robot.
  • the calculation and control module inside the robot can usually determine the next action goal of the robot according to the image obtained by the camera device.
  • the image acquisition device In order for the robot to accurately complete the scheduled work, the image acquisition device needs to be calibrated first.
  • the existing robots cannot check and evaluate the calibration results of the image acquisition device, and the existing robots usually have to directly enter the trial operation after completing the calibration of the image acquisition device.
  • the trial run takes a long time and cannot provide a statistical evaluation result of the calibration results.
  • the trial run may still cause danger under certain circumstances.
  • the present invention aims to provide a calibration inspection component, a robot system, an inspection method, and a calibration method to effectively verify the calibration result of the image acquisition device on the robot, and intuitively and/or statistically feedback the inspection result to the user .
  • one aspect of the present invention provides a calibration inspection assembly for inspecting the calibration results of an image acquisition device, the image acquisition device is installed on a robot, and the robot includes a movable end effector. Inspection components include:
  • a calibration inspection platform where a display touch screen is installed on the calibration inspection platform, and the calibration inspection platform is positioned so that the image acquisition device can capture images on the display touch screen;
  • a touch screen device the touch screen device is installed on the end effector so as to be able to move under the drive of the end effector, and the touch screen device is configured to be able to touch and activate the display touch screen;
  • a calculation control module which is communicatively connected with the image acquisition device, the end effector, and the calibration inspection platform, and the calculation control module is configured to be able to perform calculations based on the calibration results and convert the calculation results Compare with the actual situation to get the test result.
  • the robot system includes a robot and the above-mentioned calibration and inspection assembly.
  • An image acquisition device is installed on the robot, and the robot includes a movable end effector.
  • the calculation control module of the calibration inspection component is integrated inside the robot.
  • the image acquisition device is installed on the end effector.
  • Another aspect of the present invention provides a method for inspecting the calibration result of an image acquisition device installed on a robot, the method is implemented by the above-mentioned calibration inspection assembly, the robot includes a movable end effector, and the method includes The following steps:
  • the image acquisition device captures an image containing the characteristic figure, and sends a signal containing the image information to a calculation control module;
  • the calculation control module processes and calculates the received signal, and based on the calculation result, controls the end effector to drive the touch screen device to touch the display touch screen with the characteristic point as a target;
  • the inspection result is obtained based on the distance between the touch screen point of the touch screen device on the display touch screen and the actual position of the characteristic point.
  • the step of processing and calculating the received signal by the calculation control module includes: converting the position coordinates of the characteristic point in different coordinate systems.
  • the step of processing and calculating the received signal by the calculation control module includes:
  • calculation control module controls the movement of the end effector based on the world coordinates.
  • the step of converting the pixel coordinates into the world coordinates includes:
  • the camera depth coordinates are converted into the world coordinates based on the calibrated external parameters of the image acquisition device and the pose parameters of the end effector.
  • the steps of the image acquisition device taking images and the steps of the calculation control module calculating based on image information and controlling the end effector according to the calculation results are repeated multiple times, so that the touch screen The device touches the touch screen multiple times.
  • the step of obtaining the inspection result has two implementation modes: a preset mode and a custom mode.
  • the step of obtaining a test result includes:
  • the display touch screen displays a plurality of closed figures centered on the characteristic point, the areas of the plurality of closed figures are not equal, and the plurality of closed figures are arranged in order from small to large and from the center to the outside.
  • the closed figure corresponds to a precision threshold
  • the step of obtaining a detection result includes:
  • the user sets the required accuracy threshold
  • the display touch screen displays a closed graphic centered on the characteristic point and corresponding to the set accuracy threshold
  • test result is obtained based on the statistical result.
  • the characteristic figure and the closed figure are both circles, and the characteristic point is the center of the circle.
  • the step of obtaining the inspection result further includes: calculating the distance value between each touch screen point and the characteristic point, and displaying all the distance values in a chart form through the display touch screen.
  • the step of obtaining the inspection result further includes: calculating an average value of all the distance values, and reflecting the average value in the graph.
  • the step of obtaining the inspection result further includes: calculating the standard deviation of all the distance values, and reflecting the standard deviation in the table.
  • the test result is deemed invalid, and the method proceeds to the first step.
  • Another aspect of the present invention provides a method for calibrating an image acquisition device installed on a robot, characterized in that the method includes the following steps:
  • the inspection step is implemented based on the inspection method described in any one of the above schemes,
  • the calibration step is repeated.
  • the calibration step is also implemented by a calibration inspection component, and the calibration step includes:
  • the image acquisition device captures an image containing the calibration graphic and sends a signal containing the image information to a calculation control module;
  • the calculation control module processes and calculates the received signal, thereby completing the calibration.
  • the calibration result can be inspected after the image acquisition device of the robot is calibrated, and the inspection result can also be intuitively reflected in the form of graphics, data, and icons. For users, let users quickly understand the test results.
  • the robot can enter the trial run after the inspection result is qualified, which can improve the completion quality of the trial run, and can avoid danger during the trial run.
  • the inspection method is more efficient, accurate, and can be realized automatically, is user-friendly, and can help the user to quickly complete the initial setting of the robot before use.
  • Fig. 1 is a schematic diagram of a partial structure of a calibration inspection assembly according to a preferred embodiment of the present invention
  • FIG. 2 is a flowchart of the inspection method in this embodiment
  • 3 and 4 are display views in the preset mode in this embodiment.
  • FIG. 11 is a calibration view displayed on the display touch screen in this embodiment.
  • FIG. 12 is a flowchart of the calibration method in this embodiment.
  • Figures 1 to 11 show an inspection component, a robot system, a method for inspecting calibration results, and an overall calibration method according to a preferred embodiment of the present invention.
  • the calibration inspection assembly of this embodiment is used to inspect the calibration result of the image acquisition device 3 (for example, a camera) installed on the robot, where the robot includes the movable end effector 5.
  • a specific example of a robot is an intelligent robot.
  • FIG. 1 shows a part of the structure of the calibration inspection assembly and the end effector 5 of the robot.
  • the robot is an intelligent robot
  • the end effector 5 is a mechanical arm installed on the base 6 of the intelligent robot.
  • the calibration inspection component includes a calibration inspection platform 1, a touch screen device 4, and a calculation control module not shown.
  • the calibration inspection platform 1 is equipped with a display touch screen 2, and the calibration inspection platform 1 is positioned so that the image acquisition device 3 can take images of the display touch screen 2, for example, based on the relative position relationship between the calibration inspection platform 1 and the center point of the robot. The positioning.
  • the touch screen device 4 is installed on the end effector 5 so as to be able to move under the drive of the end effector 5, and the touch screen device 4 can touch and activate the display touch screen 2.
  • a preferred example of the touch screen device 4 is a flexible touch screen pin whose flexible structure can ensure that the touch screen pin and the display touch screen 2 can be in flexible contact, so that the display touch screen 2 will not be damaged.
  • the calculation control module is communicatively connected with the image acquisition device 3, the end effector 5 and the calibration inspection platform 1 for calculating based on the calibration result and comparing the calculation result with the actual result to obtain the inspection result.
  • the robot system provided in this embodiment includes the inspection component and the robot described above.
  • the calculation control module can be integrated inside the robot, or it can be integrated in the calibration inspection platform 1, or the calculation control module can be divided into two parts, one part is integrated inside the robot for the image acquisition device 3 and The end effector 5 controls, and the other part is integrated in the calibration inspection platform 1 for calculating inspection results.
  • Some parameters can be preset in the calculation control module in advance, for example, the size parameters and position parameters of the touch screen device 4 and the end effector 5 are preset in advance.
  • the image acquisition device 3 is installed on the end effector 5, that is, the inspection assembly is applied to the image acquisition device 3 installed on the end effector 5.
  • the image acquisition device 3 can be installed in other positions except the end effector 5.
  • the inspection assembly, inspection method, and calibration method provided by the present invention can also be applied to other robots installed The image acquisition device 3 of the part.
  • the inspection method provided by this embodiment is implemented by the inspection component described above.
  • the inspection method mainly includes the following steps (see Figure 2):
  • Step S301 start inspection
  • Step S302 display the touch screen 2 to display a characteristic graph, and the characteristic graph has characteristic points;
  • Step S303 The image acquisition device 3 captures an image on the display touch screen 2, and converts the image into an electronic signal to send to the calculation control module;
  • Step S304 The calculation control module processes and calculates the received signal, and based on the calculation result, controls the end effector 5 to drive the touch device to touch and display the touch screen 2 with the feature point as the target;
  • the calculation control module obtains the inspection result based on the distance between the touch screen point of the touch screen device 4 on the display touch screen 2 and the actual position of the feature point; the step of obtaining the inspection result can also have two implementation modes. The two implementation modes will be described in detail later.
  • the inspection result can be judged. If it is judged that it does not meet the expectations, the calibration is performed again, and the inspection step is repeated after the recalibration; if it is judged If the result meets expectations, the inspection can be ended.
  • each step can have multiple implementation modes or calculation methods, and some specific examples in this embodiment are given below.
  • the characteristic graph displayed on the display touch screen 2 may be a circle, and the characteristic point is the center of the circle.
  • Step S304 includes: the calculation control module uses a method such as Hough Transform to extract feature graphics in the captured image, for example, to capture a circle in the image; find feature points of the feature graphics in the image, and compare the position coordinates of the feature points
  • the conversion is performed in different coordinate systems, and more specifically, the pixel coordinates of the feature points are converted into world coordinates.
  • the step of calculating and converting the pixel coordinate P 1 into the world coordinate P 4 in the above step further includes the following sub-steps: converting the pixel coordinate P 1 of the feature point in the image into the camera coordinate P based on the internal parameters of the image acquisition device 3 2 ; Convert the camera coordinate P 2 to the camera depth coordinate P 3 that can reflect the distance between the camera and the principal point of the image; the camera depth coordinate is calculated based on the calibrated external parameters of the image acquisition device 3 and the pose parameters of the end effector 5 P 3 is converted to world coordinate P 4 .
  • B -1 is the internal parameter matrix of the image acquisition device 3, namely Multiply P 1 and B -1 to obtain the camera coordinate P 2 ;
  • C -1 is the depth vector modulus between the image acquisition device 3 and the principal point of the image at the time of calibration (that is, the intersection of the optical axis of the image acquisition device 3 and the image surface), which can be multiplied by the camera coordinate P 2 Obtain the camera depth coordinate P 3 reflecting the distance between the image acquisition device 3 and the principal point of the image;
  • D -1 is the calibrated external parameter matrix of the image acquisition device 3 And the pose matrix of the end effector 5.
  • the product matrix of The world coordinate P 4 of the feature point can be obtained by multiplying D -1 by the camera depth coordinate P 3 .
  • the parameters r, p, and y are the rotation parameters of the end effector 5, representing its pitch, roll and yaw;
  • the parameters xr, yr, and zr are the translation parameters of the end effector 5, representing its position transformation
  • the parameters e, f, g are the rotation parameters of the calibrated image acquisition device 3 related to the end effector 5, representing its pitch, roll and yaw, and the parameters xc, yc, and zc are the calibrated image acquisition devices 3 is related to the translation parameter of the end effector 5, which represents its position transformation. Due to matrix Is the calibrated external parameter matrix of the camera, then it can be understood that the matrix In fact, it can be regarded as the "calibration result" mentioned in this article.
  • the calculation control module controls the end effector 5 to drive the touch screen device 4 on the display touch screen 2 There is a gap between the touch screen point and the feature point. As described above, the calculation control module obtains the inspection result by calculating the distance between the touch screen point and the feature point.
  • step S302, step S303, and step S304 can be repeated multiple times, so as to leave multiple touch screen points on the display touch screen 2.
  • the calculation control module performs the calculation of these multiple touch screen points and feature points.
  • step S304 the step of obtaining the inspection result is entered.
  • the step of obtaining the inspection result has two selectable modes: the preset mode 31 and the custom mode 32. That is, after the step S304 is completed, the selection step S305 is first entered, and the preset mode 31 or the custom mode 32 is selected to be used.
  • step S306 the display touch screen 2 will display multiple closed graphics centered on the feature point, the areas of the multiple closed graphics are not equal, and the multiple closed graphics vary from small to large according to the area.
  • the center is arranged outwards in sequence, and each closed figure corresponds to an accuracy threshold reflecting the calibration accuracy.
  • multiple closed figures can be circles with unequal radii from each other, and each circle is arranged one by one.
  • the circle with the radius r1 corresponds to the highest precision.
  • the value, r1 can be set to 0.05 cm, for example; the accuracy thresholds corresponding to a circle with a radius of r2, a circle with a radius of r3, and a circle with a radius of r4 are sequentially reduced.
  • step 309 further includes reflecting the detection result in the form of a chart, and the generated chart may be as shown in FIG. 5 and FIG. 6, for example.
  • the calculation control module can further process the data and generate a histogram as shown in FIG. 6.
  • the abscissa is the number of each touch screen point
  • the ordinate is the distance (cm) between each touch screen point and the characteristic point.
  • the calculation control module calculates the icon, and can also get the average value and standard deviation of all distance values. For example, by calculating the histogram in Figure 6, the average value of each distance value is 1.755cm, and the standard deviation is 0.399. . Both the average value and the standard deviation value can be used as data to characterize whether the calibration result is qualified. For example, an expected average value range and an expected standard deviation value range can be preset in the system.
  • the determination step S401 it is determined whether the actual average value and standard deviation value are within the expected range. If so, the calibration result is determined to be qualified; otherwise, it is determined whether the actual average value and standard deviation value are within the expected range. The calibration result is considered unqualified. In addition, because the standard deviation reflects the stability of the data, if the standard deviation is large, it may only represent a large error in the inspection process, not necessarily the calibration result is unqualified, so if the standard deviation is not within the expected range, the user You can choose whether to re-calibrate or re-inspect.
  • step S307 the user sets the required accuracy threshold. Then, it proceeds to step S308: display the touch screen 2 to display the closed graph corresponding to the selected accuracy threshold, and count the number of touch screen points that fall into the closed graph.
  • step S308 display the touch screen 2 to display the closed graph corresponding to the selected accuracy threshold, and count the number of touch screen points that fall into the closed graph.
  • the calculation control module can also further process the data to generate the charts in Figure 9 and Figure 10.
  • the histogram in Figure 9 intuitively reflects the number of touch screen points that fall into the circle and the number of touch screen points outside the circle; the histogram in Figure 10 reflects that each touch screen point is between the characteristic points , And get the average value and standard deviation of all distance values. Specifically to the histogram in Figure 10, the average value of all distance values is 1.664 cm, and the standard deviation is 0.370. For the processing and determination of these data, please refer to the description in the preset mode 31.
  • this embodiment also provides a calibration method.
  • the calibration method includes a calibration step and an inspection step implemented based on the above inspection method. If the inspection result of the inspection step does not meet expectations, the calibration step is repeated.
  • the calibration method of this embodiment is shown in FIG. 12. Among them, the calibration step is also completed by the above-mentioned calibration inspection component.
  • the system is initially set (S11), then the calibration inspection component is bound to the robot and the calibration inspection component and the robot are started (S12), and then the calibration is started (S21) and the calibration step S22 is started.
  • a known method such as April-Tag can be used for calibration, which includes taking a picture of the calibration plate and calculating and completing the calibration based on the image information and the distance between the calibration plate and the image acquisition device 3.
  • the calibration inspection platform 1 is used as a calibration board in this step, and a calibration view as shown in FIG. 11 is displayed on the display touch screen 2.
  • the inspection step S3 has been described in detail above and will not be repeated here.
  • the device and method provided by the present invention can check the calibration result after the image acquisition device 3 of the robot is calibrated, and the check result can also be intuitively reflected to the user in the form of graphics, data and icons, allowing the user to quickly To understand the test results.
  • the robot can enter the trial run after the inspection result is qualified, which can improve the completion quality of the trial run, and can avoid danger during the trial run.
  • the inspection method is more efficient, accurate, and can be realized automatically, is user-friendly, and can help the user to quickly complete the initial setting of the robot before use.

Abstract

本发明提供了一种标定检验组件、机器人系统、检验方法和标定方法。检验方法包括如下步骤:显示触摸屏展示特征图形,特征图形具有特征点(S302);图像获取装置拍摄包含特征图形的图像,并将包含该图像信息的信号发送至计算控制模块(S303);计算控制模块对接收到的信号进行处理和计算,并基于计算结果控制末端执行机构带动触屏装置以特征点为目标而触摸显示触摸屏(S304);基于触屏装置在显示触摸屏上的触屏点和特征点的实际位置之间的距离得出检验结果。根据本发明,能够在机器人的图像获取装置标定之后对标定结果进行检验,并且该检验结果还能够以图形、数据和图标等形式直观地反映给使用者。该检验方法较为高效、准确且能够自动实现。

Description

标定检验组件、机器人系统、检验方法和标定方法 技术领域
本发明涉及一种标定检验组件、机器人系统、检验方法和标定方法。
背景技术
在现代工业领域,许多加工制造工艺诸如焊接、喷漆和组装等都由机器人、特别是关节型机器人完成。通常,若要使关节型机器人到达预定位置,则需要先向其输入包含预定位置信号的明确指令,这样的机器人效率较低,所能完成的工作较为有限。当前,柔性制造成为趋势。为了提升机器人的柔性操作能力或自主操作能力,柔性操作系统或机器人应运而生,这样,关节型机器人就能够运动到目标位置并且自行完成工作。
实现上述情形的解决方案是增加图像获取装置诸如照相机,以引导关节型机器人的运动。机器人内部的计算控制模块通常能够根据摄像装置所获取的图像来决定机器人的下一步行动目标。为了使机器人能够精准地完成预定工作,图像获取装置需要首先完成标定。
然而,现有的机器人无法检验、评估图像获取装置的标定结果,现有的机器人通常在完成图像获取装置的标定之后不得不直接进入试运行。然而,试运行耗时长、无法提供对于标定结果的数据性评估结果,并且,如果标定精度没有达到要求,在一定情况下试运行可能还会造成危险。
因而,需要提供一种标定检验组件、机器人系统、检验方法和标定方法,以至少部分地解决上述问题。
发明内容
本发明旨在提供一种标定检验组件、机器人系统、检验方法和标定方法,来有效地验证机器人上的图像获取装置的标定结果,并直观地和/或数据性地将检验结果反馈给使用者。
为实现上述目的,本发明一方面提供一种用于检验图像获取装置的标定结果的标定检验组件,所述图像获取装置安装在机器人上,所述机器人包括能够活动的末端执行机构,所述标定检验组件包括:
标定检验平台,所述标定检验平台上安装有显示触摸屏,所述标定检验平台被定位为使得所述图像获取装置能够对所述显示触摸屏拍摄图像;
触屏装置,所述触屏装置安装在所述末端执行机构上从而能够在所述末端执行机构的带动下运动,并且,所述触屏装置被构造为能够触摸并激活所述显示触摸屏;和
计算控制模块,所述计算控制模块与所述图像获取装置、所述末端执行机构、所述标定检验平台通信地连接,且所述计算控制模块被构造为能够基于标定结果进行计算、将计算结果与实际进行对比从而得到检验结果。
本发明另一方面提供了一种机器人系统,所述机器人系统包括机器人和上述标定检验组件,所述机器人上安装有图像获取装置,且所述机器人包括能够活动的末端执行机构。
在一种实施方式中,所述标定检验组件的计算控制模块集成在所述机器人内部。
在一种实施方式中,所述图像获取装置安装在所述末端执行机构上。
本发明的再一个方面提供了一种检验安装在机器人上的图像获取装置的标定结果的方法,所述方法由上述标定检验组件实现,所述机器人包括能够活动的末端执行机构,所述方法包括如下步骤:
显示触摸屏展示特征图形,所述特征图形具有特征点;
所述图像获取装置拍摄包含所述特征图形的图像,并将包含该图像信息的信号发送至计算控制模块;
所述计算控制模块对接收到的信号进行处理和计算,并基于计算结果控制所述末端执行机构带动所述触屏装置以所述特征点为目标而触摸所述显示触摸屏;
基于所述触屏装置在所述显示触摸屏上的触屏点和所述特征点的实际位置之间的距离得出检验结果。
在一种实施方式中,所述计算控制模块对接收到的信号进行处理和计算的步骤包括:将所述特征点的位置坐标在不同坐标系内进行转换。
在一种实施方式中,所述计算控制模块对接收到的信号进行处理和计算的步骤包括:
提取所述图像中的所述特征图形;
找到所述图像中的所述特征图形的所述特征点并获取所述特征点的像素 坐标;
将所述像素坐标转换为世界坐标,
并且,所述计算控制模块基于所述世界坐标控制所述末端执行机构运动。
在一种实施方式中,将所述像素坐标转换为所述世界坐标的步骤包括:
基于所述图像获取装置的内部参数将所述像素坐标转换为相机坐标;
将所述相机坐标转换为能够反映所述相机和图像主点之间的距离的相机深度坐标;
基于所述图像获取装置标定后的外部参数和所述末端执行机构的位姿参数将所述相机深度坐标转换为所述世界坐标。
在一种实施方式中,所述图像获取装置拍摄图像的步骤、所述计算控制模块基于图像信息来计算并根据计算结果控制所述末端执行机构的步骤被多次重复,从而使得所述触屏装置多次触摸所述触摸屏。
在一种实施方式中,得出检验结果的步骤具有预设定模式和自定义模式两种实现方式。
在一种实施方式中,在所述预设定模式下,得出检验结果的步骤包括:
所述显示触摸屏展示以所述特征点为中心的多个封闭图形,所述多个封闭图形的面积不相等,所述多个封闭图形按照面积从小到大而从中心向外依次排列,每一个所述封闭图形对应于一个精度阈值;
统计落入对应于所需精度阈值的所述封闭图形中的触屏点的数量;
根据统计结果得出检测结果。
在一种实施方式中,在所述自定义模式下,得出检测结果的步骤包括:
使用者设定所需精度阈值;
所述显示触摸屏展示以所述特征点为中心、且对应于所设定的精度阈值的封闭图形;
统计落入对应于所需精度阈值的所述封闭图形中的触屏点的数量;
根据统计结果得出检测结果。
在一种实施方式中,所述特征图形和所述封闭图形均为圆形,所述特征点为所述圆形的圆心。
在一种实施方式中,得出检验结果的步骤还包括:计算每一个触屏点与所述特征点之间的距离值,并通过所述显示触摸屏将所有的距离值以图表形式对外展示。
在一种实施方式中,得出检验结果的步骤还包括:计算所有所述距离值的平均值,并将所述平均值体现在所述图表中。
在一种实施方式中,得出检验结果的步骤还包括:计算所有所述距离值的标准偏差,并将所述标准偏差体现在所述表格中。
在一种实施方式中,如果所述标准偏差高于预定值,则认定所述检验结果无效,并转到所述方法的第一步。
本发明的又一个方面提供了一种对安装在机器人上的图像获取装置进行标定的方法,其特征在于,所述方法包括如下步骤:
标定步骤;
检验步骤,所述检验步骤基于上述任意一项方案所述的检验方法实现,
其中,若所述检验步骤的检验结果不符合预期,则重复所述标定步骤。
在一种实施方式中,所述标定步骤也由标定检验组件实现,所述标定步骤包括:
显示触摸屏显示标定图形;
所述图像获取装置拍摄包含所述标定图形的图像并将包含该图像信息的信号发送至计算控制模块;
所述计算控制模块对接收到的信号进行处理和计算,从而完成标定。
根据本发明所提供的检验组件、机器人系统、检验方法和标定方法,能够在机器人的图像获取装置标定之后对标定结果进行检验,并且该检验结果还能够以图形、数据和图标等形式直观地反映给使用者,让使用者快速地了解检验结果。机器人可以在该检验结果合格之后再进入试运行,这样能够提高试运行的完成质量,并且能够避免在试运行中发生危险。该检验方法较为高效、准确且能够自动实现,对使用者友好,能帮助使用者快速完成机器人使用前的初始设定。
附图说明
以下附图仅旨在于对本发明做示意性说明和解释,并不限定本发明的范围。其中,
图1是根据本发明的一个优选实施方式的标定检验组件的部分结构的示意图;
图2是该实施方式中的检验方法的流程图;
图3和图4是该实施方式中预设定模式下的显示视图;
图5和图6是该实施方式中预设定模式下的数据分析结果;
图7和图8是该实施方式中的自定义模式下的显示视图;
图9和图10是该实施方式中的自定义模式下的数据分析结果;
图11是该实施方式中的显示触摸屏上所展示的标定视图;
图12是该实施方式中的标定方法的流程图。
部件附图标记:
标定检验平台 1
显示触摸屏 2
图像获取装置 3
触屏装置 4
末端执行机构 5
基座 6
具体实施方式
为了对本发明的技术特征、目的和效果有更加清楚的理解,现对照附图说明本发明的具体实施方式。
图1至图11示出了根据本发明的一个优选实施方式的检验组件、机器人系统、检验标定结果的方法和整体的标定方法。
本实施方式的标定检验组件用于检验安装在机器人上的图像获取装置3(例如摄像机)的标定结果,其中机器人包括能够活动的末端执行机构5。机器人的一个具体示例为智能机器人。具体地,图1示出了标定检验组件的部分结构以及机器人的末端执行机构5。在本实施方式中,机器人为智能机器人,末端执行机构5为安装在智能机器人的基座6上的机械臂。
标定检验组件包括标定检验平台1、触屏装置4以及未示出的计算控制模块。标定检验平台1上安装有具有显示触摸屏2,标定检验平台1被定位为使得图像获取装置3能够对显示触摸屏2拍摄图像,例如可以基于标定检验平台1与机器人的中心点的相对位置关系来完成该定位。触屏装置4安装在末端执行机构5上从而能够在末端执行机构5的带动下运动,并且,触屏装置4能够触摸并激活显示触摸屏2。触屏装置4的一个优选示例为柔性的触屏销,其柔性构造能够保证触屏销与显示触摸屏2能够柔 性接触,因而不会损坏显示触摸屏2。
计算控制模块与图像获取装置3、末端执行机构5和标定检验平台1通信地连接,以用于基于标定结果进行计算、将计算结果与实际进行对比从而得到检验结果。
本实施方式提供的机器人系统包括了上面所述的检验组件以及机器人。在该机器人系统中,计算控制模块可以集成在机器人内部,也可集成在标定检验平台1内部,或者计算控制模块也可以分为两部分,一部分集成在机器人内部以用于对图像获取装置3和末端执行机构5进行控制,另一部分集成在标定检验平台1内以用于计算检验结果。计算控制模块内可以提前预设一些参数,例如提前预设触屏装置4、末端执行机构5的尺寸参数和位置参数等等。
在本实施方式的机器人系统中,图像获取装置3是安装在末端执行机构5上的,也就是说检验组件是应用于安装在末端执行机构5上的图像获取装置3的。但在其他未示出的实施方式中,图像获取装置3可以安装在除了末端执行机构5以外的其他位置,本发明所提供的检验组件以及检验方法、标定方法也可应用于安装在机器人的其他部位的图像获取装置3。
本实施方式所提供的检验方法由上述检验组件而实现。该检验方法主要包括如下步骤(参见图2):
步骤S301:启动检验;
步骤S302:显示触摸屏2显示特征图形,特征图形上具有特征点;
步骤S303:图像获取装置3对显示触摸屏2拍摄图像,并将该图像转成电子信号发送至计算控制模块;
步骤S304:计算控制模块对接收到的信号进行处理和计算,并基于计算结果控制末端执行机构5带动触摸装置以特征点为目标而触摸显示触摸屏2;
之后,计算控制模块基于触屏装置4在显示触摸屏2上的触屏点和特征点的实际位置之间的距离得出检验结果;该得出检验结果的步骤还可以有两种实现模式,这两种实现模式将在后文进行详细描述。
优选地,在上述步骤完成后,还可以转入判定步骤S4,在步骤S4中可以对检验结果进行判定,若判定为不符合预期,则重新进行标定,重新标定之后再重复检验步骤;若判定结果符合预期,则可结束检验。
其中,各个步骤都可以具有多种实现方式或计算方法,下面给出本实施方式中的一些具体的示例。
步骤S302中,显示触摸屏2上显示的特征图形可以为圆形,特征点为圆形的圆心。
步骤S304包括:计算控制模块利用诸如Hough Transform的方法提取所拍摄的图像中的特征图形,例如捕捉图像中的圆形;找到图像中的特征图形的特征点,并将所述特征点的位置坐标在不同坐标系内进行转换,更具体地,是将特征点的像素坐标转换为世界坐标。在此步骤中,例如可以找到圆形的圆心并得到其像素坐标P 1(u,v,z);将像素坐标P 1换算成世界坐标P4(X,Y,Z),世界坐标系的原点例如可以是机器人的中心点;计算控制模块基于世界坐标P 4控制末端执行机构5带动触屏装置4运动。
进一步地,上述步骤将像素坐标P 1计算转换为世界坐标P 4的步骤还包括如下子步骤:基于图像获取装置3的内部参数而将图像中的特征点的像素坐标P 1转换为相机坐标P 2;将相机坐标P 2转换为能够反映相机和图像主点之间的距离的相机深度坐标P 3;基于图像获取装置3标定后的外部参数和末端执行机构5的位姿参数将相机深度坐标P 3转换为世界坐标P 4
上述步骤可通过运算下面的矩阵式而完成:
D -1*C -1*B -1*P 1=P 4
其中,B -1为图像获取装置3的内参矩阵,即
Figure PCTCN2019104559-appb-000001
将P 1和B -1相乘即得到相机坐标P 2
C -1中为图像获取装置3与标定时的图像主点(即图像获取装置3的光轴与图像表面的交点)之间的深度向量模量,将其与相机坐标P 2相乘便能够得到反映图像获取装置3与图像主点之间的距离的相机深度坐标P 3
D -1为图像获取装置3的标定后的外参矩阵
Figure PCTCN2019104559-appb-000002
以及末端执行机构5的位姿矩阵
Figure PCTCN2019104559-appb-000003
的乘积矩阵,即
Figure PCTCN2019104559-appb-000004
将D -1与相机深度坐标P 3相乘即可得到特征点的世界坐标P 4。具体地,
Figure PCTCN2019104559-appb-000005
其中 参数r、p、y为末端执行机构5的旋转参数,代表了其纵倾、横倾和横摆;参数xr、yr、zr为末端执行机构5的平移参数,代表了其位置变换,
Figure PCTCN2019104559-appb-000006
其中参数e、f、g为标定后的图像获取装置3相关于末端执行机构5的旋转参数,代表了其纵倾、横倾和横摆,参数xc、yc、zc为标定后的图像获取装置3相关于末端执行机构5的平移参数,代表了其位置变换。由于矩阵
Figure PCTCN2019104559-appb-000007
为相机的标定后的外参矩阵,那么可以理解,矩阵
Figure PCTCN2019104559-appb-000008
实际上可以被视作是本文所提到的“标定结果”。
由于标定不可能做到完全没有误差,所以计算得到的特征点的世界坐标P 4与特征点的实际位置不重合,那么计算控制模块控制末端执行机构5带动触屏装置4在显示触摸屏2上的触屏点便和特征点之间存在间隔,如上文所述,计算控制模块便是通过计算触屏点和特征点之间的距离而得到检验结果。
为了使测试结果更加准确,步骤S302、步骤S303和步骤S304可被多次重复,从而在显示触摸屏2上留下多个触屏点,之后,计算控制模块对这多个触屏点和特征点之间的距离进行综合分析。
继续参考图2,在步骤S304之后进入到得出检验结果的步骤,得出检验结果的步骤具有预设定模式31和自定义模式32两种可选择模式。即,在完成步骤S304之后首先进入选择步骤S305,选择使用预设定模式31或自定义模式32。
在预设定模式31下,在步骤S306中,显示触摸屏2上会展示以特征点为中心的多个封闭图形、多个封闭图形面积不相等,且多个封闭图形按照面积从小到大而从中心向外依次排列,每一个封闭图形对应于一个反映标定精度的精度阈值。例如,在本实施方式中,参考图3和图4,多个封闭图形可以为半径彼此不等的圆形,各个圆形一环套一环地排列,其中半径为r1的圆对应于最高精度值,r1例如可以被设定为0.05cm;半径r2的圆、半径为r3的圆和半径为r4的圆所对应的精度阈值依次降低。
使用者能够以各个圆为标尺直观地看到触屏点的大概分布,从而直观 地看到检验结果(即步骤S309)。优选地,步骤309还包括以图表的形式来反映检测结果,生成的图表例如可以为图5和图6中所示。
在图5中的柱状图中,横坐标代表了各个特征图形,纵坐标为落入各个特征图形的触屏点的数量,观察图5可以看到有1个触屏点落入了半径为r3的圆中,10个触屏点落入了半径为r4的圆中。在实际操作中,如果使用者对标定精度的要求不高,则可以统计落入半径为r4的圆中的触屏点的数量;如果使用者对标定精度要求较高,则可以统计落入半径为r3或r2、r1的圆中的触屏点的数量。
更优选地,计算控制模块能够对数据做进一步处理并生成如图6所示的柱状图。在图6中所示的柱状图中,横坐标为各个触屏点的编号,纵坐标为各触屏点与特征点之间的距离(cm)。计算控制模块对该图标进行计算,还能够得到所有距离值的平均值以及标准偏差,例如,对图6中的柱状图进行计算便能够得到各个距离值的平均值为1.755cm,标准偏差为0.399。平均值和标准偏差值均可以作为表征标定结果是否合格的数据。例如,系统中可以预设定一个预期平均值范围和预期标准偏差值范围,在判定步骤S401中判定实际的平均值和标准偏差值是否在预期范围内,若是,则认定标定结果合格,否则便认为标定结果不合格。另外,由于标准偏差反映了数据的稳定性,若标准偏差较大则可能仅仅代表了检验过程出现了较大误差,而不一定是标定结果不合格,所以如果标准偏差不在预期范围内,使用者可以选择是重新进行标定还是重新进行检验。
若使用者选择了自定义模式32,则首先进入步骤S307:使用者设定所需的精度阈值。之后转入步骤S308:显示触摸屏2显示与所选的精度阈值对应的封闭图形、并统计落入该封闭图形内的触屏点的数量。在这种模式下,参考图7和图8,围绕一个特征点仅会出现一个封闭图形(在本实施方式中为圆),该封闭图形的面积与使用者所选的精度阈值相对应。在这种情况下,计算控制模块也可以对数据做进一步处理,生成图9和图10中的图表。图9中的柱状图直观地反映了落入圆内的触屏点的数量和在圆外的触屏点的数量;图10中的柱状图反映了每一个触屏点是与特征点之间的距离,并得到了所有距离值的平均值以及标准偏差。具体到图10中的柱状图,所有距离值的平均值为1.664cm,标准偏差为0.370。对这些数据的处理、判定等步骤可以参照预设定模式31中的描述。
本实施方式另一方面还提供了一种标定方法,标定方法包括标定步骤和基于上述检验方法而实现的检验步骤,其中,若检验步骤的检验结果不符合预期,则重复标定步骤。本实施方式的标定方法如图12所示。其中,标定步骤也由上述的标定检验组件完成。
如图12所示,首先对系统进行初始设定(S11),之后将标定检验组件绑定至机器人并启动标定检验组件和机器人(S12),随后启动标定(S21)并进入标定步骤S22。在标定步骤S22中,可以利用例如April-Tag的已知方法来进行标定,该方法包括对标定板进行拍照并基于图像信息以及标定板和图像获取装置3之间的距离来进行计算并完成标定。优选地,标定检验平台1在该步骤中用作标定板,显示触摸屏2上显示如图11所示的标定视图。在标定步骤S22完成之后,则转入检验步骤S3,检验步骤S3已在上文中详细描述,在此不再赘述。
本发明所提供的装置和方法,能够在机器人的图像获取装置3标定之后对标定结果进行检验,并且该检验结果还能够以图形、数据和图标等形式直观地反映给使用者,让使用者快速地了解检验结果。机器人可以在该检验结果合格之后再进入试运行,这样能够提高试运行的完成质量,并且能够避免在试运行中发生危险。该检验方法较为高效、准确且能够自动实现,对使用者友好,能帮助使用者快速完成机器人使用前的初始设定。
应当理解,虽然本说明书是按照各个实施例描述的,但并非每个实施例仅包含一个独立的技术方案,说明书的这种叙述方式仅仅是为清楚起见,本领域技术人员应当将说明书作为一个整体,各实施例中的技术方案也可以经适当组合,形成本领域技术人员可以理解的其他实施方式。
以上所述仅为本发明示意性的具体实施方式,并非用以限定本发明的范围。任何本领域的技术人员,在不脱离本发明的构思和原则的前提下所作的等同变化、修改与结合,均应属于本发明保护的范围。

Claims (19)

  1. 一种标定检验组件,用于对安装在机器人上的图像获取装置(3)的标定结果进行检验,所述机器人包括能够活动的末端执行机构(5),其特征在于,所述标定检验组件包括:
    标定检验平台(1),所述标定检验平台上安装有显示触摸屏(2),所述标定检验平台被定位为使得所述图像获取装置能够对所述显示触摸屏拍摄图像;
    触屏装置(4),所述触屏装置安装在所述末端执行机构上从而能够在所述末端执行机构的带动下运动,并且,所述触屏装置被构造为能够触摸并激活所述显示触摸屏;和
    计算控制模块,所述计算控制模块与所述图像获取装置、所述末端执行机构、所述标定检验平台通信地连接,且所述计算控制模块被构造为能够基于标定结果进行计算、将计算结果与实际进行对比从而得到检验结果。
  2. 一种机器人系统,其特征在于,所述机器人系统包括机器人和根据权利要求1所述的标定检验组件,所述机器人上安装有图像获取装置,且所述机器人包括能够活动的末端执行机构。
  3. 根据权利要求2所述的机器人系统,其特征在于,所述标定检验组件的计算控制模块集成在所述机器人内部。
  4. 根据权利要求2所述的机器人系统,其特征在于,所述图像获取装置安装在所述末端执行机构上。
  5. 一种检验安装在机器人上的图像获取装置的标定结果的方法,所述方法由根据权利要求1所述的标定检验组件实现,所述机器人包括能够活动的末端执行机构,其特征在于,所述方法包括如下步骤:
    显示触摸屏展示特征图形,所述特征图形具有特征点(S302);
    所述图像获取装置拍摄包含所述特征图形的图像,并将包含该图像信息的信号发送至计算控制模块(S303);
    所述计算控制模块对接收到的信号进行处理和计算,并基于计算结果控制所述末端执行机构带动所述触屏装置以所述特征点为目标而触摸所述显示触摸屏(S304);
    基于所述触屏装置在所述显示触摸屏上的触屏点和所述特征点的实际位 置之间的距离得出检验结果。
  6. 根据权利要求5所述的方法,其特征在于,所述计算控制模块对接收到的信号进行处理和计算的步骤包括:将所述特征点的位置坐标在不同坐标系内进行转换。
  7. 根据权利要求6所述的方法,其特征在于,所述计算控制模块对接收到的信号进行处理和计算的步骤包括:
    提取所述图像中的所述特征图形;
    找到所述图像中的所述特征图形的所述特征点并获取所述特征点的像素坐标;
    将所述像素坐标转换为世界坐标,
    并且,所述计算控制模块基于所述世界坐标控制所述末端执行机构运动。
  8. 根据权利要求7所述的方法,其特征在于,将所述像素坐标转换为所述世界坐标的步骤包括:
    基于所述图像获取装置的内部参数将所述像素坐标转换为相机坐标;
    将所述相机坐标转换为能够反映所述相机和图像主点之间的距离的相机深度坐标;
    基于所述图像获取装置标定后的外部参数和所述末端执行机构的位姿参数将所述相机深度坐标转换为所述世界坐标。
  9. 根据权利要求5所述的方法,其特征在于,所述图像获取装置拍摄图像的步骤、所述计算控制模块基于图像信息来计算并根据计算结果控制所述末端执行机构的步骤被多次重复,从而使得所述触屏装置多次触摸所述触摸屏。
  10. 根据权利要求9所述的方法,其特征在于,得出检验结果的步骤具有预设定模式(S31)和自定义模式(S32)两种实现方式。
  11. 根据权利要求10所述的方法,其特征在于,在所述预设定模式下,得出检验结果的步骤包括:
    所述显示触摸屏展示以所述特征点为中心的多个封闭图形,所述多个封闭图形的面积不相等,所述多个封闭图形按照面积从小到大而从中心向外依次排列,每一个所述封闭图形对应于一个精度阈值;
    统计落入对应于所需精度阈值的所述封闭图形中的触屏点的数量;
    根据统计结果得出检测结果。
  12. 根据权利要求10所述的方法,其特征在于,在所述自定义模式下,得出检测结果的步骤包括:
    使用者设定所需精度阈值;
    所述显示触摸屏展示以所述特征点为中心、且对应于所设定的精度阈值的封闭图形;
    统计落入对应于所需精度阈值的所述封闭图形中的触屏点的数量;
    根据统计结果得出检测结果。
  13. 根据权利要求11或12所述的方法,其特征在于,所述特征图形和所述封闭图形均为圆形,所述特征点为所述圆形的圆心。
  14. 根据权利要求9所述的方法,其特征在于,得出检验结果的步骤还包括:计算每一个触屏点与所述特征点之间的距离值,并通过所述显示触摸屏将所有的距离值以图表形式对外展示。
  15. 根据权利要求14所述的方法,其特征在于,得出检验结果的步骤还包括:计算所有所述距离值的平均值,并将所述平均值体现在所述图表中。
  16. 根据权利要求14所述的方法,其特征在于,得出检验结果的步骤还包括:计算所有所述距离值的标准偏差,并将所述标准偏差体现在所述表格中。
  17. 根据权利要求16所述的方法,其特征在于,如果所述标准偏差高于预定值,则认定所述检验结果无效,并转到所述方法的第一步。
  18. 一种对安装在机器人上的图像获取装置进行标定的方法,其特征在于,所述方法包括如下步骤:
    标定步骤;
    检验步骤,所述检验步骤基于根据权利要求5-17中任意一项所述的方法实现,
    其中,若所述检验步骤的检验结果不符合预期,则重复所述标定步骤。
  19. 根据权利要求18所述的方法,其特征在于,所述标定步骤也由标定检验组件实现,所述标定步骤包括:
    显示触摸屏显示标定图形;
    所述图像获取装置拍摄包含所述标定图形的图像并将包含该图像信息的信号发送至计算控制模块;
    所述计算控制模块对接收到的信号进行处理和计算,从而完成标定。
PCT/CN2019/104559 2019-09-05 2019-09-05 标定检验组件、机器人系统、检验方法和标定方法 WO2021042332A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980096363.0A CN113840695B (zh) 2019-09-05 2019-09-05 标定检验组件、机器人系统、检验方法和标定方法
PCT/CN2019/104559 WO2021042332A1 (zh) 2019-09-05 2019-09-05 标定检验组件、机器人系统、检验方法和标定方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/104559 WO2021042332A1 (zh) 2019-09-05 2019-09-05 标定检验组件、机器人系统、检验方法和标定方法

Publications (1)

Publication Number Publication Date
WO2021042332A1 true WO2021042332A1 (zh) 2021-03-11

Family

ID=74852165

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/104559 WO2021042332A1 (zh) 2019-09-05 2019-09-05 标定检验组件、机器人系统、检验方法和标定方法

Country Status (2)

Country Link
CN (1) CN113840695B (zh)
WO (1) WO2021042332A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114052915A (zh) * 2021-11-02 2022-02-18 武汉联影智融医疗科技有限公司 手术机器人定位精度的测试方法、系统和模体
CN115235527A (zh) * 2022-07-20 2022-10-25 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备
IT202100032777A1 (it) * 2021-12-28 2023-06-28 Comau Spa "Apparecchiatura e procedimento per testare in modo automatico un sistema di infotainment di un autoveicolo"

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102294695A (zh) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 机器人标定方法及标定系统
CN105094405A (zh) * 2014-05-23 2015-11-25 中兴通讯股份有限公司 自动调整有效触点的方法及装置
CN109807885A (zh) * 2018-12-29 2019-05-28 深圳市越疆科技有限公司 一种机械手的视觉标定方法、装置及智能终端

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008014940A (ja) * 2006-06-08 2008-01-24 Fast:Kk 平面状被撮像物のカメラ計測のためのカメラキャリブレーション方法、および応用計測装置
KR101526424B1 (ko) * 2013-12-18 2015-06-05 현대자동차 주식회사 차량용 헤드 업 디스플레이 검사장치 및 그 방법
CN105844670B (zh) * 2016-03-30 2018-12-18 广东速美达自动化股份有限公司 水平机器人移动相机多点移动标定方法
CN107053177B (zh) * 2017-04-13 2020-07-17 北京邮电大学 改进的基于筛选和最小二乘法的手眼标定算法
CN109887041B (zh) * 2019-03-05 2020-11-20 中测国检(北京)测绘仪器检测中心 一种机械臂控制数字相机摄影中心位置和姿态的方法
US10399227B1 (en) * 2019-03-29 2019-09-03 Mujin, Inc. Method and control system for verifying and updating camera calibration for robot control
CN110148174A (zh) * 2019-05-23 2019-08-20 北京阿丘机器人科技有限公司 标定板、标定板识别方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102294695A (zh) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 机器人标定方法及标定系统
CN105094405A (zh) * 2014-05-23 2015-11-25 中兴通讯股份有限公司 自动调整有效触点的方法及装置
CN109807885A (zh) * 2018-12-29 2019-05-28 深圳市越疆科技有限公司 一种机械手的视觉标定方法、装置及智能终端

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114052915A (zh) * 2021-11-02 2022-02-18 武汉联影智融医疗科技有限公司 手术机器人定位精度的测试方法、系统和模体
CN114052915B (zh) * 2021-11-02 2023-11-21 武汉联影智融医疗科技有限公司 手术机器人定位精度的测试方法、系统和模体
IT202100032777A1 (it) * 2021-12-28 2023-06-28 Comau Spa "Apparecchiatura e procedimento per testare in modo automatico un sistema di infotainment di un autoveicolo"
WO2023126795A1 (en) * 2021-12-28 2023-07-06 Comau S.P.A. An apparatus and method for automatically testing an infotainment system of a motor-vehicle
CN115235527A (zh) * 2022-07-20 2022-10-25 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备
WO2024016892A1 (zh) * 2022-07-20 2024-01-25 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备

Also Published As

Publication number Publication date
CN113840695A (zh) 2021-12-24
CN113840695B (zh) 2024-03-08

Similar Documents

Publication Publication Date Title
WO2021042332A1 (zh) 标定检验组件、机器人系统、检验方法和标定方法
US11254008B2 (en) Method and device of controlling robot system
US11059169B2 (en) Method of controlling robot, method of teaching robot, and robot system
JP4191080B2 (ja) 計測装置
JP6280525B2 (ja) カメラのミスキャリブレーションの実行時決定のためのシステムと方法
US9604363B2 (en) Object pickup device and method for picking up object
US20070071310A1 (en) Robot simulation device
US20180178388A1 (en) Control apparatus, robot and robot system
US20120027307A1 (en) Image Measurement Device, Method For Image Measurement, And Computer Readable Medium Storing A Program For Image Measurement
JP2020501221A (ja) 組み込みシステムの汎用自動試験
US20180178389A1 (en) Control apparatus, robot and robot system
US11094082B2 (en) Information processing apparatus, information processing method, robot system, and non-transitory computer-readable storage medium
TWI699264B (zh) 視覺導引機器手臂校正方法
CN111152223A (zh) 一种全自动机器人手眼标定方法
JP6585693B2 (ja) 物体検査システム及び物体検査方法
JP6488571B2 (ja) 教示装置、及びロボットシステム
JP2959043B2 (ja) 画像処理装置
TWI747079B (zh) 機械手臂的定位精度量測系統與方法
US11478936B2 (en) Image processing apparatus that processes image picked up by image pickup apparatus attached to robot, control method therefor, and storage medium storing control program therefor
Scaria et al. Cost Effective Real Time Vision Interface for Off Line Simulation of Fanuc Robots
WO2022060689A1 (en) Feature inspection system
CN111562413A (zh) 检测方法及检测系统
CN115984388B (zh) 一种空间定位精度评估方法、系统、存储介质及计算机
US20230321836A1 (en) Correction device, correction method, and robot system
WO2022172471A1 (ja) 支援システム、画像処理装置、支援方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19943893

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19943893

Country of ref document: EP

Kind code of ref document: A1