CN111445521A - Target workpiece position determining method, device, terminal and readable storage medium - Google Patents

Target workpiece position determining method, device, terminal and readable storage medium Download PDF

Info

Publication number
CN111445521A
CN111445521A CN202010114299.9A CN202010114299A CN111445521A CN 111445521 A CN111445521 A CN 111445521A CN 202010114299 A CN202010114299 A CN 202010114299A CN 111445521 A CN111445521 A CN 111445521A
Authority
CN
China
Prior art keywords
target workpiece
coordinate
coordinate system
coordinates
obtaining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010114299.9A
Other languages
Chinese (zh)
Inventor
殷兴国
吴兵
冯永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202010114299.9A priority Critical patent/CN111445521A/en
Publication of CN111445521A publication Critical patent/CN111445521A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a method for determining the position of a target workpiece, which comprises the following steps: acquiring a first coordinate set of a target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, wherein the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera; obtaining a transformation relation of the target workpiece between corresponding coordinates of the first coordinate system and the second coordinate system based on the first coordinate set and the second coordinate set; and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation. The invention also discloses a device, a terminal and a readable storage medium. The purpose of improving the speed and the precision of the process of determining the actual position coordinates of the target workpiece is achieved.

Description

Target workpiece position determining method, device, terminal and readable storage medium
Technical Field
The present invention relates to the field of positioning technologies, and in particular, to a method, an apparatus, a terminal, and a readable storage medium for determining a position of a target workpiece.
Background
With the popularization of the industry 3.0, the automation degree of industrial robots is higher and higher, the combination of machine vision and industrial robots is more and more common, when a robot vision system is used for autonomous operation, a camera needs to be fixed on an end effector of a robot arm to form a hand-eye system, and when the hand-eye system is used, the transformation relation between a robot coordinate system and an image coordinate system of the camera needs to be determined. In the prior art, when the transformation relation between the robot coordinate system and the image coordinate system of the camera is determined, multiple point location teaching needs to be carried out manually, so that the whole robot hand-eye calibration process is complicated, and the speed and the precision of robot hand-eye calibration are greatly reduced.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a method, a device, a terminal and a readable storage medium for determining the position of a target workpiece, and aims to solve the technical problem of how to improve the speed and the precision of the process of determining the actual position coordinates of the target workpiece.
In order to achieve the above object, the present invention provides a target workpiece position determining method including:
acquiring a first coordinate set of a target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, wherein the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera;
obtaining a transformation relation of the target workpiece between corresponding coordinates of the first coordinate system and the second coordinate system based on the first coordinate set and the second coordinate set;
and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
Optionally, the camera is fixed to a robot arm end of the robot, and the step of acquiring a first coordinate set of the target workpiece in a first coordinate system includes:
acquiring a first central coordinate of the target workpiece in the first coordinate system, wherein the first central coordinate is a coordinate of the target workpiece when the target workpiece is located at the central position of the field of view of the camera;
and controlling the robot to move on a preset track based on the first center coordinate and a preset step length so as to obtain eight moving coordinates of the target workpiece in the first coordinate system, and obtain a first coordinate group consisting of the first center coordinate and the eight moving coordinates.
Optionally, after the step of acquiring the first center coordinate of the target workpiece in the first coordinate system, the method further includes:
controlling the camera to acquire nine target workpiece images corresponding to the target workpiece and the first coordinate set when the robot is controlled to move on a preset track based on the first center coordinate and the preset step length;
the step of obtaining a second set of coordinates of the target workpiece in a second coordinate system comprises:
and based on a preset image positioning algorithm, positioning the target workpiece in the nine target workpiece images, and correspondingly obtaining nine coordinates of the target workpiece in the second coordinate system to obtain a second coordinate set.
Optionally, the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation includes:
acquiring a second central coordinate of the target workpiece in the first coordinate system, wherein the second central coordinate is a position coordinate of the robot tool when the center of the robot tool is aligned with the center of the target workpiece;
calculating a position deviation between the second center coordinate and the first center coordinate;
and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation and the position deviation.
Optionally, before the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation and the position deviation, the method further includes:
controlling the camera to acquire a current image corresponding to the target workpiece, wherein the current image is an image of the target workpiece when the target workpiece is located at the center of the field of view of the camera;
based on the current image, positioning the target workpiece in the current image through a preset image positioning algorithm to obtain a pixel coordinate of the target workpiece in the second coordinate system;
the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relationship and the position deviation includes:
and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the pixel coordinate and the transformation relation and by combining the position deviation.
Optionally, the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the pixel coordinates and the transformation relation and by combining the position deviation includes:
obtaining the current coordinate of the target workpiece in the first coordinate system based on the pixel coordinate and the transformation relation;
and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the current coordinate and by combining the position deviation.
Optionally, the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the current coordinates and the position deviation includes:
obtaining actual application process coordinates of the target workpiece in the first coordinate system based on the current coordinates;
and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the actual application process coordinate and by combining the position deviation.
The present invention also provides a target workpiece position determining apparatus including:
the system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring a first coordinate set of a target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera;
the processing module is used for obtaining a transformation relation between the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system based on the first coordinate set and the second coordinate set; and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
In addition, to achieve the above object, the present invention also provides a terminal, including: a memory, a processor and a target workpiece position determining program stored on the memory and executable on the processor, the target workpiece position determining program when executed by the processor implementing the steps of the target workpiece position determining method described above.
In addition, to achieve the above object, the present invention also provides a computer storage medium having a target workpiece position determining program stored thereon, the target workpiece position determining program implementing the steps of the target workpiece position determining method described above when executed by a processor.
According to the method, the device, the terminal and the readable storage medium for determining the position of the target workpiece, a first coordinate set of the target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system are obtained, wherein the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera; obtaining a transformation relation between the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system based on the first coordinate set and the second coordinate set; obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation, so that the robot can determine the actual position coordinates of the target workpiece after one-time manual teaching, specifically, in the process of determining the actual position coordinates of the target workpiece, the robot only needs to manually move once to enable the target workpiece to be located at the central position of the visual field of the camera, at this time, the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system are obtained, then the robot can automatically complete the obtaining of the remaining coordinates in the first coordinate set and the second coordinate set according to a control program, the transformation relation between the corresponding coordinates of the target workpiece in the first coordinate system and the second coordinate system is obtained through the first coordinate set and the second coordinate set, and the obtained coordinates of the target workpiece in the second coordinate system are transformed into the coordinates of the target workpiece in the first coordinate system according to the transformation relation, the actual position coordinates of the target workpiece in the first coordinate system can be obtained through the coordinates, the transformation from the coordinates of the target workpiece in the second coordinate system to the coordinates of the target workpiece in the first coordinate system is realized without moving the robot for many times to align the tool center of the robot to the target workpiece reference (including the center of the target workpiece), in addition, the teaching process of aligning the tool center of the robot to the target workpiece reference is finished manually, and whether the tool center is aligned to the target workpiece reference is judged through human eyes, the error is very large, and in order to improve the accuracy of the determination process of the actual position coordinates of the target workpiece, it takes much time to judge whether the tool center is aligned to the target workpiece reference through human eyes for many times. According to the method and the device, the actual position coordinates of the target workpiece can be determined after the robot is manually taught for one time, and the speed and the precision of the process of determining the actual position coordinates of the target workpiece are improved.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating a first embodiment of a target workpiece position determining method according to the present invention;
FIG. 2 is a schematic diagram of a predetermined trajectory of a target workpiece position determination method according to the present invention;
FIG. 3 is a functional block diagram of a preferred embodiment of the target workpiece position determining apparatus of the present invention;
fig. 4 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The present invention provides a method for determining a target workpiece position, referring to fig. 1, fig. 1 is a flowchart illustrating a method for determining a target workpiece position according to a first embodiment of the present invention.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than presented herein. The target workpiece position determining method can be applied to equipment and/or terminals, including but not limited to personal computers, mobile phones and the like. For convenience of description, the respective steps of the subject description target workpiece position determination method are omitted below. The target workpiece position determining method comprises the following steps:
step S10, acquiring a first coordinate set of the target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, wherein the first coordinate system is a coordinate system corresponding to the robot, and the second coordinate system is a coordinate system corresponding to the camera;
the vision system of the robot comprises a hand-eye vision system which is formed by a camera and the tail end of a robot arm. In determining the actual position coordinates of the target workpiece for the robot, it is necessary to find a suitable reference, which is the camera optical center for the camera, and the tool center for the end of the robot arm (for the robot to perform a specified production task, it is usually necessary to fix a tool, such as a welding gun of a welding robot, a glue gun of a glue coating robot, a clamp of a transfer robot, etc., at the end of the robot, and the tool center is the center point of the tool).
When the robot is applied to an actual production site, generally, an image of a semi-finished product in a production process is obtained through a camera, at this time, the position of the semi-finished product is fixed, and the position of the semi-finished product cannot be changed in the process of robot processing, after the image is obtained through a control camera, the semi-finished product already has a determined position in a second coordinate system through calculation, but due to the difference of the coordinate systems, at this time, the robot cannot determine the position of the semi-finished product in a first coordinate system through the second coordinate system, which is equivalent to that the semi-finished product has an independent coordinate in the second coordinate system and an independent coordinate in the first coordinate system.
However, there is a certain correlation between these two coordinates, which simultaneously represent the coordinates of the semi-finished product in a fixed position. The aim of the invention is to obtain this association, so that the coordinates of the semi-finished product in the first coordinate system can be obtained by obtaining the coordinates of the semi-finished product in the second coordinate system during the subsequent production process according to the association. Furthermore, we generally obtain the association before production and obtain the association by setting a target workpiece so that the robot can use the association directly during production to transform the coordinates of the semi-finished product in the second coordinate system into the coordinates of the semi-finished product in the first coordinate system. The first coordinate system is a coordinate system corresponding to the robot, and the second coordinate system is a coordinate system corresponding to the camera. In addition, the Eye-Hand vision system is classified into an Eye-in-Hand system and an Eye-to-Hand system, which are out of Hand, according to whether the camera can move relative to the robot. For an Eye-in-Hand system, a camera of the system is arranged at the tail end of an arm of a robot and moves along with the robot in the working process of the robot; for Eye-to-Hand systems with eyes outside the Hand, the camera of the system is mounted in a fixed position outside the robot, and does not move along with the robot in the robot work project.
In obtaining the above-mentioned association, that is, the hand-eye position relationship, three unknowns of abscissa rotation, ordinate rotation and displacement are involved, so at least 3 equations are needed to solve the three unknowns, that is, 3 different coordinates of the target workpiece in the first coordinate system and 3 different coordinates of the target workpiece in the second coordinate system corresponding to the 3 coordinates need to be collected, and in order to ensure the accuracy of transformation between the coordinates of the target workpiece in the second coordinate system and the coordinates of the target workpiece in the first coordinate system in practical application, generally 9 different coordinates of the target workpiece in the second coordinate system corresponding to the coordinates of the target workpiece in the first coordinate system are collected, that is, 9 coordinates of the target workpiece in the first coordinate system are obtained as the first coordinate set, and acquiring 9 coordinates of the target workpiece in the second coordinate system corresponding to the 9 coordinates in the first coordinate system to serve as a second coordinate set.
Further, the camera is fixed to a robot arm end of the robot, and the step of acquiring a first coordinate set of the target workpiece in a first coordinate system includes:
step a, acquiring a first central coordinate of the target workpiece in the first coordinate system, wherein the first central coordinate is a coordinate of the target workpiece when the target workpiece is located at the central position of the field of view of the camera.
And b, controlling the robot to move on a preset track based on the first center coordinate and a preset step length so as to obtain eight moving coordinates of the target workpiece in the first coordinate system, and obtaining a first coordinate group consisting of the first center coordinate and the eight moving coordinates.
Specifically, when determining the actual position coordinates of the target workpiece with respect to the robot, it is necessary to perform an operation in a tool coordinate system and a target workpiece coordinate system that are taught (automatically taught by a teach pendant), and the target object that is photographed is recorded as the target workpiece. In the actual production process of the hand-eye vision system, eye-in-hand application is wide, and hereinafter, eye-in-hand is taken as an example (the actual position coordinates of a target workpiece can be determined by eye-to-hand), and a camera and a clamping tool center are fixed at the tail end of a robot.
After the preparation work is finished, the posture of the robot is kept unchanged, and the robot moves to a certain height below a target workpiece coordinate system and is recorded as z0Thereafter, the robot is moved in the abscissa direction and the ordinate direction but not in the ordinate direction, that is, the robot is then moved in a two-dimensional plane。
Referring to fig. 2, Point-center is the position of the first center coordinate, numbers 1 to 9 in the circle represent the numbers of 9 points, and the pointing arrow between the numbers is the moving direction of the robot (which may not move in this order, and the image capturing of nine points in fig. 2 is completed), that is, the planned moving track of the robot, L0Obtaining the position coordinate of the target workpiece in the first coordinate system when the robot moves to the center position of the visual field of the camera, taking the position coordinate as the first center coordinate, the coordinate is Point center (x, y), then controlling the robot to move a preset step L according to the planned moving track and each time0And moving to the positions of the subsequent 8 points, and recording the position coordinates of the target workpiece of each point in the first coordinate system. Recording the position coordinates of the target workpiece at the nine points in the first coordinate system as the coordinates Rob _ point [9] of the nine points of the robot]As a first set of coordinates, wherein:
Rob_point[0].x=Point_center.x;
Rob_point[0].y=Point_center.y;
Rob_point[1].x=Point_center.x-L0;
Rob_point[1].y=Point_center.y;
Rob_point[2].x=Point_center.x-L0;
Rob_point[2].y=Point_center.y+L0;
Rob_point[3].x=Point_center.x;
Rob_point[3].y=Point_center.y+L0;
Rob_point[4].x=Point_center.x+L0;
Rob_point[4].y=Point_center.y+L0;
Rob_point[5].x=Point_center.x+L0;
Rob_point[5].y=Point_center.y;
Rob_point[6].x=Point_center.x+L0;
Rob_point[6].y=Point_center.y-L0;
Rob_point[7].x=Point_center.x;
Rob_point[7].y=Point_center.y-L0;
Rob_point[8].x=Point_center.x-L0;
Rob_point[8].y=Point_center.y-L0;
wherein the coordinates Rob _ point [0] -Rob _ point [8] correspond in sequence to the coordinates of the points numbered 1-9 in fig. 2.
Further, after the step of controlling the robot to locate the target workpiece at the center of the field of view of the camera and acquiring the first center coordinate of the target workpiece in the first coordinate system, the method further includes:
c, controlling the camera to acquire nine target workpiece images corresponding to the target workpiece and the first coordinate set when the robot is controlled to move on a preset track based on the first central coordinate and the preset step length;
specifically, when a first central coordinate is acquired, a camera is controlled to shoot a target workpiece image at a position corresponding to the first central coordinate; accordingly, when the remaining 8 coordinates in the first coordinate set are acquired, the camera is controlled to capture a target workpiece image at a position corresponding to each coordinate. So as to obtain nine target workpiece images corresponding to nine points including the first central coordinate position.
The step of obtaining a second set of coordinates of the target workpiece in a second coordinate system comprises:
and d, positioning the target workpiece in the nine target workpiece images based on a preset image positioning algorithm, and correspondingly obtaining nine coordinates of the target workpiece in the second coordinate system to obtain a second coordinate set.
Specifically, nine target workpiece images shot before are recorded as a nine-point image Pic [9], and according to a preset image positioning algorithm, the target workpieces in the nine-point image Pic [9] are sequentially positioned to respectively obtain nine pixel coordinates of the target workpieces in a second coordinate system, which are recorded as image nine-point coordinates Pic _ point [9], and the coordinates are used as a second coordinate set. It is understood that the second coordinate set includes: pic _ point [0] -Pic _ point [8 ]. The coordinates Pic _ point [0] -Pic _ point [8] of the nine point position of the image correspond to the nine point positions Rob _ point [0] -Rob _ point [8] of the robot one by one according to the number. Specifically, the preset algorithm may be to firstly pixelate the image, and then determine the position of the pixel representing the center of the target workpiece in the image in the whole image, where if the pixel representing the center of the target workpiece is located in the 5 th row and the 8 th column after the pixelation of the image, the coordinate of the target workpiece in the second coordinate system at this time is (5, 8).
Step S20, obtaining a transformation relationship between the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system based on the first coordinate set and the second coordinate set;
and obtaining nine pairs of relations according to the one-to-one correspondence relation between the coordinates Pic _ point [0] -Pic _ point [8] of the nine point positions of the image and the coordinates Rob _ point [0] -Rob _ point [8] of the nine point positions of the robot. Specifically, it is known that the robot performs position transformation of the end of the robot arm by rotation and displacement, and assuming that the coordinates of the target workpiece in the second coordinate system are (x, y) and the coordinates of the target workpiece in the first coordinate system are (x ', y'), the coordinates (x ', y') can be obtained by rotating and displacing the coordinates (x, y), and the transformation relation between the coordinates of the target workpiece in the second coordinate system composed of rotation and displacement and the coordinates in the first coordinate system is the transformation relation h, and each of the coordinates in the second coordinate system and the first coordinate system corresponds to one transformation relation.
The nine pairs of relationships are represented as a transformation relationship matrix H between the robot position coordinates and the image position coordinates, the matrix H being:
Figure BDA0002390612490000091
wherein h is11,h12,...,h33Respectively are the coordinates Pic _ point [0] of the nine-point position of the image]-Pic_point[8]Coordinates Rob _ point [0] of the nine-point position of the robot]-Rob_point[8]A one-to-one correspondence of (a), wherein h11,h12,...,h33Can be the corresponding transformation relation between the coordinates in any first coordinate set and the coordinates in the second coordinate set corresponding to the coordinates, such as h11Can represent Pic _ point [0]]And Rob _ point [0]Can also represent Pic _ point [8]]And Rob _ point [8]The corresponding relationship of (1).
And step S30, obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
And obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
In the embodiment, a first coordinate set of the target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system are obtained, wherein the first coordinate system is a coordinate system corresponding to the robot, and the second coordinate system is a coordinate system corresponding to the camera; obtaining a transformation relation between the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system based on the first coordinate set and the second coordinate set; obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation, so that the robot can determine the actual position coordinates of the target workpiece after one-time manual teaching, specifically, in the process of determining the actual position coordinates of the target workpiece, the robot only needs to manually move once to enable the target workpiece to be located at the central position of the visual field of the camera, at this time, the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system are obtained, then the robot can automatically complete the obtaining of the remaining coordinates in the first coordinate set and the second coordinate set according to a control program, the transformation relation between the corresponding coordinates of the target workpiece in the first coordinate system and the second coordinate system is obtained through the first coordinate set and the second coordinate set, and the obtained coordinates of the target workpiece in the second coordinate system are transformed into the coordinates of the target workpiece in the first coordinate system according to the transformation relation, the actual position coordinates of the target workpiece in the first coordinate system can be obtained through the coordinates, the transformation from the coordinates of the target workpiece in the second coordinate system to the coordinates of the target workpiece in the first coordinate system is realized without moving the robot for many times to align the tool center of the robot to the target workpiece reference (including the center of the target workpiece), in addition, the teaching process of aligning the tool center of the robot to the target workpiece reference is finished manually, and whether the tool center is aligned to the target workpiece reference is judged through human eyes, the error is very large, and in order to improve the accuracy of the determination process of the actual position coordinates of the target workpiece, it takes much time to judge whether the tool center is aligned to the target workpiece reference through human eyes for many times. According to the method and the device, the actual position coordinates of the target workpiece can be determined after the robot is manually taught for one time, and the speed and the precision of the process of determining the actual position coordinates of the target workpiece are improved.
Further, in a second embodiment of the method for determining a position of a target workpiece according to the present invention, based on the first embodiment, the step of obtaining an actual position coordinate of the target workpiece in the first coordinate system based on the transformation relation includes:
step S31, obtaining a second central coordinate of the target workpiece in the first coordinate system, where the second central coordinate is a position coordinate of the robot tool center aligned with the target workpiece center;
and acquiring a second central coordinate of the target workpiece in the first coordinate system, wherein the second central coordinate is a position coordinate when the center of the robot tool is aligned with the center of the target workpiece. Specifically, the position of the target workpiece is kept unchanged, the vertical coordinate direction of the robot is kept unchanged, the coordinates of the target workpiece in the first coordinate system when the center of the tool is aligned with the center of the target workpiece by manually guiding the robot arm are obtained, the coordinates are used as second center coordinates, and the coordinates are recorded as Teach _ center (x, y).
Step S32, calculating a positional deviation between the second center coordinates and the first center coordinates;
a positional deviation between the second center coordinates and the first center coordinates is calculated. Specifically, there is a distance between the center of the tool and the center of the camera (the optical axis of the camera), and in the previous process of solving the transformation relation matrix, the position coordinate of the robot used is the first center coordinate, but not the second center coordinate, that is, the relation matrix obtained from the first center coordinate cannot be directly used for the transformation from the second coordinate system to the first coordinate system, there is a position deviation between the first center coordinate and the second center coordinate, which is the distance between the center of the tool and the optical axis of the camera, and this position deviation is denoted as (off _ x, off _ y), wherein,
off_x=Teach_center.x–Point_center.x;
off_y=Teach_center.y–Point_center.y。
step S33, obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relationship and the position deviation.
Further, before the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation and the position deviation, the method further includes:
step S34, controlling the camera to acquire a current image corresponding to the target workpiece, wherein the current image is an image of the target workpiece located at the center of the field of view of the camera;
step S35, based on the current image, positioning the target workpiece in the current image by a preset image positioning algorithm to obtain the pixel coordinate of the target workpiece in the second coordinate system;
the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relationship and the position deviation includes:
step S331, based on the pixel coordinates and the transformation relation, and in combination with the position deviation, obtaining actual position coordinates of the target workpiece in the first coordinate system.
Specifically, the robot is moved to a first central coordinate position (namely, the target workpiece is located at the central position of the field of view of the camera), image acquisition is carried out to acquire a current image corresponding to the target workpiece, wherein the current image is an image of the target workpiece located at the central position of the field of view of the camera, and a preset image positioning algorithm is adopted to acquire the current image corresponding to the target workpiecePositioning the current image shot by the camera (i.e. determining to pixelate the current image and then obtaining the coordinates of the pixel points representing the center of the target workpiece) to obtain the pixel coordinates Pic _ Point (x) of the target workpiece in the second coordinate system0,y0) Then the pixel coordinates Pic _ Point (x) are transformed by combining the positional deviation0,y0) The transformation is to the actual position coordinates of the target workpiece in the first coordinate system.
Further, the step S331 includes:
step S3311, based on the pixel coordinate and the transformation relation, obtaining the current coordinate of the target workpiece in the first coordinate system;
step S3312, based on the current coordinate, obtaining an actual position coordinate of the target workpiece in the first coordinate system in combination with the position deviation.
Specifically, based on the pixel coordinates Pic _ Point (x)0,y0) And transforming the matrix H corresponding to the relation to obtain the current coordinate of the target workpiece in the first coordinate system, and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the current coordinate and in combination with the position deviation.
In particular, by the pixel coordinate Pic _ Point (x)0,y0) And a transformation relation matrix H, and the current coordinate Rob _ Point (wld _ x) of the robot can be obtained0,wld_y0) As shown in the following formula:
Figure BDA0002390612490000121
then, in conjunction with the positional deviation, the current coordinate Rob _ Point (wld _ x)0,wld_y0) The transformation is to actual position coordinates.
Further, the step S3312 includes:
step S33121, based on the current coordinate, obtaining the actual application process coordinate of the target workpiece in the first coordinate system;
step S33122, based on the actual application process coordinate, obtaining an actual position coordinate of the target workpiece in the first coordinate system in combination with the position deviation.
Based on the current coordinates Rob _ Point (wld _ x)0,wld_y0) And obtaining the actual application process coordinate of the target workpiece in the first coordinate system, and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the actual application process coordinate and by combining the position deviation.
Specifically, during the determination of the transformation relation matrix H, we set the target workpiece position to remain unchanged, the camera takes a shot at a different position, and during the determination of the actual target workpiece actual position coordinates, the camera takes a shot at a fixed position, and instead the target workpiece needs to be at a different position, referring to FIG. 2, i.e., the target workpiece needs to be moved by the arrow from the number 1 of the nine points in the figure to the number 9 in accordance with the step L. thus, the robot coordinate position is opposite to the first center coordinate position during the determination of the actual target workpiece actual position coordinates, for example, during the determination of the actual target workpiece actual position coordinates, the image of the 2 Point position is taken by the camera, the coordinates of the target workpiece in the second coordinate system are calculated by the preset image location algorithm, and then the coordinates are transformed by the transformation relation matrix H into the coordinates of the target workpiece in the first coordinate system at the 2 Point position (Point _ center. x-L)0Point _ center.y), but because the actual target workpiece actual position coordinate determination process is different from the process of obtaining the transformation relation matrix H, the target workpiece position is unchanged and the camera position is changed when the transformation relation matrix H is obtained, while the target workpiece position is changed and the camera position is unchanged in the actual target workpiece actual position coordinate determination process.
That is, the position coordinate of the point No. 2 obtained in the actual position coordinate determination process of the actual target workpiece and the position coordinate of the point No. 2 transformed by the obtaining transformation relation matrix H are symmetric with respect to the first center coordinate, that is, the position coordinate of the point No. 2 obtained in the actual position coordinate determination process of the actual target workpiece is the position coordinate of the point No. 6 transformed by the obtaining transformation relation matrix HThe coordinate is (Point _ center. x + L)0,Point_center.y)。
Specifically, the current coordinates Rob _ Point (wld _ x) of the target workpiece in the first coordinate system is transformed by the position coordinates0,wld_y0) The transformation is carried out, and the transformed coordinates are (wld _ trans _ x)0,wld_trans_y0) Wherein, in the step (A),
wld_trans_x0=2*Point_center.x-wld_x0
wld_trans_y0=2*Point_center.y-wld_y0
furthermore, due to the actual application process coordinates of the target workpiece in the first coordinate system (wld _ trans _ x)0,wld_trans_y0) There is also a positional deviation (off _ x, off _ y) from the actual position of the Target workpiece, so the actual position Target _ point (x, y) of the Target workpiece is:
Target_point.x=wld_trans_x0+off_x;
Target_point.y=wld_trans_y0+off_y;
in this embodiment, after acquiring a target workpiece image corresponding to a first center coordinate position captured by a camera, acquiring coordinates of a target workpiece in a second coordinate system through the target workpiece image, acquiring coordinates of the target workpiece in a first coordinate system through a previously acquired transformation relation matrix H, acquiring coordinates of the target workpiece in an actual target workpiece actual position coordinate determination process through the coordinates, and then adding a position deviation to the coordinates to obtain final actual position coordinates of the target workpiece in the first coordinate system, without manually teaching the actual position coordinates of the target workpiece in the first coordinate system again or multiple times. The purpose of improving the speed and the precision of the process of determining the actual position coordinates of the target workpiece is achieved.
The present invention also provides a target workpiece position determining apparatus, as shown in fig. 3, including:
the system comprises an acquisition module 10, a processing module and a control module, wherein the acquisition module is used for acquiring a first coordinate set of a target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera;
a processing module 20, configured to obtain a transformation relationship between coordinates of the target workpiece in the first coordinate system and coordinates of the target workpiece in the second coordinate system based on the first coordinate set and the second coordinate set; and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
Further, the obtaining module 10 includes:
the first acquisition unit is used for acquiring a first central coordinate of the target workpiece in a first coordinate system;
the first control unit is used for controlling the robot to move on a preset track based on the first center coordinate and a preset step length;
the first acquisition unit is further used for acquiring eight moving coordinates of the target workpiece in the first coordinate system to obtain a first coordinate set consisting of the first center coordinate and the eight moving coordinates.
Further, the first control unit is further configured to control the camera to acquire nine target workpiece images of the target workpiece corresponding to the first coordinate set when the robot is controlled to move on the preset track based on the first center coordinate and the preset step length.
The first acquisition unit further includes:
and the processing subunit is used for positioning the target workpiece in the nine target workpiece images based on a preset image positioning algorithm, and correspondingly obtaining nine coordinates of the target workpiece in a second coordinate system to obtain a second coordinate set.
Further, the processing module 20 includes:
the second acquisition unit is used for acquiring a second central coordinate of the target workpiece in the first coordinate system;
a calculation unit for calculating a positional deviation between the second center coordinates and the first center coordinates;
and the second acquisition unit is also used for acquiring the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation and the position deviation.
Further, the processing module 20 further includes:
the second control unit is used for controlling the robot to enable the target workpiece to be located at the center of the view field of the camera and controlling the camera to acquire a current image corresponding to the target workpiece;
the processing unit is used for positioning a target workpiece in the current image through a preset image positioning algorithm based on the current image to obtain a pixel coordinate of the target workpiece in a second coordinate system;
further, the second acquisition unit includes:
and the second acquisition subunit is used for acquiring the actual position coordinate of the target workpiece in the first coordinate system based on the pixel coordinate and the transformation relation in combination with the position deviation.
Further, the second acquiring subunit includes:
the second acquisition subunit is used for acquiring the current coordinate of the target workpiece in the first coordinate system based on the pixel coordinate and the transformation relation; and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the current coordinates and in combination with the position deviation.
Further, the second acquiring subunit includes:
the second acquisition subunit is used for acquiring actual application process coordinates of the target workpiece in the first coordinate system based on the current coordinates; and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the actual application process coordinates in combination with the position deviation.
The specific implementation of the target workpiece position determining apparatus of the present invention is substantially the same as that of the above embodiments of the target workpiece position determining method, and is not described herein again.
In addition, the invention also provides a terminal for determining the position of the target workpiece. As shown in fig. 4, fig. 4 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that fig. 4 is a schematic structural diagram of a hardware operating environment of the target workpiece position determination terminal.
The terminal of the embodiment of the invention can be a PC, and can also be a terminal device with a data processing function, such as a smart phone, a tablet computer, a portable computer and the like.
As shown in fig. 4, the target workpiece position determination terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Optionally, the terminal may further include a camera, a Radio Frequency (RF) circuit, a sensor, an audio circuit, a WiFi module, and the like. Such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display screen according to the brightness of ambient light, and a proximity sensor that may turn off the display screen and/or the backlight when the mobile terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile terminal is stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer and tapping) and the like for recognizing the attitude of the mobile terminal; of course, the mobile terminal may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which are not described herein again.
Those skilled in the art will appreciate that the terminal configuration shown in fig. 4 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 4, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a target workpiece position determination program. The operating system is a program for managing and controlling hardware and software resources of the target workpiece position determination terminal, and supports the operation of the target workpiece position determination program and other software or programs.
The specific implementation of the target workpiece position determining terminal of the present invention is substantially the same as that of each embodiment of the target workpiece position determining method described above, and is not described herein again.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, in which a target workpiece position determining program is stored, and the target workpiece position determining program, when executed by a processor, implements the steps of the embodiments of the target workpiece position determining method described above.
The specific implementation of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the target workpiece position determining method described above, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are only for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A target workpiece position determining method, characterized by comprising the steps of:
acquiring a first coordinate set of a target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, wherein the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera;
obtaining a transformation relation of the target workpiece between corresponding coordinates of the first coordinate system and the second coordinate system based on the first coordinate set and the second coordinate set;
and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
2. The method of claim 1, wherein the camera is affixed to a robot arm end of the robot, and wherein the step of acquiring a first set of coordinates of the target workpiece in a first coordinate system comprises:
acquiring a first central coordinate of the target workpiece in the first coordinate system, wherein the first central coordinate is a coordinate of the target workpiece when the target workpiece is located at the central position of the field of view of the camera;
and controlling the robot to move on a preset track based on the first center coordinate and a preset step length so as to obtain eight moving coordinates of the target workpiece in the first coordinate system, and obtain a first coordinate group consisting of the first center coordinate and the eight moving coordinates.
3. The method of claim 2, wherein said step of obtaining a first center coordinate of said target workpiece in said first coordinate system further comprises:
controlling the camera to acquire nine target workpiece images corresponding to the target workpiece and the first coordinate set when the robot is controlled to move on a preset track based on the first center coordinate and the preset step length;
the step of obtaining a second set of coordinates of the target workpiece in a second coordinate system comprises:
and based on a preset image positioning algorithm, positioning the target workpiece in the nine target workpiece images, and correspondingly obtaining nine coordinates of the target workpiece in the second coordinate system to obtain a second coordinate set.
4. The method of claim 3, wherein said step of deriving actual position coordinates of said target workpiece in said first coordinate system based on said transformation relationship comprises:
acquiring a second central coordinate of the target workpiece in the first coordinate system, wherein the second central coordinate is a position coordinate of the robot tool when the center of the robot tool is aligned with the center of the target workpiece;
calculating a position deviation between the second center coordinate and the first center coordinate;
and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation and the position deviation.
5. The method of claim 4, wherein said step of deriving actual position coordinates of said target workpiece in said first coordinate system based on said transformation relationship and said positional deviation further comprises:
controlling the camera to acquire a current image corresponding to the target workpiece, wherein the current image is an image of the target workpiece when the target workpiece is located at the center of the field of view of the camera;
based on the current image, positioning the target workpiece in the current image through a preset image positioning algorithm to obtain a pixel coordinate of the target workpiece in the second coordinate system;
the step of obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relationship and the position deviation includes:
and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the pixel coordinate and the transformation relation and by combining the position deviation.
6. The method of claim 5, wherein said step of obtaining actual position coordinates of said target workpiece in said first coordinate system based on said pixel coordinates and said transformation relationship in combination with said positional deviation comprises:
obtaining the current coordinate of the target workpiece in the first coordinate system based on the pixel coordinate and the transformation relation;
and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the current coordinate and by combining the position deviation.
7. The method of claim 6, wherein said step of obtaining actual position coordinates of said target workpiece in said first coordinate system based on said current coordinates in combination with said position deviations comprises:
obtaining actual application process coordinates of the target workpiece in the first coordinate system based on the current coordinates;
and obtaining the actual position coordinate of the target workpiece in the first coordinate system based on the actual application process coordinate and by combining the position deviation.
8. A target workpiece position determining apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring a first coordinate set of a target workpiece in a first coordinate system and a second coordinate set of the target workpiece in a second coordinate system, the first coordinate system is a coordinate system corresponding to a robot, and the second coordinate system is a coordinate system corresponding to a camera;
the processing module is used for obtaining a transformation relation between the coordinates of the target workpiece in the first coordinate system and the coordinates of the target workpiece in the second coordinate system based on the first coordinate set and the second coordinate set; and obtaining the actual position coordinates of the target workpiece in the first coordinate system based on the transformation relation.
9. A terminal, characterized in that the terminal comprises: a memory, a processor and a target workpiece position determination program stored on the memory and executable on the processor, the target workpiece position determination program when executed by the processor implementing the steps of the target workpiece position determination method as claimed in any one of claims 1 to 7.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the target workpiece position determination method according to any one of claims 1 to 7.
CN202010114299.9A 2020-02-24 2020-02-24 Target workpiece position determining method, device, terminal and readable storage medium Pending CN111445521A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010114299.9A CN111445521A (en) 2020-02-24 2020-02-24 Target workpiece position determining method, device, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010114299.9A CN111445521A (en) 2020-02-24 2020-02-24 Target workpiece position determining method, device, terminal and readable storage medium

Publications (1)

Publication Number Publication Date
CN111445521A true CN111445521A (en) 2020-07-24

Family

ID=71652723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010114299.9A Pending CN111445521A (en) 2020-02-24 2020-02-24 Target workpiece position determining method, device, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN111445521A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907673A (en) * 2021-03-19 2021-06-04 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium
CN113393534A (en) * 2021-06-23 2021-09-14 广东利元亨智能装备股份有限公司 Product laminating method, device, equipment and system
CN114139857A (en) * 2021-10-26 2022-03-04 成都飞机工业(集团)有限责任公司 Workpiece finishing process correcting method, system, storage medium and device
CN114794667A (en) * 2022-03-31 2022-07-29 深圳市如本科技有限公司 Tool calibration method, system, device, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107797560A (en) * 2017-11-28 2018-03-13 广州中国科学院先进技术研究所 A kind of visual identifying system and method for robotic tracking
CN110148187A (en) * 2019-06-04 2019-08-20 郑州大学 A kind of the high-precision hand and eye calibrating method and system of SCARA manipulator Eye-in-Hand
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method
CN110640745A (en) * 2019-11-01 2020-01-03 苏州大学 Vision-based robot automatic calibration method, equipment and storage medium
CN110717943A (en) * 2019-09-05 2020-01-21 中北大学 Method and system for calibrating eyes of on-hand manipulator for two-dimensional plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107797560A (en) * 2017-11-28 2018-03-13 广州中国科学院先进技术研究所 A kind of visual identifying system and method for robotic tracking
CN110148187A (en) * 2019-06-04 2019-08-20 郑州大学 A kind of the high-precision hand and eye calibrating method and system of SCARA manipulator Eye-in-Hand
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method
CN110717943A (en) * 2019-09-05 2020-01-21 中北大学 Method and system for calibrating eyes of on-hand manipulator for two-dimensional plane
CN110640745A (en) * 2019-11-01 2020-01-03 苏州大学 Vision-based robot automatic calibration method, equipment and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907673A (en) * 2021-03-19 2021-06-04 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium
CN112907673B (en) * 2021-03-19 2021-10-22 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium
CN113393534A (en) * 2021-06-23 2021-09-14 广东利元亨智能装备股份有限公司 Product laminating method, device, equipment and system
CN113393534B (en) * 2021-06-23 2022-06-17 广东利元亨智能装备股份有限公司 Product laminating method, device, equipment and system
CN114139857A (en) * 2021-10-26 2022-03-04 成都飞机工业(集团)有限责任公司 Workpiece finishing process correcting method, system, storage medium and device
CN114139857B (en) * 2021-10-26 2024-05-14 成都飞机工业(集团)有限责任公司 Workpiece finishing working procedure correction method, system, storage medium and device
CN114794667A (en) * 2022-03-31 2022-07-29 深圳市如本科技有限公司 Tool calibration method, system, device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN111445521A (en) Target workpiece position determining method, device, terminal and readable storage medium
CN111791227B (en) Robot hand-eye calibration method and device and robot
CN107073719B (en) Robot and robot system
US8355816B2 (en) Action teaching system and action teaching method
CN114012731B (en) Hand-eye calibration method and device, computer equipment and storage medium
CN111195897B (en) Calibration method and device for mechanical arm system
CN112873204B (en) Robot calibration method, device, equipment and computer readable storage medium
JP6565175B2 (en) Robot and robot system
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
US20190030722A1 (en) Control device, robot system, and control method
CN114833832B (en) Robot hand-eye calibration method, device, equipment and readable storage medium
CN113997295A (en) Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium
CN112809668A (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
KR20180017074A (en) Detection of the robot axial angles and selection of a robot by means of a camera
CN113302027B (en) Job coordinate generating device
CN112743548A (en) Method, system and terminal for unifying hand-eye calibration of two mechanical arms
CN114677429B (en) Positioning method and device of manipulator, computer equipment and storage medium
CN113858214B (en) Positioning method and control system for robot operation
CN114683267B (en) Calibration method, calibration device, electronic equipment and storage medium
WO2021261411A1 (en) Robot teaching method and robot working method
Cheng Robot manipulation of 3D cylindrical objects with a robot-mounted 2D vision camera
CN112184819A (en) Robot guiding method and device, computer equipment and storage medium
CN109664273B (en) Industrial robot cursor dragging teaching method and system
TWI656421B (en) Control method of self-propelled equipment
CN107340889B (en) Positioning initialization method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination