CN111791226A - Method and device for realizing assembly through robot and robot - Google Patents

Method and device for realizing assembly through robot and robot Download PDF

Info

Publication number
CN111791226A
CN111791226A CN201911414360.5A CN201911414360A CN111791226A CN 111791226 A CN111791226 A CN 111791226A CN 201911414360 A CN201911414360 A CN 201911414360A CN 111791226 A CN111791226 A CN 111791226A
Authority
CN
China
Prior art keywords
end effector
robot
center
coordinate system
shooting device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911414360.5A
Other languages
Chinese (zh)
Other versions
CN111791226B (en
Inventor
殷波
王仁忠
王丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Horn Audio Co Ltd
Original Assignee
Shenzhen Horn Audio Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Horn Audio Co Ltd filed Critical Shenzhen Horn Audio Co Ltd
Priority to CN201911414360.5A priority Critical patent/CN111791226B/en
Publication of CN111791226A publication Critical patent/CN111791226A/en
Application granted granted Critical
Publication of CN111791226B publication Critical patent/CN111791226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application belongs to the technical field of robot part assembly, and provides a method, a device and a robot for realizing assembly through the robot, wherein the method comprises the following steps: acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; controlling a shooting device to shoot the part center of the assembly position grabbed by the end effector by using the parallel pose of the flange plane and the base plane of the robot to obtain a first offset of the tool center and the part center of the assembly position to establish a dynamic tool coordinate system; and controlling the end effector to move to a pickup position to pick the part, controlling the shooting device to shoot the part center of the pickup position picked by the end effector, obtaining a second offset of the tool center and the part center of the pickup position, re-teaching the dynamic tool coordinate system, and controlling the end effector to move the picked part to an assembly position coordinate to assemble the part. The embodiment of the application solves the problems that the robot is long in time consumption and complex in calculation when parts are assembled.

Description

Method and device for realizing assembly through robot and robot
Technical Field
The invention relates to the technical field of robot part assembly, in particular to a method and a device for realizing assembly through a robot and the robot.
Background
Along with the development of industrial automation production towards small batch and multiple varieties, the traditional robot application technology needs a large number of high-precision special jigs to fix the position of a product, the cost is high, the workload of replacing the clamp jig is huge, and a large amount of time is also spent on teaching the operation track of the robot. Therefore, the position information of the product can be obtained by the vision system through the cooperation of the camera and the robot, and the robot is guided to grab and assemble. Among several installation modes of the camera, the installation mode of fixing the upward (looking up the camera) has the advantages of simple calibration method, no human eye alignment, higher calibration precision, elimination of material taking errors of the robot and highest assembly precision. Therefore, a bottom view camera is one of the most common ways in precision assembly. However, this mounting method has a problem that after the camera photographs the position and angle of the part and converts the position and angle into the coordinates of the robot, the center of the flange of the robot is not always coincident with the center of the part, and if the flange center coordinates of the robot are directly used to rotate the photographed part angle, the position of the part is changed after the rotation, which results in the part being not assembled in place. The general method is to adopt a mode of taking pictures twice, after taking pictures for the first time, the angle of the part is obtained, the coordinate of the robot is rotated by the angle, so that the angle of the part is the same as that of the reference image, then taking pictures for the second time, the angle of the obtained part image is the same as that of the reference image, and the assembly can be completed only by calculating the deviation in the direction of X, Y and subtracting the deviation from the assembly position when the robot is assembled. The method takes two times of photographing, so that the calculation amount of X, Y deviation is large, calculation errors exist, the photographing position of a part may exceed the visual field range of a camera after the first photographing rotation, the photographing position needs to be corrected, the calculation method is complex, the robot needs time to move, and the time of the whole assembly process is increased again.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for implementing assembly by a robot, and a robot, so as to solve the problems of long time consumption and complex calculation when a robot performs part assembly.
A first aspect of an embodiment of the present invention provides a method for performing assembly by a robot, including:
acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot at the pose where the flange plane of the robot is parallel to the base plane, and obtaining a first offset between the tool center of the end effector and the part center of the assembly position according to the point-to-point mapping relation;
correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtain the assembly position coordinates of the assembly position in the dynamic tool coordinate system;
maintaining the pose, controlling the end effector to move to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation;
and re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grabbed part to the assembly position coordinate for part assembly according to the re-taught dynamic tool coordinate system.
In one implementation example, the acquiring a point location mapping relationship between a coordinate system of the robot and an image coordinate system of the camera includes:
maintaining the pose, and controlling the end effector to carry a calibration plate and the shooting device to carry out shooting calibration to obtain the offset of the tool center and the flange center of the end effector on the plane;
correcting a coordinate system of the robot according to the offset of the tool center of the end effector and the flange center on a plane to establish a first tool coordinate system;
and maintaining the pose, and controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device to obtain a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device.
In one implementation example, the maintaining the pose, and controlling the end effector to carry a calibration plate to perform shooting calibration with the shooting device, so as to obtain an offset of a tool center and a flange center of the end effector on a plane, includes:
maintaining the pose, controlling the end effector to carry a calibration plate to align with the center of the visual field of the shooting device, and recording first position information and a first angle of the end effector and calibration coordinates of the center of a circle of the calibration plate in an image shot by the shooting device during alignment;
maintaining the pose, rotating the end effector from the first angle to a second angle, controlling the end effector to carry the calibration plate to align with the calibration coordinates in an image shot by the shooting device under the condition that the rotation angle is the second angle, and recording second position information of the end effector during alignment; the second angle is an angle formed by rotating the first angle by 180 degrees;
calculating the offset radius and the rotation angle of the end effector according to the first position information, the first angle, the second position information and the second angle;
and obtaining the offset of the tool center and the flange center of the end effector on the plane according to the offset radius and the rotation angle.
In one implementation example, the maintaining the pose, and controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device to obtain a point location mapping relationship between the first tool coordinate system and the image coordinate system of the shooting device includes:
maintaining the pose, controlling the end effector to carry the calibration plate to move in the visual field range of the shooting device, and simultaneously sampling the coordinate position of the end effector in a first tool coordinate system and the pixel coordinate of the circle center of the calibration plate in an image shot by the shooting device for N times according to a preset rule; n is more than or equal to 4;
and obtaining a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device according to the coordinate position and the pixel coordinate of each sampling.
In an implementation example, the preset rule is that sampling is performed when the calibration plate is located at the center of an image shot by the shooting device, at the center of four boundaries and at the positions of four boundary included angles; wherein N is 9.
In one implementation example, controlling the shooting device to shoot the part center of an assembly position grabbed by an end effector of the robot in a pose in which a flange plane of the robot is parallel to a base plane, and obtaining a first offset between a tool center of the end effector and the part center of the assembly position according to the point-to-point mapping relation includes:
controlling the alignment of an end effector of the robot and a part assembled in an assembly position in advance according to the pose in which a flange plane of the robot is parallel to a base plane, and recording third position information of the end effector during the alignment; the third position information is the assembly position;
maintaining the pose, and controlling an end effector of the robot to grab the part at the assembling position and translate the part to the view range of the shooting device so as to shoot the grabbed part;
obtaining the pixel position of the center of the grabbed part in the image shot by the shooting device;
converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
and acquiring the current actual coordinate and the current third rotation angle of the end effector, and acquiring a first offset of the center of the tool and the center of the part at the assembling position according to the current actual coordinate and the current third rotation angle of the end effector and the target position.
In one implementation example, the maintaining the pose, controlling the end effector to move to a pickup position to pick up a part, controlling the shooting device to shoot a part center of the pickup position picked by the end effector, and obtaining a second offset between the tool center and the part center of the pickup position according to the point-to-point mapping relationship includes:
maintaining the pose, controlling the end effector to move to a pickup position to grab the part and translate the part to the visual field range of the shooting device so as to shoot the grabbed part;
obtaining the pixel position of the center of the grabbed part in the image shot by the shooting device;
converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
and acquiring the current actual coordinate and the current fourth rotation angle of the end effector, and acquiring a second offset of the center of the tool and the center of the part at the workpiece taking position according to the current actual coordinate and rotation angle of the end effector and the target position.
In one embodiment, the re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grasped part to the assembly position coordinate according to the re-taught dynamic tool coordinate system for part assembly includes:
re-teaching the dynamic tool coordinate system according to the second offset;
and controlling the end effector to move the grabbed part to the assembly position coordinate and rotate the fourth rotation angle for part assembly according to the re-taught dynamic tool coordinate system.
A second aspect of an embodiment of the present invention provides a method for performing assembly by a robot, including:
acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
controlling an end effector of the robot to move to a pickup position to pick up a part according to the pose in which a flange plane of the robot is parallel to a base plane, controlling a shooting device to shoot the part center of the pickup position picked by the end effector, and obtaining the offset between the tool center of the end effector and the part center of the pickup position according to the point location mapping relation;
correcting a coordinate system of the robot according to the offset to establish a dynamic tool coordinate system, controlling the end effector to move the grabbed parts to an assembly position for assembly, and recording assembly position coordinates of the assembly position in the dynamic tool coordinate system;
and controlling the end effector to move the grabbed parts to the assembly position coordinates according to the dynamic tool coordinate system to assemble the parts.
A third aspect of an embodiment of the present invention provides an apparatus for performing assembly by a robot, including:
the point location mapping relation acquisition module is used for acquiring the point location mapping relation between the coordinate system of the robot and the image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
the first offset calculation module is used for controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot according to the pose in which the flange plane of the robot is parallel to the base plane, and obtaining a first offset of the tool center of the end effector and the part center of the assembly position according to the point position mapping relation;
the dynamic tool coordinate system establishing module is used for correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtaining the assembly position coordinates of the assembly position in the dynamic tool coordinate system;
the second offset calculation module is used for maintaining the pose, controlling the end effector to operate to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation;
and the part assembling module is used for re-teaching the dynamic tool coordinate system according to the second offset and controlling the end effector to move the grabbed part to the assembling position coordinate to assemble the part according to the re-taught dynamic tool coordinate system.
A fourth aspect of an embodiment of the present invention provides a robot including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the method of robot-implemented assembly of the first aspect when executing the computer program.
According to the method, the device and the robot for realizing assembly through the robot, provided by the embodiment of the invention, the point position mapping relation between the coordinate system of the robot and the image coordinate system of the shooting device vertically fixed right below the flange plane of the robot in a bottom view mode is obtained; controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot at the pose where the flange plane of the robot is parallel to the base plane, and obtaining a first offset between the tool center and the part center of the assembly position according to the point-to-point mapping relation; correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtain the assembly position coordinates of the assembly position in the dynamic tool coordinate system; maintaining the pose, controlling the end effector to move to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation; and re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grabbed part to the assembly position coordinate for part assembly according to the re-taught dynamic tool coordinate system. And after the end effector grabs the part from the part taking position, the shooting device is controlled to shoot the part to obtain the position and angle information of the current part and the central position of the part, a dynamic tool coordinate system is established by using the center of the current part, and the robot grabs the part to move to an assembling position to assemble the part under the dynamic tool coordinate system to realize precise assembly. The correction of the position and the angle of the product can be completed through one-time photographing without the teaching of human eyes, so that the time consumed by part assembly is shortened, the calculated amount is reduced, and the assembly precision of the part is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for realizing assembly by a robot according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a robot according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a calibration board according to an embodiment of the present invention;
FIG. 4 is a schematic flowchart of a method for assembling by a robot according to a second embodiment of the present invention;
FIG. 5 is a schematic flowchart of a method for assembling by a robot according to a third embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an apparatus for assembly by a robot according to a fourth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a robot according to a fifth embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this invention and the above-described drawings are intended to cover non-exclusive inclusions. For example, a process, method, or system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. Furthermore, the terms "first," "second," and "third," etc. are used to distinguish between different objects and are not used to describe a particular order.
Example one
Fig. 1 is a schematic flow chart of a method for robot assembly according to an embodiment of the present invention. The embodiment also provides a process of obtaining the point location mapping relation between the coordinate system of the robot and the image coordinate system of the shooting device, so that the hand-eye calibration of the robot is realized, and the assembly precision is improved. The method specifically comprises the following steps:
s110, controlling the end effector to carry a calibration plate and the shooting device to carry out shooting calibration according to the parallel pose of the flange plane and the base plane of the robot, and obtaining the offset of the tool center of the end effector and the flange center on the plane;
when an industrial robot is used for assembling parts or workpieces, the centers of the parts and the flange of the robot do not coincide after the end effector of the robot grabs the parts. In the prior art, a mode of taking pictures twice is usually adopted, and the angle between a part and a reference image is corrected after the picture is taken for the first time; and shooting for the second time, correcting the horizontal position deviation of the part image angle and the reference image, and subtracting the deviation from the assembly position when the robot is assembled to finish the assembly. However, this method takes a long time to take two photographs, and the calculation amount of the deviation is large, resulting in calculation errors. To solve this problem, embodiments of the present invention implement parts assembly using a robot having a camera vertically fixed directly below a flange plane of the robot in a bottom view. Alternatively, the robot may be a six-axis robot. The coordinate system of the 6-axis robot comprises X, Y, Z three translation amounts and R (X), R (Y) and R (Z) rotation amounts which rotate around the three axes respectively; the camera may be an upward-looking industrial camera.
Before the robot is controlled to carry out part arrangement, the hand and eye calibration of the robot is needed. Fig. 2 is a schematic structural diagram of the robot. The whole process of the robot for hand-eye calibration is carried out with the posture that the flange plane 6 of the robot is parallel to the base plane 1. When the flange surface 6 of the robot is parallel to the robot base plane 1, r (x) is 180 °, r (y) is 0 °, and the Z-axis is perpendicular to the XY plane, that is, the base plane, so that the coordinates of the robot X, Y are adjusted, and the r (Z) angle is not changed. The end effector 2 of the robot is controlled to rotate in this pose, that is, the rotation axis angle is changed, that is, the rotation angle r (Z) of the Z axis is changed. And (3) looking up at the calibration process of the end effector 2 of the shooting robot through the shooting device 5 to obtain the offset of the tool center and the flange center of the end effector on the plane. Optionally, the robot also has a light source 4 that provides illumination for the camera.
In an implementation example, the specific process of maintaining the pose and controlling the end effector to carry a calibration plate and perform shooting calibration with the shooting device to obtain the offset of the tool center and the flange center of the end effector on the plane may be:
step 11, maintaining the pose, controlling the end effector to carry a calibration plate to align with the center of the visual field of the shooting device, and recording first position information and a first angle of the end effector and calibration coordinates of the center of a circle of the calibration plate in an image shot by the shooting device during alignment; .
In one example of implementation, the end effector of the robot includes a jaw or a suction nozzle. While the calibration is performed, the calibration board 3 is attached to the end effector 2 of the robot, such as a clamping jaw or a suction nozzle, and optionally, the calibration board 3 may be a black circular calibration board, as shown in fig. 3, which is a schematic structural diagram of the calibration board. And controlling the end effector to carry a calibration plate to align with the center of the visual field of the shooting device by the pose of the robot with the flange plane parallel to the base plane, and recording first position information and a first angle of the end effector of the robot during the alignment and calibration coordinates of the circle center of the calibration plate in an image shot by the shooting device. When aligning, the lens of the shooting device needs to be adjusted to make the image shot by the shooting device as clear as possible, and then the adjusted lens and the diaphragm are locked. The current robot height is also recorded so that the robot can be shot by the shooting device at the height when the robot carries out part assembly work.
Specifically, the end effector of the robot is controlled to move with a calibration plate so as to align the calibration plate with the center of the visual field of the shooting device in a posture that the flange plane of the robot is parallel to the base plane, and first position information (x) of the end effector of the robot during the alignment is recorded0,y0) The first angle being the angle u of the rotation axis R (Z)0And calibration coordinates (x) of the center of the calibration plate in the image captured by the camerap,yp)。
Step 12, maintaining the pose, rotating the end effector from the first angle to a second angle, controlling the end effector to carry the calibration plate to align with the calibration coordinates in the image shot by the shooting device when the rotation angle is the second angle, and recording second position information of the end effector during alignment; the second angle is an angle formed by rotating the first angle by 180 degrees;
saving the current pose of the robot, and rotating the axis R (Z) of the end effector from a first angle u0Rotated to a second angle u0+180 °. And controlling the end effector to carry the calibration plate to move under the condition that the rotation angle is the second angle so as to enable the coordinate position of the calibration plate in the image shot by the shooting device to be calibration coordinate (x)p,yp) Recording second position information (x) of the end effector at the time of alignment1,y1)。
Step 13, calculating an offset radius and a rotation angle of the end effector according to the first position information, the first angle, the second position information and the second angle;
specifically, the end effector of the robot is controlled to rotate by a preset angle and is aligned with the circle center of the calibration plate twice to obtain first position information (x) at a pose where the flange plane of the robot is parallel to the base plane0,y0) And a first angle u0And second position information (x)1,y1) And a second angle u0+180 °. By the following equation:
Rx=(x1-x0)/2,Ry=(y1-y0)/2;
Figure BDA0002350800580000121
when Rx is>At the time of 0, the number of the first,
Figure BDA0002350800580000122
when Rx is<At the time of 0, the number of the first,
Figure BDA0002350800580000123
can be based on the first position information (x)0,y0) And a first angle u0And second position information (x)1,y1) And a second angle u0The +180 ° calculates the offset radius Rx of the end effector of the robot in the X-axis direction and the offset radius Ry of the end effector of the robot in the Y-axis direction at the current R (z) angle, so as to obtain the offset radius R of the end effector and the rotation angle θ.
And 14, obtaining the offset of the tool center and the flange center of the end effector on the plane according to the offset radius and the rotation angle.
Calculating the offset x of the tool center and the flange center of the end effector on the plane according to the offset radius R and the rotation angle theta through the trigonometric function relationshiptoolAnd ytoolThe calculation formula is as follows:
xtool=R*cos(θ);
ytool=R*sin(θ)。
when the end effector of the robot is a gripper or a suction nozzle, the offset of the tool center of the gripper or the suction nozzle from the flange center on the plane can be calculated by performing the above steps 11 to 14.
S120, correcting a coordinate system of the robot according to the offset of the tool center of the end effector and the flange center on a plane to establish a first tool coordinate system;
specifically, the offset x of the tool center of the end effector and the flange center on the planetoolAnd ytoolA coordinate system of a robot is corrected by inputting a teach pendant connected with the robot to establish a first tool coordinate system (tool1), namely, the offset of an X axis of a tool coordinate system of the robot teach pendant for setting a pinpoint tool is XtoolOffset of the Y axis being YtoolThe other axis offset amounts are set to 0. And the position coordinates and the rotation angle of the end effector read from the demonstrator are the position coordinates and the rotation angle after the deviation between the center of the tool and the center of the flange is corrected.
S130, maintaining the pose, controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device, and obtaining a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device.
And calibrating coordinates of the end effector and the shooting device of the robot in a first tool coordinate system according to the parallel pose of the flange plane and the base plane of the robot, and obtaining the point mapping relation of the first tool coordinate system and the image coordinate system of the shooting device by a multi-point calibration method. Therefore, the point location mapping relation of the coordinate system of the robot and the image coordinate system of the shooting device, which is obtained by the robot, is the point location mapping relation of the first tool coordinate system and the image coordinate system of the shooting device after the offset between the tool center and the flange center is corrected.
In an implementation example, the specific process of maintaining the pose and controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device to obtain the point location mapping relationship between the first tool coordinate system and the image coordinate system of the shooting device may be:
step 21, maintaining the pose, controlling the end effector to carry the calibration plate to move in the visual field range of the shooting device, and simultaneously sampling the coordinate position of the end effector in a first tool coordinate system and the pixel coordinates of the circle center of the calibration plate in the image shot by the shooting device for N times according to a preset rule; n is more than or equal to 4;
specifically, the end effector is controlled to carry the calibration plate to move in the visual field range of the shooting device in a pose where the flange plane of the robot is parallel to the base plane, so that the shooting device shoots the circle center of the moving calibration plate, and the calibration plate is controlled to be always located in the visual field range of the shooting device when the end effector carries the calibration plate to move. Controlling an end effector to carry a calibration plate to move in a visual field range of a shooting device so that the shooting device shoots the center of a circle of the moving calibration plate, and simultaneously sampling the coordinate position of the end effector in a first tool coordinate system (tool1) and the pixel coordinate of the center of the calibration plate in an image shot by the shooting device for N times according to a preset rule; n is more than or equal to 4. The coordinate position of the end effector in the first tool coordinate system obtained by each sampling corresponds to the pixel coordinate of the image shot by the shooting device at the center of the calibration plate one by one.
In one implementation example, the preset rule is that sampling is performed when the circle center of the calibration plate is located at the center of the image shot by the shooting device, at the center of four boundaries and at the positions of four boundary included angles; wherein N is 9.
Specifically, when the end effector is controlled to carry the calibration plate to move in the field of view of the shooting device so that the shooting device shoots the center of a circle of the moving calibration plate, sampling is performed when the center of the calibration plate is located at the center, the four boundary centers and the four boundary included angle positions of an image shot by the shooting device, and the coordinate position of the end effector in a first tool coordinate system and the pixel coordinate of the center of the calibration plate in the image shot by the shooting device are recorded simultaneously during each sampling. Optionally, the specific process of maintaining the parallel pose of the flange plane and the base plane of the robot, controlling the end effector to carry the calibration plate to move within the field of view of the shooting device, and simultaneously sampling the coordinate position of the end effector in the first tool coordinate system and the pixel coordinate of the center of the calibration plate in the image shot by the shooting device for N times according to a preset rule may be: the end effector of the control robot carries the calibration plate to move within the field of view of the camera, so that the calibration plate follows a track similar to a Chinese character 'tian' along the field of view within the field of view of the camera. In the process of shooting a moving calibration plate by a shooting device, sampling is carried out when the circle center of the calibration plate is positioned at the center, four boundary centers and four boundary included angle positions of an image shot by the shooting device, namely, sampling is carried out when the circle center of the calibration plate sequentially appears at 9 points of the upper left, the upper middle, the upper right, the middle left, the lower middle and the lower right of the camera visual field, namely, 9-point sampling is carried out totally, so that the calibration precision is improved.
And step 22, obtaining a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device according to the coordinate position and the pixel coordinate of each sampling.
Because the end effector is controlled to carry the calibration plate to move in the visual field range of the shooting device in the pose that the flange plane of the robot is parallel to the base plane, when the shooting device shoots the center of a circle of the moving calibration plate, the angle of the R (X) axis of the end effector is unchanged, and only the plane coordinate X, Y is changed. By carrying out 9-point sampling calibration when the circle center of the calibration plate is positioned at the center, the four boundary centers and the four boundary included angle positions of the image shot by the shooting device, recording the coordinate position (x, y) of the end effector in a first tool coordinate system and the pixel coordinate (u, v) of the circle center of the calibration plate in the image shot by the shooting device at the same time during each sampling, and thus obtaining the point mapping relation between the coordinate positions of nine groups of end effectors in the first tool coordinate system and the pixel coordinate of the image shot by the shooting device. And obtaining a point location mapping relation between the first tool coordinate system of the robot and the image coordinate system of the shooting device through affine transformation according to the coordinate position and the pixel coordinate of the first tool coordinate system sampled every time. Optionally, the point-to-point mapping relationship between the first tool coordinate system of the robot and the image coordinate system of the shooting device obtained through affine transformation is a two-dimensional homogeneous transformation matrix, and the specific formula is as follows:
Figure BDA0002350800580000151
wherein m is11Representing a scaling factor, m, in the X-direction between the robot coordinate system and the image coordinate system12Denotes the rotation angle m in the Y direction between two coordinate systems13Representing translation vectors, m, in the X-direction of two coordinate systems21Denotes the rotation angle in the X direction, m22Denoted is the scaling factor, m, in the Y-axis direction23The translation vector in the Y direction is shown.
Example two
Fig. 4 is a schematic flowchart of a method for robot assembly according to a second embodiment of the present invention. On the basis of the first embodiment, the first embodiment is applicable to an application scenario where an industrial robot assembles a part or a workpiece, and the method can be executed by a device which is assembled by the robot, wherein the device can be a processor, an intelligent terminal, a tablet or a PC; in the embodiment of the present application, a device that is assembled by a robot is taken as an execution subject, and the method specifically includes the following steps:
s210, acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
when an industrial robot is used for assembling parts or workpieces, the centers of the parts and the flange of the robot do not coincide after the end effector of the robot grabs the parts. In the prior art, a mode of taking pictures twice is usually adopted, and the angle between a part and a reference image is corrected after the picture is taken for the first time; and shooting for the second time, correcting the horizontal position deviation of the part image angle and the reference image, and subtracting the deviation from the assembly position when the robot is assembled to finish the assembly. However, this method takes a long time to take two photographs, and the calculation amount of the deviation is large, resulting in calculation errors. To solve this problem, embodiments of the present invention implement parts assembly using a robot having a camera vertically fixed directly below a flange plane of the robot in a bottom view. Alternatively, the robot may be a six-axis robot. The coordinate system of the 6-axis robot comprises X, Y, Z three translation amounts and R (X), R (Y) and R (Z) rotation amounts which rotate around the three axes respectively; the camera may be an upward-looking industrial camera.
When the flange surface of the robot is parallel to the plane of the robot base, r (x) is 180 °, r (y) is 0 °, and the Z-axis is perpendicular to the XY-plane, that is, the base plane, so that the coordinates of the robot X, Y are adjusted, and the r (Z) angle does not change. And controlling the end effector of the robot to rotate under the pose, namely changing the angle of the rotating shaft, namely changing the rotating angle R (Z) of the Z axis. And the offset of the tool center of the end effector and the center of the flange on the plane is obtained through the calibration process of the end effector of the photographing robot viewed from the upside by the photographing device.
The offset of the tool center of the end effector from the flange center in the plane may be input into the coordinate system of the teach pendant calibration robot to establish a first tool coordinate system (tool 1). And then, calibrating coordinates of the end effector and the shooting device of the robot in the first tool coordinate system, and obtaining a point mapping relation between the first tool coordinate system and an image coordinate system of the shooting device by a multi-point calibration method. Therefore, the point location mapping relation of the coordinate system of the robot and the image coordinate system of the shooting device, which is obtained by the robot, is the point location mapping relation of the first tool coordinate system and the image coordinate system of the shooting device after the offset between the tool center and the flange center is corrected.
S220, controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot according to the pose with the flange plane of the robot parallel to the base plane, and obtaining a first offset between the tool center of the end effector and the part center of the assembly position according to the point-to-point mapping relation;
after the part position and the part angle at the captured assembly position are captured and converted into the coordinates of the robot by the shooting device, the center of the flange of the robot is usually not coincident with the center of the part. In order to calculate the offset between the center of the part at the assembly position grabbed by the robot and the center of the flange of the robot and establish a dynamic coordinate system by using the center of the current part, the shooting device is controlled to shoot the center of the part at the assembly position grabbed by the end effector of the robot at the pose where the flange plane of the robot is parallel to the base plane.
Specifically, when the robot is in a posture that a flange plane of the robot is parallel to a base plane, the robot can keep the same posture in the moving process and the part grabbing process, only x and y coordinates on a horizontal plane are changed in the moving process of an end effector of the robot, and the X, Y coordinates of the robot are ensured to have a one-to-one correspondence relation with pixel coordinates x and y of an image shot by a shooting device. The adjusting method is to switch the robot motion mode into a joint mode, adjust the 5 th joint angle to be-90 degrees, and then the flange plane of the robot is parallel to the robot base plane.
When the shooting device shoots the part center of the assembly position grabbed by the end effector of the robot, the pixel coordinates of the part center of the assembly position grabbed by the end effector, and the current position information and the current rotation angle of the end effector can be obtained. And converting the pixel coordinates of the current part center according to the point mapping relation between the first tool coordinate system and the image coordinate system of the shooting device to obtain the coordinates of the current part center in the first tool coordinate system. A first offset between the tool center of the end effector and the part center of the assembly position can be calculated from the coordinates of the current part center in the first tool coordinate system, the current position information of the end effector, and the rotation angle.
In one implementation example, the specific process of controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot in the pose in which the flange plane of the robot is parallel to the base plane, and obtaining the first offset between the tool center of the end effector and the part center of the assembly position according to the point-to-point mapping relation includes:
step 31, controlling the alignment of the end effector of the robot and a part assembled in an assembly position in advance according to the parallel pose of the flange plane and the base plane of the robot, and recording third position information of the end effector during alignment; the third position information is the assembly position;
specifically, before the photographing device is controlled to photograph the part center of the assembly position grasped by the end effector of the robot, the part to be assembled is mounted on the target device position in advance. Then, the end effector of the robot is controlled to be aligned with the parts previously assembled in the assembly position in an attitude in which the flange plane of the robot is parallel to the base plane, and third position information P3 of the end effector of the robot at the time of alignment is recorded under a robot flange center coordinate system (tool 0). The third position information is coordinates corresponding to the assembly position. Alternatively, the end effector of the robot may be a gripper or a suction nozzle.
Step 32, maintaining the pose, controlling an end effector of the robot to grab the part at the assembling position and translate the part to the view field of the shooting device so as to shoot the grabbed part;
specifically, the robot is controlled to grasp the part mounted at the assembly position with the end effector in a posture in which the flange plane of the robot is parallel to the base plane. And after the end effector grabs the part, the part is translated to a preset photographing position in the visual field range of the photographing device, so that the photographing device, namely the upward camera, photographs the part currently grabbed by the end effector.
Step 33, obtaining the pixel position of the center of the grabbed part in the image shot by the shooting device;
and shooting the part currently grabbed by the end effector by a shooting device, namely an upward camera, and obtaining the pixel position (u, v) of the center of the part grabbed by the end effector in the image shot by the shooting device.
Step 34, converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
according to the obtained first tool coordinate system and beatThe pixel position (u, v) is converted from the point-to-point mapping relationship in the image coordinate system of the imaging device, and the target position P4 (x) of the end effector in the first tool coordinate system is obtained4,y4)。
And 35, acquiring the current actual coordinate and the current third rotation angle of the end effector, and obtaining a first offset between the center of the tool and the center of the part at the assembly position according to the current actual coordinate and the current third rotation angle of the end effector and the target position.
Specifically, the actual coordinates P5 (x) of the current end effector can be obtained by the teach pendant under the robot flange center coordinate system (tool0)5,y5) And a third angle of rotation u3. According to the target position P4 (x) through trigonometric function relation4,y4) Actual coordinates P5 (x)5,y5) And a third angle of rotation u3Calculating a first offset x between the center of the tool and the center of the part at the assembly positiontoolAnd ytoolThe calculation formula is specifically as follows:
Rx=x4-x5,Ry=y4-y5
Figure BDA0002350800580000191
Figure BDA0002350800580000192
xtool=Rtool*cos(θ);
ytool=Rtool*sin(θ)。
s230, correcting a coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system, and obtaining an assembly position coordinate of the assembly position in the dynamic tool coordinate system;
the first offset x of the tool center of the end effector from the part center of the assembly positiontoolAnd ytoolInputting the coordinate system of the teaching machine correction robot to establish a dynamic tool coordinate system (tool2), namely on machineThe robot demonstrator sets the offset of the X axis of the dynamic tool coordinate system as XtoolOffset of the Y axis being YtoolThe other axis offset amounts are set to 0. And re-teaching the assembly position under the dynamic tool coordinate system, namely moving the end effector of the robot to third position information, and obtaining the assembly position coordinate P0 of the assembly position in the dynamic tool coordinate system through the dynamic tool coordinate system (tool2) in the teaching machine.
S240, maintaining the pose, controlling the end effector to move to a pickup position to pick up a part, controlling the shooting device to shoot the part center of the pickup position picked by the end effector, and obtaining a second offset between the tool center and the part center of the pickup position according to the point-to-point mapping relation;
the part is placed through a low-precision jig or a plastic sucking disc, namely, the part taking position is obtained, when the end effector of the robot is formally controlled to reach the position above the jig or the plastic sucking disc to grab the part, the shooting device shoots the part position and angle of the grabbed part taking position and converts the part position and angle into the coordinate of the robot, and the center of a flange of the robot is usually not overlapped with the center of the part. In order to calculate the offset between the center of the part at the pick-up position grabbed by the robot and the center of the flange of the robot and re-teach the dynamic coordinate system by using the center of the current part, the shooting device is controlled to shoot the center of the part at the pick-up position grabbed by the end effector of the robot according to the parallel pose of the flange plane of the robot and the base plane.
When the shooting device shoots the part center of the pick-up position grabbed by the end effector of the robot, the pixel coordinates of the part center of the pick-up position grabbed by the end effector, the current position information of the end effector and the current rotation angle can be obtained. And converting the pixel coordinates of the current part center according to the point mapping relation between the first tool coordinate system and the image coordinate system of the shooting device to obtain the coordinates of the current part center in the first tool coordinate system. And calculating to obtain a second offset of the tool center of the end effector and the part center of the part taking position according to the coordinates of the current part center in the first tool coordinate system, the current position information of the end effector and the rotation angle.
In an implementation example, the specific process of maintaining the pose, controlling the end effector to move to a pickup position to pick up a part, and controlling the shooting device to shoot a part center of the pickup position picked by the end effector, and obtaining a second offset between the tool center and the part center of the pickup position according to the point-to-point mapping relationship may be:
step 41, maintaining the pose, controlling the end effector to move to a pickup position to grab the part and translate the part to the view range of the shooting device so as to shoot the grabbed part;
and controlling the end effector of the robot to grab the part placed at the pick-up position in a posture that the flange plane of the robot is parallel to the base plane. And after the end effector grabs the part, the part is translated to a preset photographing position in the visual field range of the photographing device, so that the photographing device, namely the upward camera, photographs the part currently grabbed by the end effector.
Step 42, obtaining the pixel position of the center of the grabbed part in the image shot by the shooting device;
and shooting the part currently grabbed by the end effector by a shooting device, namely an upward camera, and obtaining the pixel position (u, v) of the center of the part grabbed by the end effector in the image shot by the shooting device.
43, converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
the pixel position (u, v) is converted from the point-to-point mapping relationship between the acquired first tool coordinate system and the image coordinate system of the imaging device, and the target position P6 (x) of the end effector in the first tool coordinate system is obtained6,y6)。
And 44, acquiring the current actual coordinate and the current fourth rotation angle of the end effector, and acquiring a second offset between the center of the tool and the center of the part at the part taking position according to the current actual coordinate and rotation angle of the end effector and the target position.
Specifically, the actual coordinates P7 (x) of the current end effector can be obtained by the teach pendant under the robot flange center coordinate system (tool0)7,y7) And a fourth angle of rotation u4. According to the target position P6 (x) through trigonometric function relation6,y6) Actual coordinates P7 (x)7,y7) And a fourth angle of rotation u4Calculating a second offset x between the center of the tool and the center of the part at the assembly positiontoolAnd ytoolThe calculation formula is specifically as follows:
Rx=x6-x7,Ry=y6-y7
Figure BDA0002350800580000221
Figure BDA0002350800580000222
xtool=Rtool*cos(θ);
ytool=Rtool*sin(θ)。
and S250, re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grabbed part to the assembly position coordinate to assemble the part according to the re-taught dynamic tool coordinate system.
The second offset of the tool center of the end effector from the part center of the assembly position may be entered into a program run by the robot to correct the robot's coordinate system by modifying the tool coordinate function to achieve re-teaching of the dynamic tool coordinate system (tool 2). And controlling the end effector to move the grabbed part to an assembly position coordinate P0 for part assembly under the re-taught dynamic tool coordinate system.
In one embodiment, the re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grasped part to the assembly position coordinate according to the re-taught dynamic tool coordinate system to assemble the part includes: re-teaching the dynamic tool coordinate system according to the second offset; and controlling the end effector to move the grabbed part to the assembly position coordinate and rotate the fourth rotation angle for part assembly according to the re-taught dynamic tool coordinate system.
A second offset of the tool center from the part center of the assembly position may be input into the coordinate system of the teach pendant calibration robot to re-teach the dynamic tool coordinate system (tool 2). Controlling the end effector to move the grabbed part to an assembly position coordinate P0 and controlling the R (z) rotating shaft of the end effector to rotate by a fourth rotating angle u under the re-taught dynamic tool coordinate system in a pose in which the flange plane of the robot is parallel to the base plane4And assembling the parts. Optionally, the fourth rotation angle is a part angle of the current part captured by the end effector, and the current part may be captured by the capturing device to obtain the part angle of the current part.
According to the method for realizing assembly through the robot, provided by the embodiment of the invention, the point position mapping relation between the coordinate system of the robot and the image coordinate system of the shooting device vertically fixed right below the flange plane of the robot in an upward view mode is obtained; controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot at the pose where the flange plane of the robot is parallel to the base plane, and obtaining a first offset between the tool center and the part center of the assembly position according to the point-to-point mapping relation; correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtain the assembly position coordinates of the assembly position in the dynamic tool coordinate system; maintaining the pose, controlling the end effector to move to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation; and re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grabbed part to the assembly position coordinate for part assembly according to the re-taught dynamic tool coordinate system. And after the end effector grabs the part from the part taking position, the shooting device is controlled to shoot the part to obtain the position and angle information of the current part and the central position of the part, a dynamic tool coordinate system is established by using the center of the current part, and the robot grabs the part to move to an assembling position to assemble the part under the dynamic tool coordinate system to realize precise assembly. The correction of the position and the angle of the product can be completed through one-time photographing without the teaching of human eyes, so that the time consumed by part assembly is shortened, the calculated amount is reduced, and the assembly precision of the part is improved.
EXAMPLE III
Fig. 5 is a schematic flow chart of a method for robot assembly according to a third embodiment of the present invention. On the basis of the first embodiment and the second embodiment, the present embodiment further provides a simple teaching method for an industrial robot to perform assembly position of a part or a workpiece, which can be performed by a device for performing assembly by the robot, wherein the device can be a processor, a smart terminal, a tablet or a PC; in the embodiment of the present application, a device that is assembled by a robot is taken as an execution subject, and the method specifically includes the following steps:
s310, acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
when the flange surface of the robot is parallel to the plane of the robot base, r (x) is 180 °, r (y) is 0 °, and the Z-axis is perpendicular to the XY-plane, that is, the base plane, so that the coordinates of the robot X, Y are adjusted, and the r (Z) angle does not change. And controlling the end effector of the robot to rotate under the pose, namely changing the angle of the rotating shaft, namely changing the rotating angle R (Z) of the Z axis. And the offset of the tool center of the end effector and the center of the flange on the plane is obtained through the calibration process of the end effector of the photographing robot viewed from the upside by the photographing device.
The offset of the tool center of the end effector from the flange center in the plane may be input into the coordinate system of the teach pendant calibration robot to establish a first tool coordinate system (tool 1). And then, calibrating coordinates of the end effector and the shooting device of the robot in the first tool coordinate system, and obtaining a point mapping relation between the first tool coordinate system and an image coordinate system of the shooting device by a multi-point calibration method. Therefore, the point-to-point mapping relation of the coordinate system of the robot and the image coordinate system of the shooting device, which is acquired by the robot, is the point-to-point mapping relation of the first tool coordinate system and the image coordinate system of the shooting device after the offset between the tool center of the end effector and the flange center is corrected.
S320, controlling an end effector of the robot to move to a pickup position to pick up a part according to the pose in which a flange plane of the robot is parallel to a base plane, controlling a shooting device to shoot the part center of the pickup position picked by the end effector, and obtaining the offset between the tool center of the end effector and the part center of the pickup position according to the point position mapping relation;
the part is placed through a low-precision jig or a plastic sucking disc, namely, the part taking position is obtained, when the end effector of the robot is formally controlled to reach the position above the jig or the plastic sucking disc to grab the part, the shooting device shoots the part position and angle of the grabbed part taking position and converts the part position and angle into the coordinate of the robot, and the center of a flange of the robot is usually not overlapped with the center of the part. In order to calculate the offset between the center of a part at the position where the robot grabs the part and the center of the flange of the robot and teach a dynamic coordinate system by using the center of the current part, the shooting device is controlled to shoot the center of the part at the position where the end effector of the robot grabs the part by using the pose where the plane of the flange of the robot is parallel to the plane of the base.
When the shooting device shoots the part center of the pick-up position grabbed by the end effector of the robot, the pixel coordinates of the part center of the pick-up position grabbed by the end effector, the current position information of the end effector and the current rotation angle can be obtained. Converting the pixel coordinate of the current part center according to the point mapping relation between the first tool coordinate system and the image coordinate system of the shooting device to obtain the first tool seatCoordinates of the current part center in the coordinate system. And calculating the offset of the tool center of the end effector and the part center of the part taking position according to the coordinates of the current part center in the first tool coordinate system, the current position information of the end effector and the rotating angle. Specifically, the offset x between the tool center of the end effector and the part center of the pick-up position can be obtained by performing steps 41 to 44 in the second embodimenttoolAnd ytool
S330, correcting the coordinate system of the robot according to the offset to establish a dynamic tool coordinate system, controlling the end effector to move the grabbed parts to an assembly position for assembly, and recording the assembly position coordinates of the assembly position in the dynamic tool coordinate system;
specifically, the offset x of the tool center of the end effector and the part center of the pick position can be determinedtoolAnd ytoolInputting the coordinate system of the robot corrected by the teaching device to establish a dynamic tool coordinate system (tool2), namely setting the offset of the X axis of the dynamic tool coordinate system in the robot teaching device as XtoolOffset of the Y axis being YtoolThe other axis offset amounts are set to 0.
And after the dynamic coordinate system is established, controlling the end effector of the robot to move the part grabbed from the grabbing position to the assembling position for assembling according to the pose that the flange plane of the robot is parallel to the base plane. Alternatively, the robot may be constantly adjusted by a slight distance so that the parts held by the end effector of the robot can fit exactly into the position to be assembled. At this time, the assembly position coordinate P0 of the assembly position in the dynamic tool coordinate system can be obtained by the dynamic tool coordinate system (tool2) in the teach pendant.
And S340, controlling the end effector to move the grabbed parts to the assembly position coordinates according to the dynamic tool coordinate system to assemble the parts.
And controlling the end effector to move the part grabbed from the grabbing position to an assembly position coordinate P0 under the dynamic tool coordinate system and controlling the R (z) rotating shaft of the end effector to rotate the part angle for part assembly. Optionally, the part angle is a part angle of a current part captured by the end effector, and the current part may be captured by the capturing device to obtain the part angle of the current part.
Example four
Fig. 6 shows an apparatus for realizing assembly by a robot according to a fourth embodiment of the present invention. On the basis of the first or second embodiment, the embodiment of the present invention further provides an apparatus 6 for realizing assembly by a robot, the apparatus comprising:
a point location mapping relation obtaining module 601, configured to obtain a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
in an embodiment, the point location mapping relationship obtaining module 601 includes:
the offset calculation unit is used for maintaining the pose and controlling the end effector to carry a calibration plate and the shooting device to carry out shooting calibration so as to obtain the offset of the tool center of the end effector and the flange center on the plane;
the first tool coordinate system establishing unit is used for correcting a coordinate system of the robot according to the offset of the tool center of the end effector and the flange center on the plane to establish a first tool coordinate system;
and the point location mapping relation calculation unit is used for maintaining the pose and controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device to obtain the point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device.
In one implementation example, the offset calculation unit includes:
the first alignment subunit is used for maintaining the pose, controlling the end effector to carry a calibration plate to align with the center of the visual field of the shooting device, and recording first position information, a first angle and calibration coordinates of the center of the calibration plate in an image shot by the shooting device during alignment;
the second alignment subunit is used for maintaining the pose, rotating the end effector from the first angle to a second angle, controlling the end effector to carry the calibration plate to align with the calibration coordinates in an image shot by the shooting device under the condition that the rotation angle is the second angle, and recording second position information of the end effector during alignment; the second angle is an angle formed by rotating the first angle by 180 degrees;
the offset radius and rotation angle calculation subunit is used for calculating the offset radius and the rotation angle of the end effector according to the first position information, the first angle, the second position information and the second angle;
and the offset amount operator unit is used for obtaining the offset of the tool center and the flange center of the end effector on the plane according to the offset radius and the rotation angle.
In one embodiment, the point location mapping relation calculating unit includes:
the sampling subunit is used for maintaining the pose, controlling the end effector to carry the calibration plate to move in the visual field range of the shooting device, and simultaneously sampling the coordinate position of the end effector in a first tool coordinate system and the pixel coordinate of the circle center of the calibration plate in the image shot by the shooting device for N times according to a preset rule; n is more than or equal to 4;
and the point location mapping relation calculating subunit is configured to obtain a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device according to the coordinate position and the pixel coordinate of each sampling.
A first offset calculation module 602, configured to control the shooting device to shoot a part center of an assembly position grabbed by an end effector of the robot at a pose where a flange plane of the robot is parallel to a base plane, and obtain a first offset between a tool center of the end effector and the part center of the assembly position according to the point-to-point mapping relationship;
in one implementation example, the first offset calculation module 602 includes:
the alignment unit is used for controlling the alignment of the end effector of the robot and a part assembled in an assembly position in advance in a posture that a flange plane of the robot is parallel to a base plane, and recording third position information of the end effector during alignment; the third position information is the assembly position;
the part shooting unit is used for maintaining the pose, controlling an end effector of the robot to grab the part at the assembling position and horizontally move to the visual field range of the shooting device so as to shoot the grabbed part;
the pixel position acquisition unit is used for acquiring the pixel position of the image shot by the shooting device at the center of the grabbed part;
the target position calculation unit is used for converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
and the offset calculation unit is used for acquiring the current actual coordinate and the current third rotation angle of the end effector, and obtaining a first offset between the center of the tool and the center of the part at the assembly position according to the current actual coordinate and the current third rotation angle of the end effector and the target position.
A dynamic tool coordinate system establishing module 603, configured to correct the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system, and obtain an assembly position coordinate of the assembly position in the dynamic tool coordinate system;
a second offset calculation module 604, configured to maintain the pose, control the end effector to move to a pickup position to pick up a part, control the shooting device to shoot a part center of the pickup position picked by the end effector, and obtain a second offset between the tool center and the part center of the pickup position according to the point location mapping relationship;
in one implementation example, the second offset calculation module 604 includes:
the part shooting unit is used for maintaining the pose, controlling the end effector to move to a part taking position to grab a part and horizontally move to the visual field range of the shooting device so as to shoot the grabbed part;
the pixel position acquisition unit is used for acquiring the pixel position of the image shot by the shooting device at the center of the grabbed part;
the target position calculation unit is used for converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
and the offset calculation unit is used for acquiring the current actual coordinate and the current fourth rotation angle of the end effector and obtaining a second offset between the tool center and the part center of the pickup position according to the current actual coordinate and rotation angle of the end effector and the target position.
And the part assembling module 605 is configured to re-teach the dynamic tool coordinate system according to the second offset, and control the end effector to move the grasped part to the assembling position coordinate for part assembling according to the re-taught dynamic tool coordinate system.
In one example implementation, the parts assembly module 605 includes:
the teaching unit is used for re-teaching the dynamic tool coordinate system according to the second offset;
and the part assembling unit is used for controlling the end effector to move the grabbed part to the assembling position coordinate and rotate the fourth rotation angle to assemble the part according to the re-taught dynamic tool coordinate system.
According to the device for realizing assembly through the robot, provided by the embodiment of the invention, the point position mapping relation between the coordinate system of the robot and the image coordinate system of the shooting device vertically fixed right below the flange plane of the robot in a bottom view mode is obtained; controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot at the pose where the flange plane of the robot is parallel to the base plane, and obtaining a first offset between the tool center and the part center of the assembly position according to the point-to-point mapping relation; correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtain the assembly position coordinates of the assembly position in the dynamic tool coordinate system; maintaining the pose, controlling the end effector to move to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation; and re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grabbed part to the assembly position coordinate for part assembly according to the re-taught dynamic tool coordinate system. And after the end effector grabs the part from the part taking position, the shooting device is controlled to shoot the part to obtain the position and angle information of the current part and the central position of the part, a dynamic tool coordinate system is established by using the center of the current part, and the robot grabs the part to move to an assembling position to assemble the part under the dynamic tool coordinate system to realize precise assembly. The correction of the position and the angle of the product can be completed through one-time photographing without the teaching of human eyes, so that the time consumed by part assembly is shortened, the calculated amount is reduced, and the assembly precision of the part is improved.
EXAMPLE five
Fig. 7 is a schematic structural diagram of a robot according to a fifth embodiment of the present invention. The robot includes: a processor 71, a memory 72 and a computer program 73 stored in said memory 72 and executable on said processor 71, such as a program for a method for realizing an assembly by a robot. The processor 71, when executing the computer program 73, implements the steps in the above-described method embodiment of robot-implemented assembly, such as the steps S210 to S250 shown in fig. 2.
Illustratively, the computer program 73 may be partitioned into one or more modules that are stored in the memory 72 and executed by the processor 71 to accomplish the present application. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 73 in the robot. For example, the computer program 73 may be divided into a point location mapping relationship obtaining module, a first offset calculating module, a dynamic tool coordinate system establishing module, a second offset calculating module, and a part assembling module, and each module has the following specific functions:
the point location mapping relation acquisition module is used for acquiring the point location mapping relation between the coordinate system of the robot and the image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
the first offset calculation module is used for controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot according to the pose in which the flange plane of the robot is parallel to the base plane, and obtaining a first offset of the tool center of the end effector and the part center of the assembly position according to the point position mapping relation;
the dynamic tool coordinate system establishing module is used for correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtaining the assembly position coordinates of the assembly position in the dynamic tool coordinate system;
the second offset calculation module is used for maintaining the pose, controlling the end effector to operate to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation;
and the part assembling module is used for re-teaching the dynamic tool coordinate system according to the second offset and controlling the end effector to move the grabbed part to the assembling position coordinate to assemble the part according to the re-taught dynamic tool coordinate system.
The robot may include, but is not limited to, a processor 71, a memory 72, and a computer program 73 stored in the memory 72. Those skilled in the art will appreciate that fig. 7 is merely an example of a robot and is not intended to be limiting and may include more or fewer components than those shown, or some components in combination, or different components, for example the robot may also include input output devices, network access devices, buses, etc.
The Processor 71 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 72 may be an internal storage unit of the robot, such as a hard disk or a memory of the robot. The memory 72 may also be an external storage device, such as a plug-in hard disk provided on the robot, a Smart Memory Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like. Further, the memory 72 may also include both an internal storage unit of the robot and an external storage device. The memory 72 is used for storing the computer program and other programs and data required for the method of assembly by robot. The memory 72 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A method of robotic assembly, comprising:
acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
controlling the shooting device to shoot the part center of the assembly position grabbed by the end effector of the robot at the pose where the flange plane of the robot is parallel to the base plane, and obtaining a first offset between the tool center of the end effector and the part center of the assembly position according to the point-to-point mapping relation;
correcting the coordinate system of the robot according to the first offset to establish a dynamic tool coordinate system and obtain the assembly position coordinates of the assembly position in the dynamic tool coordinate system;
maintaining the pose, controlling the end effector to move to a part taking position to grab a part, controlling the shooting device to shoot the part center of the part taking position grabbed by the end effector, and obtaining a second offset of the tool center and the part center of the part taking position according to the point position mapping relation;
and re-teaching the dynamic tool coordinate system according to the second offset, and controlling the end effector to move the grabbed part to the assembly position coordinate for part assembly according to the re-taught dynamic tool coordinate system.
2. The method for realizing assembly through a robot according to claim 1, wherein the obtaining of the point-to-point mapping relationship between the coordinate system of the robot and the image coordinate system of the shooting device comprises:
maintaining the pose, and controlling the end effector to carry a calibration plate and the shooting device to carry out shooting calibration to obtain the offset of the tool center and the flange center of the end effector on the plane;
correcting a coordinate system of the robot according to the offset of the tool center of the end effector and the flange center on a plane to establish a first tool coordinate system;
and maintaining the pose, and controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device to obtain a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device.
3. The method for realizing assembly through a robot according to claim 2, wherein the maintaining of the pose, the controlling of the end effector carrying a calibration plate and the shooting device for shooting calibration, and the obtaining of the offset of the tool center and the flange center of the end effector on the plane comprise:
maintaining the pose, controlling the end effector to carry a calibration plate to align with the center of the visual field of the shooting device, and recording first position information and a first angle of the end effector and calibration coordinates of the center of a circle of the calibration plate in an image shot by the shooting device during alignment;
maintaining the pose, rotating the end effector from the first angle to a second angle, controlling the end effector to carry the calibration plate to align with the calibration coordinates in an image shot by the shooting device under the condition that the rotation angle is the second angle, and recording second position information of the end effector during alignment; the second angle is an angle formed by rotating the first angle by 180 degrees;
calculating the offset radius and the rotation angle of the end effector according to the first position information, the first angle, the second position information and the second angle;
and obtaining the offset of the tool center and the flange center of the end effector on the plane according to the offset radius and the rotation angle.
4. The method for realizing assembly through a robot according to claim 3, wherein the maintaining the pose and controlling the end effector to carry the calibration plate and the calibration coordinates of the shooting device to obtain the point mapping relation between the first tool coordinate system and the image coordinate system of the shooting device comprises:
maintaining the pose, controlling the end effector to carry the calibration plate to move in the visual field range of the shooting device, and simultaneously sampling the coordinate position of the end effector in a first tool coordinate system and the pixel coordinate of the circle center of the calibration plate in an image shot by the shooting device for N times according to a preset rule; n is more than or equal to 4;
and obtaining a point location mapping relation between the first tool coordinate system and the image coordinate system of the shooting device according to the coordinate position and the pixel coordinate of each sampling.
5. The method for realizing assembly through a robot according to claim 4, wherein the preset rule is that the calibration board is sampled when being positioned at the center, the four boundary centers and the four boundary included angle positions of the image shot by the shooting device; wherein N is 9.
6. The method for realizing assembly by a robot according to any one of claims 1 to 5, wherein controlling the photographing device to photograph a part center of an assembly position grasped by an end effector of the robot in a posture in which a flange plane of the robot is parallel to a base plane, and obtaining a first offset amount of a tool center of the end effector and the part center of the assembly position from the point-to-point mapping relation comprises:
controlling the alignment of an end effector of the robot and a part assembled in an assembly position in advance according to the pose in which a flange plane of the robot is parallel to a base plane, and recording third position information of the end effector during the alignment; the third position information is the assembly position;
maintaining the pose, and controlling an end effector of the robot to grab the part at the assembling position and translate the part to the view range of the shooting device so as to shoot the grabbed part;
obtaining the pixel position of the center of the grabbed part in the image shot by the shooting device;
converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
and acquiring the current actual coordinate and the current third rotation angle of the end effector, and acquiring a first offset of the center of the tool and the center of the part at the assembling position according to the current actual coordinate and the current third rotation angle of the end effector and the target position.
7. The method for robot assembly according to claim 6, wherein the maintaining of the pose, the controlling of the end effector to move to a pickup position for picking up the part, and the controlling of the photographing device to photograph a part center of the pickup position picked by the end effector, and the obtaining of the second offset amount between the tool center and the part center of the pickup position according to the point-to-point mapping relationship comprise:
maintaining the pose, controlling the end effector to move to a pickup position to grab the part and translate the part to the visual field range of the shooting device so as to shoot the grabbed part;
obtaining the pixel position of the center of the grabbed part in the image shot by the shooting device;
converting the pixel position according to the point location mapping relation to obtain a target position of the end effector;
and acquiring the current actual coordinate and the current fourth rotation angle of the end effector, and acquiring a second offset of the center of the tool and the center of the part at the workpiece taking position according to the current actual coordinate and rotation angle of the end effector and the target position.
8. The method of robotic assembly according to claim 7, wherein re-teaching the dynamic tool coordinate system according to the second offset and controlling the end effector to move the grasped part to the assembly position coordinates for part assembly according to the re-taught dynamic tool coordinate system comprises:
re-teaching the dynamic tool coordinate system according to the second offset;
and controlling the end effector to move the grabbed part to the assembly position coordinate and rotate the fourth rotation angle for part assembly according to the re-taught dynamic tool coordinate system.
9. A method of robotic assembly, comprising:
acquiring a point location mapping relation between a coordinate system of the robot and an image coordinate system of the shooting device; the shooting device is vertically fixed right below a flange plane of the robot in an upward view mode;
controlling an end effector of the robot to move to a pickup position to pick up a part according to the pose in which a flange plane of the robot is parallel to a base plane, controlling a shooting device to shoot the part center of the pickup position picked by the end effector, and obtaining the offset between the tool center of the end effector and the part center of the pickup position according to the point location mapping relation;
correcting a coordinate system of the robot according to the offset to establish a dynamic tool coordinate system, controlling the end effector to move the grabbed parts to an assembly position for assembly, and recording assembly position coordinates of the assembly position in the dynamic tool coordinate system;
and controlling the end effector to move the grabbed parts to the assembly position coordinates according to the dynamic tool coordinate system to assemble the parts.
10. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the steps of the robot-implemented assembly method according to any of claims 1 to 9 are implemented when the computer program is executed by the processor.
CN201911414360.5A 2019-12-31 2019-12-31 Method and device for realizing assembly through robot and robot Active CN111791226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911414360.5A CN111791226B (en) 2019-12-31 2019-12-31 Method and device for realizing assembly through robot and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911414360.5A CN111791226B (en) 2019-12-31 2019-12-31 Method and device for realizing assembly through robot and robot

Publications (2)

Publication Number Publication Date
CN111791226A true CN111791226A (en) 2020-10-20
CN111791226B CN111791226B (en) 2021-12-03

Family

ID=72805064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911414360.5A Active CN111791226B (en) 2019-12-31 2019-12-31 Method and device for realizing assembly through robot and robot

Country Status (1)

Country Link
CN (1) CN111791226B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113643384A (en) * 2021-10-12 2021-11-12 深圳荣耀智能机器有限公司 Coordinate system calibration method, automatic assembly method and device
CN114179090A (en) * 2021-12-28 2022-03-15 苏州优速软件研发有限公司 Rotation assembly control method, system, equipment and storage medium for manipulator
CN114589689A (en) * 2020-12-07 2022-06-07 山东新松工业软件研究院股份有限公司 Visual positioning method and device based on two-dimensional code and computer readable storage medium
CN117140540A (en) * 2023-08-04 2023-12-01 上海智元新创技术有限公司 Fastening system and pick-up fastening method of fastening system
CN114248293B (en) * 2021-12-23 2024-04-09 成都大学 2D laser profiler and 2D camera-based perforated part grabbing method and system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140334668A1 (en) * 2013-05-10 2014-11-13 Palo Alto Research Center Incorporated System and method for visual motion based object segmentation and tracking
CN104482924A (en) * 2014-12-11 2015-04-01 中国航天空气动力技术研究院 Revolution body object pose vision measurement method
CN106256508A (en) * 2015-06-22 2016-12-28 罗普伺达有限公司 Robot control system and the method for part are installed
CN106426172A (en) * 2016-10-27 2017-02-22 深圳元启智能技术有限公司 Calibration method and system for industrial robot tool coordinate system
CN106426161A (en) * 2015-08-06 2017-02-22 康耐视公司 System and method for interlinking machine vision coordinate spaces together in a guide assembly environment
CN107924175A (en) * 2015-07-23 2018-04-17 X开发有限责任公司 System and method for determining a work offset
US20180126547A1 (en) * 2016-09-16 2018-05-10 Carbon Robotics, Inc. System and calibration, registration, and training methods
CN108122257A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of Robotic Hand-Eye Calibration method and device
CN108364320A (en) * 2018-03-29 2018-08-03 深圳市自行科技有限公司 camera calibration method, terminal device and computer readable storage medium
CN109318234A (en) * 2018-11-09 2019-02-12 哈尔滨工业大学 A kind of scaling method suitable for visual servo plug operation
EP3441200A1 (en) * 2017-08-07 2019-02-13 Elwema Automotive GmbH Referencing method and device for industrial robots
US10399227B1 (en) * 2019-03-29 2019-09-03 Mujin, Inc. Method and control system for verifying and updating camera calibration for robot control
CN110276799A (en) * 2019-05-29 2019-09-24 季华实验室 A kind of coordinate scaling method, calibration system and mechanical arm
CN110349218A (en) * 2018-04-03 2019-10-18 泰科电子(上海)有限公司 The scaling method and calibration system of video camera
CN110340881A (en) * 2018-04-03 2019-10-18 泰科电子(上海)有限公司 The scaling method and calibration system of robot tool
CN110405773A (en) * 2019-08-19 2019-11-05 广东博智林机器人有限公司 A kind of floor mounting method and robot

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140334668A1 (en) * 2013-05-10 2014-11-13 Palo Alto Research Center Incorporated System and method for visual motion based object segmentation and tracking
CN104482924A (en) * 2014-12-11 2015-04-01 中国航天空气动力技术研究院 Revolution body object pose vision measurement method
CN106256508A (en) * 2015-06-22 2016-12-28 罗普伺达有限公司 Robot control system and the method for part are installed
CN107924175A (en) * 2015-07-23 2018-04-17 X开发有限责任公司 System and method for determining a work offset
CN106426161A (en) * 2015-08-06 2017-02-22 康耐视公司 System and method for interlinking machine vision coordinate spaces together in a guide assembly environment
US20180126547A1 (en) * 2016-09-16 2018-05-10 Carbon Robotics, Inc. System and calibration, registration, and training methods
CN106426172A (en) * 2016-10-27 2017-02-22 深圳元启智能技术有限公司 Calibration method and system for industrial robot tool coordinate system
CN108122257A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of Robotic Hand-Eye Calibration method and device
EP3441200A1 (en) * 2017-08-07 2019-02-13 Elwema Automotive GmbH Referencing method and device for industrial robots
CN108364320A (en) * 2018-03-29 2018-08-03 深圳市自行科技有限公司 camera calibration method, terminal device and computer readable storage medium
CN110349218A (en) * 2018-04-03 2019-10-18 泰科电子(上海)有限公司 The scaling method and calibration system of video camera
CN110340881A (en) * 2018-04-03 2019-10-18 泰科电子(上海)有限公司 The scaling method and calibration system of robot tool
CN109318234A (en) * 2018-11-09 2019-02-12 哈尔滨工业大学 A kind of scaling method suitable for visual servo plug operation
US10399227B1 (en) * 2019-03-29 2019-09-03 Mujin, Inc. Method and control system for verifying and updating camera calibration for robot control
CN110276799A (en) * 2019-05-29 2019-09-24 季华实验室 A kind of coordinate scaling method, calibration system and mechanical arm
CN110405773A (en) * 2019-08-19 2019-11-05 广东博智林机器人有限公司 A kind of floor mounting method and robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WENCHANG ZHANG; RONGHAO WANG; LIJUAN JI; RUNNING CAO; LEI CHEN: "High Precise Position Method Based on 2 CCD Cameras in Alternate-Angle Image Acquisition Mode", 《MECHATRONICS, ROBOTICS AND AUTOMATION 》 *
俞强强: "基于视觉伺服的核燃料棒组装机器人位姿精确识别研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114589689A (en) * 2020-12-07 2022-06-07 山东新松工业软件研究院股份有限公司 Visual positioning method and device based on two-dimensional code and computer readable storage medium
CN113643384A (en) * 2021-10-12 2021-11-12 深圳荣耀智能机器有限公司 Coordinate system calibration method, automatic assembly method and device
CN114248293B (en) * 2021-12-23 2024-04-09 成都大学 2D laser profiler and 2D camera-based perforated part grabbing method and system
CN114179090A (en) * 2021-12-28 2022-03-15 苏州优速软件研发有限公司 Rotation assembly control method, system, equipment and storage medium for manipulator
CN117140540A (en) * 2023-08-04 2023-12-01 上海智元新创技术有限公司 Fastening system and pick-up fastening method of fastening system

Also Published As

Publication number Publication date
CN111791226B (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN111791227B (en) Robot hand-eye calibration method and device and robot
CN111791226B (en) Method and device for realizing assembly through robot and robot
TWI672206B (en) Method and apparatus of non-contact tool center point calibration for a mechanical arm, and a mechanical arm system with said calibration function
CN109483531B (en) Machine vision system and method for picking and placing FPC board by manipulator at fixed point
JP2020116734A (en) System and method for automatic hand-eye calibration of vision system for robot motion
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN108827154B (en) Robot non-teaching grabbing method and device and computer readable storage medium
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN109671122A (en) Trick camera calibration method and device
CN110276799B (en) Coordinate calibration method, calibration system and mechanical arm
CN110276806A (en) Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN107808400B (en) Camera calibration system and calibration method thereof
JP2009269110A (en) Assembly equipment
US20150202776A1 (en) Data generation device for vision sensor and detection simulation system
CN110465946B (en) Method for calibrating relation between pixel coordinate and robot coordinate
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN110136068B (en) Sound membrane dome assembly system based on position calibration between bilateral telecentric lens cameras
CN113500593B (en) Method for grabbing designated part of shaft workpiece for feeding
CN114833832B (en) Robot hand-eye calibration method, device, equipment and readable storage medium
CN111591474B (en) Alignment type hand-eye calibration method for spacecraft on-orbit operating system
JP2017006990A (en) Robot, control device, and control method
CN107756391A (en) The bearing calibration of mechanical arm correction system
CN117173254A (en) Camera calibration method, system, device and electronic equipment
CN116652970A (en) Four-axis mechanical arm 2D hand-eye calibration method and system and storage medium
CN109177138B (en) Method and device for aligning glass and membrane

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant