CN115366105A - Workpiece grabbing method and device, electronic equipment and storage medium - Google Patents
Workpiece grabbing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN115366105A CN115366105A CN202211052358.XA CN202211052358A CN115366105A CN 115366105 A CN115366105 A CN 115366105A CN 202211052358 A CN202211052358 A CN 202211052358A CN 115366105 A CN115366105 A CN 115366105A
- Authority
- CN
- China
- Prior art keywords
- workpiece
- image
- calibration
- translation
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Fuzzy Systems (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
The embodiment of the application provides a workpiece grabbing method, a workpiece grabbing device, electronic equipment and a storage medium, wherein the workpiece grabbing method comprises the following steps: acquiring a first workpiece image acquired by a camera; determining at least one first feature point from the first workpiece image; determining a first space coordinate of the first workpiece in a space coordinate system of the mechanical arm according to the pixel coordinate and the calibration information of at least one first characteristic point; and generating control information according to the first space coordinate, and sending the control information to the mechanical arm so that the mechanical arm can grab the first workpiece according to the control information. According to the scheme, the position of the workpiece is identified through the mapping relation between the pixel coordinate system and the space coordinate system, the mechanical arm is controlled to automatically grab the workpiece, and the problem that in the process of grabbing the workpiece, the problem that long time is consumed and production efficiency is low due to the fact that the position where the target object is placed needs to be manually placed to a specific angle and position can be solved.
Description
Technical Field
The embodiment of the application relates to the technical field of computer vision, in particular to a workpiece grabbing method and device, electronic equipment and a storage medium.
Background
In the work piece production process, in order to reduce the cost of labor and improve the security, need to snatch the work piece with the arm replacement manual work to transport, overturn, welding etc. the work piece. The workpiece is grabbed by simulating the way that human eyes see the object and driving the mechanical arm. But the grabbing effect of the mechanical arm is greatly different from the manual grabbing effect.
At present, in order to deal with the above difference, a common method is to place a workpiece right below the robot arm, and manually swing the position where the workpiece is placed to a specific angle and position, so as to ensure the grabbing effect of the robot arm.
However, the method of manually placing the position of the workpiece at a specific angle and position takes a long time, resulting in low production efficiency.
Disclosure of Invention
In order to solve the above technical problems, embodiments of the present invention provide a workpiece grabbing method, device, electronic apparatus, and storage medium, so as to at least solve or alleviate the above problems.
According to a first aspect of embodiments of the present application, there is provided a workpiece gripping method including: acquiring a first workpiece image acquired by a camera, wherein the first workpiece image comprises an image of a first workpiece on a stage; determining at least one first feature point from the first workpiece image, wherein the first feature point is used for indicating the position of the image of the first workpiece in the first workpiece image; determining a first spatial coordinate of the first workpiece in a spatial coordinate system of the mechanical arm according to the pixel coordinate of the at least one first characteristic point and calibration information, wherein the calibration information is used for indicating a mapping relation between the pixel coordinate system of the image acquired by the camera and the spatial coordinate system; and generating control information according to the first space coordinate, and sending the control information to the mechanical arm so that the mechanical arm can grab the first workpiece according to the control information.
According to a second aspect of embodiments of the present application, there is provided an object grasping apparatus including: the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first workpiece image acquired by a camera, and the first workpiece image comprises an image of a first workpiece on an object stage; an extraction module configured to determine at least one first feature point from the first workpiece image, wherein the first feature point is used to indicate a position of the image of the first workpiece in the first workpiece image; the determining module is used for determining a first spatial coordinate of the first workpiece in a spatial coordinate system of the mechanical arm according to the pixel coordinate of the at least one first feature point and calibration information, wherein the calibration information is used for indicating a mapping relation between the pixel coordinate system of the image acquired by the camera and the spatial coordinate system; and the grabbing module is used for generating control information according to the first space coordinate and sending the control information to the mechanical arm so that the mechanical arm grabs the first workpiece according to the control information.
According to a third aspect of embodiments herein, there is provided an electronic device comprising: the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to any one of the method embodiments.
According to a fourth aspect of embodiments of the present application, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements any one of the method embodiments as described above.
According to a fifth aspect of embodiments of the present application, there is provided a computer program product comprising computer instructions for instructing a computing device to perform operations corresponding to any one of the above-described method embodiments.
According to the technical scheme, at least one first characteristic point determined from the first workpiece image can indicate the position of the image of the first workpiece in the first workpiece image, the calibration information can indicate the mapping relation between the pixel coordinate system of the image acquired by the camera and the space coordinate system of the mechanical arm, the position of the first workpiece in the space coordinate system of the mechanical arm can be determined through the calibration information and the pixel coordinate of the first characteristic point, further, control information can be generated according to the position of the first workpiece in the space coordinate system, and after the control information is sent to the mechanical arm, the mechanical arm captures the first workpiece according to the control information. Therefore, the position of the workpiece in the space coordinate system of the mechanical arm can be determined, so that the mechanical arm can automatically grab the workpiece randomly placed on the objective table, the workpiece does not need to be placed on the objective table according to a specific angle and a specific position, time consumed for placing the workpiece on the objective table can be saved, and the grabbing efficiency of the workpiece is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a flow chart of a method of workpiece capture according to one embodiment of the present application;
FIG. 2 is a flow chart of a second spatial coordinate acquisition method according to an embodiment of the present application;
FIG. 3 is a flow diagram of an offset determination method according to an embodiment of the present application;
FIG. 4 is a flow chart of a calibration information determination method according to an embodiment of the present application;
FIG. 5 is a schematic view of a robotic arm movement sequence according to one embodiment of the present application;
FIG. 6 is a schematic illustration of the point location coordinates of the robotic arm movement of one embodiment of the present application;
FIG. 7 is a schematic view of an object capture device according to one embodiment of the subject application;
FIG. 8 is a schematic view of an electronic device of an embodiment of the application.
List of reference numerals:
100: workpiece grasping method 200: control information generation method 300: offset determination method
400: calibration information determination method 700: the object grasping apparatus 701: acquisition module
702: the extraction module 703: the determination module 704: grabbing module
800: the electronic device 801: the processor 802: communication interface
803: the storage 804: communication bus 805: procedure for measuring the movement of a moving object
101: acquiring a first workpiece image acquired by a camera
102: determining at least one first feature point from a first workpiece image
103: determining a first spatial coordinate of a first workpiece in a spatial coordinate system of a robot arm
104: generating control information according to the first space coordinate, and sending the control information to the mechanical arm
201: acquiring a second workpiece image previously acquired by the camera
202: determining at least one second feature point from a second workpiece image
203: determining second spatial coordinates of the first workpiece in a spatial coordinate system
301: determining the X-axis displacement offset of the first workpiece corresponding to the standard position in the space coordinate system
302: determining a Y-axis displacement offset of a position of a first workpiece in a spatial coordinate system relative to a nominal position
303: determining a Z-axis rotational offset of a position of a first workpiece in a spatial coordinate system relative to a nominal position
401: determining a calibration reference point located on a second workpiece
402: controlling the mechanical arm to perform N times of translation along the X-axis and/or Y-axis directions in a space coordinate system
403: obtaining a translation calibration image acquired by a camera after each translation of a mechanical arm
404: controlling the mechanical arm to rotate for M times around the Z axis in a space coordinate system
405: obtaining a rotation calibration image acquired by a camera after each rotation of a mechanical arm
406: determining calibration information
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application shall fall within the scope of the protection of the embodiments in the present application.
Workpiece grabbing method
Fig. 1 illustrates a workpiece capture method 100 according to an embodiment of the present application, where the workpiece capture method 100 includes the following steps:
The first workpiece image includes an image of a first workpiece, which may be a circuit board or the like. When the first workpiece image is collected, the first workpiece image is placed on the objective table, and after the mechanical arm moves to the position above the objective table, the camera which is installed on the mechanical arm and moves synchronously with the mechanical arm collects the image of the first workpiece to obtain the first workpiece image.
At least one first feature point is determined 102 from the first workpiece image.
After acquiring the image of the first workpiece to be grabbed, which is acquired by the camera, the position of the image of the first workpiece in the image of the first workpiece needs to be determined. At least one first feature point is determined from the first workpiece image, wherein the first feature point refers to a point in the first workpiece image where the gray scale value changes drastically or a point on the edge of the first workpiece image where the curvature is large. The first feature point may indicate a position of an image of a first workpiece to be grasped in the first workpiece image.
And 103, determining a first space coordinate of the first workpiece in a space coordinate system of the mechanical arm.
The robot arm has a corresponding spatial coordinate system based on which the robot arm can be moved from one position to another. In order to realize that the mechanical arm grasps the first workpiece, a first spatial coordinate of the first workpiece in a spatial coordinate system of the mechanical arm needs to be determined, so that the mechanical arm determines a position of the first workpiece in the spatial coordinate system according to the first spatial coordinate of the first workpiece, and further grasps the first workpiece.
The calibration information may indicate a mapping between a pixel coordinate system of an image captured by the camera and a spatial coordinate system of the robotic arm, the calibration information being fixed since the robotic arm and the camera are moving synchronously. Since the pixel coordinates of each first feature point can indicate the position of the image of the first workpiece in the first workpiece image, the pixel coordinates of each first feature point can be transformed into space coordinates in a space coordinate system according to the pixel coordinates and the calibration information of each first feature point, and thus first space coordinates which can indicate that the first work is in the space coordinate system are obtained.
And 104, generating control information according to the first space coordinate, and sending the control information to the mechanical arm.
The first space coordinate indicates the position of the first workpiece in the space coordinate system, corresponding control information can be generated according to the first space coordinate, and after the control information is sent to the mechanical arm, the mechanical arm can move to a proper position according to the control information to grab the first workpiece. The control information includes, but is not limited to, instructions for grabbing, translating, rotating, telescoping, etc.
In the embodiment of the application, at least one first feature point determined from the first workpiece image can indicate the position of the image of the first workpiece in the first workpiece image, the calibration information can indicate the mapping relation between the pixel coordinate system of the image acquired by the camera and the space coordinate system of the robot arm, the position of the first workpiece in the space coordinate system of the robot arm can be determined through the calibration information and the pixel coordinate of the first feature point, further, control information can be generated according to the position of the first workpiece in the space coordinate system, and after the control information is sent to the robot arm, the robot arm grasps the first workpiece according to the control information. Therefore, the position of the workpiece in the space coordinate system of the mechanical arm can be determined, so that the mechanical arm can automatically grab the workpiece randomly placed on the objective table, the workpiece does not need to be placed on the objective table according to a specific angle and a specific position, time consumed for placing the workpiece on the objective table can be saved, and the grabbing efficiency of the workpiece is improved.
In a possible implementation manner, when the control information is generated according to the first spatial coordinate, a predetermined second spatial coordinate may be obtained, where the second spatial coordinate is used to indicate a coordinate of the first workpiece in the spatial coordinate system when the first workpiece is located at a standard position on the stage, and then an offset of the position of the first workpiece in the spatial coordinate system with respect to the standard position is determined according to the first spatial coordinate and the second spatial coordinate, and the offset is further determined as the control information.
It will be appreciated that the second spatial coordinates are pre-obtained indicating the coordinates of the first workpiece in the spatial coordinate system if the first workpiece is located at a standard position on the stage. The workpiece of the same type as the first workpiece to be grasped can be placed at the standard position on the stage in advance to determine the second spatial coordinate, and the first workpiece does not need to be placed at the standard position on the stage every time of grasping in the process of actually controlling the mechanical arm to grasp the first workpiece.
Since the second spatial coordinate indicates the position of the first workpiece at the standard position in the spatial coordinate system, and the first spatial coordinate indicates the actual position of the first workpiece in the spatial coordinate system, the offset amount of the actual position of the first workpiece in the spatial coordinate system with respect to the standard position can be determined based on the first spatial coordinate and the second spatial coordinate.
The motion process of the mechanical arm is preset according to the standard position, and when the mechanical arm moves to the position above the objective table according to the motion process, the first workpiece placed in the standard position can be grabbed. In the process of actually grabbing the workpiece, the offset of the actual position of the first workpiece in the space coordinate system relative to the standard position is used as control information to be sent to the mechanical arm, and the mechanical arm can perform corresponding translation or rotation on a preset motion flow according to the control information so as to grab the first workpiece placed at any position on the objective table.
In the embodiment of the application, the first space coordinate indicates an actual position of the first workpiece in a space coordinate system, the second space coordinate system indicates a position of the first workpiece in the space coordinate system when the first workpiece is located at a standard position, offset of the actual position of the first work relative to the standard position can be determined according to the first space coordinate and the second space coordinate, the offset is sent to the mechanical arm as control information, the mechanical arm can correspondingly translate or rotate on a preset motion process according to the offset, the preset motion process corresponds to the standard position, the workpiece located at a non-standard position on the objective table is grabbed, and motion control of the mechanical arm is facilitated.
Fig. 2 is a flowchart of a second spatial coordinate acquisition method according to an embodiment of the present application. As shown in fig. 2, the second spatial coordinate obtaining method 200 includes the following steps:
When the camera collects the image of the second workpiece, the first workpiece is placed at the standard position on the objective table, the mechanical arm moves to the position above the first workpiece, and the camera moving synchronously with the mechanical arm collects the image of the first workpiece to obtain the image of the second workpiece.
At least one second feature point is determined from the second workpiece image, step 202.
After acquiring the image of the second workpiece to be grabbed, which is acquired by the camera, the position of the image of the first workpiece in the image of the second workpiece needs to be determined. At least one second feature point is determined from the second workpiece image, wherein the second feature point refers to a point in the second workpiece image where the gray scale value changes drastically or a point on the edge of the second workpiece image where the curvature is large. The second feature point may indicate a position of the image of the first workpiece in the image of the second workpiece.
And step 203, determining a second space coordinate of the first workpiece in the space coordinate system.
Each second feature point indicates a position of the image of the first workpiece at the standard position in the second workpiece image, and the calibration information indicates a mapping relationship between a pixel coordinate system and a spatial coordinate system of the image captured by the camera, so that a second spatial coordinate indicating the first workpiece at the standard position in the spatial coordinate system of the robot arm can be determined based on each second feature point and the calibration information.
In the embodiment of the application, a first workpiece is placed at a standard position on an object stage in advance, a second workpiece image is acquired through a camera, one or more second characteristic points are determined from the second workpiece image, a second space coordinate is determined according to the second characteristic points and calibration information, the position of the first workpiece in a space coordinate system of a mechanical arm when the first workpiece is located at the standard position can be accurately indicated by the determined second space coordinate, the accuracy of the determined control information is further ensured, and the mechanical arm can accurately grab the first workpiece according to the received control information.
In one possible implementation, the first feature point is a point on the contour line of the first workpiece in the first workpiece image, and the second feature point is a point on the contour line of the first workpiece in the second workpiece image.
In general, a large chromatic aberration exists between the workpiece and the stage, the contour line of the workpiece can be accurately identified in the workpiece image acquired by the camera, and then the points on the contour line of the workpiece can be determined as the feature points. In this case, all possible first characteristic points lie on the contour of the first circuit board and all possible second characteristic points lie on the contour of the second circuit board.
In the embodiment of the application, the first characteristic points are located on the contour line of the first workpiece, and the second characteristic points are located on the contour line of the first workpiece, so that the first workpiece can be accurately positioned according to the first characteristic points, the accuracy of the determined second space coordinate can be ensured according to the second characteristic points, and the manipulator can smoothly grab the workpiece.
In one possible implementation, the spatial coordinate system of the robot arm may be a three-dimensional cartesian coordinate system in which the planes of the X-axis and the Y-axis are parallel to the stage, and the first spatial coordinate and the second spatial coordinate each include an abscissa in the X-axis direction of the three-dimensional cartesian coordinate system, an ordinate in the Y-axis direction of the three-dimensional cartesian coordinate system, and a rotation angle in the Z-axis direction of the three-dimensional cartesian coordinate system. On this basis, fig. 3 provides an offset determination method, and as shown in fig. 3, the offset determination method 300 includes the following steps:
and 301, determining the X-axis displacement offset of the position of the first workpiece in the space coordinate system relative to the standard position.
Since the second spatial coordinate indicates the position of the standard position in the spatial coordinate system, a difference between an abscissa included in the first spatial coordinate and an abscissa included in the second spatial coordinate may be calculated, and the difference may be determined as an X-axis displacement offset of the position of the first workpiece with respect to the standard position.
Since the second spatial coordinate indicates the position of the standard position in the spatial coordinate system, a difference between the ordinate included in the first spatial coordinate and the ordinate included in the second spatial coordinate may be calculated, and the difference may be determined as the Y-axis offset of the position of the first circuit board with respect to the standard position.
Since the second spatial coordinate indicates the position of the standard position in the spatial coordinate system, a difference between the rotation angle included in the first spatial coordinate and the rotation angle included in the second spatial coordinate may be calculated, and the difference may be determined as a Z-axis rotational offset amount of the position of the first circuit board with respect to the standard position. Wherein, since the second spatial coordinate indicates the position of the standard position in the spatial coordinate system, the second spatial coordinate may include a rotation angle equal to zero with the standard position as a reference.
In the embodiment of the application, the X-axis displacement offset, the Y-axis displacement offset and the Z-axis rotation offset of the first workpiece relative to the standard position in the spatial coordinate system are calculated to serve as control information of the mechanical arm, and the mechanical arm can translate or rotate relative to the standard position according to the control information, so that the mechanical arm can grab the workpiece at any position on the object stage.
Fig. 4 is a flowchart of a calibration information determining method according to an embodiment of the present application, and as shown in fig. 4, the calibration information determining method 400 includes the following steps:
The space coordinate system of the mechanical arm is a three-dimensional Cartesian coordinate system, the X axis, the Y axis and the Z axis in the coordinate system are perpendicular to each other, and the plane where the X axis and the Y axis are located is parallel to the objective table. When the first workpiece and the second workpiece are respectively placed on the stage, the first workpiece and the second workpiece have the same height in the Z-axis direction. The second workpiece and the first workpiece may be the same type of workpiece, or may be different types of workpieces, for example, the second workpiece and the first workpiece are circuit boards of the same type.
When the camera is calibrated through the second workpiece, because the camera acquires a plane image, the distance between the workpiece and the camera is different, the position of the image of the workpiece in the image acquired by the camera is influenced, and further the calibration result is influenced, so that the height of the second workpiece is required to be the same as that of the first workpiece in the Z-axis direction.
In order to calibrate the camera by means of an image of the second workpiece, a calibration reference point is determined on the upper surface of the second workpiece, the determined calibration reference point having a corresponding image in the image captured by the camera.
And step 402, controlling the mechanical arm to perform N times of translation along the X-axis and/or Y-axis direction in the space coordinate system.
In the camera calibration process, in order to determine a conversion relationship between a pixel coordinate system of an image acquired by the camera and a spatial coordinate system of the mechanical arm, the mechanical arm is controlled to drive the camera fixed on the mechanical arm to perform N times of translation in the spatial coordinate system along the X-axis direction and/or the Y-axis direction according to a preset X-axis translation amount and a preset Y-axis translation amount, so that the camera can acquire images of a second workpiece placed on the object stage at different positions. In order to solve the transformation relation between the pixel coordinate system and the space coordinate system, at least three sets of pixel coordinates and space coordinates are needed, so that the translation times N of the mechanical arm in the space coordinate system are positive integers which are greater than or equal to 3. The camera acquires an image of the second workpiece after each translation of the mechanical arm as a translation calibration image.
When the robot arm translates along the X-axis and/or Y-axis direction in the spatial coordinate system, the position of the robot arm in the Z-axis direction remains unchanged, and the distance between the camera and the stage in the Z-axis direction remains unchanged.
And 403, acquiring a translation calibration image acquired by the camera after each translation of the mechanical arm.
The method comprises the steps of acquiring translation calibration images acquired by a camera, and acquiring at least one translation calibration image after each translation of a mechanical arm by the camera, so that at least N translation calibration images can be acquired.
And step 404, controlling the mechanical arm to rotate around the Z axis for M times in the space coordinate system.
In order to improve the calibration accuracy of the camera, the camera is not limited to move along the X-axis and the Y-axis, and the mechanical arm can be controlled to drive the camera to rotate around the Z axis for M times in a space coordinate system according to the preset rotation amount of the Z axis. In order to determine the conversion relationship between the pixel coordinates and the space coordinates when the mechanical arm rotates, at least three groups of pixel coordinates and space coordinates of the calibration reference point after the mechanical arm rotates are needed, and therefore M is a positive integer greater than or equal to 3.
And 405, acquiring a rotation calibration image acquired by the camera after each rotation of the mechanical arm.
The method comprises the steps of acquiring a rotation calibration image acquired by a camera, and acquiring at least one rotation calibration image after each rotation of a mechanical arm by the camera, so that at least M rotation calibration images can be acquired.
After the translation calibration image and the rotation calibration image are obtained, calibration information is determined according to the position offset of the image of the calibration reference point in different translation calibration images, the position offset of the image of the calibration reference point in different rotation calibration images, the X-axis translation amount and the Y-axis translation amount of the mechanical arm when each translation calibration image is collected, and the Z-axis rotation amount of the mechanical arm when each rotation calibration image is collected.
Because the second workpiece is positioned on the object stage and keeps still, after the mechanical arm drives the camera to translate or rotate, the position of an image of a calibration reference point on the second workpiece in the image acquired by the camera can be deviated, and the deviation amount is positively correlated with the translation amount of the mechanical arm along the X axis and/or the Y axis and the rotation amount around the Z axis, calibration information used for indicating the mapping relationship between the pixel coordinate system and the space coordinate system of the image acquired by the camera can be determined according to each translation calibration image and each rotation calibration image.
In the embodiment of the application, the mechanical arm drives the camera to translate and rotate, the camera acquires the image of the second workpiece after translation and/or rotation every time, the mapping relation between the pixel coordinate system of the image acquired by the camera and the space coordinate system of the mechanical arm is determined according to the translation amount and/or rotation amount of the camera and the position offset amount of the image of the calibration reference point in the image acquired by the camera, so that calibration information reflecting the mapping relation is obtained, after the camera acquires the image of the workpiece placed on the object stage, the pixel coordinate of the workpiece in the image acquired by the camera can be converted into the space coordinate in the space coordinate system through the calibration information, the mechanical arm can grab the workpiece according to the space coordinate of the workpiece, and the mechanical arm can grab the workpiece randomly placed on the object stage.
In addition, the camera is calibrated through a plurality of translation calibration images and a plurality of rotation calibration images, the generated calibration information can accurately reflect the mapping relation between the pixel coordinate system of the image acquired by the camera and the space coordinate system of the mechanical arm, the space coordinate of the workpiece in the space coordinate system can be accurately determined through the calibration information, and the success rate of the mechanical arm for grabbing the workpiece is ensured.
In a possible implementation manner, when the calibration information is determined, the i-th pixel translation offset is calculated through the i-th translation of the N translations performed by the mechanical arm according to the position of the i-1-th translation calibration image acquired by the camera before the image of the calibration reference point performs the i-th translation on the mechanical arm and the position in the i-th translation calibration image acquired by the camera after the mechanical arm performs the i-th translation. And calculating the j pixel rotation offset according to the j-1 rotation calibration image position acquired by the camera before the j rotation of the mechanical arm and the j rotation calibration image position acquired by the camera after the j rotation of the mechanical arm by the j rotation of the image of the calibration reference point in the M rotations of the mechanical arm. And determining calibration information according to the ith pixel translation offset, the X-axis translation amount and the Y-axis translation amount corresponding to the ith translation of the mechanical arm, and the jth pixel rotation offset and the Z-axis rotation amount corresponding to the jth rotation of the mechanical arm.
For example, the number of translations N is 8 and the number of rotations M is 4.
And in the 8-time translation process of the mechanical arm, calculating the pixel translation offset of the image of the calibration reference point in the translation calibration image before and after each translation. After the 1 st translation, calculating the 1 st pixel translation offset of the image of the calibration reference point in the translation calibration image before the 1 st translation and after the 1 st translation, wherein the calculation mode of the subsequent i-th pixel translation offset is the same as the calculation mode, and is not repeated herein.
And in the process of 4 rotations of the mechanical arm, calculating the pixel rotation offset of the image of the calibration reference point in the calibration image before and after each rotation. After the 1 st rotation, the 1 st pixel rotation offset of the image of the calibration reference point in the rotation calibration image before the 1 st rotation and after the 1 st rotation is calculated, and the calculation mode of the subsequent j-th pixel rotation offset is the same as the calculation mode, and is not repeated herein.
And determining calibration information according to the 1 st to 8 th pixel translation offset, the X-axis translation amount and the Y-axis translation amount corresponding to each translation, the 1 st to 4 th pixel rotation offset and the Z-axis rotation amount corresponding to each rotation.
In the embodiment of the application, the offset of the calibration reference point matched with the movement times is obtained by comparing the pixel translation offset of the image of the calibration reference point in the translation calibration image before and after each translation and the pixel rotation offset of the image of the calibration reference point in the rotation calibration image before and after each rotation, and the precision of camera calibration is further improved by the movement offset in the space coordinate system and the corresponding offset of the calibration reference point each time.
Fig. 5 is a schematic diagram of a motion sequence of a robot arm according to an embodiment of the present application, as shown in fig. 5, the number N of times of translation of the robot arm is 9, and when the robot arm performs 9 times of translation, the combinations of translation amounts along the X-axis direction and the Y-axis direction are (X, 0), (0,y), (-X, 0), (0,y), (X, 0), and (-X, -Y), where X and Y are not equal to 0.
In the embodiment of the application, the translation path of the mechanical arm is planned, so that the situation that the mechanical arm exceeds the range of an objective table in the translation process to cause a camera to acquire invalid pictures without workpieces can be avoided, and the camera calibration efficiency is improved.
In one possible implementation, the preset X-axis translation amount and the Y-axis translation amount are equal, i.e., X is equal to Y.
Fig. 6 is a schematic diagram of the coordinates of the point of motion of the robotic arm according to an embodiment of the present application, as shown in fig. 6, where the X-axis translation and the Y-axis translation are equal to 5. The combination of the translation amounts along the X-axis direction and the Y-axis direction when the mechanical arm performs 9 translations is (5,0), (5,0), (0,5), (-5,0), (-5,0), (0,5), (5,0), (5,0) and (-5, -5) in sequence.
In the embodiment of the application, the X-axis translation amount and the Y-axis translation amount are set to be equal offset, so that the calculation complexity is reduced, and the camera calibration efficiency is improved.
In one possible implementation, the robotic arm is configured to perform both translational and rotational operations during at least one of the movements.
In the embodiment of the application, the translation and the rotation are synchronously completed, so that the image quantity acquired by the camera can be reduced, the calculation accuracy is ensured while the calculation quantity is reduced, and the camera calibration efficiency is improved.
Object gripping device
Fig. 7 is a schematic view of an object grasping apparatus according to an embodiment of the present application. As shown in fig. 7, the object grasping apparatus 700 includes an acquisition module 701, an extraction module 702, a determination module 703, and a grasping module 704.
After the acquisition module 701 acquires the first workpiece image acquired by the camera, the extraction module 702 determines at least one first feature point from the first workpiece image acquired by the camera.
The determining module 703 determines a first spatial coordinate of the first workpiece in the spatial coordinate system of the robot arm according to the first feature point determined by the extracting module 702.
The grasping module 704 generates control information according to the first spatial coordinates determined by the determining module 703.
In the embodiment of the present application, the position of the first workpiece in the first workpiece image can be indicated by the at least one first feature point determined by the extraction module 702. The determination module 703 can determine the position of the first workpiece in the spatial coordinate system of the robot arm by using the calibration information and the pixel coordinates of the first feature point. The grabbing module 704 controls the mechanical arm to grab the workpiece according to the position information of the workpiece. Specific angle and position need not be put to the position that places the work piece, and the arm can both carry out the automation to the work piece of optional position on the objective table and snatch, has saved the human cost.
Electronic device
Fig. 8 is a schematic diagram of an electronic device according to an embodiment of the present application, and the specific embodiment of the present application does not limit a specific implementation of the electronic device. As shown in fig. 8, the electronic device 800 may include: a processor (processor) 801, a communication Interface 802, a memory 803, and a communication bus 804. Wherein:
the processor 801, the communication interface 802, and the memory 803 communicate with each other via a communication bus 804.
A communication interface 802 for communicating with other electronic devices or servers.
The processor 801 is configured to execute the program 805, and may specifically execute relevant steps in any one of the foregoing method embodiments.
In particular, program 805 may include program code that includes computer operating instructions.
The processor 801 may be a CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present Application. The intelligent device comprises one or more processors which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
A memory 803 for storing a program 805. The memory 803 may include high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 805 may specifically be adapted to cause the processor 801 to perform any of the method embodiments of the methods of the embodiments described above.
For specific implementation of each step in the program 805, reference may be made to corresponding steps and corresponding descriptions in units in the foregoing embodiment of the workpiece capture method, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
Through the electronic equipment provided by the embodiment of the application, at least one first characteristic point determined from the first workpiece image can indicate the position of the image of the first workpiece in the first workpiece image, the calibration information can indicate the mapping relation between the pixel coordinate system of the image collected by the camera and the space coordinate system of the mechanical arm, the position of the first workpiece in the space coordinate system of the mechanical arm can be determined through the calibration information and the pixel coordinate of the first characteristic point, further, control information can be generated according to the position of the first workpiece in the space coordinate system, and after the control information is sent to the mechanical arm, the mechanical arm captures the first workpiece according to the control information. Therefore, the position of the workpiece in the space coordinate system of the mechanical arm can be determined, so that the mechanical arm can automatically grab the workpiece randomly placed on the objective table, the workpiece does not need to be placed on the objective table according to a specific angle and a specific position, time consumed for placing the workpiece on the objective table can be saved, and the grabbing efficiency of the workpiece is improved.
Computer storage medium
The present application further provides a computer readable storage medium storing instructions for causing a machine to perform any of the method embodiments as described herein. Specifically, a system or an apparatus equipped with a storage medium on which software program codes that realize the functions of any of the above-described embodiments are stored may be provided, and a computer (or a CPU or MPU) of the system or the apparatus is caused to read out and execute the program codes stored in the storage medium.
In this case, the program code itself read from the storage medium can realize the functions of any of the above-described embodiments, and thus the program code and the storage medium storing the program code constitute a part of the present application.
Examples of the storage medium for supplying the program code include a floppy disk, a hard disk, a magneto-optical disk, an optical disk (e.g., CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, DVD + RW), a magnetic tape, a nonvolatile memory card, and a ROM. Alternatively, the program code may be downloaded from a server computer via a communications network.
Computer program product
Embodiments of the present application further provide a computer program product, which includes computer instructions for instructing a computing device to perform operations corresponding to any of the above method embodiments.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present application may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present application.
The above-described methods according to embodiments of the present application may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium downloaded through a network and to be stored in a local recording medium, so that the methods described herein may be stored in such software processes on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It will be appreciated that a computer, processor, microprocessor controller, or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by a computer, processor, or hardware, implements the methods described herein. Furthermore, when a general-purpose computer accesses code for implementing the methods illustrated herein, execution of the code transforms the general-purpose computer into a special-purpose computer for performing the methods illustrated herein.
It should be noted that not all steps and modules in the above flows and system structure diagrams are necessary, and some steps or modules may be omitted according to actual needs. The execution sequence of the steps is not fixed and can be adjusted according to the needs. The system structure described in the above embodiments may be a physical structure or a logical structure, that is, some modules may be implemented by the same physical entity, or some modules may be implemented by a plurality of physical entities, or some components in a plurality of independent devices may be implemented together.
In the above embodiments, the hardware module may be implemented mechanically or electrically. For example, a hardware module may comprise permanently dedicated circuitry or logic (such as a dedicated processor, FPGA or ASIC) to perform the corresponding operations. A hardware module may also include programmable logic or circuitry (e.g., a general-purpose processor or other programmable processor) that may be temporarily configured by software to perform the corresponding operations. The specific implementation (mechanical, or dedicated permanent, or temporarily set) may be determined based on cost and time considerations.
While the invention has been particularly shown and described with reference to the preferred embodiments and drawings, it is not intended to be limited to the specific embodiments disclosed, and it will be understood by those skilled in the art that various other combinations of code approval means and various embodiments described above may be made, and such other embodiments are within the scope of the present invention.
Claims (14)
1. A workpiece grasping method (100), comprising:
acquiring a first workpiece image acquired by a camera, wherein the first workpiece image comprises an image of a first workpiece on a stage;
determining at least one first feature point from the first workpiece image, wherein the first feature point is used for indicating the position of the image of the first workpiece in the first workpiece image;
determining a first spatial coordinate of the first workpiece in a spatial coordinate system of the mechanical arm according to the pixel coordinate of the at least one first feature point and calibration information, wherein the calibration information is used for indicating a mapping relation between the pixel coordinate system of the image acquired by the camera and the spatial coordinate system;
and generating control information according to the first space coordinate, and sending the control information to the mechanical arm so that the mechanical arm can grab the first workpiece according to the control information.
2. The method of claim 1, wherein the generating control information from the first spatial coordinate comprises:
acquiring a second space coordinate, wherein the second space coordinate is used for indicating the coordinate of the first workpiece in the space coordinate system when the first workpiece is positioned at a standard position on the objective table;
determining the offset of the position of the first workpiece in the space coordinate system relative to the standard position according to the first space coordinate and the second space coordinate;
determining the offset as control information.
3. The method of claim 2, further comprising:
acquiring a second workpiece image previously acquired by the camera, wherein the second workpiece image comprises an image of the first workpiece at a standard position on an object stage;
determining at least one second feature point from the second workpiece image, wherein the second feature point is used for indicating the position of the image of the first workpiece in the second workpiece image;
and determining a second space coordinate of the first workpiece in the space coordinate system according to the pixel coordinate of the at least one second characteristic point and the calibration information.
4. The method of claim 3, wherein the first feature point is a point on a contour line of the first workpiece in the first workpiece image, and the second feature point is a point on a contour line of the first workpiece in the second workpiece image.
5. The method of claim 2, wherein the spatial coordinate system is a three-dimensional cartesian coordinate system in which a plane of an X-axis and a Y-axis is parallel to the stage, the first spatial coordinate and the second spatial coordinate each include an abscissa along an X-axis direction of the three-dimensional cartesian coordinate system, an ordinate along a Y-axis direction of the three-dimensional cartesian coordinate system, and a rotation angle about a Z-axis direction of the three-dimensional cartesian coordinate system;
the determining an offset of the position of the first workpiece in the spatial coordinate system relative to the standard position according to the first spatial coordinate and the second spatial coordinate comprises:
determining a difference between an abscissa comprised by the first spatial coordinate and an abscissa comprised by the second spatial coordinate as an X-axis displacement offset of the position of the first workpiece in the spatial coordinate system relative to the standard position;
determining a difference value of a vertical coordinate included in the first space coordinate and a vertical coordinate included in the second space coordinate as a Y-axis displacement offset of the position of the first workpiece in the space coordinate system relative to the standard position;
and determining the difference value of the rotation angle included by the first space coordinate and the rotation angle included by the second space coordinate as the Z-axis rotation offset of the position of the first workpiece in the space coordinate system relative to the standard position.
6. The method according to any one of claims 1-5, the method further comprising:
determining a calibration reference point on a second workpiece, wherein the height of the second workpiece is the same as that of the first workpiece in the Z-axis direction of the space coordinate system, and the planes of the X axis and the Y axis in the space coordinate system are parallel to the objective table;
controlling a mechanical arm to carry out N times of translation in the space coordinate system along the X-axis direction and/or the Y-axis direction according to the preset X-axis translation amount and Y-axis translation amount, and acquiring translation calibration images acquired by the camera after each time of translation of the mechanical arm, wherein the translation calibration images comprise images of the second workpiece on the objective table, and N is a positive integer greater than or equal to 3;
controlling a mechanical arm to rotate around a Z axis for M times in the space coordinate system according to a preset Z-axis rotation amount, and acquiring a rotation calibration image acquired by the camera after each rotation of the mechanical arm, wherein the rotation calibration image comprises an image of the second workpiece on the objective table, and M is a positive integer greater than or equal to 3;
and determining the calibration information according to the position offset of the image of the calibration reference point in different translation calibration images, the position offset of the image of the calibration reference point in different rotation calibration images, the X-axis translation amount and the Y-axis translation amount corresponding to each translation calibration image, and the Z-axis rotation amount corresponding to each rotation calibration image.
7. The method of claim 6, wherein the determining the calibration information according to the position offset amount of the image of the calibration reference point in different translation calibration images, the position offset amount of the image of the calibration reference point in different rotation calibration images, the X-axis translation amount and the Y-axis translation amount corresponding to each translation calibration image, and the Z-axis rotation amount corresponding to each rotation calibration image comprises:
aiming at the ith translation in the N translations performed by the mechanical arm, calculating the translation offset of the image of the calibration reference point in the i-1 translation calibration image and the ith translation calibration image, wherein the i-1 translation calibration image is the translation calibration image acquired by the camera before the mechanical arm performs the ith translation, the i translation calibration image is the translation calibration image acquired by the camera after the mechanical arm performs the ith translation, and i is a positive integer less than N;
calculating j-th pixel rotation offset of the image of the calibration reference point in a j-1-th rotation calibration image and a j-th rotation calibration image according to j rotation in M rotations of the mechanical arm, wherein the j-1-th rotation calibration image is a rotation calibration image acquired by the camera before the j rotation of the mechanical arm, the j-th rotation calibration image is a rotation calibration image acquired by the camera after the j rotation of the mechanical arm, and j is a positive integer smaller than M;
and determining the calibration information according to the ith pixel translation offset, the X-axis translation amount and the Y-axis translation amount corresponding to the ith translation of the mechanical arm, and the jth pixel rotation offset and the Z-axis rotation amount corresponding to the jth rotation of the mechanical arm.
8. The method of claim 7, wherein N equals 9,M equals 3;
when the mechanical arm carries out N translations, the translation amount combinations along the X-axis direction and the Y-axis direction are sequentially (X, 0), (0,y), (-X, 0), (0,y), (X, 0) and (-X, -Y), wherein X is used for representing the translation amount of the mechanical arm along the X-axis direction in the space coordinate system, Y is used for representing the translation amount of the mechanical arm along the Y-axis direction in the space coordinate system, and X and Y are not equal to 0.
9. The method of claim 8, wherein x is equal to y.
10. The method according to any of claims 7-9, wherein the robotic arm is rotated about the Z-axis while being translated at least once in the X-axis and/or Y-axis direction.
11. An object grasping apparatus (700), comprising:
an acquisition module (701) for acquiring a first workpiece image acquired by a camera, wherein the first workpiece image comprises an image of a first workpiece on a stage;
an extraction module (702) for determining at least one first feature point from the first workpiece image, wherein the first feature point is indicative of a position of an image of the first workpiece in the first workpiece image;
a determining module (703) configured to determine a first spatial coordinate of the first workpiece in a spatial coordinate system of the robot arm according to the pixel coordinate of the at least one first feature point and calibration information, wherein the calibration information is used to indicate a mapping relationship between the pixel coordinate system of the image acquired by the camera and the spatial coordinate system;
and the grabbing module (704) is used for generating control information according to the first space coordinate and sending the control information to the mechanical arm so that the mechanical arm grabs the first workpiece according to the control information.
12. An electronic device (800) comprising: the system comprises a processor (801), a communication interface (802), a memory (803) and a communication bus (804), wherein the processor (801), the memory (803) and the communication interface (802) are communicated with each other through the communication bus (804);
the memory (803) is configured to store at least one executable instruction for causing the processor (801) to perform operations corresponding to the method for workpiece grabbing according to any one of claims 1-10.
13. A computer storage medium, on which a computer program is stored which, when being executed by a processor, carries out a method for workpiece grabbing according to any one of claims 1-10.
14. A computer program product comprising computer instructions that instruct a computing device to perform operations corresponding to the workpiece grasping method according to any one of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211052358.XA CN115366105A (en) | 2022-08-31 | 2022-08-31 | Workpiece grabbing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211052358.XA CN115366105A (en) | 2022-08-31 | 2022-08-31 | Workpiece grabbing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115366105A true CN115366105A (en) | 2022-11-22 |
Family
ID=84069577
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211052358.XA Pending CN115366105A (en) | 2022-08-31 | 2022-08-31 | Workpiece grabbing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115366105A (en) |
-
2022
- 2022-08-31 CN CN202211052358.XA patent/CN115366105A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108827154B (en) | Robot non-teaching grabbing method and device and computer readable storage medium | |
JP6429473B2 (en) | Robot system, robot system calibration method, program, and computer-readable recording medium | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
JP4021413B2 (en) | Measuring device | |
CN109159114A (en) | The accuracy method of SCARA manipulator fixed camera vision system hand and eye calibrating | |
WO2018043525A1 (en) | Robot system, robot system control device, and robot system control method | |
EP4013578A1 (en) | Robot-mounted moving device, system, and machine tool | |
US7957834B2 (en) | Method for calculating rotation center point and axis of rotation, method for generating program, method for moving manipulator and positioning device, and robotic system | |
US11577400B2 (en) | Method and apparatus for managing robot system | |
CN115383256B (en) | Automatic welding method, device and system | |
CN113319859A (en) | Robot teaching method, system and device and electronic equipment | |
JP6912529B2 (en) | How to correct the visual guidance robot arm | |
CN112529856A (en) | Method for determining the position of an operating object, robot and automation system | |
JP2024096756A (en) | Robot mounting mobile device and control method therefor | |
CN115366105A (en) | Workpiece grabbing method and device, electronic equipment and storage medium | |
CN215701709U (en) | Configurable hand-eye calibration device | |
US20220134577A1 (en) | Image processing method, image processing apparatus, robot-mounted transfer device, and system | |
CN110977950B (en) | Robot grabbing and positioning method | |
CN113858214A (en) | Positioning method and control system for robot operation | |
JP2022055779A (en) | Method of setting threshold value used for quality determination of object recognition result, and object recognition apparatus | |
CN114571199A (en) | Screw locking machine and screw positioning method | |
CN112184819A (en) | Robot guiding method and device, computer equipment and storage medium | |
US20230123629A1 (en) | 3d computer-vision system with variable spatial resolution | |
CN114619233B (en) | Lock positioning method, screw locking method, lock positioning device and screw machine | |
WO2024135220A1 (en) | Robot control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |