CN113510697B - Manipulator positioning method, device, system, electronic device and storage medium - Google Patents

Manipulator positioning method, device, system, electronic device and storage medium Download PDF

Info

Publication number
CN113510697B
CN113510697B CN202110441828.0A CN202110441828A CN113510697B CN 113510697 B CN113510697 B CN 113510697B CN 202110441828 A CN202110441828 A CN 202110441828A CN 113510697 B CN113510697 B CN 113510697B
Authority
CN
China
Prior art keywords
manipulator
information
pose
relation
conversion relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110441828.0A
Other languages
Chinese (zh)
Other versions
CN113510697A (en
Inventor
汝杰
邱进忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhishou Technology Hangzhou Co ltd
Original Assignee
Zhishou Technology Hangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhishou Technology Hangzhou Co ltd filed Critical Zhishou Technology Hangzhou Co ltd
Priority to CN202110441828.0A priority Critical patent/CN113510697B/en
Publication of CN113510697A publication Critical patent/CN113510697A/en
Application granted granted Critical
Publication of CN113510697B publication Critical patent/CN113510697B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement

Abstract

The application relates to a manipulator positioning method, a manipulator positioning device, a manipulator positioning system, an electronic device and a storage medium, wherein the method comprises the following steps: acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator; carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation; when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation. The application solves the problem that the grabbing precision is low due to low positioning precision of the manipulator and the manipulator can only be applied to application scenes of operations such as carrying, stacking and the like of regular articles; the high-precision positioning of the manipulator is realized, so that the grabbing precision is improved, and the precision requirement for grabbing irregular articles can be met.

Description

Manipulator positioning method, device, system, electronic device and storage medium
Technical Field
The present disclosure relates to the field of manipulator technologies, and in particular, to a manipulator positioning method, device, system, electronic device, and storage medium.
Background
Industrial robots are widely applied in the industrial field, replace manpower in the mechanical processing field and engage in a lot of heavy and repetitive activities; the industrial robot is continuously developed, so that the industrial robot is greatly developed in the aspect of operability and intelligence, and is expected to replace manpower in more fields and be applied more along with the requirements of population aging and industry upgrading in the future. Generally, an industrial robot is a multi-joint degree-of-freedom mechanical arm, the mechanical arm is driven by a plurality of rotating motors to realize controllable positioning drive of the tail end of the robot, the robot is not provided with a sensor, a camera is manually installed on or beside the robot, target coordinates are obtained by using the camera, so that the robot operates a target according to images obtained by the camera, namely, robot vision, calibration of the robot and a camera coordinate system is needed to establish a relation between the camera and a robot coordinate system, and the calibration process is also called hand-eye calibration.
In the traditional hand-eye calibration, point-to-point operation is usually adopted, and the tail end of the robot and the actual grabbing position are in a coaxial position relation; for the non-coaxial condition, the research objects are mostly regular articles, and the application is more in the traditional logistics industry. This method is not high in positioning and grasping accuracy, and therefore, only the operations of carrying, stacking and the like can be performed. When the tail end of the robot is not coaxial with the actual grabbing position, the position of the robot is usually designed in mechanical design, but due to the existence of errors and assembly errors of an actual workpiece, the position of the robot and the designed position have large deviation, and the errors are compensated by means of compensation, so that the positioning accuracy is low, and the grabbing accuracy is difficult to guarantee.
At present, aiming at the application scenes that the mechanical arm in the related technology has low positioning precision, so that the grabbing precision is low, and the mechanical arm can only be applied to operations such as carrying, stacking and the like of regular articles, and an effective solution is not provided.
Disclosure of Invention
The embodiment of the application provides a manipulator positioning method, a manipulator positioning device, a manipulator positioning system, an electronic device and a storage medium, and at least solves the problem that in the related technology, the manipulator positioning precision is low, so that the grabbing precision is low, and the manipulator positioning method and the manipulator positioning system can only be applied to application scenes of operations such as carrying and stacking of regular articles.
In a first aspect, an embodiment of the present application provides a manipulator positioning method, including:
acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator;
performing conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
when an article is grabbed, establishing a first grabbing pose and corresponding rotation center information of the manipulator, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
In some of these embodiments, further comprising:
and after the first conversion relation is obtained, the current manipulator is grabbed and positioned according to the first conversion relation.
In some embodiments, performing a transformation relationship process on the pose information and the end position information to obtain a first transformation relationship includes:
and carrying out conversion relation processing on the pose information and the tail end position information according to a first conversion formula to obtain a first conversion relation.
In some of these embodiments, the first conversion formula is:
Figure GDA0003899277340000021
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the vertical coordinate of the pixel point in the image coordinate; r is a rotation matrix; m is a translation matrix.
In some embodiments, processing according to the first capture pose, the rotation center information, and the first transformation relationship to obtain a relative transformation relationship includes:
processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
In some of these embodiments, the second conversion formula is:
Figure GDA0003899277340000022
in the formula (x) Relative position ,y Relative positionDevice for placing ) Is in a relative position relation; x' 0 The abscissa of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; y' 0 The vertical coordinate of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; x' 0 Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the abscissa of a pixel point in the image coordinate of the rotation center of the target object; y' 0 Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a rotation matrix; m is a translation matrix.
In a second aspect, an embodiment of the present application provides a manipulator positioning device, which includes an obtaining module, a first processing module, and a second processing module;
the acquisition module is used for acquiring the pose information of the preset mark points of the calibration plate and the tail end position information of the current manipulator;
the first processing module is used for carrying out conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
the second processing module is used for establishing a first grabbing pose of the manipulator and corresponding rotation center information when grabbing an article, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
In a third aspect, an embodiment of the present application provides a manipulator positioning system, including: a terminal device, a transmission device and a server device; the terminal equipment is connected with the server equipment through the transmission equipment;
the terminal equipment is used for acquiring pose information and terminal position information;
the transmission equipment is used for transmitting the pose information and the tail end position information;
the server apparatus is for performing the robot positioning method of the first aspect.
In a fourth aspect, embodiments of the present application provide an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor, when executing the computer program, implements the robot positioning method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, where the program is executed by a processor to implement the robot positioning method according to the first aspect.
Compared with the related art, the manipulator positioning method, the manipulator positioning device, the manipulator positioning system, the electronic device and the storage medium provided by the embodiment of the application acquire the position and pose information of the preset mark point of the calibration plate and the terminal position information of the current manipulator; carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation; when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation. The application solves the problem that the grabbing precision is low due to low positioning precision of the manipulator and the manipulator can only be applied to application scenes of operations such as carrying, stacking and the like of regular articles; the high-precision positioning of the manipulator is realized, so that the grabbing precision is improved, and the precision requirement for grabbing irregular articles can be met.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of a hardware structure of a terminal device of a robot positioning method according to an embodiment of the present application;
fig. 2 is a flowchart of a robot positioning method according to an embodiment of the present disclosure;
fig. 3 is a block diagram of a robot positioning device according to an embodiment of the present disclosure.
In the figure: 100. an acquisition module; 200. a first processing module; 300. and a second processing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that such a development effort might be complex and tedious, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure, given the benefit of this disclosure, without departing from the scope of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, "a and/or B" may indicate: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The method embodiment provided by the embodiment can be executed in a terminal, a computer or a similar operation device. Taking the operation on the terminal as an example, fig. 1 is a hardware structure block diagram of the terminal of the manipulator positioning method according to the embodiment of the present invention. As shown in fig. 1, the terminal 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the terminal. For example, the terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 can be used for storing computer programs, for example, software programs and modules of application software, such as a computer program corresponding to the robot positioning method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, thereby implementing the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the terminal 10. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The present embodiment provides a robot positioning method, and fig. 2 is a flowchart of the robot positioning method according to the embodiment of the present application, and as shown in fig. 2, the flowchart includes the following steps:
step S210, acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator;
step S220, carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation;
step S230, when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
It should be noted that, in a general situation, the grasping position of the manipulator and the end of the manipulator are in a coaxial position relationship, and this way only can perform point-to-point operation, and not only is the form single, but also the grasped object is generally small in size, light in weight, and has limitations in adapting to a scene. In order to adapt to more grabbing objects, the number of the grabbing objects is increased from the original 1 to 2, 3, 4 or more by adding the suction devices. The object with large volume and heavy weight can be grabbed by increasing the number of the suction devices under the cooperative action, the grabbing position and the tail end of the robot are in a non-coaxial relationship due to the increase of the number of the suction devices, on one hand, the robot can only adapt to the object with large volume and regular shape, and on the other hand, the precision of the suction position can only be realized by a compensation mode.
In actual control, after the camera detects the pixel position of the target in the image, the pixel coordinate of the camera is converted into a space coordinate system of the manipulator through a calibrated coordinate second conversion formula, and then how each motor moves is calculated according to the manipulator coordinate system, so that the manipulator is controlled to reach the specified position. Hand-eye calibration refers to the process of conversion due to problems with existing calibration methods. The method comprises the steps of firstly obtaining pose information of preset mark points of a calibration plate and the tail end position information of a current manipulator; namely, a calibration board of the checkerboard is placed at a processing position, a camera is used for collecting images of the calibration board, pixel data of angular points of the checkerboard are recorded once according to a preset rule, and the recorded preset rule is as follows: the upper left corner is the origin of coordinates (0,0), and a row by row records corner point data, including a row of pixels and a column of pixels for that point. Thus obtaining the in-place posture information. The method comprises the following steps that a coordinate system is established in a machining area by a mechanical arm, a probe is installed at the position of a flange plate, a TCP is established, and the actual coordinates of the checkerboard corner points are recorded by the TCP, wherein the actual coordinates comprise an abscissa x of the middle point of a world coordinate and a ordinate y of the middle point of the world coordinate; obtaining the terminal position information; the TCP records the point position sequence of the actual coordinates of the checkerboard corner points and records the point positions in the calibration board image in a one-to-one correspondence mode.
And then, carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation. If the regular article is grabbed, the current manipulator can be directly grabbed and positioned according to the first conversion relation. If the irregular article is grabbed, establishing a first grabbing pose and corresponding rotation center information of the manipulator when the article is grabbed, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
Through the steps, the problem that the grabbing precision is low due to low positioning precision of the manipulator and the manipulator can only be applied to application scenes of operations such as carrying, stacking and the like of regular articles is solved; the high-precision positioning of the manipulator is realized, so that the grabbing precision is improved, and the precision requirement for grabbing irregular articles can be met. Furthermore, the suction device of the manipulator can be arbitrarily set according to the volume, shape and weight of the object without being limited by the installation position of the object and the suction device.
In some embodiments, the performing a conversion relationship process on the pose information and the end position information to obtain a first conversion relationship includes:
and carrying out conversion relation processing on the attitude information and the tail end position information according to a first conversion formula to obtain a first conversion relation.
Specifically, the first conversion formula is:
Figure GDA0003899277340000071
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the vertical coordinate of the pixel point in the image coordinate; r is a rotation matrix; m is a translation matrix.
The calculation process is as follows: for example, pixel 1 in the image coordinates is (x' 1 ,y′ 1 ) (ii) a Pixel point 2 in the image coordinates is (x' 2 ,y′ 2 ) The world coordinate midpoint 1 is (x) 1 ,y 1 ) (ii) a World coordinate midpoint 2 is (x) 2 ,y 2 )。
Substituted into a first conversion formula
Figure GDA0003899277340000072
Obtaining a rotation matrix by differentiating
Figure GDA0003899277340000073
Figure GDA0003899277340000074
The obtained rotation matrix R is substituted into any formula to obtain a translation matrix
Figure GDA0003899277340000075
In some embodiments, the processing according to the first capture pose, the rotation center information, and the first transformation relationship to obtain a relative transformation relationship includes:
processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
Specifically, a camera is used for shooting an irregular target object to be grabbed, and a manipulator establishes a first grabbing pose. The first grab pose includes a first pose position (x) 0 ,y 0 ) And a first attitude angle A 0 . The rotation center information includes a rotation center coordinate point o ' (x ') recorded in the TCP position point ' 0 ,y' 0 ) And angle A' 0 (ii) a And randomly placing the target object to be grabbed in the camera view field to obtain a pixel coordinate point o '(x') of the target object at the current rotation center " 0 ,y" 0 ) And angle A " 0 . Actually, the relative conversion relationship includes a relative positional relationship and a relative angular relationship.
Also, the second conversion formula is:
Figure GDA0003899277340000081
wherein (x) Relative position ,y Relative position ) Is in a relative position relation; x' 0 For TCP bit based on the first pose of manipulatorSetting the abscissa of a pixel point in the image coordinate recorded by the point; y' 0 The vertical coordinate of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; x' 0 Randomly placing an object to be grabbed under the view field of a camera, and obtaining the abscissa of a pixel point in the image coordinate of the rotation center of the target object; y' 0 Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a rotation matrix; m is a translation matrix. And processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation.
The formula of the angular offset is A Relative position =A" 0 -A' 0 (ii) a And processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation.
The actual position of the manipulator is then (x) 0 +x Relative position ,y 0 +y Relative position ,A 0 +A Relative position ) Therefore, the irregular target object can be grabbed with high precision.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The present embodiment further provides a manipulator positioning device, which is used to implement the foregoing embodiments and preferred embodiments, and the description of the device is omitted here. As used below, the terms "module," "unit," "sub-unit," and the like may implement a combination of software and/or hardware of predetermined functions. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a block diagram of a robot positioning apparatus according to an embodiment of the present application, and as shown in fig. 3, the apparatus includes an acquisition module 100, a first processing module 200, and a second processing module 300; the acquisition module 100 is configured to acquire pose information of preset mark points of the calibration plate and end position information of a current manipulator; the first processing module 200 is configured to perform conversion relationship processing on the pose information and the end position information to obtain a first conversion relationship; the second processing module 300 is configured to establish a first grabbing pose and corresponding rotation center information of the manipulator when grabbing an article, and perform processing according to the first grabbing pose, the rotation center information, and the first conversion relationship to obtain a relative conversion relationship; and the current manipulator is grabbed and positioned according to the relative conversion relation.
This application has realized the high accuracy location of manipulator to improve and snatch the precision, can satisfy the required precision that irregular article snatched.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
In some embodiments, the robot further comprises a grabbing module, and the grabbing module is configured to, after obtaining the first conversion relationship, grab and position the current robot according to the first conversion relationship.
In some embodiments, the first processing module 200 is further configured to perform a conversion relationship processing on the pose information and the end position information according to a first conversion formula, so as to obtain a first conversion relationship.
In some embodiments, the second processing module 300 is further configured to process the first capture pose, the rotation center information, and the first conversion relationship by using a second conversion formula, so as to obtain a relative position relationship;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s2, acquiring pose information of preset mark points of the calibration plate and the tail end position information of the current manipulator;
s2, carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation;
s3, when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the manipulator positioning method in the foregoing embodiment, the embodiment of the present application may be implemented by providing a storage medium. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the manipulator positioning methods in the above embodiments.
It should be understood by those skilled in the art that various technical features of the above-described embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above-described embodiments are not described, however, so long as there is no contradiction between the combinations of the technical features, they should be considered as being within the scope of the present description.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A method for positioning a manipulator, comprising:
acquiring pose information of preset mark points of a calibration plate and tail end position information of a current manipulator, wherein the calibration plate with the checkerboard is placed at a processing position, a camera is used for acquiring images of the calibration plate, pixel data of angular points of the checkerboard are recorded once according to preset rules, and the recorded preset rules are as follows: the left upper corner is a coordinate origin (0,0), the data of the corner points are recorded in a row and a line, the data comprise rows and columns of pixels of the corner points, namely the in-place posture information is obtained, a mechanical arm establishes a coordinate system in a processing area, a probe is installed at the position of a flange plate, a TCP is established, and the actual coordinates of the checkerboard corner points are recorded by the TCP, wherein the actual coordinates comprise an abscissa x of the middle point of the world coordinate and an ordinate y of the middle point of the world coordinate; obtaining the terminal position information; the TCP records the point position sequence of the actual coordinates of the checkerboard corner points and records the point positions in the calibration board image in a one-to-one correspondence manner;
performing conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
when an irregular article is grabbed, establishing a first grabbing pose of a manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and grabbing and positioning the current manipulator according to the relative conversion relation, wherein the relative conversion relation is obtained by processing according to the first grabbing pose, the rotation center information and the first conversion relation, and the method comprises the following steps: the first grabbing pose, the rotation center information and the first conversion relation are processed by a second conversion formula,obtaining a relative position relation; processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation, wherein the first conversion formula is as follows:
Figure FDA0003899277330000011
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the ordinate of the pixel point in the image coordinate; r is a rotation matrix; m is a translation matrix; the second conversion formula is:
Figure FDA0003899277330000012
wherein (x) Relative position ,y Relative position ) Is in a relative position relation; x' 0 The abscissa of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; y' 0 The vertical coordinate of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; x' 0 Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the abscissa of a pixel point in the image coordinate of the rotation center of the target object; y' 0 Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a rotation matrix; m is a translation matrix;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
2. The robot positioning method according to claim 1, further comprising:
and after the first conversion relation is obtained, the current manipulator is grabbed and positioned according to the first conversion relation.
3. The manipulator positioning method according to claim 1, wherein performing a transformation relationship process on the pose information and the end position information to obtain a first transformation relationship comprises:
and carrying out conversion relation processing on the pose information and the tail end position information according to a first conversion formula to obtain a first conversion relation.
4. The manipulator positioning device is characterized by comprising an acquisition module, a first processing module and a second processing module;
the acquisition module is used for acquiring pose information of preset mark points of the calibration plate and tail end position information of the current manipulator, wherein the calibration plate with the checkerboards is placed at a processing position, a camera is used for acquiring images of the calibration plate, pixel data of corner points of the checkerboards are recorded once according to preset rules, and the recorded preset rules are as follows: the left upper corner is a coordinate origin (0,0), the data of the corner points are recorded in a row and a line, the data comprise rows and columns of pixels of the corner points, namely the in-place posture information is obtained, a mechanical arm establishes a coordinate system in a processing area, a probe is installed at the position of a flange plate, a TCP is established, and the actual coordinates of the checkerboard corner points are recorded by the TCP, wherein the actual coordinates comprise an abscissa x of the middle point of the world coordinate and an ordinate y of the middle point of the world coordinate; obtaining the terminal position information; the TCP records the point position sequence of the actual coordinates of the checkerboard corner points and records the point positions in the calibration board image in a one-to-one correspondence manner;
the first processing module is used for carrying out conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
the second processing module is used for establishing a first grabbing pose and corresponding rotation center information of the manipulator when grabbing an article, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and grabbing and positioning the current manipulator according to the relative conversion relation, wherein the relative conversion relation is obtained by processing according to the first grabbing pose, the rotation center information and the first conversion relation, and the method comprises the following steps: processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation; processing the first grabbing pose and the rotation center information by using an angle deviation formula to obtain a relative angle relation, wherein the first conversion formula is as follows:
Figure FDA0003899277330000021
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the vertical coordinate of the pixel point in the image coordinate; r is a rotation matrix; m is a translation matrix; the second conversion formula is:
Figure FDA0003899277330000022
in the formula (x) Relative position ,y Relative position ) Is in a relative position relation; x' 0 The abscissa of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; y' 0 The vertical coordinate of a pixel point in an image coordinate recorded by a TCP position point based on a first pose of the manipulator; x' 0 Randomly placing an object to be grabbed under the view field of a camera, and obtaining the abscissa of a pixel point in the image coordinate of the rotation center of the target object; y' 0 Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a rotation matrix; m is a translation matrix;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
5. A manipulator positioning system, comprising: a terminal device, a transmission device and a server device; the terminal equipment is connected with the server equipment through the transmission equipment;
the terminal equipment is used for acquiring pose information and terminal position information;
the transmission equipment is used for transmitting the pose information and the tail end position information;
the server device is configured to perform the manipulator positioning method according to any one of claims 1 to 3.
6. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the robot positioning method of any of claims 1 to 3.
7. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the manipulator positioning method of any of claims 1 to 3 when executed.
CN202110441828.0A 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium Active CN113510697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110441828.0A CN113510697B (en) 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110441828.0A CN113510697B (en) 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN113510697A CN113510697A (en) 2021-10-19
CN113510697B true CN113510697B (en) 2023-02-14

Family

ID=78062703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110441828.0A Active CN113510697B (en) 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113510697B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114354986B (en) * 2022-01-18 2022-11-11 苏州格拉尼视觉科技有限公司 Flying probe tester and test shaft polarity distribution method thereof

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102448679A (en) * 2009-05-27 2012-05-09 莱卡地球系统公开股份有限公司 Method and system for extremely precise positioning of at least one object in the end position in space
EP2543483A1 (en) * 2011-07-08 2013-01-09 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN104626206A (en) * 2014-12-17 2015-05-20 西南科技大学 Robot operation pose information measuring method under non-structural environment
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN105234943A (en) * 2015-09-09 2016-01-13 大族激光科技产业集团股份有限公司 Industrial robot demonstration device and method based on visual recognition
CN105759720A (en) * 2016-04-29 2016-07-13 中南大学 Mechanical arm tracking and positioning on-line identification and correction method based on computer vision
CN108766894A (en) * 2018-06-07 2018-11-06 湖南大学 A kind of chip attachment method and system of robot vision guiding
CN109421050A (en) * 2018-09-06 2019-03-05 北京猎户星空科技有限公司 A kind of control method and device of robot
CN109829953A (en) * 2019-02-27 2019-05-31 广东拓斯达科技股份有限公司 Image collecting device scaling method, device, computer equipment and storage medium
JP2019120967A (en) * 2017-12-28 2019-07-22 國家中山科學研究院 Error compensation device and method
CN110238845A (en) * 2019-05-22 2019-09-17 湖南视比特机器人有限公司 Optimal Calibration point chooses and the automatic hand and eye calibrating method and device of error measurement
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN111369625A (en) * 2020-03-02 2020-07-03 广东利元亨智能装备股份有限公司 Positioning method, positioning device and storage medium
CN112258589A (en) * 2020-11-16 2021-01-22 北京如影智能科技有限公司 Hand-eye calibration method and device
CN112489133A (en) * 2020-11-17 2021-03-12 北京京东乾石科技有限公司 Calibration method, device and equipment of hand-eye system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5591945A (en) * 1995-04-19 1997-01-07 Elo Touchsystems, Inc. Acoustic touch position sensor using higher order horizontally polarized shear wave propagation
US7153454B2 (en) * 2003-01-21 2006-12-26 University Of Southern California Multi-nozzle assembly for extrusion of wall
CN104132613B (en) * 2014-07-16 2017-01-11 佛山科学技术学院 Noncontact optical volume measurement method for complex-surface and irregular objects
CN105518702B (en) * 2014-11-12 2018-06-26 深圳市大疆创新科技有限公司 A kind of detection method, detection device and robot to target object
US10317872B2 (en) * 2015-08-07 2019-06-11 Spm Automation (Canada) Inc. Method of self-adjusting a machine to compensate for part-to-part variations
DE112019000125B4 (en) * 2018-10-30 2021-07-01 Mujin, Inc. SYSTEMS, DEVICES AND METHODS FOR AUTOMATED PACKAGING REGISTRATION
CN111136656B (en) * 2019-12-24 2020-12-08 上海智殷自动化科技有限公司 Method for automatically identifying and grabbing three-dimensional irregular object of robot

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102448679A (en) * 2009-05-27 2012-05-09 莱卡地球系统公开股份有限公司 Method and system for extremely precise positioning of at least one object in the end position in space
EP2543483A1 (en) * 2011-07-08 2013-01-09 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN104626206A (en) * 2014-12-17 2015-05-20 西南科技大学 Robot operation pose information measuring method under non-structural environment
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN105234943A (en) * 2015-09-09 2016-01-13 大族激光科技产业集团股份有限公司 Industrial robot demonstration device and method based on visual recognition
CN105759720A (en) * 2016-04-29 2016-07-13 中南大学 Mechanical arm tracking and positioning on-line identification and correction method based on computer vision
JP2019120967A (en) * 2017-12-28 2019-07-22 國家中山科學研究院 Error compensation device and method
CN108766894A (en) * 2018-06-07 2018-11-06 湖南大学 A kind of chip attachment method and system of robot vision guiding
CN109421050A (en) * 2018-09-06 2019-03-05 北京猎户星空科技有限公司 A kind of control method and device of robot
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN109829953A (en) * 2019-02-27 2019-05-31 广东拓斯达科技股份有限公司 Image collecting device scaling method, device, computer equipment and storage medium
CN110238845A (en) * 2019-05-22 2019-09-17 湖南视比特机器人有限公司 Optimal Calibration point chooses and the automatic hand and eye calibrating method and device of error measurement
CN111369625A (en) * 2020-03-02 2020-07-03 广东利元亨智能装备股份有限公司 Positioning method, positioning device and storage medium
CN112258589A (en) * 2020-11-16 2021-01-22 北京如影智能科技有限公司 Hand-eye calibration method and device
CN112489133A (en) * 2020-11-17 2021-03-12 北京京东乾石科技有限公司 Calibration method, device and equipment of hand-eye system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
The Location Error Analysis and Compensation for Needle Driven Robot;Qinjun Du;《2009 3rd International Conference on Bioinformatics and Biomedical Engineering》;20090714;全文 *
基于工业机器人的结构件柔性定位技术研究;冯志刚;《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》;20200215(第02期);全文 *

Also Published As

Publication number Publication date
CN113510697A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN111015655B (en) Mechanical arm grabbing method and device, computer readable storage medium and robot
CN110751691B (en) Automatic pipe fitting grabbing method based on binocular vision
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
CN113510697B (en) Manipulator positioning method, device, system, electronic device and storage medium
CN113715016B (en) Robot grabbing method, system, device and medium based on 3D vision
CN111083376B (en) Method, system and device for determining installation position of target object and electronic equipment
CN109636783B (en) Method and device for determining arm length of robot, computer equipment and storage medium
CN113524187B (en) Method and device for determining workpiece grabbing sequence, computer equipment and medium
CN110909644A (en) Method and system for adjusting grabbing posture of mechanical arm end effector based on reinforcement learning
CN112847375B (en) Workpiece grabbing method and device, computer equipment and storage medium
CN111445521A (en) Target workpiece position determining method, device, terminal and readable storage medium
CN114714365B (en) Disordered workpiece grabbing method and system based on cloud platform
CN113329179A (en) Shooting alignment method, device, equipment and storage medium
CN113997295A (en) Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium
CN110298877A (en) A kind of the determination method, apparatus and electronic equipment of object dimensional pose
CN113246145B (en) Pose compensation method and system for nuclear industry grabbing equipment and electronic device
CN110167721B (en) Robot system, automatic calibration method and storage device
CN111383283A (en) Calibration method and system for tool coordinate system of robot
CN113510696A (en) Method, device and system for constructing manipulator workpiece coordinate system and storage medium
CN113414764A (en) Part warehousing method and device, terminal and readable storage medium
CN112184819A (en) Robot guiding method and device, computer equipment and storage medium
CN112748737A (en) Laser charging method for estimating trinocular visual pose of patrol robot
CN112605990A (en) Robot vision control method and system
CN115797332B (en) Object grabbing method and device based on instance segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant