CN113510697A - Manipulator positioning method, device, system, electronic device and storage medium - Google Patents

Manipulator positioning method, device, system, electronic device and storage medium Download PDF

Info

Publication number
CN113510697A
CN113510697A CN202110441828.0A CN202110441828A CN113510697A CN 113510697 A CN113510697 A CN 113510697A CN 202110441828 A CN202110441828 A CN 202110441828A CN 113510697 A CN113510697 A CN 113510697A
Authority
CN
China
Prior art keywords
manipulator
information
pose
conversion relation
rotation center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110441828.0A
Other languages
Chinese (zh)
Other versions
CN113510697B (en
Inventor
汝杰
邱进忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhishou Technology Hangzhou Co ltd
Original Assignee
Zhishou Technology Hangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhishou Technology Hangzhou Co ltd filed Critical Zhishou Technology Hangzhou Co ltd
Priority to CN202110441828.0A priority Critical patent/CN113510697B/en
Publication of CN113510697A publication Critical patent/CN113510697A/en
Application granted granted Critical
Publication of CN113510697B publication Critical patent/CN113510697B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a manipulator positioning method, a manipulator positioning device, a manipulator positioning system, an electronic device and a storage medium, wherein the method comprises the following steps: acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator; carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation; when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation. The application solves the problem that the grabbing precision is low due to low positioning precision of the manipulator and the manipulator can only be applied to application scenes of operations such as carrying, stacking and the like of regular articles; the high-precision positioning of the manipulator is realized, so that the grabbing bottom crossing is improved, and the precision requirement for grabbing irregular articles can be met.

Description

Manipulator positioning method, device, system, electronic device and storage medium
Technical Field
The present disclosure relates to the field of manipulator technologies, and in particular, to a manipulator positioning method, device, system, electronic device, and storage medium.
Background
Industrial robots are widely applied in the industrial field, replace manpower in the mechanical processing field and engage in a lot of heavy and repetitive activities; the industrial robot is continuously developed, so that the operation intelligence is greatly developed, and the industrial robot is expected to replace manpower in more fields and be applied more along with the requirements of population aging and industry upgrading in the future. Generally, an industrial robot is a multi-joint degree-of-freedom mechanical arm, the mechanical arm is driven by a plurality of rotating motors to realize controllable positioning driving of the tail end of the robot, the robot is not provided with a sensor, a camera is manually installed on or beside the robot, target coordinates are obtained by using the camera, and therefore a mode that the robot operates a target according to the image obtained by the camera is called robot vision, and calibration of the robot and a camera coordinate system is needed to establish a relation between the camera and the robot coordinate system, and the calibration process is called hand-eye calibration.
In the traditional hand-eye calibration, point-to-point operation is usually adopted, and the tail end of the robot and the actual grabbing position are in a coaxial position relation; for the non-coaxial condition, the research objects are mostly regular articles, and the application is more in the traditional logistics industry. This method is not high in positioning and grasping accuracy, and therefore, only the operations of carrying, stacking and the like can be performed. When the tail end of the robot is not coaxial with the actual grabbing position, the position of the robot is usually designed in mechanical design, but due to the existence of errors and assembly errors of an actual workpiece, the position of the robot and the designed position have large deviation, and the errors are compensated by means of compensation, so that the positioning accuracy is low, and the grabbing accuracy is difficult to guarantee.
At present, aiming at the application scenes that the mechanical arm in the related technology has low positioning precision, so that the grabbing precision is low, and the mechanical arm can only be applied to operations such as carrying, stacking and the like of regular articles, and an effective solution is not provided.
Disclosure of Invention
The embodiment of the application provides a manipulator positioning method, a manipulator positioning device, a manipulator positioning system, an electronic device and a storage medium, and at least solves the problem that in the related technology, the manipulator positioning precision is low, so that the grabbing precision is low, and the manipulator positioning method and the manipulator positioning system can only be applied to application scenes of operations such as carrying and stacking of regular articles.
In a first aspect, an embodiment of the present application provides a manipulator positioning method, including:
acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator;
carrying out conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
when an article is grabbed, establishing a first grabbing pose of a manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
In some of these embodiments, further comprising:
and after the first conversion relation is obtained, the current manipulator is grabbed and positioned according to the first conversion relation.
In some embodiments, performing a transformation relationship process on the pose information and the end position information to obtain a first transformation relationship includes:
and carrying out conversion relation processing on the pose information and the tail end position information according to a first conversion formula to obtain a first conversion relation.
In some of these embodiments, the first conversion formula is:
Figure BDA0003035382340000021
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the vertical coordinate of the pixel point in the image coordinate; r is a translation matrix; m is a rotation matrix.
In some embodiments, processing according to the first capture pose, the rotation center information, and the first transformation relationship to obtain a relative transformation relationship includes:
processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
In some of these embodiments, theThe second conversion formula is:
Figure BDA0003035382340000022
in the formula (x)Relative position,yRelative position) Is in a relative position relation; x'0The horizontal coordinate of a pixel point in the image coordinate of the rotation center is a TCP position point based on the first pose of the robot; y' is a longitudinal coordinate of a pixel point in an image coordinate based on a TCP position point of the first pose of the robot and the rotation center; x ″)0Randomly placing an object to be grabbed under the view field of a camera, and obtaining the abscissa of a pixel point in the image coordinate of the rotation center of the target object; y ″)0Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a translation matrix; m is a rotation matrix.
In a second aspect, an embodiment of the present application provides a manipulator positioning device, which includes an obtaining module, a first processing module, and a second processing module;
the acquisition module is used for acquiring the pose information of the preset mark points of the calibration plate and the terminal position information of the current manipulator;
the first processing module is used for carrying out conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
the second processing module is used for establishing a first grabbing pose of the manipulator and corresponding rotation center information when grabbing an article, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
In a third aspect, an embodiment of the present application provides a manipulator positioning system, including: a terminal device, a transmission device and a server device; the terminal equipment is connected with the server equipment through the transmission equipment;
the terminal equipment is used for acquiring pose information and terminal position information;
the transmission equipment is used for transmitting the pose information and the tail end position information;
the server apparatus is configured to perform the robot positioning method according to the first aspect.
In a fourth aspect, embodiments of the present application provide an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor, when executing the computer program, implements the robot positioning method according to the first aspect.
In a fifth aspect, the present application provides a storage medium, on which a computer program is stored, where the program is executed by a processor to implement the robot positioning method according to the first aspect.
Compared with the related art, the manipulator positioning method, the manipulator positioning device, the manipulator positioning system, the electronic device and the storage medium provided by the embodiment of the application acquire the position and pose information of the preset mark point of the calibration plate and the terminal position information of the current manipulator; carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation; when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation. The application solves the problem that the grabbing precision is low due to low positioning precision of the manipulator and the manipulator can only be applied to application scenes of operations such as carrying, stacking and the like of regular articles; the high-precision positioning of the manipulator is realized, so that the grabbing bottom crossing is improved, and the precision requirement for grabbing irregular articles can be met.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of a hardware structure of a terminal device of a robot positioning method according to an embodiment of the present application;
fig. 2 is a flowchart of a robot positioning method according to an embodiment of the present disclosure;
fig. 3 is a block diagram of a robot positioning device according to an embodiment of the present disclosure.
In the figure: 100. an acquisition module; 200. a first processing module; 300. and a second processing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The method provided by the embodiment can be executed in a terminal, a computer or a similar operation device. Taking the operation on the terminal as an example, fig. 1 is a hardware structure block diagram of the terminal of the manipulator positioning method according to the embodiment of the present invention. As shown in fig. 1, the terminal 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the terminal. For example, the terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store computer programs, for example, software programs and modules of application software, such as a computer program corresponding to the robot positioning method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the terminal 10. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The present embodiment provides a robot positioning method, and fig. 2 is a flowchart of the robot positioning method according to the embodiment of the present application, and as shown in fig. 2, the flowchart includes the following steps:
step S210, acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator;
step S220, carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation;
step S230, when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
It should be noted that, in a general situation, the grasping position of the robot hand and the end of the robot hand are in a coaxial position relationship, and this way only can perform point-to-point operation, and not only is the form single, but also the grasped object is generally small in size, light in weight, and has limitations in adapting to the scene. In order to adapt to more grabbing objects, the number of the grabbing objects is increased from the original 1 to 2, 3, 4 or more by adding the suction devices. The object with large volume and heavy weight can be grabbed by increasing the number of the suction devices under the cooperative action, the grabbing position and the tail end of the robot are in a non-coaxial relationship due to the increase of the number of the suction devices, on one hand, the robot can only adapt to the object with large volume and regular shape, and on the other hand, the precision of the suction position can only be realized by a compensation mode.
In actual control, after the camera detects the pixel position of the target in the image, the pixel coordinate of the camera is converted into a space coordinate system of the manipulator through a calibrated coordinate second conversion formula, and then how each motor moves is calculated according to the manipulator coordinate system, so that the manipulator is controlled to reach the specified position. Hand-eye calibration refers to the process of conversion due to problems with existing calibration methods. The method comprises the steps of firstly obtaining pose information of preset mark points of a calibration plate and the tail end position information of a current manipulator; namely, a calibration board of the checkerboard is placed at a processing position, a camera is used for collecting images of the calibration board, pixel data of angular points of the checkerboard are recorded once according to a preset rule, and the recorded preset rule is as follows: the top left corner is the origin of coordinates (0, 0), and a row of lines records corner point data, including a row of pixels for that point and a column of pixels. And obtaining the posture information. A robot hand establishes a coordinate system in a processing area, installs a probe at the position of a flange plate and establishes a TCP, and records the actual coordinates of the checkerboard corner points by using the TCP, wherein the actual coordinates comprise the abscissa x of the middle point of the world coordinate and the ordinate y of the middle point of the world coordinate; obtaining the terminal position information; the TCP records the point position sequence of the actual coordinates of the checkerboard corner points and records the point positions in the calibration board image in a one-to-one correspondence mode.
And then, carrying out conversion relation processing on the attitude information and the tail end position information to obtain a first conversion relation. If the regular articles are grabbed, the current manipulator can be grabbed and positioned directly according to the first conversion relation. If the irregular object is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information when the object is grabbed, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
Through the steps, the problem that the grabbing precision is low due to low positioning precision of the manipulator and the manipulator can only be applied to application scenes of operations such as carrying, stacking and the like of regular articles is solved; the high-precision positioning of the manipulator is realized, so that the grabbing bottom crossing is improved, and the precision requirement for grabbing irregular articles can be met. Furthermore, the suction device of the manipulator can be arbitrarily set according to the volume, shape and weight of the object without being limited by the installation position of the object and the suction device.
In some embodiments, the performing a conversion relationship process on the pose information and the end position information to obtain a first conversion relationship includes:
and carrying out conversion relation processing on the attitude information and the tail end position information according to a first conversion formula to obtain a first conversion relation.
Specifically, the first conversion formula is:
Figure BDA0003035382340000071
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the vertical coordinate of the pixel point in the image coordinate; r is a translation matrix; m is a rotation matrix.
The calculation process is as follows: for example, pixel 1 in the image coordinates is (x'1,y′1) (ii) a Pixel point 2 in the image coordinates is (x'2,y′2) The world coordinate midpoint 1 is (x)1,y1) (ii) a World coordinate midpoint 2 is (x)2,y2)。
Substituted into a first conversion formula
Figure BDA0003035382340000072
Obtaining a translation matrix R by difference
Figure BDA0003035382340000073
Substituting the obtained translation matrix R into any formula to obtain a selected rotation matrix
Figure BDA0003035382340000074
In some embodiments, the processing according to the first capture pose, the rotation center information, and the first transformation relationship to obtain a relative transformation relationship includes:
processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
Specifically, a camera is used for shooting an irregular target object to be grabbed, and a robot hand is used for establishing a first grabbing pose. The first grabbing pose comprises a first pose position (x)0,y0) And a first attitude angle A0. The rotation center information includes a recording rotation center pixel point o ' (x ') of the TCP position point recording '0,y′0) And angle A'0(ii) a And randomly placing the target object to be grabbed in the visual field of the camera to obtain a pixel coordinate point o "(x ″) of the target object at the current rotation center0,y″0) And angle A ″)0. Actually, the relative conversion relationship includes a relative positional relationship and a relative angular relationship.
Also, the second conversion formula is:
Figure BDA0003035382340000081
in the formula (x)Relative position,yRelative position) Is in a relative position relation; x'0The horizontal coordinate of a pixel point in the image coordinate of the rotation center is a TCP position point based on the first pose of the robot; y' is a longitudinal coordinate of a pixel point in an image coordinate based on a TCP position point of the first pose of the robot and the rotation center; x ″)0Randomly placing an object to be grabbed under the view field of a camera, and obtaining the abscissa of a pixel point in the image coordinate of the rotation center of the target object; y ″)0Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a translation matrix; m is a rotation matrix. And processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation.
The formula of the angular offset is ARelative position=A″0-A′0(ii) a And processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation.
Then the actual position of the robot is (x)0+xRelative position,y0+yRelative position,A0+ARelative position) Therefore, the irregular target object can be grabbed with high precision.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The present embodiment further provides a manipulator positioning device, which is used to implement the foregoing embodiments and preferred embodiments, and the description of the device is omitted here. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a block diagram of a robot positioning apparatus according to an embodiment of the present application, and as shown in fig. 3, the apparatus includes an acquisition module 100, a first processing module 200, and a second processing module 300; the acquisition module 100 is configured to acquire pose information of preset mark points of the calibration plate and end position information of a current manipulator; the first processing module 200 is configured to perform conversion relationship processing on the pose information and the end position information to obtain a first conversion relationship; the second processing module 300 is configured to establish a first grabbing pose and corresponding rotation center information of the manipulator when grabbing an article, and perform processing according to the first grabbing pose, the rotation center information, and the first conversion relationship to obtain a relative conversion relationship; and the current manipulator is grabbed and positioned according to the relative conversion relation.
This application has realized the high accuracy location of manipulator to improve and snatch the end of a business, can satisfy the required precision that irregular article snatched.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
In some embodiments, the robot further comprises a grabbing module, and the grabbing module is configured to, after obtaining the first conversion relationship, grab and position the current robot according to the first conversion relationship.
In some embodiments, the first processing module 200 is further configured to perform a conversion relationship processing on the pose information and the end position information according to a first conversion formula, so as to obtain a first conversion relationship.
In some embodiments, the second processing module 300 is further configured to process the first capture pose, the rotation center information, and the first conversion relationship by using a second conversion formula, so as to obtain a relative position relationship;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s2, acquiring pose information of preset mark points of the calibration plate and the terminal position information of the current manipulator;
s2, converting the posture information and the tail end position information to obtain a first conversion relation;
s3, when an article is grabbed, establishing a first grabbing pose of the manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the manipulator positioning method in the foregoing embodiment, the embodiment of the present application may be implemented by providing a storage medium. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the manipulator positioning methods in the above embodiments.
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for positioning a manipulator, comprising:
acquiring pose information of preset mark points of a calibration plate and the tail end position information of the current manipulator;
carrying out conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
when an article is grabbed, establishing a first grabbing pose of a manipulator and corresponding rotation center information, and processing according to the first grabbing pose, the rotation center information and a first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
2. The robot positioning method according to claim 1, further comprising:
and after the first conversion relation is obtained, the current manipulator is grabbed and positioned according to the first conversion relation.
3. The manipulator positioning method according to claim 1, wherein performing a transformation relationship process on the pose information and the end position information to obtain a first transformation relationship comprises:
and carrying out conversion relation processing on the pose information and the tail end position information according to a first conversion formula to obtain a first conversion relation.
4. The robot positioning method of claim 3, wherein the first conversion formula is:
Figure FDA0003035382330000011
wherein x is the abscissa of the middle point of the world coordinate; y is the ordinate of the world coordinate midpoint; x' is the abscissa of the pixel point in the image coordinate; y' is the vertical coordinate of the pixel point in the image coordinate; r is a translation matrix; m is a rotation matrix.
5. The manipulator positioning method according to claim 1, wherein processing according to the first grasp pose, the rotation center information, and the first conversion relationship to obtain a relative conversion relationship comprises:
processing the first grabbing pose, the rotation center information and the first conversion relation by using a second conversion formula to obtain a relative position relation;
processing the first grabbing pose and the rotation center information by using an angle offset formula to obtain a relative angle relation;
and generating a relative conversion relation according to the relative position relation and the relative angle relation.
6. The robot positioning method of claim 5, wherein the second conversion formula is:
Figure FDA0003035382330000012
in the formula (x)Relative position,yRelative position) Is in a relative position relation; x'0The horizontal coordinate of a pixel point in the image coordinate of the rotation center is a TCP position point based on the first pose of the robot; y' is a longitudinal coordinate of a pixel point in an image coordinate based on a TCP position point of the first pose of the robot and the rotation center; x ″)0Randomly arranging the object to be grabbed atThe abscissa of a pixel point in an image coordinate of a rotation center of a target object under the view of a camera; y ″)0Randomly placing an object to be grabbed under the visual field of a camera, and obtaining the vertical coordinate of a pixel point in the image coordinate of the rotation center of the target object; r is a translation matrix; m is a rotation matrix.
7. The manipulator positioning device is characterized by comprising an acquisition module, a first processing module and a second processing module;
the acquisition module is used for acquiring the pose information of the preset mark points of the calibration plate and the terminal position information of the current manipulator;
the first processing module is used for carrying out conversion relation processing on the pose information and the tail end position information to obtain a first conversion relation;
the second processing module is used for establishing a first grabbing pose of the manipulator and corresponding rotation center information when grabbing an article, and processing according to the first grabbing pose, the rotation center information and the first conversion relation to obtain a relative conversion relation; and the current manipulator is grabbed and positioned according to the relative conversion relation.
8. A manipulator positioning system, comprising: a terminal device, a transmission device and a server device; the terminal equipment is connected with the server equipment through the transmission equipment;
the terminal equipment is used for acquiring pose information and terminal position information;
the transmission equipment is used for transmitting the pose information and the tail end position information;
the server device is configured to perform the manipulator positioning method according to any one of claims 1 to 6.
9. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the robot positioning method of any of claims 1 to 6.
10. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the manipulator positioning method of any of claims 1 to 6 when executed.
CN202110441828.0A 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium Active CN113510697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110441828.0A CN113510697B (en) 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110441828.0A CN113510697B (en) 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN113510697A true CN113510697A (en) 2021-10-19
CN113510697B CN113510697B (en) 2023-02-14

Family

ID=78062703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110441828.0A Active CN113510697B (en) 2021-04-23 2021-04-23 Manipulator positioning method, device, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113510697B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114354986A (en) * 2022-01-18 2022-04-15 苏州格拉尼视觉科技有限公司 Flying probe tester and test shaft polarity distribution method thereof
CN115922404A (en) * 2023-01-28 2023-04-07 中冶赛迪技术研究中心有限公司 Disassembling method, disassembling system, electronic equipment and storage medium

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030164820A1 (en) * 1995-04-19 2003-09-04 Joel Kent Acoustic condition sensor employing a plurality of mutually non-orthogonal waves
US20050196484A1 (en) * 2003-01-21 2005-09-08 University Of Southern California Robotic systems for automated construction
CN102448679A (en) * 2009-05-27 2012-05-09 莱卡地球系统公开股份有限公司 Method and system for extremely precise positioning of at least one object in the end position in space
EP2543483A1 (en) * 2011-07-08 2013-01-09 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN104132613A (en) * 2014-07-16 2014-11-05 佛山科学技术学院 Noncontact optical volume measurement method for complex-surface and irregular objects
CN104626206A (en) * 2014-12-17 2015-05-20 西南科技大学 Robot operation pose information measuring method under non-structural environment
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN105234943A (en) * 2015-09-09 2016-01-13 大族激光科技产业集团股份有限公司 Industrial robot demonstration device and method based on visual recognition
CN105518702A (en) * 2014-11-12 2016-04-20 深圳市大疆创新科技有限公司 Method, device and robot for detecting target object
CN105759720A (en) * 2016-04-29 2016-07-13 中南大学 Mechanical arm tracking and positioning on-line identification and correction method based on computer vision
US20170038756A1 (en) * 2015-08-07 2017-02-09 Spm Automation (Canada) Inc. Method of self-adjusting a machine to compensate for part-to-part variations
CN108766894A (en) * 2018-06-07 2018-11-06 湖南大学 A kind of chip attachment method and system of robot vision guiding
CN109421050A (en) * 2018-09-06 2019-03-05 北京猎户星空科技有限公司 A kind of control method and device of robot
CN109829953A (en) * 2019-02-27 2019-05-31 广东拓斯达科技股份有限公司 Image collecting device scaling method, device, computer equipment and storage medium
JP2019120967A (en) * 2017-12-28 2019-07-22 國家中山科學研究院 Error compensation device and method
CN110238845A (en) * 2019-05-22 2019-09-17 湖南视比特机器人有限公司 Optimal Calibration point chooses and the automatic hand and eye calibrating method and device of error measurement
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
WO2020091846A1 (en) * 2018-10-30 2020-05-07 Mujin, Inc. Automated package registration systems, devices, and methods
CN111136656A (en) * 2019-12-24 2020-05-12 上海智殷自动化科技有限公司 Method for automatically identifying and grabbing three-dimensional irregular object of robot
CN111369625A (en) * 2020-03-02 2020-07-03 广东利元亨智能装备股份有限公司 Positioning method, positioning device and storage medium
CN112258589A (en) * 2020-11-16 2021-01-22 北京如影智能科技有限公司 Hand-eye calibration method and device
CN112489133A (en) * 2020-11-17 2021-03-12 北京京东乾石科技有限公司 Calibration method, device and equipment of hand-eye system

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030164820A1 (en) * 1995-04-19 2003-09-04 Joel Kent Acoustic condition sensor employing a plurality of mutually non-orthogonal waves
US20050196484A1 (en) * 2003-01-21 2005-09-08 University Of Southern California Robotic systems for automated construction
CN102448679A (en) * 2009-05-27 2012-05-09 莱卡地球系统公开股份有限公司 Method and system for extremely precise positioning of at least one object in the end position in space
EP2543483A1 (en) * 2011-07-08 2013-01-09 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN104132613A (en) * 2014-07-16 2014-11-05 佛山科学技术学院 Noncontact optical volume measurement method for complex-surface and irregular objects
CN105518702A (en) * 2014-11-12 2016-04-20 深圳市大疆创新科技有限公司 Method, device and robot for detecting target object
CN104626206A (en) * 2014-12-17 2015-05-20 西南科技大学 Robot operation pose information measuring method under non-structural environment
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
US20170038756A1 (en) * 2015-08-07 2017-02-09 Spm Automation (Canada) Inc. Method of self-adjusting a machine to compensate for part-to-part variations
CN105234943A (en) * 2015-09-09 2016-01-13 大族激光科技产业集团股份有限公司 Industrial robot demonstration device and method based on visual recognition
CN105759720A (en) * 2016-04-29 2016-07-13 中南大学 Mechanical arm tracking and positioning on-line identification and correction method based on computer vision
JP2019120967A (en) * 2017-12-28 2019-07-22 國家中山科學研究院 Error compensation device and method
CN108766894A (en) * 2018-06-07 2018-11-06 湖南大学 A kind of chip attachment method and system of robot vision guiding
CN109421050A (en) * 2018-09-06 2019-03-05 北京猎户星空科技有限公司 A kind of control method and device of robot
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
WO2020091846A1 (en) * 2018-10-30 2020-05-07 Mujin, Inc. Automated package registration systems, devices, and methods
CN109829953A (en) * 2019-02-27 2019-05-31 广东拓斯达科技股份有限公司 Image collecting device scaling method, device, computer equipment and storage medium
CN110238845A (en) * 2019-05-22 2019-09-17 湖南视比特机器人有限公司 Optimal Calibration point chooses and the automatic hand and eye calibrating method and device of error measurement
CN111136656A (en) * 2019-12-24 2020-05-12 上海智殷自动化科技有限公司 Method for automatically identifying and grabbing three-dimensional irregular object of robot
CN111369625A (en) * 2020-03-02 2020-07-03 广东利元亨智能装备股份有限公司 Positioning method, positioning device and storage medium
CN112258589A (en) * 2020-11-16 2021-01-22 北京如影智能科技有限公司 Hand-eye calibration method and device
CN112489133A (en) * 2020-11-17 2021-03-12 北京京东乾石科技有限公司 Calibration method, device and equipment of hand-eye system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QINJUN DU: "The Location Error Analysis and Compensation for Needle Driven Robot", 《2009 3RD INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICAL ENGINEERING》 *
冯志刚: "基于工业机器人的结构件柔性定位技术研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114354986A (en) * 2022-01-18 2022-04-15 苏州格拉尼视觉科技有限公司 Flying probe tester and test shaft polarity distribution method thereof
CN115922404A (en) * 2023-01-28 2023-04-07 中冶赛迪技术研究中心有限公司 Disassembling method, disassembling system, electronic equipment and storage medium
CN115922404B (en) * 2023-01-28 2024-04-12 中冶赛迪技术研究中心有限公司 Disassembling method, disassembling system, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113510697B (en) 2023-02-14

Similar Documents

Publication Publication Date Title
DE102019130046B4 (en) Robot system with improved scanning mechanism
CN113510697B (en) Manipulator positioning method, device, system, electronic device and storage medium
US11220007B2 (en) Method of stacking goods by robot, system of controlling robot to stack goods, and robot
CN111083376B (en) Method, system and device for determining installation position of target object and electronic equipment
CN109366472B (en) Method and device for placing articles by robot, computer equipment and storage medium
US11839986B2 (en) Systems and methods for active perception and coordination between robotic vision systems and manipulators
CN113715016B (en) Robot grabbing method, system, device and medium based on 3D vision
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
CN106945037A (en) A kind of target grasping means and system applied to small scale robot
CN109636783A (en) Determination method, apparatus, computer equipment and the storage medium of robot brachium
CN113734981B (en) Method and device for setting material transportation path of intelligent tower crane
CN110167721B (en) Robot system, automatic calibration method and storage device
CN111225554B (en) Bulk object grabbing and assembling method, device, controller and system
CN113246145B (en) Pose compensation method and system for nuclear industry grabbing equipment and electronic device
CN114098980B (en) Camera pose adjustment method, space registration method, system and storage medium
CN112748737A (en) Laser charging method for estimating trinocular visual pose of patrol robot
CN113510696A (en) Method, device and system for constructing manipulator workpiece coordinate system and storage medium
CN113414764A (en) Part warehousing method and device, terminal and readable storage medium
CN114037756A (en) Automatic adding method and system of grinding body and terminal equipment
CN117428788B (en) Equipment control method and device, electronic equipment and storage medium
CN112605990A (en) Robot vision control method and system
Sun et al. A medical garbage bin recycling system based on AGV
CN111410045A (en) Container handling method and device
CN117014824A (en) Robot communication method, system, processing equipment and storage medium suitable for mine tunnel
CN113997282B (en) Mechanical arm control method, mechanical arm control device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant