CN111390910A - Manipulator target grabbing and positioning method, computer readable storage medium and manipulator - Google Patents

Manipulator target grabbing and positioning method, computer readable storage medium and manipulator Download PDF

Info

Publication number
CN111390910A
CN111390910A CN202010246874.0A CN202010246874A CN111390910A CN 111390910 A CN111390910 A CN 111390910A CN 202010246874 A CN202010246874 A CN 202010246874A CN 111390910 A CN111390910 A CN 111390910A
Authority
CN
China
Prior art keywords
manipulator
target
auxiliary positioning
camera
positioning object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010246874.0A
Other languages
Chinese (zh)
Inventor
傅峰峰
江志强
刘嘉荣
李航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fugang Wanjia Intelligent Technology Co Ltd
Original Assignee
Guangzhou Fugang Wanjia Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fugang Wanjia Intelligent Technology Co Ltd filed Critical Guangzhou Fugang Wanjia Intelligent Technology Co Ltd
Priority to CN202010246874.0A priority Critical patent/CN111390910A/en
Publication of CN111390910A publication Critical patent/CN111390910A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5854Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

The invention relates to the technical field of manipulator control, in particular to a manipulator target grabbing and positioning method, a computer readable storage medium and a manipulator. In the grabbing operation of the manipulator, an auxiliary positioning object is arranged to help the manipulator to grab and position the target: acquiring the position relation between the auxiliary positioning object and the target, and deducing the position relation between the camera and the auxiliary positioning object according to the geometric characteristics of the auxiliary positioning object presented in the image and the pre-stored geometric characteristics of the auxiliary positioning object; and the position relation between the camera and the target is deduced under the condition that the position relation between the auxiliary positioning object and the target and the position relation between the auxiliary positioning object and the camera are respectively used as conditions. According to the manipulator target grabbing and positioning method, the geometric characteristics of the auxiliary positioning object can be increased by updating the database at any time, and the manipulator is convenient to grab targets placed at different positions.

Description

Manipulator target grabbing and positioning method, computer readable storage medium and manipulator
Technical Field
The invention relates to the technical field of manipulator control, in particular to a manipulator target grabbing and positioning method, a computer readable storage medium and a manipulator.
Background
With the rapid development of machine vision technology, more and more industrial manipulators and devices with manipulators use a vision system as a navigation tool for manipulator operation. The Eye-in-Hand system is widely applied to industrial robots, the camera is fixed at the tail end of the manipulator and can move along with the movement of the manipulator, and a processor of the manipulator provides navigation guidance for the manipulator by adopting an Eye-in-Hand type visual system. Before the manipulator is used for grabbing the target, the manipulator and the camera are calibrated by hands, so that a hand-eye relation matrix corresponding to the position relation between the camera and the manipulator is obtained, then the camera is used for collecting coordinate values of the target in a pixel coordinate system (generally, the upper left corner of a photosensitive element of the camera is used as an original point, and an XY axis is parallel to an image edge), and the coordinate values are converted into coordinate values in a base coordinate system (generally, a right hand coordinate system with the center of a base of the manipulator as a coordinate original point) by the hand-eye relation matrix, so that a controller of the manipulator plans a moving path of moving and clamping operation.
The hand-eye calibration method comprises the following steps: the method comprises the steps of horizontally placing a high-precision plane calibration plate on an area to be calibrated, arranging a plurality of calibration points with known distances between the calibration plate and the manipulator, controlling the manipulator to drive a camera fixed on the manipulator to move, sequentially moving the manipulator to the positions of the calibration points, shooting the calibration plate each time to obtain a plane image containing the calibration point, obtaining the coordinates of the current calibration point in a pixel coordinate system, correspondingly recording the coordinates of the manipulator in the base coordinate system when the camera shoots each time, solving the corresponding relation between the coordinates of the calibration point in the pixel coordinate system and the coordinates of the manipulator in the base coordinate system, and obtaining the conversion relation between the pixel coordinate system of the camera and the base coordinate system of the manipulator. Based on the coordinate conversion relation obtained by the hand-eye calibration method, the manipulator can only accurately grab in the calibrated area, and if the target to be grabbed is outside the calibrated area, the calibration area needs to be increased to realize the grabbing operation.
Disclosure of Invention
The purpose of the invention is: a method for quickly and conveniently positioning an object for grabbing by a manipulator, a computer-readable storage medium for storing a program for implementing the method when executed by a processor, and a manipulator using the medium, capable of quickly positioning an object.
The manipulator target grabbing and positioning method is characterized by comprising the following steps:
shooting a scene where a target is located by using a camera moving along with a manipulator, identifying the target and an auxiliary positioning object in the scene, obtaining the geometric characteristics of the auxiliary positioning object in the shot scene, and obtaining the position relation between the target and the auxiliary positioning object in the shot scene;
the position relation between the camera and the auxiliary positioning object is deduced according to the geometric characteristics of the auxiliary positioning object in the shot scene and the pre-stored geometric characteristics of the auxiliary positioning object;
and deducing the position relation between the camera and the target according to the position relation between the camera and the auxiliary positioning object and the position relation between the auxiliary positioning object and the target in the shot scene.
The computer-readable storage medium stores a computer program, and the computer program can realize the above-mentioned manipulator target grabbing and positioning method when being executed by a processor.
The manipulator comprises a camera and a processor, wherein the camera is used for shooting the area where the target is located, the camera moves along with the manipulator, and the processor controls the camera to shoot.
Has the advantages that: when the manipulator is used for grabbing operation, an auxiliary positioning object is arranged to help the manipulator to grab and position the target. Firstly, an image of a scene where a target is located is shot through a camera. Then, the image is processed: on one hand, the position relation of the target and the auxiliary positioning object in the shot scene is obtained; and on the other hand, the position relation between the camera and the auxiliary positioning object is derived according to the geometric characteristics of the auxiliary positioning object presented in the image and the pre-stored geometric characteristics of the auxiliary positioning object. And finally, the position relation between the camera and the target is deduced according to the condition that the position relation between the auxiliary positioning object and the target and the position relation between the auxiliary positioning object and the camera are respectively taken as conditions. According to the manipulator target grabbing and positioning method, the geometric characteristics of the auxiliary positioning object can be increased by updating the database at any time, and the manipulator is convenient to grab targets placed at different positions.
Drawings
Fig. 1 is a schematic structural view of a cooking table equipped with a robot and a water tank.
Detailed Description
The cooking district in wisdom kitchen is equipped with the manipulator, as shown in fig. 1, and the base of manipulator 2 is installed on cooking platform 1, is equipped with the camera 3 that moves along with manipulator 2 on the manipulator 2, and the 2 end-to-end installations of manipulator have the clamping jaw to grab the target. When the manipulator is installed, the structural data of the working scene of the manipulator is called from a database of the intelligent kitchen, the geometric characteristics such as square and right-angled triangle which are convenient for image correction are identified in the called structural data, and the objects with simple and stable geometric characteristics in the identified data structure are recorded as auxiliary positioning objects. As in the cooking table 1 of fig. 1, the water tank 4 at the top of the cooking table 1 has a rectangular top opening, and the length and width of the rectangular opening are recorded in the structural data of the water tank 4, that is, the water tank 4 has four right angles with known respective lengths of the right-angle sides, and conforms to the geometric characteristics of the auxiliary positioning object, so that the water tank 4 is stored in the characteristic library as the auxiliary positioning object, and the geometric characteristics of the water tank 4 itself are recorded: the four vertexes P1, P2, P3 and P4 and the distances between every two vertexes are converted according to the structural data of the water tank 4 to obtain edge feature images of a plurality of angles, and then the edge feature images are stored in a feature library.
The positioning process performed by the robot 2 to grasp the object 5 is described in detail below.
The camera 3 captures the scene in which the object 5 is located, the area covered in the captured image is shown in the scene 31 shown in fig. 1, and the water tank 4 is also captured in its entirety in the image. The manipulator control system processes the scene 31 through a conventional edge detection algorithm, identifies the target 5, extracts the rest edge features in the scene 31, and calls the edge feature images stored in the feature library to compare the extracted edge features. After comparison, the manipulator control system identifies the water outlet tank 4 from the scene 31, and uses the water tank 4 as an auxiliary positioning object to assist the manipulator 2 in positioning the target 5 to be grabbed.
The robot control system acquires the geometric features of the identified water bath 4 present in the captured scene 31: the positions of the four vertexes P1, P2, P3, P4 of the water tank 4, and the angle of the triangle obtained by combining any three of the four vertexes. In the scene 31, taking the object 5 and one side P1P2 of the water tank 4 as an example, knowing that the length of P1P2 and the bottom B of the object 5 are coplanar with the side P1P2, the positional relationship between the bottom B of the object 5 and the top P2 of the water tank 4 can be derived from the space geometry, and the height h of the object 5 from the top T to the bottom B can also be derived, i.e., the positional relationship between the object 5 and the water tank 4 in the scene to be photographed can be obtained.
Since the acquired geometric features of the water tank 4 in the photographed scene and the pre-stored geometric features of the water tank 4 themselves change in angle and distance, the conversion relationship required for restoring the water tank 4 from the geometric features presented in the photographed scene to the pre-stored geometric features of the water tank 4 itself can be obtained by performing a reverse-extrapolation by means of image correction such as affine transformation correction or transmission transformation correction, and the positional relationship between the camera 3 and the water tank 4 can be derived from the amount of change in the angle and the distance in the conversion relationship.
The positional relationship between the camera 3 and the target 5 can be derived from the positional relationship between the camera 3 and the water tank 4 and the positional relationship between the water tank 4 and the target 5 in the scene to be photographed. According to the manipulator target grabbing and positioning method, when the manipulator is used for grabbing operation, the auxiliary positioning object is arranged to help the manipulator to grab and position the target, the geometric characteristics of the auxiliary positioning object can be increased by updating the database at any time, and the manipulator can grab the targets placed at different positions conveniently.
If the auxiliary positioning object is not identified in the current shot scene, for example, the water tank 4 is not shot into the scene image, the camera 3 is adjusted, for example, the shooting angle, the shooting distance or the image capturing parameters are adjusted, and then the scene where the target is located is shot until the auxiliary positioning object is identified.

Claims (10)

1. The manipulator target grabbing and positioning method is characterized by comprising the following steps of:
shooting a scene where a target is located by using a camera moving along with a manipulator, identifying the target and an auxiliary positioning object in the scene, obtaining the geometric characteristics of the auxiliary positioning object in the shot scene, and obtaining the position relation between the target and the auxiliary positioning object in the shot scene;
the position relation between the camera and the auxiliary positioning object is deduced according to the geometric characteristics of the auxiliary positioning object in the shot scene and the pre-stored geometric characteristics of the auxiliary positioning object;
and deducing the position relation between the camera and the target according to the position relation between the camera and the auxiliary positioning object and the position relation between the auxiliary positioning object and the target in the shot scene.
2. The manipulator target grabbing and positioning method according to claim 1, wherein the target and the auxiliary positioning object are recognized in a scene, specifically, edge features are extracted from a shot scene, and the target and the auxiliary positioning object are recognized according to the extracted edge features.
3. The manipulator target grabbing and positioning method according to claim 1, wherein if the auxiliary positioning object is not identified in the current shot scene, the camera is adjusted to shoot the scene where the target is located.
4. The manipulator target grabbing and positioning method according to claim 3, wherein the camera adjustment mode comprises one or more of adjusting a shooting angle, adjusting a shooting distance and adjusting image capturing parameters.
5. The manipulator target grabbing and positioning method of claim 1, wherein the geometrical characteristics of the pre-stored auxiliary positioning object include at least three intersection points located on the auxiliary positioning object and the distance between at least two pairs of intersection points.
6. The manipulator target grabbing and positioning method of claim 5, wherein the geometrical characteristics of the pre-stored auxiliary positioner include a right angle and the lengths of two right-angle sides of the right angle.
7. The manipulator target grabbing and positioning method according to claim 1, wherein the deriving of the position relationship between the camera and the auxiliary positioner according to the geometric characteristics of the auxiliary positioner in the scene to be shot and the pre-stored geometric characteristics of the auxiliary positioner is specifically: and acquiring a conversion relation required by the auxiliary positioning object to restore the geometrical characteristics presented in the shot scene to the geometrical characteristics of the auxiliary positioning object, and deducing the position relation between the camera and the auxiliary positioning object according to the variation of the angle and the distance in the conversion relation.
8. The manipulator object grabbing positioning method of claim 7, wherein the required transformation relationship is obtained by affine transformation correction or transmission transformation correction.
9. A computer-readable storage medium, in which a computer program is stored, wherein the computer program, when executed by a processor, is capable of implementing the manipulator object grabbing positioning method according to any one of claims 1 to 8.
10. A robot comprising a camera for taking a picture of an area in which the object is located and a processor, the camera moving with the robot, the processor controlling the camera to take the picture, and a computer readable storage medium having a computer program executable by the processor, wherein the computer readable storage medium is as claimed in claim 9.
CN202010246874.0A 2020-03-31 2020-03-31 Manipulator target grabbing and positioning method, computer readable storage medium and manipulator Pending CN111390910A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010246874.0A CN111390910A (en) 2020-03-31 2020-03-31 Manipulator target grabbing and positioning method, computer readable storage medium and manipulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010246874.0A CN111390910A (en) 2020-03-31 2020-03-31 Manipulator target grabbing and positioning method, computer readable storage medium and manipulator

Publications (1)

Publication Number Publication Date
CN111390910A true CN111390910A (en) 2020-07-10

Family

ID=71425949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010246874.0A Pending CN111390910A (en) 2020-03-31 2020-03-31 Manipulator target grabbing and positioning method, computer readable storage medium and manipulator

Country Status (1)

Country Link
CN (1) CN111390910A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111844048A (en) * 2020-08-04 2020-10-30 合肥盛恩智能装备科技有限公司 Robot clamp grabbing algorithm
CN112459734A (en) * 2020-11-26 2021-03-09 湖南三一石油科技有限公司 Manipulator positioning method and device, manipulator and storage medium
CN112809672A (en) * 2020-12-31 2021-05-18 安徽飞凯电子技术有限公司 Target positioning system for mechanical arm

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08322121A (en) * 1995-05-23 1996-12-03 Mitsui Eng & Shipbuild Co Ltd Positioning method for remote operation manipulator and positioning support system
CN104400703A (en) * 2014-09-28 2015-03-11 沈阳拓荆科技有限公司 Auxiliary positioning tool and three-axis positioning method
CN106530276A (en) * 2016-10-13 2017-03-22 中科金睛视觉科技(北京)有限公司 Manipulator positioning method and system for grabbing of non-standard component
CN110788863A (en) * 2019-11-22 2020-02-14 上海原能细胞生物低温设备有限公司 Machine vision calibration method and mechanical arm positioning and grabbing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08322121A (en) * 1995-05-23 1996-12-03 Mitsui Eng & Shipbuild Co Ltd Positioning method for remote operation manipulator and positioning support system
CN104400703A (en) * 2014-09-28 2015-03-11 沈阳拓荆科技有限公司 Auxiliary positioning tool and three-axis positioning method
CN106530276A (en) * 2016-10-13 2017-03-22 中科金睛视觉科技(北京)有限公司 Manipulator positioning method and system for grabbing of non-standard component
CN110788863A (en) * 2019-11-22 2020-02-14 上海原能细胞生物低温设备有限公司 Machine vision calibration method and mechanical arm positioning and grabbing method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111844048A (en) * 2020-08-04 2020-10-30 合肥盛恩智能装备科技有限公司 Robot clamp grabbing algorithm
CN111844048B (en) * 2020-08-04 2021-09-28 合肥盛恩智能装备科技有限公司 Robot clamp grabbing algorithm
CN112459734A (en) * 2020-11-26 2021-03-09 湖南三一石油科技有限公司 Manipulator positioning method and device, manipulator and storage medium
CN112809672A (en) * 2020-12-31 2021-05-18 安徽飞凯电子技术有限公司 Target positioning system for mechanical arm

Similar Documents

Publication Publication Date Title
CN111452040B (en) System and method for associating machine vision coordinate space in a pilot assembly environment
CN109483554B (en) Robot dynamic grabbing method and system based on global and local visual semantics
JP2022028672A (en) System and method for automatic hand-eye calibration of vision system for robot motion
CN107618030B (en) Robot dynamic tracking grabbing method and system based on vision
US9957120B2 (en) Stowage pattern calculation device and stowage device for stowing plurality types of objects
CN111390910A (en) Manipulator target grabbing and positioning method, computer readable storage medium and manipulator
CN108827154B (en) Robot non-teaching grabbing method and device and computer readable storage medium
CN107471218B (en) Binocular vision-based hand-eye coordination method for double-arm robot
JP2018169403A5 (en)
JP6815309B2 (en) Operating system and program
JP2017505240A (en) Automatic calibration method for robotic systems using visual sensors
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
WO2018043525A1 (en) Robot system, robot system control device, and robot system control method
CN110980276B (en) Method for implementing automatic casting blanking by three-dimensional vision in cooperation with robot
JP2012030320A (en) Work system, working robot controller, and work program
CN114074331A (en) Disordered grabbing method based on vision and robot
US20210197391A1 (en) Robot control device, robot control method, and robot control non-transitory computer readable medium
CN108711174B (en) Approximate parallel vision positioning system for mechanical arm
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
CN110533717A (en) A kind of target grasping means and device based on binocular vision
Fan et al. An automatic robot unstacking system based on binocular stereo vision
CN113858214B (en) Positioning method and control system for robot operation
KR101820241B1 (en) Apparatus for motion estimation of object using gripper and method thereof
CN111015670B (en) Mechanical arm and method for positioning, identifying and processing parts by using same
CN113538459A (en) Multi-mode grabbing obstacle avoidance detection optimization method based on drop point area detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination