CN111360821A - Picking control method, device and equipment and computer scale storage medium - Google Patents

Picking control method, device and equipment and computer scale storage medium Download PDF

Info

Publication number
CN111360821A
CN111360821A CN202010108488.5A CN202010108488A CN111360821A CN 111360821 A CN111360821 A CN 111360821A CN 202010108488 A CN202010108488 A CN 202010108488A CN 111360821 A CN111360821 A CN 111360821A
Authority
CN
China
Prior art keywords
mechanical arm
coordinate
coordinate system
target
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010108488.5A
Other languages
Chinese (zh)
Inventor
李京兵
史晨阳
孙林
黄梦醒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan University
Original Assignee
Hainan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan University filed Critical Hainan University
Priority to CN202010108488.5A priority Critical patent/CN111360821A/en
Publication of CN111360821A publication Critical patent/CN111360821A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D45/00Harvesting of standing crops
    • A01D45/006Harvesting of standing crops of tomatoes
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D46/00Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
    • A01D46/30Robotic devices for individually picking crops
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a picking control method, which comprises the following steps: when a picking control request is received, acquiring an initial three-dimensional coordinate of a target object in a visual sensor coordinate system; acquiring the position relation between a visual sensor and a mechanical arm; calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate; respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates; and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object. By applying the technical scheme provided by the embodiment of the invention, the accuracy of controlling the mechanical arm is improved, the picking accuracy of the target object is improved, and the picking efficiency is greatly improved. The invention also discloses a picking control device, equipment and a storage medium, and has corresponding technical effects.

Description

Picking control method, device and equipment and computer scale storage medium
Technical Field
The invention relates to the technical field of automatic control, in particular to a picking control method, a picking control device, picking control equipment and a computer readable storage medium.
Background
With the continuous progress and development of science and technology, the picking robot becomes the key for replacing the manual work and improving the agricultural output value. The production of different kinds of picking robots, such as tomato picking robots, apple picking robots, grape picking robots, etc., is of great significance to the development of agricultural science and technology. The control of the mechanical arm is very important in the process of executing picking tasks, wherein the establishment and solution of a mechanical arm motion model and the target recognition and positioning are key problems of the control of the mechanical arm of the picking robot. Meanwhile, the picking robot usually faces some unstructured and complex external environments, in order to enable the picking robot arm to have the capability of acquiring information from unknown environments, an external visual sensor needs to be equipped for the picking robot arm, the visual sensor is used as the 'eyes' of the picking robot, can be used for sensing the environment and realizing the recognition of agricultural target fruits and acquiring the three-dimensional coordinate information of target objects, and brings convenience to the control of the picking robot arm.
The existing picking mode is that a target object is identified through a visual sensor, the target object is positioned under a visual sensor coordinate system, then a mechanical arm is controlled to pick the target object, the mechanical arm is prevented from shielding the visual sensor to identify the target object, a certain distance exists between the visual sensor and the mechanical arm in a space position, the accuracy of control over the mechanical arm based on current positioning information is low, the picking accuracy of the target object is low, and the picking efficiency is directly influenced.
In summary, how to effectively solve the problems of low accuracy of controlling the mechanical arm, low picking accuracy of a target object, influence on picking efficiency and the like is a problem which needs to be solved urgently by technical personnel in the field at present.
Disclosure of Invention
The invention aims to provide a picking control method, which greatly improves the accuracy of controlling a mechanical arm, improves the picking accuracy of a target object and greatly improves the picking efficiency; it is another object of the present invention to provide a picking control device, apparatus and computer readable storage medium.
In order to solve the technical problems, the invention provides the following technical scheme:
a picking control method comprising:
when a picking control request is received, acquiring an initial three-dimensional coordinate of a target object in a visual sensor coordinate system;
acquiring the position relation between a visual sensor and a mechanical arm;
calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate;
respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates;
and controlling the steering engines to rotate corresponding target rotation angles, and sending a grabbing instruction to the claw steering engines of the mechanical arm so that the claw steering engines control the claws of the mechanical arm to pick the target object.
In one embodiment of the present invention, acquiring initial three-dimensional coordinates of a target object in a vision sensor coordinate system when a picking control request is received comprises:
when a picking control request is received, identifying a target object by using a color-based image segmentation method of the visual sensor, and acquiring a two-dimensional coordinate of the target object;
and solving the initial three-dimensional coordinate corresponding to the two-dimensional coordinate by using a least square method according to the conversion relation between the image plane coordinate system and the camera coordinate system.
In an embodiment of the present invention, the vision sensor is a binocular vision camera, and before the identifying the target object by using the vision sensor based on the image segmentation method of color, the method further includes:
and carrying out internal reference correction operation on the binocular vision camera.
In a specific embodiment of the present invention, the calculating a target three-dimensional coordinate of the target object in a robot arm coordinate system by using the position relationship and the initial three-dimensional coordinate includes:
acquiring an origin Cartesian coordinate of a visual sensor coordinate system origin corresponding to the visual sensor coordinate system in the mechanical arm coordinate system;
acquiring a rotation angle of the mechanical arm coordinate system relative to the visual sensor coordinate system on a z-axis;
and calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system according to the original point Cartesian coordinate, the rotation angle and the initial three-dimensional coordinate.
A picking control device comprising:
the system comprises an initial coordinate acquisition module, a visual sensor coordinate system and a picking control module, wherein the initial coordinate acquisition module is used for acquiring an initial three-dimensional coordinate of a target object in the visual sensor coordinate system when a picking control request is received;
the position relation acquisition module is used for acquiring the position relation between the visual sensor and the mechanical arm;
the target coordinate calculation module is used for calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate;
the rotation angle calculation module is used for calculating target rotation angles corresponding to the steering engines of the mechanical arm according to the target three-dimensional coordinates;
and the picking control module is used for controlling each steering engine to rotate corresponding target rotation angles and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick the target object.
In one embodiment of the present invention, the initial coordinate acquiring module includes:
the two-dimensional coordinate acquisition sub-module is used for identifying a target object by using a color-based image segmentation method of the visual sensor and acquiring a two-dimensional coordinate of the target object when a picking control request is received;
and the initial coordinate calculation submodule is used for solving the initial three-dimensional coordinate corresponding to the two-dimensional coordinate by using a least square method according to the conversion relation between the image plane coordinate system and the camera coordinate system.
In a specific embodiment of the present invention, the vision sensor is a binocular vision camera, further comprising:
and the internal reference correction module is used for carrying out internal reference correction operation on the binocular vision camera before the target object is identified by using the vision sensor based on the color image segmentation method.
In a specific embodiment of the present invention, the binocular vision camera is horizontally disposed, and the binocular vision camera and the base of the robot arm are located at the same horizontal plane, and the target coordinate calculation module includes:
the origin coordinate acquisition submodule is used for acquiring origin Cartesian coordinates of a visual sensor coordinate system origin corresponding to the visual sensor coordinate system in the mechanical arm coordinate system;
a rotation angle acquisition submodule for acquiring a rotation angle of the robot arm coordinate system with respect to the vision sensor coordinate system on a z-axis;
and the target coordinate calculation sub-module is used for calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system according to the original point Cartesian coordinate, the rotation angle and the initial three-dimensional coordinate.
A picking control apparatus comprising:
a memory for storing a computer program;
a processor for implementing the steps of the picking control method as described above when executing the computer program.
A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, carries out the steps of a picking control method as set out above.
By applying the method provided by the embodiment of the invention, when a picking control request is received, the initial three-dimensional coordinates of the target object in the coordinate system of the visual sensor are obtained; acquiring the position relation between a visual sensor and a mechanical arm; calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate; respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates; and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object. After the initial three-dimensional coordinate of the target object in the visual sensor coordinate system is obtained, the target three-dimensional coordinate of the target object in the mechanical arm coordinate system is calculated according to the initial three-dimensional coordinate, and the rotation angle and picking action of each steering engine of the mechanical arm are controlled based on the target three-dimensional coordinate in the mechanical arm coordinate system.
Correspondingly, the embodiment of the invention also provides a picking control device, equipment and a computer readable storage medium corresponding to the picking control method, which have the technical effects and are not described herein again.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of an embodiment of a picking control method according to the present invention;
FIG. 2 is a two-dimensional schematic diagram of a rotation angle a of a steering engine at the bottom of a flat XY plane;
FIG. 3 is a schematic diagram of spatial modeling when the distance H from the arm steering engine to the bottom surface of the mechanical arm is greater than the absolute value of the z coordinate of the target object;
FIG. 4 is a schematic diagram of spatial modeling when the distance H from the arm steering engine to the bottom surface of the mechanical arm is smaller than the absolute value of the z coordinate of the target object;
FIG. 5 is a flow chart of another embodiment of a picking control method according to the present invention;
fig. 6 is a block diagram of a picking control device according to an embodiment of the present invention;
fig. 7 is a block diagram of a picking control device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the invention will be described in further detail with reference to the accompanying drawings and specific embodiments. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment is as follows:
referring to fig. 1, fig. 1 is a flowchart of an implementation of a picking control method according to an embodiment of the present invention, where the method may include the following steps:
s101: when a picking control request is received, the initial three-dimensional coordinates of the target object in the vision sensor coordinate system are acquired.
When picking operation is needed, a picking control request can be sent to the picking control center, and when the picking control center receives the picking control request, the initial three-dimensional coordinates of the target object in the visual sensor coordinate system are obtained. The vision sensor may be a binocular vision camera.
The target object may be any one of objects to be picked corresponding to the picking control request.
S102: and acquiring the position relation between the visual sensor and the mechanical arm.
When the visual sensor and the mechanical arm are installed, the working radius range of the mechanical arm is recorded as an identification area, and in order to ensure that the identification area is not blocked, the distance between the position of the visual sensor and the identification area is generally set to be 0.8-20.0 m. The position relationship between the visual sensor and the mechanical arm can be acquired. The position relationship can be the distance between the central point of the vision sensor and the central point of the mechanical arm, the included angle between the connecting line between the vision sensor and the mechanical arm and each axial direction of the world coordinate axis, and the like.
S103: and calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate.
After the position relationship between the visual sensor and the mechanical arm and the initial three-dimensional coordinate of the target object in the visual sensor coordinate system are obtained, the target three-dimensional coordinate of the target object in the mechanical arm coordinate system can be calculated by combining the position relationship and the initial three-dimensional coordinate, so that the three-dimensional coordinate of the target object is converted into the mechanical arm coordinate system.
S104: and respectively calculating the target rotation angles corresponding to the steering engines of the mechanical arm according to the target three-dimensional coordinates.
After the target three-dimensional coordinates of the target object in the mechanical arm coordinate system are obtained through calculation, the target rotation angles corresponding to the steering engines of the mechanical arm can be calculated according to the target three-dimensional coordinates.
The arm of four degrees of freedom can specifically be selected for use to the arm, and it is bottom steering wheel rotation angle a respectively to establish four target rotation angles that need calculate, arm steering wheel rotation angle c, elbow steering wheel rotation angle e, the angle f that opens and shuts of arm claw subsection, and the target rotation angle's that each steering wheel corresponds calculation process can include:
referring to fig. 2, fig. 2 is a two-dimensional schematic diagram of a rotation angle a of the steering engine at the bottom of the plane XY plane. The bottom of the mechanical arm is used as a coordinate system origin O, the mechanical arm is initially positioned on a Y axis, the rotation angle of a steering engine at the bottom is a, and the range of a is (0,180) degrees. Since the coordinates of the target object are known as (x, y), a is known as arctan (y/x) by geometric calculation.
Referring to fig. 3, fig. 3 is a schematic diagram of a space modeling when a distance H from a manipulator arm steering engine to a bottom surface is greater than an absolute value of a z coordinate of a target object. First, a known parameter L is defined1Is a machineProjection of the distance from the arm steering gear to the target object on the XY plane, L2Distance from arm steering engine to elbow steering engine, L3The distance between the elbow steering engine of the mechanical arm and a target object is (the distance between the claw steering engine and the target object is very small, almost coincides in the figure, so that the distance is ignored). When the distance H from the arm steering engine of the mechanical arm to the bottom surface is larger than the absolute value of the z coordinate of the target object, the values of variables b and e are respectively calculated through the following two cosine theorem formulas:
Figure BDA0002389171910000061
Figure BDA0002389171910000062
and the rotation angle c of the arm steering engine and the angle e of the elbow steering engine to be rotated can be calculated by calculating the values of b and e. The rotation angle of the arm steering engine is as follows:
Figure BDA0002389171910000063
the elbow steering engine has a rotation angle e.
Referring to fig. 4, fig. 4 is a schematic diagram of a spatial modeling when a distance H from a manipulator arm steering engine to a bottom surface is smaller than an absolute value of a z coordinate of a target object. Similarly, the values of the variables b and e are calculated firstly through the following two cosine theorem formulas:
Figure BDA0002389171910000071
Figure BDA0002389171910000072
and the rotation angle c of the arm steering engine and the angle e of the elbow steering engine to be rotated can be calculated by calculating the values of b and e.
Rotation angle of the arm steering engine:
Figure BDA0002389171910000073
the elbow steering engine has a rotation angle e.
S105: and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object.
After the target rotation angles corresponding to the steering engines of the mechanical arm are obtained through calculation, the steering engines can be controlled to rotate corresponding target rotation angles, and a grabbing instruction is sent to the claw steering engine of the mechanical arm, so that the claw steering engine controls claws of the mechanical arm to pick a target object.
Let f denote the opening and closing angle of the gripper portion of the robot arm. The initial angle is set to be 0 degrees, which represents that the claw is opened, and the claw is closed when the grabbing work is carried out, and the angle is changed to be 180 degrees.
By applying the method provided by the embodiment of the invention, when a picking control request is received, the initial three-dimensional coordinates of the target object in the coordinate system of the visual sensor are obtained; acquiring the position relation between a visual sensor and a mechanical arm; calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate; respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates; and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object. After the initial three-dimensional coordinate of the target object in the visual sensor coordinate system is obtained, the target three-dimensional coordinate of the target object in the mechanical arm coordinate system is calculated according to the initial three-dimensional coordinate, and the rotation angle and picking action of each steering engine of the mechanical arm are controlled based on the target three-dimensional coordinate in the mechanical arm coordinate system.
It should be noted that, based on the first embodiment, the embodiment of the present invention further provides a corresponding improvement scheme. In the following embodiments, steps that are the same as or correspond to those in the first embodiment may be referred to each other, and corresponding advantageous effects may also be referred to each other, which are not described in detail in the following modified embodiments.
Example two:
referring to fig. 5, fig. 5 is a flowchart of another implementation of the picking control method in the embodiment of the present invention, and the method may include the following steps:
s501: and when a picking control request is received, performing internal reference correction operation on the binocular vision camera.
The vision sensor may specifically employ a binocular vision camera, and when a picking control request is received, an internal reference correction operation may be performed on the binocular vision camera. And calibrating the binoculars of the binocular vision camera, wherein the internal parameters of the binocular vision camera are parameters related to the characteristics of the binocular vision camera, and the focal length and the pixel size of the binocular vision camera can be determined. The camera matrix consists of an internal reference matrix and an external reference matrix, and the internal reference matrix and the external reference matrix can be obtained by performing orthogonal triangle (QR) decomposition on the camera matrix. The internal parameters include focal length, principal point, tilt coefficient, distortion coefficient, and the left internal parameter can be expressed as:
Figure BDA0002389171910000081
the right internal reference can be expressed as:
Figure BDA0002389171910000082
wherein f isx1Representing the focal length in pixels in the direction of the horizontal axis of the left camera image, fy1Focal length in pixels in the direction of the longitudinal axis of the left camera image, cx1Is the difference of the left camera optical axis and the image center in the horizontal axis direction in pixel unit, cy1The difference between the optical axis of the left camera and the center of the image in the longitudinal axis direction by taking pixels as units; f. ofx2As right camera imageFocal length in pixel units in the direction of the horizontal axis, fy2Focal length in pixels in the direction of the longitudinal axis of the right camera image, cx2Is the difference of the right camera optical axis and the image center in the horizontal axis direction in pixel unit, cy2Is the difference of the right camera optical axis and the image center in the vertical axis direction in pixel units.
By carrying out internal reference correction operation on the binocular vision camera, the shot image is corrected according to the distortion coefficient of the binocular vision camera, so that the accuracy of controlling the mechanical arm is improved, and the picking accuracy of the target object is improved.
S502: and identifying the target object by using a binocular vision camera based on a color image segmentation method, and acquiring two-dimensional coordinates of the target object.
The target object can be identified by using a binocular vision camera based on a color image segmentation method, and the two-dimensional coordinates of the target object are acquired. The specific process may include first converting each color pixel in the image into xy chromaticity coordinates, and finding a set of points on the plane using a K-means algorithm, where each set of points corresponds to a pixel cluster of distinguishable color. The K-means algorithm must specify the number of point sets to be found, assuming that the scene has n elements of different colors, the pixels are clustered to n chroma types. A pixel in the image has a value c ═ {0, 1, 2, 3.., n }, which indicates to which class the corresponding input pixel is assigned. A pixel belonging to type 2 is selected, which is a partial image, and all pixels of type 2 are displayed as white, corresponding to the object in the original image that belongs to the color represented by the type 2 pixel. The target object can generate a hole due to mirror reflection, the image processing result is improved through morphological operation, and the image pixel is divided into a target and a non-target. After the target area is identified, calculating two-dimensional coordinates of the center of the target area in the left eye image and the right eye image of the binocular vision camera by using a VS + open cv program language.
S503: and solving an initial three-dimensional coordinate corresponding to the two-dimensional coordinate by using a least square method according to the conversion relation between the image plane coordinate system and the camera coordinate system.
After the two-dimensional coordinates of the target object are acquired, according to the transformation relation between the image plane coordinate system and the camera coordinate system, the initial three-dimensional coordinates corresponding to the two-dimensional coordinates are solved by using a least square method, and can be recorded as (X, Y, Z).
S504: when the binocular vision camera is horizontally placed and the binocular vision camera and the base of the mechanical arm are located on the same horizontal plane, the original point Cartesian coordinates of the visual sensor coordinate system corresponding to the visual sensor coordinate system in the mechanical arm coordinate system are obtained.
When installing binocular vision camera and arm, can place binocular vision camera level, and set up the base that binocular vision camera and arm and be located same horizontal plane. In this case, the coordinate system of the robot arm base and the coordinate system of the binocular camera have X axes parallel to each other and opposite directions, Y axes parallel to each other and opposite directions, and Z axes parallel to each other and identical directions. Acquiring cartesian coordinates of the origin of the vision sensor coordinate system corresponding to the vision sensor coordinate system in the mechanical arm coordinate system, which can be recorded as (x)1,y1,z1)。
S505: the rotation angle of the robot arm coordinate system with respect to the vision sensor coordinate system on the z-axis is acquired.
The rotation angle of the robot arm coordinate system with respect to the vision sensor coordinate system on the z-axis is obtained and may be denoted as θ.
S506: and calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system according to the original point Cartesian coordinate, the rotation angle and the initial three-dimensional coordinate.
Respectively acquiring initial three-dimensional coordinates (X, Y, Z) of the target object in a vision sensor coordinate system, wherein the vision sensor coordinate system corresponds to an origin point of a vision sensor coordinate system in an origin point Cartesian coordinate (X) of a mechanical arm coordinate system1,y1,z1) And after the rotation angle theta of the mechanical arm coordinate system relative to the visual sensor coordinate system on the z axis, the target three-dimensional coordinates of the target object in the mechanical arm coordinate system can be calculated according to the original point Cartesian coordinates, the rotation angle and the initial three-dimensional coordinates. The target three-dimensional coordinates of the target object in the robot arm coordinate system may be calculated using the following formula:
Figure BDA0002389171910000101
wherein (x)2,y2,z2) Expressed as target three-dimensional coordinates of the target object in the robot arm coordinate system.
When the binocular vision camera is horizontally placed and the binocular vision camera and the base of the mechanical arm are positioned on the same horizontal plane, the rotation angle theta of the mechanical arm coordinate system relative to the visual sensor coordinate system on the z axis is pi. Under the condition that the positions of the binocular vision camera and the mechanical arm are arranged, the complexity of calculating the target three-dimensional coordinates of the target object in the mechanical arm coordinate system is reduced.
S507: and respectively calculating the target rotation angles corresponding to the steering engines of the mechanical arm according to the target three-dimensional coordinates.
S508: and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object.
Corresponding to the above method embodiment, the embodiment of the invention also provides a picking control device, and the picking control device described below and the picking control method described above can be correspondingly referred to each other.
Referring to fig. 6, fig. 6 is a block diagram of a picking control device according to an embodiment of the present invention, where the picking control device may include:
an initial coordinate obtaining module 61, configured to obtain an initial three-dimensional coordinate of the target object in a visual sensor coordinate system when the picking control request is received;
a position relation obtaining module 62, configured to obtain a position relation between the visual sensor and the mechanical arm;
a target coordinate calculation module 63, configured to calculate a target three-dimensional coordinate of the target object in the mechanical arm coordinate system by combining the position relationship and the initial three-dimensional coordinate;
the rotation angle calculation module 64 is used for calculating target rotation angles corresponding to the steering engines of the mechanical arm according to the target three-dimensional coordinates;
and the picking control module 65 is used for controlling each steering engine to rotate by a corresponding target rotation angle and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object.
By applying the device provided by the embodiment of the invention, when a picking control request is received, the initial three-dimensional coordinates of the target object in the coordinate system of the visual sensor are obtained; acquiring the position relation between a visual sensor and a mechanical arm; calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate; respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates; and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object. After the initial three-dimensional coordinate of the target object in the visual sensor coordinate system is obtained, the target three-dimensional coordinate of the target object in the mechanical arm coordinate system is calculated according to the initial three-dimensional coordinate, and the rotation angle and picking action of each steering engine of the mechanical arm are controlled based on the target three-dimensional coordinate in the mechanical arm coordinate system.
In one embodiment of the present invention, the initial coordinate obtaining module 61 includes:
the two-dimensional coordinate acquisition sub-module is used for identifying a target object by using a visual sensor based on a color image segmentation method and acquiring a two-dimensional coordinate of the target object when a picking control request is received;
and the initial coordinate calculation submodule is used for solving an initial three-dimensional coordinate corresponding to the two-dimensional coordinate by using a least square method according to the conversion relation between the image plane coordinate system and the camera coordinate system.
In a specific embodiment of the present invention, the vision sensor is a binocular vision camera, further comprising:
and the internal reference correction module is used for carrying out internal reference correction operation on the binocular vision camera before the target object is identified by using the vision sensor based on the color image segmentation method.
In one embodiment of the present invention, the binocular vision camera is horizontally disposed, and the binocular vision camera and the base of the robot arm are located at the same horizontal plane, and the target coordinate calculating module 63 includes:
the origin coordinate acquisition sub-module is used for acquiring origin Cartesian coordinates of the origin of the visual sensor coordinate system corresponding to the visual sensor coordinate system in the mechanical arm coordinate system;
the rotation angle acquisition submodule is used for acquiring the rotation angle of the mechanical arm coordinate system relative to the visual sensor coordinate system on the z axis;
and the target coordinate calculation submodule is used for calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system according to the original point Cartesian coordinate, the rotation angle and the initial three-dimensional coordinate.
In correspondence with the above method embodiment, referring to fig. 7, fig. 7 is a schematic diagram of a picking control apparatus provided by the present invention, which may include:
a memory 71 for storing a computer program;
the processor 72, when executing the computer program stored in the memory 71, may implement the following steps:
when a picking control request is received, acquiring an initial three-dimensional coordinate of a target object in a visual sensor coordinate system; acquiring the position relation between a visual sensor and a mechanical arm; calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate; respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates; and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object.
For the introduction of the device provided by the present invention, please refer to the above method embodiment, which is not described herein again.
Corresponding to the above method embodiment, the present invention further provides a computer-readable storage medium having a computer program stored thereon, the computer program, when executed by a processor, implementing the steps of:
when a picking control request is received, acquiring an initial three-dimensional coordinate of a target object in a visual sensor coordinate system; acquiring the position relation between a visual sensor and a mechanical arm; calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate; respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates; and controlling each steering engine to rotate by a corresponding target rotation angle, and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick a target object.
The computer-readable storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
For the introduction of the computer-readable storage medium provided by the present invention, please refer to the above method embodiments, which are not described herein again.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device, the apparatus and the computer-readable storage medium disclosed in the embodiments correspond to the method disclosed in the embodiments, so that the description is simple, and the relevant points can be referred to the description of the method.
The principle and the implementation of the present invention are explained in the present application by using specific examples, and the above description of the embodiments is only used to help understanding the technical solution and the core idea of the present invention. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (10)

1. A picking control method, comprising:
when a picking control request is received, acquiring an initial three-dimensional coordinate of a target object in a visual sensor coordinate system;
acquiring the position relation between a visual sensor and a mechanical arm;
calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate;
respectively calculating target rotation angles corresponding to all steering engines of the mechanical arm according to the target three-dimensional coordinates;
and controlling the steering engines to rotate corresponding target rotation angles, and sending a grabbing instruction to the claw steering engines of the mechanical arm so that the claw steering engines control the claws of the mechanical arm to pick the target object.
2. The picking control method of claim 1, where acquiring initial three-dimensional coordinates of a target object in a vision sensor coordinate system upon receiving a picking control request comprises:
when a picking control request is received, identifying a target object by using a color-based image segmentation method of the visual sensor, and acquiring a two-dimensional coordinate of the target object;
and solving the initial three-dimensional coordinate corresponding to the two-dimensional coordinate by using a least square method according to the conversion relation between the image plane coordinate system and the camera coordinate system.
3. The picking control method of claim 2, wherein the vision sensor is a binocular vision camera, further comprising, prior to identifying the target object using the vision sensor color based image segmentation method:
and carrying out internal reference correction operation on the binocular vision camera.
4. The picking control method of claim 3, wherein the binocular vision camera is horizontally placed and is located at the same horizontal plane as a base of the robotic arm, and calculating target three-dimensional coordinates of the target object in a robotic arm coordinate system in combination with the positional relationship and the initial three-dimensional coordinates comprises:
acquiring an origin Cartesian coordinate of a visual sensor coordinate system origin corresponding to the visual sensor coordinate system in the mechanical arm coordinate system;
acquiring a rotation angle of the mechanical arm coordinate system relative to the visual sensor coordinate system on a z-axis;
and calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system according to the original point Cartesian coordinate, the rotation angle and the initial three-dimensional coordinate.
5. A picking control device, comprising:
the system comprises an initial coordinate acquisition module, a visual sensor coordinate system and a picking control module, wherein the initial coordinate acquisition module is used for acquiring an initial three-dimensional coordinate of a target object in the visual sensor coordinate system when a picking control request is received;
the position relation acquisition module is used for acquiring the position relation between the visual sensor and the mechanical arm;
the target coordinate calculation module is used for calculating a target three-dimensional coordinate of the target object in a mechanical arm coordinate system by combining the position relation and the initial three-dimensional coordinate;
the rotation angle calculation module is used for calculating target rotation angles corresponding to the steering engines of the mechanical arm according to the target three-dimensional coordinates;
and the picking control module is used for controlling each steering engine to rotate corresponding target rotation angles and sending a grabbing instruction to the claw steering engine of the mechanical arm so that the claw steering engine controls the claw of the mechanical arm to pick the target object.
6. The picking control device of claim 5, where the initial coordinate acquisition module comprises:
the two-dimensional coordinate acquisition sub-module is used for identifying a target object by using a color-based image segmentation method of the visual sensor and acquiring a two-dimensional coordinate of the target object when a picking control request is received;
and the initial coordinate calculation submodule is used for solving the initial three-dimensional coordinate corresponding to the two-dimensional coordinate by using a least square method according to the conversion relation between the image plane coordinate system and the camera coordinate system.
7. The pick control device of claim 6, wherein the vision sensor is a binocular vision camera, further comprising:
and the internal reference correction module is used for carrying out internal reference correction operation on the binocular vision camera before the target object is identified by using the vision sensor based on the color image segmentation method.
8. The pick control device of claim 7, wherein the binocular vision camera is horizontally disposed and is located at the same horizontal plane as a base of the robotic arm, the target coordinate calculation module comprising:
the origin coordinate acquisition submodule is used for acquiring origin Cartesian coordinates of a visual sensor coordinate system origin corresponding to the visual sensor coordinate system in the mechanical arm coordinate system;
a rotation angle acquisition submodule for acquiring a rotation angle of the robot arm coordinate system with respect to the vision sensor coordinate system on a z-axis;
and the target coordinate calculation sub-module is used for calculating the target three-dimensional coordinate of the target object in the mechanical arm coordinate system according to the original point Cartesian coordinate, the rotation angle and the initial three-dimensional coordinate.
9. A picking control apparatus, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the picking control method according to any of claims 1 to 4 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the picking control method according to any of the claims 1 to 4.
CN202010108488.5A 2020-02-21 2020-02-21 Picking control method, device and equipment and computer scale storage medium Pending CN111360821A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010108488.5A CN111360821A (en) 2020-02-21 2020-02-21 Picking control method, device and equipment and computer scale storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010108488.5A CN111360821A (en) 2020-02-21 2020-02-21 Picking control method, device and equipment and computer scale storage medium

Publications (1)

Publication Number Publication Date
CN111360821A true CN111360821A (en) 2020-07-03

Family

ID=71203823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010108488.5A Pending CN111360821A (en) 2020-02-21 2020-02-21 Picking control method, device and equipment and computer scale storage medium

Country Status (1)

Country Link
CN (1) CN111360821A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111972123A (en) * 2020-07-17 2020-11-24 武汉爱农云联科技有限公司 Intelligent fruit and vegetable picking recommendation method and device based on intelligent planter
CN113057020A (en) * 2021-04-02 2021-07-02 中国铁建重工集团股份有限公司 Method, system, device and medium for controlling rotation speed of collecting head
CN114029997A (en) * 2021-12-16 2022-02-11 广州城市理工学院 Working method of mechanical arm
CN114474094A (en) * 2022-02-16 2022-05-13 苏州书农科技有限公司 Control method, system and device for realizing picking based on multi-joint mechanical arm
CN114932554A (en) * 2022-06-06 2022-08-23 北京钢铁侠科技有限公司 Autonomous moving method and device of grabbing robot, storage medium and equipment
CN115031580A (en) * 2022-06-20 2022-09-09 无锡市星迪仪器有限公司 High-precision artillery correction method, processing device and high-precision artillery correction system
CN115250744A (en) * 2022-07-29 2022-11-01 四川启睿克科技有限公司 Multi-angle strawberry picking system and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726251A (en) * 2009-11-13 2010-06-09 江苏大学 Automatic fruit identification method of apple picking robot on basis of support vector machine
CN101807247A (en) * 2010-03-22 2010-08-18 中国农业大学 Fine-adjustment positioning method of fruit and vegetable picking point
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
KR20150124305A (en) * 2014-04-28 2015-11-05 이철희 Agricultural robot
CN107767423A (en) * 2017-10-10 2018-03-06 大连理工大学 A kind of mechanical arm target positioning grasping means based on binocular vision

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726251A (en) * 2009-11-13 2010-06-09 江苏大学 Automatic fruit identification method of apple picking robot on basis of support vector machine
CN101807247A (en) * 2010-03-22 2010-08-18 中国农业大学 Fine-adjustment positioning method of fruit and vegetable picking point
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
KR20150124305A (en) * 2014-04-28 2015-11-05 이철희 Agricultural robot
CN107767423A (en) * 2017-10-10 2018-03-06 大连理工大学 A kind of mechanical arm target positioning grasping means based on binocular vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
中国自动化学会: "《首届全国机器人学术讨论会论文集》", 30 June 1987 *
都伊林: "《智能安防新发展与应用》", 31 May 2018, 华中科技大学出版社 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111972123A (en) * 2020-07-17 2020-11-24 武汉爱农云联科技有限公司 Intelligent fruit and vegetable picking recommendation method and device based on intelligent planter
CN113057020A (en) * 2021-04-02 2021-07-02 中国铁建重工集团股份有限公司 Method, system, device and medium for controlling rotation speed of collecting head
CN114029997A (en) * 2021-12-16 2022-02-11 广州城市理工学院 Working method of mechanical arm
CN114474094A (en) * 2022-02-16 2022-05-13 苏州书农科技有限公司 Control method, system and device for realizing picking based on multi-joint mechanical arm
CN114932554A (en) * 2022-06-06 2022-08-23 北京钢铁侠科技有限公司 Autonomous moving method and device of grabbing robot, storage medium and equipment
CN114932554B (en) * 2022-06-06 2023-12-01 北京钢铁侠科技有限公司 Autonomous movement method, device, storage medium and equipment of grabbing robot
CN115031580A (en) * 2022-06-20 2022-09-09 无锡市星迪仪器有限公司 High-precision artillery correction method, processing device and high-precision artillery correction system
CN115031580B (en) * 2022-06-20 2023-10-24 无锡市星迪仪器有限公司 High-precision gun correction method, processing device and high-precision gun correction system
CN115250744A (en) * 2022-07-29 2022-11-01 四川启睿克科技有限公司 Multi-angle strawberry picking system and method
CN115250744B (en) * 2022-07-29 2023-09-15 四川启睿克科技有限公司 Multi-angle strawberry picking system and method

Similar Documents

Publication Publication Date Title
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
CN107767423B (en) mechanical arm target positioning and grabbing method based on binocular vision
CN110480637B (en) Mechanical arm part image recognition and grabbing method based on Kinect sensor
JP5835926B2 (en) Information processing apparatus, information processing apparatus control method, and program
CN109658460A (en) A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN108908334A (en) A kind of intelligent grabbing system and method based on deep learning
WO2022012337A1 (en) Moving arm system and control method
CN110695996B (en) Automatic hand-eye calibration method for industrial robot
CN112894815B (en) Method for detecting optimal position and posture for article grabbing by visual servo mechanical arm
CN109940626B (en) Control method of eyebrow drawing robot system based on robot vision
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN113715016B (en) Robot grabbing method, system, device and medium based on 3D vision
CN113103235B (en) Method for vertically operating cabinet surface equipment based on RGB-D image
CN113751981B (en) Space high-precision assembling method and system based on binocular vision servo
CN111476841A (en) Point cloud and image-based identification and positioning method and system
CN116157837A (en) Calibration method and device for robot
CN113284179A (en) Robot multi-object sorting method based on deep learning
CN115213896A (en) Object grabbing method, system and equipment based on mechanical arm and storage medium
CN114022551A (en) Method for accurately identifying and estimating pose of fuel filling cover of fuel vehicle
Chen et al. Random bin picking with multi-view image acquisition and cad-based pose estimation
CN114187312A (en) Target object grabbing method, device, system, storage medium and equipment
Ranjan et al. Identification and control of NAO humanoid robot to grasp an object using monocular vision
CN111699445B (en) Robot kinematics model optimization method and system and storage device
CN114494426A (en) Apparatus and method for controlling a robot to pick up an object in different orientations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200703