CN109559341B - Method and device for generating mechanical arm grabbing scheme - Google Patents

Method and device for generating mechanical arm grabbing scheme Download PDF

Info

Publication number
CN109559341B
CN109559341B CN201710891044.1A CN201710891044A CN109559341B CN 109559341 B CN109559341 B CN 109559341B CN 201710891044 A CN201710891044 A CN 201710891044A CN 109559341 B CN109559341 B CN 109559341B
Authority
CN
China
Prior art keywords
grabbing
deflection angle
point
scheme
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710891044.1A
Other languages
Chinese (zh)
Other versions
CN109559341A (en
Inventor
王旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Orion Star Technology Co Ltd
Original Assignee
Beijing Orion Star Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Orion Star Technology Co Ltd filed Critical Beijing Orion Star Technology Co Ltd
Priority to CN201710891044.1A priority Critical patent/CN109559341B/en
Publication of CN109559341A publication Critical patent/CN109559341A/en
Application granted granted Critical
Publication of CN109559341B publication Critical patent/CN109559341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Mechanical Engineering (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Robotics (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a method and a device for generating a mechanical arm grabbing scheme, wherein the method comprises the following steps: acquiring point cloud data of an object to be grabbed, and calculating the centroid position of the object to be grabbed; determining a range of a grabbing point according to the position of the mass center and a preset rule; calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a preset grabbing point generation algorithm; extracting position coordinates corresponding to each grabbing point, determining the grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in the preset coordinate direction, and determining an alternative grabbing scheme; extracting a deflection angle corresponding to each grabbing point, and calculating an axial deflection angle corresponding to each alternative grabbing scheme; and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme. The grabbing device can grab objects in any postures and at any positions, and grabbing efficiency and flexibility are greatly improved.

Description

Method and device for generating mechanical arm grabbing scheme
Technical Field
The invention relates to the field of mechanical arm data processing, in particular to a method and a device for generating a mechanical arm grabbing scheme.
Background
The application scenarios of grabbing objects by a mechanical arm are very wide, such as automatically sorting goods, automatically assembling parts, and taking articles by a robot. Utilize the arm to carry out snatching of object, can use manpower sparingly and time, improve production efficiency.
Before the mechanical arm grabs the object, a grabbing scheme is firstly determined, wherein the grabbing scheme comprises position coordinates of the object to be grabbed and a rotation angle required by moving the tail end of the mechanical arm from a current position to the position of the object to be grabbed, and it can be understood that the position coordinates and the rotation angle are three-dimensional space data and can be represented by a 4 x 4 matrix, namely a rotation transformation matrix. And a rotation transformation matrix is determined, and the mechanical arm can move and rotate according to the matrix, so that the object can be grabbed. The present mechanical arm generally needs the manual work to put the object on horizontal desktop to snatching of object, and just to the arm end, because under this kind of condition, the scheme of snatching of arm is comparatively simple, only needs to move along a direction, and the rotation of angle is also less, and consequently the success rate is higher, and the robustness is good.
However, obviously, in the grabbing process of the mechanical arm, the grabbing of objects in any posture and any position in a three-dimensional space cannot be realized, the objects need to be manually placed, and the grabbing efficiency and the grabbing flexibility are not high.
Disclosure of Invention
The embodiment of the invention aims to provide a method and a device for generating a mechanical arm grabbing scheme, so as to grab objects in any posture and any position and improve grabbing efficiency and flexibility. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a method for generating a robot grabbing scheme, where the method includes:
acquiring point cloud data of an object to be grabbed;
calculating the centroid position of the object to be grabbed according to the point cloud data;
determining a grabbing point range including the centroid position according to the centroid position and a preset rule;
according to a preset grabbing point generating algorithm, carrying out coordinate transformation on grabbing points in the grabbing point range, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation;
extracting the position coordinates corresponding to each grabbing point from the grabbing scheme, determining the grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in the preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
Optionally, the step of obtaining point cloud data of an object to be grabbed includes:
acquiring a color image and a depth image of the object to be grabbed;
inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
and converting pixel points of the part corresponding to the target position in the depth image into point cloud data.
Optionally, the step of determining, according to the centroid position and according to a preset rule, a capture point range including the centroid position includes:
and determining the range of the distance between the three-dimensional space and the centroid position within a second preset distance as a grabbing point range.
Optionally, the step of performing coordinate transformation on the grabbing points in the grabbing point range according to a preset grabbing point generating algorithm, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation includes:
performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range according to the Generator grass position candidates algorithm;
and calculating a grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
Optionally, the step of extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position, and the current position of the end of the mechanical arm includes:
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes;
calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
and determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
Optionally, the centroid position is represented as (x)0,y0) The current position of the end of the mechanical arm is represented as (x)*,y*);
The step of calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm comprises the following steps:
according to the formula
Figure BDA0001421171700000031
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
Optionally, the method further includes:
and controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
In a second aspect, an embodiment of the present invention provides an apparatus for generating a robot gripping scheme, where the apparatus includes:
the point cloud data acquisition module is used for acquiring point cloud data of an object to be grabbed;
the mass center position determining module is used for calculating the mass center position of the object to be grabbed according to the point cloud data;
the grasping point range determining module is used for determining a grasping point range including the centroid position according to the centroid position and a preset rule;
the grabbing scheme determining module is used for carrying out coordinate transformation on the grabbing points in the grabbing point range according to a preset grabbing point generating algorithm and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation;
the alternative grabbing scheme determining module is used for extracting position coordinates corresponding to each grabbing point from the grabbing scheme, determining grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in a preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
the axial deflection angle determining module is used for extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating the axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
and the target grabbing scheme generating module is used for determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
Optionally, the point cloud data obtaining module includes:
the image acquisition unit is used for acquiring a color image and a depth image of the object to be grabbed;
the target position determining unit is used for inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
and the point cloud data acquisition unit is used for converting the pixel points of the part corresponding to the target position in the depth image into point cloud data.
Optionally, the grasping point range determining module includes:
and the grasp point range determining unit is used for determining the range of the distance between the three-dimensional space and the centroid position within a second preset distance as a grasp point range.
Optionally, the grasping scheme determining module includes:
the rotation transformation unit is used for performing rotation transformation on a local coordinate system of the grabbing point in the grabbing point range according to the Generator grass position coordinates;
and the grabbing scheme determining unit is used for calculating the grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
Optionally, the axial slip angle determining module includes:
a first deflection angle determination unit, configured to extract a deflection angle corresponding to each grabbing point from the alternative grabbing schemes;
the second deflection angle determining unit is used for calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
and the axial deflection angle determining unit is used for determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
Optionally, the centroid position is represented as (x)0,y0) The current position of the end of the mechanical arm is represented as (x)*,y*);
The second deflection angle determination unit includes:
a deflection angle determining subunit for determining a deflection angle according to a formula
Figure BDA0001421171700000051
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
Optionally, the apparatus further comprises:
and the grabbing control module is used for controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing the steps of the method for generating the mechanical arm grabbing scheme when executing the program stored in the memory.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method for generating the robot arm grabbing plan are implemented.
In the scheme provided by the embodiment of the invention, point cloud data of an object to be grabbed is firstly obtained, the centroid position of the object to be grabbed is calculated according to the point cloud data, the grabbing point range including the centroid position is determined according to the centroid position and a preset rule, then coordinate transformation is carried out on grabbing points in the grabbing point range according to a preset grabbing point generating algorithm, a grabbing scheme corresponding to the grabbing points in the grabbing point range is calculated, the position coordinate corresponding to each grabbing point is extracted from the grabbing scheme, grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in the preset coordinate direction are determined according to the position coordinate, the grabbing scheme corresponding to the determined grabbing points is taken as an alternative grabbing scheme, finally the deflection angle corresponding to each grabbing point is extracted from the alternative grabbing scheme, and according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm, and calculating the axial deflection angle corresponding to each alternative grabbing scheme, and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme. The target grabbing scheme is determined by distance screening and axial deflection angle screening after coordinate transformation is carried out on the grabbing points in the grabbing point range in a three-dimensional space, so that objects in any posture and any position can be grabbed, and the grabbing efficiency and flexibility are greatly improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for generating a robot gripping scheme according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of point cloud data;
FIG. 3 is a detailed flowchart of a manner of acquiring point cloud data of an object to be grabbed in the embodiment shown in FIG. 1;
FIG. 4 is a detailed flow chart of the manner in which the axial declination is determined in the embodiment of FIG. 1;
FIG. 5 is a schematic view of the deflection angle of a robot arm relative to an object to be grasped;
fig. 6 is a schematic structural diagram of a device for generating a robot gripping scheme according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to realize the grabbing of objects in any postures and at any positions and improve grabbing efficiency and flexibility, the embodiment of the invention provides a method and a device for generating a mechanical arm grabbing scheme, electronic equipment and a computer-readable storage medium.
First, a method for generating a robot arm grabbing scheme provided by an embodiment of the present invention is described below.
First, it should be noted that, first, the method for generating the grabbing scheme of the robot arm provided by the embodiment of the present invention may be applied to any electronic device (hereinafter, referred to as an electronic device) that establishes a communication connection with the robot arm, and it is understood that data and instructions may be sent between the electronic device and the robot arm. The electronic device may be an electronic device such as a computer and a processor, and is not particularly limited herein. Generally, a camera is mounted on the mechanical arm, and is used for shooting an object to be grabbed (i.e., an object to be grabbed) and acquiring an image.
As shown in fig. 1, a method for generating a robot gripping scheme includes:
s101, point cloud data of an object to be grabbed are obtained;
it is understood that the point cloud data is data composed of spatial points of an object that can be detected in space, and can represent the position of the object in space. Each spatial point in the point cloud data is represented by its coordinates in space. Thus, the point cloud data of the object to be grabbed is obtained, namely, the position of the object to be grabbed in the space is determined.
In one embodiment, the point cloud data may be obtained by a 3D scanning device, such as a lidar, stereo camera, or the like. Specifically, the 3D scanning device is used to scan the object to be captured, and point cloud data of the object to be captured can be obtained.
In another embodiment, since the color image of the object may represent the position of the object in the plane space, and the depth image may represent the position of the object in the direction perpendicular to the plane space represented by the color image, the point cloud data of the object to be grabbed may be determined according to the color image and the depth image of the object to be grabbed. This will be exemplified later for clarity of layout and clarity of the scheme.
Of course, other methods for acquiring point cloud data in related technologies may also be used, as long as the point cloud data of the object to be captured can be acquired, which is not specifically limited herein. The point cloud data shown in fig. 2 is a schematic diagram of point cloud data of a cup, and it can be seen that information such as a position, a shape, and an outline of an object can be shown in the point cloud data, so that a subsequent grabbing scheme can be generated conveniently.
S102, calculating the centroid position of the object to be grabbed according to the point cloud data;
since the centroid position of the object is the center position of the mass distribution of the object, it can be understood that, for a general object, the closer the grasping position is to the centroid position when grasping is performed, the more stable the grasping will be. Then, after the electronic device acquires the point cloud data of the object to be grabbed, the centroid position of the object to be grabbed can be calculated according to the point cloud data.
Specifically, the point cloud data of the object to be grasped includes n spatial points whose coordinates are (x) respectively1,y1,z1)、(x2,y2,z2)、(x3,y3,z3)…(xn,yn,zn). The centroid position of the object to be grasped is
Figure BDA0001421171700000081
For example, if the point cloud data of the object to be grabbed includes 5 spatial points, the coordinates thereof are (5, 8, 7), (10, 2, 4), (5, 7, 10), (12, 6, 34), (21, 5, 9), respectively. The centroid position of the object to be grasped is
Figure BDA0001421171700000082
Figure BDA0001421171700000083
It should be noted that, in the above example, the point cloud data of the object to be grabbed includes 5 spatial points, and only for clearly and concisely describing the calculation manner of the centroid position, in an actual situation, the number of spatial points included in the point cloud data of the object to be grabbed is very large, as can be seen from the schematic diagram shown in fig. 2, but no matter how many spatial points are included in the point cloud data of the object to be grabbed, the manner of calculating the centroid position is not changed.
S103, determining a capture point range including the centroid position according to the centroid position and a preset rule;
it can be understood that when grabbing the object to be grabbed, the grabbing position of the mechanical arm is not necessarily the centroid position of the object to be grabbed, and the mechanical arm can successfully grab the object to be grabbed as long as the mechanical arm is within a certain range from the centroid position, and the grabbing difficulty of the mechanical arm can be reduced. In addition, for some irregularly shaped objects, such as some mechanical parts, ornaments and the like, the centroid position may not be on the object, so that after the electronic equipment determines the centroid position, a reasonable capture point range can be determined according to the centroid position. That is, the electronic device may determine the range of grasp points that includes the centroid location according to a preset rule.
In one embodiment, the electronic device may determine, in the three-dimensional space, a range of a distance from a centroid position of the object to be grabbed within a second preset distance as a grabbing point range. It can be understood that, the range of the grabbing point determined by the electronic device is a spherical area on the three-dimensional space, where the centroid position is the center of the sphere and the second preset distance is the radius of the sphere, so that the range which may become the grabbing point can be determined, and a grabbing scheme can be generated subsequently.
It should be noted that the second preset distance may be determined according to the actual size of the object to be grabbed, and is not specifically limited herein. If the volume of the object to be grabbed is small, the second preset distance may be small, for example, 2 cm, 3 cm, 5 cm, or the like, so as to avoid that the range of the grabbing point is too large, which may result in an inaccurate grabbing scheme determined subsequently. If the volume of the object to be gripped is large, the second preset distance may be large, for example 7 cm, 8 cm, 10 cm, etc., so that the determined gripping point range includes as many possible gripping points as possible.
S104, performing coordinate transformation on the grabbing points in the grabbing point range according to a preset grabbing point generation algorithm, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation;
after the electronic device determines the range of the grabbing points, the coordinate transformation of the grabbing points in the range of the grabbing points can be performed according to a preset grabbing point generation algorithm, and then the grabbing scheme corresponding to the grabbing points in the range of the grabbing points is calculated according to the coordinate transformation relation. The preset capture point generating algorithm may adopt a capture point generating algorithm such as a generated grass position candidates algorithm.
In order to simulate any posture of the object to be grabbed in the three-dimensional space, the electronic equipment can perform coordinate transformation, such as coordinate system rotation and the like, on the grabbing points in the grabbing point range according to a preset grabbing point generation algorithm to simulate different postures of the object to be grabbed in the three-dimensional space, so that information of different postures and different grabbing point positions can be obtained.
Furthermore, the electronic device can calculate the grabbing scheme corresponding to each grabbing point according to the coordinate transformation relation corresponding to each grabbing point. For clarity of the scheme and clarity of layout, a specific manner for calculating the grasping scheme corresponding to each grasping point will be described later.
S105, extracting position coordinates corresponding to each grabbing point from the grabbing scheme, determining the grabbing points within a first preset distance from the current position of the tail end of the mechanical arm in a preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
because there is the randomness in above-mentioned coordinate transformation, can lead to the point that snatchs after the coordinate transformation corresponds snatchs the scheme and hardly realize stable snatching when actually snatching, consequently in order to guarantee the accuracy and the stability of snatching, electronic equipment is obtaining above-mentioned snatching the scheme after, can filter the scheme of snatching, filters some and snatchs comparatively difficult scheme of snatching to the arm.
That is to say, the electronic device may determine a grabbing point within a first preset distance from the current position of the end of the mechanical arm in the preset coordinate direction, and use a grabbing scheme corresponding to the determined grabbing point as an alternative grabbing scheme. Generally, the preset coordinate direction may be a vertical direction of a coordinate system of the robot arm, which is expressed by a z-axis, so that it is ensured that a moving distance of the robot arm in the z-axis direction is not too large. Of course, the preset coordinate direction may also be other directions, and may be determined specifically according to the use scene of the mechanical arm, the structure of the mechanical arm itself, and other factors, and is not specifically limited herein.
The first preset distance may be set according to factors such as an actual gripping environment and a size of an object to be gripped, and may be, for example, 2 cm, 3 cm, 5 cm, and the like, which is not specifically limited herein.
S106, extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
through the processing of step S105, some alternative capture schemes can be obtained, and the electronic device can determine an optimal capture scheme from the alternative capture schemes. Because the rotation angle of the mechanical arm can directly influence the accuracy and stability of grabbing, the electronic equipment can select the optimal grabbing scheme according to the axial deflection angle corresponding to each grabbing point in the alternative grabbing schemes.
The electronic equipment can extract the deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and then the axial deflection angle corresponding to each alternative grabbing scheme is calculated according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm.
And S107, determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
After the axial deflection angle corresponding to each alternative grabbing scheme is determined, the electronic equipment can determine the minimum axial deflection angle from the axial deflection angles, and as the smaller the axial deflection angle is, the smaller the angle required by the mechanical arm to rotate when grabbing an object to be grabbed is, the lower the grabbing difficulty is, and the higher the success rate and accuracy are, the electronic equipment can determine the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
It should be noted that the target grabbing scheme refers to the grabbing scheme that can be adopted when the mechanical arm grabs the object to be grabbed, which is determined by the method, and has no other limiting meanings.
It can be seen that, in the scheme provided by the embodiment of the present invention, the electronic device first obtains point cloud data of an object to be grabbed, calculates a centroid position of the object to be grabbed according to the point cloud data, determines a grabbing point range including the centroid position according to the centroid position and a preset rule, then performs coordinate transformation on grabbing points in the grabbing point range according to a preset grabbing point generation algorithm, calculates a grabbing scheme corresponding to the grabbing points in the grabbing point range, extracts a position coordinate corresponding to each grabbing point from the grabbing scheme, determines grabbing points within a first preset distance from a current position of the end of the mechanical arm in a preset coordinate direction according to the position coordinates, uses the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme, and finally extracts a deflection angle corresponding to each grabbing point from the alternative grabbing scheme, and according to the deflection angle, And calculating the axial deflection angle corresponding to each alternative grabbing scheme according to the centroid position and the current position of the tail end of the mechanical arm, and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme. The target grabbing scheme is determined by distance screening and axial deflection angle screening after coordinate transformation is carried out on the grabbing points in the grabbing point range in a three-dimensional space, so that objects in any posture and any position can be grabbed, and the grabbing efficiency and flexibility are greatly improved.
As an implementation manner of the embodiment of the present invention, as shown in fig. 3, the step of acquiring point cloud data of an object to be grabbed may include:
s301, acquiring a color image and a depth image of the object to be grabbed;
under general conditions, the electronic equipment can shoot an object to be grabbed through the color camera and the depth camera simultaneously, and then the color image and the depth image of the object to be grabbed can be obtained.
It will be appreciated that the colour camera and depth camera are typically mounted at the end of the robot arm, so that the depth image is representative of the distance of the object to be gripped from the end of the robot arm.
S302, inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
after the electronic equipment obtains the color image of the object to be grabbed, the color image can be input into a preset convolutional neural network for target detection, and then the target position of the object to be grabbed in the color image can be obtained.
The preset convolutional neural network can be any convolutional neural network capable of carrying out target detection. It is to be understood that the preset convolutional neural network is obtained by performing parameter training on a preset convolutional neural network model, and the parameter training on the convolutional neural network model may use a parameter training mode in the related art, such as a back propagation method, and is not specifically limited and described herein.
For example, a convolutional neural network model may be established based on a Caffe deep learning framework, then images of the object to be captured acquired by the color camera at various positions and in various forms are taken as image samples, the position of the object to be captured in each image sample is calibrated, generally a rectangular frame is used for representation, then the image samples and the positions of the corresponding objects to be captured are input into a preset convolutional neural network model, parameter adjustment is performed until iteration times reach preset times or a target function converges, then parameter training may be stopped, a preset convolutional neural network is obtained, and the preset convolutional neural network learns the characteristics of the object to be captured.
Furthermore, when the electronic device inputs the color image into the preset convolutional neural network for target detection, the preset convolutional neural network can obtain and output the target position of the object to be captured in the color image according to the characteristics of the object to be captured learned in the training process, and the electronic device can obtain the target position of the object to be captured in the color image.
And S303, converting pixel points of the part corresponding to the target position in the depth image into point cloud data.
The target position is generally a rectangular frame, so that the electronic device can determine a corresponding position in the depth image after acquiring the target position. It can be understood that the color image and the depth image are generally the same in size and include two images with the same number of pixel points, and then the portion of the depth image corresponding to the target position is the portion within the rectangular frame. For example, the target position is (x)a,ya,xb,yb) Wherein (x)a,ya) And (x)b,yb) The coordinates of two diagonal vertices of the rectangle are the coordinates (x) of the two diagonal vertices in the depth image, and the portion of the depth image corresponding to the target position is the depth imagea,ya) And (x)b,yb) Is rectangular.
Then, the electronic device can convert the pixel points of the corresponding part of the target position in the depth image into point cloud data. The coordinates of the pixel points in the depth image represent the positions of the pixel points in the camera coordinate system, and the gray values of the pixel points represent the depths corresponding to the pixel points, namely the distance between the pixel points and the tail end of the mechanical arm, namely another coordinate value in the three-dimensional space corresponding to the camera coordinate system. Then the (x, y, z) composed of the coordinates and gray values of the pixel points is the point cloud data in the camera coordinate system. Thus, the point cloud data of the object to be grabbed can be obtained.
Therefore, the point cloud data of the object to be captured is obtained in the above mode, professional equipment is not needed, and only common color cameras and depth cameras are needed, so that the cost is reduced. Moreover, the operation is simple, and the calculation amount is small.
As an implementation manner of the embodiment of the present invention, the step of performing coordinate transformation on the grabbing points in the grabbing point range according to a preset grabbing point generating algorithm, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation may include:
performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range according to the Generator grass position candidates algorithm; and calculating a grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
The electronic device may use a general grass position coordinates algorithm to perform rotation transformation on the local coordinate system of the grabbing point in the grabbing point range, specifically, a plurality of grabbing points may be randomly selected from the grabbing point range, a normal line and a local coordinate system of the grabbing point are calculated at each grabbing point, and then each local coordinate system is rotated randomly along an axis to simulate various postures of the object to be grabbed.
And then the electronic equipment can calculate the grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system. During the coordinate transformation, a projection relation matrix of the coordinate transformation may be recorded, which is a 4 x 4 matrix, for example
Figure BDA0001421171700000131
Then 3 x 3 elements in the upper left corner of the matrixVegetable extract
Figure BDA0001421171700000132
I.e. representing coordinate rotation relations, elements
Figure BDA0001421171700000133
I.e. representing a coordinate translation relationship. It can be understood that the matrix represents the position and angle relationship before and after the coordinate transformation of the grabbing point, and if the mechanical arm needs to move from the grabbing point before the coordinate transformation to the grabbing point after the coordinate transformation, the mechanical arm only needs to move according to the matrix, and the matrix is a grabbing scheme corresponding to the grabbing point after the coordinate transformation from the grabbing point before the coordinate transformation.
Since the coordinate system of the point cloud data of the object to be captured is generally the coordinate system of the device for obtaining the point cloud data, for example, if the point cloud data is obtained by a color image and a depth image, the point cloud data is the data in the coordinate system of the camera. Therefore, when the grabbing scheme corresponding to each grabbing point is calculated, the grabbing scheme corresponding to the grabbing point after coordinate transformation needs to be converted into a mechanical arm coordinate system, and the mechanical arm can move according to the grabbing scheme.
Specifically, still taking the example that the point cloud data is obtained by color image and depth image, the projection relationship between the camera coordinate system and the mechanical arm coordinate system can be obtained by the camera calibration technique, which is also a 4 × 4 matrix, for example
Figure BDA0001421171700000141
The projection relation matrix of the above coordinate transformation is multiplied with the matrix, i.e.
Figure BDA0001421171700000142
The grabbing scheme corresponding to the grabbing point after coordinate transformation can be converted into the mechanical arm coordinate system, and the grabbing scheme corresponding to the grabbing point is obtained
Figure BDA0001421171700000143
All the grabbing points in the grabbing point range are transformed according to the method, so that grabbing schemes corresponding to all the grabbing points in the grabbing point range can be obtained, and the grabbing schemes are grabbing schemes under a mechanical arm coordinate system.
Therefore, the information of different postures of the object to be grabbed at different grabbing point positions can be obtained through the random rotation transformation of the local coordinate system of the randomly selected grabbing points, and the grabbing scheme obtained through calculation can be suitable for the objects to be grabbed at various postures and positions.
As an implementation manner of the embodiment of the present invention, as shown in fig. 4, the step of extracting a deflection angle corresponding to each grabbing point from the alternative grabbing solutions, and calculating an axial deflection angle corresponding to each alternative grabbing solution according to the deflection angle, the centroid position, and the current position of the end of the mechanical arm may include:
s401, extracting a deflection angle corresponding to each grabbing point from the alternative grabbing scheme;
since the grabbing scheme is a 4 × 4 matrix, and 3 × 3 elements at the upper left corner represent a rotation relationship as described above, the deflection angle corresponding to the grabbing point can be calculated according to the 3 × 3 elements. For example, the deflection angle, i.e. the deflection angle corresponding to each grabbing point, can be extracted from the alternative grabbing solutions according to the conversion relationship between euler angles and the matrix.
It should be noted that the deflection angle corresponding to each grabbing point represents an angle required to rotate when the current deflection angle of the mechanical arm is 0 degree in the mechanical arm coordinate system to the deflection angle corresponding to each grabbing point.
S402, calculating a deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
when the mechanical arm is actually used, the deflection angle of the current position relative to the object to be grabbed is usually not 0 degree, so after the deflection angle corresponding to each grabbing point is determined, the deflection angle of the current position of the mechanical arm relative to the object to be grabbed needs to be determined, and the axial deflection angle corresponding to each grabbing point can be determined.
Then, the electronic device may calculate a deflection angle of the current position of the mechanical arm relative to the object to be grasped according to the centroid position and the current position of the end of the mechanical arm. It can be understood that the rotation angle of the mechanical arm is generally three-dimensional data, and in order to conveniently and rapidly determine the axial deflection angle, the deflection angle of the mechanical arm relative to the object to be grabbed in a certain set direction can be used for calculation. In order to minimize the gripping difficulty of the mechanical arm, the direction in which the actual rotation difficulty of the mechanical arm is highest may be used as the setting direction. The above-mentioned extraction of the deflection angle corresponding to each grasping point from the alternative grasping scheme is also the deflection angle in the set direction.
As an embodiment, the direction of the plane defined by the coordinate axes x, y represents the above-mentioned set direction, if the centroid position of the object to be grasped is represented as (x)0,y0) The current position of the end of the arm is denoted as (x)*,y*) Can be according to the formula
Figure BDA0001421171700000151
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
As shown in fig. 5, in the geometric relationship between the robot arm 501 and the object 502 to be grasped, the sine value of the deflection angle θ in the robot arm coordinate system is shown as
Figure BDA0001421171700000152
The value of the deflection angle is then
Figure BDA0001421171700000153
And S403, determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
It can be understood that the difference between the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point is the angle that the mechanical arm actually needs to rotate when rotating to each grabbing point, and is called an axial deflection angle. Then, the deflection angle corresponding to each grabbing point and the deflection angle corresponding to the mechanical arm are determined, and the electronic device can determine the difference between the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
As an implementation manner of the embodiment of the present invention, the method may further include:
and controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
After the electronic equipment determines the target grabbing scheme, the mechanical arm can be controlled to grab the object to be grabbed according to the target grabbing scheme. Specifically, the target grabbing scheme is a 4 × 4 matrix, and the electronic device may control the mechanical arm to rotate according to the element value representing the rotation relationship in the 4 × 4 matrix, translate according to the element value representing the translation relationship in the 4 × 4 matrix, and further move to the object to be grabbed to perform grabbing.
Corresponding to the method embodiment, the embodiment of the invention also provides a device for generating the mechanical arm grabbing scheme.
The following describes a device for generating a robot arm grabbing scheme provided by an embodiment of the present invention.
As shown in fig. 6, an apparatus for generating a robot grasping plan includes:
a point cloud data acquisition module 610, configured to acquire point cloud data of an object to be captured;
a centroid position determining module 620, configured to calculate a centroid position of the object to be grabbed according to the point cloud data;
a grasp point range determining module 630, configured to determine, according to the centroid position and according to a preset rule, a grasp point range including the centroid position;
the grasping scheme determining module 640 is configured to perform coordinate transformation on the grasping points in the grasping point range according to a preset grasping point generating algorithm, and calculate a grasping scheme corresponding to the grasping points in the grasping point range according to a coordinate transformation relation;
the alternative grabbing scheme determining module 650 is configured to extract a position coordinate corresponding to each grabbing point from the grabbing scheme, determine, according to the position coordinate, a grabbing point within a first preset distance from a current position of the end of the mechanical arm in a preset coordinate direction, and use the grabbing scheme corresponding to the determined grabbing point as an alternative grabbing scheme;
an axial deflection angle determining module 660, configured to extract a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculate an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position, and the current position of the end of the mechanical arm;
and an object grabbing scheme generating module 670, configured to determine the candidate grabbing scheme corresponding to the smallest axial declination as the object grabbing scheme.
It can be seen that in the solution provided in the embodiment of the present invention, point cloud data of an object to be grasped is obtained, a centroid position of the object to be grasped is calculated according to the point cloud data, a grasping point range including the centroid position is determined according to the centroid position and a preset rule, then coordinate transformation is performed on grasping points in the grasping point range according to a preset grasping point generating algorithm, and a grasping solution corresponding to the grasping points in the grasping point range is calculated, then a position coordinate corresponding to each grasping point is extracted from the grasping solution, and according to the position coordinates, a grasping point in a first preset distance from a current position of the end of the mechanical arm in a preset coordinate direction is determined, the grasping solution corresponding to the determined grasping point is taken as an alternative grasping solution, finally a deflection angle corresponding to each grasping point is extracted from the alternative grasping solution, and according to the deflection angle, the centroid position and the current position of the end of the mechanical arm, and calculating the axial deflection angle corresponding to each alternative grabbing scheme, and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme. The target grabbing scheme is determined by distance screening and axial deflection angle screening after coordinate transformation is carried out on the grabbing points in the grabbing point range in a three-dimensional space, so that objects in any posture and any position can be grabbed, and the grabbing efficiency and flexibility are greatly improved.
As an implementation manner of the embodiment of the present invention, the point cloud data obtaining module 610 may include:
an image acquisition unit (not shown in fig. 6) for acquiring a color image and a depth image of the object to be grasped;
a target position determining unit (not shown in fig. 6) configured to input the color image into a preset convolutional neural network for target detection, so as to obtain a target position of the object to be grabbed in the color image;
and a point cloud data acquisition unit (not shown in fig. 6) for converting the pixel points of the portion corresponding to the target position in the depth image into point cloud data.
As an implementation manner of the embodiment of the present invention, the capture point range determining module 630 may include:
and a grasp point range determination unit (not shown in fig. 6) for determining, as a grasp point range, a range having a distance from the centroid position within a second preset distance in the three-dimensional space.
As an implementation manner of the embodiment of the present invention, the capture scheme determining module 640 may include:
a rotation transformation unit (not shown in fig. 6) for performing rotation transformation on the local coordinate system of the grasp points in the grasp point range according to the generated grass position coordinates;
and a grasping scheme determining unit (not shown in fig. 6) configured to calculate a grasping scheme corresponding to each grasping point in the grasping point range according to a rotation transformation relationship of the local coordinate system and a projection relationship of the robot arm coordinate system and the point cloud data coordinate system.
As an implementation manner of the embodiment of the present invention, the axial deviation angle determining module 660 may include:
a first deflection angle determination unit (not shown in fig. 6) for extracting a deflection angle corresponding to each grasping point from the alternative grasping solutions;
a second deflection angle determining unit (not shown in fig. 6) configured to calculate a deflection angle of the current position of the robot arm with respect to the object to be grasped, according to the centroid position and the current position of the end of the robot arm;
and an axial deflection angle determination unit (not shown in fig. 6) configured to determine a difference between a deflection angle corresponding to the robot arm and a deflection angle corresponding to each grasping point as an axial deflection angle corresponding to each grasping point.
As an implementation of an embodiment of the present invention, the centroid position may be represented as (x)0,y0) The current position of the end of the mechanical arm can be expressed as (x)*,y*);
The second deflection angle determination unit may include:
deflection angle determining stator unit (not shown in fig. 6) for determining a stator unit according to a formula
Figure BDA0001421171700000181
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
As an implementation manner of the embodiment of the present invention, the apparatus may further include:
and the grabbing control module (not shown in fig. 6) is used for controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
An embodiment of the present invention further provides an electronic device, as shown in fig. 7, including a processor 701, a communication interface 702, a memory 703 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 complete mutual communication through the communication bus 704,
a memory 703 for storing a computer program;
the processor 701 is configured to implement the following steps when executing the program stored in the memory 703:
acquiring point cloud data of an object to be grabbed;
calculating the centroid position of the object to be grabbed according to the point cloud data;
determining a grabbing point range including the centroid position according to the centroid position and a preset rule;
according to a preset grabbing point generating algorithm, carrying out coordinate transformation on grabbing points in the grabbing point range, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation;
extracting the position coordinates corresponding to each grabbing point from the grabbing scheme, determining the grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in the preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
It can be seen that in the solution provided in the embodiment of the present invention, the electronic device obtains point cloud data of an object to be grabbed, calculates a centroid position of the object to be grabbed according to the point cloud data, determines a grabbing point range including the centroid position according to a preset rule according to the centroid position, then performs coordinate transformation on grabbing points in the grabbing point range according to a preset grabbing point generation algorithm, calculates a grabbing scheme corresponding to the grabbing points in the grabbing point range, extracts a position coordinate corresponding to each grabbing point from the grabbing scheme, determines grabbing points within a first preset distance from a current position of the end of the mechanical arm in a preset coordinate direction according to the position coordinates, uses the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme, finally extracts a deflection angle corresponding to each grabbing point from the alternative grabbing scheme, and according to the deflection angle, the centroid position and the current position of the end of the mechanical arm, and calculating the axial deflection angle corresponding to each alternative grabbing scheme, and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme. The target grabbing scheme is determined by distance screening and axial deflection angle screening after coordinate transformation is carried out on the grabbing points in the grabbing point range in a three-dimensional space, so that objects in any posture and any position can be grabbed, and the grabbing efficiency and flexibility are greatly improved.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
The step of obtaining point cloud data of an object to be grabbed may include:
acquiring a color image and a depth image of the object to be grabbed;
inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
and converting pixel points of the part corresponding to the target position in the depth image into point cloud data.
The step of determining the range of the capture point including the centroid position according to the centroid position and a preset rule may include:
and determining the range of the distance between the three-dimensional space and the centroid position within a second preset distance as a grabbing point range.
The step of performing coordinate transformation on the grabbing points in the grabbing point range according to a preset grabbing point generation algorithm, and calculating the grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation may include:
performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range according to the Generator grass position candidates algorithm;
and calculating a grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
The step of extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm may include:
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes;
calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
and determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
Wherein the centroid position can be expressed as (x)0,y0) The current position of the end of the mechanical arm can be expressed as (x)*,y*);
The step of calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm may include:
according to the formula
Figure BDA0001421171700000211
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
Wherein, the method can also comprise:
and controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
An embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and when executed by a processor, the computer program implements the following steps:
acquiring point cloud data of an object to be grabbed;
calculating the centroid position of the object to be grabbed according to the point cloud data;
determining a grabbing point range including the centroid position according to the centroid position and a preset rule;
according to a preset grabbing point generating algorithm, carrying out coordinate transformation on grabbing points in the grabbing point range, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation;
extracting the position coordinates corresponding to each grabbing point from the grabbing scheme, determining the grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in the preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
It can be seen that in the solution provided in the embodiment of the present invention, when the computer program is executed by the processor, the point cloud data of the object to be grabbed is obtained, the centroid position of the object to be grabbed is calculated according to the point cloud data, the grabbing point range including the centroid position is determined according to the centroid position and the preset rule, then the coordinate transformation is performed on the grabbing points in the grabbing point range according to the preset grabbing point generating algorithm, the grabbing scheme corresponding to the grabbing points in the grabbing point range is calculated, the position coordinates corresponding to each grabbing point are extracted from the grabbing scheme, the grabbing points corresponding to the current position of the end of the mechanical arm in the preset coordinate direction within the first preset distance are determined according to the position coordinates, the grabbing scheme corresponding to the determined grabbing points is used as the alternative grabbing scheme, and finally the deflection angle corresponding to each grabbing point is extracted from the alternative grabbing scheme, and calculating the axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the position of the center of mass and the current position of the tail end of the mechanical arm, and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme. The target grabbing scheme is determined by distance screening and axial deflection angle screening after coordinate transformation is carried out on the grabbing points in the grabbing point range in a three-dimensional space, so that objects in any posture and any position can be grabbed, and the grabbing efficiency and flexibility are greatly improved.
The step of obtaining point cloud data of an object to be grabbed may include:
acquiring a color image and a depth image of the object to be grabbed;
inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
and converting pixel points of the part corresponding to the target position in the depth image into point cloud data.
The step of determining the range of the capture point including the centroid position according to the centroid position and a preset rule may include:
and determining the range of the distance between the three-dimensional space and the centroid position within a second preset distance as a grabbing point range.
The step of performing coordinate transformation on the grabbing points in the grabbing point range according to a preset grabbing point generation algorithm, and calculating the grabbing scheme corresponding to the grabbing points in the grabbing point range according to a coordinate transformation relation may include:
performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range according to the Generator grass position candidates algorithm;
and calculating a grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
The step of extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm may include:
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes;
calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
and determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
Wherein the centroid position can be expressed as (x)0,y0) The current position of the end of the mechanical arm can be expressed as (x)*,y*);
The step of calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm may include:
according to the formula
Figure BDA0001421171700000231
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
Wherein, the method can also comprise:
and controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (16)

1. A method for generating a mechanical arm grabbing scheme is characterized by comprising the following steps:
acquiring point cloud data of an object to be grabbed;
calculating the centroid position of the object to be grabbed according to the point cloud data;
determining a grabbing point range including the centroid position according to the centroid position and a preset rule;
according to a preset grabbing point generating algorithm, performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range to obtain information of different postures of the object to be grabbed at different grabbing point positions, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a rotation transformation relation;
extracting the position coordinates corresponding to each grabbing point from the grabbing scheme, determining the grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in the preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating an axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
and determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
2. The method of claim 1, wherein the step of obtaining point cloud data of the object to be grabbed comprises:
acquiring a color image and a depth image of the object to be grabbed;
inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
and converting pixel points of the part corresponding to the target position in the depth image into point cloud data.
3. The method of claim 1, wherein the step of determining a range of grasp points including the centroid position according to a preset rule based on the centroid position comprises:
and determining the range of the distance between the three-dimensional space and the centroid position within a second preset distance as a grabbing point range.
4. The method according to claim 1, wherein the step of performing rotation transformation on the grabbing points in the grabbing point range according to a preset grabbing point generation algorithm to obtain information of different postures of the object to be grabbed at different grabbing point positions, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a rotation transformation relation comprises:
performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range according to the Generator grass position candidates algorithm;
and calculating a grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
5. The method of claim 1, wherein the step of extracting a deflection angle corresponding to each grasping point from the alternative grasping solutions and calculating an axial deflection angle corresponding to each alternative grasping solution according to the deflection angle, the centroid position and the current position of the end of the mechanical arm comprises:
extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes;
calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
and determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
6. The method of claim 5, wherein the centroid position is represented as (x)0,y0),The current position of the end of the arm is denoted as (x)*,y*);
The step of calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm comprises the following steps:
according to the formula
Figure FDA0002848169390000021
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
7. The method of any one of claims 1-6, further comprising:
and controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
8. An apparatus for generating a robot gripping plan, the apparatus comprising:
the point cloud data acquisition module is used for acquiring point cloud data of an object to be grabbed;
the mass center position determining module is used for calculating the mass center position of the object to be grabbed according to the point cloud data;
the grasping point range determining module is used for determining a grasping point range including the centroid position according to the centroid position and a preset rule;
the grabbing scheme determining module is used for performing rotation transformation on a local coordinate system of the grabbing points in the grabbing point range according to a preset grabbing point generating algorithm to obtain information of different postures of the object to be grabbed at different grabbing point positions, and calculating a grabbing scheme corresponding to the grabbing points in the grabbing point range according to a rotation transformation relation;
the alternative grabbing scheme determining module is used for extracting position coordinates corresponding to each grabbing point from the grabbing scheme, determining grabbing points which are within a first preset distance from the current position of the tail end of the mechanical arm in a preset coordinate direction according to the position coordinates, and taking the grabbing scheme corresponding to the determined grabbing points as an alternative grabbing scheme;
the axial deflection angle determining module is used for extracting a deflection angle corresponding to each grabbing point from the alternative grabbing schemes, and calculating the axial deflection angle corresponding to each alternative grabbing scheme according to the deflection angle, the centroid position and the current position of the tail end of the mechanical arm;
and the target grabbing scheme generating module is used for determining the alternative grabbing scheme corresponding to the minimum axial deflection angle as the target grabbing scheme.
9. The apparatus of claim 8, wherein the point cloud data acquisition module comprises:
the image acquisition unit is used for acquiring a color image and a depth image of the object to be grabbed;
the target position determining unit is used for inputting the color image into a preset convolutional neural network for target detection to obtain a target position of the object to be grabbed in the color image;
and the point cloud data acquisition unit is used for converting the pixel points of the part corresponding to the target position in the depth image into point cloud data.
10. The apparatus of claim 8, wherein the grasp point range determination module comprises:
and the grasp point range determining unit is used for determining the range of the distance between the three-dimensional space and the centroid position within a second preset distance as a grasp point range.
11. The apparatus of claim 8, wherein the grasping scheme determining module comprises:
the rotation transformation unit is used for performing rotation transformation on a local coordinate system of the grabbing point in the grabbing point range according to the Generator grass position coordinates;
and the grabbing scheme determining unit is used for calculating the grabbing scheme corresponding to each grabbing point in the grabbing point range according to the rotation transformation relation of the local coordinate system and the projection relation of the mechanical arm coordinate system and the point cloud data coordinate system.
12. The apparatus of claim 8, wherein the axial declination determination module comprises:
a first deflection angle determination unit, configured to extract a deflection angle corresponding to each grabbing point from the alternative grabbing schemes;
the second deflection angle determining unit is used for calculating the deflection angle of the current position of the mechanical arm relative to the object to be grabbed according to the centroid position and the current position of the tail end of the mechanical arm;
and the axial deflection angle determining unit is used for determining the difference value of the deflection angle corresponding to the mechanical arm and the deflection angle corresponding to each grabbing point as the axial deflection angle corresponding to each grabbing point.
13. The apparatus of claim 12, wherein the centroid position is represented as (x)0,y0) The current position of the end of the mechanical arm is represented as (x)*,y*);
The second deflection angle determination unit includes:
a deflection angle determining subunit for determining a deflection angle according to a formula
Figure FDA0002848169390000041
And calculating the deflection angle theta of the current position of the mechanical arm relative to the object to be grabbed.
14. The apparatus of any one of claims 8-13, further comprising:
and the grabbing control module is used for controlling the mechanical arm to grab the object to be grabbed according to the target grabbing scheme.
15. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 7 when executing a program stored in the memory.
16. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 7.
CN201710891044.1A 2017-09-27 2017-09-27 Method and device for generating mechanical arm grabbing scheme Active CN109559341B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710891044.1A CN109559341B (en) 2017-09-27 2017-09-27 Method and device for generating mechanical arm grabbing scheme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710891044.1A CN109559341B (en) 2017-09-27 2017-09-27 Method and device for generating mechanical arm grabbing scheme

Publications (2)

Publication Number Publication Date
CN109559341A CN109559341A (en) 2019-04-02
CN109559341B true CN109559341B (en) 2021-03-26

Family

ID=65863718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710891044.1A Active CN109559341B (en) 2017-09-27 2017-09-27 Method and device for generating mechanical arm grabbing scheme

Country Status (1)

Country Link
CN (1) CN109559341B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110181500B (en) * 2019-06-06 2021-07-13 广东海洋大学 Control system and control method of bionic manipulator
CN110744544B (en) * 2019-10-31 2021-03-02 昆山市工研院智能制造技术有限公司 Service robot vision grabbing method and service robot
JP2021133470A (en) * 2020-02-28 2021-09-13 セイコーエプソン株式会社 Control method of robot and robot system
WO2022086157A1 (en) * 2020-10-20 2022-04-28 삼성전자주식회사 Electronic apparatus and control method thereof
CN113674348B (en) * 2021-05-28 2024-03-15 中国科学院自动化研究所 Object grabbing method, device and system
CN113731860B (en) * 2021-09-03 2023-10-24 西安建筑科技大学 Automatic sorting system and method for piled articles in container
CN115213721B (en) * 2022-09-21 2022-12-30 江苏友邦精工实业有限公司 A upset location manipulator for automobile frame processing
CN115995013A (en) * 2023-03-21 2023-04-21 江苏金恒信息科技股份有限公司 Covering agent adding method, covering agent adding device, computer equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530297A (en) * 2016-11-11 2017-03-22 北京睿思奥图智能科技有限公司 Object grabbing region positioning method based on point cloud registering

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4059614B2 (en) * 2000-05-22 2008-03-12 三菱電機株式会社 Control device for 3D laser processing machine
CN104217441B (en) * 2013-08-28 2017-05-10 北京嘉恒中自图像技术有限公司 Mechanical arm positioning fetching method based on machine vision
CN106874914B (en) * 2017-01-12 2019-05-14 华南理工大学 A kind of industrial machinery arm visual spatial attention method based on depth convolutional neural networks
CN106932780A (en) * 2017-03-14 2017-07-07 北京京东尚科信息技术有限公司 Object positioning method, device and system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530297A (en) * 2016-11-11 2017-03-22 北京睿思奥图智能科技有限公司 Object grabbing region positioning method based on point cloud registering

Also Published As

Publication number Publication date
CN109559341A (en) 2019-04-02

Similar Documents

Publication Publication Date Title
CN109559341B (en) Method and device for generating mechanical arm grabbing scheme
CN107813310B (en) Multi-gesture robot control method based on binocular vision
CN106767393B (en) Hand-eye calibration device and method for robot
CN111627072B (en) Method, device and storage medium for calibrating multiple sensors
CN113409384B (en) Pose estimation method and system of target object and robot
JP2019056966A (en) Information processing device, image recognition method and image recognition program
CN113246140B (en) Multi-model workpiece disordered grabbing method and device based on camera measurement
CN112950667A (en) Video annotation method, device, equipment and computer readable storage medium
Nguyen et al. 3D scanning system for automatic high-resolution plant phenotyping
US20220414910A1 (en) Scene contour recognition method and apparatus, computer-readable medium, and electronic device
CN112171666B (en) Pose calibration method and device for visual robot, visual robot and medium
CN111178317A (en) Detection positioning method, system, device, electronic equipment and storage medium
CN108748149B (en) Non-calibration mechanical arm grabbing method based on deep learning in complex environment
CN113610921A (en) Hybrid workpiece grabbing method, device and computer-readable storage medium
CN110751691A (en) Automatic pipe fitting grabbing method based on binocular vision
WO2022021156A1 (en) Method and apparatus for robot to grab three-dimensional object
CN115816460B (en) Mechanical arm grabbing method based on deep learning target detection and image segmentation
JP2014029664A (en) Image comparison range generation method, positional orientation detection method, image comparison range generation device, positional orientation detection device, robot, robot system, image comparison range generation program and positional orientation detection program
CN115205286B (en) Method for identifying and positioning bolts of mechanical arm of tower-climbing robot, storage medium and terminal
CN112686950A (en) Pose estimation method and device, terminal equipment and computer readable storage medium
CN110181504B (en) Method and device for controlling mechanical arm to move and control equipment
CN116985141B (en) Industrial robot intelligent control method and system based on deep learning
WO2016123813A1 (en) Attitude relationship calculation method for intelligent device, and intelligent device
CN116844124A (en) Three-dimensional object detection frame labeling method, three-dimensional object detection frame labeling device, electronic equipment and storage medium
JP7161857B2 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant