CN111844019A - Method and device for determining grabbing position of machine, electronic device and storage medium - Google Patents

Method and device for determining grabbing position of machine, electronic device and storage medium Download PDF

Info

Publication number
CN111844019A
CN111844019A CN202010525389.7A CN202010525389A CN111844019A CN 111844019 A CN111844019 A CN 111844019A CN 202010525389 A CN202010525389 A CN 202010525389A CN 111844019 A CN111844019 A CN 111844019A
Authority
CN
China
Prior art keywords
grabbing
information
determining
sample
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010525389.7A
Other languages
Chinese (zh)
Other versions
CN111844019B (en
Inventor
王帅帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Hongcheng Opto Electronics Co Ltd
Original Assignee
Anhui Hongcheng Opto Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Hongcheng Opto Electronics Co Ltd filed Critical Anhui Hongcheng Opto Electronics Co Ltd
Priority to CN202010525389.7A priority Critical patent/CN111844019B/en
Publication of CN111844019A publication Critical patent/CN111844019A/en
Application granted granted Critical
Publication of CN111844019B publication Critical patent/CN111844019B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

One or more embodiments of the present specification provide a machine-grabbed position determination method, device, electronic device, and storage medium, including: acquiring image information of a placed object, and generating an image coordinate system; detecting image information, distinguishing an object sample to be grabbed and an overlapped area sample, and performing image processing to generate a middle contour; establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point; determining coordinate information and angle information of a grabbing point; and transmitting the coordinate information and the angle information to the grabbing arm. One or more embodiments of this description make the machine when snatching overlapping object, choose to keep away from overlapping position and snatch, can adjust to corresponding angle simultaneously, and then can correctly distinguish overlapping object to snatch the object correct position fast accurate, effectively protected object itself and whole frock process.

Description

Method and device for determining grabbing position of machine, electronic device and storage medium
Technical Field
One or more embodiments of the present disclosure relate to the field of automation control technologies, and in particular, to a method, an apparatus, an electronic apparatus, and a storage medium for determining a gripping position of a machine.
Background
With the development of artificial intelligence technology, robots play more and more important roles in industrial production and family life. The robot grabbing is an important means for realizing human-computer interaction, the robot can grab a target through the operation paw arranged at the tail end of the mechanical arm, the robot can be applied to the fields of assembly line sorting, family service and the like, and grabbing of parts becomes an important link of robot application.
At present, the mechanical arm based on machine vision can automatically recognize and grab workpieces, but can only recognize and grab single workpieces, if two workpieces are connected together, the mechanical arm cannot accurately recognize and grab, and mistaken grabbing often causes the problems of workpiece damage, assembly part damage and the like, so that the working efficiency and the working cost are seriously influenced.
Disclosure of Invention
In view of the above, an object of one or more embodiments of the present specification is to provide a machine-grasp-position determining method, apparatus, electronic apparatus, and storage medium.
In view of the above object, one or more embodiments of the present specification provide a method for determining a gripping position of a machine, including:
acquiring image information of a placed object, and generating an image coordinate system;
detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour;
establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point;
determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system;
and transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm, so that the grabbing arm grabs.
In some embodiments, the establishing a first circumscribed rectangle of the intermediate profile, determining a preselected grasping point from a short side of the first circumscribed rectangle, includes:
when the number of the middle outlines is larger than 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point;
setting the minimum area in the middle outline as a secondary outline, wherein the first external rectangle corresponding to the secondary outline is a secondary rectangle, and removing the middle point of each short side of the secondary rectangle in the preselected grabbing point;
And when the number of the middle outlines is 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point.
In some embodiments, said determining a grasp point from said location information of said preselected grasp point comprises:
when the number of the middle profiles is more than 1, determining a first central position of the secondary profile, measuring the distance from the preselected grabbing point to the first central position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point;
when the number of the middle profiles is 1, determining a second center position of the overlapping area sample, measuring the distance from the preselected grabbing point to the second center position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point.
In some embodiments, the detecting the image information to distinguish the sample of the object to be grabbed and the sample of the overlapping area includes:
obtaining at least two sample images with mutually overlapped objects, and training a target detection training model through the sample images;
and inputting the image information into the trained target detection training model, and distinguishing the object sample to be grabbed and the overlapping area sample.
In some embodiments, the image processing the sample of the object to be grabbed and the sample of the overlapping area to generate an intermediate contour includes:
and removing the area where the overlapped area sample is located in the object sample to be grabbed, carrying out binarization processing on the residual area, and carrying out contour extraction on the binarized residual area through OpenCV to generate the middle contour.
In some embodiments, the determining grabbing coordinate information of the grabbing point according to the image coordinate system includes:
generating a two-dimensional camera coordinate system through a monocular camera, and determining two-dimensional coordinate information of the grabbing point in the two-dimensional camera coordinate system;
and taking a platform coordinate system of the grabbing platform as the image coordinate system, and converting the two-dimensional coordinate information into the grabbing coordinate information in the image coordinate system.
In some embodiments, the determining grabbing angle information of the grabbing point according to the image coordinate system includes:
determining a target object in the object to be grabbed according to the grabbing point, and establishing a second external rectangle of the target object;
determining a first vertex coordinate, a second vertex coordinate and a third vertex coordinate of the second external rectangle in the image coordinate system, wherein the first vertex coordinate, the second vertex coordinate and the third vertex coordinate are sequentially distributed on the second external rectangle;
Comparing a first distance of the first vertex coordinate to the second vertex coordinate to a second distance of the second vertex coordinate to the third vertex coordinate;
when the first distance is larger than or equal to the second distance, determining the grabbing angle information according to the specific coordinate values of the first vertex coordinate and the second vertex coordinate;
and when the first distance is smaller than the second distance, determining the grabbing angle information according to the specific coordinate values of the second vertex coordinate and the third vertex coordinate.
In some embodiments, the transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm includes:
creating a socket protocol, binding an output port consistent with a receiving port of the grabbing arm, and creating communication connection through the receiving port and the output port;
and transmitting the grabbing coordinate information and the grabbing angle information to the grabbing arm through the communication connection.
Based on the same concept, one or more embodiments of the present specification further provide a machine-grasping-position determining apparatus including:
the acquisition module acquires image information of a placed object and generates an image coordinate system;
The generating module is used for detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour;
the determining module is used for establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point;
the calculation module is used for determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system;
and the transmission module transmits the grabbing coordinate information and the grabbing angle information to the grabbing arm so that the grabbing arm can grab.
Based on the same concept, one or more embodiments of the present specification further provide an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor implements the method as described in any one of the above when executing the program.
Based on the same concept, one or more embodiments of the present specification also provide a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method as any one of the above.
As can be seen from the above description, one or more embodiments of the present specification provide a method, an apparatus, an electronic apparatus, and a storage medium for determining a grabbed machine position, including: acquiring image information of a placed object, and generating an image coordinate system; detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour; establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point; determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system; and transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm, so that the grabbing arm grabs. One or more embodiments of this description make the machine when snatching overlapping object, choose to keep away from overlapping position and snatch, can adjust to corresponding angle simultaneously, and then can correctly distinguish overlapping object to snatch the object correct position fast accurate, effectively protected object itself and whole frock process.
Drawings
In order to more clearly illustrate one or more embodiments or prior art solutions of the present specification, the drawings that are needed in the description of the embodiments or prior art will be briefly described below, and it is obvious that the drawings in the following description are only one or more embodiments of the present specification, and that other drawings may be obtained by those skilled in the art without inventive effort from these drawings.
Fig. 1 is a schematic flowchart of a method for determining a gripping position of a machine according to one or more embodiments of the present disclosure;
FIG. 2 is a schematic view of a landing object according to one or more embodiments of the present disclosure;
FIG. 3 is a schematic view of a plurality of intermediate profile bridging objects set forth in one or more embodiments of the present disclosure;
FIG. 4 is a schematic view of a mid-profile attachment object according to one or more embodiments of the present disclosure;
fig. 5 is a schematic structural diagram of a machine grasping position determining apparatus according to one or more embodiments of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to one or more embodiments of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present specification more apparent, the present specification is further described in detail below with reference to the accompanying drawings in combination with specific embodiments.
It should be noted that technical terms or scientific terms used in the embodiments of the present specification should have a general meaning as understood by those having ordinary skill in the art to which the present disclosure belongs, unless otherwise defined. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that a element, article, or method step that precedes the word, and includes the element, article, or method step that follows the word, and equivalents thereof, does not exclude other elements, articles, or method steps. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
As described in the background section, the conventional robot arm directly grips the center position of the workpiece, but in the case where two workpieces are overlapped, the grip center position may not be suitable, and therefore, the grip position of the workpiece needs to be redefined.
In combination with the above practical situations, one or more embodiments of the present specification provide a scheme for determining a gripping position of a machine, and when gripping an overlapped object, a position far away from the overlapped part is selected to grip the overlapped object, and a corresponding angle is adjusted at the same time, so that the overlapped object can be correctly identified, the correct position of the gripped object can be quickly and accurately determined, and the object itself and the whole tooling process are effectively protected.
The technical solutions of one or more embodiments of the present specification are described in detail below with reference to specific embodiments.
Referring to fig. 1, a method for determining a gripping position of a machine according to an embodiment of the present disclosure includes the following steps:
step 101, obtaining image information of a placed object, and generating an image coordinate system.
The step aims to obtain an image of a placed object and generate a corresponding coordinate system. Wherein, in the embodiment, put the object and generally place on leveling the platform, mainly be the object of piling up, lapping, for example: two overlapping bolts, etc. The image information is information that can reflect the real appearance of the object, and may be a picture taken by a camera, a picture scanned by an infrared device, or the like. In the shooting mode by the camera, the shooting mode can be divided into the shooting mode by a monocular camera and the shooting mode by a binocular camera. The monocular camera can be used for obtaining a two-dimensional image of an object, and the binocular camera can be used for obtaining three-dimensional information of the object, but the installation and calculation are more complicated than those of the monocular camera.
After that, an image coordinate system is generated. The image coordinate system may be a camera coordinate system with a projection position of a camera center as an origin; or a workpiece coordinate system with the table top where the object is located as a whole and the end point as the origin; but also a tool coordinate system with the gripping point of the robot gripping arm as the origin, etc. The image coordinate system may identify points on the placed object.
And 102, detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour.
The method comprises the steps of distinguishing a sample of an object to be grabbed and an overlapping area in image information, and then carrying out image processing to generate a middle contour. The placing object is a stack and a lap of a plurality of single objects, and the sample of the object to be grabbed refers to an area in the image where each single object is located, and other objects may be contained in the area, but the area necessarily contains one single object. And then, the overlapping area sample is the area for stacking and lapping the objects in the image information. In a specific application scenario, as shown in fig. 2, the boxes denoted by 201 and 202 are the first sample of the object to be grabbed and the second sample of the object to be grabbed, and the box denoted by 203 is the first sample of the overlapping area.
In the method of distinguishing the sample of the object to be grabbed from the sample of the overlapping area, the image may be detected by using a target detection algorithm in the neural network, for example: SPP-Net, Faster-R-CNN, YOLov1-v3, and the like.
Then, the samples are subjected to image processing to generate an intermediate contour. The middle contour is the contour of the rest object in the object sample to be grabbed after the overlapped area sample is removed from the object sample to be grabbed. In a specific application scenario, as shown in fig. 3, the object contours in reference numerals 204, 205 and 206 are the first intermediate contour, the second intermediate contour and the third intermediate contour. The image processing process can be in various forms, such as: performing image analysis on the sample of the object to be grabbed and the sample of the overlapped area, deleting the overlapped area in the sample of the object to be grabbed, then performing binarization processing on the residual area to distinguish an object side line, and extracting the side line to generate a middle contour; or edge processing is directly carried out on the sample of the object to be grabbed to generate an edge of the object, then all the edges are analyzed, the overlapped part of the edges is removed, and the rest is the middle contour.
Step 103, establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point.
The method comprises the steps of determining a middle outline, externally connecting a rectangle to the middle outline, determining preselected grabbing points through short edges of the externally connected rectangle, and then selecting final grabbing points from the preselected grabbing points. Wherein the circumscribed rectangle is the smallest rectangle that completely contains the middle outline.
And then, determining a preselected grabbing point by the short side of the circumscribed rectangle. And when the number of the middle outlines is 1, determining the middle point of each short side in the circumscribed rectangle as the preselected grabbing point. I.e. 1 middle contour, corresponds to 2 preselected grab points, which are the midpoints on the two short sides of the circumscribed rectangle of this middle contour. In a specific application scenario, as shown in fig. 4, reference numeral 301 is a third sample of the object to be grasped, reference numeral 302 is a second sample of the overlapping area, and the contour of the object in reference numeral 303 is a fourth intermediate contour. And only one intermediate contour, namely the fourth intermediate contour, is left after the second overlapped area sample is removed from the third object sample to be grabbed, then directly determining a circumscribed rectangle of the fourth intermediate contour, and determining the middle points on two short sides of the circumscribed rectangle as preselected grabbing points. When the number of the middle outlines is larger than 1, selecting a minor outline with the smallest area in the middle outlines, and selecting a circumscribed rectangle corresponding to the minor outline as a minor rectangle; and determining the middle point of each short side of other circumscribed rectangles except the secondary rectangle as a preselected grabbing point. That is, when the number of the middle outlines is greater than 1, the circumscribed rectangle corresponding to the minimum middle outline is removed, and the midpoints of the two short sides of the other circumscribed rectangles are preselected grabbing points, for example: when the number of the middle outlines is 2, the number of the preselected grabbing points is 2; when the number of the middle outlines is 3, 4 preselected grabbing points are arranged; and so on. In a specific application scenario, as shown in fig. 3, the second middle contour in the reference numeral 205 is the sub-contour, the short sides of the circumscribed rectangle corresponding to the first middle contour and the third middle contour in 204 and 206 are determined, and the middle point of the short sides is selected as the preselected grabbing point.
And then selecting a final grabbing point according to the position information of the preselected grabbing points. The position information refers to the distance information of the preselected grabbing point to the same point. For example: the distance a preselected grasp point reaches the center point of the overlap region, the distance a preselected grasp point reaches one end point of the overlap region, the distance a preselected grasp point reaches one point in another contour (e.g., a secondary contour), etc. And then the preselected grabbing point with the largest or smallest position information can be selected as the final grabbing point.
And 104, determining the grabbing coordinate information and the grabbing angle information of the grabbing point according to the image coordinate system.
The step aims to determine the coordinate information and the grabbing angle of the grabbing point. Due to the machine grabbing process there may be a camera coordinate system, a work piece coordinate system, a tool coordinate system, etc. And the coordinate information of the grabbing point can be on any one of the three coordinate systems, or the grabbing point can be placed in the coordinate system which is most beneficial to the grabbing of the machine through the mutual conversion among the three coordinate systems, and the like. Meanwhile, the coordinate system may be a two-dimensional coordinate system formed by a monocular camera, a three-dimensional coordinate system formed by a binocular camera, and the like, and the final coordinate information forms of the captured points are not completely the same.
Then, the grabbing angle is determined, and the calculation process can be set in any one of the coordinate systems. Since the discharged object itself has a definite shape, the angle formed by the discharged object with respect to a plane or space is also generally definite, i.e. the grabbing angle may be an angle in a two-dimensional coordinate system or an angle in a three-dimensional coordinate system, etc. And the grabbing angle during grabbing can be determined through a corresponding calculation mode. For example: the angle is determined from the line segment and the projection of the line segment in the two-dimensional coordinate system, or the angle is determined from the position of each vertex.
And 105, transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm, so that the grabbing arm grabs.
The step aims to transmit the grabbing coordinate information and the grabbing angle information to the grabbing arm, so that the grabbing arm moves according to the information and finishes final grabbing. The communication may be established in many ways, for example: directly controlling the grabbing arm through the circuit board, wherein the steps are also calculated in the circuit board; or after corresponding ports are set to compile corresponding communication programs through the network cable or the wirelessly connected terminal and the grabbing arm, communication connection and data transmission are carried out; or the terminal simply sends information to the gripper arm, whether the gripper arm performs a manual confirmation requiring the user, etc.
By applying one or more embodiments of the present specification, a method for determining a gripping position of a machine is provided, including: acquiring image information of a placed object, and generating an image coordinate system; detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour; establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point; determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system; and transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm, so that the grabbing arm grabs. One or more embodiments of this description make the machine when snatching overlapping object, choose to keep away from overlapping position and snatch, can adjust to corresponding angle simultaneously, and then can correctly distinguish overlapping object to snatch the object correct position fast accurate, effectively protected object itself and whole frock process.
In an alternative embodiment of the present description, the pre-selected gripping points are accurately determined from a different number of intermediate profiles. The establishing a first circumscribed rectangle of the middle outline, and determining a preselected grabbing point according to a short side of the first circumscribed rectangle, includes:
When the number of the middle outlines is larger than 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point;
setting the minimum area in the middle outline as a secondary outline, wherein the first external rectangle corresponding to the secondary outline is a secondary rectangle, and removing the middle point of each short side of the secondary rectangle in the preselected grabbing point;
and when the number of the middle outlines is 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point.
It can be understood that, as described in the previous embodiment, in a specific application scenario, the number of the middle outlines may not be counted, and further, no matter how many first circumscribed rectangles formed by the middle outlines are, the middle point of the short side in each rectangle may be directly obtained as a preselected grasping point; it is also possible to select a specific number of first bounding rectangles (e.g. with an aspect ratio reaching a certain threshold, etc.), obtain the middle point of the short side of these rectangles as a preselected grasping point, etc.
In an alternative embodiment of the present description, the final grasping point is selected in order to be based on a different number of intermediate contours. The determining of the grabbing point according to the position information of the preselected grabbing point comprises:
When the number of the middle profiles is larger than 1, determining a first central position of the secondary profile, measuring the distance from the preselected grabbing point to the first central position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point.
When the number of the middle profiles is 1, determining a second center position of the overlapping area sample, measuring the distance from the preselected grabbing point to the second center position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point.
It is to be understood that when the number of the middle contours is greater than 1, the reference point for determining the final capture point does not necessarily need to select the center position of the sub-contour, but may be any point on the sub-contour, more may be a point in the overlap region, and so on. When the number of the middle contours is 1, the reference point determining the final grasping point does not necessarily have to select the center position of the overlap area sample, it may be any point on the overlap area sample, and so on.
In an alternative embodiment of the present description, in order to accurately separate the samples of the object to be grasped and the samples of the overlap area. The detecting the image information to distinguish the sample of the object to be grabbed and the sample of the overlapping area comprises the following steps:
Obtaining at least two sample images with mutually overlapped objects, and training a target detection training model through the sample images;
and inputting the image information into the trained target detection training model, and distinguishing the object sample to be grabbed and the overlapping area sample.
In a specific application scenario, a large number of two mutually connected workpiece pictures are collected to serve as sample images, a deep learning YOLOv3 training model is used for training the sample images, and then the trained YOLOv3 model is used for processing image information. The training model here may be any of v1-v4 in YOLO and may also utilize other models such as: SPP-Net, Faster-R-CNN, and the like.
In an alternative embodiment of the present specification, in order to generate the intermediate contour more quickly and accurately, the accuracy of extracting the contour is ensured while the efficiency is improved. The image processing is performed on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour, and the method comprises the following steps:
and removing the area where the overlapped area sample is located in the object sample to be grabbed, carrying out binarization processing on the residual area, and carrying out contour extraction on the binarized residual area through OpenCV to generate the middle contour.
The binarization processing is to set the gray value of a pixel point on the image to be 0 or 255, that is, to make the whole image have an obvious black-and-white effect.
In an alternative embodiment of the present specification, in order to determine the grasping coordinate information of the grasping point, the characteristics of the machine itself are used at the same time, and the processing efficiency is improved. The determining the grabbing coordinate information of the grabbing point according to the image coordinate system comprises the following steps:
generating a two-dimensional camera coordinate system through a monocular camera, and determining two-dimensional coordinate information of the grabbing point in the two-dimensional camera coordinate system;
and taking a platform coordinate system of the grabbing platform as the image coordinate system, and converting the two-dimensional coordinate information into the grabbing coordinate information in the image coordinate system.
In a specific application scene, a workpiece is grabbed on a platform, and the vertical distance between a grabber of a grabbing arm and the platform can be fixed, so that a monocular camera is enough to obtain a two-dimensional coordinate, a binocular camera can obtain three-dimensional information of an object, but the installation and calculation are complicated, so that the monocular camera is selected, the installation mode is an Eye-to-hand mode, and the camera is right above the platform; then, the coordinates of the object on the image need to be converted into the real coordinates of the object on the platform, and the camera should be calibrated, and the steps are mainly to define a camera coordinate system on the platform and calibrate the camera by using a Zhang friend calibration method. After the coordinates in the camera coordinate system are obtained, the grabbing arm moves under the workpiece coordinate system, and therefore the camera coordinate system and the workpiece coordinate system need to be converted.
In an alternative embodiment of the present description, in order to determine the grabbing angle information of the grabbing point, the characteristics of the machine are utilized at the same time, and the processing efficiency is improved. The determining the grabbing angle information of the grabbing point according to the image coordinate system comprises the following steps:
determining a target object in the object to be grabbed according to the grabbing point, and establishing a second external rectangle of the target object;
determining a first vertex coordinate, a second vertex coordinate and a third vertex coordinate of the second external rectangle in the image coordinate system, wherein the first vertex coordinate, the second vertex coordinate and the third vertex coordinate are sequentially distributed on the second external rectangle;
comparing a first distance of the first vertex coordinate to the second vertex coordinate to a second distance of the second vertex coordinate to the third vertex coordinate;
when the first distance is larger than or equal to the second distance, determining the grabbing angle information according to the specific coordinate values of the first vertex coordinate and the second vertex coordinate;
and when the first distance is smaller than the second distance, determining the grabbing angle information according to the specific coordinate values of the second vertex coordinate and the third vertex coordinate.
In a specific application scenario, as can be seen from the previous embodiment, the target point can be accurately located by using the two-dimensional coordinate system, and the angle of the target object can be determined only by using the two-dimensional coordinate system. Sequentially determining three vertexes of the second external rectangle in a two-dimensional coordinate system, and recording the three vertexes as A (x)a,ya),B(xb,yb),C(xc,yc)。A(xa,ya) To B (x)b,yb) A distance of D1,B(xb,yb) To C (x)c,yc) A distance of D2. If D is1Greater than or equal to D2And then compare xaAnd xbIf the two values are equal, then the angle is 90 degrees, otherwise the angle is equal to the arctangent of the ratio of the distance on the y-axis to the distance on the x-axis at points A and B. If D is1Is less than D2Comparison of xbAnd xcIf the two numbers are equal, the angle is 90 degrees, otherwise the angle is equal to the arctangent of the ratio of the distance on the y axis of the points B and C to the distance on the x axis; and subtracting the obtained angle from 90 degrees to obtain the final grabbing angle.
In an alternative embodiment of the present description, the gripper arms are enabled to react quickly in order to establish a connection with the gripper arms. The will snatch coordinate information and snatch angle information transmission and snatch the arm, include:
creating a socket protocol, binding an output port consistent with a receiving port of the grabbing arm, and creating communication connection through the receiving port and the output port;
And transmitting the grabbing coordinate information and the grabbing angle information to the grabbing arm through the communication connection.
In a specific application scenario, a Socket is created and a port is bound, which is consistent with a port of the gripper arm, for example, the port of the gripper arm is defined as 8080, so the port of the service end is also 8080. And then, monitoring the information of the grabbing arm. When a connection request is received, a connection is established with the grabber arm. And carrying out data transmission with the grabbing arm, transmitting the obtained grabbing coordinate information and grabbing angle information to the grabbing arm, and moving the grabbing arm to a proper posture for grabbing. And after the communication is finished, closing the monitoring.
Based on the same concept, one or more embodiments of the present specification also provide a machine-grasping-position determining apparatus. Referring to fig. 5, the method includes:
an obtaining module 501, configured to obtain image information of a placed object, and generate an image coordinate system;
a generating module 502, configured to detect the image information, distinguish a sample of the object to be captured and a sample of the overlapping area, perform image processing on the sample of the object to be captured and the sample of the overlapping area, and generate a middle contour;
the determining module 503 is configured to establish a first external rectangle of the middle contour, determine a preselected grabbing point according to a short side of the first external rectangle, and determine a grabbing point according to the position information of the preselected grabbing point;
The calculation module 504 is used for determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system;
and the transmission module 505 transmits the grabbing coordinate information and the grabbing angle information to the grabbing arm, so that the grabbing arm grabs.
As an alternative embodiment, the determining module 503 establishes a first circumscribed rectangle of the middle outline, and determines the preselected grabbing point according to the short side of the first circumscribed rectangle, including:
when the number of the middle outlines is larger than 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point;
setting the minimum area in the middle outline as a secondary outline, wherein the first external rectangle corresponding to the secondary outline is a secondary rectangle, and removing the middle point of each short side of the secondary rectangle in the preselected grabbing point;
and when the number of the middle outlines is 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point.
As an alternative embodiment, the determining module 503 determines the grabbing point according to the position information of the preselected grabbing point, including:
when the number of the middle profiles is more than 1, determining a first central position of the secondary profile, measuring the distance from the preselected grabbing point to the first central position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point;
When the number of the middle profiles is 1, determining a second center position of the overlapping area sample, measuring the distance from the preselected grabbing point to the second center position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point.
As an optional embodiment, the generating module 502 detects the image information, and distinguishes the sample of the object to be grabbed and the sample of the overlapping area, including:
obtaining at least two sample images with mutually overlapped objects, and training a target detection training model through the sample images;
and inputting the image information into the trained target detection training model, and distinguishing the object sample to be grabbed and the overlapping area sample.
As an optional embodiment, the generating module 502 performs image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate an intermediate contour, including:
and removing the area where the overlapped area sample is located in the object sample to be grabbed, carrying out binarization processing on the residual area, and carrying out contour extraction on the binarized residual area through OpenCV to generate the middle contour.
As an alternative embodiment, the calculating module 504 determines the grabbing coordinate information of the grabbing point according to the image coordinate system, including:
generating a two-dimensional camera coordinate system through a monocular camera, and determining two-dimensional coordinate information of the grabbing point in the two-dimensional camera coordinate system;
and taking a platform coordinate system of the grabbing platform as the image coordinate system, and converting the two-dimensional coordinate information into the grabbing coordinate information in the image coordinate system.
As an alternative embodiment, the calculating module 504 determines the grabbing angle information of the grabbing point according to the image coordinate system, including:
determining a target object in the object to be grabbed according to the grabbing point, and establishing a second external rectangle of the target object;
determining a first vertex coordinate, a second vertex coordinate and a third vertex coordinate of the second external rectangle in the image coordinate system, wherein the first vertex coordinate, the second vertex coordinate and the third vertex coordinate are sequentially distributed on the second external rectangle;
comparing a first distance of the first vertex coordinate to the second vertex coordinate to a second distance of the second vertex coordinate to the third vertex coordinate;
When the first distance is larger than or equal to the second distance, determining the grabbing angle information according to the specific coordinate values of the first vertex coordinate and the second vertex coordinate;
and when the first distance is smaller than the second distance, determining the grabbing angle information according to the specific coordinate values of the second vertex coordinate and the third vertex coordinate.
As an alternative embodiment, the transmitting module 505 transmits the grabbing coordinate information and the grabbing angle information to the grabbing arm, which includes:
creating a socket protocol, binding an output port consistent with a receiving port of the grabbing arm, and creating communication connection through the receiving port and the output port;
and transmitting the grabbing coordinate information and the grabbing angle information to the grabbing arm through the communication connection.
The device of the foregoing embodiment is used to implement the corresponding method in the foregoing embodiment, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same concept, one or more embodiments of the present specification further provide an electronic device. The electronic device comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to implement the method for determining the grabbing position of the machine according to any one of the above embodiments.
Fig. 6 is a schematic diagram illustrating a more specific hardware structure of an electronic device according to this embodiment, where the electronic device may include: a processor 610, a memory 620, an input/output interface 630, a communication interface 640, and a bus 650. Wherein the processor 610, memory 620, input/output interface 630, and communication interface 640 are communicatively coupled to each other within the device via a bus 650.
The processor 610 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present specification.
The Memory 620 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random access Memory), a static storage device, a dynamic storage device, or the like. The memory 620 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 620 and called by the processor 610 to be executed.
The input/output interface 630 is used for connecting an input/output module to realize information input and output. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The communication interface 640 is used for connecting a communication module (not shown in the figure) to realize communication interaction between the device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
Bus 650 includes a pathway to transfer information between various components of the device, such as processor 610, memory 620, input/output interface 630, and communication interface 640.
It should be noted that although the above-mentioned devices only show the processor 610, the memory 620, the input/output interface 630, the communication interface 640 and the bus 650, in a specific implementation, the devices may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
Based on the same concept, one or more embodiments of the present specification also provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the machine grasp location determination method according to any of the embodiments described above.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the spirit of the present disclosure, features from the above embodiments or from different embodiments may also be combined, steps may be implemented in any order, and there are many other variations of different aspects of one or more embodiments of the present description as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures, for simplicity of illustration and discussion, and so as not to obscure one or more embodiments of the disclosure. Further, devices may be shown in block diagram form in order to avoid obscuring the understanding of one or more embodiments of the present description, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the one or more embodiments of the present description are to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that one or more embodiments of the disclosure can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic ram (dram)) may use the discussed embodiments.
It is intended that the one or more embodiments of the present specification embrace all such alternatives, modifications and variations as fall within the broad scope of the appended claims. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of one or more embodiments of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (11)

1. A method for determining a gripping position of a machine, comprising:
acquiring image information of a placed object, and generating an image coordinate system;
detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour;
establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point;
Determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system;
and transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm, so that the grabbing arm grabs.
2. The method of claim 1, wherein establishing a first circumscribing rectangle of the intermediate profile, determining a preselected grasping point from a short side of the first circumscribing rectangle, comprises:
when the number of the middle outlines is larger than 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point;
setting the minimum area in the middle outline as a secondary outline, wherein the first external rectangle corresponding to the secondary outline is a secondary rectangle, and removing the middle point of each short side of the secondary rectangle in the preselected grabbing point;
and when the number of the middle outlines is 1, determining the middle point of each short side in the first circumscribed rectangle as the preselected grabbing point.
3. The method of claim 2, wherein determining a grab point from the location information of the preselected grab point comprises:
when the number of the middle profiles is more than 1, determining a first central position of the secondary profile, measuring the distance from the preselected grabbing point to the first central position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point;
When the number of the middle profiles is 1, determining a second center position of the overlapping area sample, measuring the distance from the preselected grabbing point to the second center position to generate the position information, and setting the preselected grabbing point with the largest position information as the grabbing point.
4. The method according to claim 1, wherein the detecting the image information to distinguish the sample of the object to be grabbed and the sample of the overlapping area comprises:
obtaining at least two sample images with mutually overlapped objects, and training a target detection training model through the sample images;
and inputting the image information into the trained target detection training model, and distinguishing the object sample to be grabbed and the overlapping area sample.
5. The method according to claim 1, wherein the image processing the sample of the object to be grabbed and the sample of the overlapping area to generate an intermediate contour comprises:
and removing the area where the overlapped area sample is located in the object sample to be grabbed, carrying out binarization processing on the residual area, and carrying out contour extraction on the binarized residual area through OpenCV to generate the middle contour.
6. The method of claim 1, wherein determining grabbing coordinate information of the grabbing point according to the image coordinate system comprises:
generating a two-dimensional camera coordinate system through a monocular camera, and determining two-dimensional coordinate information of the grabbing point in the two-dimensional camera coordinate system;
and taking a platform coordinate system of the grabbing platform as the image coordinate system, and converting the two-dimensional coordinate information into the grabbing coordinate information in the image coordinate system.
7. The method according to claim 1, wherein the determining grabbing angle information of the grabbing point according to the image coordinate system comprises:
determining a target object in the object to be grabbed according to the grabbing point, and establishing a second external rectangle of the target object;
determining a first vertex coordinate, a second vertex coordinate and a third vertex coordinate of the second external rectangle in the image coordinate system, wherein the first vertex coordinate, the second vertex coordinate and the third vertex coordinate are sequentially distributed on the second external rectangle;
comparing a first distance of the first vertex coordinate to the second vertex coordinate to a second distance of the second vertex coordinate to the third vertex coordinate;
When the first distance is larger than or equal to the second distance, determining the grabbing angle information according to the specific coordinate values of the first vertex coordinate and the second vertex coordinate;
and when the first distance is smaller than the second distance, determining the grabbing angle information according to the specific coordinate values of the second vertex coordinate and the third vertex coordinate.
8. The method of claim 1, wherein transmitting the grabbing coordinate information and the grabbing angle information to a grabbing arm comprises:
creating a socket protocol, binding an output port consistent with a receiving port of the grabbing arm, and creating communication connection through the receiving port and the output port;
and transmitting the grabbing coordinate information and the grabbing angle information to the grabbing arm through the communication connection.
9. A machine gripping position determining apparatus, characterized by comprising:
the acquisition module acquires image information of a placed object and generates an image coordinate system;
the generating module is used for detecting the image information, distinguishing a sample of the object to be grabbed and a sample of an overlapping area, and carrying out image processing on the sample of the object to be grabbed and the sample of the overlapping area to generate a middle contour;
The determining module is used for establishing a first external rectangle of the middle outline, determining a preselected grabbing point according to the short side of the first external rectangle, and determining the grabbing point according to the position information of the preselected grabbing point;
the calculation module is used for determining grabbing coordinate information and grabbing angle information of the grabbing points according to the image coordinate system;
and the transmission module transmits the grabbing coordinate information and the grabbing angle information to the grabbing arm so that the grabbing arm can grab.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the program.
11. A non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 8.
CN202010525389.7A 2020-06-10 2020-06-10 Method and device for determining grabbing position of machine, electronic device and storage medium Active CN111844019B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010525389.7A CN111844019B (en) 2020-06-10 2020-06-10 Method and device for determining grabbing position of machine, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010525389.7A CN111844019B (en) 2020-06-10 2020-06-10 Method and device for determining grabbing position of machine, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN111844019A true CN111844019A (en) 2020-10-30
CN111844019B CN111844019B (en) 2021-11-16

Family

ID=72987818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010525389.7A Active CN111844019B (en) 2020-06-10 2020-06-10 Method and device for determining grabbing position of machine, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN111844019B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112802107A (en) * 2021-02-05 2021-05-14 梅卡曼德(北京)机器人科技有限公司 Robot-based control method and device for clamp group
CN113034526A (en) * 2021-03-29 2021-06-25 深圳市优必选科技股份有限公司 Grabbing method, grabbing device and robot
CN113506314A (en) * 2021-06-25 2021-10-15 北京精密机电控制设备研究所 Automatic grabbing method and device for symmetrical quadrilateral workpiece under complex background
CN113538582A (en) * 2021-07-20 2021-10-22 熵智科技(深圳)有限公司 Method and device for determining workpiece grabbing sequence, computer equipment and medium
CN114187349A (en) * 2021-11-03 2022-03-15 深圳市正运动技术有限公司 Product processing method and device, terminal device and storage medium
CN114620479A (en) * 2022-04-24 2022-06-14 广东天太机器人有限公司 Mechanical arm control system and method for improving stacking efficiency of rectangular packaging boxes
CN114782367A (en) * 2022-04-24 2022-07-22 广东天太机器人有限公司 Control system and method for mechanical arm
CN116188559A (en) * 2021-11-28 2023-05-30 梅卡曼德(北京)机器人科技有限公司 Image data processing method, device, electronic equipment and storage medium
CN116197885A (en) * 2021-11-28 2023-06-02 梅卡曼德(北京)机器人科技有限公司 Image data processing method, device, electronic equipment and storage medium
CN116449041A (en) * 2023-06-14 2023-07-18 江苏金恒信息科技股份有限公司 Continuous casting blank sampling system and method
CN116524010A (en) * 2023-04-25 2023-08-01 北京云中未来科技有限公司 Unmanned crown block positioning method, system and storage medium for bulk material storage

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530276A (en) * 2016-10-13 2017-03-22 中科金睛视觉科技(北京)有限公司 Manipulator positioning method and system for grabbing of non-standard component
CN108648233A (en) * 2018-03-24 2018-10-12 北京工业大学 A kind of target identification based on deep learning and crawl localization method
CN108898628A (en) * 2018-06-21 2018-11-27 北京纵目安驰智能科技有限公司 Three-dimensional vehicle object's pose estimation method, system, terminal and storage medium based on monocular
DE102018214063A1 (en) * 2017-08-28 2019-02-28 Fanuc Corporation Machine learning device, machine learning system and machine learning method
JP2019214100A (en) * 2018-06-13 2019-12-19 オムロン株式会社 Robot control device, robot control method, and robot control program
CN110599544A (en) * 2019-08-08 2019-12-20 佛山科学技术学院 Workpiece positioning method and device based on machine vision

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530276A (en) * 2016-10-13 2017-03-22 中科金睛视觉科技(北京)有限公司 Manipulator positioning method and system for grabbing of non-standard component
DE102018214063A1 (en) * 2017-08-28 2019-02-28 Fanuc Corporation Machine learning device, machine learning system and machine learning method
CN108648233A (en) * 2018-03-24 2018-10-12 北京工业大学 A kind of target identification based on deep learning and crawl localization method
JP2019214100A (en) * 2018-06-13 2019-12-19 オムロン株式会社 Robot control device, robot control method, and robot control program
CN108898628A (en) * 2018-06-21 2018-11-27 北京纵目安驰智能科技有限公司 Three-dimensional vehicle object's pose estimation method, system, terminal and storage medium based on monocular
CN110599544A (en) * 2019-08-08 2019-12-20 佛山科学技术学院 Workpiece positioning method and device based on machine vision

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112802107A (en) * 2021-02-05 2021-05-14 梅卡曼德(北京)机器人科技有限公司 Robot-based control method and device for clamp group
CN113034526A (en) * 2021-03-29 2021-06-25 深圳市优必选科技股份有限公司 Grabbing method, grabbing device and robot
CN113034526B (en) * 2021-03-29 2024-01-16 深圳市优必选科技股份有限公司 Grabbing method, grabbing device and robot
CN113506314A (en) * 2021-06-25 2021-10-15 北京精密机电控制设备研究所 Automatic grabbing method and device for symmetrical quadrilateral workpiece under complex background
CN113506314B (en) * 2021-06-25 2024-04-09 北京精密机电控制设备研究所 Automatic grabbing method and device for symmetrical quadrilateral workpieces under complex background
CN113538582A (en) * 2021-07-20 2021-10-22 熵智科技(深圳)有限公司 Method and device for determining workpiece grabbing sequence, computer equipment and medium
CN113538582B (en) * 2021-07-20 2024-06-07 熵智科技(深圳)有限公司 Method, device, computer equipment and medium for determining workpiece grabbing sequence
CN114187349A (en) * 2021-11-03 2022-03-15 深圳市正运动技术有限公司 Product processing method and device, terminal device and storage medium
CN116188559A (en) * 2021-11-28 2023-05-30 梅卡曼德(北京)机器人科技有限公司 Image data processing method, device, electronic equipment and storage medium
CN116197885A (en) * 2021-11-28 2023-06-02 梅卡曼德(北京)机器人科技有限公司 Image data processing method, device, electronic equipment and storage medium
CN116197885B (en) * 2021-11-28 2023-11-24 梅卡曼德(北京)机器人科技有限公司 Image data filtering method, device, equipment and medium based on press-fit detection
CN114782367B (en) * 2022-04-24 2022-12-20 广东天太机器人有限公司 Control system and method for mechanical arm
CN114782367A (en) * 2022-04-24 2022-07-22 广东天太机器人有限公司 Control system and method for mechanical arm
CN114620479A (en) * 2022-04-24 2022-06-14 广东天太机器人有限公司 Mechanical arm control system and method for improving stacking efficiency of rectangular packaging boxes
CN116524010A (en) * 2023-04-25 2023-08-01 北京云中未来科技有限公司 Unmanned crown block positioning method, system and storage medium for bulk material storage
CN116524010B (en) * 2023-04-25 2024-02-02 北京云中未来科技有限公司 Unmanned crown block positioning method, system and storage medium for bulk material storage
CN116449041A (en) * 2023-06-14 2023-07-18 江苏金恒信息科技股份有限公司 Continuous casting blank sampling system and method
CN116449041B (en) * 2023-06-14 2023-09-05 江苏金恒信息科技股份有限公司 Continuous casting blank sampling system and method

Also Published As

Publication number Publication date
CN111844019B (en) 2021-11-16

Similar Documents

Publication Publication Date Title
CN111844019B (en) Method and device for determining grabbing position of machine, electronic device and storage medium
JP7352260B2 (en) Robot system with automatic object detection mechanism and its operating method
JP7284953B2 (en) Robotic system with advanced scanning mechanism
JP6529302B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
JP6782046B1 (en) Object detection system and method based on image data
KR101919463B1 (en) Gripper robot control system for picking of atypical form package
JP2018122376A (en) Image processing device, robot control device, and robot
WO2018135326A1 (en) Image processing device, image processing system, image processing program, and image processing method
CN111080701A (en) Intelligent cabinet object detection method and device, server and storage medium
CN116175542B (en) Method, device, electronic equipment and storage medium for determining clamp grabbing sequence
CN115578457A (en) Dynamic box body detection method applied to unordered grabbing
CN113284129B (en) 3D bounding box-based press box detection method and device
US11717970B2 (en) Controller, control method using controller, and control system
CN111742349B (en) Information processing apparatus, information processing method, and information processing storage medium
US11436754B2 (en) Position posture identification device, position posture identification method and position posture identification program
CN116197887A (en) Image data processing method, device, electronic equipment and storage medium
CN116188559A (en) Image data processing method, device, electronic equipment and storage medium
CN116197886A (en) Image data processing method, device, electronic equipment and storage medium
CN118097592A (en) Mechanical arm grabbing point positioning method and device, replenishment robot, equipment and medium
CN114043531A (en) Table top inclination angle determination method, table top inclination angle use method, table top inclination angle determination device, robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant