CN112605986A - Method, device and equipment for automatically picking up goods and computer readable storage medium - Google Patents

Method, device and equipment for automatically picking up goods and computer readable storage medium Download PDF

Info

Publication number
CN112605986A
CN112605986A CN202011240724.5A CN202011240724A CN112605986A CN 112605986 A CN112605986 A CN 112605986A CN 202011240724 A CN202011240724 A CN 202011240724A CN 112605986 A CN112605986 A CN 112605986A
Authority
CN
China
Prior art keywords
target object
area
mechanical arm
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011240724.5A
Other languages
Chinese (zh)
Other versions
CN112605986B (en
Inventor
欧勇盛
王志扬
江国来
徐升
赛高乐
熊荣
李纪庆
郭嘉欣
吴新宇
冯伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202011240724.5A priority Critical patent/CN112605986B/en
Publication of CN112605986A publication Critical patent/CN112605986A/en
Application granted granted Critical
Publication of CN112605986B publication Critical patent/CN112605986B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a method, a device and equipment for automatically picking up goods and a computer readable storage medium, wherein the method for automatically picking up goods comprises the following steps: acquiring a goods taking instruction, wherein the goods taking instruction comprises the information of the area where the target object is located; calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located; controlling the mechanical arm to move to the area where the target object is located along the path track, and acquiring a target image comprising the target object so as to determine the specific position of the target object according to the target image; and controlling the mechanical arm to adsorb the target object and move the target object to the designated area based on the specific position of the target object in the target image. Above-mentioned scheme can be according to getting goods instruction control machine arm and adsorbing target object to appointed region automatically, has realized getting goods automatically.

Description

Method, device and equipment for automatically picking up goods and computer readable storage medium
Technical Field
The present application relates to the field of robotics, and in particular, to a method, an apparatus, a device, and a computer-readable storage medium for automatically picking up goods.
Background
Along with the gradual flourishing of online shopping, the express logistics industry has obtained comparatively obvious development, and the goods of getting of express logistics becomes the common problem in order to produce the life with putting. Express delivery volume that the every day sharply increases makes the temporary access of express delivery go wrong, among the prior art, express delivery send-receiver room is difficult to hold more express delivery volume, leads to the staff of express delivery send-receiver room to stack the express delivery that has certain common sign together through artificial letter sorting, and the consumer is when getting goods, and the staff still need look for required express delivery in stacking numerous express deliveries together, leads to getting goods inefficiency, and the cost of labor of letter sorting or getting goods is high.
Disclosure of Invention
The application at least provides a method, a device and equipment for automatically picking goods and a computer readable storage medium, which can realize automatic goods picking.
A first aspect of the present application provides a method for automatically picking up goods, the method comprising:
acquiring a goods taking instruction, wherein the goods taking instruction comprises information of an area where a target object is located;
calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located;
controlling the mechanical arm to move to the area where the target object is located along the traveling path track, and acquiring a target image comprising the target object so as to determine the specific position of the target object according to the target image;
and controlling the mechanical arm to adsorb the target object and moving the target object to a specified area based on the specific position of the target object in the target image.
In some embodiments, the specific position includes a central point position of any one of the contour surfaces of the target object, and the step of acquiring a target image including the target object to determine the specific position of the target object according to the target image includes:
processing the target image to obtain at least one contour surface of a target object in the target image; selecting one of the at least one contour surface based on at least one contour surface of a target object in the target image, and calculating the position of the central point of the contour surface;
and controlling the mechanical arm to adsorb the central point position of the contour surface based on the central point position of the contour surface, and moving the target object to a specified area.
In some embodiments, the target image includes a depth image, and the step of processing the target image to obtain at least one contour surface of a target object in the target image includes:
acquiring point cloud data of the depth image;
dividing the point cloud data to obtain a plurality of plane cluster sets;
projecting the points in each plane clustering set onto a corresponding fitting plane to obtain projection point data of each point in a plurality of plane clustering sets on the corresponding fitting plane;
calculating a normal vector corresponding to each fitting plane by using the projection point data on each fitting plane;
and calculating the edge point cloud of the target object in the depth image according to the normal vector of the fitting plane so as to determine the contour surface of the target object according to the edge point cloud of the target object in the depth image.
In some embodiments, the step of calculating an edge point cloud of a target object in the depth image according to a normal vector of the fitting plane includes:
judging whether an included angle between the normal vector direction of the fitting plane and a preset normal vector direction is smaller than a preset angle or not;
if yes, calculating the edge point cloud of the target object in the depth image according to the normal vector of the fitting plane corresponding to the included angle smaller than the preset angle.
In some embodiments, the step of selecting one of the at least one contour surface based on the at least one contour surface of the target object in the target image includes:
calculating the area of each contour surface of a target object in the target image;
and comparing the areas of the plurality of contour surfaces to obtain the maximum contour surface of the area of the target object in the target image.
In some embodiments, the pickup instruction includes information of a designated area where the target object is to be placed, and the acquiring the pickup instruction, where the pickup instruction includes information of an area where the target object is located, and the acquiring includes:
and acquiring a goods taking instruction, controlling the mechanical arm to adsorb the target object according to the goods taking instruction, and moving the mechanical arm and the target object to a specified area in the goods taking instruction.
In some embodiments, the step of calculating a path trajectory of the robot arm to the area where the target object is located based on the information of the area where the target object is located includes:
and calculating the path track of the mechanical arm to the region where the target object is located by adopting a fast expansion random number algorithm.
This application second aspect provides a get goods device by oneself, get goods device by oneself includes:
the system comprises an acquisition unit, a storage unit and a control unit, wherein the acquisition unit is used for acquiring a goods taking instruction, and the goods taking instruction comprises position information of a target object and designated area information of the target object;
the calculation unit is used for calculating the path track of the mechanical arm reaching the area where the target object is located based on the position information of the target object;
the control unit is used for controlling the mechanical arm to move to the area where the target object is located along the path track, acquiring a target image and determining a plurality of contour surfaces of the target object according to the target image;
and the execution unit is used for controlling the suction cup at the tail end of the mechanical arm to be adsorbed on one contour surface of the target object based on a plurality of contour surfaces of the target object in the target image and moving the mechanical arm to a specified area.
The third aspect of the application provides an automatic goods taking system, which comprises the automatic goods taking device and a data processing device, wherein the automatic goods taking device comprises a mechanical arm and a sucker, and the data processing device is used for calculating a path track of the mechanical arm reaching an area where a target object is located and determining the specific position of the target object according to a target image; the sucker is used for sucking the target object; the mechanical arm is used for moving the target object to a specified area.
A fourth aspect of the present application provides an electronic device, which includes a memory and a processor coupled to each other, wherein the processor is configured to execute program instructions stored in the memory to implement the method for automatically picking up goods in the first aspect.
A fifth aspect of the present application provides a computer readable storage medium having stored thereon program instructions that, when executed by a processor, implement the method of automatically picking up goods of the first aspect.
According to the scheme, a goods taking instruction is obtained, wherein the goods taking instruction comprises the information of the area where the target object is located; calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located; controlling the mechanical arm to move to the area where the target object is located along the path track, and acquiring a target image comprising the target object so as to determine the specific position of the target object according to the target image; and controlling the mechanical arm to adsorb the target object and move the target object to the designated area based on the specific position of the target object in the target image. This application is according to getting goods instruction control machine arm and adsorbing target object to appointed region, has realized automatic getting goods, has reduced the cost of labor.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic flow chart diagram of a first embodiment of an automatic pickup method provided by the present application;
FIG. 2 is a schematic diagram of an overall control framework in the automatic goods picking method provided by the present application
FIG. 3 is a schematic flow chart diagram of a second embodiment of an automatic pickup method provided by the present application;
FIG. 4 is a schematic flow chart illustrating image processing in the method for automatically picking up goods provided by the present application;
FIG. 5 is a schematic diagram of a frame of an embodiment of a self-service device provided herein;
FIG. 6 is a schematic view of an embodiment of an automated picking system provided herein;
FIG. 7 is a block diagram of an embodiment of an electronic device provided herein;
FIG. 8 is a block diagram of an embodiment of a computer-readable storage medium provided herein.
Detailed Description
The following describes in detail the embodiments of the present application with reference to the drawings attached hereto.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, interfaces, techniques, etc. in order to provide a thorough understanding of the present application.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship. Further, the term "plurality" herein means two or more than two. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
The application provides an automatic goods taking method, which can be applied to temporary storage or sorting of express delivery, and specifically controls a mechanical arm to adsorb a target object to a designated area according to an obtained goods taking instruction so as to achieve automatic goods taking and reduce labor cost, please refer to fig. 1, where fig. 1 is a schematic flow diagram of a first embodiment of the automatic goods taking method provided by the application. The method for automatically picking up goods in the embodiment can be applied to an automatic goods picking device, such as a goods picking robot, and can also be applied to a server with data processing capability.
Specifically, the automatic goods taking method of the embodiment includes the following steps:
s101: and acquiring a goods taking instruction.
In the embodiment of the disclosure, different express delivery placing positions are different, so that the automatic goods taking device can distinguish different express deliveries according to position information of the express deliveries placed on the shelf, and send goods taking instructions for distinguishing different express deliveries to a user, such as goods taking codes or two-dimensional goods taking codes; the user can get goods to the temporary storage of express delivery department by virtue of the received goods taking instruction.
On one hand, a goods taking instruction input module can be arranged on the automatic goods taking device, when a user inputs a received goods taking instruction in the goods taking instruction input module, the automatic goods taking device receives the goods taking instruction containing the information of the area where the target object is located, and controls the mechanical arm to reach the area where the target object is located so as to recognize and grab the target object, and the target object is moved to the designated area.
On the other hand, when the goods taking instruction is directly input to the automatic goods taking device, the user needs to go to the position of the automatic goods taking device to input the instruction each time the user needs to take goods. In order to improve the efficiency of getting goods, the automatic goods device of getting of this application accessible wireless communication's mode and instruction input equipment establish data connection. The user inputs a goods taking instruction on the instruction input equipment so that the automatic goods taking device receives the goods taking instruction, controls the mechanical arm to automatically identify and grab the target object according to the goods taking instruction, and moves the mechanical arm and the target object to the designated area. Wherein, the instruction input device can be a mobile terminal, a wearable device and the like.
In consideration of the fact that in actual express receiving and dispatching application, express access is achieved by inputting a goods taking instruction by a worker in an express receiving and dispatching room, in order to reduce labor cost, the input device of the embodiment may be a mobile device, such as a mobile phone or a wearable device, specifically, the worker in the express receiving and dispatching room may input the goods taking instruction provided by a user at any time and any place by using the mobile phone of the worker, so that workload of the worker is reduced.
Specifically, referring to fig. 2, fig. 2 is a schematic structural diagram of an overall control framework in the automatic goods picking method provided by the present application. The user inputs the goods taking instruction through the app application program in the input device 1, the input device 1 receives the goods taking instruction and sends the goods taking instruction to the wifi communication processing module in the upper computer 2 which is in communication connection with the wifi module of the user, and therefore the upper computer 2 controls the mechanical arm 3 to conduct autonomous obstacle avoidance, decision control, target image acquisition, target object adsorption and other operations according to the received goods taking instruction. The forward and inverse kinematics solution is to obtain a path track of the region where the mechanical arm 3 reaches the target object and a path track of the region where the mechanical arm 3 and the target object move to the designated region.
S102: and calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located.
The goods taking instruction comprises the information of the area where the target object is located, the automatic goods taking device only needs to calculate the path track of the mechanical arm reaching the area where the target object is located according to the information of the position where the target object is located, and the mechanical arm is controlled to move to the area where the target object is located along the path track, so that the mechanical arm can conveniently adsorb the target object, and the target object can be adsorbed specifically by arranging a sucker at the tail end of the mechanical arm.
It should be noted that, in practical applications, on one hand, when the arm length of the robot arm is long enough to grasp each target object on the shelf, the robot arm may be directly placed in front of the shelf, and at this time, the path trajectory of the robot arm reaching the area where the target object is located represents the path trajectory of the robot arm itself reaching the target object. On the other hand, considering the cost problem and the operation problem of the mechanical arm, the arm length of the mechanical arm is not too long in general, and in order to be able to catch each target object on the shelf, the mechanical arm in the present application can move in front of the shelf, and at this time, the path trajectory of the area where the mechanical arm reaches the target object represents the path trajectory of the mechanical arm moving to the target area of the target object and the path trajectory of the mechanical arm itself reaching the target object. The target area of the target object refers to an area where the automatic goods taking device reaches to enable the mechanical arm to grab the target object, and the mechanical arm does not move to the front of the target object at the moment.
It should be noted that, after the mechanical arm reaches the target area capable of capturing the target object, the path track of the mechanical arm reaching the area where the target object is located may be calculated through a fast extended random number algorithm. Specifically, the path trajectory of the mechanical arm right in front of the target object can be calculated.
S103: and controlling the mechanical arm to move to the area where the target object is located along the path track, and acquiring a target image comprising the target object so as to determine the specific position of the target object according to the target image.
Considering that after the automatic picking device controls the mechanical arm to move to the area where the target object is located along the traveling path trajectory in S102, if the automatic picking device directly controls the mechanical arm to adsorb the target object, the mechanical arm may not adsorb the target object, or the mechanical arm may adsorb the edge of the target object, so that the target object may fall off when the mechanical arm drives the target object to move. Therefore, the automatic picking device of the embodiment obtains the target image including the target object through the camera mounted on the mechanical arm after controlling the mechanical arm to move to the area where the target object is located along the path track.
The target image is obtained by arranging the camera on the mechanical arm, the camera can be triggered to start shooting the target image after the mechanical arm moves to the area where the target object is located along the path track, and the target image can be started to be shot when the automatic goods taking device detects that the distance between the camera and the target object is smaller than the preset distance. Wherein, the camera can be arranged at the tail end of the mechanical arm.
Further, the automatic goods taking device determines the specific position of the target object according to the target image, on one hand, the RGB image including the target object is obtained by processing the target image, and the RGB image is processed to determine the specific position of the target object in the target image; on the other hand, a depth image including the target object may also be acquired, and the point cloud data in the depth image is processed to determine the specific position of the target object in the target image.
S104: and controlling the mechanical arm to adsorb the target object and move the target object to the designated area based on the specific position of the target object in the target image.
Based on the specific position of the target object determined in S103, the automatic pickup device controls the robot arm to adsorb the target object, and moves the robot arm and the target object to the designated area.
It should be noted that the trajectory of the path when the automatic picking device controls the robot arm to adsorb the target object and control the target object to move to the designated area needs to be recalculated, that is, the trajectory of the path when the automatic picking device controls the robot arm to move to the area where the target object is located is different from the trajectory of the path when the automatic picking device controls the robot arm to move the target object to the designated area.
In the scheme, a goods taking instruction is obtained, wherein the goods taking instruction comprises the information of the area where the target object is located; calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located; controlling the mechanical arm to move to the area where the target object is located along the path track, and acquiring a target image comprising the target object so as to determine the specific position of the target object according to the target image; and controlling the mechanical arm to adsorb the target object and move the target object to the designated area based on the specific position of the target object in the target image. This application is according to getting goods instruction control machine arm and snatching target object to appointed region automatically, has improved the efficiency of automatic goods of getting, has reduced the cost of labor.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating a second embodiment of the automatic picking method according to the present application. The position of the mechanical arm for absorbing the target object is determined according to the specific position of the target object determined by the automatic goods taking device according to the target image, and the automatic goods taking efficiency is improved. On the basis of the above embodiment, the method for automatically picking up goods of this embodiment further includes the following steps:
s201: and acquiring a goods taking instruction.
The automatic goods taking device controls the mechanical arm to move according to the goods taking instruction, so that the sucking disc at the tail end of the mechanical arm adsorbs the target object, and the mechanical arm and the adsorbed target object are moved to the appointed area. Therefore, the designated area in this embodiment may be designated area information where the target object included in the pickup instruction is to be placed, or may be designated for the user, and is not limited herein.
S202: calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located;
s203: and controlling the mechanical arm to move to the area of the target object along the path track, and acquiring a target image comprising the target object.
For the detailed description of S201 to S203 in this embodiment, reference may be made to S101 to S103 in the above embodiments, which are not repeated herein.
S204: and processing the target image to obtain at least one contour surface of the target object in the target image.
In order to facilitate the suction of the suction cup at the end of the mechanical arm to the target object, the specific position in the above embodiment may be any contour surface of the target object in the target image or a central point of any contour surface.
Further, the method for acquiring the contour surface of the target object in the target image comprises the following steps: on one hand, the method can be characterized in that an RGB image comprising a target object is obtained by processing the target image, the contour features of the target object in the RGB image are extracted to obtain a contour image of the target object, and the contour surface of the target object in the target image is determined according to the contour image; on the other hand, a depth image including the target object may also be acquired, the point cloud data in the depth image is processed, and an edge point cloud of the target object in the depth image is acquired, so as to determine the profile surface of the target object in the target image according to the edge point cloud.
S205: and selecting one of the at least one contour surface based on the at least one contour surface of the target object in the target image, and calculating the position of the central point of the contour surface.
Based on any contour surface of the target object in the target image acquired in S204, the automatic picking apparatus of the present embodiment may control the suction cup at the end of the robot arm to suck any contour surface of the target object, so as to move the target object to the designated area.
In consideration of the stability of the suction cup at the end of the mechanical arm to the target object, the automatic picking device of the embodiment controls the suction cup at the end of the mechanical arm to suck the central point of any contour surface of the target object. Of course, in other embodiments, the automatic picking device may also control the suction cup at the end of the robot arm to be attached to any position of any contour surface of the target object.
It should be noted that, in order to ensure the stability of the suction cup at the end of the mechanical arm to the target object, the automatic picking device may control the suction cup at the end of the mechanical arm to be sucked on the contour surface or the center point of the contour surface of the target object, which is directly opposite to the mechanical arm.
Further, in consideration of the area difference of the contour surfaces of the target object, if the automatic picking device controls the suction cup to be attached to any contour surface of the target object or the center point of any contour surface, there may be a problem that the suction cup is attached to a contour surface that is too small, which may cause unstable attachment.
Specifically, the automatic goods taking device processes the target object to obtain the contour surface of the target object in the target image, and then calculates the area of each contour surface of the target object in the target image; and comparing the areas of the plurality of contour surfaces to obtain the contour surface with the maximum area of the target object in the target image.
S206: and controlling the mechanical arm to adsorb the central point position of the contour surface based on the central point position of the contour surface, and moving the target object to a specified area.
According to the scheme, the goods taking instruction is obtained, the path track of the mechanical arm reaching the area where the target object is located is calculated based on the area information of the target object, the mechanical arm is controlled to move to the area where the target object is located along the path track, the target image comprising the target object is obtained, the target image is processed, at least one contour surface of the target object in the target image is obtained, one contour surface in the at least one contour surface is selected based on the at least one contour surface of the target object in the target image, the position of the central point of the contour surface is calculated, the position of the central point of the contour surface of the target object is controlled to be adsorbed by the mechanical arm based on the position of the central point of the contour surface of. According to the embodiment, the target object in the target image is accurately positioned by processing the target image, so that the mechanical arm can conveniently and safely and stably grab the target object and move the target object to the designated area.
In practical application, considering that a target object in the present application is express delivery, an express delivery package is generally attached with interference information such as an express delivery bill, an adhesive tape, a commodity trademark, and the like, in order to achieve accurate positioning of the target object in a target image, the target image acquired in this embodiment is a depth image, specifically, a contour surface of the target object in the target image is obtained by processing point cloud data in the depth image, so as to control a mechanical arm to be attached to the contour surface of the target object, and move the target object to a specified area. Specifically, referring to fig. 4, the method for processing the depth image includes the following detailed steps:
s301, point cloud data of the depth image is obtained.
In order to better store the original shape of the target object point cloud edge in the depth image and improve the extraction accuracy of the target object point cloud edge in the depth image, in this embodiment, all point cloud data in the depth image needs to be acquired, which is beneficial to subsequently clustering all points in the depth image and improves the extraction accuracy of the target object edge point cloud in the depth image.
And S302, segmenting the point cloud data to obtain a plurality of plane cluster sets.
In order to avoid that the extraction of the target object boundary point cloud in the depth image is affected due to the uneven point cloud distribution of the depth image, in this embodiment, the point cloud data of the depth image needs to be segmented to obtain a plurality of plane cluster sets. Where the points in the planar cluster set have similar properties, e.g., point curvatures are similar.
In a specific embodiment, a region growing algorithm may be used to segment the point cloud data, and a threshold-based segmentation method, an edge-based segmentation method, a particular theory-based segmentation method, and the like may also be used.
And S303, projecting the points in each plane clustering set onto the corresponding fitting plane to obtain the projection point data of each point in the plurality of plane clustering sets on the corresponding fitting plane.
Because the points in the plane clustering sets are not all located on the same plane, in order to obtain the data of the points in the plane clustering sets, the points in each plane clustering set need to be projected onto the corresponding fitting plane, so that the projection point data of each point in the plane clustering sets can be obtained from the corresponding fitting plane, and the point cloud of the edge of the depth image can be quickly extracted according to the projection point data on the fitting plane. And the fitting plane is a plane formed by projection of a plane cluster concentration point.
And S304, calculating a normal vector corresponding to the fitting plane by using the projection point data on each fitting plane.
Based on the projection point data of the plurality of fitting planes acquired in S303, a normal vector of each fitting plane is calculated.
S305, judging whether the included angle between the normal vector direction of the fitting plane and the preset normal vector direction is smaller than a preset angle.
In order to avoid the influence of the shelf side plate on the extraction of the point cloud of the edge of the target object in the target image, in this embodiment, whether an included angle between the direction quantity of the fitting plane and the direction of the preset normal vector is smaller than a preset angle is determined, and if so, S306 is executed to filter out the fitting plane where the point cloud data of the shelf side plate is located. If not, the fitting plane is the plane of the point cloud data of the side plate of the goods shelf. And the direction of the preset normal vector is vertical to the target object plane.
S306, calculating the edge point cloud of the target object in the depth image according to the normal vector of the fitting plane corresponding to the included angle smaller than the preset angle, and determining the contour surface of the target object according to the edge point cloud of the target object in the depth image.
In order to accurately obtain the edge point cloud of the target object in the target image, the automatic cargo picking device of the embodiment may calculate the edge point cloud of the target object in the depth image by using the fitting plane satisfying the requirement in S305, so as to determine the profile surface of the target object according to the edge point cloud of the target object in the depth image. In a specific embodiment, the fitting plane may be converted into a binary image, so as to obtain a contour surface of the target object in the target image.
In the above embodiment, point cloud data of the depth image is obtained, the point cloud data is segmented to obtain a plurality of plane cluster sets, points in each plane cluster set are projected onto a corresponding fitting plane to obtain projection point data of each point in the plurality of plane cluster sets on the corresponding fitting plane, a normal vector of the corresponding fitting plane is calculated by using the projection point data on each fitting plane, whether an included angle between a normal vector direction of the fitting plane and a preset normal vector direction is smaller than a preset angle is judged, an edge point cloud of a target object in the depth image is calculated according to the normal vector of the fitting plane of which the included angle is smaller than the preset angle, and a contour surface of the target object is determined according to the edge point cloud of the target object in the depth image. The automatic goods taking device of the embodiment extracts the edge point cloud of the target object in the depth image by processing the point cloud data in the depth image, so that the contour surface of the target object is determined according to the edge point cloud, and the target object is accurately positioned.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Referring to fig. 5, fig. 5 is a schematic frame diagram of an embodiment of a self-service goods-picking device according to the present application. The self-service delivery device 50 includes:
an acquisition unit 51 configured to acquire a pickup instruction, where the pickup instruction includes position information of a target object and designated area information where the target object is placed;
a calculating unit 52, configured to calculate, based on the position information of the target object, a path trajectory of the robot arm to reach the area where the target object is located;
the control unit 53 is configured to control the mechanical arm to move to the area where the target object is located along the path trajectory, and acquire a target image, so as to determine a plurality of contour surfaces of the target object according to the target image;
and the execution unit 54 is used for controlling the suction cup at the tail end of the mechanical arm to be adsorbed on one contour surface of the target object based on a plurality of contour surfaces of the target object in the target image and moving the mechanical arm to a specified area.
Referring to fig. 6, fig. 6 is a schematic diagram of an embodiment of an automatic picking system according to the present application.
The automatic goods taking system 600 comprises an automatic goods taking device 61 and a data processing device (not shown), wherein the automatic goods taking device 61 comprises a mechanical arm 611 and a suction cup 612, and the data processing device is used for calculating a path track of the mechanical arm 611 reaching the area where the target object 62 is located and determining the specific position of the target object according to the target image; the suction cup 612 is used for sucking the target object 62; the robot arm 611 is used to move the target object 62 sucked by the suction cup 612 at the end of the robot arm 611 to a designated area.
Specifically, in order to facilitate the mechanical arm 611 to reach a target area capable of grabbing the target object 62 and achieve grabbing of different areas of the shelf 63 by the mechanical arm 611, in this embodiment, a sliding rail 64 may be disposed right in front of the shelf 63, and the mechanical arm 611 is disposed on the sliding rail 64, so that the mechanical arm 611 moves in the sliding direction under the driving of the sliding rail 64. In addition, the sliding direction of the sliding rail 64 should be parallel to the length direction of the shelf 63, which is beneficial to planning the path track of the mechanical arm 611 and avoids the operation complexity caused by adjusting the orientation of the end of the mechanical arm 611 before planning obstacle avoidance.
Referring to fig. 7, fig. 7 is a schematic diagram of a frame of an embodiment of an electronic device provided in the present application. The electronic device 70 includes a memory 71 and a processor 72 coupled to each other, and the processor 72 is configured to execute program instructions stored in the memory 71 to implement the steps of any of the above-described embodiments of the method for automatically picking an item. In one particular implementation scenario, the electronic device 70 may include, but is not limited to: a microcomputer, a server, and the electronic device 70 may also include a mobile device such as a notebook computer, a tablet computer, and the like, which is not limited herein.
In particular, the processor 72 is configured to control itself and the memory 71 to implement the steps of any of the above-described embodiments of the method of automatically picking. The processor 72 may also be referred to as a CPU (Central Processing Unit). The processor 72 may be an integrated circuit chip having signal processing capabilities. The Processor 72 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or any conventional processor or the like. Additionally, the processor 72 may be collectively implemented by an integrated circuit chip.
Referring to fig. 8, fig. 8 is a block diagram illustrating an embodiment of a computer-readable storage medium according to the present application. The computer readable storage medium 80 stores program instructions 801 that can be executed by the processor, the program instructions 801 being for implementing the steps of any of the above-described method embodiments for automatically picking items.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a unit or a component may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (11)

1. A method of automatically picking up goods, the method comprising:
acquiring a goods taking instruction, wherein the goods taking instruction comprises information of an area where a target object is located;
calculating the path track of the mechanical arm reaching the area where the target object is located based on the information of the area where the target object is located;
controlling the mechanical arm to move to the area where the target object is located along the traveling path track, and acquiring a target image comprising the target object so as to determine the specific position of the target object according to the target image;
and controlling the mechanical arm to adsorb the target object and moving the target object to a specified area based on the specific position of the target object in the target image.
2. The method of automatic picking of claim 1, wherein the specific location comprises a position of a center point of any profile of the target object, and the step of obtaining a target image comprising the target object to determine the specific location of the target object from the target image comprises:
processing the target image to obtain at least one contour surface of a target object in the target image;
selecting one of the at least one contour surface based on at least one contour surface of a target object in the target image, and calculating the position of the central point of the contour surface;
and controlling the mechanical arm to adsorb the central point position of the contour surface based on the central point position of the contour surface, and moving the target object to a specified area.
3. The method of claim 2, wherein the target image comprises a depth image, and wherein the step of processing the target image to obtain at least one contour surface of a target object in the target image comprises:
acquiring point cloud data of the depth image;
dividing the point cloud data to obtain a plurality of plane cluster sets;
projecting the points in each plane clustering set onto a corresponding fitting plane to obtain projection point data of each point in a plurality of plane clustering sets on the corresponding fitting plane;
calculating a normal vector corresponding to each fitting plane by using the projection point data on each fitting plane;
and calculating the edge point cloud of the target object in the depth image according to the normal vector of the fitting plane so as to determine the contour surface of the target object according to the edge point cloud of the target object in the depth image.
4. The method of claim 3, wherein the step of computing a point cloud of edges of the target object in the depth image from a normal vector of the fitting plane comprises:
judging whether an included angle between the normal vector direction of the fitting plane and a preset normal vector direction is smaller than a preset angle or not;
if yes, calculating the edge point cloud of the target object in the depth image according to the normal vector of the fitting plane corresponding to the included angle smaller than the preset angle.
5. The method of claim 2, wherein the step of selecting one of the at least one contour surface based on the at least one contour surface of the target object in the target image comprises:
calculating the area of each contour surface of a target object in the target image;
and comparing the areas of the plurality of contour surfaces to obtain the maximum contour surface of the area of the target object in the target image.
6. The method according to claim 1, wherein the pickup instruction includes information of a designated area where the target object is to be placed, and the acquiring of the pickup instruction includes a step of acquiring information of an area where the target object is located, and includes:
and acquiring a goods taking instruction, controlling the mechanical arm to adsorb the target object according to the goods taking instruction, and moving the mechanical arm and the target object to a specified area in the goods taking instruction.
7. The method according to claim 1, wherein the step of calculating a path trajectory of the robot arm to the area where the target object is located based on the information of the area where the target object is located comprises:
and calculating the path track of the mechanical arm to the region where the target object is located by adopting a fast expansion random number algorithm.
8. A self-presenting device, comprising:
the system comprises an acquisition unit, a storage unit and a control unit, wherein the acquisition unit is used for acquiring a goods taking instruction, and the goods taking instruction comprises position information of a target object and designated area information of the target object;
the calculation unit is used for calculating the path track of the mechanical arm reaching the area where the target object is located based on the position information of the target object;
the control unit is used for controlling the mechanical arm to move to the area where the target object is located along the path track, acquiring a target image and determining a plurality of contour surfaces of the target object according to the target image;
and the execution unit is used for controlling the suction cup at the tail end of the mechanical arm to be adsorbed on one contour surface of the target object based on a plurality of contour surfaces of the target object in the target image and moving the mechanical arm to a specified area.
9. An automatic goods taking system, characterized in that the automatic goods taking system comprises the automatic goods taking device and a data processing device as claimed in claim 8, the automatic goods taking device comprises a mechanical arm and a sucker, the data processing device is used for calculating a path track of the mechanical arm reaching the area where the target object is located, and determining the specific position of the target object according to the target image; the sucker is used for sucking the target object; the mechanical arm is used for moving the target object to a specified area.
10. An electronic device comprising a memory and a processor coupled to each other, the processor being configured to execute program instructions stored in the memory to implement the method of automatically picking up goods as claimed in any one of claims 1 to 7.
11. A computer readable storage medium having stored thereon program instructions which, when executed by a processor, implement the method of automatically picking items of any of claims 1 to 7.
CN202011240724.5A 2020-11-09 2020-11-09 Method, device and equipment for automatically picking up goods and computer readable storage medium Active CN112605986B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011240724.5A CN112605986B (en) 2020-11-09 2020-11-09 Method, device and equipment for automatically picking up goods and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011240724.5A CN112605986B (en) 2020-11-09 2020-11-09 Method, device and equipment for automatically picking up goods and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112605986A true CN112605986A (en) 2021-04-06
CN112605986B CN112605986B (en) 2022-04-19

Family

ID=75224589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011240724.5A Active CN112605986B (en) 2020-11-09 2020-11-09 Method, device and equipment for automatically picking up goods and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112605986B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114347040A (en) * 2022-02-18 2022-04-15 创新奇智(合肥)科技有限公司 Method and device for picking up target object, robot and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107748890A (en) * 2017-09-11 2018-03-02 汕头大学 A kind of visual grasping method, apparatus and its readable storage medium storing program for executing based on depth image
CN108247635A (en) * 2018-01-15 2018-07-06 北京化工大学 A kind of method of the robot crawl object of deep vision
CN109086736A (en) * 2018-08-17 2018-12-25 深圳蓝胖子机器人有限公司 Target Acquisition method, equipment and computer readable storage medium
CN110148257A (en) * 2019-04-01 2019-08-20 厦门鲜喵网络科技有限公司 A kind of picking method and system
CN110395515A (en) * 2019-07-29 2019-11-01 深圳蓝胖子机器人有限公司 A kind of cargo identification grasping means, equipment and storage medium
CN111168686A (en) * 2020-02-25 2020-05-19 深圳市商汤科技有限公司 Object grabbing method, device, equipment and storage medium
CN111439594A (en) * 2020-03-09 2020-07-24 兰剑智能科技股份有限公司 Unstacking method and system based on 3D visual guidance

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107748890A (en) * 2017-09-11 2018-03-02 汕头大学 A kind of visual grasping method, apparatus and its readable storage medium storing program for executing based on depth image
CN108247635A (en) * 2018-01-15 2018-07-06 北京化工大学 A kind of method of the robot crawl object of deep vision
CN109086736A (en) * 2018-08-17 2018-12-25 深圳蓝胖子机器人有限公司 Target Acquisition method, equipment and computer readable storage medium
CN110148257A (en) * 2019-04-01 2019-08-20 厦门鲜喵网络科技有限公司 A kind of picking method and system
CN110395515A (en) * 2019-07-29 2019-11-01 深圳蓝胖子机器人有限公司 A kind of cargo identification grasping means, equipment and storage medium
CN111168686A (en) * 2020-02-25 2020-05-19 深圳市商汤科技有限公司 Object grabbing method, device, equipment and storage medium
CN111439594A (en) * 2020-03-09 2020-07-24 兰剑智能科技股份有限公司 Unstacking method and system based on 3D visual guidance

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114347040A (en) * 2022-02-18 2022-04-15 创新奇智(合肥)科技有限公司 Method and device for picking up target object, robot and storage medium
CN114347040B (en) * 2022-02-18 2024-06-11 创新奇智(合肥)科技有限公司 Target object pickup method, device, robot and storage medium

Also Published As

Publication number Publication date
CN112605986B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
US10053305B2 (en) Article handling apparatus and method of operating the same
CN110884715B (en) Robot system and control method thereof
CN107597600B (en) Sorting system and method for sorting
JP7411932B2 (en) Automated package registration systems, devices, and methods
US9649767B2 (en) Methods and systems for distributing remote assistance to facilitate robotic object manipulation
US10679379B1 (en) Robotic system with dynamic packing mechanism
US9707682B1 (en) Methods and systems for recognizing machine-readable information on three-dimensional objects
US11883966B2 (en) Method and computing system for performing object detection or robot interaction planning based on image information generated by a camera
TWI816235B (en) Method and apparatus for storing material, robot, warehousing system and storage medium
WO2022105764A1 (en) Goods storage method and apparatus, and robot, warehousing system and storage medium
JP2019509559A (en) Box location, separation, and picking using a sensor-guided robot
US20230286751A1 (en) Method and device for taking out and placing goods, warehousing robot and warehousing system
US9802317B1 (en) Methods and systems for remote perception assistance to facilitate robotic object manipulation
WO2022237221A1 (en) Adjustment method, apparatus, and device for goods retrieval and placement apparatus, robot, and warehouse system
US20220203547A1 (en) System and method for improving automated robotic picking via pick planning and interventional assistance
CN112025701A (en) Method, device, computing equipment and storage medium for grabbing object
JP2017100214A (en) Manipulator system, imaging system, object delivery method, and manipulator control program
US10958895B1 (en) High speed automated capture of 3D models of packaged items
JP2014024142A (en) Apparatus and method for taking out bulk articles by robot
CN113351522A (en) Article sorting method, device and system
CN111191650B (en) Article positioning method and system based on RGB-D image visual saliency
US11981518B2 (en) Robotic tools and methods for operating the same
CN112605986B (en) Method, device and equipment for automatically picking up goods and computer readable storage medium
JP7175487B1 (en) Robotic system with image-based sizing mechanism and method for operating the robotic system
CN114170442A (en) Method and device for determining space grabbing points of robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant