CN117208567A - Material taking method, device, equipment and storage medium for unordered picking of material basket - Google Patents

Material taking method, device, equipment and storage medium for unordered picking of material basket Download PDF

Info

Publication number
CN117208567A
CN117208567A CN202210620397.9A CN202210620397A CN117208567A CN 117208567 A CN117208567 A CN 117208567A CN 202210620397 A CN202210620397 A CN 202210620397A CN 117208567 A CN117208567 A CN 117208567A
Authority
CN
China
Prior art keywords
target object
picking
target
state
basket
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210620397.9A
Other languages
Chinese (zh)
Inventor
赵炜
俞冠廷
周佳骥
董政霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Star Ape Philosophy Technology Shanghai Co ltd
Original Assignee
Star Ape Philosophy Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Star Ape Philosophy Technology Shanghai Co ltd filed Critical Star Ape Philosophy Technology Shanghai Co ltd
Priority to CN202210620397.9A priority Critical patent/CN117208567A/en
Publication of CN117208567A publication Critical patent/CN117208567A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The application provides a material taking method, a device, equipment and a storage medium for unordered picking of a material basket, which comprise the following steps: acquiring target depth images of a material basket and a plurality of target objects stored in the material basket, performing target detection on the target depth images to determine a plurality of target object areas, and determining the storage positions of the target objects in the material basket according to the depth information of the target object areas; determining the pose of the target object according to the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and an unclassifiable state; for the object in the picking state, the end effector is controlled to pick the object to move to a storage position, and for the object in the non-picking state, the end effector is controlled to pick a target object to move according to a preset track and then release the target object in the material basket. The application can improve the basket cleaning rate of the unordered picking of the material basket to be picked.

Description

Material taking method, device, equipment and storage medium for unordered picking of material basket
Technical Field
The application relates to a sorting robot, in particular to a material taking method, a device, equipment and a storage medium for unordered picking of material baskets.
Background
A Sorting robot (Sorting robot) is a robot provided with a sensor, an objective lens, and an electron optical system, and can quickly sort cargoes.
In the picking procedure of unordered picking of the material basket, for the workpieces placed in different angles, the picking gesture of the execution terminal is often adjusted through the pitching action and the rotating action of the wrist of the mechanical arm.
However, when the object to be picked is inclined at various angles in the material basket or is in a limiting position (for example, the inclination angle of the object to be picked is larger or is close to the basket wall), the pitching action of the mechanical arm can cause interference between the mechanical arm or the actuating terminal of the mechanical arm and the basket wall due to the fact that the actuating terminal has a certain axial length, so that picking conditions are not met, and the basket cleaning rate is low.
Disclosure of Invention
Aiming at the defects in the prior art, the application aims to provide a material taking method, a material taking device, material taking equipment and a storage medium for unordered picking of a material basket.
The material taking method for unordered picking of the material basket provided by the application comprises the following steps:
acquiring a target depth image comprising a material basket and a plurality of target objects stored in the material basket, performing target detection on the target depth image to determine a plurality of target object areas, and determining the storage position of the target objects in the material basket according to the target object areas;
determining the pose of the target object according to the depth information of the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and/or a non-pickable state;
for the target object in the pickable state, controlling the end effector to pick the target object to move to a storage position, and for the target object in the non-pickable state, controlling the end effector to pick a target object to move according to a preset track and then releasing the target object in the material basket.
Preferably, determining the placement position of the target object on the basket comprises:
acquiring target depth images of a plurality of target objects stored in a material basket;
performing target detection on the target depth image to determine a plurality of target object areas;
and determining the placement position of the target object in the basket according to the depth information of the target object area and the position relation between the target object area and the target depth image.
Preferably, when judging the placement state of the target object, the method includes:
acquiring each target object area, and determining the pose of the target object according to the depth information of the target object area;
determining whether the target object is in a pickable position of the end effector according to the storage positions and the postures of the target objects and the coordinate information of the material basket, triggering a picking area judging step when the target object is in a contactable position of the end effector, and determining that the placement state of the target object is an unclassible state when the target object is in an uncontrollable position of the end effector;
acquiring a preset picking area, judging whether the picking area exists in the target object according to the depth information of the target object area, determining that the placement state of the target object is a pickable state when the picking area exists on the target object, and determining that the placement state of the target object is an uncleanable state when the picking area does not exist on the target object.
Preferably, the picking and moving of the target object includes:
controlling an end effector to pick the target object in a pickable state through a picking area on the target object and then placing the target object in a storage position;
determining a contactable area of the end effector and the target object according to the depth information of the target object area for the target object in a non-pickable state;
the end effector is controlled to contact the accessible region on the target object and release the target object within the basket after moving in a predetermined trajectory.
Preferably, the method further comprises:
and repeating the steps from determining the placement position of the target object in the material basket to picking and moving the target object until all the target objects in the material basket are moved to the storage position.
Preferably, the picking area is an area capable of picking up and holding the target object by the end effector for a picking time greater than a preset time threshold;
the picking area is determined at least according to the weight and shape of the target object;
the contactable area is any area on the target object.
Preferably, the end effector employs a magnet suction cup or gripper;
the target object is an iron rod;
the area, in which the end face or the side face of the iron rod is continuous and is larger than the preset area threshold value, is set as a picking area.
The material taking device for unordered picking of the material basket is characterized by comprising the following modules:
the image acquisition module is used for acquiring target depth images of a material basket and a plurality of target objects stored in the material basket, carrying out target detection on the target depth images to determine a plurality of target object areas, and determining the storage positions of the target objects in the material basket according to the target object areas;
the placement state determining module is used for determining the pose of the target object according to the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and/or a non-pickable state;
and the picking control module is used for controlling the end effector to pick the target object in the picking state to move to a storage position, and controlling the end effector to pick the target object in the non-picking state to move according to a preset track and then release the target object in the material basket.
The material taking device for unordered picking of the material basket provided by the application comprises:
a processor;
a memory module having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the pick-up method for out-of-order pick-up of material baskets of claim via execution of the executable instructions.
According to the application, a computer readable storage medium is provided for storing a program which, when executed, implements the steps of the material taking method for unordered picking of material baskets.
Compared with the prior art, the application has the following beneficial effects:
according to the application, a depth camera is used for collecting target depth images of a material basket and a plurality of target objects stored in the material basket, a plurality of target object areas are detected in the depth images, the pose of the target objects and the positions of the target objects in the material basket are determined, the placement state of the target objects is judged according to the pose and the storage positions, the placement state at least comprises a pickable state and/or an unobservable state, the pickable state is picked and moved to a storage position, after all the target objects in the pickable state are picked to the storage position, an end effector is controlled to contact any target object in the unobservable state, the target object is moved and then placed in the material basket, so that the state of the target object is changed into a pickable state, the collection of the target depth images is carried out through the depth camera again, and then the pickup of the plurality of target objects is carried out according to the placement state of the target objects, so that the pickup of the limit position of the object to be picked can be realized, interference of a basket to be picked up is reduced, and the unordered pickup rate of the material basket is lifted.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art. Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a flow chart of the steps of a method for unordered picking of material baskets according to an embodiment of the present application;
FIG. 2 is a flow chart of steps of a method for unordered picking of material baskets in a variation of the present application;
FIG. 3 is a flowchart illustrating steps for determining a placement position of the target object in the basket according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating steps for determining a placement state of a target object according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating steps for picking and moving a target object according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a robot for unordered picking of material baskets according to an embodiment of the present application;
FIG. 7 is a schematic block diagram of a pick-up device for unordered picking of material baskets in accordance with an embodiment of the present application;
FIG. 8 is a schematic diagram of a material picking apparatus for unordered picking of material baskets according to an embodiment of the present application; and
fig. 9 is a schematic diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
The present application will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present application, but are not intended to limit the application in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical scheme of the application is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of steps of a material taking method for unordered picking of a material basket according to an embodiment of the present application, as shown in fig. 1, the material taking method for unordered picking of a material basket provided by the present application includes the following steps:
step S1: acquiring a target depth image comprising a material basket and a plurality of target objects stored in the material basket, performing target detection on the target depth image to determine a plurality of target object areas, and determining the storage position of the target objects in the material basket according to the target object areas;
more specifically, fig. 3 shows a flowchart of the steps for determining the placement position of the target object in the basket according to the embodiment of the present application, as shown in fig. 3, the step S1 includes the following steps:
step S101: acquiring target depth images of a plurality of target objects stored in a material basket;
step S102: performing target detection on the target depth image to determine a plurality of target object areas;
step S103: and determining the placement position of the target object in the basket according to the depth information of the target object area and the position relation between the target object area and the target depth image.
In the embodiment of the application, the depth camera is arranged right above the material basket, the whole material basket is overlooked, and the acquisition of target depth images comprising the material basket and a plurality of target objects stored in the material basket is carried out.
The position of the basket may be predetermined so that when the target object area is determined by target detection at the target depth image, the storage position of the target object relative to the basket may be determined.
The position of the basket can be determined by detecting the target depth image, so that the storage position of the target object relative to the basket can be determined.
Step S2: determining the pose of the target object according to the depth information of the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and/or a non-pickable state;
more specifically, fig. 4 shows a flowchart of the steps for determining the placement state of the target object in the embodiment of the present application, and as shown in fig. 4, the step S2 includes the following steps:
step S201: acquiring each target object area, and determining the pose of the target object according to the depth information of the target object area;
step S202: determining whether the target object is in a pickable position of the end effector according to the storage positions and the postures of the target objects and the coordinate information of the material basket, triggering step S203 when the target object is in a contactable position of the end effector, and determining that the placement state of the target object is an unclassible state when the target object is in an uncontrollable position of the end effector;
step S203: acquiring a preset picking area, judging whether the picking area exists in the target object according to the depth information of the target object area, determining that the placement state of the target object is a pickable state when the picking area exists on the target object, and determining that the placement state of the target object is an uncleanable state when the picking area does not exist on the target object.
In an embodiment of the present application, the depth information of the target object area includes a plurality of points having three-dimensional coordinates X, Y, Z, from which the contour and pose of the target object can be determined. If the depth information of the target object area is fitted to a circular outline for a cylindrical iron rod, it can be determined that one end of the cylinder faces upwards, and the depth information of the target object area is fitted to a circular arc outline with a certain length, it can be determined that one end of the cylinder faces upwards.
Step S3: for the target object in the pickable state, controlling the end effector to pick the target object to move to a storage position, and for the target object in the non-pickable state, controlling the end effector to pick a target object to move according to a preset track and then releasing the target object in the material basket.
More specifically, fig. 5 shows a flowchart of the steps of picking and moving the target object in the embodiment of the present application, and as shown in fig. 5, the step S3 includes the following steps:
step S301: controlling an end effector to pick the target object in a pickable state through a picking area on the target object and then placing the target object in a storage position;
step S302: determining a contactable area of the end effector and the target object according to the depth information of the target object area for the target object in a non-pickable state;
step S303: the end effector is controlled to contact the accessible region on the target object and release the target object within the basket after moving in a predetermined trajectory.
In the embodiment of the application, the picking area is an area capable of picking up a target object through an end effector and keeping the picking time greater than a preset time threshold;
the picking area is determined at least according to the weight and shape of the target object;
the time threshold may be set, for example, to be greater than 5 seconds.
And when the target object is an iron rod of a cylinder, the picking area may be provided as an area having an extension length on a side of the cylinder, and the extension length of the area may be provided to be about half of the length of the cylinder;
the contactable area is any area on the target object.
The end effector employs a magnet chuck or gripper, preferably a magnet chuck, having a profile corresponding to the iron rod.
The preset track can be set to move to a position in the upper direction of the center of the material basket.
The magnet sucker is attached to the picking area on the target object, the iron rod of the target object is picked and then placed in a storage position, and the magnet sucker is contacted with any one of the contactable areas on the target object, so that the target object can be temporarily sucked up to move a certain distance, or the target object is moved to a gesture so that the picking area is exposed by the target object, and the picking of the end effector is facilitated.
In a modification of the present application, the area where the end surface or the side surface of the iron rod is continuous and greater than the preset area threshold value is set as the picking area.
Fig. 2 is a flow chart of steps of a material taking method for unordered picking of a material basket according to a modification of the present application, as shown in fig. 2, the material taking method for unordered picking of a material basket according to the present application further includes the following steps:
and repeating the steps S1 to S3 until all the target objects in the material basket are moved to the storage position.
In a modification of the present application, when the end effector is controlled to pick up one target object and then release the target object in the basket after moving along a preset track, the depth camera is controlled again to collect target depth images of the target objects stored in the basket and the basket, the pose and storage positions of the target objects are determined, the placement state of the target object is judged again, and the operation is repeated until the basket is cleared.
Fig. 6 is a schematic structural diagram of a robot for unordered picking of a basket according to an embodiment of the present application, as shown in fig. 6, where the robot for unordered picking of a basket provided by the present application further includes:
the first unit and the second unit are used for storing or/and transporting materials;
a depth camera 300, whose visual scanning area covers at least a first unit for storing or transporting the material, for performing visual scanning on the material, collecting a depth image of the material, and generating pose information and storage position of the material according to the depth image;
the robot unit 100 is communicatively connected to the depth camera 300, and is configured to receive the pose information and the storage position, determine a placement state of the target object according to the pose and the storage position, and pick up or pull the target object according to the placement state.
In an embodiment of the present application, the first unit may be provided as the storage unit 200;
the storage unit 200 is used for storing unordered materials, wherein the materials are the target objects, namely the iron bars;
the robot unit 100 is communicatively connected to the depth camera 300, and is configured to receive the position and posture information and the storage position, determine a placement state of the target object according to the position and the posture, pick up the target object according to the placement state, and then transfer the picked target object to a second unit or pull the picked target object to be placed in the storage unit 200 again.
The second unit 400 may be configured to transport or store the picked material, such as a rack configured to facilitate the orderly arrangement of the items,
the second unit 400 may further include a transport unit, so that the robot unit 100 can move the target object on the support frame to the transport unit.
The depth camera 300 is disposed on the camera mount 500, and is not shown due to shielding of a beam of the camera mount 500.
The robot unit 100 includes a processor, and the processor is configured to execute the steps of the taking method for unordered picking of the basket via executing executable instructions, collect target depth images including the basket and a plurality of target objects stored in the basket by a depth camera, detect a plurality of target object areas in the depth images, determine the pose of the target objects and the position in the basket, determine the placement state of the target objects according to the pose and the storage position, and at least include a pickable state and/or an unordered state, pick and move the pickable state to a storage position, and after all the target objects in the pickable state are picked to the storage position, control an end effector to contact any target object in the unordered state, move the target object and then place the target object in the basket so that the state of the target object becomes the pickable state, and then pick up the basket by the depth camera, so as to further pick up the target depth images according to the placement state of the plurality of target objects, thereby achieving unordered picking of the basket, and reducing the interference of the basket at the unordered picking position.
Fig. 7 is a schematic block diagram of a material taking device for unordered picking of a material basket according to an embodiment of the present application, as shown in fig. 7, the material taking device for unordered picking of a material basket provided by the present application includes the following blocks:
the image acquisition module is used for acquiring target depth images of a material basket and a plurality of target objects stored in the material basket, carrying out target detection on the target depth images to determine a plurality of target object areas, and determining the storage positions of the target objects according to the target object areas;
the placement state determining module is used for determining the pose of the target object according to the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and/or a non-pickable state;
and the picking control module is used for controlling the end effector to pick the target object in the picking state to move to a storage position, and controlling the end effector to pick the target object in the non-picking state to move according to a preset track and then release the target object in the material basket.
The embodiment of the application also provides binocular imaging equipment which comprises a processor and a memory. A memory having stored therein executable instructions of a processor. Wherein the processor is configured to execute the material taking method steps for unordered picking of material baskets via execution of the executable instructions.
As described above, in this embodiment, the depth camera is controlled to collect the target depth image including the material basket and the plurality of target objects stored in the material basket, the plurality of target object areas are detected in the depth image, the pose of the target object and the position of the target object in the material basket are determined, the placement state of the target object is determined according to the pose and the storage position, the placement state at least includes the pickable state and/or the non-pickable state, the pickable state is picked and moved to a storage position, after all the target objects in the pickable state are picked to the storage position, the end effector is controlled to contact any target object in the non-pickable state, after the target object is moved and placed in the material basket, the state of the target object becomes the pickable state, the depth image is collected again by the depth camera, and then the pickable basket is picked according to the placement state of the plurality of target objects, thereby the pickable basket can be picked up at the limit position, and the disorder pickable basket is lifted while the pickable basket is lifted.
Those skilled in the art will appreciate that the various aspects of the application may be implemented as a system, method, or program product. Accordingly, aspects of the application may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" platform.
Fig. 8 is a schematic structural view of a binocular imaging apparatus in an embodiment of the present application. An electronic device 600 according to this embodiment of the application is described below with reference to fig. 8. The electronic device 600 shown in fig. 8 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 8, the electronic device 600 is in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including memory unit 620 and processing unit 610), a display unit 640, etc.
Wherein the memory unit stores program code that is executable by the processing unit 610 such that the processing unit 610 performs the steps according to various exemplary embodiments of the present application described in the above-described pick-out method section for out-of-order picking of material baskets. For example, the processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 6201 and/or cache memory unit 6202, and may further include Read Only Memory (ROM) 6203.
The storage unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, camera, depth camera, etc.), one or more devices that enable a user to interact with the electronic device 600, and/or any device (e.g., router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 650. Also, electronic device 600 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in fig. 8, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage platforms, and the like.
The embodiment of the application also provides a computer readable storage medium for storing a program, and the steps of the material taking method for unordered picking of the material basket are realized when the program is executed. In some possible embodiments, the various aspects of the application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to various exemplary embodiments of the application as described in the above-mentioned material taking method section for out-of-order picking of material baskets, when the program product is run on the terminal device.
As described above, when the program of the computer readable storage medium of this embodiment is executed, the depth camera is controlled to collect the target depth image including the basket and the plurality of target objects stored in the basket, the plurality of target object areas are detected in the depth image, the pose of the target object and the position in the basket are determined, the placement state of the target object is determined according to the pose and the storage position, the placement state at least includes the pickable state and/or the non-pickable state, the pickable state is picked and moved to a storage position, and after all the target objects in the pickable state are picked to the storage position, the end effector is controlled to contact any target object in the non-pickable state, the target object is moved and then placed in the basket so that the state of the target object becomes the pickable state, and the depth image is collected again through the depth camera, so that the pickable state is determined according to the placement state of the plurality of target objects, thereby picking can be realized, and the pickup basket at the limit position is lifted, and the pickup basket is disordered when the pickup basket is lifted, and the pickup rate of the pickup basket is reduced.
Fig. 9 is a schematic structural view of a computer-readable storage medium in an embodiment of the present application. Referring to fig. 9, a program product 800 for implementing the above-described method according to an embodiment of the present application is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable storage medium may also be any readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing describes specific embodiments of the present application. It is to be understood that the application is not limited to the particular embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the claims without affecting the spirit of the application.

Claims (10)

1. A material taking method for unordered picking of material baskets, comprising:
acquiring a target depth image comprising a material basket and a plurality of target objects stored in the material basket, performing target detection on the target depth image to determine a plurality of target object areas, and determining the storage position of the target objects in the material basket according to the target object areas;
determining the pose of the target object according to the depth information of the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and/or a non-pickable state;
for the target object in the pickable state, controlling the end effector to pick the target object to move to a storage position, and for the target object in the non-pickable state, controlling the end effector to pick a target object to move according to a preset track and then releasing the target object in the material basket.
2. The method of claim 1, wherein determining the placement of the target object in the basket comprises:
acquiring target depth images of a plurality of target objects stored in a material basket;
performing target detection on the target depth image to determine a plurality of target object areas;
and determining the placement position of the target object in the basket according to the depth information of the target object area and the position relation between the target object area and the target depth image.
3. The method for unordered picking of material baskets according to claim 1, wherein when determining a placement state of the target object, comprising:
acquiring each target object area, and determining the pose of the target object according to the depth information of the target object area;
determining whether the target object is in a pickable position of the end effector according to the storage positions and the postures of the target objects and the coordinate information of the material basket, triggering a picking area judging step when the target object is in a contactable position of the end effector, and determining that the placement state of the target object is an unclassible state when the target object is in an uncontrollable position of the end effector;
acquiring a preset picking area, judging whether the picking area exists in the target object according to the depth information of the target object area, determining that the placement state of the target object is a pickable state when the picking area exists on the target object, and determining that the placement state of the target object is an uncleanable state when the picking area does not exist on the target object.
4. A pick-up method for out-of-order picking of basket materials as claimed in claim 1, comprising, when picking and moving the target object:
controlling an end effector to pick the target object in a pickable state through a picking area on the target object and then placing the target object in a storage position;
determining a contactable area of the end effector and the target object according to the depth information of the target object area for the target object in a non-pickable state;
the end effector is controlled to contact the accessible region on the target object and release the target object within the basket after moving in a predetermined trajectory.
5. The method of claim 1, wherein when the placement state includes a pickable state, the end effector is controlled to pick up the target object in the pickable state to a storage position, and when the placement state includes only a non-pickable state, the end effector is controlled to contact a target object in the non-pickable state and then release the target object again in the basket after moving along a predetermined trajectory so that the state of the target object is changed to the pickable state;
and repeating the steps from determining the placement position of the target object in the material basket to picking and moving the target object until all the target objects in the material basket are moved to the storage position.
6. A pick-up method for basket unordered picking as claimed in claim 3 or 4 wherein the picking area is an area capable of picking up a target object by an end effector and maintaining the picking time greater than a preset time threshold;
the picking area is determined at least according to the weight and shape of the target object;
the contactable area is any area on the target object.
7. The method of claim 6, wherein the end effector is a magnet suction cup or gripper;
the target object is an iron rod;
the area, in which the end face or the side face of the iron rod is continuous and is larger than the preset area threshold value, is set as a picking area.
8. A extracting device for unordered picking up of basket, its characterized in that includes following module:
the image acquisition module is used for acquiring target depth images of a material basket and a plurality of target objects stored in the material basket, carrying out target detection on the target depth images to determine a plurality of target object areas, and determining the storage positions of the target objects in the material basket according to the target object areas;
the placement state determining module is used for determining the pose of the target object according to the target object area, and judging the placement state of the target object according to the pose and the storage position, wherein the placement state at least comprises a pickable state and/or a non-pickable state;
and the picking control module is used for controlling the end effector to pick the target object in the picking state to move to a storage position, and controlling the end effector to pick the target object in the non-picking state to move according to a preset track and then release the target object in the material basket.
9. A material taking apparatus for unordered picking of a basket of material, comprising:
a processor;
a memory module having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the pick method for basket unordered picking of any one of claims 1 to 7 via execution of the executable instructions.
10. A computer readable storage medium storing a program, wherein the program when executed implements the steps of the reclaiming method for out-of-order basket picking of objects of any one of claims 1 to 7.
CN202210620397.9A 2022-06-02 2022-06-02 Material taking method, device, equipment and storage medium for unordered picking of material basket Pending CN117208567A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210620397.9A CN117208567A (en) 2022-06-02 2022-06-02 Material taking method, device, equipment and storage medium for unordered picking of material basket

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210620397.9A CN117208567A (en) 2022-06-02 2022-06-02 Material taking method, device, equipment and storage medium for unordered picking of material basket

Publications (1)

Publication Number Publication Date
CN117208567A true CN117208567A (en) 2023-12-12

Family

ID=89041248

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210620397.9A Pending CN117208567A (en) 2022-06-02 2022-06-02 Material taking method, device, equipment and storage medium for unordered picking of material basket

Country Status (1)

Country Link
CN (1) CN117208567A (en)

Similar Documents

Publication Publication Date Title
EP3405910B1 (en) Deep machine learning methods and apparatus for robotic grasping
JP4938115B2 (en) Work take-out device and work take-out method
CN110660104A (en) Industrial robot visual identification positioning grabbing method, computer device and computer readable storage medium
JP6088563B2 (en) Work picking robot system having position and orientation conversion operation function, and work picking method
US20070007924A1 (en) Handling system, work system, and program
JP2017185578A (en) Object gripping device and gripping control program
RU2008110693A (en) METHOD AND DEVICE FOR DETERMINING THE LOCATION AND EXTRACTION OF OBJECTS FROM THE TRANSPORTING DEVICE
CN113561179B (en) Robot control method, robot control device, robot, storage medium, and program product
CN114453258A (en) Parcel sorting system, parcel sorting method, industrial control equipment and storage medium
CN109863365B (en) Method, electronic device and system for picking up objects from container
CN114202526A (en) Quality detection method, system, apparatus, electronic device, and medium
CN117208567A (en) Material taking method, device, equipment and storage medium for unordered picking of material basket
JP2024015358A (en) Systems and methods for robotic system with object handling
CN112847348A (en) Manipulator control method, manipulator control device, pickup apparatus, and computer-readable storage medium
CN115533895B (en) Two-finger manipulator workpiece grabbing method and system based on vision
CN117206215A (en) Material taking device for unordered picking of material basket
CN113284129B (en) 3D bounding box-based press box detection method and device
CN115556094A (en) Material taking method and device based on three-axis manipulator and computer readable storage medium
JP2015157343A (en) Robot, robot system, control device, and control method
JP2014174628A (en) Image recognition method
CN116704003A (en) Picking multi-grab detection method, system, equipment and storage medium
CN115139325B (en) Object grasping system
CN117809232A (en) Object grabbing gesture recognition method, device, equipment and storage medium
CN115471834A (en) Object grabbing posture recognition method, device and equipment and storage medium
CN117798957A (en) Object gripping system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination