WO2022090627A1 - Waste sorting robot with throw sensor for determining position of waste object - Google Patents

Waste sorting robot with throw sensor for determining position of waste object Download PDF

Info

Publication number
WO2022090627A1
WO2022090627A1 PCT/FI2021/050724 FI2021050724W WO2022090627A1 WO 2022090627 A1 WO2022090627 A1 WO 2022090627A1 FI 2021050724 W FI2021050724 W FI 2021050724W WO 2022090627 A1 WO2022090627 A1 WO 2022090627A1
Authority
WO
WIPO (PCT)
Prior art keywords
waste
throw
gripper
target position
waste object
Prior art date
Application number
PCT/FI2021/050724
Other languages
French (fr)
Inventor
Harri HOLOPAINEN
Tuomas Lukka
Original Assignee
Zenrobotics Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zenrobotics Oy filed Critical Zenrobotics Oy
Publication of WO2022090627A1 publication Critical patent/WO2022090627A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/04Sorting according to size
    • B07C5/06Sorting according to size measured mechanically
    • B07C5/065Sorting according to size measured mechanically with multiple measuring appliances adjusted according to different standards, for example length or thickness, which detect the shape of an object so that if it conforms to the standard set by the measuring appliance, it is removed from the conveyor, e.g. by means of a number of differently calibrated openings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/04Sorting according to size
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/06Gripping heads and other end effectors with vacuum or magnetic holding means
    • B25J15/0616Gripping heads and other end effectors with vacuum or magnetic holding means with vacuum
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0093Programme-controlled manipulators co-operating with conveyor means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/023Cartesian coordinate type
    • B25J9/026Gantry-type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • B65G47/91Devices for picking-up and depositing articles or materials incorporating pneumatic, e.g. suction, grippers
    • B65G47/917Devices for picking-up and depositing articles or materials incorporating pneumatic, e.g. suction, grippers control arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41815Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
    • G05B19/4182Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell manipulators and conveyor only
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0054Sorting of waste or refuse
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0063Using robots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40019Placing and assembly, throw object correctly on table
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present disclosure relates to a waste sorting robot for sorting waste objects.
  • waste management industry industrial and domestic waste is increasingly being sorted in order to recover and recycle useful components.
  • Each type of waste, or “fraction” of waste can have a different use and value. If waste is not sorted, then it often ends up in landfill or incineration which has an undesirable environmental and economic impact.
  • the waste sorting robot picks waste objects from a conveyor with a gripper and moves the object to a sorting location depending on the type of waste object.
  • a previous problem is the limited speed by which waste sorting robots can be operated.
  • the speed of operation limits the flow of waste objects to be sorted, and ultimately the throughput and value of this type of automated recycling.
  • Increasing the speed of operation typically results in reduced accuracy of the waste sorting robot.
  • Adding further waste sorting robots along the conveyor increases the cost of the waste sorting system, as well as the footprint and complexity of the system.
  • a waste sorting robot comprising a manipulator movable within a working area, a gripper connected to the manipulator, wherein the gripper is arranged to selectively grip a waste object in the working area at a picking position and throw the waste object to a target position.
  • the waste sorting robot comprises a throw sensor configured to determine the position of the waste object after being thrown to the target position.
  • the waste sorting robot comprises a controller in communication with the throw sensor and being configured to receive said position as throw data.
  • the controller is configured to determine deviations in the position of the thrown waste object by comparing the throw data with the target position, and determine control instructions to the gripper and/or manipulator for subsequently gripped waste objects based on the deviations to throw the subsequently gripped waste objects towards the target position.
  • the controller is configured to associate the throw data of the thrown waste object and the determined control instructions to a waste object model to be applied to the subsequently gripped waste objects.
  • the controller is configured to input the throw data to a machine learning-based model to determine the control instructions for the subsequently gripped waste objects.
  • the throw sensor is arranged to detect the waste object landing at the target position, such as into a chute, after being thrown.
  • the controller is configured to determine an estimated throw trajectory of the gripped waste object towards the target position, send control instructions to the gripper and/or manipulator so that the gripper and/or manipulator accelerates the gripped waste object and releases the waste object at a throw position with a throw velocity and throw angle towards the target position to throw the waste object along the estimated throw trajectory associated with the waste object, from the throw position to the target position.
  • the controller is configured to determine the estimated throw trajectory based on the throw data.
  • the controller is configured to determine the deviations by comparing the throw data with the estimated throw trajectory.
  • a parameter sensor is configured to detect the object parameters of the waste objects, the object parameters comprising the orientation and/or physical characteristics of the respective waste objects, and wherein the controller is configured to determine the estimated throw trajectory based the object parameters.
  • the controller is configured to associate the throw data in the waste object model with the respective object parameters of the waste object.
  • the throw data comprises the timing of the position of the waste object after being thrown to the target position, wherein the controller is configured to compare the timing of the position of the waste object with a time at which the gripper releases the waste object when being thrown.
  • the controller is configured to determine an expected target time for the waste object to reach the target position based on the time at which the gripper releases the waste object when being thrown, and compare the timing of the position of the waste object with the expected target time.
  • the controller is configured to determine the expected target time based on the detected object parameters of the waste object.
  • the throw sensor is configured to provide the throw data of the waste object along a path between a throw position at which the gripper releases the waste object when being thrown and the target position.
  • a method of controlling a waste robot comprising moving a manipulator within a working area, controlling a gripper connected to the manipulator to selectively grip a waste object in the working area at a picking position and throw the waste object to a target position, determine the position of the waste object after being thrown to the target position as throw data, determine deviations in the position of the thrown waste object by comparing the throw data with the target position, and determine control instructions to the gripper and/or manipulator for subsequently gripped waste objects based on the deviations to throw the subsequently gripped waste objects towards the target position.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
  • Figure 1 shows a perspective view of a waste sorting robot
  • Figure 2 shows a further perspective view of a waste sorting robot
  • Figure 3a shows a schematic front view of a waste sorting robot, where a gripper has engaged with a waste object and lifted the waste object from the conveyor;
  • Figure 3b shows a position of the gripper, subsequent to the position shown in Fig. 3a, where the gripper is positioned at a throw position where the waste object is thrown to a target position
  • Figure 3c shows a position of the gripper, subsequent to the position shown in Fig. 3b, where the gripper moves in a direction towards a next waste object after having thrown the previous waste object towards the target position;
  • Figure 3d shows a schematic side view of a waste sorting robot in Fig. 3c, where a detected position of the waste object is compared to the position of the target position;
  • Figure 3e shows another schematic side view of a waste sorting robot, where the position of another thrown waste object is compared to the position of the target position;
  • Figure 4 shows a schematic front view of a waste sorting robot, where a gripper moves towards another waste object to be thrown to the target position; and
  • Figure 5 shows a flowchart of a method of controlling a waste sorting robot.
  • Figure 1 shows a perspective view of a waste sorting robot 100.
  • the waste sorting robot 100 can be a waste sorting gantry robot 100.
  • the examples described below can also be used with other types of robot such as robot arms or delta robots.
  • the waste sorting robot 100 is a Selective Compliance Assembly Robot Arm (SCARA).
  • SCARA Selective Compliance Assembly Robot Arm
  • the different types of robot are collectively referred to as waste sorting robot 100 below for brevity.
  • the waste sorting robot 100 comprises a manipulator 101 which is movable within a working area 102.
  • the waste sorting robot 100 comprises a gripper 103 which is connected to the manipulator 101.
  • the gripper 103 is arranged to selectively grip a waste object 104 which moves into the working area 102 on a conveyor belt 113.
  • the gripper 103 may comprise a pneumatic suction gripper holding and releasing waste objects 104 by a varying air- or gas pressure.
  • the gripper 103 may comprise movable jaws to pinch the waste object 104 with a releasable grip.
  • the conveyor belt 113 may be a continuous belt, or a conveyor belt formed from overlapping portions.
  • the conveyor belt 113 may be a single belt or alternatively a plurality of adjacent moving belts (not shown). In other examples, the waste object 104 can be conveyed into the working area 102 via other conveying means.
  • the conveyor belt 113 can be any suitable means for moving the waste object 104 into the working area 102. For example, the waste object 104 may be fed under gravity via a slide (not shown) to the working area 102.
  • the working area 102 is an area within which the manipulator 101 and gripper 103 is able to reach and interact with the waste object 104.
  • the working area 102 as shown in Fig. 1 is a cross hatched area beneath the gripper 103.
  • Fig. 1 shows only one waste object 104 for clarity, but it should be understood that any number of waste objects 104 may move into the working area 102.
  • the schematic view of a waste sorting robot 100 in Fig. 2 shows a plurality of waste objects 104a, 104b, 104c, collectively referred to as waste object 104 for brevity unless otherwise indicated.
  • the gripper 103 is arranged to grip the waste object 104 in the working area 102, at a momentaneous position referred to as a picking position 105 below, and throw the waste object 104 to a target position 106a.
  • Fig. 2 is a schematic illustration of a waste object 104b having a momentaneous picking position 105, and a target position 106a where a previous waste object 104a has been thrown by the gripper 103.
  • the working area 102 may extend over the target position 106a as indicated by the cross hatched area beneath the gripper 103 in Fig. 2, to allow the gripper 103 to throw a waste object 104 onto the target position 106a with a vertical throw in some examples.
  • the manipulator 101 and the gripper 103 connected thereto, is configured to move within a working volume defined by the height above the working area 102 where the waste sorting robot 100 can manipulate the waste object 104.
  • the manipulator 101 is moveable along a plurality of axes.
  • the manipulator 101 is moveable along three axes which are substantially at right angles to each other. In this way, the manipulator 101 is movable in an X-axis which is parallel with the longitudinal axis of the conveyor belt 113 (“beltwise”). Additionally, the manipulator 101 is movable across the conveyor belt 113 in a Y-axis which is perpendicular to the longitudinal axis of the conveyor belt 113 (“widthwise”).
  • the manipulator 101 is movable in a Z-axis which is in a direction normal to the working area 102 and the conveyor belt 113 (“heightwise”).
  • the manipulator 101 and/or gripper 103 can rotate about one or more axes (W), as schematically indicated in Fig. 2.
  • the waste sorting robot 100 may comprise one or more servos, pneumatic actuators or any other type of mechanical actuator for moving the manipulator 101 and gripper 103 in one or more axes.
  • the servos, pneumatic actuators or mechanical actuators are not shown in the Figures.
  • the waste sorting robot 100 is arranged to sort the waste object 104 into fractions according to one or more parameters of the waste object 104.
  • the waste objects 104 can be any type of industrial waste, commercial waste, domestic waste or any other waste which requires sorting and processing.
  • Unsorted waste material comprises a plurality of fractions of different types of waste.
  • Industrial waste can comprise fractions, for example, of metal, wood, plastic, hardcore and one or more other types of waste.
  • the waste can comprise any number of different fractions of waste formed from any type or parameter of waste.
  • the fractions can be further subdivided into more refined categories. For example, metal can be separated into steel, iron, aluminium etc.
  • domestic waste also comprises different fractions of waste such as plastic, paper, cardboard, metal, glass and I or organic waste.
  • a fraction is a category of waste that the waste can be sorted into by the waste sorting robot 100.
  • a fraction can be a standard or homogenous composition of material, such as aluminium, but alternatively a fraction can be a category of waste defined by a customer or user.
  • the waste sorting robot 100 may comprise a parameter sensor 107 configured to detect object parameters of the waste objects 104, and a controller 108 in communication with the parameter sensor 107 which may be configured to receive the detected object parameters.
  • the controller 108 may thus be configured to send movement instructions to the manipulator 101 and gripper 103 for interacting with the waste objects 104 to be sorted, based on the detected object parameters.
  • the gripper 103 may selectively grip the waste objects 104 to be sorted to different target positions, such as different chutes arranged along the working area.
  • the controller 108 may thus be configured to send instructions to the X-axis, Y-axis and Z-axis drive mechanisms of the manipulator 101 and gripper 103 to control and interact with the waste objects 104 on the conveyor belt 113.
  • controller 108 for controlling the manipulator 101 and gripper 103.
  • Such information processing techniques are described in WO2012/089928, WO20 12/052615, WO2011/161304, W02008/102052 which are incorporated herein by reference.
  • the control of the waste sorting robot 100 is discussed in further detail in reference to Figs. 3 - 4 below.
  • Fig. 3a shows a schematic front view of a waste sorting robot 100 where the gripper 103 has engaged with a waste object 104a and lifted the waste object 104a from the conveyor belt 113.
  • Fig. 3b shows a position of the gripper 103, subsequent to the position shown in Fig. 3a, where the gripper 103 is positioned at a throw position 110 where the waste object 104a is thrown towards a target position 106a with a throw velocity (vo) and throw angle ( ⁇ po).
  • Fig. 3c shows a position of the gripper 103, subsequent to the position shown in Fig. 3b, where the gripper 103 moves in a direction towards a next waste object after having thrown the first waste object 104a.
  • Fig. 3c shows a momentaneous position of the first waste object 104a after having been thrown by the gripper 103 and/or manipulator 101 .
  • the waste sorting robot 100 comprises a throw sensor 112, 112a, 112b, configured to determine the position of the waste object 104 after being thrown to the target position 106a.
  • Figs. 3c and 3d show a throw sensor 112 arranged to detect the position of the waste object 104a.
  • Fig. 3d is a schematic illustration of a waste sorting robot 100, with a side view along the X-direction.
  • the conveyor belt 113 moves in the direction of the arrows indicated in Fig. 3d.
  • the controller 108 is configured to receive the position of the waste object 104a being thrown as throw data 109’.
  • the dotted path 109’ in Figs. 3c and 3d indicates the actual position of the waste object 104 when being thrown from the throw position 110.
  • the actual position of the waste object 104 is referred to as the throw data 109’.
  • the intended throw trajectory is indicated as the dotted path 109.
  • the first waste object 104a is instead thrown to a position indicated as 106b, off-set from the intended target position indicated as 106a.
  • the controller 108 is configured to determine deviations in the position of the thrown waste object 104 by comparing the throw data 109’ received from the throw sensor 112 with the target position 106a.
  • Fig. 3d illustrates one example of such deviating positions, in the X-direction, but it should be understood that deviations may occur also in the Y-direction.
  • the controller 108 is configured to determine control instructions to the gripper 103 and/or manipulator 101 for subsequently gripped waste objects 104 based on the aforementioned deviations to throw the subsequently gripped waste objects 104 towards the target position 106a.
  • Figs. 3 - 4 show one target position 106a for the purpose of clarity but it should be understood that different waste objects 104a, 104b, 104c, may have a plurality of different target positions 106a.
  • Fig. 3e shows an example of a subsequently gripped waste object 104b being thrown towards the target position 106a.
  • Fig. 3d may in this example indicate that the throw angle ( ⁇ po) and/or throw velocity (vo) for a waste object of this type needs to be adjusted, in order to reach the target position 106a.
  • the controller 108 may thus send control instructions to the gripper 103 and/or manipulator 101 to change the throw position 110, throw angle ( ⁇ po) and/or throw velocity (vo) for the subsequently gripped waste object 104b.
  • Fig. 3e shows an example where the throw position 110, throw angle ( ⁇ po) and/or throw velocity (vo) for the second waste object 104b has been compensated based on the deviations determined for the first waste object 104a, so that second waste object 104b is thrown to the target position 106a.
  • the velocity component of the throw velocity (vo) in the X-direction could in this example be increased to not fall short of the target position 106a.
  • An example of an intended throw trajectory 109 is also shown in Fig. 3e.
  • Determining control instructions to the gripper 103 and/or manipulator 101 for the subsequent waste objects 104 based on detected deviations in the position between thrown waste objects 104 and the respective target positions 106a provides for an effective refinement of the throwing capabilities of the waste sorting robot 100.
  • the accuracy of the waste sorting robot 100 is improved and the useful time interval the manipulator 101 and gripper 103 interacts with each waste object 104 is optimized by refining the throw trajectories for the different types of waste objects 104.
  • a more accurate and effective waste sorting robot 100 is provided. This means that the sorting speed may ultimately be increased.
  • the speed of the conveyor belt 113 and/or the amount of waste objects 104 on the conveyor belt 113 may be increased.
  • the objects to be sorted on the conveyor belt are more singularized and less likely to be overlapping. This means that the manipulation and object recognition is easier.
  • the controller 108 may be configured to associate the throw data 109’ of the thrown waste object 104 and the determined control instructions to a waste object model to be applied to the subsequently gripped waste objects.
  • a model may be created where different categories of waste objects 104 may be associated with the actual throw data 109’ of such different categories of waste objects 104 and the corresponding control instructions.
  • the new waste object 104 may be compared to the categories of waste objects 104 in the waste object model.
  • the closest matching throw data 109’ may be identified, and the corresponding control instructions may be retrieved to be applied for the new waste object 104, which been updated to compensate of the previous deviation from the intended trajectory 109.
  • the next waste object 104 may thus be thrown to the correct target position 106a, e.g. by increasing the length of the throw as shown in the example of Fig. 3e.
  • the control instructions to the gripper 103 and/or the manipulator 101 may thus be refined and may be optimized for varying types and categories of waste objects 104. Different categories of waste objects 104 may be identified by unique waste object parameters, which may also be associated with the respective throw data 109’ and control instructions, as described further below.
  • the controller 108 may be configured to approximate, e.g. by interpolation techniques of the available data in the model, an estimated throw velocity (vo) and a throw angle ( ⁇ po) of the particular waste object 104 towards the target position 106a.
  • the resulting throw data 109’ may then be recorded by the throw sensor 112 and the control instructions may be iteratively updated for the particular waste object type by continuously comparing the throw data 109’ with the intended throw trajectory 109.
  • the waste object model may thus be continuously built and refined to be applicable to a growing number of different types of waste objects
  • the controller 108 may be configured to input the throw data 109’ to a machine learning-based model to determine the control instructions for subsequently gripped waste objects 104. This provides for an effective adaptation and optimization of the control instructions in the waste object model.
  • the controller 108 may be configured to input the throw data 109’ together with detected object parameters of the respective waste objects 104 to a machine learning-based model to determine the control instructions.
  • the waste sorting robot 100 comprises a parameter sensor 107, as schematically indicated in Fig. 4 and described further below.
  • the parameter sensor 107 may comprise an imaging sensor, such as a camera.
  • Object parameters for the different waste objects 104 may be determined from the image data received from the parameter sensor 107. Different image features, such as shapes, colours, geometrical relationships etc of the detected waste objects 104 may be assigned as the characterising object parameters in the waste object model, to create e.g. different categories of waste objects.
  • the waste object model may be continuously populated with the throw data 109’ for the respective categories of waste objects, and deviations from the intended trajectory 109 may be determined to continuously adapt the associated control instructions.
  • the categorization of the waste objects 104 may be continuously refined by analysing and comparing the image data of waste objects 104 having similar throw data 109’ for a similar set of control instructions. The same principle may be applied to data received from different types of parameter sensors 107 to characterize the waste objects 104, as exemplified below.
  • the throw sensor 112 may be arranged to detect if the waste object 104 lands at the target position 106a after being thrown.
  • Fig. 4 shows an example where a throw sensor 112b is arranged adjacent the target position 106a, which may comprise a chute into which the waste object 104 should be thrown. A deviation may be detected if the throw sensor 112b is unable to detect the position of the waste object 104 at the target position 106a or chute after the waste object 104 is thrown.
  • the controller 108 may thus be configured to update the control instructions to the gripper 103 and/or manipulator 101 to adjust any of the throw position 110, throw velocity (vo) and throw angle (cpo) towards the target position 106a.
  • the throw sensor 112b may in some examples comprise a trigger sensor 112b configured to detect if a waste object 104 flies past the trigger sensor 112b and into the target position 106a.
  • Such trigger sensor 112b may comprise an optical sensor with trigger beams across the target position 106a.
  • the trigger sensor 112b comprises a grid or fence of optical detection beams across the target position 106a, such as a laser- or IR-grid.
  • the throw data 109’ may comprise the timing (tp) of the position of the waste object 104 after being thrown to the target position 106a.
  • Fig. 3e shows a schematic illustration where the positions of the waste object 104, i.e. the throw data 109’, have an associated timing (tp), further including a target time (t p) when the waste object 104 is detected at the target position 106a.
  • the timing of the position of the waste object 104 may thus comprise a target time (tip) at which the waste object 104 reaches the target position 106a.
  • the throw sensor 112 may be arranged to detect a target time (tip).
  • a trigger sensor 112b may record a time stamp, i.e.
  • the controller 108 may be configured to compare the timing of the position of the waste object 104, such as the target time (tip) in this example, with a time (tp) at which the gripper 103 releases the waste object 104 when being thrown.
  • Time (tp) is referred to as the release time (tp). This allows for synching the position of the waste object 104 with the release time (tp).
  • the target time (tip) may be compared with the release time (tp) to determine if the times are in agreement to confirm a correct throw.
  • the controller 108 may be configured to determine an estimated or expected target time (IET) for the waste object 104 to reach the target position 106a based on the time (tp) at which the gripper 103 releases the waste object 104 when being thrown.
  • Fig. 3e shows a schematic illustration of a throw trajectory 109, which may be the intended throw trajectory 109, having an estimated target time (IET).
  • the controller 108 may be configured to compare the timing of the position of the waste object 104, e.g. the target time (tip), with the expected target time (IET) to further facilitate the detection of deviations to optimize the control instructions of the gripper 103 and/or manipulator 101.
  • the controller 108 may be configured to determine the expected target time (IET) based on detected object parameters of the waste object 104. As described further below with reference to parameter sensor 107, the object parameters such as the size and shape of the waste object 104 may affect its trajectory towards the target position 106a, and the controller 108 may be configured to take into account these parameters to determine the expected target time (IET) for a more accurate estimation thereof.
  • a plurality of different throw sensors 112a, 112b may be arranged to detect the position of the waste object 104.
  • a trigger sensor 112b may be combined with a throw sensor 112a configured to detect image data of the waste object 104, as schematically indicated in Fig. 4.
  • the throw sensor 112a may thus comprise an image detector, such as a camera.
  • the image data may be utilized in conjunction with the information from a trigger sensor 112b to determine any deviations in the trajectory of the thrown waste objects 104 and when the control instructions to the gripper 103 and/or manipulator 101 needs to be adjusted.
  • image data from image detector 112a may be utilized to determine any deviations in the position of the waste object 104 if an expected triggering of the trigger sensor 112b does not occur after a throw.
  • the throw sensor 112 comprises an image sensor 112a configured to detect the position of the waste object 104 in two dimensions, e.g. in the X- and Y-directions, and/or in three dimensions along the in the X-, Y-, and Z - directions. This provides for accurately determining the position of the waste object 104 after being thrown, as well as any associated deviations in its position with respect to the target position 106a.
  • an image sensor 112a is arranged to detect image data of the target position 106a.
  • the throw data 109’ may thus comprise image data of the position of the waste object 104 and the target position 106a, to allow determining any deviations of the waste object 104 from the intended trajectory 109.
  • a series of images may be taken over the target position 106a after the gripper 103 releases the waste object 104 at the throw position 110.
  • a selection of image data may be made amongst the series of images to identify and extract the positions of the waste object 104 which may be indicative of deviations from the target position 106a.
  • Different image processing techniques may be utilized to effectively extract the throw data 109’ and positions of the waste object 104 relative the target position 106a from the received image data.
  • the controller 108 may then be configured to modify any of the throw position 110, throw velocity (vo) and throw angle (cpo) for the subsequent waste objects 104 to reduce or eliminate deviations in the position with respect to the intended trajectory 109 towards the target position 106a.
  • the two or three-dimensional throw data 109’ may comprise the timing (tp) of the position of the waste object 104 as discussed above.
  • the extracted image data of the position of the waste object 104 mentioned above may be compared with the timing of the gripper 103 and/or manipulator 101. For example, it may be determined that a first waste object 104a should land in the target area 106a after the gripper 103 releases the waste object 104a at the throw position 110, but before the next waste object 104b is picked or thrown from the gripper 103.
  • the timing (tp) may comprise any time stamps of the positions of the waste object 104 along any part of its path between the gripper 103 (or even picking position 105) and target position 106a, as schematically indicated in Fig. 3e. This provides for further facilitating determining deviations from an intended trajectory 109.
  • the throw sensor 112 may thus be configured to provide the throw data 109’ of the waste object along a path between the throw position 110 at which the gripper 103 releases the waste object 104 when being thrown and the target position 106a. This provides for monitoring of the trajectory or flight path of the waste object 104. In some examples, such monitoring may be emphasized along different segments of the total trajectory, e.g. with emphasis on a segment closer to the target position 106a, to optimize the detection, e.g. to facilitate the image processing of the sensor data.
  • the controller 108 may be configured to determine an estimated throw position 110, an estimated throw velocity (vo) and/or an estimated throw angle (cpo) for throwing the waste object 104 along the intended throw trajectory 109. Le. for the respective waste objects 104 being selectively gripped by the gripper 103, the controller 108 may be configured to determine an estimation of the intended throw trajectory 109 towards the target position 106a.
  • the estimation of the intended throw trajectory 109 is referred to as the estimated trajectory 109 below for brevity.
  • the controller 108 may thus be configured to send control instructions to the gripper 103 and/or manipulator 101 so that the gripper 103 and/or manipulator 101 accelerates the gripped waste object 104 and releases the waste object 104 at a throw position 110 with a throw velocity (vo) and throw angle (cpo) towards the target position 106a.
  • the controller 108 may be configured to determine the estimated throw trajectory 109 based on the throw data 109’.
  • the throw data 109’ accumulated for any number of waste objects 104 may be analysed and compared with the respective target positions 106a to determine any deviations and update the control instructions as described above.
  • any of the throw position 110, throw velocity (vo), and throw angle (cpo) may be estimated based on the throw data 109’.
  • the controller 108 may be configured to determine the aforementioned deviations in the position of the waste object 104 from the target position 106a by comparing the throw data 109’ with the estimated throw trajectory 109.
  • any parameter describing the motion of the thrown waste object 104 may be detected by the throw sensor 112 as throw data 109’ and compared to the estimated trajectory 109.
  • the controller 108 may thus be configured to iteratively refine the waste object model which may be applied to the subsequently gripped waste objects 104 by comparing the throw data 109’ resulting from the various waste objects 104 with the respectively estimated trajectories 109.
  • the waste objects 104 may have different characteristics, described as object parameters in the following.
  • the controller 108 may be configured to associate the throw data 109’ with detected object parameters of the respective waste objects 104 in a waste object model to continuously optimize sets of control instructions for a range of different waste objects 104.
  • the waste sorting robot 100 may thus comprise a parameter sensor 107 configured to detect object parameters of the waste objects 104.
  • the controller 108 may be configured to receive the detected object parameters from the parameter sensor 107.
  • the object parameters may comprise the orientation and/or physical characteristics of the respective waste objects 104.
  • the orientation of a waste object 104 should be construed as the orientation in the working volume in the X, Y, Z-directions. For example, two waste objects 104 of identical size and shape may have different orientations when being transported on the conveyor belt 113, since the waste object 104 may lay on different sides on the conveyor belt 113.
  • the orientation of such waste objects 104 may thus also be different when being held in place in the gripper 103, since the gripper 103 typically grips the waste objects 104 from a top-down approach, regardless of the orientation of the waste objects 104 on the conveyor belt 113.
  • the physical characteristics may comprise geometrical characteristics of the respective waste objects 104, such as the shape, size, and/or volume. Alternatively, or in addition, the physical characteristics may comprise material characteristics, such as from what material the waste object 104 is made, density, and/or surface properties of the waste object 104.
  • the controller 108 may be configured to determine an estimated throw trajectory 109 of the respectively gripped waste object 104 towards the target position 106a based on the detected object parameters of said gripped waste object 104.
  • the parameter sensor 107 may be configured to detect object parameters of a first waste object 104a prior to engaging the waste object 104a with the gripper 103, e.g. in case the parameter sensor 107 comprises an image sensor or any other sensor configured to detect object parameters remotely.
  • the object parameters may in one example comprise information about the size of the engaged waste object 104a.
  • the controller 108 may thus be configured to estimate a throw trajectory 109 of the waste object 104a based on the size thereof.
  • the controller 108 is configured to send control instructions to the gripper 103 and/or the manipulator 101 so that the gripper 103 and/or the manipulator 101 accelerates the gripped waste object 104a and releases the waste object 104a at a throw position 110.
  • the waste object 104a is released with a throw velocity (vo) and throw angle ( ⁇ po) towards the target position 106a for throwing the waste object 104a along the estimated throw trajectory 109, associated with the waste object 104a, from the throw position 110 to the target position 106a.
  • Fig. 3b is a schematic illustration of the momentaneous position of the gripper 103 at the throw position 110, where the waste object 104a is released from the gripper 103.
  • the waste object 104a has thrown a throw velocity (vo) and throw angle ( ⁇ po) when being released.
  • the controller 108 may be configured to instruct the manipulator 101 and gripper 103 to move to a next identified waste object 104c to be picked from the conveyor belt 113 (see e.g. Fig. 4), as soon as the first waste object 104a has been released at the throw position 110.
  • Fig. 4 is a schematic illustration where a second waste object 104c is to be picked from a picking position 105 and moved to a subsequent throw position 110.
  • the controller 108 may be configured to receive object parameters of the second waste object 104c from the parameter sensor 107.
  • the object parameters of the second waste object 104c may comprise information that the second waste object 104c has a different size and/or shape, and/or different material composition, and/or different orientation on the conveyor belt 113, compared to the first waste object 104a.
  • the controller 108 may be configured to determine a throw trajectory 109 of the second waste object 104c towards target position 106a based on the object parameters associated with the second waste object 104c.
  • the object parameters of the second waste object 104c may comprise information that the second waste object 104c has a more flattened shape compared to the first waste object 104a.
  • the flat waste object 104c may be a sheet of material, such as metal, paper, or plastic, while the first waste object 104a may be from the same material, but crumbled to rounder shape. Determining the throw trajectory 109 of the second waste object 104c may thus take into account the shape of the second waste object 104c results in a different motion through the air after being released by the gripper 103.
  • the waste objects 104 may have essentially the same object parameters with respect to the geometrical characteristics, i.e. same size and shape, but the material characteristics may be different.
  • the densities of the waste objects 104 may vary, and accordingly the weight. Determining the throw trajectory 109 may thus take into account the different weights of the waste objects 104.
  • a heavier object needs to be accelerated for a longer duration by the gripper 103 and/or manipulator 101 to reach a desired throw velocity (vo), compared to a lighter object, due to the increased inertia of the heavier object.
  • Other material characteristics may include structural parameters such as the flexibility of the waste objects 104.
  • the waste objects 104 may have essentially the same object parameters with respect to the geometrical characteristics and the material characteristics, but the orientation of the waste objects 104 on the conveyor belt 113 may vary.
  • the orientation of the waste objects 104 when held in place by the gripper 103 may thus also vary, if the waste objects 104 are gripped from the same top-down approach.
  • a rectangular waste object 104 which has one side significantly shorter than the remaining two, e.g. shaped like a text book, may have different trajectories through the air depending on which side is facing the throw direction.
  • the waste object 104 may experience less drag if the shortest side is facing the throw direction, thus cutting through the air with less air resistance.
  • the detected object parameters may comprise information of the orientation of the waste objects 104 to estimate the related throw trajectories 109.
  • Determining an estimated throw trajectory 109 of the respectively gripped waste objects 104, based on the detected object parameters provides for optimizing the useful time interval the manipulator 101 and gripper 103 interacts with each waste object 104.
  • estimating the required throw velocity (vo) of a first waste object 104a to be thrown to the target position 106a allows for minimizing the amount of time the gripper 103 needs to carry the first waste object 104a before being thrown.
  • the first waste object 104a may be thrown quickly at a throw position 110 just after being accelerated to the throw velocity (vo), and the gripper 103 may immediately target the next identified waste object 104c.
  • a subsequently gripped waste object 104 may have associated object parameters which dictate a different throw trajectory 109 and the gripper 103 and/or manipulator 101 may be controlled to throw the waste object 104 accordingly.
  • the optimized throwing of the waste objects 104 to the target positions 106a as described in the examples above provides for a more effective waste sorting robot 100.
  • the controller 108 may be configured to associate the throw data 109’ in the waste object model with the respective object parameters of the waste object 104.
  • a waste object model may be created where different categories of waste objects 104, based on their respective object parameters, are associated with the throw data 109’ of such different categories of waste objects 104.
  • the object parameters obtained from a parameter sensor 107 may be compared to the categories of waste objects 104 in the waste object model so that the closest matching throw data 109’ may be identified.
  • next waste object 104 is identified as having similar object parameters as a previous waste object 104 having associated throw data 109’
  • the respective control instructions in the model may have been updated to compensate of any previous deviation from the intended throw trajectory 109.
  • the next waste object 104 may thus be thrown to the correct target position 106a, e.g. by increasing the length of the throw in the example of Fig. 3e.
  • the controller 108 may thus be configured to associate the throw data 109’ and the detected object parameters of the thrown waste object 104 to a waste object model.
  • a parameter sensor 107 may be positioned upstream of the working area 102 so that detected parameters of the waste objects 104 may be sent to the controller 108 before the waste objects 104 enter the working area 102. Alternatively, or in addition, the sensor 107 may be positioned in the working area 102.
  • the parameter sensor 107 may comprise a plurality of different types of parameter sensors 107.
  • the parameter sensor 107 or plurality of parameter sensors 107 may be arranged at different positions outside or inside the working area 102 or working volume. In some examples the parameter sensor 107 or plurality of parameter sensors 107 may be arranged on, or in communication with, the manipulator 101 and/or gripper 103.
  • the parameter sensor 107 may be arranged in the working area 102 or working volume, but arranged to detect object parameters while the respective waste objects 104 have not yet been transported into the working area 102, e.g. by being aligned towards an upstream direction of the conveyor 113 movement with respect to the manipulator 101 and/or gripper 103.
  • the parameter sensor 107 may in that case be arranged adjacent, or on, any parts of the waste sorting robot 100. This may provide for reducing the foot print of the waste sorting robot 100, while still facilitating early detection of the object parameters.
  • the parameter sensor 107 may be arranged adjacent, or on, any parts of the waste sorting robot 100 while being aligned to detect object parameters when the respective waste objects 104 have been transported into the working area 102. This may be advantageous in applications where the waste sorting robot 100 is optimized for compactness, if the conveyor 113 moves at a slower speed, or if continuous analysis of the object parameters is desired for optimizing the manipulation and sorting of the of the waste objects 104 along their conveyor path. This may be advantageous in case the waste objects 104 would be subject to movements while moving through the working area 102.
  • the parameter sensor 107 may be positioned upstream of the manipulator 101 and/or gripper 103, e.g.
  • the parameter sensor 107 may in such example be arranged on a separate frame (not shown) positioned upstream of the manipulator 101 and/or gripper 103. Alternatively, or in addition, the parameter sensor 107 may be positioned upstream of the manipulator 101 and/or gripper 103, e.g. outside the working area 102, while being arranged to detect object parameters when the respective waste objects 104 have been transported into the working area 102. This may provide for retrieving object parameters, such as image data of the waste objects 104 from several imaging angles, which may provide for improving the performance of the waste objects 104.
  • the parameter sensor 107 may comprise any sensor suitable to detect a parameter of the waste object 104 e.g. one or more of an image sensor, a force sensor, a gyroscopic sensor, a motion sensor, an electric current sensor, a hall sensor, a metal detector, a temperature sensor, a chemical sensor, a visual and I or infrared spectroscopic detector, radioactivity sensor and / or a laser e.g. LIDAR.
  • An image sensor may comprise one or more of an RGB camera, an infrared camera, a 3D imaging sensor, a terahertz imaging system.
  • the object parameters of the waste objects 104 may be detected by any of the mentioned sensors.
  • the geometrical dimensions and orientation of a waste object 104 may be determined from image data of the waste object 104 received from an image sensor 107.
  • the image data may be used to determine any one of a size, shape, and volume of the waste object 104.
  • the image data may be utilized in a machine learning-based model to build up an object recognition capability of the waste sorting robot 100.
  • the recorded image data may be utilized to distinguish physical characteristics such as from what material the waste object 104 is made, and the associated material characteristics, besides from the geometrical dimensions and orientation of a waste object 104.
  • the image data may be combined with sensor data from any one of the aforementioned sensors.
  • the gripper 103 comprises a gyroscopic sensor, such as an electrical MEMS gyroscope used as a velocity sensor.
  • the controller 108 may thus determine the acceleration and velocity of the gripper 103 during operation.
  • the velocity of the gripper 103 may thus be monitored and controlled at the throw position 110 so that the velocity of the gripper 103 translates to the desired throw velocity (vo) of the waste object 104 when being released from the gripper 103.
  • the throw angle ( ⁇ po) may be controlled by the X-, Y-, Z- movement of the gripper 103 and manipulator 101. For example, an upwards acceleration of the waste object 104 as illustrated in Fig.
  • 3b may be achieved by an acceleration of the gripper 103 in the Z-direction, upwards from the conveyor belt 113.
  • the waste object 104a continues its trajectory with a velocity component in the Z-direction after being released from the gripper 103.
  • the gripper 103 may continue with an upward movement in the Z-direction after release, to not interfere with the trajectory of the thrown waste object 104a.
  • the velocity component of the gripper 103 in the upwards Z-direction at the throw position 110 may be increased or decreased to vary the throw angle ( ⁇ po).
  • Accelerating the gripped waste object 104 to the throw velocity (vo) may comprise applying a force (F) to the gripped waste object 104 during a time (T) by a movement of the gripper 103 and/or the manipulator 101 .
  • the waste object 104 may be accelerated to the throw velocity (vo) by applying an airflow to the gripped waste object 104, where the airflow is ejected from the gripper 103.
  • a pressure from an airflow of a flow of a gas, ejected from the gripper 103 onto the waste object 104 applies a force onto the waste object 104 to accelerate the waste object 104.
  • the gripper 103 and/or manipulator 101 may accelerate the waste object 104 by a movement in the X-, Y-, Z- directions in combination with pushing the waste object 104 away from the gripper 103 by an airflow.
  • the gripper 103 may in some examples comprise a suction gripper 103 comprising a suction cup configured to physically engage with a surface of the waste object 104.
  • a negative pressure may be created in the suction cup so that the waste object 104 is held in place by the gripper due to the force created by the negative pressure.
  • the suction gripper 103 may be in fluid communication with a pneumatic system (not shown) to connecting the suction gripper 103 with a compressed air or gas supply.
  • the air or gas supply to the suction gripper 103 may be reversed so that the negative pressure is released and a positive pressure may be exerted onto the waste object 104 to throw the waste object 104 as described above.
  • the gripper 103 comprises movable jaws to grip the waste objects 104, and a gas- or airflow connection to push and throw the waste objects 104 away from the gripper 103 when the jaws release their grip.
  • Fig. 5 is a flowchart of a method 200 of controlling a waste sorting robot 100.
  • the method 200 comprises moving 201 a manipulator 101 within a working area 102, and controlling 202 a gripper 103 connected to the manipulator to selectively grip a waste object 104, 104a, 104b, 104c in the working area 102 at a picking position 105 and throw the waste object 104 to a target position 106a.
  • the method 200 comprises determining 203 the position of the waste object 104 after being thrown to the target position 106a as throw data 109’.
  • the method 200 comprises determining 204 deviations in the position of the thrown waste object 104 by comparing the throw data 109’ with the target position 106a.
  • the method 200 comprises determining 205 control instructions to the gripper 103 and/or manipulator 101 for subsequently gripped waste objects 104 based on the deviations to throw the subsequently gripped waste objects 104 towards the target position 106a.
  • the method 200 thus provides for the advantageous benefits as described above with reference to the waste sorting robot 100 and Figs. 1 - 4.
  • the method 200 provides for a more accurate and effective waste sorting robot 100.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.

Abstract

A waste sorting robot (100) comprises a manipulator (101) moveable within a working area (102). A gripper (103) is connected to the manipulator (101) and arranged to selectively grip a waste object (104, 104a, 104b, 104c) in the working area (102). A throw sensor (112 112a, 112b) is configured to determine the position of the waste object (104, 104a, 104b, 104c) after being thrown to a target position (106a). A controller (108) is in communication with the throw sensor (112, 112a, 112b) and is configured to receive said position as throw data (109') and to determine deviations in the position of the thrown waste object (104, 104a, 104b, 104c) by comparing the throw data with the target position (106a). The controller (108) is configured to determine control instructions to the gripper (103) and/or manipulator (101 ) for subsequently gripped waste objects (104, 104a, 104b, 104c) based on the deviations to throw the subsequently gripped waste objects (104, 104a, 104b, 104c) towards the target position (106a). A related method of controlling a waste robot is also disclosed.

Description

wo 2022/090627 robot with throw sensor for determining position <PCT/FI2021/050724
The present disclosure relates to a waste sorting robot for sorting waste objects.
In the waste management industry, industrial and domestic waste is increasingly being sorted in order to recover and recycle useful components. Each type of waste, or “fraction” of waste can have a different use and value. If waste is not sorted, then it often ends up in landfill or incineration which has an undesirable environmental and economic impact.
It is known to sort industrial and domestic waste using a waste sorting robot. The waste sorting robot picks waste objects from a conveyor with a gripper and moves the object to a sorting location depending on the type of waste object.
A previous problem is the limited speed by which waste sorting robots can be operated. The speed of operation limits the flow of waste objects to be sorted, and ultimately the throughput and value of this type of automated recycling. Increasing the speed of operation typically results in reduced accuracy of the waste sorting robot. Adding further waste sorting robots along the conveyor increases the cost of the waste sorting system, as well as the footprint and complexity of the system.
Examples described hereinafter aim to address the aforementioned problems.
In a first aspect of the disclosure, there is provided a waste sorting robot comprising a manipulator movable within a working area, a gripper connected to the manipulator, wherein the gripper is arranged to selectively grip a waste object in the working area at a picking position and throw the waste object to a target position. The waste sorting robot comprises a throw sensor configured to determine the position of the waste object after being thrown to the target position. The waste sorting robot comprises a controller in communication with the throw sensor and being configured to receive said position as throw data. The controller is configured to determine deviations in the position of the thrown waste object by comparing the throw data with the target position, and determine control instructions to the gripper and/or manipulator for subsequently gripped waste objects based on the deviations to throw the subsequently gripped waste objects towards the target position.
Optionally, the controller is configured to associate the throw data of the thrown waste object and the determined control instructions to a waste object model to be applied to the subsequently gripped waste objects. Optionally, the controller is configured to input the throw data to a machine learning-based model to determine the control instructions for the subsequently gripped waste objects.
Optionally, wherein the throw sensor is arranged to detect the waste object landing at the target position, such as into a chute, after being thrown.
Optionally, for the respective waste objects being selectively gripped by the gripper, the controller is configured to determine an estimated throw trajectory of the gripped waste object towards the target position, send control instructions to the gripper and/or manipulator so that the gripper and/or manipulator accelerates the gripped waste object and releases the waste object at a throw position with a throw velocity and throw angle towards the target position to throw the waste object along the estimated throw trajectory associated with the waste object, from the throw position to the target position.
Optionally, the controller is configured to determine the estimated throw trajectory based on the throw data.
Optionally, the controller is configured to determine the deviations by comparing the throw data with the estimated throw trajectory.
Optionally, a parameter sensor is configured to detect the object parameters of the waste objects, the object parameters comprising the orientation and/or physical characteristics of the respective waste objects, and wherein the controller is configured to determine the estimated throw trajectory based the object parameters.
Optionally, the controller is configured to associate the throw data in the waste object model with the respective object parameters of the waste object.
Optionally, the throw data comprises the timing of the position of the waste object after being thrown to the target position, wherein the controller is configured to compare the timing of the position of the waste object with a time at which the gripper releases the waste object when being thrown.
Optionally, the controller is configured to determine an expected target time for the waste object to reach the target position based on the time at which the gripper releases the waste object when being thrown, and compare the timing of the position of the waste object with the expected target time.
Optionally, the controller is configured to determine the expected target time based on the detected object parameters of the waste object.
Optionally, the throw sensor is configured to provide the throw data of the waste object along a path between a throw position at which the gripper releases the waste object when being thrown and the target position.
In a second aspect of the disclosure, there is provided a method of controlling a waste robot comprising moving a manipulator within a working area, controlling a gripper connected to the manipulator to selectively grip a waste object in the working area at a picking position and throw the waste object to a target position, determine the position of the waste object after being thrown to the target position as throw data, determine deviations in the position of the thrown waste object by comparing the throw data with the target position, and determine control instructions to the gripper and/or manipulator for subsequently gripped waste objects based on the deviations to throw the subsequently gripped waste objects towards the target position.
In a third aspect of the disclosure, a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
Various other aspects and further examples are also described in the following detailed description and in the attached claims with reference to the accompanying drawings, in which:
Figure 1 shows a perspective view of a waste sorting robot;
Figure 2 shows a further perspective view of a waste sorting robot;
Figure 3a shows a schematic front view of a waste sorting robot, where a gripper has engaged with a waste object and lifted the waste object from the conveyor;
Figure 3b shows a position of the gripper, subsequent to the position shown in Fig. 3a, where the gripper is positioned at a throw position where the waste object is thrown to a target position; Figure 3c shows a position of the gripper, subsequent to the position shown in Fig. 3b, where the gripper moves in a direction towards a next waste object after having thrown the previous waste object towards the target position;
Figure 3d shows a schematic side view of a waste sorting robot in Fig. 3c, where a detected position of the waste object is compared to the position of the target position;
Figure 3e shows another schematic side view of a waste sorting robot, where the position of another thrown waste object is compared to the position of the target position;
Figure 4 shows a schematic front view of a waste sorting robot, where a gripper moves towards another waste object to be thrown to the target position; and Figure 5 shows a flowchart of a method of controlling a waste sorting robot.
Figure 1 shows a perspective view of a waste sorting robot 100. In some examples, the waste sorting robot 100 can be a waste sorting gantry robot 100. The examples described below can also be used with other types of robot such as robot arms or delta robots. In some other examples, the waste sorting robot 100 is a Selective Compliance Assembly Robot Arm (SCARA). The different types of robot are collectively referred to as waste sorting robot 100 below for brevity.
The waste sorting robot 100 comprises a manipulator 101 which is movable within a working area 102. The waste sorting robot 100 comprises a gripper 103 which is connected to the manipulator 101. The gripper 103 is arranged to selectively grip a waste object 104 which moves into the working area 102 on a conveyor belt 113. The gripper 103 may comprise a pneumatic suction gripper holding and releasing waste objects 104 by a varying air- or gas pressure. Alternatively, or in addition, the gripper 103 may comprise movable jaws to pinch the waste object 104 with a releasable grip. The conveyor belt 113 may be a continuous belt, or a conveyor belt formed from overlapping portions. The conveyor belt 113 may be a single belt or alternatively a plurality of adjacent moving belts (not shown). In other examples, the waste object 104 can be conveyed into the working area 102 via other conveying means. The conveyor belt 113 can be any suitable means for moving the waste object 104 into the working area 102. For example, the waste object 104 may be fed under gravity via a slide (not shown) to the working area 102.
The working area 102 is an area within which the manipulator 101 and gripper 103 is able to reach and interact with the waste object 104. The working area 102 as shown in Fig. 1 is a cross hatched area beneath the gripper 103. Fig. 1 shows only one waste object 104 for clarity, but it should be understood that any number of waste objects 104 may move into the working area 102. The schematic view of a waste sorting robot 100 in Fig. 2 shows a plurality of waste objects 104a, 104b, 104c, collectively referred to as waste object 104 for brevity unless otherwise indicated.
The gripper 103 is arranged to grip the waste object 104 in the working area 102, at a momentaneous position referred to as a picking position 105 below, and throw the waste object 104 to a target position 106a. Fig. 2 is a schematic illustration of a waste object 104b having a momentaneous picking position 105, and a target position 106a where a previous waste object 104a has been thrown by the gripper 103. The working area 102 may extend over the target position 106a as indicated by the cross hatched area beneath the gripper 103 in Fig. 2, to allow the gripper 103 to throw a waste object 104 onto the target position 106a with a vertical throw in some examples.
The manipulator 101 , and the gripper 103 connected thereto, is configured to move within a working volume defined by the height above the working area 102 where the waste sorting robot 100 can manipulate the waste object 104. In some examples, the manipulator 101 is moveable along a plurality of axes. In some examples, the manipulator 101 is moveable along three axes which are substantially at right angles to each other. In this way, the manipulator 101 is movable in an X-axis which is parallel with the longitudinal axis of the conveyor belt 113 (“beltwise”). Additionally, the manipulator 101 is movable across the conveyor belt 113 in a Y-axis which is perpendicular to the longitudinal axis of the conveyor belt 113 (“widthwise”). The manipulator 101 is movable in a Z-axis which is in a direction normal to the working area 102 and the conveyor belt 113 (“heightwise”). Optionally, the manipulator 101 and/or gripper 103 can rotate about one or more axes (W), as schematically indicated in Fig. 2. The waste sorting robot 100 may comprise one or more servos, pneumatic actuators or any other type of mechanical actuator for moving the manipulator 101 and gripper 103 in one or more axes. For the purposes of clarity, the servos, pneumatic actuators or mechanical actuators are not shown in the Figures.
The waste sorting robot 100 is arranged to sort the waste object 104 into fractions according to one or more parameters of the waste object 104. The waste objects 104 can be any type of industrial waste, commercial waste, domestic waste or any other waste which requires sorting and processing. Unsorted waste material comprises a plurality of fractions of different types of waste. Industrial waste can comprise fractions, for example, of metal, wood, plastic, hardcore and one or more other types of waste. In other examples, the waste can comprise any number of different fractions of waste formed from any type or parameter of waste. The fractions can be further subdivided into more refined categories. For example, metal can be separated into steel, iron, aluminium etc. Domestic waste also comprises different fractions of waste such as plastic, paper, cardboard, metal, glass and I or organic waste. A fraction is a category of waste that the waste can be sorted into by the waste sorting robot 100. A fraction can be a standard or homogenous composition of material, such as aluminium, but alternatively a fraction can be a category of waste defined by a customer or user.
The waste sorting robot 100 may comprise a parameter sensor 107 configured to detect object parameters of the waste objects 104, and a controller 108 in communication with the parameter sensor 107 which may be configured to receive the detected object parameters. The controller 108 may thus be configured to send movement instructions to the manipulator 101 and gripper 103 for interacting with the waste objects 104 to be sorted, based on the detected object parameters. Le. the gripper 103 may selectively grip the waste objects 104 to be sorted to different target positions, such as different chutes arranged along the working area. The controller 108 may thus be configured to send instructions to the X-axis, Y-axis and Z-axis drive mechanisms of the manipulator 101 and gripper 103 to control and interact with the waste objects 104 on the conveyor belt 113. Various information processing techniques can be adopted by the controller 108 for controlling the manipulator 101 and gripper 103. Such information processing techniques are described in WO2012/089928, WO20 12/052615, WO2011/161304, W02008/102052 which are incorporated herein by reference. The control of the waste sorting robot 100 is discussed in further detail in reference to Figs. 3 - 4 below.
Fig. 3a shows a schematic front view of a waste sorting robot 100 where the gripper 103 has engaged with a waste object 104a and lifted the waste object 104a from the conveyor belt 113. Fig. 3b shows a position of the gripper 103, subsequent to the position shown in Fig. 3a, where the gripper 103 is positioned at a throw position 110 where the waste object 104a is thrown towards a target position 106a with a throw velocity (vo) and throw angle (<po). Fig. 3c shows a position of the gripper 103, subsequent to the position shown in Fig. 3b, where the gripper 103 moves in a direction towards a next waste object after having thrown the first waste object 104a. Fig. 3c shows a momentaneous position of the first waste object 104a after having been thrown by the gripper 103 and/or manipulator 101 .
The waste sorting robot 100 comprises a throw sensor 112, 112a, 112b, configured to determine the position of the waste object 104 after being thrown to the target position 106a. Figs. 3c and 3d show a throw sensor 112 arranged to detect the position of the waste object 104a. Fig. 3d is a schematic illustration of a waste sorting robot 100, with a side view along the X-direction. The conveyor belt 113 moves in the direction of the arrows indicated in Fig. 3d. The controller 108 is configured to receive the position of the waste object 104a being thrown as throw data 109’. The dotted path 109’ in Figs. 3c and 3d indicates the actual position of the waste object 104 when being thrown from the throw position 110. Thus, the actual position of the waste object 104 is referred to as the throw data 109’. From the side view in Fig. 3d it can be seen that the position of the waste object 104a in the X-direction falls short of the target position 106a. The intended throw trajectory is indicated as the dotted path 109. Thus, in this example, the first waste object 104a is instead thrown to a position indicated as 106b, off-set from the intended target position indicated as 106a. The controller 108 is configured to determine deviations in the position of the thrown waste object 104 by comparing the throw data 109’ received from the throw sensor 112 with the target position 106a. Fig. 3d illustrates one example of such deviating positions, in the X-direction, but it should be understood that deviations may occur also in the Y-direction.
The controller 108 is configured to determine control instructions to the gripper 103 and/or manipulator 101 for subsequently gripped waste objects 104 based on the aforementioned deviations to throw the subsequently gripped waste objects 104 towards the target position 106a. Figs. 3 - 4 show one target position 106a for the purpose of clarity but it should be understood that different waste objects 104a, 104b, 104c, may have a plurality of different target positions 106a. Fig. 3e shows an example of a subsequently gripped waste object 104b being thrown towards the target position 106a. The throw data 109’ for the previous waste object 104a, in Fig. 3d, may in this example indicate that the throw angle (<po) and/or throw velocity (vo) for a waste object of this type needs to be adjusted, in order to reach the target position 106a. The controller 108 may thus send control instructions to the gripper 103 and/or manipulator 101 to change the throw position 110, throw angle (<po) and/or throw velocity (vo) for the subsequently gripped waste object 104b. Fig. 3e shows an example where the throw position 110, throw angle (<po) and/or throw velocity (vo) for the second waste object 104b has been compensated based on the deviations determined for the first waste object 104a, so that second waste object 104b is thrown to the target position 106a. E.g. the velocity component of the throw velocity (vo) in the X-direction could in this example be increased to not fall short of the target position 106a. An example of an intended throw trajectory 109 is also shown in Fig. 3e.
Determining control instructions to the gripper 103 and/or manipulator 101 for the subsequent waste objects 104 based on detected deviations in the position between thrown waste objects 104 and the respective target positions 106a provides for an effective refinement of the throwing capabilities of the waste sorting robot 100. The accuracy of the waste sorting robot 100 is improved and the useful time interval the manipulator 101 and gripper 103 interacts with each waste object 104 is optimized by refining the throw trajectories for the different types of waste objects 104. A more accurate and effective waste sorting robot 100 is provided. This means that the sorting speed may ultimately be increased. The speed of the conveyor belt 113 and/or the amount of waste objects 104 on the conveyor belt 113 may be increased. In one example, by increasing the speed of the conveyor belt, the objects to be sorted on the conveyor belt are more singularized and less likely to be overlapping. This means that the manipulation and object recognition is easier. This increases the processing rate e.g. tons I hour because the number of objects per hour which is fed to the robot increases.
The controller 108 may be configured to associate the throw data 109’ of the thrown waste object 104 and the determined control instructions to a waste object model to be applied to the subsequently gripped waste objects. Le. a model may be created where different categories of waste objects 104 may be associated with the actual throw data 109’ of such different categories of waste objects 104 and the corresponding control instructions. As a new waste object 104 is to be sorted to a target position 106a, the new waste object 104 may be compared to the categories of waste objects 104 in the waste object model. The closest matching throw data 109’ may be identified, and the corresponding control instructions may be retrieved to be applied for the new waste object 104, which been updated to compensate of the previous deviation from the intended trajectory 109. The next waste object 104 may thus be thrown to the correct target position 106a, e.g. by increasing the length of the throw as shown in the example of Fig. 3e. The control instructions to the gripper 103 and/or the manipulator 101 may thus be refined and may be optimized for varying types and categories of waste objects 104. Different categories of waste objects 104 may be identified by unique waste object parameters, which may also be associated with the respective throw data 109’ and control instructions, as described further below.
In some examples, it may not be possible to identify a particular waste object 104 in the waste object model, e.g. in an initial start-up phase of the waste sorting robot 100 when the amount of determined waste object parameters and/or throw data 109’ is limited. The controller 108 may be configured to approximate, e.g. by interpolation techniques of the available data in the model, an estimated throw velocity (vo) and a throw angle (<po) of the particular waste object 104 towards the target position 106a. The resulting throw data 109’ may then be recorded by the throw sensor 112 and the control instructions may be iteratively updated for the particular waste object type by continuously comparing the throw data 109’ with the intended throw trajectory 109. The waste object model may thus be continuously built and refined to be applicable to a growing number of different types of waste objects
104.
Building of the waste object model as described in the example above may be part of a machine learning-based capability of the controller 108 and the waste sorting robot 100. Thus, the controller 108 may be configured to input the throw data 109’ to a machine learning-based model to determine the control instructions for subsequently gripped waste objects 104. This provides for an effective adaptation and optimization of the control instructions in the waste object model.
In a further example, the controller 108 may be configured to input the throw data 109’ together with detected object parameters of the respective waste objects 104 to a machine learning-based model to determine the control instructions. Thus, in some examples, the waste sorting robot 100 comprises a parameter sensor 107, as schematically indicated in Fig. 4 and described further below. The parameter sensor 107 may comprise an imaging sensor, such as a camera. Object parameters for the different waste objects 104 may be determined from the image data received from the parameter sensor 107. Different image features, such as shapes, colours, geometrical relationships etc of the detected waste objects 104 may be assigned as the characterising object parameters in the waste object model, to create e.g. different categories of waste objects. The waste object model may be continuously populated with the throw data 109’ for the respective categories of waste objects, and deviations from the intended trajectory 109 may be determined to continuously adapt the associated control instructions. The categorization of the waste objects 104 may be continuously refined by analysing and comparing the image data of waste objects 104 having similar throw data 109’ for a similar set of control instructions. The same principle may be applied to data received from different types of parameter sensors 107 to characterize the waste objects 104, as exemplified below.
The throw sensor 112 may be arranged to detect if the waste object 104 lands at the target position 106a after being thrown. Fig. 4 shows an example where a throw sensor 112b is arranged adjacent the target position 106a, which may comprise a chute into which the waste object 104 should be thrown. A deviation may be detected if the throw sensor 112b is unable to detect the position of the waste object 104 at the target position 106a or chute after the waste object 104 is thrown. The controller 108 may thus be configured to update the control instructions to the gripper 103 and/or manipulator 101 to adjust any of the throw position 110, throw velocity (vo) and throw angle (cpo) towards the target position 106a. The throw sensor 112b may in some examples comprise a trigger sensor 112b configured to detect if a waste object 104 flies past the trigger sensor 112b and into the target position 106a. Such trigger sensor 112b may comprise an optical sensor with trigger beams across the target position 106a. In one example the trigger sensor 112b comprises a grid or fence of optical detection beams across the target position 106a, such as a laser- or IR-grid.
The throw data 109’ may comprise the timing (tp) of the position of the waste object 104 after being thrown to the target position 106a. Fig. 3e shows a schematic illustration where the positions of the waste object 104, i.e. the throw data 109’, have an associated timing (tp), further including a target time (t p) when the waste object 104 is detected at the target position 106a. The timing of the position of the waste object 104 may thus comprise a target time (tip) at which the waste object 104 reaches the target position 106a. I.e. the throw sensor 112 may be arranged to detect a target time (tip). For example, a trigger sensor 112b may record a time stamp, i.e. the target time (tip), when being triggered by a waste object 104. The controller 108 may be configured to compare the timing of the position of the waste object 104, such as the target time (tip) in this example, with a time (tp) at which the gripper 103 releases the waste object 104 when being thrown. Time (tp) is referred to as the release time (tp). This allows for synching the position of the waste object 104 with the release time (tp). E.g. if the trigger sensor 112b is triggered, the target time (tip) may be compared with the release time (tp) to determine if the times are in agreement to confirm a correct throw.
In some examples, the controller 108 may be configured to determine an estimated or expected target time (IET) for the waste object 104 to reach the target position 106a based on the time (tp) at which the gripper 103 releases the waste object 104 when being thrown. Fig. 3e shows a schematic illustration of a throw trajectory 109, which may be the intended throw trajectory 109, having an estimated target time (IET). The controller 108 may be configured to compare the timing of the position of the waste object 104, e.g. the target time (tip), with the expected target time (IET) to further facilitate the detection of deviations to optimize the control instructions of the gripper 103 and/or manipulator 101.
The controller 108 may be configured to determine the expected target time (IET) based on detected object parameters of the waste object 104. As described further below with reference to parameter sensor 107, the object parameters such as the size and shape of the waste object 104 may affect its trajectory towards the target position 106a, and the controller 108 may be configured to take into account these parameters to determine the expected target time (IET) for a more accurate estimation thereof. A plurality of different throw sensors 112a, 112b, may be arranged to detect the position of the waste object 104. In some examples, a trigger sensor 112b may be combined with a throw sensor 112a configured to detect image data of the waste object 104, as schematically indicated in Fig. 4. The throw sensor 112a may thus comprise an image detector, such as a camera. The image data may be utilized in conjunction with the information from a trigger sensor 112b to determine any deviations in the trajectory of the thrown waste objects 104 and when the control instructions to the gripper 103 and/or manipulator 101 needs to be adjusted. E.g. image data from image detector 112a may be utilized to determine any deviations in the position of the waste object 104 if an expected triggering of the trigger sensor 112b does not occur after a throw. The example in Fig. 4 shows a combination an image sensor 112a and trigger sensor 112b, but it should be understood that in some examples the throw data 109’ and any associated deviations are sufficiently determined by the image data of an image sensor 112a alone, without a trigger sensor 112b. In some example, the throw sensor 112 comprises an image sensor 112a configured to detect the position of the waste object 104 in two dimensions, e.g. in the X- and Y-directions, and/or in three dimensions along the in the X-, Y-, and Z - directions. This provides for accurately determining the position of the waste object 104 after being thrown, as well as any associated deviations in its position with respect to the target position 106a.
In some examples, an image sensor 112a is arranged to detect image data of the target position 106a. The throw data 109’ may thus comprise image data of the position of the waste object 104 and the target position 106a, to allow determining any deviations of the waste object 104 from the intended trajectory 109. A series of images may be taken over the target position 106a after the gripper 103 releases the waste object 104 at the throw position 110. A selection of image data may be made amongst the series of images to identify and extract the positions of the waste object 104 which may be indicative of deviations from the target position 106a. Different image processing techniques may be utilized to effectively extract the throw data 109’ and positions of the waste object 104 relative the target position 106a from the received image data. The controller 108 may then be configured to modify any of the throw position 110, throw velocity (vo) and throw angle (cpo) for the subsequent waste objects 104 to reduce or eliminate deviations in the position with respect to the intended trajectory 109 towards the target position 106a.
The two or three-dimensional throw data 109’ may comprise the timing (tp) of the position of the waste object 104 as discussed above. For example, the extracted image data of the position of the waste object 104 mentioned above may be compared with the timing of the gripper 103 and/or manipulator 101. For example, it may be determined that a first waste object 104a should land in the target area 106a after the gripper 103 releases the waste object 104a at the throw position 110, but before the next waste object 104b is picked or thrown from the gripper 103.
The timing (tp) may comprise any time stamps of the positions of the waste object 104 along any part of its path between the gripper 103 (or even picking position 105) and target position 106a, as schematically indicated in Fig. 3e. This provides for further facilitating determining deviations from an intended trajectory 109. The throw sensor 112 may thus be configured to provide the throw data 109’ of the waste object along a path between the throw position 110 at which the gripper 103 releases the waste object 104 when being thrown and the target position 106a. This provides for monitoring of the trajectory or flight path of the waste object 104. In some examples, such monitoring may be emphasized along different segments of the total trajectory, e.g. with emphasis on a segment closer to the target position 106a, to optimize the detection, e.g. to facilitate the image processing of the sensor data.
The controller 108 may be configured to determine an estimated throw position 110, an estimated throw velocity (vo) and/or an estimated throw angle (cpo) for throwing the waste object 104 along the intended throw trajectory 109. Le. for the respective waste objects 104 being selectively gripped by the gripper 103, the controller 108 may be configured to determine an estimation of the intended throw trajectory 109 towards the target position 106a. The estimation of the intended throw trajectory 109 is referred to as the estimated trajectory 109 below for brevity. The controller 108 may thus be configured to send control instructions to the gripper 103 and/or manipulator 101 so that the gripper 103 and/or manipulator 101 accelerates the gripped waste object 104 and releases the waste object 104 at a throw position 110 with a throw velocity (vo) and throw angle (cpo) towards the target position 106a.
The controller 108 may be configured to determine the estimated throw trajectory 109 based on the throw data 109’. E.g. the throw data 109’ accumulated for any number of waste objects 104 may be analysed and compared with the respective target positions 106a to determine any deviations and update the control instructions as described above. Le. any of the throw position 110, throw velocity (vo), and throw angle (cpo) may be estimated based on the throw data 109’. The controller 108 may be configured to determine the aforementioned deviations in the position of the waste object 104 from the target position 106a by comparing the throw data 109’ with the estimated throw trajectory 109. Thus, any parameter describing the motion of the thrown waste object 104, such as position, velocity and acceleration, may be detected by the throw sensor 112 as throw data 109’ and compared to the estimated trajectory 109. The controller 108 may thus be configured to iteratively refine the waste object model which may be applied to the subsequently gripped waste objects 104 by comparing the throw data 109’ resulting from the various waste objects 104 with the respectively estimated trajectories 109.
The waste objects 104 may have different characteristics, described as object parameters in the following. As mentioned, the controller 108 may be configured to associate the throw data 109’ with detected object parameters of the respective waste objects 104 in a waste object model to continuously optimize sets of control instructions for a range of different waste objects 104.
The waste sorting robot 100 may thus comprise a parameter sensor 107 configured to detect object parameters of the waste objects 104. The controller 108 may be configured to receive the detected object parameters from the parameter sensor 107. The object parameters may comprise the orientation and/or physical characteristics of the respective waste objects 104. The orientation of a waste object 104 should be construed as the orientation in the working volume in the X, Y, Z-directions. For example, two waste objects 104 of identical size and shape may have different orientations when being transported on the conveyor belt 113, since the waste object 104 may lay on different sides on the conveyor belt 113. The orientation of such waste objects 104 may thus also be different when being held in place in the gripper 103, since the gripper 103 typically grips the waste objects 104 from a top-down approach, regardless of the orientation of the waste objects 104 on the conveyor belt 113. The physical characteristics may comprise geometrical characteristics of the respective waste objects 104, such as the shape, size, and/or volume. Alternatively, or in addition, the physical characteristics may comprise material characteristics, such as from what material the waste object 104 is made, density, and/or surface properties of the waste object 104.
The controller 108 may be configured to determine an estimated throw trajectory 109 of the respectively gripped waste object 104 towards the target position 106a based on the detected object parameters of said gripped waste object 104. The parameter sensor 107 may be configured to detect object parameters of a first waste object 104a prior to engaging the waste object 104a with the gripper 103, e.g. in case the parameter sensor 107 comprises an image sensor or any other sensor configured to detect object parameters remotely. The object parameters may in one example comprise information about the size of the engaged waste object 104a. The controller 108 may thus be configured to estimate a throw trajectory 109 of the waste object 104a based on the size thereof. The controller 108 is configured to send control instructions to the gripper 103 and/or the manipulator 101 so that the gripper 103 and/or the manipulator 101 accelerates the gripped waste object 104a and releases the waste object 104a at a throw position 110. The waste object 104a is released with a throw velocity (vo) and throw angle (<po) towards the target position 106a for throwing the waste object 104a along the estimated throw trajectory 109, associated with the waste object 104a, from the throw position 110 to the target position 106a. Fig. 3b is a schematic illustration of the momentaneous position of the gripper 103 at the throw position 110, where the waste object 104a is released from the gripper 103. The waste object 104a has thrown a throw velocity (vo) and throw angle (<po) when being released. The controller 108 may be configured to instruct the manipulator 101 and gripper 103 to move to a next identified waste object 104c to be picked from the conveyor belt 113 (see e.g. Fig. 4), as soon as the first waste object 104a has been released at the throw position 110.
Fig. 4 is a schematic illustration where a second waste object 104c is to be picked from a picking position 105 and moved to a subsequent throw position 110. As described with respect to the first waste object 104a, the controller 108 may be configured to receive object parameters of the second waste object 104c from the parameter sensor 107. The object parameters of the second waste object 104c may comprise information that the second waste object 104c has a different size and/or shape, and/or different material composition, and/or different orientation on the conveyor belt 113, compared to the first waste object 104a. The controller 108 may be configured to determine a throw trajectory 109 of the second waste object 104c towards target position 106a based on the object parameters associated with the second waste object 104c. In the example in Fig. 4, the object parameters of the second waste object 104c may comprise information that the second waste object 104c has a more flattened shape compared to the first waste object 104a. In one example, the flat waste object 104c may be a sheet of material, such as metal, paper, or plastic, while the first waste object 104a may be from the same material, but crumbled to rounder shape. Determining the throw trajectory 109 of the second waste object 104c may thus take into account the shape of the second waste object 104c results in a different motion through the air after being released by the gripper 103.
In one example the waste objects 104 may have essentially the same object parameters with respect to the geometrical characteristics, i.e. same size and shape, but the material characteristics may be different. The densities of the waste objects 104 may vary, and accordingly the weight. Determining the throw trajectory 109 may thus take into account the different weights of the waste objects 104. E.g. a heavier object needs to be accelerated for a longer duration by the gripper 103 and/or manipulator 101 to reach a desired throw velocity (vo), compared to a lighter object, due to the increased inertia of the heavier object. Other material characteristics may include structural parameters such as the flexibility of the waste objects 104.
In a further example the waste objects 104 may have essentially the same object parameters with respect to the geometrical characteristics and the material characteristics, but the orientation of the waste objects 104 on the conveyor belt 113 may vary. The orientation of the waste objects 104 when held in place by the gripper 103 may thus also vary, if the waste objects 104 are gripped from the same top-down approach. For example, a rectangular waste object 104 which has one side significantly shorter than the remaining two, e.g. shaped like a text book, may have different trajectories through the air depending on which side is facing the throw direction. The waste object 104 may experience less drag if the shortest side is facing the throw direction, thus cutting through the air with less air resistance. Hence, the detected object parameters may comprise information of the orientation of the waste objects 104 to estimate the related throw trajectories 109.
Determining an estimated throw trajectory 109 of the respectively gripped waste objects 104, based on the detected object parameters provides for optimizing the useful time interval the manipulator 101 and gripper 103 interacts with each waste object 104. E.g. estimating the required throw velocity (vo) of a first waste object 104a to be thrown to the target position 106a allows for minimizing the amount of time the gripper 103 needs to carry the first waste object 104a before being thrown. The first waste object 104a may be thrown quickly at a throw position 110 just after being accelerated to the throw velocity (vo), and the gripper 103 may immediately target the next identified waste object 104c. A subsequently gripped waste object 104 may have associated object parameters which dictate a different throw trajectory 109 and the gripper 103 and/or manipulator 101 may be controlled to throw the waste object 104 accordingly. The optimized throwing of the waste objects 104 to the target positions 106a as described in the examples above provides for a more effective waste sorting robot 100.
The controller 108 may be configured to associate the throw data 109’ in the waste object model with the respective object parameters of the waste object 104. Le. a waste object model may be created where different categories of waste objects 104, based on their respective object parameters, are associated with the throw data 109’ of such different categories of waste objects 104. As a new waste object 104 is to be sorted to a target position 106a, the object parameters obtained from a parameter sensor 107 may be compared to the categories of waste objects 104 in the waste object model so that the closest matching throw data 109’ may be identified. As the next waste object 104 is identified as having similar object parameters as a previous waste object 104 having associated throw data 109’, the respective control instructions in the model may have been updated to compensate of any previous deviation from the intended throw trajectory 109. The next waste object 104 may thus be thrown to the correct target position 106a, e.g. by increasing the length of the throw in the example of Fig. 3e. The controller 108 may thus be configured to associate the throw data 109’ and the detected object parameters of the thrown waste object 104 to a waste object model.
Different types of parameter sensors 107 will be described in the following. A parameter sensor 107 may be positioned upstream of the working area 102 so that detected parameters of the waste objects 104 may be sent to the controller 108 before the waste objects 104 enter the working area 102. Alternatively, or in addition, the sensor 107 may be positioned in the working area 102. The parameter sensor 107 may comprise a plurality of different types of parameter sensors 107. The parameter sensor 107 or plurality of parameter sensors 107 may be arranged at different positions outside or inside the working area 102 or working volume. In some examples the parameter sensor 107 or plurality of parameter sensors 107 may be arranged on, or in communication with, the manipulator 101 and/or gripper 103. In examples where the parameter sensor 107 is configured to detect the object parameters remotely, e.g. when the parameter sensor 107 comprises an image sensor, the parameter sensor 107 may be arranged in the working area 102 or working volume, but arranged to detect object parameters while the respective waste objects 104 have not yet been transported into the working area 102, e.g. by being aligned towards an upstream direction of the conveyor 113 movement with respect to the manipulator 101 and/or gripper 103. The parameter sensor 107 may in that case be arranged adjacent, or on, any parts of the waste sorting robot 100. This may provide for reducing the foot print of the waste sorting robot 100, while still facilitating early detection of the object parameters. Alternatively, or in addition, the parameter sensor 107 may be arranged adjacent, or on, any parts of the waste sorting robot 100 while being aligned to detect object parameters when the respective waste objects 104 have been transported into the working area 102. This may be advantageous in applications where the waste sorting robot 100 is optimized for compactness, if the conveyor 113 moves at a slower speed, or if continuous analysis of the object parameters is desired for optimizing the manipulation and sorting of the of the waste objects 104 along their conveyor path. This may be advantageous in case the waste objects 104 would be subject to movements while moving through the working area 102. The parameter sensor 107 may be positioned upstream of the manipulator 101 and/or gripper 103, e.g. outside the working area 102, while being arranged to detect object parameters while the respective waste objects 104 have not yet been transported into the working area 102. The parameter sensor 107 may in such example be arranged on a separate frame (not shown) positioned upstream of the manipulator 101 and/or gripper 103. Alternatively, or in addition, the parameter sensor 107 may be positioned upstream of the manipulator 101 and/or gripper 103, e.g. outside the working area 102, while being arranged to detect object parameters when the respective waste objects 104 have been transported into the working area 102. This may provide for retrieving object parameters, such as image data of the waste objects 104 from several imaging angles, which may provide for improving the performance of the waste objects 104.
The parameter sensor 107 may comprise any sensor suitable to detect a parameter of the waste object 104 e.g. one or more of an image sensor, a force sensor, a gyroscopic sensor, a motion sensor, an electric current sensor, a hall sensor, a metal detector, a temperature sensor, a chemical sensor, a visual and I or infrared spectroscopic detector, radioactivity sensor and / or a laser e.g. LIDAR. An image sensor may comprise one or more of an RGB camera, an infrared camera, a 3D imaging sensor, a terahertz imaging system.
The object parameters of the waste objects 104 may be detected by any of the mentioned sensors. For example, the geometrical dimensions and orientation of a waste object 104 may be determined from image data of the waste object 104 received from an image sensor 107. The image data may be used to determine any one of a size, shape, and volume of the waste object 104. Further, the image data may be utilized in a machine learning-based model to build up an object recognition capability of the waste sorting robot 100. Thus, the recorded image data may be utilized to distinguish physical characteristics such as from what material the waste object 104 is made, and the associated material characteristics, besides from the geometrical dimensions and orientation of a waste object 104. The image data may be combined with sensor data from any one of the aforementioned sensors.
In some examples, the gripper 103 comprises a gyroscopic sensor, such as an electrical MEMS gyroscope used as a velocity sensor. The controller 108 may thus determine the acceleration and velocity of the gripper 103 during operation. The velocity of the gripper 103 may thus be monitored and controlled at the throw position 110 so that the velocity of the gripper 103 translates to the desired throw velocity (vo) of the waste object 104 when being released from the gripper 103. The throw angle (<po) may be controlled by the X-, Y-, Z- movement of the gripper 103 and manipulator 101. For example, an upwards acceleration of the waste object 104 as illustrated in Fig. 3b may be achieved by an acceleration of the gripper 103 in the Z-direction, upwards from the conveyor belt 113. The waste object 104a continues its trajectory with a velocity component in the Z-direction after being released from the gripper 103. The gripper 103 may continue with an upward movement in the Z-direction after release, to not interfere with the trajectory of the thrown waste object 104a. The velocity component of the gripper 103 in the upwards Z-direction at the throw position 110 may be increased or decreased to vary the throw angle (<po).
Accelerating the gripped waste object 104 to the throw velocity (vo) may comprise applying a force (F) to the gripped waste object 104 during a time (T) by a movement of the gripper 103 and/or the manipulator 101 . Alternatively, or in addition, the waste object 104 may be accelerated to the throw velocity (vo) by applying an airflow to the gripped waste object 104, where the airflow is ejected from the gripper 103. Hence, a pressure from an airflow of a flow of a gas, ejected from the gripper 103 onto the waste object 104 applies a force onto the waste object 104 to accelerate the waste object 104. In some examples, the gripper 103 and/or manipulator 101 may accelerate the waste object 104 by a movement in the X-, Y-, Z- directions in combination with pushing the waste object 104 away from the gripper 103 by an airflow. The gripper 103 may in some examples comprise a suction gripper 103 comprising a suction cup configured to physically engage with a surface of the waste object 104. A negative pressure may be created in the suction cup so that the waste object 104 is held in place by the gripper due to the force created by the negative pressure. The suction gripper 103 may be in fluid communication with a pneumatic system (not shown) to connecting the suction gripper 103 with a compressed air or gas supply. The air or gas supply to the suction gripper 103 may be reversed so that the negative pressure is released and a positive pressure may be exerted onto the waste object 104 to throw the waste object 104 as described above. In a further example, the gripper 103 comprises movable jaws to grip the waste objects 104, and a gas- or airflow connection to push and throw the waste objects 104 away from the gripper 103 when the jaws release their grip.
Fig. 5 is a flowchart of a method 200 of controlling a waste sorting robot 100. The method 200 comprises moving 201 a manipulator 101 within a working area 102, and controlling 202 a gripper 103 connected to the manipulator to selectively grip a waste object 104, 104a, 104b, 104c in the working area 102 at a picking position 105 and throw the waste object 104 to a target position 106a. The method 200 comprises determining 203 the position of the waste object 104 after being thrown to the target position 106a as throw data 109’. The method 200 comprises determining 204 deviations in the position of the thrown waste object 104 by comparing the throw data 109’ with the target position 106a. The method 200 comprises determining 205 control instructions to the gripper 103 and/or manipulator 101 for subsequently gripped waste objects 104 based on the deviations to throw the subsequently gripped waste objects 104 towards the target position 106a. The method 200 thus provides for the advantageous benefits as described above with reference to the waste sorting robot 100 and Figs. 1 - 4. The method 200 provides for a more accurate and effective waste sorting robot 100.
A computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.
In another example two or more examples are combined. Features of one example can be combined with features of other examples.
Examples of the present disclosure have been discussed with particular reference to the examples illustrated. However it will be appreciated that variations and modifications may be made to the examples described within the scope of the disclosure.

Claims

Claims
1. A waste sorting robot (100) comprising a manipulator (101 ) movable within a working area (102), a gripper (103) connected to the manipulator, wherein the gripper is arranged to selectively grip a waste object (104, 104a, 104b, 104c) in the working area at a picking position (105) and throw the waste object to a target position (106a), a throw sensor (112, 112a, 112b) configured to determine the position of the waste object after being thrown to the target position, a controller (108) in communication with the throw sensor and being configured to receive said position as throw data (109’), the controller is configured to determine deviations in the position of the thrown waste object by comparing the throw data with the target position, and determine control instructions to the gripper and/or manipulator for subsequently gripped waste objects based on the deviations to throw the subsequently gripped waste objects towards the target position.
2. Waste sorting robot according to claim 1 , wherein the controller is configured to associate the throw data of the thrown waste object and the determined control instructions to a waste object model to be applied to the subsequently gripped waste objects.
3. Waste sorting robot according to claim 2, wherein the controller is configured to input the throw data to a machine learning-based model to determine the control instructions for the subsequently gripped waste objects.
4. Waste sorting robot according to any of claims 1 - 3, wherein the throw sensor is arranged to detect the waste object landing at the target position, such as into a chute, after being thrown.
5. Waste sorting robot according to any of claims 1 - 4, wherein, for the respective waste objects being selectively gripped by the gripper, the controller is configured to determine an estimated throw trajectory (109) of the gripped waste object towards the target position, send control instructions to the gripper and/or manipulator so that the gripper and/or manipulator accelerates the gripped waste object and releases the waste object at a throw position (110) with a throw velocity (vo) and throw angle (cpo) towards the target position to throw the waste object along the estimated throw trajectory associated with the waste object, from the throw position to the target position.
6. Waste sorting robot according to claim 5, wherein the controller is configured to determine the estimated throw trajectory based on the throw data.
7. Waste sorting robot according to claim 5 or 6, wherein the controller is configured to determine the deviations by comparing the throw data with the estimated throw trajectory.
8. Waste sorting robot according to any of claims 5 - 7, comprising a parameter sensor (107) configured to detect the object parameters of the waste objects, the object parameters comprising the orientation and/or physical characteristics of the respective waste objects, and wherein the controller is configured to determine the estimated throw trajectory based the object parameters.
9. Waste sorting robot according to claim 2 and 8, wherein the controller is configured to associate the throw data in the waste object model with the respective object parameters of the waste object.
10. Waste sorting robot according to any of claims 1 - 9, wherein the throw data comprises the timing (tp) of the position of the waste object after being thrown to the target position, wherein the controller is configured to compare the timing of the position of the waste object with a time (tp) at which the gripper releases the waste object when being thrown.
11. Waste sorting robot according to claim 10, wherein the controller is configured to determine an expected target time (IET) for the waste object to reach the target position based on the time (tp) at which the gripper releases the waste object when being thrown, and compare the timing of the position of the waste object with the expected target time.
12. Waste sorting robot according to claim 8 and 11 , wherein the controller is configured to determine the expected target time (IET) based on the detected object parameters of the waste object.
13. Waste sorting robot according to any of claims 1 - 12, wherein the throw sensor is configured to provide the throw data of the waste object along a path between a throw position (110) at which the gripper releases the waste object when being thrown and the target position.
14. A method (200) of controlling a waste robot comprising moving (201) a manipulator (101) within a working area (102), controlling (202) a gripper (103) connected to the manipulator to selectively grip a waste object (104, 104a, 104b, 104c) in the working area at a picking position (105) and throw the waste object to a target position (106a), determine (203) the position of the waste object after being thrown to the target position as throw data, determine (204) deviations in the position of the thrown waste object by comparing the throw data with the target position, and determine (205) control instructions to the gripper and/or manipulator for subsequently gripped waste objects based on the deviations to throw the subsequently gripped waste objects towards the target position.
15. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to claim 14.
PCT/FI2021/050724 2020-10-28 2021-10-26 Waste sorting robot with throw sensor for determining position of waste object WO2022090627A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2030329A SE544102C2 (en) 2020-10-28 2020-10-28 Waste sorting robot with throw sensor for determining position of waste object
SE2030329-3 2020-10-28

Publications (1)

Publication Number Publication Date
WO2022090627A1 true WO2022090627A1 (en) 2022-05-05

Family

ID=81077457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2021/050724 WO2022090627A1 (en) 2020-10-28 2021-10-26 Waste sorting robot with throw sensor for determining position of waste object

Country Status (2)

Country Link
SE (1) SE544102C2 (en)
WO (1) WO2022090627A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022167468A1 (en) * 2021-02-03 2022-08-11 Karl Schulz Robot for detecting and picking up at least one predetermined object
AT526401A1 (en) * 2022-08-11 2024-02-15 Brantner Env Group Gmbh Method for sorting material to be sorted

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9789517B2 (en) * 2015-02-10 2017-10-17 Veolia Environnement—Ve Selective sorting method
US20180127219A1 (en) * 2016-11-08 2018-05-10 Berkshire Grey Inc. Systems and methods for processing objects
WO2019207202A1 (en) * 2018-04-22 2019-10-31 Zenrobotics Oy Force control coupling for a robotic end effector for a waste sorting robot
WO2019215384A1 (en) * 2018-05-11 2019-11-14 Zenrobotics Oy Waste sorting robot
JP2020022930A (en) * 2018-08-07 2020-02-13 宇部興産株式会社 Waste sorting device and sorting method, as well as waste processing system and processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9789517B2 (en) * 2015-02-10 2017-10-17 Veolia Environnement—Ve Selective sorting method
US20180127219A1 (en) * 2016-11-08 2018-05-10 Berkshire Grey Inc. Systems and methods for processing objects
WO2019207202A1 (en) * 2018-04-22 2019-10-31 Zenrobotics Oy Force control coupling for a robotic end effector for a waste sorting robot
WO2019215384A1 (en) * 2018-05-11 2019-11-14 Zenrobotics Oy Waste sorting robot
JP2020022930A (en) * 2018-08-07 2020-02-13 宇部興産株式会社 Waste sorting device and sorting method, as well as waste processing system and processing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022167468A1 (en) * 2021-02-03 2022-08-11 Karl Schulz Robot for detecting and picking up at least one predetermined object
AT526401A1 (en) * 2022-08-11 2024-02-15 Brantner Env Group Gmbh Method for sorting material to be sorted

Also Published As

Publication number Publication date
SE2030329A1 (en) 2021-12-21
SE544102C2 (en) 2021-12-21

Similar Documents

Publication Publication Date Title
WO2022090627A1 (en) Waste sorting robot with throw sensor for determining position of waste object
FI127100B (en) A method and apparatus for separating at least one object from the multiplicity of objects
CN108273761A (en) A kind of device and method of sorting building waste
DK3056288T3 (en) SELECTIVE SORTING METHOD AND DEVICE
KR20220165262A (en) Pick and Place Robot System
JP5806301B2 (en) Method for physical object selection in robotic systems
US20180056335A1 (en) Workpiece sorting system and method
CN208390465U (en) A kind of device sorting building waste
Zhang et al. Gilbreth: A conveyor-belt based pick-and-sort industrial robotics application
JP2001252886A (en) Object handling system
Raptopoulos et al. Robotic pick-and-toss facilitates urban waste sorting
US20230144252A1 (en) Waste sorting robot
CN110090818A (en) A kind of intelligence color sorting equipment fine air-flow self-adaptation control method
JP2010264559A (en) Method of controlling robot
US11465858B2 (en) Actuated air conveyor device for material sorting and other applications
CN114405866B (en) Visual guide steel plate sorting method, visual guide steel plate sorting device and system
CN110302981A (en) A kind of solid waste sorts online grasping means and system
Nguyen et al. Development of a robotic system for automated decaking of 3D-printed parts
US20230405639A1 (en) Waste sorting robot with gripper that releases waste object at a throw position
TW202322990A (en) Velocity control-based robotic system, method to control a robotic system and computer program product embodied in a non-transitory computer readable medium
WO2022072217A1 (en) Controllable array sorting device
CN111687057B (en) Article sorting method, sorting system, sorting equipment and readable storage medium
JP5606424B2 (en) Component extraction method and component extraction system
Michalos et al. A novel pneumatic gripper for in-hand manipulation and feeding of lightweight complex parts—a consumer goods case study
Zhang et al. A CNN-based fast picking method for WEEE recycling

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21885432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21885432

Country of ref document: EP

Kind code of ref document: A1