EP3468727A1 - Dispositif de triage et procédé de triage correspondant - Google Patents

Dispositif de triage et procédé de triage correspondant

Info

Publication number
EP3468727A1
EP3468727A1 EP17729466.7A EP17729466A EP3468727A1 EP 3468727 A1 EP3468727 A1 EP 3468727A1 EP 17729466 A EP17729466 A EP 17729466A EP 3468727 A1 EP3468727 A1 EP 3468727A1
Authority
EP
European Patent Office
Prior art keywords
evaluation
sorting
objects
unit
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP17729466.7A
Other languages
German (de)
English (en)
Other versions
EP3468727B1 (fr
Inventor
Thomas Längle
Wolfgang Karl
Georg Maier
Michael Bromberger
Mario Kicherer
Thomas Becker
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Karlsruher Institut fuer Technologie KIT
Original Assignee
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Karlsruher Institut fuer Technologie KIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV, Karlsruher Institut fuer Technologie KIT filed Critical Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Publication of EP3468727A1 publication Critical patent/EP3468727A1/fr
Application granted granted Critical
Publication of EP3468727B1 publication Critical patent/EP3468727B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/342Sorting according to other particular properties according to optical properties, e.g. colour
    • B07C5/3425Sorting according to other particular properties according to optical properties, e.g. colour of granular material, e.g. ore particles, grain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/361Processing or control devices therefor, e.g. escort memory

Definitions

  • the present invention relates to a sorting apparatus (hereinafter also referred to as an optical sorting system) according to the preamble of claim 1 (and a corresponding method).
  • a sorting apparatus hereinafter also referred to as an optical sorting system
  • object agglomerates object clusters
  • object clusters often additionally arise from the point of view of image processing, which algorithmically can only be separated with difficulty (in particular: segmented).
  • Sorting systems in particular for sorting bulk goods, are known from the prior art from the following publications:
  • the object of the present invention is to improve the performance of optical sorting systems for sorting objects in material flows, in particular to improve the real-time capability of the systems, ie, the Increase probability that there is actually a sorting decision for each object in the material stream at the time of discharge.
  • the present invention is intended to provide a more efficient processing of sensor data in order to enable a real-time capability even at short distances between the image recording unit (sensor system) and the sorting unit (unit for discharging objects) (and consequently to increase the sorting performance).
  • the present invention is intended to provide new approaches to sensor data processing.
  • the present invention is based on the following basic considerations: In conventional systems, there is the problem that for some objects when reaching the separation mechanism (sorting unit) there is no sorting decision, since the required calculations of the evaluation system (evaluation unit) have not yet been completed. A subset of the prior art known plants does not handle this condition at all, resulting in forcible loss of sorting performance and / or sorting quality. To solve these problems, the following approaches seem to be available:
  • a basic idea of the present invention is to suitably respond to the required calculation times which fluctuate depending on the sensor data or the specific conditions in the material flow and / or on its objects.
  • the system according to the invention can be realized as a belt sorting system, but slide sorting systems are also conceivable.
  • the optical detection of the material flow in this case usually means the recording of a plurality of individual images of the material flow or of cutouts thereof ever
  • Time unit For example, while the sorting system is working, video images (fast frame sequences) of a neutral background over which the material stream is transported can be recorded and evaluated by the evaluation unit in real time (with regard to the objects recorded in the material flow in front of said background).
  • a color line or color area camera system for example on a CCD or CMOS basis, can be used as imaging sensor system (image acquisition unit).
  • the individual images or video images of the image recording unit can be evaluated by image processing algorithms by the evaluation unit to the six-dimensional pose (ie the three-dimensional position and the three-dimensional orientation) or at least to determine the three-dimensional position of each individual object of the material flow at defined times.
  • the setting of the evaluation parameter (s) is preferably carried out by or in the evaluation unit, in particular by a microcontroller or a computer unit thereof.
  • the sorting of "classified” objects does not preclude non-classified objects from being sorted (for example, for safety reasons, they are always rejected as “bad objects” or rejected).
  • the sorting may, in particular, be a mechanical separation of the classified objects.
  • the objects are classified into two classes, namely "good objects” and "bad objects".
  • the objects of the two classes can be collected during sorting in separate containers. For this purpose (as is generally known to the person skilled in the art, cf. also the aforementioned prior art), rapid air valves can be used to blow out the bad objects.
  • a sorting decision can be made with the said minimum probability for any object in the material flow by the evaluation unit, so that the sorting unit can react to this object in accordance with the sorting decision made. If the (minimum) probability equals 100%, the evaluation unit makes a decision of identifying and classifying with absolute certainty for all objects in the material flow, ie a sorting decision. In this case, (especially at very high belt speeds of the conveyor belt of a belt sorting system and / or in the case of very high occupation densities of the objects in the material flow), the sorting decision may also be incorrect.
  • the accuracy GK of one or more process steps / e (for example, the segmentation process step or the process step of applying a decision tree-based classifier) can be set.
  • the process step (s) is / are then carried out with the (in each case) set GK.
  • the setting of a low GK can mean, for example, that the image data or the data resulting from the image data are only processed as coarse-screened as input data of the process step in order to minimize the number of calculations to be carried out with computer assistance.
  • the process step performed with this accuracy GK is ended in any case.
  • this can also mean that after a defined (for example, low) number of repetitions of the iteration loop (for iterative calculations) or after reaching a defined (eg low) recursion depth (for recursive calculations), ie after fulfilling a defined termination criterion, a termination of the calculations of the process step takes place.
  • a defined (for example, low) number of repetitions of the iteration loop (for iterative calculations) or after reaching a defined (eg low) recursion depth (for recursive calculations) ie after fulfilling a defined termination criterion
  • stepwise refining algorithms in particular: recursively, iteratively and / or incrementally refining algorithms
  • the algorithms can not be repetitive and / or analytical.
  • the calculation time BZ of one or more process steps / e can be at least approximately predetermined or set.
  • the process step (s) can then be carried out with the (respectively) set BZ or until the BZ has expired (termination criterion).
  • step-by-step accuracy-refining process steps can be implemented or realized.
  • Any two or all three of the mentioned evaluation parameter types (GK, BZ and / or WH) can be used together by the evaluation unit for identifying and classifying the objects of the material flow.
  • the GK can be set, while in another process step (for example, the application of a classifier based on decision trees) the GZ is set.
  • the implementation of the algorithms used is carried out such that for all used evaluation parameters or types of evaluation parameters for each process step at any time during the execution of such a process step (or at the time of
  • the identification and classification or the evaluation can therefore be carried out in several process steps (hereinafter also referred to as components) of the evaluation unit.
  • components For each process step (each component), one type, preferably exactly one type, of evaluation parameters according to claims 3 to 5 can be defined in each case. Further advantageously realizable features can be found in claim 7.
  • evaluation parameters can thus be set which are used directly in the evaluation unit in order to directly, ie directly, control the identification and classification (or the process steps of the same) by the evaluation unit.
  • evaluation parameters can be set which control the image acquisition unit 1 and which thus indirectly or indirectly, ie as a consequence of this control, influence the identification and classification performed by the evaluation unit (or the process steps thereof).
  • An example of such an evaluation parameter is the image resolution (or the number of pixels per unit area) in the case of
  • Image Capture Unit This can be decreased (e.g., by combining multiple pixels).
  • the image data then have a lower image resolution, the allocation parameters thus become less accurate / coarser and / or occur in smaller numbers (and thus can be determined more quickly).
  • the latter then also simplifies or accelerates the identification and classification by or in the evaluation unit indirectly.
  • the latter can also be effected by changing one or more evaluation parameters of the evaluation unit with respect to the setting on the basis of the evaluation parameter (s) of the image acquisition unit.
  • Such a sorting parameter can preferably be set by the sorting unit.
  • One possible sorting parameter is a combination of several rere adjacent exhaust nozzles of the sorting unit to a nozzle cluster. For example, with such a nozzle cluster, large objects can be sorted out more safely or (with a low occupation density) larger blow-out areas around objects to be sorted out can be effected. This makes it possible to further increase the probability of actually sorting out items to be sorted out by the sorting unit.
  • the occupation density can be defined in particular as an average number of objects per unit area of the material flow (for example, as an average number of objects with which a unit area of the conveyor belt of a belt-type sorting system is occupied).
  • the occupancy density may be the mean number of objects per fallway or per fall-area unit.
  • the occupancy distribution can capture or describe whether the objects in the material flow are all singled out or with which probability any object is isolated on, for example, the conveyor belt (or whether objects still overlap or with what probability cluster formations are present).
  • the context analysis as part of the identification may be a "Connected Component Analysis” as described, for example, in “Topological Algorithms for Digital Image Processing” by TY Kong, A. Rosenfeld, North Holland, Amsterdam, NL, 1996.
  • the segmentation as part of the identification may be performed, for example, as described in “Digital Image Processing and Image Acquisition” by B. Jahne, Springer, Heidelberg, Germany, 2012.
  • the identifying and classifying, in particular the identifying may comprise further process steps such as an image pre-processing step (before the segmentation and before the context analysis) and / or a feature calculation step (after the segmentation and after the context analysis). See, for example, “Digital Image Processing and Image Acquisition” by B. Jahne, Springer, Heidelberg, Germany,
  • the implementation of individual process steps can be done both in software and in hardware, but also only in software or only in hardware.
  • a central processing unit (server system) is used, with which all units of the sorting system are connected via bidirectional data lines.
  • This central processing unit can computer-aided execute all necessary data processing measures, calculations and / or process steps.
  • the present invention thus describes a procedure in which a best possible sorting decision can fundamentally be made within the available computing time, in order to thereby enable a principle compliance with real-time barriers.
  • the Improved valorization of the evaluation system can thus directly support qualitative advances in optical bulk material sorting, since time limits can be narrowed down. This is realized according to the invention by the use of algorithms which incrementally refine a sorting decision or alternatively align their calculations to a time budget allocated.
  • a number of implementations which differ in accuracy GK and calculation effort BZ can be used for subtasks of the evaluation system.
  • a runtime system can select concrete implementations in order to maintain real-time barriers and to achieve the best possible result during the available time.
  • the available time budget can be propagated to evaluation algorithms.
  • the algorithms can be enhanced with intelligence so that they make the best possible use of the available time.
  • interruptible sub-processes can be realized.
  • the evaluation algorithm can be interrupted by a control system at any time and it can be queried the best process step result up to this time.
  • information about an object to be sorted can always be evaluated for a sorting decision. It is therefore not the case that an object for a classification or sorting is not considered at all when the time limit is exceeded. This ultimately results in an increase in sorting performance and sorting quality.
  • sorting unit it can be ensured that a sorting decision is present for each object contained in the material flow when the separation mechanism (sorting unit) is reached.
  • This decision is usually based on information collected about the object by the sensor (image acquisition unit), the quality of the evaluation differs depending on the time available. This means that a better decision can be made than is the case with conventional prior art systems.
  • a lower latency between the sensory detection and the separation is made possible, whereby the spatial separation is minimized, an increase in the sorting quality is achieved, and a more compact sorting system is realized.
  • blow-out windows can be used compared to the prior art.
  • the invention can be used for the sorting of complex bulk materials which are classified on the basis of many complex features.
  • Optical sorting systems as in the present invention can be used if the materials or the objects differ on the basis of optical features, preferably in the visible or also in the near infrared range, and thus can be classified for sorting.
  • the invention can be used for the raw material processing industry, in addition to a cost-effective production always a consistently high quality must be ensured. This includes, for example, the sorting of industrial minerals, for example, to reduce the iron content in a raw material.
  • the invention can also be used in the food or beverage sector, where impurities in products (for example: dried peppers, dried grapes ...) must be eliminated. Another important area of application is the recycling of products (for example, scrap glass sorting).
  • the present invention can make a significant contribution to improved economy in optical sorting systems.
  • a material occupancy for example occupation density
  • a distribution of the objects in the material flow can be determined by a monitoring unit. This can be achieved through learned models as well as through certain metrics. Based on this knowledge, a single control component can be used
  • Elements (components or process steps) of the processing chain For example, with regard to the accuracy can be adjusted. Thus, a shorter calculation time of these members can be achieved.
  • the accuracy of at least one member or process step can be set to "coarser” or "lower", so that the real-time conditions in each Case (even if this reduces the accuracy of the classification or sorting decisions, or even if the error rate, ie the probability of a wrong sorting decision or
  • Classification decision for a currently viewed object is increasing). In accordance with the invention, this greatly increases the probability of compliance with all real-time conditions in the sorting system. This results in better sorting decisions, as a classification can be made at all by said increase in compliance for more objects in the material flow. (In extreme cases, in any case, i.e. for every single object in the
  • control component can set the evaluation parameters, in particular the accuracy, the calculation time and / or the repetition frequency, on the basis of relationships determined with known material flows (with known object occupancy, with known object types, object sizes, object weights, etc.). These relationships can dependencies between the material occupancy or the occupancy parameters on the one hand and the one to be carried out
  • FIGS. 1 to 5 show an embodiment of an inventive optical sorting system (and a corresponding sorting method) as follows.
  • FIG. 1 shows an optical band sorter according to the invention.
  • FIG. 2 shows the processing sequence in the sorting system according to FIG. 1 from the perspective of the digital data (in particular: the image data) and the material flow or the path of its objects.
  • FIGS. 1 and 2 shows the sequence of the sorting process according to FIGS. 1 and 2, in particular the process steps of the evaluation (identification and classification) in the evaluation unit of the system.
  • FIGS. 1 to 3 shows an example of an occupancy parameter determined in accordance with FIGS. 1 to 3 (here: material occupation density) and an evaluation parameter (here: accuracy GK) set using the same, which controls the identification and classification of the objects, that is to say the evaluation, by the evaluation unit.
  • FIG. 5 shows an example of the process step of the classification in the evaluation according to FIG. 3.
  • FIG. 1 shows an optical band sorter, which fundamentally follows the structure according to the prior art, wherein the features according to the invention are located in the evaluation unit 2 or the evaluation of the image data 4, respectively.
  • a bulk material flow or material flow M is transported by means of a conveyor belt 11 in a manner known per se to a person skilled in the art past an image recording unit 1 to a sorting unit 3 arranged at a defined distance from the image recording unit 1.
  • the bulk material flow M comprises a multiplicity of individual objects 0, which are to be classified or sorted here into only two classes, namely good objects (to be sorted into the collecting container 13b of the sorting unit 3) and bad objects (to be sorted into the further collecting container 13a the sorting unit 3).
  • the objects 0 of the material flow M must first be detected by the image acquisition unit, then classified by evaluation of the image data 4 recorded by this unit 1 in the evaluation unit 2 into good objects and bad objects, and finally sorted.
  • Klassifizier mecanics 13 is carried out by the compressed air valves of the sorting unit 3, based on the evaluation results 10 of the evaluation unit 2 bad Remove objects from the material flow M by blowing out.
  • the image recording unit 1 comprises an imaging sensor system, here a CCD color line camera 1a, which detects the material flow M or the objects 0 thereof in the ejection area of the conveyor belt 11 against a background 12 or records a rapid imaging sequence of the material flow M against the background 12 , Illumination lb of the unit 1 illuminates the material flow M against the background 12 of the unit 1 in order to ensure optimum image recording conditions for the camera 1a.
  • an imaging sensor system here a CCD color line camera 1a, which detects the material flow M or the objects 0 thereof in the ejection area of the conveyor belt 11 against a background 12 or records a rapid imaging sequence of the material flow M against the background 12 .
  • Illumination lb of the unit 1 illuminates the material flow M against the background 12 of the unit 1 in order to ensure optimum image recording conditions for the camera 1a.
  • the recorded image data or video data 4 are transmitted via a data line connection between camera la and evaluation unit 2 to the latter.
  • the evaluation unit 2 then carries out the process steps described below in detail for identifying and classifying the objects 0 of the material flow M in the image data 4 and transmits them
  • Evaluation results 10 of the performed process steps 7a-7g via a data link to the sorting unit 3.
  • the latter finally leads spaced from the discharge area of the conveyor belt 11 and of the receiving unit 1, the removal of bad objects, whereby the separation into Good objects (container 13b) and bad objects (container
  • FIG. 2 shows the material flow (solid arrows) and the data flow (dashed arrows), in particular the data flow in the image data acquisition and evaluation, in the system according to FIG. 1.
  • the material feed onto the conveyor belt 11 is first followed by a singling and settling of the objects O on the conveyor Conveyor before the transport state or state of the objects O shown in Figure 1 on the left in the image shown in the material flow M on the conveyor belt 11.
  • image acquisition or sensory detection of the objects O in the material flow M takes place by means of the image recording unit 1 or the camera 1a thereof.
  • the image data 4 are sent to the Evaluation in the evaluation unit 2 transmitted, which performs the classification of the conclusion of the evaluation or makes the sorting decision for the individual objects 0.
  • the material flow M downstream of the discharge region of the conveyor belt 11 falls along the discharge parabola.
  • the period of free fall of the objects 0 along the Abschparabel is defined by the distance between the conveyor belt end on the one hand and the impact area of the compressed air valves of the sorting unit 3 on the other hand and corresponds to the latency or the time for a meeting of a sort decision for an object 0 is available.
  • the evaluation that is to say the identification and classifying of the objects 0 by the evaluation unit 2 comprises a total of seven individual process steps 7.
  • the input data of the evaluation are the image data 4.
  • the determined assignment parameter 5 see FIG. 4: material occupation density
  • the evaluation parameter 6 controlling this process step in the evaluation unit is set. The necessary evaluation of the occupation density for calculating the evaluation parameters 6a-6g of the individual process steps 7a-7g of the identification and classifying by the evaluation unit 2 takes place in FIG. 4
  • the evaluation unit 2 for the first process step 7a of the image data preprocessing of the image data 4 sets the evaluation parameter GK 6a as the evaluation parameter controlling this preprocessing.
  • the accuracy level can be eg "low”, “medium” or "high” (see also FIG. 4).
  • a fixed calculation time BZ is set by the evaluation unit 2 as the evaluation parameter 6b.
  • the control of the step 7b of the evaluation is thus based on a BZ.
  • the subsequent third process step 7c of the segmentation is again performed or controlled with an accuracy step GK set by means of the unit 2 as the evaluation parameter 6c.
  • one of the three accuracy level values "low”, “medium” or "high” is likewise assigned or set as the evaluation parameter 6d.
  • the temporally fifth process step of the feature calculation 7e is performed or controlled on the basis of a set repetition frequency WH as the evaluation parameter 6e.
  • the feature calculation here comprises a calculation sequence to be performed recursively, the repetition frequency WH 6e being e.g. may take on a value between 3 and 7 (i.e., the recursion depth of the calculations may be chosen between 3 and 7, where 3 requires little computational effort, thus allowing a fast execution of step 7e and a value of 7 being high
  • step 7e Requires computational effort, so that the implementation of step 7e takes a long time, but can be done with high accuracy - the latter is thus useful or possible only with low occupancy density 5, if 0 sorting decisions are to be made for all objects).
  • the classification takes place as the sixth step 7f, an accuracy GK in turn being set as the evaluation parameter 6f for this process step 7f.
  • the final sorting decision 7g also takes place on the basis of setting an accuracy value GK 6g.
  • FIG. 4 shows how, in the system of FIGS. 1 to 3, evaluation parameters 6 can be set based on occupancy parameters 5 previously determined in the image data 4. This is shown on the basis of a single assignment parameter 5, in this case the material occupation density, which is determined with the evaluation unit 2 from the image data recorded with the image recording unit 1 or the camera 1a (shown here individual images 4).
  • This accuracy 6 is an evaluation parameter of the evaluation unit 2 which is set at the evaluation unit 2 in order to identify and classify the objects 0 by As shown in FIG.
  • a high accuracy 6 is set as the evaluation parameter for identifying and classifying or for all individual process steps thereof, so that the evaluation unit 2 performs all identification and classification process steps 7 with performs the accuracy "high” without causing a violation of the real-time conditions in the evaluation (cf. FIG. 4, middle image line).
  • FIG. 4 shows that evaluation parameters 8 of the image recording unit 1 can also be set on the basis of the material occupation density 5.
  • the camera la can image data 4 with high
  • Image Resolution 8 record, whereas at a significant increase in the material occupancy density 5, as shown in the lower line of Figure 4, the image resolution 8 can be significantly reduced when recording the image data 4.
  • the thus adjustable evaluation parameter 8 of the image acquisition unit 1 (image resolution) can then indirectly also influence the determination of the occupancy parameter (s) 5 and thus also the setting of the evaluation parameter (s) 6 of the evaluation unit 2.
  • FIG. 5 shows an example of a possible implementation of one of the process steps or subtasks from FIG. 3, namely the classification
  • Classifier for making the best possible classification decision (or the following sorting decision).
  • the basis for the classifier is a decision tree.
  • a feature of a found object is compared against a threshold value, and according to the comparison result, the path is followed to the right or to the left, as shown in FIG.
  • the leaves of the decision tree store the resulting classes. According to FIGS. 1 to 4, there are only two classes here, namely the class of the good objects and the class of the bad objects.
  • these methods can be extended to the effect that not only in the leaves of the decision tree is a class membership stored, but also that in each inner decision tree node a probability for the different classes is added. (based on the training amount). For example, if you distinguish two classes, namely a class A for good objects and class B for bad objects, where there are equal numbers of examples in the training set for both classes, then the probabilities for the class membership for A and B are in the top node f 8 at 50% each.
  • f describes the feature vector and the eighth feature included is described by f 8 accordingly. It should be noted that a single feature in several nodes of the decision tree can be used as a test criterion.
  • An object to classify 0 falls after the comparison with f 8 in the first, top node in the left or in the right subtree.
  • the next node, for example right, is now compared with f 4 .
  • the probabilities can be redistributed. For example, it is conceivable that in the latter node the probabilities for A at 30% and for B with
  • the process step of the context analysis 7d can be realized, for example, by a "Connected Component Analysis.” See, for example, MB Dillencourt et al "A General Approach to Connected Component Labeling for Arbitrary Image Representations". Journal of the ACM 39 (2), 1992.
  • breakpoints can be granted by displaying the image e.g. is subsampled with a shifted raster. For example, in a first pass, only every fourth pixel can be considered: for all pixels not considered, the same value can then be assumed as for the pixel last considered. If enough computation time is available, the grid can be moved: this way, the information about the image is refined with each pass or repetition. It is an iterative process, after each
  • Interruptible sub-processes or process steps can also be realized, for example, as follows.
  • the area of an object is determined in pixels.
  • the pixels are simply counted in an algorithm.
  • Such a process can be interrupted at any time and the previously counted pixels can be assumed to be an area.
  • the area can be estimated from other data, if already available. For example, if an axis-aligned bounding box is known, the area of this box can be used as an estimate instead of the true area (this can provide advantages through fewer memory accesses).
  • iterative methods or iteratively performed process steps can be interrupted during each loop pass. In the event of a break during a loop it is possible to use the result of the most recently completed loop pass.

Landscapes

  • Sorting Of Articles (AREA)

Abstract

La présente invention concerne un dispositif de triage, permettant de trier des objets (0) d'un flux de matériel (M), comprenant une unité d'enregistrement d'image (1), au moyen de laquelle le flux de matériel (M) est détecté de manière optique et des données d'image (4) de celui-ci (M) sont générées, une unité d'évaluation (2), au moyen de laquelle les objets (0) du flux de matériel (M) sont identifiés et classés et une unité de triage (3), au moyen de laquelle les objets (0) classés du flux de matériel (M) sont triés, caractérisé en ce qu'un ou plusieurs paramètres d'occupation (5), caractérisant le flux de matériel (M) par rapport à son occupation par les objets (0), est/sont déterminé/s par l'unité d'évaluation (2), à partir des données d'image générées (4) et que, à partir du ou des plusieurs paramètres d'occupation (5) déterminé/s, un ou plusieurs paramètres d'évaluation (6, 8) permettant d'identifier et de classifier les objets (0), commandé/s par l'unité d'évaluation (2), est/sont défini/s.
EP17729466.7A 2016-06-14 2017-06-13 Dispositif de triage et procédé de triage correspondant Active EP3468727B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016210482.9A DE102016210482A1 (de) 2016-06-14 2016-06-14 Optisches Sortiersystem sowie entsprechendes Sortierverfahren
PCT/EP2017/064329 WO2017216124A1 (fr) 2016-06-14 2017-06-13 Dispositif de triage et procédé de triage correspondant

Publications (2)

Publication Number Publication Date
EP3468727A1 true EP3468727A1 (fr) 2019-04-17
EP3468727B1 EP3468727B1 (fr) 2020-10-14

Family

ID=59054125

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17729466.7A Active EP3468727B1 (fr) 2016-06-14 2017-06-13 Dispositif de triage et procédé de triage correspondant

Country Status (3)

Country Link
EP (1) EP3468727B1 (fr)
DE (1) DE102016210482A1 (fr)
WO (1) WO2017216124A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109533511A (zh) * 2018-12-25 2019-03-29 哈尔滨联科包装机械有限公司 双向拣选机及拣选方法
CN110989904B (zh) * 2019-12-13 2021-05-28 威海新北洋正棋机器人股份有限公司 交叉带分拣设备的配置方法、装置及交叉带分拣系统
DE102020110976B4 (de) 2020-04-22 2023-12-21 Separation AG Optische Sortieranlage für die Sortierung von Granulatpartikeln
CN113592824A (zh) * 2021-08-02 2021-11-02 合肥名德光电科技股份有限公司 基于深度学习的色选机分选方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5085325A (en) * 1988-03-08 1992-02-04 Simco/Ramic Corporation Color sorting system and method
GB2219079B (en) * 1988-05-06 1992-09-09 Gersan Ets A method of identifying individual objects or zones
GB2273154B (en) * 1992-12-02 1996-12-11 Buehler Ag Method for cleaning and sorting bulk material
US5526437A (en) * 1994-03-15 1996-06-11 Key Technology, Inc. Integrated food sorting and analysis apparatus
US6545240B2 (en) * 1996-02-16 2003-04-08 Huron Valley Steel Corporation Metal scrap sorting system
US6266390B1 (en) * 1998-09-21 2001-07-24 Spectramet, Llc High speed materials sorting using x-ray fluorescence
DE102009007481A1 (de) 2009-01-30 2010-09-02 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Fördersystem zum Transport von Materialien, insbesondere von Schüttgut
DE102010046438A1 (de) 2010-09-24 2012-03-29 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zur optischen Charakterisierung von Materialien
DE102011103253B4 (de) 2011-05-31 2012-12-27 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Anordnung zur optischen Bestimmung einer Probe und entsprechendes Verfahren
DE102012001868B4 (de) 2012-01-24 2018-03-29 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren zum Einrichtung einer dem optischen Identifizieren von Objekten dienender Anlage, Laborbildaufnahmesystem zum Durchführen eines solchen Verfahrens und Anordnung umfassend das Laborbildaufnahmesystem sowie die Anlage
DE102014207157A1 (de) * 2014-02-28 2015-09-03 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Fördersystem, Anlage zur Schüttgutsortierung mit einem solchen Fördersystem und Transportverfahren

Also Published As

Publication number Publication date
DE102016210482A1 (de) 2017-12-14
EP3468727B1 (fr) 2020-10-14
WO2017216124A1 (fr) 2017-12-21

Similar Documents

Publication Publication Date Title
EP3468727B1 (fr) Dispositif de triage et procédé de triage correspondant
DE69329380T2 (de) Verfahren zum Segmentieren von Bildern und Klassifizieren von Bildelementen zur Dokumentverarbeitung
DE60224852T2 (de) Verfahren und Vorrichtung zur Verarbeitung von Fahrzeugbildern
DE102018128531A1 (de) System und Verfahren zum Analysieren einer durch eine Punktwolke dargestellten dreidimensionalen Umgebung durch tiefes Lernen
DE102018133188A1 (de) Abstandbestimmung einer probenebene in einem mikroskopsystem
DE102004063769A1 (de) Verfahren und Einrichtung zur automatischen und quantitativen Erfassung des Anteils von Saatgütern oder Körnerfrüchten bestimmter Qualität
EP1797533B1 (fr) Procede et dispositif pour segmenter une representation numerique de cellules
DE102018208126A1 (de) Verfahren zum Hantieren eines Werkstücks mit Hilfe eines Entnahmewerkzeugs und Maschine zur Durchführung des Verfahrens
EP2849151A1 (fr) Procédé d'analyse de files d'attente libres
DE102020215227B4 (de) Vorrichtung und Verfahren zum Erstellen einer Referenzaufnahme des unbeladenen Zustands eines Werkstückträgers
DE102017006566B3 (de) Vorrichtung und Verfahren zur optischen Überwachung von Oberflächen eines Körpers
DD152870A1 (de) Verfahren und vorrichtung zum klassieren in bewegung befindlichen stueckgutes
DE102018127844A1 (de) Verfahren zur Regelung des Betriebs einer Maschine zum Ernten von Hackfrüchten
DE102016100134B4 (de) Verfahren und Vorrichtung zum Untersuchen eines Objekts unter Verwendung von maschinellem Sehen
WO2008034599A2 (fr) Procédé et dispositif de traitement d'image
DE102019215255A1 (de) Vorrichtung und Verfahren zum Verarbeiten von Daten eines neuronalen Netzes
WO2023110301A1 (fr) Procédé de realisation d'une opération de réglage d'un dispositif d'inspection de récipients et dispositif d'inspection de récipients
EP3923193B1 (fr) Mesure de la sensibilité de classificateurs d'image par rapport aux changements de l'image d'entrée
EP2808843B1 (fr) Procédé de paramétrage d'un système de traitement d'image pour la surveillance d'une machine-outil
EP2642749B1 (fr) Dispositif et procédé d'optimisation de la détermination de zones de capture
DE102021133164B3 (de) Verfahren zum Durchführen eines Einstellbetriebs einer Behältnisinspektionsvorrichtung und Behältnisinspektionsvorrichtung
AT526401A1 (de) Verfahren zum Sortieren von Sortiergut
EP4405908A1 (fr) Procédé pour déterminer si un article à transporter prédéfini est disposé dans une zone de surveillance
AT511399B1 (de) Verfahren zur automatisierten klassifikation von einschlüssen
DE202022001616U1 (de) System zur Korrektur der Ergebnisse von Paketvereinzelung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20181113

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: MAIER, GEORG

Inventor name: BROMBERGER, MICHAEL

Inventor name: LAENGLE, THOMAS

Inventor name: BECKER, THOMAS

Inventor name: KICHERER, MARIO

Inventor name: KARL, WOLFGANG

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20191213

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200515

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1323068

Country of ref document: AT

Kind code of ref document: T

Effective date: 20201015

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502017007759

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20201014

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210115

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210114

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210215

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210214

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210114

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502017007759

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

26N No opposition filed

Effective date: 20210715

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20210613

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210613

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210613

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210613

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210214

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230524

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20170613

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: CH

Payment date: 20230702

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240617

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: AT

Payment date: 20240617

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240621

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20240618

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201014

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20240628

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: CH

Payment date: 20240701

Year of fee payment: 8