EP3850545A1 - Method and apparatus for identifying an article - Google Patents
Method and apparatus for identifying an articleInfo
- Publication number
- EP3850545A1 EP3850545A1 EP19773741.4A EP19773741A EP3850545A1 EP 3850545 A1 EP3850545 A1 EP 3850545A1 EP 19773741 A EP19773741 A EP 19773741A EP 3850545 A1 EP3850545 A1 EP 3850545A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- recording
- data
- features
- extraction algorithm
- algorithm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/758—Involving statistics of pixels or of feature values, e.g. histogram matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/40—Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
- G06F18/41—Interactive pattern learning with a human teacher
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/7715—Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30136—Metal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/06—Recognition of objects for industrial automation
Definitions
- the invention relates to a method and a device for identifying an object.
- the invention further relates to the use of such a method or the use of such a device in industrial production.
- an object in particular a manufactured sheet metal part
- Disposing of the item is a waste of resources and also carries the risk that the item will be missing from a particular job.
- the completely manual identification of the object is complex and can often only successfully carried out by experienced employees who are not always present or who have to be released from other tasks in order to identify the object.
- a fully automated identification of an object often fails due to inadequate recordings of the object or inadequate identification programs.
- the invention thus relates to a method for identifying an object.
- the object can be embodied in the form of an essentially two-dimensional object, for example in the form of a flat sheet metal part, or in the form of a three-dimensional object, for example in the form of a sheet metal part with deformations.
- Identification is done by assigning a recording of the object to the stored object data.
- This object data can be stored in the form of CAD data.
- the process has the following process steps:
- the invention it is therefore provided to output at least one item of information relating to the assignment of the object to be identified to stored object data.
- the information is preferably given in the form of probability information, so that it can be seen with what probability the object corresponds to the stored object data.
- the output can e.g. on a screen, data glasses, or as data transfer.
- a plurality of assignment information items can be output with an indication of the probability.
- features are extracted from the image as well as from the stored object data. This enables a particularly reliable assignment of the recording to the stored object data and thus a particularly high probability of identifying the object.
- the method preferably has the following method step:
- the method then provides for an evaluation of a user with regard to this assignment information.
- the evaluation by the user can take place, for example, by data transmission or by an input device, for example a button, touch screen, voice recording, speech recognition or similar input devices. If, for example, the user is shown two or more identification matches, sorted by probability, the user can select the one that they really agree with. He has given a user rating. This can make it possible to improve future allocation information.
- a recording extraction algorithm can insert data of the recording, process it depending on predefined recording parameters and output recording characteristics in the form of the processed data.
- the predefined recording parameters can have so-called weighted variables. Their function and determination is explained below.
- the method can further preferably have the following method step:
- the parameters can have so-called weighted variables. Their determination and function is explained below.
- the method step is preferably carried out as follows:
- G storing the user rating of the assignment information in a user rating result memory.
- Saving the user rating enables many user ratings to be collected, as a result of which the exposure extraction algorithm, the object extraction algorithm and / or the comparison algorithm can be significantly improved.
- the user evaluation result memory can be cloud-based.
- Cloud-based means a storage device, in particular a locally remote, preferably anonymized, storage device in which user ratings of more than one, advantageously several, or several thousand different users are stored.
- different users can contribute to the optimization of the process regardless of the manufacturing location. It was recognized that the described method was a resounding success, that is to say assignment information with the correct assignment to the Most likely to get it only after tens of thousands, especially hundreds of thousands of user reviews have been read out. Such an amount of data is often not available for a single manufacturing facility in a year. Accordingly, the procedure would probably have remained uninteresting.
- the recording can be made or recorded in the wavelength range visible to the human eye.
- the recording can be made in the wavelength range that is not visible to the human eye, for example in the IR range, in the UV range and / or in the ultrasound range.
- method step A several images of the object can be recorded with the image capturing device.
- the recording features can be extracted from several recordings.
- assignment information can be output from several recordings of the stored objects. It has been shown that the creation and processing of a plurality of recordings considerably improves the quality of the assignment information for a user, since in particular the influence of artifacts caused by the recording position is reduced.
- the image extraction algorithm, the object extraction algorithm and / or the comparison algorithm preferably have an algorithm with a plurality of data aggregation routines.
- a data aggregation routine can be designed to aggregate several “determined data” into a new data packet.
- the new data packet can have one or more numbers or vectors.
- the new data packet can be made available in whole or in part to other data aggregation routines as "determined data”.
- "Determined data" can be, for example, recording data, object data, or data packets made available by one of the data aggregation routines.
- the recording / extraction algorithm, the object extraction algorithm and / or the is / are particularly preferred Comparison algorithm in the form of an algorithm with several connected data aggregation routines.
- the acquisition extraction algorithm, the object extraction algorithm and / or the comparison algorithm can have a function with weighted variables.
- One, in particular several, particularly preferably all data aggregation routines can be designed to combine, in particular multiply, a plurality of "determined data” with a weighted variable, and thus to convert the "determined data” into combined data "then the combined data 'aggregate to a new data packet, in particular to add.
- the change in the weighting, in particular the change in the weighted variables is particularly preferably carried out on the basis of the user rating.
- the exposure extraction algorithm, the object extraction algorithm and / or the comparison algorithm can be run through with data, in particular exposure data and / or object data, the relationship of which is known in each case.
- the weighted variables can be determined, preferably in a first phase, separately for the exposure extraction algorithm, the object extraction algorithm and the comparison algorithm.
- the recording features and object features can in this case themselves be data packets, in particular a plurality of structured data, in particular data vectors or data arrays, which in turn themselves contain "determined data” e.g. for the comparison algorithm, in particular for the data aggregation routines of the comparison algorithm.
- the precise structure of these recording features and object features can change, in particular improve, preferably optimize, through the mechanical evaluation of the user rating.
- weighted variables By changing, in particular improving, preferably op- timed, weighted variables are managed cloud-based, other users can use them in their algorithms and benefit from the method.
- the algorithms mentioned or a further secondary or higher-level algorithm can be designed to monitor and recognize when one or all of the algorithms, with a predetermined cluster, output association information that the user assesses as bad, and then output a negative message.
- the output can be visual, for example on a screen or in another suitable form, for example as data output.
- the monitoring algorithm can also be designed to respond to the output of such a negative message with an improvement routine that changes further properties or the interaction of one or more of the algorithms mentioned.
- the object features are in the form of construction data, material data, surface texture data and / or thermal conductivity data.
- the comparison algorithm can include forming a dot product and / or forming a difference between the recording features and the object features. The aforementioned measures have proven to be particularly effective in assigning the recording characteristics to the object characteristics.
- the output of the assignment information of the recording further preferably comprises the output of several assignment probabilities to different stored objects.
- a user can choose from various assignment options.
- the identification of the object can thus be carried out with a very high probability of success.
- the object according to the invention is further achieved by a device for identifying an object by assigning the object to stored object data, the device having the following:
- an image capture device for taking a picture of the object
- the recording extraction algorithm, the object extraction algorithm and / or the comparison algorithm is designed to be optimizable on the basis of the user rating
- the device according to the invention is preferably designed to carry out a method described here.
- the image capture device can be in the form of a camera, in particular for visible light.
- the device can also have the following:
- the device can have the following: g) A user rating result storage for storing the user rating of the association information.
- the user evaluation result memory can be cloud-based.
- the recording extraction algorithm, the object extraction algorithm and / or the comparison algorithm can be designed in the form of an algorithm with a plurality of connected data aggregation routines.
- the method and / or the device can be used particularly advantageously in industrial production with a computer-based production control for processing reflective objects, in particular sheet metal parts.
- reflective objects are meant objects with a smooth surface that reflect the light in such a way that when taking the picture, in addition to the contours, undesired light reflections of other objects can also occur.
- Such reflective objects are metals, glass, plastics with a smooth surface, coated materials, such as coated plates made of plastic, wood, metal, glass, etc. It is particularly advantageous if the production control is at least partially cloud-based. Then parameters, in particular the weighted variables, for changing, in particular improving, in particular optimizing, the algorithms from a first production site can also be used in other production sites and vice versa. This means that a much larger database is available and the identification for each individual manufacturing company can be significantly improved. Further advantages of the invention result from the description and the drawing. Likewise, according to the invention, the features mentioned above and those which have been elaborated further can be used individually or in combination in any combination. The embodiment shown and described is not to be understood as conclusive, but rather has an exemplary character for the description of the invention.
- FIG. 1 shows a schematic representation of an embodiment of the method according to the invention or the device according to the invention.
- FIG. 1 shows a device 10 for identifying an object found without being assigned.
- the object can be in the form of a sheet metal part.
- Item data 12 are stored for the item.
- At least one image 14 is created.
- the object data 12 are preferably stored in the form of CAD data, in particular in an object database 16 in the form of a CAD database.
- the image 14 is created with an image capture device 18.
- the image capturing device 18 can be designed in the form of a camera.
- the receptacle 14 shown in FIG. 1 indicates that the assignment of such a receptacle 14 can be very difficult. This is particularly the case if - as shown - the receptacle 14 only shows a part of the object, the receptacle 14 is made against an uneasy background and / or the surface properties of the object make the receptacle 14 difficult to create.
- a recording extraction algorithm 20 is applied to the recording 14 in order to extract recording features 22. This is indicated schematically in FIG. 1 on the basis of the area framed in dash-dotted lines in the receptacle 14.
- the recording extraction algorithm 20 is in a recording Extraction unit 24 deposited.
- the recording extraction algorithm 20 can have data aggregation routines which are connected to one another, in particular weighted to one another.
- An item extraction algorithm 26 is applied to the item data 12 to extract item features 28. This is indicated schematically in FIG. 1 on the basis of the area framed in dot-dash lines in the item data 12.
- the object extraction algorithm 26 is stored in an object extraction unit 30.
- the object extraction algorithm 26 can have interconnected, in particular mutually weighted, data aggregation routines.
- the recording features 22 and the item features 28 are fed to a comparison algorithm 32.
- the comparison algorithm 32 is stored in a comparison unit 34.
- the comparison algorithm 32 can have interconnected, in particular mutually weighted, data aggregation routines.
- the comparison algorithm 32 is preferably designed to form a scalar product or a difference between the recording features 22 and the object features 28.
- Assignment information 36 is output as a result of the comparison algorithm 32.
- the assignment information is output in an output unit 38.
- a plurality of item data here three; not shown in FIG. 1 with a reference symbol
- the assignment probability in each case (here 60%, 35% and 5%). for the respective item data. This considerably simplifies the assignment of the found object to object data for a user.
- the device 10 has an input unit 40.
- the input unit 40 is designed to read out a user rating 42.
- the user rating 42 is then used to Extraction algorithm 20 to optimize the object extraction algorithm 26 and / or the comparison algorithm 32 or its parameters.
- the user rating 42 can be stored in a user rating result memory 44, so that the method or the device 10 can be optimized with a large number of user ratings 42. It is particularly preferred that the user evaluation result memory 44 is cloud-based. In this way, user evaluations 42 can be incorporated into the optimization of the method or device 10 across devices.
- the invention relates to a method and a device 10 for recognizing an object.
- At least one picture 14, in particular in the form of a photograph, is made of the object.
- 20 recording features 22 are determined by means of a recording extraction algorithm 20.
- object features 28 are determined and compared with the recording features 22 in order to output assignment information 36.
- it is provided in particular to provide a user rating 42 both for improving the exposure extraction algorithm 20 and for the object extraction algorithm 26.
- it is provided according to the invention in particular to design both the exposure extraction algorithm 20 and the object extraction algorithm 26 based on interconnected, preferably weighted, data aggregation routines.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Quality & Reliability (AREA)
- Human Computer Interaction (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018215538.0A DE102018215538A1 (en) | 2018-09-12 | 2018-09-12 | Method and device for identifying an object |
PCT/EP2019/073383 WO2020053023A1 (en) | 2018-09-12 | 2019-09-03 | Method and apparatus for identifying an article |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3850545A1 true EP3850545A1 (en) | 2021-07-21 |
Family
ID=68062887
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19773741.4A Pending EP3850545A1 (en) | 2018-09-12 | 2019-09-03 | Method and apparatus for identifying an article |
Country Status (6)
Country | Link |
---|---|
US (1) | US11972600B2 (en) |
EP (1) | EP3850545A1 (en) |
JP (1) | JP2022500757A (en) |
CN (1) | CN112703508A (en) |
DE (1) | DE102018215538A1 (en) |
WO (1) | WO2020053023A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022115997A1 (en) | 2022-06-28 | 2023-12-28 | TRUMPF Werkzeugmaschinen SE + Co. KG | Method and system to support the differentiation of sheet metal workpieces |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3053512B2 (en) * | 1993-09-22 | 2000-06-19 | 三菱電機株式会社 | Image processing device |
US6975764B1 (en) * | 1997-11-26 | 2005-12-13 | Cognex Technology And Investment Corporation | Fast high-accuracy multi-dimensional pattern inspection |
US6856698B1 (en) * | 1997-11-26 | 2005-02-15 | Cognex Corporation | Fast high-accuracy multi-dimensional pattern localization |
JPH11320143A (en) | 1998-05-12 | 1999-11-24 | Amada Co Ltd | Device and method for additional processing |
JP2002230546A (en) * | 2001-01-30 | 2002-08-16 | Fujitsu Ltd | Image processing program, computer readable storage medium storing it, and image processing method and device |
CN1741046A (en) * | 2004-08-27 | 2006-03-01 | 鸿富锦精密工业(深圳)有限公司 | Sheet metal component Engineering Design Management System and method |
US8032469B2 (en) * | 2008-05-06 | 2011-10-04 | Microsoft Corporation | Recommending similar content identified with a neural network |
DE102009012543A1 (en) | 2009-03-10 | 2010-09-23 | Esab Cutting Systems Gmbh | cutter |
US9082086B2 (en) * | 2011-05-20 | 2015-07-14 | Microsoft Corporation | Adaptively learning a similarity model |
JP5929238B2 (en) * | 2012-01-27 | 2016-06-01 | オムロン株式会社 | Image inspection method and image inspection apparatus |
DE102014213518A1 (en) * | 2014-07-11 | 2016-01-14 | Trumpf Werkzeugmaschinen Gmbh + Co. Kg | Method, processing machine and computer program product for image-based placement of workpiece machining operations |
US9440351B2 (en) * | 2014-10-30 | 2016-09-13 | International Business Machines Corporation | Controlling the operations of a robotic device |
CA3195997A1 (en) * | 2014-11-21 | 2016-05-26 | Christopher M. Mutti | Imaging system for object recognition and assessment |
US9483707B2 (en) * | 2015-02-04 | 2016-11-01 | GM Global Technology Operations LLC | Method and device for recognizing a known object in a field of view of a three-dimensional machine vision system |
JP6753105B2 (en) * | 2016-03-29 | 2020-09-09 | 日本電気株式会社 | Identification device |
US10249033B1 (en) * | 2016-12-20 | 2019-04-02 | Palantir Technologies Inc. | User interface for managing defects |
US10497257B2 (en) * | 2017-08-31 | 2019-12-03 | Nec Corporation | Parking lot surveillance with viewpoint invariant object recognition by synthesization and domain adaptation |
-
2018
- 2018-09-12 DE DE102018215538.0A patent/DE102018215538A1/en active Pending
-
2019
- 2019-09-03 JP JP2021513836A patent/JP2022500757A/en active Pending
- 2019-09-03 CN CN201980059832.1A patent/CN112703508A/en active Pending
- 2019-09-03 EP EP19773741.4A patent/EP3850545A1/en active Pending
- 2019-09-03 US US17/197,070 patent/US11972600B2/en active Active
- 2019-09-03 WO PCT/EP2019/073383 patent/WO2020053023A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
US11972600B2 (en) | 2024-04-30 |
US20230118305A1 (en) | 2023-04-20 |
WO2020053023A1 (en) | 2020-03-19 |
DE102018215538A1 (en) | 2020-03-12 |
CN112703508A (en) | 2021-04-23 |
JP2022500757A (en) | 2022-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3797337B1 (en) | Method for handling a workpiece with the aid of a removal tool and machine for carrying out the method | |
DE102004063769A1 (en) | Method for automatically and quantitatively determining the amount of seed or grain of required quality comprises recording the seed and grain using an imaging device and further processing | |
WO2018219551A1 (en) | Method and device for automatic gesture recognition | |
DE102017213583A1 (en) | Process for production planning | |
EP3850545A1 (en) | Method and apparatus for identifying an article | |
WO2004029738A1 (en) | Method for the computer-supported generation of prognoses for operative systems and system for the generation of prognoses for operative systems | |
EP4221930A1 (en) | Method, computer program product, and device comprising such a product for displaying the influence of cutting parameters on a cutting edge | |
DE102020207613A1 (en) | Method for evaluating a cutting edge of a body | |
DE102011121877A1 (en) | Method and device for determining classification parameters for the classification of banknotes | |
EP1392455B1 (en) | Method for automatic selection of sheet metal, especially sheet metal for forming components | |
DE102020208765A1 (en) | Image classifier with variable receptive fields in convolutional layers | |
DE102019208922A1 (en) | Method and device for controlling a production process | |
DE102022103844B3 (en) | Method for optimizing a production process based on visual information and device for carrying out the method | |
DE102021123761A1 (en) | Component classification device, method for classifying components and method for training a component classification device | |
EP4338135A1 (en) | Component-classifying apparatus, method for classifying components, and method for training a component-classifying apparatus | |
WO2021058399A1 (en) | Method for assigning a marker of a positioning system to a production task by identifying a workpiece, and device for carrying out the method | |
DE102021131082A1 (en) | Method and AI system for providing a crack formation probability of a deep-drawn component | |
DE102021211828A1 (en) | Computer-implemented method to support planning and organization of a construction project, device for data processing as part of such a method, and computer program for implementing such a method | |
DE102014016676A1 (en) | Method for the computer-aided selection of applicants from a large number of applicants for a given requirement profile | |
WO2023072775A1 (en) | Verifying people in portrait paintings | |
DE102022121545A1 (en) | Microscopy system and method for generating a machine learned model for processing microscope data | |
DE102021202564A1 (en) | Device and in particular computer-implemented method for classifying data sets | |
DE102023201929A1 (en) | Method and device for providing configuration suggestions for a configuration of a modular production system | |
WO2023198700A1 (en) | Computer-implemented method and systems for repositioning defective products | |
EP4252706A1 (en) | Automatic detection of dental indications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210112 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: TRUMPF WERKZEUGMASCHINEN SE + CO. KG |
|
17Q | First examination report despatched |
Effective date: 20230315 |