WO2021004710A1 - Procédé et dispositif pour inspecter des récipients dans un flux massique de récipients - Google Patents

Procédé et dispositif pour inspecter des récipients dans un flux massique de récipients Download PDF

Info

Publication number
WO2021004710A1
WO2021004710A1 PCT/EP2020/065695 EP2020065695W WO2021004710A1 WO 2021004710 A1 WO2021004710 A1 WO 2021004710A1 EP 2020065695 W EP2020065695 W EP 2020065695W WO 2021004710 A1 WO2021004710 A1 WO 2021004710A1
Authority
WO
WIPO (PCT)
Prior art keywords
neural network
training
deep neural
images
container
Prior art date
Application number
PCT/EP2020/065695
Other languages
German (de)
English (en)
Inventor
Alexander Hewicker
Stefan Piana
Anton Niedermeier
Stefan Schober
Original Assignee
Krones Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Krones Ag filed Critical Krones Ag
Publication of WO2021004710A1 publication Critical patent/WO2021004710A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/90Investigating the presence of flaws or contamination in a container or its contents
    • G01N21/9018Dirt detection in containers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/90Investigating the presence of flaws or contamination in a container or its contents
    • G01N21/9036Investigating the presence of flaws or contamination in a container or its contents using arrays of emitters or receivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8883Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges involving the calculation of gauges, generating models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Definitions

  • the invention relates to a method and a device for inspecting containers in a container mass flow with the features of the preamble of claims 1 and 13, respectively.
  • the container of the container mass flow are usually transported with a conveyor to an inspection device and recorded as an image data stream with at least one camera.
  • the image data stream is then evaluated by an image processing unit for errors and / or for foreign bodies on the containers.
  • WO 2018/150415 A1 discloses a method and a system for recognizing the integrity of a package.
  • the packaging is recorded with a camera in a wavelength range from 0.76 pm to 14 pm and defects in a sealing region are identified on the basis of at least one recording.
  • the disadvantage of the known methods and devices is that the image processing unit has to be programmed in a complex manner for such inspection applications.
  • a detection model implemented by a developer must offer sufficient distinction between faulty and faultless containers in order to guarantee the highest possible product quality. Consequently, the identification model and / or its parameters often have to be adapted to different container types or sorts, so that the known methods and devices are time-consuming and therefore cost-intensive.
  • the object of the present invention is therefore to provide a method and a device for the inspection of containers in a container mass flow that are less time-consuming to set up and can be used even more reliably.
  • the invention provides a method for inspecting containers in a container mass flow with the features of claim 1.
  • Advantageous Ausense approximately forms of the invention are mentioned in the subclaims.
  • the deep neural network can refer to the different containers ter, error and / or foreign body types are trained.
  • the deep neural network can independently learn their representations from a training data set so that the image processing does not have to be set up by a developer or on site by a fitter. Accordingly, neither a recognition model nor its parameters need to be set, so that the method can be replaced particularly quickly and inexpensively.
  • the method for inspecting containers can be used in a beverage processing plant.
  • the method can be upstream or downstream of a container manufacturing method, cleaning method, filling method, closure and / or packaging method.
  • the method can be used during a transport from a first container treatment method to a subsequent, second container treatment method.
  • Containers can be provided to hold beverages, food, hygiene articles, pastes, chemical, biological and / or pharmaceutical products.
  • the containers can be designed as bottles, in particular as plastic bottles or as glass bottles.
  • Plastic bottles can in particular be PET, PEN, HD-PE or PP bottles. It can also be biodegradable containers or bottles, the main components of which are made from renewable raw materials such as sugar cane, wheat or corn. It is conceivable that the containers are provided with a closure.
  • the conveyor can comprise a linear conveyor and / or a carousel. It is conceivable, for example, that the transporter comprises a conveyor belt on which the containers are transported standing into a detection area of the camera.
  • the at least one camera can comprise an objective and an image sensor in order to record the container mass flow optoelectronically.
  • the image sensor can comprise a CMOS or a CCD sensor. It is conceivable that the image sensor is designed as a line sensor or as an area sensor.
  • the at least one camera can be connected to the image processing unit via a data interface in order to transmit the image data stream.
  • the data interface can comprise an analog or a digital data interface.
  • the image processing unit can process the image data stream with a signal processor, a CPU, a GPU, an FPGA and / or with a vector processor. It is also conceivable that the image processing unit for this purpose comprises a storage unit, one or more data interfaces, for example a network interface, a display unit and / or an input unit. The image processing unit can split the image data stream into individual images, each of which is evaluated individually with the deep neural network. It is also conceivable that the image processing unit evaluates the image data stream with image processing algorithms, in particular with one or more filters and the like. It is conceivable that the image processing unit and the camera are designed as an integrated system. For example, the camera can evaluate the image data stream with the integrated image processing unit.
  • the deep neural network can comprise an input layer, an output layer and at least two hidden layers in between, the image data stream or data derived from the image data stream being processed to output information first via the input layer, then via the at least two hidden layers and then via the output layer to detect and / or localize defects and / or foreign bodies.
  • the output layer can be connected to the input layer via the at least two hidden layers.
  • the image data stream can be fed to the input layer, in particular images of the image data stream.
  • Output information can be output with the output layer in order to indicate a probability for each container as to whether an error and / or a foreign body is present on it.
  • the output layer can be used to output further output information about the point at which an error and / or a foreign body is located on a container.
  • the input layer, the at least two hidden layers and / or the output layer can each comprise neural nodes and / or be connected to one another via neural connections.
  • the input layer, the output layer and / or the at least two hidden layers can each comprise at least one data matrix in order to represent the data contained therein.
  • the output layer can also comprise at least one vector to represent the output information.
  • the deep neural network can comprise a deep, folding neural network with at least one folding layer, in particular wherein the at least one folding layer is designed as a covered layer, and wherein the image data stream, data derived from the image data stream and / or already with the deep neural Network partially processed data are processed into convoluted data by one or more convolution operations of the at least one convolution layer.
  • the folding layer can comprise, as common weights, at least one filter matrix with, for example, 3 ⁇ 3, 5 ⁇ 5, 7 ⁇ 7 or even more weights in order to carry out the one or more convolution operations.
  • the filter matrix can be “pushed” over the data in order to fold it section by section with the filter matrix.
  • the evaluation neighboring pixels are offset against one another, for example to reinforce edges.
  • the common weights of the folding layer, in particular the filter matrix, can also be trained during training.
  • the deep, convolutional neural network can comprise a pooling layer with which data relevant to the detection and / or localization of the errors and / or foreign bodies are selected from the convoluted data. This means that superfluous information can be neglected and the evaluation speed increased.
  • the pooling layer can comprise a pooling matrix, in particular with which a selection is made from 2 x 2, 3 x 3, 4 x 4 or even more data points of the folded data.
  • the deep neural network can be trained with a training data set with images of containers with training errors and / or with training foreign bodies, so that the deep neural network uses the training data set to develop a model in order to recognize the defects and / or foreign bodies on the containers and / or to locate.
  • the deep neural network can be trained for a particularly large number of container types, different error types and / or foreign body types, so that these can then be reliably detected during the actual inspection.
  • the training errors and / or the training foreign bodies are automatically recognized by the image processing unit, by a further image processing unit and / or manually by an operator and stored as error markers in the training data set.
  • specifications can be made particularly easily for the deep neural network as to which types of defects or types of foreign bodies are to be recognized.
  • conventional image processing algorithms can be used to automatically create a training data set with the largest possible number of container, defect and / or foreign body types.
  • the error markings can include an assignment to one or more error classes and / or location information for localization in the images of the training data set.
  • a further image processing unit of a container inspection device At least some of the images for the training data set can be recorded.
  • the container inspection device can be designed conventionally with 2D image analysis methods, in particular without a deep neural network. Synthetic error and / or foreign body patterns can be generated, at least some of the images of the training data set being generated with the synthetic error patterns.
  • the deep neural network can also be trained for unusual faults and / or foreign body types.
  • the training data set can include images of defect-free and / or foreign body-free containers. This avoids incorrectly positive detection and rejection of containers free of defects or foreign bodies.
  • the images of the training data set can be duplicated automatically in order to create further images for the training data set with variants of the training errors and / or the training foreign bodies.
  • the deep neural network learns general characteristics of the training errors and / or foreign bodies, so that the recognition accuracy is further improved. It is conceivable that the training errors and / or the training foreign bodies are rotated, shifted, scaled and / or mirrored. It is also conceivable that changes in contrast and / or detail are made on at least some of the reproduced images.
  • the deep neural network can be trained with a first part of the images of the training data set, the deep neural network being verified with a second part of the images of the training data set.
  • the quality of the recognition accuracy can be particularly good, since the verification takes place with different images than the training.
  • the first part can therefore comprise different images from the second part.
  • the invention provides a device for inspecting containers in a container mass flow with the features of claim 13 ready.
  • An advantageous embodiment of the invention is mentioned in the dependent claim.
  • the image processing unit comprises the deep neural network for evaluating the image data stream
  • the deep neural network can be trained for the different types of containers, defects and / or foreign bodies.
  • the deep neural network can independently learn their representations from a training data set so that it does not have to be set up by a developer or on site by a fitter. So must neither a detection model nor its parameters are set, so that the method can be replaced particularly quickly and inexpensively.
  • the device for inspecting containers in a container mass flow can be arranged in a beverage processing plant. It is conceivable that at least one container treatment machine is arranged upstream and / or downstream of the conveyor. In other words, the conveyor can connect two container handling machines to one another.
  • the device can comprise the features described above with reference to the method, in particular according to one of claims 1-12, individually or in any combination.
  • the image processing unit can comprise a storage medium with machine instructions which, when they are executed with the image processing unit, evaluate the image data stream with the deep neural network.
  • the image processing unit can comprise the deep neural network.
  • the storage medium comprises machine instructions with which the method described above can be carried out at least partially.
  • the machine instructions can execute those parts of the method which are carried out with the image processing unit and / or with the deep neural network.
  • the image processing unit can comprise a processor for executing the machine instructions.
  • the image processing unit can comprise a signal processor, a CPU, a GPU, an FPGA and / or a vector processor for performing method steps of the deep neural network.
  • the process steps can include the machine instructions.
  • FIG. 1 shows an exemplary embodiment of a device according to the invention for inspecting containers in a container mass flow as a perspective view
  • FIG. 2 shows an exemplary embodiment of a method according to the invention for inspecting containers in a container mass flow as a flow chart
  • FIG. 3 shows the method steps for training the deep neural network from FIG. 2 in detail as a flow chart
  • FIG. 4 shows an exemplary embodiment of the deep neural network as a schematic diagram representation.
  • FIG. 1 shows an exemplary embodiment of a device 1 according to the invention for inspecting containers 2 in a container mass flow in a top view.
  • the conveyor 3 which is designed here as an example of a conveyor belt and on which the containers 2 of the container mass flow are transported in the direction R to a detection area between the lighting unit 4 and the camera 5.
  • the transmitted light inspection is shown here only as an example, but any other type of inspection, for example a reflected light inspection or the like, in which the camera 5 or other cameras are used, is also conceivable.
  • the containers 2 are subjected, for example, to a side wall inspection in order to identify and / or localize the fault 21 and / or the foreign body 22.
  • the camera 5 is arranged on the transporter 3, which detects the container 2 from the side.
  • the arrangement of the camera 5 is shown here only as an example. It is also conceivable that several cameras and / or a mirror cabinet are present in order to capture the container 2 from several viewing directions. An arrangement is also conceivable directly from above, perpendicular to a transport surface of the conveyor 3.
  • the camera 6 thus records the container mass flow as an image data stream and transmits it to the image processing unit 7 via the data interface 6.
  • the image data stream is evaluated with the deep neural network 10 shown in FIG. 4 in order to detect the errors 21 and / or the foreign bodies 22 independently of the Particularly reliable to recognize and / or localize container type.
  • the image processing unit 7 comprises a storage medium with machine instructions which, when executed with the image processing unit 7, evaluate the image data stream with the deep neural network 10.
  • the image processing unit 7 comprises a GPU 8 on which the machine instructions of the deep neural network 10 can be executed particularly quickly.
  • FIG. 2 shows an exemplary embodiment of a method 100 according to the invention for inspecting containers 2 in a container mass flow as a flow chart.
  • the device from FIG. 1 is designed to carry out method steps 101-107 of method 100 from FIG. It is also conceivable that the device is further designed to also at least partially carry out training 200 of deep neural network 10 according to FIG. 3.
  • the deep neural network 10 of the image processing unit 7 is initially trained with a training data set (step 200).
  • the associated training steps 201-207 are explained in more detail below with reference to FIG.
  • the containers 2 are first transported in step 101 with the conveyor 3 in the direction R and then captured with the at least one camera 5 in step 102.
  • the light is refracted to different degrees locally at the fault 21, so that high-contrast structures are recorded in the camera image.
  • the foreign body 22 appears, for example, as a darkened area in the camera image.
  • the image data stream from the camera 5 is transmitted via the data interface 6 to the image processing unit 7 and evaluated there in step 103 in order to identify and localize the errors 21 and / or foreign bodies 22 with the deep neural network 10 (step 104).
  • the affected container 2 is sorted out according to decision 105, for example with a switch (step 106). If, on the other hand, the container 2 is free of defects and foreign bodies, it is fed to a subsequent treatment step in step 107. This can be, for example, the filling of the container 2 with a product.
  • the deep neural network 10 can be trained for the different container, error and / or foreign body types.
  • the deep neural network 10 can independently learn their representations from a training data set (step 200), so that it does not have to be set up by a developer or fitter. Accordingly, neither a recognition model nor its parameters need to be set, so that the method can be used particularly quickly and cost-effectively.
  • the method steps 201-207 for training the deep neural network from FIG. 2 are shown in detail as a flow chart.
  • the method steps 201-203 can be carried out both individually and in any combination, so that the desired container, defect and / or foreign body types in the model of the deep neural network 10 are trained.
  • step 201 at least some of the images for the training data set are captured by a further image processing unit of a container inspection device.
  • a large number of Bil countries for the training data set can be generated automatically with a conventional container inspection device, for example, in order to train error and / or foreign body types that are already sufficiently well known with conventional image processing technology. It is also conceivable that for this purpose images from an archive of the container inspection device are used, in which it was subsequently established that in rare cases errors and / or foreign bodies were not reliably detected.
  • At least some of the images for the training data set are generated by a manufacturer and / or by an operator of a container processing system and / or that an error catalog is created (step 203). This means that container, defect and / or foreign body types typical for the system can be taken into account particularly easily.
  • the training errors and / or training foreign bodies are recognized automatically and / or manually and stored as error markers in the training data set (step 204).
  • specifications can be made particularly easily for the deep neural network as to which types of defects or types of foreign bodies are to be recognized. It is also conceivable that this takes place at least partially in step 201, with conventional image processing algorithms being able to be used to automatically create the training data set with the greatest possible number of container, defect and / or foreign body types.
  • the error markings contain, for example, assignment to one or more error classes and location information for localization in the images of the training data set.
  • the images of the training data set are then automatically duplicated in step 205.
  • the training errors and / or the training foreign bodies are rotated, shifted, scaled and / or mirrored and, based on this, additional images of the training data set are generated. It is also conceivable that changes in the contrast and / or section are made in the duplicated images in order to train the deep neural network 10 as independently as possible of the exact error position and / or lighting conditions.
  • the deep neural network 10 is then trained with a first part of the images of the training data set (step 206), so that it detects the training errors and / or training foreign bodies as independently of the container types as possible.
  • the trained, deep neural network 10 is then verified in step 207 with a second part of the images of the training data set.
  • the first part of the training data set comprises different images from the second part.
  • the errors 21 and / or foreign bodies 22 can then be detected particularly reliably in accordance with step 104 of the method 100 described above with reference to FIG. 2, without a time-consuming device being necessary.
  • FIG. 4 an exemplary embodiment of the deep neural network 10 is shown as a schematic diagram representation. It can be seen that the deep neural network 10 is designed as a deep, folding neural network with the folding layers 12 and 14.
  • the deep neural network 10 comprises the input layer 11, the output layer 17 and at least 2 hidden layers 12-16 in between.
  • the images I of the image data stream captured by the camera 5 are first transferred to the input layer 11.
  • An image I of the container 2 with the defect 21 and with the foreign body 22, which was detected in accordance with method step 102, is shown here merely by way of example.
  • a data matrix of the first folding layer 12 is determined via a first convolution operation C1.
  • the convolution operation C1 corresponds to an edge filter for edges aligned vertically in the image I.
  • the folding layer comprises further data matrices that are formed with other folding operations. This could be done for example by a further convolution operation, for example by a further edge filter for horizontal edges.
  • the first pooling layer 13 is shown below, which is formed via a first selection operation P1, the relevant data of the convoluted data matrices being selected from the first convolving layer 12. This reduces the computing power required for further processing of the data.
  • the second folding layer 14 and the second pooling layer 15 are shown below, which on the basis of one or more second convolution operations C2 and one or more second selection operations P2 first fold the data of the first pooling layer 13 and then select them further.
  • the deep neural network 10 with the fully connected layer 16 can be completed. It corresponds to the architecture of a multilayer perceptron and outputs the output information 17.1 and 17.2 by means of the output layer 17.
  • the output information 17.1 can include an error class that indicates whether and, if so, which error or which foreign body is present. It is also conceivable that the output information 17.2 outputs the localization of the corresponding error or the foreign body.
  • the deep neural network 10 shown in FIG. 4 can be trained by means of the training steps shown in FIG. In particular, the weights of the individual convolution operations C1, C2, the selection operation P1, P2 and the neural links of the deep neural network 10 are determined.
  • the deep neural network 10 shown in FIG. 4 is trained for a wide variety of container, defect and / or foreign body types, the containers 2 can be inspected without having to adapt a recognition model and / or its parameters. As a result, the method 100 and the device 1 for inspecting the container 2 are less time-consuming to set up and can be used even more reliably.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention concerne un procédé (100) permettant d'inspecter des récipients (2) dans un flux massique de récipients, les récipients (2) du flux massique de récipients étant transportés avec un transporteur (3), le flux massique de récipients étant détecté (102) par au moins une caméra (5) en tant que flux de données d'image et le flux de données d'image étant évalué par une unité de traitement d'images (7) afin de déceler l'éventuelle présence de défauts (21) et/ou de corps étrangers (22) sur les récipients (2), le flux de données d'image étant évalué (103) par une unité de traitement d'image (7) avec un réseau neuronal profond (10) afin de déceler l'éventuelle présence de défauts (21) et/ou de corps étrangers (22) sur les récipients (2), de sorte à identifier et/ou à localiser les défauts (21) et/ou les corps étrangers.
PCT/EP2020/065695 2019-07-08 2020-06-05 Procédé et dispositif pour inspecter des récipients dans un flux massique de récipients WO2021004710A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019209976.9A DE102019209976A1 (de) 2019-07-08 2019-07-08 Verfahren und Vorrichtung zur Inspektion von Behältern in einem Behältermassenstrom
DE102019209976.9 2019-07-08

Publications (1)

Publication Number Publication Date
WO2021004710A1 true WO2021004710A1 (fr) 2021-01-14

Family

ID=71016554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/065695 WO2021004710A1 (fr) 2019-07-08 2020-06-05 Procédé et dispositif pour inspecter des récipients dans un flux massique de récipients

Country Status (2)

Country Link
DE (1) DE102019209976A1 (fr)
WO (1) WO2021004710A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180107928A1 (en) * 2016-10-14 2018-04-19 Kla-Tencor Corporation Diagnostic systems and methods for deep learning models configured for semiconductor applications
WO2018150415A1 (fr) 2017-02-20 2018-08-23 Yoran Imaging Ltd. Procédé et système de détermination de l'intégrité d'un emballage
DE102017206971A1 (de) * 2017-04-26 2018-10-31 Krones Aktiengesellschaft Inspektionsverfahren und -vorrichtung zur bildverarbeitenden Inspektion von Behältern
DE102017213247A1 (de) * 2017-06-30 2019-01-03 Conti Temic Microelectronic Gmbh Wissenstransfer zwischen verschiedenen Deep-Learning Architekturen
US20190041318A1 (en) * 2016-01-28 2019-02-07 Siemens Healthcare Diagnostics Inc. Methods and apparatus for imaging a specimen container and/or specimen using multiple exposures
DE102018103244A1 (de) * 2017-08-04 2019-02-07 Omron Corporation Bildverarbeitugssystem

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7684034B2 (en) * 2007-05-24 2010-03-23 Applied Vision Company, Llc Apparatus and methods for container inspection

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190041318A1 (en) * 2016-01-28 2019-02-07 Siemens Healthcare Diagnostics Inc. Methods and apparatus for imaging a specimen container and/or specimen using multiple exposures
US20180107928A1 (en) * 2016-10-14 2018-04-19 Kla-Tencor Corporation Diagnostic systems and methods for deep learning models configured for semiconductor applications
WO2018150415A1 (fr) 2017-02-20 2018-08-23 Yoran Imaging Ltd. Procédé et système de détermination de l'intégrité d'un emballage
DE102017206971A1 (de) * 2017-04-26 2018-10-31 Krones Aktiengesellschaft Inspektionsverfahren und -vorrichtung zur bildverarbeitenden Inspektion von Behältern
DE102017213247A1 (de) * 2017-06-30 2019-01-03 Conti Temic Microelectronic Gmbh Wissenstransfer zwischen verschiedenen Deep-Learning Architekturen
DE102018103244A1 (de) * 2017-08-04 2019-02-07 Omron Corporation Bildverarbeitugssystem

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ALEX KRIZHEVSKY ET AL: "ImageNet classification with deep convolutional neural networks", COMMUNICATIONS OF THE ACM, ASSOCIATION FOR COMPUTING MACHINERY, INC, UNITED STATES, vol. 60, no. 6, 24 May 2017 (2017-05-24), pages 84 - 90, XP058339266, ISSN: 0001-0782, DOI: 10.1145/3065386 *

Also Published As

Publication number Publication date
DE102019209976A1 (de) 2021-01-14

Similar Documents

Publication Publication Date Title
EP3311146B1 (fr) Procédé et dispositif d'inspection pour contrôler la fermeture de récipients
DE102013109915B4 (de) Verfahren und Vorrichtung zur Überprüfung eines Inspektionssystems zur Erkennung von Oberflächendefekten
DE102018129425B4 (de) System zur Erkennung eines Bearbeitungsfehlers für ein Laserbearbeitungssystem zur Bearbeitung eines Werkstücks, Laserbearbeitungssystem zur Bearbeitung eines Werkstücks mittels eines Laserstrahls umfassend dasselbe und Verfahren zur Erkennung eines Bearbeitungsfehlers eines Laserbearbeitungssystems zur Bearbeitung eines Werkstücks
EP3428834B1 (fr) Lecteur de code optoélectronique et procédé de lecture de code optique
DE60315138T3 (de) Vorrichtung und Verfahren zur Qualitätsüberprüfung von Vorformlingen aus Kunststoff
DE60028756T2 (de) Methode und vorrichtung zur handhabung von ausgeworfenen sprtizgussteilen
EP2295157B1 (fr) Dispositif et procédé destinés au contrôle de fermetures de récipients
DE102016014381A1 (de) Spritzgießsystem
WO2007082575A1 (fr) Procede et dispositif de surveillance d'une ligne de production
EP3111200B1 (fr) Procédé de détection de fissures dans les parois d'articles en verre creux
EP2605212A2 (fr) Procédé et dispositif de vérification optique d'objets à vérifier lors de la fabrication et/ou de l'emballage de cigarettes
EP2850416A1 (fr) Procédé et dispositif de contrôle de bouteilles vides
WO2021213779A1 (fr) Procédé et dispositif d'inspection optique de récipients dans un système de traitement de boissons
DE102016124400A1 (de) Verfahren und Vorrichtung zum Erfassen von Störungen beim Objekttransport
WO2008104273A1 (fr) Procédé et dispositif d'inspection permettant de contrôler des récipients
WO2021004710A1 (fr) Procédé et dispositif pour inspecter des récipients dans un flux massique de récipients
WO2018197297A1 (fr) Procédé et dispositif d'inspection destinés à inspecter des récipients par traitement d'images
WO2022012938A1 (fr) Appareil et procédé pour inspecter des contenants
EP3812748A1 (fr) Procédé et dispositif d'inspection optique de récipients
DE19959623A1 (de) Verfahren und Anordnung zum Lokalisieren von zylinderförmigen Objekten
DE102019132830A1 (de) Verfahren und Vorrichtung zur Erkennung von umgefallenen und/oder beschädigten Behältern in einem Behältermassenstrom
EP3023774A1 (fr) Dispositif d'inspection destine a surveiller des processus de production
WO2015024617A1 (fr) Dispositif de contrôle d'articles en verre creux
EP2567220A1 (fr) Procédé et dispositif de contrôle d'objets pour applications pharmaceutiques
EP3355271A1 (fr) Procédé de configuration d'un système d'inspection assistée par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20731072

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20731072

Country of ref document: EP

Kind code of ref document: A1