EP3771564A1 - Dispositif d'inspection d'impression et procédé d'inspection optique d'une image imprimée d'un objet imprimé - Google Patents

Dispositif d'inspection d'impression et procédé d'inspection optique d'une image imprimée d'un objet imprimé Download PDF

Info

Publication number
EP3771564A1
EP3771564A1 EP20188815.3A EP20188815A EP3771564A1 EP 3771564 A1 EP3771564 A1 EP 3771564A1 EP 20188815 A EP20188815 A EP 20188815A EP 3771564 A1 EP3771564 A1 EP 3771564A1
Authority
EP
European Patent Office
Prior art keywords
image
print
print image
delta
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20188815.3A
Other languages
German (de)
English (en)
Inventor
Dr. Lucas FRANEK
Dr. Ralf GRIESER
Christoph Schultheiss
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bundesdruckerei GmbH
Original Assignee
Bundesdruckerei GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bundesdruckerei GmbH filed Critical Bundesdruckerei GmbH
Publication of EP3771564A1 publication Critical patent/EP3771564A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41FPRINTING MACHINES OR PRESSES
    • B41F33/00Indicating, counting, warning, control or safety devices
    • B41F33/0036Devices for scanning or checking the printed matter for quality control

Definitions

  • the present invention relates to the field of optical inspection of a print image of a print object.
  • the pamphlet DE 10 2017 116 882 A1 relates to a print inspection device for the optical inspection of a print image of a print object, a target print object being assigned to the print object.
  • the print inspection device comprises an image camera for optically recording the print object and a processor for further processing the recorded print object.
  • the invention solves this problem by recognizing anomalies in the print image, measuring error sizes and classifying the errors.
  • the invention enables customer-specific error detection in the print image by means of a machine Learn.
  • the object is achieved in particular by an architecture with several layers, which enables customer-specific detection of printing errors.
  • the multiple layers include color calibration, determination of the defect image from a calibrated Delta E image, customer-specific classification using machine learning and detection of different degrees of defects.
  • the invention relates to a print inspection device for optically inspecting a print image of a print object, a target print image being assigned to the print object, with: a processor which is designed to each based on a plurality of raster cell images for the print image and the target print image to determine a subdivision of the print image and the target print image into raster cells; for each raster cell to determine a delta-E raster cell image based on a color distance between a raster cell image of the print image and a raster cell image of the target print image; to determine for each pixel of the Delta-E cell image based on a pixel-specific threshold value function whether a pixel defect is present, the pixel-specific threshold value function being based on a previous training of a database of Delta-E cell images with associated manually specified ground truth cell images; assemble the previously determined pixel defects in the respective Delta-E cell images to form a defect image which gives an overview of the pixel defects of the print image; and output an
  • Such a print inspection device offers the user an efficient optical inspection of a print image of a print object. Due to the previous training, the optical inspection is less dependent on the manual parameterization than is the case with previous devices and there are fewer misrecognized printing errors, in particular pseudo-printing errors are recognized as such.
  • the print inspection device detects anomalies in the print image and provides the user with a means of measuring defect sizes and classifying the defects.
  • the print inspection device enables customer-specific error detection in the print image by means of machine learning.
  • the print image forms an actual print image of a print object to be inspected, which can be printed on a substrate.
  • the target print image can be present, for example, as a vector font or as a vector graphic or as a digital image, and can be known from a prepress stage, for example.
  • the grid can be an equidistant grid, but it can also be composed of any polygons. Individual cells in the grid can also be deactivated. This has the advantage that regions in which an examination is not desired can be adapted very flexibly.
  • the grid cells of the grid overlap and a local alignment can be carried out for each cell. This has the advantage that in the case of not optimally tensioned film, in the case of lamination distortion or in similar situations, local alignment allows the actual print image and the target print image to be better brought into local coincidence.
  • the processor is designed to determine the plurality of raster cell images based on color-calibrated and mutually aligned print images and target print images.
  • the processor is designed to determine the plurality of raster cell images for a respective color channel of a plurality of color channels of a color space.
  • the processor is designed to determine the delta-E raster cell image based on a Euclidean distance in the color space between the raster cell image of the print image and the corresponding raster cell image of the target print image.
  • the previous training of the database of Delta-E cell images with associated ground truth cell images takes place based on neural networks (NN), support vector machine (SVM) and / or multi-layer perceptron (MLP) methods.
  • NN neural networks
  • SVM support vector machine
  • MLP multi-layer perceptron
  • the pixel-specific threshold value function specifies a fragment in the Delta-E raster cell image, which is based on an irregularity in an alignment of the print image with respect to the target print image, as not a pixel error.
  • the print inspection device is fault-tolerant to incorrect positioning of the print image compared to the target print image, which is caused, for example, by the fact that the print image is not precisely positioned, e.g. due to incorrect positioning of the recording camera or an inclined indentation when reading or Scanning the print image.
  • the pixel-specific threshold value function specifies a contiguous area of pixels in the Delta-E raster cell image that exceeds a predetermined size as a pixel error.
  • both the training of the database of Delta-E cell images with associated ground truth cell images and the user-specific classification of the defect image are based on the same training process.
  • the training process is based on a classification of a database of print images against associated target value print images based on a subjective error detection of a user.
  • the pixel-specific threshold value function assigns a binary value 0 or 1 to each gray value of the delta-E raster cell image.
  • the threshold function can efficiently distinguish areas with a high level of confidence from areas with a low level of confidence and can easily represent it in the form of a threshold.
  • the pixel-specific threshold value function comprises a logistic function, e.g. a sigmoid function or a step function with regard to the gray values of the Delta-E raster cell image.
  • the defect classification is based on the following features: an area of connected components of the Delta-E cell image, an inner radius of the connected components of the Delta-E cell image, a correlation between the actual print image and the target print image, a contrast between the background and Foreground in relation to the Delta-E cell image.
  • the number of defect pixels per line is also taken into account as a feature. Columns are considered analogously for vertical streakiness.
  • the print inspection device comprises a camera which is designed to optically record the print object in order to obtain the print image.
  • the camera can easily provide the print image.
  • the print image can be provided via a scanner or a reading device.
  • the processor is designed to output a printing error when printing the print image with a color printer as the inspection result, the printing error occurring in the form of a color point or color stripe based on a malfunction of a color channel of the color printer.
  • the invention relates to a method for the optical inspection of a print image of a print object, wherein the print object is assigned a target print image, with the following steps: Determining a respective plurality of raster cell images for the print image and the target print image based on a subdivision of the Print image and the target print image in raster cells; Determining a delta-E raster cell image for each raster cell based on a color distance between a raster cell image of the print image and a raster cell image of the target print image; Determining for each pixel of the Delta-E cell image, based on a pixel-specific threshold value function, whether a pixel defect is present, the pixel-specific threshold value function being based on a previous training of a database of Delta-E cell images with associated manually specified ground-truth cell images; Combining the previously determined pixel defects in the respective Delta-E cell images to form a defect image which gives an overview of the pixel defects of the print image; and
  • Such a method offers the user an efficient optical inspection of a print image of a print object. Due to the previous training, the optical inspection is less dependent on the manual parameterization than was previously the case and there are fewer misrecognized printing errors, in particular pseudo-printing errors are recognized as such.
  • the invention relates to a computer program with a program code for executing the method according to the second aspect of FIG Invention. This has the advantage that the method can be carried out in an automated manner.
  • the pressure inspection device can be set up in terms of programming to execute the program code or parts of the program code.
  • the invention can be implemented in hardware and software.
  • Fig. 1 shows a schematic diagram of a print inspection device 100 for the optical inspection of a print image 120 of a print object according to one embodiment.
  • a target print image 121 which represents the ideal image of the print object, is assigned to the print object.
  • the print inspection device 100 comprises a processor 110 which is designed to determine a plurality of raster cell images 111 for the print image 120 and the target print image 121 based on a subdivision of the print image 120 and the target print image 121 into raster cells 130.
  • the processor 110 is further designed to determine a delta-E raster cell image 112 for each raster cell 130 based on a color distance between a raster cell image of the print image 120 and a raster cell image of the target print image 121.
  • the processor 110 is also designed to determine for each pixel of the Delta-E raster cell image 112 based on a pixel-specific threshold value function 140 whether a pixel defect 113 is present.
  • the pixel-specific threshold value function 140 is based on a previous training 150 of a database 160 of Delta-E raster cell images with associated manually specified ground truth cell images.
  • the processor 110 is also designed to combine the previously determined pixel defects 113 in the respective Delta-E raster cell images 112 to form a defect image 114, which gives an overview of the pixel defects 113 of the print image 120, and an inspection result 116 based on a user-specific classification 115 of the defect image 114 output.
  • a processor is a programmable arithmetic unit, that is to say a machine or an electronic circuit that controls other machines or electrical circuits in accordance with transferred commands and thereby executes an algorithm or process, which usually includes data processing.
  • the processor can, for example, be a microcontroller or a digital signal processor or a CPU, for example in an embedded system.
  • the term processor is used here understood both the component, ie the semiconductor chip and the data-processing logic unit.
  • the term processor also includes one or a plurality of processor cores which are nowadays contained in many processor chips, each core representing a (largely) independent logic unit.
  • Ground truth is a term used in statistics and machine learning. This means that the results of the machine learning are checked for their correctness compared to the real world. The term comes from meteorology, where "basic truth" refers to information obtained on site. The term implies a kind of reality check for machine learning algorithms.
  • Delta E is a measure of the perceived color difference, which is "equally spaced” for all colors that occur as far as possible.
  • the delta stands as a sign of the difference. This allows work that deals with colors to be quantified.
  • the L * a * b * color space (also: CIELAB) describes all perceptible colors. It uses a three-dimensional color space in which the brightness value L * is perpendicular to the color plane (a *, b *).
  • the most important properties of the L * a * b * color model include device independence and perception-relatedness, i.e. colors are defined as they are perceived by a normal observer under a standard lighting condition, regardless of how they are generated or how they are reproduced.
  • the color model is standardized in EN ISO 11664-4 "Colorimetry - Part 4: CIE 1976 L * a * b * Color space".
  • the color difference is usually given as Delta E.
  • the term color distance is preferred to the term color difference.
  • color difference it stands for the quantified form. Every color that actually occurs, including every color emitted or measured by a device, can be assigned a color location in a three-dimensional space.
  • the value of Delta E between the color coordinates (L *, a *, b *) p and (L *, a *, b *) q is calculated as the Euclidean distance according to EN ISO 11664-4.
  • the grid 130 can be a rectangular or square grid, i.e. a grid with rectangular or square grid elements. Alternatively, any other grid shape can be used, e.g. hexagon, triangular, etc.
  • the processor 110 is designed to determine the plurality of raster cell images 111 based on color-calibrated 255 and 256 print images 120 and target print images 121 that are aligned with one another.
  • the goal of color calibration 255 is to measure and / or adjust the color behavior of a device (input or output) in a known state.
  • Calibration refers to establishing a known relationship with a standard color space.
  • Input data can come from device sources such as digital cameras, image scanners, or other measuring devices. These inputs can be specified in either monochrome or multidimensional color - most commonly in the three-channel RGB (red / green / blue) model.
  • the processor 110 can determine an alignment of the print image relative to the target print image and rotate the print image on the basis of the alignment. This has the advantage that an alignment of the print image with respect to the target print image or the masking template can be corrected.
  • the processor 110 can be designed to determine the plurality of raster cell images 111 for a respective color channel of a plurality of color channels 246 of a color space 247, for example to Figure 2 shown.
  • processor 110 can be designed to determine delta-E raster cell image 112 based on a Euclidean distance in the color space between raster cell image 111 of print image 120 and the corresponding raster cell image 111 of target print image 121.
  • Every color that actually occurs can be assigned a color location in a three-dimensional space.
  • the preceding training 150 of the database 160 of Delta-E cell images with associated ground truth cell images takes place based on neural networks (NN), support vector machine (SVM) and / or multi-layer perceptron (MLP) methods.
  • NN neural networks
  • SVM support vector machine
  • MLP multi-layer perceptron
  • Ns artificial neural networks
  • the topology of a network i.e. the assignment of connections to nodes, must be well thought out depending on its task.
  • the training phase follows, in which the network "learns”. Learning a neural network through the following methods: developing new connections, deleting existing connections, changing the weighting (the weights from neuron to neuron), adjusting the threshold values of the neurons, if these have threshold values, adding or deleting neurons, modifying activation, Propagation or output function.
  • the learning behavior changes when the activation function of the neurons or the learning rate of the network changes.
  • a network "learns” mainly by modifying the weights of the neurons.
  • NNs are able to learn complicated non-linear functions using a "learning" algorithm that tries to determine all parameters of the function from existing input and desired output values using an iterative or recursive procedure.
  • a support vector machine serves as a classifier and regressor.
  • a support vector machine divides a number of objects into classes in such a way that the broadest possible area around the class boundaries remains free of objects.
  • the starting point for building a support vector machine is a set of training objects, for which it is known which class they belong to. Each object is represented by a vector in a vector space.
  • the task of the support vector machine is to fit a hyperplane into this space, which acts as a separating surface and divides the training objects into two classes. The distance between those vectors that are closest to the hyperplane is maximized. That wide, empty margin should later ensure that objects that do not exactly correspond to the training objects are classified as reliably as possible.
  • the perceptron is a simplified artificial neural network.
  • simple perceptron it consists of a single artificial neuron with adjustable weightings and a threshold value.
  • This term various combinations of the original model (MLP English. Multi-layerperceptron,) be understood today, it will ply between and multilayer perceptrons distinguished.
  • Perceptron networks convert an input vector into an output vector and thus represent a simple associative memory.
  • the pixel-specific threshold value function 140 specifies a fragment 224, 611 (as, for example, in FIG Figure 2 or Figure 6 shown) in the Delta-E raster cell image 112, which is based on an irregularity in an alignment of the print image 120 with respect to the target print image 121, as not a pixel defect 113. This can prevent misalignments of the print image with respect to the target Print image 121 can be determined as a printing error or defect. Such fragments mostly occur in the form of lines or hatching, in particular at points at which the print image 120 is slightly shifted in relation to the target print image 121.
  • the pixel-specific threshold function 140 specifies a contiguous area 214, 610 of pixels in the Delta-E raster cell image 112 (eg as in FIG Figure 2 or Figure 6 shown). This means that misprints, which usually take up a certain area, are recognized as printing errors or defects.
  • both the training 150 of the database 160 of Delta-E cell images with associated ground truth cell images and the user-specific classification 115 of the defect image 114 are based on the same training process 150.
  • a training process for example, a user is provided with different pressure images associated target print images. The user recognizes errors in the print images based on his subjective impression and marks them (labeling process). Those flagged as errors by the user Print images with associated (actual) print images are learned in the training phase by means of a machine learning process such as NN, MLP or SVM. This enables the machine learning process to also classify new images for printing errors based on the learned pattern.
  • the training process 150 is based on a classification of a database of print images (see, for example, 401, 411, 421 in FIG Figure 4 ) compared to the corresponding setpoint pressure images (see e.g. 402, 412, 422 in Figure 4 ) based on a subjective error detection of a user, as described above.
  • the pixel-specific threshold value function 140, 505 assign a threshold value in the range from 0 to 1 to each color value of the delta-E raster cell image 112.
  • the pixel-specific threshold value function 140, 505 can include, for example, a logistic function, for example a sigmoid function or a step function with respect to the gray values of the delta-E raster cell image 503.
  • the classification 115 can be based on the following features: an area of connected components of the delta-E raster cell image 503 (see FIG Figure 5 ), an inner radius of the connected components of the Delta-E raster cell image 503, a correlation between the actual print image and the target print image, a contrast of the Delta-E raster cell image 503, in particular a color difference between background and foreground.
  • the print inspection device 100 can furthermore include a camera which is designed to optically record the print object in order to obtain the print image.
  • the print inspection device 100 can comprise a scanner or a reading device in order to scan or read in the print image.
  • the processor 110 is designed to output a print error when printing the print image 120 with a color printer as the inspection result 116, the print error occurring in the form of a color point or color stripe which is due to a malfunction of a color channel 246 (see FIG Fig. 2 ) of the color printer.
  • Fig. 2 shows an architectural diagram of a print inspection device 200 for optical inspection of a print image 250 of a print object according to an embodiment.
  • the pressure inspection device 200 is a specific embodiment of the above Figure 1 pressure inspection device 100 described.
  • An input image 250 corresponds to the print image 120 from Figure 1 .
  • the input image 250 is assigned to a target print image 241 of a model 240 which corresponds to the ideal image of the print object, ie the image without printing errors.
  • a color calibration stage 255 input image 250 and target print image 241 are color-calibrated.
  • the aim of the color calibration 255 is to adapt the color behavior of the input image 250 to the color behavior of the target print image 241.
  • Calibration refers to establishing a known relationship with a standard color space.
  • Inputs are the input image 250 and the target print image 241. These inputs can be specified either monochrome or in multidimensional color - most often in the three-channel RGB (red / green / blue) model.
  • a color-calibrated input image 251 is generated, for example with white balance, as in FIG Fig. 2 shown.
  • the target print image 241 can also be displayed as a color-calibrated target print image.
  • the target print image 241 or the color-calibrated target print image and color-calibrated (actual) print image 251 are aligned 256 with one another.
  • the images are shifted or rotated with respect to one another so that both images are finally aligned so that they lie on top of one another. That is, the processor 110 converts the position data of both images accordingly, so that the position data are aligned with one another.
  • a target print image 242 and an actual print image 252 are available, which are aligned with one another or with one another.
  • Both images are then split into respective color channels 246 in the present color space 247, e.g. RGB, so that an actual print image 253 and a corresponding target print image are generated for each color channel 246.
  • present color space 247 e.g. RGB
  • a grid 244 or raster is placed over the print image 253 and the associated target print image 242 in order to divide the print image 253 and the associated target print image 242 into a plurality of raster cell images 254, each of which represents partial sections of the respective print image.
  • the grid size is preset or freely selectable.
  • a local position correction 245 can be carried out via the grid layer.
  • Each cell or raster cell of the grid 244 is treated in the pressure inspection device 200 as an independent layer 210, 220, 230, in FIG Figure 2 the three layers 210, 220, 230 are shown by way of example.
  • a Delta E raster cell image 201 is determined on the basis of a comparison between model and image.
  • the Delta-E raster cell image 215 of a first layer 210 can be determined based on a Euclidean distance in the color space between the raster cell image 214 of the print image 253 and the corresponding raster cell image 213 of the target print image 242.
  • Delta E is calculated as the Euclidean distance between the color locations (L *, a *, b *), and (L *, a *, b *) 2 .
  • the delta-E raster cell image 225 of a second layer 220 can be determined e.g. based on the Euclidean distance in the color space between the raster cell image 224 of the print image 253 and the corresponding raster cell image 223 of the target print image 242.
  • the delta-E grid cell image 235 of an n-th layer 230 can be determined, e.g. based on the Euclidean distance in the color space between the grid cell image 234 of the print image 253 and the corresponding grid cell image 233 of the target print image 242.
  • a database is trained using an activation function 202 per pixel.
  • the activation function 202 can be based, for example, on the following relationship: where p l indicates a probability variable, ⁇ its mean value and ⁇ its variance. The probability distribution is therefore characterized by a sigmoid function.
  • a classification in a classification layer 203 of a training database for example the training database 160 with Delta-E and Ground-Truth images, as in FIG Figure 1 described in order to make a detection for an error or for no error.
  • related components are thus detected in the Delta E raster cell image 201 and classified as faulty or non-faulty. For example, a related component 216 of the Delta E grid cell image 215 of the first layer 210 is classified as not defective and a related component 217 of the Delta E grid cell image 215 of the first layer 210 is classified as defective.
  • a related component 226 of the delta E grid cell image 225 is classified as not defective and a related component 227 of the delta E grid cell image 225 is classified as not defective.
  • a related component 236 of the delta E raster cell image 235 is classified as not defective and a related component 237 of the delta E raster cell image 235 is classified as defective.
  • a pixel-specific threshold value function can classify a fragment 224 in the delta-E raster cell image 201, which is based on an irregularity in an alignment of the print image 250 with respect to the target print image 242, as no pixel defect. It can thus be avoided that misalignments of the print image 250 with respect to the target print image 242 are determined as a printing error or defect. Such fragments usually appear in the form of lines or hatching, in particular in places where the print image 120 is light is shifted with respect to the target print image 121. Furthermore, the pixel-specific threshold value function can classify a coherent area 214 of pixels in the delta-E raster cell image 215, which exceeds a predetermined size or extent, as a pixel defect. This means that misprints, which usually take up a certain area, are recognized as printing errors or defects.
  • a user is presented with various print images with associated target print images.
  • the user recognizes errors in the print images based on his subjective impression and marks them (labeling process).
  • the print images marked with errors by the user with associated (actual) print images are learned in the training phase by means of a machine learning method such as NN, MLP or SVM. This enables the machine learning process to also classify new images for printing errors based on the learned pattern.
  • the activation function 202 also referred to as a pixel-specific threshold value function, can be based on the following features: an area of connected components of the Delta-E grid cell image 215, 225, 235, an inner radius of the connected components of the Delta-E grid cell image 215, 225, 235, a Correlation between color channels 246 of the Delta-E raster cell image 215, 225, 235, a contrast of the Delta-E raster cell image 215, 225, 235, in particular a color difference between background and foreground.
  • the number of defect pixels per line is also taken into account as a feature. Columns are considered analogously for vertical streakiness.
  • an error image 204 is generated which gives the viewer an overview of printing errors in the print image.
  • both the training of the classification layer 203 with activation function on the basis of a database and the customer-specific classification 205 of the defect image 204 are based on the same training process 150, for example as above Figure 1 described.
  • Training process for example, a user is presented with various print images with associated target print images. The user recognizes errors in the print images based on his subjective impression and marks them (labeling process). The print images marked with errors by the user with associated (actual) print images are learned in the training phase by means of a machine learning method such as NN, MLP or SVM. This enables the machine learning method to also classify new images for printing errors on the basis of the learned pattern, so that the print inspection device 200 has a higher error detection rate.
  • the print inspection device 200 was tested with various data sets of portrait images.
  • a first data set had 72 portrait images, with 64 images having color points within the portrait and 8 images being free from printing errors.
  • a second data set had 600 portrait images, 480 images having color points within the portrait, 117 images having stripes and only 3 images being free from printing errors. Each image had only one misprint, namely in a color channel. In the test data set, the printing errors were varied across the individual color channels.
  • the color points had areas of 0.4 mm 2 , 0.5 mm 2 and 0.6 mm 2 . Additional errors, ie printing errors of the area 0.4 mm 2 , were detected with a detection rate of 85.53% and critical errors, ie printing errors of the areas 0.5 mm 2 and 0.6 mm 2 , with a detection rate of 98.39 % recognized.
  • Fig. 3 shows two print images 301, 302 of a female person, each of which has a printing error caused by a printer.
  • a printing error at point 311, ie in the area of the right cheek has been recognized by manual inspection.
  • print defects at points 312 and 313, that is to say light horizontal stripes in the face area have been detected by manual inspection.
  • the printing errors are in each case printing errors of a second class, which arise due to incorrect behavior of the printer and are noticeable as colored dots or stripes. In this case, either a color channel is missing completely or a color channel is printing although it should not be printing.
  • This class of defects can be specified more precisely.
  • Second class typographical errors are different from typographical errors of a first class, which are anomalies, such as scratches, dirt, etc. Printing errors of this first class cannot be specified further.
  • Printing errors of the first class can be recognized by means of statistical anomaly detection, while printing errors of the second class can be used to train a classifier, as with the above Figures 1 and 2 described, who can then detect these defects.
  • Fig. 4 shows three exemplary series of raster cell images, each with input 401, 411, 421, model 402, 412, 422 or nominal value and manually specified ground truth 403, 413, 423.
  • the printing error in the actual Raster cell image 401 that is to say the point-like or circular area which occupies the entire center of the image 401, is very pronounced and can therefore easily be recognized manually by a user.
  • the user can thus identify or label the ground truth grid cell image 403.
  • White areas in the ground truth grid cell image 403 represent areas with a high level of confidence, while black areas represent areas with a low level of confidence.
  • the confidence level in the ground truth grid cell image 403 is scaled from 0 to 240, where 0 represents the black area with a low confidence level and 240 the white area with a high confidence level. Even if no gray areas can be seen in the ground truth grid cell image 403, the user can also label confidence levels between 0 and 240 (i.e. gray levels from dark gray to light gray).
  • the printing error in the actual Raster cell image 411 that is to say the point-like or circular area which occupies the entire center of the image 411, is not very pronounced and can therefore only be recognized manually by the user with difficulty after intensive examination. After carefully inspecting the user, he can use it to identify or identify the ground truth grid cell image 413. label. Due to the careful examination of the user, the ground-truth grid cell image 413 corresponds approximately to the ground-truth grid cell image 403 from the first series, ie despite the poorer input data 411, the result is the same.
  • the printing error in the actual Raster cell image 421, ie the point-like or circular area which occupies the entire center of the image 421, is also only slightly pronounced and can therefore only be recognized manually by the user with difficulty after intensive examination.
  • the second series has a light gray error in a light gray image
  • the third series has a dark gray error in a dark gray image.
  • the user must carefully inspect the image in order to then be able to label the ground truth raster cell image 423.
  • the ground-truth grid cell image 423 corresponds roughly to the ground-truth grid cell image 403 from the first series due to the careful examination by the user, i.e. despite the poorer input data 421, the result is the same.
  • FIG. 11 shows a schematic diagram of a pixel-related training 500 of a Delta E raster cell image according to an embodiment.
  • the Delta E grid cell image 503 is determined, as above for the Figures 1 and 2 described.
  • the ground truth raster cell image 504, which is from a previous labeling process as above Figure 4 is available, is used together with the Delta E grid cell image 503 as input variables for the training 507 in order to train a threshold value function 505, on the basis of which an optimal threshold 506 can be determined.
  • Brightness values from 0 to around 25 are weighted with threshold value 1, which corresponds to a high level of confidence.
  • Brightness values of around 30 to 50 are weighted with the threshold value 0, which corresponds to a low confidence level.
  • the threshold falls monotonically from 1 to 0.
  • Such values correspond to an average confidence level which is established in the border area between an area which is recognized as a printing error and an area which is not recognized as a printing error.
  • the optimal threshold value 506 is thus in the range between 25 and 30, e.g. 27 or 28.
  • Fig. 6 shows an exemplary image sequence of a target print image 601, a real print image 602 and an associated defect image 603.
  • the target print image 601 represents a portrait of a female person.
  • a colored spot or point can be seen on the right cheek of the person, which represents a printing error.
  • Defect image 114 described corresponds or also to that Figure 2 described error image 204, two different types of printing errors can be recognized.
  • the above-mentioned colored spot or point 610 is recognized as a printing error;
  • lines or hatching 611 in the area of the hairline are recognized as printing errors, which may have arisen due to an inexact alignment of the two images 601, 602 with one another.
  • the lines or hatching 611 can be excluded as printing errors, so that only the colored spot or point 610 is recognized as a printing error.
  • FIG. 3 shows a schematic diagram of a training 700 of the classification layer for generating a defect classifier according to an embodiment.
  • Real defects 701 are shown on the left in the figure, while pseudo defects 702 are shown on the right.
  • the real defects 701 are colored spots or points, while the pseudo-defects 702 are lines or hatching, which usually arise due to a misalignment of the actual image to the desired image.
  • the defect classifier is determined, which can bring about a differentiation between the two types of defects 701 and 702. With this defect classifier 704 it is then possible to only identify the spots 610 in the defect image 603 of the Figure 6 to be classified as printing errors, but not the stripes 611 or hatching.
  • FIG. 8 shows a schematic diagram of a method 800 for the optical inspection of a print image of a print object according to an embodiment.
  • a target print image 121 is assigned to the print object.
  • the method performs the steps of processor 110 referred to above Figure 1 and Figure 2 have been described in more detail.
  • the method 800 includes determining 801 a respective plurality of raster cell images 111 for the print image 120 and the target print image 121 based on a subdivision of the print image and the target print image into raster cells 130, such as for example above Figures 1 and 2 described.
  • the method 800 comprises determining 802 a delta-E raster cell image 112 for each raster cell 130 based on a color distance between a raster cell image 111 of the print image 120 and a raster cell image 111 of the target print image 121, such as for example above Figures 1 and 2 described
  • the method 800 comprises determining 803 for each pixel of the Delta-E grid cell image 112, based on a pixel-specific threshold value function 140, whether a pixel defect 113 is present, the pixel-specific threshold value function 140 on a previous training 150 of a database 160 of Delta-E grid cell images with associated manually specified ground truth grid cell images, such as above for the Figures 1 and 2 described.
  • the method comprises an assembly 804 of the previously determined pixel defects 113 in the respective delta-E raster cell images 112 to form a defect image 114, which gives an overview of the pixel defects 113 of the print image 120, such as the above Figures 1 and 2 described.
  • the method 800 comprises outputting 805 an inspection result 116 based on a user-specific classification 115 of the defect image 114, such as, for example, regarding the above Figures 1 and 2 described.

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
EP20188815.3A 2019-08-02 2020-07-31 Dispositif d'inspection d'impression et procédé d'inspection optique d'une image imprimée d'un objet imprimé Pending EP3771564A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102019120938.2A DE102019120938B4 (de) 2019-08-02 2019-08-02 Druckinspektionsvorrichtung und Verfahren zur optischen Inspektion eines Druckbildes eines Druckobjekts

Publications (1)

Publication Number Publication Date
EP3771564A1 true EP3771564A1 (fr) 2021-02-03

Family

ID=71899531

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20188815.3A Pending EP3771564A1 (fr) 2019-08-02 2020-07-31 Dispositif d'inspection d'impression et procédé d'inspection optique d'une image imprimée d'un objet imprimé

Country Status (2)

Country Link
EP (1) EP3771564A1 (fr)
DE (1) DE102019120938B4 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1384580A1 (fr) * 2002-07-27 2004-01-28 serv-o-tec Druck- und Papierverarbeitungsmaschinen GmbH Procédé et dispositif pour réglage des registres d'une machine à imprimer
WO2004056570A1 (fr) * 2002-12-20 2004-07-08 Océ Document Technologies GmbH Procede et dispositif de controle en temps reel d'images imprimees
DE102017116882A1 (de) 2017-07-26 2019-01-31 Bundesdruckerei Gmbh Druckinspektionsvorrichtung zur optischen Inspektion eines Druckbildes eines Druckobjekts

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6024018A (en) 1997-04-03 2000-02-15 Intex Israel Technologies Corp., Ltd On press color control system
DE10314071B3 (de) 2003-03-28 2004-09-30 Koenig & Bauer Ag Verfahren zur qualitativen Beurteilung eines Materials mit mindestens einem Erkennungsmerkmal
JP6487866B2 (ja) 2016-03-09 2019-03-20 富士フイルム株式会社 印刷結果検査装置、方法およびプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1384580A1 (fr) * 2002-07-27 2004-01-28 serv-o-tec Druck- und Papierverarbeitungsmaschinen GmbH Procédé et dispositif pour réglage des registres d'une machine à imprimer
WO2004056570A1 (fr) * 2002-12-20 2004-07-08 Océ Document Technologies GmbH Procede et dispositif de controle en temps reel d'images imprimees
DE102017116882A1 (de) 2017-07-26 2019-01-31 Bundesdruckerei Gmbh Druckinspektionsvorrichtung zur optischen Inspektion eines Druckbildes eines Druckobjekts

Also Published As

Publication number Publication date
DE102019120938A1 (de) 2021-02-04
DE102019120938B4 (de) 2023-12-21

Similar Documents

Publication Publication Date Title
DE10314071B3 (de) Verfahren zur qualitativen Beurteilung eines Materials mit mindestens einem Erkennungsmerkmal
DE3937950C2 (fr)
EP0012724B1 (fr) Procédé mécanisé d'estimation de la qualité d'impression d'un produit imprimé ainsi que dispositf pour sa mise en oeuvre
DE69910631T2 (de) Bildanpassung um die Empfindlichkeit auf Falschregistrierung zu vermindern
DE102019204139A1 (de) Training für künstliche neuronale Netzwerke mit besserer Ausnutzung der Lern-Datensätze
DE10221318A1 (de) Verfahren zur Farbvariationskorrektur und Defekterkennung für Wafer
EP2787485B1 (fr) Procédé et dispositif de détection automatique d'erreurs dans des corps flexibles
DE102021201124A1 (de) Trainieren von bildklassifizierernetzen
DE102016109030A1 (de) Verfahren zum Bewerten eines Scheinwerfers
EP1578609B1 (fr) Procede et dispositif de controle en temps reel d'images imprimees
WO2020200620A1 (fr) Masquage d'objets contenus dans une image
DE102019120938B4 (de) Druckinspektionsvorrichtung und Verfahren zur optischen Inspektion eines Druckbildes eines Druckobjekts
EP3435056B1 (fr) Dispositif d'inspection de pression destiné à l'inspection optique d'une image d'impression d'un objet d'impression
EP3482348A1 (fr) Procédé et dispositif pour catégoriser une surface de rupture d'un élément
EP3316216B1 (fr) Procédé de vérification d'un objet
EP1741060A2 (fr) Procede pour comparer une image avec au moins une image reference
EP0978097A1 (fr) Procede pour modifier la dimension d'elements a traits
DE102008033171A1 (de) Verfahren und Vorrichtung zur Inline-Qualitätssicherung an Druckmaschinen
DE102009058605A1 (de) Verfahren und Vorrichtung zum Erhöhen des Kontrastes eines Grauwertebildes
DE102021130505A1 (de) Verfahren zur Bestimmung einer Beleuchtungsfarbe, Bildverarbeitungsvorrichtung und Aufnahmesystem
DE102021204343A1 (de) Steuergerät zum Erzeugen von Trainingsdaten zum Trainieren eines Algorithmus des maschinellen Lernens
DE202021102338U1 (de) Steuergerät zum Erzeugen von Trainingsdaten zum Trainieren eines Algorithmus des maschinellen Lernens
EP4152206A1 (fr) Dispositif et procédé d'usinage de câbles
DE102022209113A1 (de) Training von instanzsegmentierungsalgorithmen mit partiell annotierten bildern
DE102022212818A1 (de) Verfahren zum Erzeugen von zusätzlichen Trainingsdaten zum Trainieren eines Algorithmus des maschinellen Lernens zum Erkennen von Anomalien in Bilddaten

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210730

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20220202

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230526