CN110667254A - Nozzle health detection by means of neural networks - Google Patents

Nozzle health detection by means of neural networks Download PDF

Info

Publication number
CN110667254A
CN110667254A CN201910395823.1A CN201910395823A CN110667254A CN 110667254 A CN110667254 A CN 110667254A CN 201910395823 A CN201910395823 A CN 201910395823A CN 110667254 A CN110667254 A CN 110667254A
Authority
CN
China
Prior art keywords
printing
image data
neural network
computer
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910395823.1A
Other languages
Chinese (zh)
Other versions
CN110667254B (en
Inventor
S·内布
N·R·诺瑞克
A·亨
J·福歇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heidelberger Druckmaschinen AG
Original Assignee
Heidelberger Druckmaschinen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heidelberger Druckmaschinen AG filed Critical Heidelberger Druckmaschinen AG
Publication of CN110667254A publication Critical patent/CN110667254A/en
Application granted granted Critical
Publication of CN110667254B publication Critical patent/CN110667254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J2/00Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed
    • B41J2/005Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed characterised by bringing liquid or particles selectively into contact with a printing material
    • B41J2/01Ink jet
    • B41J2/135Nozzles
    • B41J2/165Preventing or detecting of nozzle clogging, e.g. cleaning, capping or moistening for nozzles
    • B41J2/16579Detection means therefor, e.g. for nozzle clogging
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J2/00Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed
    • B41J2/005Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed characterised by bringing liquid or particles selectively into contact with a printing material
    • B41J2/01Ink jet
    • B41J2/135Nozzles
    • B41J2/165Preventing or detecting of nozzle clogging, e.g. cleaning, capping or moistening for nozzles
    • B41J2/16585Preventing or detecting of nozzle clogging, e.g. cleaning, capping or moistening for nozzles for paper-width or non-reciprocating print heads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J2/00Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed
    • B41J2/005Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed characterised by bringing liquid or particles selectively into contact with a printing material
    • B41J2/01Ink jet
    • B41J2/21Ink jet for multi-colour printing
    • B41J2/2132Print quality control characterised by dot disposition, e.g. for reducing white stripes or banding
    • B41J2/2142Detection of malfunctioning nozzles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J2/00Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed
    • B41J2/005Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed characterised by bringing liquid or particles selectively into contact with a printing material
    • B41J2/01Ink jet
    • B41J2/21Ink jet for multi-colour printing
    • B41J2/2132Print quality control characterised by dot disposition, e.g. for reducing white stripes or banding
    • B41J2/2146Print quality control characterised by dot disposition, e.g. for reducing white stripes or banding for line print heads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J25/00Actions or mechanisms not otherwise provided for
    • B41J2025/008Actions or mechanisms not otherwise provided for comprising a plurality of print heads placed around a drum

Abstract

The invention relates to a method for detecting and compensating defective printing nozzles in an inkjet printer (7) by means of a computer (6), wherein a test pattern (10) is printed by the inkjet printer (7), the test pattern (10) is detected by means of at least one sensor, digitized and transmitted as digital image data (11) to the computer (6), the computer then determines characteristic values for the individual printing nozzles of a printing head (5) from the digital image data (11), detects defective printing nozzles from the characteristic values and compensates the printing nozzles concerned, characterized in that the computer supplies the digital image data (11) to a neural network (14) which has been trained by means of training data (12, 12a, 12b) in such a way that the neural network determines the printing nozzles of the printing head (5) from the supplied digital image data (11) The corresponding characteristic value.

Description

Nozzle health detection by means of neural networks
Technical Field
The invention relates to a method for detecting defective printing nozzles in an inkjet printer by means of the use of a neural network.
Background
In inkjet printers, the absence of an identified wrong printing nozzle (e.g. a mistake such as a failure or a slant blow) leads to the occurrence of waste and thus to a commercially worthless print. Thus, the goal is production with no waste or at least minimal waste. The printing quality of the inkjet printer can be measured here by means of different characteristic values, which are obtained by suitable image processing of the recording of suitable test patterns.
This includes:
the quality of the individual printing nozzles is described by specific printing nozzle characteristic values (e.g. intensity, skewness, grey-scale value) and is obtained from a so-called printing nozzle test pattern model;
describing the uniformity transverse to the printing direction by a process of density variation transverse to the printing direction;
head position can be found by so-called y-, x-tiling pattern;
and so on.
In a continuous printing operation, the characteristic values are determined on-line in a predetermined interval. In most inkjet printers there is an inline inspection system (i.e. a camera system) which records a digital image of the appropriate test pattern. These image data are analyzed programmatically. A series of methods are used here — for example image processing by means of sub-pixel methods, filtering, fourier transformation, etc. These methods are constantly evolving and work to some extent robustly. For many of these partial steps, one or more parameters are required that influence the result to some extent. In order to be able to select the parameters appropriately, empirical knowledge is required and the whole process needs to be continuously optimized. If the parameters thus determined are present, they can be used, for example, for classifying the nozzles or for adjusting the head.
For a method that is robust to a certain extent, a very large number of parameters must be selected and adapted in this case. The interaction of these parameters with each other is hardly known explicitly. In the prior art, there are different methods for different quantum aspects of mass. To quantify the different effects, a plurality of different test patterns are generated and processed when finding the print characteristic values (e.g., print nozzle test patterns), finding the head positions (y-, x-stitch patterns).
All this makes automated quality assessment by means of digital image processing very complex, expensive and not flexible enough. Therefore, there is a need to find better alternatives for this field of application.
In this case, it is entirely known from the prior art to use neural networks, which are self-learning algorithms implemented as programs on a computer, for image recognition and digital image processing. Thus, the US application document US 2004/0101181a1 discloses a method for processing medical image data in which an artificial neural network is used in order to identify a determined type of error in the current image data. Here, the neural network is trained specifically to recognize these specific errors. However, the neural network proposed in this patent application is specifically adapted for application in medical image data. Thus, the neural network is not suitable for use in the quality control of printing nozzles in ink jet printers in the embodiments and applications disclosed therein.
Disclosure of Invention
The object of the present invention is therefore to provide a method for detecting and compensating defective printing nozzles in an inkjet printer, which is easier and less expensive to implement than the methods known from the prior art, while the efficiency remains at least constant.
This object is achieved by a method for detecting and compensating defective printing nozzles in an inkjet printer by means of a computer, wherein a test pattern is printed by the inkjet printer, the test pattern is detected by means of at least one image sensor, the test pattern is digitized and transmitted as digital image data to the computer, the computer then determines from the digital image data characteristic values of the individual printing nozzles of the printing head, detects defective printing nozzles on the basis of the characteristic values and compensates the printing nozzles concerned, characterized in that the computer supplies the digital image data to a neural network which has been trained (einlernen) with the aid of training data in such a way that the neural network determines from the digital image data supplied the respective characteristic values of the printing nozzles of the printing head. The use of a neural network is necessary because the resolution of the image sensor is significantly lower than the printing resolution of the inkjet printer. This means that: the printed test pattern is present in a significantly lower resolution after digitization or detection by the image sensor than in the printed form. Due to this lower resolution, information about the printed test pattern is lost, which is necessary for evaluating the state of the printing nozzles to be examined by means of normal (as in the prior art) image processing algorithms. In the prior art, attempts have been made to compensate for the lower resolution of the digital image data of the detected test pattern by various digital image processing instruments. However, this is very laborious and very inflexible. In the method according to the invention, the detected digital image data is therefore supplied instead to a neural network in the form of a self-learning algorithm, which then evaluates the image data with respect to the characteristic values of the printing nozzles to be determined. In order for the self-learning algorithm to be able to do this, it must be trained in advance by means of digitally available training data in addition to the basic digital image processing tools. Untrained neural networks cannot be used meaningfully. Therefore, the neural network must be trained on the feature values of the printing nozzles based on digital training data (e.g., test patterns having certain known features). This can therefore be achieved by: a digitally present test pattern is provided to the neural network, which test pattern contains well-defined values (usually defined false images) in terms of the characteristic values of the printing nozzles. The neural network is then trained with the aid of these training data for a long time until it can determine the correct characteristic values.
Only then is the neural network used to evaluate the actual, detected and digitized test patterns and to subsequently determine the actual characteristic values of the printing nozzles.
Advantageous and therefore preferred embodiments of the method result from the description of the invention and the drawing.
A preferred embodiment of the method according to the invention is: a printed nozzle test pattern and/or a face-covering element is used for the test pattern. These two types of test patterns are the most commonly used test patterns when detecting defective printing nozzles. In order to determine the characteristic values of the printing nozzles, a printing nozzle test pattern is particularly required. These test patterns of printing nozzles usually consist of one or more horizontal lines of printing objects, wherein the individual printing objects are produced by the printing nozzles of the printing head. Then, the characteristic value of each print nozzle of the print head can be found from the position and type of each print object. In this case, too, surface covering elements or surface wedges are often used
Figure BDA0002057158760000031
The surface covering elements or surface wedges are typically composed of gray or solid areas of single or multiple process colors, respectively. These area covering elements or area wedges are particularly well suited for finding white lines and other printing errors caused by defective printing nozzles.
A further preferred embodiment of the method according to the invention is: the digital image data is cleaned by the computer of disturbing effects such as static lens errors and lens distortions of the at least one image sensor. In order for the neural network to be able to evaluate the existing digital image data correctly as well, the mentioned disturbing effects have to be removed from the digital image data. Otherwise, the neural network may find spurious errors or may incorrectly resolve actual errors that are superimposed by these interference effects. Therefore, the correct determination of the characteristic values of the printing nozzles will be made significantly more difficult. Of course, the neural network may also be trained such that it learns to ignore these interference effects. However, this lengthens and complicates the actual learning process, which is why it is preferable to remove the disturbing effects from the digital image data by a computer.
A further preferred embodiment of the method according to the invention is: training data for training the neural network is generated manually by a computer by: a test data set in the form of digitized printed image data with known errors (in particular in the form of printed nozzle test patterns and/or area coverage elements with image errors caused by defective printing nozzles) is generated and used as input variables for the neural network. This is a particular advantage of using neural networks. Instead of programming a computer and a computer-controlled image inspection system with the aid of different image processing operations in an effort to find errors in the test image and to evaluate them accordingly in terms of the characteristic values of the printing nozzles, as in the prior art, a neural network can be trained to generate the printing nozzle characteristic values, which uses artificially generated training data in the form of digitally present test patterns with artificially introduced errors. Since this entire process is carried out automatically by a computer, the neural network can be provided with a large amount of computer-generated, artificially trained data in a relatively short time, and thus trained in a short time. The training data can be generated manually, for example by a computer, in such a way that they contain test patterns with defined characteristic values, which are implemented in the digital image as defined image errors. These image errors are caused by defined defects in the printing nozzles, respectively. The computer can be programmed in such a way that errors contained in the artificially generated training data can be changed in virtually any combination. Thus, a large set of different training data can be generated in the shortest time and used to train the neural network.
A further preferred embodiment of the method according to the invention is: pre-raster image processor data (Pre-RIP-Daten) in which the individual pixel steps (Pixelschrit) are assigned to the individual printing nozzles in the printing direction the drop sizes to be printed are used as digitized printing image data with known errors. Here, the pre-raster image processor data is particularly suitable for use as digitized print image data. The pre-raster image processor data consists of the following data sets: the data set assigns to each print nozzle, in the printing direction, a drop size to be printed for each pixel step.
A further preferred embodiment of the method according to the invention is: in order to train the neural network, firstly digitized image data with known errors are converted by the computer into a higher image resolution by means of upsampling, then known errors are introduced into the digitized image data, which is sent to the neural network, and then the image resolution is continuously reduced by means of downsampling until the image resolution of the at least one image sensor is reached. In order that the neural network can correctly learn digitized printed image data with known errors, it is proposed to introduce known errors into the digitized printed image data of higher image resolution and to first train the neural network with the higher image resolution and then to gradually reduce the image resolution until the actual image resolution of the image sensor is reached. This makes it easier for the neural network to learn and find the known errors introduced.
A further preferred embodiment of the method according to the invention is: known errors in digitally printing images include skewed jetting of the printing nozzles, biased drop size, displacement and/or twisting of the print head, and trajectory fluctuations of the at least one image sensor transverse to the printing direction. These are the most common and most frequently occurring errors that negatively impact the manner in which inkjet printing nozzles operate. These errors can be detected by: the effect of these errors in the printing nozzle on the printed image is determined. This is preferably done by an evaluation process of the digitized test pattern, by means of which the characteristic values are determined and checked. It is obvious that in the method according to the invention, all possible further (not specifically mentioned here) printing nozzle errors or other influencing variables of the characteristic values to be determined can also be taken into account.
A further preferred embodiment of the method according to the invention is: the characteristic values of the printing nozzles reflect known errors in the digitized printed image data. As already explained, these characteristic values describe the influence of the most common printing nozzle errors on the performance data of the printing nozzle or of the entire inkjet printer. These characteristic values reflect to some extent the status of the printing nozzles, the so-called "nozzle health". Then, by printing and measuring the test pattern, the characteristic values can be quantified and used to evaluate the state of the inkjet printer.
A further preferred embodiment of the method according to the invention is: the camera system of an inspection system, which is installed in-line in the printing press behind the printing unit, is used as the at least one image sensor. It is also desirable to use the inspection systems present in most inkjet printers for monitoring the mode of operation of the individual printing nozzles, which inspection systems are usually installed inline in the printer behind the printing mechanism and which are intended to inspect the resulting print quality. The camera of the inspection system is used as an image sensor in order to detect the printed test pattern and to digitize it. Of course, also external image sensors or internal image sensors can be used for the method according to the invention, which image sensors are not part of the image inspection system. This is particularly desirable in inkjet printers that do not have an inline image inspection system. However, in principle for efficiency reasons, the use of existing image inspection systems and their cameras is the most conceivable and efficient approach.
Drawings
The invention and its structurally and/or functionally advantageous embodiments are further described below with reference to the attached drawings, in accordance with at least one preferred embodiment. In the drawings, mutually corresponding elements are provided with the same reference numerals, respectively.
The figures show:
fig. 1 shows an example of the structure of a sheet inkjet printer;
FIG. 2 shows a schematic example of a "white line" due to a "missing nozzle";
FIG. 3 shows the detected digitized test patterns in actual print resolution and in detected camera resolution, respectively;
fig. 4 shows a schematic process of training of a neural network for image analysis processing.
Detailed Description
The field of application of the preferred embodiment variant is an ink jet printer 7. An example of the basic structure of such a machine 7 is shown in fig. 1, which consists of a feeder 3 for supplying a printing substrate 2 into a printing unit 4, in which the printing substrate is printed by a print head 5, as far as a collector 1. In this case, a sheet-fed ink-jet printer 7 is provided which is controlled by a control computer 6. During operation of such a printing machine 7, individual printing nozzles in the printing head 5 of the printing unit 4 may fail, as described above. The result is then a "white line" 9 or, in the case of multicolor color printing, a distorted color value. An example of such a "white line" 9 in the printed image 8 is shown in fig. 2.
Here, the initial case of the method according to the invention is: there is a digital camera image 13 of the printed nozzle test pattern 11 detected by means of the inline inspection system of the inkjet printer 7, to which static lens errors, lens distortions, etc. are corrected if necessary. The test pattern 10 may be, for example, a common test pattern 10 in which individual printing nozzles participate. Additionally, grid planes, the so-called "big dots test stage" (bigdotttettrespe), may also be included. In principle, other test patterns may also be used. It is only important that the test pattern 10 has the following elements: the elements can be assigned to individual printing nozzles. The procedure of the following preferred embodiment variants is set forth below: in this embodiment variant, only the test pattern 11 is present. In the production case, the quality-describing characteristic values of the printing nozzles are determined from the camera image 11 thus present, by: the digital image data 13 is guided through a trained neural network 14. The training of the neural network 14 is further illustrated in fig. 3 and 4 and proceeds as follows:
the test pattern 10 in the resolution of the print nozzle exists as pre-raster image processor data, which is a data set of: the data set assigns a drop size to be printed to each pixel step in the print direction for each print nozzle;
manually increasing the resolution of the image by a factor of ten to one hundred (so-called upsampling);
in such high resolutions, random physically meaningful errors are predefined and introduced into the image, for example:
continuous, non-continuous characteristics of the printing nozzles in terms of oblique ejection, droplet size (so-called weak spot);
displacement and torsion of the entire print head 5;
-track fluctuation in the lateral direction, i.e. relative position of track/camera;
o image resolution, noise, exposure, ink/paper interaction, etc.
The resolution of the high-resolution image is manually reduced to the resolution of the camera by so-called down-sampling.
Fig. 3 shows an example of a test pattern in a high original image resolution 10(2540dpi) on the left and an example of the same test pattern in a low camera resolution 11(200dpi) on the right.
In this way, a set of manual test data with image data 12 comprising known errors is generated as input parameters. To train the neural network 14, multiple of these test data sets 12 are required. The process of such training is schematically illustrated in fig. 4. Here it can be easily seen that: how to train the neural network 14 in multiple stages. In this case, the data set 12 is divided, for example, in the proportion 60/40, into a training data set 12a and a test data set 12 b. The network 14 is "trained" based on the artificially generated training data set 12a, while the trained network 14 is examined by the test data set 12 b. First, the network 14 is trained with the aid of training data 12 a. If a sufficient level is reached, verification is performed with the aid of test data 12 b. Then, the actual image data 13 is tested using the verified network 14, so that finally there is checked image data with position information 13a (xy coordinates). The use of a plurality of different test data sets 12 does not pose a problem because the training data 12a are generated manually in the computer 6. The more test data sets 12, the better the neural network 14 is trained. Furthermore, the main advantages of generating the training data on the computer 6 are: the errors introduced are known. The result of this simple test pattern is the following characteristic values: the amplitude of the individual printing nozzles, the so-called weak point and the phase of the printing nozzles (so-called skew jet value) and the orientation characteristic of the print head 5 with respect to position and rotation.
In another implementation, the same starting data may also be obtained using a modified test pattern.
Additional alternative or further developed embodiments are seen below:
A) the observed, i.e., complete, test pattern is expanded by the grid plane.
However, there are problems with this: currently there is no method for producing a composite printed image of a grid face. That is, the test data 12 cannot be generated in the computer 6 in the following form: PDF- > raster image processor- > printing (including paper effects, ink effects such as diffusion, coalescence, etc.). Therefore, the actual print data 13 must be used, which reduces the flexibility and speed of training of the neural network 14.
The method starts as standard method. Next, additional training data is generated and an additional training step is performed based on the existing network 14-the network is "evolved":
the test pattern contains the existing pattern 10 and is supplemented with a grid face;
the digital camera images or the printed sheets 2 that have been recorded in the printing run are evaluated by a subjective evaluator;
if the evaluator identifies a so-called white line 9, this information is stored together with the position of the white line 9.
In such a trained network 14, information of "whether or not white lines are present" is added to the print nozzle information (i.e., the eigenvalues) found by the standard method for each print nozzle.
B) Extending the observed until PDF compares
The standard method is modified to such an extent that the test pattern 10 of the respective printing color has the following elements: the elements can be assigned to the individual printing nozzles and are so small that they can be placed above or below the actual printing material on the individual sheets 2.
If the standard method has been performed, the training data 12 can be quasi-synthetically generated. The process is as follows:
the sheet 2 contains a test pattern 10 according to standard methods for the respective printing color. The neural network 14 thus trained now reliably recognizes the white line 9 and its assignment to a particular printing nozzle;
providing one or more subsequent sheets 2 with a miniaturized test pattern and an arbitrary and varying printed material (e.g. customer material);
this process is repeated until the neural network 14 can reliably identify the white line 9 from any client material;
in a further embodiment, additional input data in the form of image raw data 13 of the printed material can be introduced, for example the following data sets: the data set assigns each pixel step size in the print direction to the drop size to be printed for each print nozzle.
If this is to be achieved without standard methods, an evaluation of the individual sheets 2 by one or more users is required, which is an unrealistic expense. Or the following cameras are required: the camera has a higher resolution than that used when using standard procedures.
The method according to the invention using the neural network 14 has a number of advantages compared to the prior art with fixed image processing algorithms. Thus, multiple parameters need not be required and used. The success of this method and its robustness is independent of some arbitrarily chosen parameters, but results from the quantity and quality of the training data 12 (which can even be artificially generated in part and can therefore be used almost without restriction). Almost any number of training data 12a and test data 12b may be generated. Furthermore, errors and probabilities are known to any degree of accuracy, as they are generated manually. In the final completion phase, the method can be driven to such an extent that the white lines can be recognized directly in the printed image 13 by the trained neural network 14, and the test pattern 10 can be printed and evaluated almost or no longer at all.
List of reference numerals
1 feeder
2 Current printing substrate/Current printing sheet
3 material collector
4 ink-jet printing mechanism
5 ink jet print head
6 computer
7 ink jet printer
8 printed image on current printed sheet
9 white line
10 test pattern with high resolution of the original image
11 test pattern with low camera resolution
12 artificially generated image data with position information
12a artificially generated training data
12b artificially generated test data
13 actual image data
13a actual image data with position information
14 neural network

Claims (9)

1. Method for detecting and compensating defective printing nozzles in an inkjet printer (7) by means of a computer (6), wherein a test pattern (10) is printed by the inkjet printer (7), the test pattern (10) is detected by means of at least one sensor, the test pattern is digitized and transmitted as digital image data (11) to the computer (6), and the computer (6) then determines from the digital image data (11) characteristic values of the individual printing nozzles of a print head (5), detects defective printing nozzles from the characteristic values and compensates the printing nozzles concerned, characterized in that the computer (6) supplies the digital image data (11) to a neural network (14) which has been calibrated by means of training data (12, 12a, 12b) is trained in such a way that the neural network extracts the respective characteristic values of the printing nozzles of the printing head (5) from the digital image data (11) provided.
2. Method according to claim 1, characterized in that a printed nozzle test pattern and/or a face covering element is used for the test pattern (10).
3. Method according to any of the preceding claims, characterized in that the digital image data (11) is cleaned by the computer (6) of disturbing effects, such as static lens errors and lens distortions of the at least one image sensor.
4. The method according to any of the preceding claims, characterized in that training data (12, 12a, 12b) for training the neural network (14) are generated artificially by the computer (6) by: a test data set (12) in the form of digitized print image data with known errors, in particular in the form of print nozzle test patterns and/or area coverage elements with image errors caused by defective print nozzles, is generated and used as input variables for the neural network (14).
5. A method as claimed in claim 4, characterised by using as the digitised print image data with known errors pre-raster image processor data in which each pixel step is allocated a drop size to be printed in the print direction for each print nozzle.
6. Method according to claim 5, characterized in that, for training the neural network (14), firstly digitized printed image data with known errors are converted by the computer (6) into a higher image resolution by means of upsampling, then known errors are introduced into the digitized printed image data, which are sent to the neural network (14), and then the image resolution is continuously reduced by means of downsampling until the image resolution of the at least one image sensor is reached.
7. The method of any of claims 4 to 6, wherein the known errors in the digitized printed image data include: oblique jetting of the printing nozzles, biased drop size, displacement and/or twisting of the print head (5), and trajectory fluctuation of the at least one image sensor transverse to the printing direction.
8. The method of claim 7, wherein the characteristic values of the printing nozzles reflect known errors in the digitized printed image data.
9. Method according to any of the preceding claims, characterized in that a camera system of an inspection system is used as the at least one image sensor, which inspection system is mounted inline in the printing press (7) behind the printing mechanism (4).
CN201910395823.1A 2018-07-03 2019-05-13 Nozzle health detection method by means of neural network Active CN110667254B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018210895 2018-07-03
DE102018210895.1 2018-07-03

Publications (2)

Publication Number Publication Date
CN110667254A true CN110667254A (en) 2020-01-10
CN110667254B CN110667254B (en) 2022-08-02

Family

ID=68943901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910395823.1A Active CN110667254B (en) 2018-07-03 2019-05-13 Nozzle health detection method by means of neural network

Country Status (2)

Country Link
CN (1) CN110667254B (en)
DE (1) DE102019208149A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116533661A (en) * 2022-02-03 2023-08-04 海德堡印刷机械股份公司 Method for printing a print with a fault-free and compensated fault-free printing nozzle

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019132794A1 (en) * 2019-12-03 2021-06-10 Heidelberger Druckmaschinen Ag Missing nozzle treatment taking geometric structures into account
EP3871892B1 (en) 2020-02-28 2022-02-09 Heidelberger Druckmaschinen AG Detektion method to minimize maculature
WO2021204480A1 (en) * 2020-04-10 2021-10-14 Memjet Technology Limited Method of evaluating printhead condition
WO2021234509A1 (en) * 2020-05-17 2021-11-25 Landa Corporation Ltd. Detecting a defective nozzle in a digital printing system
EP3954540A1 (en) * 2020-08-11 2022-02-16 Schott AG Method for printing an image on a substrate and corresponding system
JP2022049858A (en) * 2020-09-17 2022-03-30 セイコーエプソン株式会社 Machine learning method, machine learning program, and liquid discharge system
DE102021134448A1 (en) 2021-12-23 2023-06-29 Canon Production Printing Holding B.V. Device for printing a recording medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5200816A (en) * 1991-06-25 1993-04-06 Scitex Corporation Ltd. Method and apparatus for color processing with neural networks
AU1010100A (en) * 1996-01-26 2000-03-02 Stephen L. Thaler Neural network based data examining system and method
JP2000071437A (en) * 1998-09-02 2000-03-07 Ricoh Co Ltd Ink jet recorder, recording medium and control table generating method
JP2004130799A (en) * 2002-09-30 2004-04-30 Hewlett-Packard Development Co Lp Inkjet printing method and its system providing improved image durability
CN102555507A (en) * 2010-11-08 2012-07-11 施乐公司 Method and system for reflex printing to compensate for registration errors in a continuous web inkjet printer
CN106064529A (en) * 2015-04-24 2016-11-02 海德堡印刷机械股份公司 For the method detecting the print nozzles of inefficacy in ink-jet printing system
CN106573467A (en) * 2014-06-30 2017-04-19 科迪华公司 Techniques for arrayed printing of permanent layer with improved speed and accuracy
CN107160865A (en) * 2016-03-08 2017-09-15 松下知识产权经营株式会社 Image processing apparatus, printing equipment and image processing method
CN107538917A (en) * 2016-06-28 2018-01-05 海德堡印刷机械股份公司 Print nozzles compensation is carried out by the print nozzles of deviation
CN108162600A (en) * 2016-12-07 2018-06-15 海德堡印刷机械股份公司 The method and test pattern of failure print nozzles in detection and compensation ink-jet printer
CN108215508A (en) * 2016-12-14 2018-06-29 海德堡印刷机械股份公司 The method and test pattern of failure print nozzles in detection and compensation ink-jet printer

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7418123B2 (en) 2002-07-12 2008-08-26 University Of Chicago Automated method and system for computerized image analysis for prognosis

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5200816A (en) * 1991-06-25 1993-04-06 Scitex Corporation Ltd. Method and apparatus for color processing with neural networks
AU1010100A (en) * 1996-01-26 2000-03-02 Stephen L. Thaler Neural network based data examining system and method
JP2000071437A (en) * 1998-09-02 2000-03-07 Ricoh Co Ltd Ink jet recorder, recording medium and control table generating method
JP2004130799A (en) * 2002-09-30 2004-04-30 Hewlett-Packard Development Co Lp Inkjet printing method and its system providing improved image durability
CN102555507A (en) * 2010-11-08 2012-07-11 施乐公司 Method and system for reflex printing to compensate for registration errors in a continuous web inkjet printer
CN106573467A (en) * 2014-06-30 2017-04-19 科迪华公司 Techniques for arrayed printing of permanent layer with improved speed and accuracy
CN106064529A (en) * 2015-04-24 2016-11-02 海德堡印刷机械股份公司 For the method detecting the print nozzles of inefficacy in ink-jet printing system
CN107160865A (en) * 2016-03-08 2017-09-15 松下知识产权经营株式会社 Image processing apparatus, printing equipment and image processing method
CN107538917A (en) * 2016-06-28 2018-01-05 海德堡印刷机械股份公司 Print nozzles compensation is carried out by the print nozzles of deviation
CN108162600A (en) * 2016-12-07 2018-06-15 海德堡印刷机械股份公司 The method and test pattern of failure print nozzles in detection and compensation ink-jet printer
CN108215508A (en) * 2016-12-14 2018-06-29 海德堡印刷机械股份公司 The method and test pattern of failure print nozzles in detection and compensation ink-jet printer

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116533661A (en) * 2022-02-03 2023-08-04 海德堡印刷机械股份公司 Method for printing a print with a fault-free and compensated fault-free printing nozzle

Also Published As

Publication number Publication date
DE102019208149A1 (en) 2020-01-09
CN110667254B (en) 2022-08-02

Similar Documents

Publication Publication Date Title
CN110667254B (en) Nozzle health detection method by means of neural network
US10214017B2 (en) Method for detecting and compensating for failed printing nozzles in an inkjet printing machine
US9539803B2 (en) Method for detecting failed printing nozzles in inkjet printing systems and inkjet printing machine
CN110171203B (en) Method for compensating for malfunctioning printing nozzles in an inkjet printer
CN109649006B (en) Method for detecting defective printing nozzles in an ink jet printing press by means of a computer
JP5848978B2 (en) A test pattern that is difficult to perceive by human observation and an image data analysis method that corresponds to the test pattern of an inkjet printer
CN102529445A (en) System and method for detecting missing inkjets in an inkjet printer using image data of printed documents without a priori knowledge of the documents
CN110126469B (en) Method for detecting defective printing nozzles in an inkjet printer
KR100636236B1 (en) Method and apparatus for detecting missing nozzle
JP2011209105A (en) Image inspection apparatus and printing equipment, and method of inspecting image
CN110667258B (en) Method for analyzing printing quality by means of neural network
CN110077113B (en) Method for detecting malfunctioning printing nozzles in an inkjet printer
US20130057886A1 (en) Image inspection apparatus, image recording apparatus, and image inspection method
CN111439035A (en) Improved printed nozzle test pattern
CN110717889A (en) Defect detection method and device based on digital printing, terminal and readable medium
CN113320286B (en) Method for detecting and compensating defective nozzle of ink-jet printer by means of computer
US20230264483A1 (en) Detecting a defective nozzle in a digital printing system
JP2008257394A (en) Unit, method and program for image processing
JP7191655B2 (en) Variable print nozzle test pattern
JP2009530137A (en) Image processing system for printing press
JP5129539B2 (en) RECORDING FAILURE DETECTING DEVICE, IMAGE RECORDING DEVICE, AND RECORDING FAILURE DETECTING METHOD
CN106371967A (en) Method for evaluating the significance of failed print nozzles in inkjet printing systems
JP4631384B2 (en) Printing state inspection method, character inspection method, and inspection apparatus using these methods
CN110682684B (en) Two-dimensional printing of nozzle test patterns
DE102020120962B4 (en) Recognition method and recognition system for clearly recognizing an object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant