WO2005087460A1 - Apparatus, method and system for detecting the width and position of adhesives applied to a substrate - Google Patents

Apparatus, method and system for detecting the width and position of adhesives applied to a substrate Download PDF

Info

Publication number
WO2005087460A1
WO2005087460A1 PCT/SE2004/000386 SE2004000386W WO2005087460A1 WO 2005087460 A1 WO2005087460 A1 WO 2005087460A1 SE 2004000386 W SE2004000386 W SE 2004000386W WO 2005087460 A1 WO2005087460 A1 WO 2005087460A1
Authority
WO
WIPO (PCT)
Prior art keywords
substrate
glue
adhesives
fluting
image
Prior art date
Application number
PCT/SE2004/000386
Other languages
French (fr)
Inventor
John Erik Larsson
Kent Thavelin
Stefan RÖNNBÄCK
Tomas LAGERBÄCK
Tommy Sandin
Original Assignee
Kappa Packaging B.V.
Optima Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kappa Packaging B.V., Optima Ab filed Critical Kappa Packaging B.V.
Priority to PCT/SE2004/000386 priority Critical patent/WO2005087460A1/en
Publication of WO2005087460A1 publication Critical patent/WO2005087460A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B27WORKING OR PRESERVING WOOD OR SIMILAR MATERIAL; NAILING OR STAPLING MACHINES IN GENERAL
    • B27GACCESSORY MACHINES OR APPARATUS FOR WORKING WOOD OR SIMILAR MATERIALS; TOOLS FOR WORKING WOOD OR SIMILAR MATERIALS; SAFETY DEVICES FOR WOOD WORKING MACHINES OR TOOLS
    • B27G11/00Applying adhesives or glue to surfaces of wood to be joined
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/028Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring lateral position of a boundary of the object

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Wood Science & Technology (AREA)
  • Forests & Forestry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This invention is an apparatus for detecting and measuring the width and position of adhesives (D3, 9.1) applied to a substrate (C, 9) in at least one line. The apparatus comprises a device (1) for registration of pictures of the substrate (C, 9) and the adhesive (D3, 9.1) on the substrate (C, 9) and that the picture registration device (1) is cooperating with devices (10, 12) for transforming /processing the pictures into manageable information by making measurements out of the information from the pictures. The invention also is a method for detecting and measuring the width and position of adhesives (D3, 9.1) applied to a substrate (C, 9) in at least one line and a system for producing board.

Description

Apparatus, method and system
Field of invention
This invention relates to an apparatus, a method and a system for detecting and measuring the width and position of adhesives applied to a substrate and a use of the apparatus and/or method and/or system.
Corrugated board is produced by bonding layers of substrates to each other. A flat paper, liner, is bounded to one or more layers of corrugated paper, wave formed paper called fluting, having ridges/tips and valleys. The fluting is formed by heating and conditioning the paper, usually by steam, and passing the paper through a pair of corrugating rolls. Adhesive, with the major components being starch and water, is applied to the ridges of one side of the fluting, in lines. The fluting is then joined with one layer of liner to form a single faced corrugated board and the adhesive is used for the joining/bounding. Single wall corrugated board is formed by bonding one more layer of liner to the opposite side of the fluting. With two or three layers of fluting, with liner in between and on the outside, double and triple wall corrugated board is formed.
The adhesive can be of any suitable kind, for example a starch based cold emulsion adhesive. The adhesive will now on in the text be mentioned as "glue" but it is to be understood as adhesive.
The amount, distribution and position of glue on the fluting ridges, in both machine direction (MD) and in the cross machine direction (CMD) are of great importance for the quality and properties of corrugated board. MD means the direction in which the corrugated fibreboard moves in the machine, CMD means the direction perpendicular to the machine direction.
The amount, distribution and position of glue on the fluting ridges influence the degree of warp of the board, the amount of washboard and delamination of layers, which when not within the stated quality limits leads to costly waste. Application of much glue, perhaps too much glue, just to be on the safe side, leads to a costly over consumption of glue. It also leads to increased energy consumption when the excess moisture is removed in the drying section of the corrugator machine. Much/too much application and/or uneven distribution of glue may also lead to warp and/or washboard while too little glue may lead to delamination. The width of the glue on the fluting ridges, see Fig. 9, is measured in a length unit, e.g. mm. The glue width is strongly correlated with the amount of glue, i.e. it is reasonable to assume that the amount of the amount of glue is proportional to the width and can be calculated with a scale factor.
Normally the amount of glue is measured in the mass of glue per area unit, e.g. gr/m2. The common practice is to calculate the amount of glue as the measured glue consumption per time unit, kg/hour, divided by the area production per time unit, m2/hour. The later is calculated by multiplying the machine speed, m/min, with the machine width, m. The amount of glue is a slowly varying value that does not catch the fast variations in the amount of applied glue.
Glue position is the centre position of the glue string in relation to the fluting ridge, see fig. 9.
The common practice, up to this date, for detecting, checking, measuring, the amount, distribution and position of applied glue is to manually separate the liner from the fluting when the manufacture of the corrugated fibre board is finished and then make a visual inspection. An indicator liquid is brushed on the fluting, usually iodine solution, which colours the glue residues but not the fluting. The glue residues then appear in a more visible manner and the amount, distribution and position of applied glue is possible to check with a naked eye. The result is a manual visual inspection of the applied glue which is time consuming, the result of the inspection does not effect the manufacture process until quite a long time after the gluing moment, it is not done online, and as some time has passed by the glue has been absorbed by the paper and does not give a fully true picture of the glue amount at the gluing moment. The visual inspection is therefore difficult to use for control and feed-back.
The common practice, up to this date, to control the amount, distribution and position of glue is by manually adjusting the machine specific actuators for glue application. This is indeed a difficult task since the result is not measured directly. Only indirect measurements are available, such as the method described above and/or in terms of visual inspection of e.g. washboard, warp and delamination at the end of the corrugator. At such a late stage it cannot easily be distinguished to what extent a certain problem originates from the amount of glue, and/or its distribution and/or its position, and/or whether other variables are also of significant importance. Furthermore, the distance from the point of glue application to the end of the corrugator, the point of indirect measurement, corresponds to a time delay, which seriously limits achievable performance from a control point of view. Attempts have been made to detect, check, measure and control the amount of glue on the fluting ridges on-line, close to the gluing moment in the manufacturing line.
On-line detection and measurement of glue line characteristics of corrugated board is known from U.S. Pat. No. 5,663,565 (the '565 patent) entitled "System and method for the on-line measurement of key glue-line characteristics on corrugated board" issued September 2 in 1997, and U.S. Pat. No. 6,281 ,500 (the '500 patent) entitled "Detection and measurement of cold emulsion adhesives applied to a substrate" issued August 28 in 2001.
Both the '565 patent and the '500 patent present systems and methods for detecting and measuring glue and are based on information (temperature, mass of moisture and/or mass of starch) in the IR-spectrum, or near IR-spectrum, of light. The detection and measurement in made in one spot. The resolution of both the methods depends on how small that one spot can be, on how accurate the velocity of the substrate can be measured and on high resolution timers. To get a proper picture of the glue presence along a/each ridge of the fluting the sensor has to be moved to and fro, see the bottom of column 2 in the '565 patent.
According to the disclosure of the '565 patent the sensor signal is typically the glue temperature measured by an IR pyrometer or the mass of moisture and/or starch measured by an IR or near IR absorption sensor. The chosen sensor measures the glue characteristics after the glue is applied to the ridges of the fluting and before the fluting is joined with the liner. To increase the signal to noise ratio of the measurement the fluting oscillation frequency is calculated from the machine speed and the flute length, the oscillation frequency is then used to isolate and amplify the glue related information in the sensor signal. If an IR or near IR absorption sensor is used the mass of the starch and/or water can be calculated based on the measurement signal and empirical calibration constants.
The '500 patent describes an apparatus and method to detect and measure the amount of glue by comparing the signals from two IR (near IR) absorption sensors with different wavelength sensibility, one being sensible in the absorption spectrum of water and/or starch and one being sensible outside the spectrum of water and/or starch, focusing on the same spot. When no glue is present the difference between the sensor signals is adjusted to be close to zero, when glue is present the reflected intensity in the absorption spectrum of water and/or starch is lower than in the spectrum outside of the absorption spectrum of water and/or starch, thus creating a difference in the sensor signals. The difference in signals is amplified and/or threshold to create a signal that is proportional to the amount of glue and/or to, in a discrete sense, indicate presence or no presence of glue. None of the patents '565 and '500 present a solution regarding how to detect, check, measure or control the distribution and position of the glue.
One purpose of this invention is to provide an apparatus and method for detecting and measuring the width, amount, distribution and position of adhesives applied to a substrate not having these problems/disadvantages.
This is made possible by an apparatus and a method having the characterising technical features specified in Claim 1 and Claim 19.
Description of invention
The invention will be described here below with references made to the figures.
Fig. 1 shows an apparatus to be used when manufacturing single wall corrugated board having a device according to the invention.
Fig. 2 shows an apparatus to be used when manufacturing double wall corrugated board having a device according to the invention. Fig. 3 shows an embodiment of a device according to the invention.
Fig. 4 shows another embodiment of a device according to the invention.
Fig. 5a-5c shows a light arrangement for enhancement of contrasts between adhesive and paper. Fig. 5a shows the whole arrangement, Fig. 5b shows the light reflection on paper and
Fig. 5c shows the light reflection on the adhesive. Fig. 6a-6d shows an arrangement for reference measurement and fluting positioning by a projected light pattern. Fig. 6a shows the projected light appearance on the fluting, Fig. 6b shows the angles between the fluting and a camera and the fluting and a pattern projector, seen in a machine direction view, in a lengthwise direction. Fig. 6c shows the fluting, a camera and a pattern projector seen in a cross machine direction view, a side view. Fig. 6d shows an image viewing the valleys and the ridges on the fluting in lighter and darker ribbons with the projection of a light pattern being a laser line, sheet of light, in the machine direction.
The camera, pattern projector and fluting being are arranged according to Fig. 6a.
Fig. 7 shows images of adhesive applicated on the fluting. Fig. 7a shows three images, 1-3, taken by the camera viewing adhesive on two ridges, at random positions, in each image. Fig. 7b shows the parts of each image that will be displayed for a person.
Fig. 8 shows a flowchart of an image processing operation according to the invention.
Fig. 9 shows an adhesive string on a ridge of the fluting. Fig. 10 shows a pixel image of variations in brightness and texture in areas with and without adhesive.
Fig. 11 shows a coordinate system with definitions used in algorithm descriptions. Fig. 12 shows a co-ordination of calculations of fast and slow variations in adhesive consumption.
Fig. 13 shows a diagram of feature variations, e.g. inter row difference, and a measurement model fitted to the data.
To reduce the mentioned problems the present invention makes it possible detect, check, measure and control the amount, distribution and position of glue in a more direct and exact way on-line and in a visual manner depending on the picture/image of the glue in proportion to the substrate. Other advantages are also to be found within the invention.
Visualization Visualization of the glue on the fluting ridges has to be made in a manner meaningful to the observer. An image of a part of the substrate has to be stable and non-confusing. The fluting ridges should be placed in the same position for every viewed image. If this is not the case the observer will not see, or will have difficulties to see, the glue and changes in the glue due to the changed reference point of the image, also, the vision of the observer will not have time to adjust to the new reference point because of the high image rate.
The present invention uses software synchronization of the images by viewing only a part of the images and always positioning the fluting ridges in the same position in a viewing window/area. This method according to this invention works both on the corrugating roll and in the double backer where the fluting is free running. The method does not need special machine specific hardware with corresponding interface electronics. This way to handle the images will in this description of the invention from now on be called windowed viewing.
To a trained observer, for example a machine tender, a machine operator, the visualization gives important and useful information about the state of the manufacturing process and makes it possible to detect, check, measure and control the amount, distribution and position of glue. Examples of such information are the occurrence of glue drops or poorly dissolved starch in the glue.
Measurement The present invention can detect, check and measure the glue width and/or the glue position on the fluting ridges, in the machine direction. The invention can also detect, check and measure the glue distribution in the cross machine direction if multiple measurement modules are used or if one or more measurement units are mounted on a traversing unit. Measurements relative to the flute length can be done without any a priori information or synchronization with the corrugating machine. If a scaled measurement is desired it is enough to know the flute length of the fluting to be measured.
The invented method is based on image processing for detection and quantification of the glue in images captured by a camera, an arrangement of the lighting such that the difference in characteristics of glue versus fluting is enhanced, and the projection of a structured light pattern onto the fluting for positioning and scaling. The proposed solution has high resolution, is accurate, is dependable, measures in real time online, easy to use and integrate with other systems.
The amount of glue, which is the desired measurement, is strongly correlated, proportional, to the width of the glue, thus making the width meaningful to use.
Detailed description of the invention
Fig. 1 shows a part of a production line to be used when manufacturing corrugated board A. The board A is produced by bonding layers of substrates to each other. Layers of flat paper B, called liner, in bounded to one or more layers of corrugated paper C, wave formed paper called fluting, by the use of an adhesive application, gluing, position D.
Single wall corrugated board A is formed by bonding one more layer of liner B to the opposite side of the fluting C. With two or three layers of fluting with liner X1 and X2 in between and on the outside, double and triple wall corrugated board is formed, see Fig. 2.
The following descriptions of the method, apparatus and production line describe how a system/machine capable of producing single face, single wall and double wall corrugated fibre board works and what it has to comprise to work satisfactory. An extension to manufacture triple wall corrugated board is obvious as a third single face machine easily can be inserted in the production line.
Before entering the glue position D the paper layers B and C are preconditioned by heat, and possibly steam, for moisture control for improved conditions for the corrugating of the fluting and for the gluing process. The fluting C is then forced between two toothed rolls E1 and E2, the corrugating rolls, forming the fluting into the wave from. While still on the corrugating roll E2 the fluting C is "dipped" into the glue D1 on the glue roll D2 which has a constant coating of wet glue D1 applied to it in the glue box D3. A starch based cold emulsion adhesive is used as glue and for the bounding. The glue D has as the major components starch and water.
The amount, and width, of glue D1 applied to the fluting C, the fluting ridges C1 , the top area of each wave in the corrugated fluting, depends greatly on the distance between the corrugating roll E2 and the glue roll D2. The amount of glue D1 also depends on the difference in surface velocity between the rolls E2 and D2 and on chemical factors like the blending of the glue D1 and the temperature and viscosity of the glue D1. Other not mentioned factors can also affect the amount of applied glue.
The position of the glue D1 on the fluting ridges C1 depends mainly, but not excluding other factors, on the difference in surface velocity between the corrugating roll E2 and the glue roll D2. If the surface velocity differs between the rolls E2 and D2 the glue D1 will be scraped of to one or the other side of the fluting ridges C1 , depending on direction of the velocity difference. Variations in the distribution of glue in the CMD depend mainly, but not only, on misalignment, in CMD, between the glue roll D2 and the corrugating roll E2, or misalignment of other machine components.
The distance between the corrugating roll E2 and the glue roll D2 is a common means for control of the amount of applied glue D1 in the single face machine, i.e. the width of the glue, and the difference in surface velocity between the corrugating roll E2 and the glue roll D2 is a common means for control of the position of the glue on the fluting ridges C1 in the single face machine. The CMD distribution is controlled by adjusting the distance between the glue roll D2 and corrugating roll E2 independently on each side of the machine.
After the glue D1 is applied to the fluting ridges C1 the fluting C and the liner B are joined by pressing the liner B, by a press roll F or a press felt onto the fluting C while still on the corrugating roll E2. The single faced corrugated fibreboard A is then fed to a bridge for intermediate storage before further processing in the following process sections.
Fig. 2 shows how the production from two single face units can be combined in a double backer to produce single wall or double wall corrugated board. The double backer has two holding-back rolls F, doctor rolls, instead of corrugating rolls. A corrugated board machine capable of producing triple wall corrugated fibreboard, although not illustrated, is equipped with three single face units.
In Fig. 3a an apparatus according to the invention is shown. Here down below is a list of the components comprised in the apparatus and shown in Fig 3a;
1. A device for registration of images 2. An optical means 3. A device for projecting a defined reference pattern onto fluting and glue 4. A light source 5. A first mirror arrangement 6. A second mirror arrangement 7. A housing 8. An opening 9. A substrate 9a. Substrate ridges 10. A camera interface 11. An electronic part 12. A system processing control device 13. A display unit 14. A measurement module 15. An electronic and control module 16. A first connecting means 17. A second connecting means 18. A third connecting means
To the system/apparatus/machine shown in Fig. 1 and Fig 2 is a housing 7, comprising the measurement module 14, according to the invention attached between the gluing position and the fluting-liner-joining position.
The present apparatus according to the invention, in the current implementation shown in the figures, comprises a measurement module 14 and a processing electronic and control module 15, each module comprising sub-modules, parts for different topics, parts having different functions. The modules 14 and 15 may be divided in the corresponding separate sub-modules. In another solution the processing electronic and control module 15, or parts of it, can be embedded in the measurement module 14 or vice versa. The measurement module 14 is placed at a position where the substrate 9, the corrugated paper, the wave formed paper, the fluting, and the glue/adhesive 9.1 , usually a starch based cold emulsion adhesive is visible. The module 14 is placed after the production step where the glue 9.1 is applied and before the joining step of the liner B to the fluting C, see Fig. 1 and 2.
In each measurement position, one measurement module 14 can be placed at any fixed or manually, remotely or automatically adjustable position in the cross machine direction. Two or more measurement modules 14 can be used for measurement in several positions. Two or more measurement modules 14 can be placed at any fixed or manually, remotely or automatically adjustable positions in the cross machine direction. One or more measurement module/-s 14 can be placed at a traversing unit, not shown in the pictures, moving in the cross machine direction for measurement in the whole, or part thereof, cross machine direction, the traversing unit being manually, remotely or automatically controlled.
The measurement module 14 is a module/unit for registration of glue on the fluting and takes pictures, real time pictures, of the fluting. The system processing control device 12 makes measurements out of the information from the camera 1 in cooperation with the camera interface 10.
The components for the measurement module 14 is encapsulated in an environmentally protected housing 7, withstanding heat, moist, dirt, dust, chemicals and vibration according to NEMA 4 or IP65 or higher/other protection classification. The housing 7 has an opening, a transparent part of the housing, a window 8, made of optical transparent material, for distortion free observation and lighting of the glue and fluting. To keep the window 8 clean from dirt, moist, glue residues e t c, it is equipped with cleaning devices consisting of one or more of the following devices not excluding others, clean air knifes for blowing the dirt away, pressurized spraying of water for removing dirt and dissolving the water based glue, wipers for mechanical cleaning in combination with water spraying. As these constructions for cleaning purpose is previously known they are not shown in the figures.
The components in the shown implementation of the measurement module 14 are:
• The device 1 for registration of images is generating analog or digital signals representing the image, of the fluting with glue and a possible light pattern. The camera 1 is currently an analog black and white CCD camera with progressive scan shutter (for clear images of the moving fluting) with wavelength sensitivity in the visible range, not excluding other wavelength sensitivity ranges. A camera of this kind, or a camera with similar properties, gives clear images of the moving fluting.
• The optical means 2, the optics, the lenses, used for focusing, aperture and field of view adaptation. The optics 2 may be equipped with filter devices such as polarizing filter, band pass filter or others. The current optics 2 is without filtering devices.
• The device 3 for projecting a defined reference pattern onto fluting and glue, for example a pattern projector being a diode laser with line generating optics. The pattern projector 3 is used for projecting a defined reference pattern onto the fluting 9 and glue 9.1 , the pattern being any pattern such that the position of he fluting ridges 9a, in an image can be estimated, see Fig. 3b. The pattern is a line projected in the machine direction across the fluting ridges 9a, see Fig. 6a, the pattern projector being of any type generating a pattern of light of any wavelength, preferably within the wavelength sensitivity range of the camera 1. The current line projector is a continuous wave red diode laser with a line generating optical head.
• The light source 4 for illumination of the field of view and for enhancement of the contrasts between glue 9.1 and fluting 9 can be of any type. Some possible light sources will be named here after, but the invention is not limited to those listed here; continuous or strobed, single or multiple sources, direct halogen, indirect halogen possibly directed by optical fibres, xenon flash lights, Light emitting diodes, strobed LED array, the light source having any wavelength range, within the wavelength sensitivity range of the camera. The current implemented light source being an array of multiple light emitting diodes with a wavelength or wavelength range between 600nm-800nm.
• The first mirror arrangement 5, being used for the camera 1. The first mirror 5 is used for an angled mounting of the camera 1. Fig. 4a shows a possible non-angled non- mirrored mounting of the camera 1 , for low distortion and high reflectivity the mirror 5 can be of a surface reflection type.
• The second mirror arrangement 6, being used for the pattern projector 3. The second mirror 6 is used for an angled, different from the projection angle y, see Fig 6b, mounting of the pattern projector 3. In Fig. 4a an alternative possible mounting of the pattern projector without a mirror is shown. For low distortion and high reflectivity the mirror is preferably of a surface reflection type. The measurement module 14 is connected to the processing electronic and control module 15 by connecting means 16-18. The connecting means 16-18 are electrical wires, and/or optical fibres were suitable, bundled in one or more cables. The solution, shown in Fig. 3, consists of three cables:
• The first connecting means 16 makes it possible to transport the picture information and/or power and/or signals for control and synchronization of the camera 1. The first connecting means 16 is connecting the camera 1 to the camera interface 10.
• The second connecting means 17 is for supplying power and control signals to the pattern projector 3.
• The third connecting means18 is for supplying power and control signals to the light source 4.
The processing electronic and control module 15 is a module/unit comprising the camera interface 10, the electronic part 11 and the system processing control device 12; • The camera interface 10 used to transform the picture into manageable information. The camera interface 10 is converting the image signal, digital or analog, to a digital representation suitable for the system processing control device 12, generate trigger and synchronization signals for the camera 1 and the power supply and support electronics module 11 , communicate configuration signals and/or data to the camera 1 , the communication of configuration data can also be done directly from a computer, the camera interface 10 converting the analog video signal to a digital representation, and supply the camera 3.1 with a operational power, trigger and synchronisation signals, and configuration communication. • The electronic part 11 , the power supply and support electronics unit, comprising electronic control devices, for the line projector 3 and the light source 4, and a power supply, not shown in the picture per see. The power supply and support electronics unit 11 is supplying operational power to the camera 1 , the pattern projector 3 and the light source 4 and controlling the pattern projector 3 and/or light source 4 by generating trigger and/or other synchronisation signals and/or by controlling the supply power, the control actions and behaviour being generated by trigger and/or other synchronization signals from the camera interface 10 and/or by trigger and/or other synchronization signals and configuration communication from the computer 12 and/or manual or remote actions. The solution is to control the pattern projector with a trigger signal and the light source by controlling the supply power, the action being triggered from the camera interface and the behaviour being controlled by manual setting. The power supply for the camera 1 can also be a stand alone unit.
• The system processing control device 12 comprises means for communication and image processing, for example by a computer not shown in the picture per se. The computer can be used for execution of the algorithms for glue detection, glue measurement, fluting ridge positioning and image enhancement for visualisation. The computer comprised in the system control device 12 will control the camera interface 10, the power supply and support electronics unit 11 , control of and output to the display 13. The computer will be used for communication and integration with human system operator, machine actuators, control systems, production systems, quality systems and other glue measurements systems not excluding other not mentioned systems. The computer will also be used for calculation and execution of control actions based on the measurement of glue width and/or position and the desired value of the glue width and/or position. The computer comprised in the system control device 12 can be a computer of any type, stand alone, embedded in any of the measurement system modules or embedded in the camera. The computer is capable of supporting one or more measurement modules 14. The computer can have devices for storage of information, both measurement data and images, and devices for user interaction such as Mouse, keyboard, display. • One solution is a stand alone computer with the camera interface 10 as a plug in module and with an external power supply and support electronics module 11 and an external display unit 13, a preferred solution being one physical unit with a computer 13, a camera interface 10, power supply and supply electronics unit 11 and an integrated display unit 13.
The display unit 13 is used for viewing images, of the fluting and the glue on the fluting, and for viewing measurements. The display unit 13 is for viewing of raw images, images enhanced by image processing and/or information from the glue detection algorithm, measurement data such as glue width and/or position, algorithm evaluation data, man machine interface. The module 15 comprises the power supply, support electronics and computer which may be incorporated in one single unit or divided to separate units corresponding to the necessary functions. The parts of module 15 can also be combined with parts of the measurement module 14.
Principles and algorithms for gluing:
The principle of gluing is the same for two and three wall corrugated board.
The invention is based on three main principles and algorithms: A) Glue detection by image processing
B) Reference measurement by light pattern
C) Windowed viewing of images.
This gives improved performance when combined, but an implementation of the invention does not necessary need all three principles to produce useful output.
The word "image" used below is defined as being the whole image from the camera 1 or parts of the image from the camera 1 , the image/part of the image being the region of interest, the ROI. The image may be colour or grey scale, the principles are the same, but the descriptions and algorithms assumes the images to be grey scale or colour images converted to grey scale. The principles and/or algorithms are exemplified only with examples that are easy to describe but can be used with more complicated methods.
A note to the algorithms; as the measurements are supposed to be carried out in real time the execution speed is of importance. There are some powerful methods described in the literature, some not mentioned here, but they are less useful due to their computational complexity making them to slow, that is today.
The three main principles and algorithms for gluing will be described on the following pages: A) Glue detection by image processing
A.1. principle
Referring to fig. 5a-5c the glue is detected and measured by arranging a camera 1 with optics 2 and possibly, but not necessary, one or more light sources 4 such that the visible, to the camera 1 , difference/differences between the substrate 9 (fluting/paper) and glue 9.1 is enhanced, the resulting image is then processed by a computer for extraction of one or more image features giving a discrimination between glue and substrate followed by a classification and segmentation to discern the glue from the substrate.
In Fig. 5b diffuse light reflection on paper is shown. The light reaches the camera and the paper appears bright. In Fig. 5c specular reflection on glue is shown. Under ideal conditions no light will reach the camera.
Two differences between the glue 9.1 and substrate/fluting 9 are the brightness/intensity and the texture difference between areas with glue and areas with fluting:
• Brightness/intensity Three factors affecting the brightness/intensity difference are: Reflectivity Normally, the water based glue has a higher reflectivity than the comparably rough surface of the substrate/fluting , thus if the light source 4 and camera 1 are arranged as in fig 5c, α<β«180-α,(« much smaller than) the glue 9.1 will appear dark and the substrate will appear bright due to specular reflection in the glue and diffuse reflection in the substrate, if on the other hand β==180-α (==close to equal) the glue will appear bright because of direct reflection of light into the camera.
Absorption The IR light absorption wavelength band of water, centred at ca. 1940 nm, is different from the IR light absorption wavelength band of the substrate/fluting, centred at ca. 2100 nm for cellulose. With a light source producing light mainly within the IR-light absorption wavelength band of water the glue will appear dark, as it is water based, and the substrate/fluting will appear bright, this is under the condition that the camera is sensitive in the said wavelength band.
Fluorescence If the light source produce light in the UV-band the starch in the glue can cause a fluorescent effect and as the substrate/fluting does not produce a florescent effect the glue will appear bright compared to the fluting, this is under the condition that the camera is not sensitive in the UV-light band and sensitive in the light band of the generated florescent light.
• Texture The difference in visible surface texture is seen as, when the substrate is fluting of paper, a smooth texture with small variations in brightness/intensity in the areas with glue, and a rough fibrous texture with greater, see figure 10, than in areas with glue, variation in brightness/intensity in areas without glue. Expressed in a quantitative way, areas without glue have higher energy content in the upper frequency spectrum of the image frequency spectrum than areas with glue. The difference in surface texture can be enhanced by arranging a light source such that the light is directed essentially in line with the fluting, α is small and β much larger that α, see figure 5c. The light arrangement result in a high contrast lightning of the fluting.
The preferred arrangement is to use a small α and an observation angle β=90 deg, see fig 6a-6d, to enhance the texture differences between glue and fluting and to use the reflection phenomena so that the glue appears darker then the fluting.
A.2. algorithm
Referring to fig 8 showing a glue detection and measurement thread, flowchart, one possible, sequence of image processing operations for detection and measurement of glue width and/or distribution/position is described.
• Pre-process image
Assuming the image is captured and stored in the computers digital memory, RAM, HD, FD, CD or similar, the image is pre-processed, for correction and/or enhancement of image features such as intensity, intensity distribution, contrast, frequency content and others not mentioned, by applying one or more of the following image processing operations, adding of a singular value, multiplication by a singular value, inversion, histogram equalisation, linear filtering such as low pass, high pass and band pass, stochastic filtering such as wiener filtering non-linear filtering such as averaging and median, morphological operators such as opening, closing, top hat, boot down, and other operations not mentioned. • Feature extraction
The next step is to extract image/texture features; these features characterize the image and/or parts of the image down to pixels. The chosen feature/features reflect the difference between areas with glue and those without. When making the classification, one feature can be used or several can be combined by addition and/or multiplication and/or other mathematical operations. In the following text the image is composed as in fig 5-7 with the fluting ridges 9a, and thus glue strings, parallel with the rows in the image, not excluding other image compositions.
To exemplify, a number of basic features are mentioned below, not excluding other well documented or specially designed texture features. o Intensity The intensity in a pixel or the average or median intensity in a pixel and its neighbourhood reflect the intensity difference between the glue and the substrate; see the above section concerning brightness/intensity. o Variance The variance of the intensity in a pixel and its neighbourhood reflect the difference in visible surface texture between areas with glue and areas without glue; see the above section concerning visible surface texture. A low variance is indicating glue. o Fourier transform The energy spectrum of a Fourier transform, 1 D or 2D transform depending on the nature of the pixel neighbourhood of an pixel and its neighbourhood reflect the difference in visible surface texture between areas with glue and areas without glue, see the above section concerning visible surface texture. Low energy content in the upper frequency spectrum is indicating glue. o Row difference, ad hoc The sum of the square value of the column wise difference in intensity between two row segments covering the same interval of columns, see fig. 11 for definitions, reflects the difference in visible surface texture between areas with glue and areas without glue, see the above section concerning texture, a low value indicating glue.
Figure imgf000019_0001
au, Intensity value of pixel ij /', column index j, row index d, the distance to the row for differentiation N, number of columns in the interval of columns This feature is ad hoc and specially developed for this invention. o Grey level co-occurrence matrix The grey level co-occurrence matrix of an image is one probabilistic description of the image. The grey level co-occurrence matrix is not in it self used as a feature to discern the glue from the fluting, instead, it is features derived from the matrix that is used. Several features can be derived from the matrix and the most useful in this application is the inverse difference moment. This is not a basic and easy to describe feature and is mentioned as it is one of the preferred methods.
• Classification
The feature extraction is followed by a classification to discern the areas with glue from those without, and to find the borders of the areas with glue. If an estimate of the fluting ridge position is available it can be used by any method to increase the accuracy and/or robustness of the classification. The position can be used as an initial value for the search of glue or to eliminate false detections. The classification can be done by one method or as a combination of several methods to increase the accuracy and/or robustness of the classification. Classification methods are well described in the literature and the ones described below are for exemplification.
One method to discern the glue from the substrate is to threshold the feature values, with a value above or below, depending on the feature, (variance and row difference below) the threshold value indicating glue, and for the opposite indicating substrate. The threshold value can be a manual set fixed value, or a value based on image statistics such as the average value of the feature or a value calculated by statistical classification for maximal separation between feature values indicating glue and those indicating substrate. If an estimate of the fluting ridge position is available it can be used to limit the observation area and to check the likelihood of the detected area. The threshold method can be complemented by any other image processing operations for a possible increased accuracy and/or robustness. Another method to discern the glue from the substrate is to detect the edges of the glue, in the feature values, by a gradient method, or other suitable method. The sign of the gradient and the search direction indicates if he edge is a transient from substrate to glue or glue to substrate making it possible to isolate and classify areas of glue. If an estimate of the fluting ridge position is available it can be used as a starting point for the search and to limit the search length. The described method can be complemented by any other image processing operations for a possible increased accuracy and/or robustness.
Yet another method to discern the glue from the substrate is to fit, in a best fit sense, a mathematical model of glue and substrate to the feature values, the model capturing the characteristic differences in features of glue and substrate
One possible model, not excluding others, is a pulse model reflecting the pulse like feature changes where the glue is placed, see fig. 13. The feature in this case, for example, is the inter row difference, see features above. One representation of the pulse is:
HI, l ≤ k ≤ NF / NF,PF,H\,H2,L (k) = L, NF < k < PF HI, PF ≤ k ≤ N Where k=1...N \s the N data points Λ/F, Position of the negative flank PF, Position of the positive flank H1, Function value before negative flank H2, Function value after positive flank L, Function value during the pulse
The model parameters to be estimated in this model are NF, PF, HI, H2, L where NL and PF contain the information of where the glue string borders are located.
For any model and any feature the model parameters are estimated by fitting the model, in a best fit sense, to the feature data by minimizing, with regard to the model parameters, the discrepancies between the model output and the feature data. One method to estimate the model parameters, of any model, is to minimize the loss function, Q, below with a least squares method, linear or non-linear. A non-linear least squares method is suitable for the above described model as it contains discontinuities.
k given N data points k=1..N, θ being the set of model parameters to be estimated, [NF,PF,H1,H2,L] in this case, fθ being the function NF.PF.HI.H∑.L and Ak is the inter row difference for row k.
The data points make a vector of column wise features. All parameters can be estimated but to speed up the calculations one or more parameters can be fixed and calculated before the parameter estimation. One approach is to estimate only NF and PF and calculate H1, H2 and L based on data statistics such as mean, standard deviation and min/max. Setting 1-11=1-12 is also a way to reduce the computational needs. If the fluting ridge position measurement is available it can be used as initial value for NF and PF, it can also be used to limit the NF and PF to reasonable values.
• Quantifying
For clarity, the classification and quantification is separated in this algorithm description, they are however closely coupled and possibly integrated in an implementation. In the quantification step the border/outline of the glue areas is identified, impossible values and/or outliers are corrected, the width and centres, see fig 9, of the glue areas are calculated and the relevant measurements are translated from pixels to a useful unit, for example millimetres.
See figure 6, 7 and 9 for illustrations. Assuming the glue string has a horizontal orientation. The glue string width within each column, or column interval, is calculated by subtracting the upper glue border value from the lower, where both borders are associated to the same glue area. The glue string centre within each column, or interval of columns, is the middle point between the upper and lower glue borders where both borders are associated to the same glue area, see fig 9.
The glue width values are averaged and translated to a useful unit, such as millimetres or a measure relative o the flute length, by a scaling factor. The scaling factor can be fixed and manually set. The scaling factor can also be dynamic and calculated by finding the fluting ridge positions in the picture, in pixels, and knowing the distance between them, in millimetres, thus being able to calculate a scaling factor. This demands a reference measurement by light pattern, see below. If a measure relative to the fluting length is preferred the glue width, in pixels, is divided by the distance between the fluting ridges 9a, in pixels. The glue width measurement can operate without the reference measurement by light pattern if the scaling factor is manually set.
The common measure, and calculation, of amount of glue in gr/m2 can not characterize fast variations as it is based on slowly varying measurements of the glue consumption in kg/hour, machine speed in m/min and the machine width in m. A measure of the amount of glue in gr/m2 catching both fast and slow variation can be calculated according to the scheme presented in fig. 12. The slow variations are calculated according to common practice, see above and fig.12, and the fast variations in gr/m2 are added by scaling the fast variations in glue width by a dynamic, slowly varying, scaling factor. The scaling factor is calculated based on the slow variations in both glue with and glue consumption. Both measurements, glue consumption and glue width, are filtered with averaging or low pass filters with the same time constants, and it is important that the time constants are the same to be able to use the values in the same calculation. The time constant of the filters being longer than the longest time constant of the actual on line measurements, usually the time constant of the glue consumption in kg/hour. The filtered values in the glue width are subtracted from the original measurement to isolate the fast variations. The fast variations in glue width are scaled to gr/m2 by the scaling factor and added to the slow, filtered, variation of glue calculated according to the common practice, thus resulting in a signal containing both fast and slow variations in the amount of applied glue in gr/m2. Other methods to calculate the amount of glue (glue consumption) are possible and this is an easy to implement method.
The positioning of the glue on the ridges 9a needs information from the reference measurement by light pattern to be able to operate. The position of the glue string is calculated as the difference between the glue centre position and the fluting ridge position, see fig 9. The position value is translated to a useful unit in the same way as the glue width value.
• Communication
When all calculations are done the measurements can be transferred to other systems such as the corrugator machine control system, other glue measurement systems, plant production system, plant quality control system, e.t.c. The measurements can also be presented on a display, with or without visualization of the glue. B) Fluting ridge positioning and scaling
B.1. principle
To find and position the fluting ridges 9a in the image, for glue positioning and/or windowed viewing, a well defined light pattern, a line, multiple lines, any curve line, dots, e t c, is directed onto the fluting in a certain direction and observed from another direction such that the height variations in the fluting will cause distortions in the light projection, the distortion being periodic with a period equal to the flute length and the size of the distortion depending on the flute height, type of light pattern and pattern projector, camera and fluting setup parameters, the geometrical relations between them, see figure 6 . The light pattern is a light pattern giving enough and adequate information, at least fulfilling the Nyquist sampling theorem. The fluting ridges 9a are then positioned by finding the corresponding distortion in the projected light in the image.
With information of the flute length, flute height, type of light pattern and pattern projector, camera and fluting setup parameters, the geometrical relations between them, an analytic parametric mathematical model, or an empirical, possibly mathematical and/or parametric model, of the observed light pattern can be constructed. The fluting ridges 9a in the image can then be found by identifying maximum or minimum, depending on setup, values in the distortions. Another method to find the ridges 9a is to extract the projected light pattern from the image and fit, in a best-fit sense, the parametric model to the extracted data thus identifying the ridge positions in the image from the model parameters. The later method is less sensitive to disturbances.
When the fluting ridges 9a are positioned in the image, in pixel, a scaling factor for the image can be calculated (see "quantifying"). This method does not demand a complex mathematical model of the setup and not a rigorous calibration of the setup. Another much more complicated method is to calculate the scaling factor based on triangulation. Triangulation with structured light and camera is a method demanding a rigorous calibration and a complex mathematical model of the setup.
The camera 1 , pattern projector 3 and fluting 9 can be set up according to figure 6a-c with the projected pattern being a line perpendicular to the fluting pipes. B.2. algorithm
Fig. 8 shows a glue detection and measurement thread, flowchart, one possible sequence of image processing operations and other operations for positioning of the fluting ridges in the image. • Pre-process image
Assuming the image is captured and stored in the computers digital memory, RAM, HD, FD, CD, the image is pre-processed, for correction and/or enhancement of the projected light pattern in he image, by applying one or more of the following image processing operations, adding of a singular value, multiplication by a singular value, inversion, histogram equalisation, linear filtering such as low pass, high pass and band pass, stochastic filtering such as wiener filtering non-linear filtering such as averaging and median, morphological operators such as opening, closing, top hat, boot down, and other operations not mentioned.
• Extract reference pattern The projected light pattern is assumed to be brighter than the rest of the image. The projected light pattern can be extracted by thresholding by an intensity value, manually set or otherwise automatically selected, where image intensity values above the threshold value is assumed to be part of the projected light pattern. Another method to extract the projected light pattern is to find the edges, with a gradient method, or other method, of the light pattern and isolate it from the rest of the image. The methods for projected light pattern extraction can be complemented by any other suitable image processing operations for modification of the extracted data, such operations can be morphological filling and/or thinning.
• Estimate ridge positions If the light pattern is chosen such that the waveform of the fluting is uniquely distinguishable in the extracted data, and such that the ridge of the fluting represent the only maximum (or minimum), in the extracted data, in a neighbourhood equal to the fluting length, the fluting ridge positions can be found by searching for all local maximas (or minimas) in the extracted data. Figure 6d illustrating such a light pattern in the form of a line projected onto the fluting resulting in a sinusoidal projected light pattern. With higher values to the left in the figure, it is seen that that maximas in the figure represents a fluting ridge. The search can be done by a sliding window comparison search or by a gradient method, or by any other not mentioned method. To reduce the sensitivity to disturbances the extracted data can be smoothed by a suitable filter before the search of maximas (minimas). The walleyes of the fluting can be found by the same methodology. This method is said to be based on a non-parametric empirical model not demanding any complex mathematical model or rigorous calibration. For positioning of the fluting ridges 9a in the image no further information is needed, if a scaling factor is to be calculated the flute length in the preferred unit is needed.
An analytic parametric mathematical model of the, in the image, projected light can be constructed if the flute length, flute height, flute form, type of light pattern and pattern projector, camera and fluting setup parameters, the geometrical relations between them, are well defined, such a definition demanding accurate measurements and/or information on setup and fluting characteristics and/or a rigorous calibration.
A more practical approach is to use an empirical parametric mathematical model catching the base characteristics of the projected light in the image, not demanding complete information on the fluting characteristics and setup. If the light pattern is a line aligned perpendicular to the fluting ridges and with a setup shown in figure 6, the resulting projected light in the image will have a sinusoidal characteristic, see fig 6d, with an offset in the image (sideway position) and possibly a tilting in the image. The sideway position varying with a varying distance between the fluting and the light projector and camera setup, and the tilt angle depending mainly on the alignment of the light projector, camera and fluting. With a setup and image orientation as in figure 6, others are possible, and with c denoting column index and r denoting row index such that the extracted data is a set of (c,r) pairs, one possible empirical parametric mathematical model is:
Figure imgf000025_0001
The model can be described as a sine wave having amplitude a, period of p and a phase shift equal to Δr, biased with the straight line equation where k is the slope of the line and m the constant bias. The biasing with the equation for a straight line is suitable when the fluting is in a flat plane, as it is in the double backer, if the fluting is on the corrugating roll, or other roll, when measured another biasing function might be, but not necessary, more suitable, such an biasing function can be an elliptical curve or other curve matching the curvature of the roll in the projected light pattern in the image.
The model parameters to be estimated in this model are k, m, a, p and Δr where the period and phase shift contains the information of where the fluting ridges are positioned in the image. The phase shift gives information of where the ridges are positioned in the image and the period is giving the distance between the ridges.
For any model and any light pattern, the model parameters are estimated by fitting the model, in a best fit sense, to the extracted data by minimizing, with regard to the model parameters, the discrepancies between the model output and the extracted data.
One method to estimate the model parameters, of any model, is to minimize the loss function, Q, below with a least squares method, linear or non linear. A non-linear least squares method is suitable for the above described model as it contains a sinusoidal function.
Q = ∑(fθ(r,)-c,)
given N data points (c, ,rl )...(cN,rN ), θ being the set of model parameters, [k,m,a,p,Δr] in this case, to be estimated and fθ being the function to be fit to the data, in this case fn.m.a,pM-
The data points are the extracted projected light pattern in the image. All parameters can be estimated but if a parameter does not vary it can be fixed (constant) with the rest of the parameters being estimated. This reduces the computational need and reduces the time for estimation. Slowly varying parameters does not need to be estimated in every image.
One possible implementation:
Pre-Process Column and row wise pole filters with user selectable parameters and/or a (m x n) averaging filter with user selectable parameters.
Extract reference pattern
Threshold, with the level being manually set.
Estimate ridge positions
Accomplished with an empirical parametric mathematical model, the one described earlier, fitted to the extracted projected light pattern with a non-linear least squares estimation method, a modified Levenberg-Marquardt algorithm. C) Windowed viewing
C.1. principle
Referring to figure 7, a camera not synchronized to the fluting ridge positions, when on the corrugating roll the camera can be synchronized to the roll, will result in images with the fluting ridge positions being in different not determined positions making observations of the glue difficult (or impossible). The suggested solution to this problem is to view only a part of the image in a viewing window (area) which is offset in the image such that the position/positions of the fluting ridge/ridges is always the same in the viewing window, see fig 7b. To enhance the visibility/appearance of the glue the image is pre processed by any image processing operations suitable, some listed here, linear filtering, contrast enhancement, inversion and histogram equalization e t c.
One way to do this is to align all subsequent images to a reference image by finding the maximal, in the vertical direction, correlation of image features, in a column-wise manner, columns one by one or for an interval of columns. If available, the glue detection and measurement information can be added to the image.
One other method to accomplish windowed viewing is to measure the position of the fluting ridges in the image, preferably by one method described in this patent, and to calculate an offset such that the fluting ridges always are in the same position in the viewing window.
One possible implementation:
Both ways to accomplish windowed viewing are preferred, the one using the measured fluting position being the most preferred.
C.2. algorithm
Referring to fig 8 and the glue detection and measurement thread, flowchart, glue visualization thread, one possible sequence of image processing operations for visual enhancement of the glue and for windowed viewing is described.
• Pre-process image • Assuming the image is captured and stored in the computers digital memory, RAM, HD, FD, CD) the image is pre-processed, for enhancement of glue visibility/appearance, by applying one or more of the following image processing operations, adding of a singular value, multiplication by a singular value, inversion, histogram equalisation, linear filtering such as low pass, high pass and band pass, stochastic filtering such as wiener filtering non-linear filtering such as averaging and median, morphological operators such as opening, closing, top hat, boot down, and other operations not mentioned.
• Enhance image with quantified information • If the glue is detected and quantified, see Fig. 8, the glue detection and measurement thread, this information can be used to further enhance the visibility of the glue in the images. One method to do this is to, in the image for viewing, mark the borders of the glue areas. One other method is to darken or enlighten, digitally in the image, the areas of glue.
• Display part of image in view window
To view one or more fluting ridges, depending on field of view of the camera, such that their positions are seemingly synchronized with the camera shutter and/or light source, only part of the image is viewed in a virtual viewing window, see fig 7b. The image is vertically offset, in regard to the viewing window, such that the fluting ride positions, for all images, will be the same in the viewing window. The maximum practical, to always fit the whole viewing window in the images independent of the fluting positions in the images, vertical size of the viewing window is such that it can show 1 less complete flute length than the number of complete flute length in the image. Example, if the image shows (field of view is) 3,5x the fluting length, the viewing window can always show 2 complete fluting periods length. The width of the viewing window is the same as the image or smaller.
If the fluting ridge positions are known, see "fluting ridge positioning and scaling, and assuming the viewing window being a multiple of the flute length in vertical size and the fluting ridges being positioned in the centre of each flute length, the offset can be calculated as the position of the first fluting ridge position in the image which is at least 1/2 flute length below the top of the image, subtracted by the upper most desired fluting ridge position in the fluting window, Offset=P,-Pw in figure 7b.
If the fluting ridge positions are not known the windowed viewing can be accomplished by correlating, in the vertical direction, the images to a reference image where the maximum correlation gives the offset of the current image compared to the reference image. The desired fluting position in the viewing window is set by offsetting the viewing window in the reference image.
One way to do the correlation is by 1 D column-wise correlation of the image pixel intensities, either by correlation of the corresponding, in the reference image and in the current image, columns one by one, or by the row averages of an interval of columns. The maximum correlation for the whole image is then found by combining the correlations from each column or interval of columns. One way to do this is to average the offset for each column or each interval of columns.
One other way to do this is to correlate, in 1 D, the corresponding row pixel intensity variance, row difference (see feature extraction above) or other row-wise feature of an interval of columns. The maximum correlation for the whole image is then found by combining the correlations from each interval of columns. One way to do this is to average the offset for each column or each interval of columns.
The maximum correlation can also be found by 2D correlation and only allowing vertical offset.
The offset for the viewing window in the current image is calculated by subtracting the current image offset compared to the reference, from the viewing window offset in the reference window.
One possible implementation:
Pre-Process
Column and row wise pole filters with user selectable parameters, and/or a (m x n) averaging filter with user selectable parameters, and/or user selectable inversion and/or histogram equalization.
Enhance image
Marking of the glue borders.
Display part of image in view window. By correlation against a reference image, according to the description, with the correlated feature being the pixel intensity with other parameters being user selectable, by using the measurement of the fluting ridge position. Use of the invention
The invention is to be used for presentation of measured data as figures, numerals or trends, as clean information or as basis for handling or controlling work. It is possible to transfer measured information to other systems as machine systems, production systems and quality systems e t c.
The amount of adhesive/glue placed on the substrate/fluting can be manually or automatically controlled. This can be made by control of the distance between the glue roll C1 and the corrugating roll E2 for a single faced board and the distance between the glue roll C1 and the holding-back roll, the doctor roll, the rider roll, for a double backer.
The adhesive/glue placement can be manually or automatically controlled by controlling the speed difference between the glue roll and the corrugating roll for a single faced board and the speed difference between the glue roll and the fluting at a double backer.
The adhesive/glue distribution can be manually or automatically controlled by angular adjustment of the glue roll C1 , by adjusting the distance between the glue roll C1 and the corrugating roll E2 independently, at each side, based on measurements of the glue width at several points in the CD by the use of a traversing measurement or by fixed/adjustable measurements in several positions.
Automatic or manual control/optimation of the machine speed based on the measurements and optimation criterion or limitation of the machine capacity is possible.
In the description above corrugated substrate in mentioned. As the invention is based on taking pictures of the substrate and the glue on the substrate the apparatus and method works just as well if the substrate is not corrugated but is a substrate having planar structure. It is possible to take pictures of glue on any kind of substrate.

Claims

Claims
1. An apparatus for detecting and measuring the width and position of adhesives (D3, 9.1) applied to a substrate (C, 9) in at least one line c h a r a c t e r i s e d i n a device (1) for registration of pictures of the substrate (C, 9) and the adhesive (D3, 9.1) on the substrate (C, 9) and that the picture registration device (1) is cooperating with devices (10, 12) for transforming/processing the pictures into manageable information by making measurements out of the information from the pictures.
2. An apparatus according to claim 1 where the picture registration device (1) is for registration of real time pictures.
3. An apparatus according to claim 1 or 2 wherein the picture registration device (1) generates analog or digital signals representing the image of the adhesives (D3, 9.1) applied to the substrate (C, 9).
4. An apparatus according to anyone of claim 1-3 comprising a first mirror arrangement (5), used for the picture registration device (1), for allowing an angled mounting of the registration device (1) in relation to the substrate (C, 9).
5. An apparatus according to any one of claims 1-4 comprising optical means (2) used for the registration device (1), for the for focusing, aperture and field of view adaptation.
6. An apparatus according to any one of claims 1-5 comprising a device (3) for projecting a defined reference light pattern onto the adhesives (D3, 9.1) and the substrate (C, 9).
7. An apparatus according to claim 6 wherein the device (3) comprises a diode laser with line generating optics.
8. An apparatus according to claim 6 or 7 comprising a second mirror arrangement (6) used for the device (3), for allowing an angled mounting of the device (3) in relation to the substrate (C, 9).
9. An apparatus according to any one of claims 1-8 comprising a light source (4) for illumination of the field of view and for enhancement of the contrasts between the adhesives (D3, 9.1) and the substrate (C, 9).
10. An apparatus according to any one of claims 1-9 comprising an environmentally protected housing (7) which encapsulate at least the picture registration device (1).
11. An apparatus according to claim 10 wherein the housing (7) has an opening (8) covered by a transparent material for distortion free observation of the adhesives (D3, 9.1) applied to the substrate (C, 9).
12. An apparatus according to any one of claims 1-11 comprising an electronic part (11) comprising a power supply and a support electronics unit with control devices.
13. An apparatus according to any one of claims 1-12 comprising a system control device (12).
14. An apparatus according to claim 13 wherein the system control device (12) comprises a computer.
15. An apparatus according to any one of claims 1-14 comprising a display unit (13) for viewing images of the substrate (C, 9), raw images, images enhanced by image processing and/or information from the adhesives (D3, 9.1) detection and the measurements.
16. A method for detecting and measuring the width and position of adhesives (D3, 9.1) applied to a substrate (C, 9) in at least one line c h a r a c t e r i s e d i n registration of pictures of the substrate (C, 9) and the adhesive on the substrate (C, 9) by using a picture registration device (1) and then transforming/processing the pictures into manageable information by using devices (10,12) for making measurements out of the information from the pictures.
17. A method according to claim 16 where a picture is processed in real time.
18. A method according to claim 16 or 17 comprising detecting one, or more, visible, difference/differences between the adhesives (D3, 9.1) and the substrate (C, 9) and by processing the pictures for extraction of one or more image features giving a discrimination between the adhesives (D3, 9.1) and the substrate (C, 9) followed by segmentation and classification to discern the adhesives (D3, 9.1) from the substrate (C, 9).
19. A method according to claim 18 where the difference is enhanced.
20. A method according to anyone of claims 16-19 generating analog or digital signals representing the image of the adhesives (D3, 9.1) applied to the substrate (C, 9) by using the registration device (1).
21. A method according to anyone of claims 18-20 where the discriminating difference between the adhesives (D3, 9.1) and the substrate (C, 9) is the brightness/intensity between areas with adhesives (D3, 9.1) and areas with substrate (C, 9).
22. A method according to anyone of claims 18-20 where the discriminating difference between the adhesives (D3, 9.1) and the substrate (C, 9) is the texture difference between areas with adhesives (D3, 9.1) and areas with substrate (C, 9).
23. A method according to any one of claims 18-22 comprising measurements of the adhesives (D3, 9.1) width by using the information on the difference/differences between the adhesives (D3, 9.1) and the substrate (C, 9).
24. A method according to claim 23 comprising estimation of the amount of the adhesives (D3, 9.1) using the detected adhesive width.
25. A method according to any one of claims 16-24 where the substrate (C, 9) is corrugated comprising finding and positioning the ridges (C1 , 9a) of the substrate (C, 9) in each picture by directing a light onto the substrate (C, 9) in a certain direction by the use of a device (3) for projecting a defined reference light pattern onto the adhesives (D3, 9.1) and the substrate (C, 9) and observing the light, the light pattern, from an other direction such that the height variations in the substrate (C, 9) will cause distortions in the light projection.
26. A method according to claim 25 comprising placing the ridges (C1 , 9a) of the substrate (C, 9) in the same position when showing the pictures at a display unit (13) by finding the corresponding distortion in the projected light in the images.
27. A system for producing board comprising at least one apparatus according to claim 1 fixed positioned after an adhesive application position (D) and/or at least one apparatus according to claim 1 moveable over the substrate (C, 9) after an adhesive application position (D).
28. An apparatus according to claim 1 and/or a method according to claim 16 for detecting and measuring the width and position of adhesives (D3, 9.1) applied to a substrate (C, 9) and/or a system for producing board according to claim 27 where the substrate (C, 9) is corrugated.
PCT/SE2004/000386 2004-03-16 2004-03-16 Apparatus, method and system for detecting the width and position of adhesives applied to a substrate WO2005087460A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SE2004/000386 WO2005087460A1 (en) 2004-03-16 2004-03-16 Apparatus, method and system for detecting the width and position of adhesives applied to a substrate

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2004/000386 WO2005087460A1 (en) 2004-03-16 2004-03-16 Apparatus, method and system for detecting the width and position of adhesives applied to a substrate

Publications (1)

Publication Number Publication Date
WO2005087460A1 true WO2005087460A1 (en) 2005-09-22

Family

ID=34975414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2004/000386 WO2005087460A1 (en) 2004-03-16 2004-03-16 Apparatus, method and system for detecting the width and position of adhesives applied to a substrate

Country Status (1)

Country Link
WO (1) WO2005087460A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007036329A1 (en) * 2005-09-29 2007-04-05 Windmöller & Hölscher Kg Apparatus for measuring and/or monitoring three-dimensional expansion features of glue traces on workpieces
US8847739B2 (en) 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US8952894B2 (en) 2008-05-12 2015-02-10 Microsoft Technology Licensing, Llc Computer vision-based multi-touch sensing using infrared lasers
US9171454B2 (en) 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand
ITUB20153309A1 (en) * 2015-08-31 2017-03-03 Fosber Spa PLANT AND METHOD FOR THE PRODUCTION OF CORRUGATED CARDBOARD WITH BONDING DEFECTS DETECTOR
WO2017130631A1 (en) * 2016-01-29 2017-08-03 三菱重工印刷紙工機械株式会社 Corrugated board sheet defect detecting device, corrugated board sheet defect removing device and corrugated board sheet manufacturing device
JP6273594B1 (en) * 2017-05-12 2018-02-07 三菱重工機械システム株式会社 Cardboard sheet defect detection apparatus, cardboard sheet defect removal apparatus, and corrugated sheet manufacturing apparatus
DE102017219064A1 (en) * 2017-10-25 2019-04-25 Texmag Gmbh Vertriebsgesellschaft MEASURING SYSTEM FOR WAVE PAPER MACHINE
CN110503638A (en) * 2019-08-15 2019-11-26 上海理工大学 Spiral colloid amount online test method
CN111458318A (en) * 2020-05-12 2020-07-28 西安交通大学 Super-resolution imaging method and system utilizing square lattice structure light illumination
ES2801223A1 (en) * 2019-06-27 2021-01-08 Ind Bolcar S L ARTIFICIAL VISION EQUIPMENT FOR DEFECTS CONTROL IN LINE FOR BAG FORMATION (Machine-translation by Google Translate, not legally binding)
CN113567433A (en) * 2021-06-09 2021-10-29 中车青岛四方机车车辆股份有限公司 Method and device for detecting adhesive joint
CN116777888A (en) * 2023-06-30 2023-09-19 广州高迪机电工程有限公司 Self-adaptive compensation correction method for adhesive width by visual detection system during angular adhesive coating

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4305991C1 (en) * 1993-02-26 1994-06-01 Grecon Greten Gmbh & Co Kg Monitoring glue coating in prodn. of glued V-teeth joining of wood pieces - using glue coating on entire surface of teeth on one of two wooden pieces to be joined together
GB2297616A (en) * 1994-06-03 1996-08-07 Nireco Corp Apparatus for monitoring glue application pattern
NL1016532C1 (en) * 2000-11-02 2002-05-07 Arnold Wilhelm Heinrich Toonen Adhesive inspection device for gluing and folding machine, comprises intelligent camera system operated using program structure design software package
FR2817618A1 (en) * 2000-12-04 2002-06-07 Renault Method for depositing a string of glue on the surface of a component using camera and computer controlled elements and a computer learning process to ensure glue is optimally deposited in terms of dimension and position
WO2003021534A1 (en) * 2001-09-04 2003-03-13 John Wain Monitoring and rejection system and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4305991C1 (en) * 1993-02-26 1994-06-01 Grecon Greten Gmbh & Co Kg Monitoring glue coating in prodn. of glued V-teeth joining of wood pieces - using glue coating on entire surface of teeth on one of two wooden pieces to be joined together
GB2297616A (en) * 1994-06-03 1996-08-07 Nireco Corp Apparatus for monitoring glue application pattern
NL1016532C1 (en) * 2000-11-02 2002-05-07 Arnold Wilhelm Heinrich Toonen Adhesive inspection device for gluing and folding machine, comprises intelligent camera system operated using program structure design software package
FR2817618A1 (en) * 2000-12-04 2002-06-07 Renault Method for depositing a string of glue on the surface of a component using camera and computer controlled elements and a computer learning process to ensure glue is optimally deposited in terms of dimension and position
WO2003021534A1 (en) * 2001-09-04 2003-03-13 John Wain Monitoring and rejection system and apparatus

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007036329A1 (en) * 2005-09-29 2007-04-05 Windmöller & Hölscher Kg Apparatus for measuring and/or monitoring three-dimensional expansion features of glue traces on workpieces
DE102005046660B4 (en) * 2005-09-29 2016-02-11 Windmöller & Hölscher Kg Manufacturing device and method of sacks or sack semi-finished products comprising a measurement of glue jobs
US9171454B2 (en) 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand
US8952894B2 (en) 2008-05-12 2015-02-10 Microsoft Technology Licensing, Llc Computer vision-based multi-touch sensing using infrared lasers
US8847739B2 (en) 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
CN107923734A (en) * 2015-08-31 2018-04-17 弗斯伯股份公司 The apparatus and method for being used to manufacture corrugated board with glued defect detector
US10525653B2 (en) 2015-08-31 2020-01-07 Fosber S.P.A. Plant and method for producing corrugated cardboard with gluing defect detector
WO2017036685A1 (en) 2015-08-31 2017-03-09 Fosber S.P.A. Plant and method for producing corrugated cardboard with gluing defect detector
ITUB20153309A1 (en) * 2015-08-31 2017-03-03 Fosber Spa PLANT AND METHOD FOR THE PRODUCTION OF CORRUGATED CARDBOARD WITH BONDING DEFECTS DETECTOR
JP2017133998A (en) * 2016-01-29 2017-08-03 三菱重工印刷紙工機械株式会社 Defect detection device of cardboard sheet and defect removal device of cardboard sheet, and manufacturing method of cardboard sheet
WO2017130631A1 (en) * 2016-01-29 2017-08-03 三菱重工印刷紙工機械株式会社 Corrugated board sheet defect detecting device, corrugated board sheet defect removing device and corrugated board sheet manufacturing device
JP2018192633A (en) * 2017-05-12 2018-12-06 三菱重工機械システム株式会社 Failure detection device of corrugated cardboard sheet, failure removal device of corrugated cardboard sheet, and manufacturing device of corrugated cardboard sheet
JP6273594B1 (en) * 2017-05-12 2018-02-07 三菱重工機械システム株式会社 Cardboard sheet defect detection apparatus, cardboard sheet defect removal apparatus, and corrugated sheet manufacturing apparatus
DE102017219064A1 (en) * 2017-10-25 2019-04-25 Texmag Gmbh Vertriebsgesellschaft MEASURING SYSTEM FOR WAVE PAPER MACHINE
ES2801223A1 (en) * 2019-06-27 2021-01-08 Ind Bolcar S L ARTIFICIAL VISION EQUIPMENT FOR DEFECTS CONTROL IN LINE FOR BAG FORMATION (Machine-translation by Google Translate, not legally binding)
CN110503638B (en) * 2019-08-15 2023-06-02 上海理工大学 Spiral adhesive quality online detection method
CN110503638A (en) * 2019-08-15 2019-11-26 上海理工大学 Spiral colloid amount online test method
CN111458318A (en) * 2020-05-12 2020-07-28 西安交通大学 Super-resolution imaging method and system utilizing square lattice structure light illumination
CN111458318B (en) * 2020-05-12 2021-06-22 西安交通大学 Super-resolution imaging method and system utilizing square lattice structure light illumination
CN113567433A (en) * 2021-06-09 2021-10-29 中车青岛四方机车车辆股份有限公司 Method and device for detecting adhesive joint
CN116777888A (en) * 2023-06-30 2023-09-19 广州高迪机电工程有限公司 Self-adaptive compensation correction method for adhesive width by visual detection system during angular adhesive coating
CN116777888B (en) * 2023-06-30 2024-02-06 广州高迪机电工程有限公司 Self-adaptive compensation correction method for adhesive width by visual detection system during angular adhesive coating

Similar Documents

Publication Publication Date Title
WO2005087460A1 (en) Apparatus, method and system for detecting the width and position of adhesives applied to a substrate
US7471383B2 (en) Method of automated quantitative analysis of distortion in shaped vehicle glass by reflected optical imaging
JP6620477B2 (en) Method and program for detecting cracks in concrete
US10607333B2 (en) Real-time, full web image processing method and system for web manufacturing supervision
US9841383B2 (en) Multiscale uniformity analysis of a material
US20040233421A1 (en) Method and device for examining an object in a contactless manner, especially for examining the surface form of the same
TWI629665B (en) Defect inspection method and defect inspection system
CN102601131B (en) A kind of billet surface quality on-line detecting device
US7619740B2 (en) Microgloss measurement of paper and board
CN110057841A (en) A kind of defect inspection method based on transmittance structure light
JP7116771B2 (en) Multi-step method and investigative apparatus for determining surface properties
CN106918597A (en) Film quality detection method and film quality detecting system
CN109791088A (en) Check device, inspection method and program
Hansson et al. Topography and reflectance analysis of paper surfaces using a photometric stereo method
FI121033B (en) Method and devices for optical evaluation of paper surface
TW201940861A (en) Infrared light transmission inspection for continuous moving web
CN111351805A (en) Light source module, online float glass defect detection device and detection method thereof
EP2863169B1 (en) Apparatus and method for measuring caliper of creped tissue paper
EP0809800B1 (en) Surface topography enhancement
WO2003095739A1 (en) Method and apparatus for monitoring of the dry line in a fou drinier paper machine and for control based thereupon
KR100966814B1 (en) A Surface Defect Detection and Surface Shape Recognition Equipment
Sari-Sarraf et al. On-line characterization of slurry for monitoring headbox performance
JPH0650906A (en) On-line formation meter
Berndtson et al. Automatic observation of the dry line in paper machine
Goddard Jr Four-Dimensional Characterization of Paper Web at the Wet End

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase