WO2019219955A1 - Tube inspection system - Google Patents

Tube inspection system Download PDF

Info

Publication number
WO2019219955A1
WO2019219955A1 PCT/EP2019/062892 EP2019062892W WO2019219955A1 WO 2019219955 A1 WO2019219955 A1 WO 2019219955A1 EP 2019062892 W EP2019062892 W EP 2019062892W WO 2019219955 A1 WO2019219955 A1 WO 2019219955A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
defect
processing
utility
tube
Prior art date
Application number
PCT/EP2019/062892
Other languages
French (fr)
Inventor
Somasundaram S
Omkar BHOITE
Original Assignee
Ab Sandvik Materials Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ab Sandvik Materials Technology filed Critical Ab Sandvik Materials Technology
Publication of WO2019219955A1 publication Critical patent/WO2019219955A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • G01N2021/9542Inspecting the inner surface of hollow bodies, e.g. bores using a probe
    • G01N2021/9544Inspecting the inner surface of hollow bodies, e.g. bores using a probe with emitter and receiver on the probe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present invention relates to an inspection system, apparatus and method for inspecting a surface of a tube or pipe and in particular, although not exclusively, to a system configured to generate data and/or images relating to the surface via an automated inspection process.
  • Inspection systems have been used for the assessment, maintenance and repair of tubes and pipes within heat exchangers, fluid transfer tubing and utility transport networks i.e., water, oil and gas. Such systems typically involve introducing a probe into the pipe, advancing the probe axially and inspecting the internal surface via video and image output.
  • GB 2468301 discloses a water mains inspection and servicing system that comprises a feed cable, an ultrasound probe having an ultrasound sensor, a processor for analysing the detected ultrasound signals and a display for outputting the processed data.
  • JP 2001-7819 discloses an in-pipe inspection camera system having a lighting array positioned at the camera for direct visual inspection. A reflector and a diffusor plate are described and utilised to improve image capture and inspection.
  • Endoscopic inspection is the preferred technique for investigating the status of the internal surface of long pipes.
  • manual insertion of the endoscopic camera along the length of the pipe is difficult, time consuming and tedious.
  • such existing methods and apparatus are challenging to operate efficiently and reliably and can be ineffective to assess the full length of the surface of a long pipe due to poor resolution of the images obtained and the difficultly with fully advancing the camera. Accordingly, identification of defects at pipe surfaces is typically difficult, inefficient and unreliable. Accordingly, what is required is a tube and pipe inspection system that solves the above problems.
  • the present arrangements provide a manual and/or an automated tube inspection system, apparatus and method to obtain an image and/or inspection data of an external or internal surface of a tube. It is a further aspect to provide a system configured for the identification of defects at the tube surface to facilitate monitoring, servicing and repair of tubes and pipes. It is a further specific aspect to provide a system suitable for the automated inspection of tube networks that may form part of a fluid transport system such as a heat exchanger, water, oil and gas supply.
  • a fluid transport system such as a heat exchanger, water, oil and gas supply.
  • an inspection system having a mobile camera unit capable of axial transport within the internal bore of a tube with the camera unit being controlled by a control unit and a feeder mechanism, the control unit provided with a data handling utility for data acquisition, processing, feedback signal generation and data and/or image output.
  • the present system is particularly adapted for inspection of metal tubes and pipes.
  • the present system is configured for inspection of non-metallic tubes and pipes such as polymer, ceramic, glass, clay or pot-based tubes and pipes.
  • a method of inspecting a surface of a tube comprising: pre-processing inspection image data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; inputting the pre-processed image data into a data processing utility and processing the data using feature extraction layers having filters to identify and categorise any defects within the sub-locations; post-processing the data of the patches output by the data processing utility to assign at least one characteristic to the data and/or the defect; and outputting data and/or at least one notification based on the identified and categorised defect.
  • the step of identifying and categorising the defects may comprise categorising a defect as: a positive class feature that represents at least one feature within the image data of the surface associated directly or indirectly with the camera unit and/or a non-hazardous non-textural feature at the surface; or a negative class feature being a defect that represents at least one feature within the image data of the surface that is hazardous.
  • a positive class feature encompasses a feature that is non- hazardous and effectively may be ignored by the data processing utility.
  • Reference to a negative class feature is to be interpreted as a‘defect’ being an undesirable feature present at the tube surface.
  • Such a defect may be formed within the tube wall (at the region of the surface for example as a pit, scratch, crack etc) or present on the surface as a foreign body or liquid encompassing such features as solid contaminants, lubricants etc., that may be residual from an initial pipe manufacturing process.
  • the negative class feature may comprise any one or a combination of: a pit; a scratch; a scar; a tear; a split; a groove; a hole; an indentation or depression; a region of surface roughening, relative to non-surface roughened regions of the surface; or surface imperfection, solid contaminants, lubricants or other contaminants or manufacturing residue present on the surface.
  • the present system is therefore useful for quality control of tube/pipe manufacture by scanning the tube/surface prior to onward supply or subsequent downstream processing.
  • the non-hazardous non-textural defect comprises any one or a combination of: a water mark; a stain; a discontinuation in a colouration of the surface.
  • the at least one feature within the image data of the surface associated directly or indirectly with the camera unit comprises a part of the image data representing: a light reflector of the camera unit; a shadow created by a part of the camera unit; a shaft mounting a light reflector of the camera unit; a defect-free portion of the surface.
  • the step of processing the data using the feature extraction layers comprises applying a series of feature extraction layers to the pre-processed image data, each of the layers in order of application having an increasing number of filters.
  • the system comprises 2 to 60; 2 to 40; 2 to 20; 2 to 10; 2 to 8; or 2 to 6 feature extraction layers.
  • At least one of the feature extraction layers comprises 10 to 20 filters, at least a second feature extraction layer comprises 20 to 40 filters, at least a third feature extraction layer comprises 40 to 80 filters and at least a fourth feature extraction layer comprises 80 to 180 filters.
  • the processing of the data using the data processing utility comprises utilising weightings assigned to the filters of the feature extraction layers.
  • the step of processing the data using the data processing utility comprises initiating a defect identification utility to control the collection of additional inspection data of the surface at and/or proximate to a location of said defect.
  • the at least one characteristic comprises any one or a combination of: a label; an area of a defect; a type of defect; a location of a defect within the surface as location coordinates at the tube.
  • the pre-processing of the inspection image data comprises applying at least one cloak or cover to at least a part of the inspection image data to prevent the processing of the data covered by the cloak by the data processing utility.
  • a reflector identification utility is operative as part of the data pre-processing.
  • Such a utility functions to identify within the image the disc-shaped reflector and effectively cloak or conceal this part of the image data from subsequent data processing by the data processing utility.
  • the reflector identification utility is advantageous to enhance the speed of data processing and data handling efficiency of the present system.
  • the reflector identification utility functions to create a perimeter line or border around the reflector (within the image) such that the subsequent data processing does not occur internally within the as-created perimeter border.
  • the step of pre-processing the inspection image data comprises any one or a combination of image cropping, scaling, augmentation, blurring, labelling and storing the pre-processed data.
  • the step of post-processing the data comprises applying at least one label to the data output by the data processing utility.
  • the step of applying the label comprises inserting a boundary around a defect identified within a sub-location.
  • the step of post-processing the data comprises generating a binarized black and white image of a defect.
  • the step of outputting data comprises outputting any one or a combination of: a number of defects within the surface of the tube; a label assigned to a defect; a grade assigned to tube and/or a defect; a chart or graph based on defects identified and categorised by the data processing utility; at least one image of a defect obtained by the data processing utility; a surface area value of a defect; a 2D or 3D map of the surface containing the identified and categorised defects.
  • a method of creating a system for the inspection of a surface of a tube comprising: pre-processing inspection image data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; inputting the pre-processed image data into a data processing utility and processing the data using feature extraction layers and filters to identify and categorise any defects within the sub-locations; comparing data output by the data processing utility with a reference library to determine an accuracy of the defect identification and/or
  • a tube inspection system comprising: a data pre processing utility to process inspection data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; a data processing utility to process the pre-processed image data, the data processing utility utilising feature extraction layers having filters to identify and categorise any defects within the sub-locations; a post processing utility for the post-processing of the data of the patches output by the data processing utility to assign at least one characteristic to the data and/or the defect; wherein the system is configured to output at least one notification or output data based on the identification and categorisation of defects.
  • a tube inspection system comprising: a data pre processing utility to process inspection data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; a data processing utility to process the pre-processed image data, the data processing utility utilising feature extraction layers having filters to identify and categorise any defects within the sub-locations; a reference library used by the data processing utility as a comparison with the data output by the data processing utility to determine an accuracy of the defect identification and/or categorisation; and a model compiling utility to adjust parameters associated with the layers and/or filters and to control and action any re-processing of data by the data processing utility until an accuracy of the defect identification and/or categorisation is at or above a pre-determined threshold.
  • a tube inspection system as claimed and described herein to inspect a surface of a tube for defect identification and categorisation.
  • an inspection system for inspecting an internal surface of a tube comprising: a camera unit comprising a camera, a light source and a shield, the shield mounted at a separation distance from the camera and configured to reflect at least some light generated from the light source in a direction towards the camera; a data handling unit connected in data transfer communication with the camera unit to receive data from the camera; a feeder mechanism to provide axial transport of the camera unit within the tube; a control unit to control at least one of the camera unit, the data handling unit and the feeder mechanism.
  • control unit is configured to control the camera unit, the data handling unit and the feeder mechanism collectively.
  • control unit further comprises a motor driver, a computer and a display.
  • the data handling unit comprises: a data storage utility; a data acquisition utility; a data processing utility.
  • the data handling unit is connected in wired or wireless communication with the camera unit.
  • the feeder mechanism comprises: a drivable motor; a mechanical connection member extending in contact with the camera unit; an actuator drivable by the motor to act on the mechanical connection member and provide axial advancement and withdrawal of the camera unit at the tube relative to the feeder mechanism.
  • the mechanical connection member comprises a rigid or semi-rigid elongate shaft, wire or cable extending axially from the camera unit.
  • the mechanical connection member may be generally straight and comprises a rigidity to prevent coiling of the mechanical connection member during axial
  • a method of inspecting an internal surface of a tube comprising: positioning a camera unit within a tube, the unit comprising a camera, a light source and a shield mounted at a separation distance from the camera; advancing the camera unit axially within the tube using a feeder mechanism; illuminating a region of the internal surface of the tube with the light source and reflecting at least some of the light in a direction towards the camera using the shield; acquiring inspection data from the camera unit at a data handling unit connected in data transfer communication with the camera unit; and controlling at least one of the camera unit and the feeder mechanism using a control unit.
  • the method further comprises: processing said inspection data using a data processing utility; and outputting processed inspection data and/or an image of the internal surface of the tube at an output display.
  • the step of acquiring inspection data from the camera unit comprises: transferring inspection data from the camera to the data handling unit; and storing said inspection data at a data storage utility.
  • control unit comprises a programmable logic controller (PLC) having a processor and a power supply to control operation of at least one of the camera unit, the feeder mechanism and the data handling unit in response to the inspection data received from the camera unit.
  • PLC programmable logic controller
  • the step of advancing the camera unit axially within the tube comprises driving an actuator via a motor, the actuator acting on a mechanical connection member extending axially from the camera unit to provide axial transport of the camera unit within the tube relative to the feeder mechanism.
  • a method of inspecting an internal surface of a tube comprising: controlling a feeder mechanism to advance axially a camera unit within a tube; controlling the camera unit to generate inspection data of the surface of the tube; processing the inspection data to identify a defect at the surface; and outputting an image and/or data relating to the defect at an output device.
  • the method comprises initiating a defect identification utility to control the collection of additional inspection data of the surface at and/or proximate to a location of said defect.
  • the method further comprises storing the inspection data, processed data and/or an output image of the surface and/or defect.
  • the step of initiating the defect identification utility comprises pulsing the collection of additional inspection data of the surface.
  • the step of initiating the defect identification utility further comprises activating a defect labelling utility to apply at least one label to the additional inspection data and/or processed data based.
  • the method further comprises activating a method learning utility to improve a capability of the method to identify defects and/or process the inspection data.
  • the method further comprises initiating a post data processing utility to determine a size of the defect.
  • the size may comprise data relating to an area, length and/or width of the defect.
  • the step of outputting the image and/or data comprises outputting any one or a combination of: an image of the surface and the defect; a size of the defect; location details of the defect at the surface of the tube.
  • the step of obtaining the inspection data comprises controlling the camera unit via a control unit to obtain additional inspection data in response to processing of previously captured inspection data.
  • the step of obtaining the inspection data comprises controlling the camera unit having a camera, light source and a shield, the shield mounted at a separation distance from the camera and configured to reflect at least some of the light generated from the light source in a direction towards the camera.
  • a tube inspection system comprising: a control utility to control a camera unit within a tube to obtain inspection data of an internal surface of a tube; a data handling utility comprising a data acquisition utility and a data processing utility, the data processing utility configured to identify a defect at the surface of the tube based on the inspection data.
  • the system further comprises a defect identification utility configured to control the camera utility to collect additional inspection data of the surface of the tube at or proximate to the location of the defect at the surface.
  • system may further comprise a defect labelling utility to apply labels to the additional inspection data and/or processed data based on said inspection data.
  • the system may further comprise a system learning utility configured to improve the capability of the system to identify defects and/or process the raw image data.
  • the system further comprises a camera unit wherein the camera unit comprises a camera, a light source and a shield, the shield mounted at a separation distance from the camera and configured to reflect at least some light generated from the light source in a direction towards the camera.
  • the present system is suitable for investigation and analysis of different types of tube, including tubes of different material, shape and size.
  • the present system is compatible for use with tubes of diameter in the range 5mm to 300 mm or even larger.
  • the present system is particularly suitable for investigation and analysis of tubes of diameter 10 mm to 250 mm. Also, the system is suitable for tubes of diameter 10 to 80 mm 20 to 80 mm as examples only.
  • Figure 1 is a perspective view of a camera unit suitable for inspecting an internal surface of a tube according to a specific implementation of the present invention
  • Figure 2 is a side view of the camera unit of figure 1 ;
  • Figure 3 is a schematic side view of the camera unit positioned and operable to illuminate a region within a tube;
  • Figure 4 is a schematic illustration of the camera unit of figure 3 positionally controlled by a feeder unit and a control unit according to a specific implementation of the present invention;
  • Figure 5 is a perspective view of a part of the feeder unit of figure 4.
  • Figure 6 is a schematic illustration of components of a tube inspection system according to a specific implementation of the present invention.
  • Figure 7 is a schematic illustration of selected components of the tube inspection system of figure 6;
  • Figure 8 is a schematic flow diagram of one mode of operation of the tube inspection system of figure 7 ;
  • Figure 9 is a further flow diagram of an automated data collection process utilising the present inspection system.
  • Figure 10A is a first part of a schematic flow diagram detailing the operation and control of the various components of the inspection system of figure 7 ;
  • Figure 1 OB is a second part of the schematic flow diagram of figure 10A;
  • Figures 11 A, B, C, D, E and F are images at an internal surface of a tube output from the tube inspection system according to one aspect of the present invention.
  • Figure 12 is a schematic flow diagram of an initial‘ learning’ load of the present system using a feature learning utility to calibrate and prepare the system for operational use according to a specific implementation
  • Figure 13A is a schematic flow diagram of a first half of the stages associated with operational use of the present system for tube surface inspection;
  • Figure 13B is a schematic flow diagram of those stages within a second half of operational use of the present system for tube surface inspection and in particular defect identification and characterisation;
  • Figure 14 is an image of an internal surface of a tube with different defect types identified and categorised
  • Figure 15 is an image of an internal surface of a tube illustrating‘ patches’ of data at sub locations within a particular region at a surface of the tube as part of the data processing stages.
  • a tube inspection system for inspecting an internal surface of a tube/pipe comprises a camera unit 10 suitably dimensioned to be introduced and advanced axially within a tube 21.
  • Camera unit 10 comprises: a camera 11 having an axially forward lens optionally being a PAL (Phase Alternating Line) type camera; an array of LEDs positioned at an axially forward region 14 of the camera unit 10 around the forwardmost lens; and a tubular sleeve 13 housing the camera 11 and light source 12.
  • Unit 10 further comprises a disk-shaped shield 18 having a generally circular rearward facing surface 19 being positioned opposed to and separated a relatively short (e.g., 10 to 150 mm) distance from the lens of camera 11.
  • Surface 19 may be coated and comprise a reflective material.
  • surface 19 may be formed from a plastic such as a nylon or other suitable polymer capable of impeding or preventing light generated from light source 12 from propagating axially along the tube 21.
  • surface 19 is configured to (at least partially) reflect the light generated from the light source 12 in a direction axially towards the lens of camera 11.
  • the disk-shaped shield 18 is positionally secured to the camera 11 via a mounting shaft 20 secured to sleeve 12 such that the shield 18 and the camera 11 form an integrated unit for unified movement within the tube 21.
  • Unit 10 further comprises a plurality of cables 16 to provide data communication between the camera unit 10 and further electronic components of the present inspection system including in particular a feeder mechanism, a control unit and a data handling unit and utility detailed below.
  • a substantially rigid elongate member 17 extends axially rearward from a rearward end 15 of sleeve 13.
  • Member 17 may be formed as a rod, wire or cable having a thickness, rigidity and/or stiffness to avoid deflection and in particular coiling of the member 17 as the camera unit 10 is moved axially within tube 21 (via the member 17).
  • camera unit 10 in addition to sleeve 13 may comprise an additional spacer illustrated schematically by reference 23 to ensure appropriate alignment of the camera unit 10 within the tube 21 relative to an internal surface 2la of the tube 21. That is, spacer 23, sleeve 13 and shield 18 are dimensioned so as to comprise an outside diameter being slightly less than an internal diameter of tube surface 2la with a separation distance optionally being around 1 mm. Accordingly, light emitted (22a) from light source 12 is configured to fill the region between the camera 11 and shield surface 19. Light is reflected (22b) from surface 19 to return to camera 11. Shield 18 is beneficial to prevent propagation and loss of light axially along the tube bore 2 lb a distance beyond the focal length of camera 11.
  • the inventors have identified that, the illumination of the tube surface 2la beyond the focal length of the camera 11 is effectively wasted, represents inefficient use of the light source 12, results in poor quality images and is not optimised for defect detection.
  • the present tube inspection system is specifically adapted to identify defects at internal surface 2la such as pits, projections, tears, water or other marks and stains, scoring, breaks, cracks and the like. By utilising the present system, such defects may be identified as dark or black regions within an otherwise highly illuminated internal surface 2 la.
  • Shield 18 mounted at a distance at or slightly beyond the focal length of camera 11, traps the emitted light 22a and significantly enhances illumination of surface 2la within the focal length of the camera 11 and accordingly provides a system for enhanced sensitivity to defect detection.
  • the present inspection system further comprises a feeder mechanism illustrated schematically by reference 24 having an actuator 29 controlled by/provided at a control panel 28.
  • a feeder mechanism illustrated schematically by reference 24 having an actuator 29 controlled by/provided at a control panel 28.
  • a single or an array of tubes 21 are mounted at a platform 25 with the tube inspection system including feeder mechanism 24 and control panel 28 also mounted on a suitable support structure 26.
  • Control panel 28 is adapted to control actuator 29.
  • Elongate member 17 extends axially rearward from camera unit 10 and is fed into the feeder mechanism 24 such that actuator 29, when activated is configured to act upon the member 17 and provide axial advancement and withdrawal of the camera unit 10 within tube 21.
  • Control panel 28 comprises: a power supply, electronic components and a suitable terminal input/user interface optionally including a keyboard, touch screen, control buttons monitor etc., as found in the art.
  • control panel 28 comprises wireless communication components and modules for wireless data exchange with the camera unit and a remote network.
  • Panel 28 via its components and functionality is configured to power and control the various components of the inspection system including the feeder mechanism, the camera unit and a data handling unit. Further details of the control panel 28 its integrated control unit are described referring to figures 6 to 10B.
  • feeder unit 24 comprises a base plate 44 mounting at least one actuator 29 in the form of a pair of rotatable axels positioned opposed to one another by a separation distance to accommodate cables 16 and elongate member 17.
  • a compressible rubber bushing 42 is mounted at each axel to sit in frictional contact against elongate member 17 such that a rotation of actuator 29 (and bushings 43) provides a corresponding axial movement of elongate member 17 (and cables 16) to drive the axial movement of camera unit 10 within tube 21.
  • the feeder mechanism 24 further comprises at least one guidance or alignment mounting 41 to assist with appropriate positioning of the elongate member 17 relative to the actuator 29. Additionally, the mechanism 24 may be
  • the tube inspection system is adapted to inspect the internal surfaces 2la of an array of pipes 27 with each pipe 21 positioned side-by-side.
  • Camera unit 10 being positionally controlled via feeder mechanism 24, is capable of being axially advanced and withdrawn within tube 21 via the control panel 28 controlling actuator 29.
  • Control panel 28 specifically comprises: a CPU; a programmable logic controller (utilised for controlling feeder mechanism 24); and a data handling unit for handling of the data generated by the camera unit 10.
  • the inspection system 38 further comprises an output device 37 in the form of a digital display (such as a TV or monitor) to display moving and/or still images of the surface 2la generated by camera 11.
  • the system 38 (implemented via control panel 28) specifically comprises: a control unit 69 having a programmable logic controller 65; a data storage utility 66; software 67; and a power supply 68.
  • the control unit is coupled in data transfer communication with the feeder mechanism 78.
  • Feeder mechanism 78 comprises a motor driver 74, and actuator, motor 75, a power supply 76 (that may be the same or additional to power supply 68) and movement control components and utilities 77.
  • Control unit 69 is also coupled for data transfer with the camera unit 10 that in turn comprises optionally an on board processor 70, cables 71, lighting utilities 72 and camera control utilities 73.
  • Control unit 69 includes or is provided in communication with a data handling utility 80 configured to acquire and process raw image data (e.g., video steam) generated by camera unit 10.
  • Data handling utility 80 comprises a data acquisition utility 81; a data processing utility 82; a system learning utility 83; a data output utility 84; and a data storage utility 85.
  • Such utilities 81 to 85 may be implemented as software and comprise associated components for data output and storage.
  • the present system 38 is configured for at least two modes of operation including a manual mode 50 and an intelligent (or automatic) mode 51. Both modes 50, 51 utilise control panel 28 in which signals (including in particular feedback signals 54) are generated and returned via the programmable logic controller 65.
  • Controller 65 provides motor driver control 55 and stepped motor control 56 with the instruction data being transmitted to feeder mechanism 78 and in particular actuator 29.
  • Camera unit is advanced via the elongate member 17 (that supports cables 16) with data transfer between camera unit 10 and data handling utility 80 providing a means of video and still/static image acquisition 60.
  • a data storage unit 61 is adapted for storing raw and/or processed image data.
  • data storage 61 may be divided or partitioned into storage areas dependent upon the type of video footage or image obtained via the data acquisition and processing.
  • Data processing and system learning utilities 82, 83 are implemented as a convolutional neural network (CNN), referred to herein as a machine learning graph.
  • CNN convolutional neural network
  • the CNN and associated data processing utilities are operable according to an initial system learning stage to prepare, calibrate and configure the system for automated inspection of internal or external surfaces of a tube to specifically identify and categorise defects based on defect type.
  • identification and categorisation involves differentiation of defects that may be regarded as‘positive class features’ and ‘negative class features’ (i.e.,‘defects’) as described referring to figure 14.
  • the CNN once configured via the initial learning stage is operable in a second and subsequent‘use’ mode to inspect the external or internal surface of a tube for the automated differentiation, identification and categorisation of defects.
  • the first stage system learning is described referring to figure 12 and the second operational stage or‘use’ stage is described referring to figures 13A and 13B.
  • output from the data handling utility and in particular stage 62 provides a feedback signal 63 that is utilised by the system 38 for continued and targeted data acquisition via the intelligent mode of operation 51.
  • Figure 9 illustrates a specific implementation of the intelligent mode of operation 51 (within the second stage of operation) in which control unit 69 (implemented on control panel 28) provides control of the various components of the system 38 including the camera unit 10, the feeder mechanism 78 and the data handling utility 80.
  • control unit 69 (implemented on control panel 28) provides control of the various components of the system 38 including the camera unit 10, the feeder mechanism 78 and the data handling utility 80.
  • an initial stage comprises system activation 86
  • the feeder mechanism is then operated at stage 87 to advance camera unit at stage 88.
  • Raw video/image data is generated at stage 89 with such data being transferred to the data handling utility to obtain output processed image data at stage 90.
  • aspects of the data handling utility 80 are adapted to identify and categorise defects at stage 91. If no defect is identified the data is stored at stage 92 (within a storage unit 1) and the camera unit advanced at stage 96. If a defect is detected at stage 91, the image data is stored at a storage unit 2 at stage 93. The camera unit 10 is then maintained stationary at stage 94 and further image data of the defect is collected at stage 95 for subsequent processing, storage and analysis. Camera unit is then advanced at stage 96. If the end of the tube is identified by the camera unit 10 at stage 97, the camera unit stops 98 is extracted from the tube 99 and returned to the start of the tube 100 via axially withdrawing the camera unit 10 via elongate member 17 and feeder mechanism 78. The system may then be stopped and the camera unit 10 extracted from the tube at stage 101. As indicated, control unit 69 is configured to control at least one or all of the camera units 10, feeder mechanism 78 and data handling utility.
  • the apparatus and system is initially configured for defect differentiation based on defect type (positive class feature and negative class feature as discussed referring to figure 14). If a defect is identified that is unclassified, the system is adapted to issue a notification to an operator and/or system administrator that a system update and/or additional learning may be necessary. Such an event may then require a manual intervention or a separate automated system update in which the system is subject to further calibration and learning via the initial first stage set up process in which the CNN is operated specifically for system learning described referring to figure 12.
  • the data processing utility (CNN) and associated libraries are updated and the system may continue in the second stage or use/operational mode to acquire image inspection data via the camera unit according to the automated intelligent mode of operation.
  • FIGS. 10A and 10B illustrate further specific features, components and functionality of the inspection system 38.
  • camera unit 10 comprise a‘ frame grabber’ 103 to obtain static images (of the tube internal surface 2 la) from the live video stream generated by camera unit 10.
  • the data handling utility 80 when operative comprises a plurality of stages for data acquisition, processing and output according to collective components 117. Initially, a start command is received 104. Image capture and data acquisition 81 are then initiated having sequential stages involving a frame capturing subroutine 105, frame display and recording 106, single frame analysis 107 and image processing 108. If the end of the pipe is identified at stage 109, a pulse generation subroutine is initiated 110. Camera unit is then reversed or stopped at stage 102. A subroutine to convert video to image data is then initiated 111.
  • a data input facility is provided 112 having data input fields including heat, number 113, lot number 114, pipe grade 115 and pipe ID/OD 116.
  • the present pipe inspection system may be configured for two different types of system learning.
  • supervised learning mode (described referring to figure 12)
  • the software and system learning utility 83 are configured and constructed such that the software is provided with all types of internal surface defects (i.e. pits, projections, score marks, tears etc) identifiable as part of the inspection process.
  • the software and system learning utility 83 are suitable for the inspection process to identify and label defects.
  • unsupervised learning mode the software and system learning utility 83 are initially configured with no or little initial data input regarding the type of defects as mentioned. Accordingly, in this mode, the system via the software and system learning utility 83 is configured to leam entirely from the collection of inspection data acquired during use.
  • the primary implementation described herein is focussed on the first mode of operation described referring to figure 12 involving complete system learning as an initial configuration set-up.
  • the data processing utility 82 involves processing initialisation/import packages at stage 118 and generating and potentially outputting system learning data via system learning utility 83 at stage 119. If a defect is detected at stage 120, the pulse generation subroutine is activated at stage 121 followed by activation of a labelling defect subroutine 122 to assign labels to the processed data associated with the presence of a defect within a particular image or set of images. A visualisation subroutine is then initiated at stage 123 for potential output and storage of the defect containing data.
  • a report generation utility 84 comprises generating a single image of the entire length of a pipe 132; defect characterisation 133; the location of defect 134; the generation of a spreadsheet 135 and the generation and output of schematic illustrations such as graphs, pie charts etc., 136 of the status and characteristics of the pipe surface 2 la.
  • the data handling functionality 117 further comprises a post data processing utility 125 configured to assign a characteristic to a defect that may include an area value, a length, a width, a location, a defect type etc. at stage 126.
  • the identification of a hazardous (negative class feature) defect 127 is followed by initiation of pulse generation subroutine 128. If a defect is categorised as significant/hazardous (according to predefined criteria) a notification to the operator is generated 129 with the subsequent storage of the defect image at stage 130.
  • Figure 11 illustrates processed static images of tube surface 2la as generated by the inspection system 38 in which axial sections of the surface 2la are appropriately illuminated by the light source 12 and footage captured by camera 11.
  • Figures 11A to E illustrate regions of tube surface 2la that include defects whilst image 11F represents a section of the tube surface 2la that is defect free.
  • the present system is advantageous to enhance image quality to greatly facilitate data processing and accordingly the
  • figure 11A illustrates pits and/or projections as discreet black dots 150 within an otherwise smooth internal surface 2la.
  • Figure 11B and 11D illustrates tears 151 and 152.
  • Figure 11D illustrates the presence of a watermark 153 and figure 11E illustrates the presence of axial scoring 154 at surface 2 la.
  • the present tube inspection system 38 via the components and functionality as described, is configured to capture video data of a tube surface 2 la, to process the raw data and identify and classify defects.
  • the present system involves generating feedback signals to dynamically control the movement and operation of camera unit 10 being sensitive to the presence of a defect at the surface.
  • camera 11 is formed as an endoscopic camera.
  • the present system 38 may find application for defect detection within internal combustion engines, turbo chargers, turbine pipe networks, automotive valves, heat exchanger pipes and pipes for transporting utilities such as water, oil and gas.
  • the present system is capable of displaying live video and/or images of the tube surface 2la as the camera unit 10 is advanced within the tube 21 in parallel with raw image data processing according to manual, semi-automated of fully automated implementation.
  • the video footage is capable of being dynamically processed via the control unit 69 and data handling utility 80 to store and output captured data.
  • the present system via the control panel 28 may be pre-programmed so as to be capable of automated/dynamic or manual control for a variety of different modes of operation.
  • an automated or semi-automated tube inspection system is provided for the inspection of internal surfaces of an array of pipes in which the camera unit 10 may be automatically advanced within the pipes during data handling (acquisition and processing).
  • the present system is advantageous by requiring minimal installation set-up and operation space and to significantly reduce the time and effort required by operating personnel to inspect tube internal surfaces 2 la.
  • the present tube inspection system is adapted specifically to categorise and differentiate between anomalies at the surface of the tube with such anomalies including defects considered as‘ hazardous’ such as pits, scoring, cracks etc., and those anomalies that may be regarded as‘ non-hazardous’ or non-textual that include features identified within an image of the tube surface that includes parts of the camera unit or apparatus that is positioned at the tube as well as other features within the image such as watermarks shadows, discontinuities in coloration of the surface etc.
  • such features/defects are categorised and classified according to two fundamental classes being a positive class and a negative class.
  • Figure 14 is an image of an internal surface of a tube captured using the apparatus of figures 1 to 6.
  • a‘ positive class feature/defecf corresponds to background features that includes reflector A used to reflect the light back to the camera unit 10, a shadow C formed behind the reflector A, a shaft B used to mount the reflector A at the camera unit 10 and the general defect-free regions D of the tube surface.
  • a diameter of reflector A may be dependent upon the internal diameter of the tubes being inspected. In certain aspects, the diameter of reflector A is very close to the internal diameter of the tube generating negligible or zero shadow formation behind the reflector. However, the where the reflector A diameter is less than the internal diameter of the tube, significant shadow formation is encountered.
  • the present system is adapted to compensate for such features. Additionally, the surface texture of the pipe resulting from changes in coloration etc., along the pipe length may be encountered by the inspection system. In particular, variation of surface coloration or surface texturing may be dependent on the grade, dimensions etc., of the tube. Such positive class features would accordingly not give rise for concern and accordingly can be ignored for the purposes of output defect identification and categorisation as a result of surface inspection.
  • the present system is configured to differentiate between positive and negative class features and to process preferentially negative class defects as part of the automated inspection system process.
  • A‘ negative class feature/defect’, class E is a defect categorisation that requires identification by the present system.
  • the present system is adapted for identifying multiple types of negative class defects including tens and up to hundreds or even thousands of very specific defect types based on the nature of the defect being for example a tear, a pit, surface roughening, scoring, a longitudinal or transverse (including perpendicular) orientated defect such as a crack or slit.
  • the present system comprises a data processing utility preferably
  • CNN convolution neural network
  • the CNN is adapted to receive and process inspection image data with such inspection image data including positive class and negative class features.
  • libraries are loaded into the system at stage 200.
  • Such libraries may utilise open source software adapted for pre processing inspection image data in which such data may be cropped, scaled, augmented, blurred etc., at stage 201.
  • patches of 64 x 64 pixels are cropped from different regions of an image representing positive and negative classes.
  • patches are labelled at stage 202 and then stored in a desired folder.
  • the present system comprises feature learning utility 205 that is operative with a machine learning graph at stage 204 utilising the CNN.
  • the CNN is a feature learning utility that via an iterative data re-processing loop is adapted to learn and differentiate positive and negative class features and to further accurately predict negative class defect classification for a given unknown image or patch.
  • the present CNN is designed specifically to be‘light’ with regard to mathematical computation characterisation so as to exhibit enhanced processing speed. Accordingly, the present system is adapted specifically for online, real-time tube inspection and analysis without the need for highly sophisticated processing engines.
  • This enhanced processing speed is achieved, in part by the configuration of the CNN with a minimum number of feature extraction layers.
  • the system comprises four feature extraction layers.
  • an initial layer comprises 16 filters.
  • Each subsequent layer may then comprise a progressively increasing number of filters being 32, 64 and 128 respectively over the four layers.
  • a model compiling utility 207 is provided to optimise data processing with such optimisation comprising an‘optimizer’ typically referred as a gradient descent algorithm to enhance the speed of learning.
  • the system is configurable, as will be appreciated, to comprise any number of feature extraction layers for example two to a hundred layers with each layer comprising different numbers of filters.
  • the number of filters within each layer may be the same or may increase uniformly or non-uniform from the initial layer to the final processing layer.
  • the filters may be the same or may be different at each layer.
  • the pre-processed inspection image data generated by the data pre-processing utility is fed to the feature learning utility 205 to be processed by the CNN at stage 204.
  • the labelled data fed to the CNN is then analysed by applying the layers and filters to extract features according to each positive and negative class.
  • the accuracy of the CNN is checked for every class at stage 208.
  • a predetermined accuracy threshold of 99% is input into the system. Any accuracy less than 99% involves a looped re-processing of data by the CNN at stage 204. Such re-processing comprises changing the model filter weights at stage 210. Accordingly, the present system is self-training and self-learning to achieve the desired accuracy.
  • a reference data library is utilised.
  • Such reference data library includes pre-categorised inspection image data in which positive class features and negative class defects within the patches of the data are assigned identifications/labels.
  • the reference library may be initially constructed and/or may be enhanced and built during the operation of stages 200 to 208.
  • a final stage 209 comprises saving the determined model filter weights once defect accuracy identification of greater than 99% is obtained.
  • live inspection image data is obtained from a camera accessing utility 216.
  • the frames from the camera accessing utility 216 are read by the system at stage 212.
  • Real-time defects identification and scanning is integral to the present system.
  • the real-time/live scanning and image processing is achieved using a‘ sliding window approach' .
  • a window of size 64 x 64 slides over an image, crops, patches and feeds the inspection image data as pre-processed data to the CNN model.
  • a reflector identification utility 217 is operative as part of the data pre-processing.
  • Such a utility is advantageous to enhance the speed of data processing and data handling efficiency of the present system.
  • stage 213 a circle (cloak) is fitted over reflector A using a canny edge detection and Hough circle transformation technique.
  • the coordinates of each sliding window‘ patch’ are extracted at stage 215.
  • a check is then undertaken at stage 216 to identify if any of the four coordinates are located within the cloak circle (of stage 213).
  • the system is re-looped to re-sample at stage 214. If not, the processed data is then input into the machine learning graph (comprising the CNN) at stage 217.
  • the model filter weights are loaded into the CNN at stage 218 with such filter weights being obtained from the initial system learning function and configuration of figure 12 as described.
  • the present system is optimised for processing speed and to focus on defect identification and categorisation only within the region of the inspection image between the peripheral boundary of the data and excluding specifically the region covered by the reflector.
  • essential libraries are loaded into the system at stage 211 as an initial stage.
  • Figure 15 illustrates inspection image data output from the CNN as predicted defect data at stage 219.
  • data involves the division of image data into patches of 64 x 64 pixels that are cropped from different regions of the overall image. Such patches are scaled, augmented and labelled, as described, with such data being obtained and processed in real-time using the sliding window approach.
  • a patch 250 containing a positive class feature such as a watermark may be ignored by the CNN for the purposes of outputting a predicted defect at stage 219.
  • 64 x 64 patch 253 contains a negative class defect such as a spot or pit 252
  • a predicted defect is output from stage 219.
  • the pulse at stage 220 functions to collect additional inspection data at the region (proximally) to the negative class defect identified. This additional inspection data is useful to provide further data surrounding a defect and to increase the reliability and accuracy for the present system for defect identification and categorisation.
  • a boundary box is created around the defect at stage 221, with the name of the defect added/tagged to the patch data at stage 222 followed by defect extraction at stage 223.
  • Such steps are controlled by a labelling utility to label defect 252 within a patch 253 at a specific sub-location (within a region) at the surface of the tube.
  • a post processing utility 226 at stage 225 identifies the surface area and location of the defect within the tube. Such a location may be represented by coordinates to accurately identify the precise position of a defect at an internal surface. If a defect is regarded as a negative class type and therefore hazardous at stage 227, a notification is issued to an operator at stage 228 using a notification utility 229.
  • the detected defect image is stored at stage 230 and the inspection ended at stage 231.
  • a report generation utility 235 is operative to output predetermined categorisations of data output by the CNN and the post processing utility.
  • Such output data may include a number of defects, a label of a defect type, a grade of a tube/pipe, a bar chart/pie chart of defects (eg defect types, size, position, orientation etc.).
  • stage 234 involving reading of the next frame from the camera unit 10 with the system repeating the steps as described within steps 221 to 234 to provide an intelligent, autonomous and automated tube inspection system.
  • threshold techniques are used to convert the data within a patch 253 to a binary image.
  • the present system is capable of surface analysis of tubes, pipes and other cylindrical-type objects.
  • the present system is adapted for inspecting internal tube and external tube surfaces as desired.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

A method and system for inspecting a surface of a tube. The system comprises a pre-processing utility to process inspection image data of a surface of the tube, a main data processing utility that uses feature extraction layers and filters to identify and categorise defects within sub-locations of the region at the surface of the tube and a post-processing utility to assign a characteristic to the data and/or the defect prior to the output of data and/or a notification based on an identified and categorised defect.

Description

Tube Inspection System
Field of invention
The present invention relates to an inspection system, apparatus and method for inspecting a surface of a tube or pipe and in particular, although not exclusively, to a system configured to generate data and/or images relating to the surface via an automated inspection process.
Background art
Inspection systems have been used for the assessment, maintenance and repair of tubes and pipes within heat exchangers, fluid transfer tubing and utility transport networks i.e., water, oil and gas. Such systems typically involve introducing a probe into the pipe, advancing the probe axially and inspecting the internal surface via video and image output.
GB 2468301 discloses a water mains inspection and servicing system that comprises a feed cable, an ultrasound probe having an ultrasound sensor, a processor for analysing the detected ultrasound signals and a display for outputting the processed data.
JP 2001-7819 discloses an in-pipe inspection camera system having a lighting array positioned at the camera for direct visual inspection. A reflector and a diffusor plate are described and utilised to improve image capture and inspection. Endoscopic inspection is the preferred technique for investigating the status of the internal surface of long pipes. However, manual insertion of the endoscopic camera along the length of the pipe is difficult, time consuming and tedious. In situations of batch inspection, such existing methods and apparatus are challenging to operate efficiently and reliably and can be ineffective to assess the full length of the surface of a long pipe due to poor resolution of the images obtained and the difficultly with fully advancing the camera. Accordingly, identification of defects at pipe surfaces is typically difficult, inefficient and unreliable. Accordingly, what is required is a tube and pipe inspection system that solves the above problems.
Summary of the Invention
It is one aspect the present arrangements provide a manual and/or an automated tube inspection system, apparatus and method to obtain an image and/or inspection data of an external or internal surface of a tube. It is a further aspect to provide a system configured for the identification of defects at the tube surface to facilitate monitoring, servicing and repair of tubes and pipes. It is a further specific aspect to provide a system suitable for the automated inspection of tube networks that may form part of a fluid transport system such as a heat exchanger, water, oil and gas supply.
It is a further specific aspect to provide a system for the characterisation of defects with regard to type, size, position, magnitude. It is a further specific aspect to provide a system configured for intelligent and dynamic control being responsive to the status of the surface and in particular the presence of a defect. In particular, it is one aspect to provide a system that is self-controlling to be fully automated and capable of defect identification and targeted inspection in direct response to the nature of the captured inspection data.
The aspects are achieved via an inspection system having a mobile camera unit capable of axial transport within the internal bore of a tube with the camera unit being controlled by a control unit and a feeder mechanism, the control unit provided with a data handling utility for data acquisition, processing, feedback signal generation and data and/or image output. The present system is particularly adapted for inspection of metal tubes and pipes.
Additionally, the present system is configured for inspection of non-metallic tubes and pipes such as polymer, ceramic, glass, clay or pot-based tubes and pipes.
According to one aspect there is provided a method of inspecting a surface of a tube comprising: pre-processing inspection image data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; inputting the pre-processed image data into a data processing utility and processing the data using feature extraction layers having filters to identify and categorise any defects within the sub-locations; post-processing the data of the patches output by the data processing utility to assign at least one characteristic to the data and/or the defect; and outputting data and/or at least one notification based on the identified and categorised defect.
Optionally, the step of identifying and categorising the defects may comprise categorising a defect as: a positive class feature that represents at least one feature within the image data of the surface associated directly or indirectly with the camera unit and/or a non-hazardous non-textural feature at the surface; or a negative class feature being a defect that represents at least one feature within the image data of the surface that is hazardous. Within this specification, reference to a positive class feature encompasses a feature that is non- hazardous and effectively may be ignored by the data processing utility. Reference to a negative class feature is to be interpreted as a‘defect’ being an undesirable feature present at the tube surface. Such a defect may be formed within the tube wall (at the region of the surface for example as a pit, scratch, crack etc) or present on the surface as a foreign body or liquid encompassing such features as solid contaminants, lubricants etc., that may be residual from an initial pipe manufacturing process.
Optionally, the negative class feature may comprise any one or a combination of: a pit; a scratch; a scar; a tear; a split; a groove; a hole; an indentation or depression; a region of surface roughening, relative to non-surface roughened regions of the surface; or surface imperfection, solid contaminants, lubricants or other contaminants or manufacturing residue present on the surface. The present system is therefore useful for quality control of tube/pipe manufacture by scanning the tube/surface prior to onward supply or subsequent downstream processing.
Optionally, the non-hazardous non-textural defect comprises any one or a combination of: a water mark; a stain; a discontinuation in a colouration of the surface.
Optionally, the at least one feature within the image data of the surface associated directly or indirectly with the camera unit comprises a part of the image data representing: a light reflector of the camera unit; a shadow created by a part of the camera unit; a shaft mounting a light reflector of the camera unit; a defect-free portion of the surface.
Optionally, the step of processing the data using the feature extraction layers comprises applying a series of feature extraction layers to the pre-processed image data, each of the layers in order of application having an increasing number of filters. Optionally, the system comprises 2 to 60; 2 to 40; 2 to 20; 2 to 10; 2 to 8; or 2 to 6 feature extraction layers.
Optionally, at least one of the feature extraction layers comprises 10 to 20 filters, at least a second feature extraction layer comprises 20 to 40 filters, at least a third feature extraction layer comprises 40 to 80 filters and at least a fourth feature extraction layer comprises 80 to 180 filters.
Optionally, the processing of the data using the data processing utility comprises utilising weightings assigned to the filters of the feature extraction layers.
Optionally, the step of processing the data using the data processing utility comprises initiating a defect identification utility to control the collection of additional inspection data of the surface at and/or proximate to a location of said defect. Optionally, the at least one characteristic comprises any one or a combination of: a label; an area of a defect; a type of defect; a location of a defect within the surface as location coordinates at the tube.
Optionally, the pre-processing of the inspection image data comprises applying at least one cloak or cover to at least a part of the inspection image data to prevent the processing of the data covered by the cloak by the data processing utility. Optionally, as part of the data pre-processing, a reflector identification utility is operative. Such a utility functions to identify within the image the disc-shaped reflector and effectively cloak or conceal this part of the image data from subsequent data processing by the data processing utility. The reflector identification utility is advantageous to enhance the speed of data processing and data handling efficiency of the present system. The reflector identification utility functions to create a perimeter line or border around the reflector (within the image) such that the subsequent data processing does not occur internally within the as-created perimeter border.
Optionally, the step of pre-processing the inspection image data comprises any one or a combination of image cropping, scaling, augmentation, blurring, labelling and storing the pre-processed data.
Optionally, the step of post-processing the data comprises applying at least one label to the data output by the data processing utility. Optionally, the step of applying the label comprises inserting a boundary around a defect identified within a sub-location.
Optionally, the step of post-processing the data comprises generating a binarized black and white image of a defect.
Optionally, the step of outputting data comprises outputting any one or a combination of: a number of defects within the surface of the tube; a label assigned to a defect; a grade assigned to tube and/or a defect; a chart or graph based on defects identified and categorised by the data processing utility; at least one image of a defect obtained by the data processing utility; a surface area value of a defect; a 2D or 3D map of the surface containing the identified and categorised defects. According to one aspect there is provided a method of creating a system for the inspection of a surface of a tube comprising: pre-processing inspection image data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; inputting the pre-processed image data into a data processing utility and processing the data using feature extraction layers and filters to identify and categorise any defects within the sub-locations; comparing data output by the data processing utility with a reference library to determine an accuracy of the defect identification and/or
categorisation; and adjusting parameters associated with the layers and/or filters and pre processing the data by the data processing utility until an accuracy of the defect identification and/or categorisation is at or above a pre-determined threshold.
According to one aspect there is provided a tube inspection system comprising: a data pre processing utility to process inspection data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; a data processing utility to process the pre-processed image data, the data processing utility utilising feature extraction layers having filters to identify and categorise any defects within the sub-locations; a post processing utility for the post-processing of the data of the patches output by the data processing utility to assign at least one characteristic to the data and/or the defect; wherein the system is configured to output at least one notification or output data based on the identification and categorisation of defects.
According to one aspect there is provided a tube inspection system comprising: a data pre processing utility to process inspection data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; a data processing utility to process the pre-processed image data, the data processing utility utilising feature extraction layers having filters to identify and categorise any defects within the sub-locations; a reference library used by the data processing utility as a comparison with the data output by the data processing utility to determine an accuracy of the defect identification and/or categorisation; and a model compiling utility to adjust parameters associated with the layers and/or filters and to control and action any re-processing of data by the data processing utility until an accuracy of the defect identification and/or categorisation is at or above a pre-determined threshold.
According to one aspect there is provided a use of a tube inspection system as claimed and described herein to inspect a surface of a tube for defect identification and categorisation.
According to one aspect there is provided an inspection system for inspecting an internal surface of a tube comprising: a camera unit comprising a camera, a light source and a shield, the shield mounted at a separation distance from the camera and configured to reflect at least some light generated from the light source in a direction towards the camera; a data handling unit connected in data transfer communication with the camera unit to receive data from the camera; a feeder mechanism to provide axial transport of the camera unit within the tube; a control unit to control at least one of the camera unit, the data handling unit and the feeder mechanism.
Optionally, the control unit is configured to control the camera unit, the data handling unit and the feeder mechanism collectively. Optionally, control unit further comprises a motor driver, a computer and a display. According to one embodiment, the data handling unit comprises: a data storage utility; a data acquisition utility; a data processing utility.
Optionally, the data handling unit is connected in wired or wireless communication with the camera unit.
According to another embodiment, the feeder mechanism comprises: a drivable motor; a mechanical connection member extending in contact with the camera unit; an actuator drivable by the motor to act on the mechanical connection member and provide axial advancement and withdrawal of the camera unit at the tube relative to the feeder mechanism. According to yet one embodiment, the mechanical connection member comprises a rigid or semi-rigid elongate shaft, wire or cable extending axially from the camera unit. The mechanical connection member may be generally straight and comprises a rigidity to prevent coiling of the mechanical connection member during axial
advancement of the camera unit within the tube when actuated by the feeder mechanism.
According to one aspect there is provided a method of inspecting an internal surface of a tube comprising: positioning a camera unit within a tube, the unit comprising a camera, a light source and a shield mounted at a separation distance from the camera; advancing the camera unit axially within the tube using a feeder mechanism; illuminating a region of the internal surface of the tube with the light source and reflecting at least some of the light in a direction towards the camera using the shield; acquiring inspection data from the camera unit at a data handling unit connected in data transfer communication with the camera unit; and controlling at least one of the camera unit and the feeder mechanism using a control unit.
Optionally, the method further comprises: processing said inspection data using a data processing utility; and outputting processed inspection data and/or an image of the internal surface of the tube at an output display. Preferably, the step of acquiring inspection data from the camera unit comprises: transferring inspection data from the camera to the data handling unit; and storing said inspection data at a data storage utility.
According to one embodiment, the control unit comprises a programmable logic controller (PLC) having a processor and a power supply to control operation of at least one of the camera unit, the feeder mechanism and the data handling unit in response to the inspection data received from the camera unit.
Optionally, the step of advancing the camera unit axially within the tube comprises driving an actuator via a motor, the actuator acting on a mechanical connection member extending axially from the camera unit to provide axial transport of the camera unit within the tube relative to the feeder mechanism.
According to one aspect there is provided a method of inspecting an internal surface of a tube comprising: controlling a feeder mechanism to advance axially a camera unit within a tube; controlling the camera unit to generate inspection data of the surface of the tube; processing the inspection data to identify a defect at the surface; and outputting an image and/or data relating to the defect at an output device.
Optionally, following the identification of the defect, the method comprises initiating a defect identification utility to control the collection of additional inspection data of the surface at and/or proximate to a location of said defect. Optionally, the method further comprises storing the inspection data, processed data and/or an output image of the surface and/or defect. Optionally, the step of initiating the defect identification utility comprises pulsing the collection of additional inspection data of the surface. Optionally, the step of initiating the defect identification utility further comprises activating a defect labelling utility to apply at least one label to the additional inspection data and/or processed data based. Optionally, the method further comprises activating a method learning utility to improve a capability of the method to identify defects and/or process the inspection data.
Optionally, the method further comprises initiating a post data processing utility to determine a size of the defect. Optionally, the size may comprise data relating to an area, length and/or width of the defect. Optionally, the step of outputting the image and/or data comprises outputting any one or a combination of: an image of the surface and the defect; a size of the defect; location details of the defect at the surface of the tube. Optionally, the step of obtaining the inspection data comprises controlling the camera unit via a control unit to obtain additional inspection data in response to processing of previously captured inspection data. Optionally, the step of obtaining the inspection data comprises controlling the camera unit having a camera, light source and a shield, the shield mounted at a separation distance from the camera and configured to reflect at least some of the light generated from the light source in a direction towards the camera.
According one aspect there is provided a tube inspection system comprising: a control utility to control a camera unit within a tube to obtain inspection data of an internal surface of a tube; a data handling utility comprising a data acquisition utility and a data processing utility, the data processing utility configured to identify a defect at the surface of the tube based on the inspection data. According to one embodiment, the system further comprises a defect identification utility configured to control the camera utility to collect additional inspection data of the surface of the tube at or proximate to the location of the defect at the surface.
Optionally, the system may further comprise a defect labelling utility to apply labels to the additional inspection data and/or processed data based on said inspection data.
Optionally, the system may further comprise a system learning utility configured to improve the capability of the system to identify defects and/or process the raw image data. According to one embodiment, the system further comprises a camera unit wherein the camera unit comprises a camera, a light source and a shield, the shield mounted at a separation distance from the camera and configured to reflect at least some light generated from the light source in a direction towards the camera.
The present system is suitable for investigation and analysis of different types of tube, including tubes of different material, shape and size. Optionally, the present system is compatible for use with tubes of diameter in the range 5mm to 300 mm or even larger.
The present system is particularly suitable for investigation and analysis of tubes of diameter 10 mm to 250 mm. Also, the system is suitable for tubes of diameter 10 to 80 mm 20 to 80 mm as examples only.
Brie† description ot drawings
A specific implementation of the present invention will now be described, by way of example only, and with reference to the accompanying drawings in which:
Figure 1 is a perspective view of a camera unit suitable for inspecting an internal surface of a tube according to a specific implementation of the present invention;
Figure 2 is a side view of the camera unit of figure 1 ;
Figure 3 is a schematic side view of the camera unit positioned and operable to illuminate a region within a tube; Figure 4 is a schematic illustration of the camera unit of figure 3 positionally controlled by a feeder unit and a control unit according to a specific implementation of the present invention;
Figure 5 is a perspective view of a part of the feeder unit of figure 4;
Figure 6 is a schematic illustration of components of a tube inspection system according to a specific implementation of the present invention;
Figure 7 is a schematic illustration of selected components of the tube inspection system of figure 6;
Figure 8 is a schematic flow diagram of one mode of operation of the tube inspection system of figure 7 ;
Figure 9 is a further flow diagram of an automated data collection process utilising the present inspection system;
Figure 10A is a first part of a schematic flow diagram detailing the operation and control of the various components of the inspection system of figure 7 ;
Figure 1 OB is a second part of the schematic flow diagram of figure 10A;
Figures 11 A, B, C, D, E and F are images at an internal surface of a tube output from the tube inspection system according to one aspect of the present invention;
Figure 12 is a schematic flow diagram of an initial‘ learning’ load of the present system using a feature learning utility to calibrate and prepare the system for operational use according to a specific implementation; Figure 13A is a schematic flow diagram of a first half of the stages associated with operational use of the present system for tube surface inspection;
Figure 13B is a schematic flow diagram of those stages within a second half of operational use of the present system for tube surface inspection and in particular defect identification and characterisation;
Figure 14 is an image of an internal surface of a tube with different defect types identified and categorised;
Figure 15 is an image of an internal surface of a tube illustrating‘ patches’ of data at sub locations within a particular region at a surface of the tube as part of the data processing stages.
Detailed description
Referring to figures 1, 2 and 3 a tube inspection system for inspecting an internal surface of a tube/pipe comprises a camera unit 10 suitably dimensioned to be introduced and advanced axially within a tube 21. Camera unit 10 comprises: a camera 11 having an axially forward lens optionally being a PAL (Phase Alternating Line) type camera; an array of LEDs positioned at an axially forward region 14 of the camera unit 10 around the forwardmost lens; and a tubular sleeve 13 housing the camera 11 and light source 12. Unit 10 further comprises a disk-shaped shield 18 having a generally circular rearward facing surface 19 being positioned opposed to and separated a relatively short (e.g., 10 to 150 mm) distance from the lens of camera 11. Surface 19 may be coated and comprise a reflective material. Optionally, surface 19 may be formed from a plastic such as a nylon or other suitable polymer capable of impeding or preventing light generated from light source 12 from propagating axially along the tube 21. Preferably, surface 19 is configured to (at least partially) reflect the light generated from the light source 12 in a direction axially towards the lens of camera 11. The disk-shaped shield 18 is positionally secured to the camera 11 via a mounting shaft 20 secured to sleeve 12 such that the shield 18 and the camera 11 form an integrated unit for unified movement within the tube 21. Unit 10 further comprises a plurality of cables 16 to provide data communication between the camera unit 10 and further electronic components of the present inspection system including in particular a feeder mechanism, a control unit and a data handling unit and utility detailed below. To facilitate axial advancement and withdrawal of the unit 10 within tube 21, a substantially rigid elongate member 17 extends axially rearward from a rearward end 15 of sleeve 13. Member 17 may be formed as a rod, wire or cable having a thickness, rigidity and/or stiffness to avoid deflection and in particular coiling of the member 17 as the camera unit 10 is moved axially within tube 21 (via the member 17).
Referring to figure 3, camera unit 10 in addition to sleeve 13 may comprise an additional spacer illustrated schematically by reference 23 to ensure appropriate alignment of the camera unit 10 within the tube 21 relative to an internal surface 2la of the tube 21. That is, spacer 23, sleeve 13 and shield 18 are dimensioned so as to comprise an outside diameter being slightly less than an internal diameter of tube surface 2la with a separation distance optionally being around 1 mm. Accordingly, light emitted (22a) from light source 12 is configured to fill the region between the camera 11 and shield surface 19. Light is reflected (22b) from surface 19 to return to camera 11. Shield 18 is beneficial to prevent propagation and loss of light axially along the tube bore 2 lb a distance beyond the focal length of camera 11. The inventors have identified that, the illumination of the tube surface 2la beyond the focal length of the camera 11 is effectively wasted, represents inefficient use of the light source 12, results in poor quality images and is not optimised for defect detection. The present tube inspection system is specifically adapted to identify defects at internal surface 2la such as pits, projections, tears, water or other marks and stains, scoring, breaks, cracks and the like. By utilising the present system, such defects may be identified as dark or black regions within an otherwise highly illuminated internal surface 2 la. Shield 18 mounted at a distance at or slightly beyond the focal length of camera 11, traps the emitted light 22a and significantly enhances illumination of surface 2la within the focal length of the camera 11 and accordingly provides a system for enhanced sensitivity to defect detection.
Referring to figure 4 the present inspection system further comprises a feeder mechanism illustrated schematically by reference 24 having an actuator 29 controlled by/provided at a control panel 28. Typically, a single or an array of tubes 21 are mounted at a platform 25 with the tube inspection system including feeder mechanism 24 and control panel 28 also mounted on a suitable support structure 26. Control panel 28 is adapted to control actuator 29. Elongate member 17 extends axially rearward from camera unit 10 and is fed into the feeder mechanism 24 such that actuator 29, when activated is configured to act upon the member 17 and provide axial advancement and withdrawal of the camera unit 10 within tube 21. Control panel 28 comprises: a power supply, electronic components and a suitable terminal input/user interface optionally including a keyboard, touch screen, control buttons monitor etc., as found in the art. Optionally, control panel 28 comprises wireless communication components and modules for wireless data exchange with the camera unit and a remote network. Panel 28 via its components and functionality is configured to power and control the various components of the inspection system including the feeder mechanism, the camera unit and a data handling unit. Further details of the control panel 28 its integrated control unit are described referring to figures 6 to 10B.
Referring to figure 5, feeder unit 24 comprises a base plate 44 mounting at least one actuator 29 in the form of a pair of rotatable axels positioned opposed to one another by a separation distance to accommodate cables 16 and elongate member 17. A compressible rubber bushing 42 is mounted at each axel to sit in frictional contact against elongate member 17 such that a rotation of actuator 29 (and bushings 43) provides a corresponding axial movement of elongate member 17 (and cables 16) to drive the axial movement of camera unit 10 within tube 21. The feeder mechanism 24 further comprises at least one guidance or alignment mounting 41 to assist with appropriate positioning of the elongate member 17 relative to the actuator 29. Additionally, the mechanism 24 may be
positionally or statically mounted in position via supports 40.
Referring to figure 6, the tube inspection system is adapted to inspect the internal surfaces 2la of an array of pipes 27 with each pipe 21 positioned side-by-side. Camera unit 10, being positionally controlled via feeder mechanism 24, is capable of being axially advanced and withdrawn within tube 21 via the control panel 28 controlling actuator 29. Control panel 28 specifically comprises: a CPU; a programmable logic controller (utilised for controlling feeder mechanism 24); and a data handling unit for handling of the data generated by the camera unit 10. The inspection system 38 further comprises an output device 37 in the form of a digital display (such as a TV or monitor) to display moving and/or still images of the surface 2la generated by camera 11.
Referring to figure 7, the system 38 (implemented via control panel 28) specifically comprises: a control unit 69 having a programmable logic controller 65; a data storage utility 66; software 67; and a power supply 68. The control unit is coupled in data transfer communication with the feeder mechanism 78. Feeder mechanism 78 comprises a motor driver 74, and actuator, motor 75, a power supply 76 (that may be the same or additional to power supply 68) and movement control components and utilities 77. Control unit 69 is also coupled for data transfer with the camera unit 10 that in turn comprises optionally an on board processor 70, cables 71, lighting utilities 72 and camera control utilities 73. Whilst such components and utilities 70 to 73 are illustrated forming a part of the camera unit 10, utilities 72 and 73 may be located at the control unit 69 and in particular at control panel 28. Control unit 69 includes or is provided in communication with a data handling utility 80 configured to acquire and process raw image data (e.g., video steam) generated by camera unit 10. Data handling utility 80 comprises a data acquisition utility 81; a data processing utility 82; a system learning utility 83; a data output utility 84; and a data storage utility 85. Such utilities 81 to 85 may be implemented as software and comprise associated components for data output and storage.
Referring to figure 8, the present system 38 is configured for at least two modes of operation including a manual mode 50 and an intelligent (or automatic) mode 51. Both modes 50, 51 utilise control panel 28 in which signals (including in particular feedback signals 54) are generated and returned via the programmable logic controller 65.
Controller 65 provides motor driver control 55 and stepped motor control 56 with the instruction data being transmitted to feeder mechanism 78 and in particular actuator 29. Camera unit is advanced via the elongate member 17 (that supports cables 16) with data transfer between camera unit 10 and data handling utility 80 providing a means of video and still/static image acquisition 60. A data storage unit 61 is adapted for storing raw and/or processed image data. Optionally, as detailed below, data storage 61 may be divided or partitioned into storage areas dependent upon the type of video footage or image obtained via the data acquisition and processing.
Data processing and system learning utilities 82, 83 are implemented as a convolutional neural network (CNN), referred to herein as a machine learning graph. As described referring to figures 12, 13A and 13B, the CNN and associated data processing utilities, are operable according to an initial system learning stage to prepare, calibrate and configure the system for automated inspection of internal or external surfaces of a tube to specifically identify and categorise defects based on defect type. Such identification and categorisation involves differentiation of defects that may be regarded as‘positive class features’ and ‘negative class features’ (i.e.,‘defects’) as described referring to figure 14. The CNN once configured via the initial learning stage is operable in a second and subsequent‘use’ mode to inspect the external or internal surface of a tube for the automated differentiation, identification and categorisation of defects. The first stage system learning is described referring to figure 12 and the second operational stage or‘use’ stage is described referring to figures 13A and 13B.
Referring again to figure 8, output from the data handling utility and in particular stage 62 provides a feedback signal 63 that is utilised by the system 38 for continued and targeted data acquisition via the intelligent mode of operation 51. Figure 9 illustrates a specific implementation of the intelligent mode of operation 51 (within the second stage of operation) in which control unit 69 (implemented on control panel 28) provides control of the various components of the system 38 including the camera unit 10, the feeder mechanism 78 and the data handling utility 80. According to the intelligent mode of operation 51, an initial stage comprises system activation 86, the feeder mechanism is then operated at stage 87 to advance camera unit at stage 88. Raw video/image data is generated at stage 89 with such data being transferred to the data handling utility to obtain output processed image data at stage 90. Aspects of the data handling utility 80 are adapted to identify and categorise defects at stage 91. If no defect is identified the data is stored at stage 92 (within a storage unit 1) and the camera unit advanced at stage 96. If a defect is detected at stage 91, the image data is stored at a storage unit 2 at stage 93. The camera unit 10 is then maintained stationary at stage 94 and further image data of the defect is collected at stage 95 for subsequent processing, storage and analysis. Camera unit is then advanced at stage 96. If the end of the tube is identified by the camera unit 10 at stage 97, the camera unit stops 98 is extracted from the tube 99 and returned to the start of the tube 100 via axially withdrawing the camera unit 10 via elongate member 17 and feeder mechanism 78. The system may then be stopped and the camera unit 10 extracted from the tube at stage 101. As indicated, control unit 69 is configured to control at least one or all of the camera units 10, feeder mechanism 78 and data handling utility.
As indicated, the apparatus and system is initially configured for defect differentiation based on defect type (positive class feature and negative class feature as discussed referring to figure 14). If a defect is identified that is unclassified, the system is adapted to issue a notification to an operator and/or system administrator that a system update and/or additional learning may be necessary. Such an event may then require a manual intervention or a separate automated system update in which the system is subject to further calibration and learning via the initial first stage set up process in which the CNN is operated specifically for system learning described referring to figure 12. Once the system is enabled for identification and categorisation of the new, previously unclassified defect, the data processing utility (CNN) and associated libraries are updated and the system may continue in the second stage or use/operational mode to acquire image inspection data via the camera unit according to the automated intelligent mode of operation.
Figures 10A and 10B illustrate further specific features, components and functionality of the inspection system 38. In particular, camera unit 10 comprise a‘ frame grabber’ 103 to obtain static images (of the tube internal surface 2 la) from the live video stream generated by camera unit 10. The data handling utility 80 when operative comprises a plurality of stages for data acquisition, processing and output according to collective components 117. Initially, a start command is received 104. Image capture and data acquisition 81 are then initiated having sequential stages involving a frame capturing subroutine 105, frame display and recording 106, single frame analysis 107 and image processing 108. If the end of the pipe is identified at stage 109, a pulse generation subroutine is initiated 110. Camera unit is then reversed or stopped at stage 102. A subroutine to convert video to image data is then initiated 111. As part of the data handling utility 80, a data input facility is provided 112 having data input fields including heat, number 113, lot number 114, pipe grade 115 and pipe ID/OD 116.
The present pipe inspection system may be configured for two different types of system learning. In supervised learning mode (described referring to figure 12), the software and system learning utility 83 are configured and constructed such that the software is provided with all types of internal surface defects (i.e. pits, projections, score marks, tears etc) identifiable as part of the inspection process. Once initially configured, the software and system learning utility 83 are suitable for the inspection process to identify and label defects. Optionally, in an alternative unsupervised learning mode, the software and system learning utility 83 are initially configured with no or little initial data input regarding the type of defects as mentioned. Accordingly, in this mode, the system via the software and system learning utility 83 is configured to leam entirely from the collection of inspection data acquired during use. The primary implementation described herein is focussed on the first mode of operation described referring to figure 12 involving complete system learning as an initial configuration set-up.
The data processing utility 82 involves processing initialisation/import packages at stage 118 and generating and potentially outputting system learning data via system learning utility 83 at stage 119. If a defect is detected at stage 120, the pulse generation subroutine is activated at stage 121 followed by activation of a labelling defect subroutine 122 to assign labels to the processed data associated with the presence of a defect within a particular image or set of images. A visualisation subroutine is then initiated at stage 123 for potential output and storage of the defect containing data. As part of the data handling functionality 117, a report generation utility 84 comprises generating a single image of the entire length of a pipe 132; defect characterisation 133; the location of defect 134; the generation of a spreadsheet 135 and the generation and output of schematic illustrations such as graphs, pie charts etc., 136 of the status and characteristics of the pipe surface 2 la. The data handling functionality 117 further comprises a post data processing utility 125 configured to assign a characteristic to a defect that may include an area value, a length, a width, a location, a defect type etc. at stage 126. The identification of a hazardous (negative class feature) defect 127 is followed by initiation of pulse generation subroutine 128. If a defect is categorised as significant/hazardous (according to predefined criteria) a notification to the operator is generated 129 with the subsequent storage of the defect image at stage 130.
Figure 11 illustrates processed static images of tube surface 2la as generated by the inspection system 38 in which axial sections of the surface 2la are appropriately illuminated by the light source 12 and footage captured by camera 11. Figures 11A to E illustrate regions of tube surface 2la that include defects whilst image 11F represents a section of the tube surface 2la that is defect free. The present system is advantageous to enhance image quality to greatly facilitate data processing and accordingly the
identification of defects of a variety of different types, shapes, sizes and magnitude. In particular, figure 11A illustrates pits and/or projections as discreet black dots 150 within an otherwise smooth internal surface 2la. Figure 11B and 11D illustrates tears 151 and 152. Figure 11D illustrates the presence of a watermark 153 and figure 11E illustrates the presence of axial scoring 154 at surface 2 la.
The present tube inspection system 38 via the components and functionality as described, is configured to capture video data of a tube surface 2 la, to process the raw data and identify and classify defects. The present system involves generating feedback signals to dynamically control the movement and operation of camera unit 10 being sensitive to the presence of a defect at the surface. Preferably, camera 11 is formed as an endoscopic camera. The present system 38 may find application for defect detection within internal combustion engines, turbo chargers, turbine pipe networks, automotive valves, heat exchanger pipes and pipes for transporting utilities such as water, oil and gas. The present system is capable of displaying live video and/or images of the tube surface 2la as the camera unit 10 is advanced within the tube 21 in parallel with raw image data processing according to manual, semi-automated of fully automated implementation. The video footage, as described, is capable of being dynamically processed via the control unit 69 and data handling utility 80 to store and output captured data. The present system via the control panel 28 may be pre-programmed so as to be capable of automated/dynamic or manual control for a variety of different modes of operation. In particular, an automated or semi-automated tube inspection system is provided for the inspection of internal surfaces of an array of pipes in which the camera unit 10 may be automatically advanced within the pipes during data handling (acquisition and processing). The present system is advantageous by requiring minimal installation set-up and operation space and to significantly reduce the time and effort required by operating personnel to inspect tube internal surfaces 2 la.
The present tube inspection system is adapted specifically to categorise and differentiate between anomalies at the surface of the tube with such anomalies including defects considered as‘ hazardous’ such as pits, scoring, cracks etc., and those anomalies that may be regarded as‘ non-hazardous’ or non-textual that include features identified within an image of the tube surface that includes parts of the camera unit or apparatus that is positioned at the tube as well as other features within the image such as watermarks shadows, discontinuities in coloration of the surface etc. Within the present system, such features/defects are categorised and classified according to two fundamental classes being a positive class and a negative class. Figure 14 is an image of an internal surface of a tube captured using the apparatus of figures 1 to 6. The image of the type of figure 14 is subsequently processed using the image processing method and system as described herein. According to the present system, a‘ positive class feature/defecf corresponds to background features that includes reflector A used to reflect the light back to the camera unit 10, a shadow C formed behind the reflector A, a shaft B used to mount the reflector A at the camera unit 10 and the general defect-free regions D of the tube surface. As will be appreciated, a diameter of reflector A may be dependent upon the internal diameter of the tubes being inspected. In certain aspects, the diameter of reflector A is very close to the internal diameter of the tube generating negligible or zero shadow formation behind the reflector. However, the where the reflector A diameter is less than the internal diameter of the tube, significant shadow formation is encountered. The present system is adapted to compensate for such features. Additionally, the surface texture of the pipe resulting from changes in coloration etc., along the pipe length may be encountered by the inspection system. In particular, variation of surface coloration or surface texturing may be dependent on the grade, dimensions etc., of the tube. Such positive class features would accordingly not give rise for concern and accordingly can be ignored for the purposes of output defect identification and categorisation as a result of surface inspection. The present system is configured to differentiate between positive and negative class features and to process preferentially negative class defects as part of the automated inspection system process.
A‘ negative class feature/defect’, class E, is a defect categorisation that requires identification by the present system. The present system is adapted for identifying multiple types of negative class defects including tens and up to hundreds or even thousands of very specific defect types based on the nature of the defect being for example a tear, a pit, surface roughening, scoring, a longitudinal or transverse (including perpendicular) orientated defect such as a crack or slit.
As indicated, the present system comprises a data processing utility preferably
implemented as a convolution neural network (CNN). The CNN is adapted to receive and process inspection image data with such inspection image data including positive class and negative class features. Referring to figures 12, 13A and 13B, and utilising the apparatus of figures 1 to 6 including processing inspection image data of the type of figures 11 A to 11D and figure 14, an inspection system is provided using the general components and function as described in figures 7 to 10B. According to a specific implementation, a pre
configuration, calibration or system learning phase is illustrated and described referring to figure 12. General operation‘use’ and function of the system in use is then described and illustrated referring to figures 13A and 13B.
Referring to figure 12, utilising a data pre-processing utility 203, libraries are loaded into the system at stage 200. Such libraries may utilise open source software adapted for pre processing inspection image data in which such data may be cropped, scaled, augmented, blurred etc., at stage 201. In particular, according to the specific implementation, patches of 64 x 64 pixels are cropped from different regions of an image representing positive and negative classes. Such patches are labelled at stage 202 and then stored in a desired folder. The present system comprises feature learning utility 205 that is operative with a machine learning graph at stage 204 utilising the CNN. The CNN is a feature learning utility that via an iterative data re-processing loop is adapted to learn and differentiate positive and negative class features and to further accurately predict negative class defect classification for a given unknown image or patch.
The present CNN is designed specifically to be‘light’ with regard to mathematical computation characterisation so as to exhibit enhanced processing speed. Accordingly, the present system is adapted specifically for online, real-time tube inspection and analysis without the need for highly sophisticated processing engines. This enhanced processing speed is achieved, in part by the configuration of the CNN with a minimum number of feature extraction layers. According to a preferred embodiment, the system comprises four feature extraction layers. Preferably an initial layer comprises 16 filters. Each subsequent layer may then comprise a progressively increasing number of filters being 32, 64 and 128 respectively over the four layers. A model compiling utility 207 is provided to optimise data processing with such optimisation comprising an‘optimizer’ typically referred as a gradient descent algorithm to enhance the speed of learning. Whilst the present CNN is described comprising four feature extraction layers, the system is configurable, as will be appreciated, to comprise any number of feature extraction layers for example two to a hundred layers with each layer comprising different numbers of filters. The number of filters within each layer may be the same or may increase uniformly or non-uniform from the initial layer to the final processing layer. As will be appreciated, the filters may be the same or may be different at each layer.
In operation, the pre-processed inspection image data generated by the data pre-processing utility is fed to the feature learning utility 205 to be processed by the CNN at stage 204.
The labelled data fed to the CNN is then analysed by applying the layers and filters to extract features according to each positive and negative class. The accuracy of the CNN is checked for every class at stage 208. A predetermined accuracy threshold of 99% is input into the system. Any accuracy less than 99% involves a looped re-processing of data by the CNN at stage 204. Such re-processing comprises changing the model filter weights at stage 210. Accordingly, the present system is self-training and self-learning to achieve the desired accuracy. As part of the system learning function is described in stages 200 to 208, a reference data library is utilised. Such reference data library includes pre-categorised inspection image data in which positive class features and negative class defects within the patches of the data are assigned identifications/labels. The reference library may be initially constructed and/or may be enhanced and built during the operation of stages 200 to 208. A final stage 209 comprises saving the determined model filter weights once defect accuracy identification of greater than 99% is obtained.
Referring to figures 13A to 13B, live inspection image data is obtained from a camera accessing utility 216. The frames from the camera accessing utility 216 are read by the system at stage 212. Real-time defects identification and scanning is integral to the present system. The real-time/live scanning and image processing is achieved using a‘ sliding window approach' . In such an approach, a window of size 64 x 64 slides over an image, crops, patches and feeds the inspection image data as pre-processed data to the CNN model. In particular, as part of the data pre-processing, a reflector identification utility 217 is operative. Such a utility is advantageous to enhance the speed of data processing and data handling efficiency of the present system. Since slide patches of data at the region of the reflector A are of no use for defect identification, it is advantageous to avoid processing such patches. Accordingly, at stage 213, a circle (cloak) is fitted over reflector A using a canny edge detection and Hough circle transformation technique. The coordinates of each sliding window‘ patch’ are extracted at stage 215. A check is then undertaken at stage 216 to identify if any of the four coordinates are located within the cloak circle (of stage 213).
If they are, the system is re-looped to re-sample at stage 214. If not, the processed data is then input into the machine learning graph (comprising the CNN) at stage 217. The model filter weights are loaded into the CNN at stage 218 with such filter weights being obtained from the initial system learning function and configuration of figure 12 as described.
Accordingly, due to the application of the reflector cloak at stage 213 and the subsequent sampling, extraction and checking stages 214 to 216, the present system is optimised for processing speed and to focus on defect identification and categorisation only within the region of the inspection image between the peripheral boundary of the data and excluding specifically the region covered by the reflector. As described referring to figure 12, essential libraries are loaded into the system at stage 211 as an initial stage.
Figure 15 illustrates inspection image data output from the CNN as predicted defect data at stage 219. As indicated, such data involves the division of image data into patches of 64 x 64 pixels that are cropped from different regions of the overall image. Such patches are scaled, augmented and labelled, as described, with such data being obtained and processed in real-time using the sliding window approach. A patch 250 containing a positive class feature such as a watermark may be ignored by the CNN for the purposes of outputting a predicted defect at stage 219. However, if 64 x 64 patch 253 contains a negative class defect such as a spot or pit 252, a predicted defect is output from stage 219. This generates a pulse at stage 220 to stop the camera unit 10. The pulse at stage 220 functions to collect additional inspection data at the region (proximally) to the negative class defect identified. This additional inspection data is useful to provide further data surrounding a defect and to increase the reliability and accuracy for the present system for defect identification and categorisation.
Referring to figure 13B, subsequent to the processing by the CNN and during a post processing stage, a boundary box is created around the defect at stage 221, with the name of the defect added/tagged to the patch data at stage 222 followed by defect extraction at stage 223. Such steps are controlled by a labelling utility to label defect 252 within a patch 253 at a specific sub-location (within a region) at the surface of the tube. A post processing utility 226 at stage 225 identifies the surface area and location of the defect within the tube. Such a location may be represented by coordinates to accurately identify the precise position of a defect at an internal surface. If a defect is regarded as a negative class type and therefore hazardous at stage 227, a notification is issued to an operator at stage 228 using a notification utility 229. The detected defect image is stored at stage 230 and the inspection ended at stage 231. A report generation utility 235 is operative to output predetermined categorisations of data output by the CNN and the post processing utility. Such output data may include a number of defects, a label of a defect type, a grade of a tube/pipe, a bar chart/pie chart of defects (eg defect types, size, position, orientation etc.). The system continues at stage 234 involving reading of the next frame from the camera unit 10 with the system repeating the steps as described within steps 221 to 234 to provide an intelligent, autonomous and automated tube inspection system. As part of the defect categorisation and analysis, during a post processing phase, threshold techniques are used to convert the data within a patch 253 to a binary image. From such a binarized image (black and white image) the area of the defect may be identified. As will be appreciated, the present system is capable of surface analysis of tubes, pipes and other cylindrical-type objects. In particular, the present system is adapted for inspecting internal tube and external tube surfaces as desired.

Claims

Claims
1. A method of inspecting a surface of a tube comprising:
pre-processing inspection image data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface;
inputting the pre-processed image data into a data processing utility and processing the data using feature extraction layers having filters to identify and categorise any defects within the sub-locations;
post-processing the data of the patches output by the data processing utility to assign at least one characteristic to the data and/or the defect; and
outputting data and/or at least one notification based on the identified and categorised defect.
2. The method as claimed in claim 1 wherein the step of identifying and categorising the defects comprises categorising a defect as:
• a positive class feature that represents at least one feature within the image data of the surface associated directly or indirectly with the camera unit and/or a non-hazardous non-textural defect at the surface; or
• a negative class feature that represents at least one feature within the image data of the surface that is hazardous.
3. The method as claimed as in claim 2 wherein the negative class feature comprises any one or a combination of:
• a pit;
• a scratch;
• a scar;
• a tear;
• a split;
• a groove;
• a hole;
• an indentation or depression; • a region of surface roughening, relative to non-surface roughened regions of the surface; or
• surface imperfections, solid contaminants or manufacturing residue present on the surface.
4. The method as claimed in claims 2 or 3 wherein the non-hazardous non-textural defect comprises any one or a combination of:
• a water mark;
• a stain;
• a discontinuation in a colouration of the surface.
5. The method as claimed in any one of claims 2 to 4 wherein the at least one feature within the image data of the surface associated directly or indirectly with the camera unit comprises a part of the image data representing:
• a light reflector of the camera unit;
• a shadow created by a part of the camera unit;
• a shaft mounting a light reflector of the camera unit;
• a defect-free portion of the surface.
6. The method as claimed in any preceding claim wherein the step of processing the data using the feature extraction layers comprises applying a series of feature extraction layers to the pre-processed image data, each of the layers in order of application having an increasing number of filters.
7. The method as claimed in claim 6 comprises two to six feature extraction layers.
8. The method as claimed in claim 7 wherein at least one of the feature extraction layers comprises 10 to 20 filters, at least a second feature extraction layer comprises 20 to 40 filters, at least a third feature extraction layer comprises 40 to 80 filters and at least a fourth feature extraction layer comprises 80 to 180 filters.
9. The method as claimed in any preceding claim wherein the processing of the data using the data processing utility comprises utilising weightings assigned to the filters of the feature extraction layers.
10. The method as claimed in any preceding claim wherein the step of processing the data using the data processing utility comprises initiating a defect identification utility to control the collection of additional inspection data of the surface at and/or proximate to a location of said defect.
11. The method as claimed in any preceding claim wherein the at least one characteristic comprises any one or a combination of:
• a label;
• an area of a defect;
• a type of defect;
• a location of a defect within the surface as location coordinates at the tube.
12. The method as claimed in any preceding claim wherein the pre-processing of the inspection image data comprises applying at least one cloak to at least a part of the inspection image data to prevent the processing of the data covered by the cloak by the data processing utility.
13. The method as claimed in any preceding claim wherein the step of pre-processing the inspection image data comprises any one or a combination of image cropping, scaling, augmentation, blurring, labelling and storing the pre-processed data.
14. The method as claimed in any preceding claim wherein the step of post processing the data comprises applying at least one label to the data output by the data processing utility.
15. The method as claimed in claim 14 wherein the step of applying the label comprises inserting a boundary around a defect identified within a sub-location.
16. The method as claimed in any preceding claim wherein the step of post processing the data comprises generating a binarized black and white image of a defect.
17. The method as claimed in any preceding claim wherein the step of outputting data comprises outputting any one or a combination of:
• a number of defects within the surface of the tube;
• a label assigned to a defect;
• a grade assigned to tube and/or a defect;
• a chart or graph based on defects identified and categorised by the data
processing utility;
• at least one image of a defect obtained by the data processing utility;
• a surface area value of a defect;
• a 2D or 3D map of the surface containing the identified and categorised
defects.
18. A method of creating a system for the inspection of a surface of a tube comprising:
pre-processing inspection image data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface;
inputting the pre-processed image data into a data processing utility and processing the data using feature extraction layers and filters to identify and categorise any defects within the sub-locations;
comparing data output by the data processing utility with a reference library to determine an accuracy of the defect identification and/or categorisation; and
adjusting parameters associated with the layers and/or filters and pre-processing the data by the data processing utility until an accuracy of the defect identification and/or categorisation is at or above a pre-determined threshold.
19. The method as claimed in claim 18 wherein the step of identifying and categorising the defects comprises categorising a defect as: • a positive class feature that represents at least one feature within the image data of the surface associated directly or indirectly with the camera unit and/or a non-hazardous non-textural defect at the surface; or
• a negative class feature that represents at least one feature within the image data of the surface that is hazardous.
20. The method as claimed as in claim 19 wherein the negative class feature comprises any one or a combination of:
• a pit;
• a scratch;
• a scar;
• a tear;
• a split;
• a groove;
• a hole;
• an indentation or depression;
• a region of surface roughening, relative to non-surface roughened regions of the surface.
21. The method as claimed in claims 19 or 20 wherein the non-hazardous non-textural defect comprises any one or a combination of:
• a water mark;
• a stain;
• a discontinuation in a colouration of the surface.
22. The method as claimed in any one of claims 19 to 21 wherein the at least one feature within the image data of the surface associated directly or indirectly with the camera unit comprises a part of the image data representing:
• a light reflector of the camera unit;
• a shadow created by a part of the camera unit;
• a shaft mounting a light reflector of the camera unit;
• a defect-free portion of the surface.
23. The method as claimed in any one of claims 19 to 22 wherein the step of processing the data using the feature extraction layers comprises applying a series of feature extraction layers to the pre-processed image data, each of the layers in order of application having an increasing number of filters.
24. The method as claimed in claim 23 comprises two to six feature extraction layers.
25. The method as claimed in claim 24 wherein at least one of the feature extraction layers comprises 10 to 20 filters, at least a second feature extraction layer comprises 20 to 40 filters, at least a third feature extraction layer comprises 40 to 80 filters and at least a fourth feature extraction layer comprises 80 to 180 filters.
26. The method as claimed in any one of claims 19 to 25 wherein the processing of the data using the data processing utility comprises utilising weightings assigned to the filters of the feature extraction layers.
27. The method as claimed in any one of claims 19 to 26 wherein the step of processing the data using the data processing utility comprises utilising a gradient descent utility to optimise the identification and categorisation of defects.
28. The method as claimed in any one of claims 19 to 27 wherein the at least one characteristic comprises any one or a combination of:
• a label;
• an area of a defect;
• a type of defect;
• a location of a defect within the surface as location coordinates at the tube.
29. The method as claimed in any one of claims 19 to 28 wherein the pre-processing of the inspection image data comprises applying at least one cloak to at least a part of the inspection image data to prevent the processing of the data covered by the cloak by the data processing utility.
30. The method as claimed in any one of claims 19 to 29 wherein the step of pre processing the inspection image data comprises any one or a combination of image cropping, scaling, augmentation, blurring, labelling and storing the pre-processed data.
31. A tube inspection system comprising:
a data pre-processing utility to process inspection data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; a data processing utility to process the pre-processed image data, the data processing utility utilising feature extraction layers having filters to identify and categorise any defects within the sub-locations;
a post processing utility for the post-processing of the data of the patches output by the data processing utility to assign at least one characteristic to the data and/or the defect;
wherein the system is configured to output at least one notification or output data based on the identification and categorisation of defects.
32. A tube inspection system comprising:
a data pre-processing utility to process inspection data of a region of a surface of a tube obtained from a camera unit to generate pre-processed image data comprising a plurality of patches of data representing sub-locations within the region of the surface; a data processing utility to process the pre-processed image data, the data processing utility utilising feature extraction layers having filters to identify and categorise any defects within the sub-locations;
a reference library used by the data processing utility as a comparison with the data output by the data processing utility to determine an accuracy of the defect identification and/or categorisation; and
a model compiling utility to adjust parameters associated with the layers and/or filters and to control and action any re-processing of data by the data processing utility until an accuracy of the defect identification and/or categorisation is at or above a pre determined threshold.
33. Use of a tube inspection system of claims 31 or 32 to inspect a surface of a tube for defect identification and categorisation.
PCT/EP2019/062892 2018-05-18 2019-05-17 Tube inspection system WO2019219955A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP18173333 2018-05-18
EP18173333.8 2018-05-18

Publications (1)

Publication Number Publication Date
WO2019219955A1 true WO2019219955A1 (en) 2019-11-21

Family

ID=62235809

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/EP2019/062892 WO2019219955A1 (en) 2018-05-18 2019-05-17 Tube inspection system
PCT/EP2019/062893 WO2019219956A1 (en) 2018-05-18 2019-05-17 Tube inspection system

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/062893 WO2019219956A1 (en) 2018-05-18 2019-05-17 Tube inspection system

Country Status (1)

Country Link
WO (2) WO2019219955A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021182600A1 (en) * 2020-03-12 2021-09-16 Hoya株式会社 Information processing device, inspection system, program, and information processing method
DE102022103844B3 (en) 2022-02-17 2023-06-22 Synsor.ai GmbH Method for optimizing a production process based on visual information and device for carrying out the method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116705642B (en) * 2023-08-02 2024-01-19 西安邮电大学 Method and system for detecting silver plating defect of semiconductor lead frame and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001007819A (en) 1999-06-25 2001-01-12 Sony Corp Data transmission method and switching node device
GB2468301A (en) 2009-03-03 2010-09-08 Jd7 Ltd Water mains inspection and servicing system
KR101772916B1 (en) * 2016-12-30 2017-08-31 한양대학교 에리카산학협력단 Device for measuring crack width of concretestructure
US20170323163A1 (en) * 2016-05-06 2017-11-09 City Of Long Beach Sewer pipe inspection and diagnostic system and method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL7501009A (en) * 1975-01-29 1976-08-02 Skf Ind Trading & Dev DEVICE FOR AUTOMATIC DETECTION OF SURFACE ERRORS.
JP3310025B2 (en) * 1992-09-30 2002-07-29 バブコック日立株式会社 Pipe inner surface inspection device with CCD camera
JPH08136464A (en) * 1994-11-02 1996-05-31 Japan Steel Works Ltd:The Cylinder inner surface inspection method and cylinder inner surface inspection device
JP2007057305A (en) * 2005-08-23 2007-03-08 Mitsubishi Electric Engineering Co Ltd Internal inspection device of cylinder
CN102175694A (en) * 2006-05-23 2011-09-07 麒麟工程技术系统公司 Surface inspection device
JP2008241650A (en) * 2007-03-29 2008-10-09 Hitachi Industrial Equipment Systems Co Ltd Device for inspecting defect
JP2011141122A (en) * 2010-01-05 2011-07-21 Kyodo Printing Co Ltd Inside inspection device of cylindrical body and its inside inspection method
JP5742655B2 (en) * 2011-01-14 2015-07-01 新日鐵住金株式会社 Defect detection apparatus and defect detection method
KR101679650B1 (en) * 2014-11-07 2016-11-25 부산대학교 산학협력단 Method for detecting defect of hole inside
JP6922166B2 (en) * 2016-07-14 2021-08-18 日本製鉄株式会社 Cylindrical inner surface observation device, cylindrical inner surface observation method, cylindrical inner surface inspection device and cylindrical inner surface inspection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001007819A (en) 1999-06-25 2001-01-12 Sony Corp Data transmission method and switching node device
GB2468301A (en) 2009-03-03 2010-09-08 Jd7 Ltd Water mains inspection and servicing system
US20170323163A1 (en) * 2016-05-06 2017-11-09 City Of Long Beach Sewer pipe inspection and diagnostic system and method
KR101772916B1 (en) * 2016-12-30 2017-08-31 한양대학교 에리카산학협력단 Device for measuring crack width of concretestructure

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHENG JACK C P ET AL: "Automated detection of sewer pipe defects in closed-circuit television images using deep learning techniques", AUTOMATION IN CONSTRUCTION, ELSEVIER, AMSTERDAM, NL, vol. 95, 27 August 2018 (2018-08-27), pages 155 - 171, XP085467903, ISSN: 0926-5805, DOI: 10.1016/J.AUTCON.2018.08.006 *
SRINATH S. KUMAR ET AL: "Automated defect classification in sewer closed circuit television inspections using deep convolutional neural networks", AUTOMATION IN CONSTRUCTION, vol. 91, 1 July 2018 (2018-07-01), AMSTERDAM, NL, pages 273 - 283, XP055607521, ISSN: 0926-5805, DOI: 10.1016/j.autcon.2018.03.028 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021182600A1 (en) * 2020-03-12 2021-09-16 Hoya株式会社 Information processing device, inspection system, program, and information processing method
JPWO2021182600A1 (en) * 2020-03-12 2021-09-16
JP7313542B2 (en) 2020-03-12 2023-07-24 Hoya株式会社 Information processing device, inspection system, program and information processing method
DE102022103844B3 (en) 2022-02-17 2023-06-22 Synsor.ai GmbH Method for optimizing a production process based on visual information and device for carrying out the method

Also Published As

Publication number Publication date
WO2019219956A1 (en) 2019-11-21

Similar Documents

Publication Publication Date Title
WO2019219955A1 (en) Tube inspection system
CN112088387B (en) System and method for detecting defects in an imaged article
US9471057B2 (en) Method and system for position control based on automated defect detection feedback
CN111415329B (en) Workpiece surface defect detection method based on deep learning
EP3537136A1 (en) Image inspecting apparatus, image inspecting method and image inspecting program
CN202083644U (en) Steel tube inner wall test system based on panoramic imaging technology
EP4285337A1 (en) System and method for manufacturing quality control using automated visual inspection
JP2007292699A (en) Surface inspection method of member
US6382510B1 (en) Automatic inspection system using barcode localization and method thereof
CN116441190A (en) Longan detection system, method, equipment and storage medium
Wang et al. Design of machine vision applications in detection of defects in high-speed bar copper
CN112881403A (en) Hot rolling strip steel surface defect detection device
CN115984593A (en) Cigarette filter stick defect detection method, device, equipment and storage medium
US20220335254A1 (en) Computer vision inferencing for non-destructive testing
US20230138331A1 (en) Motion in images used in a visual inspection process
Chauhan et al. Effect of illumination techniques on machine vision inspection for automated assembly machines
CN116420068A (en) Detection based on automatic inspection plan
CN209247650U (en) Product quality on-line measuring device
JP2009236593A (en) Visual inspection supporting device
Yousef et al. Innovative inspection device for investment casting foundries
TWI793883B (en) Defect detection method and system for wire rod coating
JP2007033327A (en) Flaw detecting method and flaw detector
US20220270212A1 (en) Methods for improving optical inspection and metrology image quality using chip design data
JP2018048981A (en) Pipe material inspection device
WO2023218441A1 (en) Optimizing a reference group for visual inspection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19723831

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19723831

Country of ref document: EP

Kind code of ref document: A1