WO2014067685A1 - A method for simplifying defect analysis - Google Patents

A method for simplifying defect analysis Download PDF

Info

Publication number
WO2014067685A1
WO2014067685A1 PCT/EP2013/066949 EP2013066949W WO2014067685A1 WO 2014067685 A1 WO2014067685 A1 WO 2014067685A1 EP 2013066949 W EP2013066949 W EP 2013066949W WO 2014067685 A1 WO2014067685 A1 WO 2014067685A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
cylindrical body
pipeline
representation
Prior art date
Application number
PCT/EP2013/066949
Other languages
French (fr)
Inventor
Amin NASR
Erwann Houzay
Sébastien Guillon
William Gilmour
Original Assignee
Total Sa
Chevron U.S.A. Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Total Sa, Chevron U.S.A. Inc. filed Critical Total Sa
Publication of WO2014067685A1 publication Critical patent/WO2014067685A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0692Rate of change of altitude or depth specially adapted for under-water vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the present invention relates to defect detection on a pipeline (or other cylindrical body) and more specifically to the defect detection during underwater survey.
  • AUV for "Autonomous Underwater Vehicle” is a powerful tool to carry out subsea mapping, geotechnical and environmental survey in deep water.
  • AUV does need minimal support from surface vessel to carry out survey. Therefore, the underwater pipeline may be surveyed much faster and minimal human intervention during operations is requested.
  • the conventional method is to visually inspect the pipeline with a ROV (for "Remote Operated Vehicle”).
  • Standard video format is used to transmit feedback information to the engineer/operator at the surface.
  • the engineer/operator may detects/sees subsea anomalies. Videos are stored in surface storage devices.
  • AUV cannot be very close or in-contact with subsea pipelines: it is required that the AUV flies over subsea pipelines.
  • 10 to 100 km of pipeline may be surveyed.
  • the memory of the AUV available for storing these images
  • 10Mo per image, 10000 images per 10km may be too small (e.g. 10Mo per image, 10000 images per 10km).
  • the amount of data captured during such survey may be huge for a direct processing by engineers.
  • the invention relates to a method for simplifying defect analysis on an underwater pipeline.
  • the method comprises:
  • contour lines associated with contours of the representation of the cylindrical body may be expected to be quasi-parallel lines due to the perspective induced by the capture means.
  • the capture means may capture a plurality of pictures/images of a cylindrical body during a survey. Each image may represent 3 or 4 meters of the cylindrical body. The overlap between two successive images may be about 50-70%.
  • a cylindrical body may be, for instance, a pipeline, a cable (e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable), line (e.g. alimentation line, riser (e.g. riser bundle, single hybrid riser, etc).
  • a cable e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable
  • line e.g. alimentation line
  • riser e.g. riser bundle, single hybrid riser, etc.
  • the determined transformation may comprise a rotation or a a perspective distortion transformation.
  • the panorama image may combine parts of the transformed images with minimum transformations. Indeed, when correcting the perspective effect of an image, parts of the image may be distorted un-uniformly: parts corresponding to the closest points of spaces may be not distorted while the parts corresponding to the furthest points of spaces may be highly distorted.
  • the transformed image may be associated with the same order.
  • combining said transformed images to create a panorama image may comprise:
  • the order may correspond to the order of the images capture: the first captured image by capture means may have a first index; the second captured image by capture means may have the second index, etc.
  • the computed inter-correlation may be for instance a matrix C, each point C u of the matrix with coordinates (i,j) may correspond to the correlation between the two images when after the superposition of the two images, the second image is shift by i pixels in the horizontal direction and j pixels in the vertical direction.
  • the overlapping zone may be thus determined by shifting the second image by "iv” pixels in the horizontal direction and "jv” pixels in the vertical direction, where the valueC iv jv is the maximal value in the matrix.
  • the contour lines may correspond to the main external contours of the representation of cylindrical body in the latter received image.
  • the method may further comprise at least one of the following steps:
  • the marking may consist in drawing vertical red lines at said locations corresponding to the identified possible defects;
  • the programming of the navigation route may enables a close and accurate check after the detection of a possible defect.
  • Identifying possible defects on the panorama image may also comprise at least one of the following computations:
  • a second aspect of the invention relates to an analyzer device for simplifying defect analysis on an underwater cylindrical body, wherein the analyzer device comprises:
  • a third aspect relates to a computer program product comprising a computer readable medium, having thereon a computer program comprising program instructions.
  • the computer program is loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the method described above when the computer program is run by the data-processing unit.
  • Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention
  • Figures 2a to 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention
  • Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention
  • Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention.
  • Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention
  • Figure 6a is an illustration of a sample image taken by an AUV during a survey according to a possible embodiment of the invention
  • Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention
  • FIG. 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention
  • FIG. 8 is an illustration of possible defect detection in a panorama image according to a possible embodiment of the invention
  • FIG. 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention.
  • FIG. 10 is a possible embodiment for a device that enables the present invention.
  • Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention.
  • AUV for "Autonomous Underwater Vehicle” is subsea vehicle that are not directly controlled from the surface.
  • the AUV 102 may be used to ensure that there is no problem on subsea pipelines such as the pipeline 101 in Figure 1 .
  • the AUV 102 follows the path of the pipeline 101 .
  • the navigation module of the AUV control the AUV so that the AUV is translated according to this direction.
  • the distance d between the AUV 102 and the pipeline 101 is greater than a predetermined safety distance to avoid any collisions.
  • the AUV 102 may comprise capture means 103 (such as a camera, a video camera, a sonar, etc.) in order to survey the pipeline and provide information and data to the engineers.
  • the capture means 103 may, for instance, be able to capture visual information close to the pipeline within a predetermined area 104.
  • Figures 2a to Figure 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention.
  • the camera may create images 200 (or set of data) representing the seabed and comprising the pipeline 204 that is expected to be surveyed.
  • the determination of the relative location of the AUV in space is even more accurate (i.e. orientation of the pipeline compared to the orientation of the pipeline) if two contour lines (210 and 21 1 ) are determined.
  • This determination may use image processing technique such as contour detection. If the image is defined as a set of pixels with amplitude or colour for each pixel, the detection may be done by searching on the image the two lines which maximize the variation of image amplitude orthogonally of the lines. An optimization process may be used to find the two best lines on image verifying the above criterion.
  • control pattern may be defined in the AUV configuration settings.
  • the AUV is well localized.
  • observed differences may be used to correct the location of the AUV. Knowing the mathematical model of the camera, the pipeline location, etc it is possible to compute the displacement between the estimate and the real location: the true location of the AUV could be then estimated.
  • this pattern may consist in a zone of the captured image 200 where the pipeline (or its representation through the determined contour lines) should remain. There are a huge number of possible solutions to define such "control pattern”.
  • this pattern may consist in a set of points defining a polygon (e.g. points 220, 221 , 222, 224 and 223) and the representation of the pipeline should fit in this polygon. It is also possible to define segments at the edge of the capture image and the representation of the pipeline should correspond to the latter segments at those edges.
  • the pattern is defined with three segments 201 , 202 and 203. In order to "validate" this pattern with the representation of the pipeline 204 in the image 200, the following conditions are to be verified: - the contour line 210 is to go through the point 220 of segment 201 and through the point 223 of segment 203,
  • the contour line 21 1 is to go through the point 222 of segment 202 and through the point 224 of segment 203.
  • the AUV is assumed to be at a correct distance and to have a correct orientation in regard of the pipeline.
  • the pattern may be "not validated".
  • the contour line 210 goes through the point 220r (which is above the point 220) and through the point 223r (which is at the right of the point 223), - the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224r (which is below and at the right of the point 223).
  • the representation of the pipeline (i.e. its detected contour lines) in the picture 200 is to be rotated in an anti-clockwise direction with a rotation centered on the point 225, in order to "validate" the pattern.
  • the AUV may be rotated in a clockwise direction about the axis z (assuming that the pipeline is on the seabed defining the plan (x,y)) (e.g. the direction of the AUV is modified by the AUV navigation module to slightly turn right).
  • the contour line 210 goes through the point 220r (which is below the point 220) and through the point 223,
  • the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
  • the segment 203 is locally validated but the segments 201 and 202 are not validated.
  • the AUV may be moved in the direction y in order to bring the pipeline closer to the AUV (i.e. to zoom the representation of the pipeline in the bottom-left corner of the image). It may also be useful to slightly rotate the AUV in an anticlockwise direction about the axis y .
  • the contour line 210 goes through the point 223r (which is at the right of the point 223) and through the point 220,
  • the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
  • the segment 203 is not validated but the segments 201 and 202 are locally validated.
  • the contour line 210 goes through the point 223r (which is at the left of the point 223) and through the point 220r (which is above the point 223),
  • the contour line 21 1 goes through the point 224r (which is below and at the right of the point 224) and through the point 222r (which is at the right of the point 222).
  • navigation instructions are sent to the navigation module of the AUV to modify the navigation parameters of the AUV.
  • these modified navigation parameters it is possible to control the AUV to ensure that the AUV follows a subsea pipeline for a survey and to capture consistent images of the pipeline (i.e. where the pipeline is always at the same (or similar) location in the captured images).
  • Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention.
  • Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.
  • Upon reception of data 300 e.g. a 2D-array of pixel values, an image, etc.
  • data 300 e.g. a 2D-array of pixel values, an image, etc.
  • a plurality of methods is possible in order to determine whether a given feature is present in an image. For instance, this determination may use contour detection or pattern recognition algorithms in conjunction with a database 302 with stored pattern signatures.
  • the AUV is considered as "temporally lost' (output KO of test 310). If no pipeline is detected during a predetermined period of time (for instance 1 min) of after a predetermined number of received image (for instance 10 images), the AUV is considered as lost' (output OK of test 310). Thus, the AUV is configured to go back to a location where a pipeline has previously been detected (message 308) or to a predetermined fallback location.
  • the contour lines of the pipeline are detected (step 303) and the contours lines may be compared to a predetermined "control pattern" stored in a memory 305 of the AUV in order to determine if the pipeline representation "validates” (see above) this pattern.
  • the memory 305 and the memory 302 may be the same memory.
  • contour lines do not "validate” this pattern (output KO of test 306)
  • a modification of the navigation parameters may be computed (step 307) and a message 308 may be sent to the navigation module of the AUV to control the AUV survey path. If the contour lines do "validate” this pattern (output OK of test 306), the navigation parameters does not need to be updated and the AUV continues on its preprogrammed survey path.
  • Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention.
  • the AUV 402 may be able to detect features attached to the pipeline with detection means 403 (such as camera, sonar, multi-beam sonar, etc.)
  • detection means 403 such as camera, sonar, multi-beam sonar, etc.
  • the detection may use characters recognition algorithms, pattern recognition algorithms or others.
  • - to add reflective covering e.g. painting with microsphere, etc.
  • - to use material that reflects/absorbs specific wavelengths IR, UV, red light, etc.
  • This numbers or letters may represent an encoded real location (for instance in signed degrees format, in a DMS + compass direction format, in a degrees minutes seconds format, etc.)or other;
  • flange 401 that is used to attach two part of the pipeline together (detected for instance with a pattern recognition algorithms); - an anode 408 attached to the pipeline;
  • Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention.
  • Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.
  • data representing an underwater region (for instance, a picture taken by a camera or a video-recorder, rows of value taken by a sonar, etc.)
  • the identification of the underwater features may be performed on only part(s) of the received data: for instance, features may be searched in a bottom-left corner of the received image 500 or any other specific subset of the data.
  • a comparison may be performed to find a correspondence (e.g. a signature match) among a plurality of stored features in a database 503.
  • the stored features may have been stored in association with real location.
  • the detected feature in the data described directly i.e. without the need of an external database
  • a real location for instance, a sticker with real coordinates written on it.
  • test 504 If no correspondence is found in the database 503 (test 504, output KO), no action is performed (step 505).
  • test 504, output OK If a single correspondence is found in the database 503 (test 504, output OK), the real location associated with the correspondence in the database 503 is used to updated (step 506) the computed location 507 of the AUV.
  • the selected correspondence may be the correspondence for which the distance between the real location associated with and the current computed location 507 of the AUV is minimum.
  • Figure 6a is an illustration of a sample image 600a taken by an AUV during a survey according to a possible embodiment of the invention.
  • the image 600a comprises the representation of a pipeline 601 a with a flange 602a and two perpendicular pipeline valves 603a and 604a.
  • the representation of the pipeline 601 a has a perspective effect: the two contour lines of the pipeline (which are normally parallel) are crossing at a vanishing point (outside image 600a).
  • Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention.
  • This deformation may comprise a perspective correction or perspective transformation (i.e. to set the contour lines parallel) and a rotation (i.e. to set the contour lines horizontal).
  • objects of the non-transformed image 600a i.e. elements 601 a, 602a, 603a, 604a
  • a transformed image 600b i.e. elements 601 b, 602b, 603b, 604b.
  • Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention.
  • an AUV may capture a plurality of images along the pipeline.
  • the transformation may comprise a simple morphing modifying the two detected lines (edges of the pipe) in two parallel and horizontal lines.
  • the mosaic image may be created with the following process: a/ store the n corrected images in a memory buffer; b/ for the first two successive corrected images (e.g. 701 and 702), analyzing these latter images by detecting the inter correlation between the two images, an overlapping zone (e.g 704) is thus estimated. Then the two images are flattened in a single image. c/ storing the flattened image in the buffer in order to replace the two successive corrected images in the first location in the buffer. d/ if the buffer comprises more than one image, steps bl to 61 are reapplied to obtain the complete mosaic of the pipe 703.
  • Figure 8 is an illustration of possible defect detection method in a panorama image according to a possible embodiment of the invention.
  • a possible method for detection such defects is described in the application FR 2 965 616. Moreover, a possible method for detection such defects may consist in:
  • zone 801 a of the extracted part of the panorama image 800a corresponds to the zone 801 b in the graphic, where CVV is below 190.
  • the zone 802a of the extracted part of the panorama image 800a corresponds to the zone 802b in the graphic, where contrast variation values are below 190. It may be possible to detect defects such as:
  • Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention. Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit.
  • each image of the plurality may be modified to change the perspective and/or to rotate the latter image (step 901 ).
  • the panorama image may be cropped in order to keep the only relevant part of the image (i.e. the part of the image close to the representation of the pipeline in the panorama image, step 903).
  • step 904 It is possible to process the panorama image to detect anomalies/defects (step 904) for instance, according to the method described in patent application FR 2 965 616.
  • the panorama image may be marked according to the previous detection (step 905) to ease a future identification and verification of defects on the pipeline (for instance, to ease the visual inspection by operators/engineers).
  • the marks may be, for instance, vertical red lines at the location of the detected defects in the panorama image.
  • the final marked panorama image (message 906) may be outputted to be displayed, for instance, to the operators/engineers.
  • Figure 10 is a possible embodiment for a device that enables the present invention.
  • the device 1000 comprise a computer, this computer comprising a memory 1005 to store program instructions loadable into a circuit and adapted to cause circuit 1004 to carry out the steps of the present invention when the program instructions are run by the circuit 1004.
  • the memory 1005 may also store data and useful information for carrying the steps of the present invention as described above.
  • the circuit 1004 may be for instance:
  • processor or the processing unit may comprise, may be associated with or be attached to a memory comprising the instructions, or
  • processors / processing unit adapted to interpret instructions in a computer language
  • the memory comprising said instructions, or - an electronic card wherein the steps of the invention are described within silicon, or
  • a programmable electronic chip such as a FPGA chip (for « Field- Programmable Gate Array »).
  • This computer comprises an input interface 1003 for the reception of data used for the above method according to the invention and an output interface 1006 for providing a panorama image, control navigation instructions, or update of the AUV location as described above.
  • a screen 1001 and a keyboard 1002 may be provided and connected to the computer circuit 1004.
  • a screen 1001 and a keyboard 1002 may be provided and connected to the computer circuit 1004.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

The present invention relates to a method for simplifying defect analysis on an underwater cylindrical body. The method comprises receiving a plurality of images corresponding to an underwater region. Moreover, for each received image and for a predetermined direction in the received images, the method further comprises determining whether a cylindrical body have a representation in the received image and, upon positive determination, determining at least two contour lines associated with contours of the representation of the cylindrical body, determining a transformation of the latter received image to parallelize the latter determined contour lines and the predetermined direction and transforming the latter image according to the latter determined transformation. In addition, the method comprises combining said transformed images to create a panorama image, and identifying possible defects on the panorama image.

Description

A METHOD FOR SIMPLIFYING DEFECT ANALYSIS
BACKGROUND OF THE INVENTION
The present invention relates to defect detection on a pipeline (or other cylindrical body) and more specifically to the defect detection during underwater survey.
The approaches described in this section could be pursued, but are not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, the approaches described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section. Furthermore, all embodiments are not necessarily intended to solve all or even any of the problems brought forward in this section.
AUV (for "Autonomous Underwater Vehicle") is a powerful tool to carry out subsea mapping, geotechnical and environmental survey in deep water.
AUV does need minimal support from surface vessel to carry out survey. Therefore, the underwater pipeline may be surveyed much faster and minimal human intervention during operations is requested.
To survey underwater pipeline, the conventional method is to visually inspect the pipeline with a ROV (for "Remote Operated Vehicle"). Standard video format is used to transmit feedback information to the engineer/operator at the surface. Thus, the engineer/operator may detects/sees subsea anomalies. Videos are stored in surface storage devices.
It is also possible to post-process the stored video by replaying the video and identifying pipeline anomalies/defects and any other features manually.
For AUV, no high bandwidth data link (i.e. direct communication link such as a wire) to surface is available.
Moreover, for safety reasons, AUV cannot be very close or in-contact with subsea pipelines: it is required that the AUV flies over subsea pipelines. During pipe inspection/survey, 10 to 100 km of pipeline may be surveyed. As many images may be captured during the survey (to be post-analyzed by engineers), the memory of the AUV (available for storing these images) may be too small (e.g. 10Mo per image, 10000 images per 10km).
Thus, there is a need to compress the relevant part of the captured image in order to avoid any memory saturation.
Moreover, the amount of data captured during such survey may be huge for a direct processing by engineers.
There is thus a need for simplifying defect analysis on an underwater pipeline.
SUMMARY OF THE INVENTION
The invention relates to a method for simplifying defect analysis on an underwater pipeline. The method comprises:
- receiving a plurality of images corresponding to an underwater region;
- for each received image and for a predetermined direction in the received images,
- determining whether a cylindrical body have a representation in the received image,
- upon positive determination:
- determining at least two contour lines associated with contours of the representation of the cylindrical body;
- determining a transformation of the latter received image to parallelize the latter determined contour lines and the predetermined direction;
- transforming the latter image according to the latter determined transformation;
- combining said transformed images to create a panorama image; and
- identifying possible defects on the panorama image.
The contour lines associated with contours of the representation of the cylindrical body may be expected to be quasi-parallel lines due to the perspective induced by the capture means.
For instance, the capture means may capture a plurality of pictures/images of a cylindrical body during a survey. Each image may represent 3 or 4 meters of the cylindrical body. The overlap between two successive images may be about 50-70%.
A cylindrical body may be, for instance, a pipeline, a cable (e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable), line (e.g. alimentation line, riser (e.g. riser bundle, single hybrid riser, etc).
In a possible embodiment, the determined transformation may comprise a rotation or a a perspective distortion transformation.
In addition, the panorama image may combine parts of the transformed images with minimum transformations. Indeed, when correcting the perspective effect of an image, parts of the image may be distorted un-uniformly: parts corresponding to the closest points of spaces may be not distorted while the parts corresponding to the furthest points of spaces may be highly distorted.
Thus, to create the panorama image, it is advantageous to keep the less distorted part of the image while combining the modified images together.
The received images being associated with an order, the transformed image may be associated with the same order.
In such embodiment, combining said transformed images to create a panorama image may comprise:
- for a first image and a second image, the second image being the successor of the first image according to the order: - computing an overlapping zone in the first image according to a computed inter-correlation between the first and second images,
- modifying the first image, said modification comprising removing said overlapping zone from the first image,
- joining the first modified image and the second image into a third image.
For instance, the order may correspond to the order of the images capture: the first captured image by capture means may have a first index; the second captured image by capture means may have the second index, etc. The computed inter-correlation may be for instance a matrix C, each point Cu of the matrix with coordinates (i,j) may correspond to the correlation between the two images when after the superposition of the two images, the second image is shift by i pixels in the horizontal direction and j pixels in the vertical direction.
The overlapping zone may be thus determined by shifting the second image by "iv" pixels in the horizontal direction and "jv" pixels in the vertical direction, where the valueCiv jv is the maximal value in the matrix.
In a possible embodiment, the contour lines may correspond to the main external contours of the representation of cylindrical body in the latter received image.
Moreover, the method may further comprise at least one of the following steps:
- marking the panorama image at locations corresponding to the identified possible defects;
- programming navigation route of a underwater vehicle according to locations corresponding to the identified possible defects.
The marking may consist in drawing vertical red lines at said locations corresponding to the identified possible defects; The programming of the navigation route may enables a close and accurate check after the detection of a possible defect.
Identifying possible defects on the panorama image may also comprise at least one of the following computations:
- detection of pre-identified patterns in the panorama image;
- detection of variation of contrast/color in the panorama image.
- detection of variation of contrast/color on the representation of the cylindrical body in the panorama image.
A second aspect of the invention relates to an analyzer device for simplifying defect analysis on an underwater cylindrical body, wherein the analyzer device comprises:
- an interface for receiving a plurality of images corresponding to a underwater region;
- a circuit for determining whether a cylindrical body have a representation in the received image;
- a circuit for, upon positive determination, determining at least two contour lines associated with contours of the representation of the cylindrical body; - a circuit for, upon positive determination, determining a transformation of the received images to parallelize the determined contour lines and a predetermined direction in the received images;
- a circuit for, upon positive determination, transforming the received images according to the latter determined transformation; - a circuit for combining said transformed images to create a panorama image; and
- a circuit for identifying possible defects on the panorama image. A third aspect relates to a computer program product comprising a computer readable medium, having thereon a computer program comprising program instructions. The computer program is loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the method described above when the computer program is run by the data-processing unit.
Other features and advantages of the method and apparatus disclosed herein will become apparent from the following description of non-limiting embodiments, with reference to the appended drawings.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which like reference numerals refer to similar elements and in which:
Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention;
Figures 2a to 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention;
Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention;
Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention;
Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention;
Figure 6a is an illustration of a sample image taken by an AUV during a survey according to a possible embodiment of the invention;
Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention;
- Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention; - Figure 8 is an illustration of possible defect detection in a panorama image according to a possible embodiment of the invention;
- Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention;
- Figure 10 is a possible embodiment for a device that enables the present invention.
DESCRIPTION OF PREFERRED EMBODIMENTS
Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention.
AUV (for "Autonomous Underwater Vehicle") is subsea vehicle that are not directly controlled from the surface.
The AUV 102 may be used to ensure that there is no problem on subsea pipelines such as the pipeline 101 in Figure 1 .
To survey the pipeline, the AUV 102 follows the path of the pipeline 101 . For instance, if the pipeline is parallel to the axis x of the Cartesian coordinate system (x,y,z) represented in the Figure 1 , the navigation module of the AUV control the AUV so that the AUV is translated according to this direction. For safety reason the distance d between the AUV 102 and the pipeline 101 is greater than a predetermined safety distance to avoid any collisions.
In addition, the AUV 102 may comprise capture means 103 (such as a camera, a video camera, a sonar, etc.) in order to survey the pipeline and provide information and data to the engineers. The capture means 103 may, for instance, be able to capture visual information close to the pipeline within a predetermined area 104. Figures 2a to Figure 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention.
As described in reference with Figure 1 , the camera may create images 200 (or set of data) representing the seabed and comprising the pipeline 204 that is expected to be surveyed.
To control the navigation of the AUV, it is possible to use these images initially captured to survey the pipeline 204. Indeed, it is possible to determine a relative location of the AUV in space (distance of the AUV from the pipeline): - knowing the dimension of the real diameter d204 of the pipeline 204,
- knowing the orientation of capture means (e.g. the camera axe).
The determination of the relative location of the AUV in space is even more accurate (i.e. orientation of the pipeline compared to the orientation of the pipeline) if two contour lines (210 and 21 1 ) are determined. This determination may use image processing technique such as contour detection. If the image is defined as a set of pixels with amplitude or colour for each pixel, the detection may be done by searching on the image the two lines which maximize the variation of image amplitude orthogonally of the lines. An optimization process may be used to find the two best lines on image verifying the above criterion. Once the relative location of the AUV from the pipeline is determined (distance and orientation) it is possible to modify the navigation path of the AUV to bring the AUV at a specific distance from the pipeline (e.g. 3 meters from the pipeline) and with a specific relative orientation (e.g. parallel to the pipeline).
In order to ease this determination, a "control pattern" may be defined in the AUV configuration settings.
If no differences are observed between the control pattern and the representation of the pipeline, then the AUV is well localized. On the opposite, observed differences may be used to correct the location of the AUV. Knowing the mathematical model of the camera, the pipeline location, etc it is possible to compute the displacement between the estimate and the real location: the true location of the AUV could be then estimated.
Basically, this pattern may consist in a zone of the captured image 200 where the pipeline (or its representation through the determined contour lines) should remain. There are a huge number of possible solutions to define such "control pattern".
For instance, this pattern may consist in a set of points defining a polygon (e.g. points 220, 221 , 222, 224 and 223) and the representation of the pipeline should fit in this polygon. It is also possible to define segments at the edge of the capture image and the representation of the pipeline should correspond to the latter segments at those edges. In Figures 2a to 2e, the pattern is defined with three segments 201 , 202 and 203. In order to "validate" this pattern with the representation of the pipeline 204 in the image 200, the following conditions are to be verified: - the contour line 210 is to go through the point 220 of segment 201 and through the point 223 of segment 203,
- the contour line 21 1 is to go through the point 222 of segment 202 and through the point 224 of segment 203.
If this pattern is validated (as represented in Figure 2a), the AUV is assumed to be at a correct distance and to have a correct orientation in regard of the pipeline.
Nevertheless, the pattern may be "not validated".
A first illustration of this invalidity is provided in Figure 2b:
- the contour line 210 goes through the point 220r (which is above the point 220) and through the point 223r (which is at the right of the point 223), - the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224r (which is below and at the right of the point 223).
It appears that the representation of the pipeline (i.e. its detected contour lines) in the picture 200 is to be rotated in an anti-clockwise direction with a rotation centered on the point 225, in order to "validate" the pattern.
In order to perform a rotation of the representation of the pipeline in the image 200 in an anti-clockwise direction, the AUV may be rotated in a clockwise direction about the axis z (assuming that the pipeline is on the seabed defining the plan (x,y)) (e.g. the direction of the AUV is modified by the AUV navigation module to slightly turn right).
A second illustration of this invalidity is provided in Figure 2c:
- the contour line 210 goes through the point 220r (which is below the point 220) and through the point 223,
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
Therefore, the segment 203 is locally validated but the segments 201 and 202 are not validated.
In order to validate the pattern with the representation of the pipeline 204 in the image 200, the AUV may be moved in the direction y in order to bring the pipeline closer to the AUV (i.e. to zoom the representation of the pipeline in the bottom-left corner of the image). It may also be useful to slightly rotate the AUV in an anticlockwise direction about the axis y .
A third illustration of this invalidity is provided in Figure 2d:
- the contour line 210 goes through the point 223r (which is at the right of the point 223) and through the point 220,
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
Therefore, the segment 203 is not validated but the segments 201 and 202 are locally validated.
In order to validate the pattern with the representation of the pipeline in the image 200, it may be useful to slightly rotate the AUV in a clockwise direction about the axis y .
A fourth illustration of this invalidity is provided in Figure 2e:
- the contour line 210 goes through the point 223r (which is at the left of the point 223) and through the point 220r (which is above the point 223),
- the contour line 21 1 goes through the point 224r (which is below and at the right of the point 224) and through the point 222r (which is at the right of the point 222).
In order to validate the representation of the pipeline in the image 200, it may be useful to move the AUV away from the pipeline (i.e. to move the AUV in the direction - y ).
In order to rotate, translate, etc. the AUV as described above, navigation instructions are sent to the navigation module of the AUV to modify the navigation parameters of the AUV. With these modified navigation parameters, it is possible to control the AUV to ensure that the AUV follows a subsea pipeline for a survey and to capture consistent images of the pipeline (i.e. where the pipeline is always at the same (or similar) location in the captured images).
Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention.
Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.
Upon reception of data 300 (e.g. a 2D-array of pixel values, an image, etc.), it is possible to determine (step 301 ) whether a pipeline have a representation in the received data.
A plurality of methods is possible in order to determine whether a given feature is present in an image. For instance, this determination may use contour detection or pattern recognition algorithms in conjunction with a database 302 with stored pattern signatures.
If a pipeline is not detected (output KO of the test 309) in the image 300, the AUV is considered as "temporally lost' (output KO of test 310). If no pipeline is detected during a predetermined period of time (for instance 1 min) of after a predetermined number of received image (for instance 10 images), the AUV is considered as lost' (output OK of test 310). Thus, the AUV is configured to go back to a location where a pipeline has previously been detected (message 308) or to a predetermined fallback location.
If a pipeline is detected (output OK of the test 309) in the image 300, the contour lines of the pipeline are detected (step 303) and the contours lines may be compared to a predetermined "control pattern" stored in a memory 305 of the AUV in order to determine if the pipeline representation "validates" (see above) this pattern. The memory 305 and the memory 302 may be the same memory.
If the contour lines do not "validate" this pattern (output KO of test 306), a modification of the navigation parameters (rotations, translations, etc.) may be computed (step 307) and a message 308 may be sent to the navigation module of the AUV to control the AUV survey path. If the contour lines do "validate" this pattern (output OK of test 306), the navigation parameters does not need to be updated and the AUV continues on its preprogrammed survey path.
Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention.
When surveying a pipeline 400 on the seabed, the AUV 402 may be able to detect features attached to the pipeline with detection means 403 (such as camera, sonar, multi-beam sonar, etc.)
The detection may use characters recognition algorithms, pattern recognition algorithms or others.
In order to enhance the detection of the underwater features, it is possible:
- to add reflective covering (e.g. painting with microsphere, etc.) on these features; - to use material that reflects/absorbs specific wavelengths (IR, UV, red light, etc.);
- etc.
The above features may be for instance:
- a white sticker 407 with black numbers or letters written on it. This numbers or letters may represent an encoded real location (for instance in signed degrees format, in a DMS + compass direction format, in a degrees minutes seconds format, etc.)or other;
- a flange 401 that is used to attach two part of the pipeline together (detected for instance with a pattern recognition algorithms); - an anode 408 attached to the pipeline;
- a geometrical form 405 painted on the pipeline 400;
- a pipeline sleeper 406 used to avoid any displacement of the pipeline in regard of the seabed.
Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention.
Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device. Upon receiving data (message 500) representing an underwater region (for instance, a picture taken by a camera or a video-recorder, rows of value taken by a sonar, etc.), it is possible to process the data to identify (step 501 ) an underwater features as described above.
The identification of the underwater features may be performed on only part(s) of the received data: for instance, features may be searched in a bottom-left corner of the received image 500 or any other specific subset of the data. When possible underwater features are identified, in the received data 500, a comparison (step 502) may be performed to find a correspondence (e.g. a signature match) among a plurality of stored features in a database 503. The stored features may have been stored in association with real location.
It is also possible that the detected feature in the data described directly (i.e. without the need of an external database) a real location (for instance, a sticker with real coordinates written on it).
If no correspondence is found in the database 503 (test 504, output KO), no action is performed (step 505).
If a single correspondence is found in the database 503 (test 504, output OK), the real location associated with the correspondence in the database 503 is used to updated (step 506) the computed location 507 of the AUV.
If a plurality of correspondences is found in the database 503 (test 504, output OK2), it is possible to select (step 508) one of the correspondence in the plurality of correspondences. The selected correspondence may be the correspondence for which the distance between the real location associated with and the current computed location 507 of the AUV is minimum.
For instance, when a survey is performed on a pipeline, several flanges/anodes may have the same signature and then several correspondences may be found in the database matching an underwater feature. This algorithm for selecting one correspondence assumes that the most probable detected feature is the closest matching feature (i.e. with the shortest distance).
Figure 6a is an illustration of a sample image 600a taken by an AUV during a survey according to a possible embodiment of the invention. The image 600a comprises the representation of a pipeline 601 a with a flange 602a and two perpendicular pipeline valves 603a and 604a.
It is noted that the representation of the pipeline 601 a has a perspective effect: the two contour lines of the pipeline (which are normally parallel) are crossing at a vanishing point (outside image 600a).
In order to compensate this perspective effect, it is possible to deform image 600a. Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention. This deformation may comprise a perspective correction or perspective transformation (i.e. to set the contour lines parallel) and a rotation (i.e. to set the contour lines horizontal).
Thus, objects of the non-transformed image 600a (i.e. elements 601 a, 602a, 603a, 604a) are modified into new objects in a transformed image 600b (i.e. elements 601 b, 602b, 603b, 604b).
Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention.
During a survey of a pipeline, an AUV may capture a plurality of images along the pipeline.
Due to image acquisition and perspective effect, the pipe location is not stable between pair of images. In order to be able to correlate images and to create the mosaic image, a correction (see above) is applied on the image so that the pipe becomes horizontal with a constant width on the image. The transformation may comprise a simple morphing modifying the two detected lines (edges of the pipe) in two parallel and horizontal lines.
After the transformation of these images (701 , 702, etc.) it is possible to combine these transformed images to create a panorama image (or mosaic image). The mosaic image may be created with the following process: a/ store the n corrected images in a memory buffer; b/ for the first two successive corrected images (e.g. 701 and 702), analyzing these latter images by detecting the inter correlation between the two images, an overlapping zone (e.g 704) is thus estimated. Then the two images are flattened in a single image. c/ storing the flattened image in the buffer in order to replace the two successive corrected images in the first location in the buffer. d/ if the buffer comprises more than one image, steps bl to 61 are reapplied to obtain the complete mosaic of the pipe 703.
Figure 8 is an illustration of possible defect detection method in a panorama image according to a possible embodiment of the invention.
For instance, it may be considered that there is a defect if the pipeline is not in contact (or close to) the seabed. Indeed, if the distance between the seabed and the pipeline is too big (a gap) and on a given length along the pipeline, the gravitational forces exerted on the pipeline could be dangerous for the pipeline integrity.
A possible method for detection such defects is described in the application FR 2 965 616. Moreover, a possible method for detection such defects may consist in:
- computing a panorama image according to the above method;
- extracting the part 800a of the panorama image corresponding below the representation of the pipeline;
- for each vertical segment (810, 81 1 , 812, 813, etc.) of the extracted part of the panorama image 800a, computing a "contrast variation value" or CW
(820, 81 1 , 812, 813, etc.) related to a contrast of the pixels in the latter vertical segment; - if the contrast value or if the variation of the contrast value (within a zone, according to a direction of space, etc) is below a predetermined threshold (in the present example 190, line 800b), it is considered that a defect is present.
For instance the zone 801 a of the extracted part of the panorama image 800a, where a gap below the pipeline and the seabed is present, corresponds to the zone 801 b in the graphic, where CVV is below 190.
The zone 802a of the extracted part of the panorama image 800a, where a gap below the pipeline and the seabed is present, corresponds to the zone 802b in the graphic, where contrast variation values are below 190. It may be possible to detect defects such as:
- debris in-contact with subsea pipelines through applying a real-time shape/ pattern comparison to pre-identified patterns in the software database;
- drag / scar marks on the seabed which consider an evidence of "walking pipelines"; - etc.
Upon the detection of such defects, it is possible to:
- produce preliminary report and/or to compare this report with the last produced report to stress differences;
- identify the defects on the panorama image; - re-program AUV route to re-survey the area where defects have detected;
- use/activate other detections means (such as acoustics sensor, sonar, etc.) to increase the accuracy of the defect detection;
- etc.
Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention. Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit.
Upon the reception of a plurality of images (message 900), each image of the plurality may be modified to change the perspective and/or to rotate the latter image (step 901 ).
Once all images are modified, it is possible to combine the modified images to create a panorama (step 902).
The panorama image may be cropped in order to keep the only relevant part of the image (i.e. the part of the image close to the representation of the pipeline in the panorama image, step 903).
It is possible to process the panorama image to detect anomalies/defects (step 904) for instance, according to the method described in patent application FR 2 965 616.
The panorama image may be marked according to the previous detection (step 905) to ease a future identification and verification of defects on the pipeline (for instance, to ease the visual inspection by operators/engineers).
The marks may be, for instance, vertical red lines at the location of the detected defects in the panorama image.
Finally, the final marked panorama image (message 906) may be outputted to be displayed, for instance, to the operators/engineers.
Figure 10 is a possible embodiment for a device that enables the present invention.
In this embodiment, the device 1000 comprise a computer, this computer comprising a memory 1005 to store program instructions loadable into a circuit and adapted to cause circuit 1004 to carry out the steps of the present invention when the program instructions are run by the circuit 1004.
The memory 1005 may also store data and useful information for carrying the steps of the present invention as described above. The circuit 1004 may be for instance:
- a processor or a processing unit adapted to interpret instructions in a computer language, the processor or the processing unit may comprise, may be associated with or be attached to a memory comprising the instructions, or
- the association of a processor / processing unit and a memory, the processor or the processing unit adapted to interpret instructions in a computer language, the memory comprising said instructions, or - an electronic card wherein the steps of the invention are described within silicon, or
- a programmable electronic chip such as a FPGA chip (for « Field- Programmable Gate Array »).
This computer comprises an input interface 1003 for the reception of data used for the above method according to the invention and an output interface 1006 for providing a panorama image, control navigation instructions, or update of the AUV location as described above.
To ease the interaction with the computer, a screen 1001 and a keyboard 1002 may be provided and connected to the computer circuit 1004. A person skilled in the art will readily appreciate that various parameters disclosed in the description may be modified and that various embodiments disclosed may be combined without departing from the scope of the invention.
For instance, the description proposes embodiments with pipelines examples. Any cylindrical body may replace these pipelines.

Claims

1 . A method for simplifying defect analysis on an underwater cylindrical body, wherein the method comprises:
- receiving a plurality of images (900) corresponding to an underwater region; - for each received image (600a) and for a predetermined direction in the received images,
- determining whether a cylindrical body have a representation (601 a) in the received image,
- upon positive determination:
- determining at least two contour lines associated with contours of the representation (601 a) of the cylindrical body;
- determining a transformation of the latter received image to parallelize the latter determined contour lines and the predetermined direction;
- transforming (901 ) the latter image according to the latter determined transformation;
- combining (902) said transformed images to create a panorama image (906); and - identifying (904, 905) possible defects on the panorama image (906).
2. A method according to claim 1 , wherein the determined transformation comprises a rotation.
3. A method according to one of the preceding claims, wherein the determined transformation comprises a perspective distortion transformation.
4. A method according to one of the preceding claims, wherein the panorama image combines parts of the transformed images with minimum transformations.
5. A method according to one of the preceding claims, wherein the received images is associated with an order, the transformed image being associated with the same order, and wherein combining said transformed images to create a panorama image comprises:
- for a first image and a second image, the second image being the successor of the first image according to the order:
- computing an overlapping zone (704) in the first image according to a computed inter-correlation between the first and second images,
- modifying the first image, said modification comprising removing said overlapping zone from the first image,
- joining the first modified image and the second image into a third image.
6. A method according to one of the preceding claims, wherein the contour lines corresponds to the main external contours of the representation (601 a) of cylindrical body in the latter received image.
7. A method according to one of the preceding claims, wherein the method further comprises at least one of the following steps:
- marking the panorama image at locations corresponding to the identified possible defects;
- programming navigation route of a underwater vehicle according to locations corresponding to the identified possible defects.
8. A method according to one of the preceding claims, wherein identifying possible defects on the panorama image comprises at least one of the following computations:
- detection of pre-identified patterns in the panorama image; - detection of variation of contrast/color in the panorama image.
- detection of variation of contrast/color on the representation of the cylindrical body in the panorama image.
9. An analyzer device for simplifying defect analysis on an underwater cylindrical body, wherein the analyzer device comprises:
- an interface (1003) for receiving a plurality of images (900) corresponding to a underwater region;
- a circuit (1004) for determining whether a cylindrical body have a representation (601 a) in the received image;
- a circuit (1004) for, upon positive determination, determining at least two contour lines associated with contours of the representation (601 a) of the cylindrical body;
- a circuit (1004) for, upon positive determination, determining a transformation of the received images to parallelize the determined contour lines and a predetermined direction in the received images;
- a circuit (1004) for, upon positive determination, transforming (901 ) the received images according to the latter determined transformation;
- a circuit (1004) for combining (902) said transformed images to create a panorama image (906); and
- a circuit (1004) for identifying (904, 905) possible defects on the panorama image (906).
10. A non-transitory computer readable storage medium, having stored thereon a computer program comprising program instructions, the computer program being loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the steps of any of claims 1 to 8 when the computer program is run by the data-processing device.
PCT/EP2013/066949 2012-10-30 2013-08-13 A method for simplifying defect analysis WO2014067685A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261720237P 2012-10-30 2012-10-30
US61/720,237 2012-10-30

Publications (1)

Publication Number Publication Date
WO2014067685A1 true WO2014067685A1 (en) 2014-05-08

Family

ID=48998605

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/EP2013/066949 WO2014067685A1 (en) 2012-10-30 2013-08-13 A method for simplifying defect analysis
PCT/EP2013/066948 WO2014067684A1 (en) 2012-10-30 2013-08-13 Method to enhance underwater localization
PCT/EP2013/066947 WO2014067683A1 (en) 2012-10-30 2013-08-13 A method for controlling navigation of an underwater vehicle

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/EP2013/066948 WO2014067684A1 (en) 2012-10-30 2013-08-13 Method to enhance underwater localization
PCT/EP2013/066947 WO2014067683A1 (en) 2012-10-30 2013-08-13 A method for controlling navigation of an underwater vehicle

Country Status (1)

Country Link
WO (3) WO2014067685A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110533650A (en) * 2019-08-28 2019-12-03 哈尔滨工程大学 A kind of AUV submarine pipeline detecting and tracking method of view-based access control model
CN113269720A (en) * 2021-04-16 2021-08-17 张家港华程机车精密制管有限公司 Defect detection method and system for straight welded pipe and readable medium
CN115932864A (en) * 2023-02-24 2023-04-07 深圳市博铭维技术股份有限公司 Pipeline defect detection method and pipeline defect detection device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3025346A1 (en) * 2014-08-26 2016-03-04 Centre Nat Rech Scient AUTOMATIC METHOD OF IDENTIFYING A SHADOW GENERATED BY A REAL TARGET IN A TWO-DIMENSIONAL IMAGE OF A SONAR
NO342795B1 (en) * 2016-07-28 2018-08-06 4Subsea As Method for detecting position and orientation of a subsea structure using an ROV
CN109976384B (en) * 2019-03-13 2022-02-08 厦门理工学院 Autonomous underwater robot and path following control method and device
CN116452513B (en) * 2023-03-20 2023-11-21 山东未来智能技术有限公司 Automatic identification method for corrugated aluminum sheath defects of submarine cable

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060152589A1 (en) * 2002-09-25 2006-07-13 Steven Morrison Imaging and measurement system
FR2965616A1 (en) 2010-10-01 2012-04-06 Total Sa METHOD OF IMAGING A LONGITUDINAL DRIVE

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW268099B (en) * 1994-05-02 1996-01-11 Ghneral Electric Co

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060152589A1 (en) * 2002-09-25 2006-07-13 Steven Morrison Imaging and measurement system
FR2965616A1 (en) 2010-10-01 2012-04-06 Total Sa METHOD OF IMAGING A LONGITUDINAL DRIVE

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GRACIAS N R ET AL: "Trajectory reconstruction with uncertainty estimation using mosaic registration", ROBOTICS AND AUTONOMOUS SYSTEMS, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 35, no. 3-4, 30 June 2001 (2001-06-30), pages 163 - 177, XP004245253, ISSN: 0921-8890, DOI: 10.1016/S0921-8890(01)00120-8 *
ZINGARETTI P ET AL: "Robust real-time detection of an underwater pipeline", ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, PINERIDGE PRESS, SWANSEA, GB, vol. 11, no. 2, 1 April 1998 (1998-04-01), pages 257 - 268, XP027087572, ISSN: 0952-1976, [retrieved on 19980401] *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110533650A (en) * 2019-08-28 2019-12-03 哈尔滨工程大学 A kind of AUV submarine pipeline detecting and tracking method of view-based access control model
CN110533650B (en) * 2019-08-28 2022-12-13 哈尔滨工程大学 AUV underwater pipeline detection tracking method based on vision
CN113269720A (en) * 2021-04-16 2021-08-17 张家港华程机车精密制管有限公司 Defect detection method and system for straight welded pipe and readable medium
CN113269720B (en) * 2021-04-16 2024-02-02 张家港华程机车精密制管有限公司 Defect detection method, system and readable medium for straight welded pipe
CN115932864A (en) * 2023-02-24 2023-04-07 深圳市博铭维技术股份有限公司 Pipeline defect detection method and pipeline defect detection device

Also Published As

Publication number Publication date
WO2014067683A1 (en) 2014-05-08
WO2014067684A1 (en) 2014-05-08

Similar Documents

Publication Publication Date Title
WO2014067685A1 (en) A method for simplifying defect analysis
KR102583989B1 (en) Automated image labeling for vehicles based on maps
CN111797650B (en) Obstacle identification method, obstacle identification device, computer equipment and storage medium
US10268201B2 (en) Vehicle automated parking system and method
US10496762B2 (en) Model generating device, position and orientation calculating device, and handling robot device
US11348263B2 (en) Training method for detecting vanishing point and method and apparatus for detecting vanishing point
US10726616B2 (en) System and method for processing captured images
JP2018060296A (en) Image processing apparatus, image processing system, and image processing method
CN109544629A (en) Camera pose determines method and apparatus and electronic equipment
US10976734B2 (en) Augmented reality (AR) display of pipe inspection data
EP4210002A1 (en) Pose estimation refinement for aerial refueling
CN106296646A (en) The tolerance correcting unit of AVM system and method thereof
US11645773B2 (en) Method for acquiring distance from moving body to at least one object located in any direction of moving body by performing near region sensing and image processing device using the same
Shah et al. Condition assessment of ship structure using robot assisted 3D-reconstruction
CN114359865A (en) Obstacle detection method and related device
KR102174035B1 (en) Object inspection method using an augmented-reality
EP3985609A1 (en) Positioning system and method for determining the three-dimensional position of a movable object
CN113701633A (en) Position and posture monitoring equipment of development machine
CN113487668B (en) Learnable cylindrical surface back projection method with unlimited radius
Bodenmann et al. Visual mapping of internal pipe walls using sparse features for application on board Autonomous Underwater Vehicles
WO2023007551A9 (en) Image processing device and computer-readable storage medium
KR102641506B1 (en) System and method for establishing artificial intelligence-based 3d digital damage model of narrow space using indoor inspection drone
CN112990003B (en) Image sequence repositioning judging method, device and computer equipment
CN112130550B (en) Road image processing method and device, electronic equipment and storage medium
US11348280B2 (en) Method and computer readable medium for pose estimation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13750314

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13750314

Country of ref document: EP

Kind code of ref document: A1