WO2014067683A1 - A method for controlling navigation of an underwater vehicle - Google Patents
A method for controlling navigation of an underwater vehicle Download PDFInfo
- Publication number
- WO2014067683A1 WO2014067683A1 PCT/EP2013/066947 EP2013066947W WO2014067683A1 WO 2014067683 A1 WO2014067683 A1 WO 2014067683A1 EP 2013066947 W EP2013066947 W EP 2013066947W WO 2014067683 A1 WO2014067683 A1 WO 2014067683A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- pipeline
- navigation
- zone
- auv
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000012986 modification Methods 0.000 claims abstract description 11
- 230000004048 modification Effects 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 10
- 239000003981 vehicle Substances 0.000 description 29
- 230000007547 defect Effects 0.000 description 18
- 238000001514 detection method Methods 0.000 description 16
- 238000012360 testing method Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 241001669679 Eleotris Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000004005 microsphere Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 231100000241 scar Toxicity 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0692—Rate of change of altitude or depth specially adapted for under-water vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Definitions
- the present invention relates to navigation control and more specifically to navigation control of an underwater vehicle during a pipeline (or other cylindrical body) survey.
- AUV for "Autonomous Underwater Vehicle” is a powerful tool to carry out subsea mapping, geotechnical and environmental survey in deep water.
- AUV does need minimal support from surface vessel to carry out survey. Therefore, the underwater pipeline may be surveyed much faster and minimal human intervention during operations is requested.
- the conventional method is to visually inspect the pipeline with a ROV (for "Remote Operated Vehicle”).
- Standard video format is used to transmit feedback information to the engineer/operator at the surface.
- the engineer/operator may detects/sees subsea anomalies. Videos are stored in surface storage devices.
- AUV For AUV, no high bandwidth data link (i.e. direct communication link such as a wire) to surface is available. Moreover, for safety reasons, AUV cannot be very close or in-contact with subsea pipelines: it is required that the AUV flies over subsea pipelines.
- the underwater vehicle is autonomous: its survey path is pre-configured in its memory.
- the exact position of the AUV may be inaccurate.
- the pipeline to survey may have moved and its position is uncertain.
- the survey path of the AUV is often configured to be distant from the expected position of the pipeline (safety margin).
- the invention relates to a method for controlling navigation of an underwater vehicle.
- the vehicle has navigation parameters.
- the method comprises:
- receiving image data may comprise acquiring data of an underwater region with a sonar and/or a camera.
- Sonar device or camera device are simple to use and to adapt on an AUV. Moreover, these devices are often already installed on the underwater vehicle.
- a cylindrical body may be, for instance, a pipeline, a cable (e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable), line (e.g. alimentation line, riser (e.g. riser bundle, single hybrid riser, etc).
- a cable e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable
- line e.g. alimentation line
- riser e.g. riser bundle, single hybrid riser, etc.
- modifying the navigation parameters may comprise:
- Modifying the navigation parameters may also comprise:
- the computation is optimized to determine the correct modification of the navigation parameters in view of the location of the representation of the cylindrical body in the image data.
- control pattern may comprise a control zone.
- the computed deviation may then be function of the intersection of the determined zone with the control zone.
- the modification of the navigation parameters may search to superpose the control zone and the determined zone in the image data.
- control pattern may comprise control points.
- the computed deviation may then be function of distances between the control points and a hull of the determined zone.
- the hull may be a convex hull.
- the control pattern may also comprise at least two segments.
- the computed deviation may then be function of at least four distances, each distance being a distance between a hull of the determined zone and an end of one of the segments. For instance, these segments are positioned at the border of the image data to set constraints at the edge of available data.
- the control pattern may also comprise at least two control lines.
- the computed deviation may then be function of at least two deviation angles, each deviation angle being an angle between one of the control lines and a hull of the determined zone.
- the two control lines may be expected to correspond to the contour lines of the representation of the cylindrical body.
- the method further may comprise a modification of the received image data based on lens characteristics.
- lens may induce distortions in the image data.
- the method may further comprise:
- the period may be a time period. This period may be also a number of captured images: if image data is captured at regular intervals, it is possible to express this period in a "number of image data captured”.
- the period may represent a period during no cylindrical body has been detected in the image data. It may indicate that the underwater vehicle is lost.
- this fallback position may be a surface position, the start point or any position where a cylindrical body is expected to be detected.
- the method may also further comprise:
- the positive location comprising the location of the vehicle when the positive determination occurs.
- the fallback location may be a recorded positive location.
- the determined period may be maximum.
- the navigation parameters may comprise a navigation direction.
- a second aspect of the invention relates to a controller for controlling navigation of an underwater vehicle, the vehicle having navigation parameters, wherein the controller comprises:
- a third aspect relates to a computer program product comprising a computer readable medium, having thereon a computer program comprising program instructions.
- the computer program is loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the method described above when the computer program is run by the data-processing unit.
- Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention
- Figures 2a to 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention
- Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention
- Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention.
- Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention
- Figure 6a is an illustration of a sample image taken by an AUV during a survey according to a possible embodiment of the invention
- Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention
- Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention.
- FIG. 8 is an illustration of possible defect detection in a panorama image according to a possible embodiment of the invention
- FIG. 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention
- FIG. 10 is a possible embodiment for a device that enables the present invention.
- Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention.
- AUV for "Autonomous Underwater Vehicle” is subsea vehicle that are not directly controlled from the surface.
- the AUV 102 may be used to ensure that there is no problem on subsea pipelines such as the pipeline 101 in Figure 1 .
- the AUV 102 follows the path of the pipeline 101 .
- the navigation module of the AUV control the AUV so that the AUV is translated according to this direction.
- the distance d between the AUV 102 and the pipeline 101 is greater than a predetermined safety distance to avoid any collisions.
- the AUV 102 may comprise capture means 103 (such as a camera, a video camera, a sonar, etc.) in order to survey the pipeline and provide information and data to the engineers.
- the capture means 103 may, for instance, be able to capture visual information close to the pipeline within a predetermined area 104.
- FIGS 2a to Figure 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention.
- the camera may create images 200 (or set of data) representing the seabed and comprising the pipeline 204 that is expected to be surveyed.
- the determination of the relative location of the AUV in space is even more accurate (i.e. orientation of the pipeline compared to the orientation of the pipeline) if two contour lines (210 and 21 1 ) are determined.
- This determination may use image processing technique such as contour detection. If the image is defined as a set of pixels with amplitude or colour for each pixel, the detection may be done by searching on the image the two lines which maximize the variation of image amplitude orthogonally of the lines. An optimization process may be used to find the two best lines on image verifying the above criterion.
- image processing technique such as contour detection. If the image is defined as a set of pixels with amplitude or colour for each pixel, the detection may be done by searching on the image the two lines which maximize the variation of image amplitude orthogonally of the lines. An optimization process may be used to find the two best lines on image verifying the above criterion.
- the relative location of the AUV from the pipeline is determined (distance and orientation) it is possible to modify the navigation path of the AUV to bring the AUV at a specific distance from the pipeline (e.g. 3 meters from the pipeline) and with a specific relative orientation (e.g. parallel to the pipeline).
- control pattern may be defined in the AUV configuration settings.
- the AUV is well localized.
- observed differences may be used to correct the location of the AUV. Knowing the mathematical model of the camera, the pipeline location, etc it is possible to compute the displacement between the estimate and the real location: the true location of the AUV could be then estimated.
- this pattern may consist in a zone of the captured image 200 where the pipeline (or its representation through the determined contour lines) should remain. There are a huge number of possible solutions to define such "control pattern”.
- this pattern may consist in a set of points defining a polygon (e.g. points 220, 221 , 222, 224 and 223) and the representation of the pipeline should fit in this polygon.
- the contour line 210 is to go through the point 220 of segment 201 and through the point 223 of segment 203,
- the contour line 21 1 is to go through the point 222 of segment 202 and through the point 224 of segment 203. If this pattern is validated (as represented in Figure 2a), the AUV is assumed to be at a correct distance and to have a correct orientation in regard of the pipeline.
- the pattern may be "not validated".
- the contour line 210 goes through the point 220r (which is above the point 220) and through the point 223r (which is at the right of the point 223),
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224r (which is below and at the right of the point 223).
- the representation of the pipeline (i.e. its detected contour lines) in the picture 200 is to be rotated in an anti-clockwise direction with a rotation centered on the point 225, in order to "validate" the pattern.
- the AUV may be rotated in a clockwise direction about the axis z (assuming that the pipeline is on the seabed defining the plan (x,y)) (e.g. the direction of the AUV is modified by the AUV navigation module to slightly turn right).
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
- the segment 203 is locally validated but the segments 201 and 202 are not validated.
- the AUV may be moved in the direction y in order to bring the pipeline closer to the AUV (i.e. to zoom the representation of the pipeline in the bottom-left corner of the image). It may also be useful to slightly rotate the AUV in an anti- clockwise direction about the axis y .
- the contour line 210 goes through the point 223r (which is at the right of the point 223) and through the point 220,
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
- the segment 203 is not validated but the segments 201 and 202 are locally validated.
- the contour line 210 goes through the point 223r (which is at the left of the point 223) and through the point 220r (which is above the point 223),
- the contour line 21 1 goes through the point 224r (which is below and at the right of the point 224) and through the point 222r (which is at the right of the point 222).
- it may be useful to move the AUV away from the pipeline e.g.. to move the AUV in the direction - y ).
- navigation instructions are sent to the navigation module of the AUV to modify the navigation parameters of the AUV.
- Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention.
- Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.
- data 300 e.g. a 2D-array of pixel values, an image, etc.
- a plurality of methods is possible in order to determine whether a given feature is present in an image. For instance, this determination may use contour detection or pattern recognition algorithms in conjunction with a database 302 with stored pattern signatures.
- the AUV is considered as "temporally lost' (output KO of test 310). If no pipeline is detected during a predetermined period of time (for instance 1 min) of after a predetermined number of received image (for instance 10 images), the AUV is considered as lost' (output OK of test 310). Thus, the AUV is configured to go back to a location where a pipeline has previously been detected (message 308) or to a predetermined fallback location.
- the contour lines of the pipeline are detected (step 303) and the contours lines may be compared to a predetermined "control pattern" stored in a memory 305 of the AUV in order to determine if the pipeline representation "validates” (see above) this pattern.
- the memory 305 and the memory 302 may be the same memory. If the contour lines do not "validate" this pattern (output KO of test 306), a modification of the navigation parameters (rotations, translations, etc.) may be computed (step 307) and a message 308 may be sent to the navigation module of the AUV to control the AUV survey path.
- Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention.
- the AUV 402 may be able to detect features attached to the pipeline with detection means 403 (such as camera, sonar, multi-beam sonar, etc.)
- the detection may use characters recognition algorithms, pattern recognition algorithms or others. In order to enhance the detection of the underwater features, it is possible:
- This numbers or letters may represent an encoded real location (for instance in signed degrees format, in a DMS + compass direction format, in a degrees minutes seconds format, etc.) or other;
- flange 401 that is used to attach two part of the pipeline together (detected for instance with a pattern recognition algorithms);
- Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention.
- Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.
- step 501 Upon receiving data (message 500) representing an underwater region (for instance, a picture taken by a camera or a video-recorder, rows of value taken by a sonar, etc.), it is possible to process the data to identify (step 501 ) an underwater features as described above.
- data representing an underwater region (for instance, a picture taken by a camera or a video-recorder, rows of value taken by a sonar, etc.)
- the identification of the underwater features may be performed on only part(s) of the received data: for instance, features may be searched in a bottom-left corner of the received image 500 or any other specific subset of the data.
- a comparison may be performed to find a correspondence (e.g. a signature match) among a plurality of stored features in a database 503.
- the stored features may have been stored in association with real location.
- the detected feature in the data described directly i.e. without the need of an external database
- a real location for instance, a sticker with real coordinates written on it. If no correspondence is found in the database 503 (test 504, output KO), no action is performed (step 505).
- the real location associated with the correspondence in the database 503 is used to updated (step 506) the computed location 507 of the AUV.
- a plurality of correspondences is found in the database 503 (test 504, output OK2), it is possible to select (step 508) one of the correspondence in the plurality of correspondences.
- the selected correspondence may be the correspondence for which the distance between the real location associated with and the current computed location 507 of the AUV is minimum. For instance, when a survey is performed on a pipeline, several flanges/anodes may have the same signature and then several correspondences may be found in the database matching an underwater feature. This algorithm for selecting one correspondence assumes that the most probable detected feature is the closest matching feature (i.e. with the shortest distance).
- Figure 6a is an illustration of a sample image 600a taken by an AUV during a survey according to a possible embodiment of the invention.
- the image 600a comprises the representation of a pipeline 601 a with a flange 602a and two perpendicular pipeline valves 603a and 604a. It is noted that the representation of the pipeline 601 a has a perspective effect: the two contour lines of the pipeline (which are normally parallel) are crossing at a vanishing point (outside image 600a).
- Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention.
- This deformation may comprise a perspective correction or perspective transformation (i.e. to set the contour lines parallel) and a rotation (i.e. to set the contour lines horizontal).
- objects of the non-transformed image 600a i.e. elements 601 a, 602a, 603a, 604a
- a transformed image 600b i.e. elements 601 b, 602b, 603b, 604b.
- Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention.
- an AUV may capture a plurality of images along the pipeline.
- the pipe location is not stable between pair of images.
- a correction (see above) is applied on the image so that the pipe becomes horizontal with a constant width on the image.
- the transformation may comprise a simple morphing modifying the two detected lines (edges of the pipe) in two parallel and horizontal lines. After the transformation of these images (701 , 702, etc.) it is possible to combine these transformed images to create a panorama image (or mosaic image).
- the mosaic image may be created with the following process: a/ store the n corrected images in a memory buffer; b/ for the first two successive corrected images (e.g. 701 and 702), analyzing these latter images by detecting the inter correlation between the two images, an overlapping zone (e.g 704) is thus estimated. Then the two images are flattened in a single image. c/ storing the flattened image in the buffer in order to replace the two successive corrected images in the first location in the buffer. d/ if the buffer comprises more than one image, steps bl to 61 are reapplied to obtain the complete mosaic of the pipe 703.
- Figure 8 is an illustration of possible defect detection method in a panorama image according to a possible embodiment of the invention.
- a possible method for detection such defects may consist in:
- zone 801 a of the extracted part of the panorama image 800a corresponds to the zone 801 b in the graphic, where CVV is below 190.
- the zone 802a of the extracted part of the panorama image 800a corresponds to the zone 802b in the graphic, where contrast variation values are below 190. It may be possible to detect defects such as:
- Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention.
- Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit.
- each image of the plurality may be modified to change the perspective and/or to rotate the latter image (step 901 ).
- the panorama image may be cropped in order to keep the only relevant part of the image (i.e. the part of the image close to the representation of the pipeline in the panorama image, step 903). It is possible to process the panorama image to detect anomalies/defects (step 904) for instance, according to the method described in patent application FR 2 965 616.
- the panorama image may be marked according to the previous detection (step 905) to ease a future identification and verification of defects on the pipeline (for instance, to ease the visual inspection by operators/engineers).
- the marks may be, for instance, vertical red lines at the location of the detected defects in the panorama image.
- the final marked panorama image (message 906) may be outputted to be displayed, for instance, to the operators/engineers.
- Figure 10 is a possible embodiment for a device that enables the present invention.
- the device 1000 comprise a computer, this computer comprising a memory 1005 to store program instructions loadable into a circuit and adapted to cause circuit 1004 to carry out the steps of the present invention when the program instructions are run by the circuit 1004.
- the memory 1005 may also store data and useful information for carrying the steps of the present invention as described above.
- the circuit 1004 may be for instance:
- the processor or the processing unit may comprise, may be associated with or be attached to a memory comprising the instructions, or - the association of a processor / processing unit and a memory, the processor or the processing unit adapted to interpret instructions in a computer language, the memory comprising said instructions, or
- This computer comprises an input interface 1003 for the reception of data used for the above method according to the invention and an output interface 1006 for providing a panorama image, control navigation instructions, or update of the AUV location as described above.
- a screen 1001 and a keyboard 1002 may be provided and connected to the computer circuit 1004.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Image Processing (AREA)
Abstract
The present invention relates to a method for controlling navigation of an underwater vehicle, the vehicle having navigation parameters. The method comprises receiving image data representing an underwater region and determining whether a cylindrical body has a representation in the received image data. Moreover, upon positive determination, the method further comprises determining (303) a zone of the image data corresponding to the representation of the cylindrical body and modifying the navigation parameters, said modification being function of said determined zone.
Description
A METHOD FOR CONTROLLING NAVIGATION OF AN UNDERWATER VEHICLE
BACKGROUND OF THE INVENTION
The present invention relates to navigation control and more specifically to navigation control of an underwater vehicle during a pipeline (or other cylindrical body) survey.
The approaches described in this section could be pursued, but are not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, the approaches described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section. Furthermore, all embodiments are not necessarily intended to solve all or even any of the problems brought forward in this section.
AUV (for "Autonomous Underwater Vehicle") is a powerful tool to carry out subsea mapping, geotechnical and environmental survey in deep water.
AUV does need minimal support from surface vessel to carry out survey. Therefore, the underwater pipeline may be surveyed much faster and minimal human intervention during operations is requested.
To survey underwater pipeline, the conventional method is to visually inspect the pipeline with a ROV (for "Remote Operated Vehicle"). Standard video format is used to transmit feedback information to the engineer/operator at the surface. Thus, the engineer/operator may detects/sees subsea anomalies. Videos are stored in surface storage devices.
It is also possible to post-process the stored video by replaying the video and identifying pipeline anomalies/defects and any other features manually.
For AUV, no high bandwidth data link (i.e. direct communication link such as a wire) to surface is available.
Moreover, for safety reasons, AUV cannot be very close or in-contact with subsea pipelines: it is required that the AUV flies over subsea pipelines.
During surveying operations, the underwater vehicle is autonomous: its survey path is pre-configured in its memory.
Nevertheless, the exact position of the AUV may be inaccurate. Moreover, the pipeline to survey may have moved and its position is uncertain.
Therefore, for all the above reasons and for safety reasons, the survey path of the AUV is often configured to be distant from the expected position of the pipeline (safety margin).
Nevertheless, if the AUV is distant from the pipeline, it may be difficult to detect small defects on the pipeline structure.
There is thus a need for improving the navigation control of the AUV during an underwater pipeline survey.
SUMMARY OF THE INVENTION The invention relates to a method for controlling navigation of an underwater vehicle. The vehicle has navigation parameters.
The method comprises:
- receiving image data representing an underwater region;
- determining whether a cylindrical body has a representation in the received image data,
- upon positive determination:
- determining a zone of the image data corresponding to the representation of the cylindrical body;
- modifying the navigation parameters, said modification being function of said determined zone.
Therefore, it is possible to control the navigation of the underwater vehicle in order to follow and survey the path of the cylindrical body.
In a possible embodiment, receiving image data may comprise acquiring data of an underwater region with a sonar and/or a camera.
Sonar device or camera device are simple to use and to adapt on an AUV. Moreover, these devices are often already installed on the underwater vehicle.
A cylindrical body may be, for instance, a pipeline, a cable (e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable), line (e.g. alimentation line, riser (e.g. riser bundle, single hybrid riser, etc).
In addition, modifying the navigation parameters may comprise:
- determining a relative position of the underwater vehicle compared to the cylindrical body, said determination being based on cylindrical body characteristics;
- modifying the navigation parameters to modify the relative position of the underwater vehicle to a reference relative position.
Indeed, it is possible to determine, by analyzing the representation of the cylindrical body in the received image data:
- whether the underwater vehicle is "too close" to the cylindrical body,
- whether its course is "not parallel" with the cylindrical body path,
- etc.
For instance, if the diameter of the cylindrical body is known/determinable, and if the orientation of the capture means (from which the image data is received) of the underwater vehicle compared to the orientation of the underwater vehicle is known/ determinable, basic geometrical computations may be sufficient to determine the orientation of the underwater vehicle compared to the cylindrical body.
Then, it is possible to modify the navigation parameters (i.e. to control the navigation) of the underwater vehicle to bring the vehicle at a wished orientation.
Modifying the navigation parameters may also comprise:
- computing a deviation between the determined zone and a control pattern.
- modifying the navigation parameters of the underwater vehicle to reduce the deviation. In such embodiment, the computation is optimized to determine the correct modification of the navigation parameters in view of the location of the representation of the cylindrical body in the image data.
If the representation of the cylindrical body is located at the "expected" position in the image data, no modification of the navigation is needed and the underwater vehicle may continue on its current path.
If the representation of the cylindrical body is not located at the "expected" position in the image data, it is needed to modify the navigation parameters to relocate the representation at an "expected" position in the image data.
In a possible embodiment, the control pattern may comprise a control zone. The computed deviation may then be function of the intersection of the determined zone with the control zone.
For instance, the modification of the navigation parameters may search to superpose the control zone and the determined zone in the image data.
In addition, the control pattern may comprise control points. The computed deviation may then be function of distances between the control points and a hull of the determined zone.
The hull may be a convex hull.
The control pattern may also comprise at least two segments. The computed deviation may then be function of at least four distances, each distance being a distance between a hull of the determined zone and an end of one of the segments.
For instance, these segments are positioned at the border of the image data to set constraints at the edge of available data.
The control pattern may also comprise at least two control lines. The computed deviation may then be function of at least two deviation angles, each deviation angle being an angle between one of the control lines and a hull of the determined zone.
The two control lines may be expected to correspond to the contour lines of the representation of the cylindrical body.
In a possible embodiment, the image data being acquired through a lens, the method further may comprise a modification of the received image data based on lens characteristics.
Indeed, lens may induce distortions in the image data.
In addition, the method may further comprise:
- upon negative determination:
- determining a period starting on or after a last positive determination and ending on or before the negative determination,
- if the period is greater than a predetermined value, modifying the navigation parameters to navigate to a fallback location.
The period may be a time period. This period may be also a number of captured images: if image data is captured at regular intervals, it is possible to express this period in a "number of image data captured".
The period may represent a period during no cylindrical body has been detected in the image data. It may indicate that the underwater vehicle is lost.
Therefore, it is advantageous to anticipate this case and determine a fallback position to go back if such situation occurs. For instance, this fallback position may be a surface position, the start point or any position where a cylindrical body is
expected to be detected.
The method may also further comprise:
- upon positive determination:
- recording a positive location, the positive location comprising the location of the vehicle when the positive determination occurs.
Thus, the fallback location may be a recorded positive location.
The determined period may be maximum.
In a possible embodiment, the navigation parameters may comprise a navigation direction.
A second aspect of the invention relates to a controller for controlling navigation of an underwater vehicle, the vehicle having navigation parameters, wherein the controller comprises:
- an input to receive image data representing an underwater region;
- a circuit to determine whether a cylindrical body have a representation in the received image data - a circuit to determine a zone of the image data corresponding to the representation of the cylindrical body;
- a circuit to modify the navigation parameters, said modification being function of said determined zone.
A third aspect relates to a computer program product comprising a computer readable medium, having thereon a computer program comprising program
instructions. The computer program is loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the method described above when the computer program is run by the data-processing unit.
Other features and advantages of the method and apparatus disclosed herein will become apparent from the following description of non-limiting embodiments, with reference to the appended drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which like reference numerals refer to similar elements and in which:
Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention;
Figures 2a to 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention;
Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention;
Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention;
Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention;
Figure 6a is an illustration of a sample image taken by an AUV during a survey according to a possible embodiment of the invention;
Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention;
Figure 7 is an illustration of a possible combination of sample images taken
by an AUV during a survey according to a possible embodiment of the invention;
- Figure 8 is an illustration of possible defect detection in a panorama image according to a possible embodiment of the invention; - Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention;
- Figure 10 is a possible embodiment for a device that enables the present invention.
DESCRIPTION OF PREFERRED EMBODIMENTS Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention.
AUV (for "Autonomous Underwater Vehicle") is subsea vehicle that are not directly controlled from the surface.
The AUV 102 may be used to ensure that there is no problem on subsea pipelines such as the pipeline 101 in Figure 1 .
To survey the pipeline, the AUV 102 follows the path of the pipeline 101 . For instance, if the pipeline is parallel to the axis x of the Cartesian coordinate system (x,y,z) represented in the Figure 1 , the navigation module of the AUV control the AUV so that the AUV is translated according to this direction. For safety reason the distance d between the AUV 102 and the pipeline 101 is greater than a predetermined safety distance to avoid any collisions.
In addition, the AUV 102 may comprise capture means 103 (such as a camera, a video camera, a sonar, etc.) in order to survey the pipeline and provide information and data to the engineers. The capture means 103 may, for instance, be able to capture visual information close to the pipeline within a predetermined area 104.
Figures 2a to Figure 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention.
As described in reference with Figure 1 , the camera may create images 200 (or set of data) representing the seabed and comprising the pipeline 204 that is expected to be surveyed.
To control the navigation of the AUV, it is possible to use these images initially captured to survey the pipeline 204. Indeed, it is possible to determine a relative location of the AUV in space (distance of the AUV from the pipeline):
- knowing the dimension of the real diameter d204 of the pipeline 204,
- knowing the orientation of capture means (e.g. the camera axe).
The determination of the relative location of the AUV in space is even more accurate (i.e. orientation of the pipeline compared to the orientation of the pipeline) if two contour lines (210 and 21 1 ) are determined.
This determination may use image processing technique such as contour detection. If the image is defined as a set of pixels with amplitude or colour for each pixel, the detection may be done by searching on the image the two lines which maximize the variation of image amplitude orthogonally of the lines. An optimization process may be used to find the two best lines on image verifying the above criterion.
Once the relative location of the AUV from the pipeline is determined (distance and orientation) it is possible to modify the navigation path of the AUV to bring the AUV at a specific distance from the pipeline (e.g. 3 meters from the pipeline) and with a specific relative orientation (e.g. parallel to the pipeline).
In order to ease this determination, a "control pattern" may be defined in the AUV configuration settings.
If no differences are observed between the control pattern and the representation of the pipeline, then the AUV is well localized. On the opposite, observed differences may be used to correct the location of the AUV. Knowing the mathematical model of the camera, the pipeline location, etc it is possible to compute the displacement between the estimate and the real location: the true location of the AUV could be then estimated.
Basically, this pattern may consist in a zone of the captured image 200 where the
pipeline (or its representation through the determined contour lines) should remain. There are a huge number of possible solutions to define such "control pattern".
For instance, this pattern may consist in a set of points defining a polygon (e.g. points 220, 221 , 222, 224 and 223) and the representation of the pipeline should fit in this polygon.
It is also possible to define segments at the edge of the capture image and the representation of the pipeline should correspond to the latter segments at those edges. In Figures 2a to 2e, the pattern is defined with three segments 201 , 202 and 203. In order to "validate" this pattern with the representation of the pipeline 204 in the image 200, the following conditions are to be verified:
- the contour line 210 is to go through the point 220 of segment 201 and through the point 223 of segment 203,
- the contour line 21 1 is to go through the point 222 of segment 202 and through the point 224 of segment 203. If this pattern is validated (as represented in Figure 2a), the AUV is assumed to be at a correct distance and to have a correct orientation in regard of the pipeline.
Nevertheless, the pattern may be "not validated".
A first illustration of this invalidity is provided in Figure 2b:
- the contour line 210 goes through the point 220r (which is above the point 220) and through the point 223r (which is at the right of the point 223),
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224r (which is below and at the right of the point 223).
It appears that the representation of the pipeline (i.e. its detected contour lines) in the picture 200 is to be rotated in an anti-clockwise direction with a rotation centered on the point 225, in order to "validate" the pattern.
In order to perform a rotation of the representation of the pipeline in the image 200 in an anti-clockwise direction, the AUV may be rotated in a clockwise direction about
the axis z (assuming that the pipeline is on the seabed defining the plan (x,y)) (e.g. the direction of the AUV is modified by the AUV navigation module to slightly turn right).
A second illustration of this invalidity is provided in Figure 2c: - the contour line 210 goes through the point 220r (which is below the point
220) and through the point 223,
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
Therefore, the segment 203 is locally validated but the segments 201 and 202 are not validated.
In order to validate the pattern with the representation of the pipeline 204 in the image 200, the AUV may be moved in the direction y in order to bring the pipeline closer to the AUV (i.e. to zoom the representation of the pipeline in the bottom-left corner of the image). It may also be useful to slightly rotate the AUV in an anti- clockwise direction about the axis y .
A third illustration of this invalidity is provided in Figure 2d:
- the contour line 210 goes through the point 223r (which is at the right of the point 223) and through the point 220,
- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.
Therefore, the segment 203 is not validated but the segments 201 and 202 are locally validated.
In order to validate the pattern with the representation of the pipeline in the image 200, it may be useful to slightly rotate the AUV in a clockwise direction about the axis y .
A fourth illustration of this invalidity is provided in Figure 2e:
- the contour line 210 goes through the point 223r (which is at the left of the
point 223) and through the point 220r (which is above the point 223),
- the contour line 21 1 goes through the point 224r (which is below and at the right of the point 224) and through the point 222r (which is at the right of the point 222). In order to validate the representation of the pipeline in the image 200, it may be useful to move the AUV away from the pipeline (e.g.. to move the AUV in the direction - y ).
In order to rotate, translate, etc. the AUV as described above, navigation instructions are sent to the navigation module of the AUV to modify the navigation parameters of the AUV.
With these modified navigation parameters, it is possible to control the AUV to ensure that the AUV follows a subsea pipeline for a survey and to capture consistent images of the pipeline (i.e. where the pipeline is always at the same (or similar) location in the captured images).
Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention.
Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device. Upon reception of data 300 (e.g. a 2D-array of pixel values, an image, etc.), it is possible to determine (step 301 ) whether a pipeline have a representation in the received data.
A plurality of methods is possible in order to determine whether a given feature is present in an image. For instance, this determination may use contour detection or pattern recognition algorithms in conjunction with a database 302 with stored pattern signatures.
If a pipeline is not detected (output KO of the test 309) in the image 300, the AUV is considered as "temporally lost' (output KO of test 310). If no pipeline is detected during a predetermined period of time (for instance 1 min) of after a predetermined
number of received image (for instance 10 images), the AUV is considered as lost' (output OK of test 310). Thus, the AUV is configured to go back to a location where a pipeline has previously been detected (message 308) or to a predetermined fallback location. If a pipeline is detected (output OK of the test 309) in the image 300, the contour lines of the pipeline are detected (step 303) and the contours lines may be compared to a predetermined "control pattern" stored in a memory 305 of the AUV in order to determine if the pipeline representation "validates" (see above) this pattern.
The memory 305 and the memory 302 may be the same memory. If the contour lines do not "validate" this pattern (output KO of test 306), a modification of the navigation parameters (rotations, translations, etc.) may be computed (step 307) and a message 308 may be sent to the navigation module of the AUV to control the AUV survey path.
If the contour lines do "validate" this pattern (output OK of test 306), the navigation parameters does not need to be updated and the AUV continues on its preprogrammed survey path.
Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention. When surveying a pipeline 400 on the seabed, the AUV 402 may be able to detect features attached to the pipeline with detection means 403 (such as camera, sonar, multi-beam sonar, etc.)
The detection may use characters recognition algorithms, pattern recognition algorithms or others. In order to enhance the detection of the underwater features, it is possible:
- to add reflective covering (e.g. painting with microsphere, etc.) on these features;
- to use material that reflects/absorbs specific wavelengths (IR, UV, red light, etc.);
- etc.
The above features may be for instance:
- a white sticker 407 with black numbers or letters written on it. This numbers or letters may represent an encoded real location (for instance in signed degrees format, in a DMS + compass direction format, in a degrees minutes seconds format, etc.) or other;
- a flange 401 that is used to attach two part of the pipeline together (detected for instance with a pattern recognition algorithms);
- an anode 408 attached to the pipeline; - a geometrical form 405 painted on the pipeline 400;
- a pipeline sleeper 406 used to avoid any displacement of the pipeline in regard of the seabed.
Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention.
Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.
Upon receiving data (message 500) representing an underwater region (for instance, a picture taken by a camera or a video-recorder, rows of value taken by a sonar, etc.), it is possible to process the data to identify (step 501 ) an underwater features as described above.
The identification of the underwater features may be performed on only part(s) of the received data: for instance, features may be searched in a bottom-left corner of the received image 500 or any other specific subset of the data.
When possible underwater features are identified, in the received data 500, a comparison (step 502) may be performed to find a correspondence (e.g. a signature match) among a plurality of stored features in a database 503. The stored features
may have been stored in association with real location.
It is also possible that the detected feature in the data described directly (i.e. without the need of an external database) a real location (for instance, a sticker with real coordinates written on it). If no correspondence is found in the database 503 (test 504, output KO), no action is performed (step 505).
If a single correspondence is found in the database 503 (test 504, output OK), the real location associated with the correspondence in the database 503 is used to updated (step 506) the computed location 507 of the AUV. If a plurality of correspondences is found in the database 503 (test 504, output OK2), it is possible to select (step 508) one of the correspondence in the plurality of correspondences. The selected correspondence may be the correspondence for which the distance between the real location associated with and the current computed location 507 of the AUV is minimum. For instance, when a survey is performed on a pipeline, several flanges/anodes may have the same signature and then several correspondences may be found in the database matching an underwater feature. This algorithm for selecting one correspondence assumes that the most probable detected feature is the closest matching feature (i.e. with the shortest distance).
Figure 6a is an illustration of a sample image 600a taken by an AUV during a survey according to a possible embodiment of the invention.
The image 600a comprises the representation of a pipeline 601 a with a flange 602a and two perpendicular pipeline valves 603a and 604a. It is noted that the representation of the pipeline 601 a has a perspective effect: the two contour lines of the pipeline (which are normally parallel) are crossing at a vanishing point (outside image 600a).
In order to compensate this perspective effect, it is possible to deform image 600a.
Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention.
This deformation may comprise a perspective correction or perspective transformation (i.e. to set the contour lines parallel) and a rotation (i.e. to set the contour lines horizontal).
Thus, objects of the non-transformed image 600a (i.e. elements 601 a, 602a, 603a, 604a) are modified into new objects in a transformed image 600b (i.e. elements 601 b, 602b, 603b, 604b).
Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention.
During a survey of a pipeline, an AUV may capture a plurality of images along the pipeline.
Due to image acquisition and perspective effect, the pipe location is not stable between pair of images. In order to be able to correlate images and to create the mosaic image, a correction (see above) is applied on the image so that the pipe becomes horizontal with a constant width on the image. The transformation may comprise a simple morphing modifying the two detected lines (edges of the pipe) in two parallel and horizontal lines. After the transformation of these images (701 , 702, etc.) it is possible to combine these transformed images to create a panorama image (or mosaic image).
The mosaic image may be created with the following process: a/ store the n corrected images in a memory buffer; b/ for the first two successive corrected images (e.g. 701 and 702), analyzing these latter images by detecting the inter correlation between the two images, an overlapping zone (e.g 704) is thus estimated. Then the two images are flattened in a single image. c/ storing the flattened image in the buffer in order to replace the two successive corrected images in the first location in the buffer.
d/ if the buffer comprises more than one image, steps bl to 61 are reapplied to obtain the complete mosaic of the pipe 703.
Figure 8 is an illustration of possible defect detection method in a panorama image according to a possible embodiment of the invention.
For instance, it may be considered that there is a defect if the pipeline is not in contact (or close to) the seabed. Indeed, if the distance between the seabed and the pipeline is too big (a gap) and on a given length along the pipeline, the gravitational forces exerted on the pipeline could be dangerous for the pipeline integrity. A possible method for detection such defects is described in the application FR 2 965 616.
Moreover, a possible method for detection such defects may consist in:
- computing a panorama image according to the above method;
- extracting the part 800a of the panorama image corresponding below the representation of the pipeline;
- for each vertical segment (810, 81 1 , 812, 813, etc.) of the extracted part of the panorama image 800a, computing a "contrast variation value" or CW (820, 81 1 , 812, 813, etc.) related to a contrast of the pixels in the latter vertical segment; - if the contrast value or if the variation of the contrast value (within a zone, according to a direction of space, etc) is below a predetermined threshold (in the present example 190, line 800b), it is considered that a defect is present.
For instance the zone 801 a of the extracted part of the panorama image 800a, where a gap below the pipeline and the seabed is present, corresponds to the zone 801 b in the graphic, where CVV is below 190.
The zone 802a of the extracted part of the panorama image 800a, where a gap below the pipeline and the seabed is present, corresponds to the zone 802b in the graphic, where contrast variation values are below 190.
It may be possible to detect defects such as:
- debris in-contact with subsea pipelines through applying a real-time shape/ pattern comparison to pre-identified patterns in the software database;
- drag / scar marks on the seabed which consider an evidence of "walking pipelines";
- etc.
Upon the detection of such defects, it is possible to:
- produce preliminary report and/or to compare this report with the last produced report to stress differences;
- identify the defects on the panorama image;
- re-program AUV route to re-survey the area where defects have detected;
- use/activate other detections means (such as acoustics sensor, sonar, etc.) to increase the accuracy of the defect detection;
- etc.
Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention.
Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit.
Upon the reception of a plurality of images (message 900), each image of the plurality may be modified to change the perspective and/or to rotate the latter image (step 901 ).
Once all images are modified, it is possible to combine the modified images to create a panorama (step 902).
The panorama image may be cropped in order to keep the only relevant part of the image (i.e. the part of the image close to the representation of the pipeline in the panorama image, step 903).
It is possible to process the panorama image to detect anomalies/defects (step 904) for instance, according to the method described in patent application FR 2 965 616.
The panorama image may be marked according to the previous detection (step 905) to ease a future identification and verification of defects on the pipeline (for instance, to ease the visual inspection by operators/engineers).
The marks may be, for instance, vertical red lines at the location of the detected defects in the panorama image.
Finally, the final marked panorama image (message 906) may be outputted to be displayed, for instance, to the operators/engineers.
Figure 10 is a possible embodiment for a device that enables the present invention.
In this embodiment, the device 1000 comprise a computer, this computer comprising a memory 1005 to store program instructions loadable into a circuit and adapted to cause circuit 1004 to carry out the steps of the present invention when the program instructions are run by the circuit 1004.
The memory 1005 may also store data and useful information for carrying the steps of the present invention as described above. The circuit 1004 may be for instance:
- a processor or a processing unit adapted to interpret instructions in a computer language, the processor or the processing unit may comprise, may be associated with or be attached to a memory comprising the instructions, or - the association of a processor / processing unit and a memory, the processor or the processing unit adapted to interpret instructions in a computer language, the memory comprising said instructions, or
- an electronic card wherein the steps of the invention are described within silicon, or
- a programmable electronic chip such as a FPGA chip (for « Field- Programmable Gate Array »).
This computer comprises an input interface 1003 for the reception of data used for the above method according to the invention and an output interface 1006 for providing a panorama image, control navigation instructions, or update of the AUV location as described above.
To ease the interaction with the computer, a screen 1001 and a keyboard 1002 may be provided and connected to the computer circuit 1004.
A person skilled in the art will readily appreciate that various parameters disclosed in the description may be modified and that various embodiments disclosed may be combined without departing from the scope of the invention.
For instance, the description proposes embodiments with pipelines examples. Any cylindrical body may replace these pipelines.
Claims
1 . A method for controlling navigation of an underwater vehicle, the vehicle having navigation parameters, wherein the method comprises: - receiving image data (300) representing an underwater region;
- determining (301 ) whether a cylindrical body has a representation (204) in the received image data (200),
- upon positive determination:
- determining (303) a zone of the image data corresponding to the representation of the cylindrical body;
- modifying the navigation parameters, said modification being function of said determined zone.
2. A method according to claim 1 , wherein receiving image data comprises: - acquiring data of an underwater region with a sonar and/or a camera.
3. A method according to one of the preceding claims, wherein modifying the navigation parameters comprises:
- determining a relative position of the underwater vehicle compared to the cylindrical body, said determination being based on cylindrical body characteristics;
- modifying the navigation parameters to modify the relative position of the underwater vehicle to a reference relative position.
4. A method according to one of the preceding claims, wherein modifying the navigation parameters comprises:
- computing a deviation between the determined zone and a control pattern.
- modifying the navigation parameters of the underwater vehicle to reduce the deviation.
5. A method according to claim 4, wherein the control pattern comprises a control zone, and wherein the computed deviation is function of the intersection of the determined zone with the control zone.
6. A method according to claim 4 or 5, wherein the control pattern comprises control points, and wherein the computed deviation is function of distances between the control points and a hull of the determined zone.
7. A method according to any of claims 4 to 6, wherein the control pattern comprises at least two segments, and wherein the computed deviation is function of at least four distances, each distance being a distance between a hull of the determined zone and an end of one of the segments.
8. A method according to any of claims 4 to 7, wherein the control pattern comprises at least two control lines, and wherein the computed deviation is function of at least two deviation angles, each deviation angle being an angle between one of the control lines and a hull of the determined zone.
9. A method according to one of the preceding claims, wherein, the image data
being acquired through a lens, the method further comprises a modification of the received image data based on lens characteristics.
10. A method according to one of the preceding claims, wherein the method further comprises:
- upon negative determination:
- determining a period starting on or after a last positive determination and ending on or before the negative determination,
- if the period is greater than a predetermined value, modifying the navigation parameters to navigate to a fallback location.
1 1 . A method according to claim 10, wherein the method further comprises: upon positive determination:
- recording a positive location, the positive location comprising the location of the vehicle when the positive determination occurs, and wherein the fallback location is a recorded positive location.
12. A method according to claim 10 or 1 1 , wherein the determined period is maximum.
13. A method according to one of the preceding claims, wherein the navigation parameters comprises a navigation direction.
14. A controller (1000) for controlling navigation of an underwater vehicle, the vehicle having navigation parameters, wherein the controller comprises:
- an input (1003) to receive image data (300) representing an underwater
region;
- a circuit (1004) to determine (301 ) whether a cylindrical body have a representation (204) in the received image data (200),
- a circuit (1004) to determine (303) a zone of the image data corresponding to the representation of the cylindrical body;
- a circuit (1004, 1006) to modify the navigation parameters, said modification being function of said determined zone.
15. A non-transitory computer readable storage medium, having stored thereon a computer program comprising program instructions, the computer program being loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the steps of any of claims 1 to 13 when the computer program is run by the data-processing device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261720237P | 2012-10-30 | 2012-10-30 | |
US61/720,237 | 2012-10-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014067683A1 true WO2014067683A1 (en) | 2014-05-08 |
Family
ID=48998605
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2013/066948 WO2014067684A1 (en) | 2012-10-30 | 2013-08-13 | Method to enhance underwater localization |
PCT/EP2013/066947 WO2014067683A1 (en) | 2012-10-30 | 2013-08-13 | A method for controlling navigation of an underwater vehicle |
PCT/EP2013/066949 WO2014067685A1 (en) | 2012-10-30 | 2013-08-13 | A method for simplifying defect analysis |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2013/066948 WO2014067684A1 (en) | 2012-10-30 | 2013-08-13 | Method to enhance underwater localization |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2013/066949 WO2014067685A1 (en) | 2012-10-30 | 2013-08-13 | A method for simplifying defect analysis |
Country Status (1)
Country | Link |
---|---|
WO (3) | WO2014067684A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NO20161239A1 (en) * | 2016-07-28 | 2018-01-29 | 4Subsea As | Method for detecting position and orientation of a subsea structure using an ROV |
CN109976384A (en) * | 2019-03-13 | 2019-07-05 | 厦门理工学院 | A kind of autonomous underwater robot and path follow-up control method, device |
CN116452513A (en) * | 2023-03-20 | 2023-07-18 | 山东未来智能技术有限公司 | Automatic identification method for corrugated aluminum sheath defects of submarine cable |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3025346A1 (en) * | 2014-08-26 | 2016-03-04 | Centre Nat Rech Scient | AUTOMATIC METHOD OF IDENTIFYING A SHADOW GENERATED BY A REAL TARGET IN A TWO-DIMENSIONAL IMAGE OF A SONAR |
CN110533650B (en) * | 2019-08-28 | 2022-12-13 | 哈尔滨工程大学 | AUV underwater pipeline detection tracking method based on vision |
CN113269720B (en) * | 2021-04-16 | 2024-02-02 | 张家港华程机车精密制管有限公司 | Defect detection method, system and readable medium for straight welded pipe |
CN115932864B (en) * | 2023-02-24 | 2023-08-01 | 深圳市博铭维技术股份有限公司 | Pipeline defect detection method and pipeline defect detection device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2965616A1 (en) | 2010-10-01 | 2012-04-06 | Total Sa | METHOD OF IMAGING A LONGITUDINAL DRIVE |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW268099B (en) * | 1994-05-02 | 1996-01-11 | Ghneral Electric Co | |
GB0222211D0 (en) * | 2002-09-25 | 2002-10-30 | Fortkey Ltd | Imaging and measurement system |
-
2013
- 2013-08-13 WO PCT/EP2013/066948 patent/WO2014067684A1/en active Application Filing
- 2013-08-13 WO PCT/EP2013/066947 patent/WO2014067683A1/en active Application Filing
- 2013-08-13 WO PCT/EP2013/066949 patent/WO2014067685A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2965616A1 (en) | 2010-10-01 | 2012-04-06 | Total Sa | METHOD OF IMAGING A LONGITUDINAL DRIVE |
Non-Patent Citations (2)
Title |
---|
GIAN LUCA FORESTI: "Visual Inspection of Sea Bottom Structures by an Autonomous Underwater Vehicle", IEEE TRANSACTIONS ON SYSTEMS, MAN AND CYBERNETICS. PART B:CYBERNETICS, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 31, no. 5, 1 October 2001 (2001-10-01), XP011057004, ISSN: 1083-4419 * |
ZINGARETTI P ET AL: "Robust real-time detection of an underwater pipeline", ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, PINERIDGE PRESS, SWANSEA, GB, vol. 11, no. 2, 1 April 1998 (1998-04-01), pages 257 - 268, XP027087572, ISSN: 0952-1976, [retrieved on 19980401] * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NO20161239A1 (en) * | 2016-07-28 | 2018-01-29 | 4Subsea As | Method for detecting position and orientation of a subsea structure using an ROV |
NO342795B1 (en) * | 2016-07-28 | 2018-08-06 | 4Subsea As | Method for detecting position and orientation of a subsea structure using an ROV |
CN109976384A (en) * | 2019-03-13 | 2019-07-05 | 厦门理工学院 | A kind of autonomous underwater robot and path follow-up control method, device |
CN109976384B (en) * | 2019-03-13 | 2022-02-08 | 厦门理工学院 | Autonomous underwater robot and path following control method and device |
CN116452513A (en) * | 2023-03-20 | 2023-07-18 | 山东未来智能技术有限公司 | Automatic identification method for corrugated aluminum sheath defects of submarine cable |
CN116452513B (en) * | 2023-03-20 | 2023-11-21 | 山东未来智能技术有限公司 | Automatic identification method for corrugated aluminum sheath defects of submarine cable |
Also Published As
Publication number | Publication date |
---|---|
WO2014067684A1 (en) | 2014-05-08 |
WO2014067685A1 (en) | 2014-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014067683A1 (en) | A method for controlling navigation of an underwater vehicle | |
KR102583989B1 (en) | Automated image labeling for vehicles based on maps | |
US10810734B2 (en) | Computer aided rebar measurement and inspection system | |
CN111797650B (en) | Obstacle identification method, obstacle identification device, computer equipment and storage medium | |
CN110312912B (en) | Automatic vehicle parking system and method | |
US10417781B1 (en) | Automated data capture | |
US10496762B2 (en) | Model generating device, position and orientation calculating device, and handling robot device | |
KR101381218B1 (en) | Apparartus and method for generating an around view of a remotely operated vehicle | |
WO2020253010A1 (en) | Method and apparatus for positioning parking entrance in parking positioning, and vehicle-mounted terminal | |
US10726616B2 (en) | System and method for processing captured images | |
CN111094895B (en) | System and method for robust self-repositioning in pre-constructed visual maps | |
CN111178295A (en) | Parking space detection and model training method and device, vehicle, equipment and storage medium | |
WO2017171649A1 (en) | Methods for providing task related information to a user, user assistance systems, and computer-readable media | |
Leite et al. | An hierarchical architecture for docking autonomous surface vehicles | |
US9317968B2 (en) | System and method for multiple hypotheses testing for surface orientation during 3D point cloud extraction from 2D imagery | |
JP7461399B2 (en) | Method and device for assisting the running operation of a motor vehicle, and motor vehicle | |
KR102645492B1 (en) | Device and method for monitoring ship and port | |
JP2010066595A (en) | Environment map generating device and environment map generating method | |
CN114359865A (en) | Obstacle detection method and related device | |
Hurtos et al. | Sonar-based chain following using an autonomous underwater vehicle | |
WO2022097426A1 (en) | Status determination device, status determination system, and status determination method | |
EP3985609A1 (en) | Positioning system and method for determining the three-dimensional position of a movable object | |
JP6770826B2 (en) | Automatic collimation method and automatic collimation device for measuring the placement position of structures | |
KR102077934B1 (en) | Method for generating alignment data for virtual retrofitting object using video and Terminal device for performing the same | |
Baligh Jahromi et al. | Layout slam with model based loop closure for 3d indoor corridor reconstruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13750312 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13750312 Country of ref document: EP Kind code of ref document: A1 |