NO20161239A1 - Method for detecting position and orientation of a subsea structure using an ROV - Google Patents
Method for detecting position and orientation of a subsea structure using an ROV Download PDFInfo
- Publication number
- NO20161239A1 NO20161239A1 NO20161239A NO20161239A NO20161239A1 NO 20161239 A1 NO20161239 A1 NO 20161239A1 NO 20161239 A NO20161239 A NO 20161239A NO 20161239 A NO20161239 A NO 20161239A NO 20161239 A1 NO20161239 A1 NO 20161239A1
- Authority
- NO
- Norway
- Prior art keywords
- rov
- orientation
- camera
- visual feature
- feature
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 30
- 230000000007 visual effect Effects 0.000 claims description 28
- 238000012544 monitoring process Methods 0.000 claims description 15
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000005259 measurement Methods 0.000 claims description 5
- 238000009434 installation Methods 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 3
- 239000003550 marker Substances 0.000 claims description 2
- 239000003973 paint Substances 0.000 claims description 2
- PKKNCEXEVUFFFI-UHFFFAOYSA-N nevanimibe Chemical compound CC(C)C1=CC=CC(C(C)C)=C1NC(=O)NCC1(C=2C=CC(=CC=2)N(C)C)CCCC1 PKKNCEXEVUFFFI-UHFFFAOYSA-N 0.000 claims 7
- 241000251468 Actinopterygii Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63G—OFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
- B63G8/00—Underwater vessels, e.g. submarines; Equipment specially adapted therefor
- B63G8/001—Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63G—OFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
- B63G8/00—Underwater vessels, e.g. submarines; Equipment specially adapted therefor
- B63G8/38—Arrangement of visual or electronic watch equipment, e.g. of periscopes, of radar
-
- E—FIXED CONSTRUCTIONS
- E21—EARTH OR ROCK DRILLING; MINING
- E21B—EARTH OR ROCK DRILLING; OBTAINING OIL, GAS, WATER, SOLUBLE OR MELTABLE MATERIALS OR A SLURRY OF MINERALS FROM WELLS
- E21B47/00—Survey of boreholes or wells
- E21B47/002—Survey of boreholes or wells by visual inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Transmission And Conversion Of Sensor Element Output (AREA)
- Control Of Position Or Direction (AREA)
- Length Measuring Devices By Optical Means (AREA)
Description
Introduction
The invention relates to monitoring of a subsea structure. More specifically the invention is defined by a method for detecting position and orientation of a subsea 5 structure.
Background
When performing lifting or lowering operations of different types of subsea structures, it may be essential to know the exact position and orientation of the 10 structure, e.g. if it is in level or out of level. This is specifically the case for big and heavy structures and for installation operations where one subsea structure is to be connected to another structure.
It may also be important to keep already installed subsea equipment and structures under surveillance after installation to see if they are moving or offset of a set 15 position.
A subsea structure can be any type of structure installed or being operated subsea. Lifting or lowering operations of such structures are typically performed by a vessel with lifting arrangements, e.g. winch, crane or hoist. The subsea structure may for instance be a structure that is to be installed on the seabed or removed from the 20 seabed. Such structures may be large and heavy.
Large structures being handled under water are typically related to subsea oil and gas installations. Examples of such are BOP, riser and pipelines, frame structures, anchors and suction piles. Large structures may however also be related to the fish farm industry where large frames are positioned offshore or wind turbines with 25 large substructures. When handling such large and heavy structures in a lifting or lowering operation it may be vital to know the exact position and orientation during the operation.
WO 02086288 A1 describes a method and apparatus for monitoring the position of parts of a subsurface tool. Position is monitored by detecting acoustic emission 30 generated in the tool. The method is depending on acoustic emission, and will not work if the subsea structure being operated does not generate any acoustic emission. The method is thus not suited for detecting position and rotation of a stand-alone structure being lifted or lowered subsea.
There is a need for a simple and accurate method for monitoring a subsea structure 35 while it is lifted or lowered subsea as well as after it has been installed.
The present invention provides a cost efficient solution where a standard remote operated vehicle, ROV, equipped with at least one camera is used in a novel method for monitoring subsea operations.
Short description of the invention
The present invention comprises a method for monitoring a subsea structure for detecting position and orientation of the structure. The method is defined by:
- providing a ROV with at least a first camera generating a video stream of 5 pictures of the structure to be monitored;
- controlling the ROV such that the camera is always directed at the structure;
- locking the focus of the camera for tracking a specific visual feature of the structure and video recording said feature over time;
- interpreting the video stream in a video processor by calculating position and 10 rotation of said visual feature for detecting position and orientation of the structure relative to the ROV.
Further features of the method are defined in the dependent claims.
The invention is further defined by a computer program having instructions which when executed by a computer device or system cause the computing device or 15 system to perform a method.
Detailed description of the invention
The invention will now be described in detail with reference to the drawings illustration different embodiments:
20
Figure 1 is a sketch of a ROV with cameras and a structure to be monitored;
Figure 2 illustrates the concept where a tracing program is locking a camera to a specific visual feature of a structure;
Figure 3 shows a zoomed-in section of the visual feature that a program is
25 instructed to track, and
Figure 4 shows a time series of the horizontal position of the tracked feature.
The invention is defined by a method for monitoring a subsea structure for detecting position and orientation of the structure. The method comprises several steps.
30 One step is providing a ROV with a first camera filming the structure to be monitored. The camera can be a camera which is already integrated in the ROV, or it may be retrofitted by attaching the camera to the body of the ROV. By video recording with one camera, in-plane (2D) measurements of the structure can be performed.
In one embodiment of the invention, the ROV is provided with a second camera. By video recording the specific visual feature of the structure with said first and second cameras, spatial measurements (3D) of the structure can be provided.
This is possible by configuring the first and second camera to lock and focus on to 5 the same specific visual feature of the structure. In order to obtain 3D measurements, the optical axes of each camera must be separated. The cameras may for instance be mounted on each side of a ROV.
Video recorded from two or more cameras will be synchronised for providing the 3D measurements.
10 Figure 1 is a sketch illustrating a set-up used when performing the method with a ROV 20, located above seabed 50, having two cameras 30, 35 and a structure 10 to be monitored.
Cameras are always directed at a structure 10 to be monitored by controlling the ROV 20. This may be performed automatically by letting the camera 30 detect the 15 moving structure 10 to be monitored and direct its field of view towards the moving structure 10 automatically. One way of controlling this operation is to let tracking software tracking said specific visual feature 40 also control the positioning of the ROV 20. This is further described below.
When a camera 30 is directed at the structure 10 to be monitored, the next step is 20 locking the focus of the camera 30 to a specific visual feature 40 of the structure 10 and video recording the feature 40 of the structure 10. The specific visual feature 40 may be a feature 40 that is already a part of the structure 10 to be monitored, or it may be a reference mark provided in the form of a marker, label, magnetic sticker or paint that is attached to the structure 10 for the purpose of monitoring it while 25 performing lifting or lowering operations. These will all act as a reference mark that easily can be detected and recognized in a video processor.
Figure 2 illustrates the concept where a tracing program in the video processor is locking a camera 30 to a specific visual feature 40 of a structure 10, while Figure 3 shows the feature to be tracked in Figure 2.
30 How the tracking software is operating for detecting movements of a specific visual feature 40 is regarded as known prior art within pattern recognition and will not be described in detail.
Figure 4 shows a time series of the horizontal position of the tracked feature 40. The x-axis shows time in seconds, while the y-axis shows the horizontal
35 displacement in mm of the tracked feature 40.
Focusing on only one visual feature 40 or reference mark on a subsea structure 10 will enable the invention and make it possible to monitor and detect position and rotation. In one embodiment two or three reference marks may be placed on the structure 10. These can then be spaced apart to a certain degree as long as they all are in the focal view of the one or more cameras 30 used.
Using more that one reference mark on the structure 10 may improve the sensitivity when detecting small movements. Multiple markers are also used to measure 5 relative distances and rotations between markers.
By placing reference marks on different sides of a structure 10, the structure 10 do not have to be in a specific position relative to a ROV 20 video filming it before locking the focus of a camera 30 and starting interpretation of the video stream.
A combination of using natural visual features 40 and attached reference marks for 10 monitoring a subsea structure 10 is also feasible according to the invention.
The last step of the inventive method is interpreting the video stream in a video processor by calculating position and orientation of said visual features 40 for detecting position and orientation of the structure 10 relative to the ROV 20, including calculation of all the 6 degrees of freedom, i.e. directions x, y, z and 15 rotation 1,2 and 3. In order to do this, the video processor comprises tracking software for monitoring and tracking the defined visual feature 40, and thereby position and orientation of the structure 10.
In one embodiment of the invention, the tracking software tracking the specific visual feature 40 is linked to software controlling positioning of the ROV 20. In this 20 way a ROV 20 with a camera 30 locked to a defined visual feature 40 of a structure 10 can be kept still at a specific and optimal distance for performing accurate and continuous tracking of position and orientation of said feature 40.
When performing a lifting or lowering operation of a subsea structure 10, the position and orientation of the structure 10 relative to a ROV 20 may be irrelevant.
25 The vital information for an operator of a lifting and lowering operation of a structure 10 may be the current position and orientation of the structure 10 relative to the sea floor. This information will be available if the ROV 20 is positioned still on the seabed 50. If however the ROV 20 is moving, further steps must be taken in order to find the current position and orientation of the structure 10 relative to the 30 sea floor.
According to one embodiment of the invention, finding the position and orientation of the structure 10 relative to the sea floor is possible, even if the ROV 20 is moving around subsea, by using the ROV’s 20 positioning sensors, such as short baseline acoustic positioning system (BSL), depth sensor, accelerometer and 35 gyroscope, for calculating position and orientation of the ROV 20 relative to the seabed 50, and then combining these calculations with the calculation of orientation of the visual feature 40 of the structure 10 for determining the position and orientation of the structure 10 relative to seabed 50.
Irrespective of which type of specific visual features 40 that are used when performing the described method, the video processor must know the appearance of the feature 40 prior to calculating position and orientation.
There are several different ways of doing this. One way is letting an operator of the 5 ROV 20 control the camera 30 for zooming in on a specific visual feature 40 prior to inputting an instruction telling a video processor that the zoomed-in feature 40 is the one to lock to and use in the calculations.
Another way is inputting, to the video processor, information defining the specific visual feature 40 to focus on prior to performing a monitoring operation. A ROV 20 10 system may then operate autonomously by first detecting a subsea structure 10 to be monitored, then focus and lock to the specific visual feature 40 before performing position and orientation calculations.
The above described method for monitoring a subsea structure 10 can be controlled and executed by a computer program having instructions which when executed by a 15 computer device or system will cause the computing device or system to perform the method.
The program can be executed in a computer device installed in the ROV 20. It can further be linked to a video processor and a controlling device for controlling movements of the ROV 20. Resulting monitoring information will then be
20 sent/streamed from the ROV 20 to be displayed at a remote location.
The camera 30 used for performing the inventive method can be connected to a video processor for processing and transmission of monitoring results. Recorded video may also be transmitted elsewhere for being processed in a remote located video processor. When recorded video is being processed, the video processor 25 recognises the defined visual features 40 in the video and calculates its orientation and position as a function of time. Signal processing is used to ensure stable performance and to remove vibrations from the ROV 20 holding the camera 30. Either way, real-time presentation of the results of position and orientation data of the structure 10 is presented to an operator controlling lifting and lowering
30 operations of the structure 10.
The present invention provides a novel and efficient method for monitoring a subsea structure 10 for detecting position and orientation. It is well suited for lifting and lowering operation, but just as well suited for inspection and surveying purposes.
Claims (10)
1. A method for monitoring a subsea structure (10) for detecting position and orientation of the structure (10), c h a r a c h e r i z e d i n:
- providing a ROV (20) with at least a first camera (30) generating a 5 video stream of pictures of the structure (10) to be monitored;
- controlling the ROV (20) such that the camera (30) is always directed at the structure (10);
- locking the focus of the camera (30) for tracking a specific visual feature (40) of the structure (10) and video recording said feature (40) 10 over time, and
- interpreting the video stream in a video processor by calculating position and rotation of said visual feature (40) for detecting position and rotation of the structure (10) relative to the ROV (20).
15
2. The method according to claim 1, c h a r a c h e r i z e d i n that the ROV (20) is provided with a second camera (35) generating a video stream with pictures of the structure (10) to be monitored.
3. The method according to claim 3, c h a r a c h e r i z e d i n that first and 20 second cameras (30, 35) are configured to provide spatial 3D measurements of the structure (10) to be monitored.
4. The method according to any of the previous claims,
c h a r a c h e r i z e d i n linking software controlling said at least one 25 camera (30), tracking the specific visual feature (40), to software controlling the ROV (20) such that the ROV (20) is kept still at a specific and optimal distance for performing accurate and continuous tracking of position and orientation of said feature (40).
30
5. The method according to any of the previous claims,
c h a r a c t e r i z e d i n that the visual feature (40) is provided as a reference mark by attaching a marker, label or paint to the structure (10) to be monitored.
35
6. The method according to any of the previous claims,
c h a r a c t e r i z e d i n that calculation of position and orientation includes calculation of all six degrees of freedom of the subsea structure (10).
7. The method according to any of the previous claims,
c h a r a c t e r i z e d i n using the ROV’s (20) positioning and motion sensors for calculating orientation of the ROV (20) relative to seabed (50).
5
8. The method according to claim 7, c h a r a c t e r i z e d i n combining the calculation of orientation of the ROV (20) relative to seabed (50) with the calculation of position and orientation of the visual feature (40) for determining the position and orientation of the structure (10) relative to seabed (50).
10
9. The method according to any of the previous claims,
c h a r a c t e r i z e d i n presenting position and orientation data of the structure (10) to an operator surveying the structure (10) during or after installation of the structure (10).
15
10. A computer program having instructions which when executed by a computer device or system cause the computing device or system to perform a method according to any of the claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NO20161239A NO342795B1 (en) | 2016-07-28 | 2016-07-28 | Method for detecting position and orientation of a subsea structure using an ROV |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NO20161239A NO342795B1 (en) | 2016-07-28 | 2016-07-28 | Method for detecting position and orientation of a subsea structure using an ROV |
Publications (2)
Publication Number | Publication Date |
---|---|
NO20161239A1 true NO20161239A1 (en) | 2018-01-29 |
NO342795B1 NO342795B1 (en) | 2018-08-06 |
Family
ID=62103935
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NO20161239A NO342795B1 (en) | 2016-07-28 | 2016-07-28 | Method for detecting position and orientation of a subsea structure using an ROV |
Country Status (1)
Country | Link |
---|---|
NO (1) | NO342795B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111354036A (en) * | 2018-12-20 | 2020-06-30 | 核动力运行研究所 | Underwater optical positioning algorithm applied to pressure container |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4470680A (en) * | 1982-09-21 | 1984-09-11 | Minolta Camera Kabushiki Kaisha | Water-proof photographic camera with an automatic focusing device |
JP2000035312A (en) * | 1998-07-16 | 2000-02-02 | Penta Ocean Constr Co Ltd | Measuring apparatus for position of structure such as block installed in submerged zone |
WO2002086288A1 (en) * | 2001-04-24 | 2002-10-31 | Fmc Technologies, Inc. | Acoustic monitoring system for subsea wellhead tools and downhole equipment |
US20100226541A1 (en) * | 2009-03-03 | 2010-09-09 | Hitachi - Ge Nuclear Energy, Ltd. | System and method for detecting position of underwater vehicle |
WO2014067683A1 (en) * | 2012-10-30 | 2014-05-08 | Total Sa | A method for controlling navigation of an underwater vehicle |
WO2016068715A1 (en) * | 2014-10-31 | 2016-05-06 | Fugro N.V. | Underwater positioning system |
-
2016
- 2016-07-28 NO NO20161239A patent/NO342795B1/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4470680A (en) * | 1982-09-21 | 1984-09-11 | Minolta Camera Kabushiki Kaisha | Water-proof photographic camera with an automatic focusing device |
JP2000035312A (en) * | 1998-07-16 | 2000-02-02 | Penta Ocean Constr Co Ltd | Measuring apparatus for position of structure such as block installed in submerged zone |
WO2002086288A1 (en) * | 2001-04-24 | 2002-10-31 | Fmc Technologies, Inc. | Acoustic monitoring system for subsea wellhead tools and downhole equipment |
US20100226541A1 (en) * | 2009-03-03 | 2010-09-09 | Hitachi - Ge Nuclear Energy, Ltd. | System and method for detecting position of underwater vehicle |
WO2014067683A1 (en) * | 2012-10-30 | 2014-05-08 | Total Sa | A method for controlling navigation of an underwater vehicle |
WO2016068715A1 (en) * | 2014-10-31 | 2016-05-06 | Fugro N.V. | Underwater positioning system |
Non-Patent Citations (2)
Title |
---|
ISHIDA, M ET AL «Marker based camera pose estimation for underwater robots», publisert i 2012 IEEE/SICE International Symposium on System Integration (SII), Kyushu University, Fukuoka, Japan, 16-18 desember, 2012, sidene 629-634, INSPEC Accession Number: 13286782, DOI: 10.1109/SII.2012.6427353 , Dated: 01.01.0001 * |
SHKURTI, F ET AL «Feature Tracking Evaluation for Pose Estimation in Underwater Environments», publisert i 2011 Canadian Conference on Computer and Robot Vision, IEEE Computer Society, 25-27 mai 2011, INSPEC Accession Number: 12122249, DOI: 10.1109/CRV.2011.28 , Dated: 01.01.0001 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111354036A (en) * | 2018-12-20 | 2020-06-30 | 核动力运行研究所 | Underwater optical positioning algorithm applied to pressure container |
Also Published As
Publication number | Publication date |
---|---|
NO342795B1 (en) | 2018-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10871567B2 (en) | Underwater optical positioning systems and methods | |
US9909864B2 (en) | System, device and method for tracking position and orientation of vehicle, loading device and cargo in loading device operations | |
EP3213104B1 (en) | Underwater positioning system | |
US8655022B2 (en) | System and method for detecting position of underwater vehicle | |
US10323941B2 (en) | Offshore positioning system and method | |
KR101381218B1 (en) | Apparartus and method for generating an around view of a remotely operated vehicle | |
EP3382335B1 (en) | System, method and computer program product for determining a position and/or attitude of an offshore construction | |
US20170074664A1 (en) | Underwater Inspection System Using An Autonomous Underwater Vehicle ("AUV") In Combination With A Laser Micro Bathymetry Unit (Triangulation Laser) and High Definition Camera | |
CN106679662B (en) | A kind of underwater robot list beacon Combinated navigation method based on TMA technology | |
WO2005033629A2 (en) | Multi-camera inspection of underwater structures | |
Choi et al. | Development of a ROV for visual inspection of harbor structures | |
CN107850644A (en) | Localization, mapping and touch feedback for confined space inside inspection machine | |
KR102298643B1 (en) | 3D modeling method of underwater surfaces using infrared thermal imaging camera and drone | |
Menna et al. | Towards real-time underwater photogrammetry for subsea metrology applications | |
WO2014067684A1 (en) | Method to enhance underwater localization | |
NO20161239A1 (en) | Method for detecting position and orientation of a subsea structure using an ROV | |
US11461906B2 (en) | Systems and methods for monitoring offshore structures | |
KR101438514B1 (en) | Robot localization detecting system using a multi-view image and method thereof | |
KR101812027B1 (en) | Method and system for estimating location of a plurality of underwater robot connercted by cable | |
KR20140064292A (en) | Precise underwater positioning system for remotely-operated vehicle | |
Honda et al. | Borehole Imaging by Applying 3D Visual SLAM to Borehole Images Acquired in Forward Vision Camera System | |
LERTUTSAHAKUL et al. | Suspended Obstacle Detection for Plant Site Inspection Robots with Monoscopic Camera | |
Jang et al. | Development of unmanned underwater excavation equipment for port construction | |
Drews et al. | Real-time depth estimation for underwater inspection using dual laser and camera | |
CN117211768A (en) | State detection method and control unit for drilling machine |