CN113093773B - Underwater structure detection method, system, equipment and medium based on underwater robot - Google Patents

Underwater structure detection method, system, equipment and medium based on underwater robot Download PDF

Info

Publication number
CN113093773B
CN113093773B CN202110645798.5A CN202110645798A CN113093773B CN 113093773 B CN113093773 B CN 113093773B CN 202110645798 A CN202110645798 A CN 202110645798A CN 113093773 B CN113093773 B CN 113093773B
Authority
CN
China
Prior art keywords
detection
task
subtask
observation
underwater robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110645798.5A
Other languages
Chinese (zh)
Other versions
CN113093773A (en
Inventor
魏建仓
郭红梅
侯明波
徐俊博
闫昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deepinfar Ocean Technology Inc
Original Assignee
Deepinfar Ocean Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deepinfar Ocean Technology Inc filed Critical Deepinfar Ocean Technology Inc
Priority to CN202110645798.5A priority Critical patent/CN113093773B/en
Publication of CN113093773A publication Critical patent/CN113093773A/en
Application granted granted Critical
Publication of CN113093773B publication Critical patent/CN113093773B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0692Rate of change of altitude or depth specially adapted for under-water vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4482Procedural

Abstract

The present disclosure provides an underwater structure detection method, system, device and medium based on an underwater robot, the method comprising: acquiring structural parameters of the underwater structure, and generating a simulated operation surface of the underwater structure; responding to the simulated operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task; controlling the underwater robot to execute corresponding detection operation based on the detection task; and outputting the detection result of the detection task. According to the underwater structure detection device and the underwater structure detection method, the underwater structure detection device and the underwater structure detection method are suitable for detection operation of large-area and multi-orientation underwater structures, the advantages of the underwater robot are fully utilized, and the efficiency of underwater structure detection of the semi-submersible platform is particularly improved.

Description

Underwater structure detection method, system, equipment and medium based on underwater robot
Technical Field
The utility model relates to an underwater detection technical field especially relates to underwater structure detection technology based on underwater robot.
Background
The sea area of China is vast, with the increase of the development strength of offshore oil, the semi-submersible type ocean platform gradually moves to deeper water and sea areas with severe environment, the sea areas have a large amount of precious natural resources, but severe natural environments such as high temperature, high humidity, high salinity and alkalinity bring severe tests to the ocean platform. In order to reduce corrosion of marine engineering equipment and prolong the service life of the equipment, people usually adopt a mode of an anticorrosive coating and cathodic protection to protect a marine steel structure. However, as the service life increases, the protection capability of the cathodic protection system gradually decreases, and the whole ocean engineering equipment is in corrosion risk. These corrosion risks are often hidden and difficult to directly detect, and periodic inspection of the underwater structure of the semi-submersible platform is important in order to timely detect these corrosion risks. At present, in offshore platform auxiliary operation based on an underwater robot, submarine pipeline detection, ocean engineering installation, Christmas tree detection and jacket platform detection are mainly focused, and underwater structure detection aiming at a semi-submersible offshore platform, particularly a deepwater semi-submersible platform, is not available. Therefore, the existing offshore platform auxiliary operation based on the underwater robot cannot be suitable for an underwater structure with a large steel structure area and a complex orientation, such as a semi-submersible platform, and the problems of inconvenience in detection, low detection efficiency, inaccurate detection result and the like are caused.
Disclosure of Invention
In view of the above, an object of the present disclosure is to provide an underwater structure detection method, system, device and medium based on an underwater robot.
Based on the above purpose, the present disclosure provides an underwater structure detection method based on an underwater robot, including:
acquiring structural parameters of the underwater structure, and generating a simulated operation surface of the underwater structure;
responding to the simulated operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task;
controlling the underwater robot to execute corresponding detection operation based on the detection task;
outputting a detection result of the task detection;
wherein, for a floating observation task, the controlling the underwater robot to execute corresponding detection operations based on the detection task comprises:
generating at least one observation subtask and a floating observation route based on a preset observation radius;
sequentially executing the observation subtasks from a first initial position according to the floating observation route;
in response to the observation result of the current observation subtask not meeting the requirement, suspending the current observation subtask;
executing the next observation subtask;
for an apparent detection task, controlling the underwater robot to perform a corresponding detection operation based on the detection task comprises:
generating an apparent subtask of the apparent detection task based on a preset detection point;
responding to the user to select the apparent subtask, and controlling the underwater robot to move to the preset detection point corresponding to the apparent subtask;
in response to the underwater robot detecting a target appearance at the preset detection point, the underwater robot tracking the target appearance and drawing a shape of the target appearance;
for a corrosion detection task, controlling the underwater robot to perform corresponding detection operations based on the detection task comprises:
responding to a user to select a to-be-detected area on the simulated operation surface, and generating a corrosion detection subtask, and an inspection route and a scanning length corresponding to the corrosion detection subtask;
and executing the corrosion detection subtask according to the corresponding inspection route and the scanning length.
Optionally, for a floating observation task, the controlling the underwater robot to perform a corresponding detection operation based on the detection task further includes:
responding to a task resuming instruction of a user, and continuing to execute the suspended observation subtask;
judging whether the suspended observation subtask needs to be calibrated or not;
and responding to a calibration instruction of a user, and re-executing the suspended observation subtask after calibrating the underwater robot.
Optionally, the method further comprises:
in the detection task, in response to a marking instruction of a user, updating or inquiring a detection mark at a specified position in the simulated working face; wherein the designated position comprises a position corresponding to the current position of the underwater robot in the simulated working surface.
Optionally, the updating or querying the detection mark at the designated position in the simulated working face in response to the marking instruction of the user includes at least one of:
adding the detection mark at the specified position in response to a mark adding instruction of a user;
or, in response to the user selecting the detection marker and a marker alteration instruction, altering the user selected detection marker;
or, in response to the user selecting the detection marker and a marker deletion instruction, deleting the detection marker selected by the user;
or, responding to the mark inquiry instruction of the user, inquiring the detection mark and browsing the inquiry result.
According to a second aspect of the present disclosure, there is provided an underwater structure detection system based on an underwater robot, comprising:
the operation planning module is used for generating a simulated operation surface of the underwater structure based on the structural parameters of the underwater structure; responding to the simulated operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task;
the operation monitoring module is used for controlling the underwater robot to execute corresponding detection operation based on the detection task, monitoring the operation process of the underwater robot and recording operation data of the detection task;
the data management module is used for outputting the detection result of the detection task and managing the operation data;
wherein, for a floating observation task, the controlling the underwater robot to execute corresponding detection operations based on the detection task comprises:
generating at least one observation subtask and a floating observation route based on a preset observation radius;
sequentially executing the observation subtasks from a first initial position according to the floating observation route;
in response to the observation result of the current observation subtask not meeting the requirement, suspending the current observation subtask;
executing the next observation subtask;
for an apparent detection task, controlling the underwater robot to perform a corresponding detection operation based on the detection task comprises:
generating an apparent subtask of the apparent detection task based on a preset detection point;
responding to the user to select the apparent subtask, and controlling the underwater robot to move to the preset detection point corresponding to the apparent subtask;
in response to the underwater robot detecting a target appearance at the preset detection point, the underwater robot tracking the target appearance and drawing a shape of the target appearance;
for a corrosion detection task, controlling the underwater robot to perform corresponding detection operations based on the detection task comprises:
responding to a user to select a to-be-detected area on the simulated operation surface, and generating a corrosion detection subtask, and an inspection route and a scanning length corresponding to the corrosion detection subtask;
and executing the corrosion detection subtask according to the corresponding inspection route and the scanning length.
According to a third aspect of the present disclosure, there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to the first aspect when executing the program.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of the first aspect.
As can be seen from the above, the underwater structure detection method, system, device and medium based on the underwater robot provided by the present disclosure plans and generates corresponding detection tasks according to the operation characteristics of each detection task, and controls the underwater robot to perform corresponding detection operations to complete detection of the underwater structure, so as to obtain a detection result. The underwater structure detection device is suitable for detection operation of large-area and multi-orientation underwater structures, fully utilizes the advantages of the underwater robot, and particularly improves the efficiency of underwater structure detection of the semi-submersible platform.
Drawings
In order to more clearly illustrate the technical solutions in the present disclosure or related technologies, the drawings needed to be used in the description of the embodiments or related technologies are briefly introduced below, and it is obvious that the drawings in the following description are only embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic block diagram of an underwater robot-based underwater structure detection system in accordance with an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart diagram of an underwater robot-based underwater structure detection method in accordance with an embodiment of the present disclosure;
FIG. 3 is a schematic flow diagram of a planktonic observation task according to an embodiment of the present disclosure;
FIG. 4 is a schematic flow diagram of an appearance detection task according to an embodiment of the present disclosure;
FIG. 5 is a schematic flow chart diagram of a corrosion detection task according to an embodiment of the present disclosure;
fig. 6 is a more specific hardware structure diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
For the purpose of promoting a better understanding of the objects, aspects and advantages of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings.
It is to be noted that technical terms or scientific terms used in the embodiments of the present disclosure should have a general meaning as understood by those having ordinary skill in the art to which the present disclosure belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the disclosure is not intended to indicate any order, quantity, or importance, but rather to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
The existing detection technology for the underwater structure of the offshore platform jacket cannot be applied to underwater structures with large steel structure areas and complex orientation of a semi-submersible platform, and the problems of inconvenience in detection, low detection efficiency, inaccurate detection results and the like are caused.
Based on the above consideration, the embodiment of the present disclosure provides an underwater structure detection system based on an underwater robot. Referring to fig. 1, fig. 1 shows a schematic block diagram of an underwater robot-based underwater structure detection system according to an embodiment of the present disclosure. As shown in fig. 1, the underwater structure detecting system based on the underwater robot includes:
an operation planning module 110, configured to generate a simulated operation surface of the underwater structure based on the structural parameters of the underwater structure; responding to the simulated operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task;
the operation monitoring module 120 is configured to monitor an operation process of the underwater robot and record operation data of the detection task when the underwater robot is controlled to perform a corresponding detection operation based on the detection task;
and the data management module 130 is configured to output a detection result of the detection task and manage the operation data of the underwater robot.
Optionally, the job planning module 110 may include a job plane module 111 and a mission planning module 112.
Further, in some embodiments, the worksurface module 111 may also include a worksurface input module 1111 and a worksurface display module 1112. Still further, in some embodiments, the worksurface input module 1111 may obtain structural parameters of the underwater structure from a data source or through user input, and construct one or more simulated worksurfaces corresponding to the underwater structure based on the structural parameters. In some embodiments, the job surface presentation module 1112 may display the simulated job surface generated by the job surface input module 1111 to the user.
Specifically, for the underwater structure of the semi-submersible platform, the working plane input module 1111 may simulate the construction of the underwater stereo structure of the semi-submersible platform based on the structural parameters (which may be in a document form), and divide the simulated underwater stereo structure into images of a plurality of working planes. The worksurface presentation module 1112 displays the simulated worksurface to the user in two or three dimensions.
Further, in some embodiments, the mission planning module 112 may also include at least one of a planktonic observation module 1121, an appearance detection module 1122, or a corrosion detection module 1123.
Still further, in some embodiments, the floating observation module 1121 may be used to plan a floating observation task. For example, planning a floating observation path, multiple observation subtasks, etc., may be presented in a table.
Still further, in some embodiments, the appearance detection module 1122 may be used to plan the appearance subtasks. For example, different key positions are planned as preset detection points for apparent detection. The apparent subtasks may also be presented in a tabular manner. In some embodiments, the appearance detection module 1122 may also map the appearance and/or contour of the target appearance detected by the underwater robot on the simulated worksurface. The target appearance may be an appearance of interest, that is, the underwater robot may not process the detected image when the apparent image is normal, and when the apparent image is detected to be abnormal (i.e., different from normal), the image may be analyzed as the target appearance to determine whether damage occurs.
Still further, in some embodiments, the corrosion detection module 1123 may be configured to plan the areas to be detected, the corrosion detection subtasks, and the inspection parameters of the corrosion detection subtasks. The corrosion detection subtask may also be presented in a tabular manner. When planning different regions to be detected, because the detection environment is different in each region to be detected, some regions may not be detected in a short distance, the scanning length of corrosion detection needs to be planned in advance according to the actual operation environment, and compared with the conventional detection mode in which uniform scanning length is adopted, the task planning module 112 of the embodiment of the present disclosure can avoid the reduction of detection efficiency and the inaccuracy of detection results caused by manual position adjustment, and can be more suitable for different application scenarios with complex structures.
Optionally, the job monitoring module 120 may include: a position monitoring module 121 and a trajectory monitoring module 122.
In some embodiments, the position monitoring module 121 may be used to monitor the position of the underwater robot and display it in the simulated work surface.
In some embodiments, the trajectory monitoring module 122 may be configured to monitor an actual trajectory of the underwater robot, a task completion area, a display of a detected object, and a record of operation data. For example, in the floating observation task, the trajectory monitoring module 122 may monitor the actual running trajectory of the underwater robot and the task completion area; in the apparent detection task, the trajectory monitoring module 122 may display the trend of the detection object; in the corrosion detection task, the trajectory monitoring module 122 may monitor an actual operation trajectory and a task completion area. The actual running track can be recorded in a document form.
Optionally, the data management module 130 may include a task management module 131 and a track management module 132. Further, the data management module 130 may further include a mark management module 133.
In some embodiments, the task management module 131 may include a task loading module 1311 and a task state management module 1312. Further, the task loading module may be configured to load a task and a history state; the task state management module can be used for detecting the start and pause control of the task, detecting the execution progress display of the task and detecting the cancellation of the task.
In some embodiments, the track management module 132 may be used for the historical track loading and playback module 1321, the track display module 1322. The historical track loading and playback module is used for selecting a task to be played back to load data and displaying the previous operation process. The track display module is used for clearing, displaying and hiding the track in the operation process.
In some embodiments, the tag management module 133 may include a tag category editing module 1331, a tag add/delete module 1332, a tag query module 1333, and a tag browsing module 1334. The mark category editing module 1331 is used to add, modify and delete attributes (including but not limited to category, color of mark display) of the mark. The mark query module 1332 is used to query the marks stored in the data for past detections. The mark adding and deleting module 1333 is used for adding marks and descriptions of the marks and deleting the obsolete marks in the simulation working plane. The tag browsing module 1334 is used for browsing the tag information selected by the user, including description information of the category, the adding time, the question, etc.
For convenience of description, the above system is described with the functions divided into various modules, which are described separately. Of course, the functionality of the various modules may be implemented in the same one or more software and/or hardware implementations of the present disclosure.
According to the embodiment of the disclosure, an underwater structure detection method based on the underwater robot is also provided, which can be implemented based on the underwater structure detection system based on the underwater robot of the embodiment of the disclosure, and has the beneficial effects of the corresponding method embodiment. It should be noted that the Underwater robot in the embodiment of the present disclosure may include an ROV device, i.e., one of a Remote Operated Vehicle (Remote Operated Vehicle) and an Unmanned Underwater Vehicle (UUV), and the Underwater robot may include: the system comprises a power propeller, a remote control electronic communication device, a camera pitching cloud deck, a user peripheral sensor interface, a real-time online display unit, a navigation positioning device, an automatic rudder navigation unit, an auxiliary illuminating lamp, a towing cable and other unit components. The ROV has various functions, different types of ROVs are used for executing different tasks, and the ROV is widely applied to various fields of army, coast guard, maritime affairs, customs, nuclear power, water and electricity, marine oil, fishery, marine rescue, pipeline detection, marine scientific research and the like.
Referring to fig. 2, fig. 2 shows a schematic flow chart of an underwater structure detection method based on an underwater robot according to an embodiment of the present disclosure. As shown in fig. 2, the underwater structure detection method based on the underwater robot includes:
step S210, obtaining structural parameters of the underwater structure, and generating a simulated operation surface of the underwater structure;
step S220, responding to the simulation operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task;
step S230, controlling the underwater robot to execute corresponding detection operation based on the detection task;
and step S240, outputting the detection result of the detection task.
The embodiment of the disclosure constructs the simulation operation surface according to the structural parameters of the underwater structure, and after selecting the simulation operation surface and the task to be detected, the user can automatically generate the detection task, and control the underwater robot to execute the corresponding detection operation according to the content of the detection task (the detection operation may refer to any relevant operation executed by completing the detection task, including but not limited to moving, shooting, and the like), thereby completing the detection and obtaining the corresponding detection result. The underwater structure detection device can be suitable for detection operation of large-area and multi-orientation underwater structures, fully utilizes the advantages of the underwater robot, and improves the efficiency of underwater structure detection of the semi-submersible platform.
According to the embodiment of the present disclosure, step S210, obtaining the structural parameters of the underwater structure, and generating the simulated operation surface of the underwater structure, includes:
obtaining the structural parameters of the underwater structure from a data source;
and reading the structure parameters by using graph drawing software to generate the simulation operation surface.
With reference to fig. 1 and fig. 2, the operation surface module 111 may automatically obtain the relevant structural parameters of the underwater structure from the data source, and read the relevant structural parameters by using the graph drawing software in the operation surface module 111 to generate the corresponding simulated operation surface. Or manually inputting the structural parameters of the underwater structure into the operation surface module 111, and generating a corresponding simulation operation surface by the graph drawing software in the operation surface module 111. The working surface module 111 displays one or more simulated working surfaces in a two-dimensional form, or displays a plurality of simulated working surfaces of an underwater structure in a three-dimensional form, and provides the simulated working surfaces to a user in a more vivid manner, so as to assist the user in more accurately selecting an area to be detected.
According to the embodiment of the disclosure, in step S220, in response to the user selecting the simulated working plane and the task instruction, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task; and in step S230, controlling the underwater robot to execute corresponding detection operation based on the detection task.
Optionally, the detection task includes a floating observation task, and the controlling the underwater robot to perform a corresponding detection operation based on the detection task includes:
generating at least one observation subtask and a floating observation route based on a preset observation radius;
sequentially executing the observation subtasks from a first initial position according to the floating observation route;
in response to the observation result of the current observation subtask not meeting the requirement, suspending the current observation subtask;
the next observation subtask is executed.
The preset observation radius may refer to a radius distance that the underwater robot can observe, for example, an observation range of a camera of the underwater robot is 1.5m, and the preset observation radius of the underwater robot may be 1.5 m.
In some embodiments, the observation subtask may include a subtask name (which may be a number or an ID), a subtask status tag (which indicates the job status of the current subtask). Still further, the observation subtask is further provided with a control area for removing or adding the observation subtask.
Specifically, according to the embodiment of the present disclosure, the user may select the task instructions of the simulated work surface and the floating observation task to be detected through the interactive device, and the job planning module 110 generates at least one observation subtask and a floating observation route based on the preset observation radius, and according to the floating observation route, all the observation subtasks may be completed. And for the first observation subtask, after the position of the underwater robot is initialized, the underwater robot moves to a first initial position, the first observation subtask is executed from the first initial position, and subsequent observation subtasks are sequentially executed according to the floating observation route. In the process of executing the floating observation task, the underwater robot transmits the detected observation result (such as sensor data or image) back to the operation monitoring module 120 for displaying to the user, and if the observation result of the current observation subtask is found to be not satisfactory (for example, the observation object is not found, that is, the observation object is lost), the operation monitoring module 120 suspends the current observation subtask. Or the underwater robot detects the fault and sends fault indication information to the operation monitoring module 120, and the current observation subtask can be suspended at the moment.
Further, if the observation result of the current observation subtask is not in accordance with the requirement or the condition that the underwater robot breaks down is eliminated, the current observation subtask can be continuously completed; if the underwater robot has a fault and is not eliminated, continuing to pause or stop the whole floating observation task; and if the observation results returned by the underwater robot are not satisfactory, continuing to pause or stop the current observation subtask, and continuing to execute the next observation subtask until all the observation subtasks are completed.
In some embodiments, said controlling said underwater robot to perform a respective detection operation based on said detection task further comprises:
responding to a task resuming instruction of a user, and continuing to execute the suspended observation subtask;
judging whether the suspended observation subtask needs to be calibrated or not;
and responding to a calibration instruction of a user, and re-executing the suspended observation subtask after calibrating the underwater robot.
Specifically, after the reason for suspending the current observation subtask is removed, the user may issue a resume task instruction to continue executing the suspended observation subtask. And judging whether the suspended observation subtask needs to calibrate the underwater robot or related parameters, if the user selects to calibrate, and sending a corresponding calibration instruction, then calibrating the underwater robot (which can be the operation parameters of the underwater robot, etc.) or the related parameters (which can be the initial parameters set in operation), and then restarting to execute the suspended observation subtask, for example, calibrating the initial position of the underwater robot, etc.
In some embodiments, the method further comprises: and displaying the corresponding actual running track of the underwater robot and/or the area which has completed the task on the simulated operation surface.
The actual running track corresponding to the underwater robot can be drawn and displayed to the user by the track monitoring module 122, so that the user can comprehensively know the detection operation process in a visual and visual mode, the user can find and solve problems conveniently and timely, and the detection efficiency is improved.
In some embodiments, referring to fig. 3, a schematic flow diagram of a planktonic observation task according to an embodiment of the present disclosure is shown in fig. 3. As shown in fig. 3, controlling the underwater robot to perform a corresponding detection operation of a floating observation task based on the detection task may include:
step S310, start; executing step S320;
step S320, selecting a simulation operation surface by a user; executing step S330;
step S330, the user selects a button corresponding to the floating observation task, for example, clicking a task planning button and selecting the floating observation task; step S340 is executed;
step S340, in response to the user selecting the simulated operation surface and the task instruction, the user operation planning module 110 may pop up a setting box for the user to set an observation radius, so as to obtain a preset observation radius; after the user confirms, the user operation planning module 110 may plan and generate at least one observation subtask and a floating observation route based on the task instruction of the simulated operation surface and the floating observation task selected by the user and the preset observation radius; executing the step S350;
step S350, initializing the position of the underwater robot; a user can manually click a position for initialization, and the initial position for initializing the underwater robot is also taken as a first initial position; executing step S360;
step S360, the operation monitoring module 120 controls the underwater robot to sequentially execute the observation subtasks according to the floating observation route; step S370 is performed;
step S370, the operation monitoring module 120 monitors the operation process of the underwater robot, and records the operation data of the detection task, including drawing an actual operation track and receiving real-time detection data; the user can observe and master the operation condition of the underwater robot in time according to the actual running track and the real-time detection data; during the period, if the observation result of the current observation subtask is not satisfactory or the underwater robot has a fault, executing step S380;
step S380, the user can click the task suspension button, the data management module 130 suspends the current observation subtask, the user can perform reason analysis at this time, solve the related problem in time, and execute step S381; the user clicks the task stop button, and the data management module 130 ends the current observation subtask or the entire floating observation task;
step S381, the operation monitoring module 120 stops drawing the actual running track, and may or may not continue to receive the real-time detection data at this time, which is not limited herein; if the reason causing the suspension of the current observation subtask is eliminated, the user can click the task continuation button to execute the step S390;
step 390, the operation monitoring module 120 controls the underwater robot to continue to execute the current observation subtask; executing step S391;
step S391, judging whether the underwater robot or related parameters need to be calibrated; if the user selects the requirement, returning to the step S350; if the user selection is not required, return to step S360.
Optionally, the detection task comprises an apparent detection task, and the controlling the underwater robot to perform the corresponding detection operation based on the detection task comprises:
generating an apparent subtask of the apparent detection task based on a preset detection point;
responding to the user to select the apparent subtask, and controlling the underwater robot to move to the preset detection point corresponding to the apparent subtask;
in response to the underwater robot detecting a target appearance at the preset detection point, the underwater robot tracks the target appearance and draws a shape of the target appearance.
The preset detection point can be set as required, and can be a key position needing apparent detection, such as a joint, a weld joint and the like, determined according to experience or historical records. Each of the appearance subtasks may correspond to a preset detection point.
In some embodiments, the apparent subtasks include preset trajectories. Wherein the preset trajectory may refer to a pre-planned trajectory for executing the apparent subtask.
In some embodiments, controlling the underwater robot to perform the respective detection operation based on the detection task further comprises:
and in response to the underwater robot deviating from the preset range of the preset track, suspending the current expression subtask.
In some embodiments, referring to fig. 4, a schematic flow diagram of an appearance detection task according to an embodiment of the present disclosure is shown in fig. 4. As shown in fig. 4, controlling the underwater robot to perform a corresponding detection operation of an apparent detection task based on the detection task may include:
step S410, start; step S420 is executed;
step S420, selecting a simulation operation surface by a user; executing the step S430;
step S430, selecting a button corresponding to the apparent detection task by a user, such as clicking a task planning button, and selecting the apparent detection task; step S440 is executed;
step S440, initializing the position of the underwater robot; the user can manually click the position for initialization, and the initial position for initializing the underwater robot is also taken as a second initial position; executing the step S450;
step S450, the user can select to manually control the underwater robot to move to a preset detection point, and after the underwater robot detects the interested target expression at the preset detection point, the step S460 is executed;
in step S460, the user job planning module 110 generates an apparent subtask of the apparent detection task based on the preset detection point, wherein the apparent subtask may include a name (including but not limited to a number or an ID) of the apparent subtask, a subtask status label (for indicating a job status of the apparent subtask at present). Still further, the appearance subtask is further provided with a control area for removing or adding the appearance subtask; step S470 is performed;
step S470, the operation monitoring module 120 monitors the operation process of the underwater robot, including drawing an actual operation track and receiving real-time detection data; during this, if the current position of the underwater robot deviates from the preset range of the preset trajectory, performing step S480;
step S480, a user can click a task pause button, the data management module 130 pauses the current appearance subtask, and the user can analyze the actual running track and the preset track at the moment so as to solve the deviation problem as soon as possible; the operation monitoring module 120 stops drawing the actual running track, and may continue to receive the real-time detection data at this time, or may not continue to receive the real-time detection data, which is not limited herein; if the user clicks the task continuation button, executing step S490; the user clicks the task stop button, and the data management module 130 ends the current appearance subtask or the entire appearance detection task;
in step S490, the data management module 130 continues to execute the current appearance subtask, and returns to step S450.
Optionally, the detection task includes a corrosion detection task, and the controlling the underwater robot to perform a corresponding detection operation based on the detection task includes:
responding to a user to select a to-be-detected area on the simulated operation surface, and generating a corrosion detection subtask, and an inspection route and a scanning length corresponding to the corrosion detection subtask;
and executing the corrosion detection subtask according to the corresponding inspection route and the scanning length.
In some embodiments, referring to fig. 5, a schematic flow chart diagram of corrosion detection tasks according to an embodiment of the present disclosure is shown in fig. 5. As shown in fig. 5, controlling the underwater robot to perform a corresponding detection operation of a corrosion detection task based on the detection task may include:
step S510, start; executing step S520;
step S520, selecting a simulation operation surface by a user; executing step S530;
step S530, selecting a button corresponding to the corrosion detection task by a user, such as clicking a task planning button, and selecting the corrosion detection task; executing step S540;
step S540, initializing the position of the underwater robot; step S550 is executed; the user can manually click the position for initialization, and the initial position for initializing the underwater robot is also taken as a third initial position; step S550 is executed;
in step S550, the user selects an area to be detected on the simulated work surface, and the user work planning module 110 generates a corrosion detection subtask and a corresponding inspection route according to the area to be detected, where the corrosion detection subtask may include a name (including but not limited to a number or an ID) of the corrosion detection subtask and a subtask status label (used to indicate a work status of the current corrosion detection subtask). Furthermore, the corrosion detection subtask is further provided with a control area for removing or adding the corrosion detection subtask; executing step S560;
step S560, the user selects a detection route corresponding to the corrosion detection subtask, and the user operation planning module 110 determines a scanning length corresponding to the current corrosion detection subtask and sends the scanning length to the underwater robot; step S561 is executed;
step S561, executing a current corrosion detection subtask, and when the underwater robot reaches a to-be-detected area, automatically starting a corrosion detection sensor in the underwater robot, and detecting the corrosion detection sensor by approaching the surface of an object to be detected along a detection route; step S570 is performed;
step S570, the operation monitoring module 120 monitors the operation process of the underwater robot, including drawing an actual operation trajectory and receiving real-time detection data, such as a task completion area, corrosion detection data, and the like; if the current corrosion detection subtask is completed, go to step S580; if the observation result of the current corrosion detection subtask does not meet the requirement or the underwater robot fails, executing step S590;
step S580, the corrosion detection subtask is completed; step S581 is executed;
step S581, the data management module 130 determines whether there is still an unexecuted corrosion detection subtask, and if yes, returns to step S560; if not, ending the whole corrosion detection task;
step S590, the user may click the task suspension button, and the data management module 130 suspends the current corrosion detection subtask and stops drawing the actual running track; when the problem is removed, the user may click a task continuation button, and step S591 is performed;
step S591, the data management module 130 continues to execute the current corrosion subtask, and step S592 is executed;
step S592, determining whether calibration of the underwater robot or the related parameters is required; if the user selects the requirement, the underwater robot or the related parameters are calibrated and then the step S561 is returned; if the user selection is not required, return is made to step S561.
According to an embodiment of the present disclosure, the task instruction may further include an automatic detection instruction, and the method further includes:
responding to the automatic detection instruction, controlling the underwater robot to execute all detection tasks of all the simulated operation surfaces according to the following sequence:
executing the floating observation task on all the simulation operation surfaces;
and sequentially executing the apparent detection task and the corrosion detection task on each simulated working face until all the working faces are detected.
The automatic detection instruction can realize full automatic detection of the underwater structure, detection efficiency is further improved, a large-area floating observation task is executed on all operation surfaces of the underwater structure, then an apparent detection task of a key point and a corrosion detection task of a small-area are sequentially carried out on each simulation operation surface, the detection task is gradually thinned to the small area and the key point from the large area, time for repeatedly moving to the same position for detection is saved, detection efficiency is further improved, and meanwhile comprehensive detection of the underwater structure can be realized.
It should be understood that the execution process of each detection task is the same as the previous embodiment of the present disclosure, and is not described herein again.
According to the embodiment, during the detection task, if the underwater robot fails or the observation target is lost midway, the detection result is not satisfactory, the user can pause or restart the task through the data management module 130, after the task is restarted, the corresponding subtask planning content is not changed, and the actual running track, the detection data, the detection area and the like of the underwater robot are recorded again. In addition, before the task is restarted, calibration can be carried out, so that errors are further avoided, and the detection accuracy is improved.
According to an embodiment of the present disclosure, the method may further include:
in the detection task, in response to a marking instruction of a user, updating a detection mark at a specified position in the simulated working plane; wherein the designated position comprises a position corresponding to the current position of the underwater robot in the simulated working surface.
In some embodiments, the user may customize the type of the mark of the detection mark in advance or during the detection task, and the corresponding attribute (including you not limited to color, name, number or ID, etc.) of the mark type. For example, in the task of corrosion detection, the damage of the structure surface, the adhesion of marine organisms, the corrosion points detected, etc. can be provided with corresponding mark types, and each mark type adopts a mark with different color. Furthermore, the user can add, remove, change and the like the marked types of the marks according to needs through corresponding interfaces in the data management module.
Optionally, the updating the detection mark at the designated position in the simulated working surface in response to the mark instruction of the user includes at least one of:
adding the detection mark at the specified position in response to a mark adding instruction of a user;
or, in response to the user selecting the detection marker and a marker alteration instruction, altering the user selected detection marker;
or, in response to the user selecting the detection marker and a marker deletion instruction, deleting the detection marker selected by the user;
or, responding to the mark inquiry instruction of the user, inquiring the detection mark and browsing the inquiry result.
In some embodiments, the tagged query instruction may include at least one of: work time, work surface, keyword for detection mark, and mark type.
In some embodiments, the method may further comprise:
and responding to the history detection task and the track playback instruction selected by the user, and displaying the actual running track corresponding to the history detection task.
In some embodiments, the method may further comprise:
responding to a historical track operation instruction selected by a user, and performing at least one of the following operations on an actual running track corresponding to the historical detection task: clear, show, or hide.
Specifically, the data management module 130 may perform tag type customization addition processing, removal, editing, and the like, detect tag addition, editing, removal, and the like, and display, clear, hide, and the like of the history track based on an instruction of the user.
According to the embodiment of the present disclosure, in step S240, the detection result of the detection task is output.
The detection result may include at least a part of job data, and may be selectively output from the job data recorded by the data management module 130 as needed. Further, the detection task, the detection subtask in the task, the detection of the actual running track of the task or the subtask, whether the detection task or the subtask is normal, the detection of the abnormal job data in the task or the subtask, and the like may be included.
It should be noted that the method of the embodiments of the present disclosure may be executed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene and completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the devices may only perform one or more steps of the method of the embodiments of the present disclosure, and the devices may interact with each other to complete the method.
It should be noted that the above describes some embodiments of the disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Based on the same inventive concept, corresponding to the method of any embodiment, the present disclosure further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and running on the processor, and when the processor executes the program, the underwater structure detection method based on the underwater robot according to any embodiment is implemented.
Fig. 6 shows a more specific hardware structure diagram of an electronic device according to an embodiment of the present disclosure, where the device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 are communicatively coupled to each other within the device via bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 1020 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 1020 and called to be executed by the processor 1010.
The input/output interface 1030 is used for connecting an input/output module to input and output information. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The communication interface 1040 is used for connecting a communication module (not shown in the drawings) to implement communication interaction between the present apparatus and other apparatuses. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
Bus 1050 includes a path that transfers information between various components of the device, such as processor 1010, memory 1020, input/output interface 1030, and communication interface 1040.
It should be noted that although the above-mentioned device only shows the processor 1010, the memory 1020, the input/output interface 1030, the communication interface 1040 and the bus 1050, in a specific implementation, the device may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The electronic device of the above embodiment is used for implementing the underwater structure detection method based on the underwater robot in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again.
Based on the same inventive concept, corresponding to any of the above-mentioned embodiment methods, the present disclosure also provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to execute the underwater robot-based underwater structure detection method according to any of the above-mentioned embodiments.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the above embodiment are used to enable the computer to execute the underwater structure detection method based on the underwater robot according to any of the above embodiments, and have the beneficial effects of the corresponding method embodiments, which are not described herein again.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the idea of the present disclosure, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present disclosure as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the disclosure. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the present disclosure, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the present disclosure are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that the embodiments of the disclosure can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic ram (dram)) may use the discussed embodiments.
The disclosed embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, equivalents, improvements, and the like that may be made within the spirit and principles of the embodiments of the disclosure are intended to be included within the scope of the disclosure.

Claims (7)

1. An underwater structure detection method based on an underwater robot comprises the following steps:
acquiring structural parameters of the underwater structure, and generating a simulated operation surface of the underwater structure;
responding to the simulated operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task;
controlling the underwater robot to execute corresponding detection operation based on the detection task;
outputting a detection result of the detection task;
wherein, for a floating observation task, the controlling the underwater robot to execute corresponding detection operations based on the detection task comprises:
generating at least one observation subtask and a floating observation route based on a preset observation radius;
sequentially executing the observation subtasks from a first initial position according to the floating observation route;
in response to the observation result of the current observation subtask not meeting the requirement, suspending the current observation subtask;
executing the next observation subtask;
for an apparent detection task, controlling the underwater robot to perform a corresponding detection operation based on the detection task comprises:
generating an apparent subtask of the apparent detection task based on a preset detection point;
responding to the user to select the apparent subtask, and controlling the underwater robot to move to the preset detection point corresponding to the apparent subtask;
in response to the underwater robot detecting a target appearance at the preset detection point, the underwater robot tracking the target appearance and drawing a shape of the target appearance;
for a corrosion detection task, controlling the underwater robot to perform corresponding detection operations based on the detection task comprises:
responding to a user to select a to-be-detected area on the simulated operation surface, and generating a corrosion detection subtask, and an inspection route and a scanning length corresponding to the corrosion detection subtask;
and executing the corrosion detection subtask according to the corresponding inspection route and the scanning length.
2. The method of claim 1, wherein for a floating observation mission, said controlling the underwater robot to perform a respective detection operation based on the detection mission further comprises:
responding to a task resuming instruction of a user, and continuing to execute the suspended observation subtask;
judging whether the suspended observation subtask needs to be calibrated or not;
and responding to a calibration instruction of a user, and re-executing the suspended observation subtask after calibrating the underwater robot.
3. The method of claim 1, further comprising:
in the detection task, in response to a marking instruction of a user, updating or inquiring a detection mark at a specified position in the simulated working face; wherein the designated position comprises a position corresponding to the current position of the underwater robot in the simulated working surface.
4. The method of claim 3, wherein updating or querying the detection marker at a specified location in the simulated worksurface in response to the marking instruction of the user comprises at least one of:
adding the detection mark at the specified position in response to a mark adding instruction of a user;
or, in response to the user selecting the detection marker and a marker alteration instruction, altering the user selected detection marker;
or, in response to the user selecting the detection marker and a marker deletion instruction, deleting the detection marker selected by the user;
or, responding to the mark inquiry instruction of the user, inquiring the detection mark and browsing the inquiry result.
5. An underwater robot-based underwater structure detection system comprising:
the operation planning module is used for generating a simulated operation surface of the underwater structure based on the structural parameters of the underwater structure; responding to the simulated operation surface and the task instruction selected by the user, planning and generating a corresponding detection task; wherein the detection task comprises at least one of the following: a floating observation task, an appearance detection task or a corrosion detection task;
the operation monitoring module is used for controlling the underwater robot to execute corresponding detection operation based on the detection task, monitoring the operation process of the underwater robot and recording operation data of the detection task;
the data management module is used for outputting the detection result of the detection task and managing the operation data;
wherein, for a floating observation task, the controlling the underwater robot to execute corresponding detection operations based on the detection task comprises:
generating at least one observation subtask and a floating observation route based on a preset observation radius;
sequentially executing the observation subtasks from a first initial position according to the floating observation route;
in response to the observation result of the current observation subtask not meeting the requirement, suspending the current observation subtask;
executing the next observation subtask;
for an apparent detection task, controlling the underwater robot to perform a corresponding detection operation based on the detection task comprises:
generating an apparent subtask of the apparent detection task based on a preset detection point;
responding to the user to select the apparent subtask, and controlling the underwater robot to move to the preset detection point corresponding to the apparent subtask;
in response to the underwater robot detecting a target appearance at the preset detection point, the underwater robot tracking the target appearance and drawing a shape of the target appearance;
for a corrosion detection task, controlling the underwater robot to perform corresponding detection operations based on the detection task comprises:
responding to a user to select a to-be-detected area on the simulated operation surface, and generating a corrosion detection subtask, and an inspection route and a scanning length corresponding to the corrosion detection subtask;
and executing the corrosion detection subtask according to the corresponding inspection route and the scanning length.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 4 when executing the program.
7. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 4.
CN202110645798.5A 2021-06-10 2021-06-10 Underwater structure detection method, system, equipment and medium based on underwater robot Active CN113093773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110645798.5A CN113093773B (en) 2021-06-10 2021-06-10 Underwater structure detection method, system, equipment and medium based on underwater robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110645798.5A CN113093773B (en) 2021-06-10 2021-06-10 Underwater structure detection method, system, equipment and medium based on underwater robot

Publications (2)

Publication Number Publication Date
CN113093773A CN113093773A (en) 2021-07-09
CN113093773B true CN113093773B (en) 2021-09-03

Family

ID=76662683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110645798.5A Active CN113093773B (en) 2021-06-10 2021-06-10 Underwater structure detection method, system, equipment and medium based on underwater robot

Country Status (1)

Country Link
CN (1) CN113093773B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108873687A (en) * 2018-07-11 2018-11-23 哈尔滨工程大学 A kind of Intelligent Underwater Robot behavior system knot planing method based on depth Q study
CN109409493A (en) * 2018-09-12 2019-03-01 中国人民解放军国防科技大学 Method for establishing accompanying observation behaviors of autonomous robots
CN112234496A (en) * 2020-10-27 2021-01-15 国网山西省电力公司大同供电公司 Cable pipe gallery underwater intelligent robot based on energy internet

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070276552A1 (en) * 2006-02-24 2007-11-29 Donald Rodocker Underwater crawler vehicle having search and identification capabilities and methods of use
GB2446670C (en) * 2007-07-11 2013-03-13 Flexlife Ltd Inspection method
CN101695956B (en) * 2009-10-27 2011-11-09 中国海洋石油总公司 Underwater robot platform of semisubmersible drilling platform
US8442709B2 (en) * 2009-12-11 2013-05-14 Lockheed Martin Corporation Underwater investigation system providing unmanned underwater vehicle (UUV) guidance based upon updated position state estimates and related methods
GB2520010B (en) * 2013-11-05 2016-06-01 Subsea 7 Ltd Tools and Sensors Deployed by Unmanned Underwater Vehicles
CN205941388U (en) * 2016-07-18 2017-02-08 孟丛丛 Underwater concrete structure thing detecting system
CN108045530A (en) * 2017-12-04 2018-05-18 国网山东省电力公司电力科学研究院 A kind of submarine cable detection underwater robot and operational method
CN109616956A (en) * 2018-11-01 2019-04-12 国网浙江省电力有限公司 There is cable remote underwater robot to the cruising inspection system and its operational method of submarine cable
CN110133666B (en) * 2019-05-17 2021-05-18 中国科学院声学研究所 System and method for monitoring scouring state of offshore wind power pile
CN110441388A (en) * 2019-07-25 2019-11-12 天津大学青岛海洋技术研究院 A kind of jacket platform underwater portion non-destructive testing method of detection
CN110333739B (en) * 2019-08-21 2020-07-31 哈尔滨工程大学 AUV (autonomous Underwater vehicle) behavior planning and action control method based on reinforcement learning
RU2724156C1 (en) * 2019-12-18 2020-06-22 Общество с ограниченной ответственностью Научно-производственное предприятие "Форт XXI" (ООО НПП "Форт XXI") Device for external flaw detection of underwater vertical hydraulic structures
CN112034042A (en) * 2020-08-25 2020-12-04 武汉理工大学 Mooring chain working state detection device and detection method
CN112215395B (en) * 2020-09-02 2023-04-18 中国船舶重工集团公司第七研究院 Underwater equipment adaptability information guarantee system based on ocean big data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108873687A (en) * 2018-07-11 2018-11-23 哈尔滨工程大学 A kind of Intelligent Underwater Robot behavior system knot planing method based on depth Q study
CN109409493A (en) * 2018-09-12 2019-03-01 中国人民解放军国防科技大学 Method for establishing accompanying observation behaviors of autonomous robots
CN112234496A (en) * 2020-10-27 2021-01-15 国网山西省电力公司大同供电公司 Cable pipe gallery underwater intelligent robot based on energy internet

Also Published As

Publication number Publication date
CN113093773A (en) 2021-07-09

Similar Documents

Publication Publication Date Title
US11398080B2 (en) Methods for augmented reality applications
Vilutiene et al. Building information modeling (BIM) for structural engineering: A bibliometric analysis of the literature
Meinig et al. Public–private partnerships to advance regional ocean-observing capabilities: a saildrone and NOAA-PMEL case study and future considerations to expand to global scale observing
US20210133149A1 (en) Intelligent and automated review of industrial asset integrity data
CN109341702A (en) Route planning method, device, equipment and storage medium in operating area
US11099716B2 (en) Context based content navigation for wearable display
Fonseca et al. A standards-based digital twin of an experiment with a scale model ship
CN114299390A (en) Method and device for determining maintenance component demonstration video and safety helmet
CN108827252B (en) Method, device, equipment and system for drawing underwater live-action map and storage medium
Chavez et al. Underwater navigation using visual markers in the context of intervention missions
Hareide et al. Developing a high-speed craft route monitor window
Mattioli et al. Unlocking Resident Underwater Inspection drones or AUV for subsea autonomous inspection: value creation between technical requirements and technological development
CN113093773B (en) Underwater structure detection method, system, equipment and medium based on underwater robot
JP6243308B2 (en) Identification device and identification method
US9817120B2 (en) Fish tracker
US20150369914A1 (en) Generating a Map Using Radar Data
WO2015179695A1 (en) Point cloud systems and methods
Primeau Innovations in inspection processes for marine pipelines
Nevoso et al. Riser Inspection via Subsea Drone: Addressing The Technological Challenges To Unlock Sustainable IMR Campaigns
CN115861557A (en) Urban rail transit monitoring method, system and medium based on BIM + three-dimensional scanning assistance
CN202600139U (en) ECDIS (electronic chart display and information system) radar signal receiving and processing device
CN114757253A (en) Three-dimensional point cloud tagging using distance field data
AU2019283811B2 (en) Event Triggering and Automatic Waypoint Generation
AU2018267650A1 (en) Plant guidance and equipment identification
Copros et al. Alister–Rapid Environment Assessment AUV (Autonomous Underwater Vehicle)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant