CN115303451A - Underwater equipment and underwater operation system - Google Patents
Underwater equipment and underwater operation system Download PDFInfo
- Publication number
- CN115303451A CN115303451A CN202210870497.7A CN202210870497A CN115303451A CN 115303451 A CN115303451 A CN 115303451A CN 202210870497 A CN202210870497 A CN 202210870497A CN 115303451 A CN115303451 A CN 115303451A
- Authority
- CN
- China
- Prior art keywords
- underwater
- information
- sub
- equipment
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 55
- 230000003287 optical effect Effects 0.000 claims abstract description 51
- 238000013507 mapping Methods 0.000 claims abstract description 12
- 230000009471 action Effects 0.000 claims abstract description 10
- 238000012876 topography Methods 0.000 claims description 35
- 239000003550 marker Substances 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 7
- 238000000034 method Methods 0.000 description 16
- 239000010410 layer Substances 0.000 description 12
- 230000008569 process Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 5
- 238000010276 construction Methods 0.000 description 4
- 239000002356 single layer Substances 0.000 description 4
- 238000002592 echocardiography Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 3
- 238000009432 framing Methods 0.000 description 3
- 238000007499 fusion processing Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000003245 working effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63C—LAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
- B63C11/00—Equipment for dwelling or working underwater; Means for searching for underwater objects
- B63C11/52—Tools specially adapted for working underwater, not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63G—OFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
- B63G8/00—Underwater vessels, e.g. submarines; Equipment specially adapted therefor
- B63G8/001—Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/422—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation for representing the structure of the pattern or shape of an object therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/05—Underwater scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Mechanical Engineering (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Ocean & Marine Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses underwater equipment and an underwater operation system, belonging to the technical field of underwater operation, wherein the underwater equipment comprises an operation control module and a mapping module, wherein the operation control module comprises a position locking unit and an execution unit, the position locking unit is used for acquiring an optical image and first sonar image data, determining the position information of an underwater target object according to the optical image and the first sonar image data, and the execution unit is used for controlling the underwater equipment to move and executing operation action on the target object according to the position information; the mapping module is used for acquiring path planning information of the underwater equipment, controlling the underwater equipment to move on the water surface according to the path planning information, acquiring second sonar image data and motion state information of the underwater equipment, and constructing an underwater topographic map according to the second sonar image data and the motion state information. The invention improves the operation capability of the underwater equipment and enhances the versatility of the underwater equipment.
Description
Technical Field
The invention relates to the technical field of underwater operation, in particular to underwater equipment and an underwater operation system.
Background
With the push of the leading edge technology of the underwater equipment, the underwater equipment can be used for surveying, routing inspection and the like.
At present, functions such as drawing construction, motion control and the like related to operation of underwater equipment are generally integrated on an above-water terminal, however, the underwater operation capability of the underwater equipment is limited by the mode, so that the functions of the underwater equipment are single.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide underwater equipment and an underwater operation system, and aims to improve the operation capability of the underwater equipment and enhance the versatility of the underwater equipment.
To achieve the above object, the present invention provides an underwater apparatus comprising:
the operation control module comprises a position locking unit and an execution unit, wherein the position locking unit is used for acquiring an optical image and first sonar image data and determining the position information of an underwater target object according to the optical image and the first sonar image data, and the execution unit is used for controlling the underwater equipment to move and executing operation action on the target object according to the position information;
and the mapping module is used for acquiring the path planning information of the underwater equipment, controlling the underwater equipment to move on the water surface according to the path planning information, acquiring second sonar image data and motion state information of the underwater equipment, and constructing an underwater topographic map according to the second sonar image data.
Optionally, the work action comprises tracking, wrapping, approaching or grabbing the target object;
the execution unit is also used for moving and tracking the target object according to the position information; or
The execution unit is further used for moving around the target object according to the position information; or
The execution unit is further used for moving towards the target object according to the position information; or
The execution unit is also used for moving and grabbing the target object according to the position information.
Optionally, the execution unit is further configured to:
controlling the underwater equipment to move to the water surface according to the position information and collecting a target video, wherein the target video comprises an underwater marker and video data corresponding to the target object;
and sending the target video to the above-water equipment so that the above-water equipment measures the size parameter of the target object according to the underwater marker in the target video.
Optionally, the step of obtaining the path planning information of the underwater device includes:
and controlling the underwater equipment to move to the water surface and receiving path planning information sent by the above-water equipment.
Optionally, the motion state information includes positioning information, heading information, and attitude data of the underwater device, and the step of constructing an underwater topography map according to the second sonar image data and the motion state information includes:
and constructing the underwater topographic map according to the positioning information, the course information, the attitude data and the second sonar image data.
Optionally, the path planning information includes a plurality of paths, the positioning information includes a plurality of sub-positioning information, the heading information includes a plurality of sub-heading information, the heading data includes a plurality of sub-attitude data, the second sonar image data includes a plurality of sub-sonar image data, and the step of constructing the underwater topography according to the positioning information, the heading information, the attitude data and the second sonar image data includes:
constructing a corresponding sub-map according to the sub-positioning information, the sub-course information, the sub-attitude data and the sub-sonar image data corresponding to each path to obtain a plurality of sub-maps;
and constructing the underwater topography map according to the plurality of sub-maps.
Optionally, the step of constructing a corresponding sub-map according to the sub-positioning information, the sub-heading information, the sub-attitude data, and the sub-sonar image data corresponding to each of the paths includes:
and constructing a corresponding two-dimensional map according to the sub-positioning information, the sub-course information, the sub-attitude data and the sub-sonar image data corresponding to each path, wherein the sub-map comprises the two-dimensional map.
Optionally, the job control module is further configured to:
inputting the optical image into a preset image enhancement model to obtain an optical enhancement image, wherein the definition of the optical enhancement image is greater than that of the optical image;
determining a first image feature of the optically enhanced image and determining a second image feature of the first sonar image data;
determining an image area where the target object is located according to a preset weight coefficient, the first image feature and the second image feature;
and determining the position information according to the image area.
In addition, in order to achieve the above object, the present invention also provides an underwater operation system comprising the underwater apparatus as described above and a marine apparatus for measuring a dimensional parameter of the target object.
Optionally, the aquatic equipment comprises:
the receiving module is used for receiving a target video sent by the underwater equipment;
and the processing module is used for detecting the underwater marker in the target video according to a visual recognition algorithm, obtaining a detection result and measuring the size parameter of the target object according to the detection result.
The application provides an underwater device, which comprises a work control module and a mapping module, wherein the work control module comprises a position locking unit and an execution unit, the position locking unit is used for acquiring an optical image and first sonar image data, the position information of an underwater target object is determined according to the optical image and the first sonar image data, and the execution unit is used for controlling the underwater device to move according to the position information and executing work action on the target object; the mapping module is used for acquiring path planning information of the underwater equipment, controlling the underwater equipment to move on the water surface according to the path planning information, acquiring second sonar image data and motion state information of the underwater equipment, and constructing an underwater topography according to the second sonar image data. The underwater equipment can realize position locking and moving operation through the operation control module, and can also construct an underwater topographic map through the mapping module, so that the operation capacity of the underwater equipment is improved, and the versatility of the underwater equipment is enhanced.
Drawings
FIG. 1 is a schematic diagram of a configuration of an underwater device in a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic illustration of an operational demonstration of an embodiment of the subsea equipment of the present invention;
FIG. 3 is a schematic diagram of an embodiment of the underwater apparatus of the present invention for constructing an underwater map;
FIG. 4 is a functional block diagram of a first embodiment of the subsea equipment of the present invention;
FIG. 5 is a schematic flow chart relating to the operation of a first embodiment of the subsea equipment of the present invention;
fig. 6 is a schematic flow diagram relating to the operation of a second embodiment of the underwater apparatus of the present invention.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an underwater device in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the underwater device 100 may include a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, and a memory 1003. The communication bus 1002 is used to implement connection communication among these components. The Memory 1003 may be a Random Access Memory (RAM) Memory, or may be a Non-Volatile Memory (NVM), such as a disk Memory. The memory 1003 may alternatively be a storage device separate from the processor 1001.
It will be appreciated by those skilled in the art that the configuration shown in figure 1 does not constitute a limitation of underwater equipment and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1003, which is a storage medium, may include therein an operating system, a data storage module, a network communication module, a user interface module, and an underwater device program.
The processor 1001 and the memory 1003 in the underwater device of the present invention may be provided in the underwater device, and the underwater device executes the steps related to the operation of the underwater device mentioned in the following embodiments by calling the underwater device program stored in the memory 1003 through the processor 1001.
An embodiment of the present invention provides an underwater device, and referring to fig. 4, fig. 4 is a schematic functional module diagram of a first embodiment of the present invention.
In this embodiment, the subsea equipment includes a job control module 101 and a mapping module 102. Wherein the job control module includes a position locking unit 1011 and an execution unit 1012.
The Underwater equipment may be an ROV (Remote Operated Vehicle), which is one of UUVs (Unmanned Underwater vehicles).
It should be noted that the underwater device can sink or scan three hundred meters underwater.
Referring to fig. 5, the underwater equipment may perform the following process when operating:
step S10, the position locking unit is used for acquiring an optical image and first sonar image data, and determining the position information of an underwater target object according to the optical image and the first sonar image data, and the execution unit is used for controlling the underwater equipment to move according to the position information and executing operation actions on the target object;
as shown in fig. 2, the underwater device b is provided with an optical camera d and a sonar sensor e, wherein the optical camera d can capture an optical image of an underwater target object c, the sonar sensor e can capture a sonar image of the underwater target object c, the above-water device a can be a mobile phone, a tablet computer, a PC (personal computer), or the like, the above-water device a is located above a horizontal plane, the above-water device a is in communication connection with the underwater device b through a twisted pair, the above-water device a can capture the optical image and the sonar image captured by the underwater device b, and the optical image and the sonar image are transmitted to the above-water device a above the horizontal plane through the twisted pair in a form of coded video stream, wherein the optical image and the sonar image correspond one to one, and the one-to-one optical image and the one-sonar image are captured at the same time and/or at the same capturing angle.
The position locking unit 1011 refers to a system processing unit for the underwater equipment to perform position locking on the attitude of the underwater equipment according to the position of the target object.
The execution unit 1012 refers to a system processing unit for performing operation control of the underwater device according to the received instruction.
When the underwater equipment is placed on the water surface or is operated underwater, the position locking unit 1011 of the underwater equipment operation control module 101 acquires optical images and sonar image data through the optical camera and the sonar sensor, the position information of the underwater target object can be determined according to the relevant information of the target object in the optical images and the sonar image data, such as light sensation intensity and sonar intensity, and the execution unit 1012 controls the underwater equipment to move to the position of the target object and executes operation actions such as tracking, surrounding, grabbing and the like on the target object according to the position information of the target object.
And S20, the mapping module is used for acquiring path planning information of the underwater equipment, controlling the underwater equipment to move on the water surface according to the path planning information, acquiring second sonar image data and motion state information of the underwater equipment, and constructing an underwater topography according to the second sonar image data.
As shown in fig. 3, an ROV (Remote Operated Vehicle) f is used for underwater observation, inspection, and construction. An Inertial Measurement Unit (IMU) g inside the ROV carrier is used for internal Inertial Measurement and mainly comprises a gyroscope, an accelerometer, a magnetic sensor, a depth meter and other sensors, so that the ROV can acquire high-precision attitude data such as pitch, roll angle, water depth and the like, wherein the attitude data is used for correcting and compensating errors caused by carrier swing and inclination in the process of receiving reflected echoes by a sonar transducer h. Because there is no satellite signal underwater, in the course of underwater scanning and map building, the ROV needs to float to the water surface end to obtain satellite positioning information and course information, so the GPS antenna i and the GPS antenna j are installed on the top of the ROV and the distance between the GPS antenna i and the GPS antenna j is kept at 0.6 m, so that the accuracy of the course angle can be solved by a double-antenna satellite positioning algorithm. The bottom of the ROV is provided with a sonar emission and receiving transducer array in a vertical downward mode, the sonar transducers Cheng Shanmian are arranged, sonar signals with an opening angle of 60-150 degrees can be emitted downwards, the receiving transducer array receives underwater reflection echoes, and the underwater reflection echoes are subjected to time delay or phase shift and then are added and summed to obtain reflected wave underwater imaging information.
The path planning information refers to the navigation path of the underwater equipment on the water surface. The motion state information includes position, heading, and attitude information of the underwater device. The underwater topographic map drawn by surveying can be used for underwater reconnaissance and detection and the like.
The underwater equipment scans two-dimensional images of the underwater topography of each layer by layer according to the planned path in the process of moving on the water surface, acquires corresponding sonar image data and motion state information of the underwater equipment, repeatedly scans the underwater topography according to the same path planning information to form a corresponding two-dimensional map, and processes the two-dimensional maps to construct a three-dimensional underwater topography map.
In this embodiment, the present application provides an underwater device, which includes a work control module and a mapping module, wherein the work control module includes a position locking unit and an execution unit, the position locking unit is configured to acquire an optical image and first sonar image data, determine position information of an underwater target object according to the optical image and the first sonar image data, and the execution unit is configured to control the underwater device to move according to the position information and execute a work action on the target object; the mapping module is used for acquiring path planning information of the underwater equipment, controlling the underwater equipment to move on the water surface according to the path planning information, acquiring second sonar image data and motion state information of the underwater equipment, and constructing an underwater topographic map according to the second sonar image data. The underwater equipment can realize position locking and moving operation through the operation control module, and can also construct an underwater topographic map through the mapping module, so that the operation capacity of the underwater equipment is improved, and the versatility of the underwater equipment is enhanced.
Further, referring to fig. 6, a second embodiment of the present invention provides an underwater apparatus, based on the embodiment shown in fig. 6, the execution unit is further configured to:
s21, controlling the underwater equipment to move to the water surface according to the position information and collecting a target video, wherein the target video comprises an underwater marker and video data corresponding to a target object;
the underwater equipment is provided with the optical camera which can shoot the underwater marker and the target object, the optical camera is used for collecting a target video, the target video is uploaded to the overwater equipment through a twisted pair after image enhancement, and measurement software deployed on the overwater equipment is used for measuring the target object, wherein the measurable data comprises the length, the width, the height, the area, the angle and the like of the target object.
The underwater marker may be composed of a fixed figure of the same size, for example, three red circles of 3cm diameter arranged in an equilateral triangle, the background of the figure being white and each side of the equilateral triangle being 5cm in length.
And S22, sending the target video to the above-water equipment so that the above-water equipment can measure the size parameter of the target object according to the underwater marker in the target video.
An identification algorithm deployed on the marine device may identify the underwater identifier and calculate it from the associated pixel distance calibration algorithm.
In the embodiment, the underwater device collects the video data of the target object and returns the video data to the above-water device for processing, so that the target object is measured, feasibility is provided for measuring the target object on the water surface, the accuracy of measuring the target object on the water surface can be ensured by measuring with the aid of the underwater marker, the operation capacity of the underwater device is improved, and the operation effect of the underwater device is enhanced.
In other embodiments, the underwater device can also measure the target object without using an underwater marker, and the underwater device can also measure the target object by laser.
Further, in this embodiment, the operation action includes tracking, surrounding, approaching or grabbing the target object;
the working action comprises tracking, encircling, approaching or grabbing the target object;
the execution unit is also used for moving and tracking the target object according to the position information; or
The execution unit is further used for moving around the target object according to the position information; or
The execution unit is further used for moving towards the target object according to the position information; or
The execution unit is also used for moving and grabbing the target object according to the position information.
The underwater equipment can work on a target object on the water surface or underwater, the underwater equipment is provided with an image fusion algorithm, namely, fusion processing is carried out on the acquired optical image and sonar image data to identify the target object, and a work control algorithm is arranged to work on the identified target object.
It should be noted that related image processing and operation control algorithms are built in the underwater equipment, that is, the underwater equipment does not need to process acquired images and perform operation control, so that the underwater equipment can realize underwater autonomous cruising and operation control on a target object including tracking, surrounding, approaching and grabbing.
In the embodiment, the underwater equipment can carry out operation control on the target object on the water surface or underwater, so that the underwater equipment can independently operate in the water, the intelligence of the underwater equipment is improved, the automation degree of the underwater equipment is enhanced, the environmental adaptability of the underwater equipment is enhanced, the operation capacity of the underwater equipment is improved, and the versatility of the underwater equipment is enhanced.
Further, the job control module is further configured to:
inputting the optical image into a preset image enhancement model to obtain an optical enhancement image, wherein the definition of the optical enhancement image is greater than that of the optical image;
determining a first image feature of the optically enhanced image and determining a second image feature of the first sonar image data;
determining an image area where the target object is located according to a preset weight coefficient, the first image feature and the second image feature;
and determining the position information according to the image area.
The image enhancement model can be an image defogging enhancement model, an image training set is obtained according to a training method of the image enhancement model, a preset neural network model is trained according to the image training set to obtain the image enhancement model, and an optical image is further input into the image enhancement model so as to obtain an optical enhancement image with high definition.
The target object can be determined from the image according to a target object framing algorithm, wherein the target object framing algorithm can determine a preset neural network model through deep learning so as to obtain the target object framing algorithm.
In the image processing, the image features correspond to the weight coefficients one by one, and feature fusion processing is performed on the image features of the plurality of images based on the weight coefficients of the image features to obtain fusion features of the plurality of images. And performing fusion processing on the plurality of images of the target object based on the weight coefficient of the first image characteristic and the weight coefficient of the second image to obtain an image area where the target object is located.
In the embodiment, the acquired optical image is enhanced, the enhanced optical image and sonar image data are synthesized into a target image, and a target object is further selected out in a frame mode.
In other embodiments, the underwater device can also determine the position information of the target object without enhancing the optical image, and the underwater device can also determine the position information of the target object by enhancing the sonar image data.
Further, a third embodiment of the present invention provides an underwater apparatus, where the step of obtaining the path planning information of the underwater apparatus includes:
and controlling the underwater equipment to move to the water surface and receiving path planning information sent by the above-water equipment.
Due to the fact that no satellite signal exists underwater, the underwater equipment needs to move to the water surface to receive path planning information.
In this embodiment, the underwater device moving to the water surface receives the path planning information, so that a navigation path can be determined for the underwater device moving on the water surface, conditions are provided for scanning underwater topography on the water surface, and the underwater topography map construction process is guaranteed to be performed orderly and planned, so that the accuracy of underwater topography map construction is improved.
In other embodiments, the planned path may also be set by an autonomous cruise algorithm built in the underwater device, so that the underwater device does not need to plan movement according to the path sent by the above-water device but moves according to the planned path set by the underwater device.
Further, the motion state information comprises positioning information, course information and attitude data of the underwater equipment, and the step of constructing the underwater topographic map according to the second sonar image data and the motion state information comprises the following steps: and constructing the underwater topography according to the positioning information, the course information, the attitude data and the second sonar image data.
The positioning information refers to the positioning information of the underwater equipment on the water surface. The heading information includes the direction of travel and path information of the underwater device on the water surface or underwater. The attitude data includes the state of the body of the underwater equipment and the manipulator.
And when the underwater equipment moves on the water surface according to the path planning information, acquiring positioning information, course information, attitude data and sonar image data of the underwater equipment in real time.
The underwater equipment scans the underwater topography layer by layer to obtain a primary image fused by an optical image and a sonar image of the underwater topography, and further splices the obtained positioning information, course information, attitude data, sonar image data and the primary image to construct the underwater topography, wherein the sonar image data corresponding to the reflection echo is offset by using the positioning information, the course information and the attitude data, so that the more accurate underwater topography is obtained.
In the embodiment, the underwater equipment constructs the underwater topographic map according to the positioning information, the course information, the attitude data and the sonar image data which are obtained in real time, so that the accuracy of constructing the underwater topographic map is ensured, the error of the underwater topographic map is reduced, and the accuracy of the underwater topographic map is improved, thereby improving the map constructing capability of the underwater equipment, namely enhancing the operation capability of the underwater equipment.
In other embodiments, the underwater topography map can also be constructed by only collecting the positioning information and the heading information.
Further, in this embodiment, the path planning information includes a plurality of paths, the positioning information includes a plurality of sub-positioning information, the heading information includes a plurality of sub-heading information, the attitude data includes a plurality of sub-attitude data, the second sonar image data includes a plurality of sub-sonar image data, the step of constructing the underwater topography map according to the positioning information, the heading information, the attitude data and the second sonar image data includes:
constructing a corresponding sub-map according to the sub-positioning information, the sub-course information, the sub-attitude data and the sub-sonar image data corresponding to each path to obtain a plurality of sub-maps;
and constructing the underwater topography map according to the plurality of sub-maps.
The path planning information comprises a plurality of same paths, each path corresponds to each underwater layer of terrain, each path corresponds to different single-layer underwater terrains, the underwater equipment scans the underwater terrains according to a single path in the path planning information to construct sub-maps, each layer of terrain map has corresponding sub-positioning information, sub-course information, sub-attitude data and sub-sonar image data, and the underwater equipment collects and stores the corresponding positioning information in real time according to the path planning information to construct the sub-maps.
The sub-maps correspond to a single-layer underwater topography and are two-dimensional maps, and a three-dimensional underwater topography map can be constructed by modeling a plurality of sub-maps.
The underwater equipment scans the underwater topography layer by layer to obtain a primary image of a sonar image of the underwater topography, and further the underwater equipment splices the positioning information, the course information, the attitude data, the sonar image data of the sub-map corresponding to each layer of the underwater topography with the primary image of the corresponding layer to construct the underwater topography.
In the embodiment, the three-dimensional underwater topographic map is constructed according to the two-dimensional sub-map, so that the visibility of the underwater topographic map is improved, the usability of the underwater topographic map is improved, and visual guarantee is provided for underwater reconnaissance and detection.
In other embodiments, the underwater topography map can be constructed by dividing the area, and a three-dimensional underwater topography map can be directly constructed by a stereo camera.
Further, the step of constructing a corresponding sub-map according to the sub-positioning information, the sub-heading information, the sub-attitude data and the sub-sonar image data corresponding to each of the paths includes:
and constructing a corresponding two-dimensional map according to the sub-positioning information, the sub-course information, the sub-attitude data and the sub-sonar image data corresponding to each path, wherein the sub-map comprises the two-dimensional map.
The sub-map constructed by collecting the positioning information of the single-layer underwater topography is a two-dimensional map, so that the two-dimensional map corresponding to the single-layer underwater topography is constructed by each piece of sub-positioning information, sub-course information, sub-attitude data and sub-sonar image data.
In the embodiment, the underwater equipment can ensure the feasibility of constructing the underwater topographic map by scanning the underwater topography layer by layer, and the accuracy of constructing the underwater topographic map can be improved, so that the operation effect of constructing the underwater topographic map by the underwater equipment is improved.
In other embodiments, a three-dimensional sub-map may also be constructed by a stereo camera.
In addition, the embodiment of the invention also provides an underwater operation system, which comprises the underwater equipment and the above-water equipment, wherein the above-water equipment is used for measuring the dimensional parameters of the target object.
Further, in this embodiment, the above-water device includes:
the receiving module is used for receiving a target video sent by the underwater equipment;
and the processing module is used for detecting the underwater marker in the target video according to a visual recognition algorithm to obtain a detection result, and measuring the size parameter of the target object according to the detection result.
The visual recognition algorithm is an algorithm for recognizing a target object from an image.
As shown in fig. 2, the underwater device b is provided with an optical camera d and a sonar sensor e, wherein the optical camera d can capture an optical image of an underwater target object c, the sonar sensor e can capture a sonar image of the underwater target object c, the above-water device a can be a mobile phone, a tablet computer, a PC (personal computer), or the like, the above-water device a is located above a horizontal plane, the above-water device a is in communication connection with the underwater device b through a twisted pair, the above-water device a can capture the optical image and the sonar image captured by the underwater device b, and the optical image and the sonar image are transmitted to the above-water device a above the horizontal plane through the twisted pair in a form of coded video stream, wherein the optical image and the sonar image correspond one to one, and the one-to-one optical image and the one-sonar image are captured at the same time and/or at the same capturing angle.
And when the overwater equipment receives the target video, inputting the target video into a visual recognition algorithm to detect the underwater marker, and when the underwater marker is detected, measuring the target object according to the preset length of the underwater marker.
In this embodiment, the underwater marker is detected from the target video including the target object and the underwater marker through a visual recognition algorithm, and then the target object is measured according to the underwater marker, so that indirect measurement of the target object is realized, a fast, accurate and reliable measurement mode is provided for the target object, the calibration effect of the target object is improved, and the measurement accuracy of the target object is ensured.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. An underwater apparatus, characterized in that the underwater apparatus comprises:
the operation control module comprises a position locking unit and an execution unit, wherein the position locking unit is used for acquiring an optical image and first sonar image data and determining the position information of an underwater target object according to the optical image and the first sonar image data, and the execution unit is used for controlling the underwater equipment to move according to the position information and executing operation action on the target object;
and the mapping module is used for acquiring path planning information of the underwater equipment, controlling the underwater equipment to move on the water surface according to the path planning information, acquiring second sonar image data and motion state information of the underwater equipment, and constructing an underwater topography map according to the second sonar image data and the motion state information.
2. The subsea equipment of claim 1, wherein said work action comprises tracking, wrapping, approaching, or grabbing said target object;
the execution unit is also used for moving and tracking the target object according to the position information; or
The execution unit is further used for moving around the target object according to the position information; or
The execution unit is further used for moving towards the target object according to the position information; or
The execution unit is also used for moving and grabbing the target object according to the position information.
3. The subsea equipment of claim 1, wherein the execution unit is further to:
controlling the underwater equipment to move to the water surface according to the position information and collecting a target video, wherein the target video comprises an underwater marker and video data corresponding to the target object;
and sending the target video to the above-water equipment so that the above-water equipment measures the size parameters of the target object according to the underwater marker in the target video.
4. The subsea equipment of claim 1, wherein said step of obtaining path planning information for the subsea equipment comprises:
and controlling the underwater equipment to move to the water surface and receiving the path planning information sent by the above-water equipment.
5. The underwater device according to claim 1, wherein the motion state information includes positioning information, heading information, and attitude data of the underwater device, and the step of constructing an underwater topography map from the second sonar image data and the motion state information includes:
and constructing the underwater topography according to the positioning information, the course information, the attitude data and the second sonar image data.
6. The underwater device of claim 5 wherein the path planning information comprises a plurality of paths, the positioning information comprises a plurality of sub-positioning information, the heading information comprises a plurality of sub-heading information, the attitude data comprises a plurality of sub-attitude data, the second sonar image data comprises a plurality of sub-sonar image data, and the step of constructing the underwater topography map from the positioning information, the heading information, and the attitude data and the second sonar image data comprises:
constructing a corresponding sub-map according to the sub-positioning information, the sub-course information, the sub-attitude data and the sub-sonar image data corresponding to each path to obtain a plurality of sub-maps;
and constructing the underwater topography map according to the plurality of sub-maps.
7. The underwater device according to claim 6, wherein the step of constructing a corresponding sub-map according to the sub-positioning information, the sub-heading information, the sub-attitude data, and the sub-sonar image data corresponding to each of the paths includes:
and constructing a corresponding two-dimensional map according to the sub-positioning information, the sub-heading information, the sub-attitude data and the sub-sonar image data corresponding to each path, wherein the sub-map comprises the two-dimensional map.
8. The subsea equipment of any of claims 1-7, where the operations control module is further to:
inputting the optical image into a preset image enhancement model to obtain an optical enhancement image, wherein the definition of the optical enhancement image is greater than that of the optical image;
determining a first image feature of the optically enhanced image and determining a second image feature of the first sonar image data;
determining an image area where the target object is located according to a preset weight coefficient, the first image feature and the second image feature;
and determining the position information according to the image area.
9. An underwater operation system comprising the underwater apparatus as claimed in any one of claims 1 to 8 and a marine apparatus for measuring a dimensional parameter of the target object.
10. The underwater operation system of claim 9, wherein the above-water device comprises:
the receiving module is used for receiving a target video sent by the underwater equipment;
and the processing module is used for detecting the underwater marker in the target video according to a visual recognition algorithm to obtain a detection result, and measuring the size parameter of the target object according to the detection result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210870497.7A CN115303451B (en) | 2022-07-22 | 2022-07-22 | Underwater equipment and underwater operation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210870497.7A CN115303451B (en) | 2022-07-22 | 2022-07-22 | Underwater equipment and underwater operation system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115303451A true CN115303451A (en) | 2022-11-08 |
CN115303451B CN115303451B (en) | 2024-08-09 |
Family
ID=83859295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210870497.7A Active CN115303451B (en) | 2022-07-22 | 2022-07-22 | Underwater equipment and underwater operation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115303451B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115892378A (en) * | 2022-12-23 | 2023-04-04 | 广东深蓝水下特种设备科技有限公司 | Ship cleaning method, system and medium based on underwater sonar positioning |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107526087A (en) * | 2016-06-21 | 2017-12-29 | 北京臻迪科技股份有限公司 | A kind of method and system for obtaining underwater 3D faultage images |
CN207281283U (en) * | 2017-01-04 | 2018-04-27 | 北京臻迪科技股份有限公司 | A kind of unmanned boat detection system |
JP2018177074A (en) * | 2017-04-18 | 2018-11-15 | 国立大学法人 東京大学 | Autonomous type underwater robot and control method for the same |
CN109460061A (en) * | 2018-12-12 | 2019-03-12 | 国家海洋局第二海洋研究所 | A kind of concurrent job method of autonomous underwater robot and geological sampling equipment |
CN110133667A (en) * | 2019-05-15 | 2019-08-16 | 上海大学 | Underwater 3 D detection system based on mobile Forward-Looking Sonar |
CN110750100A (en) * | 2019-11-08 | 2020-02-04 | 江苏科技大学 | Underwater search and rescue robot path planning method based on flow function |
CN111582403A (en) * | 2020-05-18 | 2020-08-25 | 哈尔滨工程大学 | Zero-sample side-scan sonar image target classification method |
CN111674530A (en) * | 2020-04-29 | 2020-09-18 | 大连海事大学 | Underwater small target positioning and grabbing device and method |
CN112734921A (en) * | 2021-01-11 | 2021-04-30 | 燕山大学 | Underwater three-dimensional map construction method based on sonar and visual image splicing |
KR102263037B1 (en) * | 2019-11-08 | 2021-06-10 | 대양전기공업 주식회사 | A Method of Underwater Environment Mapping System using Underwater Vehicle and Underwater Acoustic Detection Equipment |
CN113640808A (en) * | 2021-08-12 | 2021-11-12 | 深圳中海油服深水技术有限公司 | Shallow water submarine cable buried depth detection method and device |
CN114137546A (en) * | 2021-11-30 | 2022-03-04 | 青岛澎湃海洋探索技术有限公司 | AUV (autonomous underwater vehicle) submarine target identification and path planning method based on data driving |
CN114488164A (en) * | 2022-01-17 | 2022-05-13 | 清华大学深圳国际研究生院 | Underwater vehicle synchronous positioning and mapping method and underwater vehicle |
CN114663745A (en) * | 2022-03-04 | 2022-06-24 | 深圳鳍源科技有限公司 | Position locking method of underwater equipment, terminal equipment, system and medium |
CN114755667A (en) * | 2022-04-19 | 2022-07-15 | 山东海慧勘察测绘有限公司 | Three-dimensional visual processing system of marine surveying and mapping underwater target |
-
2022
- 2022-07-22 CN CN202210870497.7A patent/CN115303451B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107526087A (en) * | 2016-06-21 | 2017-12-29 | 北京臻迪科技股份有限公司 | A kind of method and system for obtaining underwater 3D faultage images |
CN207281283U (en) * | 2017-01-04 | 2018-04-27 | 北京臻迪科技股份有限公司 | A kind of unmanned boat detection system |
JP2018177074A (en) * | 2017-04-18 | 2018-11-15 | 国立大学法人 東京大学 | Autonomous type underwater robot and control method for the same |
CN109460061A (en) * | 2018-12-12 | 2019-03-12 | 国家海洋局第二海洋研究所 | A kind of concurrent job method of autonomous underwater robot and geological sampling equipment |
CN110133667A (en) * | 2019-05-15 | 2019-08-16 | 上海大学 | Underwater 3 D detection system based on mobile Forward-Looking Sonar |
KR102263037B1 (en) * | 2019-11-08 | 2021-06-10 | 대양전기공업 주식회사 | A Method of Underwater Environment Mapping System using Underwater Vehicle and Underwater Acoustic Detection Equipment |
CN110750100A (en) * | 2019-11-08 | 2020-02-04 | 江苏科技大学 | Underwater search and rescue robot path planning method based on flow function |
CN111674530A (en) * | 2020-04-29 | 2020-09-18 | 大连海事大学 | Underwater small target positioning and grabbing device and method |
CN111582403A (en) * | 2020-05-18 | 2020-08-25 | 哈尔滨工程大学 | Zero-sample side-scan sonar image target classification method |
CN112734921A (en) * | 2021-01-11 | 2021-04-30 | 燕山大学 | Underwater three-dimensional map construction method based on sonar and visual image splicing |
CN113640808A (en) * | 2021-08-12 | 2021-11-12 | 深圳中海油服深水技术有限公司 | Shallow water submarine cable buried depth detection method and device |
CN114137546A (en) * | 2021-11-30 | 2022-03-04 | 青岛澎湃海洋探索技术有限公司 | AUV (autonomous underwater vehicle) submarine target identification and path planning method based on data driving |
CN114488164A (en) * | 2022-01-17 | 2022-05-13 | 清华大学深圳国际研究生院 | Underwater vehicle synchronous positioning and mapping method and underwater vehicle |
CN114663745A (en) * | 2022-03-04 | 2022-06-24 | 深圳鳍源科技有限公司 | Position locking method of underwater equipment, terminal equipment, system and medium |
CN114755667A (en) * | 2022-04-19 | 2022-07-15 | 山东海慧勘察测绘有限公司 | Three-dimensional visual processing system of marine surveying and mapping underwater target |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115892378A (en) * | 2022-12-23 | 2023-04-04 | 广东深蓝水下特种设备科技有限公司 | Ship cleaning method, system and medium based on underwater sonar positioning |
Also Published As
Publication number | Publication date |
---|---|
CN115303451B (en) | 2024-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10809376B2 (en) | Systems and methods for detecting objects in underwater environments | |
Whitcomb et al. | Advances in underwater robot vehicles for deep ocean exploration: Navigation, control, and survey operations | |
Mallios et al. | Scan matching SLAM in underwater environments | |
Williams et al. | Simultaneous localisation and mapping on the great barrier reef | |
JP2020527500A (en) | Methods and equipment for calibrating external parameters of onboard sensors | |
CN109239709B (en) | Autonomous construction method for local environment map of unmanned ship | |
JP5105596B2 (en) | Travel route determination map creation device and travel route determination map creation method for autonomous mobile body | |
Ribas et al. | Underwater SLAM in a marina environment | |
CN111308415B (en) | Online pose estimation method and equipment based on time delay | |
Aulinas et al. | Vision-based underwater SLAM for the SPARUS AUV | |
CN114488164A (en) | Underwater vehicle synchronous positioning and mapping method and underwater vehicle | |
Maurelli et al. | A particle filter approach for AUV localization | |
Jiang et al. | A survey of underwater acoustic SLAM system | |
CN115303451B (en) | Underwater equipment and underwater operation system | |
KR102263037B1 (en) | A Method of Underwater Environment Mapping System using Underwater Vehicle and Underwater Acoustic Detection Equipment | |
Williams et al. | A terrain-aided tracking algorithm for marine systems | |
Zalewski et al. | Computer Vision-Based Position Estimation for an Autonomous Underwater Vehicle | |
Mallios | Sonar scan matching for simultaneous localization and mapping in confined underwater environments | |
US12002193B2 (en) | Inspection device for inspecting a building or structure | |
Ribas Romagós | Underwater slam for estructured environments using and imaging sonar | |
KR20200021433A (en) | Method and Apparatus for Interlocking Control Based on Sensor Fusion for Operation of Underwater Platform | |
Pulido et al. | Time and cost-efficient bathymetric mapping system using sparse point cloud generation and automatic object detection | |
Fallon et al. | Simultaneous localization and mapping in marine environments | |
JP2018130998A (en) | Underwater searching method and underwater searching system | |
Mugnai et al. | Developing affordable bathymetric analysis techniques using non-conventional payload for cultural heritage inspections |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |