CN116206070B - Hull underwater imaging method based on machine binocular vision and underwater robot - Google Patents

Hull underwater imaging method based on machine binocular vision and underwater robot Download PDF

Info

Publication number
CN116206070B
CN116206070B CN202310496786.XA CN202310496786A CN116206070B CN 116206070 B CN116206070 B CN 116206070B CN 202310496786 A CN202310496786 A CN 202310496786A CN 116206070 B CN116206070 B CN 116206070B
Authority
CN
China
Prior art keywords
underwater
binocular vision
assembly
ship body
ranging point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310496786.XA
Other languages
Chinese (zh)
Other versions
CN116206070A (en
Inventor
张岩
梅宁
张淑慧
袁瀚
李艳
赵健
刘溪源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ocean University of China
Original Assignee
Ocean University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ocean University of China filed Critical Ocean University of China
Priority to CN202310496786.XA priority Critical patent/CN116206070B/en
Publication of CN116206070A publication Critical patent/CN116206070A/en
Application granted granted Critical
Publication of CN116206070B publication Critical patent/CN116206070B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the field of machine binocular vision, in particular to a ship body underwater imaging method based on machine binocular vision and an underwater robot. The method comprises the following steps: obtaining a hull model; acquiring an outer surface image of an underwater portion of the hull using the underwater binocular vision assembly; and mapping on the hull model by using the obtained outer surface image to obtain a hull rendering model. An underwater robot is used to implement the method. The ship bottom monitoring system can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately mastering the ship bottom health condition, has a direct guiding effect on ship body maintenance work, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance in reducing resource waste.

Description

Hull underwater imaging method based on machine binocular vision and underwater robot
Technical Field
The invention relates to the field of machine binocular vision, in particular to a ship body underwater imaging method based on machine binocular vision and an underwater robot.
Background
Parasites on the bottom of a ship (e.g. barnacles) can have a serious impact on the life of the hull and can also have an additional burden on the normal voyage of the ship, and therefore cleaning up the parasites on the bottom of the ship is an important step in maintenance of the ship.
Conventional cleaning methods generally clean and maintain the ship bottom at regular intervals, but in actual maintenance work, the following problems are often faced:
when the cleaning interval period is set longer, a large amount of parasites are always attached to the bottom of the ship during each cleaning, the cleaning difficulty is high, the cleaning time is long, the navigational speed of the ship can be seriously dragged before the cleaning, the energy consumption is increased, and certain potential safety hazards exist.
If the cleaning interval period is set to be shorter, the maintenance cost is greatly increased, and in the maintenance process, little or no attachment of organisms is sometimes found, so that great waste is caused to maintenance resources.
However, the attachment and growth rate of parasites is affected by the marine environment, and it is difficult to accurately evaluate and predict the attachment and growth rate, and the influence factor is very large, so that it is difficult to set the cleaning cycle to a reliable range.
The existence of the conditions causes certain blindness to maintenance work of the ship bottom, and maintenance resources cannot be accurately and reasonably distributed, which directly causes a great amount of resource waste.
In view of this, the present application is specifically proposed.
Disclosure of Invention
The invention provides a ship body underwater imaging method based on machine binocular vision, which can timely and accurately feed back ship bottom conditions, is convenient for directly and accurately grasping ship bottom health conditions, has a direct guiding effect on ship body maintenance work, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance for reducing resource waste.
The second object of the invention is to provide an underwater robot which can timely and accurately feed back the condition of the ship bottom, is convenient for directly and accurately grasping the health condition of the ship bottom, has a direct guiding function on maintenance work of the ship body, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance for reducing resource waste.
Embodiments of the present invention are implemented as follows:
a ship body underwater imaging method based on machine binocular vision comprises the following steps:
s1: obtaining a hull model;
s2: acquiring an outer surface image of an underwater portion of the hull using the underwater binocular vision assembly;
s3: and mapping on the hull model by using the obtained outer surface image to obtain a hull rendering model.
Further, step S2 includes the steps of:
s21: setting a first ranging point, a second ranging point and a third ranging point around the underwater binocular vision component, wherein the first ranging point ranges in a first preset direction, the second ranging point ranges in a second preset direction, and the third ranging point ranges in a third preset direction;
s22: controlling the underwater binocular vision component to be close to the ship body, so that the distance between the underwater binocular vision component and the ship body detected from the first ranging point is a first preset value, the distance between the underwater binocular vision component and the ship body detected from the second ranging point is a second preset value, and the distance between the underwater binocular vision component and the ship body detected from the third ranging point is a third preset value, and therefore the underwater binocular vision component is in a preset posture;
s23: and controlling the underwater binocular vision assembly to move along the ship body under a preset gesture so as to acquire the outer surface images of different positions of the underwater part of the ship body.
Further, step S23 includes the steps of:
s231: the underwater binocular vision assembly is controlled to transversely move along the ship body and move around the ship body for a circle under the preset gesture, so that an outer surface image of the ship body at the same depth position is obtained;
s232: determining the visual field range of the underwater binocular vision assembly on the surface of the ship body under the preset gesture, and adjusting the submerging depth of the underwater binocular vision assembly, wherein the adjustment amount of the submerging depth is smaller than or equal to the longitudinal length of the visual field range each time;
S233: steps S231 and S232 are repeatedly performed to acquire an outer surface image of the entire underwater portion of the hull.
Further, in step S232, each time the adjustment amount of the submerging depth is smaller than the longitudinal length of the field of view, the difference between the adjustment amount of the submerging depth and the longitudinal length of the field of view is taken as the adjustment length;
when combining the outer surface images of different depth positions, the outer surface images of adjacent depth positions are overlapped by one adjustment length, and one of the overlapped portions is reserved.
Further, setting a distance deviation range, and regulating and controlling the posture of the underwater binocular vision component in real time in the process that the underwater binocular vision component moves along the ship body under a preset posture;
and if at least one of the distance from the hull detected by the first ranging point deviates from a first preset value, the distance from the hull detected by the second ranging point deviates from a second preset value, and the distance from the hull detected by the third ranging point deviates from a third preset value, and the deviation distance exceeds a distance deviation range, the underwater binocular vision component is readjusted to a preset gesture.
Further, determining profile data of the hull according to the transverse movement rate of the underwater binocular vision assembly, the distance from the hull detected from the first ranging point, the distance from the hull detected from the second ranging point, and the distance from the hull detected from the third ranging point, and marking profile feature points matched with the profile data in the acquired outer surface image;
The step S3 comprises the following steps: and mapping the outer surface image on the hull model according to the outline characteristic points and the submerging depth of the underwater binocular vision component when the outer surface image is acquired, so as to obtain a hull rendering model.
Further, the first ranging point is arranged at the top of the underwater binocular vision component, and the second ranging point and the third ranging point are respectively arranged at two sides of the underwater binocular vision component; the first preset direction, the second preset direction and the third preset direction are all directed to the front of the underwater binocular vision assembly, the first preset direction is simultaneously directed to the upper part of the underwater binocular vision assembly, and the second preset direction and the third preset direction are simultaneously directed to the lower part of the underwater binocular vision assembly; the straight lines of the first preset direction, the second preset direction and the third preset direction are in the same vertex;
when the underwater binocular vision component is in a preset posture, the projection of the first preset direction on the surface of the ship body is located above the visual field range, and the projection of the second preset direction and the third preset direction on the surface of the ship body is located below the visual field range.
An underwater robot for performing the above-described machine binocular vision-based ship underwater imaging method, comprising: a robot body and an underwater binocular vision assembly; the underwater binocular vision assembly is arranged on the robot body; the surface of robot body is equipped with first range finding point, second range finding point and third range finding point.
Further, the underwater robot further includes: a stabilizing mechanism; the stabilizing mechanism includes: the counterweight ball, the positioning shell, the positioning assembly and the locking assembly;
the counterweight ball is of a hollow structure, a baffle plate which divides the internal space of the counterweight ball into two hemispherical cavities is arranged in the counterweight ball, and one hemispherical cavity is filled with counterweight material;
the positioning shell is spherical, the positioning component is arranged along the radial direction of the positioning shell and penetrates through the shell wall of the positioning shell, the positioning component extends into the positioning shell, and the ball is arranged in the positioning component; the positioning components are distributed in an array along the surface of the positioning shell;
the counterweight ball is accommodated in the positioning shell, balls at the inner end of the positioning assembly are attached to the counterweight ball, and the counterweight ball is universally and rotatably matched in the positioning shell through the positioning assembly;
the locking assembly cooperates with the positioning assembly, and the locking assembly is capable of locking the ball, thereby locking the weight ball.
Further, the positioning assembly further comprises: the device comprises a reference column, a first control ring, a transmission gear, a second control ring, a core body, a pressure sensor and a cushion block;
the reference column penetrates through the shell wall of the positioning shell and is arranged along the radial direction of the positioning shell, and an installation groove for installing the ball is formed in the end face of the inner end of the reference column; a mounting blind hole is formed in the bottom of the mounting groove and extends along the axial direction of the reference column; the core body is slidably matched with the mounting blind hole;
The pressure sensor is arranged at one end part of the core body, which is close to the mounting groove, the cushion block is arranged at one side of the pressure sensor, which is far away from the core body, and a matching part for matching with the ball is arranged at one side of the cushion block, which is far away from the pressure sensor; the core body is provided with external threads, the second control ring is provided with internal threads, and the second control ring is sleeved on the core body and is matched with the threads of the core body;
the side wall of the outer end of the reference column is provided with a matching notch which is communicated with the mounting blind hole; the transmission gear is arranged on the matching notch, and the first control ring is rotatably sleeved on the reference column; the first control ring is provided with an inner gear ring, the second control ring is provided with an outer gear ring, and the first control ring are both meshed with the transmission gear;
the core body is fixedly matched with the mounting blind hole along the circumferential direction of the reference column; along the axial direction of the reference column, the second control ring is fixedly matched with the mounting blind hole;
the locking component is in transmission fit with the first control ring, and the locking component can enable the cushion block to tightly prop against and lock the ball through driving the first control ring, so that the counterweight ball is locked.
The technical scheme of the embodiment of the invention has the beneficial effects that:
according to the ship body underwater imaging method based on machine binocular vision, provided by the embodiment of the invention, after the external surface image of the underwater part of the ship body is obtained through the underwater binocular vision component, the external surface image is mapped on the ship body model, so that the actual appearance of the ship body under water can be restored to the ship body model, and the obtained ship body rendering model can actually reflect the actual appearance of the underwater part of the ship body, so that the ship body underwater imaging method is simple and visual.
In this way, the health degree of the underwater part of the ship body can be evaluated by observing the ship body rendering model, and the ship does not need to be started into a dock for inspection, so that the inspection efficiency is improved, the inspection cost is reduced, the dependence on the dock is reduced, and the conflict with overhaul work in other aspects of the ship is avoided.
The parasitic situation of parasites (such as barnacles) at the bottom of the ship can be judged by observing the ship body rendering model, so that the best opportunity for cleaning the bottom of the ship is selected, the rationality and pertinence of cleaning and maintenance work are greatly improved, and the waste of related resources is avoided.
In general, the ship body underwater imaging method based on the machine binocular vision provided by the embodiment of the invention can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately grasping the ship bottom health condition, has a direct guiding effect on ship body maintenance work, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance for reducing resource waste.
The underwater robot provided by the embodiment of the invention can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately grasping the ship bottom health condition, has a direct guiding function on maintenance work of the ship body, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance for reducing resource waste.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of an underwater binocular vision assembly when acquiring an image of the exterior surface of an underwater portion of a hull (view in the lengthwise direction of the hull);
FIG. 2 is a schematic view of the underwater binocular vision assembly when capturing an image of the exterior surface of the underwater portion of the hull (widthwise view of the hull);
FIG. 3 is a schematic view of an underwater binocular vision assembly in a preset attitude;
FIG. 4 is a schematic diagram of overlapping of two adjacent circles of images;
FIG. 5 is a schematic view of the field of view of the underwater binocular vision assembly;
FIG. 6 is a schematic view of an underwater binocular vision assembly performing image acquisition along a lateral motion;
FIG. 7 is a schematic view of the underwater binocular vision assembly ready for stern steering;
FIG. 8 is a schematic view of the overall structure of a stabilizing mechanism of the underwater robot;
FIG. 9 is a schematic view of the structure at the connecting shaft of the stabilizing mechanism;
FIG. 10 is a schematic diagram of the mating relationship of the positioning assembly.
Reference numerals illustrate: an underwater binocular vision assembly 100; a first ranging point 110; a second ranging point 120; a third ranging point 130; a stabilizing mechanism 1000; weight ball 200; a partition 210; a weight 220; positioning the shell 300; a connection shaft 310; a positioning assembly 400; a ball 410; a reference column 420; a mounting groove 421; mounting a blind hole 422; a mating notch 423; a first control loop 430; a transmission gear 440; a second control loop 450; a core 460; a pressure sensor 470; a pad 480; collar 510; a connecting rod 520; driving ring 530.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Furthermore, the terms "parallel," "perpendicular," and the like, do not denote that the components are required to be absolutely parallel or perpendicular, but may be slightly inclined. For example, "parallel" merely means that the directions are more parallel than "perpendicular" and does not mean that the structures must be perfectly parallel, but may be slightly tilted.
In the description of the present invention, it should also be noted that, unless explicitly specified and limited otherwise, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
Example 1
Referring to fig. 1 and 2, the present embodiment provides a ship body underwater imaging method based on machine binocular vision, which includes the following steps:
s1: obtaining a hull model;
s2: acquiring an outer surface image of an underwater portion of the hull using the underwater binocular vision assembly 100;
s3: and mapping on the hull model by using the obtained outer surface image to obtain a hull rendering model.
The hull model may be obtained by three-dimensional software, but is not limited thereto. In the present embodiment, it is understood that the underwater binocular vision assembly 100 may include an underwater binocular lens and an image data processing module, and is not limited thereto.
After the external surface image of the underwater part of the ship body is acquired through the underwater binocular vision assembly 100, the external surface image is mapped on the ship body model, so that the actual appearance of the ship body under the water can be restored to the ship body model, and the obtained ship body rendering model can actually reflect the actual appearance of the underwater part of the ship body, so that the ship body rendering model is simple and visual.
In this way, the health degree of the underwater part of the ship body can be evaluated by observing the ship body rendering model, and the ship does not need to be started into a dock for inspection, so that the inspection efficiency is improved, the inspection cost is reduced, the dependence on the dock is reduced, and the conflict with overhaul work in other aspects of the ship is avoided.
Through the design, the parasitic situation of parasites (such as barnacles) on the bottom of the ship can be judged by observing the ship body rendering model, so that the best opportunity for cleaning the bottom of the ship is selected, the rationality and pertinence of cleaning and maintenance work are greatly improved, and the waste of related resources is avoided.
In general, the ship body underwater imaging method based on machine binocular vision can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately grasping the ship bottom health condition, has a direct guiding function on ship body maintenance work, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance for reducing resource waste.
In this embodiment, please refer to fig. 3, step S2 includes the following steps:
s21: a first ranging point 110, a second ranging point 120 and a third ranging point 130 are arranged around the underwater binocular vision assembly 100, the first ranging point 110 ranges in a first preset direction a, the second ranging point 120 ranges in a second preset direction b, and the third ranging point 130 ranges in a third preset direction c;
s22: controlling the underwater binocular vision assembly 100 to approach the hull such that the distance from the hull detected from the first ranging point 110 is a first preset value, the distance from the hull detected from the second ranging point 120 is a second preset value, and the distance from the hull detected from the third ranging point 130 is a third preset value, thereby causing the underwater binocular vision assembly 100 to be in a preset posture;
S23: the underwater binocular vision assembly 100 is controlled to move along the hull in a preset attitude to acquire the outer surface images of different positions of the underwater portion of the hull.
The first preset value, the second preset value and the third preset value can be flexibly set and adjusted according to actual conditions, and when any one of the first preset value, the second preset value and the third preset value is changed, the preset posture of the underwater binocular vision assembly 100 is also changed.
By setting the preset posture, the underwater binocular vision assembly 100 can acquire the outer surface image of the underwater portion of the hull in a relatively stable posture, and imaging quality is improved.
Specifically, step S23 includes the steps of:
s231: the underwater binocular vision assembly 100 is controlled to transversely move along the ship body and move around the ship body for one circle under a preset gesture, so that an outer surface image of the ship body at the same depth position is obtained;
s232: determining the visual field range of the underwater binocular vision assembly 100 on the surface of the ship body under the preset gesture, and adjusting the submergence depth of the underwater binocular vision assembly 100, wherein the adjustment amount of the submergence depth is smaller than or equal to the longitudinal length of the visual field range each time;
s233: steps S231 and S232 are repeatedly performed to acquire an outer surface image of the entire underwater portion of the hull.
The method for acquiring the images by utilizing the transverse movement in a circle-by-circle way can effectively ensure the transverse continuity of the images, reduce splicing and ensure the high efficiency of image acquisition on the other hand.
After the lateral movement has acquired the outer surface image of one turn, the submergence depth of the underwater binocular vision assembly 100 is changed to perform the acquisition work of the outer surface image of the next turn. The outer surface image of the underwater part of the ship body can be acquired in a circle-by-circle acquisition mode by gradually adjusting the submerging depth.
The adjustment amount of each submerging depth is smaller than or equal to the longitudinal length of the visual field range, so that continuity of the acquired images in the longitudinal direction can be ensured, and image faults are avoided.
Further, in step S232, each time the adjustment amount of the submergence depth is smaller than the longitudinal length of the visual field, the difference between the adjustment amount of the submergence depth and the longitudinal length of the visual field is taken as the adjustment length D, as shown in fig. 4.
When combining the exterior surface images at different depth positions, the exterior surface images at adjacent depth positions (i.e., two adjacent turns of images) are overlapped by an adjustment length, thereby restoring the overall image of the underwater portion of the hull.
Wherein, because the adjustment amount of each submerging depth is smaller than the longitudinal length of the visual field range, the edges of two adjacent circles of images have repeated image contents, and the longitudinal length of the repeated image contents is exactly matched with the adjustment length. In the overlapping process, the repeated contents are aligned and overlapped according to the repeated image contents, so that each circle of images are aligned in the transverse direction, and the accuracy of the restored whole image is ensured.
After the overlapping is finished, the overlapping part of two adjacent circles of images is two layers of images, and one of the images can be selectively reserved.
In this embodiment, the underwater imaging method for the ship body based on machine binocular vision is further provided with a distance offset range, and in the process that the underwater binocular vision assembly 100 moves along the ship body under a preset posture, the posture of the underwater binocular vision assembly 100 is regulated and controlled in real time;
if at least one of the distance from the hull detected from the first ranging point 110 deviates from a first preset value (i.e., is not equal to the first preset value), the distance from the hull detected from the second ranging point 120 deviates from a second preset value (i.e., is not equal to the second preset value), the distance from the hull detected from the third ranging point 130 deviates from a third preset value (i.e., is not equal to the third preset value), and the distance deviation exceeds the distance deviation range, the underwater binocular vision assembly 100 is readjusted to the preset posture.
It should be noted that the distance offset range may be flexibly set according to the actual situation.
Through the mode, the gesture of the underwater binocular vision assembly 100 can be dynamically adjusted, so that the underwater binocular vision assembly 100 is ensured to maintain a relatively stable gesture in the process of image data acquisition, and further the continuity and uniformity of image data are ensured.
It should be noted that, as an alternative: as shown in fig. 3, the first preset value, the second preset value, and the third preset value may be configured such that when the underwater binocular vision assembly 100 is used to capture an image of a plane X, the underwater binocular vision assembly 100 is in a preset posture, and the orientation y of the underwater binocular lens of the underwater binocular vision assembly 100 is perpendicular to the plane X.
The first ranging point 110 is disposed at the top of the underwater binocular vision assembly 100, and the second ranging point 120 and the third ranging point 130 are disposed at two sides of the underwater binocular vision assembly 100. The first preset direction a, the second preset direction b and the third preset direction c all point to the front of the underwater binocular vision assembly 100, and the first preset direction a is simultaneously biased to the upper side of the underwater binocular vision assembly 100, and the second preset direction b and the third preset direction c are simultaneously biased to the lower side of the underwater binocular vision assembly 100. The straight lines of the first preset direction a, the second preset direction b and the third preset direction c are in the same vertex.
When the underwater binocular vision assembly 100 is in the preset posture, the projection of the first preset direction a on the plane X is the point a, the projection of the second preset direction B on the plane X is the point B, and the projection of the third preset direction C on the plane X is the point C. Point A, B, C is the three vertices of an isosceles triangle and point B, C corresponds to the two vertices of the base of the isosceles triangle.
When the underwater binocular vision assembly 100 is in the preset posture, the view range of the underwater binocular vision assembly on the plane X is a rectangle, and as shown in fig. 5, the bottom side of the rectangle is parallel to the bottom side of the isosceles triangle. The lens with rectangular visual field range may be directly selected, or a rectangular area may be cut out from a non-rectangular visual field to be used as the image collecting area, but the invention is not limited thereto.
When the underwater binocular vision assembly 100 is in the preset attitude, on the plane X, the point a is located above the rectangular field of view and the point B, C is located below the rectangular field of view.
In this way, when the underwater binocular vision assembly 100 is used to collect the external surface image of the ship body, the projection of the first preset direction a on the surface of the ship body is located above the visual field, and the projection of the second preset direction b and the third preset direction c on the surface of the ship body is located below the visual field.
Through the design, the first preset value, the second preset value and the third preset value are adopted, so that a better front image can be obtained, and meanwhile, the definition of the image is ensured.
Further, in performing the lateral motion capturing image, the profile data of the hull may be determined according to the lateral movement rate of the underwater binocular vision assembly 100, the actual distance from the hull detected from the first ranging point 110, the actual distance from the hull detected from the second ranging point 120, and the actual distance from the hull detected from the third ranging point 130.
In particular, when the underwater binocular vision assembly 100 moves to the bow or stern, the arc of the hull varies greatly (turns). As shown in fig. 6 and 7, taking a lateral movement along direction p as an example, when the underwater binocular vision assembly 100 is about to start to bypass the bow or stern, the actual distance from the hull detected from the third ranging point 130 will first change significantly (suddenly), indicating that the stern has been reached, and at this time, it is necessary to control the underwater binocular vision assembly 100 to move along the stern to achieve a lateral steering.
It will be appreciated that in order to ensure stability of image capture during steering, the underwater binocular vision assembly 100 may be controlled to steer along the actual profile of the stern/bow of the hull, for example, depending on the type of actual vessel: in the steering process, the distance between the lens and the ship body in the axial direction is kept unchanged (or in a certain range), in the steering process, the image acquisition is continuously carried out, and after the steering is finished, namely the ship stern/bow is bypassed, the image acquisition is carried out again according to the preset gesture. And is not limited thereto.
The underwater binocular vision assembly 100 may mark the image data acquired at this time when the actual profile of the stern/bow of the hull starts to turn, and may mark the image data acquired at the time of finishing the turning, and the two marks may indicate the position and the range of the stern/bow, that is, the profile feature points corresponding to the profile data of the hull.
The step S3 comprises the following steps: the exterior surface image is mapped on the hull model according to the contour feature points and the submerging depth of the underwater binocular vision assembly 100 when the exterior surface image is acquired, so as to obtain a hull rendering model.
That is, according to the indication of the position and the range of the stern/bow (i.e. the indication of the outline feature points) and the submerging depth of the underwater binocular vision assembly 100 when the circle of the outer surface image is acquired, the outer surface image can be mapped on the hull model more accurately, the mapping accuracy is ensured, the mapping dislocation probability is reduced, and the actual appearance of each part of the hull can be reflected more accurately.
After the ship body rendering model is obtained through the ship body underwater imaging method based on the machine binocular vision, the actual condition of the outer surface of the ship body can be known through the ship body rendering model, so that the condition that the underwater part of the ship body is attached by parasitic organisms can be intuitively known, more reasonable cleaning and maintenance measures can be conveniently taken at more proper time, and the pertinence of ship body maintenance is greatly improved.
Example 2
The present embodiment provides an underwater robot for performing the ship body underwater imaging method based on machine binocular vision provided in embodiment 1.
The underwater robot includes: a robot body (not shown in the figures) and an underwater binocular vision assembly 100.
The underwater binocular vision assembly 100 is installed on the robot body, the underwater robot realizes gesture adjustment and motion control by using a power system of the robot body, and a control command can be stored in the robot body or can be controlled remotely, so that the underwater binocular vision assembly 100 can execute image acquisition work under a preset gesture.
It will be appreciated that, in order to facilitate timely processing of the image data, a processing unit may be disposed in a cabin/control room of the ship, an image data transmitting unit may be disposed in the robot body, the image data collected by the underwater binocular vision assembly 100 may be transmitted to the processing unit by the image data transmitting unit, and the external surface image may be obtained after processing by the processing unit. Further, mapping the outer surface image on the hull model by combining a three-dimensional processing means to obtain a hull rendering model.
In the present embodiment, the first ranging point 110, the second ranging point 120, and the third ranging point 130 are disposed on the outer surface of the robot body. The first ranging point 110 is disposed at the top of the robot body, and the second ranging point 120 and the third ranging point 130 are disposed at two sides of the robot body and symmetrically.
The first ranging point 110, the second ranging point 120, and the third ranging point 130 may use underwater laser ranging to achieve distance measurement in the corresponding direction, but are not limited thereto, and a person skilled in the art may flexibly select other ranging modes and means according to practical situations.
In general, the underwater robot can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately grasping the ship bottom health condition, has a direct guiding effect on ship maintenance work, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance in reducing resource waste.
Please combine fig. 8-10, in order to improve the stability of the underwater robot in executing the image acquisition process and to ensure the stability of the image data, the underwater robot is provided with: a stabilizing mechanism 1000. The stabilization mechanism 1000 includes: weight ball 200, positioning shell 300, positioning assembly 400, and locking assembly;
the weight ball 200 is a sphere and has a hollow structure, and the inner cavity of the weight ball 200 is also a sphere and is in concentric arrangement with the outer shell of the weight ball 200. Inside the weight ball 200 is provided a partition 210 dividing the inner space thereof equally into two hemispherical cavities, one of which is filled with weight 220.
The positioning shell 300 is in a spherical shell shape, the positioning assembly 400 is arranged along the radial direction of the positioning shell 300 and penetrates through the shell wall of the positioning shell 300, the positioning assembly 400 extends into the positioning shell 300, and the ball 410 is arranged in the positioning assembly 400; a plurality of positioning assemblies 400 are distributed in an array along the surface (i.e., spherical surface) of the positioning shell 300. In this embodiment, the positioning assemblies 400 are distributed along the surface of the positioning shell 300 in a spherical array, and the positioning assemblies 400 are first spaced apart from each other.
The weight ball 200 is accommodated in the positioning case 300, and the balls 410 at the inner end of the positioning assembly 400 are attached to the weight ball 200, and the weight ball 200 is universally rotatably fitted in the positioning case 300 through the positioning assembly 400.
The locking assembly cooperates with the positioning assembly 400, which is capable of locking the ball 410, thereby locking the weighted ball 200.
Specifically, the positioning assembly 400 further includes: a reference column 420, a first control ring 430, a drive gear 440, a second control ring 450, a core 460, a pressure sensor 470, and a spacer 480.
The reference column 420 is cylindrical, the reference column 420 penetrates through the shell wall of the positioning shell 300 and is arranged along the radial direction of the positioning shell 300, a mounting groove 421 for mounting the ball 410 is formed in the inner end face of the reference column 420, a mounting blind hole 422 is formed in the bottom of the mounting groove 421, and the mounting blind hole 422 extends along the axial direction of the reference column 420. The core 460 is slidably engaged with the mounting blind bore 422.
The pressure sensor 470 is arranged at one end part of the core body 460 close to the mounting groove 421, the cushion block 480 is arranged at one side of the pressure sensor 470 away from the core body 460, and a matching part for matching with the ball 410 is arranged at one side of the cushion block 480 away from the pressure sensor 470. The core 460 has external threads, the second control ring 450 has internal threads, and the second control ring 450 is sleeved on the core 460 and is in threaded engagement with the core 460.
The lateral wall of the outer end of the reference column 420 is provided with a matching notch 423, and the matching notch 423 is communicated with the mounting blind hole 422. The transmission gear 440 is mounted on the matching notch 423, and the first control ring 430 is rotatably sleeved on the reference column 420. The first control ring 430 has an inner gear ring, the second control ring 450 has an outer gear ring, and both the first control ring 430 and the first control ring 430 are meshed with the transfer gear 440.
Wherein, along the circumference of the reference column 420, the core 460 is fixedly matched with the mounting blind hole 422. Along the axial direction of the reference column 420, the second control ring 450 is fixedly engaged with the mounting blind hole 422, and the first control ring 430 is fixedly engaged with the reference column 420.
The locking assembly is in driving engagement with the second control ring 450, and the locking assembly is capable of locking the weighted ball 200 by driving the second control ring 450 to move the core 460 toward the side of the ball 410, causing the spacer 480 to abut and lock the ball 410.
In this embodiment, the locking assembly includes: collar 510, connecting rod 520, drive ring 530, and driver (not shown).
The cavity for installing the locating component 400 is reserved in the robot body, the connecting shafts 310 are fixedly connected to the two opposite sides of the locating shell 300, the connecting shafts 310 on the two sides are coaxially arranged, and the axial lead of the connecting shafts 310 passes through the sphere center of the sphere corresponding to the locating shell 300. The positioning shell 300 is connected with the inner wall of the cavity of the robot body through connecting shafts 310 at two sides, and the connecting shafts 310 are fixedly connected with the inner wall of the cavity of the robot body.
Each connecting shaft 310 is rotatably sleeved with a collar 510, the connecting rod 520 is arc-shaped, and the connecting rod 520 is fixedly connected between the two collars 510 and is positioned outside the positioning shell 300. The driving rings 530 are plural, and the driving rings 530 are disposed coaxially with the connecting shaft 310, and the driving rings 530 are disposed at intervals along the axial direction of the connecting shaft 310. The driving rings 530 are all disposed outside the positioning case 300, and the driving rings 530 are fixedly connected to the connecting rod 520.
Wherein, the first control rings 430 each have an outer gear ring, the driving rings 530 each have an inner gear ring, the reference columns 420 of the positioning assembly 400 are distributed along the driving rings 530, and the first control rings 430 are engaged with the driving rings 530.
The driver is in driving connection with the collar 510.
The driver can drive the collar 510 to rotate, so that the driving ring 530 rotates, and the driving ring 530 can drive the first control ring 430 to rotate, so that locking and unlocking of the balls 410 are realized.
With the above design, the locking and unlocking of the balls 410 can be controlled by controlling the forward rotation and the reverse rotation of the driver for a certain number of turns. The pressure sensor 470 is used to monitor the pressure between the core 460 and the pad 480 and feed back the pressure data to the processing unit.
In actual use, when the underwater robot starts to collect the outer surface image of the underwater portion of the hull, the control ball 410 is unlocked, and when the underwater binocular vision assembly 100 is in the preset attitude by adjusting the attitude of the underwater robot, the center of gravity of the counterweight ball 200 is maintained at the lowest position in the preset attitude because the counterweight ball is freely rotatable. When the underwater binocular vision assembly 100 is in the preset posture, the balls 410 are locked, and the weight ball 200 is simultaneously locked. Due to the relationship of the weight ball 200, the underwater robot can be more stably maintained in the preset posture.
Once the ball 410 is locked, the pressure of each pressure sensor 470 stabilizes, and the pressure sensor 470 may then be used to monitor pressure changes during image acquisition. To ensure that the pressure changes correspond to the orientations, each pressure sensor 470 may be numbered to distinguish from each other and the position of each numbered pressure sensor 470 on the positioning shell 300 is marked. In this embodiment, when the balls 410 are locked, the pushing force applied by the second control ring 450 of each positioning assembly 400 to the core 460 toward the side where the balls 410 are located is the same; the thrust exerted actively by the second control ring 450 of each positioning assembly 400 on the core 460 toward the side of the ball 410 is also the same when the ball 410 is unlocked.
After the underwater binocular vision assembly 100 is in the preset posture, the ball 410 and the counterweight ball 200 are locked, and the pressure sensor 470 with the largest bearing pressure is positioned at the lowest position (also bears the gravity of the counterweight ball 200), so that the actual posture of the underwater robot in water can be judged through the pressure of each pressure sensor 470, and the more accurate posture control of the underwater robot is ensured.
During image acquisition, the underwater robot moves smoothly and the stress of each pressure sensor 470 is relatively smooth. In particular, when the stress of the pressure sensor 470 is unexpectedly changed and at least one of the first ranging point 110, the second ranging point 120, and the third ranging point 130 detects a distance deviating from a preset value, it indicates that the underwater robot may be accidentally impacted to deviate from the preset pose.
With reference to fig. 3, if the stress of each pressure sensor 470 is relatively stable while moving in the direction p and the underwater robot is continuously stably maintained in the preset posture, the distance detected by the third ranging point 130 is suddenly reduced, and then the distances detected by the first ranging point 110 and the second ranging point 120 are also reduced, and if the water flow is relatively calm, it is possible that the thickness of the parasite at this point is suddenly thickened, and the marking may be synchronized in the image.
With reference to fig. 3, if the stress of each pressure sensor 470 is relatively smooth and the underwater robot is continuously stably maintained in the preset attitude while moving in the direction p, the distance detected by the third ranging point 130 suddenly becomes very large, indicating that the corner of the bow/stern is reached if the water flow is relatively calm.
The above can be judged by combining the pressure change of the pressure sensor 470, the detection distances of the first ranging point 110, the second ranging point 120 and the third ranging point 130, the water flow/sea surface condition and the like, so that the underwater gesture of the underwater robot can be controlled more accurately, and the definition and the accuracy of the image can be further ensured.
In summary, the ship body underwater imaging method based on machine binocular vision provided by the embodiment of the invention can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately grasping the ship bottom health condition, has a direct guiding effect on ship body maintenance work, avoids blindness of maintenance work, is convenient for more reasonably managing maintenance work, and has positive significance in reducing resource waste.
The underwater robot provided by the embodiment of the invention can timely and accurately feed back the ship bottom condition, is convenient for directly and accurately grasping the ship bottom health condition, has a direct guiding function on maintenance work of the ship body, avoids blindness of the maintenance work, is convenient for more reasonably managing the maintenance work, and has positive significance for reducing resource waste.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. The ship body underwater imaging method based on machine binocular vision is characterized by comprising the following steps of:
s1: obtaining a hull model;
s2: acquiring an outer surface image of an underwater portion of the hull using the underwater binocular vision assembly;
s3: mapping is carried out on the hull model by utilizing the obtained outer surface image, so as to obtain a hull rendering model;
the step S2 includes the steps of:
s21: a first ranging point, a second ranging point and a third ranging point are arranged around the underwater binocular vision component, the first ranging point ranges in a first preset direction, the second ranging point ranges in a second preset direction, and the third ranging point ranges in a third preset direction;
s22: controlling the underwater binocular vision assembly to be close to the ship body, so that the distance between the underwater binocular vision assembly and the ship body detected from the first ranging point is a first preset value, the distance between the underwater binocular vision assembly and the ship body detected from the second ranging point is a second preset value, and the distance between the underwater binocular vision assembly and the ship body detected from the third ranging point is a third preset value, and the underwater binocular vision assembly is in a preset posture;
S23: controlling the underwater binocular vision component to move along the ship body under the preset gesture so as to acquire external surface images of different positions of the underwater part of the ship body;
the step S23 includes the steps of:
s231: controlling the underwater binocular vision component to transversely move along the ship body and move around the ship body for a circle under the preset gesture, so as to acquire an outer surface image of the ship body at the same depth position;
s232: determining the visual field range of the underwater binocular vision assembly on the surface of the ship body under the preset gesture, and adjusting the submerging depth of the underwater binocular vision assembly, wherein the adjustment amount of the submerging depth each time is smaller than or equal to the longitudinal length of the visual field range;
s233: repeatedly executing the step S231 and the step S232 to acquire all outer surface images of the underwater part of the ship body;
in the step S232, each time the adjustment amount of the submerging depth is smaller than the longitudinal length of the field of view, the difference between the adjustment amount of the submerging depth and the longitudinal length of the field of view is taken as the adjustment length;
when combining the outer surface images at different depth positions, overlapping the outer surface images at adjacent depth positions by one adjustment length, and reserving one of the overlapping parts;
Determining contour data of the ship body according to the transverse moving speed of the underwater binocular vision component, the distance between the underwater binocular vision component and the ship body detected from the first ranging point, the distance between the underwater binocular vision component and the ship body detected from the second ranging point and the distance between the underwater binocular vision component and the ship body detected from the third ranging point, and marking contour feature points matched with the contour data in the acquired outer surface images;
the step S3 includes: and mapping the outer surface image on the hull model according to the outline characteristic points and the submerging depth of the underwater binocular vision component when the outer surface image is acquired, so as to obtain the hull rendering model.
2. The ship body underwater imaging method based on machine binocular vision according to claim 1, wherein a distance offset range is set, and the underwater binocular vision component regulates and controls the posture of the underwater binocular vision component in real time in the process of moving along the ship body under the preset posture;
and if at least one of the distance from the hull detected by the first ranging point, the distance from the hull detected by the second ranging point and the distance from the hull detected by the third ranging point deviate from the third preset value by the first preset value, and the deviation distance exceeds the distance deviation range, the underwater binocular vision component is readjusted to the preset gesture.
3. The underwater imaging method of a ship based on machine binocular vision according to claim 2, wherein the first ranging point is arranged at the top of the underwater binocular vision assembly, and the second ranging point and the third ranging point are respectively arranged at two sides of the underwater binocular vision assembly; the first preset direction, the second preset direction and the third preset direction are all directed to the front of the underwater binocular vision assembly, the first preset direction is simultaneously directed to the upper side of the underwater binocular vision assembly, and the second preset direction and the third preset direction are simultaneously directed to the lower side of the underwater binocular vision assembly; the straight lines of the first preset direction, the second preset direction and the third preset direction are in the same vertex;
when the underwater binocular vision assembly is in the preset attitude, the projection of the first preset direction on the surface of the ship body is located above the visual field range, and the projection of the second preset direction and the third preset direction on the surface of the ship body is located below the visual field range.
4. An underwater robot for performing the machine binocular vision-based underwater imaging method of a ship body as claimed in any one of claims 2 to 3, comprising: a robot body and the underwater binocular vision assembly; the underwater binocular vision assembly is arranged on the robot body; the outer surface of the robot body is provided with the first ranging point, the second ranging point and the third ranging point;
The underwater robot further includes: a stabilizing mechanism; the stabilizing mechanism includes: the counterweight ball, the positioning shell, the positioning assembly and the locking assembly;
the counterweight ball is of a hollow structure, a partition plate which divides the internal space of the counterweight ball into two hemispherical cavities is arranged in the counterweight ball, and one hemispherical cavity is filled with counterweight materials;
the positioning shell is spherical, the positioning assembly is arranged along the radial direction of the positioning shell and penetrates through the shell wall of the positioning shell, the positioning assembly extends to the inside of the positioning shell, and balls are arranged in the positioning assembly; the positioning components are distributed in an array along the surface of the positioning shell;
the counterweight ball is accommodated in the positioning shell, the balls at the inner end of the positioning assembly are attached to the counterweight ball, and the counterweight ball is universally and rotatably matched in the positioning shell through the positioning assembly;
the locking assembly cooperates with the positioning assembly, and the locking assembly is capable of locking the ball, thereby locking the weighted ball.
5. The underwater robot of claim 4, wherein the positioning assembly further comprises: the device comprises a reference column, a first control ring, a transmission gear, a second control ring, a core body, a pressure sensor and a cushion block;
The reference column penetrates through the shell wall of the positioning shell and is arranged along the radial direction of the positioning shell, and an installation groove for installing the ball is formed in the end face of the inner end of the reference column; a mounting blind hole is formed in the bottom of the mounting groove, and extends along the axial direction of the reference column; the core body is slidably matched with the mounting blind hole;
the pressure sensor is arranged at one end part of the core body, which is close to the mounting groove, the cushion block is arranged at one side of the pressure sensor, which is far away from the core body, and a matching part for matching with the ball is arranged at one side of the cushion block, which is far away from the pressure sensor; the core body is provided with external threads, the second control ring is provided with internal threads, and the second control ring is sleeved on the core body and is matched with the core body in a threaded manner;
a matching notch is formed in the side wall of the outer end of the reference column, and the matching notch is communicated with the mounting blind hole; the transmission gear is arranged on the matching notch, and the first control ring is rotatably sleeved on the reference column; the first control ring is provided with an inner gear ring, the second control ring is provided with an outer gear ring, and the first control ring are both meshed with the transmission gear;
The core body is fixedly matched with the mounting blind hole along the circumferential direction of the reference column; the second control ring is fixedly matched with the mounting blind hole along the axial direction of the reference column;
the locking component is in transmission fit with the first control ring, and the locking component can enable the cushion block to tightly prop against and lock the balls through driving the first control ring, so that the counterweight ball is locked.
CN202310496786.XA 2023-05-05 2023-05-05 Hull underwater imaging method based on machine binocular vision and underwater robot Active CN116206070B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310496786.XA CN116206070B (en) 2023-05-05 2023-05-05 Hull underwater imaging method based on machine binocular vision and underwater robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310496786.XA CN116206070B (en) 2023-05-05 2023-05-05 Hull underwater imaging method based on machine binocular vision and underwater robot

Publications (2)

Publication Number Publication Date
CN116206070A CN116206070A (en) 2023-06-02
CN116206070B true CN116206070B (en) 2023-07-21

Family

ID=86508066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310496786.XA Active CN116206070B (en) 2023-05-05 2023-05-05 Hull underwater imaging method based on machine binocular vision and underwater robot

Country Status (1)

Country Link
CN (1) CN116206070B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115393806A (en) * 2022-09-20 2022-11-25 青岛华兴海洋工程技术有限公司 Ship body posture monitoring system and method based on visual technology

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5187871A (en) * 1992-01-27 1993-02-23 Mcdermott Damien Underwater navigation device
CN108177143B (en) * 2017-12-05 2021-08-10 上海工程技术大学 Robot positioning and grabbing method and system based on laser vision guidance
CN111090103B (en) * 2019-12-25 2021-03-02 河海大学 Three-dimensional imaging device and method for dynamically and finely detecting underwater small target
CN113034399A (en) * 2021-04-01 2021-06-25 江苏科技大学 Binocular vision based autonomous underwater robot recovery and guide pseudo light source removing method
CN113222961A (en) * 2021-05-27 2021-08-06 大连海事大学 Intelligent ship body detection system and method
CN114562941A (en) * 2022-03-18 2022-05-31 上汽通用五菱汽车股份有限公司 System and method for accurately measuring relative wide-area machine vision images
CN115187565A (en) * 2022-07-20 2022-10-14 东南大学 Underwater pier disease identification and positioning method and device, electronic equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115393806A (en) * 2022-09-20 2022-11-25 青岛华兴海洋工程技术有限公司 Ship body posture monitoring system and method based on visual technology

Also Published As

Publication number Publication date
CN116206070A (en) 2023-06-02

Similar Documents

Publication Publication Date Title
US10429845B2 (en) System and method for controlling a position of a marine vessel near an object
CN105667745A (en) Autonomous underwater vehicle and control method thereof
US10011337B2 (en) Water drone
JP6659071B2 (en) System and method for controlling the position of a ship near an object
CN107065898A (en) A kind of unmanned boat navigation control method and system under water
JP6763073B2 (en) Methods and systems for controlling low-speed propulsion of ships
CN107526087A (en) A kind of method and system for obtaining underwater 3D faultage images
CN116206070B (en) Hull underwater imaging method based on machine binocular vision and underwater robot
US20220017235A1 (en) Autonomous landing systems and methods for vertical landing aircraft
CN107241533A (en) A kind of battle array scanning laser imaging device and method under water
CN111038671A (en) Submarine three-dimensional terrain surveying and mapping unmanned underwater vehicle
EP3699080B1 (en) Trolling motor with local and remote control modes
CN210310793U (en) Carry on relay communication equipment's high accuracy and avoid striking unmanned ship
CN106477008B (en) A kind of streamlined AUTONOMOUS TASK underwater robot platform of three bodies
CN115019412A (en) Underwater AUV (autonomous underwater vehicle) submarine cable inspection system and method based on multiple sensors
Brown et al. An overview of autonomous underwater vehicle research and testbed at PeRL
CN111176328B (en) Multi-AUV distributed target trapping control method based on under-information
Choi et al. Autonomous towed vehicle for underwater inspection in a port area
KR101956472B1 (en) Structure inspection apparatus and system for inspecting ballast tank
KR102169998B1 (en) articulated marine robot ship with stereo photo sensor
Caccia Vision-based linear motion estimation for unmanned underwater vehicles
CN219121445U (en) Monitoring system suitable for shallow water area salvages boats and ships
CN114035591B (en) Motion switching control method of underwater variable-curvature wall surface motion robot
CN215794338U (en) Unmanned ship for wharf detection
CN115629392B (en) Underwater ranging device and ranging method for underwater robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant