CN117346792B - Positioning method for underwater robot in ocean engineering environment - Google Patents

Positioning method for underwater robot in ocean engineering environment Download PDF

Info

Publication number
CN117346792B
CN117346792B CN202311643883.3A CN202311643883A CN117346792B CN 117346792 B CN117346792 B CN 117346792B CN 202311643883 A CN202311643883 A CN 202311643883A CN 117346792 B CN117346792 B CN 117346792B
Authority
CN
China
Prior art keywords
target
underwater
cooperative target
initial
underwater robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311643883.3A
Other languages
Chinese (zh)
Other versions
CN117346792A (en
Inventor
张德津
何莉
张伟
周宝定
董可
马正鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN202311643883.3A priority Critical patent/CN117346792B/en
Publication of CN117346792A publication Critical patent/CN117346792A/en
Application granted granted Critical
Publication of CN117346792B publication Critical patent/CN117346792B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a positioning method of an underwater robot in a marine engineering environment, which comprises the following steps: acquiring an initial position of the underwater robot based on a satellite and an underwater sound positioning system; determining an initial coding cooperative target corresponding to the initial position in a coding cooperative target set, wherein the coding cooperative target set comprises coding cooperative targets preset on an underwater structure; determining an instantaneous accurate position and an instantaneous accurate posture of the underwater robot by utilizing an underwater photogrammetry technology based on the initial coding cooperative target; determining that the underwater robot leaves the initial coding cooperative target and goes to the next coding cooperative target to be determined as a target coding cooperative target based on a working path formed by the initial coding cooperative target; and according to the instantaneous accurate position and the instantaneous accurate gesture, combining an inertial navigation system to perform positioning navigation, thereby realizing the underwater long-endurance positioning of the ocean engineering environment.

Description

Positioning method for underwater robot in ocean engineering environment
Technical Field
The application relates to the technical field of underwater robots, in particular to a positioning method of an underwater robot in a marine engineering environment.
Background
Typical ocean engineering includes submarine immersed tube tunnel construction, port and dock construction, offshore wind power plant construction, and the like. Taking the submarine immersed tube tunnel construction with the most severe positioning requirements as an example, because the immersed tube has large geometric dimension, complex construction environment and large influence on surrounding water body in the construction process, the diver is required to continuously perform underwater operation in order to meet the high-precision positioning requirement, and the safety risk is high. Underwater observation and operation by replacing divers with underwater robots are a trend.
Accurate positioning is a precondition for normal operation of the underwater robot. Currently, underwater robots are often equipped with inertial navigation systems for positioning. However, the error of the inertial navigation system tends to diverge with the navigation time, and the longer the navigation time is, the larger the error of the inertial navigation system is, which makes it difficult for the underwater robot to maintain high-precision positioning.
Disclosure of Invention
The embodiment of the application aims to provide a positioning method for an underwater robot in a marine engineering environment, which can prolong the precision holding time of the underwater robot carrying an inertial navigation system and enable the underwater robot to maintain high-precision positioning.
In order to solve the above technical problems, an embodiment of the present application provides a positioning method for an underwater robot in an ocean engineering environment, including: acquiring an initial position of the underwater robot based on a satellite and an underwater sound positioning system; determining an initial coding cooperative target corresponding to the initial position in a coding cooperative target set, wherein the coding cooperative target set comprises coding cooperative targets preset on an underwater structure; determining an instantaneous accurate position and an instantaneous accurate posture of the underwater robot by utilizing an underwater photogrammetry technology based on the initial coding cooperative target; determining that the underwater robot leaves the initial coding cooperative target and goes to the next coding cooperative target as a target coding cooperative target based on a working path formed by the initial coding cooperative target; and according to the instantaneous accurate position, the instantaneous accurate gesture and the working path, positioning and navigation are carried out by combining an inertial navigation system.
In the embodiment of the application, the instantaneous accurate position and the instantaneous accurate posture of the underwater robot are obtained based on the coded cooperative targets by arranging the coded cooperative targets on the underwater structure and adopting the underwater photogrammetry technology, and the long-endurance high-precision underwater robot positioning method based on the inertial navigation system can be formed by combining the instantaneous accurate position and the instantaneous accurate posture with the operation path of the underwater robot.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
Fig. 1 shows a schematic flow chart of a positioning method of an underwater robot in an ocean engineering environment according to an embodiment of the present application;
fig. 2 shows a schematic diagram of a positioning method of an underwater robot in a marine engineering environment according to an embodiment of the present application;
fig. 3 is another flow diagram of a positioning method of an underwater robot in an ocean engineering environment according to an embodiment of the present application;
fig. 4 is a schematic flow chart of another positioning method of an underwater robot in a marine engineering environment according to an embodiment of the present application;
FIG. 5 shows another schematic diagram of a method for positioning an underwater robot in an ocean engineering environment according to an embodiment of the present application;
fig. 6 shows a schematic structural diagram of an underwater robot positioning device in an ocean engineering environment according to an embodiment of the present application.
Detailed Description
As described above, the underwater robot is generally equipped with inertial navigation with certain performance and performs positioning by auxiliary inertia such as water satellite, underwater sonar, vision and the like, and the multi-source combined positioning method is widely applied to underwater observation and operation. However, the underwater sound positioning is prone to multipath effects, thereby affecting the accuracy of the underwater sound positioning; disturbance to the water body in the engineering construction process changes the water quality of the local environment, so that the underwater visibility is poor, the imaging distance and the imaging quality are greatly influenced, and the visual matching positioning precision is influenced. In such an engineering environment, the multi-source combined positioning method is difficult to meet the requirement of high-precision positioning of the underwater robot. Therefore, the acquisition of more reliable and more accurate specific positioning data and the provision of auxiliary correction information when inertial navigation is needed are still the most possible ways of positioning the underwater robot. Based on the above, the application provides a positioning method of the underwater robot in the ocean engineering environment.
In order to better understand the technical solutions in the present application, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
Fig. 1 shows a schematic flow chart of a positioning method of an underwater robot in an ocean engineering environment according to an embodiment of the present application. As shown, the method may include the following steps.
Step S110: based on the satellite and the underwater sound positioning system, the initial position of the underwater robot is obtained.
For example, the initial position of the underwater robot is obtained based on a global navigation satellite system GNSS and a multi-source combined positioning of the underwater sound system. The initial position is also used as a coarse positioning position of the underwater robot. The satellite obtains a high-precision position of a geodetic coordinate system, and in particular, the satellite can achieve centimeter-level positioning through difference; the satellite and underwater sound combination can obtain the position of the underwater robot. The underwater sound location includes long baseline, short baseline, ultra-short baseline, etc.
Inertial navigation systems (also known as inertial sensors) navigate based on initial positions obtained by satellites and the underwater sound system. Specifically, it is based on this initial position and performs navigation by measuring acceleration and angular velocity, and combines the acceleration and angular velocity with the navigation time, respectively, to achieve navigation. The initial position may be expressed as coordinates of the underwater robot in a geodetic coordinate system, e.g. g (x 0 , y 0 , z 0 ). The initial position obtained by the method is influenced by navigation time. The longer the navigation time, the larger the error of the initial position.
Step S120: in the set of encoded cooperative targets, an initial encoded cooperative target corresponding to the initial position is determined.
The initial coding cooperative target and the initial position meet the position corresponding relation. For example, the primary encoding partner target is located within a preset range of the primary position, or the primary encoding partner target is located a distance from the primary position that satisfies a predetermined relationship or satisfies a predetermined threshold.
The coding cooperative target set comprises coding cooperative targets (short for cooperative targets) which are preset on underwater structures. The underwater structure is a building with fixed underwater position, such as tunnel immersed tubes, piers and the like which are already laid. The cooperative targets are disposed at locations on the underwater structure where they are sensitive, such as on the surface of a immersed tube. The location of the cooperative targets on the underwater structure is known. Since the position of the underwater structure is fixed and the position of the cooperative target on the underwater structure is known, the position of the cooperative target can be determined, for example, the coordinates of the cooperative target in the geodetic coordinate system can be expressed as Ci (x i , y i , z i )。
Through the step, the coding cooperative target for acquiring the accurate position of the underwater robot can be determined.
Step S130: based on the initial coded cooperative targets, determining the instantaneous accurate position and the instantaneous accurate posture of the underwater robot by utilizing an underwater photogrammetry technology.
Underwater photogrammetry is used to photographically measure underwater objects to determine the shape, size, location, nature, etc. of the underwater objects. The device for underwater photogrammetry, such as a vision sensor, may be located on water or underwater, without limitation.
Under the condition that the position of the cooperative target is known, the position and the gesture of the underwater robot, namely the instantaneous accurate position and the instantaneous accurate gesture, can be calculated by shooting the cooperative target.
Alternatively, only one of the instantaneous precise position and the instantaneous precise pose may be determined based on the initially encoded cooperative targets, as desired for measurement.
Step S140: and determining that the underwater robot leaves the initial coding cooperative target and goes to the next coding cooperative target as a target coding cooperative target based on a working path formed by the initial coding cooperative target.
The work path may include an order in which the underwater robot passes through the plurality of encoded cooperative targets; further based on this order, the target-encoding cooperative target can be determined.
Step S150: and according to the instantaneous accurate position, the instantaneous accurate gesture and the working path, positioning and navigation are carried out by combining an inertial navigation system.
The initial position can be corrected or replaced based on the instantaneous accurate position and the instantaneous accurate posture, and the inertial navigation system uses the instantaneous accurate position and the instantaneous accurate posture to perform navigation positioning along the working path. Under the condition that the initial position is not accurate enough, the underwater robot can maintain higher positioning accuracy by positioning based on the accurate position and the instantaneous accurate gesture. In other words, the positioning accuracy of the underwater robot can be made not to be affected by the navigation time.
In the embodiment of the application, the initial position of the underwater robot is obtained through a satellite-based and underwater sound positioning system; determining an initial coding cooperative target corresponding to the initial position in a coding cooperative target set, wherein the coding cooperative target set comprises coding cooperative targets preset on an underwater structure; determining an instantaneous accurate position and an instantaneous accurate posture of the underwater robot by utilizing an underwater photogrammetry technology based on the initial coding cooperative target; determining that the underwater robot leaves the initial coding cooperative target and goes to the next coding cooperative target as a target coding cooperative target based on a working path formed by the initial coding cooperative target; according to the instantaneous accurate position, the instantaneous accurate gesture and the operation path, the instantaneous accurate position and the instantaneous accurate gesture which are determined in a photogrammetry mode based on the coding cooperative target can be combined with an inertial navigation system to carry out positioning navigation, and the inertial navigation system can be assisted in the photogrammetry mode, so that the long-endurance high-precision underwater robot positioning method based on the inertial navigation system is realized.
In one possible implementation, step S130 includes: performing photogrammetry on the initial coding cooperative target through a visual sensor of the underwater robot, and determining the pose relation of the visual sensor relative to the initial coding cooperative target; determining the instantaneous precise position and the instantaneous precise pose of the underwater robot based on the target position of the initial coded cooperative target, the target pose of the initial coded cooperative target, and the pose relationship.
The vision sensor is mounted on the underwater robot and can be used for shooting the cooperative targets. The vision sensor is a camera or video camera that can be adapted for underwater use. The number of the vision sensors may be one or more, and is not limited herein. The cooperative targets may be passive light reflecting targets or active light emitting targets, which may provide position and attitude information. Because in the ocean engineering construction environment, the water body is turbid, plankton is more, and the optical imaging quality of the water body is poor. If only a vision sensor is used, the imaging effect is poor, and the accuracy of the provided vision information is low. The cooperative targets capable of reflecting light or emitting light are matched with the visual sensor for joint measurement, so that accurate positioning information of the underwater robot, such as the position and the posture of the underwater robot, can be obtained in a construction environment.
Optionally, the cooperative targets may be deployed in consideration of underwater structural features, structural induction field features, and control performance of the underwater robot. In addition, after targets are laid on the underwater structure, the position and the posture of each target in the geodetic coordinate system can be obtained through the traditional engineering measurement means.
Determining the pose relationship of the vision sensor relative to the cooperative target through photogrammetry rear intersections may include, for example, determining the pose relationship of the vision sensor relative to the cooperative target based on a conversion relationship between a camera coordinate system, a target coordinate system, and a geodetic coordinate system, and a position of the underwater robot under the target coordinate system.
Optionally, the cooperative target may be an acoustic cooperative target, and the cooperative target is matched with an acoustic sensor to perform underwater measurement.
In connection with fig. 2, in one possible implementation, the encoded cooperative target 200 includes at least three reference points 201 for the vision sensor. Step S130 further includes: determining a reference pose relationship of the vision sensor relative to each of the reference points 201; based on the reference point position of each reference point 201, the reference point pose of each reference point 201, and the corresponding reference pose relationship, an instantaneous precise position and the instantaneous precise pose of the underwater robot are determined.
The reference point 201 is an observable physical point disposed on the cooperative target, for example, the reference point 201 may be a light reflection point or a light emitting point. The reference points 201 may be used as a pattern of encoded information. The number and arrangement of reference points 201 on each encoded cooperative target 200 may be designed according to measurement requirements. Optionally, the number and/or arrangement of reference points 201 on different encoded targets 200 are different so that the encoded targets 200 can be distinguished by the reference points 201.
Fig. 3 is a schematic flow chart of another positioning method of an underwater robot in an ocean engineering environment according to an embodiment of the present application. As shown, the method may include the following steps.
Step S310: based on the satellite and the underwater sound positioning system, the initial position of the underwater robot is obtained.
Step S321: and determining one of a plurality of coding cooperative targets as the initial coding cooperative target according to the distance between each cooperative target in the coding cooperative target set and the initial position.
Optionally, the cooperative target closest to the initial position is determined to be the initial encoded cooperative target. And, the initial coded cooperative target is a cooperative target which the underwater robot has not passed through at present.
For example, the distance from the coded cooperative target A to the initial position is 5 meters, the distance from the coded cooperative target B to the initial position is 3 meters, and the distance from the coded cooperative target C to the initial position is 1 meter; the underwater robot passes through the cooperative target C, and the cooperative target B is determined to be an initial coding cooperative target if the underwater robot does not pass through the cooperative target A and the cooperative target B.
Step S322: and determining the working path of the underwater robot according to the preset working path of the underwater robot and the initial coding cooperative target.
The preset operation path is a preset operation path of the underwater robot. With the determined initial coded cooperative target in the geodetic coordinate system as Ci (x i , y i , z i ) For example, get C i (x i , y i , z i ) The working path of the underwater robot, which is a key node, is also called an actual working path. The actual job path may be denoted as p { C i , C i+1 ,…, C n ,…C 1 I=1, 2, …, n, ci is the primary encoding partner target, which serves as the primary partner target in the actual job path. Optionally, in combination with the actual job requirements, the collaborative target Ci (x i , y i , z i ) Is used to determine the actual path of the path.
Alternatively, according to a preset job path, it can be used to determine the order of passing through different cooperative targets, e.g. C i , C i+1 ,…, C n . Thus, the actual job path determined from the preset job path may also be used to determine the order of passing through the different collaborative targets. That is, according to the initial cooperative target, the next cooperative target that the underwater robot goes to after leaving the initial cooperative target can be determined, and the target cooperative target can be determined. For example, based on the determined primary cooperative target C i Can determine the target of cooperation C i+1
Optionally, the cooperative target C is reached by the underwater robot i+1 Thereafter, the initial cooperative target C i Marked as "passed", repeating the above steps S321-S322, the next cooperative target, e.g., cooperative target C, to be moved forward after the underwater robot leaves the intended cooperative target can be determined i+2
Thus, the underwater robot can obtain accurate positions and postures each time the underwater robot encounters the coded cooperative targets; the navigation time of inertial navigation does not exceed the time from one cooperative target to the next cooperative target of the underwater robot, and the error of inertial navigation is always maintained between two coding cooperative targets, so that long-endurance and high-precision navigation can be ensured.
Step S340: and determining that the underwater robot leaves the initial coding cooperative target and goes to the next coding cooperative target as a target coding cooperative target.
The steps S310 and S340 and the following steps may adopt the descriptions of the corresponding steps in the previous embodiment, and achieve the same or corresponding technical effects, and the repeatable portions will not be described herein.
In the embodiment of the application, in the process of generating the actual operation path of the underwater robot by taking the coding cooperative targets as key points and performing underwater operation along the operation path by the underwater robot, the current accurate position of the underwater robot can be obtained through each preset coding cooperative target so as to continuously provide the accurate position of the accurate underwater robot for inertial navigation and further maintain the positioning accuracy of the underwater robot under long navigation time.
Fig. 4 shows another flow diagram of a positioning method of an underwater robot in an ocean engineering environment according to an embodiment of the present application. As shown, the method may include the following steps.
Step S410: based on the satellite and the underwater sound positioning system, the initial position of the underwater robot is obtained.
Step S420: an initial coded cooperative target corresponding to the initial position is determined.
Steps S410 and S420 may adopt descriptions of corresponding steps in the foregoing embodiments, and achieve the same or corresponding technical effects, and for the repeatable portions, the descriptions are omitted here.
Step S431: and obtaining the structural information of the underwater structure.
And acquiring structural information of the underwater structure through the underwater robot in the process that the underwater robot goes to the initial coding cooperative target and/or the target coding cooperative target.
Alternatively, satellites and underwater sound positioning are used to guide the underwater robot to the vicinity of the corresponding cooperating target location. Because underwater photogrammetry is greatly affected by water conditions, the underwater robot reaches a position closer to the cooperative target, and the accuracy of the pose of the underwater robot determined by the cooperative target can be further improved. For example, the underwater robot is controlled to travel from an initial position to an initial coded cooperative target, and the underwater robot is controlled to travel from the initial coded cooperative target to a destination coded cooperative target.
And acquiring structural information of the underwater structure through different sensors carried by the underwater robot. The structured information may be used to characterize the topography of the underwater structure, including, for example, the contour of the underwater structure, the shape of the underwater structure, and the like. The standard structural information is, for example, a pre-stored standard structural model of the underwater structure. Because the underwater structure is an artificial structure, the underwater structure has a standard geometric model, and the absolute coordinates of the underwater structure in a geodetic coordinate system can be obtained. The initial position is located in the same coordinate system as the position of the underwater structure. The standard structured information may include pose information for each cooperative target on the underwater structure.
Step S432: and determining the real-time position of the underwater robot.
And determining the real-time position of the underwater robot according to the standard structured information under the condition that the structured information is successfully matched with the prestored standard structured information of the underwater structure, for example, under the condition that the outline represented by the structured information can be matched with the outline represented by the standard structured information or under the condition that the outline represented by the structured information can be matched with the outline represented by the standard structured information.
According to the real-time position determined by the method, the accuracy of the position can be further improved on the basis of the instantaneous accurate position and the instantaneous accurate gesture, and the accuracy of long-time navigation is further improved.
In one possible implementation, in case the matching of the structured information with the standard structured information fails, the underwater robot is controlled to return to the last cooperative target that has passed or to end the job.
Optionally, in the process that the underwater robot goes from the initial coding cooperative target to the target coding cooperative target, if the appearance of the underwater structure represented by the structural information is different from the appearance of the underwater structure in the standard structural information, the underwater robot is controlled to return to the initial coding cooperative target.
Or in the process that the underwater robot goes to the initial coding cooperative target from the initial position, if the appearance of the underwater structure represented by the structural information is different from the appearance of the underwater structure in the standard structural information, the underwater robot is indicated to finish underwater operation because the last cooperative target position is not arranged in the operation path.
Step S440: and controlling the inertial navigation system to perform positioning navigation according to the instantaneous accurate position, the instantaneous accurate gesture, the working path and the real-time position.
On the basis of the instantaneous accurate position, the instantaneous accurate gesture and the operation path, the real-time position acquired according to the structured information is combined, so that the position accuracy can be further improved, and the long-time navigation accuracy is further improved.
In the embodiment of the application, in the process that the underwater robot goes to the initial/target coding cooperative target, the real-time position is obtained by matching the standardized structural information of the underwater structure, so that the navigation and positioning accuracy of the underwater robot can be further maintained.
In one possible implementation, step S431 includes at least one of: acquiring visual image information for representing the structure of the underwater structure through a visual sensor of the underwater robot; and acquiring sonar image information for representing the structure of the underwater structure through an acoustic sensor of the underwater robot.
The sonar is suitable for obtaining three-dimensional structures such as underwater topography, structures and the like in a large scale, and positioning is performed through feature matching; the vision can acquire images or point clouds, and feature matching positioning is performed by extracting features of the measured object. The obtained visual image and sonar image can be matched with structural information, and can also be matched by other image matching methods.
Due to the problems of water scattering, suspension and the like, the optical quality of the underwater optical imaging is poor, the optical quality is unclear, and the common point matching is often difficult to be applied; and the sonar image has low resolution and low precision due to multipath effect caused by the aggregate structure, and is difficult to perform point matching. However, the characteristics of the underwater structure are known, the position of the characteristics is also known, the robot acquires images in the motion process, and the requirements on the image quality can be reduced by adopting a structural matching mode, such as a contour matching mode, so that effective matching can be performed.
A schematic diagram of a positioning method of an underwater robot in a marine engineering environment is provided in connection with the embodiment of the application shown in fig. 5. The method comprises the following steps:
(1) Target information of all preset coding cooperative targets on the underwater structure is obtained, wherein the target information comprises, but is not limited to, target positions, target postures, target numbers and the like.
(2) GNSS and underwater sound technology are combined to obtain initial position coordinates g (x) of underwater robot in geodetic coordinate system 0 , y 0 , z 0 ) The initial position is related to the accuracy of the underwater sound positioning.
(3) According to the initial position coordinates g (x 0 , y 0 , z 0 ) Calculating an initial coded cooperative target Ci (x i , y i , z i )501。
(4) According to the preset operation path of the underwater robot, determining the actual operation path of the underwater robot taking the initial coding cooperative target 501 as a key node, and marking the actual operation path as p { C } i , C i+1 ,…, C n ,…C 1 Where i=1, 2, …, n, ci is the primary encoding cooperative target 501.
(5) In combination with the actual operation requirement, the self-g (x 0 , y 0 , z 0 ) To Ci (x) i , y i , z i ) And guiding the underwater robot to Ci (x) through satellite and underwater sound positioning system i , y i , z i )。
(6) The underwater robot performs joint measurement with the initial coding cooperative target 501 in a photogrammetric manner to obtain the instantaneous accurate position and the instantaneous accurate posture of the underwater robot, so as to provide more accurate pose information for an inertial navigation system.
(7) The underwater robot obtains an initial coding cooperative target 501 and a target coding cooperative target 502 from the path set p, calculates a motion path, a motion direction and the like, and starts navigation operation.
(8) Optionally, in the process of going from the initial coding cooperative target 501 to the target coding cooperative target 502, acquiring the real-time position of the underwater robot again through a multi-source positioning method, and extracting local structural information of the underwater structure near the real-time position from a standard structural model of the underwater structure; simultaneously, a visual image and a sonar image of the underwater structure at the real-time position are obtained through a visual sensor and an acoustic sensor carried by the underwater robot; and comparing the visual image and the sonar image with the local structure information, and if the matching is successful, determining the real-time position according to the standard model, and further correcting the inertial navigation error.
(9) After the underwater robot reaches the target code cooperative target 502, the initial code cooperative target 501 is marked as having passed, and steps (6) to (9) are repeated.
(10) Repeating the steps (1) to (9) as required to realize long-endurance and high-precision positioning of the underwater robot in the ocean engineering environment.
In the embodiment of the application, the coding cooperative targets are distributed on the structure, the accurate position and posture information provided by the coding cooperative targets is utilized to assist inertial navigation, and meanwhile, the method is cooperated with methods such as underwater satellite, underwater acoustic positioning, acoustic and optical image matching positioning and the like, so that a high-precision positioning technical framework of the underwater robot in the ocean engineering environment can be constructed, the advantages of the artificial environment in the ocean engineering construction are fully utilized, and a long-endurance and high-precision positioning method of the underwater robot taking inertial navigation as a core is formed.
Fig. 6 shows a schematic structural diagram of an underwater robot positioning device in an ocean engineering environment according to an embodiment of the present application, where the device 600 includes: an acquisition module 610, a first determination module 620, a second determination module 630, a third determination module 640, and a positioning module 650.
The acquisition module 610 is used for acquiring an initial position of the underwater robot based on the satellite and the underwater sound positioning system; the first determining module 620 is configured to determine an initial encoding cooperative target corresponding to the initial position in an encoding cooperative target set, where the encoding cooperative target set includes encoding cooperative targets preset to an underwater structure; the second determining module 630 is configured to determine, based on the initial coded cooperative targets, an instantaneous accurate position and an instantaneous accurate pose of the underwater robot using an underwater photogrammetry technique; the third determining module 640 is configured to determine, based on a working path formed by the initial encoding cooperative target, that the underwater robot leaves the initial encoding cooperative target and goes to a next encoding cooperative target as a target encoding cooperative target; the positioning module 650 is used for positioning and navigating according to the instant accurate position, the instant accurate gesture and the working path in combination with an inertial navigation system.
In one possible implementation, the second determining module 630 is configured to determine, by using a vision sensor of the underwater robot, a pose relationship of the vision sensor with respect to the initial encoding cooperative target, by performing photogrammetry on the initial encoding cooperative target; determining the instantaneous precise position and the instantaneous precise pose of the underwater robot based on the target position of the initial coded cooperative target, the target pose of the initial coded cooperative target, and the pose relationship.
In one possible implementation, the encoded collaborative target includes at least 3 reference points for the vision sensor, and the second determination module 630 is configured to determine a reference pose relationship of the vision sensor with respect to each of the reference points; determining the instantaneous accurate position and the instantaneous accurate posture of the underwater robot based on the reference point position of each reference point, the reference point posture of each reference point and the corresponding reference pose relationship.
In one possible implementation, the number of the encoding cooperative targets is not less than 2, and the first determining module 620 is configured to determine, according to a distance between each of the cooperative targets in the set of encoding cooperative targets and the initial position, one of the plurality of encoding cooperative targets as the initial encoding cooperative target, where the initial encoding cooperative target is an encoding cooperative target that the underwater robot does not pass through.
In one possible implementation, the third determining module 640 is configured to determine the working path of the underwater robot according to a preset working path of the underwater robot and the initial encoding cooperation target.
In a possible implementation manner, the apparatus 600 is further configured to obtain, by the underwater robot, structural information of the underwater structure during the travel of the underwater robot to the initial encoding cooperative target and/or the destination encoding cooperative target; matching the structural information with pre-stored standard structural information of the underwater structure; and under the condition that the structural information is successfully matched with the standard structural information, determining the real-time position of the underwater robot according to the standard structural information.
In one possible implementation, the apparatus 600 is configured to obtain, by a vision sensor of the underwater robot, a vision image for representing a structure of the underwater structure; and acquiring a sonar image for representing the structure of the underwater structure through an acoustic sensor of the underwater robot.
In one possible implementation, the apparatus 600 is configured to control the underwater robot to return to the last encoded collaborative target that has passed or to end a job if the matching of the structured information with the standard structured information fails.
In one possible implementation, the positioning module 650 is configured to control the inertial navigation system to perform positioning navigation according to the instantaneous accurate position, the instantaneous accurate posture, the working path, and the real-time position.
The apparatus 600 provided in this embodiment of the present application may perform the methods described in the foregoing method embodiments, and implement the functions and beneficial effects of the methods described in the foregoing method embodiments, which are not described herein again.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.

Claims (6)

1. The method for positioning the underwater robot in the ocean engineering environment is characterized by comprising the following steps of:
acquiring an initial position of the underwater robot based on a satellite and an underwater sound positioning system;
determining an initial coding cooperative target corresponding to the initial position in a coding cooperative target set, wherein the coding cooperative target set comprises coding cooperative targets preset on an underwater structure;
determining an instantaneous accurate position and an instantaneous accurate posture of the underwater robot by utilizing an underwater photogrammetry technology based on the initial coding cooperative target;
determining that the underwater robot leaves the initial coding cooperative target and goes to the next coding cooperative target as a target coding cooperative target based on a working path formed by the initial coding cooperative target;
according to the instantaneous accurate position, the instantaneous accurate gesture and the operation path, positioning navigation is carried out by combining an inertial navigation system;
after determining an initial coding cooperative target corresponding to the initial position or determining that the underwater robot leaves the initial coding cooperative target and goes to a next coding cooperative target as a target coding cooperative target, the method further comprises the following steps:
acquiring structural information of the underwater structure through the underwater robot in the process that the underwater robot goes to the initial coding cooperative target or the target coding cooperative target, wherein the structural information is used for representing the outline of the underwater structure and/or the shape of the underwater structure;
matching the structural information with pre-stored standard structural information of the underwater structure;
under the condition that the structural information is successfully matched with the standard structural information, determining the real-time position of the underwater robot according to the standard structural information;
under the condition that the structural information is failed to be matched with the standard structural information, the underwater robot is controlled to return to the last coded cooperative target which has already been passed or finish the operation;
the obtaining, by the underwater robot, structural information of the underwater structure includes at least one of:
acquiring visual image information for representing the structure of the underwater structure through a visual sensor of the underwater robot;
and acquiring sonar image information for representing the structure of the underwater structure through an acoustic sensor of the underwater robot.
2. The method of claim 1, wherein the determining the instantaneous precise position and instantaneous precise pose of the underwater robot using underwater photogrammetry techniques based on the initial coded cooperative targets comprises:
performing photogrammetry on the initial coding cooperative target through a visual sensor of the underwater robot, and determining the pose relation of the visual sensor relative to the initial coding cooperative target;
determining the instantaneous precise position and the instantaneous precise pose of the underwater robot based on the target position of the initial coded cooperative target, the target pose of the initial coded cooperative target, and the pose relationship.
3. The method of claim 2, wherein the encoded collaborative target comprises at least 3 reference points for the vision sensor,
the determining the pose relationship of the vision sensor relative to the initially encoded cooperative target comprises:
determining a reference pose relationship of the vision sensor relative to each reference point;
the determining the instantaneous precise position and the instantaneous precise pose of the underwater robot based on the target position of the initial encoding cooperative target, the target pose of the initial encoding cooperative target, and the pose relationship comprises:
determining the instantaneous accurate position and the instantaneous accurate posture of the underwater robot based on the reference point position of each reference point, the reference point posture of each reference point and the corresponding reference pose relationship.
4. The method of claim 1, wherein the number of encoded cooperative targets is not less than 2, wherein the determining an initial encoded cooperative target corresponding to the initial position in the set of encoded cooperative targets comprises:
and determining one of a plurality of coding cooperative targets as an initial coding cooperative target according to the distance between each coding cooperative target in the coding cooperative target set and the initial position, wherein the initial coding cooperative target is a coding cooperative target which is not passed by the underwater robot.
5. The method of claim 4, wherein determining, based on the work path composed of the initial encoding partner target, that the underwater robot leaves the initial encoding partner target, the next encoding partner target to go for the purpose, further comprises:
and determining the working path of the underwater robot according to the preset working path of the underwater robot and the initial coding cooperative target.
6. The method of claim 1, wherein said locating navigation in conjunction with an inertial navigation system based on said instantaneous accurate position, said instantaneous accurate pose, and said job path comprises:
and controlling the inertial navigation system to perform positioning navigation according to the instantaneous accurate position, the instantaneous accurate gesture, the working path and the real-time position.
CN202311643883.3A 2023-12-04 2023-12-04 Positioning method for underwater robot in ocean engineering environment Active CN117346792B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311643883.3A CN117346792B (en) 2023-12-04 2023-12-04 Positioning method for underwater robot in ocean engineering environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311643883.3A CN117346792B (en) 2023-12-04 2023-12-04 Positioning method for underwater robot in ocean engineering environment

Publications (2)

Publication Number Publication Date
CN117346792A CN117346792A (en) 2024-01-05
CN117346792B true CN117346792B (en) 2024-03-15

Family

ID=89363563

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311643883.3A Active CN117346792B (en) 2023-12-04 2023-12-04 Positioning method for underwater robot in ocean engineering environment

Country Status (1)

Country Link
CN (1) CN117346792B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3080194A1 (en) * 2018-04-12 2019-10-18 Cgg Services Sas METHOD FOR GUIDING AN AUTONOMOUS SUBMARINE VEHICLE AND ASSOCIATED SYSTEM FOR ACQUIRING UNDERWATER ANALYSIS DATA
CN115574855A (en) * 2022-09-29 2023-01-06 深圳大学 Method for detecting underwater operation robot in butt joint state of immersed tube pipe joints
JP2023050230A (en) * 2021-09-30 2023-04-11 日本電気株式会社 Underwater Position Correction Device, Underwater Position Correction Method, and Underwater Position Correction Program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2599902C1 (en) * 2015-09-08 2016-10-20 Общество с ограниченной ответственностью "Лаборатория подводной связи и навигации" Method of navigating underwater objects and system for its implementation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3080194A1 (en) * 2018-04-12 2019-10-18 Cgg Services Sas METHOD FOR GUIDING AN AUTONOMOUS SUBMARINE VEHICLE AND ASSOCIATED SYSTEM FOR ACQUIRING UNDERWATER ANALYSIS DATA
JP2023050230A (en) * 2021-09-30 2023-04-11 日本電気株式会社 Underwater Position Correction Device, Underwater Position Correction Method, and Underwater Position Correction Program
CN115574855A (en) * 2022-09-29 2023-01-06 深圳大学 Method for detecting underwater operation robot in butt joint state of immersed tube pipe joints

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
动态精密工程测量技术及应用;李清泉等;《测绘学报》;第50卷(第9期);1147-1158 *

Also Published As

Publication number Publication date
CN117346792A (en) 2024-01-05

Similar Documents

Publication Publication Date Title
CN115077487B (en) Immersed tube butt joint measurement method and system for stay wire assisted photogrammetry
US7978128B2 (en) Land survey system
US8380375B2 (en) Device, computer storage medium, and computer implemented method for metrology with inertial navigation system and aiding
CN111829512B (en) AUV navigation positioning method and system based on multi-sensor data fusion
CN107063198A (en) A kind of boat-carrying Self-stabilization holder measuring system and application process
CN107966145B (en) AUV underwater navigation method based on sparse long baseline tight combination
CN111637888B (en) Tunneling machine positioning method and system based on inertial navigation and laser radar single-point distance measurement
KR101987655B1 (en) Unmanned boat for measuring underwater geographical feature
CN115574855B (en) Method for detecting underwater operation robot in immersed tube joint butt joint state
WO2024032663A1 (en) Underwater photogrammetry-based method for measurement during docking of immersed tube segments
CN107543497A (en) A kind of non-overlapped ken Binocular vision photogrammetry station coordinates correlating method
CN110133667A (en) Underwater 3 D detection system based on mobile Forward-Looking Sonar
CN110285834A (en) Double ionertial navigation system based on a dot position information quickly independently resets method
CN109632259A (en) The measuring device and method of water conservancy project physical experiments free sailing model ship vertical section deflection
CN109813510B (en) High-speed rail bridge vertical dynamic disturbance degree measuring method based on unmanned aerial vehicle
KR101763911B1 (en) Heading estimation apparatus of auv in severe magnetic disturbance environment and the method thereof
CN117008177B (en) Seabed control point three-dimensional coordinate calibration method based on integrated platform
CN117346792B (en) Positioning method for underwater robot in ocean engineering environment
AU2018226595B2 (en) Combined metrology method for computing distance, roll and pitch attitudes and relative orientations between two underwater points of interest
Zhang et al. An open-source, fiducial-based, underwater stereo visual-inertial localization method with refraction correction
CN114353802A (en) Robot three-dimensional space positioning method based on laser tracking
WO2020054500A1 (en) Submarine machine system and work method
CN111508005A (en) Unmanned ship overwater obstacle autonomous detection system based on binocular vision
CN115112154B (en) Calibration method of underwater autonomous navigation positioning system
CN116336981B (en) Underwater coarse positioning method and system for immersed tube joint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant