CN116299374B - Sonar imaging underwater automatic calibration positioning method and system based on machine vision - Google Patents

Sonar imaging underwater automatic calibration positioning method and system based on machine vision Download PDF

Info

Publication number
CN116299374B
CN116299374B CN202310553779.9A CN202310553779A CN116299374B CN 116299374 B CN116299374 B CN 116299374B CN 202310553779 A CN202310553779 A CN 202310553779A CN 116299374 B CN116299374 B CN 116299374B
Authority
CN
China
Prior art keywords
pose
degree
image
robot
freedom
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310553779.9A
Other languages
Chinese (zh)
Other versions
CN116299374A (en
Inventor
袁亚飞
金昂
童立青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Aise Technology Co ltd
Original Assignee
Suzhou Aise Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Aise Technology Co ltd filed Critical Suzhou Aise Technology Co ltd
Priority to CN202310553779.9A priority Critical patent/CN116299374B/en
Publication of CN116299374A publication Critical patent/CN116299374A/en
Application granted granted Critical
Publication of CN116299374B publication Critical patent/CN116299374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52004Means for monitoring or calibrating
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention relates to the technical field of underwater positioning, and discloses a sonar imaging underwater automatic calibration positioning method and system based on machine vision, wherein the method comprises the following steps: building an underwater robot model, and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar; simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, generating multiple-degree-of-freedom pose simulation images of the robot and predicting multiple-degree-of-freedom pose images of the robot; finding a multi-degree-of-freedom pose image which is most similar to an input acoustic image of the multi-degree-of-freedom pose from multi-degree-of-freedom pose images obtained through multiple predictions, and taking the multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image; and measuring pose data of the robot by using a sensor, and calibrating the initial multi-degree-of-freedom pose image according to the pose data to obtain the final pose of the robot. The invention can improve the accuracy of underwater positioning and the working efficiency of underwater detection.

Description

Sonar imaging underwater automatic calibration positioning method and system based on machine vision
Technical Field
The invention relates to the technical field of underwater positioning, in particular to a sonar imaging underwater automatic calibration positioning method and system based on machine vision.
Background
The sonar imaging technology can be used as an important navigation sensing technology of an underwater detection system, and can continuously provide spatial information about surrounding environment even under the condition of limited visibility. In the sonar imaging technology, the acoustic imaging sonar measurement has the characteristics of high spatial ambiguity and low resolution, so that accurate navigation information cannot be obtained.
Underwater vehicles such as remote control vehicles are widely used in various detection tasks such as submarine construction and inspection, but the operation of the underwater detection system, particularly the calibration operation of image detection, is mainly completed by a plurality of times of actual image acquisition and a plurality of times of manual calibration of operators, so that the operation is inconvenient, the precision is low, the limitation is brought to the detection tasks, and various operation problems caused by human errors exist.
Disclosure of Invention
Therefore, the invention aims to solve the technical problems of overcoming the defects in the prior art, and providing the sonar imaging underwater automatic calibration positioning method and system based on machine vision, which can improve the accuracy of underwater calibration positioning and the working efficiency of underwater detection.
In order to solve the technical problems, the invention provides a sonar imaging underwater automatic calibration positioning method based on machine vision, which comprises the following steps:
building an underwater robot model, and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar;
simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, generating multiple-degree-of-freedom pose simulation images of the robot for multiple times, and predicting multiple-degree-of-freedom pose images of the robot multiple times according to the multiple-degree-of-freedom pose simulation images;
finding a multi-degree-of-freedom pose image which is most similar to an input acoustic image of the multi-degree-of-freedom pose from multi-degree-of-freedom pose images obtained through multiple predictions, and taking the most similar multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image;
and measuring pose data of the robot by using a sensor, and calibrating the initial multi-degree-of-freedom pose image according to the pose data to obtain the final pose of the robot.
In one embodiment of the invention, the sonar simulation technology is used for simulating the underwater robot and the environment where the underwater robot is located, and multiple times of multi-degree-of-freedom pose simulation images of the robot are generated, specifically:
and simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, sampling the multi-degree-of-freedom pose of the robot on the three-dimensional pose space for multiple times in a bounded positioning area, and generating a simulated acoustic image for each pose obtained by sampling by using a sonar simulator to obtain multiple multi-degree-of-freedom pose simulated images.
In one embodiment of the invention, when the multi-degree-of-freedom pose of the robot in the three-dimensional pose space is sampled multiple times in the bounded positioning area, the sampled angle comprises a pitching direction and a detection view angle in the horizontal direction.
In one embodiment of the present invention, when predicting the multi-degree-of-freedom pose image of the robot multiple times according to the multi-degree-of-freedom pose simulation image, the prediction process of each time specifically includes: and taking the multi-degree-of-freedom pose simulation image as the input of a neural network, wherein the output of the neural network is the multi-degree-of-freedom pose image of the robot.
In one embodiment of the present invention, when the multi-degree-of-freedom pose image most similar to the input acoustic image of the multi-degree-of-freedom pose is found from the multi-degree-of-freedom pose images obtained by multiple predictions, a nearest neighbor search algorithm is used.
In one embodiment of the invention, when the multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose is found from the multi-degree-of-freedom pose images obtained through multiple predictions, the image is searched by combining an image similarity measure, a minimum hash and a position sensitive hash method.
In one embodiment of the invention, the pose data of the robot is measured by using a sensor, and the initial multi-degree-of-freedom pose image is calibrated according to the pose data to obtain the final pose of the robot, specifically:
and measuring real-time pose data of the robot by using an inertial measurement unit and a Doppler log in an extended Kalman filtering system, and calibrating an initial multi-degree-of-freedom pose image by using the real-time pose data to obtain the final pose.
In one embodiment of the present invention, when the initial multi-degree-of-freedom pose image is calibrated by using the real-time pose data to obtain the final pose, the determination criteria for ending the calibration are:
constructing trigger calibration conditionsWherein->Is an initial multi-degree-of-freedom pose image +.>And pair->Smoothing the result ∈>Correlation function calculation between ∈>Is an initial multi-degree-of-freedom pose image +.>Real-time pose data image acquired at t moment +.>Calculating the association function between the two;
up toAnd stopping calibrating the initial multi-degree-of-freedom pose image by using the real-time pose data to obtain the final pose, wherein the value is smaller than or equal to a preset threshold value.
The invention also provides a sonar imaging underwater automatic calibration positioning system based on machine vision, which comprises an input acoustic image acquisition module, a sonar simulation module, a pose image pair generation module and a calibration module,
the input acoustic image acquisition module is used for building an underwater robot model and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar;
the sonar simulation module simulates the underwater robot and the environment where the underwater robot is positioned by using a sonar simulation technology, generates multiple-degree-of-freedom pose simulation images of the robot for multiple times, and predicts multiple-degree-of-freedom pose images of the robot multiple times according to the multiple-degree-of-freedom pose simulation images;
the pose image pair generating module finds a multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose from multi-degree-of-freedom pose images obtained through multiple predictions, and takes the most similar multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image;
the calibration module measures pose data of the robot by using the sensor, and performs calibration on the initial multi-degree-of-freedom pose image according to the pose data to obtain the final pose of the robot.
Compared with the prior art, the technical scheme of the invention has the following advantages:
according to the invention, the simulation environment is built to simulate the actual acoustic image and predict the pose, the optimal predicted pose is found through the collected actual acoustic image on the basis, and the optimal predicted pose is calibrated by combining the real-time pose data collected by the sensor, so that the traditional manual calibration operation can be replaced, the accuracy of underwater positioning calibration and the working efficiency of underwater detection are improved, and the labor cost is reduced.
Drawings
In order that the invention may be more readily understood, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings, in which:
fig. 1 is a flow chart of the method of the present invention.
FIG. 2 is an exemplary diagram of a sonar simulation environment constructed in the present invention.
Fig. 3 is a schematic diagram of the pitch direction and the detection view angle in the horizontal direction considered in sonar imaging in the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and specific examples, which are not intended to be limiting, so that those skilled in the art will better understand the invention and practice it.
Example 1
Referring to FIG. 1, the invention discloses a sonar imaging underwater automatic calibration positioning method based on machine vision, which comprises the following steps:
s1: constructing an underwater robot model shown in fig. 2, and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar; and constructing an underwater model by using the visual sonar system, acquiring a pose image, and providing a large amount of basic data for a subsequent neural network. The multiple degrees of freedom in this embodiment is 6 degrees of freedom.
The underwater robot model is very important, and the accuracy of underwater positioning is directly determined by the quality of the model. The simulation environment should adequately describe the actual localization space and when the simulated image produced by the sonar simulator is exactly the same as the input acoustic image, it can be assumed that the pose in which the simulated image was created is the location where the actual acoustic image was obtained.
S2: simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, generating multiple-degree-of-freedom pose simulation images of the robot for multiple times, and predicting multiple-degree-of-freedom pose images of the robot multiple times according to the multiple-degree-of-freedom pose simulation images; and taking the multi-degree-of-freedom pose simulation image of the robot, which is generated by the sonar simulator for many times, as a database, fully extracting image characteristic information, and optimizing the model from a single image quality evaluation method to image quality evaluation of comparison of a plurality of images.
S2-1: simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, and generating multiple-degree-of-freedom pose simulation images of the robot for multiple times, wherein the multiple-degree-of-freedom pose simulation images specifically comprise: and simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, sampling the multi-degree-of-freedom pose of the robot on the three-dimensional pose space for multiple times in a bounded positioning area, and generating a simulated acoustic image for each pose obtained by sampling by using a sonar simulator to obtain multiple multi-degree-of-freedom pose simulated images. Each multi-degree-of-freedom pose simulation image comprises a simulated acoustic image and a pose label with 6 degrees of freedom.
When multiple times of sampling are performed on the multi-degree-of-freedom pose of the robot in the three-dimensional pose space in the bounded positioning area, the sampled angles comprise the detection view angles in the pitching direction and the horizontal direction, and the specific detection view angle can be selected from the view angles shown in fig. 3.
S2-2: when the multi-degree-of-freedom pose image of the robot is predicted for many times according to the multi-degree-of-freedom pose simulation image, the prediction process of each time is specifically as follows: and taking the multi-degree-of-freedom pose simulation image as the input of a neural network, wherein the output of the neural network is the multi-degree-of-freedom pose image of the robot. The neural network used may be a BP neural network. The multi-degree-of-freedom pose simulation image is a multi-degree-of-freedom pose image containing the organic robot, and correspondingly, the predicted multi-degree-of-freedom pose image of the robot is a pose image containing a multi-degree-of-freedom predicted value of the robot. Taking the multi-degree-of-freedom pose simulation image as the input of the BP neural network, outputting the multi-degree-of-freedom prediction probability of the robot through the BP neural network, and obtaining the multi-degree-of-freedom prediction value of the robot according to the multi-degree-of-freedom prediction probability; the predicted values of multiple degrees of freedom of the robot are combined to form a pose image containing the predicted values of multiple degrees of freedom, namely the predicted pose image of multiple degrees of freedom of the robot through the BP neural network.
S3: and finding a multi-degree-of-freedom pose image which is the most similar to the input acoustic image of the multi-degree-of-freedom pose from the multi-degree-of-freedom pose images obtained through multiple predictions, and taking the most similar multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image.
When a multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose is found out from multi-degree-of-freedom pose images obtained through multiple predictions, a nearest neighbor search algorithm (nearest neighbor search, NNS) is used.
In this embodiment, an image including a pose image with 6 degrees of freedom is taken as a pair of pose images, and when a pose image with multiple degrees of freedom, which is most similar to an input acoustic image with multiple degrees of freedom, is found from the pose images with multiple degrees of freedom obtained by multiple predictions, a large number of image pose pairs exist, and a large number of calculations are required for searching without data compression. Therefore, the images are searched by combining the image similarity measurement, the minimum hash and the Location Sensitive Hash (LSH) method in the embodiment when searching the most similar pose images, so that the searching speed is improved. The image pose pair selection is an online process, multi-degree-of-freedom pose images obtained through multiple predictions are used as a query library, the image pose pair most similar to the input acoustic image is found in the library, and then the pose label of the image pose pair is returned.
S4: and measuring pose data of the robot by using a sensor, and calibrating the initial multi-degree-of-freedom pose image according to the pose data to obtain the final pose of the robot.
In this embodiment, an Inertial Measurement Unit (IMU) and a doppler log (DVL) in an extended kalman filter system are used to measure real-time pose data of the robot, and the real-time pose data is used to calibrate an initial multi-degree-of-freedom pose image to obtain the final pose.
When the initial multi-degree-of-freedom pose image is calibrated by using real-time pose data to obtain the final pose, the judgment standard for ending the calibration is as follows:
constructing trigger calibration conditionsWherein->Is an initial multi-degree-of-freedom pose image +.>And pair->Smoothing the result ∈>Correlation function calculation between ∈>Is an initial multi-degree-of-freedom pose image +.>Real-time pose data image acquired at t moment +.>Calculating the association function between the two; the correlation function calculation in this embodiment may be cosine similarity calculation, i.e. +.>
Up toAnd stopping calibrating the initial multi-degree-of-freedom pose image by using the real-time pose data to obtain the final pose, wherein the value is smaller than or equal to a preset threshold value.
Calibration is a process that ensures accurate positioning and verification by physical sensors. And combining the pose prediction result with sensor measurements from the DVL and the IMU in the extended Kalman filtering system, and predicting the 6-degree-of-freedom (6-DOF) poses of the detection system and the underwater robot under the global framework by triggering the calibration condition. Denominator of denominatorMeaning that the maximum correlation ratio that can be obtained when the two images are aligned. Ideally the two images are perfectly aligned +.>100%. Molecule->The value of (2) is always lower than the value of the denominator. Therefore, in practice, only the requirement of different precision is +.>Setting a corresponding threshold value.
Example two
The invention also discloses a sonar imaging underwater automatic calibration positioning system based on machine vision, which comprises an input acoustic image acquisition module, a sonar simulation module and a pose image pair generation module calibration module.
The input acoustic image acquisition module is used for building an underwater robot model and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar; the sonar simulation module generates multiple-degree-of-freedom pose simulation images of the robot by using a sonar simulator for multiple times, and predicts multiple-degree-of-freedom pose images of the robot multiple times according to the multiple-degree-of-freedom pose simulation images; the pose image pair generating module finds a multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose from multi-degree-of-freedom pose images obtained through multiple predictions, and takes the most similar multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image; the calibration module measures pose data of the robot by using the sensor, and performs calibration on the initial multi-degree-of-freedom pose image according to the pose data to obtain the final pose of the robot.
The invention relates to a novel sensor fusion positioning method suitable for an integrated IMU-DVL system and an acoustic imaging sonar. The simulation environment is built to simulate the actual acoustic image and predict the pose, the optimal predicted pose is found through the collected actual acoustic image on the basis, the optimal predicted pose is calibrated by combining the real-time pose data collected by the sensor, the traditional manual calibration operation can be replaced, the accuracy of underwater positioning in calibration and the working efficiency of underwater detection are improved, and the labor cost is reduced.
In the exploration mission using an underwater vehicle, the present invention does not require any infrastructure (e.g., transducer beacons) that is costly to install using imaging sonar, as compared to conventional acoustic-based positioning systems (e.g., LBL, SBL, and USBL). The relative pose of the underwater structure of the sensitive area is directly obtained by using the pose prediction of the imaging sonar, and the imaging sonar is also particularly suitable for the localization application of a detection system.
The invention is based on sonar depth image acquisition, the imaging sonar is an active sonar, the image contains multidimensional information, and background noise can be well filtered relative to a two-dimensional sensor image, so that the problem of sound negative tape can not be suffered during positioning, and the invention has the advantages of high efficiency, high accuracy and the like.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It is apparent that the above examples are given by way of illustration only and are not limiting of the embodiments. Other variations and modifications of the present invention will be apparent to those of ordinary skill in the art in light of the foregoing description. It is not necessary here nor is it exhaustive of all embodiments. And obvious variations or modifications thereof are contemplated as falling within the scope of the present invention.

Claims (8)

1. The sonar imaging underwater automatic calibration positioning method based on machine vision is characterized by comprising the following steps of:
building an underwater robot model, and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar;
simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, generating multiple-degree-of-freedom pose simulation images of the robot for multiple times, and predicting multiple-degree-of-freedom pose images of the robot multiple times according to the multiple-degree-of-freedom pose simulation images;
finding a multi-degree-of-freedom pose image which is most similar to an input acoustic image of the multi-degree-of-freedom pose from multi-degree-of-freedom pose images obtained through multiple predictions, and taking the most similar multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image;
measuring pose data of the robot by using a sensor, and calibrating an initial multi-degree-of-freedom pose image according to the pose data to obtain a final pose of the robot, wherein the final pose comprises the following specific steps:
and measuring real-time pose data of the robot by using an inertial measurement unit and a Doppler log in an extended Kalman filtering system, and calibrating an initial multi-degree-of-freedom pose image by using the real-time pose data to obtain the final pose.
2. The machine vision-based sonar imaging underwater automatic calibration positioning method according to claim 1, wherein the method is characterized in that: the multi-degree-of-freedom pose simulation image of the robot is generated for many times by simulating the underwater robot and the environment by using a sonar simulation technology, and specifically comprises the following steps:
and simulating the underwater robot and the environment in which the underwater robot is positioned by using a sonar simulation technology, sampling the multi-degree-of-freedom pose of the robot on the three-dimensional pose space for multiple times in a bounded positioning area, and generating a simulated acoustic image for each pose obtained by sampling by using a sonar simulator to obtain multiple multi-degree-of-freedom pose simulated images.
3. The machine vision-based sonar imaging underwater automatic calibration positioning method according to claim 2, wherein the method is characterized in that: when the multi-degree-of-freedom pose of the robot in the three-dimensional pose space is sampled for a plurality of times in the bounded positioning area, the sampled angle comprises the pitching direction and the detection visual angle in the horizontal direction.
4. The machine vision-based sonar imaging underwater automatic calibration positioning method according to claim 1, wherein the method is characterized in that: when the multi-degree-of-freedom pose image of the robot is predicted for many times according to the multi-degree-of-freedom pose simulation image, the prediction process of each time specifically comprises the following steps: and taking the multi-degree-of-freedom pose simulation image as the input of a neural network, wherein the output of the neural network is the multi-degree-of-freedom pose image of the robot.
5. The machine vision-based sonar imaging underwater automatic calibration positioning method according to claim 1, wherein the method is characterized in that: and when the multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose is found out from the multi-degree-of-freedom pose image obtained through multiple predictions, a nearest neighbor search algorithm is used.
6. The machine vision-based sonar imaging underwater automatic calibration positioning method according to claim 1, wherein the method is characterized in that: and when the multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose is found out from the multi-degree-of-freedom pose image obtained through multiple predictions, the image is searched by combining an image similarity measurement, a minimum hash and a position sensitive hash method.
7. The machine vision-based sonar imaging underwater automatic calibration positioning method according to claim 1, wherein the method is characterized in that: when the initial multi-degree-of-freedom pose image is calibrated by using the real-time pose data to obtain the final pose, the judgment standard for ending the calibration is as follows:
constructing trigger calibration conditionsWherein C (S) r (x),G(S r (x) ) is an initial multi-degree-of-freedom pose image S r (x) And pair S r (x) Results of smoothing G (S r (x) A) the correlation function calculation between the two,is an initial multi-degree-of-freedom pose image S r (x) Real-time pose data image acquired at t moment +.>Calculating the association function between the two;
up to eta r And stopping calibrating the initial multi-degree-of-freedom pose image by using the real-time pose data to obtain the final pose, wherein the value is smaller than or equal to a preset threshold value.
8. Machine vision-based sonar imaging underwater automatic calibration positioning system is characterized in that: comprises an input acoustic image acquisition module, a sonar simulation module, a pose image pair generation module and a calibration module,
the input acoustic image acquisition module is used for building an underwater robot model and acquiring an input acoustic image of the multi-degree-of-freedom pose of the robot by using an acoustic imaging sonar;
the sonar simulation module simulates the underwater robot and the environment where the underwater robot is positioned by using a sonar simulation technology, generates multiple-degree-of-freedom pose simulation images of the robot for multiple times, and predicts multiple-degree-of-freedom pose images of the robot multiple times according to the multiple-degree-of-freedom pose simulation images;
the pose image pair generating module finds a multi-degree-of-freedom pose image which is most similar to the input acoustic image of the multi-degree-of-freedom pose from multi-degree-of-freedom pose images obtained through multiple predictions, and takes the most similar multi-degree-of-freedom pose image as an initial multi-degree-of-freedom pose image;
the calibration module uses a sensor to measure pose data of the robot, and performs calibration on an initial multi-degree-of-freedom pose image according to the pose data to obtain a final pose of the robot, specifically:
and measuring real-time pose data of the robot by using an inertial measurement unit and a Doppler log in an extended Kalman filtering system, and calibrating an initial multi-degree-of-freedom pose image by using the real-time pose data to obtain the final pose.
CN202310553779.9A 2023-05-17 2023-05-17 Sonar imaging underwater automatic calibration positioning method and system based on machine vision Active CN116299374B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310553779.9A CN116299374B (en) 2023-05-17 2023-05-17 Sonar imaging underwater automatic calibration positioning method and system based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310553779.9A CN116299374B (en) 2023-05-17 2023-05-17 Sonar imaging underwater automatic calibration positioning method and system based on machine vision

Publications (2)

Publication Number Publication Date
CN116299374A CN116299374A (en) 2023-06-23
CN116299374B true CN116299374B (en) 2023-08-04

Family

ID=86798096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310553779.9A Active CN116299374B (en) 2023-05-17 2023-05-17 Sonar imaging underwater automatic calibration positioning method and system based on machine vision

Country Status (1)

Country Link
CN (1) CN116299374B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU17810U1 (en) * 2000-11-30 2001-04-27 Государственное унитарное предприятие "Научно-исследовательский и проектный институт геофизических методов разведки океана" HYDROACOUSTIC NAVIGATION SYSTEM AND ITS CALIBRATION SYSTEM
CN111175761A (en) * 2019-11-19 2020-05-19 南京工程学院 Registration method of underwater robot positioning sonar data
CN111260649A (en) * 2020-05-07 2020-06-09 常州唯实智能物联创新中心有限公司 Close-range mechanical arm sensing and calibrating method
CN111660290A (en) * 2019-03-05 2020-09-15 波音公司 Automatic calibration for robotic optical sensors
CN114509767A (en) * 2022-02-15 2022-05-17 交通运输部天津水运工程科学研究所 Underwater imaging sonar measurement calibration device and method
CN115575931A (en) * 2022-09-29 2023-01-06 北京百度网讯科技有限公司 Calibration method, calibration device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110506297B (en) * 2017-04-17 2023-08-11 康耐视公司 High accuracy calibration system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU17810U1 (en) * 2000-11-30 2001-04-27 Государственное унитарное предприятие "Научно-исследовательский и проектный институт геофизических методов разведки океана" HYDROACOUSTIC NAVIGATION SYSTEM AND ITS CALIBRATION SYSTEM
CN111660290A (en) * 2019-03-05 2020-09-15 波音公司 Automatic calibration for robotic optical sensors
CN111175761A (en) * 2019-11-19 2020-05-19 南京工程学院 Registration method of underwater robot positioning sonar data
CN111260649A (en) * 2020-05-07 2020-06-09 常州唯实智能物联创新中心有限公司 Close-range mechanical arm sensing and calibrating method
CN114509767A (en) * 2022-02-15 2022-05-17 交通运输部天津水运工程科学研究所 Underwater imaging sonar measurement calibration device and method
CN115575931A (en) * 2022-09-29 2023-01-06 北京百度网讯科技有限公司 Calibration method, calibration device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于卷积神经网络的声呐图像配准研究;韩鹏举;《中国优秀硕士学位论文全文数据库信息科技辑》(第2期);正文第6-41页 *

Also Published As

Publication number Publication date
CN116299374A (en) 2023-06-23

Similar Documents

Publication Publication Date Title
KR100506097B1 (en) Method and apparatus for making magnetic field map and method and apparatus for checking pose of the moving body using the map
US10930013B2 (en) Method and system for calibrating imaging system
CN113091771B (en) Laser radar-camera-inertial navigation combined calibration method and system
JP7131994B2 (en) Self-position estimation device, self-position estimation method, self-position estimation program, learning device, learning method and learning program
CN102162577B (en) Pipeline defect surface integrity detection device and detection method
CN110663060B (en) Method, device, system and vehicle/robot for representing environmental elements
EP2169422A1 (en) System and method for acoustic tracking an underwater vehicle trajectory
CN105547635A (en) Non-contact type structural dynamic response measurement method for wind tunnel test
CN107504917B (en) Three-dimensional size measuring method and device
CN110136186B (en) Detection target matching method for mobile robot target ranging
CN112285650A (en) Method, system and storage medium for positioning unknown wave velocity sound emission source in presence of abnormal TDOA
CN116299374B (en) Sonar imaging underwater automatic calibration positioning method and system based on machine vision
CN112597574A (en) Construction method and device of building information model
CN117496051A (en) BIM model construction method and system combined with laser radar
CN115660027B (en) Multi-equipment sea area target data generation method and system supporting small samples
CN112907663A (en) Positioning method, computer program product, device and system
JP2021154935A (en) Vehicle simulation system, vehicle simulation method and computer program
JPWO2021130978A1 (en) Motion analysis system and motion analysis program
CN114485613B (en) Positioning method for multi-information fusion underwater robot
EP4148392A1 (en) Method and apparatus for vehicle positioning
JP2020143920A (en) Model generation device, vehicle simulation system, model generation method, vehicle simulation method, and computer program
JP4431486B2 (en) Target identification support system
CN113793378B (en) Semantic SLAM object association and pose updating method and system based on hierarchical grouping
CN116008914A (en) Global sensitivity analysis method for parameters of synthetic baseline underwater acoustic positioning system
CN114091219A (en) Road network data confidence determination method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant