CN111983620A - Target positioning method for underwater robot searching and feeling - Google Patents

Target positioning method for underwater robot searching and feeling Download PDF

Info

Publication number
CN111983620A
CN111983620A CN202011065672.2A CN202011065672A CN111983620A CN 111983620 A CN111983620 A CN 111983620A CN 202011065672 A CN202011065672 A CN 202011065672A CN 111983620 A CN111983620 A CN 111983620A
Authority
CN
China
Prior art keywords
target
underwater
underwater robot
layer
elevation angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011065672.2A
Other languages
Chinese (zh)
Other versions
CN111983620B (en
Inventor
马杰
余逸飞
尉浩然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Publication of CN111983620A publication Critical patent/CN111983620A/en
Application granted granted Critical
Publication of CN111983620B publication Critical patent/CN111983620B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/539Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention discloses a target positioning method facing an underwater robot searching and detecting model, which comprises the following steps: extracting and identifying A-KAZE characteristic points of the sonar image of the underwater target; measuring and calculating a two-dimensional position of an underwater target relative to an underwater robot; measuring and calculating the elevation angle of a target; measuring and calculating the three-dimensional position of an underwater target relative to the underwater robot; and correcting the elevation angle of the target so as to further correct the three-dimensional azimuth of the underwater target point relative to the underwater robot. The invention relates to an underwater robot searching and detecting-oriented target positioning method, which is used for automatically extracting and identifying characteristic points of an underwater target by using a deep convolutional neural network based on forward-looking sonar data and combining the posture of the underwater robot, thereby realizing the accurate positioning of the underwater target, facilitating a searcher to finely detect the position of the underwater target, and realizing the reliability, high efficiency and intellectualization of underwater searching and detecting operation.

Description

Target positioning method for underwater robot searching and feeling
Technical Field
The invention relates to the technical field of underwater target searching and detecting, in particular to a target positioning method facing to an underwater robot searching and detecting.
Background
In scientific research on the ocean, underwater robots are the most important research tools to replace human beings working underwater for a long time or working in a severe underwater environment. In a complex underwater environment, the most reliable and effective detection means is underwater sound detection, and the most widely applied underwater detection means of an underwater robot is also provided. The method comprehensively utilizes the modern sonar detection technology to carry out underwater search detection on the sea area with the accident to obtain key feature points of an underwater search target, and simultaneously combines the target feature points and the posture information of a detection robot to realize accurate positioning of the underwater target.
The existing underwater target searching and detecting method is not high in refinement and accuracy of the position of an underwater target, and the research on the underwater target searching and detecting positioning method is a focus problem in scientific research in the present and even a long time in the future.
Disclosure of Invention
The invention aims to solve at least one technical problem in the prior art, and provides a target positioning method facing an underwater robot searching and detecting model, which can realize accurate positioning of an underwater target.
According to the embodiment of the invention, the target positioning method facing the underwater robot searching and detecting is provided, and comprises the following steps:
s1, extracting A-KAZE characteristic points of an underwater target through sonar images collected by a forward-looking sonar of an underwater robot;
s2, inputting the sonar image with the A-KAZE characteristics into a convolutional neural network method to identify A-KAZE characteristic points of a target in the sonar image;
s3, measuring and calculating the two-dimensional orientation of the underwater target relative to the underwater robot by using the geometric relation between the target feature point and the front sonar;
s4, combining the two-dimensional position of the feature point of the underwater target and the posture of the underwater robot to measure and calculate the elevation angle theta of the target, and measuring and calculating the three-dimensional position of the underwater target relative to the underwater robot through the obtained elevation angle theta;
s5, correcting the elevation angle theta of the target, and further correcting the three-dimensional direction of the underwater target point relative to the underwater robot.
According to the target positioning method facing the underwater robot searching and probing model, the step of extracting the A-KAZE characteristic points of the underwater target in the step S1 comprises the following steps:
s101, defining a group of evolution time to construct a nonlinear scale space;
s102, converting the discrete set in the pixel unit into a time unit;
s103, giving an input image and a contrast factor, and using a rapid explicit diffusion method;
s104, embedding the rapid explicit diffusion method into a pyramid method from coarse to fine;
s105, calculating a Hessian determinant for each sonar image;
s106, calculating a second derivative by using a cascade Share filter.
According to the target positioning method facing the underwater robot searching and touching, disclosed by the embodiment of the invention, the specific implementation of the step S2 comprises the following sub-steps:
s201, training a convolutional neural network on a sonar image data set by using a GoogleLeNet framework.
According to the target positioning method facing the underwater robot searching and exploring, the GoogleLeNet framework comprises five layers, wherein the first layer and the second layer are a convolution layer and a maximum pooling layer, the third layer is an initiation layer, the fourth layer is a characteristic layer and is a fully connected layer, the fourth layer maps the previous output to a Dimx 1 vector, the fifth layer is a fully connected layer, the fifth layer maps the previous characteristic layer to a 3 × 1 vector, and the characteristic layer mapped to the 3 × 1 vector is compared with a position label using Euclidean loss.
According to the target positioning method facing the underwater robot searching and touching, disclosed by the embodiment of the invention, the specific implementation of the step S3 comprises the following sub-steps:
s301, converting a local Cartesian sonar coordinate system and a spherical parameter coordinate system.
According to the target positioning method facing the underwater robot searching and touching, disclosed by the embodiment of the invention, the specific implementation of the step S4 comprises the following sub-steps:
s401, formulating the postures of the underwater target feature points and the underwater robot into nonlinear least square factor graph optimization, and for each posture XtComprising the following 6 parameters (x)o,yo,zoYaw, pitch, roll), for each feature point, contains the following 3 parameters (x, y, z);
s402, solving the factor graph into nonlinear optimization;
s403, feature points l are usedjConvert to sonar frame (x, y, z), obtain local coordinates (x)s,ys,zs) Azimuth and distance of;
s404, finding initial estimation of the feature points through back projection of sonar measurement values by utilizing monotonicity of a logarithmic function;
s405, setting the unknown elevation angle theta to be 0, and then using the underwater robot posture XtPoint from sonar rectangular coordinate (x)s,ys,zs) Converting into world rectangular coordinates (x, y, z) for use as three-dimensional orientation of the initial guess feature points;
s406, according to the basic posture XbCorresponding measured values m of azimuth and distancebThe elevation angle theta of the target point is calculated.
According to the target positioning method facing the underwater robot searching and probing model, in the step S5, the feature points with insufficient or sufficient constraints are adopted to iteratively calculate the corrected target elevation angle θ, so as to correct the three-dimensional azimuth of the underwater target point relative to the underwater robot.
According to the target positioning method facing the underwater robot searching and touching, disclosed by the embodiment of the invention, the specific implementation of the step S5 comprises the following sub-steps:
s501, observing the elevation angle of the target feature point through different postures;
s502, classifying the observed feature points into elements with insufficient or sufficient constraints;
s503, in order to determine whether the point feature points are sufficiently constrained, three-degree-of-freedom spherical parameterization is used;
s504, feature points l are used0Is a linearization point, using a taylor series expansion of the measurement function;
s505, simplifying optimization into a linear least square problem;
s506, determining whether optimization is subject to measurement constraint;
s507, completely deleting feature points with insufficient constraint from the state vector;
s508, only the elevation angles of the feature points with insufficient constraint are completely deleted from the state vectors, and then the feature points with insufficient constraint are modeled as two-dimensional azimuth distance points in the factor graph.
According to the target positioning method facing the underwater robot searching and probing, in step S5, a monte carlo method is adopted to perform random sampling calculation in the elevation angle range of the target feature point, the elevation angle theta of the target is corrected by taking the minimized position error as the target, and then the three-dimensional azimuth of the underwater target point relative to the underwater robot is corrected.
According to the target positioning method facing the underwater robot searching and touching, disclosed by the embodiment of the invention, the specific implementation of the step S5 comprises the following sub-steps:
s501, acquiring the elevation angle of the target feature point through different postures;
s502, randomly sampling in a target characteristic point elevation angle range by using a Monte Carlo method;
s503, correcting the elevation angle theta of the target by adopting a method of optimizing a loss function;
s504, generating space position information before and after the target feature points in the sonar roll by using a Monte Carlo algorithm, and constraining l by using an arc0
S505, simplifying optimization into a linear least square problem;
s506, determining whether optimization is subject to measurement constraint;
and S507, correcting the characteristic points by using a loss function.
Has the advantages that: the target positioning method for the underwater robot searching and touching is based on forward-looking sonar data, a deep convolutional neural network is used for automatically extracting and identifying characteristic points of an underwater target, the underwater robot posture is combined, accurate positioning of the underwater target is achieved, a searcher can conveniently and finely detect the position of the underwater target, reliability, high efficiency and intellectualization of underwater searching and touching operation are achieved, and the method is used in the technical field of underwater target searching and touching.
Drawings
The invention will be further described with reference to the accompanying drawings in which:
fig. 1 is a block diagram of the step S5 of the present invention when the corrected target elevation angle is calculated iteratively using feature points with insufficient constraints or sufficient constraints;
FIG. 2 is a block diagram of the step S5 of correcting the target elevation angle by the Monte Carlo method according to the embodiment of the present invention;
FIG. 3 is a geometric relationship diagram of target feature points and a front sonar according to an embodiment of the present invention;
FIG. 4 is a factor graph model of an embodiment of the invention;
FIG. 5 is a three-dimensional position diagram of an underwater target in accordance with an embodiment of the present invention;
FIG. 6 is a schematic view of an embodiment of the present invention in rotation about the z-axis;
FIG. 7 is a factor graph modification model according to an embodiment of the present invention;
fig. 8 is a schematic view of an underwater robot rotating around an x-axis according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to the present preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
In the description of the present invention, it should be understood that the orientation or positional relationship referred to in the description of the orientation, such as the upper, lower, front, rear, left, right, etc., is based on the orientation or positional relationship shown in the drawings, and is only for convenience of description and simplification of description, and does not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention.
In the description of the present invention, the meaning of a plurality of means is one or more, the meaning of a plurality of means is two or more, and larger, smaller, larger, etc. are understood as excluding the number, and larger, smaller, inner, etc. are understood as including the number. If the first and second are described for the purpose of distinguishing technical features, they are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
In the description of the present invention, unless otherwise explicitly limited, terms such as arrangement, installation, connection and the like should be understood in a broad sense, and those skilled in the art can reasonably determine the specific meanings of the above terms in the present invention in combination with the specific contents of the technical solutions.
Referring to fig. 1 and 2, an embodiment of the present invention provides a target positioning method for an underwater robot searching and exploring, including the following steps:
s1, extracting A-KAZE characteristic points of an underwater target through sonar images collected by a forward-looking sonar of the underwater robot, and specifically realizing the extraction of the A-KAZE characteristic points of the underwater target comprises the following sub-steps:
s101, defining a group of evolution time to construct a nonlinear scale space,
σi(o,s)=2o+s/S,o∈[0…O-1],s∈[0…S-1],i∈[0…M],
where O is the set of images blurred by different gaussian kernels, S is the discretized layer, σ is the pixel M, which is the total number of sonar images.
S102, pixel unit sigmaiThe discrete set of (a) is converted into time units,
Figure BDA0002713692220000051
s103: given an input image and a contrast factor, a fast explicit diffusion method is used, M-1 outer fast explicit diffusion cycles are used, and a minimum number of inner steps n (n < < M) is calculated for each cycle.
S104, in order to accelerate the calculation of the nonlinear scale space, the fast explicit diffusion method is embedded into a pyramid method from coarse to fine.
The fast explicit diffusion method is embedded in the pyramidal decomposition from coarse to fine. To reach steady state as quickly as possible, cascading fast explicit diffusion will address the propagation from coarse to fine stages. The image is down-sampled by a factor of 2 and the down-sampled image is used as the starting image for the next fast diffusion period in the next set.
S105, for each sonar image LiThe hessian determinant is calculated,
Figure BDA0002713692220000052
s106, using step size of sigma to calculate second derivativei,normCascaded shal filters of (1).
At each i, it is checked whether the response value is higher than a preset threshold. Then for each possible maximum, respectively at a size σiiWindow for a pixelIn (1), it is checked whether the maximum values of other key points in i +1 and i-1 are respectively positive upper and negative. And finally, fitting a quadratic function to the Hessian response value in a 3-by-3 pixel neighborhood, and finding the maximum value of the quadratic function, so as to estimate the two-dimensional position of the key point with sub-pixel precision.
S2, inputting the sonar image with the A-KAZE characteristics into a convolutional neural network method to identify A-KAZE characteristic points of targets in the sonar image, and concretely realizing the method comprises the following substeps:
s201, training a Convolutional Neural Network (CNN) on a sonar image data set by using a GoogleLeNet architecture;
the original GoogleLeNet architecture is divided into five layers, wherein the first layer and the second layer are a convolution layer and a maximum pooling layer, and the third layer, the fourth layer and the fifth layer are an initiation layer.
Two improvements were made to adapt the original network to the present invention:
(1) the second last layer (i.e., the fourth layer) is a fully connected layer that maps the previous output to a Dim x 1 vector, referred to as the feature layer.
(2) The last layer (i.e., the fifth layer) is a fully connected layer that maps the previous feature layer into a 3 x 1 vector and compares it to the position label using the euclidean penalties.
S3, measuring and calculating the two-dimensional orientation of the underwater target relative to the underwater robot by using the geometric relation between the target feature point and the front sonar, and specifically realizing the following substeps:
s301. parameterized point C ═ x y z in local Cartesian sonar coordinate system]T. This point can also be represented as Q using a spherical parameterization, the transformation between the two,
Figure BDA0002713692220000061
Figure BDA0002713692220000062
wherein
Figure BDA0002713692220000063
Is the azimuth, r is the distance, and θ is the elevation.
r is determined by the time of launch and the speed of sound in the water. Transceiver array allows azimuth angle of received reflection
Figure BDA0002713692220000066
Calculated as accuracy<Within 1 deg. While these measurements do not provide any information about the elevation angle theta.
Detected sonar reflections reflected from curved patches located on the same elevation arc will be projected to the same pixel in the final imaging sonar image, as shown in fig. 3.
Compiling all measurement results in a sonar view field will result in a gray scale polar image, where the columns in the two-dimensional matrix correspond to the discrete azimuth space and the rows correspond to the discrete range space.
For a unit pixel σ, let
Figure BDA0002713692220000064
Representing the conversion from pixel space to azimuth range space. The intensity of a pixel corresponds to the intensity of sound reflected from the elevation within a specified azimuth and range.
S4, combining the two-dimensional azimuth of the feature point of the underwater target and the attitude of the underwater robot to measure and calculate the elevation angle theta of the target, and measuring and calculating the three-dimensional azimuth of the underwater target relative to the underwater robot through the obtained elevation angle theta, wherein the method specifically comprises the following substeps:
s401, formulating the postures of the underwater target feature points and the underwater robot into nonlinear least square factor graph optimization, and for each posture XtThere are the following 6 parameters (x)o,yo,zoYaw, pitch, roll), for each feature point, there are the following 3 parameters (x, y, z).
The factor graph is a bipartite graph in which the variable nodes of the unknown variables to be optimized are connected to the factor nodes of the measured values, as shown in fig. 4.
At each time t, the attitude XtWith odometer measuring position value ut-1Together areAdded as a new node to the factor graph, the latter providing Xt-1And XtIn between. The azimuth and the distance of the jth characteristic point are calculated
Figure BDA0002713692220000065
Measured value m ofkAdded to the graph, thereby adding the feature points ljConnected to a pose for viewing it. Using "basic attitude Xb"(first pose of feature point observed), first assume a 0 ° elevation angle to generate an initial estimate of the three-dimensional position of the feature point.
S402, solving the factor graph into nonlinear optimization,
Figure BDA0002713692220000071
wherein the state vector X ═ X0,X1,…,L0,L1,…]TContains all unknown variables: pose and feature points.
The ith factor specifies the prediction function hi(X), measured value
Figure BDA0002713692220000072
And measuring the uncertainty Σ i.
In the slave attitude XtMeasuring the azimuth and distance of the feature point j
Figure BDA0002713692220000073
In the case of (2), the prediction function:
Figure BDA0002713692220000074
Figure BDA0002713692220000075
S403.hi(X) first according to the attitude XtThe feature points ljIn terms of absolute coordinates (x, y, z)o,yo,zo) Converting into sonar frame to obtain local coordinate (x)s,ys,zs) Azimuth angle of
Figure BDA0002713692220000076
And a distance r, as shown in fig. 5, obtained by the following formula;
xs=x+xo[cos(yaw)-sin(yaw)],
ys=y+yo[sin(yaw)+cos(yaw)],
zs=z+zo
Figure BDA0002713692220000077
Figure BDA0002713692220000078
s404, finding initial estimation of the feature points through back projection of sonar measurement values by utilizing monotonicity of a logarithmic function. Using the first observation of each feature, including distance r and azimuth
Figure BDA0002713692220000079
The measured value of the measured value is,
Figure BDA00027136922200000710
s405, setting the unknown elevation angle theta to be 0, and then using the underwater robot posture XtPoint from sonar rectangular coordinate (x)s,ys,zs) Converted to world rectangular coordinates (x, y, z) and used as the three-dimensional position of the initial guess feature point.
S406, according to the basic posture XbIn the corresponding direction
Figure BDA0002713692220000081
And a distance rbMeasured value m ofbComputing a targetThe elevation angle θ of the point, as shown in FIG. 5;
s5, correcting the elevation angle theta of the target, and further correcting the three-dimensional direction of the underwater target point relative to the underwater robot.
Specifically, the elevation angle θ of the target is corrected, and the elevation angle of the target can be iteratively calculated by using feature points with insufficient constraints and sufficient constraints, or the elevation angle of the target can be corrected by sampling and calculating position errors within the range of the elevation angles of the feature points by using a monte carlo method.
The method comprises the following specific steps of adopting feature points with insufficient or sufficient constraints to iteratively calculate and correct a target elevation angle theta:
and S501, observing the elevation angle of the target feature point through different postures.
Controlling the underwater robot to rotate around the z axis in a pure yawing manner, wherein the rotating angle is yaw, and the azimuth angle of the reflection received by the forward looking sonar
Figure BDA0002713692220000082
Feature point parameter (x, y, z) parameter conversion:
Figure BDA0002713692220000083
as shown in fig. 6, the elevation arcs have minimal overlap when the attitudes are separated by pure yaw rotation.
The feature point is measured from a plurality of poses, and the elevation angle of the point needs to be corrected through the following steps.
S502, classifying the observed feature points into elements with insufficient constraint or sufficient constraint. Checking whether the measurement result is enough to restrain the elevation angle of the measurement result, and if so, adding the measurement result into the factor graph as a characteristic point with enough restraint by using standard parameterization;
s503, in order to determine whether the point feature points are sufficiently constrained, three-degree-of-freedom spherical parameterization is used, wherein the state is only defined by the feature points ljThe components of the composition are as follows,
Figure BDA0002713692220000084
since the attitude of the sensors is not a state variable, they are treated as constants, and the function h is predictedi(lj) The latest estimate available from the overall factor graph state estimate is used.
S504, feature points l are used0Is a point of linearization, using a taylor series expansion of the measurement function,
Figure BDA0002713692220000085
Figure BDA0002713692220000086
s505. simplify the optimization to a linear least squares problem,
Figure BDA0002713692220000087
wherein:
Figure BDA0002713692220000091
wherein A and b:
Ai=∑i -1/2Hi
Figure BDA0002713692220000092
linearization point
Figure BDA0002713692220000093
Considered as the first azimuth and distance measurement to be backprojected at zero elevation.
S506, determining whether optimization is subject to measurement constraints, wherein A is checkedTA is the key to determining whether optimization is measurement constrained.
If the elevation angle is completely unconstrained, the 3 × 3 matrix ATThe rank of A will be insufficient with increasing constraints on elevation angle, ATMinimum eigenvalue λ of A3Will be relative to the first two eigenvalues lambda1And λ2And is increased. Therefore, the feature points must satisfy the criteria
Figure BDA0002713692220000094
Can be considered to have sufficient restraining force where p is a user-defined adjustable threshold. If the criteria are not met, the feature points are classified as under constrained.
S507, deleting feature points with insufficient constraint from the state vector so that the positions of the feature points cannot be explicitly modeled in optimization. As shown in FIG. 7, the feature point l will be associated withjCorresponding measured values are collected to a non-parametric factor fjIn (1). This factor takes the feature point from its base pose XbObtaining a first azimuthal distance measurement mbThe two spherical coordinates of the characteristic points are determined by regarding the two spherical coordinates as constants;
at each iteration of the optimization, this factor is searched over a range of feasible elevation angles by sampling the elevation angles in uniform increments, and the elevation angle with the lowest gross projection error is selected as the current predicted value:
Figure BDA0002713692220000095
wherein Θ ═ θminmin+Δθ,…,θmax-Δθ,θmax},
Computing reprojection errors as feature point to pose x using measurement uncertainty Σ kkProjection of (2) and measurement value mkAs a function of the distance therebetween.
The cost function of this factor is then the gross projection error estimated at the best elevation:
Figure BDA0002713692220000096
s508, only deleting the elevation angles of the feature points with insufficient constraint from the state vectors, and then modeling the feature points with insufficient constraint as two-dimensional azimuth distance points in the factor graph.
As shown in FIG. 4, the feature point l of insufficient constraintjAll of the measurements of (a) are combined into a single joint measurement factor sj. The joint measurement factor and the non-parameter factor fjSimilarly, the difference is that it uses the orientation and distance estimates of the feature points in the computation of the reprojection error, rather than the measurement of the underlying pose.
In addition, random sampling calculation is carried out in the elevation angle range of the target characteristic point by adopting a Monte Carlo method, the elevation angle theta of the target is corrected by taking the minimized position error as the target, and the specific implementation comprises the following substeps:
and S501, observing the elevation angle of the target feature point through different postures.
The underwater robot is controlled to rotate around the x axis in a pure rolling way, the rotation angle is roll, the azimuth angle of the reflection received by the forward looking sonar,
Figure BDA0002713692220000101
r′=r+Δr。
feature point parameter (x, y, z) parameter conversion:
Figure BDA0002713692220000102
as shown in fig. 8, the elevation arcs have minimal overlap when the poses are separated by pure scrolling rotations.
The feature point is measured from a plurality of poses, and the elevation angle of the point needs to be corrected through the following steps.
S502, randomly sampling in the elevation angle range of the target characteristic point by using a Monte Carlo method. Iteratively reducing the distance between the generated target point and the real target point by using a loss function, and adding the feature point with the minimum distance into the factor graph;
s503, the loss function uses nonlinear loss, wherein the state is only formed by characteristic points ljWherein the content of the compound is 20,
Figure BDA0002713692220000103
since the attitude of the sensors is not a state variable, they are treated as constants, and the function h is predictedi(lj) The latest estimate available from the overall factor graph state estimate is used.
S504, generating space position information before and after the target feature points in the sonar roll by using a Monte Carlo algorithm, and constraining l by using an arc0
Figure BDA0002713692220000104
Figure BDA0002713692220000105
lnThe arc of composition may constrain0Wherein c is an analytic function of the arc:
Figure BDA0002713692220000106
s505, simplifying the prediction optimization into a linear loss optimization problem,
Figure BDA0002713692220000111
wherein
Figure BDA0002713692220000112
Where α and β represent the optimization coefficients:
Figure BDA0002713692220000113
dot
Figure BDA0002713692220000114
Considered as the first azimuth and distance measurement of the back projection at zero elevation.
S506, through comparison of predicted values hi(lj) And the initial estimated value
Figure BDA0002713692220000115
The distance between them to determine the degree of constraint of the elevation angle. If h isi(lj) And
Figure BDA0002713692220000116
if the difference value of (A) is large, the optimization is insufficient, and iterative optimization needs to be continued, so that the feature points must meet the standard
Figure BDA0002713692220000117
Can be considered to have sufficient optimization, and at this point can be considered to be predictive, with the value τ being the value defined by the user.
S507, optimizing the distance between the predicted target point and the real target point through a loss function, filling the predicted target point with the minimum distance into a factor graph as shown in figure 7, and connecting the predicted target point with a feature point ljThe corresponding measured values are collected into a non-parametric factor f. This factor takes the feature point from its base pose XbObtaining a first azimuthal distance measurement mbThe two spherical coordinates of the characteristic points are determined by regarding the two spherical coordinates as constants;
in each iteration of the loss value, the factor is calculated by searching in uniform increments over a range of feasible elevation angles and selecting the elevation angle with the lowest error as the current predicted value:
Figure BDA0002713692220000118
using the measurement uncertainty Σ k, the error is calculated as the feature point to attitude xkProjection of (2) and measurement value mkAs a function of the distance therebetween.
This cost function is then the total error evaluated at the best elevation:
Figure BDA0002713692220000119
in the two methods, the elevation angle theta of the corrected target is iteratively calculated by adopting the feature points with insufficient constraint and sufficient constraint, the calculation process is complex due to the nonlinear conversion of sonar measurement projection, and the Monte Carlo method has the characteristic of simple calculation process and cannot increase the calculation complexity due to the increase of the feature point constraints, so the Monte Carlo method is preferably selected. Because the elevation angle range of the characteristic point is smaller, the calculation amount of the Monte Carlo method can be reduced, and the calculation speed is further improved.
It should be understood that parts of the specification not set forth in detail are prior art.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments, and various changes can be made within the knowledge of those skilled in the art without departing from the gist of the present invention.

Claims (10)

1. A target positioning method facing an underwater robot searching and detecting model is characterized by comprising the following steps:
s1, extracting A-KAZE characteristic points of an underwater target through sonar images collected by a forward-looking sonar of an underwater robot;
s2, inputting the sonar image with the A-KAZE characteristics into a convolutional neural network method to identify A-KAZE characteristic points of a target in the sonar image;
s3, measuring and calculating the two-dimensional orientation of the underwater target relative to the underwater robot by using the geometric relation between the target feature point and the front sonar;
s4, combining the two-dimensional position of the feature point of the underwater target and the posture of the underwater robot to measure and calculate the elevation angle theta of the target, and measuring and calculating the three-dimensional position of the underwater target relative to the underwater robot through the obtained elevation angle theta;
s5, correcting the elevation angle theta of the target, and further correcting the three-dimensional direction of the underwater target point relative to the underwater robot.
2. The method for locating the target facing the underwater robot searching module as claimed in claim 1, wherein the step S1 of extracting the a-KAZE feature points of the underwater target comprises the steps of:
s101, defining a group of evolution time to construct a nonlinear scale space;
s102, converting the discrete set in the pixel unit into a time unit;
s103, giving an input image and a contrast factor, and using a rapid explicit diffusion method;
s104, embedding the rapid explicit diffusion method into a pyramid method from coarse to fine;
s105, calculating a Hessian determinant for each sonar image;
s106, calculating a second derivative by using a cascade Share filter.
3. The underwater robot searching and touching-oriented target positioning method as claimed in claim 1, wherein the step S2 is implemented by the following steps:
s201, training a convolutional neural network on a sonar image data set by using a GoogleLeNet framework.
4. The underwater robot searching and exploring-oriented target positioning method as recited in claim 3, wherein: the google lenet architecture includes five layers, a first layer and a second layer are a convolutional layer and a maximum pooling layer, a third layer is an initiation layer, a fourth layer is a feature layer and is a fully connected layer, the fourth layer maps previous outputs to a Dim x 1 vector, the fifth layer is a fully connected layer, the fifth layer maps the previous feature layer to a 3 x 1 vector, and compares the feature layer mapped to the 3 x 1 vector with a position tag using a euclidean loss.
5. The underwater robot searching and touching-oriented target positioning method as claimed in claim 1, wherein the step S3 is implemented by the following steps:
s301, converting a local Cartesian sonar coordinate system and a spherical parameter coordinate system.
6. The underwater robot searching and touching-oriented target positioning method as claimed in claim 1, wherein the step S4 is implemented by the following steps:
s401, formulating the postures of the underwater target feature points and the underwater robot into nonlinear least square factor graph optimization, and for each posture XtComprising the following 6 parameters (x)o,yo,zoYaw, pitch, roll), for each feature point, contains the following 3 parameters (x, y, z);
s402, solving the factor graph into nonlinear optimization;
s403, feature points l are usedjConvert to sonar frame (x, y, z), obtain local coordinates (x)s,ys,zs) Azimuth and distance of;
s404, finding initial estimation of the feature points through back projection of sonar measurement values by utilizing monotonicity of a logarithmic function;
s405, setting the unknown elevation angle theta to be 0, and then using the underwater robot posture XtPoint from sonar rectangular coordinate (x)s,ys,zs) Converting into world rectangular coordinates (x, y, z) for use as three-dimensional orientation of the initial guess feature points;
s406, according to the basic posture XbCorresponding measured values m of azimuth and distancebThe elevation angle theta of the target point is calculated.
7. The underwater robot searching and exploring-oriented target positioning method as recited in claim 1, wherein: in the step S5, the corrected target elevation angle θ is iteratively calculated by using the feature points with insufficient or sufficient constraints, so as to correct the three-dimensional orientation of the underwater target point relative to the underwater robot.
8. The underwater robot searching and touching-oriented target positioning method as claimed in claim 7, wherein the step S5 is implemented by the following steps:
s501, observing the elevation angle of the target feature point through different postures;
s502, classifying the observed feature points into elements with insufficient or sufficient constraints;
s503, in order to determine whether the point feature points are sufficiently constrained, three-degree-of-freedom spherical parameterization is used;
s504, feature points l are used0Is a linearization point, using a taylor series expansion of the measurement function;
s505, simplifying optimization into a linear least square problem;
s506, determining whether optimization is subject to measurement constraint;
s507, completely deleting feature points with insufficient constraint from the state vector;
s508, only the elevation angles of the feature points with insufficient constraint are completely deleted from the state vectors, and then the feature points with insufficient constraint are modeled as two-dimensional azimuth distance points in the factor graph.
9. The underwater robot searching and exploring-oriented target positioning method as recited in claim 1, wherein: in step S5, a monte carlo method is used to perform random sampling calculation within the elevation range of the target feature point, and the elevation angle θ of the target is corrected by taking the minimized position error as the target, so as to correct the three-dimensional azimuth of the underwater target point relative to the underwater robot.
10. The underwater robot searching and touching-oriented target positioning method as claimed in claim 9, wherein the step S5 is implemented by the following steps:
s501, acquiring the elevation angle of the target feature point through different postures;
s502, randomly sampling in a target characteristic point elevation angle range by using a Monte Carlo method;
s503, correcting the elevation angle theta of the target by adopting a method of optimizing a loss function;
s504, generating targets in sonar by using Monte Carlo algorithmSpace position information before and after the feature point rolls and utilizes the circular arc to restrain0
S505, simplifying optimization into a linear least square problem;
s506, determining whether optimization is subject to measurement constraint;
and S507, correcting the characteristic points by using a loss function.
CN202011065672.2A 2020-03-04 2020-09-30 Target positioning method for underwater robot searching and exploring Active CN111983620B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010143341X 2020-03-04
CN202010143341.XA CN111413698A (en) 2020-03-04 2020-03-04 Target positioning method for underwater robot searching and feeling

Publications (2)

Publication Number Publication Date
CN111983620A true CN111983620A (en) 2020-11-24
CN111983620B CN111983620B (en) 2024-02-20

Family

ID=71489211

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010143341.XA Pending CN111413698A (en) 2020-03-04 2020-03-04 Target positioning method for underwater robot searching and feeling
CN202011065672.2A Active CN111983620B (en) 2020-03-04 2020-09-30 Target positioning method for underwater robot searching and exploring

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010143341.XA Pending CN111413698A (en) 2020-03-04 2020-03-04 Target positioning method for underwater robot searching and feeling

Country Status (1)

Country Link
CN (2) CN111413698A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529072A (en) * 2020-12-07 2021-03-19 中国船舶重工集团公司七五0试验场 Underwater buried object identification and positioning method based on sonar image processing
CN112859807A (en) * 2021-01-10 2021-05-28 西北工业大学 Underwater vehicle collaborative search efficiency evaluation method based on situation simulation and Monte Carlo
CN113379710A (en) * 2021-06-18 2021-09-10 上海大学 Underwater target sonar accurate measurement system and method

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112801191B (en) * 2021-02-02 2023-11-21 中国石油大学(北京) Intelligent recommendation method, device and equipment for handling pipeline accidents
CN114283327B (en) * 2021-12-24 2024-04-05 杭州电子科技大学 Target searching and approaching method based on underwater searching robot
CN116243720B (en) * 2023-04-25 2023-08-22 广东工业大学 AUV underwater object searching method and system based on 5G networking

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103869824A (en) * 2014-03-05 2014-06-18 河海大学常州校区 Biological antenna model-based multi-robot underwater target searching method and device
KR20160073462A (en) * 2014-12-16 2016-06-27 아진산업(주) A method for monitoring underwater exploration robot
RU2625349C1 (en) * 2016-06-28 2017-07-13 Акционерное общество "Научно-исследовательский институт "Вектор" Method for determination of spatial angular coordinates of radio signal in amplitude monopulse pelengage systems
US20180259339A1 (en) * 2015-11-13 2018-09-13 FLIR Belgium BVBA Video sensor fusion and model based virtual and augmented reality systems and methods
CN109676604A (en) * 2018-12-26 2019-04-26 清华大学 Robot non-plane motion localization method and its motion locating system
CN110246151A (en) * 2019-06-03 2019-09-17 南京工程学院 A kind of underwater robot method for tracking target based on deep learning and monocular vision
CN110275169A (en) * 2019-06-12 2019-09-24 上海大学 A kind of underwater robot near-field detection sensory perceptual system
KR20190121275A (en) * 2019-10-07 2019-10-25 엘지전자 주식회사 System, apparatus and method for indoor positioning
CN110383284A (en) * 2017-03-06 2019-10-25 微软技术许可有限责任公司 Gesture identification based on ultrasound
CN110568407A (en) * 2019-09-05 2019-12-13 武汉理工大学 Underwater navigation positioning method based on ultra-short baseline and dead reckoning

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103869824A (en) * 2014-03-05 2014-06-18 河海大学常州校区 Biological antenna model-based multi-robot underwater target searching method and device
KR20160073462A (en) * 2014-12-16 2016-06-27 아진산업(주) A method for monitoring underwater exploration robot
US20180259339A1 (en) * 2015-11-13 2018-09-13 FLIR Belgium BVBA Video sensor fusion and model based virtual and augmented reality systems and methods
RU2625349C1 (en) * 2016-06-28 2017-07-13 Акционерное общество "Научно-исследовательский институт "Вектор" Method for determination of spatial angular coordinates of radio signal in amplitude monopulse pelengage systems
CN110383284A (en) * 2017-03-06 2019-10-25 微软技术许可有限责任公司 Gesture identification based on ultrasound
CN109676604A (en) * 2018-12-26 2019-04-26 清华大学 Robot non-plane motion localization method and its motion locating system
CN110246151A (en) * 2019-06-03 2019-09-17 南京工程学院 A kind of underwater robot method for tracking target based on deep learning and monocular vision
CN110275169A (en) * 2019-06-12 2019-09-24 上海大学 A kind of underwater robot near-field detection sensory perceptual system
CN110568407A (en) * 2019-09-05 2019-12-13 武汉理工大学 Underwater navigation positioning method based on ultra-short baseline and dead reckoning
KR20190121275A (en) * 2019-10-07 2019-10-25 엘지전자 주식회사 System, apparatus and method for indoor positioning

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
BAOQING GUO: "Novel registration and fusion algorithm for multimodal railway images with different field of views", 《JOURNAL OF ADVANCED TRANSPORTATION 》, pages 1 - 17 *
R. DEBORTOLI, F. LI AND G. A. HOLLINGER: "ElevateNet: A Convolutional Neural Network for Estimating the Missing Dimension in 2D Underwater Sonar Images", 《2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)》, pages 8040 - 8047 *
彭向阳: "利用卷积神经网络进行绝缘子自动定位", 《武汉大学学报(信息科学版)》, pages 563 - 569 *
江国来: "共融移动服务机器人导航与交互关键技术研究", 《中国博士学位论文全文数据库 信息科技辑》, pages 1 - 117 *
王强军: "基于反卷积神经网络的图像融合算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, pages 1 - 97 *
王玉杰: "多目偏振视觉仿生导航方法研究", 《中国博士学位论文全文数据库 基础科学辑》, pages 1 - 150 *
马杰;刘琪;张春玮;刘克中;张煜: "基于AIS的数据时空分析及船舶会遇态势提取方法", 《中国安全科学学报》, pages 111 - 116 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529072A (en) * 2020-12-07 2021-03-19 中国船舶重工集团公司七五0试验场 Underwater buried object identification and positioning method based on sonar image processing
CN112859807A (en) * 2021-01-10 2021-05-28 西北工业大学 Underwater vehicle collaborative search efficiency evaluation method based on situation simulation and Monte Carlo
CN112859807B (en) * 2021-01-10 2022-03-22 西北工业大学 Underwater vehicle collaborative search efficiency evaluation method based on situation simulation and Monte Carlo
CN113379710A (en) * 2021-06-18 2021-09-10 上海大学 Underwater target sonar accurate measurement system and method
CN113379710B (en) * 2021-06-18 2024-02-02 上海大学 Underwater target sonar accurate measurement system and method

Also Published As

Publication number Publication date
CN111413698A (en) 2020-07-14
CN111983620B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
CN111983620B (en) Target positioning method for underwater robot searching and exploring
CN112183171B (en) Method and device for building beacon map based on visual beacon
Wu et al. Hand-eye calibration: 4-D procrustes analysis approach
CN110880189B (en) Combined calibration method and combined calibration device thereof and electronic equipment
CN105856230B (en) A kind of ORB key frames closed loop detection SLAM methods for improving robot pose uniformity
JP5987823B2 (en) Method and system for fusing data originating from image sensors and motion or position sensors
CN112184824B (en) Camera external parameter calibration method and device
Sweeney et al. Solving for relative pose with a partially known rotation is a quadratic eigenvalue problem
Heller et al. Structure-from-motion based hand-eye calibration using L∞ minimization
JP5627325B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
JP2008014691A (en) Stereo image measuring method and instrument for executing the same
EP3745310A1 (en) Method for calibrating a multi-sensor system using an artificial neural network
CN112444246A (en) Laser fusion positioning method in high-precision digital twin scene
CN110490933A (en) Non-linear state space Central Difference Filter method based on single point R ANSAC
CN112629565B (en) Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
US20240134033A1 (en) Method for determining a movement state of a rigid body
CN112991445B (en) Model training method, gesture prediction method, device, equipment and storage medium
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
CN116079727A (en) Humanoid robot motion simulation method and device based on 3D human body posture estimation
CN115311353A (en) Multi-sensor multi-handle controller graph optimization tight coupling tracking method and system
CN111366162B (en) Small celestial body detector pose estimation method based on solar panel projection and template matching
CN108827300A (en) A kind of the equipment posture position measurement method and system of view-based access control model
CN111504276B (en) Visual projection scale factor set-based joint target function multi-propeller attitude angle acquisition method
CN114018271B (en) Accurate fixed-point landing autonomous navigation method and system based on landmark image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant