CN112419410A - Horizontal attitude determination method based on underwater Snell window edge identification - Google Patents
Horizontal attitude determination method based on underwater Snell window edge identification Download PDFInfo
- Publication number
- CN112419410A CN112419410A CN202011307276.6A CN202011307276A CN112419410A CN 112419410 A CN112419410 A CN 112419410A CN 202011307276 A CN202011307276 A CN 202011307276A CN 112419410 A CN112419410 A CN 112419410A
- Authority
- CN
- China
- Prior art keywords
- pixel
- radius
- snell
- coordinate system
- window
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a horizontal attitude determination method based on underwater Snell window edge identification. The snell window is an optical phenomenon during underwater space observation and contains spatial information. The method extracts the center of a Snell window in underwater aerial imaging by an image processing means to obtain the horizontal attitude of a carrier. Firstly, carrying out noise filtering and binarization processing on an acquired image; then introducing a lens model of the camera into a traditional Hough circle transformation algorithm, and extracting a Snell window and the center thereof from a binary edge image; and finally, resolving the horizontal attitude of the carrier by utilizing the relation because the window center vector is superposed with the zenith vector under the navigation coordinate system. The method fully utilizes underwater optical information, realizes the determination of fully autonomous horizontal attitude, and can be used for the initial alignment of the underwater vehicle and the correction of inertial navigation accumulated errors.
Description
Technical Field
The invention relates to a horizontal attitude determination method based on underwater Snell window edge identification, which utilizes underwater sky imaging and obtains a carrier horizontal attitude by extracting the center of a Snell window in an image through improved Hough transformation, and belongs to the field of underwater optical navigation.
Background
Oceans store abundant biological resources, mineral resources and space resources and have great development potential, so that oceans are used as an important part in development strategies in all countries. The underwater unmanned underwater vehicle plays an irreplaceable role in the development of the sea. The method can be used for submarine landform detection, ocean engineering construction, ocean monitoring, ocean rescue and salvage and the like for civil use, can be used for military tasks such as underwater monitoring, reconnaissance, mine hunting, anti-submarine warning, communication relay and the like for national defense, and has the advantages of low manufacturing cost, high survival rate, capability of avoiding personal casualties and great development prospect. The autonomous navigation technology of the underwater unmanned underwater vehicle is a bottleneck problem which restricts the autonomous unmanned underwater vehicle. The autonomous navigation system is used as the 'ear-eye' of the unmanned submersible vehicle, ensures the smooth execution and safe return of underwater tasks of the submersible vehicle, and is a key technology in the underwater unmanned system.
Underwater acoustic positioning and satellite navigation in the current underwater navigation mode are widely applied, for example, an underwater acoustic positioning system is utilized in Chinese patent CN201410073287.0, Chinese patent CN 201510944118.4 and Chinese patent CN 201610351400.6; the Chinese patent CN 201010559044.X, the Chinese patent CN 201210332022.9, the Chinese patent CN 201810046278.0 and the Chinese patent CN 201410032817.7 utilize a satellite navigation mode. However, the two navigation modes are non-autonomous navigation, wherein underwater acoustic positioning needs to preset an underwater transponder, and the method cannot be applied to a remote and strange environment; the satellite navigation mode can only receive satellite signals when the underwater vehicle floats to the water surface, so that the continuity of the operation of the underwater vehicle is influenced, and particularly in military application, the concealment of the underwater vehicle is seriously influenced by the two navigation modes. In chinese patent CN 109443379 a and chinese patent CN 109141475a, inertial navigation and doppler mileage instruments are combined to determine the initial attitude of the carrier, however, the doppler mileage instruments have certain requirements on the distance between the submersible vehicle and the seabed, and the quality of output speed information is greatly influenced by seawater temperature, salinity, etc., so the application scenarios and performance are limited.
Disclosure of Invention
The invention solves the problems: the method overcomes the defects of the prior art, provides a horizontal attitude determination method based on underwater Snell window edge identification, and can realize the full-autonomous determination of the horizontal attitude angle without accumulated errors in an underwater environment.
The invention provides the pitch angle and the roll angle for the carrier independently by acquiring the external optical information. The Snell window is a high-brightness circular area formed at the water-air interface due to refraction when underwater air is imaged. In the imaging system, the observation vector of the circle center is vertical to the water surface upwards, namely the sky direction under the navigation coordinate system. Because the light inside the Snell window is formed by refracting sky light and has higher brightness relative to the outside, the Snell window in the null imaging has more obvious edges under the condition of a flat still water surface, and the window edges are conveniently extracted by using an image processing method so as to obtain the circle center of the Snell window. By integrating the two points, the invention provides a method for calculating the horizontal attitude of the carrier by extracting the circle center of a Snell window in the empty imaging.
The circle detection in the image is usually performed by using the hough circle transform, however, due to the distortion of the lens in the imaging system, when the underwater carrier is placed obliquely, the snell window in the image is not in a regular circle shape, so the conventional hough circle transform is not applicable any more. Aiming at the problem, the method is improved on the basis of Hough circle transformation, and a camera lens model is added, so that a Snell window and the circle center under the distortion of a camera lens can be accurately extracted.
The technical scheme adopted by the invention for solving the technical problems is as follows: a horizontal attitude determination method based on underwater Snell window edge identification comprises the following implementation steps:
step (1), carrying out noise filtering and binarization processing on the acquired image, obtaining an edge image according to light intensity information in the image, and extracting coordinates [ m, n ] of a non-zero pixel p in the edge image in a pixel coordinate system]. Wherein any point is represented by non-zero pixelsIs piWith the coordinate of [ mi,ni]Wherein i is 1,2, …, s, s is the number of nonzero pixels;
step (2), calculating a value set of the radius r of the included angle, wherein r is a possible value of the radius of the view angle of a Snell window, establishing a three-dimensional Hough space H (m, n, r), and setting all elements in the space to zero;
step (3) utilizing the edge image subjected to the binarization processing in the step (1) to convert each non-zero pixel p intoiThe observation vector under the world coordinate system, namely the w system, corresponding to the camera lens model theta is inverted through the camera lens model theta, and the observation vector is used as the z axis to establish the pixel local coordinate system, namely the l systemiSolving the non-zero pixel piCorresponding to liCoordinate transformation matrix between system and w system
Step (4) with a non-zero pixel piDetermining a cone set by using the included angle radius r, the redundancy radius delta alpha and the radius stepping delta calculated in the step (2) and taking all generatrix vectors g of the cone set as an axisiReverting to the pixel coordinate system, at each generatrix vector giqVoting in a three-dimensional Hough space H (m, n, r) of corresponding pixel coordinates, whereinTo obtain non-zero pixel point piA three-dimensional Hough space as a circle center;
and (5) performing the operations of the step (3) and the step (4) on all non-zero pixels, wherein the pixel point with the highest ticket number in the three-dimensional Hough space H (m, n, r) is the circle center of a Snell window, calculating an observation vector of the pixel coordinate of the circle center under a world coordinate system, and inverting the horizontal posture.
Further, the radius r of the included angle calculated in the step (2) is a value set, where r is a possible value of the radius of the view angle of the snell window, and a three-dimensional hough space H (m, n, r) is established, and all elements in the space are set to zero, which is specifically realized as follows:
according to Snell refractionDetermining an angle between the edge and the center of a window by a law and a lens model, taking the angle as an included angle radius in Hough circle transformation, and determining a redundant range of the included angle radius and radius stepping; the refractive indexes of atmosphere and water are respectively naAnd nwThen the angle α between the edge and the center of the snell window satisfies:
considering the errors of refractive indexes of a camera and a medium in practical application, the imaging size of a Snell window can not strictly accord with the theoretical model of the formula, so that the value range of the included angle radius r is [ alpha-delta alpha, alpha + delta alpha ] when the redundant radius delta alpha is set according to the practical water body environment and the lens model. Then, setting the included angle radius step as delta, and setting the value set of the included angle radius r as:
r∈{α-Δα,α-Δα+δ,α-Δα+2δ,…,α-Δα+kδ}(α-Δα+kδ≤α+Δα,k∈N+)
and establishing a three-dimensional Hough space H (m, n, r), wherein (m, n) is pixel coordinates, and initial values of the three-dimensional Hough space are all set to be 0.
Further, the step (3) uses the edge image binarized in the step (1) to convert each non-zero pixel p intoiThe observation vector under the world coordinate system, namely the w system, corresponding to the camera lens model theta is inverted through the camera lens model theta, and the observation vector is used as the z axis to establish the pixel local coordinate system, namely the l systemiSolving the non-zero pixel piCorresponding to liCoordinate transformation matrix between system and w systemThe concrete implementation is as follows:
in the pixel coordinate system, an arbitrary point is a nonzero pixel piHas the coordinate of [ mi,ni]Through camera lens calibration, a camera lens model theta (p) can be calibrated, and a pixel point piCorresponding incident ray propagation vector kappa under system wiExpressed as:
κi=Θ(pi) (16)
κiis a unit vector, an observation vector v in the directioniIs represented by in the w systemOrder toThen there are:
[ai bi ci]T=-Θ(pi) (17)
with viEstablishing a local coordinate system of picture elements, i.e./for the z-axisiIs, definition w is andiin the process of conversion between systems, the rotation angles around the x-axis and the y-axis are respectively omegaxi,The rotation angle around the z-axis is 0, yielding liThe coordinate transformation matrix from system to w system is:
combined formula (17), omegaxiAnd omegayiThe trigonometric function form of (a) can be expressed as follows:
thus, the method can be obtained by using any non-zero pixel point p on the imageiCorresponding observation vector viPicture element part of z-axisCoordinate system liThere is a coordinate transformation matrix with the world coordinate system of the camera as:
further, the step (4) is to use a non-zero pixel piDetermining a cone set by using the included angle radius r, the redundancy radius delta alpha and the radius stepping delta calculated in the step (2) and taking all generatrix vectors g of the cone set as an axisiReverting to the pixel coordinate system, at each generatrix vector giqVoting in a three-dimensional Hough space H (m, n, r) of corresponding pixel coordinates, whereinTo obtain non-zero pixel point piThe three-dimensional Hough space with the circle center is specifically realized as follows:
establishing random non-zero pixel point piCorresponding observation vector viTaking the included angle radius r as a cone angle half-angle cone as an axis, the circumferential step is tau, and one generatrix vector g of the coneiqIn liIs represented as follows:
whereinThen the generatrix vector giqCan be calculated from the following formula under the w system:
then, the model of the camera lens is used to calculateImaging of directionally incident light on an imagePixel coordinates are as follows:
[miq,niq]=round(Θ-1(-gi w)) (24)
where round () denotes rounding all the elements of () when point (m) is in the corresponding three-dimensional hough space H (m, n, r)iq,niqAnd r) voting once, namely accumulating the point value by 1, traversing q and r to obtain a non-zero pixel point piA three-dimensional Hough space as a circle center.
Further, in the step (5), the operations in the steps (3) and (4) are performed on all non-zero pixels, a pixel point with the highest ticket number in the three-dimensional hough space H (m, n, r) is the center of a snell window, an observation vector of a pixel coordinate of the center of the circle under a world coordinate system is solved, and a horizontal attitude is inverted, and the method is specifically realized as follows:
performing the operation of the step (3) and the operation of the step (4) on all non-zero pixel points in the binary edge image, voting and accumulating in the same three-dimensional Hough space H (m, n, r), and selecting a point (m, n, r) corresponding to the maximum value from the three-dimensional Hough space H (m, n, r)0,n0,r0) I.e. (m)0,n0,r0) argmaxH (m, n, r), then [ m0,n0]Is the coordinate of the center of the Snell window in the pixel coordinate system, r0The radius of the included angle of the Snell window, namely the actual included angle between the edge of the Snell and the center of the window;
zeta of observation vector of w system corresponding to central pixel of Snell windowwThe following are calculated by a camera lens model:
ζw=Θ([m0,n0]) (25)
the horizontal attitude of the carrier is determined by a roll angle gamma and a pitch angle theta, and under the condition of not considering a course angle, a carrier coordinate system, namely a b system, and a navigation coordinate system, namely an n system, and a coordinate conversion matrix are as follows:
because the world coordinate system (w system) of the camera and the carrier coordinate system (b system) are completely overlappedThus ζw=ζbSince the center of the Snell window is vertically upward under the n series, ζ isn=[0 0 1]TAccording to the coordinate transformation relationship, the following are provided:
the following formula is solved: zetaw=[-cosθsinγ sinθ cosθcosγ]TFrom this, the carrier pitch angle θ and roll angle γ can be calculated as:
therein, ζw(. indicates) vector ζwThe central element. Thus, the horizontal posture is determined.
Compared with the prior art, the invention has the following advantages: the existing underwater carrier attitude determination is mostly based on a combination mode of inertial navigation and other navigation modes or other navigation modes, and has the defects of error accumulation or non-autonomy and the like. The horizontal attitude determination method based on the Snell window is a method for solving space information through optical information in nature, can acquire absolute information of the horizontal attitude of a carrier in real time in a fully autonomous manner, has no error accumulation, and provides a new idea for underwater autonomous navigation.
Drawings
FIG. 1 is a flow chart of a horizontal attitude determination method based on underwater Snell window edge identification according to the present invention;
FIG. 2 is a schematic view of a Snell window;
fig. 3 is a schematic diagram of the spatial transformation relationship of the coordinate systems according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, rather than all embodiments, and all other embodiments obtained by a person skilled in the art based on the embodiments of the present invention belong to the protection scope of the present invention without creative efforts.
As shown in fig. 1, the horizontal attitude determination method based on underwater snell window edge identification of the present invention specifically includes the following steps:
step 1, carrying out noise filtering and binarization processing on the acquired image, obtaining an edge image according to light intensity information in the image, and extracting coordinates [ m, n ] of a non-zero pixel p in the edge image in a pixel coordinate system]. Where any non-zero pixel at any point is denoted as piWith the coordinate of [ mi,ni]And i is 1,2, …, s, s is the number of nonzero pixels.
And 2, calculating a value set of the radius r of the included angle, wherein r is a possible value of the radius of the view angle of a Snell window, establishing a three-dimensional Hough space H (m, n, r), and setting all elements in the space to zero. The initialization of the included angle radius r is as follows:
as shown in fig. 2, the view angle of the whole sky in the atmosphere is approximately a celestial sphere, the image in the celestial sphere is compressed into an inverted cone view angle with an underwater observer as a vertex under the action of refraction of the water surface, and the bottom of the cone is a circular area on the water surface. In an empty observation image of an underwater observer, the circular area has higher brightness relative to the area outside the circle. And determining the angle between the edge and the center of the window according to the Snell's law of refraction and a lens model, taking the angle as the radius of an included angle in Hough circle transformation, and determining the redundant range of the radius of the included angle and radius stepping. The refractive indexes of atmosphere and water are respectively naAnd nwThen the angle α between the edge and the center of the snell window satisfies:
considering the errors of refractive indexes of a camera and a medium in practical application, the imaging size of a Snell window can not strictly accord with the theoretical model of the formula, so that the value range of the included angle radius r is [ alpha-delta alpha, alpha + delta alpha ] when the redundant radius delta alpha is set according to the practical water body environment and the lens model. Then, setting the included angle radius step as delta, and setting the value set of the included angle radius r as:
r∈{α-Δα,α-Δα+δ,α-Δα+2δ,…,α-Δα+kδ}(α-Δα+kδ≤α+Δα,k∈N+)
and establishing a three-dimensional Hough space H (m, n, r), wherein (m, n) is pixel coordinates, and initial values of the three-dimensional Hough space are all set to be 0.
Step 3, utilizing the edge image subjected to the binarization processing in the step 1 to divide each nonzero pixel p into a plurality of non-zero pixelsiThe observation vector under the world coordinate system, namely the w system, corresponding to the camera lens model theta is inverted through the camera lens model theta, and the observation vector is used as the z axis to establish the pixel local coordinate system, namely the l systemiSolving the non-zero pixel piCorresponding to liCoordinate transformation matrix between system and w systemThe concrete implementation is as follows:
as shown in FIG. 3, in the pixel coordinate system, a non-zero pixel point piHas the coordinate of [ mi,ni]. Through camera lens calibration, a camera lens model Θ (p) can be calibrated. Then pixel point piCorresponding incident ray propagation vector kappa under system wiCan be expressed as:
κi=Θ(pi) (30)
κiis a unit vector, an observation vector v in the directioniIs represented by in the w systemOrder toThen there are:
[ai bi ci]T=-Θ(pi) (31)
with viEstablishing a local coordinate system of picture elements, i.e./for the z-axisiIs, definition w is andithe rotation angles around the x-axis and the y-axis during the conversion between the systemsIs omegaxi,The rotation angle around the z-axis is 0, then w is l corresponding to the pixeliCoordinate transformation matrix between systemsCan be expressed as the angle of 3-1-2 euler:
then l can be obtainediThe coordinate transformation matrix from system to w system is:
combined formula (31), omegaxiAnd omegayiThe trigonometric function form of (a) can be expressed as follows:
thus, the method can be obtained by using any non-zero pixel point p on the imageiCorresponding observation vector viLocal coordinate system l of picture element as z-axisiThere is a coordinate transformation matrix with the world coordinate system of the camera as:
step 4, a non-zero pixel p is usediDetermining a cone set by using the included angle radius r, the redundancy radius delta alpha and the radius stepping delta calculated in the step (2) and taking all generatrix vectors g of the cone set as an axisiReverting to the pixel coordinate system, at each generatrix vector giqVoting in a three-dimensional Hough space H (m, n, r) of corresponding pixel coordinates, whereinTo obtain non-zero pixel point piThe three-dimensional Hough space with the circle center is specifically realized as follows:
establishing a non-zero pixel point piCorresponding observation vector viThe included angle radius r is used as a cone of a cone angle half angle. The circumference is stepped by τ. One of the generatrix vectors g of the coneiqIn liIs represented as follows:
whereinThen the generatrix vector giqCan be calculated from the following formula under the w system:
then, the model of the camera lens is used to calculateImaging pixel coordinates on the image of the incident ray in the direction:
[miq,niq]=round(Θ-1(-gi w)) (39)
where round () denotes rounding all the elements of () to the whole. At this time at corresponding threeMiddle point (m) in dimension Hough space H (m, n, r)iq,niqAnd r) voting once, namely accumulating 1 by the point value. Traversing q and r to obtain non-zero pixel point piA three-dimensional Hough space as a circle center.
And 5, performing the operations of the step 3 and the step 4 on all non-zero pixels, taking the pixel point with the highest ticket number in the three-dimensional Hough space H (m, n, r) as the circle center of a Snell window, calculating an observation vector of the pixel coordinate of the circle center under a world coordinate system, and inverting the horizontal posture. The concrete implementation is as follows:
and (4) performing the operations of the step (3) and the step (4) on all non-zero pixel points in the binary edge image, and voting and accumulating in the same three-dimensional Hough space H (m, n, r). Then the circles with the non-zero pixel points at the edge of the snell window as the center of the circle intersect at the center of the snell window, i.e. the center coordinate of the snell window gets the most vote in the three-dimensional hough space. Therefore, the point (m) corresponding to the maximum value is selected from the three-dimensional Hough space H (m, n, r)0,n0,r0) I.e. (m)0,n0,r0) argmaxH (m, n, r). Then [ m0,n0]Is the coordinate of the center of the Snell window in the pixel coordinate system, r0Is the radius of the included angle of the Snell window, namely the actual included angle between the edge of the Snell and the center of the window.
Zeta of observation vector of w system corresponding to central pixel of Snell windowwCan be calculated by a camera lens model to obtain:
ζw=Θ([m0,n0]) (40)
the horizontal attitude of the carrier is determined by the roll angle gamma and the pitch angle theta. In the case of not considering the heading angle, the carrier coordinate system (b system) and the navigation coordinate system (n system) have the coordinate transformation matrix as follows:
since the world coordinate system (w system) of the camera and the carrier coordinate system (b system) completely coincide with each other, ζ is obtainedw=ζb. Since the center of the Snell window is vertically upward under the n series, ζ isn=[0 0 1]TAccording to the coordinate transformation relationship, the following are provided:
the following formula is solved: zetaw=[-cosθsinγ sinθ cosθcosγ]TThe carrier pitch angle θ and roll angle γ can thus be calculated as:
therein, ζw(. indicates) vector ζwThe central element. Thus, the horizontal posture is determined.
Although illustrative embodiments of the present invention have been described above to facilitate the understanding of the present invention by those skilled in the art, it should be understood that the present invention is not limited to the scope of the embodiments, but various changes may be apparent to those skilled in the art, and it is intended that all inventive concepts utilizing the inventive concepts set forth herein be protected without departing from the spirit and scope of the present invention as defined and limited by the appended claims.
Claims (5)
1. A horizontal attitude determination method based on underwater Snell window edge identification is characterized by comprising the following steps:
step (1), carrying out noise filtering and binarization processing on the acquired image, obtaining an edge image according to light intensity information in the image, and extracting coordinates [ m, n ] of a non-zero pixel p in the edge image in a pixel coordinate system]Where any non-zero pixel at any point is denoted as piWith the coordinate of [ mi,ni]Wherein i is 1,2, …, s, s is the number of nonzero pixels;
step (2), calculating a value set of the radius r of the included angle, wherein r is a possible value of the radius of the view angle of a Snell window, establishing a three-dimensional Hough space H (m, n, r), and setting all elements in the space to zero;
step (3) utilizing the edge image subjected to the binarization processing in the step (1) to convert each non-zero pixel p intoiThe observation vector under the world coordinate system, namely the w system, corresponding to the camera lens model theta is inverted through the camera lens model theta, and the observation vector is used as the z axis to establish the pixel local coordinate system, namely the l systemiSolving the non-zero pixel piCorresponding to liCoordinate transformation matrix between system and w system
Step (4) with a non-zero pixel piDetermining a cone set by using the included angle radius r, the redundancy radius delta alpha and the radius stepping delta calculated in the step (2) and taking all generatrix vectors g of the cone set as an axisiReverting to the pixel coordinate system, at each generatrix vector giqVoting is performed in a three-dimensional hough space H (m, n, r) of corresponding pixel coordinates, where q is 1,2, …,to obtain non-zero pixel point piA three-dimensional Hough space as a circle center;
and (5) performing the operations of the step (3) and the step (4) on all non-zero pixels, wherein the pixel point with the highest ticket number in the three-dimensional Hough space H (m, n, r) is the circle center of a Snell window, calculating an observation vector of the pixel coordinate of the circle center under a world coordinate system, and inverting the horizontal posture.
2. The horizontal attitude determination method based on underwater snell window edge identification as claimed in claim 1, wherein:
the radius r of the included angle calculated in the step (2) is a value set, wherein r is a possible value of the radius of the view angle of a Snell window, a three-dimensional Hough space H (m, n, r) is established, and all elements in the space are set to zero, and the method is specifically realized as follows:
determining the angle between the edge and the center of the window according to the Snell's law of refraction and the lens model, taking the angle as the radius of the included angle in Hough circle transformation, and determining the included angleRadius redundancy range and radius stepping; the refractive indexes of atmosphere and water are respectively naAnd nwThen the angle α between the edge and the center of the snell window satisfies:
considering the errors of refractive indexes of a camera and a medium in practical application, the imaging size of a Snell window can not strictly accord with the theoretical model of the formula, so that the value range of the included angle radius r is [ alpha-delta alpha, alpha + delta alpha ] when the redundant radius delta alpha is set according to the practical water body environment and the lens model. Then, setting the included angle radius step as delta, and setting the value set of the included angle radius r as:
r∈{α-Δα,α-Δα+δ,α-Δα+2δ,…,α-Δα+kδ}(α-Δα+kδ≤α+Δα,k∈N+)
and establishing a three-dimensional Hough space H (m, n, r), wherein (m, n) is pixel coordinates, and initial values of the three-dimensional Hough space are all set to be 0.
3. The horizontal attitude determination method based on underwater snell window edge identification as claimed in claim 1, wherein:
in the step (3), each non-zero pixel p in the edge image subjected to the binarization processing in the step (1) is usediThe observation vector under the world coordinate system, namely the w system, corresponding to the camera lens model theta is inverted through the camera lens model theta, and the observation vector is used as the z axis to establish the pixel local coordinate system, namely the l systemiSolving the non-zero pixel piCorresponding to liCoordinate transformation matrix between system and w systemThe concrete implementation is as follows:
in the pixel coordinate system, an arbitrary point is a nonzero pixel piHas the coordinate of [ mi,ni]Through camera lens calibration, a camera lens model theta (p) can be calibrated, and a pixel point piCorresponding incident ray propagation vector kappa under system wiExpression ofComprises the following steps:
κi=Θ(pi) (2)
κiis a unit vector, an observation vector v in the directioniIs represented by in the w systemOrder toThen there are:
[ai bi ci]T=-Θ(pi) (3)
with viEstablishing a local coordinate system of picture elements, i.e./for the z-axisiIs, definition w is andiin the process of conversion between systems, the rotation angles around the x-axis and the y-axis are respectively omegaxi,The rotation angle around the z-axis is 0, yielding liThe coordinate transformation matrix from system to w system is:
combined formula (3), omegaxiAnd omegayiThe trigonometric function form of (a) can be expressed as follows:
thus, any non-zero pixel point p on the image is obtainediCorresponding observation vector viLocal coordinate system l of picture element as z-axisiThere is a coordinate transformation matrix with the world coordinate system of the camera as:
4. the horizontal attitude determination method based on underwater snell window edge identification as claimed in claim 1, wherein:
the step (4) is to use a non-zero pixel piDetermining a cone set by using the included angle radius r, the redundancy radius delta alpha and the radius stepping delta calculated in the step (2) and taking all generatrix vectors g of the cone set as an axisiReverting to the pixel coordinate system, at each generatrix vector giqVoting is performed in a three-dimensional hough space H (m, n, r) of corresponding pixel coordinates, where q is 1,2, …,to obtain non-zero pixel point piThe three-dimensional Hough space with the circle center is specifically realized as follows:
establishing random non-zero pixel point piCorresponding observation vector viTaking the included angle radius r as a cone angle half-angle cone as an axis, the circumferential step is tau, and one generatrix vector g of the coneiqIn liIs represented as follows:
wherein q is 1,2, …,then the generatrix vector giqCan be represented by the following formula under the w systemAnd (3) calculating:
then, the model of the camera lens is used to calculateImaging pixel coordinates on the image of the incident ray in the direction:
where round () denotes rounding all the elements of () when point (m) is in the corresponding three-dimensional hough space H (m, n, r)iq,niqAnd r) voting once, namely accumulating the point value by 1, traversing q and r to obtain a non-zero pixel point piA three-dimensional Hough space as a circle center.
5. The horizontal attitude determination method based on underwater snell window edge identification as claimed in claim 1, wherein:
in the step (5), all non-zero pixels are subjected to the operations in the steps (3) and (4), the pixel point with the highest ticket number in the three-dimensional Hough space H (m, n, r) is the circle center of a Snell window, an observation vector of the pixel coordinate of the circle center under a world coordinate system is solved, and a horizontal attitude is inverted, and the method is specifically realized as follows:
performing the operation of the step (3) and the operation of the step (4) on all non-zero pixel points in the binary edge image, voting and accumulating in the same three-dimensional Hough space H (m, n, r), and selecting a point (m, n, r) corresponding to the maximum value from the three-dimensional Hough space H (m, n, r)0,n0,r0) I.e. (m)0,n0,r0) argmaxH (m, n, r), then [ m0,n0]Is the coordinate of the center of the Snell window in the pixel coordinate system, r0The radius of the included angle of the Snell window, namely the actual included angle between the edge of the Snell and the center of the window;
zeta of observation vector of w system corresponding to central pixel of Snell windowwThe following are calculated by a camera lens model:
ζw=Θ([m0,n0]) (11)
the horizontal attitude of the carrier is determined by a roll angle gamma and a pitch angle theta, and under the condition of not considering a course angle, a carrier coordinate system, namely a b system, and a navigation coordinate system, namely an n system, and a coordinate conversion matrix are as follows:
w system as a world coordinate system of the camera and b system as a carrier coordinate system are completely overlapped, ζw=ζbAnd the center of the Snell window is vertical upward under n series, then ζn=[0 0 1]TAccording to the coordinate transformation relationship, the following are provided:
the following formula is solved: zetaw=[-cosθsinγ sinθ cosθcosγ]TFrom this, the carrier pitch angle θ and roll angle γ can be calculated as:
therein, ζw(. indicates) vector ζwAnd (5) determining the horizontal posture by the first element.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011307276.6A CN112419410B (en) | 2020-11-20 | 2020-11-20 | Horizontal attitude determination method based on underwater Snell window edge identification |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011307276.6A CN112419410B (en) | 2020-11-20 | 2020-11-20 | Horizontal attitude determination method based on underwater Snell window edge identification |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112419410A true CN112419410A (en) | 2021-02-26 |
CN112419410B CN112419410B (en) | 2021-10-19 |
Family
ID=74774315
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011307276.6A Active CN112419410B (en) | 2020-11-20 | 2020-11-20 | Horizontal attitude determination method based on underwater Snell window edge identification |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112419410B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114877898A (en) * | 2022-07-11 | 2022-08-09 | 北京航空航天大学 | Sun dynamic tracking method based on underwater polarization attitude and refraction coupling inversion |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106225668A (en) * | 2016-07-27 | 2016-12-14 | 大连理工大学 | Wind-tunnel missile high speed pose measuring methods based on many refraction models |
CN107767420A (en) * | 2017-08-16 | 2018-03-06 | 华中科技大学无锡研究院 | A kind of scaling method of underwater stereoscopic vision system |
CN108387206A (en) * | 2018-01-23 | 2018-08-10 | 北京航空航天大学 | A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light |
CN108535715A (en) * | 2018-04-12 | 2018-09-14 | 西安应用光学研究所 | A kind of seen suitable for airborne photoelectric takes aim at object localization method under the atmospheric refraction of system |
CN111220150A (en) * | 2019-12-09 | 2020-06-02 | 北京航空航天大学 | Sun vector calculation method based on underwater polarization distribution mode |
US20200225744A1 (en) * | 2018-12-17 | 2020-07-16 | Tobii Ab | Gaze tracking via tracing of light paths |
-
2020
- 2020-11-20 CN CN202011307276.6A patent/CN112419410B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106225668A (en) * | 2016-07-27 | 2016-12-14 | 大连理工大学 | Wind-tunnel missile high speed pose measuring methods based on many refraction models |
CN107767420A (en) * | 2017-08-16 | 2018-03-06 | 华中科技大学无锡研究院 | A kind of scaling method of underwater stereoscopic vision system |
CN108387206A (en) * | 2018-01-23 | 2018-08-10 | 北京航空航天大学 | A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light |
CN108535715A (en) * | 2018-04-12 | 2018-09-14 | 西安应用光学研究所 | A kind of seen suitable for airborne photoelectric takes aim at object localization method under the atmospheric refraction of system |
US20200225744A1 (en) * | 2018-12-17 | 2020-07-16 | Tobii Ab | Gaze tracking via tracing of light paths |
CN111220150A (en) * | 2019-12-09 | 2020-06-02 | 北京航空航天大学 | Sun vector calculation method based on underwater polarization distribution mode |
Non-Patent Citations (2)
Title |
---|
SHAI SABBAH等: "Experimental and theoretical study of skylight polarization transmitted through Snell"s window of a flat water surface", 《JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS, IMAGE SCIENCE, AND VISION》 * |
朱小辰等: "海道测量多波束声速改正精确模型研究", 《海洋测绘》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114877898A (en) * | 2022-07-11 | 2022-08-09 | 北京航空航天大学 | Sun dynamic tracking method based on underwater polarization attitude and refraction coupling inversion |
CN114877898B (en) * | 2022-07-11 | 2022-10-18 | 北京航空航天大学 | Sun dynamic tracking method based on underwater polarization attitude and refraction coupling inversion |
Also Published As
Publication number | Publication date |
---|---|
CN112419410B (en) | 2021-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110926474B (en) | Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method | |
US9285460B2 (en) | Method and system for estimating information related to a vehicle pitch and/or roll angle | |
CN103822635B (en) | The unmanned plane during flying spatial location real-time computing technique of view-based access control model information | |
CN111220150B (en) | Sun vector calculation method based on underwater polarization distribution mode | |
CN105841688B (en) | A kind of ship auxiliary anchors alongside the shore method and system | |
CN102353377B (en) | High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof | |
CN104835115A (en) | Imaging method for aerial camera, and system thereof | |
CN103398710B (en) | Entering and leaving port, naval vessel navigational system under a kind of night fog sky condition and construction method thereof | |
CN111412916B (en) | Astronomical navigation ship position calculation method based on atmospheric polarized light field | |
CN111307139A (en) | Course and attitude determination method based on polarization/astronomical information fusion | |
CN110487266A (en) | A kind of airborne photoelectric passive high-precision localization method suitable for sea-surface target | |
CN104618689A (en) | Method and system for monitoring offshore oil spillage based on UAV | |
CN106767822A (en) | Indoor locating system and method based on camera communication with framing technology | |
CN114877898B (en) | Sun dynamic tracking method based on underwater polarization attitude and refraction coupling inversion | |
CN112419410B (en) | Horizontal attitude determination method based on underwater Snell window edge identification | |
CN105021190A (en) | Anti-satellite navigation fraud method and unmanned system based on the method | |
Chen et al. | Camera geolocation from mountain images | |
Hu et al. | Underwater downwelling radiance fields enable three-dimensional attitude and heading determination | |
CN110887477B (en) | Autonomous positioning method based on north polarization pole and polarized sun vector | |
CN114937075B (en) | Underwater polarized light field autonomous orientation method based on three-dimensional solar meridian plane fitting | |
CN116894936B (en) | Unmanned aerial vehicle vision-based marine target identification and positioning method and system | |
CN107941220B (en) | Unmanned ship sea antenna detection and navigation method and system based on vision | |
Li et al. | Adaptively robust filtering algorithm for maritime celestial navigation | |
Pfeiffer et al. | Detecting beach litter in drone images using deep learning | |
CN110887475B (en) | Static base rough alignment method based on north polarization pole and polarized solar vector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |