CN113569647B - AIS-based ship high-precision coordinate mapping method - Google Patents

AIS-based ship high-precision coordinate mapping method Download PDF

Info

Publication number
CN113569647B
CN113569647B CN202110725329.4A CN202110725329A CN113569647B CN 113569647 B CN113569647 B CN 113569647B CN 202110725329 A CN202110725329 A CN 202110725329A CN 113569647 B CN113569647 B CN 113569647B
Authority
CN
China
Prior art keywords
camera
coordinates
ship
longitude
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110725329.4A
Other languages
Chinese (zh)
Other versions
CN113569647A (en
Inventor
梁华
李晓威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fu'an Digital Technology Co ltd
Original Assignee
Guangzhou Fu'an Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fu'an Digital Technology Co ltd filed Critical Guangzhou Fu'an Digital Technology Co ltd
Priority to CN202110725329.4A priority Critical patent/CN113569647B/en
Publication of CN113569647A publication Critical patent/CN113569647A/en
Application granted granted Critical
Publication of CN113569647B publication Critical patent/CN113569647B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a ship high-precision coordinate mapping method based on AIS, which relates to the field of coordinate mapping in the ship target detection process, and comprises the steps of aligning a monitoring camera picture coordinate and an AIS longitude and latitude coordinate to establish low-precision coordinate mapping, and establishing a low-precision mapping relation between the monitoring camera picture coordinate and the AIS longitude and latitude coordinate; AIS information of all ships in the camera view is obtained once every delta t time, positions of all ships in the camera picture are detected through an image target detection algorithm, and a camera picture coordinate set omega= { < x of all ships is obtained i ,y i > }, and simultaneously converting longitude and latitude coordinates of the ship into camera picture coordinates by using a low-precision mapping relation to obtain a converted coordinate set K= { < x j ,y j > }; by setting a threshold value, omega and K are matched, so that accurate camera picture coordinates and longitude and latitude coordinates are in one-to-one correspondence, a coordinate mapping relation with higher precision is achieved, and the accuracy of coordinate mapping is improved.

Description

AIS-based ship high-precision coordinate mapping method
Technical Field
The invention relates to the field of ship target detection, in particular to a ship high-precision coordinate mapping method based on AIS.
Background
The method for automatically analyzing the video is used for automatically detecting the water surface ship, extracting the position, the size and the appearance characteristics of the ship, is a necessary stage for automatically identifying the identity of the ship, and can greatly reduce the labor cost of maritime management due to the characteristics of real time and no need of manual intervention, thereby being an important information acquisition means in the fields of ship traffic, port management and the like.
However, the ship target detection method based on the video image is easily affected by factors such as illumination, weather conditions, shielding and the like, so that the detection accuracy is reduced, a large amount of manual labeling is usually required for ship images on a deployment site based on a manual labeling method, and an image detection algorithm is subjected to incremental training to improve the detection accuracy. In order to realize the application of specific geographic position information on the video, coordinate mapping is generally needed, but the accuracy of the existing coordinate mapping is low.
In recent years, some ship detection methods combining AIS and surveillance video have been proposed, for example:
the invention patent with publication number of CN111914049A discloses a mapping method of longitude and latitude coordinates and image coordinates, wherein a plurality of points in a scene are selected, the longitude and latitude coordinates of the scene in a physical space and the pixel coordinates of the scene in an image picture are calibrated by using known measurement data of the scene or tools such as Google map, and the image pixel coordinates and the physical coordinates are related manually. However, in the method, because the coordinate value of the manual calibration in the practical application contains errors, the obtained image coordinate generally contains errors, and the time and the labor are consumed.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide an AIS-based ship high-precision coordinate mapping method, which comprises the steps of aligning a monitoring camera picture coordinate and an AIS longitude and latitude coordinate to establish low-precision coordinate mapping, and establishing a low-precision mapping relation between the monitoring camera picture coordinate and the AIS longitude and latitude coordinate; AIS information of all ships in the camera view is obtained once every delta t time, positions of all ships in the camera picture are detected through an image target detection algorithm, and a camera picture coordinate set omega= { < x of all ships is obtained i ,y i > }, and simultaneously converting longitude and latitude coordinates of the ship into camera picture coordinates by using a low-precision mapping relation to obtain a converted coordinate set K= { < x j ,y j > }; by setting a threshold value, omega and K are matched, and the accurate picture coordinates of the camera are in one-to-one correspondence with longitude and latitude coordinates, so that a coordinate mapping relation with higher precision is achieved.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a ship high-precision coordinate mapping method based on AIS comprises the following steps:
step S1: manually measuring external parameters of the camera, wherein the external parameters of the camera are positions of a camera coordinate system in a world coordinate system and coarse calibration of any position of the water surface in a visible range of the camera, aligning camera picture coordinates and AIS longitude and latitude coordinates, and establishing a low-precision mapping relation between the camera picture coordinates and the AIS longitude and latitude coordinates; because the mapping relationship is a low-precision mapping relationship, a high-precision mapping relationship needs to be further established;
step S2: detecting the positions of all ships in the camera images through an image target detection algorithm to obtain a camera image coordinate set omega = { < x of all ships i ,y i And (2) converting longitude and latitude coordinates of all ships into camera picture coordinates of the ships by using the low-precision mapping relation in the step S1 to obtain a converted coordinate set K= { < x j ,y j >};
Step S3: setting a threshold value, matching the ship camera picture coordinate set omega obtained by the image target detection algorithm in the step S2 with the ship camera picture coordinate set K obtained by converting the low-precision mapping relation, screening the camera picture coordinates of the ship meeting the conditions, enabling the longitude and latitude coordinates of the ship and the camera picture coordinates to form a one-to-one correspondence relation, and determining the longitude and latitude coordinates of the ship and the camera picture coordinate set M= { < lon i ,lat i ,x i ,y i >};
Step S4: using the set of location information m= { < lon in step S3 i ,lat i ,x i ,y i > -establishing a high-precision coordinate map of the vessel.
Preferably, in the step S1, the establishing process of the low-precision mapping relationship between the camera frame coordinates and the AIS longitude and latitude coordinates specifically includes the following steps:
step 1.1: according to the Haverine formula, calculating the vertical projection position O' of the position of the camera on the horizontal plane and the arbitrary position A of the water surface in the visible range of the camera i Straight horizontal distance d of (2) i Sheet (S)The positions are m, O' and A i Longitude horizontal distance s of (2) i The unit is m:
wherein: a. b are all intermediate variable values, O' (lambda) 0 ,ψ 0 ) A is the vertical projection position of the camera on the horizontal plane ii ,ψ i ) The camera is positioned at any position of the water surface in the visible range, r is the earth radius, and the unit is m;
step 1.2: from step 1.1, O' and A are calculated i Included angle beta between the connecting line of (C) and the geographic true north direction i
Step 1.3: from step 1.1, O and A are calculated i Included angle theta between the line connecting the line and the vertical line i
Wherein H is the height of the camera from the horizontal plane, and the unit is m;
step 1.4: calculation A i Monitoring camera frame coordinates (x i ,y i ):
Wherein X is the pixel width of the image, Y is the pixel height, and the parameter values of X and Y can be obtained according to the resolution of the camera image being X Y;
θ is the angle between the center line of the camera and the vertical line, β is the angle between the projection of the center line of the camera on the horizontal plane and the true north direction of geography, ω x For the horizontal angle of view of the camera, ω y Is the vertical field angle of the camera.
Preferably, in the step S2, a frame of image is acquired at intervals of Δt, AIS information of all ships in the camera view of the frame at the moment is acquired, positions of all ships in the camera frame are detected by using an image object detection algorithm, and n frames of images are acquired in total to obtain a camera frame coordinate set Ω= { (x) of the ship i ,y i )},(x i ,y i ) Representing camera frame coordinates of the ith vessel.
Preferably, longitude and latitude coordinates of all ships in each frame of image are converted into camera picture coordinates through the low-precision mapping relation in the step S1, so as to obtain a camera picture coordinate set K= { (x' j ,y′ j ) (x 'in which' j ,y′ j ) And the picture coordinates of the camera of the j-th ship after the longitude and latitude conversion are represented.
Preferably, in the step S3, the processing procedure for each frame of image is as follows:
in each frame of image, selecting a camera picture coordinate set omega of all ships detected by using an image target detection algorithm in the frame of image n ={(x i ,y i ) And a camera picture coordinate set K obtained by AIS after longitude and latitude coordinate conversion n ={(x′ j ,y′ j ) And (2) n represents an nth frame of image, and the difference between the ship camera picture coordinates obtained by using the image target detection algorithm and the ship camera picture coordinates converted from the longitude and latitude coordinates is (deltax, deltay), and is calculated as follows:
Δx=|x i -x′ j |
Δy=|y i -y′ j |
the method for determining the one-to-one correspondence between the picture coordinates and the longitude and latitude coordinates of the camera according to (deltax, deltay) is as follows:
setting a threshold value (delta X, delta Y), screening out qualified ship picture coordinates (X, Y), namely delta X < delta X, delta Y < delta Y, discarding the ship picture coordinates (X, Y) if one ship picture coordinate obtained by an image target detection algorithm corresponds to a plurality of qualified camera picture coordinates converted from longitude and latitude coordinates in the screening process, so that one ship picture coordinate obtained by the image target detection algorithm corresponds to only one qualified camera picture coordinate converted from longitude and latitude coordinates, sequentially screening out the qualified ship picture coordinates (X, Y) in all frame images, and obtaining the qualified ship picture coordinates and a position information set M= { < lon of the ship longitude and latitude coordinates i ,lat i ,x i ,y i >, the (lon) i ,lat i ) Is longitude and latitude coordinates before the conversion of the longitude and latitude coordinates of the ship in the step S1, (x) i ,y i ) The image coordinates of the ship camera are obtained by an image target detection algorithm. The probability of matching the longitude and latitude coordinates to the wrong camera picture coordinates can be reduced by the screening method of the one-to-one correspondence of the camera picture coordinates and the longitude and latitude coordinates.
Preferably, in the step S4, the position information set m= { < lon using the camera frame coordinates and longitude and latitude coordinates obtained in the step S3 is used i ,lat i ,x i ,y i > -obtaining transformation matrix parameters by calculation, the process is as follows:
from the set of location information m= { < lon i ,lat i ,x i ,y i Three sets of data are selected each time in >, by inverseMatrix calculation to obtain multiple transformation matrices H i
Wherein (lon) i1 ,lat i1 )、(lon i1 ,lat i2 )、(lon i3 ,lat i3 ) Three sets of longitude and latitude coordinates, (x) of the ship i1 ,y i1 )、(x i2 ,y i2 )、(x i3 ,y i3 ) Three sets of coordinates of the ship in the camera picture;
then take a plurality of transformation matrices H i Such a setting can reduce the error:
the conversion relation between the longitude and latitude coordinates and the camera picture coordinates is as follows:
where (lon, lat) is the longitude and latitude coordinates of the ship, (x, y) is the coordinates of the ship in the camera picture, and H is the transformation matrix.
Compared with the prior art, the invention has the beneficial technical effects that:
1. according to the invention, the camera picture coordinates and the AIS longitude and latitude coordinates are aligned, so that a low-precision mapping relation between the camera picture coordinates and the AIS longitude and latitude coordinates is established, and an accurate position information corresponding relation between the camera picture coordinates and the AIS longitude and latitude coordinates in one-to-one correspondence is obtained on the basis of low-precision coordinate mapping, thereby improving the accuracy of coordinate mapping.
2. The invention can rapidly obtain the longitude and latitude coordinates of any ship in the camera picture and the camera picture coordinates by using the high-precision coordinate mapping method, thereby reducing measurement errors and improving the ship monitoring efficiency.
Drawings
FIG. 1 is a flow chart of an AIS-based ship high-precision coordinate mapping method in an embodiment of the invention;
fig. 2 is a schematic diagram of a method for calculating a mapping relationship between AIS longitude and latitude coordinates and monitor camera frame coordinates in the embodiment of the present invention;
fig. 3 is a schematic diagram two of a mapping relation calculation method of AIS longitude and latitude coordinates and monitoring camera picture coordinates in the embodiment of the present invention;
fig. 4 is a schematic diagram of the coordinate positions of the camera frames of the ship to be detected in the embodiment of the invention.
Detailed Description
The present invention will be further described in detail with reference to the following examples, for the purpose of making the objects, technical solutions and advantages of the present invention more apparent, but the scope of the present invention is not limited to the following specific examples.
Examples
Referring to fig. 1, the embodiment discloses a ship high-precision coordinate mapping method based on AIS, which comprises the following steps:
step S1: manually measuring external parameters of the camera, wherein the external parameters of the camera are positions of a camera coordinate system in a world coordinate system and coarse calibration of any position of the water surface in a visible range of the camera, aligning camera picture coordinates and AIS longitude and latitude coordinates, and establishing a low-precision mapping relation between the camera picture coordinates and the AIS longitude and latitude coordinates; because the mapping relationship is a low-precision mapping relationship, a high-precision mapping relationship needs to be further established;
step S2: AIS information of all ships in the camera view is obtained once every delta t time, positions of all ships in the camera picture are detected through an image target detection algorithm, and a camera picture coordinate set omega= { < x of all ships is obtained i ,y i And (2) converting longitude and latitude coordinates of all ships into camera picture coordinates of the ships by using the low-precision mapping relation in the step S1 to obtain a converted coordinate set K= { < x j ,y j >};
Step S3: setting a threshold value, and performing the following stepsS2, matching a ship camera picture coordinate set omega obtained through an image target detection algorithm and a ship camera picture coordinate set K obtained through conversion of a low-precision mapping relation, screening camera picture coordinates of a ship meeting the conditions, enabling longitude and latitude coordinates of the ship and the camera picture coordinates to form a one-to-one correspondence relation, and determining the longitude and latitude coordinates of the ship and the camera picture coordinate set M= { < lon i ,lat i ,x i ,y i >};
Step S4: using the set of location information m= { < lon in step S3 i ,lat i ,x i ,y i > -establishing a high-precision coordinate map of the vessel.
In the specific step S1, the process of establishing the low-precision mapping relationship between the camera frame coordinates and the AIS longitude and latitude coordinates specifically includes the following steps:
step S1.0: parameter acquisition preparation:
as shown in fig. 2 to 4, N in fig. 3 indicates the geographic north direction, the height of the camera from the horizontal plane is determined to be H, the included angle between the center line of the camera and the vertical line is determined to be θ, the projection of the center line of the camera on the horizontal plane and the included angle between the geographic north direction are determined to be β, and the horizontal angle of view of the camera is determined to be ω x The vertical angle of view of the camera is omega y Acquiring the resolution parameter information of the camera image as X multiplied by Y (X is the pixel width of the image and Y is the pixel height);
assume that: the coordinate of the center of the camera picture is (0, 0), the vertical projection position of the position O of the camera on the horizontal plane is O', and the longitude and latitude are (lambda) 0 ,ψ 0 ) Aiming at any position A of water surface in the visible range of a camera i Longitude and latitude coordinates (lambda) i ,ψ i ) Can be converted into camera picture coordinates (x) as follows i ,y i );
Step 1.1: according to the Haverine formula, calculating the vertical projection position O' of the position of the camera on the horizontal plane and the arbitrary position A of the water surface in the visible range of the camera i Straight horizontal distance d of (2) i In the units m, O' and A i Longitude horizontal distance s of (2) i The unit is m:
wherein: a. b are all intermediate variable values, O' (lambda) 0 ,ψ 0 ) A is the vertical projection position of the camera on the horizontal plane ii ,ψ i ) The camera is positioned at any position of the water surface in the visible range, r is the earth radius, and the unit is m;
step 1.2: from step 1.1, O' and A are calculated i Included angle beta between the connecting line of (C) and the geographic true north direction i
Step 1.3: from step 1.1, O and A are calculated i Included angle theta between the line connecting the line and the vertical line i
Wherein H is the height of the camera from the horizontal plane, and the unit is m;
step 1.4: calculation A i Monitoring camera frame coordinates (x i ,y i ):
That is, the specific process of step S1 can be summarized as follows: by means of manual measurement, a mapping relation between the picture coordinates of the low-precision monitoring camera and the AIS longitude and latitude coordinates can be established.
The specific process of the step S2 is as follows: acquiring a frame of image every delta t time, acquiring AIS information of all ships in the camera view of the frame at the moment, detecting positions of all ships in the camera picture by utilizing an image target detection algorithm, and acquiring n frames of images in total to obtain a camera picture coordinate set omega = { (x) of the ship i ,y i )},(x i ,y i ) Representing camera frame coordinates of the ith vessel. Simultaneously processing the AIS information of the received past ship in real time to obtain a ship information set { < lambda } ii > }; wherein < lambda ii Representing the detected position information of the ith ship, lambda i Is the longitude of the ship, ψ i Is the latitude; converting longitude and latitude coordinates of all ships in each frame of image into camera coordinates through the low-precision mapping relation in the step S1 to obtain a camera picture coordinate set K= { (x ')' j ,y′ j ) (x 'in which' j ,y′ j ) And the picture coordinates of the camera of the j-th ship after the longitude and latitude conversion are represented.
The image target detection algorithm in step S2 is existing, and reference may be made to the existing published patent: the bulletin number is CN109993163A, the name is a non-nameplate identification system based on artificial intelligence and an identification method thereof, so that images of each frame acquired at intervals of delta t in a monitoring video are detected, and a camera picture coordinate set omega= { (x) of a ship is obtained i ,y i )}。
In a specific implementation, the distance d between the AIS information and O' may be filtered out i Beyond a certain visual rangeThe threshold value is set in the range of 1 km to 10 km according to the specific parameters of the camera and the field of view of the installation position, so as to achieve the purpose of reducing the calculation amount of subsequent matching.
In the step S3, the processing procedure for each frame of image is as follows:
in each frame of image, selecting a camera picture coordinate set omega of all ships detected by using an image target detection algorithm in the frame of image n ={(x i ,y i ) And a camera picture coordinate set K obtained by AIS after longitude and latitude coordinate conversion n ={(x′ j ,y′ j ) And (2) n represents an nth frame of image, and the difference between the ship camera picture coordinates obtained by using the image target detection algorithm and the ship camera picture coordinates converted from the longitude and latitude coordinates is (deltax, deltay), and is calculated as follows:
Δx=|x i -x′ j |
Δy=|y i -y′ j |
the method for determining the one-to-one correspondence between the picture coordinates and the longitude and latitude coordinates of the camera according to (deltax, deltay) is as follows:
setting a threshold value (delta X, delta Y), screening out qualified ship picture coordinates (X, Y), namely delta X < delta X, delta Y < delta Y, discarding the ship picture coordinates (X, Y) if one ship picture coordinate obtained by an image target detection algorithm corresponds to a plurality of qualified camera picture coordinates converted from longitude and latitude coordinates in the screening process, so that one ship picture coordinate obtained by the image target detection algorithm corresponds to only one qualified camera picture coordinate converted from longitude and latitude coordinates, sequentially screening out the qualified ship picture coordinates (X, Y) in all frame images, and obtaining the qualified ship picture coordinates and a position information set M= { < lon of the ship longitude and latitude coordinates i ,lat i ,x i ,y i >, the (lon) i ,lat i ) Is longitude and latitude coordinates before the conversion of the longitude and latitude coordinates of the ship in the step S1, (x) i ,y i ) Is obtained by an image target detection algorithmIs a ship camera picture coordinate. The probability of matching the longitude and latitude coordinates to the wrong camera picture coordinates can be reduced by the screening method of the one-to-one correspondence of the camera picture coordinates and the longitude and latitude coordinates.
The specific process of the step S4 is as follows: position information set M= { < lon using camera picture coordinates and longitude and latitude coordinates obtained in step S3 i ,lat i ,x i ,y i > -obtaining transformation matrix parameters by calculation, the process is as follows:
from the set of location information m= { < lon i ,lat i ,x i ,y i Three groups of data are selected each time in the >, and a plurality of transformation matrixes H are obtained through inverse matrix calculation i
Wherein (lon) i1 ,lat i1 )、(lon i1 ,lat i2 )、(lon i3 ,lat i3 ) Three sets of longitude and latitude coordinates, (x) of the ship i1 ,y i1 )、(x i2 ,y i2 )、(x i3 ,y i3 ) Three sets of coordinates of the ship in the camera picture;
then take a plurality of transformation matrices H i Such a setting can reduce the error:
the conversion relation between the longitude and latitude coordinates and the camera picture coordinates is as follows:
where (lon, lat) is the longitude and latitude coordinates of the ship, (x, y) is the coordinates of the ship in the camera picture, and H is the transformation matrix.
And finally, establishing a high-precision coordinate mapping relation of the ship through the step S4, and improving the accuracy of coordinate mapping so as to quickly obtain longitude and latitude coordinates of any ship in the camera picture and the camera picture coordinates, thereby reducing measurement errors and improving the ship monitoring efficiency.
Variations and modifications to the above would be obvious to persons skilled in the art to which the invention pertains from the foregoing description and teachings. Therefore, the invention is not limited to the specific embodiments disclosed and described above, but some modifications and changes of the invention should be also included in the scope of the claims of the invention. In addition, although specific terms are used in the present specification, these terms are for convenience of description only and do not constitute any limitation on the invention.

Claims (2)

1. The ship high-precision coordinate mapping method based on AIS is characterized by comprising the following steps:
step S1: the method comprises the steps of manually measuring external parameters of a camera and roughly calibrating any position of a water surface in a visible range of the camera, aligning camera picture coordinates and AIS longitude and latitude coordinates, and establishing a low-precision mapping relation between the camera picture coordinates and the AIS longitude and latitude coordinates;
step S2: detecting the positions of all ships in the camera images through an image target detection algorithm to obtain a camera image coordinate set omega = { < x of all ships i ,y i And (2) converting longitude and latitude coordinates of all ships into camera picture coordinates of the ships by using the low-precision mapping relation in the step S1 to obtain a converted coordinate set K= { < x j ,y j >};
Step S3: setting a threshold value, matching the ship camera picture coordinate set omega obtained by the image target detection algorithm in the step S2 with the ship camera picture coordinate set K obtained by converting the low-precision mapping relation, screening the camera picture coordinates of the ship meeting the conditions, enabling the longitude and latitude coordinates of the ship and the camera picture coordinates to form a one-to-one correspondence relation, and determining the longitude and latitude coordinates of the ship and the camera picture coordinate set M= { < lon i ,lat i ,x i ,y i >};
Step S4: using the set of location information m= { < lon in step S3 i ,lat i ,x i ,y i Building high-precision coordinate mapping of the ship;
in the step S1, the process of establishing the low-precision mapping relationship between the camera frame coordinates and the AIS longitude and latitude coordinates specifically includes the following steps:
step 1.1: according to the Haverine formula, calculating the vertical projection position O' of the position O of the camera on the horizontal plane and the arbitrary position A of the water surface in the visible range of the camera i Straight horizontal distance d of (2) i In the units m, O' and A i Longitude horizontal distance s of (2) i The unit is m:
wherein: a. b are all intermediate variable values, O' (lambda) 0 ,ψ 0 ) A is the vertical projection position of the camera on the horizontal plane ii ,ψ i ) The camera is positioned at any position of the water surface in the visible range, r is the earth radius, and the unit is m;
step 1.2: from step 1.1, O' and A are calculated i Included angle beta between the connecting line of (C) and the geographic true north direction i
Step 1.3: from step 1.1, O and A are calculated i Included angle theta between the line connecting the line and the vertical line i
Wherein H is the height of the camera from the horizontal plane, and the unit is m;
step 1.4: calculation A i Monitoring camera frame coordinates (x i ,y i ):
Wherein X is the pixel width of the image, Y is the pixel height, and the parameter values of X and Y can be obtained according to the resolution of the camera image being X Y;
θ is the angle between the center line of the camera and the vertical line, β is the angle between the projection of the center line of the camera on the horizontal plane and the true north direction of geography, ω x For the horizontal angle of view of the camera, ω y The vertical angle of view is the camera;
in the step S2, a frame of image is acquired at intervals of Δt, AIS information of all ships in the camera view of the frame at the moment is acquired, positions of all ships in the camera frame are detected by using an image object detection algorithm, n frames of images are acquired in total, and a camera frame coordinate set Ω= { (x) of the ship is obtained i ,y i )},(x i ,y i ) Representing camera picture coordinates of the ith ship;
the longitude and latitude sitting of all ships in each frame of image is carried out through the low-precision mapping relation in the step S1The target is converted into camera picture coordinates, and a converted camera picture coordinate set K= { (x ') of the ship is obtained' j ,y′ j ) (x 'in which' j ,y′ j ) Representing the picture coordinates of the camera of the jth ship after longitude and latitude conversion;
in the step S3, the processing procedure for each frame of image is as follows:
in each frame of image, selecting a camera picture coordinate set omega of all ships detected by using an image target detection algorithm in the frame of image n ={(x i ,y i ) And a camera picture coordinate set K obtained by AIS after longitude and latitude coordinate conversion n ={(x′ j ,y′ j ) And (2) n represents an nth frame of image, and the difference between the ship camera picture coordinates obtained by using the image target detection algorithm and the ship camera picture coordinates converted from the longitude and latitude coordinates is (deltax, deltay), and is calculated as follows:
Δx=|x i -x′ j |
Δy=|y i -y′ j |
the method for determining the one-to-one correspondence between the picture coordinates and the longitude and latitude coordinates of the camera according to (deltax, deltay) is as follows:
setting a threshold value (delta X, delta Y), screening out qualified ship picture coordinates (X, Y), namely delta X < delta X, delta Y < delta Y, discarding the ship picture coordinates (X, Y) if one ship picture coordinate obtained by an image target detection algorithm corresponds to a plurality of qualified camera picture coordinates converted from longitude and latitude coordinates in the screening process, so that one ship picture coordinate obtained by the image target detection algorithm corresponds to only one qualified camera picture coordinate converted from longitude and latitude coordinates, sequentially screening out the qualified ship picture coordinates (X, Y) in all frame images, and obtaining the qualified ship picture coordinates and a position information set M= { < lon of the ship longitude and latitude coordinates i ,lat i ,x i ,y i >, the (lon) i ,lat i ) Is longitude and latitude coordinates before the conversion of the longitude and latitude coordinates of the ship in the step S1, (x) i ,y i ) Is detected by image targetAnd measuring the picture coordinates of the ship camera obtained by the algorithm.
2. The method according to claim 1, wherein in the step S4, the camera frame coordinates and the position information set of longitude and latitude coordinates m= { < lon obtained in the step S3 are used i ,lat i ,x i ,y i > -obtaining transformation matrix parameters by calculation, the process is as follows:
from the set of location information m= { < lon i ,lat i ,x i ,y i Three groups of data are selected each time in the >, and a plurality of transformation matrixes H are obtained through inverse matrix calculation i
Wherein (lon) i1 ,lat i1 )、(lon i1 ,lat i2 )、(lon i3 ,lat i3 ) Three sets of longitude and latitude coordinates, (x) of the ship i1 ,y i1 )、(x i2 ,y i2 )、(x i3 ,y i3 ) Three sets of coordinates of the ship in the camera picture;
then take a plurality of transformation matrices H i Average value of (2):
the conversion relation between the longitude and latitude coordinates and the camera picture coordinates is as follows:
where (lon, lat) is the longitude and latitude coordinates of the ship, (x, y) is the coordinates of the ship in the camera picture, and H is the transformation matrix.
CN202110725329.4A 2021-06-29 2021-06-29 AIS-based ship high-precision coordinate mapping method Active CN113569647B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110725329.4A CN113569647B (en) 2021-06-29 2021-06-29 AIS-based ship high-precision coordinate mapping method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110725329.4A CN113569647B (en) 2021-06-29 2021-06-29 AIS-based ship high-precision coordinate mapping method

Publications (2)

Publication Number Publication Date
CN113569647A CN113569647A (en) 2021-10-29
CN113569647B true CN113569647B (en) 2024-02-20

Family

ID=78162950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110725329.4A Active CN113569647B (en) 2021-06-29 2021-06-29 AIS-based ship high-precision coordinate mapping method

Country Status (1)

Country Link
CN (1) CN113569647B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114228973B (en) * 2021-12-31 2024-06-07 中国商用飞机有限责任公司 Aircraft porthole system with transparent display and display method and medium thereof
CN114926522B (en) * 2022-04-29 2024-03-15 湖北国际物流机场有限公司 AIS system and video technology-based target ship detection system and method
CN114821494B (en) * 2022-06-27 2022-10-14 杭州声飞光电技术有限公司 Ship information matching method and device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408542A (en) * 2016-10-10 2017-02-15 四川大学 Rapid geometric correction method of dome visual scene
CN109460740A (en) * 2018-11-15 2019-03-12 上海埃威航空电子有限公司 The watercraft identification recognition methods merged based on AIS with video data
CN109919975A (en) * 2019-02-20 2019-06-21 中国人民解放军陆军工程大学 Wide-area monitoring moving target association method based on coordinate calibration
CN111145545A (en) * 2019-12-25 2020-05-12 西安交通大学 Road traffic behavior unmanned aerial vehicle monitoring system and method based on deep learning
CN111241988A (en) * 2020-01-08 2020-06-05 北京天睿空间科技股份有限公司 Method for detecting and identifying moving target in large scene by combining positioning information
CN111311654A (en) * 2020-02-13 2020-06-19 北京百度网讯科技有限公司 Camera position registration method and device, electronic equipment and storage medium
CN111523465A (en) * 2020-04-23 2020-08-11 中船重工鹏力(南京)大气海洋信息系统有限公司 Ship identity recognition system based on camera calibration and deep learning algorithm
CN111914049A (en) * 2020-07-29 2020-11-10 北京天睿空间科技股份有限公司 Method for mapping longitude and latitude coordinates and image coordinates
CN112598733A (en) * 2020-12-10 2021-04-02 广州市赋安电子科技有限公司 Ship detection method based on multi-mode data fusion compensation adaptive optimization
CN112836737A (en) * 2021-01-29 2021-05-25 同济大学 Roadside combined sensing equipment online calibration method based on vehicle-road data fusion
CN113012047A (en) * 2021-03-26 2021-06-22 广州市赋安电子科技有限公司 Dynamic camera coordinate mapping establishing method and device and readable storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408542A (en) * 2016-10-10 2017-02-15 四川大学 Rapid geometric correction method of dome visual scene
CN109460740A (en) * 2018-11-15 2019-03-12 上海埃威航空电子有限公司 The watercraft identification recognition methods merged based on AIS with video data
CN109919975A (en) * 2019-02-20 2019-06-21 中国人民解放军陆军工程大学 Wide-area monitoring moving target association method based on coordinate calibration
CN111145545A (en) * 2019-12-25 2020-05-12 西安交通大学 Road traffic behavior unmanned aerial vehicle monitoring system and method based on deep learning
CN111241988A (en) * 2020-01-08 2020-06-05 北京天睿空间科技股份有限公司 Method for detecting and identifying moving target in large scene by combining positioning information
CN111311654A (en) * 2020-02-13 2020-06-19 北京百度网讯科技有限公司 Camera position registration method and device, electronic equipment and storage medium
CN111523465A (en) * 2020-04-23 2020-08-11 中船重工鹏力(南京)大气海洋信息系统有限公司 Ship identity recognition system based on camera calibration and deep learning algorithm
CN111914049A (en) * 2020-07-29 2020-11-10 北京天睿空间科技股份有限公司 Method for mapping longitude and latitude coordinates and image coordinates
CN112598733A (en) * 2020-12-10 2021-04-02 广州市赋安电子科技有限公司 Ship detection method based on multi-mode data fusion compensation adaptive optimization
CN112836737A (en) * 2021-01-29 2021-05-25 同济大学 Roadside combined sensing equipment online calibration method based on vehicle-road data fusion
CN113012047A (en) * 2021-03-26 2021-06-22 广州市赋安电子科技有限公司 Dynamic camera coordinate mapping establishing method and device and readable storage medium

Also Published As

Publication number Publication date
CN113569647A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN113569647B (en) AIS-based ship high-precision coordinate mapping method
CN110363158B (en) Millimeter wave radar and visual cooperative target detection and identification method based on neural network
CN112819903B (en) L-shaped calibration plate-based camera and laser radar combined calibration method
CN112598733B (en) Ship detection method based on multi-mode data fusion compensation adaptive optimization
CN110570449B (en) Positioning and mapping method based on millimeter wave radar and visual SLAM
CN111523465A (en) Ship identity recognition system based on camera calibration and deep learning algorithm
CN111241988B (en) Method for detecting and identifying moving target in large scene by combining positioning information
CN113012047B (en) Dynamic camera coordinate mapping establishing method and device and readable storage medium
CN113642463B (en) Heaven and earth multi-view alignment method for video monitoring and remote sensing images
CN113313047B (en) Lane line detection method and system based on lane structure prior
CN112348775B (en) Vehicle-mounted looking-around-based pavement pit detection system and method
CN112687127A (en) Ship positioning and snapshot method based on AIS and image analysis assistance
CN111914049A (en) Method for mapping longitude and latitude coordinates and image coordinates
CN115222819A (en) Camera self-calibration and target tracking method based on multi-mode information reference in airport large-range scene
CN115717867A (en) Bridge deformation measurement method based on airborne double cameras and target tracking
CN113223095B (en) Internal and external parameter calibration method based on known camera position
CN113850868A (en) Wave climbing image identification method
CN114120236A (en) Method for identifying and positioning low-altitude target
CN116989825A (en) Combined calibration method and system for road side laser radar-camera-UTM coordinate system
CN112255604A (en) Method and device for judging accuracy of radar data and computer equipment
CN112284509A (en) Bridge structure vibration mode measuring method based on mobile phone video
CN111207683A (en) Tunnel deformation monitoring method and device and computer readable storage medium
CN113838126B (en) Video monitoring and unmanned aerial vehicle image alignment method
CN111932642B (en) Method, device and equipment for measuring and calculating volume of structural crack and storage medium
CN110044379B (en) Calibration method of mobile measurement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 510000 room 1301 (Location: room 1301-1), No. 68, yueken Road, Wushan street, Tianhe District, Guangzhou City, Guangdong Province (office only)

Applicant after: Guangzhou Fu'an Digital Technology Co.,Ltd.

Address before: 510000 No. 1501, 68 yueken Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU FUAN ELECTRONIC TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant