CN111986312A - Ship track drawing method, terminal device and storage medium - Google Patents

Ship track drawing method, terminal device and storage medium Download PDF

Info

Publication number
CN111986312A
CN111986312A CN202010817894.9A CN202010817894A CN111986312A CN 111986312 A CN111986312 A CN 111986312A CN 202010817894 A CN202010817894 A CN 202010817894A CN 111986312 A CN111986312 A CN 111986312A
Authority
CN
China
Prior art keywords
ship
image
pixel point
representing
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010817894.9A
Other languages
Chinese (zh)
Other versions
CN111986312B (en
Inventor
曹恩广
薛晗
方琼林
柴田�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Expedition Port
Jimei University
Original Assignee
Expedition Port
Jimei University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Expedition Port, Jimei University filed Critical Expedition Port
Priority to CN202010817894.9A priority Critical patent/CN111986312B/en
Publication of CN111986312A publication Critical patent/CN111986312A/en
Application granted granted Critical
Publication of CN111986312B publication Critical patent/CN111986312B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The invention relates to a ship track drawing method, a terminal device and a storage medium, wherein the method comprises the following steps: s1: shooting an object with known coordinates by a camera device; s2: calibrating the camera device according to the shot two-dimensional image and the known coordinates of the object; s3: acquiring ship images in continuous time through a calibrated camera device, filtering the ship images through a nonlinear gradient domain guided filtering algorithm, and extracting ship characteristics in each ship image; s4: converting two-dimensional coordinates of the ship features in the image into three-dimensional coordinates in a world coordinate system according to the extracted ship features and the camera calibration data; s5: and displaying on the chart according to the three-dimensional coordinates corresponding to the ship features in each ship image in continuous time and connecting the ship images to form a ship track. The invention realizes the display of the longitude and latitude of the monitored ship on the chart.

Description

Ship track drawing method, terminal device and storage medium
Technical Field
The invention relates to the field of track drawing, in particular to a ship track drawing method, terminal equipment and a storage medium.
Background
In order to enhance the display function of the electronic chart, the AIS information is superimposed on the electronic chart for display in the prior art, which is beneficial for operators to quickly and accurately make various reactions according to the display information. However, the use of the AIS equipment and the ship identification codes of some fishing boats is not standard, illegal behaviors such as 'ship code is not consistent, one ship has multiple codes and one code has multiple ships' bring huge risk hazards to safe sailing and rescue of the fishing boats, and the AIS equipment and the ship identification codes become a 'big problem' threatening the safety of lives and properties of fishermen all the time.
Disclosure of Invention
The invention provides a ship track drawing method, a terminal device and a storage medium, which aims to enable the navigation track of a fishing ship, a dredger and the like which are not provided with an AIS to be sensed by other ships around a navigation channel so as to increase navigation safety.
The specific scheme is as follows:
a ship track drawing method comprises the following steps:
s1: shooting an object with known coordinates by a camera device;
s2: calibrating the camera device according to the shot two-dimensional image and the known coordinates of the object;
s3: acquiring ship images in continuous time through a calibrated camera device, filtering the ship images through a nonlinear gradient domain guided filtering algorithm, and extracting ship characteristics in each ship image;
s4: converting two-dimensional coordinates of the ship features in the image into three-dimensional coordinates in a world coordinate system according to the extracted ship features and the camera calibration data;
s5: and displaying on the chart according to the three-dimensional coordinates corresponding to the ship features in each ship image in continuous time and connecting the ship images to form a ship track.
Further, the filtering by the nonlinear gradient domain guided filtering algorithm comprises the following steps:
s31: initializing parameters, wherein the parameters comprise a window size and a regularization parameter lambda;
s32: for each window, its corresponding first and second coefficients are calculated:
Figure BDA0002633387880000021
wherein α represents an index, wkRepresenting a window centred on the pixel point k, akAnd bkRepresenting a first coefficient and a second coefficient corresponding to a window with a pixel point k as a center, k representing the pixel point, gamma representing a parameter for distinguishing an edge from a smooth region, representing an edge perception weighting parameter, IiRepresenting the ith pixel point, p, in the guide imageiRepresenting the filter input corresponding to the ith pixel point, i representing a pixel point, | wkI represents the window wkThe number of pixels contained;
s33: calculating the average value of the first coefficient coefficients and the average value of the second coefficient coefficients of all windows contained in each pixel point according to the following steps:
Figure BDA0002633387880000022
Figure BDA0002633387880000031
wherein,
Figure BDA0002633387880000032
and
Figure BDA0002633387880000033
respectively representing the average values of the first coefficient and the second coefficient, | w | representing the number of windows included in the image;
s34: the pixel values of the filtered image are calculated according to the following formula:
Figure BDA0002633387880000034
wherein q isiAnd expressing the pixel value of the ith pixel point in the filtered image.
Further, the calculation formula of the edge perception weighting parameter is as follows:
Figure BDA0002633387880000035
wherein, N represents the pixel number of the image, chi (j) represents the variance of the pixel value of the window where the pixel point j is located, j represents the pixel point, chi (j ') represents the variance of the pixel value of the window where the pixel point j ' is located, and the pixel value of the window where the pixel point j ' is the pixel value in the image after the image is guided to undergo linear transformation, and represents a positive number with the denominator of 0.
Further, the calculation formula of the parameter γ for distinguishing the edge and the smooth area is:
Figure BDA0002633387880000036
Figure BDA0002633387880000037
where η is the intermediate variable and μ represents the average of all χ (j) values.
A ship track drawing terminal device comprises a processor, a memory and a computer program which is stored in the memory and can run on the processor, wherein the processor executes the computer program to realize the steps of the method of the embodiment of the invention.
A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the method as described above for an embodiment of the invention.
By adopting the technical scheme, the invention realizes the display of the longitude and latitude of the monitored ship on the chart, and adopts the nonlinear gradient domain guided filtering algorithm to filter the image in order to reduce the influence of sea fog on the monitoring video image, thereby improving the definition of the image.
Drawings
Fig. 1 is a flowchart illustrating a first embodiment of the present invention.
Detailed Description
To further illustrate the various embodiments, the invention provides the accompanying drawings. The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the embodiments. Those skilled in the art will appreciate still other possible embodiments and advantages of the present invention with reference to these figures.
The invention will now be further described with reference to the accompanying drawings and detailed description.
The first embodiment is as follows:
an embodiment of the present invention provides a ship track drawing method, as shown in fig. 1, including the following steps.
S1: the object with known coordinates is photographed by the camera.
In this embodiment the camera means is a video camera. The object with known coordinates may be a ship or other objects, and is not limited herein.
S2: and calibrating the camera device according to the shot two-dimensional image and the known coordinates of the object.
The calibration and calibration of the camera is the basis for performing a three-dimensional Reconstruction (3D Reconstruction).
The basic coordinate systems associated with camera imaging include the pixel coordinate system, the image coordinate system, and the world coordinate system. The transformation relationship between the three-dimensional scene and the two-dimensional imaging plane of the video image captured by the camera is as follows:
Figure BDA0002633387880000051
in the formula (X)w,Yw,Zw) Is the three-dimensional coordinates in the three-dimensional scene, (x, y) is the coordinates in the two-dimensional imaging plane of the image taken by the camera, R is the rotation matrix, and t is the translation vector. Alphax、ɑy、u0And v0Both are transformation and camera internal parameters, and gamma is the radial distortion correction.
Expanding the formula (1) as follows:
Figure BDA0002633387880000052
taking into account sea surface specificity, take ZwWhen 0, equation (2) can be simplified as:
Figure BDA0002633387880000053
the computer vision technology obtains two-dimensional information of an object or a scene in a space through a camera, and restores three-dimensional information of the object in the space by combining internal and external parameters of the camera, wherein the three-dimensional information comprises the size, the position, the motion state and the like of the object. In the whole process, camera calibration is required to be carried out firstly, namely, various parameters of the camera, including optical parameters and geometric parameters, are solved through calculation. The optical parameters are internal parameters of the camera, and the geometric parameters are external parameters of the camera, including a rotation matrix and a translation vector of the camera in space due to motion.
S3: the ship images in continuous time are collected through the calibrated camera device, and after the ship images are filtered through a nonlinear gradient domain guiding filtering algorithm, ship features in each ship image are extracted.
The following describes the nonlinear gradient domain guided filtering algorithm in detail.
(1) Gradient domain guided filtering algorithm
In the gradient-domain guided filtering algorithm, the filtered image q is assumed to guide a linear transformation of the image I in a window Ω:
q=aI+b (4)
where a and b represent two coefficients, respectively.
The cost function is defined as:
Figure BDA0002633387880000061
wherein, X represents the image to be filtered, j represents the pixel point in the image, and λ represents the regularization parameter, represents the edge perception weighting parameter, and is defined as:
Figure BDA0002633387880000062
wherein, N represents the image pixel number, χ (j) represents the variance of the pixel value of the window where the pixel point j is located, j represents the pixel point, χ (j ') represents the variance of the pixel value of the window where the pixel point j ' is located, and the pixel value of the window where the pixel point j ' is located is the pixel value in the image after the image is guided to be subjected to linear transformation, and represents a positive number of which the denominator is 0.
γ represents a coefficient for distinguishing an edge from a smooth area, and is defined as:
Figure BDA0002633387880000063
wherein mu represents the average value of all chi (j) values, and the calculation formula of the intermediate variable eta is as follows:
Figure BDA0002633387880000064
(2) nonlinear gradient domain guided filtering algorithm
Assume that the filtered image q is a non-linear transformation of the guide image I:
q=aIα+b (9)
where α represents an exponent, and a and b represent a first coefficient and a second coefficient, respectively.
To avoid gradient inversion, the following constraints are set:
1≤α≤2 (10)
when α is 1, the conventional gradient-guided filtering algorithm, that is, equation (9) is degenerated to equation (4). Therefore, the conventional gradient-domain guided filtering algorithm is a special case of the non-linear gradient-domain guided filtering algorithm proposed in this embodiment.
Theorem 1: the optimal values for a and b are calculated as follows:
Figure BDA0002633387880000071
Figure BDA0002633387880000072
wherein, wkRepresenting a window centred on the pixel point k, akAnd bkRepresenting a first coefficient and a second coefficient corresponding to a window centred on a pixel point k, k representing a pixel point, IiRepresenting the ith pixel point, p, in the guide imageiRepresenting the filter input corresponding to the ith pixel point, i representing a pixel point, | wkI represents the window wkThe number of pixels included.
And (3) proving that: noise is defined as
n=q-p (13)
Substituting (11) into (13) yields:
n=aIα+b-p (14)
the ultimate goal is to minimize this noise. Thus, the cost function can be written as:
Figure BDA0002633387880000073
partial derivatives of the network parameters can be obtained:
Figure BDA0002633387880000074
Figure BDA0002633387880000081
the following result is therefore true:
Figure BDA0002633387880000082
Figure BDA0002633387880000083
substituting (19) into (18) yields:
Figure BDA0002633387880000084
from formula (20):
Figure BDA0002633387880000085
thus, the certification of theorem 1 is completed. The above model is applied to the entire image filtering window. But each pixel is contained in multiple windows. For example, if a 3 x 3 window filter is used, all points except the edge region will be contained in nine windows. Therefore, we will get | wkQ | ═ 9iThe value is obtained. Setting:
Figure BDA0002633387880000086
Figure BDA0002633387880000087
wherein,
Figure BDA0002633387880000088
and
Figure BDA0002633387880000089
respectively, the average values of the first coefficient and the second coefficient, | w | represents the number of windows included in the image.
All q are put togetheriThe values are averaged to obtain the final result.
Figure BDA0002633387880000091
Wherein q isiAnd expressing the pixel value of the ith pixel point in the filtered image.
The features mainly include feature points, feature lines, and regions. In most cases, the feature points are used as matching elements, and the feature point extraction algorithm can be a common algorithm, such as a method based on a directional derivative, a method based on an image brightness contrast relation, a method based on mathematical morphology, and the like.
S4: and converting the two-dimensional coordinates of the ship features in the image into three-dimensional coordinates in a world coordinate system according to the extracted ship features and the calibration data of the camera device.
That is, the coordinates (X) of the ship features in each ship image corresponding to the world coordinate system are obtainedw,Yw,Zw)。
S6: and displaying on the chart according to the three-dimensional coordinates corresponding to the ship features in each ship image in continuous time and connecting the ship images to form a ship track.
In this embodiment, taking Z into account sea surface specificitywCharacterizing the vessel as (X) in coordinates in the world coordinate system, 0w,Yw) Shown on the chart.
And (3) experimental verification:
taking the sea cang bridge in Xiamen city as an example, the navigation mark coordinates are shown in Table 1:
TABLE 1
Navigation mark Latitude Longitude (G) Height Image coordinates
Sea cang bridge No. 1 bridge culvert mark 24°29'49.5"N 118°04'14.7"E 55 m (277,250)
Sea cang bridge No. 2 bridge culvert mark 24°29'49.6"N 118°04'06.6"E 55 m (950,335)
Sea cang bridge No. 3 bridge culvert mark 24°29'49.7"N 118°03'59.1"E 55 m (1206,379)
Cow dung reef lamp pile 24°29'56.7"N 118°04'15.9"E 0 (1269,490)
Can be substituted by the formula (2):
Figure BDA0002633387880000101
obtaining by solution:
Figure BDA0002633387880000102
thus, the transformation relationship between the three-dimensional scene and the two-dimensional imaging plane of the image captured by the camera is:
Figure BDA0002633387880000103
the track positions of the ship on the images are (1220,500), (1250,510) and (1280,520) respectively through the monitoring video on the buoys. The longitude and latitude are obtained as shown in table 2:
TABLE 2
Image coordinates Latitude Longitude (G) Height
(1220,500) 24°29.848N 118°4.042E 0
(1250,510) 24°29.798N 118°4.03E 0
(1280,520) 24°29.781N 118°4.016E 0
According to the embodiment of the invention, the longitude and latitude of the monitoring ship are displayed on the chart, and in order to reduce the influence of sea fog on the monitoring video image, the image is filtered by adopting a nonlinear gradient domain guided filtering algorithm, so that the definition of the image is improved.
Example two:
the invention further provides a ship track drawing terminal device, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to realize the steps of the method embodiment of the first embodiment of the invention.
Further, as an executable scheme, the ship track drawing terminal device may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The ship track drawing terminal equipment can comprise, but is not limited to, a processor and a memory. It is understood by those skilled in the art that the composition structure of the ship track drawing terminal device is only an example of the ship track drawing terminal device, and does not constitute a limitation on the ship track drawing terminal device, and may include more or fewer components than the above, or combine some components, or different components, for example, the ship track drawing terminal device may further include an input/output device, a network access device, a bus, and the like, which is not limited in this embodiment of the present invention.
Further, as an executable solution, the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, and the like. The general-purpose processor may be a microprocessor or the processor may be any conventional processor, and the processor is a control center of the ship track drawing terminal device, and various interfaces and lines are used to connect various parts of the whole ship track drawing terminal device.
The memory may be configured to store the computer program and/or the module, and the processor may implement various functions of the ship trajectory drawing terminal device by operating or executing the computer program and/or the module stored in the memory and calling data stored in the memory. The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the mobile phone, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The invention also provides a computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned method of an embodiment of the invention.
The ship track drawing terminal device integrated module/unit can be stored in a computer readable storage medium if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM ), Random Access Memory (RAM), software distribution medium, and the like.
While the invention has been particularly shown and described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (6)

1. A ship track drawing method is characterized by comprising the following steps:
s1: shooting an object with known coordinates by a camera device;
s2: calibrating the camera device according to the shot two-dimensional image and the known coordinates of the object;
s3: acquiring ship images in continuous time through a calibrated camera device, filtering the ship images through a nonlinear gradient domain guided filtering algorithm, and extracting ship characteristics in each ship image;
s4: converting two-dimensional coordinates of the ship features in the image into three-dimensional coordinates in a world coordinate system according to the extracted ship features and the camera calibration data;
s5: and displaying on the chart according to the three-dimensional coordinates corresponding to the ship features in each ship image in continuous time and connecting the ship images to form a ship track.
2. The ship trajectory drawing method according to claim 1, characterized in that: the filtering by the nonlinear gradient domain guided filtering algorithm comprises the following steps:
s31: initializing parameters, wherein the parameters comprise a window size and a regularization parameter lambda;
s32: for each window, its corresponding first and second coefficients are calculated:
Figure FDA0002633387870000011
Figure FDA0002633387870000012
wherein α represents an index, wkRepresenting a window centred on the pixel point k, akAnd bkRepresenting a first coefficient and a second coefficient corresponding to a window with a pixel point k as a center, k representing the pixel point, gamma representing a parameter for distinguishing an edge from a smooth region, representing an edge perception weighting parameter, IiRepresenting the ith pixel point, p, in the guide imageiRepresenting the filter input corresponding to the ith pixel point, i representing a pixel point, | wkI represents the window wkThe number of pixels contained;
s33: calculating the average value of the first coefficient coefficients and the average value of the second coefficient coefficients of all windows contained in each pixel point according to the following steps:
Figure FDA0002633387870000021
Figure FDA0002633387870000022
wherein,
Figure FDA0002633387870000023
and
Figure FDA0002633387870000024
respectively representing the average values of the first coefficient and the second coefficient, | w | representing the number of windows included in the image;
s34: the pixel values of the filtered image are calculated according to the following formula:
Figure FDA0002633387870000025
wherein q isiAnd expressing the pixel value of the ith pixel point in the filtered image.
3. The ship trajectory drawing method according to claim 2, characterized in that: the calculation formula of the edge perception weighting parameter is as follows:
Figure FDA0002633387870000026
wherein, N represents the pixel number of the image, chi (j) represents the variance of the pixel value of the window where the pixel point j is located, j represents the pixel point, chi (j ') represents the variance of the pixel value of the window where the pixel point j ' is located, and the pixel value of the window where the pixel point j ' is the pixel value in the image after the image is guided to undergo linear transformation, and represents a positive number with the denominator of 0.
4. The ship trajectory drawing method according to claim 3, characterized in that: the formula for calculating the parameter γ for distinguishing the edge from the smooth area is:
Figure FDA0002633387870000031
Figure FDA0002633387870000032
where η is the intermediate variable and μ represents the average of all χ (j) values.
5. The utility model provides a terminal equipment is drawn to boats and ships orbit which characterized in that: comprising a processor, a memory and a computer program stored in the memory and running on the processor, the processor implementing the steps of the method according to any of claims 1 to 4 when executing the computer program.
6. A computer-readable storage medium storing a computer program, characterized in that: the computer program when executed by a processor implementing the steps of the method as claimed in any one of claims 1 to 4.
CN202010817894.9A 2020-08-14 2020-08-14 Ship track drawing method, terminal device and storage medium Expired - Fee Related CN111986312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010817894.9A CN111986312B (en) 2020-08-14 2020-08-14 Ship track drawing method, terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010817894.9A CN111986312B (en) 2020-08-14 2020-08-14 Ship track drawing method, terminal device and storage medium

Publications (2)

Publication Number Publication Date
CN111986312A true CN111986312A (en) 2020-11-24
CN111986312B CN111986312B (en) 2022-04-22

Family

ID=73434453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010817894.9A Expired - Fee Related CN111986312B (en) 2020-08-14 2020-08-14 Ship track drawing method, terminal device and storage medium

Country Status (1)

Country Link
CN (1) CN111986312B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907728A (en) * 2021-01-27 2021-06-04 北京邮电大学 Ship scene restoration and positioning method and system based on camera and edge calculation

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103996049A (en) * 2014-05-05 2014-08-20 南京大学 Ship overlength and overwidth detection method based on video image
US20140369600A1 (en) * 2013-06-17 2014-12-18 Fujitsu Limited, Filtering method and apparatus for recovering an anti-aliasing edge
US20150063628A1 (en) * 2013-09-04 2015-03-05 Xerox Corporation Robust and computationally efficient video-based object tracking in regularized motion environments
CN106871900A (en) * 2017-01-23 2017-06-20 中国人民解放军海军工程大学 Image matching positioning method in ship magnetic field dynamic detection
CN109188909A (en) * 2018-09-26 2019-01-11 大连海事大学 Adaptive fuzzy method for optimally controlling and system towards ship course nonlinear discrete systems
US20190066334A1 (en) * 2017-08-25 2019-02-28 Boe Technology Group Co., Ltd. Method, apparatus, terminal and system for measuring trajectory tracking accuracy of target
US20190333189A1 (en) * 2018-04-27 2019-10-31 Imam Abdulrahman Bin Faisal University Gradient vector orientation based nonlinear diffusion filter
CN110472607A (en) * 2019-08-21 2019-11-19 上海海事大学 A kind of ship tracking method and system
CN111027459A (en) * 2019-12-06 2020-04-17 江苏海事职业技术学院 Ship track prediction method and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140369600A1 (en) * 2013-06-17 2014-12-18 Fujitsu Limited, Filtering method and apparatus for recovering an anti-aliasing edge
US20150063628A1 (en) * 2013-09-04 2015-03-05 Xerox Corporation Robust and computationally efficient video-based object tracking in regularized motion environments
CN103996049A (en) * 2014-05-05 2014-08-20 南京大学 Ship overlength and overwidth detection method based on video image
CN106871900A (en) * 2017-01-23 2017-06-20 中国人民解放军海军工程大学 Image matching positioning method in ship magnetic field dynamic detection
US20190066334A1 (en) * 2017-08-25 2019-02-28 Boe Technology Group Co., Ltd. Method, apparatus, terminal and system for measuring trajectory tracking accuracy of target
US20190333189A1 (en) * 2018-04-27 2019-10-31 Imam Abdulrahman Bin Faisal University Gradient vector orientation based nonlinear diffusion filter
CN109188909A (en) * 2018-09-26 2019-01-11 大连海事大学 Adaptive fuzzy method for optimally controlling and system towards ship course nonlinear discrete systems
CN110472607A (en) * 2019-08-21 2019-11-19 上海海事大学 A kind of ship tracking method and system
CN111027459A (en) * 2019-12-06 2020-04-17 江苏海事职业技术学院 Ship track prediction method and system

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
FEI KOU等: "Gradient Domain Guided Image Filtering", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
刘伊凡等: "实船数据环境下船舶主柴油机性能评估方法", 《内燃机学报》 *
姜佰辰等: "基于多项式卡尔曼滤波的船舶轨迹预测算法", 《信号处理》 *
姜佰辰等: "海上交通的船舶异常行为挖掘识别分析", 《计算机仿真》 *
权波等: "基于LSTM的船舶航迹预测模型", 《计算机科学》 *
茅晨昊等: "基于高斯过程回归的船舶航行轨迹预测", 《科技创新与应用》 *
邹雄等: "面向自主靠泊的船舶航迹估计方法", 《中国航海》 *
郭冬冬等: "基于大型船舶实船观测的开敞海域浅滩深挖槽", 《中国港湾建设》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907728A (en) * 2021-01-27 2021-06-04 北京邮电大学 Ship scene restoration and positioning method and system based on camera and edge calculation

Also Published As

Publication number Publication date
CN111986312B (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN109064428B (en) Image denoising processing method, terminal device and computer readable storage medium
CN108694705B (en) Multi-frame image registration and fusion denoising method
CN108765343B (en) Image processing method, device, terminal and computer readable storage medium
CN109035319B (en) Monocular image depth estimation method, monocular image depth estimation device, monocular image depth estimation apparatus, monocular image depth estimation program, and storage medium
Negru et al. Exponential contrast restoration in fog conditions for driving assistance
CN111340077B (en) Attention mechanism-based disparity map acquisition method and device
CN104008538B (en) Based on single image super-resolution method
CN111368717B (en) Line-of-sight determination method, line-of-sight determination device, electronic apparatus, and computer-readable storage medium
CN111860398B (en) Remote sensing image target detection method and system and terminal equipment
US20130136338A1 (en) Methods and Apparatus for Correcting Disparity Maps using Statistical Analysis on Local Neighborhoods
CN107358586A (en) A kind of image enchancing method, device and equipment
US20170358100A1 (en) Image processing apparatus and image processing method
CN110163831B (en) Method and device for dynamically displaying object of three-dimensional virtual sand table and terminal equipment
CN107590811B (en) Scene segmentation based landscape image processing method and device and computing equipment
WO2014070273A1 (en) Recursive conditional means image denoising
CN109410246B (en) Visual tracking method and device based on correlation filtering
WO2023142904A1 (en) Image processing method, electronic device and non-transient computer readable medium
CN113570725A (en) Three-dimensional surface reconstruction method and device based on clustering, server and storage medium
CN113379815A (en) Three-dimensional reconstruction method and device based on RGB camera and laser sensor and server
CN109064402A (en) Based on the single image super resolution ratio reconstruction method for enhancing non local total variation model priori
CN116862812A (en) Infrared image processing method, device, computer equipment, storage medium and product
CN111986312B (en) Ship track drawing method, terminal device and storage medium
KR101362183B1 (en) Depth image noise removal apparatus and method based on camera pose
CN117408886A (en) Gas image enhancement method, gas image enhancement device, electronic device and storage medium
CN113628148B (en) Method and device for reducing noise of infrared image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220422