CN114332454A - Image-based ship snapshot method and system - Google Patents
Image-based ship snapshot method and system Download PDFInfo
- Publication number
- CN114332454A CN114332454A CN202111642614.6A CN202111642614A CN114332454A CN 114332454 A CN114332454 A CN 114332454A CN 202111642614 A CN202111642614 A CN 202111642614A CN 114332454 A CN114332454 A CN 114332454A
- Authority
- CN
- China
- Prior art keywords
- ship
- snapshot
- camera
- image
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000001514 detection method Methods 0.000 claims abstract description 56
- 238000013507 mapping Methods 0.000 claims abstract description 30
- 239000011159 matrix material Substances 0.000 claims abstract description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 9
- 230000011218 segmentation Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 4
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 3
- 239000000203 mixture Substances 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 239000000126 substance Substances 0.000 claims description 3
- 238000012549 training Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000004590 computer program Methods 0.000 description 8
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Abstract
The invention discloses a ship snapshot method and system based on images, which belong to the technical field of ship snapshot and are used for solving the technical problems that the existing ship snapshot operation is complex and depends on AIS (automatic identification system), and the method comprises the following steps: 1) acquiring internal parameters and initial external parameters of a snapshot camera; 2) acquiring images of a detection camera and a snapshot camera, and estimating a mapping matrix H between the detection camera and the snapshot camera; 3) predicting the position and the size of a rectangular frame of the ship in the detection image; 4) mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix; 5) and calculating the azimuth parameters and the magnification factor of the center of the snapshot camera aiming at the ship, and then snapshotting the ship. The method has the advantages of simple and convenient operation, high snapshot accuracy and the like.
Description
Technical Field
The invention mainly relates to the technical field of ship snapshot, in particular to a ship snapshot method and system based on images.
Background
With the overload of the ship on water and the frequent occurrence of illegal things such as the name of the ship which are not marked or shielded according to the regulations, the manual snapshot mode is obviously insufficient, the law enforcement efficiency of the method is low, and a large amount of manpower resources are wasted. At present, a snapshot system of an intelligent traffic gate on water positions and snapshots a ship based on longitude and latitude information reported by an Automatic Identification System (AIS), for example, in the invention, a ship positioning and snapshot method based on AIS and image analysis assistance calculates a pitch angle and a horizontal deflection angle when a camera is aligned with the ship by using the ship longitude and latitude reported by the AIS and a position azimuth angle of the camera and by means of longitude and latitude values of a reference point. According to the method for calculating the snapshot angle according to the AIS reported longitude and latitude, on one hand, a reference point needs to be set on the water surface, and great inconvenience is brought to workers; on the other hand, the AIS is not opened, and the snapshot cannot be carried out.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: aiming at the problems in the prior art, the invention provides the image-based ship snapshot method and system which are simple and convenient to operate and high in snapshot accuracy.
In order to solve the technical problems, the technical scheme provided by the invention is as follows:
an image-based ship snapshot method, comprising the steps of:
1) acquiring internal parameters and initial external parameters of a snapshot camera;
2) acquiring images of a detection camera and a snapshot camera, and estimating a mapping matrix H between the detection camera and the snapshot camera;
3) predicting the position and the size of a rectangular frame of the ship in the detection image;
4) mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix;
5) and calculating the azimuth parameters and the magnification factor of the center of the snapshot camera aiming at the ship, and then snapshotting the ship.
Preferably, in step 1), parameters in the snapshot camera are the photosensitive device sizes W and H, and the image resolutions W and H; the external parameters are a focal length f, a field angle FOV, a horizontal deflection angle alpha and a pitch angle beta in an initial state, and a relation table of the magnification factor and the field angle is established.
Preferably, in step 2), the step of estimating the mapping matrix between the detection camera and the snapshot camera specifically includes:
carrying out graying processing on the detection image and the snapshot image to obtain a grayscale image;
counting the proportion of each pixel value of the two gray level images and the distribution of each gray level in the images, traversing each gray level segmentation to obtain the magnitude of the inter-class variance value, and taking the segmentation threshold value when the inter-class variance is maximum as the image binarization threshold value;
extracting a closed region in a binary image, describing the closed region by using Hu second-order invariant moment, and obtaining the distance Dist between the regions<When 10, i.e. considered a match, the set of points making up the region is defined as D (u)i,vi),D′(ui′,vi′)。
Preferably, in step 2), the mapping matrix between the cameras is estimated by a CPD algorithm, specifically: defining mapping transformation of the detection camera and the snapshot camera into an affine transformation model, wherein the model expression formula is as follows:
wherein, (u, v) represents a pixel point in the detection camera, (u ', v') represents a pixel point in the snapshot camera, and a11、a12、a21And a22Representing scaling, rotation transformation factors, and Δ u and Δ v representing translation along the u and v axes of the image;
d (u)i,vi) And D' (u)i′,vi') is converted into a fitting problem between a centroid point set of a Gaussian mixture model and observed data, and a likelihood function E (theta, sigma) is maximized through an EM algorithm2) And the theta value is the H matrix parameter.
Preferably, in step 3), the position and size of the rectangular frame of the ship in the detection image are predicted through a trained target detection algorithm, specifically:
acquiring on-site detection camera pictures at different time periods and different weather, and marking the position of a rectangular frame of a ship by using a marking tool to establish a data set;
training a target detection model through a data set;
and (2) predicting the position box (xmin, ymin, xmax, ymax) and the confidence coefficient S of the rectangular frame of the ship by using the detection model, wherein (xmin, ymin) is the coordinate of the upper left corner of the rectangle, and (xmax, ymax) is the coordinate of the lower right corner of the rectangle.
Preferably, in step 4), the ship rectangular box ' (xmin ', ymin ', xmax ', ymax ') mapped into the snapshot camera is: and box 'is H × box, and the mapped ship center point P' (u ', v') is as follows: u '(xmin' + xmax ')/2, v' ((ymin '+ ymax')/2), width W of the shipbox=xmax′-xmin′。
Preferably, in step 5), the azimuth parameters include a horizontal declination angle α 'and a pitch angle β'.
Preferably, the specific calculation step of the azimuth angle in step 5) includes:
pixel coordinates of a ship center point P '(u', v ') in the snapshot camera, image coordinates P' (X ', y'), and coordinates P '(X' in the initial camera coordinate systemc′,Yc′,Zc') is:
when the horizontal deflection angle and the pitch angle of the snapshot camera rotate delta alpha and delta beta, the center of the camera is aligned with the ship, and then the rotation moment is obtainedArray RαAnd RβComprises the following steps:
the coordinates of the ship in the camera coordinate system at this time are as follows:
when the center of the snapshot camera is defined to be aligned with the ship, the image coordinates of the ship at the snapshot camera are P' (0,0), the values of delta alpha and delta beta can be obtained by combining the three formulas, and the camera snapshot azimuth angle is as follows: α '═ α + Δ α, β' ═ β + Δ β.
Preferably, the specific calculation step of the magnification in step 5) includes:
the required field angle fov when capturing the ship is defined as: wboxand/W multiplied by FOV, and K is a numerical value closest to FOV according to the corresponding relation table of the magnification factor and the angle of view of the snapshot camera.
The invention also discloses a ship snapshot system based on the image, which comprises the following components:
the first program module is used for acquiring internal parameters and initial external parameters of the snapshot camera;
the second program module is used for acquiring images of the detection camera and the snapshot camera and estimating a mapping matrix H between the detection camera and the snapshot camera;
the third program module is used for predicting the position and the size of a rectangular frame of the ship in the detection image;
the fourth program module is used for mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix;
and the fifth program module is used for calculating the azimuth parameters of the center of the snapshot camera aiming at the ship and then snapshotting the ship.
Compared with the prior art, the invention has the advantages that:
according to the image-based ship snapshot method and system, the ship is positioned and snapshot in a mode that the image detection camera and the snapshot camera work cooperatively, the operation complexity is reduced, the problem that the ship cannot be snapshot when the AIS is not started is effectively solved, and the snapshot accuracy is high.
Drawings
FIG. 1 is a flow chart of an embodiment of the method of the present invention.
Detailed Description
The invention is further described below with reference to the figures and the specific embodiments of the description.
As shown in fig. 1, the image-based ship snapshot method according to the embodiment of the present invention is implemented based on a detection camera and a snapshot camera, wherein the detection camera and the snapshot camera are installed in a water area to be observed, and the method specifically includes the following steps:
1) acquiring internal parameters of a snapshot camera and initial external parameters of the snapshot camera;
2) collecting images of a detection camera and a snapshot camera, and estimating a mapping matrix H between the two cameras;
3) predicting the position and the size of a rectangular frame of the ship in a detection image by using a trained target detection algorithm;
4) mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix;
5) and calculating orientation parameters (including a horizontal deflection angle and a pitch angle) and a magnification K when the center of the snapshot camera is aligned with the ship, and controlling the camera to snapshot the ship.
According to the image-based ship snapshot method, the ship is positioned and snapshot in a mode that the image detection camera and the snapshot camera work cooperatively, the operation complexity is reduced, the problem that the ship cannot be snapshot when the AIS is not started is effectively solved, and the snapshot accuracy is high.
In a specific embodiment, the internal parameters of the capturing camera in the step 1) are the dimensions W and H of the photosensitive device and the image resolutions W and H, the external parameters are the focal length f, the field angle FOV, the horizontal deflection angle α and the pitch angle β in the initial state, and a relation table between the magnification factor and the field angle is established.
In a specific embodiment, the mapping moment between the camera and the snapshot camera is detected in step 2)The array estimation step specifically comprises the steps of carrying out gray processing on the detection image and the snapshot image to obtain a gray image; counting the proportion of each pixel value of the two gray level images and the distribution of each gray level in the images, traversing each gray level segmentation to obtain the magnitude of the inter-class variance value, and taking the segmentation threshold value when the inter-class variance is maximum as the image binarization threshold value; extracting a closed region in a binary image, describing the closed region by using Hu second-order invariant moment, and obtaining the distance Dist between the regions<When 10, i.e. considered a match, the set of points making up the region is defined as D (u)i,vi),D′(ui′,vi′)。
In another embodiment, in step 2), the CPD algorithm is used to estimate a mapping matrix between the cameras, specifically: defining mapping transformation of the detection camera and the snapshot camera into an affine transformation model, wherein the model expression formula is as follows:
wherein, (u, v) represents a pixel point in the detection camera, (u ', v') represents a pixel point in the snapshot camera, and a11、a12、a21And a22Representing scaling, rotation transformation factors, and Δ u and Δ v representing the amount of translation along the u and v axes of the image. D (u)i,vi) And D' (u)i′,vi') is converted into a fitting problem between a centroid point set of a Gaussian mixture model and observed data, and a likelihood function E (theta, sigma) is maximized through an EM algorithm2) And the theta value is the H matrix parameter.
In a specific embodiment, the specific steps of step 3) include collecting on-site detection camera pictures at different time intervals and in the weather, marking the position of a rectangular frame of the ship by using a marking tool, and establishing a data set; training a target detection model by using the data set; and (2) predicting the position box (xmin, ymin, xmax, ymax) and the confidence coefficient S of the rectangular frame of the ship by using the detection model, wherein (xmin, ymin) is the coordinate of the upper left corner of the rectangle, and (xmax, ymax) is the coordinate of the lower right corner of the rectangle.
In one embodiment, step 4)The ship rectangular box ' (xmin ', ymin ', xmax ', ymax ') mapped into the snapshot camera is: and box 'is H × box, and the mapped ship center point P' (u ', v') is as follows: u '(xmin' + xmax ')/2, v' ((ymin '+ ymax')/2), width W of the shipbox=xmax′-xmin′。
In a specific embodiment, the specific step of calculating the azimuth angle and the magnification of the ship by the snapshot camera in the step 5) includes:
pixel coordinates of a ship center point P '(u', v ') in the snapshot camera, image coordinates P' (X ', y'), and coordinates P '(X' in the initial camera coordinate systemc′,Yc′,Zc') is:
when the horizontal deflection angle and the pitch angle of the snapshot camera rotate delta alpha and delta beta, the center of the camera can be aligned with the ship, and then the rotation matrix RαAnd RβComprises the following steps:
the coordinates of the ship in the camera coordinate system at this time are as follows:
when the center of the snapshot camera is defined to be aligned with the ship, the image coordinates of the ship at the snapshot camera are P' (0,0), the values of delta alpha and delta beta can be obtained by combining the three formulas, and the camera snapshot azimuth angle is as follows: α '═ α + Δ α, β' ═ β + Δ β;
wherein the magnification K is calculated as follows: the required field angle fov when capturing the ship is defined as: wboxWxFOV according to the magnification and view of the snap-shot cameraIn the correspondence table of the field angle, K is a value closest to fov.
And finally, after the azimuth angle and the magnification factor are obtained, controlling the snapshot camera to rotate according to the azimuth angle, amplifying according to the magnification factor, snapshotting the currently detected ship, storing a snapshot result, returning to an initial state after the snapshot is finished, and waiting for the next snapshot.
The invention also discloses a ship snapshot system based on the image, which comprises the following components:
the first program module is used for acquiring internal parameters and initial external parameters of the snapshot camera;
the second program module is used for acquiring images of the detection camera and the snapshot camera and estimating a mapping matrix H between the detection camera and the snapshot camera;
the third program module is used for predicting the position and the size of a rectangular frame of the ship in the detection image;
the fourth program module is used for mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix;
and the fifth program module is used for calculating the azimuth parameters of the center of the snapshot camera aiming at the ship and then snapshotting the ship.
The image-based ship snapshot system disclosed by the invention corresponds to the snapshot method and has the advantages of the snapshot method.
The invention further discloses a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above method. The invention also discloses a computer device comprising a memory and a processor, the memory having stored thereon a computer program which, when executed by the processor, performs the steps of the above method. All or part of the flow of the method of the embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium and executed by a processor, to implement the steps of the embodiments of the methods. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. The memory may be used to store computer programs and/or modules, and the processor may perform various functions by executing or executing the computer programs and/or modules stored in the memory, as well as by invoking data stored in the memory. The memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The above is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above-mentioned embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may be made by those skilled in the art without departing from the principle of the invention.
Claims (10)
1. An image-based ship snapshot method is characterized by comprising the following steps:
1) acquiring internal parameters and initial external parameters of a snapshot camera;
2) acquiring images of a detection camera and a snapshot camera, and estimating a mapping matrix H between the detection camera and the snapshot camera;
3) predicting the position and the size of a rectangular frame of the ship in the detection image;
4) mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix;
5) and calculating the azimuth parameters and the magnification factor of the center of the snapshot camera aiming at the ship, and then snapshotting the ship.
2. The image-based ship snapshot method of claim 1, wherein in step 1), the snapshot camera internal parameters are photosensitive device sizes W and H, image resolutions W and H; the external parameters are a focal length f, a field angle FOV, a horizontal deflection angle alpha and a pitch angle beta in an initial state, and a relation table of the magnification factor and the field angle is established.
3. The image-based ship snapshot method of claim 1, wherein in step 2), the step of estimating the mapping matrix between the detection camera and the snapshot camera specifically comprises:
carrying out graying processing on the detection image and the snapshot image to obtain a grayscale image;
counting the proportion of each pixel value of the two gray level images and the distribution of each gray level in the images, traversing each gray level segmentation to obtain the magnitude of the inter-class variance value, and taking the segmentation threshold value when the inter-class variance is maximum as the image binarization threshold value;
extracting a closed region in a binary image, describing the closed region by using Hu second-order invariant moment, and obtaining the distance Dist between the regions<When 10, i.e. considered a match, the set of points making up the region is defined as D (u)i,vi),D′(ui′,vi′)。
4. The image-based ship snapshot method of claim 1, wherein in step 2), the mapping matrix between the cameras is estimated by a CPD algorithm, specifically: defining mapping transformation of the detection camera and the snapshot camera into an affine transformation model, wherein the model expression formula is as follows:
wherein, (u, v) represents a pixel point in the detection camera, (u ', v') represents a pixel point in the snapshot camera, and a11、a12、a21And a22Representing scaling, rotation transformation factors, and Δ u and Δ v representing translation along the u and v axes of the image;
d (u)i,vi) And D' (u)i′,vi') is converted into a fitting problem between a centroid point set of a Gaussian mixture model and observed data, and a likelihood function E (theta, sigma) is maximized through an EM algorithm2) And the theta value is the H matrix parameter.
5. The image-based ship snapshot method according to any one of claims 1 to 4, wherein in step 3), the position and size of the rectangular frame of the ship in the detection image are predicted through a trained target detection algorithm, specifically:
acquiring on-site detection camera pictures at different time periods and different weather, and marking the position of a rectangular frame of a ship by using a marking tool to establish a data set;
training a target detection model through a data set;
and (2) predicting the position box (xmin, ymin, xmax, ymax) and the confidence coefficient S of the rectangular frame of the ship by using the detection model, wherein (xmin, ymin) is the coordinate of the upper left corner of the rectangle, and (xmax, ymax) is the coordinate of the lower right corner of the rectangle.
6. The image-based ship snapshot method of claim 5, wherein in step 4), the ship rectangular box ' (xmin ', ymin ', xmax ', ymax ') mapped into the snapshot camera is: and box 'is H × box, and the mapped ship center point P' (u ', v') is as follows: u '(xmin' + xmax ')/2, v' ((ymin '+ ymax')/2), width W of the shipbox=xmax′-xmin′。
7. The image-based ship snapshot method of claim 6, wherein in step 5), the azimuth parameters comprise a horizontal declination angle α 'and a pitch angle β'.
8. The image-based ship snapshot method of claim 7, wherein the specific calculation of the azimuth angle in step 5) comprises:
pixel coordinates of a ship center point P '(u', v ') in the snapshot camera, image coordinates P' (X ', y'), and coordinates P '(X' in the initial camera coordinate systemc′,Yc′,Zc') is:
when the horizontal deflection angle and the pitch angle of the snapshot camera rotate delta alpha and delta beta, the center of the camera is aligned with the ship, and then the rotation matrix RαAnd RβComprises the following steps:
the coordinates of the ship in the camera coordinate system at this time are as follows:
when the center of the snapshot camera is defined to be aligned with the ship, the image coordinates of the ship at the snapshot camera are P' (0,0), then the values of delta alpha and delta beta can be obtained by combining the three formulas, and the camera snapshot azimuth angle is: α '═ α + Δ α, β' ═ β + Δ β.
9. The image-based ship snapshot method of claim 8, wherein the specific calculation of the magnification in step 5) comprises:
the required field angle fov when capturing the ship is defined as: wboxand/W multiplied by FOV, and K is a numerical value closest to FOV according to the corresponding relation table of the magnification factor and the angle of view of the snapshot camera.
10. An image-based marine snapshot system, comprising:
the first program module is used for acquiring internal parameters and initial external parameters of the snapshot camera;
the second program module is used for acquiring images of the detection camera and the snapshot camera and estimating a mapping matrix H between the detection camera and the snapshot camera;
the third program module is used for predicting the position and the size of a rectangular frame of the ship in the detection image;
the fourth program module is used for mapping the ship rectangular frame in the detection camera to the snapshot camera according to the mapping matrix;
and the fifth program module is used for calculating the azimuth parameters of the center of the snapshot camera aiming at the ship and then snapshotting the ship.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111642614.6A CN114332454A (en) | 2021-12-29 | 2021-12-29 | Image-based ship snapshot method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111642614.6A CN114332454A (en) | 2021-12-29 | 2021-12-29 | Image-based ship snapshot method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114332454A true CN114332454A (en) | 2022-04-12 |
Family
ID=81016494
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111642614.6A Pending CN114332454A (en) | 2021-12-29 | 2021-12-29 | Image-based ship snapshot method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114332454A (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090043504A1 (en) * | 2007-05-31 | 2009-02-12 | Amrit Bandyopadhyay | System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors |
CN103778441A (en) * | 2014-02-26 | 2014-05-07 | 东南大学 | Dezert-Smaradache Theory (DSmT) and Hidden Markov Model (HMM) aircraft sequence target recognition method |
CN107240128A (en) * | 2017-05-09 | 2017-10-10 | 北京理工大学 | A kind of X-ray film and photochrome method for registering based on contour feature |
CN107514994A (en) * | 2017-07-12 | 2017-12-26 | 浙江工业大学 | A kind of headchute localization method based on error compensation |
CN109460740A (en) * | 2018-11-15 | 2019-03-12 | 上海埃威航空电子有限公司 | The watercraft identification recognition methods merged based on AIS with video data |
US20190114777A1 (en) * | 2017-10-18 | 2019-04-18 | Tata Consultancy Services Limited | Systems and methods for edge points based monocular visual slam |
CN110186383A (en) * | 2019-05-31 | 2019-08-30 | 上海大学 | Monocular camera deflection metrology method based on the variation of the target point elevation angle |
US10515458B1 (en) * | 2017-09-06 | 2019-12-24 | The United States Of America, As Represented By The Secretary Of The Navy | Image-matching navigation method and apparatus for aerial vehicles |
CN111815715A (en) * | 2020-07-03 | 2020-10-23 | 浙江大华技术股份有限公司 | Method and device for calibrating zoom pan-tilt camera and storage medium |
CN112348006A (en) * | 2021-01-11 | 2021-02-09 | 湖南星空机器人技术有限公司 | Unmanned aerial vehicle signal identification method, system, medium and equipment |
CN112687127A (en) * | 2020-12-18 | 2021-04-20 | 华南理工大学 | Ship positioning and snapshot method based on AIS and image analysis assistance |
CN113378606A (en) * | 2020-03-10 | 2021-09-10 | 杭州海康威视数字技术股份有限公司 | Method, device and system for determining labeling information |
CN113657256A (en) * | 2021-08-16 | 2021-11-16 | 大连海事大学 | Unmanned ship-borne unmanned aerial vehicle sea-air cooperative visual tracking and autonomous recovery method |
-
2021
- 2021-12-29 CN CN202111642614.6A patent/CN114332454A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090043504A1 (en) * | 2007-05-31 | 2009-02-12 | Amrit Bandyopadhyay | System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors |
CN103778441A (en) * | 2014-02-26 | 2014-05-07 | 东南大学 | Dezert-Smaradache Theory (DSmT) and Hidden Markov Model (HMM) aircraft sequence target recognition method |
CN107240128A (en) * | 2017-05-09 | 2017-10-10 | 北京理工大学 | A kind of X-ray film and photochrome method for registering based on contour feature |
CN107514994A (en) * | 2017-07-12 | 2017-12-26 | 浙江工业大学 | A kind of headchute localization method based on error compensation |
US10515458B1 (en) * | 2017-09-06 | 2019-12-24 | The United States Of America, As Represented By The Secretary Of The Navy | Image-matching navigation method and apparatus for aerial vehicles |
US20190114777A1 (en) * | 2017-10-18 | 2019-04-18 | Tata Consultancy Services Limited | Systems and methods for edge points based monocular visual slam |
CN109460740A (en) * | 2018-11-15 | 2019-03-12 | 上海埃威航空电子有限公司 | The watercraft identification recognition methods merged based on AIS with video data |
CN110186383A (en) * | 2019-05-31 | 2019-08-30 | 上海大学 | Monocular camera deflection metrology method based on the variation of the target point elevation angle |
CN113378606A (en) * | 2020-03-10 | 2021-09-10 | 杭州海康威视数字技术股份有限公司 | Method, device and system for determining labeling information |
CN111815715A (en) * | 2020-07-03 | 2020-10-23 | 浙江大华技术股份有限公司 | Method and device for calibrating zoom pan-tilt camera and storage medium |
CN112687127A (en) * | 2020-12-18 | 2021-04-20 | 华南理工大学 | Ship positioning and snapshot method based on AIS and image analysis assistance |
CN112348006A (en) * | 2021-01-11 | 2021-02-09 | 湖南星空机器人技术有限公司 | Unmanned aerial vehicle signal identification method, system, medium and equipment |
CN113657256A (en) * | 2021-08-16 | 2021-11-16 | 大连海事大学 | Unmanned ship-borne unmanned aerial vehicle sea-air cooperative visual tracking and autonomous recovery method |
Non-Patent Citations (5)
Title |
---|
HONG ZHANG等: "Merchant Vessel Classification Based on Scattering Component Analysis for COSMO-SkyMed SAR Images", 《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS》 * |
张伟: "水面无人艇的非稳像运动目标检测与跟踪方法", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技II辑》 * |
郭乾等: "一种桥梁防撞系统中的虚拟航道构建与船舶偏航检测方法", 《软件导刊》 * |
闫石: "异质图像配准关键技术研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 * |
黄攀峰等: "《空间绳系机器人技术》", 31 August 2014, 中国宇航出版社 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107145874B (en) | Ship target detection and identification method in complex background SAR image | |
CN109785291B (en) | Lane line self-adaptive detection method | |
WO2022126377A1 (en) | Traffic lane line detection method and apparatus, and terminal device and readable storage medium | |
Yan et al. | A method of lane edge detection based on Canny algorithm | |
CN108229475B (en) | Vehicle tracking method, system, computer device and readable storage medium | |
CN109919002B (en) | Yellow stop line identification method and device, computer equipment and storage medium | |
CN109344820B (en) | Digital ammeter reading identification method based on computer vision and deep learning | |
WO2022237272A1 (en) | Road image marking method and device for lane line recognition | |
Youjin et al. | A robust lane detection method based on vanishing point estimation | |
CN109708658B (en) | Visual odometer method based on convolutional neural network | |
WO2021088504A1 (en) | Road junction detection method and apparatus, neural network training method and apparatus, intelligent driving method and apparatus, and device | |
Arulmozhi et al. | Image refinement using skew angle detection and correction for Indian license plates | |
CN105184804A (en) | Sea surface small target detection method based on airborne infrared camera aerially-photographed image | |
CN104463238B (en) | A kind of automobile logo identification method and system | |
CN111191653A (en) | License plate recognition method and device, computer equipment and storage medium | |
Qiu et al. | License plate extraction based on vertical edge detection and mathematical morphology | |
CN110473255B (en) | Ship mooring post positioning method based on multiple grid division | |
CN111369570A (en) | Multi-target detection tracking method for video image | |
Tiwari et al. | Automatic vehicle number plate recognition system using matlab | |
CN114332454A (en) | Image-based ship snapshot method and system | |
CN116994236A (en) | Low-quality image license plate detection method based on deep neural network | |
CN112200850B (en) | ORB extraction method based on mature characteristic points | |
CN113971799A (en) | Vehicle nameplate information position detection method and system | |
Yang et al. | A review of intelligent ship marine object detection based on RGB camera | |
Li et al. | Lane detection and road surface reconstruction based on multiple vanishing point & symposia |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220412 |
|
RJ01 | Rejection of invention patent application after publication |