CN112818789A - Method and device for detecting ship entering and exiting port - Google Patents

Method and device for detecting ship entering and exiting port Download PDF

Info

Publication number
CN112818789A
CN112818789A CN202110095962.XA CN202110095962A CN112818789A CN 112818789 A CN112818789 A CN 112818789A CN 202110095962 A CN202110095962 A CN 202110095962A CN 112818789 A CN112818789 A CN 112818789A
Authority
CN
China
Prior art keywords
ship
port
monitoring image
line segment
surface area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110095962.XA
Other languages
Chinese (zh)
Other versions
CN112818789B (en
Inventor
韩月
杨玉玉
王俊伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sanya Hai Lan World Marine Mdt Infotech Ltd
Original Assignee
Sanya Hai Lan World Marine Mdt Infotech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanya Hai Lan World Marine Mdt Infotech Ltd filed Critical Sanya Hai Lan World Marine Mdt Infotech Ltd
Priority to CN202110095962.XA priority Critical patent/CN112818789B/en
Publication of CN112818789A publication Critical patent/CN112818789A/en
Application granted granted Critical
Publication of CN112818789B publication Critical patent/CN112818789B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a device for detecting the entrance and the exit of ships, wherein the method comprises the following steps: acquiring a sea surface area of a port monitoring image; identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area; acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port; and detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result. The scheme of the invention improves the accuracy and stability of the judgment of the port entry and exit states of the ship with the high dense port ao.

Description

Method and device for detecting ship entering and exiting port
Technical Field
The invention relates to the technical field of ship detection, in particular to a method and a device for detecting ship entering and exiting a port.
Background
When a ship navigates on the sea, common positioning methods include satellite positioning, astronomical positioning, radar positioning, target object positioning, dead reckoning positioning and the like, and the position of the ship is determined according to the positioning position. And updating the positioning position at intervals, and judging the entering and leaving states of the ship by combining the longitude and latitude information of the position and the port position.
The above positioning method has at least the following problems:
1. when extreme weather occurs or an area which cannot be covered by radar exists, the method for analyzing the position of the ship by means of the radar technology to obtain the judgment of port entrance and port exit is invalid. The positioning has large deviation, and the course judgment of the port entering and exiting is influenced. In this case, manual intervention is required.
2. Under a high-density scene, port ships enter and exit frequently, part of the ships wander near a port, and the port entering and exiting judgment is carried out only by means of position information, so that the delay is realized.
Disclosure of Invention
The invention aims to solve the technical problem of providing a method and a device for detecting the state of a ship entering and exiting a port, which improve the accuracy and the stability of the state judgment of the ship entering and exiting the port by using high-density port ao.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a method of detecting the arrival and departure of a ship from a port, the method comprising:
acquiring a sea surface area of a port monitoring image;
identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result.
Optionally, the obtaining of the sea surface area of the port monitoring image includes:
acquiring a port monitoring image;
extracting a sea surface area and a non-sea surface area in the port monitoring image;
and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
Optionally, identifying, according to the port monitoring image and the sea area, a ship in the port monitoring image through a depth detection network, includes:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
Optionally, identifying, by the depth detection network, the ship in the port monitoring image includes:
obtaining a detection position frame of the target ship in the port monitoring image through the depth detection network;
predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and judging the predicted position frame and the detected position frame through a second preset matching algorithm to determine whether the ships between the two frames are the same ship.
Optionally, the determining whether the ship between two frames is the same ship by determining the predicted position frame and the detected position frame includes:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
Optionally, obtaining a mapping relationship between the feature vector of the ship and the status of the ship entering and exiting the port includes: tracking the ship to obtain the track of the ship in the port monitoring image;
determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
acquiring a first position of the ship when the track is intersected with the first line segment, a second position of the ship when the track is intersected with the second line segment, and a third position of the ship when the track is intersected with the third line segment;
and generating the mapping relation according to the ID value of the ship, the position of the ship which is firstly detected in the port monitoring image, the first position, the second position and the third position, and the marked port entering and exiting states of the ship.
Optionally, according to the mapping relationship, detecting the ship entering and exiting the port to obtain a detection result, including:
and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
The embodiment of the invention also provides a detection device for the entrance and exit of ships, which comprises:
the first acquisition module is used for acquiring a sea surface area of the port monitoring image;
the identification module is used for identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
the second acquisition module is used for acquiring the mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and the detection module is used for detecting the entrance and the exit of the ship according to the mapping relation to obtain a detection result.
Optionally, the first obtaining module is specifically configured to: acquiring a port monitoring image; extracting a sea surface area and a non-sea surface area in the port monitoring image; and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
Embodiments of the present invention also provide a computer-readable storage medium storing instructions that, when executed on a computer, cause the computer to perform the method as described above.
The scheme of the invention at least comprises the following beneficial effects:
obtaining a sea surface area of a port monitoring image; identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area; acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port; detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result; by means of the vision sensor, the ship target is detected and tracked in real time in a monitoring picture, and the four-dimensional features are adopted as the input of the depth network, so that the detection stability is improved. In addition, the accuracy and the stability of the judgment of the ship entry and exit states of the high-density port are greatly improved by constructing a mapping relation between the multi-dimensional feature vector and the ship entry and exit states.
Drawings
FIG. 1 is a schematic flow chart of a method for detecting the arrival and departure of a ship from a port according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for detecting the entrance and exit of a ship according to an embodiment of the present invention;
fig. 3 is a block diagram schematically illustrating a detection apparatus for a ship entering and exiting a port according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As shown in fig. 1, an embodiment of the present invention provides a method for detecting a ship entering and exiting a port, including:
step 11, acquiring a sea surface area of a port monitoring image;
step 12, identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
step 13, acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and step 14, detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result.
In the embodiment, the ship target is detected and tracked in real time in a monitoring picture by means of a visual sensor, and the four-dimensional features are adopted as the input of a depth network, so that the detection stability is improved. In addition, the accuracy and the stability of the judgment of the ship entry and exit states of the high-density port are greatly improved by constructing a mapping relation between the multi-dimensional feature vector and the ship entry and exit states.
In an alternative embodiment of the present invention, step 11 may include:
step 111, acquiring a port monitoring image;
step 112, extracting a sea surface area and a non-sea surface area in the port monitoring image;
and 113, carrying out gray processing on the non-sea surface area to obtain a sea surface area.
In this embodiment, in the acquisition of the port monitoring image, a manual calibration method may be used to distinguish between the sea area and the non-sea area, that is, the non-sea area image pixel is set to 0, and the processed image is output after the graying processing is performed on the non-sea area image pixel.
In an alternative embodiment of the present invention, step 12 may include:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
And taking the red channel, the green channel and the blue channel of the port monitoring image as front three-dimensional characteristics, and taking the sea surface area as a fourth-dimensional characteristic of the depth detection network. Here, the deep inspection network may use a YOLOV 4-like deep inspection network to perform feature learning on the input. By extracting the sea surface area as the fourth dimension of the image, the influence of a coastline or a building and the like appearing in a monitoring picture on ship detection can be avoided, and in addition, the input information of the deep convolutional network can be enriched, so that the deep convolutional network can better learn ship characteristics.
In an optional embodiment of the present invention, in step 12, identifying the ship in the port monitoring image through the depth detection network includes:
step 121, obtaining a detection position frame of the target ship in the port monitoring image through the depth detection network;
step 122, predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and step 123, judging the predicted position frame and the detected position frame through a second preset matching algorithm, and determining whether the ships between the two frames are the same ship.
In an optional embodiment of the present invention, in step 123, the determining whether the predicted position frame and the detected position frame are the same ship may include:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
In the embodiment, the deep detection network adopts a deep convolutional network to extract a target ship in an input image, the ships with different scales are trained adaptively, multi-scale ship target detection near a port can be finally covered, and the output of the deep detection network is a detection position frame of the target ship.
And then, predicting the position of the ship in the next video frame by using a first preset position prediction algorithm (such as a Kalman filtering algorithm), then performing intersection comparison judgment on the areas of the predicted position frame and the detected position frame by using a second preset matching algorithm (such as a Hungarian matching algorithm), and if the area exceeds a set threshold value, determining that ship association between the two frames is successful, namely that the ships between the two frames are the same ship.
In an optional embodiment of the present invention, the step 13 may include:
step 131, tracking the ship to obtain the track of the ship in the port monitoring image;
step 132, determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
step 133, acquiring a first position of the ship when the track intersects with the first line segment, a second position of the ship when the track intersects with the second line segment, and a third position of the ship when the track intersects with the third line segment;
step 134, generating the mapping relationship according to the ID value of the ship, the first detected position of the ship in the port monitoring image, the first position, the second position, and the third position, and the marked incoming and outgoing states of the ship.
In this embodiment, the mapping relationship is obtained by training the correlation models in the above steps 11 and 12, and when training is performed, in step 11, the port monitoring image is a port monitoring image sample library.
In the embodiment, the ID is assigned to the tracked ship, the track of the ship in the monitoring picture can be obtained through tracking of continuous frames, and the purpose of tracking a plurality of ship targets in the monitoring picture can be finally achieved. Three line segments are designated in the port monitoring image, the first line segment is marked as A at the port position, the direction of the first line segment and the flow direction of the seawater form a tangent, and the two ends of the first line segment are the junction points of the coastlines at the two sides of the port at the inlet. The second line segment is arranged inside the port and marked as B, the direction is parallel to A, and the two ends of the second line segment are provided with the intersection points of a plurality of pixels and the coastline at the distance A inside the port. And the third line segment is arranged outside the port and is marked as C, the direction is parallel to the direction A, and the two ends of the third line segment are provided with the junction points of a plurality of pixels and the coastline at the distance A outside the port.
Constructing a mapping relation between multidimensional characteristics and the ship port entering and exiting states: and taking the ID value of the ship, the position of the ship which is firstly detected in the monitoring picture, the position of the ship when the ship is intersected with the line segment A, the position of the ship when the ship is intersected with the line segment B and the position of the ship when the ship is intersected with the line segment C as a characteristic vector, and taking the port entering and exiting state of the ship as a label. And learning the mapping relation of the image by using machine learning.
In an alternative embodiment of the present invention, step 14 may include: and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
In this embodiment, according to the mapping relationship and the feature vector of the ship, the heading information of the ship entering and exiting the port, that is, whether the ship enters the port or exits the port, can be obtained.
As shown in fig. 2, it is a specific implementation flow of the above method:
collecting port monitoring images through a video sensor; collecting samples to obtain a port monitoring image sample library; extracting a sea surface area in the image as a fourth dimension of the image; constructing a depth detection network, and learning a target ship in an image sample library; and constructing a mapping relation between the target ship feature vector and the ship port entering and exiting label. And finally, judging the state of the ship entering and leaving the port according to the mapping relation.
According to the embodiment of the invention, the state information of the ship entering and leaving the port can be judged in real time by constructing the mapping relation and detecting the characteristic vector of the ship, and the labor cost is reduced; stability of course determination of a vessel at high dense harbor ostia is improved.
As shown in fig. 3, an embodiment of the present invention further provides a detection apparatus 30 for a ship entering and exiting a port, including: the first acquisition module 31 is used for acquiring a sea surface area of the port monitoring image;
the identification module 32 is used for identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
a second obtaining module 33, configured to obtain a mapping relationship between the feature vector of the ship and the ship entering and exiting port state;
and the detection module 34 is used for detecting the ship entering and leaving the port according to the mapping relation to obtain a detection result.
Optionally, the first obtaining module 31 is specifically configured to: acquiring a port monitoring image; extracting a sea surface area and a non-sea surface area in the port monitoring image; and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
Optionally, identifying, according to the port monitoring image and the sea area, a ship in the port monitoring image through a depth detection network, includes:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
Optionally, identifying, by the depth detection network, the ship in the port monitoring image includes:
determining a detection position frame of a target ship in the port monitoring image through the depth detection network;
predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and judging the predicted position frame and the detected position frame through a second preset matching algorithm to determine whether the ships between the two frames are the same ship.
Optionally, the determining whether the ship between two frames is the same ship by determining the predicted position frame and the detected position frame includes:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
Optionally, obtaining a mapping relationship between the feature vector of the ship and the status of the ship entering and exiting the port includes: tracking the ship to obtain the track of the ship in the port monitoring image;
determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
acquiring a first position of the ship when the track is intersected with the first line segment, a second position of the ship when the track is intersected with the second line segment, and a third position of the ship when the track is intersected with the third line segment;
and generating the mapping relation according to the ID value of the ship, the position of the ship which is firstly detected in the port monitoring image, the first position, the second position and the third position, and the marked port entering and exiting states of the ship.
Optionally, according to the mapping relationship, detecting the ship entering and exiting the port to obtain a detection result, including:
and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
It should be noted that the apparatus is an apparatus corresponding to the above method, and all the implementations in the above method embodiment are applicable to the embodiment of the apparatus, and the same technical effects can be achieved.
Embodiments of the present invention also provide a computer-readable storage medium comprising instructions which, when executed on a computer, cause the computer to perform the method as described above. All the implementation manners in the above method embodiment are applicable to this embodiment, and the same technical effect can be achieved.
Aiming at the defects, the ship target is detected and tracked in real time in a monitoring picture by means of a visual sensor, and the four-dimensional characteristics are adopted as the input of a depth network, so that the detection stability is improved. In addition, the accuracy and the stability of the judgment of the ship entry and exit states of the high-density port are greatly improved by constructing a mapping relation between the multi-dimensional feature vector and the ship entry and exit states.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
Furthermore, it is to be noted that in the device and method of the invention, it is obvious that the individual components or steps can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of performing the series of processes described above may naturally be performed chronologically in the order described, but need not necessarily be performed chronologically, and some steps may be performed in parallel or independently of each other. It will be understood by those skilled in the art that all or any of the steps or elements of the method and apparatus of the present invention may be implemented in any computing device (including processors, storage media, etc.) or network of computing devices, in hardware, firmware, software, or any combination thereof, which can be implemented by those skilled in the art using their basic programming skills after reading the description of the present invention.
Thus, the objects of the invention may also be achieved by running a program or a set of programs on any computing device. The computing device may be a general purpose device as is well known. The object of the invention is thus also achieved solely by providing a program product comprising program code for implementing the method or the apparatus. That is, such a program product also constitutes the present invention, and a storage medium storing such a program product also constitutes the present invention. It is to be understood that the storage medium may be any known storage medium or any storage medium developed in the future. It is further noted that in the apparatus and method of the present invention, it is apparent that each component or step can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of executing the series of processes described above may naturally be executed chronologically in the order described, but need not necessarily be executed chronologically. Some steps may be performed in parallel or independently of each other.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method for detecting the arrival and departure of a ship, comprising:
acquiring a sea surface area of a port monitoring image;
identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result.
2. The method of inspecting ships entering and exiting a port according to claim 1, wherein acquiring a sea area of a port monitoring image comprises:
acquiring a port monitoring image;
extracting a sea surface area and a non-sea surface area in the port monitoring image;
and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
3. The method for detecting the entrance and exit of ships to and from harbor of claim 1, wherein the step of identifying the ship in the harbor monitoring image through a depth detection network based on the harbor monitoring image and the sea surface area comprises:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
4. The method for detecting ships entering and exiting a port according to claim 3, wherein the identifying the ships in the port monitoring image through the depth detection network comprises:
obtaining a detection position frame of the target ship in the port monitoring image through the depth detection network;
predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and judging the predicted position frame and the detected position frame through a second preset matching algorithm to determine whether the ships between the two frames are the same ship.
5. The method of claim 4, wherein the step of determining whether the predicted position frame and the detected position frame are the same ship comprises:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
6. The method for detecting the arrival and departure of a ship at a port according to claim 4, wherein the step of obtaining a mapping relationship between the feature vector of the ship and the arrival and departure state of the ship comprises the steps of:
tracking the ship to obtain the track of the ship in the port monitoring image;
determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
acquiring a first position of the ship when the track is intersected with the first line segment, a second position of the ship when the track is intersected with the second line segment, and a third position of the ship when the track is intersected with the third line segment;
and generating the mapping relation according to the ID value of the ship, the position of the ship which is firstly detected in the port monitoring image, the first position, the second position and the third position, and the marked port entering and exiting states of the ship.
7. The method for detecting the ship entering and exiting the port according to claim 6, wherein the step of detecting the ship entering and exiting the port according to the mapping relationship to obtain a detection result comprises the following steps:
and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
8. A detection device for ships entering and exiting a port, comprising:
the first acquisition module is used for acquiring a sea surface area of the port monitoring image;
the identification module is used for identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
the second acquisition module is used for acquiring the mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and the detection module is used for detecting the entrance and the exit of the ship according to the mapping relation to obtain a detection result.
9. The device for detecting the arrival and departure of ships from and to port of claim 8, wherein the first acquiring module is specifically configured to: acquiring a port monitoring image; extracting a sea surface area and a non-sea surface area in the port monitoring image; and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
10. A computer-readable storage medium storing instructions that, when executed on a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN202110095962.XA 2021-01-25 2021-01-25 Method and device for detecting arrival and departure of ship Active CN112818789B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110095962.XA CN112818789B (en) 2021-01-25 2021-01-25 Method and device for detecting arrival and departure of ship

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110095962.XA CN112818789B (en) 2021-01-25 2021-01-25 Method and device for detecting arrival and departure of ship

Publications (2)

Publication Number Publication Date
CN112818789A true CN112818789A (en) 2021-05-18
CN112818789B CN112818789B (en) 2024-10-22

Family

ID=75859464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110095962.XA Active CN112818789B (en) 2021-01-25 2021-01-25 Method and device for detecting arrival and departure of ship

Country Status (1)

Country Link
CN (1) CN112818789B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113282782A (en) * 2021-05-21 2021-08-20 三亚海兰寰宇海洋信息科技有限公司 Track acquisition method and device based on multi-point phase camera array
CN116634631A (en) * 2023-06-09 2023-08-22 江苏唐城霓虹数码科技有限公司 Intelligent control method and system for double lifting of port high-pole lamp for illumination monitoring

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107731011A (en) * 2017-10-27 2018-02-23 中国科学院深圳先进技术研究院 A kind of harbour is moored a boat monitoring method, system and electronic equipment
CN109725310A (en) * 2018-11-30 2019-05-07 中船(浙江)海洋科技有限公司 A kind of ship's fix supervisory systems based on YOLO algorithm and land-based radar system
CN110807424A (en) * 2019-11-01 2020-02-18 深圳市科卫泰实业发展有限公司 Port ship comparison method based on aerial images
CN111899450A (en) * 2020-07-24 2020-11-06 宁波盛洋电子科技有限公司 Method and system for monitoring ships entering and exiting port and finding dangerous ships

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107731011A (en) * 2017-10-27 2018-02-23 中国科学院深圳先进技术研究院 A kind of harbour is moored a boat monitoring method, system and electronic equipment
CN109725310A (en) * 2018-11-30 2019-05-07 中船(浙江)海洋科技有限公司 A kind of ship's fix supervisory systems based on YOLO algorithm and land-based radar system
CN110807424A (en) * 2019-11-01 2020-02-18 深圳市科卫泰实业发展有限公司 Port ship comparison method based on aerial images
CN111899450A (en) * 2020-07-24 2020-11-06 宁波盛洋电子科技有限公司 Method and system for monitoring ships entering and exiting port and finding dangerous ships

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李晋: "基于AIS船舶数据与人工智能算法的港口交通流量预测模型研究", 中国优秀硕士论文全文数据库, 15 July 2020 (2020-07-15), pages 17 - 20 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113282782A (en) * 2021-05-21 2021-08-20 三亚海兰寰宇海洋信息科技有限公司 Track acquisition method and device based on multi-point phase camera array
CN116634631A (en) * 2023-06-09 2023-08-22 江苏唐城霓虹数码科技有限公司 Intelligent control method and system for double lifting of port high-pole lamp for illumination monitoring
CN116634631B (en) * 2023-06-09 2024-01-12 江苏唐城霓虹数码科技有限公司 Intelligent control method and system for double lifting of port high-pole lamp for illumination monitoring

Also Published As

Publication number Publication date
CN112818789B (en) 2024-10-22

Similar Documents

Publication Publication Date Title
CN109255317B (en) Aerial image difference detection method based on double networks
CN109636771B (en) Flight target detection method and system based on image processing
CN112883819A (en) Multi-target tracking method, device, system and computer readable storage medium
CN107452015B (en) Target tracking system with re-detection mechanism
US20120328161A1 (en) Method and multi-scale attention system for spatiotemporal change determination and object detection
CN105427342B (en) A kind of underwater Small object sonar image target detection tracking method and system
CN111325769B (en) Target object detection method and device
CN113591968A (en) Infrared weak and small target detection method based on asymmetric attention feature fusion
CN110458198B (en) Multi-resolution target identification method and device
CN105930852B (en) A kind of bubble image-recognizing method
EP3734496A1 (en) Image analysis method and apparatus, and electronic device and readable storage medium
CN110555868A (en) method for detecting small moving target under complex ground background
CN112818789A (en) Method and device for detecting ship entering and exiting port
CN113822352B (en) Infrared dim target detection method based on multi-feature fusion
CN112597877A (en) Factory personnel abnormal behavior detection method based on deep learning
CN113591592B (en) Overwater target identification method and device, terminal equipment and storage medium
US20220386571A1 (en) Fish counting system, fish counting method, and program
CN111402185B (en) Image detection method and device
CN111553184A (en) Small target detection method and device based on electronic purse net and electronic equipment
CN113705380A (en) Target detection method and device in foggy days, electronic equipment and storage medium
CN117745709A (en) Railway foreign matter intrusion detection method, system, equipment and medium
CN117830356A (en) Target tracking method, device, equipment and medium
CN103093481B (en) A kind of based on moving target detecting method under the static background of watershed segmentation
CN116862832A (en) Three-dimensional live-action model-based operator positioning method
CN115294439B (en) Method, system, equipment and storage medium for detecting air weak and small moving target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: No. 1156, F1, Building 2, Building A, Qingshuiwan International Information Industry Park, No. 1 Lehuo Avenue, Yingzhou Town, Lingshui Li Autonomous County, Hainan Province, 572427

Applicant after: Hainan Hailan Huanyu Ocean Information Technology Co.,Ltd.

Address before: 572000 rooms 425 and 426, 4th floor, building 4, Baitai Industrial Park, yazhouwan science and Technology City, Yazhou District, Sanya City, Hainan Province

Applicant before: Sanya Hai Lan world marine Mdt InfoTech Ltd.

Country or region before: China

GR01 Patent grant