Disclosure of Invention
The invention aims to solve the technical problem of providing a method and a device for detecting the state of a ship entering and exiting a port, which improve the accuracy and the stability of the state judgment of the ship entering and exiting the port by using high-density port ao.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a method of detecting the arrival and departure of a ship from a port, the method comprising:
acquiring a sea surface area of a port monitoring image;
identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result.
Optionally, the obtaining of the sea surface area of the port monitoring image includes:
acquiring a port monitoring image;
extracting a sea surface area and a non-sea surface area in the port monitoring image;
and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
Optionally, identifying, according to the port monitoring image and the sea area, a ship in the port monitoring image through a depth detection network, includes:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
Optionally, identifying, by the depth detection network, the ship in the port monitoring image includes:
obtaining a detection position frame of the target ship in the port monitoring image through the depth detection network;
predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and judging the predicted position frame and the detected position frame through a second preset matching algorithm to determine whether the ships between the two frames are the same ship.
Optionally, the determining whether the ship between two frames is the same ship by determining the predicted position frame and the detected position frame includes:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
Optionally, obtaining a mapping relationship between the feature vector of the ship and the status of the ship entering and exiting the port includes: tracking the ship to obtain the track of the ship in the port monitoring image;
determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
acquiring a first position of the ship when the track is intersected with the first line segment, a second position of the ship when the track is intersected with the second line segment, and a third position of the ship when the track is intersected with the third line segment;
and generating the mapping relation according to the ID value of the ship, the position of the ship which is firstly detected in the port monitoring image, the first position, the second position and the third position, and the marked port entering and exiting states of the ship.
Optionally, according to the mapping relationship, detecting the ship entering and exiting the port to obtain a detection result, including:
and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
The embodiment of the invention also provides a detection device for the entrance and exit of ships, which comprises:
the first acquisition module is used for acquiring a sea surface area of the port monitoring image;
the identification module is used for identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
the second acquisition module is used for acquiring the mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and the detection module is used for detecting the entrance and the exit of the ship according to the mapping relation to obtain a detection result.
Optionally, the first obtaining module is specifically configured to: acquiring a port monitoring image; extracting a sea surface area and a non-sea surface area in the port monitoring image; and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
Embodiments of the present invention also provide a computer-readable storage medium storing instructions that, when executed on a computer, cause the computer to perform the method as described above.
The scheme of the invention at least comprises the following beneficial effects:
obtaining a sea surface area of a port monitoring image; identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area; acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port; detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result; by means of the vision sensor, the ship target is detected and tracked in real time in a monitoring picture, and the four-dimensional features are adopted as the input of the depth network, so that the detection stability is improved. In addition, the accuracy and the stability of the judgment of the ship entry and exit states of the high-density port are greatly improved by constructing a mapping relation between the multi-dimensional feature vector and the ship entry and exit states.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As shown in fig. 1, an embodiment of the present invention provides a method for detecting a ship entering and exiting a port, including:
step 11, acquiring a sea surface area of a port monitoring image;
step 12, identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
step 13, acquiring a mapping relation between the characteristic vector of the ship and the states of the ship entering and leaving the port;
and step 14, detecting the ship entering and exiting the port according to the mapping relation to obtain a detection result.
In the embodiment, the ship target is detected and tracked in real time in a monitoring picture by means of a visual sensor, and the four-dimensional features are adopted as the input of a depth network, so that the detection stability is improved. In addition, the accuracy and the stability of the judgment of the ship entry and exit states of the high-density port are greatly improved by constructing a mapping relation between the multi-dimensional feature vector and the ship entry and exit states.
In an alternative embodiment of the present invention, step 11 may include:
step 111, acquiring a port monitoring image;
step 112, extracting a sea surface area and a non-sea surface area in the port monitoring image;
and 113, carrying out gray processing on the non-sea surface area to obtain a sea surface area.
In this embodiment, in the acquisition of the port monitoring image, a manual calibration method may be used to distinguish between the sea area and the non-sea area, that is, the non-sea area image pixel is set to 0, and the processed image is output after the graying processing is performed on the non-sea area image pixel.
In an alternative embodiment of the present invention, step 12 may include:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
And taking the red channel, the green channel and the blue channel of the port monitoring image as front three-dimensional characteristics, and taking the sea surface area as a fourth-dimensional characteristic of the depth detection network. Here, the deep inspection network may use a YOLOV 4-like deep inspection network to perform feature learning on the input. By extracting the sea surface area as the fourth dimension of the image, the influence of a coastline or a building and the like appearing in a monitoring picture on ship detection can be avoided, and in addition, the input information of the deep convolutional network can be enriched, so that the deep convolutional network can better learn ship characteristics.
In an optional embodiment of the present invention, in step 12, identifying the ship in the port monitoring image through the depth detection network includes:
step 121, obtaining a detection position frame of the target ship in the port monitoring image through the depth detection network;
step 122, predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and step 123, judging the predicted position frame and the detected position frame through a second preset matching algorithm, and determining whether the ships between the two frames are the same ship.
In an optional embodiment of the present invention, in step 123, the determining whether the predicted position frame and the detected position frame are the same ship may include:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
In the embodiment, the deep detection network adopts a deep convolutional network to extract a target ship in an input image, the ships with different scales are trained adaptively, multi-scale ship target detection near a port can be finally covered, and the output of the deep detection network is a detection position frame of the target ship.
And then, predicting the position of the ship in the next video frame by using a first preset position prediction algorithm (such as a Kalman filtering algorithm), then performing intersection comparison judgment on the areas of the predicted position frame and the detected position frame by using a second preset matching algorithm (such as a Hungarian matching algorithm), and if the area exceeds a set threshold value, determining that ship association between the two frames is successful, namely that the ships between the two frames are the same ship.
In an optional embodiment of the present invention, the step 13 may include:
step 131, tracking the ship to obtain the track of the ship in the port monitoring image;
step 132, determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
step 133, acquiring a first position of the ship when the track intersects with the first line segment, a second position of the ship when the track intersects with the second line segment, and a third position of the ship when the track intersects with the third line segment;
step 134, generating the mapping relationship according to the ID value of the ship, the first detected position of the ship in the port monitoring image, the first position, the second position, and the third position, and the marked incoming and outgoing states of the ship.
In this embodiment, the mapping relationship is obtained by training the correlation models in the above steps 11 and 12, and when training is performed, in step 11, the port monitoring image is a port monitoring image sample library.
In the embodiment, the ID is assigned to the tracked ship, the track of the ship in the monitoring picture can be obtained through tracking of continuous frames, and the purpose of tracking a plurality of ship targets in the monitoring picture can be finally achieved. Three line segments are designated in the port monitoring image, the first line segment is marked as A at the port position, the direction of the first line segment and the flow direction of the seawater form a tangent, and the two ends of the first line segment are the junction points of the coastlines at the two sides of the port at the inlet. The second line segment is arranged inside the port and marked as B, the direction is parallel to A, and the two ends of the second line segment are provided with the intersection points of a plurality of pixels and the coastline at the distance A inside the port. And the third line segment is arranged outside the port and is marked as C, the direction is parallel to the direction A, and the two ends of the third line segment are provided with the junction points of a plurality of pixels and the coastline at the distance A outside the port.
Constructing a mapping relation between multidimensional characteristics and the ship port entering and exiting states: and taking the ID value of the ship, the position of the ship which is firstly detected in the monitoring picture, the position of the ship when the ship is intersected with the line segment A, the position of the ship when the ship is intersected with the line segment B and the position of the ship when the ship is intersected with the line segment C as a characteristic vector, and taking the port entering and exiting state of the ship as a label. And learning the mapping relation of the image by using machine learning.
In an alternative embodiment of the present invention, step 14 may include: and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
In this embodiment, according to the mapping relationship and the feature vector of the ship, the heading information of the ship entering and exiting the port, that is, whether the ship enters the port or exits the port, can be obtained.
As shown in fig. 2, it is a specific implementation flow of the above method:
collecting port monitoring images through a video sensor; collecting samples to obtain a port monitoring image sample library; extracting a sea surface area in the image as a fourth dimension of the image; constructing a depth detection network, and learning a target ship in an image sample library; and constructing a mapping relation between the target ship feature vector and the ship port entering and exiting label. And finally, judging the state of the ship entering and leaving the port according to the mapping relation.
According to the embodiment of the invention, the state information of the ship entering and leaving the port can be judged in real time by constructing the mapping relation and detecting the characteristic vector of the ship, and the labor cost is reduced; stability of course determination of a vessel at high dense harbor ostia is improved.
As shown in fig. 3, an embodiment of the present invention further provides a detection apparatus 30 for a ship entering and exiting a port, including: the first acquisition module 31 is used for acquiring a sea surface area of the port monitoring image;
the identification module 32 is used for identifying ships in the port monitoring image through a depth detection network according to the port monitoring image and the sea surface area;
a second obtaining module 33, configured to obtain a mapping relationship between the feature vector of the ship and the ship entering and exiting port state;
and the detection module 34 is used for detecting the ship entering and leaving the port according to the mapping relation to obtain a detection result.
Optionally, the first obtaining module 31 is specifically configured to: acquiring a port monitoring image; extracting a sea surface area and a non-sea surface area in the port monitoring image; and carrying out gray processing on the non-sea surface area to obtain a sea surface area.
Optionally, identifying, according to the port monitoring image and the sea area, a ship in the port monitoring image through a depth detection network, includes:
and respectively taking the red channel, the green channel and the blue channel of the port monitoring image as a first dimension characteristic, a second dimension characteristic and a third dimension characteristic of the depth detection network, taking the sea surface area as a fourth dimension characteristic of the depth detection network, and identifying the ship in the port monitoring image through the depth detection network.
Optionally, identifying, by the depth detection network, the ship in the port monitoring image includes:
determining a detection position frame of a target ship in the port monitoring image through the depth detection network;
predicting the position of the target ship appearing in the next video frame of the port monitoring image through a first preset position prediction algorithm to obtain a predicted position frame;
and judging the predicted position frame and the detected position frame through a second preset matching algorithm to determine whether the ships between the two frames are the same ship.
Optionally, the determining whether the ship between two frames is the same ship by determining the predicted position frame and the detected position frame includes:
and judging the area of the predicted position frame and the area of the detected position frame by a merging ratio, and if the area of the predicted position frame and the area of the detected position frame exceeds a set threshold value, determining that the ships between the two frames are the same ship.
Optionally, obtaining a mapping relationship between the feature vector of the ship and the status of the ship entering and exiting the port includes: tracking the ship to obtain the track of the ship in the port monitoring image;
determining a first line segment, a second line segment and a third line segment in the port monitoring image; the first line section is arranged at a port, the direction of the first line section is tangential to the flow direction of seawater, and two ends of the first line section are junction points of coastlines at two sides of the port at an inlet; the second line segment is arranged inside the port, the direction of the second line segment is parallel to that of the first line segment, and the two ends of the second line segment are provided with the junction points of a plurality of pixels and the coastline, which are away from the first line segment, inside the port; the third line segment is arranged outside the port, the direction of the third line segment is parallel to that of the first line segment, and the intersection points of a plurality of pixels and the coastline which are arranged outside the port and are away from the first line segment are taken at the two ends of the third line segment;
acquiring a first position of the ship when the track is intersected with the first line segment, a second position of the ship when the track is intersected with the second line segment, and a third position of the ship when the track is intersected with the third line segment;
and generating the mapping relation according to the ID value of the ship, the position of the ship which is firstly detected in the port monitoring image, the first position, the second position and the third position, and the marked port entering and exiting states of the ship.
Optionally, according to the mapping relationship, detecting the ship entering and exiting the port to obtain a detection result, including:
and detecting the ship entering and exiting the port according to the mapping relation to obtain the course information of the ship entering and exiting the port.
It should be noted that the apparatus is an apparatus corresponding to the above method, and all the implementations in the above method embodiment are applicable to the embodiment of the apparatus, and the same technical effects can be achieved.
Embodiments of the present invention also provide a computer-readable storage medium comprising instructions which, when executed on a computer, cause the computer to perform the method as described above. All the implementation manners in the above method embodiment are applicable to this embodiment, and the same technical effect can be achieved.
Aiming at the defects, the ship target is detected and tracked in real time in a monitoring picture by means of a visual sensor, and the four-dimensional characteristics are adopted as the input of a depth network, so that the detection stability is improved. In addition, the accuracy and the stability of the judgment of the ship entry and exit states of the high-density port are greatly improved by constructing a mapping relation between the multi-dimensional feature vector and the ship entry and exit states.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
Furthermore, it is to be noted that in the device and method of the invention, it is obvious that the individual components or steps can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of performing the series of processes described above may naturally be performed chronologically in the order described, but need not necessarily be performed chronologically, and some steps may be performed in parallel or independently of each other. It will be understood by those skilled in the art that all or any of the steps or elements of the method and apparatus of the present invention may be implemented in any computing device (including processors, storage media, etc.) or network of computing devices, in hardware, firmware, software, or any combination thereof, which can be implemented by those skilled in the art using their basic programming skills after reading the description of the present invention.
Thus, the objects of the invention may also be achieved by running a program or a set of programs on any computing device. The computing device may be a general purpose device as is well known. The object of the invention is thus also achieved solely by providing a program product comprising program code for implementing the method or the apparatus. That is, such a program product also constitutes the present invention, and a storage medium storing such a program product also constitutes the present invention. It is to be understood that the storage medium may be any known storage medium or any storage medium developed in the future. It is further noted that in the apparatus and method of the present invention, it is apparent that each component or step can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of executing the series of processes described above may naturally be executed chronologically in the order described, but need not necessarily be executed chronologically. Some steps may be performed in parallel or independently of each other.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.