CN113609891A - Ship identification monitoring method and system - Google Patents

Ship identification monitoring method and system Download PDF

Info

Publication number
CN113609891A
CN113609891A CN202110660132.7A CN202110660132A CN113609891A CN 113609891 A CN113609891 A CN 113609891A CN 202110660132 A CN202110660132 A CN 202110660132A CN 113609891 A CN113609891 A CN 113609891A
Authority
CN
China
Prior art keywords
ship
monitoring
edge
module
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110660132.7A
Other languages
Chinese (zh)
Inventor
王方东
孟凡清
臧永生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Liaowang Shenzhou Technology Co ltd
Original Assignee
Beijing Liaowang Shenzhou Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Liaowang Shenzhou Technology Co ltd filed Critical Beijing Liaowang Shenzhou Technology Co ltd
Priority to CN202110660132.7A priority Critical patent/CN113609891A/en
Publication of CN113609891A publication Critical patent/CN113609891A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a ship identification monitoring method and a system, comprising the following steps: acquiring a sea surface environment image of a ship by using a 5G data transmission technology; detecting a sea surface environment image of the ship based on an edge feature extraction strategy and extracting the edge feature of the sea surface ship; combining a deep learning neural network strategy with a YOLO algorithm, constructing a ship feature recognition model, and inputting the acquired sea surface ship edge features for training; transmitting coordinate data of a ship to be monitored by using a positioning sensor, constructing a positioning recognition model based on a binary strategy, and inputting recognition data output by a ship feature recognition model after training is finished; and calculating and matching the coordinate data and the identification data to obtain final ship monitoring data. According to the invention, data information is efficiently acquired through a 5G data transmission technology, and the position and state information of the monitoring ship is accurately obtained by combining a target detection and binary optimization positioning technology, so that real-time monitoring is carried out, and the safety maintenance and the application universality of the marine ship are improved.

Description

Ship identification monitoring method and system
Technical Field
The invention relates to the technical field of image processing and ship identification and positioning, in particular to a ship identification and monitoring method and system.
Background
Marine vessel identification is widely used, for example, a malicious vessel is close to the vessel, a non-national vessel enters the territory of China, an intelligent monitoring system, marine big data and the like, however, in sea vessel identification, storms, heavy fog, illumination and the like all cause great influence on the sea vessel identification, so that the problems of accuracy, precision and robustness of vessel identification are concerned.
At present, an identification algorithm applied to sea surface ships is mainly an SVM algorithm, classification training prediction is carried out on ship signals through the SVM algorithm, real-time identification cannot be achieved, the running time is too long, the identification rate is low, and accurate data support is lacked for observation of existing ocean big data.
Disclosure of Invention
The embodiment of the invention provides a ship identification monitoring method and a system, which can improve the accuracy and efficiency of ship identification and positioning and realize real-time monitoring.
In a first aspect of the embodiments of the present invention, a method for identifying and monitoring a ship is provided, and optionally in a possible implementation manner of the first aspect, the method includes: acquiring a sea surface environment image of a ship by using a 5G data transmission technology; detecting a sea surface environment image where the ship is located based on an edge feature extraction strategy and extracting sea surface ship edge features; combining a deep learning neural network strategy with a YOLO algorithm, constructing a ship feature recognition model, and inputting the obtained edge features of the sea surface ship for training; transmitting coordinate data of a ship to be monitored by using a positioning sensor, constructing a positioning recognition model based on a binary strategy, and inputting recognition data output by the ship feature recognition model after training is finished; and calculating and matching the coordinate data and the identification data to obtain final ship monitoring data.
Optionally, in a possible implementation manner of the first aspect, the acquired sea surface environment image of the ship needs to be preprocessed to form a sample set; the preprocessing comprises graying, geometric transformation and image enhancement processing.
Optionally, in a possible implementation manner of the first aspect, the edge feature extraction strategy includes performing gradient calculation on each pixel point in a sea surface environment image where the ship is located to obtain a gradient intensity value corresponding to the pixel point; if the gradient strength of the current pixel point is greater than the gradient strength of the pixel points along the positive and negative gradient directions, the gradient strength of the current pixel point is an edge pixel point; if the gradient strength of the edge pixel point is greater than the high edge threshold value, the edge pixel point is a strong edge pixel point; if the gradient strength of the edge pixel point is greater than a low edge threshold value and less than a high edge threshold value, the edge pixel point is a weak edge pixel point; extracting each field pixel point around the edge pixel point, and if the strong edge pixel point exists in the field pixel points, retaining the extracted weak edge pixel point; and connecting the reserved strong edge pixel points and the reserved weak edge pixel points to form the edge feature of the sea surface ship.
Optionally, in a possible implementation manner of the first aspect, constructing the ship feature recognition model includes building a YOLO algorithm framework in a deep learning neural network structure layer to operate, and forming an objective function, that is, the ship feature recognition model, as follows,
Figure RE-GDA0003293748190000021
the confidence is a feature identification trust value, namely the identified grid contains confidence of the target object and labeled IOU information, if the target object is in the grid, the feature identification trust value is 1, otherwise, the feature identification trust value is 0, and D is obtainedr(object) is confidence level, IOU is cross-over ratio, if
Figure RE-GDA0003293748190000022
The recognition result is correct.
In a first aspect of the embodiments of the present invention, there is provided a ship identification monitoring method, optionally, in a possible implementation manner of the first aspect, the constructing the location identification model includes,
Figure RE-GDA0003293748190000023
Figure RE-GDA0003293748190000024
Figure RE-GDA0003293748190000025
Zxy≤Sxy,x=1,2…X,Anx∈{0,1}
wherein X is the number of edge pixel points in the sea surface environment image of the ship, Y is the number of each monitoring time period in the sea surface environment image of the ship,
Figure RE-GDA0003293748190000026
selecting only one edge pixel point for each monitoring request to perform positioning detection, SxySupply matrix for edge pixel point detection, AnxDetecting a demand matrix for edge pixels, BnySelecting a matrix for edge pixels, ZxyFor edge pixel point anomaly detection matrices, CxyA confidence matrix of edge pixels, t is a monitoring time interval, S is a positioning constraint parameter, tau is an objective function for identifying and detecting the edge pixels in the sea surface environment image of the ship, and the method comprises the following steps of
Figure RE-GDA0003293748190000031
Figure RE-GDA0003293748190000032
Optionally, in a possible implementation manner of the first aspect, the monitoring result includes, if the edge pixel point is detected to be abnormal, that the edge pixel point is detected to be abnormalMatrix ZxyDetection supply matrix S for pixel points with value greater than edgexyAnd if so, the monitoring ship only needs to be overhauled and maintained.
A second aspect of the embodiments of the present invention provides a ship identification and monitoring system, optionally, in a possible implementation manner of the second aspect, the system includes an acquisition module, an image identification module, a ship positioning module, and a monitoring management and control module; the acquisition module includes camera and vision sensor, the camera with vision sensor sets up around the hull of waiting to monitor the ship, the camera is used for shooing sea surrounding environment picture, vision sensor with the camera is connected, and it is used for catching the image information of waiting to monitor the ship and the change of sea area environment around in-process of traveling.
Optionally, in a possible implementation manner of the second aspect, the image recognition module is connected to the acquisition module through a protocol stack, and the image recognition module includes an image decoder and a recognition unit; the ship positioning module and the image recognition module are connected to the acquisition module in parallel, the ship positioning module comprises a positioning sensor, and a GPS positioning algorithm and a grid node positioning technology are carried in the positioning sensor to provide three-dimensional coordinates of a ship to be monitored in real time.
Optionally, in a possible implementation manner of the second aspect, the monitoring management and control module is connected to the image recognition module and the ship positioning module, and includes a calculation processing center and a control unit, where the calculation processing center is configured to process data information obtained by each module in batch, and finally embodies the data information in a form of a numerical value, and the control unit is configured to read a calculation result of the calculation processing center and control a state of the monitored ship in real time.
A second aspect of the embodiments of the present invention provides a ship identification and monitoring system, optionally, in a possible implementation manner of the second aspect, the system further includes a data transmission module; the data transmission module is arranged in parallel and connected with the acquisition module, the image recognition module, the ship positioning module and the monitoring control module, the data transmission module comprises 5G data transmission and receiving and transmitting, and the data transmission module is used for providing data transmission service for each module and building an information transmission channel.
According to the invention, data information is efficiently acquired through a 5G data transmission technology, and the position and state information of the monitoring ship is accurately obtained by combining a target detection and binary optimization positioning technology, so that the accuracy and efficiency of ship identification and positioning are improved, real-time monitoring is carried out through ground platform management and control, and the safety maintenance and the application universality of the marine ship are improved.
Drawings
Fig. 1 is a schematic flow chart of a ship identification and monitoring method according to a first embodiment of the present invention;
FIG. 2 is a schematic block diagram of a ship identification and monitoring system according to a second embodiment of the present invention;
fig. 3 is a schematic diagram of a network topology of a ship identification and monitoring system according to a second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the processes do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the internal logic of the processes, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
It should be understood that in the present application, "comprising" and "having" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that, in the present invention, "a plurality" means two or more. "and/or" is merely an association describing an associated object, meaning that three relationships may exist, for example, and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "comprises A, B and C" and "comprises A, B, C" means that all three of A, B, C comprise, "comprises A, B or C" means that one of A, B, C comprises, "comprises A, B and/or C" means that any 1 or any 2 or 3 of A, B, C comprises.
It should be understood that in the present invention, "B corresponding to a", "a corresponds to B", or "B corresponds to a" means that B is associated with a, and B can be determined from a. Determining B from a does not mean determining B from a alone, but may be determined from a and/or other information. And the matching of A and B means that the similarity of A and B is greater than or equal to a preset threshold value.
As used herein, "if" may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
[ application scenario description ] example 1
Referring to fig. 1, a first embodiment of the present invention provides a ship identification and monitoring method, which specifically includes:
s1: and acquiring the sea surface environment image of the ship by using a 5G data transmission technology. Wherein, it is required to be noted that:
preprocessing the acquired sea surface environment image of the ship to form a sample set;
the preprocessing includes graying, geometric transformation, and image enhancement processing.
S2: and detecting the sea surface environment image of the ship based on the edge feature extraction strategy and extracting the edge features of the sea surface ship. It should be noted in this step that the edge feature extraction strategy includes:
performing gradient calculation on each pixel point in the sea surface environment image of the ship to obtain a gradient intensity value corresponding to the pixel point;
if the gradient strength of the current pixel point is greater than the gradient strength of the pixel points along the positive and negative gradient directions, the gradient strength of the current pixel point is an edge pixel point;
if the gradient strength of the edge pixel point is greater than the high edge threshold value, the edge pixel point is a strong edge pixel point;
if the gradient strength of the edge pixel point is greater than the low edge threshold value and less than the high edge threshold value, the edge pixel point is a weak edge pixel point;
extracting all domain pixel points around the edge pixel points, and if strong edge pixel points exist in the domain pixel points, keeping the extracted weak edge pixel points;
and connecting all reserved strong edge pixel points and weak edge pixel points to form the edge characteristic of the sea surface ship.
S3: and combining a deep learning neural network strategy with a YOLO algorithm, constructing a ship feature recognition model, and inputting the acquired sea surface ship edge features for training. It should be further noted that the building of the ship feature recognition model includes:
a YO LO algorithm framework is set up in a deep learning neural network structure layer to operate, an objective function, namely a ship feature recognition model is formed, as follows,
Figure RE-GDA0003293748190000061
the confidence is a feature identification trust value, namely the identified grid contains confidence of the target object and labeled IOU information, if the target object is in the grid, the feature identification trust value is 1, otherwise, the feature identification trust value is 0, and D is obtainedr(object) is confidence level, IOU is cross-over ratio, if
Figure RE-GDA0003293748190000062
The recognition result is correct.
Further, the training specifically comprises:
(1) inputting initialized pre-training data, and training the RPN network independently;
(2) taking the output candidate area of the RPN as the input of a detection recognition network, and independently training a Fast-RCNN network;
(3) training the RPN again, fixing the parameters of the public part of the network, and only updating the parameters of the unique part of the RPN;
(4) fine-tuning a Fast-RCNN network by utilizing a training result of the RPN network, fixing parameters of a public part of the network, and only updating parameters of a unique part of the Fast-RCNN network;
still further, this embodiment should be further explained as follows:
if the truth value of the IOU is too large, the confidence degree of the target detection object is increased, namely the confidence value is larger;
if the confidence value is larger, the identification result is more accurate.
S4: the coordinate data of the ship to be monitored are transmitted by using the positioning sensor, the positioning recognition model is built based on a binary strategy, and the recognition data output by the ship feature recognition model after training is input. It should be further noted that, the step of constructing the location identification model includes:
Figure RE-GDA0003293748190000063
Figure RE-GDA0003293748190000064
Figure RE-GDA0003293748190000065
Zxy≤Sxy,x=1,2…X,Anx∈{0,1}
wherein X is the number of edge pixel points in the sea surface environment image of the ship, Y is the number of each monitoring time period in the sea surface environment image of the ship,
Figure RE-GDA0003293748190000071
selecting only one edge pixel point for each monitoring request to perform positioning detection, SxySupply matrix for edge pixel point detection, AnxDetecting a demand matrix for edge pixels, BnySelecting a matrix for edge pixels, ZxyFor edge pixel point anomaly detection matrices, CxyIs a confidence matrix of edge pixel points, t is a monitoring time interval, S is a positioning constraint parameter, tau is an objective function for identifying and detecting the edge pixel points in the sea surface environment image of the ship,
Figure RE-GDA0003293748190000072
Figure RE-GDA0003293748190000073
preferably, the step is further detailed as follows:
(1) acquiring coordinate data by using a GPS positioning technology, and performing binary conversion calculation by combining identification data of image target detection;
(2) randomly monitoring data captured in a certain periodPerforming line image coordinate matching calculation, and detecting a matrix Z if the edge pixel point abnormity appears in the monitoring time periodxyIf the image is larger, a grid map divided by the ship image in the time interval is immediately called, and an image area where the confidence value is smaller than that in the image target detection and identification is determined;
(3) and if the image area where the smaller value of the confidence is located corresponds to the actual ship image shooting position, the matching is successful, otherwise, the record is deleted, the whole graph is traversed, and the calculation is carried out again until the traversal is finished.
S5: and calculating and matching the coordinate data and the identification data to obtain final ship monitoring data. Wherein, it should be noted again that the monitoring result includes:
detecting matrix Z if edge pixel point is abnormalxyDetection supply matrix S for pixel points with value greater than edgexyAnd if so, the monitoring ship only needs to be overhauled and maintained.
It should be understood that, this embodiment still needs to explain that, present ship identification method calculates the discernment through simple feature extraction mostly, and can only travel comparatively steadily at the ship, network signal can normally go on under comparatively unobstructed circumstances, if face too big or the bad good condition of network communication of marine ship storm, then can't provide accurate monitoring for ground staff, traditional ship still can't realize real-time monitoring technique, because the monitoring technique to the ship is still not extensive maturity at present, therefore an urgent need can realize the method of real-time ship discernment monitoring.
Preferably, the present embodiment refers to the 5G data transmission technology to efficiently collect data information, perform data transmission through radio and base station, even if the sea wave is too large and the network communication is poor, the data can be transmitted smoothly in real time, further avoiding the problem, and the method of the invention carries the YO LO algorithm in the deep learning neural network for parameter training, further improves the parameter precision aiming at the image edge feature extraction, namely, the accuracy of image identification is increased, the positioning coordinate matching algorithm provided by the invention combines and matches the positioning data and the image identification data, further verification and improvement of the accuracy of ship monitoring are carried out on the basis of the accuracy of the preliminary determination of the identification result, the ground platform monitoring system can monitor ships in real time with high precision and high efficiency.
In order to better verify and explain the technical effects adopted in the method of the present invention, the present embodiment selects the traditional ship detection method based on the convolutional neural network, the ship detection method based on the feature extraction, and the ship detection method based on the binarization processing to perform the comparison test, and compares the test results by means of scientific demonstration to verify the true effects of the method of the present invention.
The traditional ship detection method based on the convolutional neural network, the ship detection method based on the feature extraction and the ship detection method based on the binarization processing cannot accurately monitor ships in real time, can only identify ships running in a steady state under normal weather, and have great limitation.
And (3) testing environment: the method comprises the steps of running a ship on a simulation platform to simulate running and simulate severe weather and stormy waves, adopting a Zhoushan sea area environment as a test sample, respectively inputting three traditional methods (a traditional method 1 is a ship detection method based on a convolutional neural network, a traditional method 2 is a ship detection method based on feature extraction, and a traditional method 3 is a ship detection method of binarization processing) and an operation program of the method, starting automatic test equipment, performing ship detection test by using MATLB (matrix-assisted laser ranging) and obtaining test result data, testing one hundred groups of data in each method, calculating time and error value of each group of data, observing data transmission conditions of each method in a test period, and obtaining results shown in a table.
Table 1: efficiency, accuracy comparison data sheet.
Figure RE-GDA0003293748190000081
Referring to table 1, it can be seen visually that the time average and the error average of a hundred sets of data tested by each method are compared, compared with the three traditional methods, the method of the present invention has shorter calculation time under severe weather, and the error value is lower, i.e., the accuracy is higher.
In order to further verify that the method of the present invention can achieve the beneficial effect of real-time monitoring, the communication states of the four methods are also observed and recorded in the testing process of the four methods, respectively, as shown in the following table:
table 2: and (5) communication observation record table.
Early stage of testing Middle stage of test Late stage of testing
Conventional method 1 Transmission is normal Transmission instability Transmission interrupt
Conventional method 2 Transmission is normal Transmission instability Transmission interrupt
Conventional method 3 Transmission is normal Transmission interrupt /
The method of the invention Transmission is normal Transmission is normal Transmission is normal
Referring to table 2, the test is divided into a test early stage, a test middle stage and a test late stage in this embodiment, for example, if the test duration is 1 hour, the first 20 minutes is the test early stage, the second 20 minutes is the test middle stage, and the third 20 minutes is the test late stage, according to the schematic of table 2, none of the three conventional methods can provide real-time monitoring well in a severe environment, and only the method of the present invention can normally transmit data.
Example 2
Referring to fig. 2 and 3, a second embodiment of the present invention, which is different from the first embodiment, provides a ship identification and monitoring system, specifically comprising:
the system comprises an acquisition module 100, an image recognition module 200, a ship positioning module 300, a monitoring management and control module 400 and a data transmission module 500.
The acquisition module 100 comprises a camera 101 and a visual sensor 102, the camera 101 and the visual sensor 102 are arranged around the hull of the ship to be monitored, the camera 101 is used for shooting the picture of the surrounding environment of the sea surface, and the visual sensor 102 is connected with the camera 101 and is used for capturing the image information of the ship to be monitored and the environment change of the surrounding sea area in the running process of the ship.
The image recognition module 200 is connected with the acquisition module 100 through a protocol stack, and the image recognition module 200 comprises an image decoding body 201 and a recognition unit 202; the ship positioning module 300 and the image recognition module 200 are connected to the acquisition module 100 in parallel, the ship positioning module 300 comprises a positioning sensor 301, and a GPS positioning algorithm and a grid node positioning technology are carried in the positioning sensor 301 to provide three-dimensional coordinates of a ship to be monitored in real time.
The monitoring control module 400 is respectively connected with the image recognition module 200 and the ship positioning module 300, the monitoring control module 400 comprises a calculation processing center 401 and a control unit 402, the calculation processing center 401 is used for processing data information obtained by each module in batches and finally embodying the data information in a numerical value form, and the control unit 402 is used for reading a calculation result of the calculation processing center 401 and controlling the state of the monitored ship in real time.
The data transmission module 500 is arranged in parallel and connected with the acquisition module 100, the image recognition module 200, the ship positioning module 300 and the monitoring control module 400, and comprises a 5G data transmission 501 and a receiving and sending 502, wherein the data transmission module 500 is used for providing data transmission service for each module and building an information transmission channel.
Preferably, this embodiment should also be described in that the 5G data transmission 501 is used for wireless long-distance data transmission, and it uses a high-performance processor and a wireless terminal, and uses a real-time operation terminal as software to support a ground platform control system, and directly connects to a serial device and converts the serial device into 5G transmission through a serial port, so as to implement high-efficiency wireless long-distance data transmission, and the computing processing center 401 packs the serial data into TCP data and remotely transmits the TCP data to a server through the transceiver 502, and the communication through the image recognition module 200 and the ship positioning module 300 is parallel, highly reliable, and accurately provides monitoring information.
The present embodiment also provides a readable storage medium, in which a computer program is stored, and the computer program is used for implementing the methods provided by the various embodiments described above when being executed by a processor.
Wherein a readable storage medium may be a computer storage medium or a communication medium, including any medium that facilitates transfer of a computer program from one place to another, and which may be any available medium that can be accessed by a general purpose or special purpose computer; for example, a readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the readable storage medium.
Of course, the readable storage medium may be a part of the processor, the processor and the readable storage medium may be located in an Application Specific Integrated Circuits (ASIC), the ASIC may be located in the user equipment, and of course, the processor and the readable storage medium may also be present in the communication device as discrete components, and the readable storage medium may be a Read Only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The present invention also provides a program product comprising executable instructions stored in a readable storage medium, the executable instructions being readable from the readable storage medium by at least one processor of a device, execution of the executable instructions by the at least one processor causing the device to implement the methods provided by the various embodiments described above.
In the above embodiments of the apparatus, it should be understood that the Processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a microprocessor, or any conventional Processor, and the steps of the method disclosed in the present invention may be directly embodied as a hardware Processor, or may be implemented by a combination of hardware and software modules in the Processor.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method of vessel identification monitoring, comprising:
acquiring a sea surface environment image of a ship by using a 5G data transmission technology;
detecting a sea surface environment image where the ship is located based on an edge feature extraction strategy and extracting sea surface ship edge features;
combining a deep learning neural network strategy with a YOLO algorithm, constructing a ship feature recognition model, and inputting the obtained edge features of the sea surface ship for training;
transmitting coordinate data of a ship to be monitored by using a positioning sensor, constructing a positioning recognition model based on a binary strategy, and inputting recognition data output by the ship feature recognition model after training is finished;
and calculating and matching the coordinate data and the identification data to obtain final ship monitoring data.
2. The vessel identification monitoring method of claim 1, wherein: preprocessing the acquired sea surface environment image of the ship to form a sample set;
the preprocessing comprises graying, geometric transformation and image enhancement processing.
3. The vessel identification monitoring method according to claim 1 or 2, characterized in that: the edge feature extraction strategy comprises the following steps,
performing gradient calculation on each pixel point in the sea surface environment image of the ship to obtain a gradient intensity value corresponding to the pixel point;
if the gradient strength of the current pixel point is greater than the gradient strength of the pixel points along the positive and negative gradient directions, the gradient strength of the current pixel point is an edge pixel point;
if the gradient strength of the edge pixel point is greater than the high edge threshold value, the edge pixel point is a strong edge pixel point;
if the gradient strength of the edge pixel point is greater than a low edge threshold value and less than a high edge threshold value, the edge pixel point is a weak edge pixel point;
extracting each field pixel point around the edge pixel point, and if the strong edge pixel point exists in the field pixel points, retaining the extracted weak edge pixel point;
and connecting the reserved strong edge pixel points and the reserved weak edge pixel points to form the edge feature of the sea surface ship.
4. The vessel identification monitoring method of claim 3, wherein: constructing the vessel feature recognition model includes,
building a YOLO algorithm framework in the deep learning neural network structure layer for operation to form an objective function, namely the ship feature recognition model, as follows,
Figure RE-FDA0003293748180000021
the confidence is a feature identification trust value, namely the identified grid contains confidence of the target object and labeled IOU information, if the target object is in the grid, the feature identification trust value is 1, otherwise, the feature identification trust value is 0, and D is obtainedr(object) is confidence level, IOU is cross-over ratio, if
Figure RE-FDA0003293748180000022
The recognition result is correct.
5. The vessel identification monitoring method of claim 4, wherein: constructing the location-identification model includes,
Figure RE-FDA0003293748180000023
Figure RE-FDA0003293748180000024
Figure RE-FDA0003293748180000025
Zxy≤Sxy,x=1,2…X,Anx∈{0,1}
wherein X is the number of edge pixel points in the sea surface environment image of the ship, Y is the number of each monitoring time period in the sea surface environment image of the ship,
Figure RE-FDA0003293748180000026
selecting only one edge pixel point for each monitoring request to perform positioning detection, SxySupply matrix for edge pixel point detection, AnxDetecting a demand matrix for edge pixels, BnySelecting a matrix for edge pixels, ZxyFor edge pixel point anomaly detection matrices, CxyA confidence matrix of edge pixels, t is a monitoring time interval, S is a positioning constraint parameter, tau is an objective function for identifying and detecting the edge pixels in the sea surface environment image of the ship, and the method comprises the following steps of
Figure RE-FDA0003293748180000027
Figure RE-FDA0003293748180000028
6. The vessel identification monitoring method of claim 5, wherein: the monitoring results include the results of the monitoring,
detecting matrix Z if edge pixel point is abnormalxyDetection supply matrix S for pixel points with value greater than edgexyAnd if so, the monitoring ship only needs to be overhauled and maintained.
7. A ship identification monitoring system applied to the ship identification monitoring method according to claim 6, characterized in that: the system comprises an acquisition module (100), an image recognition module (200), a ship positioning module (300) and a monitoring management and control module (400);
the acquisition module (100) comprises a camera (101) and a visual sensor (102), wherein the camera (101) and the visual sensor (102) are arranged around a ship body of a ship to be monitored, the camera (101) is used for shooting a picture of the surrounding environment of the sea surface, the visual sensor (102) is connected with the camera (101), and the visual sensor is used for capturing the image information of the ship to be monitored and the surrounding sea area environment change in the driving process.
8. The vessel identification monitoring system of claim 7, wherein: the image recognition module (200) is connected with the acquisition module (100) through a protocol stack, and the image recognition module (200) comprises an image decoding body (201) and a recognition unit (202);
the ship positioning module (300) and the image recognition module (200) are connected in parallel to the acquisition module (100), the ship positioning module (300) comprises a positioning sensor (301), a GPS positioning algorithm and a grid node positioning technology are carried in the positioning sensor (301), and three-dimensional coordinates of a ship to be monitored are provided in real time.
9. The vessel identification monitoring system of claim 8, wherein: the monitoring and control module (400) is respectively connected with the image recognition module (200) and the ship positioning module (300), the monitoring and control module (400) comprises a calculation processing center (401) and a control unit (402), the calculation processing center (401) is used for processing data information obtained by each module in batches and finally embodying the data information in a numerical value form, and the control unit (402) is used for reading a calculation result of the calculation processing center (401) and controlling the state of a monitoring ship in real time.
10. The vessel identification monitoring system of claim 9, wherein: further comprising a data transmission module (500);
the data transmission module (500) is arranged in parallel and connected with the acquisition module (100), the image recognition module (200), the ship positioning module (300) and the monitoring management and control module (400), the data transmission module comprises 5G data transmission (501) and transceiving (502), and the data transmission module (500) is used for providing data transmission service for each module and building an information transmission channel.
CN202110660132.7A 2021-06-15 2021-06-15 Ship identification monitoring method and system Pending CN113609891A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110660132.7A CN113609891A (en) 2021-06-15 2021-06-15 Ship identification monitoring method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110660132.7A CN113609891A (en) 2021-06-15 2021-06-15 Ship identification monitoring method and system

Publications (1)

Publication Number Publication Date
CN113609891A true CN113609891A (en) 2021-11-05

Family

ID=78336508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110660132.7A Pending CN113609891A (en) 2021-06-15 2021-06-15 Ship identification monitoring method and system

Country Status (1)

Country Link
CN (1) CN113609891A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205581646U (en) * 2016-03-24 2016-09-14 天津中翔腾航科技股份有限公司 Unmanned aerial vehicle electric power inspection image acquisition and processing system
CN109902618A (en) * 2019-02-26 2019-06-18 青岛海之声科技有限公司 A kind of sea ship recognition methods and device
CN109919072A (en) * 2019-02-28 2019-06-21 桂林电子科技大学 Fine vehicle type recognition and flow statistics method based on deep learning and trajectory tracking
CN110728650A (en) * 2019-08-27 2020-01-24 深圳大学 Well lid depression detection method based on intelligent terminal and related equipment
CN110992307A (en) * 2019-11-04 2020-04-10 华北电力大学(保定) Insulator positioning and identifying method and device based on YOLO
CN111414807A (en) * 2020-02-28 2020-07-14 浙江树人学院(浙江树人大学) Tidal water identification and crisis early warning method based on YO L O technology
CN111537841A (en) * 2020-06-30 2020-08-14 上海交通大学 Optimization method and system suitable for ground fault type identification
CN111695397A (en) * 2019-12-20 2020-09-22 珠海大横琴科技发展有限公司 Ship identification method based on YOLO and electronic equipment
CN112329768A (en) * 2020-10-23 2021-02-05 上善智城(苏州)信息科技有限公司 Improved YOLO-based method for identifying fuel-discharging stop sign of gas station
CN112382085A (en) * 2020-10-20 2021-02-19 华南理工大学 System and method suitable for intelligent vehicle traffic scene understanding and beyond visual range perception
CN112862818A (en) * 2021-03-17 2021-05-28 合肥工业大学 Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205581646U (en) * 2016-03-24 2016-09-14 天津中翔腾航科技股份有限公司 Unmanned aerial vehicle electric power inspection image acquisition and processing system
CN109902618A (en) * 2019-02-26 2019-06-18 青岛海之声科技有限公司 A kind of sea ship recognition methods and device
CN109919072A (en) * 2019-02-28 2019-06-21 桂林电子科技大学 Fine vehicle type recognition and flow statistics method based on deep learning and trajectory tracking
CN110728650A (en) * 2019-08-27 2020-01-24 深圳大学 Well lid depression detection method based on intelligent terminal and related equipment
CN110992307A (en) * 2019-11-04 2020-04-10 华北电力大学(保定) Insulator positioning and identifying method and device based on YOLO
CN111695397A (en) * 2019-12-20 2020-09-22 珠海大横琴科技发展有限公司 Ship identification method based on YOLO and electronic equipment
CN111414807A (en) * 2020-02-28 2020-07-14 浙江树人学院(浙江树人大学) Tidal water identification and crisis early warning method based on YO L O technology
CN111537841A (en) * 2020-06-30 2020-08-14 上海交通大学 Optimization method and system suitable for ground fault type identification
CN112382085A (en) * 2020-10-20 2021-02-19 华南理工大学 System and method suitable for intelligent vehicle traffic scene understanding and beyond visual range perception
CN112329768A (en) * 2020-10-23 2021-02-05 上善智城(苏州)信息科技有限公司 Improved YOLO-based method for identifying fuel-discharging stop sign of gas station
CN112862818A (en) * 2021-03-17 2021-05-28 合肥工业大学 Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera

Similar Documents

Publication Publication Date Title
CN108961235B (en) Defective insulator identification method based on YOLOv3 network and particle filter algorithm
CN108229509B (en) Method and device for identifying object class and electronic equipment
CN112199993B (en) Method for identifying transformer substation insulator infrared image detection model in any direction based on artificial intelligence
CN101430195B (en) Method for computing electric power line ice-covering thickness by using video image processing technology
CN109344753A (en) A kind of tiny fitting recognition methods of Aerial Images transmission line of electricity based on deep learning
CN109190446A (en) Pedestrian's recognition methods again based on triple focused lost function
CN107492094A (en) A kind of unmanned plane visible detection method of high voltage line insulator
CN114564545A (en) System and method for extracting ship experience course based on AIS historical data
CN109255336A (en) Arrester recognition methods based on crusing robot
CN111539456A (en) Target identification method and device
CN109241893B (en) Road selection method and device based on artificial intelligence technology and readable storage medium
WO2022222036A1 (en) Method and apparatus for determining parking space
CN114359702A (en) Method and system for identifying building violation of remote sensing image of homestead based on Transformer
CN113592839A (en) Distribution network line typical defect diagnosis method and system based on improved fast RCNN
CN113609891A (en) Ship identification monitoring method and system
CN115937492A (en) Transformer equipment infrared image identification method based on feature identification
CN116612461A (en) Target detection-based pointer instrument whole-process automatic reading method
CN113420623B (en) 5G base station detection method and system based on self-organizing mapping neural network
CN115761606A (en) Box electric energy meter identification method and device based on image processing
CN114463628A (en) Deep learning remote sensing image ship target identification method based on threshold value constraint
CN114005166A (en) Face recognition method and device based on multi-task learning
CN111079617A (en) Poultry identification method and device, readable storage medium and electronic equipment
CN115995062B (en) Abnormal recognition method and system for connecting net electric connection wire clamp nut
CN113989630B (en) Lens shielding judging method based on semantic analysis
CN116805435B (en) Intelligent inspection device for motor room

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination