CN114612802A - System and method for classifying fine granularity of ship target based on MBCNN - Google Patents

System and method for classifying fine granularity of ship target based on MBCNN Download PDF

Info

Publication number
CN114612802A
CN114612802A CN202210508800.9A CN202210508800A CN114612802A CN 114612802 A CN114612802 A CN 114612802A CN 202210508800 A CN202210508800 A CN 202210508800A CN 114612802 A CN114612802 A CN 114612802A
Authority
CN
China
Prior art keywords
ship
image
fine
network
ship target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210508800.9A
Other languages
Chinese (zh)
Inventor
胡泽辰
李超
刁博宇
王京
黄智华
郑新千
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202210508800.9A priority Critical patent/CN114612802A/en
Publication of CN114612802A publication Critical patent/CN114612802A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a system and a method for classifying fine granularity of ship targets based on MBCNN. Acquiring a ship image of a visible light remote sensing image, inputting a pre-trained ship target detection model, and outputting position coordinate information of all ships through network forward reasoning; cutting the visible light remote sensing ship image according to the position coordinate information of the ship to obtain a ship image; preprocessing the ship image to obtain a normalized ship image; constructing a ship target fine-grained classification model and performing pre-training; and inputting the ship image into a pre-trained ship target fine-grained classification model, and outputting a class label of the ship through network forward reasoning. The method has the characteristics of high accuracy and high identification efficiency, and solves some defects in calculation and problems in fine-grained classification of the traditional BCNN.

Description

System and method for classifying fine granularity of ship target based on MBCNN
Technical Field
The invention relates to the field of computer image classification, in particular to a system and a method for classifying ship targets based on MBCNN (multi-component network connectivity network).
Background
Most of the classification techniques at present mainly aim at the classification between general objects, and the attention is paid to specific classification without considering the slight difference between the same objects. Compared with general image classification, the image classification needing to be judged by fine-grained image classification is finer. Marine vessel target identification requires fine-grained screening. In civil use, the system can be used for monitoring and managing the work of ships, and can improve the efficiency of offshore operation and ensure the safety.
An article on fine-grained classification in ICCV2015Bilinear CNNs for Fine-grained Visual RecognitionThe feasibility of the bilinear network on fine-grained classification and the advantages thereof are publicly demonstrated. Patent document No. CN111860068A discloses a fine-grained bird identification method based on cross-layer simplified bilinear network, which includes inputting processed bird image data into VGG-16 convolutional neural network to extract feature map of bird image, extracting three groups of simplified bilinear feature representations from obtained feature map of different high-layer convolution, and performing normalization operation on the three groups of simplified bilinear feature representationsAnd sending the data to a softmax classifier in a cascading manner, and finally optimizing the whole network by utilizing cross entropy loss and assisting in pairwise confusion loss. The technical scheme introduces a bilinear network, but has the same defects as those of a paper, and the defects are as follows:
(1) the performance of the bilinear network is very excellent, but a fatal defect exists, the dimensionality of a result after the matrix outer product operation is very large, a great number of weight parameters can be generated, and the whole training process is very slow;
(2) in the deficiency of point (1), PCA dimension reduction is usually used, however, PCA can be used for data with normal number of samples, but when there are too many samples, covariance calculation becomes slow and difficult to calculate;
(3) in the prior art, the convolutional neural networks of the same type are generally used, only one network is used or the networks of the same type and different convolution depths are used, and two completely different convolutional neural networks are not considered to be used as feature extractors to improve the classification accuracy.
Meanwhile, aiming at the ship target data set, the serious problem of category imbalance, namely the problem of long tail data distribution, also exists. This will result in over-fitting into classes with more training data, and low accuracy for classes with less training data during the training of the classification model.
Disclosure of Invention
The invention aims to provide a system and a method for classifying fine granularity of ship targets based on MBCNN (moving target network) aiming at the defects of the existing technology for classifying the fine granularity of ship targets, in particular to the problems of poor fine granularity identification effect and low identification efficiency of ship targets caused by the fact that a plurality of ships exist in a marine visible light remote sensing image, the intra-class difference of the fine granularity field is larger than the inter-class difference, the image quality is fuzzy and the like.
In order to achieve the technical purpose, the technical scheme of the invention is as follows: the first aspect of the embodiments of the present invention provides a method for classifying fine granularity of a ship target based on MBCNN, where the method specifically includes the following steps:
acquiring ship images of the visible light remote sensing images, inputting the ship images into a pre-trained ship target detection model, and outputting position coordinate information of all ships through network forward reasoning;
cutting the visible light remote sensing ship image according to the position coordinate information of the ship to obtain a ship image;
preprocessing the ship image to obtain a normalized ship image;
constructing a ship target fine-grained classification model and performing pre-training; and inputting the ship image into a pre-trained ship target fine-grained classification model, and outputting a class label of the ship through network forward reasoning.
Further, the ship target detection model adopts a Yolov3 target detection algorithm.
Further, the process of preprocessing the ship image specifically comprises: on the premise of keeping the ship image not deformed, the length-width ratio of the image is changed into 1 by adding pixels to the short edge of the ship image; and then scaling the image to a uniform size for normalization processing, so that the image meets the input standard of a ship target fine-grained classification model.
Further, the fine-grained classification model of the ship target is a bilinear hybrid network, and specifically comprises the following steps: the method comprises the following steps of taking a VGG-16 network and a ResNet-18 network as feature extraction networks, a pooling function and an activation function; the VGG-16 network is used for accurately detecting and positioning the ship target in the ship image, and the ResNet-18 network is used for completing fine-grained information extraction of the ship target positioned by the VGG-16 network.
Further, inputting the normalized ship image into a pre-trained ship target fine-grained classification model, and outputting a class label of the ship through network forward reasoning, wherein the class label is specifically as follows: and inputting the normalized ship image into a pre-trained ship target fine-grained classification model, performing network forward reasoning to obtain a fusion feature vector of the ship, determining a ship target fine-grained classification output category according to the fusion feature vector, and acquiring a type label to which the ship specifically belongs according to a label mapping relation according to the ship target fine-grained classification output category.
Further, the process of obtaining the fusion feature vector of the ship specifically comprises: performing projection dimensionality reduction on the features output by the ResNet-18 network by a principal component analysis method, and simultaneously using singular value decomposition to replace covariance calculation to directly obtain feature values; and inputting the normalized ship image into a pre-trained ship target fine-grained classification model, and performing feature fusion on the outputs of the two feature extraction networks of the VGG-16 network and the ResNet-18 network in a way of outer product multiplication to obtain a fusion feature vector of the ship.
Further, the process of determining the fine-grained classification output category of the ship target according to the fusion feature vector specifically comprises the following steps: inputting the fusion feature vector into a classifier, outputting class probability vectors with the same dimensionality as the number of the fine-grained classes of the ship target, selecting the class corresponding to the maximum probability value, and determining the class as the classification output class of the fine-grained classes of the ship target.
A second aspect of the embodiments of the present invention provides a MBCNN-based ship target fine-grained classification system, including:
the ship target detection module is used for detecting and positioning a ship target in the remote sensing image by using a pre-trained target detection network, acquiring coordinate information of the ship target, cutting out a target ship image according to the coordinate information, and inputting the target ship image into the ship target fine-grained classification module;
the ship target fine-granularity classification module is used for preprocessing a target ship image to obtain a normalized ship image; constructing a ship target fine-grained classification model and performing pre-training; and inputting the ship image into a pre-trained ship target fine-grained classification model to obtain a fusion feature vector, and inputting the fusion feature vector into a classifier to obtain a class label of the ship after network forward reasoning.
A third aspect of embodiments of the present invention provides an electronic device, comprising a memory and a processor, the memory coupled to the processor; the memory is used for storing program data, and the processor is used for executing the program data to realize the fine-grained classification method for the MBCNN-based ship targets.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the above-described MBCNN-based ship target fine-grained classification method.
The ship target fine-grained classification system and method based on the bilinear network MBCNN have the following advantages that:
(1) after the marine visible light remote sensing image is obtained, the ship is detected and positioned through a target detection algorithm, and then the positioned ship image is input into a ship target fine-grained classification algorithm to obtain a fine-grained classification result of the ship type; through the ship detection module and the ship type fine-grained classification module, the problems that a plurality of ships exist in a video or an image, intra-class difference of image fine-grained classification field data is larger than inter-class difference, image quality is fuzzy and the like, and the ship target fine-grained classification effect is poor can be effectively solved.
(2) The ship target detection module and the ship target fine-grained classification module are both constructed based on a deep neural network, and have the characteristics of high accuracy and high identification efficiency; the method comprises the steps of firstly detecting and positioning ships based on a deep learning target detection algorithm, then paying attention to detailed information of the ships through an improved fine-grained classification network of a bilinear hybrid network MBCNN, and more accurately identifying the types of the ships; the ship target detection module and the ship target fine-grained classification module are fused, end-to-end reasoning is realized on the image, the process does not contain operations such as artificial design features, and the fine identification efficiency of the ship type is improved.
(3) The target detection algorithm based on deep learning is used for positioning and detecting the ships so as to solve the problem of a plurality of ships in the image and avoid the problem of poor identification precision caused by the existence of a plurality of ships in the image; then, fine-grained classification algorithm is adopted to carry out fine identification on the type of the ship target so as to solve the problem that the intra-class difference in the fine-grained field is larger than the inter-class difference; through the strong learning ability of deep learning, the visible light remote sensing image for dealing with the marine ship target can be effectively improved, and meanwhile, the target detection and the fine-grained classification algorithm are combined, so that the end-to-end reasoning process is realized, and the recognition efficiency is improved.
(4) The invention adopts the improved fine-grained classification network of the bilinear hybrid network MBCNN, carries out more effective feature extraction and fusion on the ship target through two different extraction functions, pays attention to the detail information of the ship, and effectively solves the problem that the intra-class difference of data in the fine-grained field is larger than the inter-class difference, thereby improving the fine-grained classification precision of the ship target; the invention uses two different extraction functions instead of only one network or the same type of networks with different convolution depths, thereby improving the classification accuracy.
(5) The invention uses PCA to reduce dimension, and solves the problem of dimension explosion after the matrix outer product operation of the bilinear network; and the invention uses SVD to replace covariance calculation to directly obtain the characteristic value in the calculation process of PCA dimension reduction to avoid the problem that the covariance calculation in PCA dimension reduction becomes slow and difficult to calculate when the number of samples is too much.
(6) In the training stage of the MBCNN fine-grained classification network, the data diversity is fully considered, data enhancement operation (such as blurring, brightness, random cutting and the like) is performed, the model robustness is enhanced, and the problem of low recognition rate caused by the blurring of the visible light remote sensing image quality of the offshore target can be effectively solved.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a block diagram of an MBCNN network;
FIG. 3 is a block diagram of the system of the present invention;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The fine-grained classification system and method for ship targets based on the bilinear network MBCNN according to the present invention are described in detail below with reference to the drawings and specific embodiments of the specification.
As shown in fig. 1, the present invention provides a method for classifying fine granularity of a ship target based on MBCNN, and the method specifically includes the following steps:
(1) and acquiring ship images of the visible light remote sensing images, inputting the ship target detection models which are pre-trained, and outputting position coordinate information of all ships through network forward reasoning.
Preferably, the ship target detection model adopts a YOLO target detection algorithm, which includes YOLO 1, YOLO 2, YOLO 3, tinyolo, YOLO 4, YOLO 5, YOLObile, yoloof, YOLOX, and other detection algorithms, and in the embodiment of the present invention, the YOLO 3 target detection algorithm is preferably adopted.
(2) And cutting the visible light remote sensing ship image according to the position coordinate information of the ship to obtain the ship image.
(3) And preprocessing the ship image to obtain a normalized ship image.
The process of preprocessing the ship image specifically comprises the following steps: on the premise of keeping the ship image not deformed, the length-width ratio of the image is changed into 1 by adding pixels to the short edge of the ship image; and then scaling the image to a uniform size for normalization processing, so that the image meets the input standard of a ship target fine-grained classification model. The formula is as follows:
Figure 445034DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 565437DEST_PATH_IMAGE002
representing a matrix of image pixel values before normalization;
Figure 938649DEST_PATH_IMAGE003
representing a matrix of normalized image pixel values;
Figure 802700DEST_PATH_IMAGE004
means representing an imagenet dataset;
Figure 961280DEST_PATH_IMAGE005
standard deviation values for the imagenet dataset are represented.
(4) Constructing a ship target fine-grained classification model and performing pre-training; and inputting the normalized ship image into a pre-trained ship target fine-granularity classification model, and outputting a class label of the ship through network forward reasoning.
The ship target fine-grained classification model is a Bilinear hybrid network MBCNN (Mixed-Bilinear-CNN, which is an improved low-dimensional hybrid Bilinear network). As shown in fig. 2, the MBCNN network is evolved based on the bilinear network BCNN, and two different CNN networks are used to extract image features. The method specifically comprises the following steps: the method comprises a VGG-16 network and a ResNet-18 network which are used as feature extraction networks, a pooling function and an activation function; the VGG-16 network is used for accurately detecting and positioning the ship target in the ship image, and the ResNet-18 network is used for completing fine-grained information extraction of the ship target positioned by the VGG-16 network. In the embodiment of the invention, the target ship image firstly passes through two classic convolution networks of VGG16 and ResNet18 at the same time; secondly, PCA dimension reduction is carried out on the last convolutional layer of the ResNet18, so that the dimension of the final convolutional layer is k smaller than 512; then, performing matrix outer product on the characteristics of the last convolution layer of the VGG18 and the ResNet convolution layer after PCA dimension reduction to obtain a fusion characteristic vector with the dimension of 512 xk; and finally, inputting the result of the fusion feature vector into a softmax classifier for fine-grained classification.
The method specifically comprises the following substeps:
(4.1) acquiring a fusion feature vector;
(4.1.1) constructing a ship target fine-grained classification network by using two networks of VGG-16 and ResNet-18 as a feature extraction network of a bilinear hybrid network; the VGG-16 completes accurate detection and positioning of the ship target in the image, and the other feature extractor ResNet-18 is used for extracting fine-grained information of the ship target positioned by the VGG-16.
The bilinear network model is generally described by a four-tuple equation as shown below:
Figure 252584DEST_PATH_IMAGE006
wherein the bilinear network model is used
Figure 113093DEST_PATH_IMAGE007
It is shown that,
Figure 780835DEST_PATH_IMAGE008
and
Figure 793921DEST_PATH_IMAGE009
two feature extraction functions are respectively shown, generally, two same functions are adopted, and the embodiment of the invention uses two different extraction functions, namely VGG-16 and ResNet-18; the image feature descriptors are extracted using a convolutional neural network,
Figure 256127DEST_PATH_IMAGE010
the representation of the pooling function is shown,
Figure 603931DEST_PATH_IMAGE011
representing the final softmax classification function.
(4.1.2) projection dimensionality reduction is carried out on the characteristics output by the ResNet-18 network by combining a Principal Component Analysis (PCA), so that dimensionality reduction accelerates the training process and the performance is not influenced due to complete irrelevance of all dimensions; meanwhile, Singular Value Decomposition (SVD) is used for replacing covariance calculation to directly obtain characteristic values in the process of calculating projection dimensionality reduction of the characteristics output by the ResNet-18 network by a Principal Component Analysis (PCA).
The singular value decomposition process can be represented by the following formula
Figure 809785DEST_PATH_IMAGE012
Wherein
Figure 677378DEST_PATH_IMAGE013
Is a matrix of m x m size,
Figure 576064DEST_PATH_IMAGE014
is a matrix of m x n size, the
Figure 411164DEST_PATH_IMAGE014
All the element values of the matrix except the main diagonal are 0 and each element on the main diagonal which is not 0 is called a singular value,
Figure 155129DEST_PATH_IMAGE015
representing a matrix of n x n size. As can be seen from the definition of PCA, only the retention matrix is required
Figure 134019DEST_PATH_IMAGE016
The first k columns of data to reduce the original data to k dimensions, the specific result can be obtained by the following formula:
Figure 938027DEST_PATH_IMAGE017
(4.1.3) inputting the target ship image preprocessed in the step (3) into the ship target fine-grained classification network built in the step (4.1.1), fusing the output of the two feature extraction networks of the ship target fine-grained classification network by means of outer product multiplication, and outputting a fusion feature vector of the ship. The feature fusion formula is as follows:
for images
Figure 994845DEST_PATH_IMAGE018
In position
Figure 808080DEST_PATH_IMAGE019
Two characteristics of
Figure 384686DEST_PATH_IMAGE020
And
Figure 890754DEST_PATH_IMAGE021
performing matrix outer product to obtain matrix
Figure 434867DEST_PATH_IMAGE022
Figure 786214DEST_PATH_IMAGE023
Then to the matrix
Figure 217327DEST_PATH_IMAGE022
Performing the sum posing to obtain a fusion feature matrix
Figure 894296DEST_PATH_IMAGE024
The formula is as follows:
Figure 660127DEST_PATH_IMAGE025
fusing feature matrices
Figure 424951DEST_PATH_IMAGE024
Vectorization, i.e. fusion of feature vectors
Figure 366363DEST_PATH_IMAGE026
The formula is as follows:
Figure 73287DEST_PATH_IMAGE027
(4.2) determining the classification output category of the fine granularity of the ship target according to the obtained fusion feature vector; the method specifically comprises the following steps:
inputting the fusion feature vector into a classifier, outputting class probability vectors with the same dimensionality as the number of the fine-grained classes of the ship target, selecting the class corresponding to the maximum probability value, and determining the class as the classification output class of the fine-grained classes of the ship target.
And (4.3) acquiring a category label to which the ship specifically belongs.
And (4) classifying and outputting categories according to the fine granularity of the ship target output in the step (4.2), and acquiring a category label to which the ship specifically belongs according to a label mapping relation.
As shown in fig. 3, the present invention further provides a ship target fine-grained classification system based on the bilinear network MBCNN, which includes:
the ship target detection module is used for detecting and positioning a ship target in the remote sensing image by using a pre-trained target detection network, acquiring coordinate information of the ship target, cutting out a target ship image according to the coordinate information, and inputting the target ship image into the ship target fine-grained classification module;
the ship target fine-granularity classification module is used for preprocessing a target ship image to obtain a normalized ship image; constructing a ship target fine-grained classification model and performing pre-training; and inputting the ship image into a pre-trained ship target fine-grained classification model to obtain a fusion feature vector, and inputting the fusion feature vector into a classifier to obtain a class label of the ship after network forward reasoning.
In conclusion, aiming at the problem that the visible light remote sensing image of the offshore target has fine granularity classification of a plurality of ships, the method detects and positions the ships based on the deep learning target detection algorithm, and then accurately identifies the detected ship type by adopting an advanced MBCNN fine granularity classification model, so that the identification efficiency and the identification accuracy of the ship type fine granularity are improved.
Corresponding to the embodiment of the ship target fine-grained classification method based on the MBCNN, the invention also provides an embodiment of a ship target fine-grained classification device based on the MBCNN.
Referring to fig. 4, the apparatus for classifying fine granularity of a ship target based on MBCNN provided in the embodiment of the present invention includes one or more processors, and is configured to implement the method for classifying fine granularity of a ship target based on MBCNN in the embodiment.
The embodiment of the MBCNN-based ship target fine-grained classification device of the present invention can be applied to any device with data processing capability, such as a computer or other devices. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for running through the processor of any device with data processing capability. In terms of hardware, as shown in fig. 4, the present invention is a hardware structure diagram of any device with data processing capability where the MBCNN-based ship target fine-grained classification apparatus is located, and in addition to the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 4, any device with data processing capability where the apparatus is located in the embodiment may also include other hardware according to the actual function of the any device with data processing capability, which is not described again.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the invention. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiment of the present invention further provides a computer-readable storage medium, on which a program is stored, and when the program is executed by a processor, the method for classifying the fine granularity of the MBCNN-based ship target in the above embodiments is implemented.
The computer readable storage medium may be an internal storage unit, such as a hard disk or a memory, of any data processing capability device described in any of the foregoing embodiments. The computer readable storage medium can be any device with data processing capability, such as a plug-in hard disk, a Smart Media Card (SMC), an SD Card, a Flash memory Card (Flash Card), etc. provided on the device. Further, the computer readable storage medium may include both an internal storage unit and an external storage device of any data processing capable device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the arbitrary data processing-capable device, and may also be used for temporarily storing data that has been output or is to be output.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A ship target fine-grained classification method based on MBCNN is characterized by comprising the following steps:
acquiring ship images of the visible light remote sensing images, inputting the ship images into a pre-trained ship target detection model, and outputting position coordinate information of all ships through network forward reasoning;
cutting the visible light remote sensing ship image according to the position coordinate information of the ship to obtain a ship image;
preprocessing the ship image to obtain a normalized ship image;
constructing a ship target fine-grained classification model and performing pre-training; and inputting the normalized ship image into a pre-trained ship target fine-granularity classification model, and outputting a class label of the ship through network forward reasoning.
2. The MBCNN-based ship target fine-grained classification method according to claim 1, characterized in that the ship target detection model employs YOLOv3 target detection algorithm.
3. The MBCNN-based ship target fine-grained classification method according to claim 1, wherein the process of preprocessing the ship image specifically comprises: on the premise of keeping the ship image not deformed, the length-width ratio of the image is changed into 1 by adding pixels to the short edge of the ship image; and then scaling the image to a uniform size for normalization processing, so that the image meets the input standard of a ship target fine-grained classification model.
4. The MBCNN-based ship target fine-grained classification method according to claim 1, wherein the ship target fine-grained classification model is a bilinear hybrid network, and specifically comprises: the method comprises the following steps of taking a VGG-16 network and a ResNet-18 network as feature extraction networks, a pooling function and an activation function; the VGG-16 network is used for accurately detecting and positioning the ship target in the ship image, and the ResNet-18 network is used for completing fine-grained information extraction of the ship target positioned by the VGG-16 network.
5. The MBCNN-based ship target fine-grained classification method according to claim 1, wherein the normalized ship image is input into a pre-trained ship target fine-grained classification model, and through network forward inference, the class label to which the ship belongs is output specifically as follows: inputting the normalized ship image into a pre-trained ship target fine-grained classification model, obtaining a fusion feature vector of the ship through network forward reasoning, determining a ship target fine-grained classification output category according to the fusion feature vector, and obtaining a specific ship category label according to the ship target fine-grained classification output category and a label mapping relation.
6. The MBCNN-based ship target fine-grained classification method according to claim 5, wherein the process of obtaining the fusion feature vector of the ship specifically comprises: performing projection dimensionality reduction on the features output by the ResNet-18 network by a principal component analysis method, and simultaneously using singular value decomposition to replace covariance calculation to directly obtain feature values; and inputting the normalized ship image into a pre-trained ship target fine-grained classification model, and performing feature fusion on the outputs of the two feature extraction networks of the VGG-16 network and the ResNet-18 network in a way of outer product multiplication to obtain a fusion feature vector of the ship.
7. The MBCNN-based ship target fine-grained classification method according to claim 5, wherein the process of determining the ship target fine-grained classification output category according to the fusion feature vector specifically comprises: and inputting the fusion feature vector into a classifier, outputting class probability vectors with the dimension same as the number of the fine-grained classes of the ship target, selecting the class corresponding to the maximum probability value, and determining the class as the classification output class of the fine-grained classes of the ship target.
8. A naval vessel target fine grit classification system based on MBCNN, its characterized in that includes:
the ship target detection module is used for detecting and positioning a ship target in the remote sensing image by using a pre-trained target detection network, acquiring coordinate information of the ship target, cutting out a target ship image according to the coordinate information, and inputting the target ship image into the ship target fine-grained classification module;
the ship target fine-granularity classification module is used for preprocessing a target ship image to obtain a normalized ship image; constructing a ship target fine-grained classification model and performing pre-training; and inputting the ship image into a pre-trained ship target fine-grained classification model to obtain a fusion feature vector, and inputting the fusion feature vector into a classifier to obtain a class label of the ship after network forward reasoning.
9. An electronic device comprising a memory and a processor, wherein the memory is coupled to the processor; wherein the memory is configured to store program data and the processor is configured to execute the program data to implement the MBCNN-based ship target fine-grained classification method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the MBCNN-based ship target fine-grained classification method according to any one of claims 1 to 7.
CN202210508800.9A 2022-05-11 2022-05-11 System and method for classifying fine granularity of ship target based on MBCNN Pending CN114612802A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210508800.9A CN114612802A (en) 2022-05-11 2022-05-11 System and method for classifying fine granularity of ship target based on MBCNN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210508800.9A CN114612802A (en) 2022-05-11 2022-05-11 System and method for classifying fine granularity of ship target based on MBCNN

Publications (1)

Publication Number Publication Date
CN114612802A true CN114612802A (en) 2022-06-10

Family

ID=81870523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210508800.9A Pending CN114612802A (en) 2022-05-11 2022-05-11 System and method for classifying fine granularity of ship target based on MBCNN

Country Status (1)

Country Link
CN (1) CN114612802A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117746193A (en) * 2024-02-21 2024-03-22 之江实验室 Label optimization method and device, storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112464792A (en) * 2020-11-25 2021-03-09 北京航空航天大学 Remote sensing image ship target fine-grained classification method based on dynamic convolution
CN113255793A (en) * 2021-06-01 2021-08-13 之江实验室 Fine-grained ship identification method based on contrast learning
US20210264194A1 (en) * 2020-02-24 2021-08-26 Electronics And Telecommunications Research Institute Apparatus and method for identifying warship

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210264194A1 (en) * 2020-02-24 2021-08-26 Electronics And Telecommunications Research Institute Apparatus and method for identifying warship
CN112464792A (en) * 2020-11-25 2021-03-09 北京航空航天大学 Remote sensing image ship target fine-grained classification method based on dynamic convolution
CN113255793A (en) * 2021-06-01 2021-08-13 之江实验室 Fine-grained ship identification method based on contrast learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张仁宇: "基于深度学习的细粒度图像分类方法研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *
闫子旭等: "YOLOv3和双线性特征融合的细粒度图像分类", 《中国图象图形学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117746193A (en) * 2024-02-21 2024-03-22 之江实验室 Label optimization method and device, storage medium and electronic equipment
CN117746193B (en) * 2024-02-21 2024-05-10 之江实验室 Label optimization method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
Qin et al. DeepFish: Accurate underwater live fish recognition with a deep architecture
CN110569782A (en) Target detection method based on deep learning
Hou et al. BSNet: Dynamic hybrid gradient convolution based boundary-sensitive network for remote sensing image segmentation
Wu et al. Typical target detection in satellite images based on convolutional neural networks
Mehrjardi et al. A survey on deep learning-based image forgery detection
CN110852327A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113095333A (en) Unsupervised feature point detection method and unsupervised feature point detection device
CN111199558A (en) Image matching method based on deep learning
Zhao et al. CRAS-YOLO: A novel multi-category vessel detection and classification model based on YOLOv5s algorithm
CN108664968B (en) Unsupervised text positioning method based on text selection model
CN114612802A (en) System and method for classifying fine granularity of ship target based on MBCNN
CN117115632A (en) Underwater target detection method, device, equipment and medium
US11816909B2 (en) Document clusterization using neural networks
CN115984219A (en) Product surface defect detection method and device, electronic equipment and storage medium
Shishkin et al. Implementation of yolov5 for detection and classification of microplastics and microorganisms in marine environment
CN116958615A (en) Picture identification method, device, equipment and medium
Huang et al. Baggage image retrieval with attention-based network for security checks
CN115170854A (en) End-to-end PCANetV 2-based image classification method and system
Zhou et al. LEDet: localization estimation detector with data augmentation for ship detection based on unmanned surface vehicle
CN112733686A (en) Target object identification method and device used in image of cloud federation
Luo Sailboat and kayak detection using deep learning methods
Li et al. Multi-level Pyramid Feature Extraction and Task Decoupling Network for SAR Ship Detection
Alegavi et al. Implementation of deep convolutional neural network for classification of multiscaled and multiangled remote sensing scene
Dai et al. Automatic Identification of Bond Information Based on OCR and NLP.
Zhang et al. Underwater Sea Cucumber Target Detection Based on Edge-Enhanced Scaling YOLOv4

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220610

RJ01 Rejection of invention patent application after publication