CN110598636A - Ship target identification method based on feature migration - Google Patents

Ship target identification method based on feature migration Download PDF

Info

Publication number
CN110598636A
CN110598636A CN201910866137.8A CN201910866137A CN110598636A CN 110598636 A CN110598636 A CN 110598636A CN 201910866137 A CN201910866137 A CN 201910866137A CN 110598636 A CN110598636 A CN 110598636A
Authority
CN
China
Prior art keywords
hog
domain
target
source domain
target domain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910866137.8A
Other languages
Chinese (zh)
Other versions
CN110598636B (en
Inventor
陈浩
郭斌
李宏博
高通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201910866137.8A priority Critical patent/CN110598636B/en
Publication of CN110598636A publication Critical patent/CN110598636A/en
Application granted granted Critical
Publication of CN110598636B publication Critical patent/CN110598636B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

A ship target identification method based on feature migration belongs to the field of ship target identification. The method solves the problem that in the prior art, the target to be recognized is different from the known training target data in the aspects of appearance and imaging quality, so that the recognition effect of the target to be recognized is poor. The invention extracts HOG characteristics of ship images with different resolutions, maps the HOG characteristics of a source domain and the HOG characteristics of a target domain to the same characteristic space based on a transfer learning method of space alignment and probability adaptation, then carries out probability adaptation and example weight adjustment in the same characteristic space, regenerates new source domain vectorization HOG characteristics and new target domain vectorization HOG characteristics, trains a support vector machine by using the new source domain vectorization HOG characteristics, and carries out target identification of an image to be identified by using the trained support vector machine. The method can be applied to the identification of the ship target in the remote sensing image.

Description

Ship target identification method based on feature migration
Technical Field
The invention belongs to the field of ship target identification, and particularly relates to a ship target identification method based on feature migration.
Background
In the optical sensor, since the observation position, the height, and the like change, the resolution of an image obtained by the target also changes. Since the resolution of the images obtained from the same object varies, they tend to follow different distributions. For a traditional machine learning method, it is generally assumed that training data and test data satisfy the same distribution, but actually, a target to be recognized in an optical remote sensing image is often different from known training target data in characteristics such as appearance and imaging quality, so that the target cannot be recognized well, and the recognition effect of the target to be recognized is poor. Therefore, how to improve the target recognition rate of images with different resolutions obtained under the homologous sensor is one of the problems to be solved at present.
Disclosure of Invention
The invention aims to solve the problem that in the existing method, the target to be recognized is poor in recognition effect due to the fact that the target to be recognized is different from known training target data in appearance and imaging quality characteristics, and provides a ship target recognition method based on feature migration to be applied to typical remote sensing target recognition.
The technical scheme adopted by the invention for solving the technical problems is as follows: a ship target identification method based on feature migration comprises the following steps:
selecting a high-resolution ship image as a training set image, respectively intercepting a target slice of each image in the training set, and taking the high-resolution ship image in the training set as a source domain to obtain a target slice of the source domain;
step two, calculating the HOG characteristics of the source domain target slice obtained in the step one, and vectorizing the calculated HOG characteristics to obtain vectorized HOG characteristics of the source domain;
intercepting a target slice of the image to be identified for the low-resolution ship image to be identified, and taking the low-resolution ship image to be identified as a target domain, namely obtaining the target slice of the target domain;
calculating the HOG characteristics of the target slice of the target domain, and vectorizing the calculated HOG characteristics to obtain the vectorized HOG characteristics of the target domain;
respectively carrying out standard normalization processing on the vectorized HOG characteristics of the source domain and the vectorized HOG characteristics of the target domain to obtain normalized backward quantization HOG characteristics of the source domain and normalized backward quantization HOG characteristics of the target domain;
performing PCA transformation on the normalized backward quantization HOG characteristic of the source domain and the normalized backward quantization HOG characteristic of the target domain respectively to obtain a base of a subspace of the source domain and a base of a subspace of the target domain;
performing subspace alignment on the substrate of the source domain subspace and the substrate of the target domain subspace to generate a new coordinate space; mapping the normalized backward quantization HOG characteristics of the source domain and the normalized backward quantization HOG characteristics of the target domain into a new coordinate space to obtain vectorization HOG characteristics of the source domain and the target domain in the new coordinate space;
fifthly, carrying out probability adaptation and instance weight adjustment on the vectorization HOG characteristics of the source domain and the target domain in a new coordinate space, and regenerating new source domain vectorization HOG characteristics and new target domain vectorization HOG characteristics;
inputting the source domain vectorization HOG characteristics regenerated in the step five into a support vector machine for training, and stopping training when the error function of the support vector machine is not reduced any more to obtain a trained support vector machine;
inputting the target domain vectorization HOG characteristics regenerated in the step five into a trained support vector machine to obtain a target identification result.
The invention has the beneficial effects that: the invention provides a ship target identification method based on feature migration, which extracts HOG features of ship images with different resolutions, maps the HOG features of a source domain and the HOG features of a target domain to the same feature space based on a migration learning method of space alignment and probability adaptation, then performs probability adaptation and example weight adjustment in the same feature space, regenerates new source domain vectorization HOG features and new target domain vectorization HOG features, trains a support vector machine by using the new source domain vectorization HOG features, and performs target identification of an image to be identified by using the trained support vector machine.
The method of the invention can achieve the target recognition accuracy of the image to be recognized up to 98%, and compared with the existing method, the method is at least improved by more than 3%, thus the method of the invention effectively improves the target recognition effect.
Drawings
FIG. 1 is a flow chart of a method of feature migration based ship target identification of the present invention;
figure 2 is a schematic diagram of a low resolution destroyer;
figure 3 is a schematic diagram of a high resolution destroyer;
FIG. 4 is a schematic illustration of a low resolution cruiser;
FIG. 5 is a schematic illustration of a high resolution cruiser;
figure 6 is a schematic illustration of a low resolution aircraft carrier;
figure 7 is a schematic illustration of a high resolution aircraft carrier;
FIG. 8 is a graph comparing the recognition accuracy of the method of the present invention with BDA (balanced distribution adaptation), JDA (joint probability adaptation), SA (subspace alignment), TCA (transport component analysis), and PCA (principal component analysis) at 1m resolution for a destroyer, a cruise, and an aircraft carrier;
FIG. 9 is a diagram comparing the MMD distance at 1m resolution for the method of the present invention with BDA, JDA, SA, TCA and PCA;
FIG. 10 is a graph comparing the mean of the correct rate for three object identifications for each method of the present invention with BDA, JDA, SA, TCA and PCA at 1m resolution;
namely, the average value of the accuracy of the method for identifying the three targets is calculated, and similarly, the average value of the accuracy of the other 5 methods for identifying the three targets is calculated.
Detailed Description
The first embodiment is as follows: as shown in fig. 1. The method for identifying the ship target based on the feature migration comprises the following steps:
selecting a high-resolution ship image as a training set image, respectively intercepting a target slice of each image in the training set, and taking the high-resolution ship image in the training set as a source domain to obtain a target slice of the source domain;
the high-resolution ship image is within 0.5m of resolution;
step two, calculating the HOG characteristics of the source domain target slices obtained in the step one, and vectorizing the calculated HOG characteristics to obtain vectorized HOG characteristics of the source domain (each slice has a corresponding vectorized HOG characteristic);
intercepting a target slice of the image to be identified for the low-resolution ship image to be identified, and taking the low-resolution ship image to be identified as a target domain, namely obtaining the target slice of the target domain;
calculating the HOG characteristics of the target slice of the target domain, and vectorizing the calculated HOG characteristics to obtain the vectorized HOG characteristics of the target domain;
the low-resolution ship image is beyond 1m resolution;
when the HOG characteristics of the target slice of the high-resolution training set image and the target slice of the low-resolution image to be recognized are extracted, the HOG characteristics are extracted relative to the high-resolution target slice: selecting the sizes of cells and blocks suitable for the high-resolution target, and obtaining a gradient map by using a filter to finally obtain the HOG characteristic of the high-resolution target;
extraction of HOG features for low resolution target slices: and selecting the sizes of the cells and blocks suitable for the low-resolution target, and obtaining the gradient map by applying a filter to finally obtain the HOG characteristic of the low-resolution target.
By selecting the proper cell and block sizes, the HOG characteristics of the high-resolution target and the HOG characteristics of the low-resolution target can be ensured to have the same dimension, and the accuracy of identifying the image target to be detected is improved.
Respectively carrying out standard normalization processing on the vectorized HOG characteristics of the source domain and the vectorized HOG characteristics of the target domain to obtain normalized backward quantization HOG characteristics of the source domain and normalized backward quantization HOG characteristics of the target domain;
performing PCA transformation on the normalized backward quantization HOG characteristic of the source domain and the normalized backward quantization HOG characteristic of the target domain respectively to obtain a base of a subspace of the source domain and a base of a subspace of the target domain;
performing subspace alignment on the substrate of the source domain subspace and the substrate of the target domain subspace to generate a new coordinate space; mapping the normalized backward quantization HOG characteristics of the source domain and the normalized backward quantization HOG characteristics of the target domain into a new coordinate space to obtain vectorization HOG characteristics of the source domain and the target domain in the new coordinate space;
fifthly, carrying out probability adaptation and instance weight adjustment on the vectorization HOG characteristics of the source domain and the target domain in a new coordinate space, and regenerating new source domain vectorization HOG characteristics and new target domain vectorization HOG characteristics;
inputting the source domain vectorization HOG characteristics regenerated in the step five into a support vector machine for training, and stopping training when the error function of the support vector machine is not reduced any more to obtain a trained support vector machine;
inputting the target domain vectorization HOG characteristics regenerated in the step five into a trained support vector machine to obtain a target identification result.
The embodiment mainly aims at the visible light ship target image acquired by the homologous optical sensor, and improves the ship target recognition rate with different resolutions.
The second embodiment is as follows: the first difference between the present embodiment and the specific embodiment is: in the first step, the size of the target slice is set according to the size of the target in the high-resolution ship image.
For the identification of ship targets, the training set images are from a Google Earth data source, the targets in the training set images are divided into aircraft carriers, destroyers and cruisers, and the target slices corresponding to each training set image are the same in size.
The third concrete implementation mode: the second embodiment is different from the first embodiment in that: the specific process of the step four is as follows:
step four, generating a subspace:
respectively carrying out standard normalization processing (mean value is 0 and variance is 1) on the vectorized HOG characteristics of the source domain and the vectorized HOG characteristics of the target domain to obtain the normalized backward quantization HOG characteristics of the source domain and the normalized backward quantization HOG characteristics of the target domain;
carrying out PCA (principal component analysis) transformation on normalized backward quantization HOG (histogram of oriented gradient) features of the source domain, selecting feature vectors corresponding to the first d large feature values, and taking the selected feature vectors as a substrate B of a subspace of the source domainS
Similarly, after the normalized backward quantization HOG feature of the target domain is subjected to PCA (principal component analysis) conversion, the feature vectors corresponding to the first d large feature values are selected, and the selected feature vectors are used as the substrate B of the subspace of the target domainT
BS,BT∈Rp×d;BS′,BT' is a normalized orthogonal matrix (i.e., B)S′BS=Id,BT′BT=Id) Wherein, IdIs an identity matrix of dimension d;
step four, solving a conversion matrix between the source domain and the target domain:
by calculating gammaSBSNormalizing each of the backward quantized HOG features gamma of the source domainSMapping into subspace of source domain by computing gammaTBTNormalizing each of the backward quantized HOG features gamma of the target domainTMapping into a subspace of a target domain;
by learning a transformation matrix, it aligns the source subspace coordinate system with the target subspace coordinate system;
learning a transformation matrix F by minimizing a Bregman matrix divergence T (F) to obtain an optimal solution F for the transformation matrix*
Wherein T (F) is the Bregman matrix divergence,is the Frobenius norm, since the Frobenius norm is invariant to orthogonal operations, equation (1) is written in the form of equation (3):
as can be seen from equation (3), the optimal solution F of the transformation matrix*From F*=BS′BTObtaining; by means of an optimal solution F*Realizing the alignment of the source domain subspace substrate and the target domain subspace substrate, generating a new coordinate space Ba,BaIs expressed as BSBS′BT(ii) a B is to beaA transformation system called source domain to target domain;
if the source domain and target domain subspace bases are the same, the optimal solution F of the transformation matrix F*Is the identity matrix;
step four, respectively calculating the vectorization HOG characteristics of the source domain and the target domain in a new coordinate space:
ZS=γSBa (4)
wherein: gamma raySFor normalised backward quantization of the HOG feature in the source domain, ZSRepresenting vectorized HOG features of the source domain in a new coordinate space;
ZT=γTBT (5)
wherein: gamma rayTFor normalized backward quantization of the HOG feature, Z, of the target domainTRepresenting the vectorized HOG features of the target domain in the new coordinate space.
After using the spatial alignment algorithm, the source domain and the target domain are located in the same feature space. However, in the new space, the distributions of the vectorized HOG features of the source domain and the target domain are not exactly the same. To improve accuracy, a joint probability adaptation method is used to adjust the probability distribution originating from the target domain in the new space.
The fourth concrete implementation mode: the third difference between the present embodiment and the specific embodiment is that: the concrete process of the step five is as follows:
fifthly, after subspace alignment, the source domain vectorization HOG with the label in the new coordinate space is characterized in thatThe label-free target domain vectorization HOG feature in the new coordinate space isWherein:the source domain vectorizes the ith sample of the HOG features in the new coordinate space,the label of the ith sample is 1,2, …, and n is the number of samples of the source domain vectorization HOG feature;in the new coordinate space, the jth sample of the target domain vectorization HOG feature is used, j is 1,2, …, m, and m is the number of samples of the target domain vectorization HOG feature;
edge distribution P of source-domain vectorized HOG features in new coordinate spaceS(ZS) Edge distribution P of vectorized HOG features with target domaint(ZT) Is not equal, i.e. PS(ZS)≠Pt(ZT) And the source domain vectorizes the conditional distribution P of the HOG featuresS(yS|ZS) Conditional distribution P of vectorized HOG features with target domaint(yt|ZT) Is not equal, i.e. PS(yS|ZS)≠Pt(yt|ZT);
Minimizing the sum D (Z) of the edge distribution distance and the conditional distribution distanceS|ZT) Carrying out balance distribution adaptation;
D(ZS|ZT)=(1-β)D(Ps(ZS),Pt(ZT))+βD(Ps(ys|ZS),Pt(yt|ZT)) (6)
wherein D (P)s(ZS),Pt(ZT) Is an edge distribution distance, D (P)s(ys|ZS),Pt(yt|ZT) Is a conditional distribution distance, β is a balance factor;
when beta → 0, it means that the source domain vectorizes the HOG featureVectorizing HOG features with a target domainThe distribution difference of (2) is large, so that the adaptation of edge distribution is more advantageous; when beta → 1, the source domain vectorized HOG feature is revealedVectorizing HOG features with a target domainAre similar, the conditional distribution is more suitable for adaptation. Therefore, the balance factor β can adaptively adjust the importance of each distribution and achieve good results;
step two, estimating the edge distribution distance and the conditional distribution distance by using MMD (maximum mean difference), and rewriting the formula (6) into a form of a formula (7):
wherein the content of the first and second substances,andrespectively representing samples belonging to class c in the new coordinate space, source domain and target domain, ncNumber of samples belonging to class c in the source domain, mcThe number of samples belonging to the class C in the target domain, C belongs to {1,2, …, C } is a class label, C is the total number of classes, and H represents the Hilbert space of the regeneration core;
wherein the first part of the right side of the equation represents the edge distribution distance between the source domain and the target domain, and the second part represents the distance of the conditional distribution between the two domains;
step five, reducing the formula (7) into a form of a formula (8) through mathematical transformation and regularization:
wherein A is a transformation matrix with probability adaptation, T represents transposition, and K0Is the maximum mean difference matrix of the edge distribution, KcThe maximum mean difference matrix of the conditional distribution is adopted, and Z is a set of vectorized HOG characteristics of a source domain in a new coordinate space and vectorized HOG characteristics of a target domain in the new coordinate space;
Z={ZS,ZT},ZSto vectorize the set of all samples of the HOG feature in the new coordinate space, Z, in the source domainTVectorizing a set of all samples of the HOG features for the target domain in a new coordinate space;
K0and KcSatisfies the following conditions:
there are two parts in equation (8), the first part is the adaptation of the edge distribution, and the second part is the adaptation of the conditional distribution;
and introducing structure sparsification and adjusting the weight of the instance. Matching feature distributions based on MMD minimization in the equation matches low and high order statistics for domain adaptation, but the distributions do not match exactly. Especially when the domain differences are particularly large, there will always be some source instances that are not related to the target instance. Thus, incorporating the instance re-weighting process with the BDA, it is important to reload the source instance.
Step five and four, adding l to the transformation matrix A of probability adaptation2,1Carrying out structure sparse regularization on the norm, namely introducing row sparsity into a probability-adapted transformation matrix A;
since each row of the probability adapted transform matrix a corresponds to one sample in the set Z of vectorized HOG features (i.e., corresponds to one vectorized HOG feature in the set Z), determining the adaptive instance weights based on row sparsity defines the regularization term of the instance weight adjustment as:
wherein A iss=A1:n,:Is a transform matrix corresponding to the probability adaptation of the vectorized HOG features of the source domain in the new coordinate space, At=An+1:n+m,:Is a transform matrix corresponding to a probability adaptation of the vectorized HOG features of the target domain in the new coordinate space;
A1:n,:represents all columns from line 1 to line n of A, An+1:n+m,:All columns from row n +1 to row n + m in A;
in equation (9), only the HOG feature vectorized in the source domain is added with l2,1Norm normalization because the goal is to re-weight the source domain vectored HOG features by correlation with the target domain vectored HOG features. By minimizing equation (10), the source domain vectored HOG features that are correlated (uncorrelated) with the target domain vectored HOG features are adaptively re-weighted, in the new representation Z ═ aTWith a larger (smaller) in KThe importance of (c). With this regularizer, robustness is provided to domain differences caused by uncorrelated vectorized HOG features.
Fifthly, combining and optimizing the formula (8) and the formula (9) to obtain the optimization problem of the formula (10):
s.t.ATZH0ZTA=I
where I represents the identity matrix, tr represents the trace of the matrix, λ is the regularization parameter used to trade-off probability adaptation and instance weight adjustment, H0Is a central matrix; h0=I-(1/N)1,N=n+m;
According to a constraint optimization theory, obtaining a Lagrangian function L:
wherein: phi is the Lagrange multiplier;
step five and six, derivation is carried out on the Lagrange function L, and the order is carried outEquation (12) is obtained:
wherein: g is a diagonal gradient matrix, and a transformation matrix A of probability adaptation is solved according to a formula (12);
wherein, | | As||2,1Is a non-smooth function of zero, | As||2,1Is calculated asG is a diagonal gradient matrix, the ith element G in GiiEqual to:
wherein: z is a radical ofiAs elements of the set Z, aiIs the ith row in the matrix A; since the diagonal gradient matrix G is also unknown, it depends on the matrix a. Therefore, an alternating optimization method is employed, i.e., iteratively fixing one variable and updating the other.
Step five, generating new source domain vectorization HOG characteristics and new target domain vectorization HOG characteristics again respectively:
Z′S=ZSAs (13)
wherein: z'SVectorizing the HOG feature on behalf of the regenerated new source domain;
Z′T=ZTAt (14)
wherein: z'TVectorizing the HOG features on behalf of the regenerated target domain.
The fifth concrete implementation mode: the fourth difference between this embodiment and the specific embodiment is that: in the fifth step, the value range of the balance factor beta belongs to [0,1 ].
The sixth specific implementation mode: the fifth embodiment is different from the fifth embodiment in that: in the fifth step, the value range of the regularization parameter λ is [0.1,10 ].
The following examples were used to demonstrate the beneficial effects of the present invention:
the source fields of the experimental data set are 252 destroyers, 160 cruisers and 192 aircraft carriers, and the resolution is 0.5 m. The target areas are 160 destroyers, 160 cruisers and 212 aircraft carriers, 1m resolution. Fig. 2 to 7 are schematic diagrams of images of a source domain and a target domain.
The test results of fig. 8 to 10 show that the method provided by the invention is superior to the existing transfer learning method and the method without transfer learning.
The above-described calculation examples of the present invention are merely to explain the calculation model and the calculation flow of the present invention in detail, and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications of the present invention can be made based on the above description, and it is not intended to be exhaustive or to limit the invention to the precise form disclosed, and all such modifications and variations are possible and contemplated as falling within the scope of the invention.

Claims (6)

1. A ship target identification method based on feature migration is characterized by comprising the following steps:
selecting a high-resolution ship image as a training set image, respectively intercepting a target slice of each image in the training set, and taking the high-resolution ship image in the training set as a source domain to obtain a target slice of the source domain;
step two, calculating the HOG characteristics of the source domain target slice obtained in the step one, and vectorizing the calculated HOG characteristics to obtain vectorized HOG characteristics of the source domain;
intercepting a target slice of the image to be identified for the low-resolution ship image to be identified, and taking the low-resolution ship image to be identified as a target domain, namely obtaining the target slice of the target domain;
calculating the HOG characteristics of the target slice of the target domain, and vectorizing the calculated HOG characteristics to obtain the vectorized HOG characteristics of the target domain;
respectively carrying out standard normalization processing on the vectorized HOG characteristics of the source domain and the vectorized HOG characteristics of the target domain to obtain normalized backward quantization HOG characteristics of the source domain and normalized backward quantization HOG characteristics of the target domain;
performing PCA transformation on the normalized backward quantization HOG characteristic of the source domain and the normalized backward quantization HOG characteristic of the target domain respectively to obtain a base of a subspace of the source domain and a base of a subspace of the target domain;
performing subspace alignment on the substrate of the source domain subspace and the substrate of the target domain subspace to generate a new coordinate space; mapping the normalized backward quantization HOG characteristics of the source domain and the normalized backward quantization HOG characteristics of the target domain into a new coordinate space to obtain vectorization HOG characteristics of the source domain and the target domain in the new coordinate space;
fifthly, carrying out probability adaptation and instance weight adjustment on the vectorization HOG characteristics of the source domain and the target domain in a new coordinate space, and regenerating new source domain vectorization HOG characteristics and new target domain vectorization HOG characteristics;
inputting the source domain vectorization HOG characteristics regenerated in the step five into a support vector machine for training to obtain a trained support vector machine;
inputting the target domain vectorization HOG characteristics regenerated in the step five into a trained support vector machine to obtain a target identification result.
2. The method for identifying ship targets based on feature migration according to claim 1, wherein in the first step, the size of the target slice is set according to the size of the target in the high-resolution ship image.
3. The method for identifying ship targets based on feature migration according to claim 2, wherein the specific process of the fourth step is as follows:
step four, generating a subspace:
respectively carrying out standard normalization processing on the vectorized HOG characteristics of the source domain and the vectorized HOG characteristics of the target domain to obtain normalized backward quantization HOG characteristics of the source domain and normalized backward quantization HOG characteristics of the target domain;
carrying out PCA conversion on the normalized backward quantization HOG characteristics of the source domain, selecting characteristic vectors corresponding to the first d large characteristic values, and taking the selected characteristic vectors as a substrate B of a subspace of the source domainS
Similarly, after the normalized backward quantization HOG feature of the target domain is subjected to PCA (principal component analysis) conversion, the feature vectors corresponding to the first d large feature values are selected, and the selected feature vectors are used as the substrate B of the subspace of the target domainT
Step four, solving a conversion matrix between the source domain and the target domain:
by calculating gammaSBSNormalizing backward quantization of HOG features gamma of source domainSMapping into subspace of source domain by computing gammaTBTNormalizing backward quantization of HOG features gamma of a target domainTMapping into a subspace of a target domain;
learning a transformation matrix F by minimizing the Bregman matrix divergence T (F) to obtain an optimal solution F of the transformation matrix*
Wherein T (F) is the Bregman matrix divergence,is the frobenius norm, since the frobenius norm is invariant to orthogonal operations, equation (1) is written in the form of equation (3):
optimal solution F of the transformation matrix*From F*=BS′BTObtaining; by means of an optimal solution F*And realizing the alignment of the source domain subspace substrate and the target domain subspace substrate, and generating a new coordinate space Ba,BaIs expressed as BSBS′BT,BS' is BSThe orthonormal matrix of (a);
step four, respectively calculating the vectorization HOG characteristics of the source domain and the target domain in a new coordinate space:
ZS=γSBa (4)
wherein: gamma raySFor normalised backward quantization of the HOG feature in the source domain, ZSRepresenting vectorization of a source domain in a new coordinate spaceHOG characteristics;
ZT=γTBT (5)
wherein: gamma rayTFor normalized backward quantization of the HOG feature, Z, of the target domainTRepresenting the vectorized HOG features of the target domain in the new coordinate space.
4. The method for identifying ship targets based on feature migration according to claim 3, wherein the specific process of the fifth step is as follows:
fifthly, after subspace alignment, the source domain vectorization HOG with the label in the new coordinate space is characterized in thatThe label-free target domain vectorization HOG feature in the new coordinate space isWherein:the source domain vectorizes the ith sample of the HOG features in the new coordinate space,the label of the ith sample is 1,2, …, and n is the number of samples of the source domain vectorization HOG feature;in the new coordinate space, the jth sample of the target domain vectorization HOG feature is used, j is 1,2, …, m, and m is the number of samples of the target domain vectorization HOG feature;
edge distribution P of source-domain vectorized HOG features in new coordinate spaceS(ZS) Edge distribution P of vectorized HOG features with target domaint(ZT) Is not equal, i.e. PS(ZS)≠Pt(ZT) And the source domain vectorizes the conditional distribution P of the HOG featuresS(yS|ZS) Conditional distribution P of vectorized HOG features with target domaint(yt|ZT) Is not equal, i.e. PS(yS|ZS)≠Pt(yt|ZT);
Minimizing the sum D (Z) of the edge distribution distance and the conditional distribution distanceS|ZT) Carrying out balance distribution adaptation;
D(ZS|ZT)=(1-β)D(Ps(ZS),Pt(ZT))+βD(Ps(ys|ZS),Pt(yt|ZT)) (6)
wherein D (P)s(ZS),Pt(ZT) Is an edge distribution distance, D (P)s(ys|ZS),Pt(yt|ZT) Is a conditional distribution distance, β is a balance factor;
step two, estimating the edge distribution distance and the conditional distribution distance by using the MMD, and rewriting the formula (6) into a form of a formula (7):
wherein the content of the first and second substances,andrespectively representing samples belonging to class c in the new coordinate space, source domain and target domain, ncNumber of samples belonging to class c in the source domain, mcThe number of samples belonging to the class C in the target domain, C belongs to {1,2, …, C } is a class label, C is the total number of classes, and H represents the Hilbert space of the regeneration core;
step five, reducing the formula (7) into a form of a formula (8) through mathematical transformation and regularization:
wherein A is a transformation matrix with probability adaptation, T represents transposition, and K0Is the maximum mean difference matrix of the edge distribution, KcThe maximum mean difference matrix of the conditional distribution is adopted, and Z is a set of vectorized HOG characteristics of a source domain in a new coordinate space and vectorized HOG characteristics of a target domain in the new coordinate space;
Z={ZS,ZT},ZSto vectorize the set of all samples of the HOG feature in the new coordinate space in the source domain, ZTA set of all samples of the target domain vectorized HOG features in the new coordinate space;
step five and four, adding l to the transformation matrix A of probability adaptation2,1Carrying out structure sparse regularization on the norm, namely introducing row sparsity into a probability-adapted transformation matrix A;
each row of the transform matrix a with probability adaptation corresponds to one sample in the vectorized HOG feature set Z, and the adaptive instance weight is determined according to row sparsity, so that the regularization term of instance weight adjustment is defined as:
wherein A iss=A1:n,:Is a transform matrix corresponding to the probability adaptation of the vectorized HOG features of the source domain in the new coordinate space, At=An+1:n+m,:Is a transform matrix corresponding to a probability adaptation of the vectorized HOG features of the target domain in the new coordinate space;
fifthly, combining and optimizing the formula (8) and the formula (9) to obtain the optimization problem of the formula (10):
where I represents the identity matrix, tr represents the trace of the matrix, λ is the regularization parameter used to trade-off probability adaptation and instance weight adjustment, H0Is a central matrix;
according to a constraint optimization theory, obtaining a Lagrangian function L:
wherein: phi is the Lagrange multiplier;
step five and six, derivation is carried out on the Lagrange function L, and the order is carried outEquation (12) is obtained:
wherein: g is a diagonal gradient matrix, and a transformation matrix A of probability adaptation is solved according to a formula (12);
step five, generating new source domain vectorization HOG characteristics and new target domain vectorization HOG characteristics again respectively:
Z′S=ZSAs (13)
wherein: z'SVectorizing the HOG feature on behalf of the regenerated new source domain;
Z′T=ZTAt (14)
wherein: z'TVectorizing the HOG features on behalf of the regenerated target domain.
5. The method for identifying ship targets based on feature migration according to claim 4, wherein in the fifth step, a value range of the balance factor β is β ∈ [0,1 ].
6. The method for identifying ship targets based on feature migration according to claim 5, wherein in the fifth step, the value range of the regularization parameter λ is [0.1,10 ].
CN201910866137.8A 2019-09-09 2019-09-09 Ship target identification method based on feature migration Active CN110598636B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910866137.8A CN110598636B (en) 2019-09-09 2019-09-09 Ship target identification method based on feature migration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910866137.8A CN110598636B (en) 2019-09-09 2019-09-09 Ship target identification method based on feature migration

Publications (2)

Publication Number Publication Date
CN110598636A true CN110598636A (en) 2019-12-20
CN110598636B CN110598636B (en) 2023-01-17

Family

ID=68859314

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910866137.8A Active CN110598636B (en) 2019-09-09 2019-09-09 Ship target identification method based on feature migration

Country Status (1)

Country Link
CN (1) CN110598636B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583201A (en) * 2020-04-26 2020-08-25 浙江大学 Transfer learning method for constructing super-resolution pathology microscope
CN111931558A (en) * 2020-06-22 2020-11-13 武汉第二船舶设计研究所(中国船舶重工集团公司第七一九研究所) Ship category identification method and system
CN113657541A (en) * 2021-08-26 2021-11-16 电子科技大学长三角研究院(衢州) Domain adaptive target identification method based on deep knowledge integration
CN116229442A (en) * 2023-01-03 2023-06-06 武汉工程大学 Text image synthesis and instantiation weight transfer learning method

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100086180A1 (en) * 2008-10-08 2010-04-08 Wallace Chester A System, method and apparatus for exploration
CN102368905A (en) * 2008-11-06 2012-03-07 南卡罗来纳医科大学研究发展基金会 Lysosomotropic inhibitors of acid ceramidase
CN103268363A (en) * 2013-06-06 2013-08-28 哈尔滨工业大学 Elastic HOG (histograms of oriented gradient) feature-based Chinese calligraphy image retrieval method matched with DDTW (Derivative dynamic time wrapping)
CN104751198A (en) * 2013-12-27 2015-07-01 华为技术有限公司 Method and device for identifying target object in image
CN105868794A (en) * 2016-04-19 2016-08-17 哈尔滨工业大学 Method for ship target fuzzy recognition based on inverse synthetic aperture radar (ISAR) image
WO2017070322A1 (en) * 2015-10-21 2017-04-27 Toth, Landy Controlled and precise treatment of cardiac tissues
CN107292246A (en) * 2017-06-05 2017-10-24 河海大学 Infrared human body target identification method based on HOG PCA and transfer learning
CN107832711A (en) * 2017-11-13 2018-03-23 常州大学 A kind of recognition methods again of the pedestrian based on transfer learning
CN108399420A (en) * 2018-01-30 2018-08-14 北京理工雷科电子信息技术有限公司 A kind of visible light naval vessel false-alarm elimination method based on depth convolutional network
CN108647702A (en) * 2018-04-13 2018-10-12 湖南大学 A kind of extensive food materials image classification method based on transfer learning
KR101914717B1 (en) * 2017-09-28 2018-11-02 전남대학교산학협력단 Human Action Recognition Using Rreproducing Kernel Hilbert Space for Product manifold of Symmetric Positive definite Matrices
US20180322385A1 (en) * 2017-05-05 2018-11-08 Intel Corporation Efficient learning and using of topologies of neural networks in machine learning
CN109325507A (en) * 2018-10-11 2019-02-12 湖北工业大学 A kind of image classification algorithms and system of combination super-pixel significant characteristics and HOG feature
CN109359623A (en) * 2018-11-13 2019-02-19 西北工业大学 High spectrum image based on depth Joint Distribution adaptation network migrates classification method
CN109373438A (en) * 2018-09-11 2019-02-22 太原理工大学 Heating energy-saving control method and system based on transfer learning algorithm
CN109389174A (en) * 2018-10-23 2019-02-26 四川大学 A kind of crowd massing Sensitive Image Detection Method
CN109766811A (en) * 2018-12-31 2019-05-17 复旦大学 The end-to-end detection and recognition methods of sea ship in a kind of satellite-borne SAR image
CN110210545A (en) * 2019-05-27 2019-09-06 河海大学 Infrared remote sensing water body classifier construction method based on transfer learning

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100086180A1 (en) * 2008-10-08 2010-04-08 Wallace Chester A System, method and apparatus for exploration
CN102368905A (en) * 2008-11-06 2012-03-07 南卡罗来纳医科大学研究发展基金会 Lysosomotropic inhibitors of acid ceramidase
CN103268363A (en) * 2013-06-06 2013-08-28 哈尔滨工业大学 Elastic HOG (histograms of oriented gradient) feature-based Chinese calligraphy image retrieval method matched with DDTW (Derivative dynamic time wrapping)
CN104751198A (en) * 2013-12-27 2015-07-01 华为技术有限公司 Method and device for identifying target object in image
WO2017070322A1 (en) * 2015-10-21 2017-04-27 Toth, Landy Controlled and precise treatment of cardiac tissues
CN105868794A (en) * 2016-04-19 2016-08-17 哈尔滨工业大学 Method for ship target fuzzy recognition based on inverse synthetic aperture radar (ISAR) image
US20180322385A1 (en) * 2017-05-05 2018-11-08 Intel Corporation Efficient learning and using of topologies of neural networks in machine learning
CN107292246A (en) * 2017-06-05 2017-10-24 河海大学 Infrared human body target identification method based on HOG PCA and transfer learning
KR101914717B1 (en) * 2017-09-28 2018-11-02 전남대학교산학협력단 Human Action Recognition Using Rreproducing Kernel Hilbert Space for Product manifold of Symmetric Positive definite Matrices
CN107832711A (en) * 2017-11-13 2018-03-23 常州大学 A kind of recognition methods again of the pedestrian based on transfer learning
CN108399420A (en) * 2018-01-30 2018-08-14 北京理工雷科电子信息技术有限公司 A kind of visible light naval vessel false-alarm elimination method based on depth convolutional network
CN108647702A (en) * 2018-04-13 2018-10-12 湖南大学 A kind of extensive food materials image classification method based on transfer learning
CN109373438A (en) * 2018-09-11 2019-02-22 太原理工大学 Heating energy-saving control method and system based on transfer learning algorithm
CN109325507A (en) * 2018-10-11 2019-02-12 湖北工业大学 A kind of image classification algorithms and system of combination super-pixel significant characteristics and HOG feature
CN109389174A (en) * 2018-10-23 2019-02-26 四川大学 A kind of crowd massing Sensitive Image Detection Method
CN109359623A (en) * 2018-11-13 2019-02-19 西北工业大学 High spectrum image based on depth Joint Distribution adaptation network migrates classification method
CN109766811A (en) * 2018-12-31 2019-05-17 复旦大学 The end-to-end detection and recognition methods of sea ship in a kind of satellite-borne SAR image
CN110210545A (en) * 2019-05-27 2019-09-06 河海大学 Infrared remote sensing water body classifier construction method based on transfer learning

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
HONGBO LI: "A Transfer Learning Method For Ship Recognition In Multi-Optical Remote Sensing Satellites", 《2018 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA (ICCC WORKSHOPS)》 *
HONGBO LI: "A Transfer Learning Method of Ship Identification Based on Weighted Hog Features", 《IGARSS 2019 - 2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM》 *
M. JEDRA: "Recognition of seed varieties using neural networks analysis of electrophoretic images", 《PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS》 *
方梦梁: "一种光学遥感图像船舶目标检测技术", 《计算机技术与发展》 *
陈浩等: "航天运载器端面特征的新型图像特征识别方法", 《传感器与微系统》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583201A (en) * 2020-04-26 2020-08-25 浙江大学 Transfer learning method for constructing super-resolution pathology microscope
CN111583201B (en) * 2020-04-26 2022-04-05 浙江大学 Transfer learning method for constructing super-resolution pathology microscope
CN111931558A (en) * 2020-06-22 2020-11-13 武汉第二船舶设计研究所(中国船舶重工集团公司第七一九研究所) Ship category identification method and system
CN113657541A (en) * 2021-08-26 2021-11-16 电子科技大学长三角研究院(衢州) Domain adaptive target identification method based on deep knowledge integration
CN113657541B (en) * 2021-08-26 2023-10-10 电子科技大学长三角研究院(衢州) Domain self-adaptive target recognition method based on depth knowledge integration
CN116229442A (en) * 2023-01-03 2023-06-06 武汉工程大学 Text image synthesis and instantiation weight transfer learning method
CN116229442B (en) * 2023-01-03 2024-05-28 武汉工程大学 Text image synthesis and instantiation weight transfer learning method

Also Published As

Publication number Publication date
CN110598636B (en) 2023-01-17

Similar Documents

Publication Publication Date Title
CN110598636B (en) Ship target identification method based on feature migration
WO2021120752A1 (en) Region-based self-adaptive model training method and device, image detection method and device, and apparatus and medium
Tuia et al. Domain adaptation for the classification of remote sensing data: An overview of recent advances
CN110569696A (en) Neural network system, method and apparatus for vehicle component identification
CN109544603B (en) Target tracking method based on deep migration learning
WO2022218396A1 (en) Image processing method and apparatus, and computer readable storage medium
CN107392107A (en) A kind of face feature extraction method based on isomery tensor resolution
CN108428220A (en) Satellite sequence remote sensing image sea island reef region automatic geometric correction method
WO2015012136A1 (en) Method for segmenting data
CN108121962B (en) Face recognition method, device and equipment based on nonnegative adaptive feature extraction
CN112818850A (en) Cross-posture face recognition method based on progressive neural network and attention mechanism
Wang et al. A novel sparse boosting method for crater detection in the high resolution planetary image
CN114863348A (en) Video target segmentation method based on self-supervision
CN109886315B (en) Image similarity measurement method based on kernel preservation
CN114565861A (en) Airborne downward-looking target image positioning method based on probability statistic differential homoembryo set matching
Yang et al. Non-rigid point set registration via global and local constraints
CN110443169B (en) Face recognition method based on edge preservation discriminant analysis
Hu et al. An adaptive nonlocal Gaussian prior for hyperspectral image denoising
Zhao et al. Two‐Phase Incremental Kernel PCA for Learning Massive or Online Datasets
CN109815889B (en) Cross-resolution face recognition method based on feature representation set
CN112465062A (en) Clustering method based on manifold learning and rank constraint
CN109543717B (en) Joint collaborative expression hyperspectral classification method based on adaptive neighborhood and dictionary
US11756319B2 (en) Shift invariant loss for deep learning based image segmentation
Yuan et al. A novel hyperspectral unmixing model based on multilayer NMF with Hoyer’s projection
CN115239694A (en) Hyperspectral anomaly detection method fusing robust dictionary and double-cooperative-constraint regular term

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant