CN107346327A - The zero sample Hash picture retrieval method based on supervision transfer - Google Patents
The zero sample Hash picture retrieval method based on supervision transfer Download PDFInfo
- Publication number
- CN107346327A CN107346327A CN201710253104.7A CN201710253104A CN107346327A CN 107346327 A CN107346327 A CN 107346327A CN 201710253104 A CN201710253104 A CN 201710253104A CN 107346327 A CN107346327 A CN 107346327A
- Authority
- CN
- China
- Prior art keywords
- matrix
- hash
- mrow
- picture
- training sample
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
Abstract
The invention discloses a kind of zero sample Hash picture retrieval method based on supervision transfer, belong to image salted hash Salted field.The present invention is modeled using existing Natural Language Processing Models to the label of existing training data, forms a Label space.Relation between label is potentially preserved by new Label space, trains the mapping from picture feature space to Label space.On this basis, this mapping relations is reflected in Hash codes.The present invention is not strict with to training sample, is had a wide range of application, and particularly in large database, when classification is more and can not find out training sample to each classification, the present invention can dramatically improve the picture retrieval accuracy rate for the category.
Description
Technical field
The invention belongs to image Hash field, and in particular to and it is a kind of accurately by the method for picture Hash, especially right
In the case that certain class picture is without training sample, it still is able to the method for such picture progress reasonable coding.
Background technology
With the generation of increasing multi-medium data, it is hashed into tackle a kind of powerful work retrieved on a large scale
Tool, it can search the spent time with the shortening of high degree in tens data.Because computer is extremely good at XOR
Computing, carry out retrieval using Hash codes and can tackle to search demand caused by the big data epoch.
Database Hash is a particularly important job, and particularly important meaning is suffered from for numerous fields.Cause
This, Hash has obtained important concern in past a very long time, it is proposed that many important algorithms.Such as the unrelated Kazakhstan of data
The Local Sensitive Hashing (LSH), the Iterative Quantization of data correlation Hash class of uncommon class
(ITQ), the Isotropic Hashing and Supervised Hashing with Kernels (KSH) of supervision Hash class,
Supervised Discrete Hashing(SDH).Hash class is especially supervised, by excavating semantic label, deep exploration
Internal relation in database between each sample, dramatically improve the accuracy of Hash.
However, in the big data epoch, with the generation of increasing multi-medium data, it is desirable to for each classification
It is unpractical to possess training data.And traditional hash method is more to be directed to static database, in face of the increased number of index
It is limited in one's ability according to storehouse.Therefore traditional supervision hash method is not suitable for the database of expansion.Zero sample learning (Zeroshot
Learning) it is a kind of method for effectively solving sample shortage.How to be reflected by study from unobservable sample characteristics space
Semantic space is mapped to, so as to avoid that the sample not observed is modeled again.However, zero traditional sample learning usually has a lot
Limitation, such as:The attribute labeling ambiguity that thinks, data set transfer can not be tackled.Therefore, recent years, with natural language processing
The development in field, zero sample learning are given new solution.Such as, by excavating huge corpus such as Wikipedia,
Supervision label can be mapped as label vector, and this label vector has been typically assigned the semanteme higher than conventional labels, from
And pilot model is modeled to unobservable space.Famous method is exercised supervision including Socher with label vector, is drawn
Lead and classify for the data not observed, be specifically shown in document " R.Socher, M.Ganjoo, C.D.Manning, and
A.Ng.Zero-shotlearning through cross-modal transfer.In NIPS,2013.”.Frome is in text
Offer " A.Frome, G.S.Corrado, J.Shlens, S.Bengio, J.Dean, T.Mikolov, et al.Devise:A deep
Visual-semantic embeddingmodel.In NIPS, similar strategy is also used in 2013. ", but use
Different language model simultaneously extend to more classifications.
The training mode of traditional supervision Hash is, for some particular category, to be exercised supervision using 1/0 label,
Aiming drill.The mutual correlation of such 1/0 label is 0, is for for the label of supervision, between class and class
Difference is fixed.So doing can cause the relevance between classification to embody, so that training the Hash mould come
Type can only carry out efficient coding to the classification in training set, and can not reliably be compiled for a classification having never seen
Code, namely reliably can not be retrieved for unseen classification.
The content of the invention
In order to overcome rigors of the existing supervision hash algorithm for training data, for the class of no training data
The shortcomings that Hash ability is weaker, the present invention propose a kind of new supervision hash algorithm, i.e. zero sample Hash (Zeroshot
Hashing).The present invention is not strict with to training sample, is had a wide range of application, particularly in large database, is worked as classification
More and when can not find out training sample to each classification, the present invention can dramatically improve the picture for the category
Retrieval rate.
The present invention is using existing natural language processing (Natural Language Processing) model to having trained
The label of data is modeled, and forms a Label space.Relation between label is potentially preserved by new Label space
(locus distance relation of i.e. each label in Label space), trains the mapping from picture feature space to Label space.
On the basis of this, this mapping relations is reflected in Hash codes.
Picture is expressed as picture primitive character, i.e. picture feature space first, then picture sign is become into engineering
It is manageable vectorial (label vector) to practise algorithm, i.e., picture original tag is passed through into existing Natural Language Processing Models (base
Train to obtain in large corpora) it is characterized as Label space (also referred to as language material space), in Label space, the vocabulary of semantic similarity
Smaller locus distance is had, and the locus distance of the semantic label mutually gone then can be farther out.To picture primitive character,
Label vector carry out hash function study, so as to both can with the Hash codes learning ability of the reservation known class of high degree,
Simultaneously again can be by the internal relation (locus distance relation) of the Label space formed after natural language processing, implicitly
Ground transmits the supervision message of unknown classification, so as to carry out good Hash to the data of unknown classification.
In the hash algorithm of the present invention, generated by discrete codes, semantic integration, luv space retains integrated learning and obtained
To corresponding hash function, it is ensured that possess good performance in the classification of no specimen.
The zero sample Hash picture retrieval method based on supervision transfer of the present invention generally includes three parts:Picture and figure
Pretreatment, hash function training and Hash codes generation and the assessment of piece label.Wherein, the pretreatment of picture and picture tag is main
It is empty that language material is carried out including the extraction for the picture feature space in training set and with Natural Language Processing Models to label
Between generate;Hash function training mainly includes using hash algorithm proposed by the invention, learns hash function;Hash codes generate
Then predominantly based on the hash function learnt, all pictures are encoded, obtain the Hash codes of picture.When progress picture
During retrieval process, compared by Hash codes, complete the retrieval process of picture.Each several part specific implementation step difference is as follows:
Step 1: the pretreatment of picture and picture tag:
Process object:Training sample, object to be retrieved (i.e. test sample), existing language material mould is based on to process object
Type, complete the mapping that picture feature vector arrives label vector.
Step 2: hash function is trained:
In order to learn preferable hash function in the case of lower complexity, the present invention uses target as follows
Equation:
In above formula, X represents the picture feature vector set of training sample set, and Y represents the label vector collection of training sample set, i.e.,
X=[x1,x2,…,xn], Y=[y1,y2,…,yn], xi、yiRepresent that the picture of i-th (i=1,2 ..., n) each and every one training sample is special
Sign vector, label vector;
R represents semantic integration matrix;W represents the mapping relations matrix of the label vector from Hash codes to supervision, and W is l × c
The real number domain matrix of dimension, alphabetical " l " represent the length of Hash codes, and c represents class number;B=[b1,b2,…,bn]∈{-1,1
}l×nRepresent the Hash codes matrix that all training sample Hash codes are formed, bi∈{-1,1}l×nRepresent the Hash codes of i-th of sample, n
Represent sample size, IcIt is c × c unit matrix;The Frobenius norms of representing matrix, symbol ()TRepresenting matrix turns
Put;Hash matrix P is the real number domain matrix of l × m dimensions, and wherein m represents the dimension of label vector;SijRepresent training sample i, training
Sample j picture feature vector xi、xjBetween similarity;Balance parameters λ, α, β, γ are the number more than 0, for adjusting not
Same penalty term;Mapping relationship f is the hash function from feature space to Hash codes, i.e., hash function f (X) is:
F (X)=PTφ(X) (2)
Wherein f (X)=[f (x1),f(x2),…,f(xn)], in view of successful application of the kernel method in Hash field, handles line
Property inseparable problem, definition:
Wherein x represents the picture feature vector of any training sample,Represent m from picture feature vector X with
The core of machine selection, coefficient of balance δ span is [- 1,1].
In order to further improve the degree of accuracy of hash function, realized by following three submethods:
1) original feature space (picture feature space) retains:(1) last in formula
The structural information in picture luv space is remained by a similarity matrix S.Wherein similarity matrix S each element Sij's
Generation method is as follows:
WhereinK neighbouring samples of object in bracket are represented, k is preset value, and span is [5,10].Parameter σ
Span be 0~1, preferred value 1.
It ensure that the similarity of picture similar in original feature space is big by similarity matrix S, so as to effectively excavate
The similarity relation of original feature space.
2) semantic integration:(1) Section 1 in formulaLabel space is integrated.Because label is empty
Between distribution and feature space certain difference is distributed with, feature space directly is used for into hash algorithm can cause error.Institute
To need a method to allow Label space and feature space uniformity to be present.The present invention obtains update semantics by iteration and integrates square
Battle array R, the Hash codes of generation can be effectively reduced error, while the convergence that also can laterally accelerate hash function to train.
3) discrete Hash codes generation:In the target equation that (1) formula of solution provides, using the side of discrete solution Hash codes
Formula, reduce quantization error.
Because (1) formula is that one non-convex and the problem of NP-hard (intangibility), the present invention uses each ginseng of successive optimization
Several strategies.4 steps will be decomposed into the solution of target equation:Step P, step B, step R, step W.By not stopping iteration
This four steps can cause target equation to restrain, and illustrate the optimal way of this four steps separately below.
Step P:All variables in fixed (1) in addition to P, rewriting (1) can obtain:
Wherein, I represents unit matrix, and Laplacian Matrix L=D-S, D are a diagonal matrix, and its i-th of diagonal element is
Step B:All variables in fixed (1) in addition to B, rewriting (1) can obtain:
Wherein Tr () representing matrix mark, H=WTRY+αPTφ(X).B is expressed asH is expressed asW is expressed as simultaneously
Further, define:
Wherein,Represent the i-th row in B, H, W removing resulting matrix.It can release, (6) formula
Optimal solution beBy loop iteration q until convergence, so as to obtain optimal B.
Step R:Parameter in fixed (1) in addition to R, (1) formula, which can convert, to be turned to
s.t.RTR=Ic
Above formula, which is solved, can obtain optimal R, for example with document " Z.Wen and W.Yin.A feasible
method for optimization withorthogonality constraints.Mathematical
Programming, the optimization method in 2013 " optimize processing to R.
Step W:Other specification in fixed (1) in addition to W, can be obtained:
W=(BBT+λIl)-1BYTR
Wherein, IlL × l unit matrix is represented, when parameter P, B, R, W value are unchanged, i.e., iteration obtains twice recently
When the difference arrived meets default precognition, stop iteration, current P, B, R, W are exported, further according to P, B, R, W last time iteration
Value, obtains hash function f (), completes the training of hash function.
Step 3: Hash codes generate:
Hash coding is carried out to all pictures based on the hash function f () learnt, generates the Hash codes of image.
Value i.e. based on obtained P, according to the φ (X) of corresponding each picture feature vector, according to formula f (X)=PTφ
(X) Hash codes of photo current are generated.
Step 4: when carrying out picture retrieval, the Hash codes based on image are compared, and obtain retrieval result.Such as base
In Hash codes, the k closest images that target image is found out in range of search are returned.
In summary, by adopting the above-described technical solution, the beneficial effects of the invention are as follows:Can be in certain no one kind
In the case of training data, learn a hash function, and then effective Hash is carried out to corresponding class, so as to realize zero sample
This Hash.Present invention can apply in large database, when training sample lacks relative to objective world, effectively improve pair
In the Hash effect of the microcosmic classification observed.
Embodiment
To make the object, technical solutions and advantages of the present invention clearer, with reference to embodiment, the present invention is made into
One step it is described in detail.
The picture retrieval of the present invention comprises the steps:
Step 1:The pretreatment of picture and picture tag is carried out to training sample, obtains the picture feature vector of training sample
X, label vector Y:
Using document " R.Socher, M.Ganjoo, C.D.Manning, and A.Ng.Zero-shotlearning
Through cross-modal transfer.In NIPS, 2013. " the material models provided, with free on Wekipedia
Corpus (including nearly 500,000,000 vocabulary) is trained, and excavates rational lexical representation method, and label is good by training in advance
Model, it is expressed as label vector Y.
When extracting picture primitive character, feature of the excitation of convolutional neural networks as picture can be used, picture is defeated
Enter the convolutional neural networks trained, and the output of network is extracted at the 7th layer, as the feature of picture, obtain each training
The picture feature vector X of sample.The network structure and parameter used in present embodiment is AlexNet.
Step 2:Hash function is trained:
Input picture feature vector X, the corresponding label vector Y of training sample;
Step 201:Random initializtion semantic integration matrix R, Hash codes matrix B, mapping relations matrix W, Hash matrix P;
Step 202:M core is randomly choosed from the picture feature vector X of training sample, obtains { ai, wherein m is corresponding to be marked
Sign the dimension of vector;φ (X) is calculated according to formula (3);
Step 204:Laplacian Matrix L is built according to L=D-S;
Step 205:Iterative manner carries out discrete solution to the target equation shown in formula (1), and iteration is carried out to P, B, R, W
Optimal solution, untill meeting iteration convergence, preferable iteration convergence condition is:The change of solving result twice recently is not
More than predetermined threshold value.
Step 3:The Hash matrix P (respective value that last time iteration optimization solves) obtained based on step 2, according to formula
F (X)=PTφ (X) generates the Hash codes of picture;
Step 4:Input the picture I of object to be retrievedt, and extract and obtain picture feature vector xt, label vector yt;
φ (the x of photo current are calculated using step 202 identical modet), then the Hash matrix P obtained based on step 2,
Pass through formula f (xt)=PTφ(xt) generation photo current ItHash codes.
I.e.WhereinM are represented from picture spy
Levy vector xtIn randomly selected core.
Based on Hash codes, the individual closest images of k (preset value) that target image is found out in range of search are retrieved
As a result return.
Claims (1)
1. the zero sample Hash picture retrieval method based on supervision transfer, it is characterised in that comprise the following steps:
Step 1:The pretreatment of picture and picture tag is carried out to training sample set, obtains the picture feature vector of training sample set
Collect X, label vector collection Y, the picture feature vector x of each training sampleiRepresent, corresponding label vector is yi, under be designated as instructing
Practice sample identification symbol;
Step 2:Picture feature vector set X based on training sample, label vector collection Y carry out hash function training:
Step 201:Random initializtion semantic integration matrix R, Hash codes matrix B, mapping relations matrix W, Hash matrix P;
Wherein, RTR=Ic, IcC × c unit matrix is represented, c represents default class number, symbol ()TRepresenting matrix turns
Put;
Matrix B=[b1,b2,…,bn]∈{-1,1}l×nThe Hash codes matrix that all training sample Hash codes are formed is represented, l is represented
The length of Hash codes, n represent training sample set quantity, bi∈{-1,1}1×nRepresent the Hash codes of i-th of sample, i=1,2 ...,
n;
Mapping relations matrix W represents the mapping relations matrix of the label vector from Hash codes to supervision, and W is the real number of l × c dimensions
Domain matrix;
Hash matrix P is the real number domain matrix of l × m dimensions, and wherein m represents the dimension of label vector;
Step 202:M core is randomly choosed from the picture feature vector x of training sample:a1,a2,…,am, according to formulaφ (x) is calculated, wherein x represents the picture of any training sample
Characteristic vector, coefficient of balance δ span are [- 1,1], the dimension of m corresponding labels vector;
Step 204:Laplacian Matrix L is built according to L=D-S, wherein D is a diagonal matrix, and S represents the similarity of sample
Matrix, i-th of diagonal element of matrix D areMatrix S first SijRepresent that training sample i, training sample j picture are special
Levy vector xi、xjBetween similarity, similarity SijCalculation formula be:Work as xiIn xjDefault field in or xjIn xiIt is default
When in field,Otherwise Sij=0;Wherein σ span is 0~1;
Step 205:Discrete solution is carried out to target equation, iteration obtains semantic integration matrix R, Hash codes matrix B, mapping relations
The optimal solution of matrix W, Hash matrix P, the target equation are:
<mrow>
<munder>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
<mrow>
<mi>f</mi>
<mo>,</mo>
<mi>W</mi>
<mo>,</mo>
<mi>B</mi>
<mo>,</mo>
<mi>R</mi>
</mrow>
</munder>
<mo>|</mo>
<mo>|</mo>
<msup>
<mi>R</mi>
<mi>T</mi>
</msup>
<mi>Y</mi>
<mo>-</mo>
<msup>
<mi>W</mi>
<mi>T</mi>
</msup>
<mi>B</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mi>F</mi>
<mn>2</mn>
</msubsup>
<mo>+</mo>
<mi>&lambda;</mi>
<mo>|</mo>
<mo>|</mo>
<mi>W</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mi>F</mi>
<mn>2</mn>
</msubsup>
<mo>+</mo>
<mi>&alpha;</mi>
<mo>|</mo>
<mo>|</mo>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>B</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mi>F</mi>
<mn>2</mn>
</msubsup>
<mo>+</mo>
<mi>&beta;</mi>
<mo>|</mo>
<mo>|</mo>
<mi>P</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mi>F</mi>
<mn>2</mn>
</msubsup>
<mo>+</mo>
<mi>&gamma;</mi>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>n</mi>
</munderover>
<msub>
<mi>S</mi>
<mrow>
<mi>i</mi>
<mi>j</mi>
</mrow>
</msub>
<mo>|</mo>
<mo>|</mo>
<mi>f</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>i</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>f</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mi>j</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mi>F</mi>
<mn>2</mn>
</msubsup>
</mrow>
Wherein,The Frobenius norms of representing matrix, balance parameters λ, α, β, γ are the number more than 0, and mapping relationship f is
Hash function from feature space to Hash codes;
Step 3:The Hash matrix P obtained based on step 2, according to formula f (x)=PTφ (x) generates the Hash of each training sample
Code;
Step 4:Input the picture I of object to be retrievedt, and extract and obtain picture feature vector xt, label vector yt;
φ (the x of photo current are calculated using step 202 identical modet), then based on the Hash matrix P that step 2 obtains, pass through
Formula f (xt)=PTφ(xt) generation photo current Hash codes;
Based on Hash codes, the k closest images that target image is found out in range of search carry out retrieval result return, wherein
K is preset value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710253104.7A CN107346327A (en) | 2017-04-18 | 2017-04-18 | The zero sample Hash picture retrieval method based on supervision transfer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710253104.7A CN107346327A (en) | 2017-04-18 | 2017-04-18 | The zero sample Hash picture retrieval method based on supervision transfer |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107346327A true CN107346327A (en) | 2017-11-14 |
Family
ID=60254424
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710253104.7A Pending CN107346327A (en) | 2017-04-18 | 2017-04-18 | The zero sample Hash picture retrieval method based on supervision transfer |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107346327A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108280180A (en) * | 2018-01-23 | 2018-07-13 | 北京航空航天大学 | Semi-supervised Hash algorithm based on topic model |
CN108763952A (en) * | 2018-05-03 | 2018-11-06 | 阿里巴巴集团控股有限公司 | A kind of data classification method, device and electronic equipment |
CN109344279A (en) * | 2018-12-12 | 2019-02-15 | 山东山大鸥玛软件股份有限公司 | Hand-written English word intelligent identification Method based on Hash retrieval |
US10248664B1 (en) | 2018-07-02 | 2019-04-02 | Inception Institute Of Artificial Intelligence | Zero-shot sketch-based image retrieval techniques using neural networks for sketch-image recognition and retrieval |
CN110222771A (en) * | 2019-06-10 | 2019-09-10 | 成都澳海川科技有限公司 | A kind of classification recognition methods of zero samples pictures |
CN110297931A (en) * | 2019-04-23 | 2019-10-01 | 西北大学 | A kind of image search method |
CN111274424A (en) * | 2020-01-08 | 2020-06-12 | 大连理工大学 | Semantic enhanced hash method for zero sample image retrieval |
CN111460077A (en) * | 2019-01-22 | 2020-07-28 | 大连理工大学 | Cross-modal Hash retrieval method based on class semantic guidance |
CN117390515A (en) * | 2023-11-01 | 2024-01-12 | 江苏君立华域信息安全技术股份有限公司 | Data classification method and system based on deep learning and SimHash |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105069173A (en) * | 2015-09-10 | 2015-11-18 | 天津中科智能识别产业技术研究院有限公司 | Rapid image retrieval method based on supervised topology keeping hash |
CN105512289A (en) * | 2015-12-07 | 2016-04-20 | 郑州金惠计算机系统工程有限公司 | Image retrieval method based on deep learning and Hash |
CN106033426A (en) * | 2015-03-11 | 2016-10-19 | 中国科学院西安光学精密机械研究所 | A latent semantic min-Hash-based image retrieval method |
-
2017
- 2017-04-18 CN CN201710253104.7A patent/CN107346327A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106033426A (en) * | 2015-03-11 | 2016-10-19 | 中国科学院西安光学精密机械研究所 | A latent semantic min-Hash-based image retrieval method |
CN105069173A (en) * | 2015-09-10 | 2015-11-18 | 天津中科智能识别产业技术研究院有限公司 | Rapid image retrieval method based on supervised topology keeping hash |
CN105512289A (en) * | 2015-12-07 | 2016-04-20 | 郑州金惠计算机系统工程有限公司 | Image retrieval method based on deep learning and Hash |
Non-Patent Citations (1)
Title |
---|
YANG YANG,YADAN LUO,等: "Zero-Shot Hashing via Transferring Supervised Knowledge", 《MM "16 PROCEEDINGS OF THE 24TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA》 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108280180A (en) * | 2018-01-23 | 2018-07-13 | 北京航空航天大学 | Semi-supervised Hash algorithm based on topic model |
CN108280180B (en) * | 2018-01-23 | 2022-05-13 | 北京航空航天大学 | Retrieval method of semi-supervised Hash algorithm based on topic model |
CN108763952A (en) * | 2018-05-03 | 2018-11-06 | 阿里巴巴集团控股有限公司 | A kind of data classification method, device and electronic equipment |
US10248664B1 (en) | 2018-07-02 | 2019-04-02 | Inception Institute Of Artificial Intelligence | Zero-shot sketch-based image retrieval techniques using neural networks for sketch-image recognition and retrieval |
CN109344279A (en) * | 2018-12-12 | 2019-02-15 | 山东山大鸥玛软件股份有限公司 | Hand-written English word intelligent identification Method based on Hash retrieval |
CN109344279B (en) * | 2018-12-12 | 2021-08-10 | 山东山大鸥玛软件股份有限公司 | Intelligent handwritten English word recognition method based on Hash retrieval |
CN111460077B (en) * | 2019-01-22 | 2021-03-26 | 大连理工大学 | Cross-modal Hash retrieval method based on class semantic guidance |
CN111460077A (en) * | 2019-01-22 | 2020-07-28 | 大连理工大学 | Cross-modal Hash retrieval method based on class semantic guidance |
CN110297931A (en) * | 2019-04-23 | 2019-10-01 | 西北大学 | A kind of image search method |
CN110297931B (en) * | 2019-04-23 | 2021-12-03 | 西北大学 | Image retrieval method |
CN110222771A (en) * | 2019-06-10 | 2019-09-10 | 成都澳海川科技有限公司 | A kind of classification recognition methods of zero samples pictures |
CN111274424B (en) * | 2020-01-08 | 2021-01-19 | 大连理工大学 | Semantic enhanced hash method for zero sample image retrieval |
CN111274424A (en) * | 2020-01-08 | 2020-06-12 | 大连理工大学 | Semantic enhanced hash method for zero sample image retrieval |
CN117390515A (en) * | 2023-11-01 | 2024-01-12 | 江苏君立华域信息安全技术股份有限公司 | Data classification method and system based on deep learning and SimHash |
CN117390515B (en) * | 2023-11-01 | 2024-04-12 | 江苏君立华域信息安全技术股份有限公司 | Data classification method and system based on deep learning and SimHash |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107346327A (en) | The zero sample Hash picture retrieval method based on supervision transfer | |
CN106980683B (en) | Blog text abstract generating method based on deep learning | |
CN109918532B (en) | Image retrieval method, device, equipment and computer readable storage medium | |
CN107729513A (en) | Discrete supervision cross-module state Hash search method based on semanteme alignment | |
CN108733837B (en) | Natural language structuring method and device for medical history text | |
CN110110122A (en) | Image based on multilayer semanteme depth hash algorithm-text cross-module state retrieval | |
CN105631479A (en) | Imbalance-learning-based depth convolution network image marking method and apparatus | |
CN104317902B (en) | Image search method based on local holding iterative quantization Hash | |
CN110347847A (en) | Knowledge mapping complementing method neural network based | |
CN105469096A (en) | Feature bag image retrieval method based on Hash binary code | |
CN112241481A (en) | Cross-modal news event classification method and system based on graph neural network | |
CN113095415B (en) | Cross-modal hashing method and system based on multi-modal attention mechanism | |
CN109840322A (en) | It is a kind of based on intensified learning cloze test type reading understand analysis model and method | |
CN107766555A (en) | Image search method based on the unsupervised type cross-module state Hash of soft-constraint | |
CN107729312A (en) | More granularity segmenting methods and system based on sequence labelling modeling | |
Zhang et al. | Adaptively Unified Semi-supervised Learning for Cross-Modal Retrieval. | |
CN113128233B (en) | Construction method and system of mental disease knowledge map | |
CN106897776A (en) | A kind of continuous type latent structure method based on nominal attribute | |
CN115409018B (en) | Corporate public opinion monitoring system and method based on big data | |
CN112559723A (en) | FAQ search type question-answer construction method and system based on deep learning | |
CN109740151A (en) | Public security notes name entity recognition method based on iteration expansion convolutional neural networks | |
Li et al. | Multi-label pattern image retrieval via attention mechanism driven graph convolutional network | |
CN115329120A (en) | Weak label Hash image retrieval framework with knowledge graph embedded attention mechanism | |
CN112182275A (en) | Trademark approximate retrieval system and method based on multi-dimensional feature fusion | |
CN116450850A (en) | Space-time knowledge graph completion method based on sequence encoder |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171114 |
|
RJ01 | Rejection of invention patent application after publication |