CN110110128A - The discrete hashing image searching system of quickly supervision for distributed structure/architecture - Google Patents

The discrete hashing image searching system of quickly supervision for distributed structure/architecture Download PDF

Info

Publication number
CN110110128A
CN110110128A CN201910372377.2A CN201910372377A CN110110128A CN 110110128 A CN110110128 A CN 110110128A CN 201910372377 A CN201910372377 A CN 201910372377A CN 110110128 A CN110110128 A CN 110110128A
Authority
CN
China
Prior art keywords
hash
sample
node
matrix
quickly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910372377.2A
Other languages
Chinese (zh)
Other versions
CN110110128B (en
Inventor
陈枫
刘志锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest University
Original Assignee
Southwest University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest University filed Critical Southwest University
Priority to CN201910372377.2A priority Critical patent/CN110110128B/en
Publication of CN110110128A publication Critical patent/CN110110128A/en
Application granted granted Critical
Publication of CN110110128B publication Critical patent/CN110110128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a kind of discrete hashing image searching systems of the quickly supervision for distributed structure/architecture, comprising: sample database, the sample image data uploaded for node each in distributed storage network model;Coring processing module carries out coring processing, obtains corresponding eigenmatrix;Distribution quickly supervises discrete Hash study module: quickly supervising discrete Hash learning model for constructing distribution, and by training optimizing, obtains best hash function projection matrix and best Hash codes matrix;Test sample obtains module: for obtaining test image data;Image retrieval module: the Hamming distance between Hash codes and the best Hash codes matrix by calculating test image data determines final image searching result.Its effect is: there is the Hash codes that study can be made to obtain by way of returning label information to binary system Hash codes good categorical attribute to reduce computation complexity to improve retrieval precision, shorten the training time.

Description

The discrete hashing image searching system of quickly supervision for distributed structure/architecture
Technical field
The present invention relates to image classification retrieval techniques, are a kind of quick supervision for distributed structure/architecture more specifically Discrete hashing image searching system.
Background technique
In recent years, hash method is widely studied and is applied to various fields, including target identification, computer vision, figure As related fieldss such as retrievals.Original similar high dimensional data is mapped as having lesser by hash method by construction hash function The similar binary system Hash codes of Hamming distance.In such a way that Hash codes indicate, solve extensive high dimensional data needs high dimensional data The problem of consuming a large amount of memory spaces and retrieval time, so that higher effect can be obtained in storage and calculating speed Rate.Existing hash method is broadly divided into two major classes: Dynamic data exchange Hash and data dependence Hash.Dynamic data exchange hash method is simultaneously Do not learn binary system Hash codes and hash function by training data directly.Representative Dynamic data exchange hash method has Local sensitivity Hash.And data dependence hash method makes full use of the information of training sample then to learn binary system compressed code and Kazakhstan Uncommon function.Existing data dependence hash method is divided into three classes again: unsupervised Hash, semi-supervised Hash and supervision Hash.
Unsupervised hash method is not needed through label information, but directly learns to obtain using the characteristic information of data Hash function.Such as iterative quantization, spectrum Hash, anchor point figure Hash.Semi-supervised hash method, by combining unmarked information and mark Information is remembered to learn hash function, such as semi-supervised differentiation Hash, the embedded Hash of semi-supervised manifold.And supervise hash method then It is to learn hash function using the label information of training sample.Representative supervision hash method includes supervising discrete Kazakhstan It is uncommon, quickly supervise discrete Hash etc..
Above-mentioned Hash learning method is only applicable to learn Hash codes and hash function on single node, i.e., centralization is breathed out It is uncommon.But in practical application, data are typically distributed on different positions, such as wireless sensor network, WWW distributed In environment.For this purpose, the distributed hash method proposed at this stage includes unsupervised distributed hash, distributed figure Hash etc. Method.But the unstability and time complexity due to algorithm are higher, these methods retrieval precision and on the training time all Have the defects that certain.Therefore, for distributed environment problem, how can precisely but also rapidly be realized by supervising Hash Image retrieval is solved the problems, such as required for the researcher of this field.
Summary of the invention
Current research there are aiming at the problem that, the invention reside in the limitation for overcoming centralized Hash study application scenarios A kind of discrete hashing image searching system of quickly supervision for distributed structure/architecture of offer, can not only improve Hash optimization problem Training speed, while also improving the precision of image retrieval.
To achieve the above object, specific technical solution of the present invention is as follows:
A kind of discrete hashing image searching system of quickly supervision for distributed structure/architecture, key be include:
Sample database, the sample graph uploaded for node each in distributed storage network model in the sample database As data, to construct sample data set;
Coring processing module, the sample image data for uploading to each node carry out coring processing, obtain corresponding Eigenmatrix;
Distribution quickly supervises discrete Hash study module: quickly supervising discrete Hash study mould for constructing distribution Type, and by the way that coring, treated that sample data set is trained optimizing, obtain best hash function projection matrix and institute State the corresponding best Hash codes matrix of sample data set;
Test sample obtains module: for obtaining test image data;
Image retrieval module: the image retrieval module will be at the test image data transmission to the coring processing module Corresponding eigenmatrix is obtained after reason, then handles to obtain test image data by the best hash function projection matrix Hash codes, then the Hamming distance between the Hash codes and the best Hash codes matrix by calculating test image data determines Final image searching result.
Optionally, the coring processing module is handled using gaussian kernel function, specifically:
The coring processing module is handled using gaussian kernel function, specifically: K (x)=[exp (- | | x-a1||2/ σ) ..., exp (- | | x-aq||2/ σ)], x indicates the feature vector of sample, by randomly choosing q sample in sample image data As anchor point, a1~aqIndicate the feature vector of q-th of anchor point, σ is the spread factor of preset gaussian kernel function.
Optionally, the distribution quickly supervise discrete Hash study module according to:
Optimizing is carried out as global object loss function, in which: BlIndicate the sample image data pair that first of node uploads The Hash codes matrix answered, YlIndicate the corresponding tally set of sample image data that first of node uploads, WlIndicate first of node pair The classifier matrix answered, PlIndicate the corresponding hash function projection matrix of first of node, K (Xl) indicate what first of node uploaded Sample image data coring expression characteristic matrix, λ and μ indicate regularization coefficient, Wl=WsIndicate the classifier of two neighboring node Matrix is identical, Pl=PsIndicate that the Hash code function projection matrix of two neighboring node is identical, P is to select from the sample database The corresponding node number of the sample selected, N (l) indicate the neighbor node collection of node l.
Optionally, the distribution quickly supervises the searching process of discrete Hash study module are as follows:
First initialize Bl、PlAnd Wl, keeping PlAnd WlIn the case where constant, pass through global object loss function searching office The best B in portionl
Then B is being keptlAnd PlIn the case where constant, local optimum W is found by global object loss functionl
Then B is being keptlAnd WlIn the case where constant, local optimum P is found by global object loss functionl
The iterative cycles above process, until maximum number of iterations or algorithmic statement, to obtain the best Kazakhstan of all nodes Uncommon Function Projective matrix stackAnd the corresponding best Hash codes matrix of the sample data set, and fromIn it is random The optimal projection matrix an of node is chosen as global optimum hash function projection matrix P.
Optionally, in training searching process, global object loss function is indicated using augmentation Lagrange multiplier formula, And local optimum target is determined using alternating direction multipliers method.
Optionally, described image retrieval module is according to BT=sign (K (XT) P) determine the Hash codes B of test image dataT, Wherein K (XT) it is test image data coring expression characteristic matrix, global optimum hash function projection matrix P, sign () they are symbol Number function.
Remarkable result of the invention is:
(1) the high score class feature of the discrete Hash codes of binary system is embodied in Distributed Architecture, it may be assumed that the Hash of inhomogeneity sample Hamming distance between code is big as far as possible, and the Hamming distance between the Hash codes of the same category sample is small as far as possible.In distributed hash In study, the Hash codes that the present invention can make study obtain by way of returning label information to binary system Hash codes have Good categorical attribute, to improve retrieval precision.
(2) present invention is when learning Hash codes using the form of a closed loop.Meanwhile updating partial projection matrix In step, by the method to label information progress recurrence learning binary code, computation complexity is reduced, to greatly shorten Training time.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples, in attached drawing:
Fig. 1 is by the distributed network system (DNS) architecture diagram used in the specific embodiment of the invention;
Fig. 2 is the image indexing system functional block diagram that discrete Hash learning model is quickly supervised based on distribution;
Fig. 3 is the mean accuracy contrast effect figure of each algorithm under different code length under CIFAR-10 training set;
Fig. 4 is the mean accuracy mean value contrast effect figure of each algorithm under different code length in CIFAR-10 training set;
Fig. 5 is the mean accuracy contrast effect figure of each algorithm under different code length in MINST training set;
Fig. 6 is the mean accuracy mean value contrast effect figure of each algorithm under different code length in MINST training set.
Specific embodiment
In order to keep the technical problem to be solved in the present invention, technical solution and advantage clearer, below in conjunction with attached drawing and Specific embodiment is described in detail, it should be understood that the specific embodiments described herein are merely illustrative of the present invention, not For limiting the present invention.
The discrete hashing image searching system of quickly supervision proposed by the present invention for distributed structure/architecture, can be adapted for as Distributed network model shown in FIG. 1 lists 10 nodes in figure, and random connection between each other, belongs to distributed wireless Self-organizing network framework.
As shown in Fig. 2, system includes sample database, 1~node of node P as shown in the figure sample image data collected It is respectively stored in respective sample database, constructs to form distributed training set as training by way of distributed storage Sample data set;
Coring processing module, the sample image data for uploading to each node carry out coring processing, obtain corresponding Eigenmatrix;
Distribution quickly supervises discrete Hash study module: quickly supervising discrete Hash study mould for constructing distribution Type, and by the way that coring, treated that sample data set is trained optimizing, obtain best hash function projection matrix and institute State the corresponding best Hash codes matrix of sample data set;
Test sample obtains module: for obtaining test image data;
Image retrieval module: the image retrieval module will be at the test image data transmission to the coring processing module Corresponding eigenmatrix is obtained after reason, then handles to obtain test image data by the best hash function projection matrix Hash codes, then the Hamming distance between the Hash codes and the best Hash codes matrix by calculating test image data determines Final image searching result.
In the specific implementation, the distribution quickly supervise discrete Hash study module according to:
Optimizing is carried out as global object loss function, in which: BlIndicate the sample image data pair that first of node uploads The Hash codes matrix answered, YlIndicate the corresponding tally set of sample image data that first of node uploads, WlIndicate first of node pair The classifier matrix answered, PlIndicate the corresponding hash function projection matrix of first of node, K (Xl) indicate what first of node uploaded Sample image data coring expression characteristic matrix, λ and μ indicate regularization coefficient, Wl=WsIndicate the classifier of two neighboring node Matrix is identical, Pl=PsIndicate that the Hash code function projection matrix of two neighboring node is identical, P is to select from the sample database The corresponding node number of the sample selected.
The training image feature set of first of node is used in network modelIt indicates, wherein nlAnd d It is the number and characteristic dimension of the node sample respectively,Indicate the character representation of i-th of training image of the node,It is Real number set.Total training image collection isWhereinRepresent all node training images Total sample number;
The corresponding tally set of the training image collection of first of node is usedIt indicates, wherein nlAnd c It is the number and classification number of the node sample corresponding label respectively,Indicate that i-th of training image sample institute of node l is right The label information answered, { 0,1 } are used to mark classification information, ifBelong to classification z, thenIt otherwise is 0;
Test image sample characteristics collection for retrieval is usedIt indicates, wherein m indicates test specimens This number,Indicate the character representation of i-th of test image, the corresponding tally set of test sample
Q sample is randomly selected from total training image feature set as anchor point collectionIt is high for making to sample The reference of this nuclear mapping;
The training sample characteristic information and test sample characteristic information of P node are normalized, it is then non-linear by Gauss Coring expression characteristic matrix K (X of the nuclear mapping into nuclear space as training sample and test samplel) and K (XT), wherein Gauss Kernel function are as follows:
K (x)=[exp (- | | x-a1||2/ σ) ..., exp (- | | x-aq||2/ σ)], x indicates the feature vector of sample, leads to It crosses in sample image data and randomly chooses q sample as anchor point, a1~aqIndicate the feature vector of q-th of anchor point, σ is default Gaussian kernel function spread factor.
For the training sample in P node, the optimizing of discrete Hash study module is quickly supervised in the distribution Cheng Zhong: the Hash codes matrix of node l is first initializedBlInitialization be by generating at random, Middle k indicates the length of Hash codes.
Classifier matrix in initialized target function simultaneouslyWith hash function projection matrix Both be randomly generated, and initialize two matrixes shared by all nodes, i.e., to arbitrary node l ∈ 1, 2 ..., P }, initialize Wl=W, Pl=P;Initialize Lagrange multiplier coefficientFor subsequent point It Qiu Xie not classifier matrix W and projection hash function projection matrix P;Initialize regularization coefficient λ, μ, α, β and greatest iteration Number T;When it is implemented, initialization regularization coefficient λ, μ, α, β is respectively 1,1e-5,1e-3,1e-3, initializes the number of iterations It is 10.
According to global object loss function, first P is being keptlAnd WlIn the case where constant, sought by global object loss function Look for local optimum Bl
The localized target function of node l is adjustable at this time are as follows:
Optimization aim finally can be obtained by further decomposition and abbreviation to above-mentioned localized target are as follows:
Finally obtain Hash codes matrix Bl=sign (YlWl+μK(Xl)Pl)。
The Hash codes matrix that all nodes can orderly be updated by the above process, finally obtains
Then B is being keptlAnd PlIn the case where constant, local optimum W is found by global object loss functionl
The localized target function of node l becomes at this time:
In order to solve the problems, such as this constrained optimization, the present invention uses alternating direction multipliers method (ADMM).ADMM is that augmentation is drawn One deformation of Ge Lang multiplier, it supports the decomposition of variable, and provides superior convergence.So above-mentioned augmentation Lagrange Multiplier formula global object is expressed as follows:
In above-mentioned formula,It is constraint Wl=WsLagrange multiplier;α is the punishment parameter of augmentation Lagrange. ADMM solves above-mentioned optimization by repeating following two step:
It can be obtained eventually by abbreviation:
Wherein used during abbreviationTo be further simplified above-mentioned optimization problem.
The classifier matrix of all nodes can be orderly updated from node 1 to node P using the above methodAnd Lagrange multiplierBecause the classifier matrix for each node uses consistency constraint, pass through theory analysis reality The classifier matrix of each node is substantially consistent on border.
Then B is being keptlAnd WlIn the case where constant, local optimum P is found by global object loss functionl
The localized target function of node l becomes at this time:
In order to solve the problems, such as this constrained optimization, the present invention, which is used, optimizes W with optimizationlThe identical alternating direction multiplier of process Method (ADMM).So above-mentioned augmentation Lagrange multiplier formula global object is expressed as follows:
In above-mentioned formula,It is constraint Pl=PsLagrange multiplier;β is the punishment parameter of augmentation Lagrange.For This optimization problem is efficiently solved, the present invention is still used with the ADMM for simplifying Lagrange multiplier.Based on optimization WlThe similar derivation of process, ADMM solve above-mentioned optimization by repeating following two step:
Wherein used during abbreviationTo be further simplified above-mentioned optimization problem.
The hash function projection matrix P of single node in the available network model of the above processlAnd Lagrange multiplies Sub- Γl, the hash function projection matrix of all nodes is orderly updated from node 1 to node P using the above methodAnd Lagrange multiplierBecause the classifier matrix for each node uses consistency constraint, from all nodes Hash function projection matrixIn randomly select the optimal projection matrix an of node as global optimum's hash function Projection matrix P.
The last iterative cycles above process, until maximum number of iterations or algorithmic statement, to obtain best hash function Projection matrixAnd the corresponding best Hash codes matrix P of the sample data set.
Then to the test sample K (X after all coringT), it is instructed by obtained best hash function projection matrix Practice the Hash codes B of test sampleT=sign (K (XT) P), sign () is sign function.When Hamming distance is less than Hamming radius r When, indicate that the training sample and the retrieval belong to same class.
In order to further embody calculating effect of the invention, the present embodiment also downloads single label data collection CIFAR-10 respectively And MINST.The distributed model of building is undirected as shown in Figure 1, indicate distributed network model with the non-directed graph connected at random The vertex of figure indicates that each node of network, training data are distributed on each node.
For data set CIFAR-10, chooses wherein 1000 samples and divide as test set, remaining 59000 sample mean For cloth in 10 nodes, each node includes 5900 samples.Each sample includes 512 dimension sample characteristics information.For each The training sample of node is usedIt indicates, herein nl=5900 indicate number of training, and d=512 indicates every The dimension of a sample.Row vectorIndicate i-th of training sample of first of node.Test sample? Here m=1000 indicates test sample number.Row vectorIndicate i-th of test sample.1000 are finally chosen from training sample A sample is used as anchor pointIt indicates, wherein q=1000.
For data set MINST, chooses wherein 1000 samples and be distributed as test set, remaining 69000 sample mean In 10 nodes, each node includes 6900 samples.Each sample includes 784 dimension sample characteristics information.For each section The training sample of point is usedIt indicates, herein nl=6900 indicate number of training, and d=784 indicates each The dimension of sample.Row vectorIndicate i-th of training sample of first of node.Test sampleAt this In m=1000 indicate test sample number.Row vectorIndicate i-th of test sample.1000 are finally chosen from training sample Sample is used as anchor pointIt indicates, wherein q=1000.
Based on above-mentioned utility data, we by (being defined as DFSDH) of the invention and several typical centralized algorithm AGH, KSH, SDH, FSDH and distributed algorithm SupDisH, DisH make comparisons.Wherein SupDisH is the distributed hash for having supervision Learning method, and DisH is unsupervised distributed hash method.By means of two datasets CIFAR-10 and MINST to assess Performance of the present invention in terms of training time, mean accuracy and mean accuracy (MAP).In order to carry out fair comparison, in SupDisH In experimentation of the invention, regularization coefficient λ, μ, α, β are initialized to 1,1e-5,1e-3,1e-3, the number of iterations respectively 10 are initialized as, Hamming radius r=2 is initialized.Distributed network model is by the non-directed graph that connects comprising 10 vertex and at random It indicates.All experiments are carried out in the environment of 2.53GHz IntelXeon CPU and 16GB RAM.
Fig. 3 and Fig. 4 is illustrated respectively in CIFAR-10 training set under different code length on the mean accuracy and MAP of each algorithm Comparison.The present invention is similar to SupDisH in retrieval performance.Compared with above-mentioned centralized approach, this method in mean accuracy and There is significant advantage on MAP.Further, since the retrieval performance of the system is substantially better than DisH using supervision message.
Fig. 5 and Fig. 6 is illustrated respectively in MINST training set the mean accuracy of each algorithm and MAP comparison under different code length. The present invention is equally similar to SupDisH in retrieval performance.Compared with centralized approach and DisH, this method in mean accuracy and There is significant advantage on MAP.
Table 1: the training time on CIFAR-10 training set
Table 2: the training time on MINST training set
Tables 1 and 2 compared the present invention with tri- algorithms of SDH, FSDH, SupDisH in CIFAR-10 and MINST two respectively Training time under a training set.Two experimental results show that the present invention saves on the training time compared to SupDisH simultaneously A large amount of training time.
In conclusion the present invention is solved to the discrete Hash problem concerning study of the supervision of image set under distributed environment.Meanwhile According to above-mentioned specific example, experimental result sufficiently shows that this method is embodied for the image retrieval problem of single label image Higher retrieval precision and training speed.To realize effective classification to single label image.
Finally, it is noted that herein, the terms "include", "comprise" or its any other variant are intended to non- It is exclusive to include, so that the process, method, article or the device that include a series of elements not only include those elements, It but also including other elements that are not explicitly listed, or further include solid by this process, method, article or device Some elements.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including There is also other identical elements in the process, method of the element, article or device.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments, and passes through above embodiment party The description of formula, it is required general that those skilled in the art can be understood that above-described embodiment method can add by software The mode of hardware platform is realized, naturally it is also possible to which by hardware, but in many cases, the former is more preferably embodiment.It is based on Such understanding, substantially the part that contributes to existing technology can be with software product in other words for technical solution of the present invention Form embody, which is stored in a storage medium (such as ROM/RAM, magnetic disk, CD), including Some instructions are used so that a terminal (can be mobile phone, computer, server, air conditioner or the network equipment etc.) executes Method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much Form, all of these belong to the protection of the present invention.

Claims (6)

1. a kind of discrete hashing image searching system of quickly supervision for distributed structure/architecture, characterized by comprising:
Sample database, the sample image number uploaded for node each in distributed storage network model in the sample database According to construct sample data set;
Coring processing module, the sample image data for uploading to each node carry out coring processing, obtain corresponding feature Matrix;
Distribution quickly supervises discrete Hash study module: discrete Hash learning model is quickly supervised for constructing distribution, and By the way that treated that sample data set is trained optimizing to coring, best hash function projection matrix and the sample are obtained The corresponding best Hash codes matrix of data set;
Test sample obtains module: for obtaining test image data;
Image retrieval module: the image retrieval module will be after the test image data transmission to coring processing module processing Corresponding eigenmatrix is obtained, then handles to obtain the Hash of test image data by the best hash function projection matrix Code, then the Hamming distance between the Hash codes and the best Hash codes matrix by calculating test image data determines final Image searching result.
2. the discrete hashing image searching system of the quickly supervision according to claim 1 for distributed structure/architecture, feature Be: the coring processing module is handled using gaussian kernel function, specifically: K (x)=[exp (- | | x-a1||2/ σ), L, exp(-||x-aq||2/ σ)], x indicates the feature vector of sample, by randomly choosing q sample in sample image data as anchor Point, a1~aqIndicate the feature vector of q-th of anchor point, σ is the spread factor of preset gaussian kernel function.
3. the discrete hashing image searching system of the quickly supervision according to claim 1 for distributed structure/architecture, feature Be: the distribution quickly supervise discrete Hash study module according to:
Optimizing is carried out as global object loss function, in which: BlThe sample image data for indicating that first of node uploads is corresponding Hash codes matrix, YlIndicate the corresponding tally set of sample image data that first of node uploads, WlIndicate that first of node is corresponding Classifier matrix, PlIndicate the corresponding hash function projection matrix of first of node, K (Xl) indicate the sample that first of node uploads Image data coring expression characteristic matrix, λ and μ indicate regularization coefficient, Wl=WsIndicate the classifier matrix of two neighboring node It is identical, Pl=PsIndicate that the Hash codes matrix of two neighboring node is identical, P is that the sample selected from the sample database corresponds to Node number, N (l) indicate node l neighbor node collection.
4. the discrete hashing image searching system of the quickly supervision according to claim 3 for distributed structure/architecture, feature Be: the distribution quickly supervises the searching process of discrete Hash study module are as follows:
First initialize Bl、PlAnd Wl, keeping PlAnd WlIn the case where constant, part is found most by global object loss function Good Bl
Then B is being keptlAnd PlIn the case where constant, local optimum W is found by global object loss functionl
Then B is being keptlAnd WlIn the case where constant, local optimum P is found by global object loss functionl
The iterative cycles above process, until maximum number of iterations or algorithmic statement, to obtain the best Hash letter of all nodes Number projection matrix collectionAnd the corresponding best Hash codes matrix of the sample data set.And fromIn randomly select The optimal projection matrix of one node is as global optimum hash function projection matrix P.
5. the discrete hashing image searching system of the quickly supervision according to claim 4 for distributed structure/architecture, feature It is: in training searching process, indicates global object loss function using augmentation Lagrange multiplier formula, and utilize alternating Direction multiplier method determines local optimum target.
6. the discrete hashing image searching system of the quickly supervision according to claim 3 for distributed structure/architecture, feature Be: described image retrieval module is according to BT=sign (K (XT) P) determine the Hash codes B of test image dataT, wherein K (XT) For test image data coring expression characteristic matrix, P is global optimum's hash function projection matrix, and sign () is sign function.
CN201910372377.2A 2019-05-06 2019-05-06 Fast supervised discrete hash image retrieval system for distributed architecture Active CN110110128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910372377.2A CN110110128B (en) 2019-05-06 2019-05-06 Fast supervised discrete hash image retrieval system for distributed architecture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910372377.2A CN110110128B (en) 2019-05-06 2019-05-06 Fast supervised discrete hash image retrieval system for distributed architecture

Publications (2)

Publication Number Publication Date
CN110110128A true CN110110128A (en) 2019-08-09
CN110110128B CN110110128B (en) 2023-04-07

Family

ID=67488310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910372377.2A Active CN110110128B (en) 2019-05-06 2019-05-06 Fast supervised discrete hash image retrieval system for distributed architecture

Country Status (1)

Country Link
CN (1) CN110110128B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111754313A (en) * 2020-07-03 2020-10-09 南京大学 Efficient communication projection-free distributed data online classification method
CN112905599A (en) * 2021-03-18 2021-06-04 南京邮电大学 Distributed deep hash retrieval method based on end-to-end
CN113191445A (en) * 2021-05-16 2021-07-30 中国海洋大学 Large-scale image retrieval method based on self-supervision countermeasure Hash algorithm
CN114022701A (en) * 2021-10-21 2022-02-08 南京审计大学 Image classification method based on neighbor supervision discrete discrimination Hash

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102553111A (en) * 2011-11-16 2012-07-11 四川红光汽车机电有限公司 Photoelectric observing and aiming system of fire truck
US20140143251A1 (en) * 2012-11-19 2014-05-22 The Penn State Research Foundation Massive clustering of discrete distributions
CN105868743A (en) * 2016-05-31 2016-08-17 天津中科智能识别产业技术研究院有限公司 Face retrieval method based on rapid supervised discrete hashing
CN109063113A (en) * 2018-07-30 2018-12-21 成都快眼科技有限公司 A kind of fast image retrieval method based on the discrete Hash of asymmetric depth, retrieval model and model building method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102553111A (en) * 2011-11-16 2012-07-11 四川红光汽车机电有限公司 Photoelectric observing and aiming system of fire truck
US20140143251A1 (en) * 2012-11-19 2014-05-22 The Penn State Research Foundation Massive clustering of discrete distributions
CN105868743A (en) * 2016-05-31 2016-08-17 天津中科智能识别产业技术研究院有限公司 Face retrieval method based on rapid supervised discrete hashing
CN109063113A (en) * 2018-07-30 2018-12-21 成都快眼科技有限公司 A kind of fast image retrieval method based on the discrete Hash of asymmetric depth, retrieval model and model building method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TIECHENG SONG等: "《Semi-Supervised Manifold-Embedded Hashing with Joint Feature Representation and Classifier Learning》", 《PATTERN RECOGNITION》 *
朱晗: "《面向大规模多媒体检索的深度哈希学习方法研究》", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111754313A (en) * 2020-07-03 2020-10-09 南京大学 Efficient communication projection-free distributed data online classification method
CN111754313B (en) * 2020-07-03 2023-09-26 南京大学 Efficient communication online classification method for distributed data without projection
CN112905599A (en) * 2021-03-18 2021-06-04 南京邮电大学 Distributed deep hash retrieval method based on end-to-end
CN112905599B (en) * 2021-03-18 2022-10-14 南京邮电大学 Distributed deep hash retrieval method based on end-to-end
CN113191445A (en) * 2021-05-16 2021-07-30 中国海洋大学 Large-scale image retrieval method based on self-supervision countermeasure Hash algorithm
CN113191445B (en) * 2021-05-16 2022-07-19 中国海洋大学 Large-scale image retrieval method based on self-supervision countermeasure Hash algorithm
CN114022701A (en) * 2021-10-21 2022-02-08 南京审计大学 Image classification method based on neighbor supervision discrete discrimination Hash
CN114022701B (en) * 2021-10-21 2022-06-24 南京审计大学 Image classification method based on neighbor supervision discrete discrimination Hash

Also Published As

Publication number Publication date
CN110110128B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN110110128A (en) The discrete hashing image searching system of quickly supervision for distributed structure/architecture
Bhandari A novel beta differential evolution algorithm-based fast multilevel thresholding for color image segmentation
Weiss et al. Spectral hashing
CN108132968A (en) Network text is associated with the Weakly supervised learning method of Semantic unit with image
Liaw et al. Fast exact k nearest neighbors search using an orthogonal search tree
CN107943938A (en) A kind of large-scale image similar to search method and system quantified based on depth product
Forero et al. Robust clustering using outlier-sparsity regularization
CN108875076B (en) Rapid trademark image retrieval method based on Attention mechanism and convolutional neural network
Wang et al. Learning a discriminative distance metric with label consistency for scene classification
CN114092747A (en) Small sample image classification method based on depth element metric model mutual learning
CN116580257A (en) Feature fusion model training and sample retrieval method and device and computer equipment
CN110263855A (en) A method of it is projected using cobasis capsule and carries out image classification
Senanayake et al. Self-organizing nebulous growths for robust and incremental data visualization
Kishore et al. A Multi-class SVM Based Content Based Image Retrieval System Using Hybrid Optimization Techniques.
CN113066528B (en) Protein classification method based on active semi-supervised graph neural network
CN107798331B (en) Method and device for extracting characteristics of off-zoom image sequence
CN107133348B (en) Approximate searching method based on semantic consistency in large-scale picture set
CN108549915A (en) Image hash code training pattern algorithm based on two-value weight and classification learning method
Côme et al. Self organizing star (sos) for health monitoring
JP2018195270A (en) Local feature expression learning device and method
CN104700439B (en) The human face portrait synthetic method drawn a portrait based on individual target
Adinugroho et al. Leaves classification using neural network based on ensemble features
Fushimi et al. Accelerating Greedy K-Medoids Clustering Algorithm with Distance by Pivot Generation
CN111709478B (en) Fuzzy clustering method and device based on anchor graph
CN112215272A (en) Bezier curve-based image classification neural network attack method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant