CN110399897A - Image-recognizing method and device - Google Patents
Image-recognizing method and device Download PDFInfo
- Publication number
- CN110399897A CN110399897A CN201910286523.XA CN201910286523A CN110399897A CN 110399897 A CN110399897 A CN 110399897A CN 201910286523 A CN201910286523 A CN 201910286523A CN 110399897 A CN110399897 A CN 110399897A
- Authority
- CN
- China
- Prior art keywords
- image
- matrix
- images
- parameter
- binary
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the present invention provides a kind of image-recognizing method and device.Method includes: to choose the first image group and the second image group from L images, according to the characteristics of image of every image in the first image group and the second image group, calculate the image relevance of each image in each image and the second image group in the first image group, the parameter of initialized target function, update is iterated to the parameter of objective function, the updated objective function of iteration is obtained, determines that the cluster centre of L images, the number that iteration updates are at least once;Binary-coding is carried out according to characteristics of image of the hash function in updated objective function to images to be recognized, obtains binary-coding data;According to the binary-coding data of every image in the cluster centre of the binary-coding data of images to be recognized and L images, images to be recognized is identified.To improve the accuracy of image recognition.
Description
Technical field
The present embodiments relate to image processing techniques more particularly to a kind of image-recognizing methods and device.
Background technique
Image recognition technology is a key areas of artificial intelligence, is that Object identifying is carried out to image, various to identify
The target of different mode and technology to picture.For example, the object of the image under the first mode is the Ha Shiqi in running, the
The object of image under two kinds of modes is that the autumn fields dog stood can identify the first mode according to image recognition technology
Under the object of image and the object of the image under second of mode be dog.
Image recognition algorithm can be divided into the image recognition algorithm based on global characteristics and the image based on local feature is known
Other algorithm.Image recognition algorithm based on local feature is the combination that image is considered as to multiple localized masses, respectively from each part
Local feature is extracted in block, then the splicing of these local features is become into a vector, indicates a width using the vector being spliced
Image.Human perception system can be differentiated according to the more prominent feature of objects in images in resolution image, be based on office
The image recognition algorithm of portion's feature meets human perception system to the understanding process of image.
Image recognition algorithm based on local feature includes image recognition algorithm based on hand-designed and based on characterology
The image recognition algorithm of habit.In the prior art, for the image recognition algorithm based on feature learning, in use training image to mesh
When scalar functions are trained, usually the parameter optimization of hash function and cluster centre are optimized and separated, i.e., first calculation optimization Hash
Function parameter calculation optimization cluster centre again.So that the adaptation to image of hash function and cluster centre that final training obtains
Ability is poor, to reduce when the application hash function and cluster centre identify image to the accurate of image recognition
Property, lead to image recognition mistake.
Summary of the invention
The embodiment of the present invention provides a kind of image-recognizing method and device, and to solve in the prior art, image-recognizing method is not
The problem of image can be accurately identified.
In a first aspect, the embodiment of the present invention provides a kind of image-recognizing method, comprising: choose the first figure from L images
As group and the second image group, the first image group includes N images, and the second image group includes M images, and described the
The image of one image group and the image of the second image group are not exactly the same;Wherein, L is more than or equal to 2, N and is less than more than or equal to 1
It is more than or equal to 1 equal to L, M and is less than or equal to L;
According to the characteristics of image of every image in the first image group and the second image group, the first image group is calculated
In in each image and the second image group each image image relevance;
According to the image relevance of each image in image each in the first image group and the second image group, just
The parameter of beginningization objective function, the objective function include hash function and cluster centre function, wherein the hash function is used
In carrying out binary-coding to characteristics of image, the cluster centre function has characteristics of image one for obtaining in the L images
The cluster centre of cause property, the cluster centre include an at least image;The parameter of the hash function is compiled including the first two-value
Code parameter, the second binary-coding parameter and prediction zoom variables, the parameter of the cluster centre function include the first rectangular projection
Matrix and the second orthogonal intersection cast shadow matrix;
Update is iterated to the parameter of the objective function, obtains the updated objective function of iteration, determines the L
The cluster centre of image, the number that the iteration updates are at least once;
Two-value volume is carried out according to characteristics of image of the hash function in the updated objective function to images to be recognized
Code obtains binary-coding data;
According to two of every image in the cluster centre of the binary-coding data of the images to be recognized and the L images
It is worth coded data, identifies the images to be recognized.
Optionally, the figure according to each image in image each in the first image group and the second image group
As relevance, the parameter of initialized target function, comprising:
According to the image relevance of each image in image each in the first image group and the second image group, obtain
Obtain image relevance matrix;
According to the parameter of described image relevance matrix initialisation objective function;
Wherein, described image relevance matrix is the image relevance matrix of N*M, i-th in described image relevance matrix
The image relevance of i-th image and the second image group jth image in element representation the first image group of row jth column;Alternatively,
Described image relevance matrix is the image relevance matrix of M*N, jth row i-th in described image relevance matrix
Jth image and the first image group i-th open the image relevance of image in element representation the second image group of column;
Wherein, i more than or equal to 1 and is less than equal N, and j, which is more than or equal to 1 and is less than, waits M.
Optionally, the parameter according to described image relevance matrix initialisation objective function, comprising:
Obtain the transposed matrix of described image relevance matrix;
The covariance matrix of described image relevance matrix and the transposed matrix of described image relevance matrix are obtained respectively
Covariance matrix;
According to the projection matrix of described image relevance matrix, the covariance matrix of described image relevance matrix is obtained
Partial Feature vector, and according to the projection matrix of the transposed matrix, obtain the portion of the covariance matrix of the transposed matrix
Divide feature vector;
According to the Partial Feature of first orthogonal intersection cast shadow matrix and the covariance matrix of described image relevance matrix to
Amount initializes the first binary-coding parameter, and the association according to second orthogonal intersection cast shadow matrix and the transposed matrix
The Partial Feature vector of variance matrix initializes the second binary-coding parameter.
Optionally, the parameter to the objective function is iterated update, obtains the updated objective function of iteration,
Include:
When each iteration updates, determines that a parameter is parameter to be updated from the parameter of the objective function, fix it
His parameter updates the parameter to be updated and the objective function is made to meet preset condition;
The number of iterations is recorded, when the number of iterations is greater than or equal to default the number of iterations, stops updating the objective function
Parameter, obtain the updated objective function of iteration.
Optionally, in the cluster centre of the binary-coding data according to the images to be recognized and the L images
The binary-coding data of every image, identify the images to be recognized, comprising:
Calculate two of every image in the binary-coding data of the images to be recognized and the cluster centre of the L images
The Euclidean distance being worth between coded data;
According to the Euclidean distance, the images to be recognized is identified.
Optionally, the method also includes:
Described image is divided at least two localized masses;
Obtain the characteristics of image of each localized mass;
The characteristics of image for merging at least two localized mass, obtains the characteristics of image of described image.
Optionally, the characteristics of image for obtaining each localized mass, including;
Utilization orientation histogram of gradients algorithm obtains the characteristics of image of each localized mass.
Second aspect, the embodiment of the present invention provide a kind of pattern recognition device, comprising:
Module is chosen, for choosing the first image group and the second image group, the first image group packet from L images
N images are included, the second image group includes M images, the figure of the image of the first image group and the second image group
As not exactly the same;Wherein, L is more than or equal to 2, N and is more than or equal to 1 less than or equal to L, and M is more than or equal to 1 less than or equal to L;
Relating module is calculated for the characteristics of image according to every image in the first image group and the second image group
The image relevance of each image and each image in the second image group in the first image group;
Initialization module, for according to each image in image each in the first image group and the second image group
Image relevance, the parameter of initialized target function, the objective function includes hash function and cluster centre function,
In, the hash function is used to carry out characteristics of image binary-coding, and the cluster centre function is for obtaining the L images
In with characteristics of image consistency cluster centre, the cluster centre include an at least image;The ginseng of the hash function
Number includes the first binary-coding parameter, the second binary-coding parameter and prediction zoom variables, the parameter of the cluster centre function
Including the first orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix;
Update module is iterated update for the parameter to the objective function, obtains the updated target letter of iteration
Number determines that the cluster centre of the L images, the number that the iteration updates are at least once;
Coding module, for special according to image of the hash function in the updated objective function to images to be recognized
Sign carries out binary-coding, obtains binary-coding data;
Identification module, for the cluster centre according to the binary-coding data of the images to be recognized and the L images
In every image binary-coding data, identify the images to be recognized.
Optionally, the initialization module, comprising:
First obtains module, for according to each figure in image each in the first image group and the second image group
The image relevance of picture obtains image relevance matrix;
Sub- initialization module, for the parameter according to described image relevance matrix initialisation objective function;
Wherein, described image relevance matrix is the image relevance matrix of N*M, i-th in described image relevance matrix
The image relevance of i-th image and the second image group jth image in element representation the first image group of row jth column;Alternatively,
Described image relevance matrix is the image relevance matrix of M*N, jth row i-th in described image relevance matrix
Jth image and the first image group i-th open the image relevance of image in element representation the second image group of column;
Wherein, i more than or equal to 1 and is less than equal N, and j, which is more than or equal to 1 and is less than, waits M.
Optionally, the sub- initialization module, is specifically used for:
Obtain the transposed matrix of described image relevance matrix;
The covariance matrix of described image relevance matrix and the transposed matrix of described image relevance matrix are obtained respectively
Covariance matrix;
According to the projection matrix of described image relevance matrix, the covariance matrix of described image relevance matrix is obtained
Partial Feature vector, and according to the projection matrix of the transposed matrix, obtain the portion of the covariance matrix of the transposed matrix
Divide feature vector;
According to the Partial Feature of first orthogonal intersection cast shadow matrix and the covariance matrix of described image relevance matrix to
Amount initializes the first binary-coding parameter, and the association according to second orthogonal intersection cast shadow matrix and the transposed matrix
The Partial Feature vector of variance matrix initializes the second binary-coding parameter.
Optionally, the update module, comprising:
Sub- update module, when being updated for each iteration, determined from the parameter of the objective function parameter be to
Undated parameter, fixed other parameters update the parameter to be updated and the objective function are made to meet preset condition;
Statistical module, when the number of iterations is greater than or equal to default the number of iterations, stops updating for recording the number of iterations
The parameter of the objective function obtains the updated objective function of iteration.
Optionally, the identification module, comprising:
Computing module, for calculating the binary-coding data of the images to be recognized and the cluster centre of the L images
In every image binary-coding data between Euclidean distance;
Sub- identification module, for identifying the images to be recognized according to the Euclidean distance.
Optionally, described device further include: extraction module;
The extraction module, for described image to be divided at least two localized masses;Obtain the image of each localized mass
Feature;The characteristics of image for merging at least two localized mass, obtains the characteristics of image of described image.
Optionally, when the extraction module obtains the characteristics of image of each localized mass, it is specifically used for:
Utilization orientation histogram of gradients algorithm obtains the characteristics of image of each localized mass.
The third aspect, the embodiment of the present invention provide a kind of pattern recognition device, comprising: at least one processor and storage
Device;
The memory stores computer executed instructions;At least one described processor executes the meter of the memory storage
Calculation machine executes instruction, to execute the described in any item methods of first aspect of the embodiment of the present invention.
Fourth aspect, the embodiment of the present invention provide a kind of computer readable storage medium, the computer readable storage medium
In be stored with program instruction, described program instruction is realized described in any one of first aspect of the embodiment of the present invention when being executed by processor
Method.
5th aspect, the embodiment of the present application provide a kind of program product, and described program product includes computer program, described
Computer program is stored in readable storage medium storing program for executing, at least one processor of pattern recognition device can be from the readable storage
Medium reads the computer program, at least one described processor executes the computer program and makes pattern recognition device real
Apply the described in any item methods of first aspect of the embodiment of the present invention.
The embodiment of the present invention provides a kind of image-recognizing method and device, by influencing the parameter of hash function in cluster
The cluster centre that heart function obtains, and the cluster centre for obtaining cluster centre function feeds back the optimization to hash function parameter
In, realize the purpose that collaboration is carried out, while optimized by the calculating of the parameter of cluster centre function and the parameter of hash function, i.e., it is poly-
Class central function and hash function collaboration optimization.The more preferably binary-coding data of image can not only be obtained according to hash function
The characteristics of image of expression, and the cluster centre with characteristics of image high consistency can be obtained, make the hash function obtained
There is stronger robustness with cluster centre, and there is better image feature adaptability to the image under different mode.Finally,
Binary-coding will be carried out according to characteristics of image of the hash function to images to be recognized, according to the images to be recognized after binary-coding
The characteristics of image of every image of characteristics of image and cluster centre identifies image, improves image recognition accuracy.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is this hair
Bright some embodiments for those of ordinary skill in the art without any creative labor, can be with
It obtains other drawings based on these drawings.
Fig. 1 provides a kind of flow chart of image-recognizing method for one embodiment of the invention;
Fig. 2 is the structural schematic diagram for the pattern recognition device that one embodiment of the invention provides;
Fig. 3 be another embodiment of the present invention provides pattern recognition device structural schematic diagram.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art
Every other embodiment obtained without creative efforts, shall fall within the protection scope of the present invention.
Fig. 1 provides a kind of flow chart of image-recognizing method for one embodiment of the invention, as shown in Figure 1, the present embodiment
Method may include:
S101, the first image group and the second image group are chosen from L images.
In the present embodiment, N images are chosen from alternative L images as the first image group, choose M image conducts
Second image group.Wherein, it when the first image group of selection is put into the M images of N images and the second image group from L, needs
Prevent the image in the image and the second image group in the first image group from identical, i.e. image in the first image group and
The different image at least image difference of image in second image group.Wherein, wherein L is more than or equal to 2, N and is more than or equal to
1 is less than or equal to L, and M is more than or equal to 1 and is less than or equal to L.
It should be noted that the present embodiment to the relationship between N and M without limitation.
S102, according to the characteristics of image of every image in the first image group and the second image group, calculate described first
The image relevance of each image and each image in the second image group in image group.
In the present embodiment, after obtaining the first image group and the second image group, by each image in the first image group
Characteristics of image is compared with the characteristics of image of each image in the second image group, obtains the relevance of two images, can also
To say being similitude.Wherein, the similitude of two images is higher, and relevance is stronger, such as can open image with digital representation two
Between relevance, be defined as when the image in the image and the second image group in the first image group that compare being same image
When, for indicating that the number of two image relevances is 1;When in the image and the second image group in the first image group compared
When image is entirely different, for indicating that the number of two image relevances is 0.Therefore, when compare acquisition for indicate two
Number of the number of image relevance between (0,1), number is bigger, illustrates that the relevance between two images is stronger.
S103, it is associated with according to image each in the first image group with the image of each image in the second image group
Property, the parameter of initialized target function.
In the present embodiment, objective function includes hash function and cluster centre function, wherein hash function is used for image
Feature carries out binary-coding, the characteristics of image that acquisition is indicated with binary-coding data;Cluster centre function is used for from L images
The cluster centre with characteristics of image consistency is obtained, cluster centre includes an at least image.For example, it is desired to the image of identification
Object can both include when being dog, in L images dog image, also the image including other objects (for example, hamster,
Building), the cluster centre obtained by cluster centre function can represent the characteristics of image of dog.
In the present embodiment, the parameter of hash function includes the first binary-coding parameter, the second binary-coding parameter and prediction
Zoom variables, the parameter of cluster centre function include the first orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix.
S104, it is iterated update to the parameter of the objective function, obtains the updated objective function of iteration, determines institute
State the cluster centre of L images.
In the present embodiment, according to the target of objective function to the first binary-coding parameter, the second binary-coding parameter, prediction
Zoom variables, the first orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix are iterated update, and it is updated to obtain each iteration
The parameter of objective function obtains the objective function that each iteration updates, wherein the number that iteration updates is at least once.For example,
The target of objective function is to obtain minimum value, undated parameter, so that target value be made to reduce, when target value minimum corresponding parameter
For updated parameter.Wherein, when each iteration updates, the first binary-coding parameter, the second binary-coding parameter, prediction scaling
5 variable, the first orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix variables update once.Also, obtaining each iteration
After update when the parameter of objective function, according to the parameter of updated objective function, the cluster centre function in objective function can
To obtain updated cluster centre.
S105, two are carried out according to characteristics of image of the hash function in the updated objective function to images to be recognized
Value coding, obtains binary-coding data.
In the present embodiment, when the parameter iteration to objective function updates stopping, the parameter pair that last time updates is obtained
The objective function and cluster centre answered, and obtain the characteristics of image of images to be recognized.According to the hash function in objective function
Binary-coding is carried out to the characteristics of image of images to be recognized, thus the characteristics of image that acquisition is indicated with binary-coding data.
S106, schemed according to every in the cluster centre of the binary-coding data of the images to be recognized and the L images
The binary-coding data of picture, identify the images to be recognized.
In the present embodiment, by every image in the cluster centre of the binary-coding data of images to be recognized and L images
Binary-coding data are compared, according to every figure in the cluster centre of the binary-coding data of images to be recognized and L images
The comparison result of the binary-coding data of picture identifies images to be recognized.
The present embodiment by making the parameter of hash function influence the cluster centre that cluster centre function obtains, and makes to gather
The cluster centre that class central function obtains is fed back into the optimization of hash function parameter, realizes parameter and the Kazakhstan of cluster centre function
The purpose that collaboration is carried out, while optimized by the calculating of the parameter of uncommon function, i.e. cluster centre function and hash function collaboration optimization.
The characteristics of image that the more preferably binary-coding data of image indicate can not only be obtained according to hash function, and can be had
There is the cluster centre of characteristics of image high consistency, makes the hash function obtained and cluster centre that there is stronger robustness, and
There is better image feature adaptability to the image under different mode.Finally, by according to hash function to images to be recognized
Characteristics of image carries out binary-coding, according to every image of the characteristics of image of the images to be recognized after binary-coding and cluster centre
Characteristics of image identify image, improve image recognition accuracy.
Several specific embodiments are used below, and the technical solution of embodiment of the method shown in Fig. 1 is described in detail.
In some embodiments, a kind of possible implementation of S103 are as follows:
S1031, it is closed according to the image of each image in image each in the first image group and the second image group
Connection property obtains image relevance matrix.
S1032, according to the parameter of described image relevance matrix initialisation objective function
The wherein image ratio in wherein an image and the second image group in the present embodiment, in the first image group
Compared with obtaining the relevance of two images, wherein relevance available digital indicates, using the relevance of two images as being associated with
Property matrix in an element, thus according in each image in the first image group and the second image group each image image close
Connection property obtains image relevance matrix.
Wherein, image relevance matrix can be the image relevance matrix of N*M, the i-th row jth in image relevance matrix
The image relevance of i-th image and the second image group jth image in element representation the first image group of column;Alternatively,
Image relevance matrix can also be the image relevance matrix of M*N, and jth row i-th arranges in image relevance matrix
Element representation the second image group in jth image and the first image group i-th open the image relevance of image.Wherein, of the invention
Embodiment is described so that image relevance matrix is the image relevance matrix of N*M as an example;Wherein, i is more than or equal to 1 and is less than
Equal N, j, which are more than or equal to 1 and are less than, waits M.
In some embodiments, a kind of possible implementation of S1032 are as follows:
S10321, the transposed matrix for obtaining described image relevance matrix.
In the present embodiment, for example, image relevance matrix is U=(uij)∈Rn×m, wherein uijIt indicates in the first image group
The relevance of jth image in i-th image and the second image group.The transposed matrix for obtaining image relevance matrix U, is denoted as
UT, wherein UTFor the matrix of M*N.
Turn of S10322, the covariance matrix for obtaining described image relevance matrix respectively and described image relevance matrix
Set the covariance matrix of matrix.
In the present embodiment, transposed matrix U is obtainedTMean vector uhWith the mean vector u of image relevance matrix Ug, from
And available transposed matrix UTCovariance matrix covariance matrix, calculation formula for example can be (UT-uh1T)T(UT-
uh1T), the covariance matrix of the covariance matrix of image relevance matrix U is obtained, calculation formula for example can be (U-ug1T)T
(U-ug1T)。
S10323, according to the projection matrix of described image relevance matrix, obtain the association side of described image relevance matrix
The Partial Feature vector of poor matrix, and according to the projection matrix of the transposed matrix, obtain the covariance of the transposed matrix
The Partial Feature vector of matrix.
In the present embodiment, first choice obtains transposed matrix UTProjection matrix PhWith the projection matrix P of image relevance matrixg,
According to projection matrix PhAnd transposed matrix UTCovariance matrix obtain transposed matrix UTCovariance matrix Partial Feature to
Amount, for example, available transposed matrix UTPreceding b feature vector, wherein obtain transposed matrix UTCovariance matrix preceding b
The calculation formula of a feature vector is, for example, formula 1.Similarly, first b for obtaining the covariance matrix of image relevance matrix U is special
The calculation formula for levying vector is, for example, formula 2:
Uh=Ph(UT-uh1T) formula 1
Ug=Pg(U-ug1T) formula 2
S10324, according to the part of first orthogonal intersection cast shadow matrix and the covariance matrix of described image relevance matrix
Feature vector initializes the first binary-coding parameter, and according to second orthogonal intersection cast shadow matrix and the transposition square
The Partial Feature vector of the covariance matrix of battle array initializes the second binary-coding parameter.
In the present embodiment, according to the first orthogonal intersection cast shadow matrix RgIt is special with the part of the covariance matrix of image relevance matrix
Levy vector UgInitialize the first binary-coding parameter G, wherein the first binary-coding parameter can for example be expressed as G=
sign(RgUg)。
According to the second orthogonal intersection cast shadow matrix RhWith the Partial Feature vector U of the covariance matrix of transposed matrixhDescribed in initialization
First binary-coding parameter H, wherein the first binary-coding parameter can for example be expressed as H=sign (RhUh)。
Wherein, at the beginning of the present embodiment does not limit prediction zoom variables, the first orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix
The mode of beginningization, for example, prediction zoom variables, the first orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix can be obtained at random
Value.
The present embodiment passes through the image relevance square of every image in every image in the first image group and the second image group
The transposed matrix of battle array and image relevance matrix, initializes the parameter of objective function.Further increase the Kazakhstan of acquisition
Uncommon function and cluster centre have stronger robustness, and are adapted to better image feature the image under different mode
Property.
In some embodiments, a kind of possible implementation of S104 are as follows:
When S1041, each iteration update, determine that a parameter is parameter to be updated from the parameter of the objective function,
Fixed other parameters update the parameter to be updated and the objective function are made to meet preset condition.
In the present embodiment, such as objective function expression formula is formula 3:
Wherein, preceding two expressions hash function of objective function, Section 3 indicate cluster centre function.
In E1In, transposed matrix UTCovariance matrix Partial Feature vector, such as before b feature vector are as follows:The covariance matrix Partial Feature vector of image relevance matrix U, for example, before b feature to
Amount are as follows:Wherein it is possible to indicate the characteristics of image of the first image group, U using UhgIndicate second
The characteristics of image of image group.First orthogonal intersection cast shadow matrix Rg, the second orthogonal intersection cast shadow matrix RhIt is the matrix of b*b.
In E2In, σ is prediction zoom variables, co (hi,gj) indicate the first image group and the second image group relevance,
In, co (hi,gj) calculation formula be, for example, formula 4:
In E3, C=[c1,c2,…,cK]∈Rb×kIndicate the cluster centre of L images, wherein K indicates to scheme in cluster centre
The number of picture, the c in formula 3kIndicate the characteristics of image that the substance coded data of kth image in cluster centre indicates.
In order to facilitate calculating and abbreviation, the E in objective function1+E2Write as the form of matrix, as shown in formula 5:
Wherein, { -1 ,+1 } H ∈b×n, G ∈ { -1 ,+1 }b×m, J is n*m and each element is 1 matrix.In order to reduce two
It is worth the redundancy of coding, makes coding as far as possible containing more information content, such as HH can be enabledT=nI, GGT=mI, I are unit
Matrix.
The process of each iteration undated parameter are as follows:
(1) the second binary-coding parameter G, prediction zoom variables σ, the first orthogonal intersection cast shadow matrix R are fixed firstgJust with second
Hand over projection matrix RhValue, update the first binary-coding parameter H.
The second binary-coding parameter G, prediction the zoom variables σ, first obtained after initialization or last iteration are updated
Orthogonal intersection cast shadow matrix RgWith the second orthogonal intersection cast shadow matrix RhValue bring into formula 5,5 abbreviation of formula be formula 6:
Wherein,HHT=nI, H ∈ { -1 ,+1 }b×n.By relaxing to first
The discrete constraint of binary-coding parameter H, we can solve the first binary-coding parameter H by singular value decomposition method, pass through
Singular value decomposition method would know thatTherefore, updated first binary-coding parameter are as follows:
(2) the first binary-coding parameter H, prediction zoom variables σ, the first orthogonal intersection cast shadow matrix R are then fixedgJust with second
Hand over projection matrix RhValue, update the second binary-coding parameter G.
Due to obtaining the first binary-coding parameter H of update in (1), updating the second binary-coding parameter G
When, it is obtained after the first binary-coding parameter H of (1) update, and initialization or last iteration are updated and predicts zoom variables
σ, the first orthogonal intersection cast shadow matrix RgWith the second orthogonal intersection cast shadow matrix RhValue bring into formula 5,5 abbreviation of formula be formula 7:
Wherein,G∈{-1,+1}b×n, GGT=mI.By relax to G from
Constraint is dissipated, we can solve the second binary-coding parameter G by singular value decomposition method, i.e., can by singular value decomposition method
KnowTherefore, updated second binary-coding parameter are as follows:
(3) the first binary-coding parameter H, the second binary-coding parameter G, prediction zoom variables σ and second are then fixed just
Hand over projection matrix RhValue, update the first orthogonal intersection cast shadow matrix Rg。
Due to obtaining the first binary-coding parameter H of update in (1), the second two-value of update is obtained in (2)
Therefore coding parameter G is updating the first orthogonal intersection cast shadow matrix RgWhen, the first binary-coding parameter H that (1) is updated, in (2)
The the second binary-coding parameter G and initialization or last iteration updated is obtaining prediction zoom variables σ and second just after updating
Hand over projection matrix RhValue bring into formula 3,3 abbreviation of formula be formula 8:
According to the first binary-coding parameter H and the second binary-coding parameter G and clustering algorithm (such as K-Means algorithm)
The cluster centre of L image can be obtained, also, the cluster centre of L images and the first binary-coding parameter H and the second two-value
Coding parameter G is related.Wherein, L is obtained by the first binary-coding parameter H and the second binary-coding parameter G and clustering algorithm
The specific embodiment for opening the cluster centre of image can refer to the prior art, and details are not described herein again.
Updated first orthogonal intersection cast shadow matrix R is obtained by the method for stochastic gradient descentg, wherein pass through boarding steps
The method of degree decline obtains updated first orthogonal intersection cast shadow matrix RgSpecific method can refer to existing method, it is no longer superfluous herein
It states.
(4) the first binary-coding parameter H, the second binary-coding parameter G, prediction zoom variables σ and first are then fixed just
Hand over projection matrix RgValue, update the second orthogonal intersection cast shadow matrix Rh。
Updating the second orthogonal intersection cast shadow matrix RhWhen, the first binary-coding parameter H that (1) is updated, the updated in (2)
Two binary-coding parameter G, the first orthogonal intersection cast shadow matrix R updated in (3)gAnd it is obtained after initialization or last iteration update
The value of prediction zoom variables σ is brought into formula 3, and 3 abbreviation of formula is formula 9:
Updated second orthogonal intersection cast shadow matrix R is obtained by the method for stochastic gradient descenth, wherein pass through boarding steps
The method of degree decline obtains updated second orthogonal intersection cast shadow matrix RhSpecific method can refer to existing method, it is no longer superfluous herein
It states.
(5) the first binary-coding parameter H, the second binary-coding parameter G, the first orthogonal intersection cast shadow matrix R are finally fixedgWith
Two orthogonal intersection cast shadow matrix RhValue, update prediction zoom variables σ.
When updating prediction zoom variables σ, which is updated into the first binary-coding parameter H obtained, the second two-value
Coding parameter G, the first orthogonal intersection cast shadow matrix RgWith the second orthogonal intersection cast shadow matrix RhValue bring formula 5 into, 5 abbreviation of formula is formula
10:
Wherein, it enablesThen formula 5 can be further simplified as
Approximate solution can be acquired
(1)-(5) by the above process are updated the parameter of objective function when an iteration updates.
It should be noted that not limiting parameter in the embodiment of the present invention when each iteration updates the parameter of objective function
Update sequence.For example, it is also possible to according to prediction zoom variables σ, the second orthogonal intersection cast shadow matrix Rh, the first binary-coding parameter H,
First orthogonal intersection cast shadow matrix Rg, the second binary-coding parameter G sequence be updated.
S1042, record the number of iterations stop updating the mesh when the number of iterations is greater than or equal to default the number of iterations
The parameter of scalar functions obtains the updated objective function of iteration.
In the present embodiment, after the update of each iteration, the number that record iteration updates, when the number of iterations is greater than or equal in advance
If when the number of iterations, stopping the parameter for updating objective function, the updated objective function of iteration being obtained, to obtain the Kazakhstan of update
The cluster centre of uncommon parameter and L images.Wherein, it enablesFor any one image of input
Characteristics of image x, hash function carry out binary-coding to x, and the formula for obtaining the first binary-coding data is, for example, formula 11, obtain
The formula of second binary-coding data is, for example, formula 12:
Wherein, the i-th row in the i representing matrix W in formula 11, wiThe element of the i-th row in representing matrix W.Formula 11
In i representing matrix V in the i-th row, viThe element of the i-th row in representing matrix V.
The present embodiment when by each iteration, influences the parameter of hash function in the cluster that cluster centre function obtains
The heart, and the cluster centre for obtaining cluster centre function is fed back into the optimization of hash function parameter, realizes cluster centre letter
The calculating of the parameter of several parameter and hash function carries out collaboration, while the purpose optimized, i.e. cluster centre function and Hash
Function collaboration optimization, improves the robustness of hash function and cluster centre, and the characteristics of image to image under different mode
Adaptability, to improve the accuracy of identification image.
In some embodiments, a kind of possible implementation of S106 are as follows:
Every figure in the cluster centre of S1061, the binary-coding data for calculating the images to be recognized and the L images
Euclidean distance between the binary-coding data of picture.
In the present embodiment, since being indicated with the first binary-coding data for images to be recognized can be obtained according to hash function
Characteristics of image and the characteristics of image that is indicated with the second binary-coding data.Therefore, obtain band identify image with the one or two
After characteristics of image that value coded data indicates and the characteristics of image indicated with the second binary-coding data, the embodiment of the present invention is simultaneously
The two-value for the images to be recognized being compared with the binary-coding data of every image in the cluster centre of L images is not limited
The characteristics of image that coded data indicates, for example, the characteristics of image and L images indicated with the first binary-coding data can be calculated
Cluster centre in every image binary-coding data between Euclidean distance.Or calculate the second binary-coding tables of data
Euclidean distance in the cluster centre of the characteristics of image and L shown image between the binary-coding data of every image, Huo Zhefen
The binary-coding of every image in the cluster centre for characteristics of image and the L image that the first binary-coding data indicate Ji Suan not used
Every in the cluster centre of Euclidean distance between data and the characteristics of image and L images that are indicated with the second binary-coding data
Euclidean distance between the binary-coding data of image.
Wherein, binary-coding data for example can be 1 and 0.
S1062, according to the Euclidean distance, identify the images to be recognized
In the present embodiment, the Euclidean distance according to obtained in S1061 identifies the band identification image.
The present embodiment passes through the image of the characteristics of image and cluster centre that indicate the binary-coding data of images to be recognized
Binary-coding data indicate characteristics of image be compared, to identify images to be recognized, due to only needing to compare 1 and 0
Compared with improving the speed of image recognition.
In some embodiments, method shown in the embodiment of the present invention can also include:
S201, described image is divided at least two localized masses.
S202, the characteristics of image for obtaining each localized mass.
Optionally, it can use histograms of oriented gradients (Histogram of Oriented Gradient, HOG) algorithm
Obtain the characteristics of image of each localized mass.
S203, the characteristics of image for merging at least two localized mass, obtain the characteristics of image of described image.
In the present embodiment, when obtaining characteristics of image, at least two localized masses are divided an image into, are obtained using HOG algorithm
Take the characteristics of image of each localized mass.For example, the pixel for choosing 8*8 forms a cell, every 2*2 cell forms a block,
It is step-length with 8.Each cell generates 9 features, and a block generates 36 features, i.e., the feature vector of 36 dimensions.It finally will be each
The characteristics of image of localized mass is superimposed according to preset order, obtains the characteristics of image of whole image.For example, image is divided into 4 parts
Block, each localized mass be 36 dimension feature vectors, then the feature vector of the characteristics of image of whole image from 4 36 tie up feature to
Amount composition.
In the following, the device for implementing the above method is described.
The structural schematic diagram for the pattern recognition device that Fig. 2 one embodiment of the invention provides.As described in Figure 2, the present embodiment mentions
The pattern recognition device of confession may include: to choose module 21, relating module 22, initialization module 23, update module 24, coding mould
Block 25 and identification module 26.
Optionally, initialization module 23 includes: the first acquisition module 231 and sub- initialization module 232.
Optionally, update module 24 includes: sub- update module 241 and statistical module 242.
Optionally, identification module 26 includes: computing module 261 and sub- identification module 262.
Optionally, on the basis of embodiment shown in Fig. 2, pattern recognition device can also include: extraction module 27.
Wherein, module 21 is chosen, for choosing the first image group and the second image group, the first image from L images
Group includes N images, and the second image group includes M images, and the image of the first image group and the image of the second image group are incomplete
It is identical;Wherein, L is more than or equal to 2, N and is more than or equal to 1 less than or equal to L, and M is more than or equal to 1 less than or equal to L.
Relating module 22 calculates for the characteristics of image according to every image in the first image group and the second image group
In one image group in each image and the second image group each image image relevance.
Initialization module 23, for the image according to each image in each image in the first image group and the second image group
Relevance, the parameter of initialized target function, objective function include hash function and cluster centre function, wherein hash function
For carrying out binary-coding to characteristics of image, cluster centre function is used to obtain in L images with characteristics of image consistency
Cluster centre, cluster centre include an at least image;The parameter of hash function includes the first binary-coding parameter, the second two-value
Coding parameter and prediction zoom variables, the parameter of cluster centre function include the first orthogonal intersection cast shadow matrix and the second rectangular projection square
Battle array.
Update module 24 is iterated update for the parameter to objective function, obtains the updated objective function of iteration,
Determine that the cluster centre of L images, the number that iteration updates are at least once.
Coding module 25, for the characteristics of image according to the hash function in updated objective function to images to be recognized
Binary-coding is carried out, binary-coding data are obtained.
Identification module 26, for every in the cluster centre according to binary-coding data and the L image of images to be recognized
The binary-coding data of image identify images to be recognized.
Optionally, first obtain module 231, for according to each image in the first image group with it is each in the second image group
The image relevance of image obtains image relevance matrix.
Sub- initialization module 232, for the parameter according to image relevance matrix initialisation objective function.
Wherein, image relevance matrix is the image relevance matrix of N*M, the i-th row jth column in image relevance matrix
The image relevance of i-th image and the second image group jth image in element representation the first image group;Alternatively,
Image relevance matrix is the image relevance matrix of M*N, the element that jth row i-th arranges in image relevance matrix
Indicate that jth opens image in the second image group and the first image group i-th opens the image relevance of image.
Wherein, i more than or equal to 1 and is less than equal N, and j, which is more than or equal to 1 and is less than, waits M.
Optionally, sub- initialization module 232, is specifically used for:
Obtain the transposed matrix of image relevance matrix.
The covariance of the covariance matrix of image relevance matrix and the transposed matrix of image relevance matrix is obtained respectively
Matrix.
According to the projection matrix of image relevance matrix, the Partial Feature of the covariance matrix of image relevance matrix is obtained
Vector, and according to the projection matrix of transposed matrix, obtain the Partial Feature vector of the covariance matrix of transposed matrix.
According to the Partial Feature vector of the first orthogonal intersection cast shadow matrix and the covariance matrix of image relevance matrix, initialization
First binary-coding parameter, and according to the Partial Feature of the second orthogonal intersection cast shadow matrix and the covariance matrix of transposed matrix to
Amount initializes the second binary-coding parameter.
Optionally, sub- update module 241 when updating for each iteration, determines a ginseng from the parameter of objective function
Number is parameter to be updated, and fixed other parameters update parameter to be updated and objective function is made to meet preset condition.
Statistical module 242, when the number of iterations is greater than or equal to default the number of iterations, stops for recording the number of iterations
The parameter of objective function is updated, the updated objective function of iteration is obtained.
Optionally, computing module 261, for calculating in the binary-coding data of images to be recognized and the cluster of L images
Euclidean distance between the binary-coding data of every image in the heart;
Sub- identification module 262, for identifying images to be recognized according to Euclidean distance.
Optionally, extraction module 27, for dividing an image at least two localized masses;Obtain the image of each localized mass
Feature;The characteristics of image for merging at least two localized masses obtains the characteristics of image of image.
Optionally, when extraction module 27 obtains the characteristics of image of each localized mass, it is specifically used for:
Utilization orientation histogram of gradients algorithm obtains the characteristics of image of each localized mass.
The above-described pattern recognition device of the present embodiment can be used for executing the technical side in above-mentioned each method embodiment
Case, it is similar that the realization principle and technical effect are similar, and details are not described herein again.
Fig. 3 be another embodiment of the present invention provides pattern recognition device structural schematic diagram.As shown in figure 3, the image
Identification device can be the chip of the network equipment or the network equipment, the apparatus may include: at least one processor 31 and deposit
Reservoir 32.Fig. 3 shows the pattern recognition device taken a processor as an example, wherein
Memory 32, for storing program.Specifically, program may include program code, and said program code includes meter
Calculation machine operational order.Memory 32 may include high-speed memory (Random Access Memory, RAM), it is also possible to also wrap
Include nonvolatile memory (non-volatile memory), for example, at least a magnetic disk storage.
Processor 31, the computer executed instructions stored for executing the memory 32, to realize in above-described embodiment
Image-recognizing method, it is similar that the realization principle and technical effect are similar, and details are not described herein.
Wherein, processor 31 may be a central processing unit (Central Processing Unit, CPU), either
Specific integrated circuit (Application Specific Integrated Circuit, ASIC), or be arranged to implement
One or more integrated circuits of the embodiment of the present application.
Optionally, in specific implementation, if communication interface, memory 32 and the independent realization of processor 31, communication connect
Mouth, memory 32 and processor 31 can be connected with each other by bus and complete mutual communication.The bus can be work
Industry standard architecture (Industry Standard Architecture, ISA) bus, external equipment interconnection
(Peripheral Component, PCI) bus or extended industry-standard architecture (Extended Industry
Standard Architecture, EISA) bus etc..The bus can be divided into address bus, data/address bus, control bus
Deng it is not intended that an only bus or a type of bus.
Optionally, in specific implementation, if communication interface, memory 32 and processor 31 are integrated real on one chip
It is existing, then communication interface, memory 32 and processor 31 can be completed by internal interface it is identical between communication.
The above-described pattern recognition device of the present embodiment can be used for executing the technical side in above-mentioned each method embodiment
Case, it is similar that the realization principle and technical effect are similar, and details are not described herein again.
Those of ordinary skill in the art will appreciate that: realize that all or part of the steps of above-mentioned each method embodiment can lead to
The relevant hardware of program instruction is crossed to complete.Program above-mentioned can be stored in a computer readable storage medium.The journey
When being executed, execution includes the steps that above-mentioned each method embodiment to sequence;And storage medium above-mentioned includes: read-only memory
(Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic or disk
Etc. the various media that can store program code.
Finally, it should be noted that the above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;To the greatest extent
Pipe present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: its according to
So be possible to modify the technical solutions described in the foregoing embodiments, or to some or all of the technical features into
Row equivalent replacement;And these are modified or replaceed, various embodiments of the present invention technology that it does not separate the essence of the corresponding technical solution
The range of scheme.
Claims (10)
1. a kind of image-recognizing method characterized by comprising
The first image group and the second image group are chosen from L images, the first image group includes N images, and described the
Two image groups include M images, and the image of the first image group and the image of the second image group are not exactly the same;Its
In, L is more than or equal to 2, N and is more than or equal to 1 less than or equal to L, and M is more than or equal to 1 less than or equal to L;
According to the characteristics of image of every image in the first image group and the second image group, calculate every in the first image group
The image relevance of each image in one image and the second image group;
According to the image relevance of each image in image each in the first image group and the second image group, initialization
The parameter of objective function, the objective function include hash function and cluster centre function, wherein the hash function for pair
Characteristics of image carries out binary-coding, and the cluster centre function has characteristics of image consistency for obtaining in the L images
Cluster centre, the cluster centre include an at least image;The parameter of the hash function is joined including the first binary-coding
Number, the second binary-coding parameter and prediction zoom variables, the parameter of the cluster centre function include the first orthogonal intersection cast shadow matrix
With the second orthogonal intersection cast shadow matrix;
Update is iterated to the parameter of the objective function, obtains the updated objective function of iteration, determines the L images
Cluster centre, the number that the iteration updates is at least once;
Binary-coding is carried out according to characteristics of image of the hash function in the updated objective function to images to be recognized, is obtained
Obtain binary-coding data;
It is compiled according to the two-value of every image in the cluster centre of the binary-coding data of the images to be recognized and the L images
Code data, identify the images to be recognized.
2. the method according to claim 1, wherein described according to image each in the first image group and institute
State the image relevance of each image in the second image group, the parameter of initialized target function, comprising:
According to the image relevance of each image in image each in the first image group and the second image group, schemed
As relevance matrix;
According to the parameter of described image relevance matrix initialisation objective function;
Wherein, described image relevance matrix is the image relevance matrix of N*M, the i-th row jth in described image relevance matrix
The image relevance of i-th image and the second image group jth image in element representation the first image group of column;Alternatively,
Described image relevance matrix is the image relevance matrix of M*N, and jth row i-th arranges in described image relevance matrix
Jth image and the first image group i-th open the image relevance of image in element representation the second image group;
Wherein, i more than or equal to 1 and is less than equal N, and j, which is more than or equal to 1 and is less than, waits M.
3. according to the method described in claim 2, it is characterized in that, described according to described image relevance matrix initialisation target
The parameter of function, comprising:
Obtain the transposed matrix of described image relevance matrix;
The association of the covariance matrix of described image relevance matrix and the transposed matrix of described image relevance matrix is obtained respectively
Variance matrix;
According to the projection matrix of described image relevance matrix, the part of the covariance matrix of described image relevance matrix is obtained
Feature vector, and according to the projection matrix of the transposed matrix, the part for obtaining the covariance matrix of the transposed matrix is special
Levy vector;
According to the Partial Feature vector of first orthogonal intersection cast shadow matrix and the covariance matrix of described image relevance matrix, just
Beginningization the first binary-coding parameter, and the covariance square according to second orthogonal intersection cast shadow matrix and the transposed matrix
The Partial Feature vector of battle array initializes the second binary-coding parameter.
4. the method according to claim 1, wherein the parameter to the objective function is iterated more
Newly, the updated objective function of iteration is obtained, comprising:
When each iteration updates, determines that a parameter is parameter to be updated from the parameter of the objective function, fix other ginsengs
Number updates the parameter to be updated and the objective function is made to meet preset condition;
The number of iterations is recorded, when the number of iterations is greater than or equal to default the number of iterations, stops the ginseng for updating the objective function
Number obtains the updated objective function of iteration.
5. the method according to claim 1, wherein the binary-coding data according to the images to be recognized
With the binary-coding data of every image in the cluster centre of the L images, the images to be recognized is identified, comprising:
The two-value for calculating every image in the binary-coding data of the images to be recognized and the cluster centre of the L images is compiled
Euclidean distance between code data;
According to the Euclidean distance, the images to be recognized is identified.
6. method according to claim 1-5, which is characterized in that the method also includes:
Described image is divided at least two localized masses;
Obtain the characteristics of image of each localized mass;
The characteristics of image for merging at least two localized mass, obtains the characteristics of image of described image.
7. according to the method described in claim 6, it is characterized in that, the characteristics of image for obtaining each localized mass, including;
Utilization orientation histogram of gradients algorithm obtains the characteristics of image of each localized mass.
8. a kind of pattern recognition device characterized by comprising
Module is chosen, includes N for choosing the first image group and the second image group, the first image group from L images
Image, the second image group include M images, and the image of the image of the first image group and the second image group is not
It is identical;Wherein, L is more than or equal to 2, N and is more than or equal to 1 less than or equal to L, and M is more than or equal to 1 less than or equal to L;
Relating module, for the characteristics of image according to every image in the first image group and the second image group, described in calculating
The image relevance of each image and each image in the second image group in first image group;
Initialization module, for the figure according to each image in image each in the first image group and the second image group
As relevance, the parameter of initialized target function, the objective function includes hash function and cluster centre function, wherein institute
Hash function is stated for carrying out binary-coding to characteristics of image, the cluster centre function has for obtaining in the L images
There is the cluster centre of characteristics of image consistency, the cluster centre includes an at least image;The parameter packet of the hash function
Including the first binary-coding parameter, the second binary-coding parameter and prediction zoom variables, the parameter of the cluster centre function includes
First orthogonal intersection cast shadow matrix and the second orthogonal intersection cast shadow matrix;
Update module is iterated update for the parameter to the objective function, obtains the updated objective function of iteration, really
The cluster centre of the fixed L images, the number that the iteration updates are at least once;
Coding module, for according to the hash function in the updated objective function to the characteristics of image of images to be recognized into
Row binary-coding obtains binary-coding data;
Identification module, for according to every in the cluster centre of the binary-coding data of the images to be recognized and the L images
The binary-coding data for opening image, identify the images to be recognized.
9. a kind of pattern recognition device characterized by comprising memory and processor, memory are used to store program instruction,
Processor is used to call the program instruction in memory to execute such as the described in any item image-recognizing methods of claim 1-7.
10. a kind of readable storage medium storing program for executing, which is characterized in that be stored with computer program on the readable storage medium storing program for executing;The meter
Calculation machine program is performed, and realizes such as the described in any item image-recognizing methods of claim 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910286523.XA CN110399897B (en) | 2019-04-10 | 2019-04-10 | Image recognition method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910286523.XA CN110399897B (en) | 2019-04-10 | 2019-04-10 | Image recognition method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110399897A true CN110399897A (en) | 2019-11-01 |
CN110399897B CN110399897B (en) | 2021-11-02 |
Family
ID=68322286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910286523.XA Active CN110399897B (en) | 2019-04-10 | 2019-04-10 | Image recognition method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110399897B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105069457A (en) * | 2015-07-15 | 2015-11-18 | 杭州朗和科技有限公司 | Image identification method and device |
CN105930834A (en) * | 2016-07-01 | 2016-09-07 | 北京邮电大学 | Face identification method and apparatus based on spherical hashing binary coding |
CN107341178A (en) * | 2017-05-24 | 2017-11-10 | 北京航空航天大学 | A kind of adaptive binary quantization Hash coding method and device |
US20180039861A1 (en) * | 2016-08-03 | 2018-02-08 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
US20180293461A1 (en) * | 2015-10-12 | 2018-10-11 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Method and device for detecting copies in a stream of visual data |
-
2019
- 2019-04-10 CN CN201910286523.XA patent/CN110399897B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105069457A (en) * | 2015-07-15 | 2015-11-18 | 杭州朗和科技有限公司 | Image identification method and device |
US20180293461A1 (en) * | 2015-10-12 | 2018-10-11 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Method and device for detecting copies in a stream of visual data |
CN105930834A (en) * | 2016-07-01 | 2016-09-07 | 北京邮电大学 | Face identification method and apparatus based on spherical hashing binary coding |
US20180039861A1 (en) * | 2016-08-03 | 2018-02-08 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
CN107341178A (en) * | 2017-05-24 | 2017-11-10 | 北京航空航天大学 | A kind of adaptive binary quantization Hash coding method and device |
Non-Patent Citations (2)
Title |
---|
CHUNXIAO FAN等: "Sparse projections matrix binary descriptors for face recognition", 《NEUROCOMPUTING》 * |
张运超等: "一种融合重力信息的快速海量图像检索方法", 《自动化学报》 * |
Also Published As
Publication number | Publication date |
---|---|
CN110399897B (en) | 2021-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108875904A (en) | Image processing method, image processing apparatus and computer readable storage medium | |
CN111476719B (en) | Image processing method, device, computer equipment and storage medium | |
CN110309856A (en) | Image classification method, the training method of neural network and device | |
KR20200088475A (en) | Simultaneous training of functional networks of neural networks | |
CN109791625A (en) | Face recognition is carried out using artificial neural network | |
CN111860398B (en) | Remote sensing image target detection method and system and terminal equipment | |
CN108229591A (en) | Neural network adaptive training method and apparatus, equipment, program and storage medium | |
CN111754596A (en) | Editing model generation method, editing model generation device, editing method, editing device, editing equipment and editing medium | |
JP7457125B2 (en) | Translation methods, devices, electronic equipment and computer programs | |
CN106855952B (en) | Neural network-based computing method and device | |
KR20180060257A (en) | Metohd and apparatus for object recognition | |
Liu et al. | SparseNet: A sparse DenseNet for image classification | |
CN111914908B (en) | Image recognition model training method, image recognition method and related equipment | |
EP3754503A1 (en) | Allocation system, method and apparatus for machine learning, and computer device | |
CN109523546A (en) | A kind of method and device of Lung neoplasm analysis | |
CN115186821A (en) | Core particle-oriented neural network inference overhead estimation method and device and electronic equipment | |
CN103177414A (en) | Structure-based dependency graph node similarity concurrent computation method | |
CN109685805A (en) | A kind of image partition method and device | |
CN114419378B (en) | Image classification method and device, electronic equipment and medium | |
CN110648309A (en) | Method for generating erythrocyte image complexed by antithetical net based on conditions and related equipment | |
CN114782686A (en) | Image segmentation method and device, terminal equipment and storage medium | |
CN109711315A (en) | A kind of method and device of Lung neoplasm analysis | |
CN110399897A (en) | Image-recognizing method and device | |
CN110502975A (en) | A kind of batch processing system that pedestrian identifies again | |
CN116543257A (en) | Training method and device for target detection model, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |