CN108615052A - A kind of image-recognizing method without under similar training sample situation - Google Patents

A kind of image-recognizing method without under similar training sample situation Download PDF

Info

Publication number
CN108615052A
CN108615052A CN201810335966.9A CN201810335966A CN108615052A CN 108615052 A CN108615052 A CN 108615052A CN 201810335966 A CN201810335966 A CN 201810335966A CN 108615052 A CN108615052 A CN 108615052A
Authority
CN
China
Prior art keywords
attribute
training sample
sample
known class
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810335966.9A
Other languages
Chinese (zh)
Inventor
吴松松
王堃
孙广成
荆晓远
岳东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201810335966.9A priority Critical patent/CN108615052A/en
Publication of CN108615052A publication Critical patent/CN108615052A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of image-recognizing method without under similar training sample situation, this method can effectively reduce the influence that semantic migration and known class training sample attribute noise problem are brought in without similar training sample situation hypograph identification process.We go out the real property of known class training sample to reduce the influence that its attribute noise problem is brought using just too distributing virtual, learn an attribute forecast device using encoder-decoder model secondly based on known class training sample feature and its virtual real property influences to achieve the purpose that effectively reduce semantic migration problem, and the label of unknown classification sample to be tested is obtained finally by nearest neighbor classifier.Compared with existing other methods, our method is obtained on discrimination and recognition rate and is significantly improved.

Description

A kind of image-recognizing method without under similar training sample situation
Technical field
The present invention relates to field of image recognition in pattern-recognition more particularly to a kind of without under similar training sample situation Image-recognizing method.
Background technology
In area of pattern recognition, image recognition is one important branch, and the purpose is to allow trained computer to exist Information is extracted in testing image and is identified.Why image recognition is furtherd investigate, and quickly grows, be because its extensively and Important application value.Image recognition technology is increasingly ripe at present, is widely used in face, the identification of number and other objects In.In image recognition a large amount of similar images of training are just needed to obtain relatively good discrimination.It provides and largely carries label Similar training sample be costly, and it is possible that collect the feelings of the similar training sample less than certain a kind of image Condition, so how we solve to seem particularly important without the problem of image recognition under similar training sample situation, and this is asked Topic is increasingly paid close attention to by industrial quarters.
Solution is built upon without most of the conventional method of image recognition under similar training sample situation on the basis of attribute 's.Attribute is artificially defined to describe category Properties information.Directly belong in image recognition under no similar training sample situation Property prediction (DAP) and proxy attribute prediction (IAP) be most basic two kinds of methods based on attribute.DAP is to utilize supporting vector Machine SVM learns an attribute forecast device from known class training sample, is then applied directly to this attribute forecast device to be measured It tries unknown classification sample and obtains the expression of its attribute, classify finally by nearest neighbor classifier.Due to direct attribute forecast method Based on attribute, when attribute changes, attributive classification device needs re -training, so directly attribute forecast method lacks spirit Activity.IAP is to learn a category classifier on known class training sample using support vector machines, is then answered It uses unknown classification and obtains the output of its class probability, bayes method is finally utilized to obtain the class label of unknown classification image. These can all encounter semantic migration problem based on attribute without the image-recognizing method under similar training sample situation.It is specific next Say, from known class training sample go to school acquistion to attribute forecast device may be used in known class training sample, however by In known class training sample and unknown classification test sample and intersection is not present, directly answers the attribute forecast device that study is got All unknown classifications can not be necessarily suitble to when using unknown classification to be tested, can not thus obtain all unknown classifications to be tested The real property of sample just will produce semantic migration problem, to which final discrimination can be caused to decline.How one is learnt efficiently Attribute forecast device is come to effectively reduce the influence that semantic migration problem is brought be an important project in image recognition research.However Ignorance attribute this key element is often all easy while considering to learn an efficient attribute forecast device, attribute is all people There is error certainly between definition, with its real property value, that is, there is noise, the attribute of given known class training sample is retouched It states and deviates its real property certainly, how to effectively reduce the influence of training sample attribute noise is also worth it is contemplated that research.
Invention content
Goal of the invention:The present invention provides a kind of image-recognizing methods without under similar training sample situation effectively to subtract The influence that light semantic migration and training sample attribute noise problem are brought.
Technical solution:A kind of image-recognizing method without under similar training sample situation, includes the following steps:
Step 1) utilizes just too distributing virtual for the attribute noise problem for including in given known class training sample Go out the real property of known class training sample to reduce the influence that attribute noise problem is brought;
Wherein, include C in given known class training samplesClass amounts to Ns width images, the training of given known class Sample set representations are as follows:
Wherein, the character representation of given known class training sample is as follows:
In above formula, d is sample characteristics dimension;
Wherein, the attribute of given known class training sample indicates as follows:
In above formula, k is sample attribute dimension;
Given known class training sample concentrates all known class prototype attributes for including to indicate as follows:
According to the attribute of given known class training sample, the known class training sample that just too distributing virtual goes out is utilized Real property indicate it is as follows:
Step 2) is directed to without semantic migration problem present in the identification of similar training sample situation hypograph, according to step 1) Calculated given known class training sample real property, one feature from known class training sample of study is to known The encoder of classification training sample real property, then learn one and known class training sample real property is mapped to feature sky Between decoder, the category of unknown classification sample to be tested is predicted using one attribute forecast device of coder-decoder model learning Property;The detailed step of step 2) is as follows:
Step 2.1) learns one from known class training sample feature XsThe real property Y virtual to itssEncoder W ∈Rk×dThat is WXs=Ys, then learn one again by YsIt is mapped to the decoder W of feature space*, obtain one it is new about known The character representation X ' of classification training sample is X '=W*Ys=W*WXs, in order to enable X ' and XsBetween error it is as small as possible, therefore Write out following object function:
Wherein | | | |FInclude two variables for F- norms, in formula (6), it is convenient for abbreviation, it is assumed that W*W=WTFormula (6) because This is write as:
Wherein λ is over-fitting control coefrficient, and in conjunction with step 1), we write out final object function:
The virtual reality attribute of the i-th width known class training sample, i ∈ 1,2 ... Ns,It is known to jth class The prototype attribute of classification training sample, j ∈ 1,2......CsIfBelong to jth class known class then mI, j=1, otherwise mI, j =0, λ1For over-fitting control coefrficient;Formula (8) is related with two variables, is broken down into two optimization problems to carry out respectively It solves:
For formula (9), we carry out derivation zero setting to it to obtain:
W=sylvester (A, B, C) (11)
Formula (11) is a sylvester equation, wherein
It is same to be obtained using the method for derivation zero setting for formula (10):
Wherein For NsThe unit matrix of rank;
Step 2.2) finds out Y using the method being separately optimizedsWith the expression formula of W, following we utilize following iteration Formula finds out optimal Ys
Wherein α=0.01, YiIt is the known class training sample virtual attribute expression of ith iteration, determines optimal Ys= YiAfterwards, we can find out optimum code device i.e. attribute forecast device W using formula (11).
The attribute forecast phase that step 3) is obtained with step 2) predicts the attribute of unknown classification sample to be tested, and will predict The attribute of unknown classification sample to be tested compared by the prototype attribute of nearest neighbor classifier and unknown classification sample to be tested To obtain the label of unknown classification sample to be tested.The detailed step of step 3) is as follows:
Step 3.1) chooses CtClass unknown classification sample to be tested is as model measurement sample, and prototype attribute is known and table Show as follows:
Step 3.2) utilizes the attribute forecast that step 2) obtainsTo predict CtThe attribute of class sample to be tested, finally pre- The attribute for the sample to be tested surveyed and given CtThe prototype attribute of class sample to be tested compares to obtain by nearest neighbor classifier The label of all unknown classification samples, wherein we are using following nearest neighbor classifier model:
Wherein CjFor the label j of k-th of unknown classification sample to be tested, wherein k ∈ 1,2,,, t;It is to be measured for k-th The attribute of unknown classification sample is tried,For the prototype attribute of the unknown classification of jth class.
Advantageous effect:1. we go out the real property of known class training sample and effectively mitigate instruction using just too distributing virtual Practice the influence that the attribute noise problem of collection database is brought.
Effectively mitigate without similar trained sample 2. we are reached by one attribute forecast device of coder-decoder model learning The purpose that semantic migration problem influences in image recognition under this situation.
3. being compared with existing method, the method that we are proposed is significantly increased on discrimination and recognition rate.
Description of the drawings
Fig. 1 is the work flow diagram of the present invention.
Specific implementation mode
A kind of image-recognizing method without under similar training sample situation, includes the following steps:
Step 1) utilizes just too distributing virtual for the attribute noise problem for including in given known class training sample Go out the real property of known class training sample to reduce the influence that attribute noise problem is brought;
Wherein, include C in given known class training samplesClass amounts to Ns width images, the training of given known class Sample set representations are as follows:
Wherein, the character representation of given known class training sample is as follows:
In above formula, d is sample characteristics dimension;
Wherein, the attribute of given known class training sample indicates as follows:
In above formula, k is sample attribute dimension;
Given known class training sample concentrates all known class prototype attributes for including to indicate as follows:
In traditional mode identification, belong to same category of several samples in attribute space in being just distributed very much, therefore I Using just too distributing virtual goes out the real property of known class training sample;According to the known class training given in formula (3) The attribute of sample indicates as follows using the real property for the known class training sample that just too distributing virtual goes out:
Step 2) is directed to without semantic migration problem present in the identification of similar training sample situation hypograph, according to step 1) Calculated given known class training sample real property, one feature from known class training sample of study is to known The encoder of classification training sample real property, then learn one and known class training sample real property is mapped to feature sky Between decoder, the category of unknown classification sample to be tested is predicted using one attribute forecast device of coder-decoder model learning Property;The detailed step of step 2) is as follows:
Step 2.1) learns one from known class training sample feature XsThe real property Y virtual to itssEncoder W ∈Rk×dThat is WXs=Ys, then learn one again by YsIt is mapped to the decoder W of feature space*, obtain one it is new about known The character representation X ' of classification training sample is x '=W*Ys=W*WXs, in order to enable X ' and XsBetween error it is as small as possible, therefore Write out following object function:
Wherein | | | |FInclude two variables for F- norms, in formula (6), it is convenient for abbreviation, it is assumed that W*W=WTFormula (6) because This is write as:
Wherein λ is over-fitting control coefrficient, and in conjunction with step 1), we write out final object function:
The virtual reality attribute of the i-th width known class training sample, i ∈ 1,2 ... Ns,It is known to jth class The prototype attribute of classification training sample, j ∈ 1,2......Cs:IfBelong to jth class known class then mI, j=1, otherwise mI, j =0, λ1For over-fitting control coefrficient;Formula (8) is related with two variables, is broken down into two optimization problems to carry out respectively It solves:
For formula (9), we carry out derivation zero setting to it to obtain:
W=sylvester (A, B, C) (11)
Formula (11) is a sylvester equation, wherein
It is same to be obtained using the method for derivation zero setting for formula (10):
Wherein For NsThe unit matrix of rank;
Step 2.2) finds out Y using the method being separately optimizedsWith the expression formula of W, following we utilize following iteration Formula finds out optimal Ys
Wherein α=0.01, YiIt is the known class training sample virtual attribute expression of ith iteration, determines optimal Ys= YiAfterwards, we can find out optimum code device i.e. attribute forecast device W using formula (11).
Step 3) chooses CtClass unknown classification sample to be tested as test sample, prototype attribute and classification it is known that Feature can directly be extracted from sample;The attribute forecast device W obtained using step 2) passes through CtClass unknown classification sample to be tested Feature to predict the attribute of unknown classification sample to be tested, and the attribute of the unknown classification sample to be tested predicted is passed through The prototype attribute of nearest neighbor classifier and unknown classification sample to be tested is compared to obtain the classification of unknown classification sample to be tested Thus label judges the accuracy of attribute forecast device W;The detailed step of step 3) is as follows:
Step 3.1) chooses CtClass unknown classification sample to be tested is as model measurement sample, and prototype attribute is known and table Show as follows:
Step 3.2) utilizes the attribute forecast that step 2) obtainsTo predict CtThe attribute of class sample to be tested, finally pre- The attribute for the sample to be tested surveyed and given CtThe prototype attribute of class sample to be tested compares to obtain by nearest neighbor classifier The label of all unknown classification samples, wherein we are using following nearest neighbor classifier model:
Wherein CjFor the label j of k-th of unknown classification sample to be tested, wherein k ∈ 1,2,,, t;It is to be measured for k-th The attribute of unknown classification sample is tried,For the prototype attribute of the unknown classification of jth class.
By a kind of image-recognizing method without under similar training sample situation of the present invention in AWA and CUB databases On tested.
Amount to 30475 photos comprising 50 class animals in AWA databases, the attribute of a row 85 dimension is corresponded to per one kind image Vector.In AWA databases, we take out 40 classes as known class training sample, remaining 10 class is as unknown class to be tested Very originally.Containing a total of 11788 pictures of 200 kinds of different birds in CUB databases, corresponding per pictures one The attribute vector of 312 dimensions.We take out 150 classes as known class training sample in CUB databases, and remaining 50 classes, which are used as, to be waited for Test unknown classification sample.The discrimination that we test various methods in two databases is as follows:
Table one:The discrimination of various methods compares on AWA and CUB databases
Method AWA databases CUB databases
SAE 81.4% 57.4%
RKT 66.2% 38.4%
SMS 74.8% 43.6%
RZSL 65.6% 31.4%
Our Method 82.6% 58.1%
Table two:Each method recognition rate situation compares on AWA databases
Method Training stage Test phase
RKT 49.71s 6.65s
RZSL 71.13s 15.85s
SMS 59.63s 11.65s
Our Method 13.78s 0.12s
Method proposed by the present invention is can be seen that in same database than its other party from the result in table one and table two Method all significantly improves on discrimination and recognition rate.
Those skilled in the art of the present technique are it is understood that unless otherwise defined, all terms used herein (including skill Art term and scientific terminology) there is meaning identical with the general understanding of the those of ordinary skill in fields of the present invention.Also It should be understood that those terms such as defined in the general dictionary should be understood that with in the context of the prior art The consistent meaning of meaning, and unless defined as here, will not be explained with the meaning of idealization or too formal.
Above-described specific implementation mode has carried out further the purpose of the present invention, technical solution and advantageous effect It is described in detail, it should be understood that the foregoing is merely the specific implementation mode of the present invention, is not limited to this hair Bright, all within the spirits and principles of the present invention, any modification, equivalent substitution, improvement and etc. done should be included in the present invention Protection domain within.

Claims (4)

1. a kind of image-recognizing method without under similar training sample situation, it is characterised in that:Include the following steps:
Step 1) is for the attribute noise problem for including in given known class training sample, using just too distributing virtual goes out Know the real property of classification training sample;
Step 2) is directed to without semantic migration problem present in the identification of similar training sample situation hypograph, is calculated according to step 1) The given known class training sample real property gone out, feature of the study one from known class training sample to known class The encoder of training sample real property, then learn one and known class training sample real property is mapped to feature space Decoder predicts the attribute of unknown classification sample to be tested using one attribute forecast device of coder-decoder model learning;
The attribute forecast device that step 3) is obtained with step 2) predicts the attribute of unknown classification sample to be tested, and is waited for what is predicted The attribute for testing unknown classification sample is compared by the prototype attribute of nearest neighbor classifier and unknown classification sample to be tested, with To the label of unknown classification sample to be tested.
2. a kind of image-recognizing method without under similar training sample situation according to claim 1, it is characterised in that:Step It is rapid 1) described in given known class training sample in include CsClass amounts to Ns width images, and given known class trains sample This set representations is as follows:
Wherein, the character representation of given known class training sample is as follows:
In above formula, d is sample characteristics dimension;
Wherein, the attribute of given known class training sample indicates as follows:
In above formula, k is sample attribute dimension;
Given known class training sample concentrates all known class prototype attributes for including to indicate as follows:
According to the attribute of given known class training sample, the true of the known class training sample that just too distributing virtual goes out is utilized Real attribute indicates as follows:
3. a kind of image-recognizing method without under similar training sample situation according to claim 2, it is characterised in that:Institute The detailed step for stating step 2) is as follows:
Step 2.1) learns one from known class training sample feature XsThe real property Y virtual to itssEncoder W ∈ Rk×d That is WXs=Ys, then learn one again by YsIt is mapped to the decoder W of feature space*, obtain one it is new about known class The character representation X ' of training sample is X '=W*Ys=W*WXs, in order to enable X ' and XsBetween error it is as small as possible, therefore write out Following object function:
Wherein | | | |FInclude two variables for F- norms, in formula (6), it is convenient for abbreviation, it is assumed that W*W=WTIt formula (6) therefore writes At:
Wherein λ is over-fitting control coefrficient, and in conjunction with step 1), we write out final object function:
The virtual reality attribute of the i-th width known class training sample, i ∈ 1,2 ... Ns,It is jth class known class The prototype attribute of training sample, j ∈ 1,2......Cs:IfBelong to jth class known class then mI, j=1, otherwise mI, j=0, λ1For over-fitting control coefrficient;Formula (8) is related with two variables, is broken down into two optimization problems to be asked respectively Solution:
For formula (9), we carry out derivation zero setting to it to obtain:
W=sylvester (A, B, C) (11)
Formula (11) is a sylvester equation, wherein
It is same to be obtained using the method for derivation zero setting for formula (10):
Wherein For NsThe unit matrix of rank;
Step 2.2) finds out Y using the method being separately optimizedsWith the expression formula of W, following we are asked using following iterative formula Go out optimal Ys
Wherein α=0.01, YiIt is the known class training sample virtual attribute expression of ith iteration, determines optimal Ys=Yi Afterwards, we can find out optimum code device i.e. attribute forecast device W using formula (11).
4. a kind of image-recognizing method without under similar training sample situation according to claim 3, it is characterised in that:Institute The detailed step for stating step 3) is as follows:
Step 3.1) chooses CtFor class unknown classification sample to be tested as model measurement sample, prototype attribute is known and indicates such as Under:
Attribute forecast device W that step 3.2) is obtained using step 2) predicts CtThe attribute of class sample to be tested, finally prediction The attribute of sample to be tested and given CtThe prototype attribute of class sample to be tested is compared by nearest neighbor classifier to be owned The label of unknown classification sample, wherein we are using following nearest neighbor classifier model:
Wherein CjFor the label j of k-th of unknown classification sample to be tested, wherein k ∈ 1,2,,, t;For k-th it is to be tested not Know the attribute of classification sample,For the prototype attribute of the unknown classification of jth class.
CN201810335966.9A 2018-04-13 2018-04-13 A kind of image-recognizing method without under similar training sample situation Pending CN108615052A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810335966.9A CN108615052A (en) 2018-04-13 2018-04-13 A kind of image-recognizing method without under similar training sample situation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810335966.9A CN108615052A (en) 2018-04-13 2018-04-13 A kind of image-recognizing method without under similar training sample situation

Publications (1)

Publication Number Publication Date
CN108615052A true CN108615052A (en) 2018-10-02

Family

ID=63660071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810335966.9A Pending CN108615052A (en) 2018-04-13 2018-04-13 A kind of image-recognizing method without under similar training sample situation

Country Status (1)

Country Link
CN (1) CN108615052A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909643A (en) * 2019-11-14 2020-03-24 北京航空航天大学 Remote sensing ship image small sample classification method based on nearest neighbor prototype representation
CN113409821A (en) * 2021-05-27 2021-09-17 南京邮电大学 Method for recognizing unknown emotional state of voice signal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160292538A1 (en) * 2015-03-31 2016-10-06 Disney Enterprises, Inc. Object Classification Through Semantic Mapping
CN106980875A (en) * 2017-03-13 2017-07-25 南京邮电大学 The zero sample image recognition methods based on attribute low-rank representation
CN107563444A (en) * 2017-09-05 2018-01-09 浙江大学 A kind of zero sample image sorting technique and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160292538A1 (en) * 2015-03-31 2016-10-06 Disney Enterprises, Inc. Object Classification Through Semantic Mapping
CN106980875A (en) * 2017-03-13 2017-07-25 南京邮电大学 The zero sample image recognition methods based on attribute low-rank representation
CN107563444A (en) * 2017-09-05 2018-01-09 浙江大学 A kind of zero sample image sorting technique and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ELYOR KODIROV 等: "Semantic Autoencoder for Zero-Shot Learning", 《CVPR》 *
KUN WANG 等: "Learning Autoencoder of Attribute Constraint for Zero-Shot Classification", 《2017 4TH IAPR ASIAN CONFERENCE ON PATTERN RECOGNITION》 *
冀中 等: "基于典型相关分析和距离度量学习的零样本学习", 《天津大学学报(自然科学与工程技术版)》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909643A (en) * 2019-11-14 2020-03-24 北京航空航天大学 Remote sensing ship image small sample classification method based on nearest neighbor prototype representation
CN110909643B (en) * 2019-11-14 2022-10-28 北京航空航天大学 Remote sensing ship image small sample classification method based on nearest neighbor prototype representation
CN113409821A (en) * 2021-05-27 2021-09-17 南京邮电大学 Method for recognizing unknown emotional state of voice signal

Similar Documents

Publication Publication Date Title
Khalil et al. Energy efficiency prediction using artificial neural network
US12061966B2 (en) Relevance score assignment for artificial neural networks
CN109766557B (en) Emotion analysis method and device, storage medium and terminal equipment
CN114912612A (en) Bird identification method and device, computer equipment and storage medium
CN106980876A (en) A kind of zero sample image recognition methods learnt based on distinctive sample attribute
Wang et al. Research on maize disease recognition method based on improved resnet50
CN116310519A (en) Surface defect classification method for semi-supervised deep learning
CN111274396B (en) Visual angle level text emotion classification method and system based on external knowledge
CN116432184A (en) Malicious software detection method based on semantic analysis and bidirectional coding characterization
CN116186250A (en) Multi-mode learning level mining method, system and medium under small sample condition
Rodzin et al. Deep learning techniques for natural language processing
CN108615052A (en) A kind of image-recognizing method without under similar training sample situation
CN113779249B (en) Cross-domain text emotion classification method and device, storage medium and electronic equipment
CN109034182A (en) A kind of zero sample image identification new method based on attribute constraint
Vargas et al. Relu-based activations: Analysis and experimental study for deep learning
Makwe et al. An empirical study of neural network hyperparameters
US20230121404A1 (en) Searching for normalization-activation layer architectures
CN112489689B (en) Cross-database voice emotion recognition method and device based on multi-scale difference countermeasure
CN110222737A (en) A kind of search engine user satisfaction assessment method based on long memory network in short-term
CN116975743A (en) Industry information classification method, device, computer equipment and storage medium
CN111598580A (en) XGboost algorithm-based block chain product detection method, system and device
CN106095811A (en) A kind of image search method of the discrete Hash of supervision based on optimum code
CN110851600A (en) Text data processing method and device based on deep learning
Horrace et al. Lasso for stochastic frontier models with many efficient firms
CN117009621A (en) Information searching method, device, electronic equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20181002

RJ01 Rejection of invention patent application after publication