CN108460405A - A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study - Google Patents

A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study Download PDF

Info

Publication number
CN108460405A
CN108460405A CN201810104070.XA CN201810104070A CN108460405A CN 108460405 A CN108460405 A CN 108460405A CN 201810104070 A CN201810104070 A CN 201810104070A CN 108460405 A CN108460405 A CN 108460405A
Authority
CN
China
Prior art keywords
grader
classifier
integrated
precision
optimization method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810104070.XA
Other languages
Chinese (zh)
Inventor
冯国瑞
胡丹丹
钟凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201810104070.XA priority Critical patent/CN108460405A/en
Publication of CN108460405A publication Critical patent/CN108460405A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of image latent writings based on deeply study to analyze Ensemble classifier optimization method.Concrete operation step is as follows:(1)Choose several base graders under certain feature;(2)Using Bagging integrated approaches, several strong integrated classifiers are generated;(3)The fixed sub-classifier number generated, and it is identical in subspace number, repeatedly generate different data sets;(4)Deeply learning model DQN is established, to step(3)In data set be trained and filter out the grader set in the constant or better less number of precision;(5)Steganographic data to be discriminated is input in model, the integrated classifier precision after calculation optimization and grader number.The present invention can effectively optimize the precision and sub-classifier number of integrated classifier, be suitable for the case where a large amount of combining classifiers are adjudicated, reach better classification performance.

Description

A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study
Technical field
The present invention relates to a kind of image latent writings based on deeply study to analyze Ensemble classifier optimization method.
Background technology
Steganography is that secret information insertion Digital Media is carried out covert communications, and steganalysis is to detect digital matchmaker The presence of secret hiding data in body.In most of Steganalysis, some extracted from original and hiding media are used Sensitive features train grader, and " normal " or " containing close " decision is made to suspicious media.
In steganalysis, the application of grader collects ingredient from the binary classifier of early stage to recent multi classifier Class device.And the performance of integrated classifier is far superior to single grader.In existing sorting technique, Ensemble classifier achieves very Good classifying quality, therefore, integrated classifier is in steganalysis using more and more extensive.But integrated classifier selects a large amount of bases Grader carries out comprehensive judgement, and grader set has certain redundancy, meanwhile, with base grader quantity in integrated classifier Increase, predetermined speed of model can decline, while its memory space needed can also sharply increase.
Invention content
Purpose of the present invention is to the deficiencies for existing Stego-detection method, propose a kind of figure based on deeply study As steganalysis Ensemble classifier optimization method.
In order to achieve the above objectives, the present invention adopts the following technical scheme that:
A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study, concrete operation step are as follows:
(1)Choose several base graders under certain feature;
(2)Using Bagging integrated approaches, several strong integrated classifiers are generated;
(3)The fixed sub-classifier number generated, and it is identical in subspace number, repeatedly generate different data sets;
(4)Deeply learning model DQN is established, to step(3)In data set be trained and filter out constant in precision Or preferably in the case of less number grader set;
(5)Steganographic data to be discriminated is input in model, the integrated classifier precision after calculation optimization and grader number.
The step(1)In, if the butt grader that the distinct methods for choosing identical embedded rate generate is as primary data.
The step(2)In, wherein Bagging integrated approaches are also known as self-service aggregation, are a kind of to be distributed according to non-uniform probability The duplicate sampling from data(It puts back to)Method, on the self-service sample set that each sampling generates, one base grader of training;It is right Trained listening group is voted, and test sample is assigned in the highest class of gained vote.
The step(3)In, integrated classifier includes many a base graders, while carrying out integrated study training, Obtain step(4)In required data set.
The step(4)In, DQN models are established, according to depth Q learning algorithms, build DQN models, are filtered out ideal Sub-classifier set, concrete operation step are:
1)Prepare data set D, i.e. grader set, serial number is added to each row;
2)Grader set C is initialized, the precision a and sub-classifier number n of current data set are calculated, as trained reference number Value;
3)Aimed at precision A and object classifiers number N are determined by certain experiment, the target as the study of DQN decisions;
4)By DQN networks, the grader for reaching desired value is selected, is added in set C;
5)By adjusting desired value in a certain range, step 4 is repeated), constantly set D is screened, is selected best As a result;
6)Calculate the precision a ' and grader number n ' of the data set that final choice goes out.
Compared with prior art, the present invention has the advantage that:
Deeply study applied in the screening of steganalysis grader, is passed through the sieve to base grader by the method for the present invention Choosing, can select effective base grader, and the number of base grader is reduced while improving classification accuracy, can be apparent The case where improving the performance of steganalysis model, being suitable for the judgement of a large amount of combining classifiers, reaches better classification performance, more Suitable for practical application scene.
Description of the drawings
Fig. 1 is the flow chart of the method for the present invention.
Fig. 2 is the method for the present invention structure chart.
Fig. 3 is that deeply learns DQN selection course figures.
Fig. 4 is precision curve graph.
Fig. 5 is sub-classifier number statistical chart.
Specific implementation mode
In order to facilitate the understanding of those skilled in the art, being carried out to the present invention below in conjunction with attached drawing and embodiment further Description.
As depicted in figs. 1 and 2, the image latent writing analysis Ensemble classifier based on deeply study that the present embodiment proposes is excellent Change method, mainly includes the following steps that:
(1)Choose several base graders under certain feature;
(2)Using Bagging integrated approaches, several strong integrated classifiers are generated;
(3)The fixed sub-classifier number generated, and it is identical in subspace number, repeatedly generate different data sets;
(4)Deeply learning model DQN is established, to step(3)In data set be trained and filter out constant in precision Or preferably in the case of less number grader set;
(5)Steganographic data to be discriminated is input in model, the integrated classifier precision after calculation optimization and grader number, Obtain final result.
Bagging methods are described in detail below in this example:
Concentrated from the initial data that size is n, independently randomly choose a samples of n ' and form self-service data set, and by this A process independently carries out many times, until generating many independent self-service data sets.Then, each self-service data set is only It is on the spot used to train one " component classifier ", the judgement of final classification device will be according to these " component classifier " respective judgements As a result it chooses in a vote.
In this example, with reference to figure 3, the step(4)In, DQN models are established, according to depth Q learning algorithms, build DQN Model, filters out ideal sub-classifier set, and concrete operation step is:
1)Prepare data set D, i.e. grader set, serial number is added to each row;
2)Grader set C is initialized, the precision a and sub-classifier number n of current data set are calculated, as trained reference number Value;
3)Aimed at precision A and object classifiers number N are determined by certain experiment, the target as the study of DQN decisions;
4)By DQN networks, the grader for reaching desired value is selected, is added in set C;
5)By adjusting desired value in a certain range, step 4 is repeated), constantly set D is screened, is selected best As a result;
6)Calculate the precision a ' and grader number n ' of the data set that final choice goes out.
With reference to figure 4, common integrated study is significantly better than by the grader set performance that the above method filters out, is not only existed It increases in precision, has even more selected the sub-classifier set after optimization, with reference to figure 5, number is obviously more integrated than common Lack.
In conclusion deeply study is used in the optimization of integrated study by this example, learn using deeply Outstanding strategy improves integrated precision and saves the space of on-line study to select the integrated classifier after optimization.
Several embodiments of the invention above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously Cannot the limitation to the scope of the claims of the present invention therefore be interpreted as.It should be pointed out that for those of ordinary skill in the art For, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to the guarantor of the present invention Protect range.Therefore, the protection domain of patent of the present invention should be determined by the appended claims.

Claims (5)

1. a kind of image latent writing based on deeply study analyzes Ensemble classifier optimization method, which is characterized in that concrete operations Steps are as follows:
(1)Choose several base graders under certain feature;
(2)Using Bagging integrated approaches, several strong integrated classifiers are generated;
(3)The fixed sub-classifier number generated, and it is identical in subspace number, repeatedly generate different data sets;
(4)Deeply learning model DQN is established, to step(3)In data set be trained and filter out constant in precision Or preferably in the case of less number grader set;
(5)Steganographic data to be discriminated is input in model, the integrated classifier precision after calculation optimization and grader number.
2. the image latent writing according to claim 1 based on deeply study analyzes Ensemble classifier optimization method, special Sign is, the step(1)In, if the butt grader generated under the distinct methods of the identical embedded rate of selection is as initial number According to.
3. the image latent writing according to claim 1 based on deeply study analyzes Ensemble classifier optimization method, special Sign is, the step(2)In, wherein Bagging integrated approaches are also known as self-service aggregation, be it is a kind of according to non-uniform probability distribution from The method of duplicate sampling in data, each to sample on the self-service sample set generated, one base grader of training;To point trained Class device is voted, and test sample is assigned in the highest class of gained vote.
4. the image latent writing according to claim 1 based on deeply study analyzes Ensemble classifier optimization method, special Sign is, the step(3)In, integrated classifier is obtained comprising many a base graders while carrying out integrated study training To step(4)In required data set.
5. the image latent writing according to claim 1 based on deeply study analyzes Ensemble classifier optimization method, special Sign is, the step(4)In, DQN models are established, according to depth Q learning algorithms, DQN models is built, filters out ideal son Grader set, concrete operation step are:
1)Prepare data set D, i.e. grader set, serial number is added to each row;
2)Grader set C is initialized, the precision a and sub-classifier number n of current data set are calculated, as trained reference number Value;
3)Aimed at precision A and object classifiers number N are determined by certain experiment, the target as the study of DQN decisions;
4)By DQN networks, the grader for reaching desired value is selected, is added in set C;
5)By adjusting desired value in a certain range, step 4 is repeated), constantly set D is screened, is selected best As a result;
6)Calculate the precision a ' and grader number n ' of the data set that final choice goes out.
CN201810104070.XA 2018-02-02 2018-02-02 A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study Pending CN108460405A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810104070.XA CN108460405A (en) 2018-02-02 2018-02-02 A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810104070.XA CN108460405A (en) 2018-02-02 2018-02-02 A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study

Publications (1)

Publication Number Publication Date
CN108460405A true CN108460405A (en) 2018-08-28

Family

ID=63239329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810104070.XA Pending CN108460405A (en) 2018-02-02 2018-02-02 A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study

Country Status (1)

Country Link
CN (1) CN108460405A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158043A (en) * 2021-04-20 2021-07-23 湖南海龙国际智能科技股份有限公司 Intelligent tourism resource recommendation system adopting reinforcement learning and integrated learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872555A (en) * 2016-03-25 2016-08-17 中国人民武装警察部队工程大学 Steganalysis algorithm specific to H.264 video motion vector information embedment
US20170032245A1 (en) * 2015-07-01 2017-02-02 The Board Of Trustees Of The Leland Stanford Junior University Systems and Methods for Providing Reinforcement Learning in a Deep Learning System

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170032245A1 (en) * 2015-07-01 2017-02-02 The Board Of Trustees Of The Leland Stanford Junior University Systems and Methods for Providing Reinforcement Learning in a Deep Learning System
CN105872555A (en) * 2016-03-25 2016-08-17 中国人民武装警察部队工程大学 Steganalysis algorithm specific to H.264 video motion vector information embedment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHRISTOS DIMITRAKAKIS等: "Online adaptive policies for ensemble classifiers", 《NEUROCOMPUTING》 *
IOANNIS PARTALAS等: "Pruning an ensemble of classifiers via reinforcement learning", 《NEUROCOMPUTING》 *
YANG GAO等: "Learning classifier system ensemble and compact rule set", 《CONNECTION SCIENCE》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158043A (en) * 2021-04-20 2021-07-23 湖南海龙国际智能科技股份有限公司 Intelligent tourism resource recommendation system adopting reinforcement learning and integrated learning

Similar Documents

Publication Publication Date Title
CN111126386B (en) Sequence domain adaptation method based on countermeasure learning in scene text recognition
CN108229550B (en) Cloud picture classification method based on multi-granularity cascade forest network
CN111507884A (en) Self-adaptive image steganalysis method and system based on deep convolutional neural network
CN109754002A (en) A kind of steganalysis hybrid integrated method based on deep learning
CN108304876A (en) Disaggregated model training method, device and sorting technique and device
CN107393542A (en) A kind of birds species identification method based on binary channels neutral net
CN104537647A (en) Target detection method and device
CN108399431A (en) Disaggregated model training method and sorting technique
CN108399378A (en) A kind of natural scene image recognition methods based on VGG depth convolutional networks
CN107944460A (en) One kind is applied to class imbalance sorting technique in bioinformatics
CN111145145B (en) Image surface defect detection method based on MobileNet
CN109259764B (en) Method for determining dynamic brain function network threshold
CN110288048A (en) A kind of submarine pipeline methods of risk assessment of SVM directed acyclic graph
CN112819063B (en) Image identification method based on improved Focal loss function
CN107273916A (en) The unknown Information Hiding & Detecting method of steganographic algorithm
CN110348448A (en) A kind of license plate character recognition method based on convolutional neural networks
CN105354600A (en) Automatic classification method for sandstone microsections
CN107492084A (en) Typical packed cell core image combining method based on randomness
CN114500396B (en) MFD chromatographic feature extraction method and system for distinguishing anonymous Torr application flow
CN109523514A (en) To the batch imaging quality assessment method of Inverse Synthetic Aperture Radar ISAR
CN106250913A (en) A kind of combining classifiers licence plate recognition method based on local canonical correlation analysis
CN108460405A (en) A kind of image latent writing analysis Ensemble classifier optimization method based on deeply study
CN105069480B (en) Polarization SAR terrain classification method based on gaussian filtering and PSO
CN111461135B (en) Digital image local filtering evidence obtaining method integrated by convolutional neural network
CN104463207A (en) Knowledge self-encoding network and polarization SAR image terrain classification method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180828