CN110889456A - Neural network-based co-occurrence matrix feature extraction method, storage medium and terminal - Google Patents

Neural network-based co-occurrence matrix feature extraction method, storage medium and terminal Download PDF

Info

Publication number
CN110889456A
CN110889456A CN201911213459.9A CN201911213459A CN110889456A CN 110889456 A CN110889456 A CN 110889456A CN 201911213459 A CN201911213459 A CN 201911213459A CN 110889456 A CN110889456 A CN 110889456A
Authority
CN
China
Prior art keywords
neural network
image
feature extraction
convolutional neural
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911213459.9A
Other languages
Chinese (zh)
Other versions
CN110889456B (en
Inventor
李广林
李斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201911213459.9A priority Critical patent/CN110889456B/en
Publication of CN110889456A publication Critical patent/CN110889456A/en
Application granted granted Critical
Publication of CN110889456B publication Critical patent/CN110889456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention discloses a symbiotic matrix characteristic extraction method based on a neural network, a storage medium and a terminal, wherein the method comprises the following steps: acquiring an image to be processed; preprocessing the image to be processed to obtain image data; and inputting the image data into a pre-trained convolutional neural network model, and extracting to obtain symbiotic matrix characteristics. According to the method, a mathematical function capable of extracting the symbiotic matrix is designed in advance, each parameter of the mathematical function simulates the quantization center and the quantization range of the symbiotic matrix, an equivalent convolutional neural network (equivalent to the mathematical function of the convolutional neural network, and the convolutional neural network can be written into a mathematical function) is designed according to the mathematical function, the parameters in the mathematical function can be obtained according to data learning by utilizing the training process of the neural network, the parameters are not set by depending on artificial experience, and the accuracy of extracting the symbiotic matrix characteristics is improved.

Description

Neural network-based co-occurrence matrix feature extraction method, storage medium and terminal
Technical Field
The invention relates to the technical field of neural networks, in particular to a symbiotic matrix characteristic extraction method based on a neural network, a storage medium and a terminal.
Background
The co-occurrence matrix feature is a very important and common statistical feature in image analysis. It is distinguished from histogram statistical features by: the histogram feature is the result of statistics on single elements of the image, and is the number of some elements in the image. The symbiotic matrix characteristic is a result of counting a plurality of adjacent elements of the image, and is used for counting the number of the plurality of adjacent elements in the image.
There is a corresponding quantization function in the center of the co-occurrence matrix (the shape of the quantization may be regular hexagon, rectangle, etc. in two dimensions, and so on in higher dimensions), and when two adjacent elements of the image pass through this function, a one is added to the corresponding co-occurrence matrix position when the two elements are near the center of the quantization function, and otherwise a zero is added. And so on for higher dimensional co-occurrence matrices.
The existing co-occurrence matrix extraction method needs to manually set a central position to be counted, namely a central point of a corresponding quantization function, and a quantization range, namely the activation output of the quantization function is 1 in the large range of the quantization function.
These statistical parameters are generally determined in the prior art based on a priori knowledge. However, a priori knowledge does not define these parameters very accurately, so that obtaining statistical features is not the best result for the current task.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a method, an apparatus, a storage medium, and a terminal for extracting co-occurrence matrix features based on a neural network, aiming at solving the problem in the prior art that the accuracy of the obtained co-occurrence matrix features is not high when the co-occurrence matrix features are extracted.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The technical scheme adopted by the invention for solving the technical problem is as follows:
in a first aspect, an embodiment of the present invention provides a neural network-based symbiotic matrix feature extraction method, where the method includes:
acquiring an image to be processed;
preprocessing the image to be processed to obtain image data;
and inputting the image data into a pre-trained convolutional neural network model, and extracting to obtain symbiotic matrix characteristics.
The symbiotic matrix characteristic extraction method based on the neural network is characterized in that a function for extracting the symbiotic matrix characteristic is as follows:
Figure BDA0002298812840000031
wherein m isnRepresents the center of quantization of the data,
Figure BDA0002298812840000032
for symmetric matrices representing the range of data quantization, M1M2Is the size of the image.
The symbiotic matrix characteristic extraction method based on the neural network comprises the following steps of:
acquiring residual image data;
dividing the residual image data into a training set and a test set;
and performing convolutional neural network training and classifier training based on the residual image data in the training set, and obtaining a trained convolutional neural network model after the training is completed.
The symbiotic matrix characteristic extraction method based on the neural network is characterized in that the preprocessing of the image to be processed is performed through a high-pass filter.
The symbiotic matrix characteristic extraction method based on the neural network is characterized in that the symmetric matrix is used
Figure BDA0002298812840000041
Decomposition into similar matrices
Figure BDA0002298812840000042
The mathematical expression function of the convolutional neural network model is as follows:
Figure BDA0002298812840000043
wherein the content of the first and second substances,
Figure BDA0002298812840000044
the symbiotic matrix characteristic extraction method based on the neural network is characterized in that the classifier is a support vector machine.
In a second aspect, an embodiment of the present invention provides a symbiotic matrix feature extraction apparatus based on a neural network, where the apparatus includes:
the acquisition unit is used for acquiring an image to be processed;
the preprocessing unit is used for preprocessing the image to be processed to obtain residual image data;
and the identification unit is used for inputting the residual image data into a pre-trained convolutional neural network model and extracting to obtain symbiotic matrix characteristics.
The symbiotic matrix characteristic extraction device based on the neural network is characterized in that the preprocessing of the image to be processed is performed through a high-pass filter.
In a third aspect, an embodiment of the present invention provides a terminal, which includes a memory, and one or more programs, where the one or more programs are stored in the memory, and the one or more programs configured to be executed by the one or more processors include a processor configured to execute the method described above.
In a fourth aspect, an embodiment of the present invention provides a storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method described above.
Has the advantages that: according to the method, a mathematical function capable of extracting the symbiotic matrix is designed in advance, each parameter of the mathematical function corresponds to a corresponding quantization center and a corresponding quantization range, an equivalent convolutional neural network is designed according to the mathematical function, the parameters in the mathematical function can be obtained according to data learning by utilizing the training process of the neural network, the parameters are not set by depending on human experience, and the accuracy and the efficiency of extracting the symbiotic matrix characteristics are improved.
Drawings
Fig. 1 is a flowchart of a neural network-based co-occurrence matrix feature extraction method according to a preferred embodiment of the present invention.
Fig. 2 is a process diagram for extracting a symbiotic matrix feature function according to an embodiment of the present invention.
Fig. 3 is a diagram illustrating a convolutional neural network.
FIG. 4 is a schematic diagram of a convolutional neural network structure in an embodiment of the present invention.
Fig. 5 is a flowchart of the SRM algorithm in an embodiment of the present invention.
Fig. 6 is a flow chart of the SRM algorithm to a convolutional neural network in an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of a convolutional neural network after an SRM algorithm is converted into the convolutional neural network in the embodiment of the present invention.
FIG. 8 is a diagram illustrating image cropping alignment in image texture classification according to an embodiment of the present invention.
FIG. 9 is a schematic structural diagram of a convolutional neural network after image texture extraction is converted into the convolutional neural network in the embodiment of the present invention.
Fig. 10 is a functional schematic diagram of a symbiotic matrix feature extraction device based on a neural network according to the present invention.
Fig. 11 is a functional schematic diagram of a terminal provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the prior art, when the co-occurrence matrix features are extracted, a central point and a quantization range in statistical parameters of a quantization function need to be set, the statistical parameters are often set according to empirical knowledge, but the statistical parameters cannot be set accurately by virtue of the empirical knowledge, and the obtained statistical features are not the best results for the current task.
In order to solve the above problem, in the embodiment of the present invention, a mathematical function capable of simulating co-occurrence matrix statistics is first designed, a convolutional neural network corresponding to the mathematical function is designed according to the mathematical function, and when the convolutional neural network is trained, parameters of the neural network are optimized, so that statistical parameters of the mathematical function capable of simulating co-occurrence matrix statistics are also optimized. And when the training of the convolutional neural network is finished, setting the statistical parameters of the mathematical function capable of simulating the symbiotic matrix statistics. Therefore, the statistical parameters of the mathematical function capable of simulating the symbiotic matrix statistics are set according to the image data instead of experience, so that the problem that the accuracy of the extracted symbiotic matrix features is not high is solved.
Various non-limiting embodiments of the present invention are described in detail below with reference to the accompanying drawings.
Exemplary method
Referring to fig. 1, the present embodiment provides a symbiotic matrix feature extraction method based on a neural network, including the following steps:
and S100, acquiring an image to be processed.
Specifically, the image to be processed may be captured by an electronic device, may be stored in a storage device in advance, or may be acquired from a network, and the format of the image may be another format such as JPG, PNG, and the like, which is not required here.
S200, preprocessing the image to be processed to obtain image data.
Specifically, the to-be-processed image obtained in step S100 may be subjected to noise reduction processing, and the to-be-processed image after noise reduction processing is filtered by a high-pass filter to obtain a residual image. And carrying out data processing on the obtained residual image to obtain data of the residual image.
S300, inputting the image data into a pre-trained convolutional neural network model, and extracting to obtain co-occurrence matrix characteristics.
Specifically, the convolutional neural network model comprises an input layer, a convolutional layer, an activation function, a pooling layer and a full-link layer, and the mathematical expression function of the convolutional neural network model is as follows:
Figure BDA0002298812840000071
wherein the content of the first and second substances,
Figure BDA0002298812840000072
in this embodiment, the function for extracting the co-occurrence matrix feature is:
Figure BDA0002298812840000081
wherein m isnRepresenting the amount of dataThe center of the chemical reaction is changed,
Figure BDA0002298812840000082
for symmetric matrices representing the range of data quantization, M1M2Is the size of the image.
Further, the function for extracting the co-occurrence matrix characteristic is transformed as follows:
Figure BDA0002298812840000083
Figure BDA0002298812840000084
Figure BDA0002298812840000085
Figure BDA0002298812840000086
from the knowledge of the quadratic form of the matrix, we know that symmetric matrices
Figure BDA0002298812840000087
Can be decomposed into similar matrices
Figure BDA0002298812840000088
The matrix is divided into two parts by a method of multiplication in the matrix
Figure BDA0002298812840000089
That is to say, the
Figure BDA00022988128400000810
Viewed as v column vectors, then
Figure BDA00022988128400000811
The transformed mathematical expression has the following form:
Figure BDA00022988128400000812
wherein
Figure BDA00022988128400000813
Figure BDA00022988128400000814
According to the neural network formula:
Figure BDA0002298812840000091
wherein the content of the first and second substances,
Figure BDA0002298812840000092
for each scanned element in the image by a convolution kernel, vector ker is the element in the convolution kernel, res is the convolution result.
By the characteristics of the neural network formula, the complex quadratic matrix in the function for extracting the co-occurrence matrix characteristics can be changed into an inner product formula of two vectors, namely a convolution formula in a convolution neural network model.
Thus, the inner product of all forms of the function that extracts the co-occurrence matrix features can be designed as a convolution operation in the corresponding convolutional neural network model. Therefore, the self-adaptive setting of the data quantization center and the data quantization range of the function of the co-occurrence matrix characteristic is realized.
The method in the embodiment of the present invention is explained with reference to a specific application scenario.
In steganalysis, there is an algorithm called SRM (sequential regularization method), in which symbiotic matrix features are applied.
As shown in fig. 5, the SRM algorithm includes 45 high-pass filters, obtains a filtered image called a residual image, truncates and quantizes the residual image to change the value of the residual image into a discrete finite value, extracts a co-occurrence matrix, inputs the co-occurrence matrix into an integrated classifier, and classifies the image into a stego image and a cover image.
As shown in fig. 6-7, according to the method provided by the present invention, the processing flow is to filter the image with 45 high-pass filters to obtain a residual image, extract a co-occurrence matrix from the residual image through a convolutional neural network, input the extracted co-occurrence matrix to a full-link classifier, and select a corresponding loss function as needed. In the SRM, X is an input residual image, N is 45 (the number of input images), M is 625 (the SRM needs to count 625 co-occurrence matrices), and an integrated classifier in the SRM needs to be a fully connected classifier instead, which facilitates the pass-back of the gradient. The convolution method is depthwise _ conv2d, and the convolution method of conv2d may be used.
In image texture classification, a co-occurrence matrix of four directions is generally required. However, it is difficult for the code under the tensrflow framework to implement a convolution kernel in any direction, so the image can be clipped and aligned and then input into the convolution neural network, and the principle of this operation is to store the elements scanned by the convolution kernel with different matrices, as shown in fig. 8.
In this embodiment, as shown in fig. 9, the network is a network extracted by a co-occurrence matrix in one direction, where x (N) is a set of images obtained by clipping and aligning images, N represents the number of images, the convolution mode at this time is conv2d, and the classifier is a fully-connected neural network.
Exemplary device
Referring to fig. 10, an embodiment of the present invention provides a symbiotic matrix feature extraction apparatus based on a neural network, where the apparatus includes: an acquisition unit 110, a preprocessing unit 120, and a recognition unit 130.
Specifically, the acquiring unit 110 is configured to acquire an image to be processed. The preprocessing unit 120 is configured to preprocess the image to be processed to obtain residual image data; wherein the preprocessing of the image to be processed is performed by a high-pass filter; and the identification unit 130 is configured to input the residual image data into a pre-trained convolutional neural network model, and extract a symbiotic matrix characteristic.
Based on the above embodiment, the present invention further provides an intelligent terminal, and a schematic block diagram thereof may be as shown in fig. 11. The intelligent terminal comprises a processor, a memory, a network interface, a display screen and a temperature sensor which are connected through a system bus. Wherein, the processor of the intelligent terminal is used for providing calculation and control capability. The memory of the intelligent terminal comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the intelligent terminal is used for being connected and communicated with an external terminal through a network. The computer program is executed by a processor to implement a neural network-based co-occurrence matrix feature extraction method. The display screen of the intelligent terminal can be a liquid crystal display screen or an electronic ink display screen, and the temperature sensor of the intelligent terminal is arranged inside the intelligent terminal in advance and used for detecting the current operating temperature of internal equipment.
It will be understood by those skilled in the art that the block diagram of fig. 11 is only a block diagram of a part of the structure related to the solution of the present invention, and does not constitute a limitation to the intelligent terminal to which the solution of the present invention is applied, and a specific intelligent terminal may include more or less components than those shown in the figure, or combine some components, or have different arrangements of components.
In one embodiment, a terminal is provided that includes a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured for execution by the one or more processors to perform the one or more programs includes instructions for:
acquiring an image to be processed;
preprocessing the image to be processed to obtain image data;
and inputting the image data into a pre-trained convolutional neural network model, and extracting to obtain symbiotic matrix characteristics.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
In summary, the present invention discloses a neural network-based symbiotic matrix feature extraction method, apparatus, intelligent terminal and storage medium, wherein the method comprises: acquiring an image to be processed; preprocessing the image to be processed to obtain residual image data; and inputting the residual image data into a pre-trained convolutional neural network model, and extracting to obtain symbiotic matrix characteristics. According to the method, a mathematical function capable of extracting the symbiotic matrix is designed in advance, each parameter of the mathematical function corresponds to a corresponding quantization center and a corresponding quantization range, an equivalent convolutional neural network is designed according to the mathematical function, the parameters in the mathematical function can be obtained according to data learning by utilizing the training process of the neural network, the parameters are not set by depending on human experience, and the accuracy and the efficiency of extracting the symbiotic matrix characteristics are improved.
It is to be understood that the invention is not limited to the examples described above, but that modifications and variations may be effected thereto by those of ordinary skill in the art in light of the foregoing description, and that all such modifications and variations are intended to be within the scope of the invention as defined by the appended claims.

Claims (10)

1. A symbiotic matrix feature extraction method based on a neural network is characterized by comprising the following steps:
acquiring an image to be processed;
preprocessing the image to be processed to obtain image data;
and inputting the image data into a pre-trained convolutional neural network model, and extracting to obtain symbiotic matrix characteristics.
2. The neural network-based co-occurrence matrix feature extraction method according to claim 1, wherein a function for extracting the co-occurrence matrix feature is:
Figure FDA0002298812830000011
wherein m isnRepresents the center of quantization of the data,
Figure FDA0002298812830000012
for symmetric matrices representing the range of data quantization, M1M2Is the size of the image.
3. The neural network-based co-occurrence matrix feature extraction method according to claim 2, wherein the convolutional neural network model training process is as follows:
acquiring residual image data;
dividing the residual image data into a training set and a test set;
and performing convolutional neural network training and classifier training based on the residual image data in the training set, and obtaining a trained convolutional neural network model after the training is completed.
4. The neural network-based co-occurrence matrix feature extraction method according to claim 1, wherein the preprocessing of the image to be processed is performed by a high-pass filter.
5. The neural network-based co-occurrence matrix feature extraction method according to claim 2, wherein the symmetric matrix is used
Figure FDA0002298812830000021
The decomposition into a similar matrix is carried out,
Figure FDA0002298812830000022
the mathematical expression function of the convolutional neural network model is as follows:
Figure FDA0002298812830000023
wherein the content of the first and second substances,
Figure FDA0002298812830000024
6. the neural network-based co-occurrence matrix feature extraction method according to claim 1, wherein the classifier is a fully-connected neural network.
7. A symbiotic matrix feature extraction device based on a neural network, the device comprising:
the acquisition unit is used for acquiring an image to be processed;
the preprocessing unit is used for preprocessing the image to be processed to obtain image data;
and the identification unit is used for inputting the image data into a pre-trained convolutional neural network model and extracting to obtain symbiotic matrix characteristics.
8. The neural network-based co-occurrence matrix feature extraction device according to claim 7, wherein the preprocessing of the image to be processed is performed by a high-pass filter.
9. A terminal comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory, and wherein the one or more programs being configured to be executed by the one or more processors comprises instructions for performing the method of any of claims 1-6.
10. A storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any of claims 1-6.
CN201911213459.9A 2019-12-02 2019-12-02 Neural network-based co-occurrence matrix feature extraction method and device, storage medium and terminal Active CN110889456B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911213459.9A CN110889456B (en) 2019-12-02 2019-12-02 Neural network-based co-occurrence matrix feature extraction method and device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911213459.9A CN110889456B (en) 2019-12-02 2019-12-02 Neural network-based co-occurrence matrix feature extraction method and device, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN110889456A true CN110889456A (en) 2020-03-17
CN110889456B CN110889456B (en) 2022-02-18

Family

ID=69749984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911213459.9A Active CN110889456B (en) 2019-12-02 2019-12-02 Neural network-based co-occurrence matrix feature extraction method and device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN110889456B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114037747A (en) * 2021-11-25 2022-02-11 佛山技研智联科技有限公司 Image feature extraction method and device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6070098A (en) * 1997-01-11 2000-05-30 Circadian Technologies, Inc. Method of and apparatus for evaluation and mitigation of microsleep events
US20100034465A1 (en) * 2008-08-08 2010-02-11 Kabushiki Kaisha Toshiba Method and apparatus for calculating pixel features of image data
JP2010198438A (en) * 2009-02-26 2010-09-09 National Institute Of Information & Communication Technology Apparatus for aligning words in pair of sentences with each other, and computer program for the same
CN104008521A (en) * 2014-05-29 2014-08-27 西安理工大学 LSB replacement steganalysis method based on grey co-occurrence matrix statistic features
CN106683031A (en) * 2016-12-30 2017-05-17 深圳大学 Feature extraction method and extraction system for digital image steganalysis
US20180089541A1 (en) * 2016-09-27 2018-03-29 Facebook, Inc. Training Image-Recognition Systems Using a Joint Embedding Model on Online Social Networks
CN108280480A (en) * 2018-01-25 2018-07-13 武汉大学 A kind of hidden image vector safety evaluation method based on residual error symbiosis probability

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6070098A (en) * 1997-01-11 2000-05-30 Circadian Technologies, Inc. Method of and apparatus for evaluation and mitigation of microsleep events
US20100034465A1 (en) * 2008-08-08 2010-02-11 Kabushiki Kaisha Toshiba Method and apparatus for calculating pixel features of image data
JP2010198438A (en) * 2009-02-26 2010-09-09 National Institute Of Information & Communication Technology Apparatus for aligning words in pair of sentences with each other, and computer program for the same
CN104008521A (en) * 2014-05-29 2014-08-27 西安理工大学 LSB replacement steganalysis method based on grey co-occurrence matrix statistic features
US20180089541A1 (en) * 2016-09-27 2018-03-29 Facebook, Inc. Training Image-Recognition Systems Using a Joint Embedding Model on Online Social Networks
CN106683031A (en) * 2016-12-30 2017-05-17 深圳大学 Feature extraction method and extraction system for digital image steganalysis
CN108280480A (en) * 2018-01-25 2018-07-13 武汉大学 A kind of hidden image vector safety evaluation method based on residual error symbiosis probability

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JIAN YE 等: "Deep Learning Hierarchical Representations for Image Steganalysis", 《IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY》 *
李斌 等: "深度学习空域隐写分析的预处理层", 《应用科学学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114037747A (en) * 2021-11-25 2022-02-11 佛山技研智联科技有限公司 Image feature extraction method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN110889456B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN113469073B (en) SAR image ship detection method and system based on lightweight deep learning
CN109754017B (en) Hyperspectral image classification method based on separable three-dimensional residual error network and transfer learning
CN112236779A (en) Image processing method and image processing device based on convolutional neural network
CN109948733B (en) Multi-classification method, classification device and storage medium for digestive tract endoscope image
CN110991511A (en) Sunflower crop seed sorting method based on deep convolutional neural network
CN110427970A (en) Image classification method, device, computer equipment and storage medium
CN110838108A (en) Medical image-based prediction model construction method, prediction method and device
CN110473172B (en) Medical image anatomical centerline determination method, computer device and storage medium
CN112699941B (en) Plant disease severity image classification method, device, equipment and storage medium
Pintea et al. Resolution learning in deep convolutional networks using scale-space theory
CN112560968A (en) HER2 image classification method and system based on convolution and residual error network
CN113869282B (en) Face recognition method, hyper-resolution model training method and related equipment
CN115410050A (en) Tumor cell detection equipment based on machine vision and method thereof
CN113705723A (en) Image classification system, method and computer equipment
CN110889456B (en) Neural network-based co-occurrence matrix feature extraction method and device, storage medium and terminal
CN113255433A (en) Model training method, device and computer storage medium
CN114049935A (en) HER2 image classification system based on multi-convolution neural network
CN110929779B (en) Reconstruction neuron quality detection method, ordered point cloud classification method and device
CN111860582A (en) Image classification model construction method and device, computer equipment and storage medium
CN110866552A (en) Hyperspectral image classification method based on full convolution space propagation network
CN115861305A (en) Flexible circuit board detection method and device, computer equipment and storage medium
CN113255667B (en) Text image similarity evaluation method and device, electronic equipment and storage medium
CN109460777A (en) Picture classification method, device and computer readable storage medium
CN115410017A (en) Seed mildew detection method, device, equipment and storage medium
CN115170456A (en) Detection method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant