CN108182470A - A kind of user identification method based on the recurrent neural network for paying attention to module - Google Patents
A kind of user identification method based on the recurrent neural network for paying attention to module Download PDFInfo
- Publication number
- CN108182470A CN108182470A CN201810045103.8A CN201810045103A CN108182470A CN 108182470 A CN108182470 A CN 108182470A CN 201810045103 A CN201810045103 A CN 201810045103A CN 108182470 A CN108182470 A CN 108182470A
- Authority
- CN
- China
- Prior art keywords
- attention
- neural network
- module
- recurrent neural
- att
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000000306 recurrent effect Effects 0.000 title claims abstract description 34
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 31
- 238000000034 method Methods 0.000 title claims abstract description 21
- 210000004556 brain Anatomy 0.000 claims abstract description 24
- 230000008569 process Effects 0.000 claims abstract description 7
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 4
- 230000006870 function Effects 0.000 claims description 15
- 230000015654 memory Effects 0.000 claims description 15
- 238000007781 pre-processing Methods 0.000 claims description 10
- 238000003066 decision tree Methods 0.000 claims description 6
- 238000012886 linear function Methods 0.000 claims description 6
- 230000007246 mechanism Effects 0.000 claims description 6
- 238000003062 neural network model Methods 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000007787 long-term memory Effects 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 230000006403 short-term memory Effects 0.000 claims description 3
- 238000012546 transfer Methods 0.000 claims description 3
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 claims description 2
- 238000009826 distribution Methods 0.000 abstract 1
- 230000004075 alteration Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000007123 defense Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000004080 punching Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 239000003245 coal Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Image Analysis (AREA)
Abstract
A kind of user identification method based on the recurrent neural network for paying attention to module proposed in the present invention, main contents include:Pretreatment, brain wave Mode Decomposition, the recurrent neural network based on attention module, classification, its process is, brain wave data is analyzed first and discloses δ patterns includes the most special information of user, then δ patterns are decomposited, and one is fed back to based on the decoder coding device recurrent neural network for paying attention to module, the network gives different brain wave channel distributions different demand values according to the importance of channel, finally grader is promoted by an extreme gradient by the feature for having discrimination that study obtains, identification is carried out to user.The present invention realizes that user identifies using based on the recurrent neural network for paying attention to module, has higher user's identification accuracy, and with better robustness and adaptability.
Description
Technical Field
The invention relates to the field of biological identification, in particular to a user identification method of a recurrent neural network based on an attention module.
Background
The biological identification technology is to identify the identity of an individual by combining a computer with high-tech means such as optics, acoustics, biosensors and the principle of biometry and utilizing the inherent physiological characteristics and behavior characteristics of a human body. The application is very wide, and the method comprises the fields of governments, armies, banks, electronic commerce, safety defense, social welfare guarantee and the like. For example, a biological identification visa is introduced into the visa field, biological characteristics such as human facial features and fingerprints are unique, safe and confidential, biological characteristic information data are collected and stored in the process of issuing the visa or border defense inspection, and the identity of an entry worker is identified more accurately and quickly through effective comparison; for example, iris card punching is carried out, as long as two eyes are aligned to a screen, a machine records iris characteristic passwords and completes a registration link, in a later identification link, a person with glasses can not take off the glasses, as long as the person is aligned to the screen, the machine completes comparison and identification within one second, identity information and card punching time are immediately displayed on the screen, and the technology is applied to the fields of coal mine worker attendance checking, prison prisoner management, bank vault entrance guard, border security clearance, military security system, examinee identity verification and the like. The current latest biometric systems are still not sufficiently secure and stable. For example, wearing an artificial mask can deceive the iris recognition system through a face recognition system, contact lenses can deceive the iris recognition system, a voice encoder can deceive the voice recognition system, and a fingerprint film can deceive the fingerprint recognition system. Recently emerging brain wave-based user identification technologies can provide a more effective identification system, but still suffer from poor robustness and adaptability.
The invention provides a user identification method of a recurrent neural network based on an attention module, which comprises the steps of firstly analyzing brain wave data and revealing that a delta mode contains the most special information of a user, then decomposing the delta mode and feeding the delta mode back to a decoder-encoder recurrent neural network based on the attention module, distributing different attention values to different brain wave channels by the network according to the importance of the channels, and finally carrying out identity identification on the user by using a characteristic with discrimination obtained by learning through an extreme gradient boosting classifier. The invention realizes user identification by using the recurrent neural network based on the attention module, has higher identification accuracy and better robustness and adaptability.
Disclosure of Invention
Aiming at the problem that the robustness and the adaptability of an identification system are poor, the invention aims to provide a user identification method of a recurrent neural network based on an attention module.
In order to solve the above problems, the present invention provides a method for identifying a user of a recurrent neural network based on an attention module, which mainly comprises the following steps:
firstly, preprocessing;
(II) decomposing brain wave modes;
(III) a recurrent neural network based on an attention module;
and (IV) classifying.
In the preprocessing, the original brain wave sample data needs to be preprocessed to remove the direct current offset and normalize the signal; wherein the necessity of removing the dc offset is that the brain wave receiving earphone introduces a noise component constant among the recorded signal data; the importance of the normalized signal is that it can handle feature data of different units or different scales.
Further, said removing the dc offset and normalizing the signal, first subtracting a dc offset constant from the signal E; then, a Z score scaling normalization method is used for calculating to obtain preprocessing data, namely:
where DC denotes direct current, μ denotes the mean of E-DC, and σ denotes the standard deviation.
In the electroencephalogram mode decomposition, because the signal component with the delta frequency band of 0.5Hz to 4Hz in the electroencephalogram signal can identify the identity very accurately and stably, the electroencephalogram needs to be decomposed to obtain the delta signal component; in order to separate out the signal components of the delta frequency band, a third-order butterworth band-pass filter with a frequency of 0.5Hz to 4Hz is used, which filter has the following parameters: the order is 3, the low cut-off frequency is 0.5Hz, and the high cut-off frequency is 4 Hz; all dimensions of the preprocessed data E' are fed back to the band pass filter in turn until finally the decomposed delta mode is obtained.
Wherein, the recurrent neural network based on attention module adds attention mechanism in the recurrent neural network model of coder-decoder, which mainly includes three elements: an encoder, an attention module, a decoder; the encoder is designed to compress the input delta wave into a single intermediate code C; note that the module generates weight sequences W of different dimensionsattFor helping the encoder to calculate a better midamble Catt(ii) a The decoder receives attention-based code CattDecoding the data for user identification; the user identity is derived from the attention module-based recurrent neural network prediction, not the human brain identity.
Further, the encoder comprises a plurality of non-recursive fully-connected neural network layers and a recursive long-short term memory layer; the non-recursive layer is used for constructing a non-linear function, and the non-linear function can be used for extracting a delta mode of the input; the data flow for these non-recursive layers can be calculated by:
Xi+1=T(Xi) (2)
wherein, XiRepresents the ith layer of data, and T (X)i)=XiW + b; the long and short term memory layers are used to compress the output of the non-recursive layer into a fixed length sequenceThe sequence is the midamble C; assuming that the long-short term memory is the i' th layer, the midamble is equal to the output of the long-short term memory, i.e. the output of the long-short term memoryWhereinThe value of (d) can be calculated by:
wherein,the hidden state of the (j-1) th long-short term memory unit is expressed, and L (-) represents the calculation process of the long-short term memory structure, which can be derived from the following equation:
wherein f is0,ff,fiAnd fmRespectively representing an output gate, a forgetting gate, an input gate and an input modulation gate.
Further, the attention module receives a final hidden state as a non-normalized attention weight W'attThe value can be given by:
calculating a normalized attention weight value:
Watt=softamax(W′att) (6)
wherein, the softamax transfer function is used for normalizing the attention weight value to be [0,1]A range of (d); under the attention mechanism, the intermediate code C can be weighted to obtain CattNamely:
Catt=C⊙Watt(7)
wherein C and WattAre obtained by simultaneous training.
Further, the decoder receives the attention module-based code CattAnd decode it for predicting the user identity Y′12(ii) a Since Y' is predicted at the output layer of the attention-module-based recurrent neural network model, that is:
Y′=T(Catt) (8)
finally, calculating a prediction cost function between the predicted identity Y' and the real identity Y by using a cross entropy function; the L2 norm is used to prevent overfitting; solving the cost function through an Adam optimizer algorithm; iterative threshold setting to n for attention-module-based recurrent neural networksiter(ii) a Weighted code CattThe output layer and the prediction result have a linear relation; if mouldThe weighted code can represent the identity of the user with high quality after being trained well and obtaining a very small cost value; setting learned depth features XDIs equal to CattIt is then used in the identification phase to identify the final user identity.
The classification is to classify the XD obtained through learning by utilizing an extreme gradient boosting classifier so as to achieve the purpose of user identity classification; the extreme gradient boosting classifier links a series of decision trees and regression trees, and tries to detect information as detailed as possible from input data; it builds multiple trees, each tree has its own leaf and corresponding score; moreover, it provides a regularized model that prevents overfitting, and its accurate prediction performance makes it widely applicable.
Further, the depth feature X obtained by learningDBy XDTraining a column of a decision tree and a regression tree, and predicting a series of user identities; let x bed∈XDIs a single sample of the depth feature, then for input xdThe final recognition result of (c) can be calculated by the following formula:
ym=f(xd) (9)
where f denotes the classification function for a single tree, ymRepresenting the predicted identity of the mth tree, F representing the mapping from the single tree prediction space to the final prediction space; i isDIs the user identity based on the final recognition of the electroencephalogram data.
Drawings
Fig. 1 is a system flow chart of a user identification method of a recurrent neural network based on an attention module according to the present invention.
Fig. 2 is a brain wave collection process diagram of a user identification method of a recurrent neural network based on an attention module according to the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application can be combined with each other without conflict, and the present invention is further described in detail with reference to the drawings and specific embodiments.
Fig. 1 is a system flow chart of a user identification method of a recurrent neural network based on an attention module according to the present invention. The method mainly comprises preprocessing, electroencephalogram mode decomposition, attention module-based recurrent neural network and classification.
Fig. 2 is a brain wave collection process diagram of a user identification method of a recurrent neural network based on an attention module according to the present invention. Removing direct current offset and normalizing signals from original brain wave sample data through preprocessing operation; wherein the necessity of removing the dc offset is that the brain wave receiving earphone introduces a noise component constant among the recorded signal data; the importance of the normalized signal is that it can handle feature data of different units or different scales.
Specifically, first, a dc offset constant is subtracted from the signal E; then, a Z score scaling normalization method is used for calculating to obtain preprocessing data, namely:
where DC denotes direct current, μ denotes the mean of E-DC, and σ denotes the standard deviation.
Since the signal component of the delta frequency band 0.5Hz to 4Hz among the brain wave signals can identify the identity with extreme accuracy and stability, first of allDecomposing brain waves to obtain a delta signal component; in order to separate out the signal components of the delta frequency band, a third-order butterworth band-pass filter with a frequency of 0.5Hz to 4Hz is used, which filter has the following parameters: order of 3, low cut-off frequency of 0.5HzHigh cutoff frequency of 4Hz(ii) a All dimensions of the preprocessed data E' are fed back to the band pass filter in turn until finally the decomposed delta mode is obtained.
An attention mechanism is added into an encoder-decoder recurrent neural network model, and the attention mechanism mainly comprises three elements: an encoder, an attention module, a decoder; the encoder is designed to compress the input delta wave into a single intermediate code C; note that the module generates weight sequences W of different dimensionsattFor helping the encoder to calculate a better midamble Catt(ii) a The decoder receives attention-based code CattDecoding the data for user identification; the user identity is derived from the attention module-based recurrent neural network prediction, not the human brain identity.
The system comprises a plurality of non-recursive fully-connected neural network layers and a recursive long-short term memory layer; the non-recursive layer is used for constructing a non-linear function, and the non-linear function can be used for extracting a delta mode of the input; the data flow for these non-recursive layers can be calculated by:
Xi+1=T(Xi) (2)
wherein, XiRepresents the ith layer of data, and T (X)i) XiW + b; the long and short term memory layer is used to compress the output of the non-recursive layer into a sequence with fixed length, namely the intermediate code C; assuming that the long-short term memory is the i' th layer, the midamble is equal to the output of the long-short term memory, i.e. the output of the long-short term memoryWhereinThe value of (d) can be calculated by:
wherein,the hidden state of the (j-1) th long-short term memory unit is expressed, and L (-) represents the calculation process of the long-short term memory structure, which can be derived from the following equation:
wherein f is0,ff,fiAnd fmRespectively representing an output gate, a forgetting gate, an input gate and an input modulation gate.
Receiving the final hidden state as a non-normalized attention weight Wa′ttThe value can be given by:
calculating a normalized attention weight value:
Watt=softmax(W′att) (6)
wherein, the softmax transfer function is used for normalizing the attention weight value to be [0,1]A range of (d); under the attention mechanism, the intermediate code C can be weighted to obtain CattNamely:
Catt=C⊙Watt(7)
wherein C and WattAre obtained by simultaneous training.
The decoder receives attention-based code CattAnd decode it for predicting the user identity Y′12(ii) a Since Y' is predicted at the output layer of the attention-module-based recurrent neural network model, that is:
Y′=T(Catt) (8)
finally, calculating a prediction cost function between the predicted identity Y' and the real identity Y by using a cross entropy function; the L2 norm is used to prevent overfitting; cost function by AdamSolving by an optimizer algorithm; iterative threshold setting to n for attention-module-based recurrent neural networksiter(ii) a Weighted code CattThe output layer and the prediction result have a linear relation; if the model is well trained and obtains a small cost value, the weighted code can be considered to represent the user identity with high quality; setting learned depth features XDIs equal to CattIt is then used in the identification phase to identify the final user identity.
Depth feature X obtained by learning by using extreme gradient boosting classifierDClassifying to achieve the purpose of user identity classification; extreme gradient boosting classifier links a series of decision trees and regressionsTrees, while trying to detect as detailed information as possible from the input data; it builds multiple trees, each tree has its own leaf and corresponding score; moreover, it provides a regularized model that prevents overfitting, and its accurate prediction performance makes it widely applicable.
By XDTraining a column of a decision tree and a regression tree, and predicting a series of user identities; let x bed∈XDIs a single sample of the depth feature, then for input xdThe final recognition result of (c) can be calculated by the following formula:
ym=f(xd) (9)
where f denotes the classification function for a single tree, ymRepresenting the predicted identity of the mth tree, F representing the mapping from the single tree prediction space to the final prediction space; i isDIs the user identity based on the final recognition of the electroencephalogram data.
It will be appreciated by persons skilled in the art that the invention is not limited to details of the foregoing embodiments and that the invention can be embodied in other specific forms without departing from the spirit or scope of the invention. In addition, various modifications and alterations of this invention may be made by those skilled in the art without departing from the spirit and scope of this invention, and such modifications and alterations should also be viewed as being within the scope of this invention. It is therefore intended that the following appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
Claims (10)
1. A user identification method of a recurrent neural network based on an attention module is characterized by mainly comprising preprocessing (I); decomposing brain wave modes; (iii) a recurrent neural network based on attention models; and (IV) classifying.
2. The preprocessing (a) as claimed in claim 1, wherein the original brain wave sample data needs to be processed by preprocessing operation to remove dc offset and normalize the signal; wherein the necessity of removing the dc offset is that the brain wave receiving earphone introduces a noise component constant among the recorded signal data; the importance of the normalized signal is that it can handle feature data of different units or different scales.
3. Removing the dc offset and normalizing signal according to claim 2, wherein the dc offset constant is first subtracted from the signal E; then, a Z score scaling normalization method is used for calculating to obtain preprocessing data, namely:
where DC denotes direct current, μ denotes the mean of E-DC, and σ denotes the standard deviation.
4. The brain wave pattern decomposition (ii) according to claim 1, wherein since the signal component of the delta frequency band 0.5Hz to 4Hz in the brain wave signal can identify the identity with extreme accuracy and stability, the brain wave needs to be decomposed first to obtain the delta signal component; in order to separate out the signal components of the delta frequency band, a third-order butterworth band-pass filter with a frequency of 0.5Hz to 4Hz is used, which filter has the following parameters: the order is 3, the low cut-off frequency is 0.5Hz, and the high cut-off frequency is 4 Hz; all dimensions of the preprocessed data E' are fed back to the band pass filter in turn until finally the decomposed delta mode is obtained.
5. The attention-based module recurrent neural network (iii) of claim 1, wherein the attention mechanism is added to the encoder-decoder recurrent neural network model, and mainly comprises three elements: an encoder, an attention module, a decoder; the encoder is designed to compress the input delta wave into a single intermediate code C; note that the module generates weight sequences W of different dimensionsattFor helping the encoder to calculate a better midamble Catt(ii) a The decoder receives attention-based code CattDecoding the data for user identification; the user identity is derived from the attention module-based recurrent neural network prediction, not the human brain identity.
6. The encoder according to claim 5, comprising a plurality of non-recursive fully-connected neural network layers and a recursive long-short term memory layer; the non-recursive layer is used for constructing a non-linear function, and the non-linear function can be used for extracting a delta mode of the input; the data flow for these non-recursive layers can be calculated by:
wherein, XiRepresents the ith layer of data, andthe long and short term memory layer is used to compress the output of the non-recursive layer into a sequence with fixed length, namely the intermediate code C; assuming that the long-short term memory is the i' th layer, the midamble is equal to the output of the long-short term memory, i.e. the output of the long-short term memoryWhereinThe value of (d) can be calculated by:
wherein,indicating the hidden state of the (j-1) th long-short term memory unit,the calculation process, which represents the long-short term memory structure, can be derived from the following equation:
wherein f is0,ff,fiAnd fmRespectively representing an output gate, a forgetting gate, an input gate and an input modulation gate.
7. The attention module of claim 5, wherein a final hidden state is received as a non-normalized attention weight W'attThe value can be given by:
calculating a normalized attention weight value:
Watt=softmax(W′att) (6)
wherein, the softmax transfer function is used for normalizing the attention weight value to be [0,1]A range of (d); attention machineThen, the intermediate code C can be weighted to obtain CattNamely:
wherein C and WattAre obtained by simultaneous training.
8. The decoder of claim 5, wherein the decoder receives attention-Module-based code CattAnd decodes it for predicting user identity Y'12(ii) a Since Y' is predicted at the output layer of the attention-module-based recurrent neural network model, that is:
finally, calculating a prediction cost function between the predicted identity Y' and the real identity Y by using a cross entropy function; the L2 norm is used to prevent overfitting; solving the cost function through an Adam optimizer algorithm; iterative threshold setting to n for attention-module-based recurrent neural networksiter(ii) a Weighted code CattThe output layer and the prediction result have a linear relation; if the model is well trained and obtains a small cost value, the weighted code can be considered to represent the user identity with high quality; setting learned depth features XDIs equal to CattIt is then used in the identification phase to identify the final user identity.
9. Classification (IV) according to claim 1, wherein the depth features X obtained by learning are classified using an extreme gradient boosting classifierDClassifying to achieve the purpose of user identity classification; the extreme gradient boosting classifier links a series of decision trees and regression trees, and tries to detect information as detailed as possible from input data; it builds multiple trees, each tree has its own leaf and corresponding score; moreover, it provides aThe regularized model, which is overfitting, is prevented and its accurate prediction performance makes it widely applicable.
10. Depth feature X obtained by learning according to claim 9DCharacterised by the use of XDTraining a column of a decision tree and a regression tree, and predicting a series of user identities; let x bed∈XDIs a single sample of the depth feature, then for input xdThe final recognition result of (c) can be calculated by the following formula:
ym=f(xd) (9)
where f denotes the classification function for a single tree, ymRepresenting the predicted identity of the mth tree, F representing the mapping from the single tree prediction space to the final prediction space; i isDIs the user identity based on the final recognition of the electroencephalogram data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810045103.8A CN108182470A (en) | 2018-01-17 | 2018-01-17 | A kind of user identification method based on the recurrent neural network for paying attention to module |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810045103.8A CN108182470A (en) | 2018-01-17 | 2018-01-17 | A kind of user identification method based on the recurrent neural network for paying attention to module |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108182470A true CN108182470A (en) | 2018-06-19 |
Family
ID=62550857
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810045103.8A Withdrawn CN108182470A (en) | 2018-01-17 | 2018-01-17 | A kind of user identification method based on the recurrent neural network for paying attention to module |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108182470A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109124625A (en) * | 2018-09-04 | 2019-01-04 | 大连理工大学 | A kind of driver fatigue state horizontal mipmap method |
CN109846477A (en) * | 2019-01-29 | 2019-06-07 | 北京工业大学 | A kind of brain electricity classification method based on frequency band attention residual error network |
CN109948427A (en) * | 2019-01-24 | 2019-06-28 | 齐鲁工业大学 | A kind of idea recognition methods based on long memory models in short-term |
CN110472395A (en) * | 2019-08-05 | 2019-11-19 | 武汉联影医疗科技有限公司 | Brain wave data processing method, device, equipment, medium and brain wave data processor |
CN110633417A (en) * | 2019-09-12 | 2019-12-31 | 齐鲁工业大学 | Web service recommendation method and system based on service quality |
CN111460892A (en) * | 2020-03-02 | 2020-07-28 | 五邑大学 | Electroencephalogram mode classification model training method, classification method and system |
CN111543988A (en) * | 2020-05-25 | 2020-08-18 | 五邑大学 | Adaptive cognitive activity recognition method and device and storage medium |
CN113472484A (en) * | 2021-06-29 | 2021-10-01 | 哈尔滨工业大学 | Internet of things terminal equipment user feature code identification method based on cross entropy iterative learning |
CN114224361A (en) * | 2021-12-31 | 2022-03-25 | 杭州电子科技大学 | Sleep stage classification method and device based on electroencephalogram signals |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105975457A (en) * | 2016-05-03 | 2016-09-28 | 成都数联铭品科技有限公司 | Information classification prediction system based on full-automatic learning |
CN107273800A (en) * | 2017-05-17 | 2017-10-20 | 大连理工大学 | A kind of action identification method of the convolution recurrent neural network based on attention mechanism |
-
2018
- 2018-01-17 CN CN201810045103.8A patent/CN108182470A/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105975457A (en) * | 2016-05-03 | 2016-09-28 | 成都数联铭品科技有限公司 | Information classification prediction system based on full-automatic learning |
CN107273800A (en) * | 2017-05-17 | 2017-10-20 | 大连理工大学 | A kind of action identification method of the convolution recurrent neural network based on attention mechanism |
Non-Patent Citations (1)
Title |
---|
XIANG ZHANG: "MindID: Person Identification from Brain Waves through Attention-based Recurrent Neural Network", 《ARXIV:1711.06149V1》 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109124625B (en) * | 2018-09-04 | 2021-07-20 | 大连理工大学 | Driver fatigue state level grading method |
CN109124625A (en) * | 2018-09-04 | 2019-01-04 | 大连理工大学 | A kind of driver fatigue state horizontal mipmap method |
CN109948427A (en) * | 2019-01-24 | 2019-06-28 | 齐鲁工业大学 | A kind of idea recognition methods based on long memory models in short-term |
CN109846477A (en) * | 2019-01-29 | 2019-06-07 | 北京工业大学 | A kind of brain electricity classification method based on frequency band attention residual error network |
CN109846477B (en) * | 2019-01-29 | 2021-08-06 | 北京工业大学 | Electroencephalogram classification method based on frequency band attention residual error network |
CN110472395A (en) * | 2019-08-05 | 2019-11-19 | 武汉联影医疗科技有限公司 | Brain wave data processing method, device, equipment, medium and brain wave data processor |
CN110472395B (en) * | 2019-08-05 | 2021-04-20 | 武汉联影医疗科技有限公司 | Brain wave data processing method, device, equipment, medium and brain wave data processor |
CN110633417A (en) * | 2019-09-12 | 2019-12-31 | 齐鲁工业大学 | Web service recommendation method and system based on service quality |
CN110633417B (en) * | 2019-09-12 | 2023-04-07 | 齐鲁工业大学 | Web service recommendation method and system based on service quality |
CN111460892A (en) * | 2020-03-02 | 2020-07-28 | 五邑大学 | Electroencephalogram mode classification model training method, classification method and system |
WO2021174618A1 (en) * | 2020-03-02 | 2021-09-10 | 五邑大学 | Training method for electroencephalography mode classification model, classification method and system |
CN111543988A (en) * | 2020-05-25 | 2020-08-18 | 五邑大学 | Adaptive cognitive activity recognition method and device and storage medium |
CN113472484A (en) * | 2021-06-29 | 2021-10-01 | 哈尔滨工业大学 | Internet of things terminal equipment user feature code identification method based on cross entropy iterative learning |
CN113472484B (en) * | 2021-06-29 | 2022-08-05 | 哈尔滨工业大学 | Internet of things equipment user feature code identification method based on cross entropy iterative learning |
CN114224361A (en) * | 2021-12-31 | 2022-03-25 | 杭州电子科技大学 | Sleep stage classification method and device based on electroencephalogram signals |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108182470A (en) | A kind of user identification method based on the recurrent neural network for paying attention to module | |
Budak et al. | An effective hybrid model for EEG-based drowsiness detection | |
Wayman et al. | An introduction to biometric authentication systems | |
Komeili et al. | Liveness detection and automatic template updating using fusion of ECG and fingerprint | |
Saini et al. | Biometrics in forensic identification: applications and challenges | |
Meuwly et al. | Forensic biometrics: From two communities to one discipline | |
CN112149638B (en) | Personnel identity recognition system construction and use method based on multi-modal biological characteristics | |
CN105989266B (en) | Authentication method, device and system based on electrocardiosignals | |
CN104239766A (en) | Video and audio based identity authentication method and system for nuclear power plants | |
CN103093234A (en) | Identity recognition method based on ground reactive force during walking | |
Jyotishi et al. | An ECG biometric system using hierarchical LSTM with attention mechanism | |
Hu et al. | A pervasive EEG-based biometric system | |
CN107944356A (en) | The identity identifying method of the hierarchical subject model palmprint image identification of comprehensive polymorphic type feature | |
Mohanchandra et al. | Using brain waves as new biometric feature for authenticating a computer user in real-time | |
CN116671919B (en) | Emotion detection reminding method based on wearable equipment | |
Al-Nima et al. | Using hand-dorsal images to reproduce face images by applying back propagation and cascade-forward neural networks | |
Jomaa et al. | A multilayer system to boost the robustness of fingerprint authentication against presentation attacks by fusion with heart-signal | |
Mostayed et al. | Foot step based person identification using histogram similarity and wavelet decomposition | |
Chiu et al. | A micro-control capture images technology for the finger vein recognition based on adaptive image segmentation | |
Alghamdi et al. | Artificial intelligence Techniques based learner authentication in cybersecurity higher education institutions | |
Srinivas et al. | Artificial intelligence based optimal biometric security system using palm veins | |
Ibrahim et al. | Trends in Biometric Authentication: A review | |
CN116861217B (en) | Identity recognition method and system for mobile terminal | |
Nejrs et al. | Face image classification based on feature extraction | |
Allah | Artificial neural networks based fingerprint authentication with clusters algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20180619 |