CN112883355B - Non-contact user identity authentication method based on RFID and convolutional neural network - Google Patents

Non-contact user identity authentication method based on RFID and convolutional neural network Download PDF

Info

Publication number
CN112883355B
CN112883355B CN202110311961.4A CN202110311961A CN112883355B CN 112883355 B CN112883355 B CN 112883355B CN 202110311961 A CN202110311961 A CN 202110311961A CN 112883355 B CN112883355 B CN 112883355B
Authority
CN
China
Prior art keywords
layer
phase
identity authentication
data
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110311961.4A
Other languages
Chinese (zh)
Other versions
CN112883355A (en
Inventor
肖甫
戴纪馨
盛碧云
周剑
刘海猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202110311961.4A priority Critical patent/CN112883355B/en
Publication of CN112883355A publication Critical patent/CN112883355A/en
Application granted granted Critical
Publication of CN112883355B publication Critical patent/CN112883355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A non-contact user identity authentication method based on RFID and convolutional neural network imitates a nine-grid unlocking deployment RFID label matrix of a mobile phone, acquires phase information during hand movement, and realizes user identity authentication by using the convolutional neural network. The implementation steps comprise: 1. collecting phase information of actions for identity authentication; 2. preprocessing the acquired phase information; 3. extracting starting point information of each action sample from the preprocessed phase information; 4. the acquired phase information is made into a data set, and then the data is trained by adopting a deep learning convolutional neural network to obtain a convolutional neural network model for authentication; 5. identifying the actions of the user by using the trained model when the user performs identity authentication; 6. and obtaining an authentication result, and judging whether the authentication is successful or not. The invention adopts the convolutional neural network based on CNN to extract the characteristics in the phase for performing the action recognition, thereby performing the identity authentication, and having better accuracy, safety and robustness.

Description

Non-contact user identity authentication method based on RFID and convolutional neural network
Technical Field
The invention relates to the technical field of action recognition, in particular to a non-contact user identity authentication method based on RFID and convolutional neural network.
Background
Wireless sensing has evolved rapidly in recent years in various fields such as activity sensing, indoor positioning, smart home and behavioral analysis, etc. More and more families deploy the Internet of things equipment, so that the safety problem is often ignored while the intelligent and convenient functions are brought to people. Some internet of things devices can control access and environment of a building, and more likely monitor audio and video devices of a user, so identity authentication is important.
The conventional contact recognition technology has a problem of carrying a dedicated device, and thus a non-contact recognition concept that does not rely on a recognition object carrying a dedicated device is becoming a focus of research in recent years. The method mainly utilizes the interference and reflection of user actions on wireless signals between transceiver devices to extract the characteristics of received signals, and then performs matching identification according to a designed model. The non-contact type motion recognition technology has the characteristics of easy operation, easy deployment, multiple scenes and the like, and can be widely applied to the aspects of intelligent families, medical care, industrial manufacturing and the like.
Convolutional Neural Networks (CNNs), one of the representative algorithms for deep learning, is a feed-forward neural network that includes convolutional calculations and has a deep structure. Success has been achieved in numerous areas, such as: image recognition, image segmentation, speech recognition, natural language processing, and the like. The principle of the method is that features can be automatically learned from data, and the result is generalized to the same type of unknown data. Therefore, the method can be applied to the field of motion recognition and has extremely high efficiency and accuracy.
Disclosure of Invention
The invention mainly aims to provide a non-contact user identity authentication method based on RFID and a convolutional neural network, which simulates a nine-grid unlocking deployment three-row three-column tag matrix on a mobile phone, collects phase information of an identification action, and uses a convolutional neural network algorithm to match with a pre-trained identity authentication action so as to safely and effectively identify the user identity.
A non-contact user identity authentication method based on RFID and convolutional neural network comprises the following steps:
step 1: arranging a passive RFID tag matrix in an indoor environment, wherein the arrangement mode is a nine-grid structure of 3 multiplied by 3, and collecting phase data for identity authentication;
step 2: preprocessing the phase information acquired in the step 1;
step 3: acquiring information of a starting point and an ending point of each action sample from the phase information preprocessed in the step 2;
step 4: the phase information data are made into an identity authentication data set, and deep learning convolutional neural network CNN is adopted to train the data, so that a CNN model for identity authentication is obtained;
step 5: when a user performs identity authentication, monitoring the action of the user, collecting phase information by a tag matrix, and then sending data into a CNN model obtained in the step 4 after the processing of the step 2 and the step 3 to identify the identity of the user;
step 6: and (5) judging whether the user identity authentication is successful or not according to the result of the step (5).
Further, in the step 1, the arranged passive RFID tag matrix consists of 9 tags, and the tags are equidistantly arranged on a plane according to three rows and three columns, wherein the distance between each tag is 12.5cm; placing an antenna 1.2m behind the tag; and acquiring the phase information of a plurality of tags in the tag matrix through mutual communication between the RFID reader and the tags.
Further, in the step 2, the preprocessing is performed on the collected phase information, which specifically includes the following steps:
step 2-1, phase unwrapping: the original phase signal is a periodic function in the range of 0-2 pi rad; when the phase value approaches 0 or 2 pi, there is a phase jump; let the phase vector value of the original sample signal of the collected label m be
Figure BDA0002989764390000031
m=1, 2, …,9; assume that the phase value of two consecutive times is +.>
Figure BDA0002989764390000032
And
Figure BDA0002989764390000033
due to the periodic function, the actual signal should be +.>
Figure BDA0002989764390000034
And; conversion of the phase unwrapping problem to solve for N m,i I=1, 2, n; from the phase range 0-2 pi rad +.>
Figure BDA0002989764390000035
Figure BDA0002989764390000036
Thus DeltaN m,1 =N m,i+1 -N m,i The calculation is as follows:
Figure BDA0002989764390000037
calculating N m,i
Figure BDA0002989764390000038
According to
Figure BDA0002989764390000039
Deriving the phase vector phi after phase unwrapping m =[φ m,1 ,φ m,2 ,...,φ m,n ]。
Step 2-2, data normalization processing: acquiring a phase sequence with the length of t for each tag in a static state, and solving an average phase value of each tag in the static state; when the phase data is processed later, the average value is subtracted from the phase value of each tag, so that the phase of each tag in a static state is normalized to 0.
Step 2-3, filtering the phase using a butterworth low pass filter: when the signal filtering is carried out, the low-frequency signal generated by the motion normally passes through, and the high-frequency signal with noise is weakened or shielded; the butterworth low pass filter is expressed in terms of square of amplitude versus frequency as follows:
Figure BDA0002989764390000041
where n is the order, ω of the filter c For cut-off frequency omega p Is the value of the passband edge; when ω=ω p Time of day
Figure BDA0002989764390000042
2 Is the value of the passband edge.
Further, the step 3 specifically includes the following steps:
step 3-1, determining the approximate range of the start or the end of the action by using a KL divergence self-adaptive segmentation algorithm: setting a sliding window, calculating a discrete probability distribution function PDF of data in each sliding window, setting P and Q as PDF values of two continuous sliding windows, and calculating KL dispersion of the P and Q:
Figure BDA0002989764390000043
/>
when the discrete value is large, representing that the two windows show different motion states for action segmentation;
step 3-2, determining the accurate time for starting or ending the action: expanding the sliding window obtained in the step 3-1 by 5 times back and forth to obtain a time period of 10 times of the last sliding window; performing sliding window operation on each label in the time period, and determining the starting or ending time of each label; the sliding window uses the amplitude and frequency of the signal as features; the length of the sliding window is denoted by L, and the amplitude value and frequency of the ith window are denoted as:
Figure BDA0002989764390000044
and
Figure BDA0002989764390000051
wherein phi is i,k A kth data point representing an ith sliding window; the measurement difference function G is calculated as:
G(i)=C A |A i+1 -A i |+C F |F i+1 -F i |
wherein C is A And C F Is two constants, which are weights of two formulas; thus obtaining the start and end times of each tagThe time when the first tag starts to change is the time when the action starts; and similarly, obtaining the action ending time.
Further, in step 4, the phase sample is cut and extracted based on the action starting point information, and a data set related to identity authentication is constructed, and the specific construction method is that a phase matrix is constructed, and an mxn matrix P is generated, where m is the number of labels, and n is the number of sampling points, namely:
Figure BDA0002989764390000052
then utilizing an im resize (A, m) function in matlab, wherein A is an original matrix, and m is a normalized size; the tag matrix is normalized to a fixed-size data set.
Further, in step 4, the CNN model includes: an input layer, a convolution layer, a maximum pooling layer, a flattening layer and a full connection layer;
the convolution layer includes: a first convolution layer and a second convolution layer; the maximum pooling layer comprises a first maximum pooling layer and a second maximum pooling layer; the full-connection layer comprises a full-connection layer I and a full-connection layer II;
the CNN model is structured in such a way that an input layer is arranged at the forefront end, the input layer is connected with a convolution layer, and the convolution layer is connected with a pooling layer; the first maximum pooling layer is connected with the second convolution layer; the second convolution layer is connected with the second maximum pooling layer; the second pooling layer is connected with the flattening layer; the flattening layer is connected with the full-connection layer; the first full connecting layer is connected with the second full connecting layer;
the input layer is used for completing the processing of input data; the convolution layer uses a convolution kernel to perform feature extraction and feature mapping; the maximum pooling layer is used for effectively reducing the number of parameters, so that the network complexity is reduced; the flattening layer is used for compressing data into a one-dimensional array; the fully connected layer maps the learned distributed feature representation to a sample tag space; the classifying layer is used for completing the classification of the identity authentication and identification actions;
and feeding the processed training data set into a constructed CNN model, wherein the CNN model is continuously learned based on a large number of data samples in the training data set, and finally a CNN model meeting the requirements is trained.
Further, in step 5, the user starts to perform user identity authentication, and the tag matrix collects phase data of user authentication actions; the original data is processed in the step 2 and the step 3 to obtain data information related to the authentication action; and (3) preparing the processed data into a data set through the step (4), and then sending the data set into the trained CNN model to perform classification authentication on actions.
Further, evaluating the actions by using the CNN model trained in the step 6, and judging whether the identified actions are matched with the trained actions; if the two types of information are matched, the identity authentication is successful.
Compared with the prior art, the invention has the following beneficial effects: the invention relates to a non-contact user identity authentication technology based on RFID and convolutional neural network, which solves the problem that the existing Internet of things system lacks security authentication. Compared with the traditional identity authentication, the invention does not need expensive equipment, only uses a single reader and a single antenna and 9 commercial passive RFID tags, has low cost and simple arrangement, does not violate the privacy of users, and improves the system security. Compared with the traditional action segmentation method, the novel signal segmentation method is designed and implemented, and the action can be accurately and effectively segmented. In addition, the invention has a great expansion prospect, and the invention can realize a terminal for operating the home Internet of things and has a prospect of continuous research. In order to improve the accuracy of recognition, the invention adopts a convolutional neural network to recognize actions. The invention is realized in an indoor room, simulates a home user identity authentication scene, and the accuracy rate of motion recognition reaches 96.3%.
Drawings
Fig. 1 is a system framework of a non-contact user identity authentication technology based on RFID and convolutional neural network according to an embodiment of the present invention.
Fig. 2 is a label matrix layout of a non-contact user identity authentication technique based on RFID and convolutional neural network according to an embodiment of the present invention.
Fig. 3 is a layout view of an RFID device according to the non-contact user identity authentication technology based on an RFID and a convolutional neural network in an embodiment of the present invention.
Fig. 4 illustrates a non-contact user identity authentication technique recognition action based on RFID and convolutional neural network according to an embodiment of the present invention.
Fig. 5 is a CNN model structure of a non-contact user identity authentication technology based on RFID and convolutional neural network according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further described in detail below with reference to the attached drawings.
Referring to fig. 1, the method includes: a phase preprocessing flow, an action extraction flow and an identity authentication flow. The passive RFID tag matrix collects phase data information of user identification actions, and the position layout of tags is shown in fig. 2 and 3; wherein the collected phase information data comprises: 5 motion modes for identity authentication. The collected motion movement mode is shown in fig. 4. The antenna is used for sending signals to the passive RFID tag, receiving signals from the passive RFID tag and sending the signals to the RFID reader; the RFID reader is used for modulating and demodulating signals and decoding data packets; the phase preprocessing unit is used for performing phase unwrapping, normalization and data smoothing on the original data; an action extraction unit for determining a start and end time of each action, including a process of determining an action range and determining a start or end time of each tag; the identity authentication unit constructs a training-out suitable CNN model based on a large amount of phase data of the recognition actions, and classifies the actions to be recognized as shown in fig. 5.
The phase preprocessing flow comprises the following steps:
and (3) phase unwrapping: the original phase signal is a periodic function in the range of 0-2 pi rad. When the phase value approaches 0 or 2 pi, there may be a phase jump. Let the phase vector value of the original sample signal of the collected label m be
Figure BDA0002989764390000081
Assume that the phase value of two consecutive times is +.>
Figure BDA0002989764390000082
And
Figure BDA0002989764390000083
for reasons of the periodic function their actual signal should be +.>
Figure BDA0002989764390000084
And
Figure BDA0002989764390000085
Figure BDA0002989764390000086
the phase unwrapping problem is converted into solution N m,i I=1, 2,..n. Since the phase is in the range of 0-2 pi rad, it is possible to obtain +.>
Figure BDA0002989764390000087
Thus DeltaN m,1 =N m,i+1 -N m,i It can be calculated as:
Figure BDA0002989764390000088
then N can be calculated m,i
Figure BDA0002989764390000091
According to
Figure BDA0002989764390000092
The phase vector phi after the phase unwrapping can be derived m =[φ m,1 ,φ m,2 ,...,φ m,n ]。
And (3) data normalization processing: collecting t readings of each tag in a static state, and obtainingAverage phase value for each tag at rest:
Figure BDA0002989764390000093
when the phase data is processed later, the average value is subtracted from the phase value of each tag, so that the phase of each tag in a static state is normalized to 0.
The phase is filtered using a butterworth low pass filter. The principle of the method is that a phase signal generated by human body movement is positioned at a lower frequency, a low-frequency signal generated by the movement can normally pass through when the Butterworth filter filters the signal, and a high-frequency signal where noise is positioned can be weakened or shielded. The butterworth low-pass filter can be expressed by the following equation of square of amplitude versus frequency:
Figure BDA0002989764390000094
where n is the order, ω of the filter c For cut-off frequency omega p Is the value of the passband edge; when ω=ω p Time of day
Figure BDA0002989764390000095
2 Is the value of the passband edge.
The action extraction flow comprises the following steps:
determining an approximate range of action starts or ends: using KL divergence self-adaptive segmentation algorithm, setting a sliding window, calculating discrete Probability Distribution Function (PDF) of data in each sliding window, setting P and Q as PDF values of two continuous sliding windows, and calculating KL divergence of the P and Q:
Figure BDA0002989764390000096
when the discrete value is large, the two windows are represented to present different motion states for action segmentation. Determining the exact time at which the action starts or ends: and expanding the sliding window obtained in the previous step by 5 times before and after the sliding window to obtain a time period of 10 times of the last sliding window. And carrying out sliding window operation on each label in the time period, and determining the starting or ending time of each label. The sliding window uses the amplitude and frequency of the signal as features.
The length of the sliding window is denoted by L, and the amplitude value and frequency of the ith window can be expressed as:
Figure BDA0002989764390000101
and
Figure BDA0002989764390000102
wherein phi is i,k Representing the kth data point of the ith sliding window. The measurement difference function G can be calculated as:
G(i)=C A |A i+1 -A i |+C F |F i+1 -F i |
wherein C is A And C F Is two constants, and is the weight of two formulas. Thus, the start and end times of each tag are obtained, with the time at which the first tag starts to change being the time at which the action starts. And similarly, obtaining the action ending time.
The identity authentication process comprises the following steps:
the phase information data is made into an action data set for identity authentication. The construction method is as follows: constructing a phase matrix to generate an m×n matrix P, where m is the number of labels and n is the number of sampling points, namely:
Figure BDA0002989764390000111
and then utilizing an im resize (A, m) function in matlab, wherein A is an original matrix, and m is a normalized size. The tag matrix is normalized to a fixed-size data set.
The CNN model includes: input layer, convolution layer, max-pooling layer, leveling layer, full connection layer.
The convolution layer includes: a first convolution layer and a second convolution layer; the maximum pooling layer comprises a first maximum pooling layer and a second maximum pooling layer; the full-connection layer comprises a full-connection layer I and a full-connection layer II.
The framework of the CNN model is shown in fig. 5, and the forefront end of the model is provided with an input layer which is connected with a convolution layer, and the convolution layer is connected with a pooling layer; the first maximum pooling layer is connected with the second convolution layer; the second convolution layer is connected with the second maximum pooling layer; the second pooling layer is connected with the flattening layer; the flattening layer is connected with the full-connection layer; the first full connecting layer is connected with the second full connecting layer.
The input layer is used for completing the processing of input data; the convolution layer uses a convolution kernel to perform feature extraction and feature mapping; the maximum pooling layer is used for effectively reducing the number of parameters, so that the network complexity is reduced; the flattening layer is used for compressing data into a one-dimensional array; the fully connected layer maps learned "distributed feature representations" to sample tag spaces; the classifying layer is used for completing classification of the identity authentication and identification actions.
And feeding the processed training data set into a constructed CNN model, wherein the CNN model is continuously learned based on a large number of data samples in the training data set, and finally a CNN model meeting the requirements is trained.
The user starts to carry out user identity authentication, and the tag matrix collects phase data of user authentication actions; the original data is subjected to phase preprocessing and action extraction to obtain data information related to authentication actions; and preparing the processed data into a data set, and then sending the data set into the trained CNN model to perform classification authentication on the actions. And evaluating the action by using the trained CNN model, and judging whether the identified action is matched with the trained action. If the two types of information are matched, the identity authentication is successful.
The above description is merely of preferred embodiments of the present invention, and the scope of the present invention is not limited to the above embodiments, but all equivalent modifications or variations according to the present disclosure will be within the scope of the claims.

Claims (7)

1. A non-contact user identity authentication method based on RFID and convolutional neural network is characterized in that: the method comprises the following steps:
step 1: arranging a passive RFID tag matrix in an indoor environment, wherein the arrangement mode is a nine-grid structure of 3 multiplied by 3, and collecting phase data for identity authentication;
step 2: preprocessing the phase information acquired in the step 1;
in the step 2, the collected phase information is preprocessed, which specifically includes the following steps:
step 2-1, phase unwrapping: the original phase signal is a periodic function in the range of 0-2 pi rad; let the phase vector value of the original sample signal of the collected label m be
Figure FDA0003980654320000011
Figure FDA0003980654320000012
m=1, 2, …,9; assume that the phase value of two consecutive times is +.>
Figure FDA0003980654320000013
And->
Figure FDA0003980654320000014
Due to the periodic function, the actual signal should be +.>
Figure FDA0003980654320000015
Figure FDA0003980654320000016
And->
Figure FDA0003980654320000017
Conversion of the phase unwrapping problem to solve for N m,i I=1, 2, …, n; from the phase range 0-2 pi rad +.>
Figure FDA0003980654320000018
Figure FDA0003980654320000019
Thus DeltaN m,i =N m,i+1 -N m,i The calculation is as follows:
Figure FDA00039806543200000110
calculating N m,i
Figure FDA00039806543200000111
According to
Figure FDA00039806543200000112
Deriving the phase vector phi after phase unwrapping m =[φ m,1m,2 ,…,φ m,n ];
Step 2-2, data normalization processing: acquiring a phase sequence with the length of t for each tag in a static state, and solving an average phase value of each tag in the static state; when the phase data is processed subsequently, subtracting an average value from the phase value of each tag, so that the phase of each tag in a static state is normalized to 0;
step 2-3, filtering the phase using a butterworth low pass filter: when the signal filtering is carried out, the low-frequency signal generated by the motion normally passes through, and the high-frequency signal with noise is weakened or shielded; the butterworth low pass filter is expressed in terms of square of amplitude versus frequency as follows:
Figure FDA0003980654320000021
where n is the order, ω of the filter c For cut-off frequency omega p Is the value of the passband edge; when ω=ω p Time of day
Figure FDA0003980654320000022
2 Is the value of the passband edge;
step 3: acquiring information of a starting point and an ending point of each action sample from the phase information preprocessed in the step 2;
step 4: the phase information is made into an identity authentication data set, and a deep learning convolutional neural network CNN is adopted to train the data, so that a CNN model for identity authentication is obtained;
step 5: when a user performs identity authentication, monitoring the action of the user, collecting phase information by a tag matrix, and then sending data into a CNN model obtained in the step 4 after the processing of the step 2 and the step 3 to identify the identity of the user;
step 6: and (5) judging whether the user identity authentication is successful or not according to the result of the step (5).
2. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein the method comprises the following steps: in the step 1, the arranged passive RFID tag matrix consists of 9 tags, which are equidistantly arranged on a plane according to three rows and three columns, wherein the distance between each tag is 12.5cm; placing an antenna 1.2m behind the tag; and acquiring the phase information of a plurality of tags in the tag matrix through mutual communication between the RFID reader and the tags.
3. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein the method comprises the following steps: the step 3 specifically comprises the following steps:
step 3-1, determining a range of starting or ending actions by using a KL divergence adaptive segmentation algorithm: setting a sliding window, calculating a discrete probability distribution function PDF of data in each sliding window, setting P and Q as PDF values of two continuous sliding windows, and calculating KL dispersion of the P and Q:
Figure FDA0003980654320000031
when the discrete value is large, representing that the two windows show different motion states for action segmentation;
step 3-2, determining the accurate time for starting or ending the action: expanding the sliding window obtained in the step 3-1 by 5 times back and forth to obtain a time period of 10 times of the sliding window; performing sliding window operation on each label in the time period, and determining the starting or ending time of each label; the sliding window uses the amplitude and frequency of the signal as features; the length of the sliding window is denoted by L, and the amplitude value and frequency of the ith window are denoted as:
Figure FDA0003980654320000032
and
Figure FDA0003980654320000033
wherein phi is i,k A kth data point representing an ith sliding window; the measurement difference function G is calculated as:
G(i)=C A |A i+1 -A i |+C F |F i+1 -F
wherein C is A And C F Is two constants, which are weights of two formulas; thus obtaining the starting and ending time of each tag, the time when the first tag starts to change being the time when the action starts; and similarly, obtaining the action ending time.
4. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein the method comprises the following steps: in step 4, clipping and extracting the phase sample based on the action starting point information and constructing a data set related to identity authentication, specifically, constructing a phase matrix to generate an m×n matrix P, where m is the number of labels, and n is the number of sampling points, namely:
Figure FDA0003980654320000041
and normalizing the label matrix to a data set with a fixed size by utilizing an im size (A, m) function in matlab, wherein A is an original matrix and m is a normalized size.
5. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein the method comprises the following steps: in step 4, the CNN model includes: an input layer, a convolution layer, a maximum pooling layer, a flattening layer, a full connection layer and a classification layer;
the convolution layer includes: a first convolution layer and a second convolution layer; the maximum pooling layer comprises a first maximum pooling layer and a second maximum pooling layer; the full-connection layer comprises a full-connection layer I and a full-connection layer II;
the CNN model is structured in such a way that an input layer is arranged at the forefront end, the input layer is connected with a convolution layer, and the convolution layer I is connected with a maximum pooling layer I; the first maximum pooling layer is connected with the second convolution layer; the second convolution layer is connected with the second maximum pooling layer; the second largest pooling layer is connected with the flattening layer; the flattening layer is connected with the full-connection layer; the first full connecting layer is connected with the second full connecting layer;
the input layer is used for completing the processing of input data; the convolution layer uses a convolution kernel to perform feature extraction and feature mapping; the maximum pooling layer is used for reducing the number of parameters, so that the network complexity is reduced; the flattening layer is used for compressing data into a one-dimensional array; the fully connected layer maps the learned distributed feature representation to a sample tag space; the classification layer is used for completing classification of the identity authentication and identification actions;
and feeding the processed training data set into a constructed CNN model, wherein the CNN model is learned based on data samples in the training data set, and finally a CNN model meeting the requirements is trained.
6. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein the method comprises the following steps: step 5, the user starts to carry out user identity authentication, and the tag matrix collects phase data of user authentication actions; the original data is processed in the step 2 and the step 3 to obtain data information related to the authentication action; and (3) preparing the processed data into a data set through the step (4), and then sending the data set into a trained CNN model to perform classification authentication on the actions.
7. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein the method comprises the following steps: evaluating the actions by using the CNN model trained in the step 6, and judging whether the identified actions are matched with the trained actions; if the two types of information are matched, the identity authentication is successful.
CN202110311961.4A 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network Active CN112883355B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110311961.4A CN112883355B (en) 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110311961.4A CN112883355B (en) 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network

Publications (2)

Publication Number Publication Date
CN112883355A CN112883355A (en) 2021-06-01
CN112883355B true CN112883355B (en) 2023-05-02

Family

ID=76042053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110311961.4A Active CN112883355B (en) 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network

Country Status (1)

Country Link
CN (1) CN112883355B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113993134B (en) * 2021-12-27 2022-03-22 广州优刻谷科技有限公司 IoT (Internet of things) equipment secure access method and system based on RFID (radio frequency identification) signals

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664909A (en) * 2018-04-28 2018-10-16 上海爱优威软件开发有限公司 A kind of auth method and terminal
CN108776788A (en) * 2018-06-05 2018-11-09 电子科技大学 A kind of recognition methods based on brain wave
CN109299697A (en) * 2018-09-30 2019-02-01 泰山学院 Deep neural network system and method based on underwater sound communication Modulation Mode Recognition
CN110516740A (en) * 2019-08-28 2019-11-29 电子科技大学 A kind of fault recognizing method based on Unet++ convolutional neural networks
CN110598734A (en) * 2019-08-05 2019-12-20 西北工业大学 Driver identity authentication method based on convolutional neural network and support vector field description

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664909A (en) * 2018-04-28 2018-10-16 上海爱优威软件开发有限公司 A kind of auth method and terminal
CN108776788A (en) * 2018-06-05 2018-11-09 电子科技大学 A kind of recognition methods based on brain wave
CN109299697A (en) * 2018-09-30 2019-02-01 泰山学院 Deep neural network system and method based on underwater sound communication Modulation Mode Recognition
CN110598734A (en) * 2019-08-05 2019-12-20 西北工业大学 Driver identity authentication method based on convolutional neural network and support vector field description
CN110516740A (en) * 2019-08-28 2019-11-29 电子科技大学 A kind of fault recognizing method based on Unet++ convolutional neural networks

Also Published As

Publication number Publication date
CN112883355A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
US10061389B2 (en) Gesture recognition system and gesture recognition method
CN106658590B (en) Design and implementation of multi-person indoor environment state monitoring system based on WiFi channel state information
CN107968689B (en) Perception identification method and device based on wireless communication signals
CN111505632B (en) Ultra-wideband radar action attitude identification method based on power spectrum and Doppler characteristics
CN110288018A (en) A kind of WiFi personal identification method merging deep learning model
CN104346503A (en) Human face image based emotional health monitoring method and mobile phone
CN110287863A (en) A kind of gesture identification method based on WiFi signal
CN111160424B (en) NFC equipment fingerprint authentication method and system based on CNN image identification
CN111698258B (en) WiFi-based environmental intrusion detection method and system
CN111144522B (en) Power grid NFC equipment fingerprint authentication method based on hardware intrinsic difference
CN111597991A (en) Rehabilitation detection method based on channel state information and BilSTM-Attention
CN113935373A (en) Human body action recognition method based on phase information and signal intensity
CN111142668B (en) Interaction method based on Wi-Fi fingerprint positioning and activity gesture joint recognition
CN105224066A (en) A kind of gesture identification method based on high in the clouds process
CN113609976A (en) Direction-sensitive multi-gesture recognition system and method based on WiFi (Wireless Fidelity) equipment
CN113609977A (en) Pedestrian gait recognition method based on channel state information quotient distance
CN112883355B (en) Non-contact user identity authentication method based on RFID and convolutional neural network
Zhang et al. A dynamic hand gesture recognition algorithm based on CSI and YOLOv3
CN111797849B (en) User activity recognition method and device, storage medium and electronic equipment
CN114154532A (en) Unmanned aerial vehicle individual multi-dimensional domain electromagnetic signal feature deep learning identification method
Pandey et al. Csi-based joint location and activity monitoring for covid-19 quarantine environments
CN112086165B (en) Upper limb rehabilitation monitoring method and system based on deep learning
CN113406589B (en) Multi-target action identification method based on SIMO Doppler radar
CN114764580A (en) Real-time human body gesture recognition method based on no-wearing equipment
CN115310473A (en) Multi-person identity recognition method based on commercial WiFi signal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant