CN114077723B - User identity verification method for tracking human body posture by using flexible sensor - Google Patents
User identity verification method for tracking human body posture by using flexible sensor Download PDFInfo
- Publication number
- CN114077723B CN114077723B CN202010813151.4A CN202010813151A CN114077723B CN 114077723 B CN114077723 B CN 114077723B CN 202010813151 A CN202010813151 A CN 202010813151A CN 114077723 B CN114077723 B CN 114077723B
- Authority
- CN
- China
- Prior art keywords
- arm
- sensors
- human
- motion
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000012795 verification Methods 0.000 title claims abstract description 10
- 230000033001 locomotion Effects 0.000 claims abstract description 87
- 238000012544 monitoring process Methods 0.000 claims description 5
- 229920001971 elastomer Polymers 0.000 claims description 4
- 210000002310 elbow joint Anatomy 0.000 claims description 4
- 210000000323 shoulder joint Anatomy 0.000 claims description 3
- 238000012549 training Methods 0.000 description 20
- 239000000523 sample Substances 0.000 description 17
- 230000008901 benefit Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000004913 activation Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 2
- 238000005452 bending Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 210000000554 iris Anatomy 0.000 description 2
- 230000007787 long-term memory Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 235000019577 caloric intake Nutrition 0.000 description 1
- 239000006229 carbon black Substances 0.000 description 1
- 230000000739 chaotic effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000002503 metabolic effect Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 229920000728 polyester Polymers 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 229920006306 polyurethane fiber Polymers 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000007430 reference method Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000012731 temporal analysis Methods 0.000 description 1
- 238000000700 time series analysis Methods 0.000 description 1
- 210000001364 upper extremity Anatomy 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/316—User authentication by observing the pattern of computer usage, e.g. typical user behaviour
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B7/00—Measuring arrangements characterised by the use of electric or magnetic techniques
- G01B7/16—Measuring arrangements characterised by the use of electric or magnetic techniques for measuring the deformation in a solid, e.g. by resistance strain gauge
- G01B7/18—Measuring arrangements characterised by the use of electric or magnetic techniques for measuring the deformation in a solid, e.g. by resistance strain gauge using change in resistance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Computer Security & Cryptography (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Social Psychology (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention provides a user identity verification method for tracking human body gestures by using a flexible sensor, which comprises the following steps: a first step of: a plurality of stretching sensors for capturing the movement pattern of the human arms are arranged on the close-fitting clothes; and a second step of: capturing movement data of a human arm using the plurality of tension sensors while a user wears the tight clothing; and a third step of: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the plurality of stretching sensors; fourth step: the classification result is matched with a predefined movement pattern of the user to determine a user authentication result.
Description
Technical Field
The invention relates to the field of intelligent authentication, in particular to a user identity verification method for tracking human body gestures by using a flexible sensor.
Background
Existing user identity authentication methods explore the use of a variety of biometric data such as fingerprints, irises, faces, ECG. But these methods require specialized (typically rigid and cumbersome) sensors to capture these biometric data. Biometric authentication refers to the use of human physiological and/or behavioral characteristics for identification. Currently, the main methods for performing biological feature recognition by using physiological features are: fingerprint, iris, palm, face, finger, etc. The mainstream approach also employs a variety of behavioral characteristics including speech, handwriting, and keystroke. In addition to these traditional biometric techniques, new means of authentication have emerged in recent years, such as ear imaging, arm, head, and gait movements. Researchers first propose to fuse coarse-grained minute physical activity (number of steps) with physiological data (heart rate, calorie consumption and metabolic equivalent of tasks) for user authentication. Despite the great deal of effort in body part movement identification for authentication, existing devices are expensive or not small enough to be cumbersome to use.
Smart garments have received wide attention due to the rapid development of flexible electronics. It is desirable that they are able to interconnect people with the world without much intervention in daily activities. User authentication has become a challenge before smart garments are used as personal devices. Explicit authentication methods (e.g., mode locks) suffer from several limitations including visual display requirements. Thus, there is a need for an authentication mechanism for smart garment users.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method capable of effectively realizing user identity verification under the condition that the intervention on the user activity is minimized aiming at the defects in the prior art.
According to the present invention, there is provided a user authentication method for human body posture tracking using a flexible sensor, comprising:
A first step of: a plurality of stretching sensors for capturing the movement pattern of the human arms are arranged on the close-fitting clothes;
and a second step of: capturing movement data of a human arm using the plurality of tension sensors while a user wears the tight clothing;
and a third step of: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the plurality of stretching sensors;
Fourth step: the classification result is matched with a predefined movement pattern of the user to determine a user authentication result.
Preferably, the plurality of tension sensors includes: five stretch sensors arranged on a single side of the body, four of which are placed at shoulder joint positions at shoulder sites and one of which is placed at elbow joint positions at elbow sites.
Preferably, the plurality of stretch sensors are sewn to the body suit.
Preferably, the stretch sensor is a conductive rubber wire stretch sensor.
Preferably, the rotation of the joint is tracked as motion data by monitoring the change in resistance value of the stretchable sensor.
Preferably, the input of the LSTM-FCN is motion data of human arm gesture trajectories captured by the plurality of stretching sensors, and the output is result information of whether the current trajectories belong to a motion mode to be verified.
Preferably, the filling length of all data of the stretch sensor is 146, and one sample in the data is a time sequence; the LSTM-FCN consists of a complete convolution block and an LSTM block; the time series passes through both the convolution block and the LSTM block.
Preferably, the movement pattern of the arm is divided into: a movement mode in which the arm is raised in the coronal plane, a movement mode in which the arm is turned in the conical plane, a movement mode in which the arm is extended to the left in the horizontal plane, a movement mode in which the arm extends out of the fist from the bottom to the front, a movement mode in which the arm swings the arm to bend the elbow in the horizontal plane, a movement mode in which the arm is raised in the vertical plane, and a movement mode in which the arm is to be straightened to bend.
Drawings
The invention will be more fully understood and its attendant advantages and features will be more readily understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, in which:
Fig. 1 schematically shows an overall flow chart of a user authentication method for human body posture tracking using flexible sensors according to a preferred embodiment of the present invention.
Fig. 2A to 2G show seven movement patterns.
The resistance distribution of the five sensors in the motion pattern shown in fig. 2A is shown in fig. 3.
Figure 4 shows statistics of sample duration.
FIG. 5 shows the results of a comparison of the three methods DTW, DTW-D and LSTM-FCN.
It should be noted that the drawings are for illustrating the invention and are not to be construed as limiting the invention. Note that the drawings representing structures may not be drawn to scale. Also, in the drawings, the same or similar elements are denoted by the same or similar reference numerals.
Detailed Description
In order that the invention may be more readily understood, a detailed description of the invention is provided below along with specific embodiments and accompanying figures.
Recent developments in sensing technology pave the way for monitoring human movement using wearable sensors. Mainstream systems typically use optical, acoustic and electromagnetic sensors. An overview of the use of different sensors and techniques to estimate the motion of the upper limb of the human body may be referred to. Optical-based systems for human motion tracking are relatively expensive, require adequate illumination, and present occlusion problems. Acoustic-based systems have poor real-time performance and are susceptible to environmental interference. Electromagnetic systems are susceptible to magnetic field interference because metal objects around the field can cause magnetic field distortions that can severely impact accuracy.
Motion tracking using Inertial Measurement Units (IMUs) has become of increasing interest and use. A method of fusing a gyroscope, an accelerometer and a force sensor together to estimate the overall posture of a human body and apply it to riding a bicycle is proposed. It is proposed to combine a single hand held camera with a set of IMUs attached to the limb to estimate the 3D pose in the field. Both of the above approaches, however, must employ additional constraints or filtering algorithms to correct for the IMU sensor drift. Another effort has attempted to fuse IMU data with Kinect to provide stable hand position information without long term drift. Although direct integration of the gyroscope signal can ensure accuracy of the output angle in a short time, output errors accumulate as time passes.
More and more recent work has used soft sensors for motion tracking. Researchers have used wearable sensing garments with flexible sensors to capture movements of the whole body, elbows, fingers and upper body. A recent work introduced silicone-based strain and force sensors composed of novel biocompatible conductive liquids for motion capture. Also, researchers have proposed a stretch-sensing soft glove that can interactively capture gestures with high accuracy without the use of external optics. Wearable soft sensors are non-invasive and can accurately track body gestures in an unrestricted environment. Currently, one of the common features of smart clothing is for motion tracking. This unrestricted feature provides a direct solution for user authentication, i.e. authentication of the user's identity through a user-defined motion profile.
The human motion tracking system has the advantages of convenient wearing, unlimited moving space and low cost. The flexible sensor can be seamlessly integrated on the garment, so that the greatest convenience is provided for a user. However, no one has studied the user authentication problem on smart clothing.
The present invention proposes to use a flexible stretch sensor to track human gestures and use the captured gesture sequence for user authentication, which solution is low cost and user friendly. But unlike the finger pen touch mode on mobile phones, the human body pose in 3D mode exhibits a large internal personal variation, i.e. multiple attempts of the same pose sequence may differ to a large extent from each other.
The integration of traditional clothing and flexible electronics is a promising combination solution, which can be used as a next generation computing platform. The user authentication problem on this new platform has not yet been fully studied. This work uses flexible sensors to track human gestures and achieve the goal of user authentication. The present invention captures a human body movement pattern by four tension sensors placed on the shoulder and one tension sensor of the elbow. The present invention introduces a long-term memory full convolution network (LSTM-FCN) that directly takes noisy and sparse sensor data as input and verifies its consistency with a user's predefined motion pattern. The method can identify the user by matching motion patterns even if there is a large personal internal variation. The authentication accuracy of LSTM-FCN reaches 98.0%, 10.7% and 6.5% higher than Dynamic Time Warping (DTW) and dynamic time warping dependent DTW-D.
The invention directly solves the problem of user identity verification on the intelligent clothing platform. In the present invention, authentication is defined as two key sub-problems: 1) Tracking the human body pose using flexible sensors, 2) verifying the consistency between the current motion trajectory attempt and the pre-stored trajectory. This work contributes in two ways:
The present invention provides a complete solution for hardware and software for human gesture tracking and trajectory authentication. The use of flexible sensors may minimize intervention in user activity and ensure maximum comfort for the user experience.
The present invention introduces an LSTM-FCN that directly takes noisy and sparse sensor data as input and matches a predefined motion pattern. The method of the present invention shows advantages in terms of authentication accuracy and mitigation of parameter adjustment compared to the representative methods (DTW 5, DTW-D6) in time series analysis.
< Specific method example >
Fig. 1 schematically shows an overall flow chart of a user authentication method for human body posture tracking using flexible sensors according to a preferred embodiment of the present invention. As shown in fig. 1, a user authentication method for human body posture tracking using a flexible sensor according to a preferred embodiment of the present invention includes:
A first step S1: a plurality of stretching sensors for capturing the movement pattern of the human arms are arranged on the close-fitting clothes;
Preferably, the plurality of tension sensors includes: five stretch sensors disposed on a single side of the body (e.g., the right side), four of which are placed at shoulder locations and one at elbow locations.
Preferably, the plurality of stretch sensors are sewn to the body suit.
The garment style is a tight fitting sportswear that ensures that the sensor stretches fully and consistently under different attempts. Preferably, the garment material is made from 80% polyester fibers and 20% polyurethane fibers. For example, in practice, the present invention uses Adafruit manufactured conductive rubber wire tensile sensors. The sensor has a diameter of 2mm and is made of carbon black impregnated rubber. In the relaxed state, the sensor resistance is approximately 350 ohms per inch. Rotation of the joint is tracked by monitoring changes in the resistance value of the stretchable sensor. For example, when a user bends his/her elbow joint, the sensor is stretched and its resistance value increases accordingly. The sampling frequency of the stretch information is 32Hz.
For example, the server computer receives the sensor data, authenticates it, and then feeds the results back to the smart garment. For example, the server is also used hereinafter to train and test predictive models.
A second step S2: capturing movement data of a human arm using the plurality of tension sensors while a user wears the tight clothing;
Third step S3: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the plurality of stretching sensors;
fourth step S4: the classification result is matched with a predefined movement pattern of the user to determine a user authentication result.
The work of the present invention uses the motion data of the arm to verify whether the motion matches the password (pre-stored motion pattern). The prototype of the invention has low price and convenient use, and is not affected by light or chaotic background. Moreover, the identity verification method of the invention does not bring too much intervention to the human body. Although motion tracking and monitoring of stroke trajectories on a touch screen are similar, performing motion in the unconstrained 3D world has significantly greater variation than pattern lock on a 2D screen.
The present invention developed a hardware prototype of smart clothing with flexible and stretchable sensors. When a person wears the intelligent coat and performs a specific action, the sensor monitors the joint movement of the person. The present invention introduces a long-term memory full convolution network (LSTM-FCN) that directly takes noisy and sparse sensor data as input and matches a predefined motion pattern of a user. Experimental results demonstrate that the advantages of the method of the present invention outweigh the representative time series classification methods (DTW and DTW-D).
For the application of the present invention, the system of the present invention may be deployed in the real world, for example, unlocking a cell phone using the system of the present invention. Another is to verify the method of the present invention for finer movements, such as finger movements. This may require more complex sensor settings and more powerful sensing algorithms. These applications may further verify the effectiveness of the method of the present invention in the actual context of human-machine interaction.
Specific examples and experiments
For data collection, the user can freely exercise according to his own preference or demand. Fig. 2A to 2G show seven movement patterns of the user. The data collected from each session is automatically uploaded to the server of the present invention and manually labeled with the corresponding schema tag. Specifically, the movement pattern of the arm is divided into: the motion pattern of the arm lifting in the coronal plane shown in fig. 2A, the motion pattern of the arm turning in the conical plane shown in fig. 2B, the motion pattern of the arm stretching to the left in the horizontal plane shown in fig. 2C, the motion pattern of the arm extending the fist from the bottom up to the front shown in fig. 2D, the motion pattern of the arm swinging the arm bending elbow in the horizontal plane shown in fig. 2E, the motion pattern of the arm lifting the arm touching in the vertical plane shown in fig. 2F, the motion pattern of the arm bending straightening shown in fig. 2G.
The present invention employs LSTM-FCN as a two-class (positive or negative) classification tool. Each pattern is associated with a separate classifier. After configuring the system of the present invention with the cryptographic mode, the network will determine whether the verification was successful based on the classification result of the new motion input. The input is the human body gesture track captured by the flexible sensor, and the output is whether the current track belongs to a movement mode to be verified.
The LSTM-FCN consists of a complete convolution block and an LSTM block. The input time series passes through both the convolution block and the LSTM block.
In the convolutional layer, a set of one-dimensional filters is applied to capture the changes in the input signal throughout the course of action. By tensorAnd bias/>To parameterize the filters of each layer. L is the length of the input feature vector and may be set to 146 here. l e { 1..3 } is the layer index and d is the filter time. For layer i and each time t, (un-normalized) activation/>Is the last layer of incoming (normalized) activation matrix/>Is a function of (a):
f (·) is a ReLU operation.
The complete convolution block consists of three stacked temporal convolution blocks, the filters being 128, 256 and 128 in size, respectively. Each convolution block consists of a temporal convolution layer, batch normalization and a ReLU activation function. A global averaging pool is applied after the last convolution block. The LSTM block is composed of a conventional LSTM layer and a Dropout layer. The input time series is transmitted to the dimension shuffling layer, and then the output of the dimension shuffling conversion is transmitted to the LSTM blocks. The outputs of the global pool layer and LSTM blocks are concatenated and passed to the softmax classification layer.
Data processing and flow: the intelligent garment of the present invention has five sensors, all data filled with 146. Thus, one sample in the collected data is a time series of shapes (146,5). The full convolution module treats the time series as a univariate time series with a plurality of time steps and receives data in 146 time steps. Instead, the dimension shuffling layer converts the time dimension of the time series into a multi-element time series with a single time step and feeds the output to LSTM blocks.
Training and testing: the present invention randomly selects a plurality of samples for each pattern to construct a training data set, while all other remaining samples are tested. For pattern i, three samples will be takenAs training samples, samples of the remaining patterns/>As a test sample. /(I)The total number of training data set samples and the number of test data set samples, respectively. /(I)For initializing an authentication system on a smart garment, a user needs to perform a three-time password trajectory. The purpose of repetition is to maximize capture of changes in the person doing the same action while the user performs the same trajectory.
Training super parameters: the training period was 1000. Batch size and initial learning rate were 128 and 1e-3, respectively. A high rejection rate of 80% is used after LSTM layer to solve the over-fitting problem. The present invention trains models using Keras library with TensorFlow backend and Adam optimizer. All convolution kernels are initialized. If the verification score does not improve, the learning rate decreases every 100 epochsMultiple times until its value reaches 1e-4.
The invention collects 69 samples of seven motion patterns. Each pattern has eleven, ten, nine, ten, nine samples, respectively. The resistance distribution of the five sensors in the motion pattern shown in fig. 2A is shown in fig. 3. The movement pattern shown in fig. 2A is a movement pattern in which the arm is raised in the coronal plane, and two sensors located on the shoulder joint cause a great change due to the rotation of the shoulder; this mode is also accompanied by a significant secondary motion of the elbow joint (sensor). The map reflects a large variation inside the individual when the user performs the same movement pattern.
Each sample action duration is between 2.788s and 4.998s, and statistics of these sample durations are shown in fig. 4. Even for the same mode, the durations of the different attempts may vary widely, exceeding 10%. The present invention solves this problem by padding and converting the signal into fragments of fixed duration.
Experimental results confirm that the sensor signal also varies greatly within the individual. The reasons may be twofold: the temporal factors are reflected in the varying durations of the different attempts of the same pattern, while the spatial factors are due to the unconstrained 3D trajectory. These two factors are closely related to the freedom of the user in designing and executing the pattern. The method of the present invention has proven to handle this variation effectively and without the need for manually performing feature engineering.
Comparison with DTW/DTW-D
Dynamic Time Warping (DTW) is one of the reference methods for time series signal processing. The DTW-D enhances the correlation of five features in the time sequence in the matching process and improves the matching accuracy. But their performance depends on the threshold. The invention first compares the method of the invention with DTW and DTW-D with the highest matching accuracy.
Successful conditions for DTW and DTW-D: for DTW, if the distance of each channel to the samples in the cipher set is within a threshold, then the match is successful. For DTW and DTW-D, once there are multiple passwords in the password set, the match is successful as long as the distance between the test sample and one of the passwords is less than the threshold.
Fig. 5 shows the comparison of three methods: DTW, DTW-D and LSTM-FCN. Regardless of the number of samples in the training dataset, the LSTM-FCN model of the present invention is always superior to DTW and DTW-D. Increasing the value of the training data set may increase the authentication accuracy to 98.0%.
In terms of time performance, LSTM-FCN takes about 27.10s to train. But it was trained only once and the validation took 0.26s. DTW and DTW-D do not require offline training and online authentication takes 0.74s and 0.001s, respectively.
The effect of the number of samples in the cipher suite on the outcome, as the training dataset increases, the FN of DTW and DTW-D becomes smaller. This means that more positive samples in the pattern can be correctly validated. The reason should be that the addition of training samples expands the state space and includes samples that were previously excluded from this space. But this results in side effects: samples that do not belong to the pattern are also erroneously matched. The addition of training data sets does not contribute much to the authentication accuracy of DTW and DTW-D. In practice, the accuracy of DTW is moderately degraded (about 2%) when the training dataset is increased from 4 to 5. The results indicate that increasing the number of samples of the cipher suite is not an effective strategy to improve the accuracy of DTW and DTW-D.
In contrast, when the training dataset is increased, both the "true password mismatch" and the "non-password authenticated" cases of the LSTM-FCN are reduced. This shows that the LSTM-FCN of the present invention can effectively extract consistent potential features from newly added samples without introducing anomaly information. This ensures that positive samples are authenticated and negative samples are rejected. This performance improvement is very pronounced when the training dataset is increased from 1 to 3; while this boost is minor when the training dataset is 4 or higher. LSTM-FCN has higher authentication accuracy when only one sample is used as a password compared to DTW and DTW-D. This advantage will further increase with increasing training data sets, and the accuracy of the LSTM-FCN reaches over 98%. Unlike DTW and DTW-D, the present invention finds a tradeoff between "true password mismatch" and "non-password pass authentication" without selecting the value of the training data set.
When the pattern shown in fig. 2F is set as the password, the tenth sample of the pattern shown in fig. 2F cannot be correctly matched regardless of the number of training samples. To find the reason, the present invention changes two-class classification networks to seven-class classification networks. When the number of training samples for each movement pattern increases from one to five, the tenth sample of the pattern shown in fig. 2F is classified into the pattern shown in fig. 2G among 4 cases. According to the pattern trace diagrams (fig. 2F and 2G), if the shoulder is not raised vertically to a sufficient extent, the tenth sample of the pattern shown in fig. 2F is similar in value to the pattern shown in fig. 2G. The present invention also measured the distance between the tenth sample and the remaining samples in the pattern shown in fig. 2F, and all samples in the pattern shown in fig. 2G using DTW-D. The sample with the smallest sample distance from the tenth sample of the pattern shown in fig. 2F is P7S 4. The distance (7.99) of P6S10-P7S4 is less than the average distance value (9.36) of the tenth sample of the pattern shown in FIG. 2F of its internal samples. This finding in fact demonstrates that the method of the present invention can reliably identify outliers in a cryptographic set.
It should be noted that, unless specifically stated otherwise, the terms "first," "second," "third," and the like in the specification are used merely as a distinction between various components, elements, steps, etc. in the specification, and are not used to denote a logical or sequential relationship between various components, elements, steps, etc.
It will be appreciated that although the invention has been described above in terms of preferred embodiments, the above embodiments are not intended to limit the invention. Many possible variations and modifications of the disclosed technology can be made by anyone skilled in the art without departing from the scope of the technology, or the technology can be modified to be equivalent. Therefore, any simple modification, equivalent variation and modification of the above embodiments according to the technical substance of the present invention still fall within the scope of the technical solution of the present invention.
Claims (5)
1. A user authentication method for human body posture tracking using a flexible sensor, comprising:
A first step of: a plurality of stretching sensors for capturing the movement pattern of the human arms are arranged on the close-fitting clothes;
and a second step of: capturing movement data of a human arm using the plurality of tension sensors while a user wears the tight clothing;
and a third step of: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the plurality of stretching sensors;
fourth step: matching the classification result with a predefined motion pattern of the user to determine a user identity verification result;
wherein the plurality of tension sensors includes: five tension sensors arranged on a single side of the body, wherein four tension sensors are placed at shoulder joint positions of a shoulder part and one tension sensor is placed at elbow joint positions of an elbow part;
And wherein rotation of the joint is tracked as motion data by monitoring changes in the resistance value of the stretchable sensor; the input of the LSTM-FCN is motion data of human arm gesture tracks captured by the plurality of stretching sensors, and the output is result information of whether the current track belongs to a motion mode to be verified.
2. The user authentication method for body position tracking using flexible sensors of claim 1, wherein the plurality of stretch sensors are sewn to the body suit.
3. A user authentication method for body posture tracking using a flexible sensor according to claim 1 or 2, characterized in that the stretch sensor is a conductive rubber wire stretch sensor.
4. A user authentication method for body posture tracking using a flexible sensor according to claim 1 or 2, characterized in that the filling length of all data of the stretch sensor is 146, one sample in the data is a time sequence; the LSTM-FCN consists of a complete convolution block and an LSTM block; the time series passes through both the convolution block and the LSTM block.
5. A user authentication method for human body posture tracking using a flexible sensor according to claim 1 or 2, characterized in that the movement pattern of the arm is divided into: a movement mode in which the arm is raised in the coronal plane, a movement mode in which the arm is turned in the conical plane, a movement mode in which the arm is extended to the left in the horizontal plane, a movement mode in which the arm extends out of the fist from the bottom to the front, a movement mode in which the arm swings the arm to bend the elbow in the horizontal plane, a movement mode in which the arm is raised in the vertical plane, and a movement mode in which the arm is to be straightened to bend.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010813151.4A CN114077723B (en) | 2020-08-13 | 2020-08-13 | User identity verification method for tracking human body posture by using flexible sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010813151.4A CN114077723B (en) | 2020-08-13 | 2020-08-13 | User identity verification method for tracking human body posture by using flexible sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114077723A CN114077723A (en) | 2022-02-22 |
CN114077723B true CN114077723B (en) | 2024-06-07 |
Family
ID=80280374
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010813151.4A Active CN114077723B (en) | 2020-08-13 | 2020-08-13 | User identity verification method for tracking human body posture by using flexible sensor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114077723B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104462906A (en) * | 2014-12-19 | 2015-03-25 | 中国农业银行股份有限公司 | Mouse with information collection function and identity recognition method and system |
CN105124893A (en) * | 2015-08-13 | 2015-12-09 | 上海箩箕技术有限公司 | Biological recognition suitcase |
CN106164920A (en) * | 2014-04-04 | 2016-11-23 | 高通股份有限公司 | Assist the method and apparatus of wearable identity manager |
CN106411952A (en) * | 2016-12-01 | 2017-02-15 | 安徽工业大学 | Telekinetic-dynamic-gesture-based user identity authentication method and apparatus |
CN106567435A (en) * | 2016-11-09 | 2017-04-19 | 中科院合肥技术创新工程院 | Intelligent detection system and method for intelligent and healthy toilet |
CN110609621A (en) * | 2019-09-17 | 2019-12-24 | 南京茂森电子技术有限公司 | Posture calibration method and human motion capture system based on micro-sensor |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8762733B2 (en) * | 2006-01-30 | 2014-06-24 | Adidas Ag | System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint |
-
2020
- 2020-08-13 CN CN202010813151.4A patent/CN114077723B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106164920A (en) * | 2014-04-04 | 2016-11-23 | 高通股份有限公司 | Assist the method and apparatus of wearable identity manager |
CN104462906A (en) * | 2014-12-19 | 2015-03-25 | 中国农业银行股份有限公司 | Mouse with information collection function and identity recognition method and system |
CN105124893A (en) * | 2015-08-13 | 2015-12-09 | 上海箩箕技术有限公司 | Biological recognition suitcase |
CN106567435A (en) * | 2016-11-09 | 2017-04-19 | 中科院合肥技术创新工程院 | Intelligent detection system and method for intelligent and healthy toilet |
CN106411952A (en) * | 2016-12-01 | 2017-02-15 | 安徽工业大学 | Telekinetic-dynamic-gesture-based user identity authentication method and apparatus |
CN110609621A (en) * | 2019-09-17 | 2019-12-24 | 南京茂森电子技术有限公司 | Posture calibration method and human motion capture system based on micro-sensor |
Non-Patent Citations (3)
Title |
---|
基于DTW算法的多模板组合手机身份认证方法;李翔宇;;福建电脑;20180425(第04期);全文 * |
柔性基、柔性铰空间机器人基于状态观测的改进模糊免疫混合控制及抑振研究;陈志勇;李振汉;张婷婷;;振动与冲击;20181015(第19期);全文 * |
科技与奥运―传感器技术在奥运中的应用;传感器世界;20080925(第09期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114077723A (en) | 2022-02-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Ehatisham-ul-Haq et al. | Continuous authentication of smartphone users based on activity pattern recognition using passive mobile sensing | |
Lamiche et al. | A continuous smartphone authentication method based on gait patterns and keystroke dynamics | |
Li et al. | Whose move is it anyway? Authenticating smart wearable devices using unique head movement patterns | |
Lu et al. | Unobtrusive gait verification for mobile phones | |
Vamsikrishna et al. | Computer-vision-assisted palm rehabilitation with supervised learning | |
US10970374B2 (en) | User identification and authentication with neuromuscular signatures | |
Acien et al. | Smartphone sensors for modeling human-computer interaction: General outlook and research datasets for user authentication | |
US20140089672A1 (en) | Wearable device and method to generate biometric identifier for authentication using near-field communications | |
Fan et al. | Emgauth: An emg-based smartphone unlocking system using siamese network | |
Zhang et al. | Recognizing hand gestures with pressure-sensor-based motion sensing | |
Yu et al. | Thumbup: Identification and authentication by smartwatch using simple hand gestures | |
Li et al. | A new deep anomaly detection-based method for user authentication using multichannel surface EMG signals of hand gestures | |
Fahim et al. | A visual analytic in deep learning approach to eye movement for human-machine interaction based on inertia measurement | |
Liu et al. | Recent advances in biometrics-based user authentication for wearable devices: A contemporary survey | |
Al-Naffakh et al. | Continuous user authentication using smartwatch motion sensor data | |
Zhang et al. | DeepKey: an EEG and gait based dual-authentication system | |
Behera et al. | A robust biometric authentication system for handheld electronic devices by intelligently combining 3D finger motions and cerebral responses | |
Chen et al. | Human posture tracking with flexible sensors for motion recognition | |
Derawi | Smartphones and biometrics: Gait and activity recognition | |
Li et al. | Adaptive deep feature fusion for continuous authentication with data augmentation | |
Li et al. | Handwritten signature authentication using smartwatch motion sensors | |
Panda et al. | Hand gesture recognition using flex sensor and machine learning algorithms | |
Luo et al. | Activity-based person identification using multimodal wearable sensor data | |
Wijewickrama et al. | Write to know: on the feasibility of wrist motion based user-authentication from handwriting | |
Ray-Dowling et al. | Stationary mobile behavioral biometrics: A survey |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |