CN114077723A - User identity verification method for tracking human body posture by using flexible sensor - Google Patents

User identity verification method for tracking human body posture by using flexible sensor Download PDF

Info

Publication number
CN114077723A
CN114077723A CN202010813151.4A CN202010813151A CN114077723A CN 114077723 A CN114077723 A CN 114077723A CN 202010813151 A CN202010813151 A CN 202010813151A CN 114077723 A CN114077723 A CN 114077723A
Authority
CN
China
Prior art keywords
arm
motion
human body
sensors
stretch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010813151.4A
Other languages
Chinese (zh)
Inventor
郭诗辉
林俊聪
高星
廖明宏
陈志勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen University
Original Assignee
Xiamen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen University filed Critical Xiamen University
Priority to CN202010813151.4A priority Critical patent/CN114077723A/en
Publication of CN114077723A publication Critical patent/CN114077723A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B7/00Measuring arrangements characterised by the use of electric or magnetic techniques
    • G01B7/16Measuring arrangements characterised by the use of electric or magnetic techniques for measuring the deformation in a solid, e.g. by resistance strain gauge
    • G01B7/18Measuring arrangements characterised by the use of electric or magnetic techniques for measuring the deformation in a solid, e.g. by resistance strain gauge using change in resistance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a user identity authentication method for tracking human body posture by using a flexible sensor, which comprises the following steps: the first step is as follows: arranging a plurality of stretch sensors for capturing a motion pattern of an arm of a human body on the bodysuit; the second step is as follows: capturing motion data of a human arm with the plurality of stretch sensors while a user is wearing the compression garment; the third step: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the stretching sensors; the fourth step: and matching the classification result with the predefined motion pattern of the user to determine the user identity verification result.

Description

User identity verification method for tracking human body posture by using flexible sensor
Technical Field
The invention relates to the field of intelligent authentication, in particular to a user identity verification method for tracking human body posture by using a flexible sensor.
Background
Existing user authentication methods explore the use of various biometric data, such as fingerprint, iris, face, ECG. However, these methods require specialized (often rigid and cumbersome) sensors to capture these biometric data. Biometric authentication refers to identification using human physiological and/or behavioral characteristics. At present, the mainstream method for performing biometric identification by using physiological characteristics is: fingerprint, iris, palm, face, finger, etc. The mainstream approach also employs a variety of behavioral characteristics including speech, handwriting, and keystrokes. In addition to these traditional biometric techniques, new authentication modalities have emerged in recent years, such as ear imaging, arm, head, and gait movements. Researchers first proposed to fuse coarse-grained minute physical activity (steps) with physiological data (heart rate, calorie expenditure, and metabolic equivalents of tasks) for user authentication. Despite the large amount of work on body part motion recognition for authentication, existing devices are expensive or not small enough to be inconvenient to use.
Due to the rapid development of flexible electronic devices, smart apparel has received a great deal of attention. It is desirable that they enable people to interact with the world without much intervention in daily activities. User authentication has become a challenge before smart apparel is used as a personal device. However, explicit authentication methods (e.g., mode locks) suffer from several limitations including visual display requirements. Therefore, there is a need for an authentication mechanism for smart garment users.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a method for effectively implementing user identity authentication under the condition that the intervention on user activities is minimized, aiming at the defects existing in the prior art.
According to the present invention, there is provided a user authentication method for human body posture tracking using a flexible sensor, comprising:
the first step is as follows: arranging a plurality of stretch sensors for capturing a motion pattern of an arm of a human body on the bodysuit;
the second step is as follows: capturing motion data of a human arm with the plurality of stretch sensors while a user is wearing the compression garment;
the third step: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the stretching sensors;
the fourth step: and matching the classification result with the predefined motion pattern of the user to determine the user identity verification result.
Preferably, the plurality of stretch sensors comprises: five stretch sensors are arranged on one side of the body, wherein four stretch sensors are placed at shoulder joint positions in the shoulder region and one stretch sensor is placed at elbow joint positions in the elbow region.
Preferably, the plurality of stretch sensors are sewn to the compression garment.
Preferably, the tension sensor is a conductive rubber wire tension sensor.
Preferably, the rotation of the joint is tracked as motion data by monitoring changes in the resistance value of the stretchable sensor.
Preferably, the input of the LSTM-FCN is motion data of the posture trajectory of the human arm captured by the plurality of stretching sensors, and the output is result information of whether the current trajectory belongs to a motion pattern to be verified.
Preferably, the fill length of all data of the stretch sensor is 146, and one sample in the data is a time series; the LSTM-FCN is composed of a complete convolution block and an LSTM block; the time sequence passes through both the convolution block and the LSTM block.
Preferably, the movement pattern of the arm is divided into: the motion mode of the arm rising in the coronal plane, the motion mode of the arm circling in the conical plane, the motion mode of the arm extending to the left in the horizontal plane, the motion mode of the arm extending out of the fist from bottom to top, the motion mode of the arm swinging the arm to bend the elbow in the horizontal plane, the motion mode of the arm raising the arm and the head in the vertical plane, and the motion mode of straightening the arm to bend.
Drawings
A more complete understanding of the present invention, and the attendant advantages and features thereof, will be more readily understood by reference to the following detailed description when considered in conjunction with the accompanying drawings wherein:
fig. 1 schematically shows a general flowchart of a user authentication method for human body posture tracking using a flexible sensor according to a preferred embodiment of the present invention.
Fig. 2A to 2G show seven movement patterns.
The resistance distributions of the five sensors in the motion mode of fig. 2A are shown in fig. 3.
Fig. 4 shows statistics of sample durations.
FIG. 5 shows the results of comparing the three methods DTW, DTW-D and LSTM-FCN.
It is to be noted, however, that the appended drawings illustrate rather than limit the invention. It is noted that the drawings representing structures may not be drawn to scale. Also, in the drawings, the same or similar elements are denoted by the same or similar reference numerals.
Detailed Description
In order that the present disclosure may be more clearly and readily understood, reference will now be made in detail to the present disclosure as illustrated in the accompanying drawings.
Recent developments in sensing technology pave the way to monitor human body motion using wearable sensors. Mainstream systems typically use optical, acoustic and electromagnetic sensors. Reference is made to the available literature for a general description of the use of different sensors and techniques for estimating the motion of the upper limbs of the human body. Optical-based systems for human motion tracking are relatively expensive, require adequate illumination, and have occlusion problems. Acoustic based systems have poor real-time performance and are susceptible to environmental interference. Electromagnetic systems are susceptible to magnetic field disturbances, which severely affect accuracy due to magnetic field distortions caused by metallic objects surrounding the site.
The use of Inertial Measurement Units (IMUs) for motion tracking has become increasingly interesting and of use. A method of integrating a gyroscope, an accelerometer and a force sensor is proposed to estimate the overall posture of a human body and apply it to bicycling. It is proposed to combine a single hand-held camera with a set of IMUs attached to the limbs to estimate the 3D pose in the field. However, both of the above approaches must employ additional constraints or filtering algorithms to correct for IMU sensor drift. Another effort is to fuse IMU data with Kinect for use to provide stable hand position information without long term drift. Although direct integration of the gyroscope signal can ensure accuracy of the output angle in a short time, the output error accumulates as time passes.
More and more recent work uses soft sensors for motion tracking. Researchers have used wearable sensing suits with flexible sensors to capture movements of the whole body, elbows, fingers, and upper body. A recent work introduced silicone-based strain and force sensors consisting of novel biocompatible conductive fluids for motion capture. Also, researchers have proposed a stretch-sensitive soft glove that can interactively capture gestures with high accuracy without the use of external optical devices. Wearable soft sensors are non-invasive and can accurately track human body gestures in an unrestricted environment. Currently, one of the common features of smart clothing is for motion tracking. This unrestricted feature provides a straightforward solution for user authentication, i.e. user identity is verified by means of a user-defined motion profile.
The human motion tracking system has the advantages of convenience in wearing, unlimited moving space and low cost. Since the flexible sensor can be seamlessly integrated on the garment, maximum convenience is provided to the user. However, no one has studied the problem of user authentication on smart clothing.
The present invention proposes to use flexible stretch sensors to track human body gestures and use the captured gesture sequence for user authentication, which is a low cost and user experience friendly solution. However, unlike the finger-pen-touch mode on mobile phones, human gestures in 3D mode exhibit large intra-personal variations, i.e. multiple attempts of the same gesture sequence may differ to a large extent from each other.
The integration of traditional apparel with flexible electronics is a promising integration solution that can serve as the next generation of computing platforms. However, the problem of user authentication on this new platform has not been fully studied. This work uses flexible sensors to track human body gestures and achieve the goal of user authentication. The present invention captures human motion patterns through four stretch sensors placed on the shoulders and one stretch sensor on the elbow. The invention introduces a long-short term memory complete convolution network (LSTM-FCN) that takes noisy and sparse sensor data directly as input and verifies its consistency with the user's predefined motion pattern. The method can identify the user by matching the motion pattern even if there is a large personal internal variation. The authentication accuracy of the LSTM-FCN reaches 98.0%, which is 10.7% and 6.5% higher than that of Dynamic Time Warping (DTW) and DTW-D depending on the dynamic time warping.
The invention directly solves the problem of user identity authentication on an intelligent clothing platform. In the present invention, authentication is defined as two key sub-problems: 1) tracking the body posture using a flexible sensor, 2) verifying the consistency between the current motion trajectory attempt and a pre-stored trajectory. This work contributes in two ways:
the invention provides a complete solution for hardware and software for human body posture tracking and trajectory authentication. The use of flexible sensors can minimize intervention in user activity and ensure maximum comfort of the user experience.
The invention introduces an LSTM-FCN that directly takes noisy and sparse sensor data as input and matches predefined motion patterns. Compared with the representative methods in time series analysis (DTW 5, DTW-D6), the method of the present invention shows advantages in both authentication accuracy and ease of parameter adjustment.
< specific method example >
Fig. 1 schematically shows a general flowchart of a user authentication method for human body posture tracking using a flexible sensor according to a preferred embodiment of the present invention. As shown in fig. 1, a user authentication method for human body posture tracking using a flexible sensor according to a preferred embodiment of the present invention includes:
first step S1: arranging a plurality of stretch sensors for capturing a motion pattern of an arm of a human body on the bodysuit;
preferably, the plurality of stretch sensors comprises: five stretch sensors are arranged on one side (e.g., the right side) of the body, four of which are placed at shoulder joint locations in the shoulder region and one at elbow joint locations in the elbow region.
Preferably, the plurality of stretch sensors are sewn to the compression garment.
The garment style is a tight fitting athletic garment that ensures that the sensors are fully and consistently stretched under different attempts. Preferably, the garment facing material is made of 80% polyester fibers and 20% polyurethane fibers. For example, in a specific implementation, the present invention uses a conductive rubber wire stretch sensor manufactured by Adafruit. The sensor, 2mm in diameter, was made of carbon black impregnated rubber. In the relaxed state, the sensor resistance is about 350 ohms per inch. The rotation of the joint is tracked by monitoring the change in resistance of the stretchable sensor. For example, when the user bends his/her elbow joint, the sensor is stretched and its resistance value increases accordingly. The sampling frequency of the stretch information is 32 Hz.
For example, the server computer receives the sensor data, authenticates it, and then feeds back the results to the smart garment. For example, in the following, the server is also used for training and testing the predictive model.
Second step S2: capturing motion data of a human arm with the plurality of stretch sensors while a user is wearing the compression garment;
third step S3: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the stretching sensors;
fourth step S4: and matching the classification result with the predefined motion pattern of the user to determine the user identity verification result.
The work of the present invention uses the arm's motion data to verify that the motion matches a password (pre-stored motion pattern). The prototype of the invention has low price, convenient use and no influence of light or disordered background. Moreover, the identity authentication method of the invention does not bring much intervention to human body. Although the stroke trajectories on motion tracking and monitoring touch screens are similar, performing motion in an unconstrained 3D world has significantly more variation than pattern locking on a 2D screen.
The present invention develops a hardware prototype of an intelligent garment with flexible, scalable sensors. When people wear the intelligent jacket and do specific actions, the sensor can monitor the joint movement of the human body. The present invention introduces a long-short term memory complete convolution network (LSTM-FCN) that directly takes noisy and sparse sensor data as input and matches the user's predefined motion pattern. The experimental results demonstrate the advantage of the method of the invention over the representative time series classification methods (DTW and DTW-D).
For applications of the invention, the system of the invention may be deployed in the real world, for example using the system of the invention to unlock a cell phone. Another is to validate the method of the invention for more subtle activities such as finger movements. This may require more complex sensor settings and more powerful sensing algorithms. The application programs can further verify the effectiveness of the method in the actual scene of human-computer interaction.
[ concrete examples and experiments ]
In order to collect data, the user can freely move according to the preference or the requirement. Fig. 2A to 2G show seven movement patterns of the user. The data collected from each session is automatically uploaded to the server of the present invention and manually labeled with the corresponding schema tags. Specifically, the movement pattern of the arm is divided into: a motion mode in which the arm is raised in the coronal plane as shown in fig. 2A, a motion mode in which the arm is rotated in the conical plane as shown in fig. 2B, a motion mode in which the arm is extended to the left in the horizontal plane as shown in fig. 2C, a motion mode in which the arm is extended forward from below to above to reach the fist as shown in fig. 2D, a motion mode in which the arm swings the elbow in the horizontal plane as shown in fig. 2E, a motion mode in which the arm raises the head of the arm in the vertical plane as shown in fig. 2F, and a motion mode in which the arm is bent to be straightened as shown in fig. 2G.
The present invention employs LSTM-FCN as two classes (positive or negative) of classification tools. Each pattern is associated with a separate classifier. After configuring the password mode for the system of the present invention, the network will determine whether the verification is successful based on the new motion input classification result. The input is a human body posture track captured by the flexible sensor, and the output is whether the current track belongs to a motion mode to be verified.
The LSTM-FCN consists of one complete volume block and one LSTM block. The input time sequence passes through both the convolution block and the LSTM block.
In convolutional layers, a set of one-dimensional filters is applied to capture the variation of the input signal throughout the motion. Through tensor
Figure RE-GDA0002781673330000071
And bias
Figure RE-GDA0002781673330000072
To parameterize the filters of each layer. L is the length of the input feature vector, which may be set to 146 herein. l ∈ { 1., 3} is the layer index, d is the filtering time. For the ith layer and each time t, (unnormalized) activation
Figure RE-GDA0002781673330000073
Is the previous layer incoming (normalized) activation matrix
Figure RE-GDA0002781673330000074
A function of:
Figure RE-GDA0002781673330000081
f (-) is a ReLU operation.
The full convolution block consists of three stacked temporal convolution blocks, the filter sizes being 128, 256 and 128, respectively. Each convolution block consists of a time convolution layer, batch normalization and a ReLU activation function. The global average pool is applied after the last volume block. The LSTM tile consists of a conventional LSTM layer and a Dropout layer. The input time series is transferred to the dimension shuffling layer and then the output of the dimension shuffling transformation is transferred to the LSTM block. The outputs of the global pool layer and the LSTM block are concatenated and passed to the softmax classification layer.
Data processing and flow: the intelligent garment of the present invention has five sensors, and all data is filled to length 146. Thus, one sample in the collected data is a time series (146, 5) of shapes. The full convolution module treats the time series as a univariate time series with multiple time steps and receives data at 146 time steps. Instead, the dimension shuffling layer would convert the time dimension of the time series into a multi-dimensional time series with a single time step and feed the output to the LSTM block.
Training and testing: the present invention randomly selects multiple samples for each pattern to construct a training data set, while all other remaining samples are tested. For mode i, three samples are taken
Figure RE-GDA0002781673330000082
As training samples, samples of the remaining patterns are used
Figure RE-GDA0002781673330000083
As a test sample.
Figure RE-GDA0002781673330000084
Respectively, are the total number of the components,the number of training data set samples and the number of test data set samples.
Figure RE-GDA0002781673330000085
For initializing the authentication system on the smart garment, the user needs to execute the password track three times. The purpose of repetition is to capture to the maximum extent the changes a person is doing the same action when the user performs the same trajectory.
Training the hyper-parameters: the training period is 1000. The batch size and initial learning rate were 128 and 1e-3, respectively. A high culling rate of 80% is used after the LSTM layer to solve the problem of overfitting. The present invention trains models using a Keras library with a TensorFlow backend and an Adam optimizer. All convolution kernels are initialized. If the verification score does not improve, the learning rate is reduced every 100 epochs
Figure RE-GDA0002781673330000086
Multiple until its value reaches 1 e-4.
The present invention collected 69 samples of seven motion patterns. Each pattern has eleven, ten, nine samples, respectively. The resistance distributions of the five sensors in the motion mode of fig. 2A are shown in fig. 3. The movement pattern shown in fig. 2A is a movement pattern in which the arm is raised in the coronal plane, and the two sensors located on the shoulder joints are subject to large changes due to shoulder rotation; this mode is also accompanied by significant secondary motion of the elbow joint (sensor). This map reflects large variations within the individual when the user performs the same movement pattern.
The duration of each sample motion was between 2.788s and 4.998s, and statistics for these sample durations are shown in fig. 4. Even for the same pattern, the duration of different attempts may vary widely, exceeding 10%. The present invention solves this problem by padding and converts the signal into segments of fixed duration.
Experimental results confirmed that the sensor signal also varied greatly within the individual. The reason may be twofold: the temporal factor is reflected in the varying durations of different attempts of the same pattern, while the spatial factor is due to the unconstrained 3D trajectory. These two factors are closely related to the degree of freedom on the part of the user in designing and executing the pattern. The method of the present invention has proven to effectively handle such changes and does not require manual performance of feature engineering.
Comparison with DTW/DTW-D
Dynamic Time Warping (DTW) is one of the reference methods for timing signal processing. The DTW-D enhances the correlation of five characteristics in the time sequence in the matching process and improves the matching accuracy. However, their performance depends on the threshold. The invention first compares the method of the invention herein with the highest matching accuracy of DTW and DTW-D.
Successful conditions for DTW and DTW-D: for DTW, the match is successful if the distance of each channel to a sample in the cipher set is within a threshold. For DTW and DTW-D, once there are multiple passwords in the password group, the matching is successful as long as the distance between the test sample and one of the passwords is less than the threshold.
Fig. 5 shows the results of a comparison of three methods: DTW, DTW-D and LSTM-FCN. The LSTM-FCN model of the present invention is consistently superior to DTW and DTW-D regardless of the number of samples in the training dataset. Increasing the values of the training data set may improve the authentication accuracy to 98.0%.
In terms of time performance, the LSTM-FCN takes approximately 27.10s to train. However, it was trained only once, and the validation took 0.26 s. DTW and DTW-D do not require offline training and online authentication takes 0.74s and 0.001s, respectively.
The effect of the number of samples in the cipher suite on the results is that as the training data set increases, the FN of the DTW and DTW-D becomes smaller. This means that more positive samples in the pattern can be correctly verified. The reason should be that the addition of training samples expands the state space and includes samples that were previously excluded from the space. However, this leads to side effects: samples that do not belong to the pattern are also erroneously matched. The addition of the training data set does not contribute much to the authentication accuracy of DTW and DTW-D. In practice, the accuracy of DTW decreases moderately (about 2%) when the training data set is increased from 4 to 5. The result shows that increasing the number of samples of the cipher block is not an effective strategy for improving the accuracy of DTW and DTW-D.
In contrast, as the training data set increases, both the "true password mismatch" and "non-password authenticated" cases of the LSTM-FCN decrease. This shows that the LSTM-FCN of the present invention can efficiently extract consistent latent features from newly added samples without introducing exception information. This ensures that positive examples are authenticated and negative examples are rejected. This performance improvement is very significant when the training data set is increased from 1 to 3; and when the training data set is 4 or higher, the boost is slight. Compared with DTW and DTW-D, LSTM-FCN has higher authentication accuracy when only one sample is used as password. This advantage will further improve with the increase of the training data set, and the accuracy of LSTM-FCN reaches above 98%. Unlike DTW and DTW-D, the present invention can find the trade-off between "true password mismatch" and "non-password pass authentication" without selecting the values of the training data set.
When the pattern shown in fig. 2F is set as the password, the tenth sample of the pattern shown in fig. 2F cannot be correctly matched regardless of the number of training samples. In order to find the reason, the invention changes the two-class classification network into the seven-class classification network. When the number of training samples per exercise pattern increases from one to five, the tenth sample of the pattern shown in fig. 2F is classified as the pattern shown in fig. 2G in 4 of 5 cases. As shown in the pattern trace plots (fig. 2F and 2G), the tenth sample of the pattern shown in fig. 2F is numerically similar to the pattern shown in fig. 2G if the shoulder is not raised vertically to a sufficient degree. The present invention also measures the distance between the tenth sample and the remaining samples in the pattern shown in fig. 2F, and all samples in the pattern shown in fig. 2G, using DTW-D. The sample that is the smallest distance from the tenth sample of the pattern shown in fig. 2F is P7S 4. The distance (7.99) of P6S10-P7S4 is less than the average distance value (9.36) of the tenth sample of the pattern shown in FIG. 2F for its internal samples. This discovery actually demonstrates that the method of the present invention can reliably identify outliers in a password set.
It should be noted that the terms "first", "second", "third", and the like in the description are used for distinguishing various components, elements, steps, and the like in the description, and are not used for indicating a logical relationship or a sequential relationship between the various components, elements, steps, and the like, unless otherwise specified.
It is to be understood that while the present invention has been described in conjunction with the preferred embodiments thereof, it is not intended to limit the invention to those embodiments. It will be apparent to those skilled in the art from this disclosure that many changes and modifications can be made, or equivalents modified, in the embodiments of the invention without departing from the scope of the invention. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention are still within the scope of the protection of the technical solution of the present invention, unless the contents of the technical solution of the present invention are departed.

Claims (8)

1. A user authentication method for human body posture tracking by using a flexible sensor is characterized by comprising the following steps:
the first step is as follows: arranging a plurality of stretch sensors for capturing a motion pattern of an arm of a human body on the bodysuit;
the second step is as follows: capturing motion data of a human arm with the plurality of stretch sensors while a user is wearing the compression garment;
the third step: classifying the motion of the human arms by adopting LSTM-FCN according to the motion data of the human arms captured by the stretching sensors;
the fourth step: and matching the classification result with the predefined motion pattern of the user to determine the user identity verification result.
2. The method of user authentication for human body posture tracking using a flexible sensor of claim 1, wherein the plurality of stretch sensors comprises: five stretch sensors are arranged on one side of the body, wherein four stretch sensors are placed at shoulder joint positions in the shoulder region and one stretch sensor is placed at elbow joint positions in the elbow region.
3. The method of user authentication for human body posture tracking using a flexible sensor according to claim 1 or 2, wherein the plurality of stretch sensors are sewn to the bodysuit.
4. The method of user authentication for human body posture tracking using a flexible sensor according to claim 1 or 2, wherein the stretch sensor is a conductive rubber wire stretch sensor.
5. The user authentication method for human body posture tracking using a flexible sensor according to claim 1 or 2, wherein the rotation of the joint is tracked as the motion data by monitoring the change of the resistance value of the stretchable sensor.
6. The method of user authentication for human body posture tracking using a flexible sensor according to claim 1 or 2, wherein the input of the LSTM-FCN is motion data of a human body arm posture trajectory captured by the plurality of stretching sensors, and the output is result information of whether the current trajectory belongs to a motion pattern to be authenticated.
7. The method of claim 1 or 2, wherein the fill length of all data of the stretch sensor is 146, and one sample of the data is a time series; the LSTM-FCN is composed of a complete convolution block and an LSTM block; the time sequence passes through both the convolution block and the LSTM block.
8. The user authentication method for human body posture tracking using a flexible sensor according to claim 1 or 2, wherein the motion pattern of the arm is divided into: the motion mode of the arm rising in the coronal plane, the motion mode of the arm circling in the conical plane, the motion mode of the arm extending to the left in the horizontal plane, the motion mode of the arm extending out of the fist from bottom to top, the motion mode of the arm swinging the arm to bend the elbow in the horizontal plane, the motion mode of the arm raising the arm and the head in the vertical plane, and the motion mode of straightening the arm to bend.
CN202010813151.4A 2020-08-13 2020-08-13 User identity verification method for tracking human body posture by using flexible sensor Pending CN114077723A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010813151.4A CN114077723A (en) 2020-08-13 2020-08-13 User identity verification method for tracking human body posture by using flexible sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010813151.4A CN114077723A (en) 2020-08-13 2020-08-13 User identity verification method for tracking human body posture by using flexible sensor

Publications (1)

Publication Number Publication Date
CN114077723A true CN114077723A (en) 2022-02-22

Family

ID=80280374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010813151.4A Pending CN114077723A (en) 2020-08-13 2020-08-13 User identity verification method for tracking human body posture by using flexible sensor

Country Status (1)

Country Link
CN (1) CN114077723A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177770A1 (en) * 2006-01-30 2007-08-02 Derchak P A System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint
CN104462906A (en) * 2014-12-19 2015-03-25 中国农业银行股份有限公司 Mouse with information collection function and identity recognition method and system
CN105124893A (en) * 2015-08-13 2015-12-09 上海箩箕技术有限公司 Biological recognition suitcase
CN106164920A (en) * 2014-04-04 2016-11-23 高通股份有限公司 Assist the method and apparatus of wearable identity manager
CN106411952A (en) * 2016-12-01 2017-02-15 安徽工业大学 Telekinetic-dynamic-gesture-based user identity authentication method and apparatus
CN106567435A (en) * 2016-11-09 2017-04-19 中科院合肥技术创新工程院 Intelligent detection system and method for intelligent and healthy toilet
CN110609621A (en) * 2019-09-17 2019-12-24 南京茂森电子技术有限公司 Posture calibration method and human motion capture system based on micro-sensor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177770A1 (en) * 2006-01-30 2007-08-02 Derchak P A System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint
CN106164920A (en) * 2014-04-04 2016-11-23 高通股份有限公司 Assist the method and apparatus of wearable identity manager
CN104462906A (en) * 2014-12-19 2015-03-25 中国农业银行股份有限公司 Mouse with information collection function and identity recognition method and system
CN105124893A (en) * 2015-08-13 2015-12-09 上海箩箕技术有限公司 Biological recognition suitcase
CN106567435A (en) * 2016-11-09 2017-04-19 中科院合肥技术创新工程院 Intelligent detection system and method for intelligent and healthy toilet
CN106411952A (en) * 2016-12-01 2017-02-15 安徽工业大学 Telekinetic-dynamic-gesture-based user identity authentication method and apparatus
CN110609621A (en) * 2019-09-17 2019-12-24 南京茂森电子技术有限公司 Posture calibration method and human motion capture system based on micro-sensor

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"科技与奥运―传感器技术在奥运中的应用", 传感器世界, no. 09, 25 September 2008 (2008-09-25) *
李翔宇;: "基于DTW算法的多模板组合手机身份认证方法", 福建电脑, no. 04, 25 April 2018 (2018-04-25) *
陈志勇;李振汉;张婷婷;: "柔性基、柔性铰空间机器人基于状态观测的改进模糊免疫混合控制及抑振研究", 振动与冲击, no. 19, 15 October 2018 (2018-10-15) *

Similar Documents

Publication Publication Date Title
Ehatisham-ul-Haq et al. Continuous authentication of smartphone users based on activity pattern recognition using passive mobile sensing
Ahmed et al. DTW-based kernel and rank-level fusion for 3D gait recognition using Kinect
US9754149B2 (en) Fingerprint based smart phone user verification
Vamsikrishna et al. Computer-vision-assisted palm rehabilitation with supervised learning
US10121049B2 (en) Fingerprint based smart phone user verification
US9432366B2 (en) Fingerprint based smartphone user verification
Kumar et al. Gait recognition based on vision systems: A systematic survey
Hu et al. Cyberphysical system with virtual reality for intelligent motion recognition and training
US10970374B2 (en) User identification and authentication with neuromuscular signatures
Fan et al. Emgauth: An emg-based smartphone unlocking system using siamese network
Derawi Smartphones and biometrics: Gait and activity recognition
Geng Research on athlete’s action recognition based on acceleration sensor and deep learning
Li et al. Adaptive deep feature fusion for continuous authentication with data augmentation
Luo et al. Activity-based person identification using multimodal wearable sensor data
Chen et al. Human posture tracking with flexible sensors for motion recognition
Hossain et al. Incorporating deep learning into capacitive images for smartphone user authentication
Zhang et al. Artificial intelligence in physiological characteristics recognition for internet of things authentication
Chen et al. Cnn-lstm model for recognizing video-recorded actions performed in a traditional chinese exercise
Dong et al. Finger vein verification with vein textons
CN114077723A (en) User identity verification method for tracking human body posture by using flexible sensor
Sun et al. IoT Motion Tracking System for Workout Performance Evaluation: A Case Study on Dumbbell
Fan et al. EmgAuth: Unlocking smartphones with EMG signals
Ma et al. Sports competition assistant system based on fuzzy big data and health exercise recognition algorithm
Shin et al. Electromyogram-based algorithm using bagged trees for biometric person authentication and motion recognition
Bastico et al. Continuous Person Identification and Tracking in Healthcare by Integrating Accelerometer Data and Deep Learning Filled 3D Skeletons

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination