CN113516063B - Motion pattern recognition method based on K-Means and gait cycle similarity - Google Patents

Motion pattern recognition method based on K-Means and gait cycle similarity Download PDF

Info

Publication number
CN113516063B
CN113516063B CN202110728234.8A CN202110728234A CN113516063B CN 113516063 B CN113516063 B CN 113516063B CN 202110728234 A CN202110728234 A CN 202110728234A CN 113516063 B CN113516063 B CN 113516063B
Authority
CN
China
Prior art keywords
data
similarity
feature vector
gait cycle
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110728234.8A
Other languages
Chinese (zh)
Other versions
CN113516063A (en
Inventor
尹业成
闫国栋
朱晓荣
于志远
曾思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Research Institute of Precise Mechatronic Controls
Original Assignee
Beijing Research Institute of Precise Mechatronic Controls
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Research Institute of Precise Mechatronic Controls filed Critical Beijing Research Institute of Precise Mechatronic Controls
Priority to CN202110728234.8A priority Critical patent/CN113516063B/en
Publication of CN113516063A publication Critical patent/CN113516063A/en
Application granted granted Critical
Publication of CN113516063B publication Critical patent/CN113516063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A motion pattern recognition method based on K-Means and gait cycle similarity includes the steps that firstly, template vectors of corresponding categories are generated offline through collected data; during recognition, similarity analysis is carried out on the data acquired in real time and template vectors, motion mode classification is carried out according to similarity results, and the probability of a current result is given according to the similarity, so that the problems that the traditional exoskeleton robot motion mode recognition in the prior art mostly adopts machine learning and calculation amount is large or template matching accuracy is limited are solved, and a feature vector set can be extracted semi-automatically; the real-time performance is good, and the precision is higher than that of the traditional template matching method.

Description

Motion pattern recognition method based on K-Means and gait cycle similarity
Technical Field
The invention relates to a motion pattern recognition method based on K-Means and gait cycle similarity, and belongs to the field of exoskeleton robots.
Background
Motion intent perception is the basis and premise of exoskeleton robot assistance, while motion pattern recognition is an important component of motion intent perception. The human body movement mode refers to the states of standing, walking on the flat ground, running on the flat ground, going upstairs, going downstairs and the like in the movement process of the exoskeleton robot. How to quickly identify the current movement mode has important significance for adjusting the assistance strategy in real time and improving the assistance effect of the exoskeleton.
At present, machine learning or template matching is mostly adopted for motion pattern recognition, wherein the former has the defects of large calculation amount and difficulty in application in low-power consumption embedded type; the latter needs to manually generate templates according to practical situations, and has limited precision.
Disclosure of Invention
The invention solves the technical problems that: aiming at the problems that in the prior art, the traditional exoskeleton robot motion pattern recognition mostly adopts machine learning with large calculated amount or limited template matching precision, a motion pattern recognition method based on K-Means and gait cycle similarity is provided.
The invention solves the technical problems by the following technical proposal:
a motion pattern recognition method based on K-Means and gait cycle similarity comprises the following steps:
(1) Collecting data offline and generating a template vector;
(2) In the identification process, after the real-time acquisition data is processed, a real-time feature vector is obtained and similarity analysis is carried out on the real-time feature vector and the template vector obtained in the step (1);
(3) And carrying out motion pattern recognition classification according to the similarity analysis result, and giving out the probability of the current classification result.
In the step (1), the specific step of generating the template vector is as follows:
after offline data acquisition and processing, gait cycle division is performed, feature vectors are generated through feature extraction, and template vectors are generated according to the feature vectors.
In the step (2), the specific step of generating the real-time feature vector is as follows:
after the real-time data are collected and processed, gait cycle division is carried out, and feature vectors are generated through feature extraction.
And carrying out data compression on the offline data or the real-time data according to the motion periodicity to obtain gait cycle data slices, and completing gait cycle division.
According to gait cycle data slice, downsampling is carried out, and the specific method comprises the following steps:
Uniformly sampling p data points x= { x (mi) |i=1, …, p } in gait cycle data { x (k) |k=1, …, n }, and obtaining a characteristic vector mi=1+ [ (i-1) ×n/p ];
wherein x (k) is the data with the sequence number k in the current data slice; n is the total number of data in the current data slice; mi is the serial number of the sampling point; [ x ] is a maximum integer not greater than x.
The feature vector x= { x (k) |k=1, …, p } is normalized and subjected to unit shift processing, acquiring feature vector set x' x/(|) x||) +e;
Wherein x is the modulus of x; e is a unit vector.
And clustering the feature vector set by adopting a K-Means clustering method, carrying out matching classification on each clustering center and the motion mode, and taking a standard vector set corresponding to each motion mode as a template vector set with classification labels in the motion mode.
The K-Means based clustering method specifically comprises the following steps:
(1) Setting an initial value k=m of a category number k according to the motion mode category M;
(2) Selecting initialized k samples from the feature vector set as initial clustering centers;
(3) Calculating the distance between each sample in the feature vector set and k clustering centers, and classifying the samples into the corresponding classification of the clustering center with the minimum distance;
(4) Recalculating each cluster center;
(5) Repeating the step (3) and the step (4) until no arbitrary feature vector is reassigned to other clusters or the iteration number reaches a specified threshold;
(6) Evaluating each cluster center, judging whether the cluster center is representative, if not, increasing the k value, returning to the step (2), and if so, entering the step (7);
(7) And obtaining k clustering centers, and carrying out matching classification on the k clustering centers and the motion patterns to obtain a template vector set with labels.
The similarity analysis specifically comprises the following steps:
And carrying out classification matching on the feature vector and the template vector set acquired by the current real-time data according to a similarity measurement function, wherein the similarity measurement function specifically comprises:
f(p,q)=var((Mp)./(Mq))
Wherein var is a variance function; m is a weight matrix of 0-1, and is adjusted according to the target with highest accuracy of classification of the motion mode under the feature vector set.
Compared with the prior art, the invention has the advantages that:
(1) The invention provides a motion pattern recognition method based on K-Means and gait cycle similarity, which combines the characteristics of human body motion, takes data in a whole gait cycle as an object and performs data compression, solves the problem of large operand in a conventional mode, simultaneously provides a standard template set extraction method based on the K-Means, can realize semi-automatic generation of a standard template, and simultaneously provides a similarity measurement function based on variance, which can adjust measurement length according to weights, and solves the problem of reduced accuracy of similarity discrimination of the start point and the end point uncertainty of the gait cycle;
(2) The invention generates template vectors of corresponding categories offline through collecting data; during recognition, similarity analysis is carried out on the data acquired in real time and the template vector, motion mode classification is carried out according to a similarity result, the probability of a current result is given according to the similarity, and a feature vector set can be extracted in a semi-automatic mode; the recognition operation amount is low, and the accuracy is higher after the data in the whole step period is processed as the object than that of the traditional method directly adopting the template matching.
Drawings
FIG. 1 is a flow chart of the invention for pattern recognition based on template matching;
FIG. 2 is a schematic diagram showing the comparison of the original vector and the processed feature vector according to the present invention;
FIG. 3 is a schematic diagram of k-means clustering results provided by the invention;
FIG. 4 is a schematic diagram of a motion pattern template vector provided by the invention;
Detailed Description
A motion pattern recognition method based on K-Means and gait cycle similarity includes the steps that firstly, template vectors of corresponding categories are generated offline through collected data; during recognition, similarity analysis is carried out on the data acquired in real time and the template vector, the motion mode classification is carried out according to the similarity result, the probability of the current result is given according to the similarity, and the specific steps of the recognition method are as follows:
(1) Collecting data offline and generating a template vector;
the specific steps of generating the template vector are as follows:
after offline data acquisition and processing, gait cycle division is carried out, feature vectors are generated through feature extraction, and template vectors are generated according to the feature vectors;
According to the motion periodicity, performing data compression on offline data or real-time data to obtain gait cycle data slices, and completing gait cycle division;
according to gait cycle data slice, downsampling is carried out, and the specific method comprises the following steps:
Uniformly sampling p data points x= { x (mi) |i=1, …, p } in gait cycle data { x (k) |k=1, …, n }, and obtaining a characteristic vector mi=1+ [ (i-1) ×n/p ];
Wherein x (k) is the data with the sequence number k in the current data slice; n is the total number of data in the current data slice; mi is the serial number of the sampling point; [ x ] is a maximum integer not greater than x;
it should be noted that different sampling modes may be selected according to the data source or the characteristics. If the MEMS inertial navigation unit is utilized, the data acquisition density of the first half period of the motion period is increased, and the data acquisition density of the second half period is reduced.
The feature vector x= { x (k) |k=1, …, p } is normalized and subjected to unit shift processing, acquiring feature vector set x' x/(|) x||) +e;
wherein x is the modulus of x; e is a unit vector;
clustering the feature vector set by adopting a K-Means clustering method, carrying out matching classification on each clustering center and the motion mode, and taking a standard vector set corresponding to each motion mode as a template vector set with classification labels in the motion mode;
The K-Means clustering method specifically comprises the following steps:
1) Setting an initial value k=m of a category number k according to the motion mode category M;
2) Selecting initialized k samples from the feature vector set as initial clustering centers;
3) Calculating the distance between each sample in the feature vector set and k clustering centers, and classifying the sample into the corresponding classification of the clustering center with the smallest distance;
4) Recalculating each cluster center;
5) Repeating the step (3) and the step (4) until no arbitrary feature vector is reassigned to other clusters or the iteration number reaches a specified threshold;
6) Evaluating each cluster center, judging whether the cluster center is representative, if not, increasing the k value, returning to the step (2), and if so, entering the step (7);
7) And obtaining k clustering centers, and carrying out matching classification on the k clustering centers and the motion patterns to obtain a template vector set with labels.
(2) In the identification process, after processing the data acquired in real time, acquiring a real-time feature vector and carrying out similarity analysis on the feature vector and the template vector obtained in the step (1);
The specific steps of generating the real-time feature vector are as follows:
After the real-time data are collected and processed, gait cycle division is carried out, and feature vectors are generated through feature extraction;
The feature vector is shown in fig. 2.
(3) And carrying out motion pattern recognition classification according to the similarity analysis result, and giving out the probability of the current classification result, wherein:
The similarity analysis is specifically as follows:
And carrying out classification matching on the feature vector and the template vector set acquired by the current real-time data according to a similarity measurement function, wherein the similarity measurement function specifically comprises:
f(p,q)=var((Mp)./(Mq))
Wherein var is a variance function; m is a weight matrix of 0-1, and is adjusted according to the target with highest accuracy of classification of the motion mode under the feature vector set.
The template vector type with the highest similarity with the feature vector acquired by the current real-time data is the current motion mode, and the probability of accurate classification result is given by the following formula:
Fi is the similarity between the feature vector and the template vector acquired by the current real-time data; fm is the maximum value in fi.
Further description of specific embodiments follows:
In this embodiment, as shown in fig. 1, a motion pattern recognition method is shown as a flowchart, and a template vector is generated offline by collecting data; during recognition, similarity analysis is carried out on the data acquired in real time after the data are processed and template vectors, motion mode classification is carried out according to similarity results, and probability of a current result is given according to the similarity;
the motion pattern recognition method is used for carrying out data compression on the original data according to the motion periodicity to obtain gait cycle data slices; the motion pattern recognition method is used for downsampling the obtained gait cycle data, and the sampling method is as follows:
p data points x= { x (mi) |i=1, …, p } are uniformly sampled in the gait cycle data { x (k) |k=1, …, n }, and are processed to be characteristic vectors.
mi=1+[(i-1)*n/p]
Wherein x (k) is the data with the sequence number k in the current data slice; n is the total number of data in the current data slice; mi is the serial number of the sampling point; [ x ] is a maximum integer not greater than x;
The motion pattern recognition method performs normalization and unit offset processing on a vector x= { x (k) |k=1, …, p }, wherein:
x'=x/(||x||)+E
Wherein, the x is the modulus of x; e is a unit vector;
The motion pattern recognition method is characterized in that K-Means is adopted for clustering of feature vector sets, each clustering center is matched with a motion pattern, and a standard vector set corresponding to each type of motion pattern is used as a template vector set of the type of motion pattern. Finally forming a template vector set with classification labels;
The K-Means clustering method comprises the following steps:
(1) Setting an initial value k=m of a category number k according to the motion mode category M;
(2) Selecting initialized k samples from the feature vector set as initial clustering centers;
(3) Calculating the distance between each sample in the feature vector set and k clustering centers, and classifying the samples into the corresponding classification of the clustering center with the minimum distance;
(4) Recalculating each cluster center classified in the step (3);
(5) Repeating the step (3) and the step (4) until no arbitrary feature vector is reassigned to other clusters or the iteration number reaches a specified threshold;
(6) Evaluating each cluster center, judging whether the cluster center is representative, if not, increasing the k value, returning to the step (2), and if so, entering the step (7);
(7) And obtaining k clustering centers, and carrying out matching classification on the k clustering centers and the motion patterns to obtain a template vector set with labels.
Clustering data including the processes of walking on the flat ground, going upstairs and going downstairs at a certain time, and the results are shown in fig. 3 and 4, wherein fig. 4 is a template vector formed by a clustering center.
The motion pattern recognition method is used for classifying and matching the current data and the template vector based on the similarity. The similarity metric function is as follows:
f(p,q)=var((Mp)./(Mq))
Wherein var is the variance function; m is a weight matrix of 0-1, and the highest accuracy of classification of the motion mode under the feature vector set is used as a target for adjustment.
Although the present invention has been described in terms of the preferred embodiments, it is not intended to be limited to the embodiments, and any person skilled in the art can make any possible variations and modifications to the technical solution of the present invention by using the methods and technical matters disclosed above without departing from the spirit and scope of the present invention, so any simple modifications, equivalent variations and modifications to the embodiments described above according to the technical matters of the present invention are within the scope of the technical matters of the present invention.
What is not described in detail in the present specification belongs to the known technology of those skilled in the art.

Claims (2)

1. A motion pattern recognition method based on K-Means and gait cycle similarity is characterized by comprising the following steps:
(1) Collecting data offline and generating a template vector;
(2) In the identification process, after the real-time acquisition data is processed, a real-time feature vector is obtained and similarity analysis is carried out on the real-time feature vector and the template vector obtained in the step (1);
(3) Performing motion pattern recognition classification according to the similarity analysis result, and giving out the probability of the current classification result;
in the step (1), the specific step of generating the template vector is as follows:
after offline data acquisition and processing, gait cycle division is carried out, feature vectors are generated through feature extraction, and template vectors are generated according to the feature vectors;
In the step (2), the specific step of generating the real-time feature vector is as follows:
After the real-time data are collected and processed, gait cycle division is carried out, and feature vectors are generated through feature extraction;
According to the motion periodicity, performing data compression on offline data or real-time data to obtain gait cycle data slices, and completing gait cycle division;
according to gait cycle data slice, downsampling is carried out, and the specific method comprises the following steps:
uniformly sampling p data points x= { x (mi) |i=1, …, p } in gait cycle data { x (k) |k=1, …, n }, and obtaining a characteristic vector after processing, wherein mi=1+ [ (i-1) ×n/p ];
Wherein x (k) is the data with the sequence number k in the current data slice; n is the total number of data in the current data slice; mi is the serial number of the sampling point; [ x ] is a maximum integer not greater than x;
The feature vector x= { x (k) |k=1, …, p } is normalized and subjected to unit shift processing, acquiring feature vector set x' x/(|) x||) +e;
Wherein x is the modulus of x; e is a unit vector.
2. The method for identifying the movement pattern based on the K-Means and the gait cycle similarity according to claim 1, wherein the method comprises the following steps of:
Clustering the feature vector sets by adopting a K-Means clustering method, carrying out matching classification on each clustering center and the motion modes, and taking a vector set corresponding to each motion mode as a template vector set with classification labels in the motion modes;
The K-Means based clustering method specifically comprises the following steps:
(1) Setting an initial value k=m of a category number k according to the motion mode category M;
(2) Selecting initialized k samples from the feature vector set as initial clustering centers;
(3) Calculating the distance between each sample in the feature vector set and k clustering centers, and classifying the samples into the corresponding classification of the clustering center with the minimum distance;
(4) Recalculating each cluster center;
(5) Repeating the step (3) and the step (4) until no arbitrary feature vector is reassigned to other clusters or the iteration number reaches a specified threshold;
(6) Evaluating each cluster center, judging whether the cluster center is representative, if not, increasing the k value, returning to the step (2), and if so, entering the step (7);
(7) K clustering centers are obtained, matching classification is carried out on the k clustering centers and the motion pattern, and a template vector set with labels is obtained;
the similarity analysis specifically comprises the following steps:
And carrying out classification matching on the feature vector and the template vector set acquired by the current real-time data according to a similarity measurement function, wherein the similarity measurement function specifically comprises:
f(p,q)=var((Mp)./(Mq))
Wherein var is a variance function; m is a weight matrix of 0-1, and is adjusted according to the target with highest accuracy of classification of the motion mode under the feature vector set.
CN202110728234.8A 2021-06-29 2021-06-29 Motion pattern recognition method based on K-Means and gait cycle similarity Active CN113516063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110728234.8A CN113516063B (en) 2021-06-29 2021-06-29 Motion pattern recognition method based on K-Means and gait cycle similarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110728234.8A CN113516063B (en) 2021-06-29 2021-06-29 Motion pattern recognition method based on K-Means and gait cycle similarity

Publications (2)

Publication Number Publication Date
CN113516063A CN113516063A (en) 2021-10-19
CN113516063B true CN113516063B (en) 2024-07-12

Family

ID=78066382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110728234.8A Active CN113516063B (en) 2021-06-29 2021-06-29 Motion pattern recognition method based on K-Means and gait cycle similarity

Country Status (1)

Country Link
CN (1) CN113516063B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114913595B (en) * 2022-04-20 2023-11-17 中国科学院自动化研究所 Motion mode identification method and device, electronic equipment and storage medium
CN116110584B (en) * 2023-02-23 2023-09-22 江苏万顶惠康健康科技服务有限公司 Human health risk assessment early warning system
CN117059227B (en) * 2023-10-13 2024-01-30 华南师范大学 Motion monitoring method and device based on gait data and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111973191A (en) * 2020-08-10 2020-11-24 北京海益同展信息科技有限公司 Motion state identification method, device and system and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008175559A (en) * 2007-01-16 2008-07-31 Yokogawa Electric Corp Walking analysis system
US9981193B2 (en) * 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
CN103049741A (en) * 2012-12-21 2013-04-17 中国科学院合肥物质科学研究院 Foot-to-ground acting force-based gait feature extraction method and gait identification system
CN103473539B (en) * 2013-09-23 2015-07-15 智慧城市系统服务(中国)有限公司 Gait recognition method and device
CN106052675B (en) * 2016-05-27 2020-06-16 中国人民解放军总医院第六医学中心 Human body motion and posture monitoring device and method
CN106850955B (en) * 2016-12-20 2019-07-02 陕西尚品信息科技有限公司 A kind of mobile phone identity verification method based on Gait Recognition
CN106991291B (en) * 2017-04-14 2020-01-10 北京工业大学 Computer simulation conversion method for real-time conversion of fetal electrocardiosignals into fetal sounds
CN109858351B (en) * 2018-12-26 2021-05-14 中南大学 Gait recognition method based on hierarchy real-time memory
EP3782547B1 (en) * 2019-08-21 2024-04-10 The Swatch Group Research and Development Ltd Method and system for gait detection of a person
CN111368762A (en) * 2020-03-09 2020-07-03 金陵科技学院 Robot gesture recognition method based on improved K-means clustering algorithm
US11006860B1 (en) * 2020-06-16 2021-05-18 Motionize Israel Ltd. Method and apparatus for gait analysis
CN112488773A (en) * 2020-12-18 2021-03-12 四川长虹电器股份有限公司 Smart television user classification method, computer equipment and storage medium
CN112949676B (en) * 2020-12-29 2022-07-08 武汉理工大学 Self-adaptive motion mode identification method of flexible lower limb assistance exoskeleton robot

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111973191A (en) * 2020-08-10 2020-11-24 北京海益同展信息科技有限公司 Motion state identification method, device and system and storage medium

Also Published As

Publication number Publication date
CN113516063A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN113516063B (en) Motion pattern recognition method based on K-Means and gait cycle similarity
Zhang et al. Human activity recognition based on motion sensor using u-net
CN105184325B (en) Mobile intelligent terminal
CN108268838B (en) Facial expression recognition method and facial expression recognition system
CN105242779B (en) A kind of method and mobile intelligent terminal of identification user action
CN105320937B (en) Traffic police's gesture identification method based on Kinect
CN110555417A (en) Video image recognition system and method based on deep learning
CN111860278B (en) Human behavior recognition algorithm based on deep learning
CN110659682A (en) Data classification method based on MCWD-KSMOTE-AdaBoost-DenseNet algorithm
CN111178288A (en) Human body posture recognition method and device based on local error layer-by-layer training
CN110008847B (en) Swimming stroke identification method based on convolutional neural network
CN111551893A (en) Deep learning and neural network integrated indoor positioning method
CN114384999A (en) User irrelevant myoelectricity gesture recognition system based on self-adaptive learning
Suganya et al. Ultrasound ovary cyst image classification with deep learning neural network with Support vector machine
CN112487902B (en) Exoskeleton-oriented gait phase classification method based on TCN-HMM
CN107729863B (en) Human finger vein recognition method
CN105657653A (en) Indoor positioning method based on fingerprint data compression
CN111191510B (en) Relation network-based remote sensing image small sample target identification method in complex scene
CN105160336B (en) Face identification method based on Sigmoid functions
CN115862639A (en) Artificial intelligence voice analysis method based on K-means clustering analysis
CN109948686B (en) Swimming stroke identification method based on nine-axis sensing signal statistical characteristics
CN110210454B (en) Human body action pre-judging method based on data fusion
CN110575177B (en) Gait classification and quantification method based on Mahalanobis distance
CN110766754B (en) Urban rail train pantograph target positioning method
CN114722850A (en) Motor imagery electroencephalogram signal identification method based on probability learning Riemann space quantization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant