CN109635754A - Gait feature fusion method based on Non-linear coupling metric learning - Google Patents
Gait feature fusion method based on Non-linear coupling metric learning Download PDFInfo
- Publication number
- CN109635754A CN109635754A CN201811540264.0A CN201811540264A CN109635754A CN 109635754 A CN109635754 A CN 109635754A CN 201811540264 A CN201811540264 A CN 201811540264A CN 109635754 A CN109635754 A CN 109635754A
- Authority
- CN
- China
- Prior art keywords
- gait
- feature
- formula
- new
- metric learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
Gait feature fusion method based on Non-linear coupling metric learning, belongs to mode identification technology.Solves the low technical problem of gait characterizing method recognition performance in the prior art.Gait feature fusion method of the invention, steps are as follows: Step 1: obtain the binaryzation profile sequence of people from gait video flowing using code book detection method, and by every frame graphics standard centralization;Step 2: according to people walk in two leg separation degrees periodicity, detect non-frontal gait cycle, and extract gait energy diagram feature and active energy figure feature in one cycle;Step 3: gait energy diagram feature and active energy figure feature are carried out Non-linear coupling metric learning, and core coupled room is projected to, obtains two new features;Step 4: being weighted fusion to two groups of obtained new feature vectors, the new gait feature of core coupled room is obtained.The gait feature fusion method is effective to non-frontal period gait sequence, and discrimination is high.
Description
Technical field
The invention belongs to mode identification technologies, and in particular to a kind of gait based on Non-linear coupling metric learning is special
Levy fusion method.
Background technique
Gait Recognition is a research direction being concerned of computer vision and living things feature recognition field in recent years,
Compared with other biological feature identification technique, Gait Recognition is the side that can uniquely identify at a distance in living things feature recognition
Method[1].Gait it is untouchable, be not easy the advantages that pretending, have very big application prospect in intelligent video monitoring.
However, pedestrian will receive the influence of external environment and oneself factor in the process of walking, such as different road surfaces of walking,
The factors such as different perspectives, different dress ornaments.Under the conditions of existing for the above-mentioned influence factor, gait characterization difference is brought to Gait Recognition
It is difficult.The gait feature that can be obtained by merging different modes, extracts the gait information for being beneficial to identification as far as possible, solves
The low problem of Gait Recognition rate under the conditions of difference walking.
Propose gait characterizing method many based on class energy diagram both at home and abroad, it be periodical space-time gait feature according to
The accumulation class of certain rule.Typical class energy diagram method mainly has: A. information cumulative method: Han et al.[2]It is proposed gait energy diagram
Normalized a cycle image energy is superimposed by (Gait Energy Image, GEI), and the concentration of pixel is represented in the pixel
The energy of position human motion, but this method can lose the motion feature of before and after frames connection.Before and after frames are lost in order to improve GEI
The motion feature of connection, Zhang[3]It proposes active energy figure (Active Eenergy Imag, AEI), obtains more differentiating energy
The feature of power;But what AEI was obtained is entirely the multidate information of human motion, and static information completely is given up to fall.Lee et al.[4]
It proposes a kind of statistical description of gait motion mode, calculates the bi-distribution of every each pixel of frame in gait cycle, all pictures
The accumulation of the mean value and variance of element obtains gait probability graph, and this method equally lacks the temporal aspect of interframe.B. multidate information draws
Enter method: Wang et al.[5]Propose a kind of gait feature that the time keeps energy diagram extraction to have timing;Ksakunniran et al.[6]
It is proposed that space-time point-of-interest characterizes gait;But the computation complexity of both methods is higher.C. information fusion method: Hofmann etc.
People[7]Deep gradient histogram energy diagram and gait energy diagram decision level fusion are realized, but this method is lost part gait information.
Disclosure relevant to invention, which is reported, includes:
[1] beautifully adorned Xian is firelight or sunlight, Xu Sen, the feature representation of Wang Kejun pedestrian gait and identification summary [J] pattern-recognition and artificial
Intelligence, 2012,25 (1): 71-81.
[2]Han J,Bhanu B.Statistical feature fusion for gait-based human
recognition[C],Proceedings of the 2004IEEE Computer Society Conference on
Computer Vision and Pattern Recognition(CVPR),Washington,DC.IEEE,2004:II-842-
II-847.
[3]Zhang E,Zhao Y,Xiong W.Active energy image plus 2DLPP for gait
recognition[J],Signal Processing,2010,90(7):2295-2302.
[4]Lee C P,Tan A W C,Tan S C.Gait probability image:An information-
theoretic model of gait representation[J],Journal of Visual Communication and
Image Representation,2014,25(6):1489-1492.
[5]Wang C,Zhang J,Wang L,et al.Human identification using temporal
information preserving gait template[J],IEEE Transactions on Pattern Analysis
and Machine Intelligence,2012,34(11):2164-2176.
[6]Kusakunniran W.Attribute-based learning for gait recognition using
spatio-temporal interest points[J],Image and Vision Computing,2014,32(12):
1117-1126.
[7]Hofmann M,Geiger J,Bachmann S,et al.The TUM gait from audio,image
and depth(GAID)database:Multimodal recognition ofsubjects and traits[J],
Journal ofVisual Communication and Image Representation,2014,25(1):195-206.
Summary of the invention
Present invention aim to address gait characterizing method in the prior art because identifying that beneficial gait feature is few, make
It is low at discrimination;When the walking states of the walking states and registration gait of especially testing gait mismatch, recognition performance is obvious
The technical issues of decline;A kind of gait feature fusion method based on Non-linear coupling metric learning is provided.
It is as follows that the present invention solves the technical solution that above-mentioned technical problem is taken.
Gait feature fusion method based on Non-linear coupling metric learning of the invention, comprising the following steps:
Step 1: the binaryzation profile sequence of people is obtained from gait video flowing using code book detection method, and by every frame
Graphics standard centralization;
Step 2: according to people walk in two leg separation degrees periodicity, detect non-frontal gait cycle, and in a week
Interim extraction gait energy diagram feature and active energy figure feature;
Step 3: gait energy diagram feature and active energy figure feature are carried out Non-linear coupling metric learning, and project
To core coupled room, new gait energy diagram feature and new active energy figure feature are obtained;
Step 4: being weighted to the vector of the feature of obtained new gait energy diagram feature and new active energy figure
Fusion, obtains the new gait feature of core coupled room.
Further, in the step 1, every frame graphics standard center, which turns to, occupies the human body in binaryzation profile sequence
In, every frame image unifies size.
Further, in the step 2, non-frontal gait cycle detection formula are as follows:
In formula, GiFor the i-th frame image leg area mean breadth, h1And h2The ankle of people respectively in certain frame foreground image
With the anthropometry height of knee, RjAnd LjRespectively belong to the Far Left of foreground image and the pixel position of rightmost in jth row
It sets.
Further, in the step 2, the extraction formula of gait energy diagram feature are as follows:
In formula, EGEIFor gait energy diagram feature, N is the gait frame number for including in a gait cycle, and (x, y) represents two
Tie up plane of delineation coordinate, Bi(x, y) represents the i-th frame bianry image.
Further, in the step 2, the extraction formula of active energy figure feature are as follows:
In formula, EAEIFor active energy figure feature, N is the gait frame number for including in a gait cycle, and (x, y) represents two
Tie up plane of delineation coordinate, Bi(x, y) represents the i-th frame bianry image, Bi+1(x, y) represents i+1 frame bianry image, Di(x, y) generation
The pixel difference of table two continuous frames bianry image.
Further, in the step 3, gait energy diagram feature and active energy figure feature are subjected to Non-linear coupling
Metric learning minimizes formula are as follows:
In formula, Tr () is the expression symbol of trace of a matrix;AxAnd AyThe transformation matrix of respectively X and Y, φ (X) Ax=φ
(Y)Ay, φ (X) and φ (Y) are respectively the result that sample matrix X and Y are mapped to higher-dimension Hilbert space F;T is transposition symbol;
FxAnd FyFor diagonal matrix, diagonal element is respectively that the correlativity Matrix C of sample matrix X and Y correspond to the cumulative of row and respective column
With;KxAnd KyFor kernel function, formula is as follows:
In formula, xiAnd xjFor the sample of sample matrix X, yiAnd yjFor the sample of sample matrix Y, σ is coefficient of variation.
Further, in the step 3, gait energy diagram feature and active energy figure Projection Character are coupled into sky to core
Between, obtain new gait energy diagram feature and new active energy figure feature, formula are as follows:
In formula, E'GEIFor new gait energy diagram feature, E'AEIFor new active energy figure feature, T is transposition symbol,WithRespectively to KxAnd KyCentralization, formula are as follows:
In formula, 1nnFor the unit matrix of matrix dimension n × n, n respectively represents sample number in sample matrix X and Y.
Further, in the step 4, to the spy of obtained new gait energy diagram feature and new active energy figure
The vector of sign is weighted fusion, obtains the new gait feature of core coupled room, formula are as follows:
E'=α E'GEI+(1-α)E'AEI (10)
In formula, E' is the new gait feature of core coupled room, E'GEIFor new gait energy diagram feature, E'AEIFor new master
Kinetic energy spirogram feature, α are weighting coefficient.
Compared with prior art, the invention has the benefit that
Gait feature fusion method based on Non-linear coupling metric learning of the invention is effectively removed original gait
The redundancy of energy diagram and active energy figure is beneficial to the classification information of identification, solves single features characterization gait information
Insufficient problem, to characterizing, non-frontal period gait sequence is effective, and discrimination is high.
Detailed description of the invention
Fig. 1 is the flow chart of the gait feature fusion method of the invention based on Non-linear coupling metric learning;
Fig. 2 is the body gait wheel that the gait feature fusion method of the invention based on Non-linear coupling metric learning is extracted
It is wide;
Fig. 3 is the anthropometry height of the gait feature fusion method of the invention based on Non-linear coupling metric learning
Figure;
Fig. 4 is after the gait cycle of the gait feature fusion method of the invention based on Non-linear coupling metric learning detects
Curve;
Fig. 5 is the binaryzation of the gait feature fusion method a cycle of the invention based on Non-linear coupling metric learning
Profile sequence;
Fig. 6 is the gait that the gait feature fusion method of the invention based on Non-linear coupling metric learning is generated by Fig. 5
Energy diagram;
Fig. 7 is the active that the gait feature fusion method of the invention based on Non-linear coupling metric learning is generated by Fig. 5
Energy diagram.
Specific embodiment
For a further understanding of the present invention, it is described below with reference to embodiment of the present invention, but it is to be understood that this
A little descriptions are only further explanation the features and advantages of the present invention, rather than limiting to the claimed invention.
As shown in Figure 1, the gait feature fusion method of Non-linear coupling metric learning of the invention, steps are as follows:
Step 1: the acquisition and processing of image
Gait video flowing of the people under different walking states is taken, foreground area is clustered by establishing code book model, is obtained
Foreground area, and normalized is the unified image of human body contour outline size placed in the middle, as shown in Figure 2;
Wherein, the size of image determine according to actual needs, such as 64 × 64 pixels or 128 × 88 pixels;
Step 2: extracting gait global feature
Step 2.1, according in the figure after frame graphics standard centralization every in gait video flowing, people's two legs point in walking
From degree periodicity, detect non-frontal gait cycle, detection formula are as follows:
In formula (1), as shown in figure 3, GiFor the i-th frame gait image leg area mean breadth, h1And h2Respectively before certain frame
The anthropometry height of the ankle and knee of people in scape image;RjAnd LjRespectively belong to the Far Left of foreground image in jth row
With the location of pixels of rightmost.Fig. 4 is the curve after gait cycle detection, and Fig. 5 is the binaryzation profile sequence of a cycle.
Step 2.2, the global feature for extracting gait using GEI in one cycle, formula are as follows:
In formula (2), EGEIFor GEI feature, N is the gait frame number for including in a gait cycle, and (x, y) represents X-Y scheme
As plane coordinates, Bi(x, y) represents the i-th frame bianry image.GEI is as shown in Figure 6.
Step 2.3, the global feature for extracting gait using AEI in one cycle, formula are;
In formula (3), EGEIFor AEI feature, N is the gait frame number for including in a gait cycle, and (x, y) represents X-Y scheme
As plane coordinates, Bi(x, y) represents the i-th frame bianry image, Bi+1(x, y) represents i+1 frame bianry image, Di(x, y), which is represented, to be connected
The pixel difference of continuous two frames.AEI is as shown in Figure 7.
Step 3: Non-linear coupling metric learning obtains two groups of new features
X, Y of GEI eigenmatrix and AEI eigenmatrix are carried out Non-linear coupling metric learning by step 3.1 respectively, most
The formula of smallization is as follows:
In formula (4), Tr () is the expression symbol of trace of a matrix;xiAnd yjRespectively two sample moments of Coupling Metric study
The sample of battle array X and Y, the sample projection of φ representative sample matrix X and Y, i.e. xi∈X,yi∈ Y, φ: Rn→F,xi→φ(xi),yi
→φ(yi), φ (X) and φ (Y) are respectively the result that sample matrix X and Y are mapped to higher-dimension Hilbert space F;T is transposition symbol
Number;FxAnd FyFor diagonal matrix, diagonal element is respectively that the correlativity Matrix C of sample matrix X and Y correspond to row and respective column
It adds up and CijFor element in correlativity Matrix C, work as cx i=cy jWhen, Cij=1, work as cx i≠cy jWhen, Cij=0, c are classification mark
Number;
Projection matrix P=φ (X) Ax=φ (Y) Ay, AxAnd AyFor the transformation matrix of X and Y, formula (4) is further simplified:
In formula (5), KxAnd KyFor kernel function, Kx=(φ (xi),φ(xj)), Ky=(φ (yi),φ(yj)), kernel function choosing
Select gaussian kernel function:
In formula (6), xiAnd xjFor the sample of sample matrix X, yiAnd yjFor the sample of sample matrix Y, σ is coefficient of variation.
Wherein, formula (5) minimization problem, which can be regarded as, seeks following formula generalized eigenvalue:
(KΓKT) a=λ (KKT)a (7)
In formula (7),A is the feature vector corresponding to eigenvalue λ;Matrix
A is by corresponding to minimal eigenvalue to DcThe corresponding feature vector composition of small characteristic value,
Step 3.2 obtains two groups of new features
By GEI feature and AEI Projection Character to core coupled room, new GEI feature and new AEI feature are obtained:
In formula (8), E'GEIFor new gait energy diagram feature, E'AEIFor new active energy figure feature, T is transposition symbol
Number,WithRespectively to KxAnd KyCentralization, formula are as follows:
In formula (9), 1nnFor the unit matrix of matrix dimension n × n, n respectively represents sample number in sample matrix X and Y.
Step 4: two groups of new feature vectors are weighted fusion
Obtained new GEI feature is weighted with the vector of new AEI feature and is merged, the new of core coupled room is obtained
Gait feature, formula are as follows:
E'=α E'GEI+(1-α)E'AEI (10)
In formula (10), E' is the new gait feature of core coupled room, E'GEIFor new gait energy diagram feature, E'AEIIt is new
Active energy figure feature, α is weighting coefficient, general value be 0.1,0.3,0.5,0.7,0.9.
For the validity for proving the gait feature fusion method provided by the invention based on Non-linear coupling metric learning, with
Under fusion method of the invention is tested in conjunction with the embodiments.
Embodiment 1
1.1 test samples: using USF HumanID gait data collection, (USF HumanID gait data collection is remote multiple
The outdoor gait video shot under miscellaneous background, the side profile of extraction it is second-rate).122 people are acquired in different condition
Video data: it according to acquisition visual angle (R/L) (about 30 ° of the optical axis direction angle of two cameras in left and right), walks on cement/meadow
(C/G), back/not knapsack (BF/NB), wear different shoes (A/B) and different acquisition time (T), be divided into multiple and different groups,
It is specific to divide as shown in table 1.
1 USF database situation of table
1.2 test methods: experiment, for gait cycle starting point, acquires a few width class energy an of gait sequence with both feet support
Figure, it is then average to be analyzed at a monocyclic GEI and AEI, on this basis using corresponding feature extracting method (tool
Body includes: gait feature fusion method of the invention, the existing independent method using GEI and the existing independent side using AEI
Method), it recently enters nearest neighbor classifier and is identified.
The experiment parameter of gait feature fusion method of the invention illustrates: the value of weighting coefficient α is 0.1,0.3,0.5,
0.7,0.9;Kernel functional parameterσ1In the value of A~L test group: [141,121,141,
121134,121,134,136,134,138,121,121];DcFor it is corresponding non-linear to obtain optimal identification rate in mono- timing of σ
The dimension of coupling projection keeping characteristics, in the case where different weights factor alpha, different test group, DcValue is different.
In order to reduce calculation amount, existing independent method and the existing independent method progress feature knowledge using AEI using GEI
When other, singular value decomposition (SVD) dimensionality reduction is used first, retains the 99.9% corresponding feature vector of the sum of all characteristic values.
1.3 test results are as shown in table 2.
The recognition performance comparison of 2 present invention of table and in the prior art gait characteristics characterizing method
By data in table 2 it is found that comparing AEI or GEI, gait feature fusion method proposed by the present invention is tested in A~L group
Middle discrimination significantly improves.Such as: test experiments A group when α=0.7 (or 0.9), method proposed by the present invention are distinguished compared with AEI, GEI
Improve 11% and 9%;Test experiments E group, method proposed by the present invention have been respectively increased 22% compared with AEI, GEI when such as α=0.9
With 17%;Test experiments G group when α=0.3 (or 0.5,0.7,0.9), method proposed by the present invention are respectively increased compared with AEI, GEI
27% and 18%.
1.4 further analyze experimental result
Test experiments A~L is divided into three classes, the Ith class (ABC): to be identified by static information;IIIth class
(HIJKL): relying primarily on multidate information and timing information is identified;IIth class (DEFG) between the Ith class and the IIIth class,
Static information, multidate information and timing information occupy status of equal importance.The results are shown in Table 3.
The experimental result of 3 inventive algorithm average recognition rate of table
In table 3, Avg. indicates the average recognition rate of 12 groups of test experiments.
From the data in table 3, it can be seen that gait feature fusion method proposed by the present invention achieves the I, the II, III class and Avg. highest
Discrimination, when such as α=0.9, the IIth class testing experiment in, method proposed by the present invention is respectively increased compared with AEI, GEI
21% and 13.7%;When such as α=0.1, in the experiment of the IIIth class testing, method proposed by the present invention is respectively increased compared with AEI, GEI
9.4% and 12%;When α=0.1, on average recognition rate Avg., method proposed by the present invention is respectively increased compared with AEI, GEI
9.4% and 12%.
To sum up, it may be said that bright gait feature fusion method proposed by the present invention includes the more information for being beneficial to identification, is taken
Obtain optimal identification performance.
Claims (8)
1. the gait feature fusion method based on Non-linear coupling metric learning, which comprises the following steps:
Step 1: the binaryzation profile sequence of people is obtained from gait video flowing using code book detection method, and by every frame image
Center for standard;
Step 2: according to people walk in two leg separation degrees periodicity, detect non-frontal gait cycle, and in one cycle
Extract gait energy diagram feature and active energy figure feature;
Step 3: gait energy diagram feature and active energy figure feature are carried out Non-linear coupling metric learning, and project to core
Coupled room obtains new gait energy diagram feature and new active energy figure feature;
Melt Step 4: being weighted to the vector of the feature of obtained new gait energy diagram feature and new active energy figure
It closes, obtains the new gait feature of core coupled room.
2. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 1, every frame graphics standard center turns to the human body in binaryzation profile sequence is placed in the middle, and every frame image is unified big
It is small.
3. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 2, non-frontal gait cycle detection formula are as follows:
In formula, GiFor the i-th frame image leg area mean breadth, h1And h2The ankle and knee of people respectively in certain frame foreground image
The anthropometry height of lid, RjAnd LjRespectively belong to the Far Left of foreground image and the location of pixels of rightmost in jth row.
4. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 2, the extraction formula of gait energy diagram feature are as follows:
In formula, EGEIFor gait energy diagram feature, N is the gait frame number for including in a gait cycle, and (x, y) represents X-Y scheme
As plane coordinates, Bi(x, y) represents the i-th frame bianry image.
5. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 2, the extraction formula of active energy figure feature are as follows:
In formula, EAEIFor active energy figure feature, N is the gait frame number for including in a gait cycle, and (x, y) represents X-Y scheme
As plane coordinates, Bi(x, y) represents the i-th frame bianry image, Bi+1(x, y) represents i+1 frame bianry image, Di(x, y), which is represented, to be connected
The pixel difference of continuous two frame bianry images.
6. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 3, gait energy diagram feature and active energy figure feature are subjected to Non-linear coupling metric learning, minimized public
Formula are as follows:
In formula, Tr () is the expression symbol of trace of a matrix;AxAnd AyThe transformation matrix of respectively X and Y, φ (X) Ax=φ (Y) Ay,
φ (X) and φ (Y) is respectively the result that sample matrix X and Y are mapped to higher-dimension Hilbert space F;T is transposition symbol;FxAnd Fy
For diagonal matrix, diagonal element be respectively sample matrix X and Y correlativity Matrix C correspond to the cumulative of row and respective column and;Kx
And KyFor kernel function, formula is as follows:
In formula, xiAnd xjFor the sample of sample matrix X, yiAnd yjFor the sample of sample matrix Y, σ is coefficient of variation.
7. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 3, by gait energy diagram feature and active energy figure Projection Character to core coupled room, new gait energy is obtained
Spirogram feature and new active energy figure feature, formula are as follows:
In formula, E'GEIFor new gait energy diagram feature, E'AEIFor new active energy figure feature, T is transposition symbol,With
Respectively to KxAnd KyCentralization, formula are as follows:
In formula, 1nnFor the unit matrix of matrix dimension n × n, n respectively represents sample number in sample matrix X and Y.
8. the gait feature fusion method according to claim 1 based on Non-linear coupling metric learning, which is characterized in that
In the step 4, the vector of the feature of obtained new gait energy diagram feature and new active energy figure is weighted and is melted
It closes, obtains the new gait feature of core coupled room, formula are as follows:
E'=α E'GEI+(1-α)E'AEI (10)
In formula, E' is the new gait feature of core coupled room, E'GEIFor new gait energy diagram feature, E'AEIFor new active energy
Spirogram feature, α are weighting coefficient.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811540264.0A CN109635754A (en) | 2018-12-17 | 2018-12-17 | Gait feature fusion method based on Non-linear coupling metric learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811540264.0A CN109635754A (en) | 2018-12-17 | 2018-12-17 | Gait feature fusion method based on Non-linear coupling metric learning |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109635754A true CN109635754A (en) | 2019-04-16 |
Family
ID=66074513
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811540264.0A Pending CN109635754A (en) | 2018-12-17 | 2018-12-17 | Gait feature fusion method based on Non-linear coupling metric learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109635754A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110236523A (en) * | 2019-06-17 | 2019-09-17 | 杭州电子科技大学 | Gait-Cardiac RR interval the correlating method returned based on Gauss |
CN111126470A (en) * | 2019-12-18 | 2020-05-08 | 创新奇智(青岛)科技有限公司 | Image data iterative clustering analysis method based on depth metric learning |
CN111553935A (en) * | 2020-05-14 | 2020-08-18 | 广东第二师范学院 | Human motion form obtaining method based on increment dimension reduction projection position optimization |
CN111860291A (en) * | 2020-07-16 | 2020-10-30 | 上海交通大学 | Multi-mode pedestrian identity recognition method and system based on pedestrian appearance and gait information |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013017858A1 (en) * | 2011-07-29 | 2013-02-07 | University Of Ulster | Gait recognition methods and systems |
CN104899604A (en) * | 2015-06-08 | 2015-09-09 | 哈尔滨工程大学 | Feature-level fusion method based on data set merging |
CN109002785A (en) * | 2018-07-05 | 2018-12-14 | 西安交通大学 | Gait recognition method based on movement timing energy diagram |
-
2018
- 2018-12-17 CN CN201811540264.0A patent/CN109635754A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013017858A1 (en) * | 2011-07-29 | 2013-02-07 | University Of Ulster | Gait recognition methods and systems |
CN104899604A (en) * | 2015-06-08 | 2015-09-09 | 哈尔滨工程大学 | Feature-level fusion method based on data set merging |
CN109002785A (en) * | 2018-07-05 | 2018-12-14 | 西安交通大学 | Gait recognition method based on movement timing energy diagram |
Non-Patent Citations (3)
Title |
---|
杨旗: "《人体步态及行为识别技术研究》", 31 January 2014, 辽宁科学技术出版社 * |
王俊等: "特征融合的多视角步态识别研究", 《中国计量大学学报》 * |
阎涛: "耦合距离度量学习方法研究及在步态识别中的应用", 《中国博士学位论文全文数据库信息科技辑》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110236523A (en) * | 2019-06-17 | 2019-09-17 | 杭州电子科技大学 | Gait-Cardiac RR interval the correlating method returned based on Gauss |
CN111126470A (en) * | 2019-12-18 | 2020-05-08 | 创新奇智(青岛)科技有限公司 | Image data iterative clustering analysis method based on depth metric learning |
CN111126470B (en) * | 2019-12-18 | 2023-05-02 | 创新奇智(青岛)科技有限公司 | Image data iterative cluster analysis method based on depth measurement learning |
CN111553935A (en) * | 2020-05-14 | 2020-08-18 | 广东第二师范学院 | Human motion form obtaining method based on increment dimension reduction projection position optimization |
CN111553935B (en) * | 2020-05-14 | 2020-12-15 | 广东第二师范学院 | Human motion form obtaining method based on increment dimension reduction projection position optimization |
CN111860291A (en) * | 2020-07-16 | 2020-10-30 | 上海交通大学 | Multi-mode pedestrian identity recognition method and system based on pedestrian appearance and gait information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jalal et al. | Human body parts estimation and detection for physical sports movements | |
CN109635754A (en) | Gait feature fusion method based on Non-linear coupling metric learning | |
Wang et al. | Human identification using temporal information preserving gait template | |
Ran et al. | Applications of a simple characterization of human gait in surveillance | |
Avgerinakis et al. | Recognition of activities of daily living for smart home environments | |
Tolliver et al. | Gait shape estimation for identification | |
Singh et al. | Human activity recognition based on silhouette directionality | |
CN105825233B (en) | A kind of pedestrian detection method based on on-line study random fern classifier | |
CN104376334B (en) | A kind of pedestrian comparison method of multi-scale feature fusion | |
CN112464730B (en) | Pedestrian re-identification method based on domain-independent foreground feature learning | |
CN105574515A (en) | Pedestrian re-identification method in zero-lap vision field | |
CN113221625A (en) | Method for re-identifying pedestrians by utilizing local features of deep learning | |
CN111914643A (en) | Human body action recognition method based on skeleton key point detection | |
CN110991398A (en) | Gait recognition method and system based on improved gait energy map | |
Song et al. | Feature extraction and target recognition of moving image sequences | |
Mu et al. | Resgait: The real-scene gait dataset | |
CN115100684A (en) | Clothes-changing pedestrian re-identification method based on attitude and style normalization | |
Lee et al. | Attribute de-biased vision transformer (ad-vit) for long-term person re-identification | |
CN106022310B (en) | Human body behavior identification method based on HTG-HOG and STG characteristics | |
Sattrupai et al. | Deep trajectory based gait recognition for human re-identification | |
CN109359578A (en) | Weighted Fusion triple channel eigengait characterizing method | |
Tang et al. | Research on the pedestrian re-identification method based on local features and gait energy images | |
Dong et al. | Recognizing human interaction by multiple features | |
CN105260718A (en) | Gait identification method based on optical flow field | |
Li et al. | Application of thermal infrared imagery in human action recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190416 |
|
RJ01 | Rejection of invention patent application after publication |