CN116740811A - Gait recognition method, medium and device of intelligent watch - Google Patents
Gait recognition method, medium and device of intelligent watch Download PDFInfo
- Publication number
- CN116740811A CN116740811A CN202310709903.6A CN202310709903A CN116740811A CN 116740811 A CN116740811 A CN 116740811A CN 202310709903 A CN202310709903 A CN 202310709903A CN 116740811 A CN116740811 A CN 116740811A
- Authority
- CN
- China
- Prior art keywords
- target
- priority
- gait
- user
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000005021 gait Effects 0.000 title claims abstract description 63
- 238000000034 method Methods 0.000 title claims abstract description 37
- 206010017577 Gait disturbance Diseases 0.000 claims abstract description 24
- 238000003860 storage Methods 0.000 claims description 6
- 230000003068 static effect Effects 0.000 claims description 4
- 230000001133 acceleration Effects 0.000 description 4
- 238000005265 energy consumption Methods 0.000 description 2
- 238000011065 in-situ storage Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Abstract
The application discloses a gait recognition method, medium and equipment of an intelligent watch, wherein the method comprises the following steps: the method comprises the steps of obtaining a target user ID list, obtaining a target position information set and a target attitude characteristic value set based on an intelligent watch worn by a target user, obtaining a first target priority set and a second target priority set, obtaining candidate abnormal gait when the first target priority is smaller than a preset first priority threshold and/or the second target priority in the same time period is smaller than a preset second priority threshold, and determining target gait when the first target priority is not smaller than the preset first priority threshold and the second target priority in the same time period is not smaller than the preset second priority threshold, so that gait recognition is achieved.
Description
Technical Field
The application relates to the technical field of computers, in particular to a gait recognition method, medium and device of an intelligent watch.
Background
Along with the wide application and development of mobile communication, the current intelligent mobile terminal integrates a plurality of powerful sensor devices, such as a global positioning system, a gyroscope, an acceleration sensor and the like in a smart watch, and the activity information of a user can be mined and identified through sensor data in the intelligent mobile terminal.
In the prior art, the gait recognition method comprises the following steps: the method comprises the steps of acquiring the speed of a user in a certain time period, the acceleration corresponding to the user in the same time period and the energy consumption of the user, and identifying the gait of the user according to the speed, the acceleration and the energy consumption corresponding to the user.
The above method for performing gait recognition has the following problems: the gait recognition is carried out only by using the speed and the acceleration, and the gait of the user is recognized and judged by not combining different data information in the intelligent watch, so that the accuracy of the gait recognition is lower, and meanwhile, the gait recognition can not be carried out by combining other equipment data information when abnormal conditions occur, so that the range of the gait recognition is smaller.
Disclosure of Invention
Aiming at the defects in the prior art, the application provides a gait recognition method, medium and equipment of an intelligent watch, which have the following technical scheme:
in one aspect, a gait recognition method of a smart watch, the method comprising the steps of:
s100, obtaining a target user ID list A= { A 1 ,A 2 ,……,A i ,……,A n },A i For the i-th target user ID, i= … … n, n is the number of target user IDs.
S200, acquiring a target position information set B= { B corresponding to A based on the intelligent watch worn by the target user 1 ,B 2 ,……,B i ,……,B n },B i ={B i1 ,B i2 ,……,B ij ,……,B im Target attitude characteristic value set C= { C corresponding to the sum A 1 ,C 2 ,……,C i ,……,C n },C i ={C i1 ,C i2 ,……,C ij ,……,C im },B ij Is A i J-th target position information in corresponding target position information list, C ij Is A i The j-th target attitude feature value in the corresponding target attitude feature value list, j= … … m, m is the number of target position information in the target position information list.
S300, according to B and C, obtaining a first target priority set D= { D corresponding to A 1 ,D 2 ,……,D i ,……,D n },D i ={D i1 ,D i2 ,……,D ij ,……,D im Second target priority set g= { G corresponding to a } and 1 ,G 2 ,……,G i ,……,G n },G i ={G i1 ,G i2 ,……,G ij ,……,G im },D ij is A i The j-th first target priority in the corresponding first target priority list, G ij Is A i The j-th first target priority in the corresponding second target priority list, wherein D i1 =0,D ij Is B ij And B is connected with i(j-1) Distance difference between G i1 =0,G ij =C ij -C i(j-1) 。
S400, when D ij <D 0 And/or G ij <G 0 At the time, obtain A i At D ij Corresponding abnormal gait candidates in the corresponding time periods, so as to realize gait recognition of the intelligent watch according to the obtained gait candidates, wherein D 0 G is a preset first priority threshold value 0 Is a preset second priority threshold.
S500, when D ij ≥D 0 And G is ij ≥G 0 When determining A i At D ij And the intelligent watch is in the target gait in the corresponding time period, so that the gait recognition of the intelligent watch is realized according to the acquired target gait.
In another aspect, a non-transitory computer readable storage medium stores at least one instruction or at least one program therein, the at least one instruction or the at least one program loaded and executed by a processor to implement a processing method as described above.
In another aspect, an electronic device includes a processor and a non-transitory computer readable storage medium embodying the processing method described above.
The beneficial effects of the application are as follows: gait recognition method, medium and device of intelligent watch, wherein the method comprises the following steps: the method comprises the steps of obtaining a target user ID list, obtaining a target position information set corresponding to the target user ID list and a target posture characteristic value set corresponding to the target user ID list based on a smart watch worn by a target user, obtaining a first target priority set corresponding to the target user ID list and a second target priority set corresponding to the target user ID list according to the target position information set and the target posture characteristic value set, when the first target priority is smaller than a preset first priority threshold and/or the second target priority of the same time period is smaller than a preset second priority threshold, obtaining a corresponding abnormal gait candidate of the target user ID in the corresponding time period, and when the first target priority is not smaller than the preset first priority threshold and the second target priority of the same time period is not smaller than the preset second priority threshold, determining that the target user ID is in the target in the corresponding time period so as to realize gait recognition of the smart watch according to the obtained target gait.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. Like elements or portions are generally identified by like reference numerals throughout the several figures. In the drawings, elements or portions thereof are not necessarily drawn to scale.
Fig. 1 is a flowchart of a gait recognition method of a smart watch according to an embodiment of the present application.
Detailed Description
Embodiments of the technical scheme of the present application will be described in detail below with reference to the accompanying drawings. The following examples are only for more clearly illustrating the technical aspects of the present application, and thus are merely examples, and are not intended to limit the scope of the present application.
It is noted that unless otherwise indicated, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
Examples
The embodiment provides a gait recognition method of a smart watch, which comprises the following steps, as shown in fig. 1:
s100, obtaining a target user ID list A= { A 1 ,A 2 ,……,A i ,……,A n },A i For the i-th target user ID, i= … … n, n is the number of target user IDs.
Specifically, the target user ID is a unique identifier representing the identity of the target user, where the target user is a user wearing the smart watch.
Further, those skilled in the art know that any method for setting an identity in the prior art falls within the protection scope of the present application, and will not be described herein.
S200, acquiring a target position information set B= { B corresponding to A based on the intelligent watch worn by the target user 1 ,B 2 ,……,B i ,……,B n },B i ={B i1 ,B i2 ,……,B ij ,……,B im Target attitude characteristic value set C= { C corresponding to the sum A 1 ,C 2 ,……,C i ,……,C n },C i ={C i1 ,C i2 ,……,C ij ,……,C im },B ij Is A i J-th target position information in corresponding target position information list, C ij Is A i The j-th target attitude feature value in the corresponding target attitude feature value list, j= … … m, m is the number of target position information in the target position information list.
Specifically, the target location information is location information corresponding to the target user obtained from the smart watch worn by the target user every second on the same day, where one skilled in the art knows that any method for obtaining location information through the smart watch in the prior art falls within the protection scope of the present application, and the method for obtaining location information is not described herein, for example: through the GPS sensor of the smart watch.
Specifically, the target attitude characteristic value is the height from the ground of the central point position of the intelligent watch worn by the target user and acquired every second on the same day.
S300, according to B and C, obtaining a first target priority set D= { D corresponding to A 1 ,D 2 ,……,D i ,……,D n },D i ={D i1 ,D i2 ,……,D ij ,……,D im Second target priority set g= { G corresponding to a } and 1 ,G 2 ,……,G i ,……,G n },G i ={G i1 ,G i2 ,……,G ij ,……,G im },D ij is A i The j-th first target priority in the corresponding first target priority list, G ij Is A i The j-th second target priority in the corresponding second target priority list, wherein D i1 =0,D ij Is B ij And B is connected with i(j-1) Distance difference between G i1 =0,G ij =C ij -C i(j-1) 。
Specifically, the first target priority is a distance difference between target position information.
Specifically, the second target priority is the difference between the target attitude characteristic values.
S400, when D ij <D 0 And/or G ij <G 0 At the time, obtain A i At D ij Corresponding abnormal gait candidates in the corresponding time periods, so as to realize gait recognition of the intelligent watch according to the obtained gait candidates, wherein D 0 G is a preset first priority threshold value 0 Is a preset second priority threshold.
Specifically, D is obtained in S400 by the following steps 0 And G 0 :
S401, acquiring a sample user ID list M= { M 1 ,M 2 ,……,M r ,……,M s },M r For the r-th sample user ID, r= … … s, s is the number of sample user IDs.
Specifically, the sample user ID is a unique identifier representing the identity of the sample user, where the sample user is an acquired user wearing the smart watch and is a walking gait user.
S403, based on the smart watch worn by the target user, acquiring a sample position information set H= { H corresponding to M 1 ,H 2 ,……,H r ,……,H s },H r ={H r1 ,H r2 ,……,H rv ,……,H rb Sample attitude feature value set J= { J corresponding to J 1 ,J 2 ,……,J r ,……,J s },J r ={J r1 ,J r2 ,……,J rv ,……,J rb },H rv Is M r The v sample position information, J in the corresponding sample position information list rv Is M r The v sample attitude feature value in the corresponding sample attitude feature value list, v= … … b, b is the number of sample position information in the sample position information list.
Specifically, the sample position information is position information corresponding to a sample user, which is obtained from a smart watch worn by the sample user at a frequency of every second in a preset time period.
Further, those skilled in the art know that the selection of the preset time period can be performed according to the actual requirement, which falls within the protection scope of the present application, and will not be described herein.
Further, the sample position information obtaining mode is consistent and specific with the target position information obtaining mode, and the sample gesture characteristic value is the height from the ground of the central point position of the intelligent watch worn by the target user and obtained at the frequency of every second in a preset time period.
S405, according to H and J, obtaining a preset first priority threshold D 0 And a preset second priority threshold G 0 Wherein D is 0 Meets the following conditions:
wherein DeltaH rv Is H rv And H is r(v-1) Distance difference between delta H r0 =0。
G 0 Meets the following conditions:
wherein DeltaJ rv =J rv -J r(v-1) ,ΔJ r0 =0。
Specifically, those skilled in the art know that any method for obtaining a distance difference according to two points in the prior art falls into the protection scope of the present application, and is not described herein.
Specifically, in S400, the candidate abnormal gait is acquired by:
s1, when D ij ≥D 0 And G is ij <G 0 When determining A i At D ij And the first candidate abnormal gait is in the corresponding time period.
Specifically, the first abnormal gait candidate includes a hugging walking gait and a dorsi-hand walking gait, which can be understood as: when the first target priority corresponding to a certain time period is not smaller than the preset first target priority and the second target priority corresponding to the same time period is smaller than the preset second target priority, the state that the target user walks in the time period is indicated, but the position of the intelligent watch does not change greatly, the target user can be considered to be in a walking state of holding arms or walking back by hands in the time period.
S3, when D ij <D 0 And G is ij <G 0 At the time, obtain A i First candidate priority P i And A i Corresponding second candidate priority Q i 。
Specifically, the first candidate priority is obtained at D through a smart watch worn by the target user ij Number of steps for the corresponding time period.
Specifically, the second candidate priority is obtained at D through the mobile phone device corresponding to the target user ij Number of steps for the corresponding time period.
S5, according to P i And Q i Obtaining A i Corresponding candidate abnormal gait, wherein in S5 the candidate abnormal gait is obtained by:
s51, when P i >Q i At the time, obtain A i At D ij And in a first intermediate abnormal gait for a corresponding period of time.
Specifically, the first intermediate abnormal gait is a static exercise-like gait, wherein the static exercise-like gait comprises yoga exercise and upward equal-gait.
S53, when P i <Q i At the time, obtain A i At D ij And in a second intermediate abnormal gait for a corresponding period of time.
Specifically, the second intermediate abnormal gait is to perform exercise on an exercise apparatus, wherein the exercise apparatus includes a treadmill and an apparatus having a function consistent with the treadmill.
S55, when P i =Q i When determining A i In a static gait。
S7, when D ij <D 0 And G is ij ≥G 0 When determining A i At D ij And the second candidate abnormal gait is in the corresponding time period.
Specifically, the second candidate abnormal gait is an in-situ motion gait, which can be understood as: when the first target priority corresponding to a certain time period is smaller than the preset first target priority and the second target priority corresponding to the same time period is not smaller than the preset second target priority, the target user is indicated to be the position deviation but the position of the intelligent watch is changed greatly, and the target user can be considered to be in-situ movement gait in the time period, such as the gait of rope skipping, jumping operation or the gait of shoulder and leg beating in a certain geographical position range.
When the first target priority is smaller than the preset first priority threshold and/or the second target priority in the same time period is smaller than the preset second priority threshold, the first candidate priority and the second candidate priority corresponding to the target user ID are obtained, the gait corresponding to the target user ID is identified according to the first candidate priority and the second candidate priority, when the gait corresponding to the user cannot be identified by only using the step number information obtained by the intelligent watch, the step number information of the intelligent watch is compared with the step number information of the user mobile phone equipment in the same time period, different gaits corresponding to the user are determined, different gaits corresponding to the user can be identified, and the gait identification range of the intelligent watch is improved.
S500, when D ij ≥D 0 And G is ij ≥G 0 When determining A i At D ij And the intelligent watch is in the target gait in the corresponding time period, so that the gait recognition of the intelligent watch is realized according to the acquired target gait.
Specifically, the target gait is a walking gait.
According to the gait recognition method, gait information is determined by using the first target priority and the second target priority corresponding to the target user ID according to different conditions of the first target priority and the second target priority, the gait of the user is recognized and judged by combining different data information in the intelligent watch, and the gait information is determined by using different methods, so that the accuracy of the intelligent watch gait recognition is improved.
Embodiments of the present application also provide a non-transitory computer readable storage medium that may be disposed in an electronic device to store at least one instruction or at least one program for implementing one of the methods embodiments, the at least one instruction or the at least one program being loaded and executed by the processor to implement the methods provided by the embodiments described above.
Embodiments of the present application also provide an electronic device comprising a processor and the non-transitory computer readable storage medium described above.
The embodiment of the application provides a gait recognition method, medium and device of an intelligent watch, wherein the method comprises the following steps: the method comprises the steps of obtaining a target user ID list, obtaining a target position information set corresponding to the target user ID list and a target posture characteristic value set corresponding to the target user ID list based on a smart watch worn by a target user, obtaining a first target priority set corresponding to the target user ID list and a second target priority set corresponding to the target user ID list according to the target position information set and the target posture characteristic value set, when the first target priority is smaller than a preset first priority threshold and/or the second target priority of the same time period is smaller than a preset second priority threshold, obtaining a corresponding abnormal gait candidate of the target user ID in the corresponding time period, and when the first target priority is not smaller than the preset first priority threshold and the second target priority of the same time period is not smaller than the preset second priority threshold, determining that the target user ID is in the target in the corresponding time period so as to realize gait recognition of the smart watch according to the obtained target gait.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application, and are intended to be included within the scope of the appended claims and description.
Claims (9)
1. A gait recognition method of a smart watch, the method comprising the steps of:
s100, obtaining a target user ID list A= { A 1 ,A 2 ,……,A i ,……,A n },A i For the i-th target user ID, i= … … n, n being the number of target user IDs;
s200, acquiring a target position information set B= { B corresponding to A based on the intelligent watch worn by the target user 1 ,B 2 ,……,B i ,……,B n },B i ={B i1 ,B i2 ,……,B ij ,……,B im Target attitude characteristic value set C= { C corresponding to the sum A 1 ,C 2 ,……,C i ,……,C n },C i ={C i1 ,C i2 ,……,C ij ,……,C im },B ij Is A i J-th target position information in corresponding target position information list, C ij Is A i The jth target attitude feature value in the corresponding target attitude feature value listJ= … … m, m being the number of target position information in the target position information list;
s300, according to B and C, obtaining a first target priority set D= { D corresponding to A 1 ,D 2 ,……,D i ,……,D n },D i ={D i1 ,D i2 ,……,D ij ,……,D im Second target priority set g= { G corresponding to a } and 1 ,G 2 ,……,G i ,……,G n },G i ={G i1 ,G i2 ,……,G ij ,……,G im },D ij is A i The j-th first target priority in the corresponding first target priority list, G ij Is A i The j-th first target priority in the corresponding second target priority list, wherein D i1 =0,D ij Is B ij And B is connected with i(j-1) Distance difference between G i1 =0,G ij =C ij -C i(j-1) ;
S400, when D ij <D 0 And/or G ij <G 0 At the time, obtain A i At D ij Corresponding abnormal gait candidates in the corresponding time periods, so as to realize gait recognition of the intelligent watch according to the obtained gait candidates, wherein D 0 G is a preset first priority threshold value 0 Is a preset second priority threshold;
s500, when D ij ≥D 0 And G is ij ≥G 0 When determining A i At D ij And the intelligent watch is in the target gait in the corresponding time period, so that the gait recognition of the intelligent watch is realized according to the acquired target gait.
2. The method of claim 1, wherein the target user ID is a unique identification characterizing the target user identity, wherein the target user is a user wearing a smart watch.
3. The method of claim 1, wherein the target location information is location information corresponding to the target user obtained from a smart watch worn by the target user on a per second basis on the day.
4. The method of claim 1, wherein the target attitude characteristic value is a height from the ground of a center point position of a smart watch worn by the target user acquired every second of the day.
5. The method according to claim 1, wherein D is obtained in S400 by 0 And G 0 :
S401, acquiring a sample user ID list M= { M 1 ,M 2 ,……,M r ,……,M s },M r For the r-th sample user ID, r= … … s, s is the number of sample user IDs;
s403, based on the smart watch worn by the target user, acquiring a sample position information set H= { H corresponding to M 1 ,H 2 ,……,H r ,……,H s },H r ={H r1 ,H r2 ,……,H rv ,……,H rb Sample attitude feature value set J= { J corresponding to J 1 ,J 2 ,……,J r ,……,J s },J r ={J r1 ,J r2 ,……,J rv ,……,J rb },H rv Is M r The v sample position information, J in the corresponding sample position information list rv Is M r The v sample gesture feature value in the corresponding sample gesture feature value list, v= … … b, b is the number of sample position information in the sample position information list;
s405, according to H and J, obtaining a preset first priority threshold D 0 And a preset second priority threshold G 0 Wherein D is 0 Meets the following conditions:
wherein delta isH rv Is H rv And H is r(v-1) Distance difference between delta H r0 =0;
G 0 Meets the following conditions:
wherein DeltaJ rv =J rv -J r(v-1) ,ΔJ r0 =0。
6. The method according to claim 1, characterized in that in S400 a candidate abnormal gait is obtained by:
s1, when D ij ≥D 0 And G is ij <G 0 When determining A i At D ij The first candidate abnormal gait is in the corresponding time period;
s3, when D ij <D 0 And G is ij <G 0 At the time, obtain A i First candidate priority P i And A i Corresponding second candidate priority Q i ;
S5, according to P i And Q i Obtaining A i Corresponding candidate abnormal gait;
s7, when D ij <D 0 And G is ij ≥G 0 When determining A i At D ij And the second candidate abnormal gait is in the corresponding time period.
7. The method according to claim 6, wherein the candidate abnormal gait is obtained in S5 by:
s51, when P i >Q i At the time, obtain A i At D ij The first middle abnormal gait is in the corresponding time period;
s53, when P i <Q i At the time, obtain A i At D ij In a second intermediate abnormal gait within a corresponding period of time;
s55, when P i =Q i When determining A i In a static gait.
8. A non-transitory computer readable storage medium having stored therein at least one instruction or at least one program loaded and executed by a processor to implement the method of any one of claims 1-7.
9. An electronic device comprising a processor and the non-transitory computer readable storage medium of claim 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310709903.6A CN116740811A (en) | 2023-06-15 | 2023-06-15 | Gait recognition method, medium and device of intelligent watch |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310709903.6A CN116740811A (en) | 2023-06-15 | 2023-06-15 | Gait recognition method, medium and device of intelligent watch |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116740811A true CN116740811A (en) | 2023-09-12 |
Family
ID=87902429
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310709903.6A Pending CN116740811A (en) | 2023-06-15 | 2023-06-15 | Gait recognition method, medium and device of intelligent watch |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116740811A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107990895A (en) * | 2017-11-08 | 2018-05-04 | 北京工商大学 | A kind of building floor gap pedestrian track tracking and system based on wearable IMU |
CN110049490A (en) * | 2019-04-23 | 2019-07-23 | 广东小天才科技有限公司 | Safety protecting method, device, wearable device and the medium of wearable device |
CN110657802A (en) * | 2019-10-11 | 2020-01-07 | 北京航空航天大学 | Intelligent bracelet navigation method under condition of GPS failure |
WO2021138964A1 (en) * | 2020-01-10 | 2021-07-15 | 鄢家厚 | Read/write distance identification method based on smart watch |
WO2021258333A1 (en) * | 2020-06-24 | 2021-12-30 | 中国科学院深圳先进技术研究院 | Gait abnormality early identification and risk early-warning method and apparatus |
CN114783057A (en) * | 2022-04-08 | 2022-07-22 | 杭州华橙软件技术有限公司 | Gait information acquisition method and related device |
-
2023
- 2023-06-15 CN CN202310709903.6A patent/CN116740811A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107990895A (en) * | 2017-11-08 | 2018-05-04 | 北京工商大学 | A kind of building floor gap pedestrian track tracking and system based on wearable IMU |
CN110049490A (en) * | 2019-04-23 | 2019-07-23 | 广东小天才科技有限公司 | Safety protecting method, device, wearable device and the medium of wearable device |
CN110657802A (en) * | 2019-10-11 | 2020-01-07 | 北京航空航天大学 | Intelligent bracelet navigation method under condition of GPS failure |
WO2021138964A1 (en) * | 2020-01-10 | 2021-07-15 | 鄢家厚 | Read/write distance identification method based on smart watch |
WO2021258333A1 (en) * | 2020-06-24 | 2021-12-30 | 中国科学院深圳先进技术研究院 | Gait abnormality early identification and risk early-warning method and apparatus |
CN114783057A (en) * | 2022-04-08 | 2022-07-22 | 杭州华橙软件技术有限公司 | Gait information acquisition method and related device |
Non-Patent Citations (1)
Title |
---|
刘妍: ""基于可穿戴设备的特征提取与快速分类算法研究"", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 2 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9948734B2 (en) | User activity tracking system | |
Huang et al. | An online radio map update scheme for WiFi fingerprint-based localization | |
US10949824B2 (en) | Identity authentication method, device, and system | |
CN103634900B (en) | Position the method and mobile terminal of mobile terminal | |
US20180276504A1 (en) | Image evaluation method | |
CN109993125A (en) | Model training method, face identification method, device, equipment and storage medium | |
CN108513259B (en) | Electronic device, floor positioning method, and computer-readable storage medium | |
US20130218451A1 (en) | Noise pattern acquisition device and position detection apparatus provided therewith | |
CN106767772B (en) | Method and device for constructing geomagnetic fingerprint distribution map and positioning method and device | |
JP7423584B2 (en) | Discrimination device, discrimination method and program | |
JP5930551B2 (en) | Residence point extraction method, residence point extraction device, and residence point extraction program | |
CN105823483B (en) | A kind of user's walking positioning method based on Inertial Measurement Unit | |
EP2835769A1 (en) | Method, device and system for annotated capture of sensor data and crowd modelling of activities | |
JP6035995B2 (en) | Weather information generating apparatus, program, and communication system | |
JP2017004252A (en) | Image information processing system | |
Wu et al. | Efficient indoor localization based on geomagnetism | |
JP2018005467A (en) | Farmwork plan support device and farmwork plan support method | |
CN116740811A (en) | Gait recognition method, medium and device of intelligent watch | |
US11692829B2 (en) | System and method for determining a trajectory of a subject using motion data | |
US11006238B1 (en) | Method for profiling based on foothold and terminal using the same | |
CN110392115B (en) | Block chain node management method, device and readable storage medium | |
KR101914922B1 (en) | Method and apparatus for estimating a position | |
US20210095967A1 (en) | Crowd sourced multi-stage mobile device fingerprint based navigation | |
CN111829520A (en) | Indoor positioning path drawing tool | |
CN110320493B (en) | Indoor positioning method, device, electronic equipment and computer storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |