CN116784838B - Steering identification system, method, equipment and medium based on wearable inertial sensor - Google Patents
Steering identification system, method, equipment and medium based on wearable inertial sensor Download PDFInfo
- Publication number
- CN116784838B CN116784838B CN202311069783.4A CN202311069783A CN116784838B CN 116784838 B CN116784838 B CN 116784838B CN 202311069783 A CN202311069783 A CN 202311069783A CN 116784838 B CN116784838 B CN 116784838B
- Authority
- CN
- China
- Prior art keywords
- steering
- azimuth
- marking
- time window
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 239000013598 vector Substances 0.000 claims abstract description 69
- 230000011218 segmentation Effects 0.000 claims abstract description 9
- 238000004364 calculation method Methods 0.000 claims abstract description 7
- 230000010354 integration Effects 0.000 claims abstract description 7
- 238000012795 verification Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 238000005259 measurement Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 3
- 230000009471 action Effects 0.000 abstract description 6
- 230000000694 effects Effects 0.000 abstract description 6
- 238000012544 monitoring process Methods 0.000 abstract description 6
- 230000007613 environmental effect Effects 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000036541 health Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 210000001513 elbow Anatomy 0.000 description 3
- 210000002683 foot Anatomy 0.000 description 3
- 230000005021 gait Effects 0.000 description 3
- 210000003127 knee Anatomy 0.000 description 3
- 210000000707 wrist Anatomy 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Dentistry (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Physiology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention relates to a steering recognition system, a method, equipment and a medium based on a wearable inertial sensor, comprising the following steps: the azimuth generation module: according to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output; and a data segmentation module: taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window; and a steering marking module: marking all azimuth vectors as turning or non-turning according to the time window; a section of direction vector sequence marked as steering is called a steering event, and a steering marking result is obtained; the invention is based on angular velocity data of the gyroscope only, does not need to execute initialization action, can overcome the influence of data signals such as high-frequency signals, noise drift, environmental interference and the like, and is easy to popularize; the wearable position has low requirement on the wearing position, can overcome the wearable position with severe movement amplitude, and improves the robustness and the effectiveness of activity monitoring in a complex environment.
Description
Technical Field
The invention relates to a steering identification system, method, equipment and medium based on a wearable inertial sensor, and belongs to the technical field of wearable devices.
Background
The wearable technology provides new possibilities for long-range health monitoring use of health evaluation, motion monitoring and the like, wherein an inertial sensing unit (IMU) for measuring acceleration and angular velocity signals is widely adopted in a wearable system because of reflecting motion information and being convenient to carry. Key event recognition (e.g., walking, falling, turning, etc.) is a common topic in human motion recognition, where turning recognition is important by clinical testing, home care, etc. because of the benefits of trajectory speculation and activity assessment.
Early IMU practitioners have developed sophisticated commercial motion capture systems, such as the millimeter-scale inertial motion capture systems offered by Xsens, nordstem, etc., which have been used in the fields of film and video, animation, etc. However, the motion capture system is complex to wear and operate, tens of thousands of yuan for each set of system is started, the problem of Cheng Piaoyi still exists, and the motion capture system is difficult to popularize in a home scene.
Lightweight IMU technology has evolved in recent years to provide the potential for field activity evaluation. Common IMU-based steering recognition technology is divided into a single-mode and a multi-mode according to modes, and is divided into continuous track restoration and key event recognition according to whether restoration contents are discrete or not.
A conventional unimodal IMU contains 3-axis acceleration and 3-axis angular velocity signals. The presence of high frequency noise makes the IMU experience directional drift and trajectory drift the most troublesome problem. The filtering technology relieves the influence of high-frequency noise by losing high-frequency information, and the drift problem can not be thoroughly solved, so that the track restoration of a single IMU is difficult to be used for track restoration of a fine scale. The envelope method based on acceleration or angular velocity can directly identify the course of the motion mode switching, i.e. the critical event. These techniques require a firm fit, often limited to the relatively stable torso region between chest and hip, and are difficult to use in areas of wear where the motion amplitude is severe, such as the foot, knee, elbow, wrist, etc., while still being sensitive to high frequency noise.
Multi-modality technology combines IMUs with other modalities to enhance effects, common modalities being magnetometer (MIMU), global Positioning System (GPS), etc. The MIMU technology effectively suppresses the problem of high-frequency noise in an interference-free environment through magnetic field signals, and can obtain millimeter-level position and posture information. The MIMU needs initialization operation every time, is sensitive to magnetic field environment, and is still difficult to popularize indifferently in the field. The IMU technology combined with the GPS can acquire the position information with the meter-level precision in an outdoor environment with strong satellite signals, and unfortunately, the IMU technology is better in performance only in an outdoor scene.
Disclosure of Invention
The invention solves the technical problems: according to the requirements of long-range health monitoring, the steering recognition system, the method, the device and the medium based on the wearable inertial sensor are provided, and only based on angular velocity data of a gyroscope, no initialization action is required to be executed, so that the influence of data signals such as high-frequency signals, noise drift, environmental interference and the like can be overcome, and the system is easy to popularize; the wearable position has low requirements on wearing positions, can be qualified for wearable positions with intense motion amplitudes such as feet, knees, elbows and wrists, and improves the robustness and effectiveness of activity monitoring in complex environments.
The technical proposal of the invention is as follows:
in a first aspect, the present invention provides a wearable inertial sensor-based steering identification system, comprising: the system comprises an azimuth generation module, a data segmentation module and a steering marking module;
the azimuth generation module: according to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
and a data segmentation module: taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
and a steering marking module: marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
In order to further optimize the technical scheme of the system, the invention also adopts the following technical measures.
Further, the steering identification system further comprises a result verification module; the result checking module calculates the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
Further, the specific implementation process of the steering marking module is as follows:
(1) Checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than a preset threshold value; stopping checking if there is no time window to be merged; otherwise, checking again;
(2) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths larger than the preset threshold value as non-steering, and marking the azimuth vectors in the time windows with the lengths smaller than the preset threshold value as steering; all direction vectors are marked unsupervised as steering or non-steering at this point.
In a second aspect, the invention provides a steering recognition method based on a wearable inertial sensor, comprising the following steps:
(1) According to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
(2) Taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
(3) Marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
In order to further optimize the technical scheme of the method, the invention also adopts the following technical measures.
Further, step (3) of the method is followed by a result verification step (4);
the result checking step (4) is realized as follows: calculating the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
In a third aspect, the present invention provides an electronic device (computer, server, smart phone, etc.) comprising a processor and a memory;
a memory for storing a computer program;
and the processor is used for executing the computer program stored in the memory and realizing a steering identification method based on the wearable inertial sensor when executing.
In a fourth aspect, the present invention provides a computer readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) having stored thereon a computer program which, when executed by a processor, implements a wearable inertial sensor based turn recognition method.
Compared with the prior art, the invention has the advantages that:
(1) The invention only uses the triaxial gyroscope without executing the initialization action, can be qualified for wearing parts with intense motion amplitude such as feet, knees, elbows, wrists and the like and complex signal frequency domain, and can realize high-accuracy steering identification.
(2) The steering marking module provided by the invention segments the dynamic window taking the gait event as a mark, realizes the unsupervised steering marking, is insensitive to high-frequency signals, noise drift, magnetic field interference and the like, and improves the stability and the robustness of the steering identification effect.
(3) The result verification module provided by the invention can remove external interference from environment, other personnel, irrelevant behaviors and the like, realizes specified steering action recognition under the condition of round-trip walking capability test and specified track walking test, and improves the accuracy and robustness of steering recognition during motion health monitoring.
Drawings
FIG. 1 is a flow chart of a wearable inertial sensor-based steering identification system of the present invention;
FIG. 2 is a schematic diagram of a foot sensing system, (a) is a schematic diagram of a sensor mounting position for capturing foot motion information, a small square box represents an inertial sensor mounted inside, and (b) is a schematic diagram of an azimuth vector obtained according to the angular velocity of a gyroscope, wherein u, v, w are coordinate axes in a world reference frame;
FIG. 3 is a schematic diagram of the x-axis angular velocity data of a gyroscope, wherein broken lines represent the original sampled data of the x-axis angular velocity, and dots are marked as gait events;
fig. 4 is a schematic diagram of a window merging and turning marking flow, in which a indicates all azimuth vectors after the dynamic window is divided, B indicates azimuth vectors of non-turning marks, C indicates azimuth vectors of accurate turning marks, D indicates azimuth vectors of turning marks determined to be abnormal (too close in time domain distance), and F indicates azimuth vectors of turning marks remaining after verification.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings and examples.
The present invention will be described in further detail below by way of specific embodiments in order to make the objects, technical solutions and advantages of the present invention more apparent.
As shown in fig. 1, a steering recognition system based on a wearable inertial sensor of the present invention includes: the system comprises an azimuth generation module, a data segmentation module, a steering marking module and a result verification module;
the embodiment of the invention uses the foot angular velocity data of the gyroscope during walking, the sampling frequency is 60Hz, and the data is obtained from an inertial sensor fixed on the shoe, as shown in (a) of fig. 2. When walking, the motion amplitude of the foot is larger, the signal is distributed in a frequency domain, so that the inertial signal of the foot has greater difficulty in analysis; as shown in fig. 3, the angular velocity data of the gyroscope in the x-axis direction during walking is zero at the foot rest time and non-zero at the foot swing time, and cycle information exists.
In order to obtain a steering recognition result from angular velocity data of the foot gyroscope at the time of walking, an azimuth generation module, a data division module, and a steering marking module are required. In the azimuth generation module, quaternion calculation and azimuth integration are carried out according to angular velocity data input into a gyroscope, and an azimuth vector sequence of the wearing part during movement is output; in the data segmentation module, whether the input angular velocity data of the gyroscope starts to be static or not is used as a start-stop sign of a window, and a time window is obtained; in the steering marking module, marking all azimuth vectors as steering or non-steering according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
Under the situations of the round-trip walking capability test and the specified track walking test, all steering actions are not required to be reserved, only specified steering action recognition such as turning around, right angle turning and the like is required, external interference from links, other people, irrelevant behaviors and the like is removed, and a result verification module is required. And in the result verification module, removing unnecessary turning events according to the obtained turning marking result to obtain a verified marking result.
1. The azimuth generation module: according to the angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearing part during movement is outputWhereinIs the coordinate value in the world reference system, the azimuth vector is the unit vector +.>As shown in fig. 2 (b). Initializing the azimuth vector such that the initial azimuth vector is +.>。
2. And a data segmentation module: and taking whether the input angular velocity data of the gyroscope starts to be stationary or not as a start-stop sign of the window to obtain a time window. In the angular velocity data of the gyroscope used in the embodiment of the invention, the x-axis angular velocity has a sharp peak before and after the foot is stationary, a shorter peak before the foot is stationary corresponds to a foot touchdown event, and a taller peak after the foot is stationary corresponds to a foot touchdown event, as shown by the dot marks in fig. 3. The peak before each gait cycle is selected as a segmentation marker, resulting in a time window.
3. And a steering identification module: marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
(31) Average vector of azimuth vector sequences within each time windowWherein i is the time window index, +.>Representing the duration of the ith time window; euclidean distances of the average vectors of the sequence of azimuth vectors for different time windows are used as metric distances. Checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than 0.35; stopping checking if there is no time window to be merged; otherwise, checking again;
(32) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths of more than 3.3s as non-steering, marking the azimuth vectors in the time windows with the lengths of less than 3.3s as steering, and corresponding to the azimuth vectors B of the non-steering marks in fig. 4; all orientation vectors are marked unsupervised as either steering or non-steering at this time, corresponding to the exact steering marked orientation vector C in fig. 4 and the steering marked orientation vector D determined to be abnormal (too close in time domain distance).
4. And a result checking module: and removing unnecessary turning events according to the obtained turning marking result to obtain a verified marking result.
Definition of twoTime domain distance of steering eventIs the time difference between the middle moments of two events:
,
where i, j is the event label,is the intermediate time of the ith steering event. If the event is the first steering event, let +.>The method comprises the steps of carrying out a first treatment on the surface of the If the event is the last turn event, then letWherein->Is the moment when the walking is finished.
Defining a time domain distance score for a steering eventThe sum of the temporal distance of the event and the latest steering event before and after is:
,
calculating the time domain distance of each other for all steering events to obtain two steering events M and N with the minimum time domain distance; if the time domain distance of the two steering events with the smallest distance is smaller thanThen the temporal distance scores of the two steering events are compared +.>Method for including steering event with small time domain distance scoreThe bit vector is marked as non-steering; checking two steering events with minimum time domain distance in the new marking result; if the time domain distance of the two steering events with the smallest distance is not less than +.>And (5) finishing the verification to obtain a verified steering mark result. The result of the checked steering mark is shown in fig. 4, and the steering mark obtained finally is shown by the azimuth vector C of the accurate steering mark and the azimuth vector F of the reserved steering mark through the processing of the result checking module.
Based on the same inventive concept, another embodiment of the present invention provides an electronic device (computer, server, smart phone, etc.) comprising a memory storing a computer program configured to be executed by the processor, and a processor, the computer program comprising instructions for performing the steps of the inventive method.
Based on the same inventive concept, another embodiment of the present invention provides a computer readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) storing a computer program which, when executed by a computer, implements the steps of the inventive method.
The above examples are provided for the purpose of describing the present invention only and are not intended to limit the scope of the present invention. The scope of the invention is defined by the appended claims. Various equivalents and modifications that do not depart from the spirit and principles of the invention are intended to be included within the scope of the invention.
Claims (6)
1. A wearable inertial sensor-based steering recognition system, comprising: the system comprises an azimuth generation module, a data segmentation module and a steering marking module;
the azimuth generation module: according to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
and a data segmentation module: taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
and a steering marking module: marking all azimuth vectors as turning or non-turning according to the time window; a section of direction vector sequence marked as steering is called a steering event, and a steering marking result is obtained;
the specific implementation process of the steering marking module is as follows:
(1) Average vector of azimuth vector sequences within each time windowWherein i is the time window index, +.>Representing the duration of the ith time window; euclidean distance of the average vector of the sequence of azimuth vectors for different time windows is taken as a metric distance; checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than a preset threshold value; stopping checking if there is no time window to be merged; otherwise, checking again;
(2) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths larger than the preset threshold value as non-steering, and marking the azimuth vectors in the time windows with the lengths smaller than the preset threshold value as steering; all direction vectors are marked unsupervised as steering or non-steering at this point.
2. The wearable inertial sensor-based steering recognition system of claim 1, wherein: the steering identification system further comprises a result verification module; the result checking module calculates the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
3. The steering recognition method based on the wearable inertial sensor is characterized by comprising the following steps of:
(1) According to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
(2) Taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
(3) Marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, a steering marking result is obtained,
the specific implementation process of the step (3) is as follows:
(31) Average vector of azimuth vector sequences within each time windowWherein i is the time window index, +.>Representing the duration of the ith time window; euclidean distance of the average vector of the sequence of azimuth vectors for different time windows is taken as a metric distance; checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than a preset threshold value; stopping checking if there is no time window to be merged; otherwise, checking again;
(32) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths larger than the preset threshold value as non-steering, and marking the azimuth vectors in the time windows with the lengths smaller than the preset threshold value as steering; all direction vectors are marked unsupervised as steering or non-steering at this point.
4. A method of steering identification based on wearable inertial sensing according to claim 3, characterized in that: the step (3) of the method is further followed by a result verification step (4);
the result checking step (4) is realized as follows: calculating the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
5. An electronic device comprising a processor and a memory;
a memory for storing a computer program;
a processor for executing a computer program stored on a memory, the execution of which implements the method of claim 3 or 4.
6. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of claim 3 or 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311069783.4A CN116784838B (en) | 2023-08-24 | 2023-08-24 | Steering identification system, method, equipment and medium based on wearable inertial sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311069783.4A CN116784838B (en) | 2023-08-24 | 2023-08-24 | Steering identification system, method, equipment and medium based on wearable inertial sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116784838A CN116784838A (en) | 2023-09-22 |
CN116784838B true CN116784838B (en) | 2024-01-09 |
Family
ID=88045075
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311069783.4A Active CN116784838B (en) | 2023-08-24 | 2023-08-24 | Steering identification system, method, equipment and medium based on wearable inertial sensor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116784838B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07301541A (en) * | 1994-05-06 | 1995-11-14 | Hitachi Ltd | Navigation device |
CN106510640A (en) * | 2016-12-13 | 2017-03-22 | 哈尔滨理工大学 | Sleep quality detection method based on overturning detection |
CN107677267A (en) * | 2017-08-22 | 2018-02-09 | 重庆邮电大学 | Indoor pedestrian navigation course feedback modifiers method based on MEMS IMU |
CN107966161A (en) * | 2017-11-09 | 2018-04-27 | 内蒙古大学 | Walking detection method based on FFT |
CN108692730A (en) * | 2018-05-21 | 2018-10-23 | 同济大学 | Pedestrian applied to inertial navigation turns to recognizer |
CN110316201A (en) * | 2018-03-30 | 2019-10-11 | 中科院微电子研究所昆山分所 | A kind of zig zag recognition methods, device, system |
CN110495896A (en) * | 2019-07-31 | 2019-11-26 | 武汉理工大学 | A kind of wearable knee joint monitoring device and monitoring method based on GPRS communication |
CN114271812A (en) * | 2021-12-06 | 2022-04-05 | 南京大学 | Three-dimensional gait analysis system and method based on inertial sensor |
CN115486837A (en) * | 2022-09-22 | 2022-12-20 | 北京戴来科技有限公司 | Gait analysis method and system and device for improving walking disorder |
WO2023044372A1 (en) * | 2021-09-17 | 2023-03-23 | Dolby Laboratories Licensing Corporation | Efficient orientation tracking with future orientation prediction |
CN116186514A (en) * | 2022-12-29 | 2023-05-30 | 王琪 | Control method of integrated device |
KR20230081878A (en) * | 2021-11-30 | 2023-06-08 | 주식회사 비플렉스 | System for analyzing mothion using sensor worn on the user's head |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8876739B2 (en) * | 2009-07-24 | 2014-11-04 | Oregon Health & Science University | System for clinical assessment of movement disorders |
ES2846821T3 (en) * | 2010-07-14 | 2021-07-29 | Ecole Polytechnique Fed Lausanne Epfl | System and method for 3D gait assessment |
JP6083279B2 (en) * | 2013-03-25 | 2017-02-22 | セイコーエプソン株式会社 | Movement status information calculation method and movement status information calculation device |
US10408622B2 (en) * | 2016-11-29 | 2019-09-10 | Hrl Laboratories, Llc | System for incremental trajectory estimation based on real time inertial sensing |
CN113218395B (en) * | 2017-06-23 | 2024-06-11 | 北京方位捷讯科技有限公司 | Pedestrian walking track detection method, device and system |
-
2023
- 2023-08-24 CN CN202311069783.4A patent/CN116784838B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07301541A (en) * | 1994-05-06 | 1995-11-14 | Hitachi Ltd | Navigation device |
CN106510640A (en) * | 2016-12-13 | 2017-03-22 | 哈尔滨理工大学 | Sleep quality detection method based on overturning detection |
CN107677267A (en) * | 2017-08-22 | 2018-02-09 | 重庆邮电大学 | Indoor pedestrian navigation course feedback modifiers method based on MEMS IMU |
CN107966161A (en) * | 2017-11-09 | 2018-04-27 | 内蒙古大学 | Walking detection method based on FFT |
CN110316201A (en) * | 2018-03-30 | 2019-10-11 | 中科院微电子研究所昆山分所 | A kind of zig zag recognition methods, device, system |
CN108692730A (en) * | 2018-05-21 | 2018-10-23 | 同济大学 | Pedestrian applied to inertial navigation turns to recognizer |
CN110495896A (en) * | 2019-07-31 | 2019-11-26 | 武汉理工大学 | A kind of wearable knee joint monitoring device and monitoring method based on GPRS communication |
WO2023044372A1 (en) * | 2021-09-17 | 2023-03-23 | Dolby Laboratories Licensing Corporation | Efficient orientation tracking with future orientation prediction |
KR20230081878A (en) * | 2021-11-30 | 2023-06-08 | 주식회사 비플렉스 | System for analyzing mothion using sensor worn on the user's head |
CN114271812A (en) * | 2021-12-06 | 2022-04-05 | 南京大学 | Three-dimensional gait analysis system and method based on inertial sensor |
CN115486837A (en) * | 2022-09-22 | 2022-12-20 | 北京戴来科技有限公司 | Gait analysis method and system and device for improving walking disorder |
CN116186514A (en) * | 2022-12-29 | 2023-05-30 | 王琪 | Control method of integrated device |
Non-Patent Citations (1)
Title |
---|
基于惯性测量单元的帕金森病患者冻结步态检测系统;马銮,李波陈,何娟娟,姚志明,杨先军,梁栋;中国医疗器械杂志(第4期);238-242 * |
Also Published As
Publication number | Publication date |
---|---|
CN116784838A (en) | 2023-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Yan et al. | Ronin: Robust neural inertial navigation in the wild: Benchmark, evaluations, and new methods | |
US11030918B2 (en) | Identification and analysis of movement using sensor devices | |
Sabatini | Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing | |
Cho et al. | Design and implementation of practical step detection algorithm for wrist-worn devices | |
CN110501011A (en) | Determine position of the mobile device in geographic area | |
Fang et al. | Development of a wearable device for motion capturing based on magnetic and inertial measurement units | |
Dong et al. | An optical-tracking calibration method for MEMS-based digital writing instrument | |
JP5750742B2 (en) | Mobile object state estimation device | |
Li et al. | Real-time human motion capture based on wearable inertial sensor networks | |
Wang et al. | Pose-invariant inertial odometry for pedestrian localization | |
Kim et al. | Recognition of sign language with an inertial sensor-based data glove | |
Lin et al. | Development of an ultra-miniaturized inertial measurement unit WB-3 for human body motion tracking | |
Fatmi et al. | American Sign Language Recognition using Hidden Markov Models and Wearable Motion Sensors. | |
Malawski | Depth versus inertial sensors in real-time sports analysis: A case study on fencing | |
Luo et al. | Deep motion network for freehand 3D ultrasound reconstruction | |
Lin | RETRACTED ARTICLE: Research on film animation design based on inertial motion capture algorithm | |
Panahandeh et al. | Chest-mounted inertial measurement unit for pedestrian motion classification using continuous hidden Markov model | |
Wang et al. | Effective inertial hand gesture recognition using particle filtering based trajectory matching | |
CN116784838B (en) | Steering identification system, method, equipment and medium based on wearable inertial sensor | |
CN109758154A (en) | A kind of motion state determines method, apparatus, equipment and storage medium | |
CN112907633A (en) | Dynamic characteristic point identification method and application thereof | |
CN111382701A (en) | Motion capture method, motion capture device, electronic equipment and computer-readable storage medium | |
CN107847187A (en) | Apparatus and method for carrying out motion tracking at least part of limbs | |
US11854214B2 (en) | Information processing apparatus specifying a relationship between a sensor and an object included in image data, and method and non-transitory computer-readable storage medium | |
JP6147446B1 (en) | Inertial sensor initialization using soft constraints and penalty functions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |