CN116784838A - Steering identification system, method, equipment and medium based on wearable inertial sensor - Google Patents

Steering identification system, method, equipment and medium based on wearable inertial sensor Download PDF

Info

Publication number
CN116784838A
CN116784838A CN202311069783.4A CN202311069783A CN116784838A CN 116784838 A CN116784838 A CN 116784838A CN 202311069783 A CN202311069783 A CN 202311069783A CN 116784838 A CN116784838 A CN 116784838A
Authority
CN
China
Prior art keywords
steering
azimuth
marking
time domain
events
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311069783.4A
Other languages
Chinese (zh)
Other versions
CN116784838B (en
Inventor
应奇峻
程敬原
陈康玉
王齐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Anhui Provincial Hospital First Affiliated Hospital of USTC
Original Assignee
University of Science and Technology of China USTC
Anhui Provincial Hospital First Affiliated Hospital of USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC, Anhui Provincial Hospital First Affiliated Hospital of USTC filed Critical University of Science and Technology of China USTC
Priority to CN202311069783.4A priority Critical patent/CN116784838B/en
Publication of CN116784838A publication Critical patent/CN116784838A/en
Application granted granted Critical
Publication of CN116784838B publication Critical patent/CN116784838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb

Abstract

The invention relates to a steering recognition system, a method, equipment and a medium based on a wearable inertial sensor, comprising the following steps: the azimuth generation module: according to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output; and a data segmentation module: taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window; and a steering marking module: marking all azimuth vectors as turning or non-turning according to the time window; a section of direction vector sequence marked as steering is called a steering event, and a steering marking result is obtained; the invention is based on angular velocity data of the gyroscope only, does not need to execute initialization action, can overcome the influence of data signals such as high-frequency signals, noise drift, environmental interference and the like, and is easy to popularize; the wearable position has low requirement on the wearing position, can overcome the wearable position with severe movement amplitude, and improves the robustness and the effectiveness of activity monitoring in a complex environment.

Description

Steering identification system, method, equipment and medium based on wearable inertial sensor
Technical Field
The invention relates to a steering identification system, method, equipment and medium based on a wearable inertial sensor, and belongs to the technical field of wearable devices.
Background
The wearable technology provides new possibilities for long-range health monitoring use of health evaluation, motion monitoring and the like, wherein an inertial sensing unit (IMU) for measuring acceleration and angular velocity signals is widely adopted in a wearable system because of reflecting motion information and being convenient to carry. Key event recognition (e.g., walking, falling, turning, etc.) is a common topic in human motion recognition, where turning recognition is important by clinical testing, home care, etc. because of the benefits of trajectory speculation and activity assessment.
Early IMU practitioners have developed sophisticated commercial motion capture systems, such as the millimeter-scale inertial motion capture systems offered by Xsens, nordstem, etc., which have been used in the fields of film and video, animation, etc. However, the motion capture system is complex to wear and operate, tens of thousands of yuan for each set of system is started, the problem of Cheng Piaoyi still exists, and the motion capture system is difficult to popularize in a home scene.
Lightweight IMU technology has evolved in recent years to provide the potential for field activity evaluation. Common IMU-based steering recognition technology is divided into a single-mode and a multi-mode according to modes, and is divided into continuous track restoration and key event recognition according to whether restoration contents are discrete or not.
A conventional unimodal IMU contains 3-axis acceleration and 3-axis angular velocity signals. The presence of high frequency noise makes the IMU experience directional drift and trajectory drift the most troublesome problem. The filtering technology relieves the influence of high-frequency noise by losing high-frequency information, and the drift problem can not be thoroughly solved, so that the track restoration of a single IMU is difficult to be used for track restoration of a fine scale. The envelope method based on acceleration or angular velocity can directly identify the course of the motion mode switching, i.e. the critical event. These techniques require a firm fit, often limited to the relatively stable torso region between chest and hip, and are difficult to use in areas of wear where the motion amplitude is severe, such as the foot, knee, elbow, wrist, etc., while still being sensitive to high frequency noise.
Multi-modality technology combines IMUs with other modalities to enhance effects, common modalities being magnetometer (MIMU), global Positioning System (GPS), etc. The MIMU technology effectively suppresses the problem of high-frequency noise in an interference-free environment through magnetic field signals, and can obtain millimeter-level position and posture information. The MIMU needs initialization operation every time, is sensitive to magnetic field environment, and is still difficult to popularize indifferently in the field. The IMU technology combined with the GPS can acquire the position information with the meter-level precision in an outdoor environment with strong satellite signals, and unfortunately, the IMU technology is better in performance only in an outdoor scene.
Disclosure of Invention
The invention solves the technical problems: according to the requirements of long-range health monitoring, the steering recognition system, the method, the device and the medium based on the wearable inertial sensor are provided, and only based on angular velocity data of a gyroscope, no initialization action is required to be executed, so that the influence of data signals such as high-frequency signals, noise drift, environmental interference and the like can be overcome, and the system is easy to popularize; the wearable position has low requirements on wearing positions, can be qualified for wearable positions with intense motion amplitudes such as feet, knees, elbows and wrists, and improves the robustness and effectiveness of activity monitoring in complex environments.
The technical proposal of the invention is as follows:
in a first aspect, the present invention provides a wearable inertial sensor-based steering identification system, comprising: the system comprises an azimuth generation module, a data segmentation module and a steering marking module;
the azimuth generation module: according to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
and a data segmentation module: taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
and a steering marking module: marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
In order to further optimize the technical scheme of the system, the invention also adopts the following technical measures.
Further, the steering identification system further comprises a result verification module; the result checking module calculates the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
Further, the specific implementation process of the steering marking module is as follows:
(1) Checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than a preset threshold value; stopping checking if there is no time window to be merged; otherwise, checking again;
(2) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths larger than the preset threshold value as non-steering, and marking the azimuth vectors in the time windows with the lengths smaller than the preset threshold value as steering; all direction vectors are marked unsupervised as steering or non-steering at this point.
In a second aspect, the invention provides a steering recognition method based on a wearable inertial sensor, comprising the following steps:
(1) According to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
(2) Taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
(3) Marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
In order to further optimize the technical scheme of the method, the invention also adopts the following technical measures.
Further, step (3) of the method is followed by a result verification step (4);
the result checking step (4) is realized as follows: calculating the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
In a third aspect, the present invention provides an electronic device (computer, server, smart phone, etc.) comprising a processor and a memory;
a memory for storing a computer program;
and the processor is used for executing the computer program stored in the memory and realizing a steering identification method based on the wearable inertial sensor when executing.
In a fourth aspect, the present invention provides a computer readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) having stored thereon a computer program which, when executed by a processor, implements a wearable inertial sensor based turn recognition method.
Compared with the prior art, the invention has the advantages that:
(1) The invention only uses the triaxial gyroscope without executing the initialization action, can be qualified for wearing parts with intense motion amplitude such as feet, knees, elbows, wrists and the like and complex signal frequency domain, and can realize high-accuracy steering identification.
(2) The steering marking module provided by the invention segments the dynamic window taking the gait event as a mark, realizes the unsupervised steering marking, is insensitive to high-frequency signals, noise drift, magnetic field interference and the like, and improves the stability and the robustness of the steering identification effect.
(3) The result verification module provided by the invention can remove external interference from environment, other personnel, irrelevant behaviors and the like, realizes specified steering action recognition under the condition of round-trip walking capability test and specified track walking test, and improves the accuracy and robustness of steering recognition during motion health monitoring.
Drawings
FIG. 1 is a flow chart of a wearable inertial sensor-based steering identification system of the present invention;
FIG. 2 is a schematic diagram of a foot sensing system, (a) is a schematic diagram of a sensor mounting position for capturing foot motion information, a small square box represents an inertial sensor mounted inside, and (b) is a schematic diagram of an azimuth vector obtained according to the angular velocity of a gyroscope, wherein u, v, w are coordinate axes in a world reference frame;
FIG. 3 is a schematic diagram of the x-axis angular velocity data of a gyroscope, wherein broken lines represent the original sampled data of the x-axis angular velocity, and dots are marked as gait events;
fig. 4 is a schematic diagram of a window merging and turning marking flow, in which a indicates all azimuth vectors after the dynamic window is divided, B indicates azimuth vectors of non-turning marks, C indicates azimuth vectors of accurate turning marks, D indicates azimuth vectors of turning marks determined to be abnormal (too close in time domain distance), and F indicates azimuth vectors of turning marks remaining after verification.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings and examples.
The present invention will be described in further detail below by way of specific embodiments in order to make the objects, technical solutions and advantages of the present invention more apparent.
As shown in fig. 1, a steering recognition system based on a wearable inertial sensor of the present invention includes: the system comprises an azimuth generation module, a data segmentation module, a steering marking module and a result verification module;
the embodiment of the invention uses the foot angular velocity data of the gyroscope during walking, the sampling frequency is 60Hz, and the data is obtained from an inertial sensor fixed on the shoe, as shown in (a) of fig. 2. When walking, the motion amplitude of the foot is larger, the signal is distributed in a frequency domain, so that the inertial signal of the foot has greater difficulty in analysis; as shown in fig. 3, the angular velocity data of the gyroscope in the x-axis direction during walking is zero at the foot rest time and non-zero at the foot swing time, and cycle information exists.
In order to obtain a steering recognition result from angular velocity data of the foot gyroscope at the time of walking, an azimuth generation module, a data division module, and a steering marking module are required. In the azimuth generation module, quaternion calculation and azimuth integration are carried out according to angular velocity data input into a gyroscope, and an azimuth vector sequence of the wearing part during movement is output; in the data segmentation module, whether the input angular velocity data of the gyroscope starts to be static or not is used as a start-stop sign of a window, and a time window is obtained; in the steering marking module, marking all azimuth vectors as steering or non-steering according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
Under the situations of the round-trip walking capability test and the specified track walking test, all steering actions are not required to be reserved, only specified steering action recognition such as turning around, right angle turning and the like is required, external interference from links, other people, irrelevant behaviors and the like is removed, and a result verification module is required. And in the result verification module, removing unnecessary turning events according to the obtained turning marking result to obtain a verified marking result.
1. The azimuth generation module: according to the angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearing part during movement is outputWhereinIs the coordinate value in the world reference system, the azimuth vector is the unit vector +.>As in (b) of FIG. 2As shown. Initializing the azimuth vector such that the initial azimuth vector is +.>
2. And a data segmentation module: and taking whether the input angular velocity data of the gyroscope starts to be stationary or not as a start-stop sign of the window to obtain a time window. In the angular velocity data of the gyroscope used in the embodiment of the invention, the x-axis angular velocity has a sharp peak before and after the foot is stationary, a shorter peak before the foot is stationary corresponds to a foot touchdown event, and a taller peak after the foot is stationary corresponds to a foot touchdown event, as shown by the dot marks in fig. 3. The peak before each gait cycle is selected as a segmentation marker, resulting in a time window.
3. And a steering identification module: marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
(31) Average vector of azimuth vector sequences within each time windowWherein i is the time window index, +.>Representing the duration of the ith time window; euclidean distances of the average vectors of the sequence of azimuth vectors for different time windows are used as metric distances. Checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than 0.35; stopping checking if there is no time window to be merged; otherwise, checking again;
(32) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths of more than 3.3s as non-steering, marking the azimuth vectors in the time windows with the lengths of less than 3.3s as steering, and corresponding to the azimuth vectors B of the non-steering marks in fig. 4; all orientation vectors are marked unsupervised as either steering or non-steering at this time, corresponding to the exact steering marked orientation vector C in fig. 4 and the steering marked orientation vector D determined to be abnormal (too close in time domain distance).
4. And a result checking module: and removing unnecessary turning events according to the obtained turning marking result to obtain a verified marking result.
Defining a temporal distance of two steering eventsIs the time difference between the middle moments of two events:
where i, j is the event label,is the intermediate time of the ith steering event. If the event is the first steering event, let +.>The method comprises the steps of carrying out a first treatment on the surface of the If the event is the last turn event, let ≡>Wherein->Is the moment when the walking is finished.
Defining a time domain distance score for a steering eventThe sum of the temporal distance of the event and the latest steering event before and after is:
calculating the time domain distance of each other for all steering events to obtain two steering events M and N with the minimum time domain distance; if the time domain distance of the two steering events with the smallest distance is smaller thanThen the temporal distance scores of the two steering events are compared +.>Marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; if the time domain distance of the two steering events with the smallest distance is not less than +.>And (5) finishing the verification to obtain a verified steering mark result. The result of the checked steering mark is shown in fig. 4, and the steering mark obtained finally is shown by the azimuth vector C of the accurate steering mark and the azimuth vector F of the reserved steering mark through the processing of the result checking module.
Based on the same inventive concept, another embodiment of the present invention provides an electronic device (computer, server, smart phone, etc.) comprising a memory storing a computer program configured to be executed by the processor, and a processor, the computer program comprising instructions for performing the steps of the inventive method.
Based on the same inventive concept, another embodiment of the present invention provides a computer readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) storing a computer program which, when executed by a computer, implements the steps of the inventive method.
The above examples are provided for the purpose of describing the present invention only and are not intended to limit the scope of the present invention. The scope of the invention is defined by the appended claims. Various equivalents and modifications that do not depart from the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (7)

1. A wearable inertial sensor-based steering recognition system, comprising: the system comprises an azimuth generation module, a data segmentation module and a steering marking module;
the azimuth generation module: according to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
and a data segmentation module: taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
and a steering marking module: marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
2. The wearable inertial sensor-based steering recognition system of claim 1, wherein: the steering identification system further comprises a result verification module; the result checking module calculates the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
3. The wearable inertial sensor-based steering recognition system of claim 1 or 2, wherein: the specific implementation process of the steering marking module is as follows:
(1) Checking the measurement distance between the time window and the azimuth vector sequence in the front time window and the back time window, and merging the time windows with the measurement distance smaller than a preset threshold value; stopping checking if there is no time window to be merged; otherwise, checking again;
(2) Comparing the length of the combined time window; marking the azimuth vectors in the time windows with the lengths larger than the preset threshold value as non-steering, and marking the azimuth vectors in the time windows with the lengths smaller than the preset threshold value as steering; all direction vectors are marked unsupervised as steering or non-steering at this point.
4. The steering recognition method based on the wearable inertial sensor is characterized by comprising the following steps of:
(1) According to angular velocity data input into the gyroscope, quaternion calculation and azimuth integration are carried out, and an azimuth vector sequence of the wearable part during movement is output;
(2) Taking whether angular velocity data input into a gyroscope start to be stationary or not as a start-stop sign of a window to obtain a time window;
(3) Marking all azimuth vectors as turning or non-turning according to the time window; a sequence of azimuth vectors marked as steering is called a steering event, and a steering marking result is obtained.
5. The steering identification method based on wearable inertial sensing according to claim 4, wherein: the step (3) of the method is further followed by a result verification step (4);
the result checking step (4) is realized as follows: calculating the time domain distance of each other for all the steering events to obtain two steering events with the minimum time domain distance; if the time domain distance between the two steering events with the minimum distance is smaller than a set threshold value, comparing the time domain distance scores of the two steering events, and marking the azimuth vector contained in the steering event with the small time domain distance score as non-steering; checking two steering events with minimum time domain distance in the new marking result; and if the time domain distance of the two steering events with the minimum distance is not smaller than the set threshold value, ending the verification to obtain a verified steering mark result.
6. An electronic device comprising a processor and a memory;
a memory for storing a computer program;
a processor for executing a computer program stored on a memory, the execution of which implements the method of claim 4 or 5.
7. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of claim 4 or 5.
CN202311069783.4A 2023-08-24 2023-08-24 Steering identification system, method, equipment and medium based on wearable inertial sensor Active CN116784838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311069783.4A CN116784838B (en) 2023-08-24 2023-08-24 Steering identification system, method, equipment and medium based on wearable inertial sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311069783.4A CN116784838B (en) 2023-08-24 2023-08-24 Steering identification system, method, equipment and medium based on wearable inertial sensor

Publications (2)

Publication Number Publication Date
CN116784838A true CN116784838A (en) 2023-09-22
CN116784838B CN116784838B (en) 2024-01-09

Family

ID=88045075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311069783.4A Active CN116784838B (en) 2023-08-24 2023-08-24 Steering identification system, method, equipment and medium based on wearable inertial sensor

Country Status (1)

Country Link
CN (1) CN116784838B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07301541A (en) * 1994-05-06 1995-11-14 Hitachi Ltd Navigation device
US20110092860A1 (en) * 2009-07-24 2011-04-21 Oregon Health & Science University System for clinical assessment of movement disorders
US20130123665A1 (en) * 2010-07-14 2013-05-16 Ecole Polytechnique Federale De Lausanne (Epfl) System and method for 3d gait assessment
US20160146610A1 (en) * 2013-03-25 2016-05-26 Seiko Epson Corporation Movement state information calculation method and movement state information calculation device
CN106510640A (en) * 2016-12-13 2017-03-22 哈尔滨理工大学 Sleep quality detection method based on overturning detection
CN107677267A (en) * 2017-08-22 2018-02-09 重庆邮电大学 Indoor pedestrian navigation course feedback modifiers method based on MEMS IMU
CN107966161A (en) * 2017-11-09 2018-04-27 内蒙古大学 Walking detection method based on FFT
US20180149480A1 (en) * 2016-11-29 2018-05-31 Hrl Laboratories, Llc System for incremental trajectory estimation based on real time inertial sensing
CN108692730A (en) * 2018-05-21 2018-10-23 同济大学 Pedestrian applied to inertial navigation turns to recognizer
US20180372500A1 (en) * 2017-06-23 2018-12-27 Beijing Fine Way Technology Co., Ltd. Method and device for detecting pedestrian stride length and walking path
CN110316201A (en) * 2018-03-30 2019-10-11 中科院微电子研究所昆山分所 A kind of zig zag recognition methods, device, system
CN110495896A (en) * 2019-07-31 2019-11-26 武汉理工大学 A kind of wearable knee joint monitoring device and monitoring method based on GPRS communication
CN114271812A (en) * 2021-12-06 2022-04-05 南京大学 Three-dimensional gait analysis system and method based on inertial sensor
CN115486837A (en) * 2022-09-22 2022-12-20 北京戴来科技有限公司 Gait analysis method and system and device for improving walking disorder
WO2023044372A1 (en) * 2021-09-17 2023-03-23 Dolby Laboratories Licensing Corporation Efficient orientation tracking with future orientation prediction
CN116186514A (en) * 2022-12-29 2023-05-30 王琪 Control method of integrated device
KR20230081878A (en) * 2021-11-30 2023-06-08 주식회사 비플렉스 System for analyzing mothion using sensor worn on the user's head

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07301541A (en) * 1994-05-06 1995-11-14 Hitachi Ltd Navigation device
US20110092860A1 (en) * 2009-07-24 2011-04-21 Oregon Health & Science University System for clinical assessment of movement disorders
US20130123665A1 (en) * 2010-07-14 2013-05-16 Ecole Polytechnique Federale De Lausanne (Epfl) System and method for 3d gait assessment
US20160146610A1 (en) * 2013-03-25 2016-05-26 Seiko Epson Corporation Movement state information calculation method and movement state information calculation device
US20180149480A1 (en) * 2016-11-29 2018-05-31 Hrl Laboratories, Llc System for incremental trajectory estimation based on real time inertial sensing
CN106510640A (en) * 2016-12-13 2017-03-22 哈尔滨理工大学 Sleep quality detection method based on overturning detection
US20180372500A1 (en) * 2017-06-23 2018-12-27 Beijing Fine Way Technology Co., Ltd. Method and device for detecting pedestrian stride length and walking path
CN107677267A (en) * 2017-08-22 2018-02-09 重庆邮电大学 Indoor pedestrian navigation course feedback modifiers method based on MEMS IMU
CN107966161A (en) * 2017-11-09 2018-04-27 内蒙古大学 Walking detection method based on FFT
CN110316201A (en) * 2018-03-30 2019-10-11 中科院微电子研究所昆山分所 A kind of zig zag recognition methods, device, system
CN108692730A (en) * 2018-05-21 2018-10-23 同济大学 Pedestrian applied to inertial navigation turns to recognizer
CN110495896A (en) * 2019-07-31 2019-11-26 武汉理工大学 A kind of wearable knee joint monitoring device and monitoring method based on GPRS communication
WO2023044372A1 (en) * 2021-09-17 2023-03-23 Dolby Laboratories Licensing Corporation Efficient orientation tracking with future orientation prediction
KR20230081878A (en) * 2021-11-30 2023-06-08 주식회사 비플렉스 System for analyzing mothion using sensor worn on the user's head
CN114271812A (en) * 2021-12-06 2022-04-05 南京大学 Three-dimensional gait analysis system and method based on inertial sensor
CN115486837A (en) * 2022-09-22 2022-12-20 北京戴来科技有限公司 Gait analysis method and system and device for improving walking disorder
CN116186514A (en) * 2022-12-29 2023-05-30 王琪 Control method of integrated device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马銮,李波陈,何娟娟,姚志明,杨先军,梁栋: "基于惯性测量单元的帕金森病患者冻结步态检测系统", 中国医疗器械杂志, no. 4, pages 238 - 242 *

Also Published As

Publication number Publication date
CN116784838B (en) 2024-01-09

Similar Documents

Publication Publication Date Title
Cho et al. Design and implementation of practical step detection algorithm for wrist-worn devices
Huang et al. Exploiting cyclic features of walking for pedestrian dead reckoning with unconstrained smartphones
Yun et al. Design, implementation, and experimental results of a quaternion-based Kalman filter for human body motion tracking
Yan et al. Ronin: Robust neural inertial navigation in the wild: Benchmark, evaluations, and new methods
CN110501011A (en) Determine position of the mobile device in geographic area
Dong et al. An optical-tracking calibration method for MEMS-based digital writing instrument
JP6592245B2 (en) Estimating the direction of user movement on mobile devices
Li et al. Real-time human motion capture based on wearable inertial sensor networks
Wang et al. Pose-invariant inertial odometry for pedestrian localization
Lin et al. Development of an ultra-miniaturized inertial measurement unit WB-3 for human body motion tracking
Malawski Depth versus inertial sensors in real-time sports analysis: A case study on fencing
Fatmi et al. American Sign Language Recognition using Hidden Markov Models and Wearable Motion Sensors.
Luo et al. Deep motion network for freehand 3d ultrasound reconstruction
Lin RETRACTED ARTICLE: Research on film animation design based on inertial motion capture algorithm
Li et al. Adaptive threshold based ZUPT for single IMU enabled wearable pedestrian localization
Panahandeh et al. Chest-mounted inertial measurement unit for pedestrian motion classification using continuous hidden Markov model
CN116784838B (en) Steering identification system, method, equipment and medium based on wearable inertial sensor
CN109758154A (en) A kind of motion state determines method, apparatus, equipment and storage medium
CN112907633A (en) Dynamic characteristic point identification method and application thereof
CN111382701A (en) Motion capture method, motion capture device, electronic equipment and computer-readable storage medium
US11854214B2 (en) Information processing apparatus specifying a relationship between a sensor and an object included in image data, and method and non-transitory computer-readable storage medium
JP6147446B1 (en) Inertial sensor initialization using soft constraints and penalty functions
Zhang et al. PCA & HMM based arm gesture recognition using inertial measurement unit
Bao et al. Improved PCA based step direction estimation for dead-reckoning localization
Sessa et al. Ultra-miniaturized WB-3 Inertial Measurement Unit: Performance evaluation of the attitude estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant