CN107865662A - A kind of method and device for identifying limb action - Google Patents
A kind of method and device for identifying limb action Download PDFInfo
- Publication number
- CN107865662A CN107865662A CN201710979685.2A CN201710979685A CN107865662A CN 107865662 A CN107865662 A CN 107865662A CN 201710979685 A CN201710979685 A CN 201710979685A CN 107865662 A CN107865662 A CN 107865662A
- Authority
- CN
- China
- Prior art keywords
- displacement
- limbs
- acceleration
- acceleration information
- sensing data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/11—Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
Abstract
The present invention relates to a kind of method for identifying limb action, comprise the following steps:Obtain the sensing data sequence that inertial sensor exports with the limb motion;The sensing data sequence of acquirement is pre-processed;The acceleration magnitude sequence in pretreated sensing data sequence is obtained, integral operation is carried out to it, obtains the coordinate value of position after the sensor displacement;The coordinate value after coordinate value and displacement before being subjected to displacement according to the inertial sensor, obtains the type of action of the limbs.The invention further relates to a kind of device for realizing the above method.Implement a kind of method and device of identification limb action of the present invention, have the advantages that:Its use environment is unrestricted, higher without training, discrimination under line.
Description
Technical field
The present invention relates to artificial intelligence to identify field, more specifically to a kind of method and dress for identifying limb action
Put.
Background technology
In general, human action identification is an important branch in Intelligent Recognition field.Common human action
Identification, be to be obtained by the form of video or image to identify, then with the development of science and technology, and electronics industry
Development, start with wearing sensor device and human action be identified;With the development and covering of wireless technology
Expand, wireless signal is used to identify human action, and achieves preferable achievement.Sensor processing people is dressed utilizing
All it is to use below scheme in the prior art during body action recognition:Data are collected first, and then the data being collected into are carried out
Denoising or processing, characteristic quantity is then extracted, then train and classify, finally realize the identification of human action.These existing skills
Although art can also realize the identification of human action, to a certain extent meet demand.But based on video and image technique
Defect is that have line of sight limitation and light limitation, and can not protect the privacy of user;And the identification based on wireless signal, for example,
WiSee, WiTrack etc., be required for it is online in advance under be trained, it is then online again to run up;Moreover, prior art is only
Existing posture can be gone to classify, even same posture, different people does, it is also possible to can recognition failures;I.e.
Make to be same action, movement range is different, and existing technology may also can recognition failures.Therefore exist in the prior art and know
The defects of other resolution ratio is not high, use environment is restricted etc..
The content of the invention
The technical problem to be solved in the present invention is, is restricted for the above-mentioned use environment of prior art, needs line
Lower training, the defects of discrimination is not high, there is provided a kind of use environment is unrestricted, without being trained under line, one that discrimination is higher
The method and device of kind identification limb action.
The technical solution adopted for the present invention to solve the technical problems is:Construct a kind of method for identifying limb action, bag
Include following steps:
A the sensing data that the inertial sensor being worn on limbs exports with the limb motion) is obtained, it is described used
Property sensor by wireless communication module according to setting time interval be sequentially output its limbs movement in it is caused multiple
3-axis acceleration data, obtain a sensing data sequence being arranged in order by the priority of the limb motion time;
B) the sensing data sequence of acquirement is pre-processed, Coordinate Conversion is carried out to it and eliminates the transfer process
In gravity influence;
C the acceleration magnitude sequence in pretreated sensing data sequence) is obtained, integral operation is carried out to it, obtained
The coordinate value of position after the sensor displacement, the correction factor for calculating each axle of sensor three go to correct displacement;
D manikin) is combined, the coordinate after coordinate value and displacement before being subjected to displacement according to the inertial sensor
Value, obtains the type of action of the limbs.
Further, before the identification is performed, the correction factor of three axles is calculated for each sensor, it is right
Its signal exported is corrected, eliminate its output signal because of hardware and caused by error.
Further, the step D) in also comprise the following steps:
D1) according to the multiple inertial sensors being separately positioned on the different joint positions of mobile limbs in the limbs
The mobile front and rear positional information of transmission and the fixed position relation of the limbs upper joint, the movement to being obtained in above-mentioned steps
Coordinate value afterwards is optimized, and the type of action of the limbs is confirmed or corrected using the coordinate value after optimization.
Further, the step A) it is middle by the way of linear fit, eliminate in sensing data sequence per number
The acceleration hardware error of each axial direction in;The step C) in, the sensing data sequence is turned by carrier coordinate system
Actual coordinates are changed to, and the influence of acceleration of gravity is subtracted in acceleration information sequence after conversion, obtain actual add
Speed data sequence.
Further, the step D1) in, further comprise following steps:
D11) for the acceleration information lacked in multiple acceleration information sequences arranged sequentially in time, adopt
With the mode for the average value for making it equal to an adjacent preceding acceleration information and a rear acceleration information, restore it;
D12) remove what each acceleration information in the acceleration information sequence carried using multiple repeatedly smoothing methods
Random noise;
D13 continuous First-order Integral twice) is carried out to each acceleration information in the acceleration information sequence successively,
Obtain displacement corresponding to the acceleration information;
D14) the correction factor obtained in advance is acted in the displacement of the acquirement, obtains revised displacement;
D15) the revised displacement is mapped on three axis, and the mapping value is added to and produces the acceleration
On sensing station before degrees of data, the position of the sensor after the limb action of the generation generation acceleration information is obtained.
Further, the step D13) in integral operation include:
vi=vi-1+((ai-1+4ai+ai+1)/6)·Δt
si=si-1+((vi-1+4vi+vi+1)/6)·Δt
Wherein, a is the acceleration magnitude that time interval obtains, and v is the velocity amplitude obtained by the acceleration magnitude, s be by
The displacement that the acceleration magnitude obtains, i are the numbering of current acceleration value in the sequence, i=1,2 ..., n, when n is limb action
The number for the acceleration magnitude that the acceleration information sequence of acquirement includes, Δ t represent the time interval of the acceleration obtained.
Further, the step D15) further comprise:
w′x=wx+snx
w′y=wy+sny
w′z=wz+snz
Wherein, initial coordinate is w (wx,wy,wz), the coordinate after the limbs movement is w ' (w 'x,w′y,w′z), snx,
snyAnd snzIt is by being obtained most after carrying out dual-integration to each acceleration information in the acceleration information sequence successively
The displacement s calculated afterwardsnIt is respectively mapped to x, the component on y and z-axis.
Further, the correction factor includes the correction factor on each axial direction, each correction on axial direction
The factor is respectively used to correct the displacement on the axial direction;The correction factor on each axial direction is by making the limbs edge should
The movement of setpoint distance is axially done, the calculated value of the setpoint distance is calculated by the sensing data, by the setting
The calculated value of distance value and the setpoint distance subtract each other divided by the setpoint distance, obtain the correction factor on the axial direction;
When correcting the displacement on an axial direction, by by the displacement on the axial direction divided by the correction factor on the axial direction and often
The sum of number 1, obtain the amendment displacement on the axial direction.
The invention further relates to a kind of device for realizing the above method, including:
Sensing data sequence acquisition unit:For obtaining the inertial sensor being worn on limbs with the limb motion
And export sensing data, the inertial sensor by wireless communication module according to setting time interval be sequentially output its
Caused multiple 3-axis acceleration data in the limbs movement, obtain a priority by the limb motion time and arrange successively
The sensing data sequence of row;
Pretreatment unit:For being pre-processed to the sensing data sequence of acquirement, its hardware linearity error is eliminated, it is right
It carries out Coordinate Conversion and eliminates the influence of the gravity in the transfer process;
Coordinate acquisition unit:For obtaining the acceleration magnitude sequence in pretreated sensing data sequence, it is entered
Row integral operation, obtain the coordinate value of position after the sensor displacement;
Act judging unit:For with reference to manikin, the coordinate value before being subjected to displacement according to the inertial sensor
With the coordinate value after displacement, the type of action of the limbs is obtained.
Further, in addition to:
Amending unit:For according to the multiple inertial sensors being separately positioned on the different joint positions of mobile limbs
The positional information of transmission and the fixed position relation of the limbs upper joint before and after limbs movement, in above-mentioned steps
Coordinate value after obtained movement is optimized, and the action class of the limbs is confirmed or corrected using the coordinate value after optimization
Type.
Implement a kind of method and device of identification limb action of the present invention, have the advantages that:Because use is worn
The sensor being worn on limbs by way of wireless signal by limbs move caused by acceleration information transmit out, then
By the acceleration information sequence to being obtained in limbs moving process, (sensor is in limbs moving process according to setting time
The data sequence for constantly producing above-mentioned acceleration information and being formed) handled, the sensing station after being moved, with original
First position is contrasted, and then obtains the type of action of limbs, therefore, will not invasion of privacy without the limitation of light;Together
When, limb action is judged by the position at the beginning of then passing through movement and after movement, it has real physical characteristic, therefore
Also without being trained under online in advance.Therefore, its use environment is unrestricted, higher without training, discrimination under line.
Brief description of the drawings
Fig. 1 is the flow chart of recognition methods in a kind of method and device embodiment for identifying limb action of the present invention;
Fig. 2 is the particular flow sheet that sensor coordinates after limbs move are obtained in the embodiment;
Fig. 3 is the structural representation of device in the embodiment.
Embodiment
Below in conjunction with accompanying drawing, embodiments of the present invention is further illustrated.
As shown in figure 1, in a kind of method and device embodiment of identification limb action of the present invention, this method is included such as
Lower step:
Step S11 eliminates the hardware error of each sensor:In the present embodiment, when beginning to use sensor to be identified
During limb action, first have to calculate the sensor used the correction factor of three axles, so as to measurement when, when the sensor is defeated
When going out signal, the signal of output can be handled using correction factor, such as superposition etc., the signal exported to it carries out school
Just, eliminate its output signal because of hardware and caused by error.It is noted that under certain situation in the present embodiment, example
Such as, confirm that hardware error is not present in the sensor or the error is minimum, when not interfering with measurement result, this step can also be omitted
Suddenly, influence of the error to output signal and is ignored in subsequent steps.And in the case where above-mentioned hardware error is larger, then
Need to carry out this step, obtain correction factor, and in subsequent steps, for example, being pre-processed in step s 12 to data
When, above-mentioned correction factor is added in signal so that the signal of output can objectively, correctly represent the reality of the sensor
Border situation.
Step S12 obtains sensor output data sequence:In the present embodiment, to be worn on the sensor on human arm
Exemplified by, illustrate identify limb action method;Sensor include it is multiple, be worn on the wrist joint, elbow joint and shoulder of arm respectively
Joint, wherein although can be used for by the sensor mainly judged, elbow joint and shoulder joint of the sensor on wrist joint
As the sensor of judgement, still, it is main or for correcting the sensing station on wrist joint.Certainly, in the present embodiment
In certain situation under, can also be without using the sensing data in above-mentioned elbow joint and shoulder joint.In this step, pendant is obtained
The sensing data that the inertial sensor being worn on limbs exports with the limb motion, the inertial sensor passes through wireless
Communication module is sequentially output its caused multiple 3-axis acceleration data in limbs movement according to setting time interval, obtains
The sensing data sequence being arranged in order to one by the priority of the limb motion time.
Step S13 pre-processes to data sequence:In this step, the sensing data sequence of acquirement is located in advance
Reason carries out Coordinate Conversion on it and eliminates the influence of the gravity in the transfer process.In this step, following step can be subdivided into
Suddenly:By the way of linear fit, the acceleration hardware error of each axial direction in each data in sensing data sequence is eliminated;
The sensing data sequence is transformed into actual coordinates by carrier coordinate system, and in acceleration information sequence after conversion
The influence of acceleration of gravity is subtracted, obtains actual acceleration information sequence.In the ideal situation, three axles of inertial sensor should
This is orthogonal, but in practice due to the original such as welding precision, planar curvature and roughness and the machining accuracy of circuit board
Cause, device error and welding error are generated, causes real sensor internal measurement axle not fully orthogonal.Because error is deposited
In the case of transducer dwell, the accelerometer read on single shaft should have a maximum | g | and minimum-| g
|.But in a practical situation, the absolute value of maximum and minimum is simultaneously not equal to local gravitational acceleration g.It is this in order to eliminate
Error, for each inertial sensor, recover the actual acceleration in carrier coordinate system using following linear fit equation
Angle value axT,ayTAnd azT:
axT=kx*ax+bx
ayt=ky*ay+by
azT=kz*az+bz
In superincumbent equation, axT,ayTAnd azTIt is x, the actual acceleration data of y and z-axis in carrier coordinate system, and
ax,ayAnd azIt is the data that x, y and z-axis are obtained in carrier coordinate system by accelerometer, kxAnd bx, kyAnd by, kzAnd bzIt is x, y
With the accelerometer correction factor of z-axis.
By taking x-axis as an example, k can be obtained by below equationxAnd bxValue:
kx=2*g/ (xmax-xmin)
bx=g-kx*xmax
Here, xmaxAnd xminIt is the maximum that accelerometer is drawn along x-axis by n wheel random measurements under quiescent conditions
Value and minimum value, can calculate y and the correction factor of z-axis after the same method.
Generally in the case of transducer dwell, in world coordinate system (actual coordinates), gravity can be acted in z-axis.
Therefore removal acceleration of gravity is needed just to obtain real acceleration.And in carrier coordinate system, the meeting shadow of acceleration of gravity
The acceleration of three axles is rung, in order to eliminate the influence of gravity, it is necessary to which acceleration information is first transformed into the world from carrier coordinate system
Coordinate system, then directly subtract g with the acceleration information in z-axis and can be obtained by real acceleration information.
Step S14 obtains sensor position after limbs movement:In this step, pretreated sensor number is obtained
According to the acceleration magnitude sequence in sequence, integral operation is carried out to it, obtains the coordinate value of position after the sensor displacement;
In the present embodiment, accumulated by removing the random noise of acceleration information sequence, continuous single order twice being carried out to acceleration information
Get shift value, amendment shift value, the coordinate position after movement is worth to using displacement.
Step S15 is modified to the position of acquirement:In this step, according to the difference for being separately positioned on mobile limbs
Positional information and the limbs upper joint of multiple inertial sensors in the front and rear transmission of limbs movement on joint position
Fixed position relation, the coordinate value after the movement that is obtained in above-mentioned steps is optimized.In the present embodiment, by upper
State and be arranged on shoulder joint, the sensing station on elbow joint, and distance constant between above-mentioned joint, to the position of above-mentioned acquirement
Put and be modified.It is noted that in the present embodiment, this step can not also be used under certain situation, but directly by
Above-mentioned steps S14, jump directly to step S16.Generally, if using this step can it is whether larger according to the amplitude of action and
Determine.
Step S16 is according to revised position judgment limb action:In this step, occur according to the inertial sensor
The coordinate value after coordinate value and displacement before displacement, obtains the type of action of the limbs.
Fig. 2 show it is a kind of in the present embodiment in the case of by the acceleration information sequence of sensor output obtain it is mobile after
The specific steps of sensing station, these steps above-mentioned steps S14 and S15 to a certain extent further refinement.At this
In embodiment, above-mentioned specific steps include:
Step S21 polishing missing datas:Due to using wireless way for transmitting data, in different environment, signal or letter
The quality in road is it is possible that different state, accordingly, it is possible to the situation of packet loss occur, for example, should have 5 data
Form a data sequence, be finally likely to occur only 4, a data are lost in centre, in this case it is necessary to polishing this
Individual data.Certainly, for other side, in the present embodiment, if there is not the situation of packet loss, can not perform
This step, jump directly to next step progress.For being lacked in multiple acceleration information sequences arranged sequentially in time
An acceleration information, using the average value for making it equal to an adjacent preceding acceleration information and a rear acceleration information
Mode, restore it.In other words, in the present embodiment, the data lost using differential technique polishing.
Step S22 removes random noise:In this step, the acceleration information is removed using multiple repeatedly smoothing methods
The random noise that each acceleration information carries in sequence.For example, remove these random noises using five-spot triple smoothing.
Assuming that ai-2,ai-1,ai,ai+1And ai+2It is five continuous acceleration informations of any acquisition
a0And a1Go to obtain with first formula and second formula, last an-1And anWith the 4th and the 5th public affairs
Formula, other aiAll with the 3rd equation.
Step S23 integrates to acceleration information, obtains shift value:In this step, successively to the acceleration number of degrees
Continuous First-order Integral twice is carried out according to each acceleration information in sequence, obtains displacement corresponding to the acceleration information;This
Integral operation in step includes:
vi=vi-1+((ai-1+4ai+ai+1)/6)·Δt
si=si-1+((vi-1+4vi+vi+1)/6)·Δt
Wherein, a is the acceleration magnitude that time interval obtains, and v is the velocity amplitude obtained by the acceleration magnitude, s be by
The displacement that the acceleration magnitude obtains, i are the numbering of current acceleration value in the sequence, i=1,2 ..., n, when n is limb action
The number for the acceleration magnitude that the acceleration information sequence of acquirement includes.
Displacements of the step S24 to acquirement is modified:In this step, the correction factor obtained in advance is acted on described
In the displacement of acquirement, revised displacement is obtained.It is noted that in the present embodiment, this step is used under certain situation,
In yet some other cases, this step can not also be performed, jumps directly to step S25 execution.In the present embodiment, the correction
The factor includes the correction factor on each axial direction, and the correction factor each on axial direction is respectively used to correct the shifting on the axial direction
Dynamic distance;The correction factor on each axial direction passes through institute by making the limbs be moved along the axial direction setpoint distance
The calculated value that the setpoint distance is calculated in sensing data is stated, by the setpoint distance value and the calculated value of the setpoint distance
Subtract each other divided by the setpoint distance, obtain the correction factor on the axial direction;In the displacement on correcting an axial direction, lead to
The amendment by the displacement on the axial direction divided by the correction factor on the axial direction and the sum of constant 1, obtained on the axial direction is crossed to move
Dynamic distance.That is, for different sensor devices, different difference in perception is had.Even same sensor,
The sensitivity of x, y and z-axis also can be variant, can be every for each sensor to eliminate such difference therefore
The positive direction and opposite direction of individual axle all calculate a correction factor upAnd un.Illustrated by taking x-axis as an example, uxpAnd uxnRespectively
It is upAnd unIt is mapped to the positive direction and in the reverse direction of x-axis.uxpAnd uxnIt can go to ask by following formula:
uxp=(dcxp-drxp)/drxp
uxn=(dcxn-drxn)/drxp
Here drWhat is represented is the distance that sensor truly moves, and dcRepresentative sensor by integral and calculating come out away from
From in practice, may repeatedly (such as 10 times) mobile identical distance takes the average value of result is dc。drxpAnd dcxpIt is dr
And dcIt is mapped to the positive direction of x-axis, drxnAnd dcxnIt is drAnd dcIt is mapped to the opposite direction of x-axis.Therefore, the displacement of x-axis can be with
By uxpAnd uxnCorrected as follows:
suxp=snxp/(1+uxp)
suxn=snxn/(1+uxn)
In above formula, snxpAnd snxnS is corresponded to respectivelynxPositive direction and opposite direction.Assuming that have wearable device M, it
Original coordinates be M (Mx,My,Mz), the point coordinates after the correction to equipment is M ' (M 'x,M′y,M′z), then x-axis
It can be expressed as after correction:
M′x=Mx+sux
In above formula, suxIt is suxpOr suxnIt is according to snxSymbol determine, if snx> 0, then suxIt is suxp,
Otherwise, suxIt is suxn。
Step S25 obtains sensor current location:In this step, the revised displacement is mapped to three axis
On, and the mapping value is added on the sensing station before producing the acceleration information, obtain occurring to produce the acceleration
The position of the sensor after the limb action of degrees of data.I.e. in this step, by being calculated as below to obtain the present bit of sensor
Put:
w′x=wx+snx
w′y=wy+sny
w′z=wz+snz
Wherein, initial coordinate is w (wx,wy,wz), the coordinate after the limbs movement is w ' (w 'x,w′y,w′z), snx,
snyAnd snzIt is by being obtained most after carrying out dual-integration to each acceleration information in the acceleration information sequence successively
The displacement s calculated afterwardsnIt is respectively mapped to x, the component on y and z-axis.
A kind of device for realizing the above method is further related in the present embodiment, Fig. 3 shows the structure of the device.In figure 3,
The device is sentenced including sensing data sequence acquisition unit 1, pretreatment unit 2, coordinate acquisition unit 3, amending unit 4 and action
Disconnected unit 5;Wherein, sensing data sequence acquisition unit 1 is used to obtain the inertial sensor being worn on limbs with the limb
The sensing data that body is moved and exported, the inertial sensor are defeated successively according to setting time interval by wireless communication module
Go out its caused multiple 3-axis acceleration data in limbs movement, obtain a priority by the limb motion time
The sensing data sequence being arranged in order;Pretreatment unit 2 is used to pre-process the sensing data sequence of acquirement, eliminates
Its hardware linearity error, Coordinate Conversion is carried out on it and eliminates the influence of the gravity in the transfer process;Coordinate acquisition unit 3 is used
In obtaining the acceleration magnitude sequence in pretreated sensing data sequence, integral operation is carried out to it, obtains the sensor
The coordinate value of position after displacement;Amending unit 4 is used for according to the different joint positions for being separately positioned on mobile limbs
Multiple inertial sensors in the front and rear positional information of transmission of limbs movement and the fixed position of the limbs upper joint
Relation, the coordinate value after the movement that is obtained in above-mentioned steps is optimized, confirm or repair using the coordinate value after optimization
The type of action of just described limbs.Action judging unit 5 is for the coordinate value before being subjected to displacement according to the inertial sensor
With the coordinate value after displacement, the type of action of the limbs is obtained.
Embodiment described above only expresses the several embodiments of the present invention, and its description is more specific and detailed, but simultaneously
Therefore the limitation to the scope of the claims of the present invention can not be interpreted as.It should be pointed out that for one of ordinary skill in the art
For, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to the guarantor of the present invention
Protect scope.Therefore, the protection domain of patent of the present invention should be determined by the appended claims.
Claims (10)
- A kind of 1. method for identifying limb action, it is characterised in that comprise the following steps:A the sensing data that the inertial sensor being worn on limbs exports with the limb motion) is obtained, the inertia passes Sensor is sequentially output its caused multiple three axle in limbs movement by wireless communication module according to setting time interval Acceleration information, obtain a sensing data sequence being arranged in order by the priority of the limb motion time;B) the sensing data sequence of acquirement is pre-processed, Coordinate Conversion is carried out to it and is eliminated in the transfer process Gravity influences;C the acceleration magnitude sequence in pretreated sensing data sequence) is obtained, integral operation is carried out to it, obtains the biography The coordinate value of position after sensor displacement, the correction factor for calculating each axle of sensor three go to correct displacement;D manikin) is combined, the coordinate value after coordinate value and displacement before being subjected to displacement according to the inertial sensor, Obtain the type of action of the limbs.
- 2. the method for identification limb action according to claim 1, it is characterised in that also comprise the following steps:Performing Before the identification, the correction factor of three axles is calculated for each sensor, the signal that it is exported is corrected, is eliminated Its output signal error caused by because of hardware.
- 3. it is according to claim 2 identification limb action method, it is characterised in that the step D) in also include it is as follows Step:D1) moved according to the multiple inertial sensors being separately positioned on the different joint positions of mobile limbs in the limbs The fixed position relation of the positional information of front and rear transmission and the limbs upper joint, to after the movement that is obtained in above-mentioned steps Coordinate value optimize, confirmed using the coordinate value after optimization or correct the type of action of the limbs.
- 4. it is according to claim 3 identification limb action method, it is characterised in that the step A) in use Linear Quasi The mode of conjunction, eliminate the acceleration hardware error of each axial direction in each data in sensing data sequence;The step C) in, The sensing data sequence is transformed into actual coordinates by carrier coordinate system, and in acceleration information sequence after conversion The influence of acceleration of gravity is subtracted, obtains actual acceleration information sequence.
- 5. the method for identification limb action according to claim 4, it is characterised in that the step D1) in, further wrap Include following steps:D11) for the acceleration information lacked in multiple acceleration information sequences arranged sequentially in time, using making It is equal to the mode of the average value of adjacent a preceding acceleration information and a rear acceleration information, restores it;D12) carried using each acceleration information in multiple repeatedly smoothing methods removals acceleration information sequence random Noise;D13 continuous First-order Integral twice) is carried out to each acceleration information in the acceleration information sequence successively, obtained Displacement corresponding to the acceleration information;D14) the correction factor obtained in advance is acted in the displacement of the acquirement, obtains revised displacement;D15) the revised displacement is mapped on three axis, and the mapping value is added to and produces the acceleration number of degrees According to the position of the sensor after the limb action on sensing station before, obtaining occurring the generation acceleration information.
- 6. it is according to claim 5 identification limb action method, it is characterised in that the step D13) in integration fortune Including:vi=vi-1+((ai-1+4ai+ai+1)/6)·Δtsi=si-1+((vi-1+4vi+vi+1)/6)·ΔtWherein, a is the acceleration magnitude that a time interval obtains, and v is the velocity amplitude obtained by the acceleration magnitude, and s is added by this The displacement that speed is worth to, i are the numbering of current acceleration value in the sequence, i=1,2 ..., n, are obtained when n is limb action The number of acceleration magnitude that includes of acceleration information sequence, Δ t represents the time interval of the acceleration obtained.
- 7. it is according to claim 6 identification limb action method, it is characterised in that the step D15) in further wrap Include:w′x=wx+snxw′y=wy+Snyw′z=wz+SnzWherein, initial coordinate is w (wx,wy,wz), the coordinate after the limbs movement is w ' (w 'x,w′y,w′z), snx, snyAnd snz It is the last calculating by carrying out obtaining after dual-integration successively to each acceleration information in the acceleration information sequence Displacement s outnIt is respectively mapped to x, the component on y and z-axis.
- 8. the method for identification limb action according to claim 7, it is characterised in that the correction factor includes each axle The upward correction factor, the correction factor each on axial direction are respectively used to correct the displacement on the axial direction;It is described every The correction factor on individual axial direction passes through the sensing data meter by making the limbs be moved along the axial direction setpoint distance Calculation obtain the calculated value of the setpoint distance, by the calculated value of the setpoint distance value and the setpoint distance subtract each other divided by it is described Setpoint distance, obtain the correction factor on the axial direction;In the displacement on correcting an axial direction, by by the axial direction The sum of the correction factor and constant 1 in displacement divided by the axial direction, obtains the amendment displacement on the axial direction.
- A kind of 9. device for realizing identification limb action method as claimed in claim 1, it is characterised in that including:Sensing data sequence acquisition unit:It is defeated with the limb motion for obtaining the inertial sensor being worn on limbs The sensing data gone out, the inertial sensor are sequentially output it described by wireless communication module according to setting time interval Caused multiple 3-axis acceleration data in limbs movement, obtain what a priority by the limb motion time was arranged in order Sensing data sequence;Pretreatment unit:For being pre-processed to the sensing data sequence of acquirement, its hardware linearity error is eliminated, it is entered Row Coordinate Conversion simultaneously eliminates the influence of the gravity in the transfer process;Coordinate acquisition unit:For obtaining the acceleration magnitude sequence in pretreated sensing data sequence, it is accumulated Partite transport is calculated, and obtains the coordinate value of position after the sensor displacement;Act judging unit:For with reference to manikin, coordinate value and position before being subjected to displacement according to the inertial sensor Coordinate value after shifting, obtain the type of action of the limbs.
- 10. device according to claim 9, it is characterised in that also include:Amending unit:For according to being separately positioned on multiple inertial sensors on the different joint positions of mobile limbs in institute Limbs the movement front and rear positional information of transmission and the fixed position relation of the limbs upper joint are stated, to being obtained in above-mentioned steps Movement after coordinate value optimize, confirmed using the coordinate value after optimization or correct the type of action of the limbs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710979685.2A CN107865662A (en) | 2017-10-19 | 2017-10-19 | A kind of method and device for identifying limb action |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710979685.2A CN107865662A (en) | 2017-10-19 | 2017-10-19 | A kind of method and device for identifying limb action |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107865662A true CN107865662A (en) | 2018-04-03 |
Family
ID=61753134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710979685.2A Pending CN107865662A (en) | 2017-10-19 | 2017-10-19 | A kind of method and device for identifying limb action |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107865662A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109091151A (en) * | 2018-09-06 | 2018-12-28 | 中国人民解放军战略支援部队信息工程大学 | A kind of pedestrian's fall detection method and device based on MIMU |
CN109540132A (en) * | 2018-11-22 | 2019-03-29 | 中国矿业大学 | Localization method of the movable equipment based on sensor fusion on the person |
CN112729317A (en) * | 2020-12-17 | 2021-04-30 | 大陆投资(中国)有限公司 | Method for locating a vehicle and on-board system |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090024353A1 (en) * | 2007-07-19 | 2009-01-22 | Samsung Electronics Co., Ltd. | Method of measuring pose of mobile robot and method and apparatus for measuring position of mobile robot using the same |
CN101535772A (en) * | 2006-11-07 | 2009-09-16 | 关键安全体系股份有限公司 | Linear displacement sensor |
CN101685308A (en) * | 2008-09-22 | 2010-03-31 | 鸿富锦精密工业(深圳)有限公司 | Robot state perception system |
CN101886927A (en) * | 2010-06-25 | 2010-11-17 | 武汉大学 | Three-dimensional motion tracking system and method based on inertial sensor and geomagnetic sensor |
CN102323854A (en) * | 2011-03-11 | 2012-01-18 | 中国科学院研究生院 | Human motion capture device |
CN102645974A (en) * | 2012-02-24 | 2012-08-22 | 姜展伟 | Positioning identification system and method of three-dimensional motions |
CN103115596A (en) * | 2013-03-08 | 2013-05-22 | 兰州大学 | Wireless sensor network node for displacement measurement and test method |
CN102667672B (en) * | 2009-07-07 | 2014-04-02 | 闫文闻 | Acceleration motion identify method and system thereof |
CN104461013A (en) * | 2014-12-25 | 2015-03-25 | 中国科学院合肥物质科学研究院 | Human body movement reconstruction and analysis system and method based on inertial sensing units |
US20150360080A1 (en) * | 2013-01-18 | 2015-12-17 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Determining a speed of a multidimensional motion in a global coordinate system |
CN106679649A (en) * | 2016-12-12 | 2017-05-17 | 浙江大学 | Hand movement tracking system and tracking method |
-
2017
- 2017-10-19 CN CN201710979685.2A patent/CN107865662A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101535772A (en) * | 2006-11-07 | 2009-09-16 | 关键安全体系股份有限公司 | Linear displacement sensor |
US20090024353A1 (en) * | 2007-07-19 | 2009-01-22 | Samsung Electronics Co., Ltd. | Method of measuring pose of mobile robot and method and apparatus for measuring position of mobile robot using the same |
CN101685308A (en) * | 2008-09-22 | 2010-03-31 | 鸿富锦精密工业(深圳)有限公司 | Robot state perception system |
CN102667672B (en) * | 2009-07-07 | 2014-04-02 | 闫文闻 | Acceleration motion identify method and system thereof |
CN101886927A (en) * | 2010-06-25 | 2010-11-17 | 武汉大学 | Three-dimensional motion tracking system and method based on inertial sensor and geomagnetic sensor |
CN102323854A (en) * | 2011-03-11 | 2012-01-18 | 中国科学院研究生院 | Human motion capture device |
CN102645974A (en) * | 2012-02-24 | 2012-08-22 | 姜展伟 | Positioning identification system and method of three-dimensional motions |
US20150360080A1 (en) * | 2013-01-18 | 2015-12-17 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Determining a speed of a multidimensional motion in a global coordinate system |
CN103115596A (en) * | 2013-03-08 | 2013-05-22 | 兰州大学 | Wireless sensor network node for displacement measurement and test method |
CN104461013A (en) * | 2014-12-25 | 2015-03-25 | 中国科学院合肥物质科学研究院 | Human body movement reconstruction and analysis system and method based on inertial sensing units |
CN106679649A (en) * | 2016-12-12 | 2017-05-17 | 浙江大学 | Hand movement tracking system and tracking method |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109091151A (en) * | 2018-09-06 | 2018-12-28 | 中国人民解放军战略支援部队信息工程大学 | A kind of pedestrian's fall detection method and device based on MIMU |
CN109540132A (en) * | 2018-11-22 | 2019-03-29 | 中国矿业大学 | Localization method of the movable equipment based on sensor fusion on the person |
CN112729317A (en) * | 2020-12-17 | 2021-04-30 | 大陆投资(中国)有限公司 | Method for locating a vehicle and on-board system |
CN112729317B (en) * | 2020-12-17 | 2023-09-19 | 大陆投资(中国)有限公司 | Method for locating a vehicle and in-vehicle system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN206479647U (en) | Alignment system and automobile | |
CN105102928B (en) | Inertial device, methods and procedures | |
CN107865662A (en) | A kind of method and device for identifying limb action | |
CN108061855B (en) | MEMS sensor based spherical motor rotor position detection method | |
CN106705968A (en) | Indoor inertial navigation algorithm based on posture recognition and step length model | |
CN107941217A (en) | A kind of robot localization method, electronic equipment, storage medium, device | |
CN108253963A (en) | A kind of robot active disturbance rejection localization method and alignment system based on Multi-sensor Fusion | |
CN102915130A (en) | Moving trajectory calibration method and moving trajectory generation method | |
CN103487011B (en) | A kind of attitude angle detection method of data glove | |
CN106445130A (en) | Motion capture glove for gesture recognition and calibration method thereof | |
CN107767425A (en) | A kind of mobile terminal AR methods based on monocular vio | |
CN109798891A (en) | Inertial Measurement Unit calibration system based on high-precision motion capture system | |
CN103175502A (en) | Attitude angle detecting method based on low-speed movement of data glove | |
CN113220119A (en) | Motion capture device of inertial sensor | |
CN105910606A (en) | Direction adjustment method based on angular velocity difference | |
CN107092882B (en) | Behavior recognition system based on sub-action perception and working method thereof | |
WO2021147391A1 (en) | Map generation method and device based on fusion of vio and satellite navigation system | |
CN107923740A (en) | Sensor device, sensing system and information processing equipment | |
CN112697131A (en) | Underground mobile equipment positioning method and system based on vision and inertial navigation system | |
CN104101293A (en) | Measurement machine station coordinate system unification system and method | |
CN111158482B (en) | Human body motion gesture capturing method and system | |
CN205430495U (en) | Augmented reality equipment and system | |
CN107402004B (en) | Attitude information acquisition method and device of sensor | |
CN112907633A (en) | Dynamic characteristic point identification method and application thereof | |
CN109343713B (en) | Human body action mapping method based on inertial measurement unit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180403 |