CN115540899A - Step counting method, step counting device, step counting equipment and storage medium - Google Patents

Step counting method, step counting device, step counting equipment and storage medium Download PDF

Info

Publication number
CN115540899A
CN115540899A CN202110744360.2A CN202110744360A CN115540899A CN 115540899 A CN115540899 A CN 115540899A CN 202110744360 A CN202110744360 A CN 202110744360A CN 115540899 A CN115540899 A CN 115540899A
Authority
CN
China
Prior art keywords
leg
detected
data
vector
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110744360.2A
Other languages
Chinese (zh)
Inventor
陈朝喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110744360.2A priority Critical patent/CN115540899A/en
Publication of CN115540899A publication Critical patent/CN115540899A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • G01C22/006Pedometers

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application discloses a step counting method, a step counting device, step counting equipment and a storage medium, wherein the method comprises the following steps: acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3; constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors; detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector; and counting the steps of the object to be detected according to the leg lifting action and the leg placing action. This technical scheme need not to be limited to software detection mode, can accurately determine to lift the leg action based on the sampling data that the sensor detected, even also can realize accurate meter step statistics under the circumstances that the system falls the power failure and shuts down for meter step flexibility is higher, can reduce the system consumption simultaneously.

Description

Step counting method, step counting device, step counting equipment and storage medium
Technical Field
The present invention relates to the field of terminal device technology, and in particular, to a step counting method, apparatus, device, and storage medium.
Background
With the development of scientific technology and the increasing popularization of Micro Electro Mechanical Systems (MEMS) sensors, more and more electronic devices such as pedometers, mobile phones and smart watches are equipped with gyros and accelerometers, so that more and more functions are added, such as a step counting function.
At present, in the related art, a method of detecting software inside a system is adopted for step counting statistics, however, the scheme needs to perform step counting detection when the system is in a starting state, so that the detection flexibility is poor, the detection is inaccurate, and the power consumption is large.
Disclosure of Invention
In view of the above-mentioned drawbacks and deficiencies of the prior art, it is desirable to provide a step counting method, apparatus, device and storage medium.
In a first aspect, the present application provides a step counting method, including:
acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3;
constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors;
detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector;
and counting steps of the object to be detected according to the leg lifting action and the leg placing action.
In a second aspect, the present application provides a step counting device comprising:
the acquisition module is used for acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3;
the construction module is used for constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors;
the detection module is used for detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector;
and the step counting module is used for counting steps of the object to be detected according to the leg lifting action and the leg placing action.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the step counting method when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the step counting method described above.
According to the step counting method, the step counting device, the step counting equipment and the step counting storage medium, sampling data of i groups of sensors of an object to be detected are obtained, a sampling matrix and a unit basis vector are constructed according to the sampling data of the i groups of sensors, leg lifting actions and leg releasing actions of the object to be detected are detected based on the sampling matrix and the unit basis vector, and the step counting is carried out on the object to be detected according to the leg lifting actions and the leg releasing actions. This technical scheme need not to be limited to software detection mode, can accurately determine to lift the leg action based on the sampling data that the sensor detected, even also can realize accurate meter step statistics under the circumstances that the system falls the power failure and shuts down for meter step flexibility is higher, can reduce the system consumption simultaneously.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a schematic flow chart of a step counting method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of Acc-x axis versus sampling time provided in the present application;
FIG. 3 is a schematic diagram illustrating relationship between Acc-y axis and sampling time provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of Acc-z axis versus sampling time provided in the present application;
FIG. 5 is a schematic diagram illustrating a relationship between a Gyr-x axis and a sampling time according to an embodiment of the present application;
fig. 6 is a schematic diagram of a relationship between a Gyr-y axis and a sampling time provided in an embodiment of the present application;
fig. 7 is a schematic diagram of a Gyr-z axis versus a sampling time according to an embodiment of the present application;
FIG. 8 is a diagram illustrating the relationship between Pressure data and sampling time according to an embodiment of the present application;
FIG. 9 is a diagram illustrating the relationship between photo data and sampling time according to an embodiment of the present application;
FIG. 10 is a diagram illustrating the relationship between Compass-x axis and sampling time according to an embodiment of the present application;
FIG. 11 is a diagram illustrating the relationship between Compass-y axis and sampling time according to an embodiment of the present application;
FIG. 12 is a diagram illustrating the relationship between Compass-z axis and sampling time provided by an embodiment of the present application;
FIG. 13 is a graph illustrating the Grv-x axis versus sampling time according to an embodiment of the present disclosure;
FIG. 14 is a graph illustrating the Grv-y axis versus sampling time according to an embodiment of the present disclosure;
FIG. 15 is a graph illustrating the Grv-z axis versus sampling time according to an embodiment of the present disclosure;
FIG. 16 is a schematic diagram of Rot-x axis versus sampling time according to an embodiment of the present disclosure;
FIG. 17 is a schematic diagram of Rot-y axis versus sampling time according to an embodiment of the present disclosure;
FIG. 18 is a schematic diagram of Rot-z axis versus sampling time provided by an embodiment of the present application;
FIG. 19 is a schematic diagram of an Ori-x axis versus sampling time provided in an embodiment of the present application;
FIG. 20 is a schematic diagram of an Ori-y axis versus sampling time provided in an embodiment of the present application;
FIG. 21 is a schematic diagram of the relationship between the Ori-z axis and the sampling time provided in the embodiment of the present application;
fig. 22 is a schematic flowchart of a method for detecting a leg raising and lowering action of an object to be detected according to an embodiment of the present application;
fig. 23 is a schematic flowchart of a method for determining a movement distance of an object to be detected according to an embodiment of the present application;
fig. 24 is a schematic flowchart of a method for determining motion data of an object to be detected according to an embodiment of the present application;
FIG. 25 is a schematic structural diagram of a step counter according to an embodiment of the present application;
FIG. 26 is a schematic structural diagram of a step-counting device according to another embodiment of the present application;
fig. 27 is a schematic structural diagram of a computer system according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
It is understood that with the advancement of technology and the improvement of living standard, MEMS sensors have been widely used in electronic devices as the essential device circuits of electronic devices due to their advantages of small size, light weight, low power consumption and low cost. The MEMS sensor can be used for counting the steps and providing guidance data for the health and training of users.
It should be noted that MEMS belongs to a new field of multidisciplinary intersection, and is a technology of integrating microelectronics and precision machining, and its objective is to integrate information acquisition, processing and execution together to form a multifunctional micro system, which is integrated in a large-sized system, thereby greatly improving the automation, intelligence and reliability level of the system.
As mentioned in the background art, in the related art, a step counting statistic is performed by using a system internal software detection method, however, the step counting detection needs to be performed when the system is in a power-on state, which results in poor detection flexibility, inaccurate detection and large power consumption.
Based on the defects, the application provides a step counting method, compared with the related art, the technical scheme can accurately determine the leg lifting and placing actions based on the sampling data detected by the sensor without being limited by a software detection mode, and can realize accurate step counting statistics even under the condition that the system is powered off, so that the step counting flexibility is higher, and the power consumption of the system can be reduced.
The step counting method provided by the embodiment of the application can be applied to an application scene of step counting detection of the terminal. Alternatively, in the exemplary embodiments described below, the terminal may be a Mobile terminal, which may also be referred to as a User Equipment (UE), a Mobile Station (MS), or the like.
Optionally, the terminal may include a smart phone, a tablet computer, a palmtop computer, a notebook computer, a Mobile Internet Device (MID), a wearable device, a Virtual Reality (VR) device, an Augmented Reality (AR) device, and the like, and may also include a smart watch, a smart band, smart glasses, and the like. This is not particularly limited in the present application. The terminal may be provided with a plurality of sensor elements therein, and the sensor elements may include one or more sensors such as an acceleration sensor, a gyroscope, a barometer, a light sensor, a magnetic flowmeter, and a MEMS sensor.
For convenience of understanding and explanation, the step counting method, apparatus, device and storage medium provided in the embodiments of the present application are described in detail below with reference to fig. 1 to 27.
Fig. 1 is a schematic flowchart of a step counting method provided in an embodiment of the present application, and as shown in fig. 1, the step counting method is applied to a terminal, and includes:
s101, acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3.
Specifically, the sampling data of the i groups of sensors are acquired by the user through the sensors in the terminal in the leg lifting and placing actions, i is the number of the sensors, and the more the sensors are, the more accurate the step counting is. The sensor can be arranged on one hand or foot of an object to be detected, can also be arranged on two hands or feet of the object to be detected, can also be arranged on wearable clothes through a signal line or directly arranged on the skin, and only needs to be capable of acquiring sampling data of the object to be detected through the sensor. The sensor may comprise one or more of an acceleration sensor, a gyroscope, and a magnetic field sensor. Alternatively, the acceleration sensor may be a three-axis acceleration sensor, the gyroscope may be a three-axis gyroscope, and the magnetic field sensor may be a three-axis magnetic field sensor.
It should be noted that, as shown in fig. 2 to fig. 21, when the sensor is disposed on the hand, the sensor may detect sampling data corresponding to the hands-off operation, and the sampling data may include, for example, acc hands-off operation sampling data, gyr angular acceleration sampling data, pressure hands-off operation sampling data, hands-off light sensation value, compass hands-off sampling data, grv hands-off sampling data, rot hands-off sampling data, or Ori hands-off sampling data.
The Acc hand-lifting hand-releasing motion sampling data can be obtained by detecting through an acceleration sensor, and comprises Acc-x axis hand-lifting hand-releasing motion data, acc-y axis hand-lifting hand-releasing motion data, acc-z axis hand-lifting hand-releasing motion data and acceleration values corresponding to different directions of an x axis, a y axis, a z axis and the like. The Gyr angular acceleration sampling data can be obtained through detection of a gyroscope and comprise Gyr-x axis angular acceleration, gyr-y axis angular acceleration and Gyr-z axis angular acceleration. The Pressure hand-raising hand-releasing motion sampling data can be detected by a barometer, and comprise a detected air Pressure value. The light sensation value of the hand-lifting release can be detected by a light sensor, and comprises a detected air pressure value. The Compass hand-raising hand-releasing sampling data can be obtained by detection of a magnetic field sensor, and comprises Compass-x axis hand-raising hand-releasing action data, compass-y axis hand-raising hand-releasing action data and Compass-z axis hand-raising hand-releasing action data, and can comprise magnetic field strengths corresponding to the directions of an x axis, a y axis and a z axis respectively. The Grv hands-on lifting sampling data comprises amplitude information of three coordinate axes, namely a Gyr-x axis, a Gyr-y axis and a Gyr-z axis. The Rot hands-on sample data can comprise amplitude information of three coordinate axes of a Rot-x axis, a Rot-y axis and a Rot-z axis. The Ori hands-on lifting sample data can comprise amplitude information of three coordinate axes of an Ori-x axis, an Ori-y axis and an Ori-z axis.
And S102, constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors.
Specifically, after sampling data of i groups of sensors of an object to be detected is acquired, data points in the sampling data of each group of sensors can be determined based on the sampling data of the i groups of sensors, the sampling data of each group of sensors includes j data points, and j is greater than or equal to 3. Then, based on j data points corresponding to each group of sensors, determining a row vector and a column vector of a sampling matrix, and constructing the sampling matrix according to a preset rule, wherein the sampling matrix can be represented by the following formula:
Figure BDA0003142325030000061
wherein i represents the number of sensors, j represents the number of data points included in the sampled data of each group of sensors, S ij Representing the jth data point in the sampled data for the ith group of sensors. Each row vector represents sampled data for a group of sensors. For example, the sampling data corresponding to Acc-x, acc-y, acc-z, gyr-x, gyr-y, and Gyr-z are respectively used as a row vector, the data points of the same type of sensor are used as the same column vector, and the data points vary with time, for example, S may be included in the row vector of Acc-x 11 ,S 12 ,...S 1j And the like.
Because the trend of the sampled data points from the lowest to the highest to the lowest is a period, if the column element of a row vector is less than 3, the sampling matrix cannot be judged to be a complete action, and therefore the number j of the column vectors must be greater than or equal to 3; if the number of row vectors is less than three, it may cause inaccurate step counting depending on the detection result of one or two sensors, and thus, the number of row vectors must be greater than or equal to 3.
It should be noted that, since each sensor comes from different manufacturers, the corresponding sampling rate is different when data sampling is performed according to the characteristics of the internal circuit of the sensor, and in order to completely judge the leg raising or leg releasing action, it is necessary to perform data point sampling at the highest sampling rate in each sensor or set all sensors to be sampling rate f for sampling, where f = n × j. Wherein, the sampling rate f refers to the number of sampling data points in unit time, and the unit is expressed by Hz; n represents the sampling frequency; j represents the number of data points.
It will be appreciated that a periodic movement, such as a step comprising a leg raising movement and a leg lowering movement, as a movement period, the column vector comprises at least one movement period. Normalizing each row vector of the sampling matrix to construct a unit column vector, wherein the unit column vector can be represented by the following formula:
E j =[e 1 ……e j ] T
wherein e is j Denotes the jth sample element, E j Representing a unit column vector. And the modulus of the unit column vector is 1, i.e. | e 1 | 2 +|e 2 | 2 +……+|e j | 2 =1。
For any line vector, the variation trend of the elements before and after the line vector can be judged, for example, an ascending trend and a descending trend represent a period, each line vector represents sampling data corresponding to a sensor type, and the counting is one step when each variation period is detected. For elements in the above-mentioned unit column vector, if the preceding element is smaller than the following element, that is to say
Figure BDA0003142325030000071
Figure BDA0003142325030000072
If the representative data shows an increasing trend, the leg lifting action can be detected; if the preceding element is larger than the succeeding element, i.e.
Figure BDA0003142325030000073
Representing a decreasing trend in the data, a leg-lowering action can be detected.
Further, each column vector of the sampling matrix may be normalized to construct a unit row vector, which may be represented by the following formula:
E i =[e 1 ……e i ]
wherein the content of the first and second substances,e i denotes the ith sample element, E i Representing a unit row vector. And the modulus of the unit column vector is 1, i.e. | e 1 | 2 +|e 2 | 2 +……+|e i | 2 =1, and
Figure BDA0003142325030000074
s103, detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector.
And S104, counting steps of the object to be detected according to the leg lifting action and the leg placing action.
Optionally, on the basis of the foregoing embodiment, please refer to fig. 22, where the step S103 may include the following steps:
s201, determining a leg lifting and placing distance vector of the object to be detected based on the sampling matrix and the unit column vector.
S202, comparing the leg lifting and placing distance vector with a preset judgment threshold value vector to determine a first quantity value.
And S203, calculating a matrix operation result based on the leg lifting and releasing distance vector and the unit row vector.
And S204, detecting leg lifting actions and leg releasing actions of the object to be detected based on the matrix operation result and the first quantity value.
In this step, after the sampling matrix, the unit column vector, and the unit column vector are obtained, the leg lifting and releasing distance vector of the object to be detected can be determined based on the sampling matrix and the unit column vector, and can be represented by the following formula:
Figure BDA0003142325030000081
wherein R is i Representing a leg-raising and lowering distance vector, e j Denotes the jth sample element, E j Representing a unit column vector, S ij And j data points in the sampling data of the ith group of sensors are represented, and the lifting and releasing leg distance vector comprises i lifting and releasing leg element values.
After the leg lifting and releasing distance vector of the object to be detected is determined, the leg lifting and releasing distance vector is compared with a preset judgment threshold value vector, and a first quantity value is determined.
It should be noted that the above judgment threshold vector is obtained in advance by training according to a large amount of sample data of leg raising and leg placing actions, and includes i judgment thresholds corresponding to each group of sensors in the i groups of sensors, and the judgment thresholds corresponding to each sensor may be the same or different. For example, if 10 sensor types exist, including sensor1 and sensor 2.. 10, a sensor1 is determined to be recognized as a leg raising action when the sensor1 is obtained by acquiring sample data of a sample object which corresponds to ten thousand walks for ten thousand times, and each sample data includes a leg raising action and a leg placing action, and otherwise, the sensor is recognized as a leg placing action; when the sensor sensors 2 are all larger than 2, the leg lifting action is determined to be recognized, otherwise, the leg placing action is recognized, so that the judgment threshold values corresponding to the sensor sensors 1 and 2 are determined to be 1 and 2 respectively, similarly, the judgment threshold value corresponding to each sensor in 10 sensor types can be obtained, and then the 10 judgment threshold values are combined to obtain the judgment threshold value vector.
Optionally, in the process of determining the first quantity value by comparing the leg lifting and releasing distance vector with a preset judgment threshold vector, it may be determined from the leg lifting and releasing distance vector that each lifting and releasing element value of i lifting and releasing element values is compared with a corresponding judgment threshold, and determine the quantity of elements whose lifting and releasing element values are greater than the corresponding judgment thresholds from the i lifting and releasing element values, and then use the quantity of elements as the first quantity value.
After the first quantity value is obtained, a matrix operation result is calculated based on the leg lifting and releasing distance vector and the unit row vector, and then leg lifting and releasing actions of the object to be detected are detected based on the matrix operation result and the first quantity value. Optionally, a second numerical value may be determined based on the first numerical value and the number of the sensors, then the matrix operation result is compared with the second numerical value, and when the matrix operation result is greater than or equal to the second numerical value, it is detected that the object to be detected is a leg lifting motion; when the matrix operation result is smaller than the second numerical value, the leg placing action of the object to be detected is detected, and the leg placing action can be represented by the following formula:
Figure BDA0003142325030000091
wherein R represents the matrix operation result, m represents the first quantity value, i represents the number of sensors, e i Denotes the ith sample element, E i Representing a unit row vector, e j Denotes the jth sample element, E j Representing a unit column vector, S ij Represents the jth data point in the sampled data for the ith group of sensors,
Figure BDA0003142325030000092
representing a second numerical value, R i Representing a leg lift and drop distance vector.
In the step, after the leg raising action in the first half period is detected, the leg placing action can be detected by the similar detection method, after the leg raising action and the leg placing action are detected, each leg raising action and one leg placing action are recorded as one step, so that the step counting is completed by detecting the leg raising action and the leg placing action once, and the step counting of the object to be detected is realized by detecting the times of the leg raising action and the leg placing action.
It should be noted that, in the above-mentioned processes of detecting leg lifting and leg placing actions, the sensor is on one leg, if the sensor is on two legs, data feature extraction may be performed on the sensors on the two legs, respectively, statistics on data of one of the sensors is performed once for leg lifting and leg placing actions, and then data of the other stationary leg in a period is stationary. Also, the above statistical method of the number of steps of detection is equally applicable to sensors arranged on one hand or both hands.
According to the step counting method provided by the embodiment of the application, the sampling data of the i groups of sensors of the object to be detected are obtained, the sampling matrix and the unit basis vector are constructed according to the sampling data of the i groups of sensors, the leg lifting action and the leg releasing action of the object to be detected are detected based on the sampling matrix and the unit basis vector, and the step counting is carried out on the object to be detected according to the leg lifting action and the leg releasing action. This technical scheme need not to be limited by software detection mode, can accurately determine to lift the leg action based on the sampling data that the sensor detected, even also can realize accurate meter step statistics under the circumstances that the system falls the power failure and shuts down for the meter step flexibility is higher, can reduce system's consumption simultaneously.
Further, as an optional implementation manner, on the basis of the foregoing embodiment, fig. 23 is a schematic flowchart of a method for determining a movement distance provided in the embodiment of the present application, and as shown in fig. 23, the method includes:
s301, recording the period value of the motion of the object to be detected according to the preset fluctuation frequency corresponding to the sensor.
S302, acquiring data of sex, height and weight of the object to be detected.
S303, determining the movement step length of the object to be detected based on the sex, the height and the weight data of the object to be detected and a pre-trained path model.
S304, determining the movement distance of the object to be detected based on the movement step length and the period value.
In this embodiment, the pace and frequency of the pace are different for users of different gender, age, height and weight. After the step of the object to be detected is counted, the period value of the movement of the object to be detected can be recorded according to the preset fluctuation frequency corresponding to the sensor, the data of the sex, the height and the weight of the object to be detected are obtained, the movement step length of the object to be detected is determined based on the data of the sex, the height and the weight of the object to be detected and a pre-trained path model, and the movement path of the object to be detected is determined based on the movement step length and the period value.
Specifically, a period signal T fused with the sensor is determined based on a sampling period F fused with the sensor, then a period value T of the motion of the object to be detected is recorded according to a fluctuation frequency F of the period signal T, wherein F = nF, after gender, height and weight data of the object to be detected are obtained, a motion step length l of the object to be detected is determined based on the gender, height and weight data of the object to be detected and a pre-trained route model, and then the motion step length l is multiplied by the period value T to determine a motion route D of the object to be detected, wherein D = T.
It should be noted that the route model may be constructed by the following steps: the method comprises the steps of firstly collecting sex, height, weight data and step length data of a sample object, determining a path model to be trained according to the sex, the height, the weight data and the step length data, and then calculating model parameters in the path model to be trained to obtain the path model.
For example, the gender s, the height h, and the weight w of ten thousand sample objects may be collected, the step length l data corresponding to each sample object may be counted, then the gender s, the height h, and the weight w data are used as independent variables, the step length l data is used as a dependent variable, the change condition of the step length with the age, the gender, and the weight data is obtained, the data may be plotted in a coordinate system to obtain the change trend thereof, for example, a model l = s (a h + b w) of the route to be trained is determined, then the gender s, the height h, the weight w data, and the step length l data of three sample objects may be taken, the data is substituted into the route model to be trained, and model parameters a and b in the model l = s (a h + b w) of the route to be trained are calculated, so as to obtain the route model.
In the embodiment, the model coefficients of the route model can be determined, so that the corresponding route model is constructed, the universality is high, the movement route of the object to be detected is accurately counted according to the personal data of the object to be detected, and the route counting function is realized.
Further, as an optional implementation manner, on the basis of the foregoing embodiment, fig. 24 is a schematic flowchart of a method for determining motion data provided in the embodiment of the present application, and as shown in fig. 24, the method includes:
s401, determining the motion state of the object to be detected according to the fluctuation frequency.
In this embodiment, the motion state of the object to be detected can be determined according to the fluctuation frequency, and the motion state includes running, walking, labor and the like. For example, when the fluctuation frequency is greater than a first preset frequency threshold, the motion state can be determined as a labor state; when the fluctuation frequency is not greater than a first preset frequency threshold and is greater than a second preset frequency threshold, the motion state of the robot can be determined to be a running state; when the fluctuation frequency is not greater than the second preset frequency threshold, the walking state can be determined. Wherein different energy is consumed for different motion states. The energy consumed for the walking state is the lowest, the energy consumed for the running state is the second highest, and the energy consumed for the labor state is the highest.
S402, determining the total consumed energy of the object to be detected in the motion state based on the sex, the height and the weight data of the object to be detected and the pre-trained energy model in the motion state.
And S403, determining the motion data of the object to be detected based on the total consumed energy.
In this embodiment, after the movement state of the object to be detected is determined, the data of the sex, the height and the weight of the object to be detected can be obtained, the total consumed energy of the object to be detected in the movement state is determined based on the data of the sex, the height and the weight of the object to be detected and the energy model trained in advance in the movement state, and the movement data of the object to be detected is determined according to the total consumed energy. Optionally, the motion data may be data such as a motion plan recommended for the object to be detected.
Specifically, the sex S, the weight w and the height h of the object to be detected can be obtained, the energy model trained in advance in the motion state is determined according to the sex S, the weight w and the height h of the object to be detected and the unit consumed energy E in the motion state, and the data of the sex S, the weight w and the height h of the object to be detected are substituted into the energy model corresponding to the motion state, so that the total consumed energy E of the object to be detected in the motion state is obtained.
It should be noted that the energy model may be constructed by the following steps: the method comprises the steps of firstly obtaining unit consumption energy of a sample object in a movement path, collecting gender, height and weight data of the sample object and total consumption energy in the movement path, then determining an energy model to be trained according to the gender, the height, the weight data, the unit consumption energy and the total consumption energy in the movement path, and calculating model parameters in the energy model to be trained to obtain the energy model.
Illustratively, for example, data of sex s, height h, and weight w of ten thousand sample objects may be collected, and corresponding step length data l may be counted, for example, the step length may be a distance of one kilometer, and a unit consumption energy E of the sample object moving for one kilometer in a certain motion state and a total consumption energy E may be detected by a standard energy detection device. Wherein, the total consumed energy E of the sample object is obtained by multiplying the movement distance l by the unit consumed energy E. Then, the sex s, the height h, the weight w data and the unit energy consumption E are used as independent variables, the total energy consumption is used as a dependent variable, the change situation of the total energy consumption along with the age, the sex and the weight data is obtained, the data can be plotted in a coordinate system to obtain the change trend of the data, for example, an energy model to be trained, E = l E = s (a x h + b x w) E, wherein the unit energy consumption E is different along with the different motion states is determined. Then, the sex s, the height h, the weight w data and the unit consumption energy E of three sample objects can be taken, the sex s, the height h, the weight w data and the unit consumption energy E of the three sample objects are substituted into the energy model to be trained, and model parameters a and b of the energy model to be trained, E = l E = s (a x h + b w) E, are calculated, so that the energy models corresponding to the motion states are obtained.
The method and the device have the advantages that the model coefficient of the energy model can be determined, so that the energy model is accurately constructed, the universality is high, the total consumed energy of the object to be detected in different motion states can be accurately counted according to the detected motion state and the personal data of the object to be detected, data guidance is provided for the health and motion data recommendation of a user, specific motion detection can be achieved in an individualized mode based on the personal data of the object to be detected, and user experience is further improved.
It should be noted that while the operations of the methods of the present invention are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, in order to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
On the other hand, fig. 25 is a schematic structural diagram of a step counting device according to an embodiment of the present application. As shown in fig. 25, the apparatus may include:
the acquisition module 510 is used for acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3;
a constructing module 520, configured to construct a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors;
a detection module 530, configured to detect leg lifting and leg releasing actions of an object to be detected based on the sampling matrix and the unit basis vector;
and the step counting module 540 is configured to count steps of the object to be detected according to the leg raising action and the leg placing action.
Optionally, as shown in fig. 26, the building module 520 includes:
the first determining unit 521 is configured to determine data points in the sampled data of each group of sensors in the sampled data of i groups of sensors, where the sampled data of each group of sensors includes j data points, and j is greater than or equal to 3;
a second determining unit 522, configured to determine a row vector and a column vector of the sampling matrix based on j data points corresponding to each group of sampling data;
a first constructing unit 523 configured to construct a sampling matrix according to a preset rule based on the row vector and the column vector;
a second constructing unit 524, configured to perform normalization processing on each row vector of the sampling matrix to construct a unit column vector;
the third constructing unit 525 is configured to perform normalization processing on each column vector of the sampling matrix to construct a unit row vector.
Optionally, the detecting module 530 includes:
a third determining unit 531, configured to determine a leg lifting and dropping distance vector of the object to be detected based on the sampling matrix and the unit column vector;
a fourth determining unit 532, configured to compare the leg lifting and dropping distance vector with a preset judgment threshold vector, and determine a first quantity value;
a calculating unit 533, configured to calculate a matrix operation result based on the leg lifting and lowering distance vector and the unit row vector;
the detecting unit 534 is configured to detect leg raising and leg releasing actions of the object to be detected based on the matrix operation result and the first quantitative value.
Optionally, the fourth determining unit 532 is specifically configured to:
determining i lifting and releasing leg element values from the lifting and releasing leg distance vector;
comparing each lifting and releasing leg element value in the i lifting and releasing leg element values with a corresponding judgment threshold value;
determining the number of elements with the leg lifting and releasing element values larger than the corresponding judgment threshold value from the i leg lifting and releasing element values;
the number of elements is taken as a first quantity value.
Optionally, the detecting unit 534 is specifically configured to:
determining a second quantity value based on the first quantity value and the number of the sensors;
comparing the matrix operation result with the second numerical value;
when the matrix operation result is larger than or equal to the second numerical value, detecting that the object to be detected is in leg lifting action;
and when the matrix operation result is smaller than the second numerical value, detecting that the object to be detected is the leg placing action.
Optionally, the apparatus is further configured to:
recording the periodic value of the motion of the object to be detected according to the preset fluctuation frequency corresponding to the sensor;
acquiring data of gender, height and weight of a to-be-detected object;
determining the movement step length of the object to be detected based on the sex, the height and the weight data of the object to be detected and a pre-trained path model;
and determining the movement distance of the object to be detected based on the movement step length and the period value.
Optionally, the route model is constructed by the following steps:
collecting sex, height, weight data and step length data of the sample object;
determining a path model to be trained according to the gender, the height, the weight data and the step length data;
and calculating model parameters in the path model to be trained to obtain the path model.
Optionally, the apparatus is further configured to:
determining the motion state of the object to be detected according to the fluctuation frequency;
determining the total energy consumption of the object to be detected in the motion state based on the sex, the height and the weight data of the object to be detected and an energy model which is trained in advance in the motion state;
and determining the motion data of the object to be detected based on the total consumed energy.
Optionally, the energy model is constructed by the following steps:
acquiring unit consumption energy of the sample object in the movement path;
collecting gender, height and weight data of the sample object and total consumed energy in the movement distance;
determining an energy model to be trained according to the sex, the height, the weight data, the unit consumption energy and the total consumption energy in the movement distance;
and calculating model parameters in the energy model to be trained to obtain the energy model.
It can be understood that the functions of the functional modules of the step counting device provided in this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the related description of the foregoing method embodiment, which is not described herein again.
On the other hand, fig. 27 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. The terminal device provided by the embodiment of the application comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and when the processor executes the program, the peripheral function implementation method of the terminal device is implemented. Referring to fig. 27, fig. 27 is a schematic structural diagram of a computer system of a terminal device according to an embodiment of the present application.
As shown in fig. 27, the computer system 1300 includes a Central Processing Unit (CPU) 1301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1302 or a program loaded from a storage section 1303 into a Random Access Memory (RAM) 1303. In the RAM 1303, various programs and data necessary for the operation of the system 1300 are also stored. The CPU 1301, the ROM1302, and the RAM 1303 are connected to each other via a bus 1304. An input/output (I/O) interface 1305 is also connected to bus 1304.
The following components are connected to the I/O interface 1305: an input portion 1306 including a keyboard, a mouse, and the like; an output portion 1307 including a signal such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 1308 including a hard disk and the like; and a communication section 1309 including a network interface card such as a LAN card, a modem, or the like. The communication section 1309 performs communication processing via a network such as the internet. A drive 1310 is also connected to the I/O interface 1305 as needed. A removable medium 1311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1310 as necessary, so that a computer program read out therefrom is mounted into the storage portion 1308 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1303 and/or installed from the removable medium 1311. The computer program executes the above-described functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 1301.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present application may be implemented by software or hardware. The described units or modules may also be provided in a processor, and may be described as: a processor, comprising: the device comprises an acquisition module, a construction module, a detection module and a step counting module. The names of the units or modules do not limit the units or modules, for example, the acquisition module can be described as "acquiring the sample data of i groups of sensors of the object to be detected, i ≧ 3".
As another aspect, the present application also provides a computer-readable storage medium, which may be included in the electronic device described in the above embodiments; or may be separate and not incorporated into the electronic device. The computer-readable storage medium stores one or more programs that, when executed by one or more processors, perform the step-counting method described in the present application:
acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3;
constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors;
detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector;
and counting the steps of the object to be detected according to the leg lifting action and the leg placing action.
To sum up, the step counting method, the step counting device, the step counting equipment and the storage medium provided by the embodiment of the application construct a sampling matrix and a unit basis vector by acquiring the sampling data of the i groups of sensors of the object to be detected and according to the sampling data of the i groups of sensors, detect the leg raising action and the leg releasing action of the object to be detected based on the sampling matrix and the unit basis vector, and count the step of the object to be detected according to the leg raising action and the leg releasing action. This technical scheme need not to be limited by software detection mode, can accurately determine to lift the leg action based on the sampling data that the sensor detected, even also can realize accurate meter step statistics under the circumstances that the system falls the power failure and shuts down for the meter step flexibility is higher, can reduce system's consumption simultaneously.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A method of step counting, the method comprising:
acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3;
constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors;
detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector;
and counting the steps of the object to be detected according to the leg lifting action and the leg placing action.
2. The method of claim 1, wherein the unit basis vectors comprise unit column vectors and unit row vectors, and wherein constructing a sampling matrix and a unit basis vector from the sampled data of the i groups of sensors comprises:
determining data points in the sampling data of each group of sensors in the i groups of sensors, wherein the sampling data of each group of sensors comprises j data points, and j is more than or equal to 3;
determining a row vector and a column vector of the sampling matrix based on j data points corresponding to each group of sampling data;
constructing the sampling matrix according to a preset rule according to the row vector and the column vector;
normalizing each row vector of the sampling matrix to construct the unit column vector;
and carrying out normalization processing on each column vector of the sampling matrix to construct the unit row vector.
3. The method according to claim 2, wherein detecting leg raising and leg lowering actions of the object to be detected based on the sampling matrix and the unit basis vector comprises:
determining a leg lifting and releasing distance vector of the object to be detected based on the sampling matrix and the unit column vector;
comparing the leg lifting and releasing distance vector with a preset judgment threshold vector to determine a first quantity value;
calculating a matrix operation result based on the leg lifting and releasing distance vector and the unit row vector;
and detecting leg lifting action and leg releasing action of the object to be detected based on the matrix operation result and the first quantity value.
4. The method of claim 3, wherein the decision threshold vector comprises i decision thresholds corresponding to each of the i groups of sensors, and comparing the leg-lift distance vector with a predetermined decision threshold vector to determine a first quantity value comprises:
determining i leg lifting and releasing element values from the leg lifting and releasing distance vector;
comparing each lifting and releasing leg element value in the i lifting and releasing leg element values with a corresponding judgment threshold value;
determining the number of elements with the leg lifting and releasing element values larger than the corresponding judgment threshold value from the i leg lifting and releasing element values;
taking the element number as a first number value.
5. The method according to claim 3, wherein detecting leg raising and leg releasing actions of the object to be detected based on the matrix operation result and the first quantitative value comprises:
determining a second numerical value based on the first numerical value and the number of the sensors;
comparing the matrix operation result with the second numerical value;
when the matrix operation result is larger than or equal to the second numerical value, detecting that the object to be detected is in leg lifting action;
and when the matrix operation result is smaller than the second numerical value, detecting that the object to be detected is a leg placing action.
6. The method according to claim 1, wherein after counting the steps of the object to be detected, the method further comprises:
recording the periodic value of the motion of the object to be detected according to the preset fluctuation frequency corresponding to the sensor;
acquiring data of the sex, the height and the weight of the object to be detected;
determining the movement step length of the object to be detected based on the gender, height and weight data of the object to be detected and a pre-trained path model;
and determining the movement distance of the object to be detected based on the movement step length and the period value.
7. The method according to claim 6, wherein the route model is constructed by:
collecting sex, height, weight data and step length data of a sample object;
determining a path model to be trained according to the gender, the height, the weight data and the step length data;
and calculating model parameters in the path model to be trained to obtain the path model.
8. The method of claim 7, wherein after counting the steps of the object to be detected, the method further comprises:
determining the motion state of the object to be detected according to the fluctuation frequency;
determining the total energy consumption of the object to be detected in the motion state based on the sex, the height and the weight data of the object to be detected and an energy model trained in advance in the motion state;
and determining the motion data of the object to be detected based on the total consumed energy.
9. The method of claim 8, wherein the energy model is constructed by:
acquiring unit consumption energy of the sample object in the movement distance;
collecting gender, height, weight data and total energy expenditure within the movement path of the sample subject;
determining an energy model to be trained according to the sex, the height, the weight data, the unit consumption energy and the total consumption energy in the movement distance;
and calculating model parameters in the energy model to be trained to obtain the energy model.
10. A step counting device, characterized in that the device comprises:
the acquisition module is used for acquiring sampling data of i groups of sensors of an object to be detected, wherein i is more than or equal to 3;
the construction module is used for constructing a sampling matrix and a unit basis vector according to the sampling data of the i groups of sensors;
the detection module is used for detecting leg lifting actions and leg releasing actions of the object to be detected based on the sampling matrix and the unit basis vector;
and the step counting module is used for counting steps of the object to be detected according to the leg lifting action and the leg placing action.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the step counting method according to any one of claims 1 to 9 when executing the program.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a step counting method according to any one of claims 1 to 9.
CN202110744360.2A 2021-06-30 2021-06-30 Step counting method, step counting device, step counting equipment and storage medium Pending CN115540899A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110744360.2A CN115540899A (en) 2021-06-30 2021-06-30 Step counting method, step counting device, step counting equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110744360.2A CN115540899A (en) 2021-06-30 2021-06-30 Step counting method, step counting device, step counting equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115540899A true CN115540899A (en) 2022-12-30

Family

ID=84722970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110744360.2A Pending CN115540899A (en) 2021-06-30 2021-06-30 Step counting method, step counting device, step counting equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115540899A (en)

Similar Documents

Publication Publication Date Title
Bai et al. A high-precision and low-cost IMU-based indoor pedestrian positioning technique
CN111336989B (en) Method and apparatus for positioning using normally open barometer
US7334472B2 (en) Apparatus and method for measuring quantity of physical exercise using acceleration sensor
US10837794B2 (en) Method and system for characterization of on foot motion with multiple sensor assemblies
CN103793042B (en) A kind of system and method body motion information interaction and shown
CN106168673B (en) Sensor information using method and electronic device using the same
US10830606B2 (en) System and method for detecting non-meaningful motion
US20160253594A1 (en) Method and apparatus for determining probabilistic context awreness of a mobile device user using a single sensor and/or multi-sensor data fusion
US10363471B2 (en) Method and device for measuring speed of moving device
CN106441296A (en) Motion track recording method and user equipment
CN110567491B (en) Initial alignment method and device of inertial navigation system and electronic equipment
CN105068657B (en) The recognition methods of gesture and device
CN104266648A (en) Indoor location system based on Android platform MARG sensor
CN110647212A (en) Folding angle determining method and device, electronic equipment and storage medium
US10914793B2 (en) Method and system for magnetometer calibration
CN115540899A (en) Step counting method, step counting device, step counting equipment and storage medium
CN106500717B (en) Method and device for realizing step counting
CN111382701A (en) Motion capture method, motion capture device, electronic equipment and computer-readable storage medium
US20190192011A1 (en) Wearable health monitoring data service
US20220151511A1 (en) System, apparatus and method for activity classification for a watch sensor
KR101742707B1 (en) Apparatus and method for activity recognition using smart phone and an embedded accelerometer sensor of smart watch
Fernandez et al. Hardware–software interfacing in smartphone centered biosensing
CN107203259B (en) Method and apparatus for determining probabilistic content awareness for mobile device users using single and/or multi-sensor data fusion
CN116416018A (en) Content output method, content output device, computer readable medium and electronic equipment
US20210164829A1 (en) Information processing system, information processing apparatus, program, and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination