CN114602155B - Swimming information statistical method, computer-readable storage medium and electronic device - Google Patents

Swimming information statistical method, computer-readable storage medium and electronic device Download PDF

Info

Publication number
CN114602155B
CN114602155B CN202210509907.5A CN202210509907A CN114602155B CN 114602155 B CN114602155 B CN 114602155B CN 202210509907 A CN202210509907 A CN 202210509907A CN 114602155 B CN114602155 B CN 114602155B
Authority
CN
China
Prior art keywords
swimming
user
data
electronic device
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210509907.5A
Other languages
Chinese (zh)
Other versions
CN114602155A (en
Inventor
邸皓轩
李丹洪
张晓武
陈政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210509907.5A priority Critical patent/CN114602155B/en
Publication of CN114602155A publication Critical patent/CN114602155A/en
Application granted granted Critical
Publication of CN114602155B publication Critical patent/CN114602155B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • A63B2071/063Spoken or verbal instructions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • A63B2071/0661Position or arrangement of display arranged on the user
    • A63B2071/0663Position or arrangement of display arranged on the user worn on the wrist, e.g. wrist bands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application relates to the technical field of wearable equipment, and discloses a swimming information statistical method, a computer readable storage medium and electronic equipment, which are applied to a system comprising first electronic equipment and second electronic equipment, wherein the first electronic equipment is worn on a first part of a user, the second electronic equipment is worn on a second part of the user, and the difference between the motion tracks of the first part under different swimming postures is smaller than the difference between the motion tracks of the second part. The method comprises the following steps: the method comprises the steps that first electronic equipment obtains first motion data of a user; the first electronic equipment acquires second motion data acquired by the second electronic equipment in the swimming process of the user; the first electronic equipment integrates the first motion data and the second motion data to obtain integrated motion data; the first electronic equipment counts swimming information of the user based on the integrated motion data. According to the technical scheme, the swimming information of the user counted by the first electronic equipment is more accurate, and the user experience is favorably improved.

Description

Swimming information statistical method, computer-readable storage medium and electronic device
Technical Field
The application relates to the technical field of wearable equipment, in particular to a swimming information statistical method, a computer-readable storage medium and electronic equipment.
Background
With the improvement of the material culture level, the exercise health consciousness of people is gradually improved, and as an exercise with a plurality of advantages of improving the heart and lung function, improving the body coordination, enhancing the strength of respiratory muscles and the like, more and more people select swimming to enhance the physique. Meanwhile, people hope to accurately monitor and record own swimming related data so as to more comprehensively know the self movement condition.
People can monitor swimming data through the intelligent watch, and the price of intelligent watch is lower, and it is convenient to dress, is widely adopted by masses, but because its sensor is limited, and user's wearing habit difference influences the degree of accuracy of the swimming data of monitoring more easily.
Disclosure of Invention
In view of the above, the present application provides a swimming information statistics method, a computer-readable storage medium, and an electronic device.
In the technical scheme of the application, first electronic equipment (for example, a sports watch worn on a wrist of a user) worn on a first part of the user integrates first motion data of the user, which is acquired by the first electronic equipment, and second motion data of the user, which is acquired from second electronic equipment (for example, a sports headset worn on an ear of the user) worn on a second part of the user, and then carries out statistics on swimming information of the user. Since the difference between the motion trajectories of the first portion is smaller than the difference between the motion trajectories of the second portion for different swimming gestures, for example, for different swimming gestures, the head movements of the user are more different even though the hand movements of the user are consistent. Therefore, after the user's motion data collected by the first electronic device worn on the first part of the user and the user's motion data collected by the second electronic device worn on the second part of the user are integrated, the swimming information of the user is more accurately counted, and for example, the swimming stroke, the stroke number, the stroke frequency and the like of the user can be more accurately distinguished.
In a first aspect, an embodiment of the present application provides a swimming information statistics method, which is applied to a system including a first electronic device and a second electronic device, where the first electronic device is worn on a first part of a user, the second electronic device is worn on a second part of the user, and a difference between motion trajectories of the first part is smaller than a difference between motion trajectories of the second part under different swimming postures, and the method includes:
the method comprises the steps that first electronic equipment obtains first motion data of a user in a swimming process;
the first electronic equipment acquires second motion data acquired by second electronic equipment in the swimming process of a user;
the first electronic equipment integrates the first motion data and the second motion data to obtain integrated motion data;
the first electronic equipment counts swimming information of the user based on the integrated motion data.
In a possible implementation of the first aspect, the integrating, by the first electronic device, the first motion data and the second motion data to obtain integrated motion data includes:
and the first electronic equipment frames the first motion data and the second motion data to obtain the integrated motion data.
In a possible implementation of the first aspect, the framing, by the first electronic device, the first motion data and the second motion data to obtain integrated motion data includes:
and the first electronic equipment performs down-sampling processing on the first motion data according to the frame rate of the second motion data to obtain down-sampled first motion data, wherein the frame rate of the down-sampled first motion data is the same as that of the second motion data.
In a possible implementation of the first aspect, the framing, by the first electronic device, the first motion data and the second motion data to obtain integrated motion data includes:
and the first electronic equipment performs interpolation processing on the second motion data according to the frame rate of the first motion data to obtain interpolated second motion data, wherein the frame rate of the interpolated second motion data is the same as that of the first motion data.
In a possible implementation of the first aspect, the swimming information of the user includes a swimming stroke of the user, and the counting, by the first electronic device, the swimming information of the user based on the integrated motion data includes:
the first electronic equipment performs feature extraction on the integrated motion data to obtain time domain features of the integrated motion data;
the first electronic equipment inputs the time domain characteristics of the integrated motion data into a preset swimming stroke recognition model to obtain a swimming stroke recognition result;
and the first electronic equipment counts the swimming postures of the user based on the swimming posture identification result.
In a possible implementation of the first aspect, the preset swimming stroke recognition model is a binary model.
In a possible implementation of the first aspect, the swimming information of the user further includes a stroke action of the user, and the counting, by the first electronic device, the swimming information of the user based on the integrated motion data includes:
the first electronic equipment determines first target data used for counting the paddling action of the user in the integrated motion data according to the counted swimming stroke of the user, wherein the first target data comprise first target subdata collected by the first electronic equipment and second target subdata collected by the second electronic equipment;
the first electronic equipment determines a plurality of rough screening water actions according to the first target subdata;
and the first electronic equipment counts the paddling action of the user from the plurality of rough screening paddling actions according to the second target subdata.
In a possible implementation of the first aspect, the first electronic device includes a gyroscope and an accelerometer, and the first target sub-data includes data acquired by the gyroscope of the first electronic device or data acquired by the accelerometer of the first electronic device.
In a possible implementation of the first aspect, the first electronic device includes an accelerometer, and the second target sub-data includes data collected by the accelerometer of the second electronic device.
In a possible implementation of the first aspect, the swimming information of the user includes a turn-around action of the user, and the first electronic device counts the swimming information of the user based on the integrated motion data, and includes:
the method comprises the steps that first electronic equipment determines an initial paddling action of a user;
the first electronic equipment determines the initial swimming direction of the user according to second motion data corresponding to the continuous first number of paddling actions including the initial paddling action;
the first electronic equipment determines the swimming direction corresponding to each paddling action after the first number of continuous paddling actions according to the second motion data corresponding to each paddling action after the first number of continuous paddling actions;
when the first electronic equipment judges that the user turns according to the difference between the swimming directions corresponding to every two adjacent water-skiing actions after the first number of continuous water-skiing actions, the first electronic equipment determines the corresponding target water-skiing action when the user turns;
the first electronic equipment determines target swimming directions corresponding to the continuous second number of the paddling actions after the target paddling action according to second motion data corresponding to the continuous second number of the paddling actions after the target paddling action;
when the first electronic device determines that the difference between the target swimming direction and the initial swimming direction reaches the threshold value, the first electronic device confirms that the user turns during the target paddling action, and counts the turning action of the user.
In a possible implementation of the first aspect, the method further includes:
the first electronic device updates the initial swimming direction to the target swimming direction.
In a possible implementation of the first aspect, the second electronic device includes a gyroscope, an accelerometer, and a magnetometer, and the second motion data includes data acquired by the gyroscope of the second electronic device, data acquired by the accelerometer of the second electronic device, and data acquired by the magnetometer of the second electronic device.
In one possible implementation of the first aspect described above, the first electronic device is a sports watch.
In one possible implementation of the first aspect described above, the second electronic device is a sports headset.
In a second aspect, this application implementation provides a computer-readable storage medium having instructions stored thereon, where the instructions, when executed on an electronic device, cause the electronic device to perform the swimming information statistics method in the first aspect and any one of the possible implementations of the first aspect.
In a third aspect, an electronic device is provided in an implementation of the present application, including:
a memory for storing instructions for execution by one or more processors of the electronic device, an
A processor for performing the swimming information statistics method of the first aspect and any one of the possible implementations of the first aspect when the instructions are executed by the one or more processors.
Drawings
Fig. 1 illustrates an application scenario where a user wears a watch 100 and an earphone 200 for swimming, according to some embodiments of the present application;
FIG. 2A illustrates a block diagram of a hardware configuration of the watch 100 shown in FIG. 1, according to some embodiments of the present application;
fig. 2B illustrates a block diagram of a hardware configuration of the headset 200 shown in fig. 1, according to some embodiments of the present application;
fig. 3 illustrates a flow diagram of swim gesture recognition for watch 100, according to some embodiments of the present application;
FIG. 4A illustrates a flow diagram of swim gesture recognition for another watch 100, according to some embodiments of the present application;
FIG. 4B illustrates a schematic diagram of a distribution of barocount values, according to some embodiments of the present application;
fig. 4C illustrates a schematic structural diagram of a swimming stroke recognition model, according to some embodiments of the present application;
FIG. 4D illustrates a schematic diagram of several canonical swimming gestures, according to some embodiments of the present application;
FIG. 5A illustrates a flow diagram for a watch 100 performing stroke recognition, according to some embodiments of the present application;
FIG. 5B illustrates a method flow diagram for the coarse screen action implementation of FIG. 5A, according to some embodiments of the present application;
FIG. 5C illustrates a time domain profile of a watch 100 side signal and a headset 200 side signal, according to some embodiments of the present application;
FIG. 6A illustrates a turn-around identification flow diagram for watch 100, according to some embodiments of the present application;
FIG. 6B illustrates a time domain distribution of yaw angle values for the headset 200, according to some embodiments of the present application;
FIG. 7 illustrates a user interface diagram of swim information statistics of the watch 100, according to some embodiments of the present application;
fig. 8 illustrates a software logic block diagram of the watch 100 shown in fig. 1, according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, a swimming information statistics method, a computer-readable storage medium, and an electronic device. Various aspects of the illustrative embodiments will be described using terms commonly employed by those skilled in the art.
Fig. 1 shows an exemplary application scenario of an embodiment of the present application. In fig. 1, a user wears a wristwatch 100 on his arm during swimming. The watch 100 may monitor the user's swimming data, e.g., monitoring the user's swimming stroke, number of strokes, etc., for subsequent viewing by the user.
The wristwatch 100 generally measures movement data of the user's arm by a built-in sensor (e.g., an acceleration sensor, a gyroscope), and determines swimming data such as a swimming stroke, a stroke number, and the like based on the measured movement data. However, the accuracy of the measured swim data is limited by the limitations of the watch 100 in its measurement capabilities.
For example, the swimming stroke of the user when swimming may be breaststroke, butterfly stroke, freestyle stroke, backstroke, and the like. When the user is in freestyle or backstroke, the user's arm movements do not differ much, at which time it may be difficult to accurately distinguish between freestyle and backstroke based on the motion data measured by the watch 100, resulting in less accurate swimming data.
To this end, the present application provides a swimming data monitoring system based on multi-device association, and the system includes a first wearable device, such as a sports watch (hereinafter referred to as a watch), which is easy to put on and take off, and a second wearable device, such as a sports headset (hereinafter referred to as a headset). The user needs to wear both wearable devices during swimming, for example in the application scenario shown in fig. 1, the watch 100 is worn on the user's wrist and the headset 200 is worn on the user's ear. These two wearable devices are all configured with sensors such as Acceleration (ACC) meter, magnetometer (MAG), gyroscope (gyrocope, GYRO), and these two wearable devices can realize real-time data communication through the bluetooth. In the swimming process of a user, the two wearable devices acquire sensor data in real time, one wearable device transmits the sensor data acquired in real time back to the other wearable device with stronger processing capacity, and the wearable device with stronger processing capacity performs combined analysis on the data acquired by the sensor of the wearable device and the sensor data received from the other wearable device, so that swimming information such as the swimming posture, the water stroke frequency, the swimming lap number and the water stroke frequency of the user is determined.
Wherein the swimming postures can comprise four standard swimming postures of breaststroke, butterfly stroke, free swimming and backstroke, mixed swimming and the like; the water stroke times can be the times of the swinging arm action during swimming; the swimming lap number can be the number of times of turning back in a swimming lane; the stroke frequency can be the frequency of the swing arm action during swimming. It should be understood that the watch and the earphone related to the embodiments of the present application may have a waterproof function.
The embodiment of the application jointly analyzes the motion data of the user collected by the wearable device worn on the wrist of the user and the motion data of the user collected by the wearable device worn on the head of the user (for example, worn on the ear of the user). For different swimming postures, even if the hand motions of the users are consistent, the head motions of the users are greatly different. Therefore, the swimming information of the user is more accurate by performing the joint analysis of the swimming information by using the motion data of the user collected by the wearable device worn on the wrist of the user and the motion data of the user collected by the wearable device worn on the head of the user, for example, the swimming stroke, the stroke number, the stroke frequency and the like of the user can be more accurately distinguished.
In addition, in some embodiments, swimming information monitoring equipment is professional equipment, and is with high costs, and the dress degree of difficulty is great. Compared with the embodiment, the first wearable device/the second wearable device can be portable wearable devices with low cost. According to the technical scheme, the user can be provided with convenient and easy-to-wear use experience, accurate swimming information can be provided in the swimming process of the user, and the user experience is improved.
It should be understood that the wearable device to which the technical solution of the present application is applicable includes, but is not limited to, a sports watch, a sports headset, wherein the first wearable device may also be a smart band, a smart ring, or the like. The second wearable device may also be a head-mounted device, such as an augmented reality device or the like.
It should be noted that the application scenario shown in fig. 1 is only an exemplary application scenario for explaining the technical solution of the present application, and only includes one watch 100 worn on the wrist of the user and a pair of earphones 200 worn on the ears of the user. In some embodiments of the present application, the user may also wear more than two wearable devices, for example, a ring or the like worn on the ankle or other parts of the user, in addition to the watch 100 worn on the wrist of the user and the pair of earphones 200 worn on the ear of the user.
In order to facilitate understanding of the technical solution of the present application, the technical solution of the present application will be described in detail below with reference to an application scenario shown in fig. 1, in which a user wears a watch 100 and an earphone 200 for swimming.
First, the hardware structures of the wristwatch 100 and the headset 200 shown in fig. 1 will be described with reference to fig. 2A and 2B, respectively. As shown in fig. 2A, a hardware structure diagram of a watch 100 according to an embodiment of the present application is shown, where the hardware structure of the watch 100 includes: accelerometer 101, gyroscope 102, magnetometer 103, barometer 104, processor 105, memory 106, communication module 107, and power module 108.
Although not shown, the watch 100 may further include a display screen, an ambient temperature sensor, an antenna, a wireless-fidelity (Wi-Fi) module, a Near Field Communication (NFC) module, a Global Positioning System (GPS) module, a speaker, and the like.
Accelerometer 101 may detect the magnitude of acceleration of watch 100 in various directions (typically three axes). In the swimming process of a user, signals of data of three coordinate axes of the accelerometer 101 are different in performance along with the swinging of the arm of the user under different swimming postures, so that the signals of the three coordinate axes of the accelerometer 101 are analyzed, and the swimming postures of the user, the paddling actions corresponding to the different swimming postures and the like can be distinguished.
The gyroscope 102 may detect angular velocity or angular acceleration of the watch 100 in various directions (e.g., three-axis directions). In the swimming process of the user, signals of data of three coordinate axes of the gyroscope 102 are represented differently along with the swinging of the arm of the user under different swimming postures, so that the signals of the three coordinate axes of the gyroscope 102 are analyzed, and the swimming postures of the user, the paddling actions corresponding to the different swimming postures and the like can be distinguished.
The magnetometer 103 can detect the direction of the earth's magnetic field. The direction of movement of the watch 100 may be determined based on the direction of the earth magnetic field. In the swimming process, when the user finishes each swimming trip and starts the next swimming trip, the user can turn around, so that the movement direction of the watch 100 worn on the wrist of the user is greatly changed, and the movement direction data of the watch 100 collected by the magnetometer 103 is favorable for identifying the turning motion of the user.
The barometer 104 may be used to detect the pressure in the environment in which the watch 100 is located. Because the pressure values collected by the barometer 104 of the watch 100 in water and in air are different, the data collected by the barometer 104 is analyzed, so that the water inlet and outlet detection of a user is facilitated.
The processor 105 is used for performing system scheduling, and may be used for controlling the aforementioned sensors such as the accelerometer 101, the gyroscope 102, the magnetometer 103, and the barometer 104, and performing operations on data collected by these sensors. For example, in some embodiments of the present application, the processor 105 is configured to perform framing, feature extraction, and then performing gesture recognition based on the extracted features from the data collected by the accelerometer 101 and gyroscope 102, and the sensor data acquired from the headset 200. As another example, in some embodiments of the present application, processor 105 is configured to perform identification of a stroke based on data collected by accelerometer 101 and/or gyroscope 102 of watch 100, and accelerometer data collected by headset 200. As another example, in some embodiments of the present application, the processor 105 is configured to perform turn recognition based on accelerometer data, gyroscope data, and magnetometer data obtained from the headset 200.
The memory 106 is used for storing software programs and data, and the processor 105 executes various functional applications and data processing of the wristwatch 100 by running the software programs and data stored in the memory. In some embodiments of the present application, the memory 2 may store data collected by the above-mentioned sensors such as the accelerometer 101, the gyroscope 102, the magnetometer 103, the barometer 104, and the like, and some sensor data obtained from the headset 200.
The communication module 107, the watch 100 may establish a connection with other electronic devices through the communication module 107, for example, in this embodiment, the watch 100 may establish a bluetooth connection with the headset 200 of the user and receive some sensor data collected by the headset 200 from the headset 200.
The power module 108 may be used to power the various components of the watch 100 described above.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the wristwatch 100. In other embodiments of the present application, the watch 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be provided. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Fig. 2B is a hardware structure diagram of an earphone 200 according to an embodiment of the present application, where the hardware structure of the earphone 200 includes: accelerometer 201, gyroscope 202, magnetometer 203, memory 204, processor 205, communication module 206, power module 207, audio module 210. The audio module 210 includes a speaker 211 and a microphone 212.
The accelerometer 201 can detect the magnitude of acceleration of the headset 200 in various directions (typically three axes).
The gyroscope 202 may be used to determine the motion pose of the headset 200.
The magnetometer 203 may be used to detect the direction of motion of the headset 200.
The memory 204 is used for storing software programs and data, and the processor 205 implements various functional applications and data processing of the headset 200 by running the software programs and data stored in the memory. In some embodiments of the present application, the memory 204 may store data collected by the accelerometer 201, the gyroscope 202, the magnetometer 203, and the like.
The processor 205 is used for performing system scheduling, and may be used for controlling the aforementioned sensors such as the accelerometer 201, the gyroscope 202, and the magnetometer 203 to perform data acquisition.
The communication module 206, the headset 200 may establish a connection with other electronic devices through the communication module 206, for example, in this embodiment, the headset 200 may establish a bluetooth connection with the watch 100 of the user and transmit some of the sensor data collected by the headset 200 to the watch 100.
The power module 207 may be used to power the components of the headset 200.
The speaker 211 is also called a "horn" for converting an audio electric signal into a sound signal. The headset 200 can listen to music through the speaker 211 or listen to a hands-free conversation. When receiving a call or voice information through the earphone 200, it is possible to receive voice by placing the speaker 211 close to the human ear.
The microphone 212 is also called "microphone" or "microphone" and is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the call microphone 212 by speaking near the call microphone 212 through the mouth. The headset 200 may be provided with at least one call microphone 212. In other embodiments, the headset 212 may be provided with two talking microphones to achieve noise reduction functions in addition to collecting sound signals.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the earphone 200. In other embodiments of the present application, the headset 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The functions of swimming stroke recognition, paddling action recognition, turning recognition and the like which can be realized by the technical scheme of the present application will be further described in detail below with reference to the hardware structure block diagrams of the watch 100 and the headset 200.
An exemplary implementation of gesture recognition is first introduced. Fig. 3 is a flowchart of a wristwatch 100 for performing swimming stroke recognition according to some embodiments of the present application, and the main body of each step in the flowchart shown in fig. 3 may be the wristwatch 100. Referring to fig. 3, the swimming stroke recognition method provided by the present application includes the following steps:
s301: and determining water entry.
For example, if it is determined whether the user is underwater based on the air pressure data collected from the air pressure gauge 104 of the watch 100, the air pressure value collected by the air pressure gauge 104 of the watch 100 may suddenly increase when the user is underwater, i.e., the watch 100 is underwater. When the air pressure value meets a certain threshold condition, the user can be determined to enter water. The details will be described later.
S302: motion data of a user is collected.
For example, the watch 100 utilizes the accelerometer 101 to collect acceleration data of the watch 100 in various directions (typically three axes). Angular velocities or angular accelerations of the watch 100 in various directions (e.g., three-axis directions) are collected using a gyroscope 102. The direction of the earth's magnetic field can be acquired by means of the magnetometer 103.
S303: the user's motion data collected by the headset 200 is acquired.
For example, the watch 100 transmits a data acquisition request to the headset 200, and the headset 200 transmits the user's motion data acquired by the headset 200 through the accelerometer 201, the gyroscope 202, and the magnetometer 203 to the watch 100 in response to the request.
S304: the motion data acquired by the sensors of the watch 100 and the motion data acquired from the headset 200 are integrated to obtain integrated motion data.
For example, the watch 100 acquires accelerometer data (hereinafter, referred to as ACC data), gyroscope data (hereinafter, referred to as GYRO data) and ACC data and GYRO data acquired by the watch 100, and performs data packet processing to obtain combined data, and how to perform the data packet processing will be described in detail below.
S305: and determining the swimming stroke of the user based on the integrated motion data.
For example, the wristwatch 100 performs time domain feature extraction on the data obtained by combining the above data to obtain time domain features such as window mean, variance, mean absolute deviation, peak-valley feature, and the like of the gyroscope and accelerometer data. And then inputting the extracted time domain characteristics into a pre-trained swimming stroke recognition model so as to determine the swimming stroke of the user. Specific implementation details will be set forth in detail below.
Fig. 3 illustrates a flow of a method for swimming stroke recognition according to an embodiment of the present application, and a detailed flow of the swimming stroke recognition performed on the watch 100 will be described below by way of an example with the specific example shown in fig. 4A. Referring to a flow chart of a watch 100 for swimming stroke recognition shown in fig. 4A, the swimming stroke recognition method of the present application includes the following steps:
s401: and (6) detecting water in and out.
For example, entry and exit water detection is performed using barometric pressure data collected from the barometer 104 of the watch 100. When the watch 100 enters water, the air pressure value collected by the air pressure gauge 104 will suddenly increase, and when the watch 100 goes out of water, the air pressure value collected by the air pressure gauge 104 will suddenly decrease.
S402: and judging whether the watch 100 enters water or not, and if the watch 100 enters water, entering S406 to extract time domain features of the ACC data and the GYRO data acquired by the watch 100 and the ACC data and the GYRO data acquired by the earphone 200. If the wristwatch 100 is judged not to be immersed, indicating that the user has not started swimming, the process proceeds to S403 to wait for immersion.
Illustratively, as shown in fig. 4B, the barometric pressure count (i.e., barometric pressure data) collected by the barometer 104 of the watch 100 suddenly increases and is greater than the threshold P1, and it is preliminarily determined that the watch 100 is underwater, i.e., the user starts swimming. The air pressure data collected over a period of time may then be further processed to again verify that the watch 100 is submerged. For example, the air pressure data of the time duration of T1 is continuously acquired, 50 frames of air pressure data are acquired in one second, 4 seconds are taken as a window, 200 frames of air pressure data are acquired in one window, then the average value of the 200 frames of air pressure data in each window is respectively calculated to obtain the average air pressure value in each window, then the difference calculation is carried out on the average air pressure values in adjacent windows, if the average air pressure values in each window are all larger than the threshold value P1 and the difference value of the average air pressure values in adjacent windows is larger than the threshold value P2, the watch 100 is judged to enter the water, that is, the user is confirmed to start swimming again; if the two conditions that the average air pressure values in the windows are all larger than the Threshold value Threshold1 and the difference value of the average air pressure values of the adjacent windows is larger than the Threshold value Threshold2 are not met, the watch 100 is judged not to enter water, namely the user is finally judged not to start swimming.
In some embodiments, the barometer data is restored to 1.5 times the pre-entry barometric pressure for 30 seconds and lasts for a period of time T2, e.g., 30 seconds, then the watch 100 is determined to be out of water.
S403: waiting for water to enter.
That is, if the wristwatch 100 is judged not to be immersed in water, the wristwatch waits for immersion in water, and the barometer 104 of the wristwatch 100 can continue to collect barometric pressure data all the time in the process.
And S404, framing the data.
That is, the swimming stroke recognition module 111 of the watch 100 determines that the watch 100 enters water through the data collected by the barometer 104, that is, it indicates that the user starts swimming, and the swimming stroke recognition module 111 in the watch 100 performs framing on the data (respectively recorded as ACC data and GYRO data) collected by the accelerometer 101 and the gyroscope 102 and the data collected by the accelerometer 201 and the gyroscope 202 of the earphone 200, which are transmitted back by the earphone 200 in real time.
For example, the watch 100 acquires 100 frames of ACC data and GYRO data in 1 second, the earphone 200 acquires 50 frames of ACC data and GYRO data in 1 second, the swimming stroke recognition module 111 averages 100 frames of ACC data acquired by the watch 100 per two frames to obtain new 50 frames of ACC data, and averages 100 frames of GYRO data acquired by the watch 100 per two frames to obtain new 50 frames of GYRO data. Then, the new 50 frames of ACC data and the new 50 frames of GYRO data are aligned with the 50 frames of ACC data and GYRO data returned by the headset 200 based on time, so as to obtain 50 frames of combined data. For another example, because the data sampling rate of the watch 100 for acquiring the ACC data and the GYRO data is higher than that of the earphone 100, the ACC data and the GYRO data of the watch 100 may be down-sampled, for example, 100 frames of data are acquired in 1 second originally, and after the down-sampling, the watch 100 acquires 50 frames of data every 1 second; or the ACC data and the GYRO data of the headset 200 are interpolated, so that the ACC data and the GYRO data of the two devices, namely the watch 100 and the headset 200, can be aligned in time and the data number is kept consistent, thereby facilitating subsequent processing.
S405: and performing sliding window processing on the framed number.
Illustratively, the data obtained through S404 is subjected to sliding window processing with a set duration (e.g., 4 seconds) as one window and with a set step size (e.g., 0.5 seconds).
It should be understood that in a specific application, the window is a data window with a fixed time length customized as required, for example, for the aforementioned combined data of 50 frames, data with time span of 5 seconds, 6 seconds, etc. can also be used as a window, and the step size can also be customized as required, for example, the step size can also be 1 second. In the embodiment, the duration of the data obtained after framing is 1 second, 50 frames of ACC data and GYRO data are shared, one window is made of 4 seconds, and 200 frames of ACC data and GYRO data exist in one window, so that feature extraction is performed on the data in one window (namely, the data in a period of time) at a later time, and the accuracy rate of the feature extraction is high relative to one frame of data.
S406: and (5) extracting time domain features.
For example, the window mean, variance, mean absolute deviation, peak-valley feature, and so on of the gyroscope and accelerometer data are extracted from the combined data obtained above.
S407: and (5) scoring by using a decision tree.
That is, after some time domain features of the ACC data and the GYRO data are extracted in S406, the swimming stroke recognition module 111 may use the time domain features and a strategy of scoring a plurality of decision trees (for example, two classification models) to obtain a result of scoring the swimming stroke of the user. In other words, the time domain feature is recognized by using a preset swimming stroke recognition model, which may be the decision tree model shown in fig. 4C, and since the standard swimming strokes usually include four types of breaststroke, backstroke, butterfly stroke and freestyle stroke shown in fig. 4D, the decision tree model at least includes a "breaststroke model", a "butterfly stroke model", a "freestyle stroke model" and a "backstroke model", and the classification may be performed by using an "unknown swimming stroke model" corresponding to other swimming strokes than the four standard swimming strokes. That is, since the total number of results to be recognized is 5, a total of 5 kinds of posture recognition models need to be trained.
Each of the 5 models is used for identifying one of breaststroke, backstroke, butterfly stroke, freestyle stroke and unknown swimming stroke, each model has a plurality of trees, each tree can output the probability of a certain result and the probability of a non-certain result (for example, the probability of breaststroke is 0.7, and the probability of non-breaststroke is 0.3), and the total probability of identifying the certain swimming stroke is obtained by adding the probabilities of identifying the certain swimming stroke in each tree. And then selecting the swimming posture with the maximum total probability from the 5 models as a recognition result.
S408: and obtaining a recognition result.
For example, in some embodiments, some time-domain features corresponding to ACC data and GYRO data within 4s may be input into the decision tree model, and the swimming posture with the highest total probability may be taken as the recognition result. For example, if the total probabilities of breaststroke, backstroke, butterfly stroke, freestyle stroke, and unknown stroke obtained by the above models are respectively 50%, 60%, 65%, 85%, and 20% in this order, the freestyle stroke with the highest total probability can be identified as the stroke of the user.
It is understood that the execution sequence of steps S401 to S408 is only an example, and in other embodiments, other execution sequences may also be adopted, and some steps may also be split or merged, which is not limited herein. For example, S404 and S405 may be performed before S401, or may be performed simultaneously with S401. S401 and S402 may also be combined into one step.
An exemplary implementation of the stroke motion recognition will be described below. Fig. 5A is a flowchart of a stroke motion recognition according to some embodiments of the present application, and an execution subject of each step in the flowchart shown in fig. 5A may be a watch 100. Referring to fig. 5A, the flow includes the following:
s501: and (5) determining the swimming posture.
For example, the swimming stroke of the user is determined according to the method provided in steps S401-S408, so as to accurately determine the swimming stroke of the user. In other embodiments, the swimming stroke may also be determined by other methods (e.g., a method of determining the swimming stroke based on the sensor data of the watch 100).
S502: and selecting axes according to the swimming postures.
Because the GYRO (gyroscope) data includes X, Y, Z three coordinate axes, the signal expressions of different coordinate axes of the GYRO data corresponding to different swimming postures are different, and when the user uses different swimming postures for swimming, the hand action difference is large, and the head action difference is small, so that in order to improve the identification accuracy, the paddling action identification module 112 can select data of one coordinate axis of the GYRO data (including X, Y, Z three coordinate axes) collected by the watch 100 to be analyzed according to the determined swimming postures.
In some embodiments, assuming that the user's swimming stroke is determined to be breaststroke, the Y-axis signal of the GYRO data collected by the watch 100 is selected for analysis; if the swimming posture of the user is determined to be backstroke, analyzing a Z-axis signal of the GYRO data acquired by the watch 100; if the swimming posture of the user is determined to be butterfly swimming, analyzing a Z-axis signal of the GYRO data acquired by the watch 100; assuming that the swimming posture of the user is determined to be backstroke or unknown, the module values of signals of three coordinate axes of the ACC data (data acquired by acceleration) acquired by the watch 100 are selected to be analyzed, for example, after the data of the three coordinate axes of the ACC data are respectively subjected to square operation, the results of the square operation of the three coordinate axes are summed, then the summed result is subjected to evolution operation, and the module values obtained by the evolution operation are used as a data source for analyzing the waveform of the data of the watch 100 in the following process.
S503: and (5) performing peak analysis on watch data.
That is, the data of the GYRO data or the data of the coordinate axes of the ACC data selected according to the determined swimming pose in S502 is analyzed, for example, the selected data to be analyzed may be first subjected to gaussian filtering, and then the peaks and valleys of the filtered signal may be determined.
S504: and (4) performing coarse screening.
That is to say, after the wave crests and wave troughs are determined according to the signals after the filtering processing, the screening of the paddling action can be carried out preliminarily according to the determined wave crests and wave troughs. For example, in the flow chart of one type of coarse screening action illustrated in fig. 6B, the following is included:
s5041: and determining an extreme point.
For example, the extreme points A1, A2 of the signals at the end of the watch 100 shown in fig. 5C are determined.
S5042: and judging whether the extreme point is larger than a threshold value or not. If the extreme point is judged to be larger than the threshold value, S5043 is entered, and the extreme point larger than the threshold value is determined as a primary screening peak point; if the extreme point is less than or equal to the threshold, the process proceeds to S4044, and the corresponding extreme point is discarded.
S5043: and determining the primary screening peak point.
In some embodiments, a peak threshold may be preset, and when the searched extreme point is greater than the threshold, the corresponding extreme point is used as an action point to be selected, that is, determined as a primary filtering peak point. For example, in the embodiment shown in fig. 5C, the extreme points A1 and A2 are searched, and the extreme points A1 and A2 are greater than the peak Threshold3, then the extreme points A1 and A2 may be used as the primary filtering peak points.
S5044: the corresponding extreme point is discarded.
That is, if the searched extreme point is smaller than the aforementioned peak threshold, the extreme point is discarded, that is, the extreme point is not used in the subsequent stroke motion recognition.
S5045: and judging whether the peak distance is larger than a time threshold value. If the wave peak distance is larger than the time threshold value, the step S5046 is carried out, and an initial screening action point is determined; and if the wave peak distance is smaller than or equal to the time threshold, entering S4067 and discarding the corresponding wave peak point.
S5046: and determining a primary screening action point.
Namely, according to the time interval between the wave crests, the preliminarily determined stroke action points are further screened. Illustratively, the time interval T between two adjacent peaks is determined, when T is greater than a time threshold T0, the peak is taken as a valid peak, and when the time interval T between two adjacent peaks is less than T0, a larger value of the adjacent peaks is selected as the valid peak. For example, as shown in fig. 5C, if the time interval T1 between two adjacent extreme points (i.e., two adjacent peaks) A1 and A2 is greater than the time threshold T0, the extreme points A1 and A2 may be used as the prescreening action points.
S5047: the corresponding peak points are discarded.
That is, the primary-screened peak points whose peak distances do not satisfy the time threshold are selected as the invalid peak points among the determined primary-screened peak points.
According to the scheme of the embodiment, the interference of invalid paddling motion on the statistical data can be avoided, and accurate paddling frequency statistics is facilitated.
S505: the headset 200 data peak analysis.
The method for searching peaks for the module value of the ACC data collected by the headset 200 in the data peak analysis of the headset 200 is similar to that in S504, except that the threshold value of the peak and the time threshold value are different, and will not be described herein.
S506: and (5) confirming the action.
Although the preliminary screening of the stroke action points can be obtained by the peak analysis of the data of the watch 100, since the watch 100 may erroneously determine that the user is swimming when the user is shaking his/her hands but not swimming in the swimming pool, and thereby erroneously determine that the user is shaking his/her hands as the stroke action of the user, the rough screening action determined in S504 needs to be further confirmed by combining the ACC data or the GYRO data of the watch 100 and the ACC data of the headset 200.
For example, it may be determined whether a peak point of the ACC data of the headset 200 exists between two rough-screening action points obtained from the ACC data or the GYRO data of the watch 100, and if so, the two rough-screening action points are determined to be a complete stroke. For example, in the embodiment shown in fig. 5C, a peak point B1 is located between the peak points A1 and A2 of the data at the end of the watch 100 and corresponds to the signal at the end of the earphone 200, so that it can be confirmed that a complete stroke exists between the peak points A1 and A2.
As can be seen from the above description of the stroke motion recognition, after the ACC data or GYRO data of the watch 100 is analyzed to obtain the primary screening peak point, the stroke motion is further confirmed by combining the ACC data of the earphone 200, and the screened stroke motion is more accurate.
It is understood that the execution sequence of the steps S501 to S506 is only an example, in other embodiments, other execution sequences may be adopted, and some steps may be split or combined, which is not limited herein.
An exemplary implementation of turn-around recognition will be described below. Fig. 6A is a flowchart of turn-around recognition according to some embodiments of the present disclosure, and the main execution body of each step in fig. 6A may be a watch 100. Referring to fig. 6A, the turn-around identification process includes the following steps:
s601: and identifying the stroke action.
For example, after the stroke recognition module 112 of the wristwatch 100 recognizes the stroke of the user, the recognition result data is transmitted to the turn recognition module 113 of the wristwatch 100, and the turn recognition module 113 recognizes the stroke of the user based on the received result data. In other embodiments, the stroke action may be identified by other methods as well.
S602: and judging whether the action number is larger than the set number N1. Hereinafter, N1=3 is described as an example. But the application is not limited thereto. In other embodiments, N may be 2,5, among other quantities.
If the number of the paddling actions is more than 3, the process goes to S603, and the mean value of the yaw angles of the earphones 200 corresponding to the previous 3 paddling actions is calculated. If the number of stroke operations is determined to be less than or equal to 3, the process returns to S601.
S603: the yaw Angle average Angle0 for the first 3 action headphones 200 is recorded. Therefore, the starting direction of swimming of the user can be determined according to the average value Angle0 (specifically, refer to fig. 6B) of the yaw angles of the headset 200 corresponding to the first 3 paddling actions.
For example, when the stroke recognition module 112 recognizes the stroke, it may calculate the starting time of each stroke during the swimming process of the user, so as to recognize 3 strokes, and take the yaw Angle average value Angle0 of the headset 100 in the 3 consecutive stroke time periods as the swimming starting orientation.
The method for calculating the mean yaw angle of the headset 100 may be: the attitude angle of the headset 200 during swimming is obtained by performing nine-axis fusion calculation on ACC data (including three axes), GYRO data (including three axes) and MAG data (including three axes) acquired by the headset 200, and the attitude angle is used as the yaw angle of the headset 200. The yaw angle of the user's head can also be deduced from the attitude angle.
S604: the mean yaw Angle1 of the headset 200 during each subsequent stroke is obtained.
For example, nine-axis fusion calculation is performed on ACC data, GYRO data, and MAG data corresponding to each of the first 3 paddling actions after the user starts swimming, so as to obtain a yaw Angle average value Angle1 of the headset 200 corresponding to each of the first 3 paddling actions.
S605: and judging whether the difference between the yaw angle average values of two adjacent actions is larger than a threshold delta0. If the difference between the yaw angle average values of the two adjacent actions is larger than the threshold delta0, the step S606 is entered, and the user is preliminarily judged to have a turning action between the two adjacent actions. Otherwise, return to S604.
S606: the pre-judgment is turning. That is, after it is determined that there are more than 3 paddling actions, the mean values of the yaw angles of every two adjacent paddling actions after the 3 paddling actions can be compared, and if the difference value of the mean values of the yaw angles of the two adjacent paddling actions is greater than the difference threshold value delta0, it can be preliminarily inferred that the swimmer has a turn-around action.
S607: and judging whether the number of the pre-judged after-turning actions is larger than a set number N2. Hereinafter, N2=3 is described as an example. But the application is not limited thereto. In other embodiments, N may be 2,4, among other numbers.
Namely, whether more than 3 paddling actions exist after the turning action of the preliminary judgment is judged. And if the number of the paddling actions after the pre-judgment of the turning is larger than 3, entering S608 to obtain the mean value of the yaw angles of the 3 paddling actions after the pre-judgment of the turning. Otherwise, the step S609 is carried out, and the number of 3 actions is reached after the pre-judgment is carried out.
S608: and acquiring the mean value Angle2 of the yaw Angle in 3 continuous paddling action periods after turning.
For example, nine-axis fusion calculation is performed on ACC data, GYRO data, and MAG data corresponding to 3 consecutive paddling actions after the pre-determination and turning to obtain yaw angles of the headset 200 corresponding to the 3 paddling actions, and then the yaw angles corresponding to the 3 paddling actions are averaged to obtain a yaw Angle average value Angle2 (refer to fig. 6B).
S609: and 3 action numbers are reached after the pre-judged turning.
S610: judging | Angle2-Angle0| > delta1. That is, it is determined whether the absolute value of the difference between the mean yaw Angle2 of the 3 consecutive paddling actions after the turning and the mean yaw Angle0 of the 3 consecutive paddling actions of the user is greater than the difference threshold delta1. If the difference is greater than the difference threshold delta1, the process proceeds to S611, and it is determined that the user performs a turn-around action, otherwise, the process proceeds to S612, and it is determined that the user performs a false turn-around, that is, the user does not perform a turn-around.
S611: and determining the turning.
In some embodiments, when it is determined that the swimmer turns, the mean Angle2 of the three consecutive paddling action yaw angles after turning can be used as a new swimming orientation, that is, the value of the yaw Angle0 originally representing the initial orientation of the headset 100 is updated to the value of Angle2, so that the direction represented by the updated Angle0 is used as the reference direction for subsequently judging turning.
S612: a false turn is determined.
As can be seen from the above description of the turn-around action recognition, when the user turns around during swimming, the signals of the ACC data, the GYRO data, and the MAG data collected by the headset 200 are reflected in a larger difference, and the watch 100 calculates the yaw angle by using the ACC data, the GYRO data, and the MAG data received from the headset 200, so as to obtain a more accurate yaw angle, thereby making the turn-around recognition result more accurate. In some embodiments, after the turn action of the user is identified, the total number of turns can be calculated, and then the number of swimming passes of the user can be calculated according to the total number of turns. For example, if n turns are recognized, the total number of swimming laps counts as n +1 laps.
In some embodiments, the number of strokes per pass may also be calculated, for example, the number of strokes in the nth (assuming N > 1) pass may be obtained by the sum of the number of strokes between the end of the nth-1 turn and the start time point of the nth turn.
In some embodiments, statistics may also be performed on the swimming strokes used in each swimming stroke, for example, how many swimming strokes are included in one swimming stroke and the duration ratio of each swimming stroke is determined, and the swimming stroke with the largest duration ratio is used as the main swimming stroke of the swimming stroke.
In some embodiments, the stroke frequency of the user may also be calculated, for example, the total stroke time obtained by summing the durations of each stroke action is totalTime, and if the total stroke time is determined to be strokeCnt, the average single stroke time meanStrokeTime = totalTime/strokeCnt. Then the average single stroke duration is converted into the stroke times per minute, namely the stroke frequency: strokeFreq = (60 x 1000)/meanStrokeTime.
In some embodiments, the watch 100 may also give the user a program through a display screen after counting the number of strokes taken by the user, the number of strokes taken by each stroke, the swimming stroke, and the stroke frequency, and may refer to the interface diagram shown in fig. 7.
It is understood that the execution sequence of steps S601 to S612 is only an example, and in other embodiments, other execution sequences may also be adopted, and some steps may also be split or merged, which is not limited herein.
The software logic block diagram of the watch 100 described above will now be described with reference to fig. 8. As shown in fig. 8, the watch 100 includes a swimming stroke recognition module 111, a stroke recognition module 112, a turn recognition module 113, and a result statistics module 114.
The swimming stroke recognition module 111 may be configured to recognize a swimming stroke of the user, for example, recognize that the swimming stroke of a certain trip of the user is one of breaststroke, butterfly stroke, backstroke stroke and freestyle stroke shown in fig. 4D. When the user uses the swimming gestures to swim, the accelerometer signals and gyroscope signals collected by the earphone 200 and the watch 100 have obvious characteristic expressions, and the training of the swimming gesture recognition model can be performed based on the characteristics, so that the swimming gestures are recognized by using the trained model. In some embodiments, the gesture recognition model may be a decision tree model.
The swimming stroke recognition module 111 may include a data packet packing sub-module M101, a feature extraction sub-module M102, and a decision tree swimming stroke recognition sub-module M103.
For example, the data set packaging sub-module M101 may be configured to perform data set packaging (or framing), that is, the watch 100 determines that the user starts swimming through the data collected by the barometer 104, and the swimming stroke recognition module 111 in the watch 100 packages the data collected by the accelerometer 101 and the gyroscope 102 (denoted as ACC data and GYRO data, respectively) and the data collected by the accelerometer 201 and the gyroscope 202 of the headset 200, which are transmitted back by the headset 200 in real time (also referred to as framing). For example, the watch 100 acquires 100 frames of ACC data and GYRO data in 1 second, the earphone 200 acquires 50 frames of ACC data and GYRO data in 1 second, the swimming stroke recognition module 111 averages 100 frames of ACC data acquired by the watch 100 per two frames to obtain new 50 frames of ACC data, and averages 100 frames of GYRO data acquired by the watch 100 per two frames to obtain new 50 frames of GYRO data. Then, the new 50 frames of ACC data and the new 50 frames of GYRO data are aligned with the 50 frames of ACC data and GYRO data returned by the headset 200 based on time, so as to obtain 50 frames of combined data.
The feature extraction sub-module M102 may perform feature extraction on the combined data obtained as described above. For example, the window mean (i.e., the mean of ACC data and GYRO data in one window), the variance, the mean absolute deviation, the peak-valley feature, and the like of the gyroscope and accelerometer data are extracted from the combined data obtained above. The window is a data window with a fixed duration, which is customized according to needs, for example, for the obtained 50 frames of combined data, data of a time span of 4 seconds is used as a window, and the step size may also be customized according to needs, for example, the step size may be 0.5 seconds. In the foregoing embodiment, the duration of the data obtained after framing is 1 second, 50 frames of ACC data and GYRO data are shared, and one window is made of 4 seconds, and 200 frames of ACC data and GYRO data are provided for one window, so that feature extraction is performed on the data in one window (i.e., the data in a period of time) once, and the accuracy rate with respect to one frame of data is high. In some embodiments, the extracted feature data related to the gyroscope and the accelerometer are extracted separately, for example, the ACC data and the GYRO data in each window are extracted separately to obtain the feature data of the gyroscope and the accelerometer in each window.
The decision tree gesture recognition sub-module M103 may perform decision tree gesture recognition based on the aforementioned extracted features. That is, the decision tree swimming stroke identifier module M103 performs swimming stroke identification based on the extracted features, for example, the extracted feature data is used as input, a pre-trained decision tree model is used to reason the extracted feature data, and the result obtained by inference is the swimming stroke identification result.
The stroke motion recognition module 112 may include a peak analysis sub-module M104 and a stroke motion confirmation sub-module M105. Illustratively, the peak analysis submodule M104 may be used to identify a stroke action of the user. For example, after the swimming stroke recognition module 111 recognizes the swimming stroke of the user, the recognition result is sent to the peak analysis submodule M104. Because the GYRO (gyroscope) data includes X, Y, Z three coordinate axes, the signal representations of different coordinate axes of the GYRO data corresponding to different swimming postures are different, and when the user uses different swimming postures for swimming, the hand action difference is large, and the head action difference is small, so that in order to improve the identification accuracy, the wave peak analysis submodule M104 can perform wave peak analysis and preliminary stroke motion identification on the data of different axes of the GYRO data (including X, Y, Z three coordinate axes) collected by the watch 100 according to different swimming postures, and then the stroke motion confirmation submodule M105 performs stroke motion confirmation on the wave peak analysis of the ACC data collected by the earphone 200. It should be understood that since the user can stroke once every time the user strokes water, the stroke motion is recognized, which is convenient for counting how many times the user strokes water once.
The turn recognition module 113 may include a nine-axis fusion gesture calculation submodule M106 and a turn recognition submodule M107. The turn identification module 113 may be used to identify a turn behavior of the user. It should be understood that a user will typically have one turn when completing one swim to perform the next swim, and thus, identifying the turn of the user may facilitate counting how many laps the user has walked. Since the user swims, the watch 100 moves in space with the paddling arms of the user, and since the earphone 200 is worn at the user's ear, the movement range of the user's head relative to the arms during swimming is small, and the posture of the earphone 200 changes greatly when the user turns. Therefore, the turn recognition module 113 can analyze the ACC data, the GYRO data, and the MAG data (data collected by the magnetometer 203 of the headset 200) of the headset 200 in the aforementioned combined data, for example, the nine-axis fusion attitude calculation module M106 performs nine-axis fusion attitude calculation on the ACC data, the GYRO data, and the MAG data of the headset 200 by using a nine-axis fusion algorithm, that is, the ACC data (three coordinate axes), the GYRO data (three coordinate axes), and the MAG data (three coordinate axes) of the headset 200 are fused, an attitude angle of the headset 200 corresponding to each paddling action is calculated, and then the turn recognition module M107 performs turn recognition according to the calculated attitude angle of each paddling action. It should be understood that if the posture angle of the earphone 200 corresponding to two adjacent paddling actions is greatly changed, it indicates that the user has turned around.
The result statistics module 114 may be configured to count the swimming stroke, the stroke frequency, the number of strokes taken during the stroke, and the number of strokes taken during the stroke (i.e., perform stroke counting) of the user according to the swimming stroke identified by the swimming stroke identification module 111, the stroke identified by the stroke identification module 112, and the turn behavior identified by the turn identification module 113. So that the watch 100 displays the statistics to the user for reference.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable storage media. Thus, a machine-computer-readable storage medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-Only memories (CD-ROMs), magneto-optical disks, read-Only memories (ROMs), random Access Memories (RAMs), erasable Programmable Read Only Memories (EPROMs), electrically Erasable Programmable Read Only Memories (EEPROMs), magnetic or optical cards, flash memories, or tangible machine-readable memories for transmitting information (e.g., carrier waves, infrared digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signals. Thus, a machine-readable storage medium includes any type of machine-readable storage medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (13)

1. A swimming information statistical method is applied to a system comprising a first electronic device and a second electronic device, wherein the first electronic device is worn on a first part of a user, the second electronic device is worn on a second part of the user, and the difference between the motion tracks of the first part is smaller than the difference between the motion tracks of the second part under different swimming postures, and the method comprises the following steps:
the first electronic equipment acquires first motion data of the user in a swimming process;
the first electronic equipment acquires second motion data acquired by the second electronic equipment in the swimming process of the user;
the first electronic device performs data framing on the first motion data and the second motion data to obtain framed motion data, wherein the data framing mode comprises:
according to the frame rate of the second motion data, performing down-sampling processing on the first motion data, or according to the frame rate of the first motion data, performing interpolation processing on the second motion data so as to enable the frame rates of the first motion data and the second motion data to be the same;
the first electronic equipment performs sliding window processing on the framed motion data and extracts time domain features;
the first electronic equipment determines a swimming stroke recognition result based on the extracted time domain characteristics and a preset swimming stroke recognition model;
the first electronic equipment counts the swimming stroke of the user based on the swimming stroke recognition result, wherein the swimming information of the user comprises the swimming stroke of the user.
2. The method according to claim 1, wherein the first electronic device determines a swimming stroke recognition result based on the extracted time domain features and a preset swimming stroke recognition model, and the method comprises:
and the first electronic equipment inputs the time domain characteristics of the framed motion data into a preset swimming stroke recognition model to obtain a swimming stroke recognition result.
3. The method according to claim 2, wherein the preset swimming stroke recognition model is composed of at least one two-class decision tree, and,
the first electronic equipment determines a swimming stroke recognition result based on the extracted time domain features and a preset swimming stroke recognition model, and the method comprises the following steps:
adopting the at least one two-classification decision tree and a preset scoring strategy to score the extracted time domain characteristics respectively to obtain a swimming stroke scoring result of the user;
and determining the swimming stroke corresponding to the maximum score in the swimming stroke scoring result as a swimming stroke identification result.
4. The method of claim 3, wherein the swimming information of the user further comprises a user's stroke, and the first electronic device counts the swimming information of the user based on the framed motion data, comprising:
the first electronic device determines first target data used for counting the paddling action of the user in the framed motion data according to the counted swimming stroke of the user, wherein the first target data comprises first target subdata collected by the first electronic device and second target subdata collected by the second electronic device;
the first electronic equipment determines a plurality of rough screening water actions according to the first target subdata;
and the first electronic equipment counts the paddling action of the user from the plurality of rough screening paddling actions according to the second target subdata.
5. The method of claim 4, wherein the first electronic device comprises a gyroscope and an accelerometer, and wherein the first target subdata comprises data collected by the gyroscope of the first electronic device or data collected by the accelerometer of the first electronic device.
6. The method of claim 4, wherein the first electronic device includes an accelerometer and the second target subdata includes data collected by an accelerometer of the second electronic device.
7. The method according to any one of claims 4 to 6, wherein the swimming information of the user comprises a turn-around action of the user, and the first electronic device counts the swimming information of the user based on the framed motion data, comprising:
the first electronic equipment determines an initial paddling action of the user;
the first electronic equipment determines the initial swimming direction of the user according to the second motion data corresponding to the continuous first number of paddling actions including the initial paddling action;
the first electronic equipment determines the swimming direction corresponding to each paddling action after the continuous first number of paddling actions according to the second motion data corresponding to each paddling action after the continuous first number of paddling actions;
when the first electronic equipment judges that the user turns according to the difference between the swimming directions corresponding to every two adjacent paddling actions after the first continuous number of paddling actions, the first electronic equipment determines the corresponding target paddling action when the user turns;
the first electronic equipment determines target swimming directions corresponding to a second number of continuous paddling actions after the target paddling action according to the second motion data corresponding to the second number of continuous paddling actions after the target paddling action;
when the first electronic device determines that the difference between the target swimming direction and the initial swimming direction reaches a threshold value, the first electronic device confirms that the user turns around during the target paddling action, and counts the turning-around action of the user.
8. The method of claim 7, further comprising:
the first electronic device updates the initial swimming direction to the target swimming direction.
9. The method of claim 8, wherein the second electronic device comprises a gyroscope, an accelerometer, and a magnetometer, and wherein the second motion data comprises data collected by the gyroscope of the second electronic device, data collected by the accelerometer of the second electronic device, and data collected by the magnetometer of the second electronic device.
10. The method of claim 1, wherein the first electronic device is a sports watch.
11. The method of claim 1, wherein the second electronic device is a sports headset.
12. A computer-readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the swimming information statistics method of any one of claims 1-11.
13. An electronic device, comprising:
a memory for storing instructions for execution by one or more processors of an electronic device, and,
a processor for performing the swim information statistics method of any of claims 1-11 when the instructions are executed by one or more processors.
CN202210509907.5A 2022-05-11 2022-05-11 Swimming information statistical method, computer-readable storage medium and electronic device Active CN114602155B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210509907.5A CN114602155B (en) 2022-05-11 2022-05-11 Swimming information statistical method, computer-readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210509907.5A CN114602155B (en) 2022-05-11 2022-05-11 Swimming information statistical method, computer-readable storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN114602155A CN114602155A (en) 2022-06-10
CN114602155B true CN114602155B (en) 2023-02-21

Family

ID=81870362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210509907.5A Active CN114602155B (en) 2022-05-11 2022-05-11 Swimming information statistical method, computer-readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN114602155B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115155044A (en) * 2022-07-13 2022-10-11 杭州光粒科技有限公司 Method, device, equipment and medium for determining swimming turn-around time

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060020396A (en) * 2004-08-31 2006-03-06 성준태 Health machine for swimming
CN106334307B (en) * 2015-07-07 2018-07-31 天彩电子(深圳)有限公司 A kind of swimming monitoring method
US20170038848A1 (en) * 2015-08-07 2017-02-09 Fitbit, Inc. User identification via collected sensor data from a wearable fitness monitor
GB2550394B8 (en) * 2016-05-19 2020-10-21 Polar Electro Oy Enhancing monitoring of swimming
CN106175781B (en) * 2016-08-25 2019-08-20 歌尔股份有限公司 Utilize the method and wearable device of wearable device monitoring swimming state
BR112019003561B1 (en) * 2016-08-31 2022-11-22 Apple Inc SYSTEM AND METHOD FOR IMPROVING THE ACCURACY OF A BODY WEAR DEVICE AND DETERMINING A USER'S ARM MOVEMENT
CN106251584B (en) * 2016-10-14 2018-07-06 山西大学 Multifunctional intellectual swimming bracelet and swimming state monitoring device and method
JP2018068705A (en) * 2016-10-31 2018-05-10 セイコーエプソン株式会社 Electronic apparatus, program, method and recording medium
CN107115653B (en) * 2016-11-03 2023-04-28 京东方科技集团股份有限公司 Device for adjusting swimming stroke, swimming stroke information processing system and swimming stroke information processing method
CN207950567U (en) * 2018-01-24 2018-10-12 西安科技大学 A kind of swimming position monitoring system
CN108379818A (en) * 2018-04-20 2018-08-10 国家体育总局体育科学研究所 A kind of technology analysis system and method for swimming exercise
JP7458650B2 (en) * 2018-04-26 2024-04-01 オムニバス 157 プロプリエタリー リミテッド System and method for systematically representing swimmer motion performance metrics
CN108452504B (en) * 2018-07-03 2020-04-14 昆山快乐岛运动电子科技有限公司 Swimming posture analysis device and method based on sensor
CN108939512B (en) * 2018-07-23 2020-05-19 大连理工大学 Swimming posture measuring method based on wearable sensor
US11097177B1 (en) * 2020-08-25 2021-08-24 Orkus Swim Llc Repulsion-based swim system and methods for use thereof
CN112587901B (en) * 2020-11-24 2022-05-31 安徽华米健康科技有限公司 Swimming gesture recognition method, device, system and storage medium
CN112542030A (en) * 2020-12-02 2021-03-23 英华达(上海)科技有限公司 Intelligent wearable device, method and system for detecting gesture and storage medium
CN113713358B (en) * 2021-06-28 2023-12-26 深圳市奋达智能技术有限公司 Swimming monitoring method, device, storage medium and program product based on multi-sensor fusion
CN114153576A (en) * 2021-11-09 2022-03-08 上海卓菡科技有限公司 Multi-task scheduling method and device based on wearable device and electronic device

Also Published As

Publication number Publication date
CN114602155A (en) 2022-06-10

Similar Documents

Publication Publication Date Title
AU2020273327B2 (en) Systems and methods of swimming analysis
US11896368B2 (en) Systems and methods for determining swimming metrics
KR101509472B1 (en) Motion parameter determination method and device and motion auxiliary equipment
US20210014617A1 (en) Detecting Use of a Wearable Device
EP2509070B1 (en) Apparatus and method for determining relevance of input speech
EP3695404B1 (en) Audio activity tracking and summaries
US20210068713A1 (en) Detecting swimming activities on a wearable device
WO2018161906A1 (en) Motion recognition method, device, system and storage medium
US20160169703A1 (en) Method and System for Characterization Of On Foot Motion With Multiple Sensor Assemblies
US10430896B2 (en) Information processing apparatus and method that receives identification and interaction information via near-field communication link
CN110974641A (en) Intelligent walking stick system integrating machine learning and Internet of things technology for blind people
WO2017177582A1 (en) Method and device for implementing speed measurement of sports apparatus
CN114602155B (en) Swimming information statistical method, computer-readable storage medium and electronic device
US11418892B2 (en) Method of operating a hearing device, and hearing device
CN113099031B (en) Sound recording method and related equipment
CN107270931A (en) A kind of IOS and the general gait auto-correlation pedometer of Android platform
CN109758154B (en) Motion state determination method, device, equipment and storage medium
CN111803902B (en) Swimming stroke identification method and device, wearable device and storage medium
CN107203259B (en) Method and apparatus for determining probabilistic content awareness for mobile device users using single and/or multi-sensor data fusion
CN110705496B (en) Swimming posture identification method based on nine-axis sensor
US20230096949A1 (en) Posture and motion monitoring using mobile devices
CN114585422B (en) Apparatus and method for determining swimming metrics
US20240180445A1 (en) Systems and methods for determining swimming metrics
JP7439727B2 (en) Judgment device and method
US20240041354A1 (en) Tracking caloric expenditure using a camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant