CN114265502A - Swimming mode automatic identification method and wearable device - Google Patents

Swimming mode automatic identification method and wearable device Download PDF

Info

Publication number
CN114265502A
CN114265502A CN202111582182.4A CN202111582182A CN114265502A CN 114265502 A CN114265502 A CN 114265502A CN 202111582182 A CN202111582182 A CN 202111582182A CN 114265502 A CN114265502 A CN 114265502A
Authority
CN
China
Prior art keywords
swimming
data
state
user
stroke
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111582182.4A
Other languages
Chinese (zh)
Inventor
卜祝亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Wingtech Electronic Technology Co Ltd
Original Assignee
Xian Wingtech Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Wingtech Electronic Technology Co Ltd filed Critical Xian Wingtech Electronic Technology Co Ltd
Priority to CN202111582182.4A priority Critical patent/CN114265502A/en
Publication of CN114265502A publication Critical patent/CN114265502A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application discloses a swimming mode automatic identification method and wearable equipment, wherein the swimming mode automatic identification method comprises the following steps: monitoring the state data of a user in real time, and matching the state data of the user with template data corresponding to each swimming stroke state of the user, wherein the template data comprises swimming action data and/or physiological data; and when the state data of the user is successfully matched with the template data, determining the swimming stroke state corresponding to the user according to the matched template data, and automatically starting a swimming mode. The template data corresponding to each swimming stroke state of the user comprises swimming action data and/or physiological data, the behavior habits corresponding to the individual swimming stroke states of the user can be truly reflected, the template data are more reliable and accurate, and the corresponding template data are automatically identified and responded according to the user state data monitored in real time, so that the swimming stroke state of the user is determined, the swimming mode is automatically started, the swimming mode does not need to be manually started, and the user experience degree is improved.

Description

Swimming mode automatic identification method and wearable device
Technical Field
The present disclosure relates generally to the field of smart wearable technologies, and in particular, to an automatic swimming mode identification method and a wearable device.
Background
Nowadays, people pay more and more attention to health and sports, swimming becomes one of the popular aerobic sports forms. With the progress of the waterproof technology of the smart watch, the smart watch is used for monitoring swimming sports gradually. Swimming states are monitored and reminded through the intelligent watch, and people can know the self motion conditions more visually.
At present in intelligent wrist-watch field, need the person's of wearing manual operation to open swimming mode under the wrist-watch running mode, the operation route is: lifting the wrist to lighten the watch screen, clicking the desktop, clicking for movement, sliding the list to select swimming, and clicking to start. The existing watch has long operation path in the mode of starting swimming, the operation difficulty in water is multiplied, and the user experience is poor.
Disclosure of Invention
In view of the above-mentioned drawbacks and deficiencies of the prior art, it is desirable to provide an automatic swimming mode identification method and a wearable device.
In a first aspect, a swimming mode automatic identification method is provided, which includes:
monitoring the state data of a user in real time, and matching the state data of the user with template data corresponding to each swimming posture state of the user; the template data comprises swimming action data and/or physiological data;
and when the state data of the user is successfully matched with the template data, determining the swimming stroke state corresponding to the user according to the matched template data, and automatically starting a swimming mode.
Preferably, the method for acquiring template data corresponding to each swimming posture state of the user includes:
collecting state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
and performing data analysis on the state data corresponding to each swimming stroke state, and determining the template data corresponding to each swimming stroke state of the user.
Further preferably, after the data analysis of the state data corresponding to each swimming stroke state, the method further includes:
and collecting state data corresponding to each swimming posture state of the user in the swimming process for multiple times, and correcting the template data according to the state data collected for multiple times.
Preferably, the swimming stroke data includes at least one of: paddling sound, paddling frequency and paddling frequency; the physiological data includes at least one of: heart rate, calories.
Preferably, the swimming posture state comprises at least one of: breaststroke posture state, freestyle stroke posture state, butterfly stroke posture state and backstroke posture state.
In a second aspect, there is provided a wearable device comprising:
the matching module is used for monitoring the state data of the user in real time and matching the state data of the user with the template data corresponding to each swimming stroke state of the user; the template data comprises swimming action data and/or physiological data;
and the swimming posture state determining module is used for determining the swimming posture state corresponding to the user according to the matched template data and automatically starting a swimming mode when the state data of the user is successfully matched with the template data.
Preferably, the wearable device further comprises an obtaining module, and the obtaining module is configured to obtain template data corresponding to each swimming posture state of the user.
Preferably, the obtaining module includes:
the acquisition unit is used for acquiring state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
and the data analysis unit is used for carrying out data analysis on the state data corresponding to each swimming posture state and determining the template data corresponding to each swimming posture state of the user.
Further preferably, the obtaining module further includes:
and the correction unit is used for collecting the state data corresponding to each swimming posture state of the user in the swimming process for multiple times, and correcting the template data according to the state data collected for multiple times.
Preferably, the swimming stroke data includes at least one of: paddling sound, paddling frequency and paddling frequency; the physiological data includes at least one of: heart rate, calories.
Preferably, the swimming posture state comprises at least one of: breaststroke posture state, freestyle stroke posture state, butterfly stroke posture state and backstroke posture state.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the automatic swimming mode identification method provided in any embodiment of the present application when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of the automatic swimming mode identification method provided in any of the embodiments of the present application.
According to the swimming mode automatic identification method and the wearable device, the template data comprise swimming action data or/and psychological data, the behavior habit corresponding to the individual swimming posture state of the user can be reflected more truly by adopting a mode of combining the swimming action data and the psychological data, the template data are more reliable and accurate, and the swimming posture state identification can be performed on the individual user in a targeted manner; and the corresponding template data is automatically identified and responded according to the user state data monitored in real time, so that the swimming posture state of the user is determined, the swimming mode is automatically started, the swimming mode does not need to be manually started, and the user experience degree is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is an exemplary flowchart of a swimming mode automatic identification method provided in an embodiment of the present application;
fig. 2 is a schematic process diagram of an embodiment of the present application for acquiring template data corresponding to each swimming stroke state of a user according to the present application;
fig. 3 is an exemplary structural block diagram of an implementation manner of a wearable device provided in an embodiment of the present application;
fig. 4 is an exemplary structural block diagram of another preferred embodiment of the wearable device provided in the embodiments of the present application;
FIG. 5 is a diagram of automatic recognition of swimming patterns provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 is an exemplary flowchart of a swimming mode automatic identification method according to an embodiment of the present disclosure. As shown in fig. 1, in this embodiment, the swimming mode automatic identification method provided by the present invention includes:
s10: monitoring the state data of a user in real time, and matching the state data of the user with template data corresponding to each swimming posture state of the user; the template data comprises swimming action data and/or physiological data;
s20: and when the state data of the user is successfully matched with the template data, determining the swimming stroke state corresponding to the user according to the matched template data, and automatically starting a swimming mode.
Specifically, the template data provided by the embodiment of the application is determined according to the behavior habit of the user when the user swims, the template data comprises swimming action data or/and psychological data, the behavior habit corresponding to the personal swimming posture state of the user can be reflected more truly by adopting a mode of combining the swimming action data and the psychological data, the template data is more reliable and accurate, and the swimming posture state identification can be performed on the user in a targeted manner. In the swimming mode automatic identification method, in the intelligent identification response stage, the corresponding template data is automatically identified and responded according to the user state data monitored in real time, so that the swimming posture state of the user is determined, the swimming mode is automatically started, the swimming mode does not need to be manually started, and the user experience is improved.
It should be noted that the swimming posture state includes at least one of the following items: breaststroke posture state, freestyle stroke posture state, butterfly stroke posture state and backstroke posture state.
In one embodiment, the swimming stroke data comprises at least one of: paddling sound, paddling frequency and paddling frequency; the physiological data includes at least one of: heart rate, calories.
Specifically, the process of collecting sound is called pickup, and in the embodiment of the present application, the stroke sound is used as the swimming motion data to perform monitoring analysis because the Voiceprint (Voiceprint) has not only specificity but also relative stability, and the Voiceprint is a sound wave spectrum carrying speech information and displayed by an electro-acoustic apparatus. After the adult, the voice of the human can be kept relatively stable and unchanged for a long time. Experiments prove that whether a speaker intentionally imitates the voice and tone of other people or speaks with whisper and whisper, even if the imitation is vivid, the voiceprint of the speaker is different all the time. Based on the two characteristics of the voiceprints, the investigation personnel can check and compare the acquired voiceprints of criminals and the voiceprints of suspects through the voiceprint identification technology, quickly identify criminals and provide reliable evidence for investigation and case solving. The paddling voiceprints corresponding to different swimming postures are different (the specificity of the voiceprints), and the paddling voiceprints corresponding to the same swimming posture are the same (the relative stability of the voiceprints), so the method distinguishes different swimming posture states according to the paddling sound. And the times of paddling, the paddling frequency and the like are determined according to the paddling sound, and the swimming action data corresponding to each swimming posture state is further assisted to be determined.
The heart rate and calorie (such as calorie) of the user vary in different ranges in each swimming stroke, for example, the heart rate does not change in a free swimming stroke in a unit time, and the heart rate changes in a larger range because the speed is higher and the calorie is more consumed, so that the heart rate changes in a larger range and in a faster range. Different swimming postures can have obvious swimming sound changes, and the heart rate change and the heat change of the user are synchronously adopted to further assist in distinguishing different swimming posture states.
In an embodiment, referring to fig. 2, fig. 2 is a schematic process diagram of a preferred implementation of the method for acquiring template data corresponding to each swimming stroke state of the user in step S10, where the method for acquiring template data corresponding to each swimming stroke state of the user specifically includes:
s101: collecting state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
s102: and performing data analysis on the state data corresponding to each swimming stroke state, and determining the template data corresponding to each swimming stroke state of the user.
Specifically, in the swimming process, swimming action data and physiological data of different user individuals under the same swimming stroke state are different, so that template data corresponding to each swimming stroke state of the user are acquired in a machine learning mode in the application, and the behavior habit corresponding to each swimming stroke state of the user can be learned, so that the template data corresponding to each swimming stroke state of the user is determined, and the behavior habits of different users in the swimming process are distinguished through machine learning. In the process of acquiring the template data, the action data or/and the psychological data of swimming are/is considered, the behavior habits corresponding to the individual swimming stroke states of the users can be reflected more truly by adopting a mode of combining the action data and the psychological data, the acquired template data are more reliable and accurate, and the swimming stroke state identification can be performed on each user in a targeted manner.
In the swimming process, the swimming action data and the physiological data of the same user in different swimming stroke states show different change rules, and the swimming action data and the physiological data of the same user in the same swimming stroke state show the same change rule, such as:
under the breaststroke posture state: the arm movements of breaststroke mainly comprise three movements of external stroke, internal stroke and forward extension, wherein the three movements are all carried out in water, the internal stroke is accelerated movement, and the acceleration is obviously divided from other two movements. Therefore, in the whole sound collecting process, the sounds generated by the movement of the arm in the water are respectively recorded to form sound combinations.
In the free swimming posture state: the arm action of the free swimming mainly comprises an overwater part and an underwater part, the two arms alternately paddle, the arm after water outlet can be instantly static on the horizontal plane, and the action after water outlet is basically in a constant speed state. The arm entering water can be divided into two stages under water, namely slow water paddling and accelerated water pushing, and the generated sound is different. Therefore, in the whole sound pickup process, the sounds generated according to the motion of the arm on the water and under the water are respectively recorded, and a free-swimming sound combination is formed.
In a backstroke state: the arm action of backstroke is similar to free stroke, and the two hands alternately stroke on water and under water, and the difference is that the stroke of the two arms is continuously carried out at a constant speed, and no pause process exists in the middle, so that continuous and regular water and under water sounds can be generated. Therefore, in the whole sound pickup process, the sounds generated according to the motion of the arm on the water and under the water are respectively recorded to form sound combinations.
In the embodiment of the application, in the process of acquiring the template data corresponding to each swimming stroke state of a user, the audio of the user in water in various swimming stroke states is collected, a large amount of swimming stroke related information such as water stroke sound, water stroke frequency and water stroke frequency is collected, and physiological data such as personalized heart rate, calories and the like generated by the corresponding swimming strokes are synchronously collected; after data are collected, operation and arrangement are carried out, template data corresponding to the corresponding personalized swimming stroke state are obtained, machine learning is completed, a swimming stroke data model is obtained, and the swimming stroke can be automatically identified according to the swimming stroke data model. In the process of obtaining the swimming stroke data model, the swimming stroke data (stroke sound, stroke frequency and the like) and the personal physiological data (heart rate, calories and the like) of the user are considered, the swimming stroke data model is obtained in a machine learning mode, the swimming stroke data model can truly reflect the behavior habits corresponding to the personal swimming stroke state of the user, and the swimming stroke state of each user is identified in a targeted mode.
In a preferred embodiment, as shown in fig. 2, step S102 is followed by:
s103: and collecting state data corresponding to each swimming posture state of the user in the swimming process for multiple times, and correcting the template data according to the state data collected for multiple times.
Specifically, after swimming action data and physiological data are collected and processed to obtain template data corresponding to a personalized swimming posture state, the swimming state data of the user needs to be continuously collected, such as repeatedly collecting swimming posture audio, collecting data of heart rate change, and the like, and the template data is corrected to reduce data deviation.
It should be understood that although the various steps in the flow charts of fig. 1-2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
Fig. 3 is a block diagram illustrating an exemplary structure of a wearable device provided in an embodiment of the present application, and as shown in fig. 3, a wearable device 400 provided in an embodiment of the present application includes:
a matching module 402, configured to monitor state data of a user in real time, and match the state data of the user with template data corresponding to each swimming stroke state of the user; the template data comprises swimming action data and/or physiological data;
and a swimming posture state determining module 403, configured to determine, according to the matched template data, a swimming posture state corresponding to the user and automatically start a swimming mode when the state data of the user is successfully matched with the template data.
In particular, the wearable device 400 may be a watch. At the intelligent recognition response stage, wear the wrist-watch, get into the swimming pool, according to the corresponding template data of the user state data automatic identification response of matching module 402 real-time supervision, finally confirm user's swimming stroke state through swimming stroke state determination module 403 to open swimming mode automatically, the maximize reduces the user operation and realizes automatic identification swimming mode, has avoided the wrist-watch screen under the condition of seeing water, and the problem of touch interaction slow has improved user experience degree. The template data comprises swimming action data and/or physiological data, so that the behavior habit corresponding to the individual swimming stroke state of the user can be reflected more truly, the template data is more reliable and accurate, and the swimming stroke state can be identified in a targeted manner for the individual user.
For specific limitations of the wearable device 400, reference may be made to the above limitations of the automatic swimming pattern recognition method, which are not described herein again. The various modules in the wearable device described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In a preferred embodiment, referring to fig. 4, the wearable device 400 further includes an obtaining module 401, configured to obtain template data corresponding to each swimming stroke state of the user.
Specifically, in the machine learning stage, the template data corresponding to each swimming stroke state of the individual behavior habit of the user is acquired by the acquisition module 401 in a machine learning manner.
In one embodiment, the obtaining module 401 includes:
the acquisition unit is used for acquiring state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
and the data analysis unit is used for carrying out data analysis on the state data corresponding to each swimming posture state and determining the template data corresponding to each swimming posture state of the user.
Specifically, the acquisition unit includes a microphone (mic), an optical heart rate sensor, and the like. The microphone has the working principle that: vibrating the diaphragm through air by human voice, and then forming magnetic force field cutting on an electromagnetic coil winding on the diaphragm and a magnet surrounding a moving coil wheat head to form weak fluctuating current; the current is fed to the loudspeaker and the fluctuating current is converted to sound in the reverse process.
In the machine learning stage, the watch collects a plurality of swimming stroke states and collects a large amount of swimming stroke related information such as stroke motion sound, stroke times and stroke frequency through a microphone, and the watch synchronously collects physiological data such as personalized heart rate, calories and the like generated by corresponding swimming strokes through an optical heart rate sensor; after data are collected, operation and arrangement are carried out, template data corresponding to the personalized swimming stroke state are obtained, machine learning is completed, a swimming stroke data model is obtained, and the swimming stroke can be automatically identified according to the swimming stroke data model. In the process of acquiring the template data, the action data or/and the psychological data of swimming are/is considered, the behavior habit corresponding to the individual swimming stroke state of the user can be reflected more truly by adopting a mode of combining the action data and the psychological data, the acquired template data is more reliable and accurate, and the swimming stroke state can be identified in a targeted manner for each user.
In the intelligent identification response stage, as shown in fig. 5, the user wears the watch and enters the swimming pool, and the watch is in a low-power mic monitoring state; the user begins to stroke, the mic of wrist-watch and optical heart rate sensor etc. monitor in real time, acquire state data such as stroke sound, rhythm of the heart, carry out characteristic analysis with state data back and the template data that the swimming stroke state corresponds match, intelligent analysis, confirm the corresponding swimming stroke state, and open the swimming mode automatically, accomplish the automatic identification function, after the swimming mode was opened to the wrist-watch, can show information such as swimming stroke state, total aquatics number of times, stroke time, rhythm of the heart with the picture and text mode automatically on the wrist-watch.
In summary, accurate identification swimming modes such as mic monitoring, pickup, big data integration, machine learning are achieved through the watch terminal equipment, redundant operation of a user is reduced, user experience is improved, and meanwhile certain degree of guarantee is also achieved for identification accuracy.
It should be noted that the wristwatch leaves the factory and is provided with a learning algorithm model, a large amount of swimming state data generated by a user is obtained, and a swimming mode is started more accurately and automatically.
In a preferred embodiment, the obtaining module 401 further includes:
and the correction unit is used for collecting the state data corresponding to each swimming posture state of the user in the swimming process for multiple times, and correcting the template data according to the state data collected for multiple times.
Specifically, after swimming action data and physiological data are collected and operated to obtain template data corresponding to the personalized swimming stroke state, the swimming state data of the user needs to be collected continuously, and the template data is corrected by the correction unit to reduce data deviation.
In one embodiment, an electronic device is provided, the internal structure of which may be as shown in FIG. 6. The electronic device comprises a processor, a memory, a communication interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the electronic device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, Near Field Communication (NFC) or other technologies. The computer program is executed by a processor to implement an automatic swimming pattern recognition method. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the electronic equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the configuration shown in fig. 6 is a block diagram of only a portion of the configuration associated with the present application, and does not constitute a limitation on the electronic device to which the present application is applied, and a particular electronic device may include more or less components than those shown in the drawings, or may combine certain components, or have a different arrangement of components.
In one embodiment, the wearable device provided herein may be implemented in the form of a computer program that is executable on an electronic device such as that shown in fig. 6. The memory of the electronic device may store various program modules constituting the wearable device, such as an acquisition module 401, a matching module 402, and a swimming stroke state determination module 403 shown in fig. 4. The respective program modules constitute computer programs that cause a processor to execute the steps in the automatic swimming pattern recognition methods of the respective embodiments of the present application described in the present specification.
For example, the electronic device shown in fig. 6 may perform step S10 through the matching module 402, monitor the status data of the user in real time, and match the status data of the user with the template data corresponding to each swimming posture status of the user; the template data comprises swim motion data and/or physiological data. The electronic device may execute step S20 through the swimming stroke status determining module 403, and when the status data of the user is successfully matched with the template data, determine the swimming stroke status corresponding to the user according to the matched template data, and automatically start the swimming mode. The electronic device may execute steps S101-S103 through the obtaining module 401 to obtain template data corresponding to each swimming posture state of the user.
In one embodiment, an electronic device is provided, comprising a memory storing a computer program and a processor implementing steps S10, S20 when executing the computer program.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
s101: collecting state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
s102: performing data analysis on the state data corresponding to each swimming stroke state, and determining template data corresponding to each swimming stroke state of the user;
s103: and collecting state data corresponding to each swimming posture state of the user in the swimming process for multiple times, and correcting the template data according to the state data collected for multiple times.
In one embodiment, a computer readable storage medium is also provided, on which a computer program is stored, which when executed by a processor, performs the following steps S10, S20.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM is available in many forms, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), and the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A swimming mode automatic identification method is characterized by comprising the following steps:
monitoring the state data of a user in real time, and matching the state data of the user with template data corresponding to each swimming posture state of the user; the template data comprises swimming action data and/or physiological data;
and when the state data of the user is successfully matched with the template data, determining the swimming stroke state corresponding to the user according to the matched template data, and automatically starting a swimming mode.
2. The method for automatically identifying swimming patterns according to claim 1, wherein the method for acquiring the template data corresponding to each swimming stroke state of the user comprises the following steps:
collecting state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
and performing data analysis on the state data corresponding to each swimming stroke state, and determining the template data corresponding to each swimming stroke state of the user.
3. The method for automatically identifying swimming patterns according to claim 2, wherein the step of analyzing the state data corresponding to each swimming stroke state further comprises:
and collecting state data corresponding to each swimming posture state of the user in the swimming process for multiple times, and correcting the template data according to the state data collected for multiple times.
4. A swimming pattern automatic recognition method according to any of claims 1-3, wherein the swimming stroke data comprises at least one of: paddling sound, paddling frequency and paddling frequency; the physiological data includes at least one of: heart rate, calories.
5. A swimming pattern automatic recognition method according to any of claims 1-3, wherein the swimming stroke status comprises at least one of: breaststroke posture state, freestyle stroke posture state, butterfly stroke posture state and backstroke posture state.
6. A wearable device, comprising:
the matching module is used for monitoring the state data of the user in real time and matching the state data of the user with the template data corresponding to each swimming stroke state of the user; the template data comprises swimming action data and/or physiological data;
and the swimming posture state determining module is used for determining the swimming posture state corresponding to the user according to the matched template data and automatically starting a swimming mode when the state data of the user is successfully matched with the template data.
7. The wearable device according to claim 6, further comprising an acquisition module for acquiring template data corresponding to each swimming stroke state of the user.
8. The wearable device of claim 7, wherein the acquisition module comprises:
the acquisition unit is used for acquiring state data corresponding to each swimming posture state of a user in the swimming process; the status data comprises swim motion data and/or physiological data;
and the data analysis unit is used for carrying out data analysis on the state data corresponding to each swimming posture state and determining the template data corresponding to each swimming posture state of the user.
9. An electronic device, characterized in that it comprises a memory and a processor, said memory storing a computer program, characterized in that said processor, when executing said computer program, implements the steps of the automatic swimming mode recognition method according to any one of claims 1-5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the automatic swimming pattern recognition method according to any one of claims 1-5.
CN202111582182.4A 2021-12-22 2021-12-22 Swimming mode automatic identification method and wearable device Pending CN114265502A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111582182.4A CN114265502A (en) 2021-12-22 2021-12-22 Swimming mode automatic identification method and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111582182.4A CN114265502A (en) 2021-12-22 2021-12-22 Swimming mode automatic identification method and wearable device

Publications (1)

Publication Number Publication Date
CN114265502A true CN114265502A (en) 2022-04-01

Family

ID=80828819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111582182.4A Pending CN114265502A (en) 2021-12-22 2021-12-22 Swimming mode automatic identification method and wearable device

Country Status (1)

Country Link
CN (1) CN114265502A (en)

Similar Documents

Publication Publication Date Title
US11327556B2 (en) Information processing system, client terminal, information processing method, and recording medium
US10733992B2 (en) Communication device, communication robot and computer-readable storage medium
US20190385066A1 (en) Method for predicting emotion status and robot
CN105981083B (en) Intelligent wearable device and for optimize output method
US20230140011A1 (en) Learning mode for context identification
CN108227932A (en) Interaction is intended to determine method and device, computer equipment and storage medium
Kern et al. A model for human interruptability: experimental evaluation and automatic estimation from wearable sensors
CN107784357A (en) Individualized intelligent based on multi-modal deep neural network wakes up system and method
EP3705990A1 (en) Method and system for providing interactive interface
CN108334583A (en) Affective interaction method and device, computer readable storage medium, computer equipment
US20140085101A1 (en) Devices and methods to facilitate affective feedback using wearable computing devices
CN104436615A (en) Effort monitoring device and method
US20180060500A1 (en) Smart health activity scheduling
EP3881910A1 (en) Information processing device, information processing method, and program
US20200302952A1 (en) System for assessing vocal presentation
US20190371344A1 (en) Apparatus and method for predicting/recognizing occurrence of personal concerned context
CN110709940A (en) Methods, systems, and media for predicting sensor measurement quality
WO2017108138A1 (en) Biometric information for dialog system
KR20210070270A (en) information processing unit
JP2018005512A (en) Program, electronic device, information processing device and system
JP6258172B2 (en) Sound information processing apparatus and system
CN109982737A (en) Output-controlling device, output control method and program
CN114265502A (en) Swimming mode automatic identification method and wearable device
KR20230154380A (en) System and method for providing heath-care services fitting to emotion states of users by behavioral and speaking patterns-based emotion recognition results
US11869535B1 (en) Character-level emotion detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination