CN117034095A - Yoga action detection method, device and system - Google Patents
Yoga action detection method, device and system Download PDFInfo
- Publication number
- CN117034095A CN117034095A CN202210461386.0A CN202210461386A CN117034095A CN 117034095 A CN117034095 A CN 117034095A CN 202210461386 A CN202210461386 A CN 202210461386A CN 117034095 A CN117034095 A CN 117034095A
- Authority
- CN
- China
- Prior art keywords
- user
- yoga
- electronic device
- feature
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000009471 action Effects 0.000 title claims abstract description 159
- 238000001514 detection method Methods 0.000 title claims abstract description 43
- 230000033001 locomotion Effects 0.000 claims abstract description 125
- 238000000034 method Methods 0.000 claims abstract description 124
- 230000008569 process Effects 0.000 claims abstract description 59
- 210000004556 brain Anatomy 0.000 claims description 75
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 62
- 238000012545 processing Methods 0.000 claims description 57
- 230000015654 memory Effects 0.000 claims description 38
- 230000003183 myoelectrical effect Effects 0.000 claims description 26
- 210000003205 muscle Anatomy 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 7
- 238000011156 evaluation Methods 0.000 abstract description 38
- 230000001133 acceleration Effects 0.000 description 48
- 238000004891 communication Methods 0.000 description 45
- 230000006870 function Effects 0.000 description 23
- 238000010295 mobile communication Methods 0.000 description 22
- 238000007726 management method Methods 0.000 description 16
- 230000004044 response Effects 0.000 description 15
- 230000033764 rhythmic process Effects 0.000 description 12
- 210000000707 wrist Anatomy 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 230000005236 sound signal Effects 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000012549 training Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000036649 mental concentration Effects 0.000 description 4
- 238000007781 pre-processing Methods 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 241000282326 Felis catus Species 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000010267 cellular communication Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013145 classification model Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000000704 physical effect Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000035565 breathing frequency Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
Landscapes
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The embodiment of the application provides a yoga action detection method, a yoga action detection device and a yoga action detection system, wherein the method can be applied to electronic equipment. The method comprises the following steps: the electronic device obtains biological information, which is sent by the first wearable device, of a user in a yoga movement process, wherein the biological information comprises myoelectricity data and movement data of the user. The electronic equipment detects the yoga action characteristics of the user through the biological information, so that the yoga action performance of the user can be monitored comprehensively in real time, and the comprehensive evaluation of the yoga action of the user is facilitated.
Description
Technical Field
The embodiment of the application relates to the field of electronic equipment, in particular to a yoga action detection method, device and system.
Background
At present, in the process of learning and training yoga motions, a professional trainer under a main wired real-time guidance of a user yoga motions carries out two ways of on-site guidance and on-line real-time guidance based on a visual image method.
For the way of off-line coaching on-site instruction, the way is easily limited by off-line yoga site resources, and users generally need to purchase coaching courses at higher expense, so that the dependence on coaching is higher, and the cost performance is low. For the on-line visual image-based mode, the visual image is easily influenced by external factors such as the place and environment where the user is located, the acquisition of the visual image may not be accurate enough, and in addition, the mode can only detect obvious physical actions of the user, which may cause inaccurate and comprehensive guidance on yoga actions of the user, so that the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a yoga action detection method, device and system, which can detect the yoga action characteristics of a user in the yoga movement process by utilizing the movement data and myoelectricity data of the user in the yoga movement process, so that the yoga action performance of the user can be comprehensively monitored in real time.
In a first aspect, a method for yoga motion detection is provided, which is applied to an electronic device, and includes:
the electronic equipment acquires biological information of a user in a yoga movement process, wherein the biological information comprises movement data and myoelectricity data, and the biological information is sent by the first wearable equipment;
the electronic device determines yoga motion characteristics of the user according to the biometric information.
According to the embodiment of the application, the electronic equipment can detect the yoga action characteristics of the user through the motion data and the myoelectricity data of the user from the first wearable equipment, so that the performance of the yoga action of the user can be comprehensively monitored in real time, and the comprehensive evaluation of the yoga action of the user is facilitated.
In addition, in the embodiment of the application, the monitoring of the yoga action performance of the user can be free from the limitation of the offline yoga place, the user does not need to purchase a training course at higher expense, the cost performance is higher, and the experience of the user is improved. In addition, when the first wearable device acquires the motion data and myoelectricity data of the user, the first wearable device is not influenced by external factors such as the place and environment where the user is located. The accuracy of detecting the yoga action characteristics of the user can be improved, and professional guiding advice can be provided for the yoga action of the user.
With reference to the first aspect, in certain implementations of the first aspect, the yoga motion feature includes one or more of a gesture feature, a force feature, and a respiration feature.
According to the embodiment of the application, the electronic equipment can comprehensively monitor the gesture characteristics, the force-giving characteristics and the breathing characteristics of the user in the yoga motion process through the motion data and the myoelectricity data of the user in the yoga motion process. This helps to realize carrying out comprehensive evaluation and professional guidance to user's yoga action.
With reference to the first aspect, in certain implementations of the first aspect, the posture feature and the breathing feature are determined from the motion data, and the stress feature is determined from the myoelectric data.
According to the embodiment of the application, the electronic equipment can detect the gesture characteristics and the breathing characteristics of the user in the yoga movement process through the movement data, and can also detect the breathing characteristics of the user in the yoga movement process through the myoelectricity data, so that the yoga action performance of the user can be comprehensively monitored and evaluated.
With reference to the first aspect, in certain implementations of the first aspect, when the yoga action feature includes the gesture feature, the method further includes:
The electronic equipment classifies and identifies yoga actions of the user according to the gesture features and a preset yoga database.
According to the embodiment of the application, the electronic equipment can carry out the integral classification and identification on the yoga action of the user according to the gesture characteristics of the user and the preset standard database. This helps improving the electronic equipment and carries out more accurate and comprehensive monitoring and evaluation to each yoga style's of user performance.
With reference to the first aspect, in certain implementations of the first aspect, the gesture feature includes one or more of a gesture angle, a gesture duration, and a gesture stability of a target body part of the user, wherein the target body part is a body part wearing the first wearable device.
According to the embodiment of the application, the electronic equipment can detect the posture angles, the posture duration and the posture stability of the user rotating at different parts of the body during each yoga style through the motion data of the user during the yoga motion. This helps to realize the comprehensive monitoring and evaluation to the action posture condition of user in yoga motion process.
With reference to the first aspect, in certain implementations of the first aspect, the force-generating features include a location of a force-generating muscle and a magnitude of the force.
According to the embodiment of the application, the electronic equipment can determine the position and the force applying size of the force applying muscle of the user through the myoelectricity data of the user in the yoga movement process, so that the comprehensive monitoring and evaluation of the force applying condition of the user in the yoga movement process are facilitated.
With reference to the first aspect, in certain implementation manners of the first aspect, the yoga action feature further includes a concentration feature of the user, and the method further includes:
the electronic equipment acquires brain wave data of the user in the yoga movement process, which is sent by the second wearable equipment;
the electronic device determines a concentration characteristic of the user based on the brain wave data.
In the embodiment of the application, the electronic equipment can detect the concentration degree characteristic of the user through the brain wave data of the user from the second wearable equipment, thereby being beneficial to monitoring and evaluating the mental concentration condition of the user in the yoga exercise process.
With reference to the first aspect, in certain implementation manners of the first aspect, the determining, by the electronic device, a concentration characteristic of the user according to the brain wave data includes:
the electronic equipment classifies the concentration degree of the user according to the brain wave data and the preset yoga database.
According to the embodiment of the application, the electronic equipment can grade the concentration degree of the user according to the brain wave data of the user and the preset standard database, so that the mental concentration condition of the user in the yoga exercise process can be evaluated more accurately.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: and outputting prompt information according to the preset yoga database and the yoga action characteristics, wherein the prompt information is used for guiding the yoga action of the user.
According to the embodiment of the application, the electronic equipment can comprehensively evaluate the yoga action of the user according to the detected yoga action characteristics and the preset standard database. For example, the electronic device may evaluate whether the action gesture of the user is standard according to the gesture feature, may evaluate whether the breathing rhythm of the user is matched with the body style according to the breathing feature, may evaluate whether the stress muscle portion and the stress size of the user are correct according to the stress feature, or may evaluate whether the user is concentrated in yoga according to the concentration degree classification. Based on the evaluation result, the electronic equipment can output prompt information in real time to provide professional guidance for the yoga action of the user, so that the user can timely correct the nonstandard yoga action, and the user experience is improved.
In some embodiments, the prompt information may be at least one of a text prompt information, a vibration prompt information, or a voice prompt information, and the user may be able to adjust her yoga according to the prompt information.
In a second aspect, there is provided an apparatus for yoga motion detection, the apparatus comprising:
the device comprises an acquisition unit, a first wearable device and a second wearable device, wherein the acquisition unit is used for acquiring biological information of a user in a yoga movement process, which is sent by the first wearable device, and the biological information comprises movement data and myoelectricity data;
and the processing unit is used for determining yoga action characteristics of the user according to the biological information.
With reference to the second aspect, in certain implementations of the second aspect, the yoga motion feature includes one or more of a gesture feature, a force feature, and a respiration feature.
With reference to the second aspect, in certain implementations of the second aspect, the posture feature and the breathing feature are determined from the motion data, and the stress feature is determined from the myoelectric data.
With reference to the second aspect, in certain implementations of the second aspect, when the yoga action feature includes the gesture feature, the processing unit is further configured to:
And classifying and identifying yoga actions of the user according to the gesture characteristics and a preset yoga database.
With reference to the second aspect, in certain implementations of the second aspect, the gesture feature includes one or more of a gesture angle, a gesture duration, and a gesture stability of the user's target body part rotation,
wherein the target body part is the body part wearing the first wearable device.
With reference to the second aspect, in certain implementations of the second aspect, the force-generating features include a location of a force-generating muscle and a magnitude of the force.
With reference to the second aspect, in certain implementations of the second aspect, the yoga action feature further includes a concentration feature of the user,
the acquisition unit is also used for acquiring brain wave data of the user in the yoga movement process sent by the second wearable equipment,
the processing unit is also used for determining the concentration degree characteristic of the user according to the brain wave data.
With reference to the second aspect, in certain implementations of the second aspect, the processing unit is specifically configured to:
and grading the concentration degree of the user according to the brain wave data and the preset yoga database.
With reference to the second aspect, in certain implementations of the second aspect, the processing unit is further configured to:
and outputting prompt information according to the preset yoga database and the yoga action characteristics, wherein the prompt information is used for guiding the yoga action of the user.
In a third aspect, an electronic device is provided that includes one or more processors and one or more memories; the one or more memories store one or more computer programs comprising instructions which, when executed by the one or more processors, cause the second aspect or any of the possible implementations of the second aspect described above to be performed.
In a fourth aspect, a system for yoga motion detection is provided, where the system includes the electronic device of the third aspect and a first wearable device, where the first wearable device is configured to:
acquiring biological information of a user in a yoga exercise process, wherein the biological information comprises exercise data and myoelectricity data;
the biometric information is transmitted to the electronic device.
In one possible implementation, the system further includes a second wearable device, where the second wearable device is configured to:
Acquiring brain wave data of the user in the yoga exercise process;
and sending the brain wave data to the electronic equipment.
In a fifth aspect, there is provided a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In a sixth aspect, there is provided a computer program product for, when run on an electronic device, causing the electronic device to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In a seventh aspect, there is provided a chip comprising a processor and a memory, the memory storing instructions for executing the instructions stored in the memory, the processor performing the method of the first aspect or any one of the possible implementations of the first aspect when the instructions are executed.
The advantages of the second aspect to the seventh aspect are referred to above for the advantages of the first aspect, and the description is not repeated.
Drawings
Fig. 1 is a schematic diagram of a yoga motion detection system architecture to which the yoga motion detection method according to the embodiment of the present application is applicable.
Fig. 2 is an application scenario of a yoga motion detection method provided by the embodiment of the application.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a first wearable device according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a second wearable device according to an embodiment of the present application.
FIG. 6 is a set of GUIs provided in an embodiment of the present application.
FIG. 7 is another set of GUIs provided in an embodiment of the present application.
Fig. 8 is another GUI provided by an embodiment of the present application.
FIG. 9 is a set of GUIs provided in an embodiment of the present application.
FIG. 10 is a set of GUIs provided in an embodiment of the present application.
Fig. 11 is a schematic flow chart of a yoga motion detection method according to an embodiment of the present application.
Fig. 12 is a schematic flow chart of a yoga motion detection method according to an embodiment of the present application.
Fig. 13 is a schematic flow chart of another yoga motion detection method according to an embodiment of the present application.
Fig. 14 is a flow chart of another yoga motion detection method according to an embodiment of the present application.
Fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one, two or more than two. The term "and/or" is used to describe an association relationship of associated objects, meaning that there may be three relationships; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Yoga exercise, which uses old and easily mastered skills, can improve the physiological, psychological, emotional and mental conditions of people, and is a harmonious and unified exercise mode for body, mind and spirit. Is popular for people with continuously accelerated life rhythm. Yoga exercise is performed by body position adjustment, breathing adjustment, meditation adjustment and the like, so as to achieve the integration of body and mind. Therefore, in the process of learning and training yoga exercise, the body, the breath, the mind and other aspects of the person need to be monitored in real time so as to be capable of carrying out comprehensive guidance.
At present, in the yoga learning and yoga training process, the real-time guidance of the yoga action of a user is mainly performed by a professional coach under a wired mode in two modes of on-site guidance and on-line real-time guidance based on a visual image method. For the way of on-site instruction of an off-line coach, although the trainee can receive the professional and comprehensive instruction of the coach, the trainee in the way must go to a special yoga classroom and is easily limited by resources of the off-line yoga place. In addition, the dependence of the mode on the coaching is high, and the user is generally required to purchase coaching courses at high cost, so that the cost performance is low, and the experience of the user is not facilitated.
An on-line visual image-based approach, such as a yoga lesson in intelligent (artificial intelligence, AI) fitness, that can monitor a user's yoga activity based on visual images, the user training based on video lessons in the intelligent screen. Compared with on-line instruction of an off-line coach, the on-line mode does not need a user to purchase courses at higher cost, and the cost performance is higher. However, this approach is susceptible to factors such as the location and environment of the user, and the acquisition of visual images may be inaccurate. In addition, the method can only monitor obvious physical actions of the user, and cannot comprehensively monitor various conditions of breathing, mind and the like of the user in the yoga exercise process. This may result in inaccurate and comprehensive guidance of the user's yoga action, resulting in poor user experience.
Based on the above, the embodiment of the application provides a yoga action detection method, which can be applied to electronic equipment. When a user performs yoga exercise, the electronic device can detect action characteristics of the user in the process of the yoga exercise by acquiring exercise data and myoelectricity data of the user sent by the first wearable device, so that yoga action performance of the user can be monitored comprehensively in real time. This helps to achieve comprehensive and professional guidance of the user's yoga action.
Fig. 1 illustrates an architecture of a yoga motion detection system to which the yoga motion detection method provided by the embodiment of the present application is applicable. As shown in fig. 1, the system architecture may include an electronic device 100 and a plurality of first wearable devices 200. The electronic device 100 and the first wearable device 200 may be connected by wireless communication. The connection may be established, for example, by at least one of the following wireless connections: bluetooth (BT), near field communication (near filed communication, NFC), or wireless fidelity (wireless filedlity, wiFi).
In the embodiment of the present application, the electronic device 100 and the first wearable device 200 are illustrated as examples through bluetooth connection.
Wherein the first wearable device 200 may be a wearable device worn on a different body part of the user. And the first wearable device 200 may have different product forms according to the body part to be worn. For example, as shown in fig. 1, the product form of the first wearable device 200 may include a smart foot ring, a smart hand ring, a smart strap, or smart glasses, etc., which the present application is not limited to.
The first wearable device 200 may acquire bio-information of the user in real time through the sensor and transmit the bio-information to the electronic device 100. The biometric information may include movement data and myoelectric data, and the movement data may include acceleration data and angular velocity data.
In one example, the acceleration data may include an acceleration signal and/or characteristic information of the acceleration signal. For example, the first wearable device 200 may collect an acceleration signal of the user through the acceleration sensor, and may further process the acceleration signal to extract characteristic information of the acceleration signal, such as a time domain characteristic and a frequency domain characteristic of the acceleration signal.
In yet another example, the angular velocity data may include an angular velocity signal and/or characteristic information of the angular velocity signal. For example, the first wearable device 200 may collect an angular velocity signal of the user through a gyro sensor, and may also process the angular velocity signal to extract characteristic information of the angular velocity signal, such as a time domain characteristic and a frequency domain characteristic of the angular velocity signal.
In another example, the myoelectrical data may include a myoelectrical signal of the user and/or characteristic information of the myoelectrical signal. For example, the first wearable device 200 may collect the myoelectric signal of the user through the myoelectric sensor, and may also process the myoelectric signal to extract characteristic information of the myoelectric signal, such as a time domain characteristic and a frequency domain characteristic of the myoelectric signal.
The electronic device 100 may be a portable electronic device that also includes other functionality such as personal digital assistant and/or music player functionality, such as a cell phone, tablet computer, wearable electronic device with wireless communication functionality (e.g., a smart watch), and so forth. Exemplary embodiments of portable electronic devices include, but are not limited to, piggy-back Or other operating system. The portable electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be appreciated that in other embodiments, the electronic device described above may not be a portable electronic device, but rather a desktop computer.
The electronic device 100 may receive the bio-information transmitted by the first wearable device 200 and determine a yoga action characteristic of the user according to the bio-information. The yoga motion characteristics of the user may include one or more of a gesture characteristic, a force characteristic, and a respiration characteristic of the user, among others. In some embodiments, the electronic device 100 may further perform comprehensive evaluation on the yoga action of the user according to the yoga action characteristics of the user and a preset yoga database, and output prompt information to guide the yoga action of the user.
It should be noted that, in order to implement the technical solution according to the present application, the number of the electronic device 100 and the first wearable device 200 may be one or more, which is not limited in the present application.
As shown in fig. 1, in some embodiments, the yoga motion detection system may further include a second wearable device 300, the second wearable device 300 in connected communication with the electronic device 100. The second wearable device 300 may be a wearable device worn on the head of the user, and the product form may include a smart strap or a smart helmet, which is not limited by the present application.
The second wearable apparatus 300 may acquire brain wave data of the user in real time through the brain wave sensor. For example, the second wearable apparatus 300 may collect brain wave signals of the user through a brain wave sensor, and may also extract characteristic information of the brain wave signals, such as time domain and frequency domain characteristics of the brain wave signals. In addition, the second wearable device 300 may also transmit the brain wave data to the electronic device 100, so that the electronic device 100 may determine the concentration characteristics of the user through the brain wave data.
It should be noted that, when the first wearable device 200 is worn on the head of the user, and the first wearable device 200 may also obtain brain wave data of the user through the brain wave sensor, the first wearable device 200 and the second wearable device 300 may be the same wearable device.
In other embodiments, the yoga motion detection system may further include a smart screen 400, the smart screen 400 in communication with the electronic device 100. When the electronic device 100 performs comprehensive evaluation on the yoga action of the user according to the yoga action characteristics of the user and the preset yoga database, the result of the comprehensive evaluation can be displayed for the user in real time through the display screen of the intelligent screen 400. For example, a multi-dimensional evaluation graph of the user's yoga action may be displayed on a display screen, or a score of the user's yoga action may also be displayed. In addition, when the electronic device 100 needs to output the prompt information according to the yoga action characteristics of the user and the preset yoga database, the prompt information may be sent to the intelligent screen 400, and the intelligent screen 400 may display the prompt information to the user in a text form or may also play the prompt information to the user in a voice form, so as to instruct the yoga action of the user.
Fig. 2 is an application scenario of a yoga motion detection method provided by the embodiment of the application.
Illustratively, as shown in fig. 2, a user may select a yoga course on the smart screen 400 and perform a yoga movement according to a course video played on the smart screen 400. The electronic device 100 may be connected to the intelligent screen 400 through bluetooth, and in response to the user selecting the yoga course on the intelligent screen 400, the electronic device 100 may detect the yoga action of the user in real time.
The electronic device 100 may be worn at an arm portion of the user while the user is performing yoga, or the user may hold the electronic device 100 or may place the electronic device 100 around the user, which is not limited by the present application.
The plurality of first wearable devices 200 may be worn at different parts of the user's body and connected with the electronic device 100 through bluetooth. For example, the first wearable device 200 may be worn on the wrist and leg of the user, or may be worn on the chest and leg of the user, or may be worn on the upper arm, waist and leg of the user, or may be worn on the wrist, chest and leg of the user. It will be appreciated that the locations where the plurality of first wearable devices 200 are worn are merely examples and are not limiting of the application.
In order to enable the electronic device 100 to distinguish between the first wearable devices 200 worn on different body parts, calibration of the wearing parts of the first wearable device 200 may be performed before performing yoga movements. For example, the electronic device 100 may prompt the user to lift the left arm, and in response to the prompt of the electronic device 100, the first wearable device 200 worn on the left arm may collect an angular velocity signal of the user through the gyro sensor and transmit the angular velocity signal to the electronic device 100. At this time, the electronic apparatus 100 may mark the wearing position of the first wearable apparatus 200 transmitting the angular velocity signal as the left arm according to the received angular velocity signal. It will be appreciated that the above calibration method of the wearing part of the first wearable device 200 is only an example, and is not a limitation of the present application.
As shown in fig. 2, when a user performs yoga, the electronic device 100 may receive biological information of the user transmitted by the first wearable device 200, the biological information including movement data and myoelectricity data acquired by the first wearable device 200. The electronic device 100 may determine yoga motion characteristics of the user using the motion data and the myoelectricity data, so as to monitor the yoga motion performance of the user comprehensively in real time. For example, the electronic device 100 may determine the posture and breathing characteristics of the user from the motion data, or may also determine the force characteristics of the user from the myoelectric data, such as the location of the force muscles, the magnitude of the force, and the like.
In some embodiments, the electronic device 100 may comprehensively evaluate the yoga action of the user according to the yoga action characteristics of the user and the preset yoga database, and output prompt information to guide the yoga action of the user. For example, the electronic device 100 may send the prompt to the smart screen 400, and the smart screen 400 may display the prompt in text form or play the prompt in voice form.
In some embodiments, as shown in fig. 2, the user may also wear the second wearable device 300 on his head during yoga movements, the second wearable device 300 and the electronic device 100 being connected by bluetooth. The second wearable device 300 may acquire brain wave data of the user and transmit the brain wave data to the electronic device 100. The electronic device 100 may determine a concentration characteristic of the user during yoga movement from the received brain wave data.
In some embodiments, in the application scenario shown in fig. 2, the yoga motion detection system may further include an audio playback device (not shown in the figure), which may be in connected communication with the electronic device 100. The audio playing device may include an earphone, a speaker, or the like. When the electronic device 100 performs comprehensive evaluation on the yoga action of the user and outputs the prompt information, a corresponding voice prompt can be generated according to the prompt information, and the voice prompt is sent to the audio playing device, and the audio playing device can play corresponding voice according to the voice prompt.
The structures of the electronic device 100, the first wearable device 200, and the second wearable device 300 provided by the embodiment of the present application are described below with reference to fig. 3 to 5.
By way of example, fig. 3 shows a schematic structural diagram of the electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity greater than or equal to a first pressure threshold acts on the alarm clock application icon, an instruction to newly create an alarm clock is executed.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. For example, when the mobile phone detects a touch operation of a user on the screen locking interface, the mobile phone can collect fingerprint information of the user through the fingerprint sensor 180H and match the collected fingerprint information with fingerprint information preset in the mobile phone. If the matching is successful, the mobile phone can enter the non-screen locking interface from the screen locking interface.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
Fig. 4 schematically illustrates a structural diagram of a first wearable device 200 according to an embodiment of the present application.
As shown in fig. 4, the wearable device 200 may include a processor 210, a memory 220, a sensor module 230, a mobile communication processing module 240, a wireless communication processing module 250, and a power module 260. These components may be connected by a bus, wherein:
processor 210 may include a controller, an operator, and registers. The controller may be responsible for instruction decoding and issue control signals for operations corresponding to the instructions. The operator may be responsible for performing fixed or floating point arithmetic operations, shift operations, logic operations, and the like, as well as address operations and translations. The register is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process.
In some embodiments, the processor 210 may be configured to parse signals received by the mobile communication processing module 240 and the wireless communication processing module 250. For example, when a user initiates a yoga movement, the processor 210 may be used to parse a request sent by the electronic device 100 to the first wearable device 200 to obtain physiological information. The processor 210 may send sensor signals, such as acceleration signals, angular velocity signals, or electromyographic signals, detected by the sensor module 230 to the electronic device 100. Alternatively, the processor 210 may perform corresponding analysis processing, for example, extracting time-domain and frequency-domain features of the acceleration signal and time-domain and frequency-domain features of the angular velocity signal, and the like, according to the acquired sensor signal, and send the analysis processing to the electronic device 100.
Memory 220 is coupled to processor 210 for storing at least one of various software programs or sets of instructions. In some embodiments, memory 220 may include high-speed random access memory, and may also include non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 220 may store an operating system and may also store communication programs that may be used to communicate with electronic device 100, one or more servers, or additional devices.
The sensor module 230 may include an acceleration sensor 230A, a gyro sensor 230B, and a myoelectric sensor 230C. Wherein,
the acceleration sensor 230A may be used to detect the magnitude of acceleration of the first wearable device 200 in various directions (typically three axes, i.e., x, y, and z axes). The magnitude and direction of gravity may also be detected when the first wearable device 200 is stationary.
The gyroscopic sensor 230B may be used to determine a motion pose of the first wearable device 200.
The electromyographic sensor 230C may be used to detect the user's electromyographic signal.
The mobile communication processing module 240 may provide a solution including 2G/3G/4G/5G wireless communication, etc. applied on the first wearable device 200. For performing cellular communication and/or data communication, for example, the mobile communication module 240 may include a circuit switched module ("CS" module) for performing cellular communication and a packet switched module ("PS" module) for performing data communication. In the present application, the mobile communication processing module 240 may communicate with other devices (e.g., the electronic device 100) through the fourth generation mobile communication technology (4 th generation mobile networks) or the fifth generation mobile communication technology (5 th generation mobile networks).
The wireless communication processing module 250 may include one or more of the bluetooth communication processing module 250A, WLAN and the communication processing module 250B. In some embodiments, one or more of the bluetooth communication processing module, the WLAN communication processing module may monitor signals, such as probe requests, scan signals, etc., transmitted by other devices (e.g., electronic device 100), and may send response signals, such as probe responses, scan responses, etc., so that the other devices (e.g., electronic device 100) may discover the first wearable device 200 and establish a wireless communication connection with the other devices (e.g., electronic device 100) to communicate with the other devices (e.g., electronic device 100) via one or more wireless communication technologies in bluetooth or WLAN. In other embodiments, one or more of the bluetooth communication processing module and the WLAN communication processing module may also transmit signals, such as broadcast bluetooth signals, beacon signals, so that other devices (e.g., electronic device 100) may discover first wearable device 210 and establish a wireless communication connection with other devices (e.g., electronic device 100) to communicate with other devices (e.g., electronic device 100) via one or more wireless communication technologies of bluetooth or WLAN.
The power module 260 may include a charge management module and a power management module. Wherein the charge management module may be configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module may receive a charging input of the wired charger through the USB interface. In some wireless charging embodiments, the charging management module may receive a wireless charging input through a wireless charging coil of the first wearable device 200, which may be housed in a wireless charging module. The charging management module may charge the battery while also powering the first wearable device 200 through the power management module.
The power management module may be used to connect the battery, the charge management module and the processor 210. The power management module receives input from the battery and/or charge management module and provides power to the processor 210 and other components, etc. In other embodiments, the power management module may be disposed in the processor 210. In other embodiments, the power management module and the charge management module may be disposed in the same device.
It will be appreciated that the structure illustrated in fig. 4 does not constitute a specific limitation on the first wearable device 200. Alternatively, in other embodiments of the present application, the first wearable device 200 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Fig. 5 schematically illustrates a structural diagram of a second wearable device 300 according to an embodiment of the present application.
As shown in fig. 5, the wearable device 300 may include a processor 310, a memory 320, a sensor module 330, a mobile communication processing module 340, a wireless communication processing module 350, and a power module 360. These components may be connected by a bus, wherein:
the sensor module 330 may include an acceleration sensor 330A, a gyro sensor 330B, and an brain wave sensor 330C. The acceleration sensor 330A and the gyro sensor 330B are the same as the acceleration sensor 230A and the gyro sensor 230B shown in fig. 4, and the detailed description will be referred to the embodiment shown in fig. 4, and the present application is not repeated here.
The brain wave sensor 330C may detect brain wave signals of the user in real time, so that concentration characteristics of the user may be determined from the brain wave signals.
In addition, the processor module 310, the memory 320, the mobile communication processing module 340, the wireless communication processing module 350 and the power module 360 are the same as the processor module 210, the memory 220, the mobile communication processing module 240, the wireless communication processing module 250 and the power module 260 shown in fig. 4, and the detailed description will be referred to the embodiment shown in fig. 4, so that the repetition of the present application is avoided.
It will be appreciated that the structure illustrated in fig. 5 does not constitute a specific limitation on the second wearable device 300. Alternatively, in other embodiments of the present application, the second wearable device 300 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The following describes a yoga motion detection method provided by the embodiment of the present application with reference to fig. 6 to 14.
Fig. 6 shows a set of graphical user interfaces (graphical user interface, GUI) provided by an embodiment of the present application. In this embodiment, the electronic device 100 is a mobile phone, which is further described in the present embodiment. Here, (a) in fig. 6 and (b) in fig. 6 are respectively a user interface 601 and a user interface 603 of the electronic device 100.
For example, as shown in fig. 6 (a), the user may click on the "start movement" button 601 of the user interface 601 in preparation for starting yoga movement. As shown in (b) of fig. 6, in response to a click operation of the key 601 by the user, the electronic device 100 may display a user interface 603, and may display prompt information asking the user whether to turn on the yoga action detection function in a dialog box 604 of the user interface 603. For example, the prompt in dialog 604 may be text information as shown in fig. 6 (b), such as "whether motion detection is required. In response to a gesture operation (e.g., clicking, etc.) of the user at the confirmation control 605, the electronic device 100 turns on the yoga action detection function.
The electronic device 100 may send an indication to the first wearable device 200 to start acquiring the user bio-information based on the bluetooth connection, or may send an indication to the second wearable device 300 to start acquiring the user brain wave data. Then, the electronic device 100 may receive the biological information of the user sent by the first wearable device 200 based on the yoga motion detection method provided by the embodiment of the present application, and determine the yoga motion feature of the user using the biological information.
For example, the electronic device 100 may determine the posture feature and the breathing feature of the user according to the motion data in the biological information, and may determine the stress feature of the user, such as the location of the stress muscle, the stress magnitude, and the like, according to the myoelectric data in the biological information. Alternatively, the electronic device 100 may also determine the user concentration characteristics from the received brain wave data. In addition, the electronic device 100 may comprehensively evaluate the yoga action of the user according to the yoga action characteristics of the user and the preset yoga database, and output prompt information to guide the yoga action of the user.
Fig. 7 is a user interface 700 of the electronic device 100 according to an embodiment of the present application when outputting a prompt message.
For example, as shown in fig. 7, when the electronic device 100 determines that the breathing rhythm of the user is not matched with the style according to the breathing characteristics of the user and the preset yoga database, for example, the breathing frequency of the user is too fast, the electronic device 100 may display a user interface 700, and may display a prompt message for reminding the user to adjust the breathing in a dialog box 701 of the user interface 700. For example, the prompt information in the dialog box 701 may be a text message as shown in fig. 7, such as "the current breathing rhythm is 20bpm, which is inconsistent with the target breathing rhythm, please adjust the breathing rhythm of the user according to the prompt music". And a prompt to close the user interface 700 after a few seconds may also be displayed in the dialog 702 of the user interface 700. For example, the prompt message in the dialog box 702 may be a text message as shown in fig. 7, such as "auto-close after 10 seconds prompt". If the user believes that no prompting is required, the electronic device 100 closes the user interface 700 in response to a gesture operation (e.g., clicking, etc.) that the user acts on the confirmation control 703.
Fig. 8 is another user interface 800 of the electronic device 100 according to the embodiment of the present application when outputting a prompt message.
For example, as shown in fig. 8, when the electronic device 100 determines that the user's stress muscles are too tight according to the stress characteristics of the user and the preset yoga database, the electronic device 100 may display a user interface 800, and may display prompt information for reminding the user to adjust the stress of the body stress part in a dialog box 801 of the user interface 800. For example, the prompt in dialog 801 may be a text message as shown in fig. 8, such as "current leg muscles are too tight, asking the user to relax. And the dialog box 802 of the user interface 800 may also close the prompt of the user interface 800 after a few seconds of display. For example, the prompt message in the dialog box 802 may be a text message as shown in fig. 8, such as "auto-close after 10 seconds prompt". If the user believes that no prompting is required, the electronic device 100 closes the user interface 800 in response to a gesture operation (e.g., clicking, etc.) that the user acts on the confirmation control 803.
Fig. 9 is a further user interface 900 of the electronic device 100 according to the embodiment of the present application when outputting a prompt message.
For example, as shown in fig. 9, when the electronic device 100 performs classification and identification on a yoga action of a user, and detects that the yoga action is a meditation action, the electronic device 100 may determine that the user's spirit is not focused enough according to the breathing characteristics, the concentration characteristics and the preset yoga database of the user, and at this time, the electronic device 100 may display a user interface 900 and may display prompt information for prompting the user to blow up in a dialog box 901 of the user interface 900. For example, the prompt information in the dialog box 901 may be text information as shown in fig. 9, such as "please the user to empty the mind, discard the omnix, and enter the forget me state". And the prompt of the user interface 900 may also be closed after a few seconds of display in the dialog box 902 of the user interface 900. For example, the prompt in dialog 902 may be text as shown in fig. 9, such as "auto-close after 10 seconds prompt". If the user believes that no prompting is required, the electronic device 100 closes the user interface 900 in response to a gesture operation (e.g., clicking, etc.) that the user acts on the confirmation control 903.
FIG. 10 illustrates another set of GUIs provided by an embodiment of the present application.
As shown in fig. 10 (a), the electronic device 100 may be bluetooth-connected to the smart screen 400. When the electronic device 100 and the smart screen 400 are bluetooth connected, the electronic device 100 may display the window 1001.
The user may select a desired yoga course to learn from among a plurality of workouts displayed in the "intelligent workout" window interface of the intelligent screen 400, e.g., the user may select a primary yoga course to learn by operation at icon 1002. The operation of the icon 1002 by the user may be performed by a remote control of the smart screen 400, or may be performed by a touch operation or a click operation.
In response to the user's operation at icon 1002, intelligent screen 400 may play the video content of the primary yoga course, and may also send, based on the bluetooth connection, to electronic device 100, indication information to turn on the yoga action detection function, and electronic device 100 turns on the yoga action detection function. When the user performs yoga according to the played course video, the electronic device 100 may send an instruction for acquiring biological information of the user to the first wearable device 200, or may send an instruction for starting to acquire brain wave data of the user to the second wearable device 300. The electronic device 100 may then determine yoga motion characteristics of the user using the received biometric information and brain wave data based on the yoga motion detection method provided in the embodiments of the present application. In addition, the electronic device 100 may perform comprehensive evaluation on the yoga action of the user according to the yoga action characteristics of the user and the preset yoga database, and may send the result of the comprehensive evaluation to the intelligent screen 400. In addition, the electronic device 100 may also send the generated prompt information for guiding the yoga action of the user to the intelligent screen 400.
Referring to (b) of fig. 10, in response to a result of the integrated evaluation of the user yoga action transmitted by the electronic device 100, the intelligent screen 400 may display an evaluation window 1003 on a display interface for playing a yoga course video. The evaluation window 1003 may display a multi-dimensional evaluation chart obtained by comprehensively evaluating the yoga actions of the user, and score the yoga actions of the user according to the multi-dimensional evaluation chart. For example, as shown in (b) of fig. 10, the evaluation window 1003 may display a six-dimensional evaluation chart and a "current action score" for the user yoga action: 75 minutes). The six dimensions may include, among others, a balance dimension, a coordination dimension, a softness dimension, a motor completion dimension, and a power skills dimension, and a concentration dimension.
In addition, in response to the prompt information sent by the electronic device 100, the smart screen 400 may display a dialog box 1004 on a display interface for playing yoga course video, where the dialog box 1004 may display prompt information for guiding the user's yoga action. For example, the prompt in dialog 1004 may be a text message as shown in fig. 10 (b), such as "current leg muscles are too tight, asking the user to relax.
The user can correct the nonstandard yoga actions according to the instruction of the prompt information of the dialog box 1004. In response to the user correcting the user's own nonstandard yoga motions, the electronic device 100 may detect the user corrected yoga motions, perform a new comprehensive evaluation on the corrected yoga motions, and may send the result of the new comprehensive evaluation and new prompt information to the intelligent screen 400.
Referring to fig. 10 (c), in response to the new comprehensive evaluation result and the new prompt message sent by the electronic device 100, the intelligent screen 400 may display an evaluation window 1005 and a dialog box 1006 on a display interface for playing yoga course video. For example, the evaluation window 1005 may display a multi-dimensional evaluation chart for comprehensively evaluating the yoga actions after user correction, and scoring the yoga actions after user correction according to the evaluation chart. For example, as shown in (c) of fig. 10, the evaluation window 1005 may display a six-dimensional evaluation graph and a "current action score" for the user yoga action: 95 points). The dialog box 1006 may display new prompts for guiding the user's yoga actions. For example, the prompt in dialog 1006 may be text information as shown in fig. 10 (c), such as "current action criteria, please keep on".
The following specifically describes a yoga motion detection method provided by the embodiment of the present application with reference to fig. 11 to 14.
Fig. 11 shows a schematic flow chart of a method 1100 for yoga motion detection, which may be performed by an electronic device, according to an embodiment of the present application. The method 1100 includes:
s1101, the electronic device obtains biological information from the user sent by the first wearable device, the biological information may include movement data and myoelectricity data.
Specifically, the first wearable device may acquire biological information of the user during yoga and then transmit the biological information to the electronic device. The motion data may include acceleration data and angular velocity data of the user, among others. By way of example, the acceleration data may include an acceleration signal of the user and/or characteristic information of the acceleration signal, and the angular velocity data may include an angular velocity signal of the user and/or characteristic information of the angular velocity signal, to which the present application is not limited.
In some embodiments, the first wearable device may collect an acceleration signal and an angular velocity signal of the user during yoga motion, respectively, through an acceleration sensor and a gyroscope sensor.
In one possible example, the first wearable device may send the acquired acceleration signal and angular velocity signal directly to the electronic device. In another possible example, the first wearable device may process the acquired acceleration signal and the acceleration signal to obtain characteristic information of the acceleration signal and the angular velocity signal, and send the characteristic information to the electronic device. For example, the characteristic information of the acceleration signal may include one or more of a time domain characteristic, a frequency domain characteristic, or a waveform characteristic of the acceleration signal. The characteristic information of the angular velocity signal may comprise one or more of a time domain characteristic, a frequency domain characteristic, or a waveform characteristic of the angular velocity signal.
Furthermore, the myoelectric data may include myoelectric signals of the user and/or characteristic information of the myoelectric signals. In some embodiments, the first wearable device may collect myoelectric signals of the user during yoga movements by means of a myoelectric sensor.
In one possible example, the first wearable device may send the acquired electromyographic signals directly to the electronic device. In another possible example, the first wearable device may process the collected electromyographic signals to obtain characteristic information of the electromyographic signals, and may send the characteristic information to the electronic device. For example, the characteristic information of the electromyographic signal may comprise a time domain characteristic and/or a frequency domain characteristic of the electromyographic signal. The processing method may include signal preprocessing of the acquired electromyographic signals to remove noise interference, thereby extracting time domain features and frequency domain features of the electromyographic signals.
Illustratively, the time domain features of the signal may include one or more of peak points, maxima, minima, and zero crossings. The frequency domain characteristics of the signals may include dominant frequencies or magnitudes, etc., as the application is not limited in this regard.
S1102, the electronic equipment determines yoga action characteristics of the user according to the biological information.
The yoga motion characteristics of the user may include one or more of a gesture characteristic, a force characteristic, and a respiration characteristic, among others.
In some embodiments, the electronic device may determine gesture features of the user from the motion data. Specifically, the electronic device may process the received acceleration signal and the angular velocity signal to obtain feature information of the signals, or the electronic device may detect key points (for example, an action end point or an action transition point) of the yoga action of the user according to the received feature information of the acceleration signal and the angular velocity signal, so as to split the yoga action of the user in a split manner, and calculate the gesture feature of each split type.
Alternatively, the gesture feature may comprise one or more of a gesture angle, a gesture duration, and a gesture stability of a target body part rotation of the user, as the application is not limited in this respect.
It should be noted that the target body part of the user may be understood as a body part of the user wearing the first wearable device.
In some embodiments, the electronic device may further perform classification and identification on yoga actions of the user according to the gesture feature and a preset yoga database. The preset yoga database can be generated according to pre-collected standard yoga motion data of a professional coach. After the yoga actions of the user are split in a split mode, the electronic equipment can classify and identify the yoga actions of the user by adopting a maximum likelihood estimation method according to the calculated gesture characteristics and combining data in the preset yoga database. For example, yoga action classifications of users may be identified as cat, dog, boat, etc.
In some embodiments, the electronic device may determine the breathing characteristics of the user from motion data acquired from a first wearable device of the wrist or chest. The electronic device may perform band-pass filtering according to waveform characteristics of the chest or wrist acceleration signal and the angular velocity signal, so as to extract time domain characteristics and frequency characteristics of the filtered acceleration signal and the angular velocity signal, and calculate a respiration rate corresponding to the yoga action of the user according to the time domain characteristics and the frequency domain characteristics.
In one possible implementation, when the first wearable device is worn by both the wrist and chest of the user, the electronic device may perform data fusion on the motion data acquired from the chest and on the motion data acquired from the wrist, thereby calculating the respiration rate of the user. Specifically, the electronic device may determine the corresponding first weight coefficient according to the confidence level of the motion data acquired from the chest, and may determine the corresponding second weight coefficient according to the confidence level of the motion data acquired from the wrist. The electronic device may then compare the relative magnitudes of the first and second weight coefficients, and select motion data corresponding to the relatively greater weight coefficient of the two to calculate the respiration rate of the user. Optionally, the electronic device may determine the first weight coefficient and the second weight coefficient according to the acceleration signals and the angular velocity signals acquired from the wrist and the chest, and may also determine the first weight coefficient and the second weight coefficient according to the characteristic information of the acceleration signals and the characteristic information of the angular velocity signals acquired from the wrist and the chest.
Still alternatively, a first wearable device worn on the wrist may calculate a first respiration rate based on the acquired motion data and send the first respiration rate to the electronic device. The chest-worn first wearable device may calculate a second respiration rate based on the acquired motion data and send the second respiration rate to the electronic device. The electronic device may determine a third weight coefficient based on the confidence level of the first respiration rate and a fourth weight coefficient based on the confidence level of the second respiration rate. The electronic device may then compare the relative magnitudes of the third and fourth weight coefficients, and select a respiration rate corresponding to the relatively greater weight coefficient of the two as the respiration rate of the user. It will be appreciated that the manner in which the above-described electronic device calculates the user's respiration rate is merely an example and is not a limitation of the present application.
It should be noted that, the acceleration signal and the angular velocity signal acquired by each first wearable device worn on the target body part of the user's body may be the same, and the electronic device may store a plurality of algorithms for performing data processing on the signals. The electronic device may take different data processing algorithms on the signals collected by different first wearable devices based on the location where the first wearable device is worn and the purpose of data processing. For example, a first data processing algorithm is employed on acceleration and angular velocity signals acquired by a first wearable device worn on the waist to calculate the gesture features of the user. And adopting a second data processing algorithm to the acceleration signal and the angular velocity signal acquired by the first wearable device worn on the chest so as to calculate the breathing rate of the user. Wherein the first data processing algorithm is different from the second data processing algorithm.
In some embodiments, the electronic device may determine a force characteristic of the user from the myoelectric data.
Specifically, the electronic device may perform signal preprocessing on the received electromyographic signal to remove noise interference, so as to extract characteristic information, such as time domain characteristics and frequency domain characteristics, of the electromyographic signal, and calculate the stress characteristics of the user based on the characteristic information. Or, the electronic device may directly calculate the stress characteristics of the user according to the received characteristic information of the electromyographic signals. Alternatively, the force-generating feature may include a location of the force-generating muscle and a magnitude of the force.
In one possible implementation, the electronic device may also calculate the force characteristics of the user based on the acceleration signal and the angular velocity signal acquired by the first wearable device instead of via myoelectric data, which the present application is not limited to.
In some embodiments, the method 1100 may further comprise:
s1103, the electronic equipment outputs prompt information according to the preset yoga database and the yoga action characteristics, and the prompt information is used for guiding the yoga action of the user.
Specifically, the electronic device can comprehensively evaluate the yoga action of the user by combining a preset yoga database and the yoga action characteristics of the user, for example, the electronic device can evaluate whether the action posture of the user is standard, whether the breathing rhythm is matched with the body type or whether the force-exerting muscle part and the force-exerting size of the user are correct according to the yoga action characteristics. Then, the electronic device may output prompt information based on the comprehensive evaluation to guide a yoga action of the user.
According to the embodiment of the application, the electronic equipment can detect the yoga action characteristics of the user through the motion data and the myoelectricity data of the user acquired by the first wearable equipment, so that the performance of the yoga action of the user can be comprehensively monitored in real time, and the comprehensive evaluation of the yoga action of the user is facilitated.
In addition, in the embodiment of the application, the monitoring of the yoga action performance of the user can be free from the limitation of the offline yoga place, and the user does not need to purchase a training course at higher expense, so that the cost performance is high. In addition, when the first wearable device acquires the motion data and myoelectricity data of the user, the first wearable device is not influenced by external factors such as the place and environment where the user is located. The accuracy of detecting the yoga action characteristics of the user can be improved, and professional guiding advice is provided for the yoga action of the user.
Fig. 12 shows a schematic flow chart of another yoga motion detection method 1200 provided by an embodiment of the present application, and as shown in fig. 12, the method 1200 may be performed by an electronic device. The method 1200 includes:
s1201, the electronic device acquires biological information of the user during the yoga exercise, where the biological information includes exercise data and myoelectricity data.
In particular, the electronic device may receive biometric information acquired by a first wearable device worn on a target body part of a user.
It can be appreciated that the process of the electronic device acquiring the biological information of the user during the yoga exercise may refer to the description related to S1101 in the method 1100, and the disclosure is not repeated here for avoiding repetition.
S1202, the electronic device determines gesture features of the user according to the motion data.
In particular, the motion data may include acceleration data and angular velocity data of the user. The acceleration data may include, among other things, an acceleration signal of the user and/or characteristic information of the acceleration signal. The angular velocity data may comprise an angular velocity signal of the user and/or characteristic information of the angular velocity signal.
It can be appreciated that the process of determining the gesture feature of the user by the electronic device according to the motion data can refer to the description related to S1102 in the method 1100, and the present application is not repeated here for avoiding repetition.
S1203, the electronic device determines a breathing characteristic of the user from the motion data.
In particular, the electronic device may determine the breathing characteristics of the user from motion data acquired by a first wearable device worn on the wrist or chest.
It will be appreciated that the process of determining the breathing characteristics of the user according to the motion data by the electronic device may refer to the description related to S1102 in the method 1100, and the present application is not repeated here for avoiding repetition.
And S1204, the electronic equipment determines the stress characteristics of the user according to the myoelectricity data.
It can be appreciated that the process of determining the breathing characteristics of the user by the electronic device according to the myoelectric data can refer to the description related to S1102 in the method 1100, and the present application is not repeated here for avoiding repetition.
S1205, the electronic equipment classifies and identifies yoga actions of the user according to the preset yoga database and the gesture characteristics.
The electronic device can classify and identify the yoga actions of the user by adopting a maximum likelihood estimation method according to the calculated gesture characteristics and combining data in the preset yoga database after the yoga actions of the user are split in a split mode. For example, yoga action classifications of users may be identified as cat, dog, boat, etc.
S1206, the electronic equipment comprehensively evaluates yoga actions of the user.
In some embodiments, the electronic device may comprehensively evaluate the yoga actions of the user from three aspects of body, gas, and force according to a preset yoga database and the gesture feature, the breathing feature, and the force feature.
Specifically, it may include:
s1207, the electronic device judges whether the action gesture of the user is standard according to the preset yoga database and the gesture characteristics.
S1208, the electronic equipment judges whether the breathing rhythm of the user is matched with the body type according to the preset database and the breathing characteristics.
S1209, the electronic equipment judges whether the force muscle parts and the force are correct according to the preset yoga database and the force characteristics.
In other embodiments, the electronic device may perform data fusion on the posture feature, the respiration feature, and the stress feature based on the combination of a preset yoga database, so as to perform a multi-dimensional graph comprehensive evaluation on the yoga action of the user. Illustratively, the multi-dimensional map may include at least two of a balance dimension, a coordination dimension, a softness dimension, a finish of action dimension, and a force skill dimension, as the application is not limited in this regard.
S1210, the electronic equipment outputs prompt information according to the comprehensive evaluation of the yoga action of the user, wherein the prompt information is used for guiding the yoga action of the user.
The electronic device may evaluate a yoga action of the user according to the preset yoga database and the gesture feature, the breathing feature, and the force feature, and generate a yoga guidance suggestion. For example, the instruction suggestion may be: adjusting breathing rhythm, adjusting stress of body tension, etc. The electronic equipment can output prompt information according to the instruction suggestion, so that professional instruction can be provided for a user in real time.
In some embodiments, the prompt may be at least one of a text-form prompt, a vibration-form prompt, or a voice-form prompt, which is not limited by the present application. The user can correct the yoga action of the user according to the prompt information, so that the experience of the user is improved.
Fig. 13 shows a schematic flow chart of another yoga motion detection method 1300 provided by an embodiment of the present application.
As shown in fig. 13, method 1300 may be performed by an electronic device. The method 1300 includes:
s1301, the electronic equipment acquires brain wave data of a user in a yoga exercise process, wherein the brain wave data are sent by the second wearable equipment.
Specifically, the second wearable device may be worn on a head of the user, so as to acquire brain wave data of the user during yoga exercise, and send the brain wave data to the electronic device.
The brain wave data may include brain wave signals of the user and/or characteristic information of the brain wave signals. The characteristic information of the brain wave signal may include a time domain characteristic and/or a frequency domain characteristic of the brain wave signal.
In some embodiments, the second wearable device may collect brain wave signals of the user during yoga movements through a brain wave sensor.
In one possible example, the second wearable device may send the acquired brain wave signals directly to the electronic device. In another possible example, the second wearable device may process the acquired brain wave signals to obtain characteristic information of the brain wave signals, and may send the characteristic information to the electronic device. The processing method can include signal preprocessing of the acquired brain wave signals to remove noise interference, so that time domain features and frequency domain features of the brain wave signals can be extracted.
S1302, the electronic device obtains biometric information of the user in the yoga process sent from the first wearable device, where the biometric information may include movement data and myoelectricity data of the user.
It can be appreciated that the process of the electronic device acquiring the biological information of the user during the yoga exercise may refer to the description related to S1101 in the method 1100, and the disclosure is not repeated here for avoiding repetition.
S1303, the electronic equipment determines yoga action characteristics of the user according to the biological information and the brain wave data.
In some embodiments, the yoga action features of the user may include one or more of a gesture feature, a force feature, a respiration feature, and a concentration feature.
In one possible example, the electronic device may determine a concentration characteristic of the user from the brain wave data.
Specifically, the electronic device may perform signal preprocessing on the received brain wave signal to remove noise interference, so as to extract time domain features and frequency domain features of the brain wave signal, and calculate concentration features of the user based on the time domain features and the frequency domain features. Alternatively, the electronic device may directly calculate the concentration characteristics of the user according to the characteristic information (such as time domain and frequency domain characteristics) of the received brain wave signals.
In the embodiment of the application, the electronic equipment can detect the concentration characteristic of the user through the brain wave data of the user acquired by the second wearable equipment, thereby being beneficial to monitoring and evaluating the mental concentration condition of the user in the yoga exercise process.
In some embodiments, the concentration characteristics may include a ranking of concentration. Specifically, the electronic device may classify the concentration of the user by using a machine learning classification model based on the time domain features and the frequency domain features of the brain wave signals and in combination with a preset yoga database. For example, the electronic device may categorize the concentration of the user into 10 different levels according to the different degrees of the brain wave bouncing back and forth. Wherein, the 1 level to 3 level can represent that the user concentration is lower, the 4 level to 7 level can represent that the user concentration is general, and the 8 level to 10 level can represent that the user concentration is higher. It will be appreciated that the above-described ranking of user concentration is merely an example and is not a limitation of the present application.
It can be appreciated that, the process of determining the yoga action characteristic of the user by the electronic device according to the biological information can refer to S1102 in the method 1100, and the disclosure is not repeated here for avoiding repetition.
S1304, the electronic equipment outputs prompt information according to a preset yoga database and the yoga action characteristics, wherein the prompt information is used for guiding the yoga action of the user.
Specifically, the electronic device can comprehensively evaluate the yoga action of the user by combining a preset yoga database and the yoga action characteristics of the user, for example, the electronic device can evaluate whether the action posture of the user is standard, whether the breathing rhythm is matched with the body type, whether the force-exerting muscle part and the force-exerting size of the user are correct, or whether the user is concentrated in the yoga according to the yoga action characteristics. Then, the electronic device may output prompt information based on the comprehensive evaluation to guide a yoga action of the user.
The electronic device may evaluate yoga actions of the user according to a preset yoga database and the above-mentioned gesture features, respiratory features, stress features, and concentration levels, and generate yoga guidance advice. For example, the instruction suggestion may be: adjusting breathing rhythm, adjusting stress in body tension, or adjusting attention, etc. The electronic equipment can output prompt information according to the instruction suggestion, so that professional instruction can be provided for a user.
The description of the prompt message may refer to S1210 in the method 1200, and the description is omitted herein for avoiding repetition.
According to the embodiment of the application, the electronic equipment can grade the concentration degree of the user according to the brain wave data of the user and the preset standard database, so that the mental concentration condition of the user in the yoga exercise process can be evaluated more accurately.
Fig. 14 shows a schematic flow chart of another yoga motion detection method 1400 provided by an embodiment of the present application. As shown in fig. 14, method 1400 may be performed by an electronic device. The method 1400 includes:
s1401, the electronic device acquires biological information and brain wave data of the user during yoga exercise, the biological information including exercise data and myoelectricity data.
Specifically, the electronic device may receive biometric information acquired by a first wearable device worn on a target body part of a user, and second wearable acquired brain wave data worn on a head of the user.
It may be appreciated that the process of receiving the biological information of the user during the yoga movement by the electronic device may refer to the description related to S1101 in the above method 1100, and the process of acquiring the brain wave data of the user during the yoga movement by the electronic device may refer to the description related to S1301 in the above method 1300, so that repetition is avoided.
S1402, the electronic device determines gesture features of the user according to the motion data.
In particular, the motion data may include acceleration data and angular velocity data of the user. The acceleration data may include, among other things, an acceleration signal of the user and/or characteristic information of the acceleration signal. The angular velocity data may comprise an angular velocity signal of the user and/or characteristic information of the angular velocity signal.
It can be appreciated that the process of determining the gesture feature of the user by the electronic device according to the motion data can refer to the description related to S1102 in the method 1100, and the present application is not repeated here for avoiding repetition.
S1403, the electronic device determines a breathing characteristic of the user from the motion data.
In particular, the electronic device may determine the breathing characteristics of the user from motion data acquired by a first wearable device worn on the wrist or chest.
It will be appreciated that the process of determining the breathing characteristics of the user according to the motion data by the electronic device may refer to the description related to S1102 in the method 1100, and the present application is not repeated here for avoiding repetition.
S1404, the electronic device determines a stress characteristic of the user according to the myoelectric data.
It can be appreciated that the process of determining the breathing characteristics of the user by the electronic device according to the myoelectric data can refer to the description related to S1102 in the method 1100, and the present application is not repeated here for avoiding repetition.
S1405, the electronic equipment classifies the concentration degree of the user according to the brain wave data and the preset yoga database.
Specifically, the electronic device may classify the concentration of the user by using a machine learning classification model based on time domain and frequency domain characteristics of brain wave information and in combination with a preset yoga database.
It can be appreciated that the process of grading the concentration of the user by the electronic device may refer to the description related to S1303 in the method 1300, and the present application is not repeated here for avoiding repetition.
S1406, the electronic equipment classifies and identifies yoga actions of the user according to the preset yoga database and the gesture characteristics.
It can be appreciated that the process of classifying and identifying the yoga action of the user by the electronic device may refer to the description related to S1205 in the method 1200, and the disclosure is not repeated here for avoiding repetition.
S1407, the electronic equipment comprehensively evaluates yoga actions of the user.
In some embodiments, the electronic device may comprehensively evaluate yoga actions of the user from four aspects of body, gas, force, and spirit according to a preset yoga database and the above-described classification of posture features, breathing features, stress features, and concentration.
Specifically, it may include:
s1408, the electronic device judges whether the action gesture of the user is standard according to the preset yoga database and the gesture feature.
S1409, the electronic equipment judges whether the breathing rhythm of the user is matched with the body type according to the preset database and the breathing characteristics.
S1410, the electronic device judges whether the force muscle part and the force are correct according to the preset yoga database and the force feature.
S1411, the electronic equipment judges whether the user is concentrated in the yoga exercise according to the preset yoga database and the concentration degree grading.
In other embodiments, the electronic device may perform data fusion on the posture feature, the breathing feature, the stress feature, and the concentration level based on the combination of the preset yoga database, so as to perform multi-dimensional graph comprehensive evaluation on the yoga action of the user. Illustratively, the multi-dimensional map may be a six-dimensional map including a balance dimension, a coordination dimension, a softness dimension, a finish of action dimension, a force skill dimension, and a concentration dimension.
And S1412, the electronic equipment outputs prompt information according to the comprehensive evaluation of the yoga action of the user, wherein the prompt information is used for guiding the yoga action of the user.
It is to be understood that the process of outputting the prompt message by the electronic device may refer to the description related to S1210 in the method 1200, and the description is omitted herein for avoiding repetition.
According to the embodiment of the application, the electronic equipment can comprehensively evaluate the yoga action of the user according to the yoga action characteristics and the preset standard database. And generating yoga guiding advice based on the comprehensive evaluation. The electronic equipment can output prompt information according to the instruction suggestion, so that professional instruction can be provided for a user in real time.
The method for yoga motion detection provided by the embodiment of the application is mainly introduced from the angle of the electronic equipment. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the processor in the electronic device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing each functional module into each function, fig. 15 shows a schematic structural diagram of a yoga motion detection apparatus 1500 provided in an embodiment of the present application, as shown in fig. 15, an electronic device 1500 may include: an acquisition unit 1510 and a processing unit 1520.
The obtaining unit 1510 may be configured to obtain biological information of the user during yoga sent by the first wearable device.
The processing unit 1520 may be configured to determine yoga motion characteristics of the user based on the biometric information.
The processing unit 1520 may be further configured to classify and identify a yoga action of the user according to the gesture feature and a preset yoga database.
The obtaining unit 1510 may be further configured to obtain brain wave data of the user during yoga sent by the second wearable device.
The processing unit 1520 may also be configured to determine a concentration profile of the user from the brain wave data.
The processing unit 1520 may be specifically configured to rank the concentration of the user according to the brain wave data and a preset yoga database.
The processing unit 1520 may be further configured to output prompt information according to a preset yoga database and yoga action characteristics.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Fig. 16 shows a schematic structural diagram of an electronic device 1600 provided by an embodiment of the present application. As shown in fig. 16, the electronic device includes: one or more processors 1610, one or more memories 1620, the one or more memories 1620 storing one or more computer programs comprising instructions. The instructions, when executed by the one or more processors 1610, cause the electronic device 1600 to perform the electronic device-side aspects of the embodiments described above.
An embodiment of the present application provides a computer program product, which when run on a power supply device, causes the electronic device to execute the technical solution in the above embodiment. The implementation principle and technical effects are similar to those of the related embodiments of the method, and are not repeated here.
An embodiment of the present application provides a readable storage medium, where the readable storage medium contains instructions, and when the instructions are executed by a power supply device, the instructions cause the electronic device to execute the technical solution of the foregoing embodiment. The implementation principle and technical effect are similar, and are not repeated here.
The embodiment of the application provides a chip for executing instructions, and when the chip runs, the technical scheme in the embodiment is executed. The implementation principle and technical effect are similar, and are not repeated here.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (24)
1. The yoga action detection method is applied to electronic equipment and is characterized by comprising the following steps:
the electronic equipment acquires biological information of a user in a yoga movement process, wherein the biological information comprises movement data and myoelectricity data, and the biological information is sent by the first wearable equipment;
and the electronic equipment determines yoga action characteristics of the user according to the biological information.
2. The method of claim 1, wherein the yoga motion feature comprises one or more of a gesture feature, a force feature, and a respiration feature.
3. The method of claim 2, wherein the posture feature and the respiration feature are determined from the motion data and the force-producing feature is determined from the myoelectric data.
4. A method according to claim 2 or 3, wherein when the yoga action feature comprises the gesture feature, the method further comprises:
and the electronic equipment classifies and identifies the yoga actions of the user according to the gesture characteristics and a preset yoga database.
5. The method of claim 4, wherein the gesture feature comprises one or more of a gesture angle, a gesture duration, and a gesture stability of a target body part rotation of the user,
Wherein the target body part is a body part wearing the first wearable device.
6. The method of claim 4 or 5, wherein the force generating features include a location of a force generating muscle and a magnitude of force generation.
7. The method of any one of claims 4 to 6, wherein the yoga action feature further comprises a concentration feature of the user, the method further comprising:
the electronic equipment acquires brain wave data of the user in the yoga movement process, which is sent by the second wearable equipment;
and the electronic equipment determines the concentration degree characteristic of the user according to the brain wave data.
8. The method of claim 7, wherein the electronic device determining the concentration profile of the user from the brain wave data comprises:
and the electronic equipment classifies the concentration degree of the user according to the brain wave data and the preset yoga database.
9. The method according to any one of claims 4 to 8, further comprising:
the electronic equipment outputs prompt information according to the preset yoga database and the yoga action characteristics, and the prompt information is used for guiding the yoga action of the user.
10. A device for yoga motion detection, the device comprising:
the device comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring biological information of a user in a yoga movement process, which is sent by a first wearable device, and the biological information comprises movement data and myoelectricity data;
and the processing unit is used for determining yoga action characteristics of the user according to the biological information.
11. The apparatus of claim 10, wherein the yoga motion feature comprises one or more of a gesture feature, a force feature, and a respiration feature.
12. The apparatus of claim 11, wherein the posture feature and the respiration feature are determined from the motion data and the force-producing feature is determined from the myoelectric data.
13. The apparatus of claim 11 or 12, wherein when the yoga motion feature comprises the gesture feature, the processing unit is further to:
and classifying and identifying yoga actions of the user according to the gesture characteristics and a preset yoga database.
14. The apparatus of claim 13, wherein the gesture feature comprises one or more of a gesture angle, a gesture duration, and a gesture stability of a target body part rotation of the user,
Wherein the target body part is a body part wearing the first wearable device.
15. The device of claim 13 or 14, wherein the force generating features include a location of a force generating muscle and a magnitude of force generation.
16. The apparatus of any one of claims 13 to 15, wherein the yoga action feature further comprises a concentration feature of the user,
the acquisition unit is also used for acquiring brain wave data of the user in the yoga process sent by the second wearable equipment,
the processing unit is further used for determining concentration characteristics of the user according to the brain wave data.
17. The apparatus according to claim 16, wherein the processing unit is specifically configured to:
and grading the concentration degree of the user according to the brain wave data and the preset yoga database.
18. The apparatus according to any one of claims 13 to 17, wherein the processing unit is further configured to:
and outputting prompt information according to the preset yoga database and the yoga action characteristics, wherein the prompt information is used for guiding the yoga action of the user.
19. An electronic device, comprising:
one or more processors;
one or more memories;
the one or more memories store one or more computer programs comprising instructions that, when executed by the one or more processors, cause the method of any of claims 1-9 to be performed.
20. A system for yoga action detection, comprising the electronic device of claim 19 and a first wearable device, wherein the first wearable device is to:
acquiring biological information of a user in a yoga exercise process, wherein the biological information comprises exercise data and myoelectricity data;
and sending the biological information to the electronic equipment.
21. The system of claim 20, further comprising a second wearable device for:
acquiring brain wave data of the user in a yoga exercise process;
and sending the brain wave data to the electronic equipment.
22. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 9.
23. A computer program product, characterized in that the computer program product, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1 to 9.
24. A chip comprising a processor and a memory, the memory storing instructions for executing the instructions stored in the memory, the processor performing the method of any one of claims 1 to 9 when the instructions are executed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210461386.0A CN117034095A (en) | 2022-04-28 | 2022-04-28 | Yoga action detection method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210461386.0A CN117034095A (en) | 2022-04-28 | 2022-04-28 | Yoga action detection method, device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117034095A true CN117034095A (en) | 2023-11-10 |
Family
ID=88643526
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210461386.0A Pending CN117034095A (en) | 2022-04-28 | 2022-04-28 | Yoga action detection method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117034095A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118277768A (en) * | 2024-05-24 | 2024-07-02 | 小舟科技有限公司 | Brain power supply signal separation and encoding method and device and computer equipment |
-
2022
- 2022-04-28 CN CN202210461386.0A patent/CN117034095A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118277768A (en) * | 2024-05-24 | 2024-07-02 | 小舟科技有限公司 | Brain power supply signal separation and encoding method and device and computer equipment |
CN118277768B (en) * | 2024-05-24 | 2024-08-09 | 小舟科技有限公司 | Brain power supply signal separation and encoding method and device and computer equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021036568A1 (en) | Fitness-assisted method and electronic apparatus | |
CN109710080A (en) | A kind of screen control and sound control method and electronic equipment | |
CN112446832A (en) | Image processing method and electronic equipment | |
CN113655935B (en) | User determination method, electronic device and computer readable storage medium | |
CN112783330A (en) | Electronic equipment operation method and device and electronic equipment | |
CN111566693A (en) | Wrinkle detection method and electronic equipment | |
CN114242037A (en) | Virtual character generation method and device | |
EP4310724A1 (en) | Method for determining exercise guidance information, electronic device, and exercise guidance system | |
WO2022068650A1 (en) | Auscultation position indication method and device | |
CN117034095A (en) | Yoga action detection method, device and system | |
WO2021036562A1 (en) | Prompting method for fitness training, and electronic device | |
CN112308880B (en) | Target user locking method and electronic equipment | |
CN111557007B (en) | Method for detecting opening and closing states of eyes and electronic equipment | |
WO2022214004A1 (en) | Target user determination method, electronic device and computer-readable storage medium | |
US20230402150A1 (en) | Adaptive Action Evaluation Method, Electronic Device, and Storage Medium | |
WO2021254091A1 (en) | Method for determining number of motions and terminal | |
WO2021233018A1 (en) | Method and apparatus for measuring muscle fatigue degree after exercise, and electronic device | |
WO2021254092A1 (en) | Rowing stroke frequency recommendation method, apparatus and device | |
CN115445170B (en) | Exercise reminding method and related equipment | |
CN113380374B (en) | Auxiliary motion method based on motion state perception, electronic equipment and storage medium | |
WO2021238338A1 (en) | Speech synthesis method and device | |
WO2024222544A1 (en) | Vibration feedback method, related apparatus and communication system | |
CN117982126A (en) | Prompting method, prompting device, electronic equipment and storage medium | |
CN115203524A (en) | Fitness recommendation method and electronic equipment | |
CN118366211A (en) | Gesture recognition method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |