CN113380374A - Auxiliary movement method based on movement state perception, electronic equipment and storage medium - Google Patents

Auxiliary movement method based on movement state perception, electronic equipment and storage medium Download PDF

Info

Publication number
CN113380374A
CN113380374A CN202110503121.8A CN202110503121A CN113380374A CN 113380374 A CN113380374 A CN 113380374A CN 202110503121 A CN202110503121 A CN 202110503121A CN 113380374 A CN113380374 A CN 113380374A
Authority
CN
China
Prior art keywords
user
motion
exercise
auxiliary
motion state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110503121.8A
Other languages
Chinese (zh)
Other versions
CN113380374B (en
Inventor
刘文浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Glory Smart Technology Development Co ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110503121.8A priority Critical patent/CN113380374B/en
Publication of CN113380374A publication Critical patent/CN113380374A/en
Application granted granted Critical
Publication of CN113380374B publication Critical patent/CN113380374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/55Push-based network services

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Telephone Function (AREA)

Abstract

The application provides an auxiliary motion method based on motion state perception, electronic equipment and a storage medium, and relates to the technical field of terminals. The method is applied to the electronic equipment and comprises the following steps: acquiring a motion state of a user; judging whether the motion state meets a preset motion assisting condition or not; if the motion state meets the preset motion assisting condition, judging whether an instruction for starting assisting motion is generated or not according to the interaction behavior with the user; if the command for starting the auxiliary exercise is determined to be generated, responding to the command for starting the auxiliary exercise and acquiring exercise data corresponding to the user; and generating a corresponding auxiliary prompt according to the motion data. By utilizing the embodiment of the application, the exercise assisting efficiency can be improved.

Description

Auxiliary movement method based on movement state perception, electronic equipment and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a motion state sensing-based auxiliary motion method, an electronic device, and a storage medium.
Background
Along with the improvement of living standard of people, more and more attention is paid to sports health, and a plurality of sports application programs appear on the market, and can be applied to electronic equipment to monitor the motion state of a user in the motion process. However, although the sports application program can obtain the sports data, the user needs to manually select the sports type and start the sports monitoring function, so the operation is complex and tedious, and is not convenient enough, which results in low efficiency of the sports assistance.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an electronic device, and a storage medium for assisting a user in exercise based on motion state perception, which can improve the interaction efficiency of the electronic device for assisting the user in exercise.
In a first aspect, the present application provides a method for assisting exercise based on motion state perception, applied to an electronic device, where the method includes: acquiring a motion state of a user; judging whether the motion state meets a preset motion assisting condition or not; if the motion state meets the preset motion assisting condition, judging whether an instruction for starting assisting motion is generated or not according to the interaction behavior with the user; if the command for starting the auxiliary exercise is determined to be generated, responding to the command for starting the auxiliary exercise and acquiring exercise data corresponding to the user; generating a corresponding auxiliary prompt according to the motion data; the obtaining of the motion state of the user and the determining whether the motion state meets a preset motion assistance condition includes: the electronic equipment starts a motion state fence to sense the motion state of a user, starts timing when the electronic equipment starts the time fence after the user is determined to enter a certain motion state, confirms that the user enters the motion state if the user is still sensed to be in the motion state after the electronic equipment lasts for preset time, and judges that the motion state meets a preset motion assisting condition.
Through the technical scheme, the electronic equipment accurately senses whether the user enters the motion state of the preset motion assisting condition through double judgment, and generates the motion assisting prompt when the user enters the motion state of the preset motion assisting condition, so that the motion assisting efficiency is improved.
In one possible implementation, the method further includes: counting the movement times of which the movement state meets a preset movement auxiliary condition; if the number of times of the movement is larger than or equal to a preset number threshold, determining whether a target movement application is installed; if the target motion application is not installed, generating installation prompt information corresponding to the target motion application; if the target motion application is installed, determining whether an auxiliary motion function of the target motion application is enabled; and if the auxiliary motion function of the target motion application is not enabled, generating recommendation information corresponding to the auxiliary motion function of the target motion application. Through the technical scheme, the electronic equipment generates the corresponding recommendation information according to the actual situation of the user, so that the recommendation accuracy is improved, and the exercise assisting efficiency is improved.
In a possible implementation manner, the determining, if the motion state meets a preset motion assistance condition, whether to generate an instruction to start an assistance motion according to an interaction behavior with the user includes: if the motion state meets the preset motion assisting condition, judging whether the earphone of the user is connected with the electronic equipment; and if the earphone of the user is connected with the electronic equipment, controlling the earphone to play an inquiry audio signal so as to inquire whether the user starts the auxiliary exercise function, and judging whether to generate an instruction for starting the auxiliary exercise according to a reply signal corresponding to the received inquiry audio signal. By the technical scheme, after the electronic equipment is determined to be connected with the earphone of the user, the voice assistant is started, so that the sudden appearance of the mobile phone caused by the sudden sound output is avoided, the exercise assistance is more intelligent, and the user experience is improved.
In a possible implementation manner, if the user's headset is connected to the electronic device, the controlling the headset to play the query audio signal includes: if the earphone of the user is connected with the electronic equipment, judging whether the user wears the earphone; and if the user wears the earphone, controlling the earphone to play an inquiry audio signal. Through the technical scheme, the condition that a user does not wear an earphone to play the inquiry audio signal is avoided, and the efficiency of starting auxiliary sports is improved.
In a possible implementation manner, the determining, if the motion state meets a preset motion assistance condition, whether to generate an instruction to start an assistance motion according to an interaction behavior with the user includes: if the motion state meets the preset motion assistance condition, acquiring the current position data of the user; judging whether the earphone of the user is connected with the electronic equipment; judging whether a preset sound playing condition is met or not according to the current position data; and if the sound external-playing condition is met, externally playing an inquiry audio signal to inquire whether a user starts an auxiliary exercise function, and judging whether an auxiliary exercise starting instruction is generated according to a received reply signal corresponding to the inquiry audio signal. By the technical scheme, under the condition that the sound playing condition is met, the inquiry audio signal is directly played, so that whether the user starts the auxiliary exercise function or not is inquired, and the auxiliary exercise starting efficiency is improved.
In one possible implementation, the method further includes: if the sound playing condition is not met, judging whether the earphone of the user is connected with the electronic equipment or not; if the earphone of the user is connected with the electronic equipment, judging whether the user wears the earphone; and if the user wears the earphone, controlling the earphone to play an inquiry audio signal so as to inquire whether the user starts the auxiliary exercise function, and judging whether to generate an instruction for starting the auxiliary exercise according to a reply signal corresponding to the received inquiry audio signal. By the technical scheme, whether the electronic equipment is connected with the earphone of the user is determined under the condition that the sound playing condition is met, so that the sudden appearance of the mobile phone caused by the sudden sound playing is avoided, the exercise assistance is more intelligent, and the user experience is improved. Meanwhile, whether the user wears the earphone or not is judged, the situation that the user plays an inquiry audio signal under the condition that the user does not wear the earphone is avoided, and the efficiency of starting auxiliary exercise is improved.
In one possible implementation, the reply signal comprises an audio reply signal and/or a shake reply signal; the audio reply signal is generated according to voice data of a user, and the shaking reply signal is generated according to shaking data of the target device. Through the technical scheme, two interaction modes are provided, and the intelligence of auxiliary movement is improved.
In a possible implementation manner, the determining, according to the interaction behavior with the user, whether to generate an instruction to start an auxiliary motion includes: generating corresponding prompt information according to a preset prompt rule; sending the prompt information to a target display screen to enable the target display screen to display the prompt information; and if the trigger operation of the user on the prompt message is detected, determining to generate an instruction for starting the auxiliary movement. Through the technical scheme, the target display screen is used for displaying the prompt information, and corresponding instructions are generated according to the operation of the user on the prompt information, so that the interaction convenience is increased, and the intelligence of auxiliary movement is improved.
In a second aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor; the memory to store program instructions; the processor is configured to read the program instructions stored in the memory to implement the method for assisting exercise based on motion state perception as described above.
In a third aspect, the present application provides a computer storage medium, in which computer readable instructions are stored, and when executed by a processor, the computer readable instructions implement the method for assisting exercise based on motion state perception as described above.
In addition, the technical effects brought by the second aspect and the third aspect can be referred to the description related to the methods designed in the above methods, and are not described herein again.
Drawings
Fig. 1 is a flowchart of a method for assisting exercise based on motion state perception according to an embodiment of the present application.
Fig. 2 is a flowchart for sensing that a user enters a certain motion state according to an embodiment of the present disclosure.
Fig. 3 is a flowchart of synchronized motion bracelet information provided in an embodiment of the present application.
Fig. 4 is a flowchart illustrating enabling an auxiliary exercise function according to an embodiment of the present application.
Fig. 5 is a flowchart for sensing exit from the exercise state according to an embodiment of the present application.
Fig. 6 is a flowchart of user recommendation provided in an embodiment of the present application.
Fig. 7 is a flowchart of another exercise assisting method based on motion state sensing according to an embodiment of the present disclosure.
Fig. 8 is a flowchart of an instruction for generating an opening assist movement according to an embodiment of the present application.
Fig. 9 is a flowchart of another instruction for generating an opening assist movement according to an embodiment of the present application.
Fig. 10 is a scene schematic diagram of a prompt message according to an embodiment of the present application.
Fig. 11 is a flowchart of a sports application recommendation method according to an embodiment of the present application.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or illustrations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. It should be understood that in this application, "/" means "or" means "unless otherwise indicated. For example, A/B may represent A or B. In the present application, "and/or" is only one kind of association relation describing an associated object, and means that three kinds of relations may exist. For example, a and/or B, may represent: a exists alone, A and B exist simultaneously, and B exists alone. "at least one" means one or more. "plurality" means two or more than two. For example, at least one of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, a, b and c.
The operation of the existing exercise assisting function for starting the exercise application is described as follows. Before the user exercises, the exercise application needs to be started, and then the user interface of the electronic device displays an exercise assisting interface of the exercise application, wherein the exercise assisting interface comprises various exercise states such as running, walking, riding and the like. The user may select one motion state from the displayed plurality of motion states as the target motion state, e.g. select "walk" as the target motion state, depending on the upcoming motion. After the walking is determined as the target motion state, a plurality of motion scenes corresponding to the walking motion state, such as walking, hiking, mountain climbing and the like, are displayed on a user interface of the electronic equipment. When the user clicks the "walking" motion scene, the user can enter the motion assistance interface corresponding to the "walking" motion scene.
However, in the operation process of starting the exercise assisting function of the exercise application, the user needs to go through a plurality of operation steps such as starting the exercise application, selecting an upcoming exercise in the exercise application as a target exercise, selecting an exercise scene corresponding to the target exercise, and clicking a button for starting the exercise scene, so as to start the exercise assisting function, and thus the process of starting the exercise assisting function is too complicated and long, and is not convenient and fast, and the interactive operation is complex, which results in low exercise assisting efficiency.
In order to solve the technical problems that the operation flow of starting the exercise assisting function of the fitness application is too complex and long, and the exercise assisting efficiency is not high, the embodiment of the application provides an exercise assisting method based on exercise state perception to improve the exercise assisting efficiency. The method can be applied to electronic devices. The electronic device may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, an intelligent home device, and/or an intelligent city device, and the specific type of the electronic device is not particularly limited in the embodiments of the present application.
Some embodiments of the present application propose a method of assisting exercise based on the perception of a state of motion. Upon sensing the user's onset of motion (running, fast walking, riding, etc.) through the motion state fence, the time fence timing is initiated. When the duration of the exercise reaches a certain threshold, if the user wears the headset, the auxiliary exercise function is activated to help the user adjust the exercise tempo. And automatically generating the running exercise report information after detecting that the user exits the running state, and providing an optimization suggestion. And reminding recommended use for the user who frequently moves but does not use the function. The motion state fence is used for detecting the motion state of the user, and the time fence is used for timing and can be used for calculating the time of continuous motion of the user.
The auxiliary motion method based on motion state perception can be applied to auxiliary motion application based on motion state perception. The main functions of the auxiliary exercise application based on the motion state perception comprise an exercise report function, an auxiliary exercise function and a recommendation function, and the realization of related functions can be automatically controlled through technologies such as motion state perception, Internet of things and man-machine interaction. For example, as shown in fig. 1, if it is determined that the user starts to perform a certain exercise (running, fast walking, riding, etc.) based on a multi-stage fence (motion state fence and time fence) technology, a mobile phone GPS, acceleration, etc. sensor is enabled to collect information such as the number of steps, geographic position, speed, etc. of the user in real time. If the user wears the sports bracelet, user information collected by the real-time synchronous bracelet is transmitted to the mobile phone, and the user information comprises information such as step number, heart rate and the like. If the user is detected to wear the earphone, the voice assistant is started to inquire whether the auxiliary motion function is started or not. And if the user confirms to start the auxiliary exercise function, starting the auxiliary exercise function to assist the user in adjusting the exercise speed and rhythm. And when the user is sensed to exit the exercise state, closing the auxiliary exercise function, generating an exercise report and an exercise plan, and providing an optimization suggestion. Further, the application and function are recommended or prompted for a user who has a high exercise frequency and does not install or use the auxiliary exercise function.
The perception user enters a certain motion state, specifically, a multi-stage fence technology can be adopted, the outer fence adopts a motion state fence, the inner fence adopts a time fence, and the effect of accurately perceiving the user entering the certain motion state is achieved. As shown in fig. 2, the parameter T0 should take a smaller value, for example, 10 seconds, which can be used to improve the accuracy of the motion state determination, and the value of the parameter T0 can be set according to the actual situation.
The synchronized motion bracelet information may specifically include: through internet of things (can be based on agreement such as bluetooth) with information such as step number, rhythm of the heart of motion bracelet collection in real time synchronization to cell-phone application, can further promote the effect and the precision of auxiliary motion. The process of synchronizing the bracelet information may be as shown in fig. 3.
Illustratively, whether the auxiliary motion function is started or not can be judged through a human-computer interaction technology, user experience can be enhanced through the human-computer interaction, and false triggering of events is avoided. After the auxiliary exercise is started, the speed and rhythm of the exercise of the user can be prompted through voice, so that the exercise efficiency of the user is improved, and the user is kept in an aerobic exercise state as far as possible. For example, during the user's early warm-up phase and the last short exercise session, the user may be prompted to decrease the speed and rhythm of the exercise. For example, as shown in fig. 4, it is determined whether the user's mobile phone installs the auxiliary exercise application, and if the user installs the auxiliary exercise application, it is determined whether the user's mobile phone is connected to an earphone and the user wears the earphone, and if the user's mobile phone is connected to an earphone and the user wears the earphone, the voice assistant is turned on, and whether the auxiliary exercise function is turned on is queried based on the voice assistant. And if the user determines to start the auxiliary exercise function, prompting the user to keep the exercise speed and rhythm according to information such as the speed, the heart rate, the exercise plan and the like of the user by voice. The connection between the electronic equipment and the earphone of the user is determined, and the voice assistant is started after the user wears the earphone, so that the sudden appearance of the mobile phone caused by sudden sound output is avoided, the exercise assistance is more intelligent, and the user experience is improved.
Sensing exiting the exercise state may specifically include: whether the user exits from the motion state is confirmed through the multi-stage fences, the outer-layer fence adopts the motion state fence, and the inner-layer fence adopts the time fence, so that the effect of accurately sensing the user to exit from a certain motion state is realized. And after the user is confirmed to exit the motion state through the multi-stage fence, stopping the auxiliary motion function. The process of sensing the exit from the exercise state can be as shown in fig. 5, wherein the parameter T1 should take a smaller value, for example, 10 seconds, which can be used to improve the accuracy of the exercise state determination, and the value of the parameter T1 can be set according to the actual situation.
The exercise form generated by the auxiliary exercise function may include exercise duration, kilometers, heat consumption, average speed, aerobic exercise time, travel track, and other information. For example, an exercise plan can be automatically generated according to the exercise condition of the user (manual correction is also supported), and the exercise plan comprises information such as exercise time, kilometers, heat, exercise speed in different time periods and the like, and is periodically optimized.
For example, accurate recommendation can be performed according to the exercise condition of the user and the use condition of the auxiliary exercise function, such as application installation recommendation, function use recommendation and the like. For example, as shown in fig. 6, the number of times of the user's exercise is calculated, and if the number of times of the user's exercise reaches a preset number N, it is determined whether the user does not install the auxiliary exercise software. And if the user does not install the auxiliary exercise software, recommending the auxiliary exercise related software to the user. And if the user installs the auxiliary exercise software, judging whether the user does not start the auxiliary exercise function. When the user does not start the auxiliary exercise, the introduction information of the function is pushed to the user, and the user is prompted to start the function.
For example, when a user runs, the electronic device acquires that the motion state of the user is running, and then determines that the motion state (running) of the user meets a preset condition for starting exercise assistance. And then, the electronic equipment generates interaction behaviors with the user according to a preset interaction rule, such as voice interaction, touch interaction, gesture interaction and the like. The electronic equipment judges whether to start the auxiliary motion mode according to the interaction behavior with the user. When the auxiliary exercise mode is determined to be started, acquiring exercise data corresponding to the user, and assisting the user to exercise by combining the exercise data, such as adjusting the exercise speed and rhythm of the user.
Referring to fig. 7, a flowchart of a method for assisting exercise based on motion state perception according to an embodiment of the present application is shown. The method can be applied to different types of electronic equipment and specifically comprises the following steps.
In step S31, the motion state of the user is acquired.
In this embodiment, taking a mobile phone as an example, the motion state of the user may be determined based on data collected by a sensor of the mobile phone. The sensors may include acceleration sensors and/or gyroscope sensors. The motion states may include running, fast walking, riding, climbing stairs, skipping ropes, and/or climbing.
And step S32, determining whether the motion state meets a preset motion assistance condition.
If the motion state does not meet the preset motion assistance condition, go to step S31, and if the motion state meets the preset motion assistance condition, go to step S33.
In the embodiments provided in the present application, the exercise assisting condition may be preset according to the requirement of the user, and the exercise assisting condition may include, but is not limited to: whether the motion state is one of multiple preset motion states, whether the motion state reaches the preset duration or not, and whether the motion state is one of multiple preset motion states and whether the motion state reaches the preset duration or not.
For example, before determining whether the exercise state meets the preset exercise assisting condition, the electronic device may first determine whether the exercise state is an exercise state supported by an exercise assisting function, for example, determine whether an option meeting the exercise state is included in an exercise assisting function provided by a fitness application installed in the electronic device. For another example, whether an option corresponding to the motion state is included in the auxiliary motion function carried in the operating system of the electronic device is judged.
And if the motion state is not the motion state supported by the auxiliary motion function, determining that the motion state does not meet a preset motion auxiliary condition. And if the motion state is supported by the auxiliary motion function, judging whether the user maintains the motion state according to a preset time interval. And if the user maintains the motion state, determining that the motion state meets a preset motion auxiliary condition. And if the user does not maintain the motion state, determining that the motion state does not meet the preset motion auxiliary condition. The preset time interval may be set according to actual requirements, for example, set according to characteristics of a motion state, which is not limited herein.
A timed task may be generated according to the preset time interval, and when the timed task expires, it is determined whether the user maintains the exercise state. For example, the preset time interval is 5S, a timing task of 5S is set according to the preset time interval, and when the timing task expires, whether the user maintains the motion state is determined.
In different embodiments, the time intervals corresponding to different motion states may be preset according to the difference of the motion states. Different motion states can have different characteristics (for example, running corresponds to a longer duration, and skipping ropes correspond to a shorter duration), and corresponding different time intervals can be set for different characteristics, for example, running corresponds to a time interval longer than skipping ropes. By setting different time intervals for different motion states, the accuracy of sensing that the user enters a certain supported motion state can be further improved.
By setting the motion state determination and the motion state duration determination, the time when the user enters a certain motion state can be accurately sensed, the efficiency of sensing the motion state is improved, and the motion assistance function is prevented from being triggered by mistake, so that the motion assistance efficiency is improved.
And step S33, if the motion state meets the preset motion assisting condition, judging whether to generate an instruction for starting assisting motion according to the interaction behavior with the user.
In the embodiment provided by the application, if the motion state meets a preset motion assistance condition, an interaction behavior with the user is generated according to a preset interaction rule, and whether an instruction for starting assistance motion is generated is judged according to the interaction behavior. The type of the interactive behavior can comprise voice interaction, touch interaction and the like, the type of the interactive behavior can be determined according to the current environment condition and a preset interaction rule, and the interactive behavior is generated according to the type of the interactive behavior. For example, when the voice interaction type is not suitable, the type of the interaction behavior is determined to be the touch interaction, and the interaction behavior corresponding to the touch interaction is generated.
The interaction may occur between both the electronic device and the user, or may occur between the electronic device, other electronic devices with which the electronic device establishes communication, and the user.
For a detailed description of a scenario to which step S33 applies, reference may be made to the detailed description of fig. 8 to 10 below.
Step S34, if it is determined that the instruction to start the auxiliary exercise is generated, acquiring exercise data corresponding to the user in response to the instruction to start the auxiliary exercise.
And after an auxiliary movement starting instruction is generated, responding to the auxiliary movement starting instruction, and acquiring movement data corresponding to the user at regular time according to a preset time interval.
In a plurality of embodiments of the present application, the motion data corresponding to the user may be acquired based on a sensor of an electronic device, or the motion data corresponding to the user may be acquired based on a wearable device that establishes communication with the electronic device, or the motion data corresponding to the user may be acquired based on the sensor of the electronic device and the wearable device that establishes communication with the electronic device. Wherein the sensors of the electronic device include a Global Positioning System (GPS) sensor and an acceleration sensor, and the motion data may include: the movement speed, the movement time length, the movement distance, the movement heat and the like.
For example, it may be determined whether the user wears a wearable device, and when the user wears the wearable device, the electronic device establishes communication with the wearable device to acquire user data acquired by the wearable device; and generating the user data and the user motion data acquired by the sensor in the electronic equipment into the motion data corresponding to the user. The wearable device can acquire user data including motion data and physiological data. The motion data corresponding to the user is generated through the data collected by the sensor in the electronic equipment and the user data collected by the wearable equipment worn by the user, the motion data can be enriched and improved, the effectiveness of the motion data is improved, and the accuracy and the efficiency of motion assistance are improved.
And step S35, generating a corresponding auxiliary prompt according to the motion data.
The auxiliary prompt can be used for prompting the user to adjust the speed and rhythm of the movement, and the efficiency of the user is improved. The secondary prompt may include a voice prompt.
For example, the generating a corresponding auxiliary prompt according to the motion data may specifically include: determining a target motion plan corresponding to the motion state; and generating a corresponding auxiliary prompt according to the motion data and the target motion plan.
Movement plans corresponding to different movement states can be preset, and the movement plan corresponding to the movement state is a target movement plan. The exercise program is used to assist the user in exercising, and may include an amount of exercise and an exercise tempo. For example, running corresponds to a running plan of 3KM, riding corresponds to a riding plan of 5KM, and skipping corresponds to a skipping plan of 500. Wherein the running plan, the riding plan, and the rope skipping plan may include a movement rhythm, for example, a running plan of 3KM, the first 0.4KM being 5 KM/h, the 0.4KM-2.6KM being 7 KM/h, and the last 0.4KM being 5 KM/h.
Specifically, multiple exercise plans corresponding to each exercise state may be preset according to different user conditions, where different user conditions correspond to different exercise plans. And determining a target motion plan in the multiple motion plans corresponding to the motion state according to the user condition of the user, namely the user condition corresponding to the user using the motion assisting function at present. Wherein the user condition may include physical fitness and athletic goals, which may include fat reduction, muscle enhancement, shaping, and the like.
For example, if the user has poor physical quality, the running plan corresponding to running is 2 KM; the physical quality of the user is better, and the running plan corresponding to running is 5 KM. The exercise plan can be set by self according to the requirements of the user, and is not limited at all.
The physical quality of the user can be confirmed in various ways, for example, a plurality of different categories of physical quality options can be preset, the physical quality can be determined by user input or user selection, the corresponding physical quality can be automatically selected by measuring physiological parameters of the user, the physical quality of the user can also be determined by historical movement data of the user, and the historical movement data can include the frequency of movement performed by the user, the duration of the movement performed by the user, the movement speed of the movement performed by the user and/or the daily average movement amount of the user. The physiological parameters of the user can be obtained by a measuring instrument such as a sphygmomanometer which is in communication connection with the electronic equipment, and can also be determined by the input mode of the user.
In some embodiments of this embodiment, after responding to the instruction to initiate the assist movement, the method further comprises: determining whether the user exits the motion state; and if the user is determined to exit the motion state, generating an instruction for closing the auxiliary motion.
By determining the current state of the user in time, the auxiliary exercise can be automatically closed when the user is confirmed to exit the exercise state, so that the intelligence of the auxiliary exercise and the accuracy of exercise data are improved, and the follow-up analysis on the exercise of the user is facilitated.
Illustratively, when it is detected that the user exits from the motion state, whether the user recovers the previous continuous motion state is judged according to a preset time interval, and if the user does not recover the previous continuous motion state, it is determined that the user exits from the motion state. The preset time interval may be set according to actual requirements, such as 5S and 10S, and is not limited herein. A timing task may be generated at preset time intervals, and after the timing task expires, it is determined whether the user is still not in the exercise state. Through setting multiple judgments, whether the user exits the motion state or not can be judged more accurately, and the auxiliary motion is prevented from being closed by mistake, so that the efficiency of the auxiliary motion is improved.
Illustratively, after turning off the auxiliary exercise function, the method further comprises: and generating a motion report according to the motion data corresponding to the motion state. The exercise report form may include exercise conditions corresponding to the exercise of the user, exercise optimization suggestions for the exercise of the user, and/or exercise plans corresponding to the user. The exercise data may include exercise duration, kilometers, heat consumption, average speed, aerobic exercise time, travel track, and other information. And automatically generating a motion plan corresponding to the user according to the motion condition of the user. The content included in the sports report can be correspondingly set and optimized according to the actual requirement of the user.
Some applicable scenarios of step S33 in the method for assisting exercise based on motion state perception according to this embodiment will be described in detail below with reference to the drawings.
For example, the electronic device (e.g., a mobile phone) determines that the motion state of the user is running, and the motion state meets a preset motion assistance condition. And inquiring whether the electronic equipment is connected with the earphone of the user. When the electronic equipment is determined to be connected with the earphone of the user, the earphone is controlled to play an inquiry audio signal so as to inquire whether the user starts the auxiliary exercise function, and whether an auxiliary exercise starting instruction is generated or not is judged according to a reply signal corresponding to the received play inquiry audio signal. The reply signal comprises an audio reply signal, i.e. the audio reply signal is generated from the speech of the user. Wherein, the connection comprises a wired connection and a wireless connection, and the wireless connection comprises a Bluetooth connection.
After the connection between the electronic equipment and the earphone of the user is determined, the voice assistant is started, so that the sudden appearance of the mobile phone caused by sudden sound output can be avoided, the exercise assistance is more intelligent, and the user experience is improved.
Illustratively, the controlling the earphone to play the query audio information specifically includes: judging whether a user wears the earphone or not; and if the user wears the earphone, controlling the earphone to play the inquiry audio information. For example, whether a user wears the earphone is judged, if the user wears the earphone, a voice assistant is started, and the user is inquired whether to start an auxiliary exercise function or not based on the voice assistant; and if the user does not wear the earphone, the earphone is not controlled to play the inquiry audio information. Determining to generate an instruction to turn on the auxiliary motion when the user determines to turn on the auxiliary motion function (e.g., a voice confirmation of the user is received); when the user determines not to turn on the auxiliary exercise function, it is determined not to generate an instruction to turn on the auxiliary exercise. Wherein whether the user wears the headset may be determined by a sensor in the headset. For example, when the user says "yes", "open", "good", etc., it is determined that the user indicates that the auxiliary exercise function is opened; when the user says "no", etc., it is determined that the user indicates not to turn on the auxiliary exercise function.
By judging whether the user wears the earphone or not, the user is prevented from playing an inquiry audio signal under the condition that the user does not wear the earphone, and the efficiency of starting auxiliary exercise is improved.
The reply signal further includes a shake reply signal, which may be generated according to a shake of the target electronic device, the shake reply signal being used to determine a shake frequency of the electronic device. After the inquiry audio signal is played, a shaking reply signal of the user is generated according to the shaking of the user on the target electronic equipment, and the shaking frequency of the target electronic equipment is determined according to the shaking reply signal. If the shaking frequency of the target electronic equipment is greater than or equal to a preset shaking threshold value, determining to generate an instruction for starting auxiliary motion; and if the shaking frequency of the target electronic equipment is smaller than a preset shaking threshold value, determining that the instruction for starting the auxiliary motion is not generated.
The target electronic device may be a mobile phone, an electronic bracelet, or the like.
The target electronic device is an electronic bracelet, for example, and the description is given. And after controlling the earphone or playing the inquiry audio signal based on the voice assistant, acquiring a shaking reply signal sent by the target electronic equipment, and judging whether to start an auxiliary exercise instruction according to the shaking reply signal.
For example, after the audio signal is played, if "the user is good, ask whether to start the auxiliary motion, if yes, please shake the electronic bracelet", acquiring a shake reply signal sent by the electronic bracelet, determining whether the user shakes the electronic bracelet according to the shake reply signal, if so, setting a shake threshold, and if the shake frequency determined by the shake reply signal is greater than or equal to a preset shake threshold, determining that the user shakes the electronic bracelet, namely, determining that an instruction for starting the auxiliary motion is generated; and if the shaking frequency determined by the shaking reply signal is smaller than the preset shaking threshold, determining that the user does not shake the electronic bracelet, namely determining that the instruction for starting the auxiliary motion is not generated.
The user can generate a reply signal through shaking of the target electronic equipment, the reply signal is used for determining to start the auxiliary movement, and the intelligence of the auxiliary movement is improved.
Fig. 8 is a flowchart of an instruction for generating an opening assist movement according to the present embodiment, which is a first detailed flowchart of step S23.
Step S51, if the motion state meets a preset motion assistance condition, obtaining current position data of the user.
For example, the current location data of the user may be obtained based on sensors of the electronic device, including a GPS sensor.
And step S52, determining whether a preset sound play-out condition is satisfied according to the current position data.
The sound playing is that the electronic device directly plays sound outwards through the built-in loudspeaker under the condition that the electronic device does not use external loudspeaker such as earphones, earphones and the like. The sound playing condition may be preset according to actual requirements, and is not limited herein. For example, the sound loudspeaking condition may include that the user is outdoors, that the current flow of people at the current location of the user is less than the target flow of people, that the current location of the user is not sound loudspeaking prohibited, and/or that the current location of the user is not sound limited. For example, sound restrictions may be required in a library to ensure that readers can have conditions for quiet reading, and sound restrictions may include prohibiting loud noises.
If the preset sound playing condition is not met, executing step S53, determining whether the user 'S earphone is connected to the electronic device, and if the user' S earphone is not connected to the electronic device, ending the process. If the user' S earphone is connected to the electronic device, go to step S54. And step S54, judging whether the user wears earphones or not. If the user does not wear the earphone, the process is ended; if the user wears the headset, step S55 is executed to control the headset to play the inquiry audio signal to inquire whether the user starts the auxiliary exercise function, and determine whether to generate an instruction to start the auxiliary exercise according to the reply audio signal corresponding to the received play inquiry audio signal.
For example, a voice assistant of the electronic device may be turned on, generate an inquiry audio signal for asking the user whether to turn on the auxiliary exercise function, and control the headset to play the inquiry audio signal. And acquiring the voice reply of the user to the inquiry audio signal by a voice acquisition device based on the electronic equipment to obtain a reply audio signal. Wherein the sound collection device may comprise a microphone device. Whether the user turns on the auxiliary exercise function may be determined based on the voice recognition function recognizing the received reply audio signal.
And if the preset sound external condition is met, externally outputting an inquiry audio signal to inquire whether a user starts an auxiliary exercise function, and judging whether an auxiliary exercise starting instruction is generated according to a received reply signal corresponding to the inquiry audio signal. For example, if the preset sound output condition is met, the voice assistant of the electronic device is turned on to generate an inquiry audio signal, and the inquiry audio signal is output to inquire whether the user turns on the auxiliary motion function. And receiving a reply audio signal of the user to the inquiry audio signal, and determining whether the user indicates to start the auxiliary motion function according to the received reply audio signal. If the user indicates to start the auxiliary exercise function, determining to generate an instruction for starting the auxiliary exercise; and if the user does not indicate that the auxiliary motion function is started, determining not to generate an instruction for starting the auxiliary motion.
Some specific implementation manners of step S53, step S54, and step S55 may refer to the descriptions in the foregoing embodiments, and are not described herein again.
If the current position information meets the preset sound playing condition, an inquiry audio signal is directly played to inquire whether the user starts the auxiliary exercise function, and the auxiliary exercise starting efficiency is improved.
For example, when the motion state of the user meets a preset motion assistance condition, the geographic position data of the user is acquired. And determining that the user is in a park according to the geographic position data of the user, wherein the park has no sound limitation, and does not prohibit the tourist from using the electronic equipment to play sound outwards, namely, the preset sound outwards playing condition of the electronic equipment is met. The voice assistant of the electronic equipment is started, and the user is asked whether to start the auxiliary motion function through the voice assistant, for example, a query audio signal is played based on the voice assistant, and a reply audio signal of the user to the query audio signal is received. And determining whether the user indicates to start the auxiliary motion function according to the received reply audio signal. If the user indicates to start the auxiliary exercise function, determining to generate an instruction for starting the auxiliary exercise; and if the user does not indicate that the auxiliary motion function is started, determining not to generate an instruction for starting the auxiliary motion.
Fig. 9 is a flowchart of another instruction for generating an opening assist movement according to the present embodiment, which is a second detailed flowchart of step S23.
And step S71, generating corresponding prompt information according to a preset prompt rule.
In this embodiment, the prompt rule may include an information format and/or information content corresponding to the generated prompt information, and the prompt rule may be modified according to the requirement of the user. The prompt message is used for prompting the user whether to start the auxiliary exercise function. The prompt rule is preset, so that the speed of generating the prompt information can be increased, and the exercise assisting efficiency is increased.
And step S72, sending the prompt information to a target display screen so that the target display screen displays the prompt information. The target display screen may be a display screen of the electronic device, or may be a display screen of another electronic device that establishes communication with the electronic device, where the another electronic device may include a wearable device with a display screen, such as an electronic bracelet and an electronic watch.
Step S73, if the trigger operation of the user on the prompt information is detected, determining to generate an instruction to start the auxiliary motion.
For example, the trigger operation of the user on the prompt message may be generated according to the click of the user on a certain area in the prompt message.
The trigger operation of the user on the prompt message is used for judging whether an instruction for starting the auxiliary movement is generated. And if the trigger operation is not detected within the preset time interval, determining that the command for starting the auxiliary motion is not generated.
Fig. 10 is a schematic view of a scene of a prompt message according to an embodiment of the present application. The prompt information is displayed on a display screen of the electronic equipment, if the user clicks 'yes', a trigger operation corresponding to the prompt information is generated, and the electronic equipment determines to generate an instruction for starting auxiliary movement according to the trigger operation; if the user clicks 'no', generating a rejection instruction, and determining not to generate an instruction for starting auxiliary movement; and if the user does not click the prompt message, no trigger operation is generated, the electronic equipment does not receive the trigger operation within a preset time interval, and the instruction for starting the auxiliary movement is determined not to be generated.
Prompt information is displayed through the target display screen, and corresponding instructions are generated according to operation of a user on the prompt information, so that convenience of interaction is improved, and intelligence of auxiliary movement is improved. The target display screen is used for displaying the prompt information, and corresponding instructions are generated according to the operation of the user on the prompt information, so that the interaction convenience is improved, and the intelligence of auxiliary movement is improved.
Fig. 11 is a flowchart of a sports application recommendation method according to this embodiment. In this embodiment, with reference to the flow shown in fig. 7, after step S33 is completed, the sports application recommendation method shown in fig. 11 may be further executed, which specifically includes the following steps:
in step S91, the number of times of exercise in which the exercise state meets the preset exercise assisting condition is counted.
The number of times of movement of which the movement state meets the preset movement assistance condition in the preset time period can be calculated, the preset time period can be set according to actual requirements and can be 1 week, 2 weeks, 3 weeks, 1 month and the like, and the number of times of movement is not limited too much. After step S33 is completed, and it is determined whether the exercise state meets the predetermined exercise assisting condition, the number of times of exercise for which the exercise state meets the predetermined exercise assisting condition is updated according to the determination result, for example, if the determination result is that the exercise state meets the predetermined exercise assisting condition, 1 is added to the number of times of exercise, so as to obtain the updated number of times of exercise.
Step S92, determining whether the number of movements is greater than or equal to a preset number threshold. For example, it may be determined whether the number of movements within a preset time period is greater than or equal to a preset number threshold. Different times thresholds corresponding to different time periods can be preset, for example, the time threshold corresponding to 1 week is 2, and the time threshold corresponding to 2 weeks is 5. The efficiency of motion application recommendation may be improved by determining different time thresholds for different time periods.
If the number of movements is less than the preset number threshold, the process continues to step S91.
If the number of times of the movement is greater than or equal to the preset number threshold, step S93 is executed to determine whether the target movement application has been installed. The use of a target motion application may improve the efficiency of motion assistance. If the electronic device is not provided with the application with the motion assisting function, the motion state perception-based auxiliary motion method in the embodiment is realized only through the motion assisting function of the electronic device system, and the target motion application may be an application with the motion assisting function; if the electronic device has an application with a motion-assist function installed, the target motion application may be a more versatile application than the electronic device has an application with a motion-assist function installed.
If the target sports application is not installed, step S94 is executed to generate installation prompt information corresponding to the target sports application. And when the target motion application is not installed on the electronic equipment, recommending the user to install the target motion application. For example, the generated installation prompting information may be displayed on a display interface of the electronic device, such as a display screen of the electronic device.
If the target sports application has been installed, step S95 is performed to determine whether the auxiliary sports function of the target sports application has not been enabled. For example, it is determined whether the user enables the target sports application or whether the user enables the auxiliary sports function of the target sports application for sports assistance. If the user does not enable the target sport application, it is determined that the auxiliary sport function of the target sport application is not enabled.
If the user does not enable the auxiliary exercise function of the target exercise application, the step S96 is executed, and recommendation information corresponding to the auxiliary exercise function of the target exercise application is generated. And if the user does not start the auxiliary motion function of the target motion application, recommending the user to use the auxiliary motion function of the target motion application. For example, the generated recommendation information may be presented on a display interface of the electronic device, such as a display screen of the electronic device.
By calculating the number of times of movement of which the movement state of the user meets the preset movement assistance condition, when the number of times of movement exceeds a preset number threshold, whether the user enables an assistance movement function of the target movement application is determined. And if the auxiliary motion function of the target motion application is not started, generating corresponding recommendation information so as to improve the efficiency of the auxiliary motion function.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 12, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices 100, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The internal memory 121 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs).
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (DRAM), synchronous dynamic random-access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), such as fifth generation DDR SDRAM generally referred to as DDR5 SDRAM, and the like;
the nonvolatile memory may include a magnetic disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operation principle, may include single-level cells (SLC), multi-level cells (MLC), three-level cells (TLC), four-level cells (QLC), etc. according to the level order of the memory cells, and may include universal FLASH memory (UFS), embedded multimedia memory cards (eMMC), etc. according to the storage specification.
The random access memory may be read and written directly by the processor 110, may be used to store executable programs (e.g., machine instructions) of an operating system or other programs in operation, and may also be used to store data of users and applications, etc.
The nonvolatile memory may also store executable programs, data of users and application programs, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect an external nonvolatile memory to extend the storage capability of the electronic device 100. The external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are saved in an external nonvolatile memory.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be an open mobile electronic device 100 platform (OMTP) standard interface of 3.5mm, a cellular telecommunications industry association (cellular telecommunications industry association, USA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment 100, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The present embodiment also provides a computer storage medium, which stores computer instructions, and when the computer instructions are executed on the electronic device 100, the electronic device 100 executes the above related method steps to implement the auxiliary exercise method based on motion state perception in the above embodiments.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the auxiliary exercise method based on motion state perception in the above embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip executes the auxiliary motion method based on motion state perception in the above-mentioned method embodiments.
The electronic device 100, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding methods provided above, so that the beneficial effects achieved by the electronic device may refer to the beneficial effects in the corresponding methods provided above, and are not described herein again.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the module or unit is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present application and not for limiting, and although the present application is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present application without departing from the spirit and scope of the technical solutions of the present application.

Claims (10)

1. An auxiliary motion method based on motion state perception is applied to an electronic device, and is characterized in that the method comprises the following steps:
acquiring a motion state of a user;
judging whether the motion state meets a preset motion assisting condition or not;
if the motion state meets the preset motion assisting condition, judging whether an instruction for starting assisting motion is generated or not according to the interaction behavior with the user;
if the command for starting the auxiliary exercise is determined to be generated, responding to the command for starting the auxiliary exercise and acquiring exercise data corresponding to the user;
generating a corresponding auxiliary prompt according to the motion data;
the obtaining of the motion state of the user and the determining whether the motion state meets a preset motion assistance condition includes: the electronic equipment starts a motion state fence to sense the motion state of a user, starts timing when the electronic equipment starts the time fence after the user is determined to enter a certain motion state, confirms that the user enters the motion state if the user is still sensed to be in the motion state after the electronic equipment lasts for preset time, and judges that the motion state meets a preset motion assisting condition.
2. A method of assisting exercise based on motion state perception according to claim 1, further comprising:
counting the movement times of which the movement state meets a preset movement auxiliary condition;
if the number of times of the movement is larger than or equal to a preset number threshold, determining whether a target movement application is installed;
if the target motion application is not installed, generating installation prompt information corresponding to the target motion application;
if the target motion application is installed, determining whether an auxiliary motion function of the target motion application is enabled;
and if the auxiliary motion function of the target motion application is not enabled, generating recommendation information corresponding to the auxiliary motion function of the target motion application.
3. The exercise state perception-based auxiliary exercise method according to claim 1 or 2, wherein the judging whether to generate an instruction to start auxiliary exercise according to the interaction behavior with the user if the exercise state meets a preset exercise auxiliary condition includes:
if the motion state meets the preset motion assisting condition, judging whether the earphone of the user is connected with the electronic equipment;
and if the earphone of the user is connected with the electronic equipment, controlling the earphone to play an inquiry audio signal so as to inquire whether the user starts the auxiliary exercise function, and judging whether to generate an instruction for starting the auxiliary exercise according to a reply signal corresponding to the received inquiry audio signal.
4. A method as claimed in claim 3, wherein if the user's headset is connected to the electronic device, controlling the headset to play an audio signal comprises:
if the earphone of the user is connected with the electronic equipment, judging whether the user wears the earphone;
and if the user wears the earphone, controlling the earphone to play an inquiry audio signal.
5. The exercise state perception-based auxiliary exercise method according to claim 1, wherein the judging whether to generate an instruction to start auxiliary exercise according to the interaction behavior with the user if the exercise state meets a preset exercise auxiliary condition includes:
if the motion state meets the preset motion assistance condition, acquiring the current position data of the user;
judging whether the earphone of the user is connected with the electronic equipment;
judging whether a preset sound playing condition is met or not according to the current position data;
and if the sound external-playing condition is met, externally playing an inquiry audio signal to inquire whether a user starts an auxiliary exercise function, and judging whether an auxiliary exercise starting instruction is generated according to a received reply signal corresponding to the inquiry audio signal.
6. A method of assisting exercise based on motion state perception according to claim 5, further comprising:
if the sound playing condition is not met, judging whether the earphone of the user is connected with the electronic equipment or not;
if the earphone of the user is connected with the electronic equipment, judging whether the user wears the earphone;
and if the user wears the earphone, controlling the earphone to play an inquiry audio signal so as to inquire whether the user starts the auxiliary exercise function, and judging whether to generate an instruction for starting the auxiliary exercise according to a reply signal corresponding to the received inquiry audio signal.
7. A method of assisting exercise based on motion state perception according to any one of claims 3-6, wherein the reply signal comprises an audio reply signal and/or a shake reply signal; the audio reply signal is generated according to voice data of a user, and the shaking reply signal is generated according to shaking data of the target device.
8. A method for assisting exercise based on motion state perception according to any one of claims 1-7, wherein the determining whether to generate an instruction for turning on assisting exercise according to the interaction with the user comprises:
generating corresponding prompt information according to a preset prompt rule;
sending the prompt information to a target display screen to enable the target display screen to display the prompt information;
and if the trigger operation of the user on the prompt message is detected, determining to generate an instruction for starting the auxiliary movement.
9. An electronic device, comprising a memory and a processor;
the memory to store program instructions;
the processor is configured to read the program instructions stored in the memory to implement the method for assisting exercise based on motion state perception according to any one of claims 1 to 8.
10. A computer-readable storage medium, wherein computer-readable instructions are stored in the computer-readable storage medium, and when executed by a processor, implement the method for assisting exercise based on motion state perception according to any one of claims 1 to 8.
CN202110503121.8A 2021-05-08 2021-05-08 Auxiliary motion method based on motion state perception, electronic equipment and storage medium Active CN113380374B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110503121.8A CN113380374B (en) 2021-05-08 2021-05-08 Auxiliary motion method based on motion state perception, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110503121.8A CN113380374B (en) 2021-05-08 2021-05-08 Auxiliary motion method based on motion state perception, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113380374A true CN113380374A (en) 2021-09-10
CN113380374B CN113380374B (en) 2022-05-13

Family

ID=77570805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110503121.8A Active CN113380374B (en) 2021-05-08 2021-05-08 Auxiliary motion method based on motion state perception, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113380374B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019051845A1 (en) * 2017-09-18 2019-03-21 Microsoft Technology Licensing, Llc Fitness assistant chatbots
CN109686146A (en) * 2019-02-19 2019-04-26 北京儒博科技有限公司 A kind of switching method of study module, device, storage medium and electronic equipment
CN109712686A (en) * 2018-11-26 2019-05-03 Oppo广东移动通信有限公司 Body-building control method and relevant apparatus
CN109829107A (en) * 2019-01-23 2019-05-31 华为技术有限公司 A kind of recommended method and electronic equipment based on user movement state
CN110750722A (en) * 2019-10-21 2020-02-04 出门问问信息科技有限公司 Method for pushing audio content through earphone, computing equipment and pushing system
US20200057779A1 (en) * 2018-08-15 2020-02-20 Chiun Mai Communication Systems, Inc. Electronic device and digital content managing method
CN112052325A (en) * 2020-09-16 2020-12-08 南通沃特光电科技有限公司 Voice interaction method and device based on dynamic perception
CN112447273A (en) * 2019-08-30 2021-03-05 华为技术有限公司 Method and electronic device for assisting fitness

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019051845A1 (en) * 2017-09-18 2019-03-21 Microsoft Technology Licensing, Llc Fitness assistant chatbots
US20200242305A1 (en) * 2017-09-18 2020-07-30 Microsoft Technology Licensing, Llc Fitness assistant chatbots
US20200057779A1 (en) * 2018-08-15 2020-02-20 Chiun Mai Communication Systems, Inc. Electronic device and digital content managing method
CN109712686A (en) * 2018-11-26 2019-05-03 Oppo广东移动通信有限公司 Body-building control method and relevant apparatus
CN109829107A (en) * 2019-01-23 2019-05-31 华为技术有限公司 A kind of recommended method and electronic equipment based on user movement state
CN109686146A (en) * 2019-02-19 2019-04-26 北京儒博科技有限公司 A kind of switching method of study module, device, storage medium and electronic equipment
CN112447273A (en) * 2019-08-30 2021-03-05 华为技术有限公司 Method and electronic device for assisting fitness
CN110750722A (en) * 2019-10-21 2020-02-04 出门问问信息科技有限公司 Method for pushing audio content through earphone, computing equipment and pushing system
CN112052325A (en) * 2020-09-16 2020-12-08 南通沃特光电科技有限公司 Voice interaction method and device based on dynamic perception

Also Published As

Publication number Publication date
CN113380374B (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN110035141B (en) Shooting method and equipment
EP3893129A1 (en) Recommendation method based on user exercise state, and electronic device
WO2021036568A1 (en) Fitness-assisted method and electronic apparatus
CN109710080A (en) A kind of screen control and sound control method and electronic equipment
CN111202955A (en) Motion data processing method and electronic equipment
CN111543049B (en) Photographing method and electronic equipment
CN113572896B (en) Two-dimensional code display method based on user behavior model, electronic device and readable storage medium
CN110727380A (en) Message reminding method and electronic equipment
CN115589051B (en) Charging method and terminal equipment
CN113892920A (en) Wearable device wearing detection method and device and electronic device
WO2022242422A1 (en) Dynamic wallpaper display method, electronic device, and storage medium
WO2022105830A1 (en) Sleep evaluation method, electronic device, and storage medium
CN113996046B (en) Warming-up judgment method and device and electronic equipment
CN113593567A (en) Method for converting video and sound into text and related equipment
US20230402150A1 (en) Adaptive Action Evaluation Method, Electronic Device, and Storage Medium
WO2022214004A1 (en) Target user determination method, electronic device and computer-readable storage medium
CN113380374B (en) Auxiliary motion method based on motion state perception, electronic equipment and storage medium
CN114120987B (en) Voice wake-up method, electronic equipment and chip system
CN115700847A (en) Drawing book reading method and related equipment
CN115700538A (en) Physiological detection signal quality evaluation method, electronic device, and storage medium
CN113764095A (en) User health management and control method and electronic equipment
CN113359120B (en) Method and device for measuring user activity distance and electronic device
CN114362878B (en) Data processing method and electronic equipment
CN113364067B (en) Charging precision calibration method and electronic equipment
WO2023237087A1 (en) Method for predicting fertile window, apparatus and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230911

Address after: 201306 building C, No. 888, Huanhu West 2nd Road, Lingang New Area, Pudong New Area, Shanghai

Patentee after: Shanghai Glory Smart Technology Development Co.,Ltd.

Address before: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee before: Honor Device Co.,Ltd.

TR01 Transfer of patent right