WO2021036568A1 - 辅助健身的方法和电子装置 - Google Patents

辅助健身的方法和电子装置 Download PDF

Info

Publication number
WO2021036568A1
WO2021036568A1 PCT/CN2020/102394 CN2020102394W WO2021036568A1 WO 2021036568 A1 WO2021036568 A1 WO 2021036568A1 CN 2020102394 W CN2020102394 W CN 2020102394W WO 2021036568 A1 WO2021036568 A1 WO 2021036568A1
Authority
WO
WIPO (PCT)
Prior art keywords
limb
action
electronic device
user
motion
Prior art date
Application number
PCT/CN2020/102394
Other languages
English (en)
French (fr)
Inventor
姜永航
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to JP2022513339A priority Critical patent/JP2022546453A/ja
Priority to EP20857343.6A priority patent/EP4020491A4/en
Publication of WO2021036568A1 publication Critical patent/WO2021036568A1/zh
Priority to US17/680,967 priority patent/US20220176200A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0025Tracking the path or location of one or more users, e.g. players of a game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/10Positions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • This application relates to the field of artificial intelligence (AI), in particular to the field of image processing, and in particular to a method and device for assisting fitness.
  • AI artificial intelligence
  • the present application provides a method and an electronic device for assisting fitness, which can accurately identify fitness actions and improve user experience.
  • a method for assisting fitness including: an electronic device acquires a user action; the electronic device determines from the user action a preparation that the movement trajectory of a first limb in the user action satisfies a first preset condition Selecting an action; the electronic device determines the range of motion change of the second limb in the candidate motion; the electronic device determines to output guidance information according to the range of motion change.
  • the output of the guidance information can be accurately determined, which can improve the user experience.
  • the method further includes: the electronic device acquires input information; and the electronic device determines the first preset condition according to the input information.
  • the first preset condition is determined, the calculation amount is reduced, and the accuracy of the output guidance information is improved.
  • the electronic device determines the first evaluation information corresponding to the first position information, and the first position information includes the movement range of the second limb, and the second limb At least one of the action starting position of the second limb, the action ending position of the second limb, the movement trajectory of the second limb, the action starting position of the second limb and the action ending position of the second limb It is determined according to the motion track; the electronic device outputs guidance information according to the first evaluation information.
  • the first evaluation information corresponding to the first position information of the user's first limb in the user action video is determined, so as to provide guidance for the user's fitness actions.
  • the position information of the first limb it can provide guidance for the user's actions and improve the applicability.
  • the method further includes: the electronic device determines second evaluation information corresponding to the second location information of the user, and the second location information includes the third At least one of the movement range of the limb, the movement start position of the third limb, the movement end position of the third limb, the movement trajectory of the third limb, and the movement start of the third limb.
  • the position and the motion termination position of the third limb are determined according to the movement trajectory of the first limb;
  • the electronic device outputs guidance information according to the first evaluation information, including: the electronic device according to the first evaluation information Second, the evaluation information and the first evaluation information, and output the guidance information.
  • the second evaluation information corresponding to the second position information of the user's second limb in the user action video can be determined according to the corresponding relationship between the position information of the second limb and the evaluation information, thereby Provide more comprehensive and detailed guidance for users' fitness actions.
  • joint points in the user motion are identified to determine the first limb and the second limb in the user motion.
  • the amount of calculation can be reduced.
  • the method further includes: the electronic device acquires input information, the input information is used to indicate the fitness action; the electronic device determines that it corresponds to the fitness action Of the first limb.
  • the amount of calculation can be reduced.
  • the electronic device determines to output guidance information according to the motion change range, including: the electronic device determines the output guidance information according to the motion change range and the presence of the second limb.
  • the action start position in the candidate action is determined to output the guidance information; or, the electronic device determines the action termination position of the second limb according to the motion change range and the action termination position in the candidate action Output the guidance information; or, the electronic device determines to output the guidance information according to the magnitude of the action change, the action start position and the action end position of the second limb in the candidate action, so
  • the action start position of the second limb and the action end position of the second limb are determined according to the movement trajectory of the first limb.
  • the range of motion change and one or more of the motion start position and motion end position of the second limb in the candidate motion, it is determined to output the guidance information, that is, more information is output through the second limb.
  • Location information judges user actions, improves the accuracy of fitness action recognition, can output more accurate guidance information, and improves user experience.
  • an apparatus for assisting fitness which includes: an acquisition module for acquiring user actions; a determining module for determining from the user actions that the movement trajectory of the first limb in the user action satisfies the first prediction Set a conditional alternative action; the determining module is further used to determine the motion change range of the second limb in the alternative action; the determining module is also used to determine the output guidance information according to the motion change range .
  • the device further includes a judging module, configured to judge that the motion change range satisfies a second preset condition; the determining module is configured to determine to output guidance information.
  • the acquiring module is further configured to acquire input information; the determining module is further configured to determine the first preset condition corresponding to the input information.
  • the determining module is further configured to determine the first evaluation information corresponding to the first position information, where the first position information includes the movement range of the second limb, and At least one of the action starting position of the second limb, the action ending position of the second limb, the movement track of the second limb, the action starting position of the second limb and the second limb The end position of the action is determined according to the movement trajectory of the first limb action; the device further includes an output module for outputting guidance information according to the first evaluation information.
  • the determining module is further configured to determine second evaluation information corresponding to the second position information of the user, where the second position information includes the movement of the third limb At least one of the range of change, the motion start position of the third limb, the motion end position of the third limb, the motion trajectory of the third limb, the motion start position of the third limb, and the motion trajectory of the third limb.
  • the action termination position of the third limb is determined according to the movement trajectory of the first limb; the output module is further configured to output the guidance information according to the second evaluation information and the first evaluation information.
  • the device further includes a recognition module for recognizing the joint points in the user motion to determine the first limb and the second limb in the user motion. Two limbs.
  • the determining module is further configured to determine the output according to the motion change amplitude and the motion start position of the second limb in the candidate motion
  • the guidance information is determined to output the guidance information or according to the motion change range and the motion termination position of the second limb in the candidate motion; or, according to the motion change range, the
  • the action start position and action end position of the second limb in the candidate action are determined to output the guidance information, and the action start position and action end position of the second limb are determined according to the motion trajectory .
  • a device for assisting fitness which includes a processor and a communication interface.
  • the communication interface is used to obtain user actions; the processor is used to: determine from the user actions, the motion trajectory of the first limb in the user action meets the first preset condition as an alternative action; Select the range of motion change in the motion; according to the range of motion change, determine the output guidance information.
  • the processor is configured to: determine that the motion change range satisfies a second preset condition; determine to output guidance information.
  • the communication interface is further used to obtain input information; the processor is further used to determine the first preset condition corresponding to the input information.
  • the processor is configured to: determine the first evaluation information corresponding to the first position information, the first position information including the movement range of the first limb, the first limb At least one of the action starting position of the first limb, the action ending position of the first limb, the movement trajectory of the first limb, the action starting position of the first limb and the action ending position of the second limb It is determined according to the movement trajectory of the first limb; and the guidance information is determined according to the first evaluation information.
  • the processor is configured to: the electronic device determines second evaluation information corresponding to the second position information of the user, where the second position information includes the movement of the third limb At least one of the range of change, the motion start position of the third limb, the motion end position of the third limb, the motion trajectory of the third limb, the motion start position of the third limb, and the motion trajectory of the third limb.
  • the action termination position of the third limb is determined according to the movement trajectory of the first limb; the guidance information is determined according to the second evaluation information and the first evaluation information.
  • the processor is further configured to: identify key points in the user action to determine the first limb and the second limb in the user action.
  • the processor is further configured to: determine to output the Guidance information; or, the electronic device determines to output the guidance information according to the motion change range and the action termination position of the second limb in the candidate action; or, the electronic device determines to output the guidance information according to the Action change range, action start position and action end position of the second limb in the candidate action, determine the output of the guidance information, the action start position of the second limb and the second limb The end position of the action is determined according to the movement trajectory of the first limb.
  • a computer storage medium which when the computer instruction runs on an electronic device, causes the electronic device to execute the method described in the first aspect.
  • a chip system in a fifth aspect, includes at least one processor, and when a program instruction is executed in the at least one processor, the chip system is caused to execute the method described in the first aspect.
  • Figure 1 is a schematic diagram of the hardware structure of an electronic device.
  • Figure 2 is a schematic diagram of the software structure of an electronic device.
  • Fig. 3 is a schematic flowchart of a method for assisting fitness provided by an embodiment of the present application.
  • Fig. 4 is a schematic flowchart of a method for assisting fitness according to another embodiment of the present application.
  • Fig. 5 is a schematic diagram of a user interface for assisting fitness provided by an embodiment of the present application.
  • Fig. 6 is a schematic diagram of a user interface for assisting fitness provided by another embodiment of the present application.
  • Fig. 7 is a schematic flowchart of a method for assisting fitness according to another embodiment of the present application.
  • Figure 8 is a schematic diagram of a squat exercise.
  • FIG. 9 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
  • Recognizing the user’s fitness actions through images can record the number of times the user completes the actions, evaluate the user’s completion quality according to key indicators of fitness actions, and point out wrong actions and ways to improve, which can provide users with scientific guidance.
  • the user’s body joint point information can be collected, and the user’s posture can be compared with the standard action posture based on the user’s joint point information and the standard action’s joint point information, so as to determine the user’s
  • the difference between actions and standard actions provides feedback and guidance for users.
  • the user may perform some actions that are not related to fitness, such as taking things, answering the phone, walking, etc.
  • some actions such as taking things, answering the phone, walking, etc.
  • treating these irrelevant actions as non-standard fitness actions and providing evaluation and guidance will result in a poor user experience.
  • the evaluation and guidance of the user's action may not be performed.
  • the movements of fitness beginners may be very non-standard, especially for complex movements that require the coordination of different parts of the body.
  • the movements performed by beginners may not meet the similarity requirements, so that they are judged as irrelevant movements and cannot provide effective guidance. , The user experience is poor.
  • the overall similarity is still low at this time, and may be lower than the preset threshold, so that it is judged as an irrelevant action , Will not trigger evaluation and guidance.
  • User actions can be continuous actions within a period of time.
  • a user action is compared with a standard action, the user action at a certain time point or time period is compared with a static standard action.
  • DTW dynamic time warping
  • DTW dynamic time warping
  • determine the start time and end time of the user action use them as the start and end points of the time window, respectively, and compare the user action in the time window with the standard action.
  • the duration of different actions of different users is different, and it is difficult to accurately determine the start time and end time of user actions, that is, the time window is difficult to determine. If the window is too large, the calculation cost will be higher and the calculated similarity will be low. If the window is too small, it may not be possible to identify whether the user is performing a fitness exercise. At the same time, this method compares the similarity between the posture of the user's action and the posture of the standard action, and cannot accurately identify whether the user is performing a fitness action.
  • the present application provides a method for assisting fitness, which can identify whether a user performs fitness actions.
  • the method can be executed in an electronic device.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the electronic device in the embodiment of the present application may include a processor 110, an audio module 170, a speaker 170A, a Bluetooth module communication in the wireless communication module 160, a display screen 194, a camera 193, an internal memory 121, and the like.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processor (neural-network processing unit, NPU), etc. At least one.
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, general At least one of a serial bus (universal bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may couple the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect to other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, IR technology, etc. at least one.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS), satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor, which is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • the camera may collect user action videos.
  • the photosensitive element converts the collected optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for related image processing.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the electronic device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example, when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Then, according to the detected opening and closing status of the leather case or the opening and closing status of the flip cover, features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers and so on.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 by way of example.
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • algorithms for processing images can all be included in the application framework layer.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, and a notification manager.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the content controller can acquire the image collected in the preview interface in real time, and display the processed image in the preview interface.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • content such as "user action video”, “standard fitness action”, and “guidance information" displayed on the display interface can be displayed by the view system receiving the instructions of the processor.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, and so on.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • Android runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • surface manager surface manager
  • media library media libraries
  • 3D graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the method for assisting fitness provided by the embodiments of this application can be applied to televisions, mobile phones, tablet computers, wearable devices, in-vehicle devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, On electronic devices such as ultra-mobile personal computers (UMPC), netbooks, and personal digital assistants (personal digital assistants, PDAs), the embodiments of this application do not impose any restrictions on the specific types of electronic devices.
  • AR augmented reality
  • VR virtual reality
  • PDAs personal digital assistants
  • Fig. 3 is a schematic flowchart of a method for assisting fitness provided by an embodiment of the present application.
  • step S301 the electronic device acquires user actions.
  • the user action may be a user action in the real-time video collected by the camera. It can also be said that the electronic device can obtain user actions collected by the camera.
  • a video containing user actions acquired by an electronic device may be referred to as a user action video.
  • the user can select a specific fitness action from at least one fitness action in the action evaluation index set.
  • the user's action is processed according to the specific fitness action.
  • the action evaluation index set can be determined based on professional knowledge. That is, the electronic device can obtain input information, and the mathematical input information is used to indicate fitness actions.
  • step S302 the electronic device recognizes the user's skeleton node in the user's motion.
  • the electronic device can identify the user's skeleton node in each frame of image in the user's action, and can also identify the user's skeleton node in multiple frames of images with fixed or unfixed intervals in the user's action.
  • the user's bone node can be used to represent the user's limbs.
  • the user bone nodes may include all bone nodes on the user's body in the image.
  • the user skeleton node may include a first key node corresponding to the first standard node of a specific fitness action in the action evaluation index set, a second key node corresponding to the second standard node, and a third key node corresponding to the third standard node.
  • the first key node, the second key node, the third key node, and the fourth key node are all bone nodes on the user's body in the image.
  • step S303 the electronic device matches the trajectory of the first key node.
  • the trajectory of the first key node in the user's skeleton node is matched.
  • the electronic device may also determine at least one of the horizontal direction and the vertical direction of the user's action.
  • the position of the user's limbs, the change of the position, the direction of the movement trajectory, etc. can be determined.
  • the gyroscope sensor and other devices in the electronic equipment can be used to determine the horizontal and vertical directions of the user's action during the process of collecting the user's action.
  • the horizontal direction and the vertical direction of the user's action can be determined according to at least one of the default horizontal direction and vertical direction.
  • the expert can determine at least one first standard node and the motion trajectory of each of the at least one first standard node in the fitness motion through professional knowledge, and compare the motion trajectory of the first standard node Saved in the action evaluation index collection.
  • the action evaluation index set includes the corresponding relationship between at least one fitness action and the motion track of at least one first standard node.
  • the movement trajectory of the first standard node is the movement trajectory of the bone node of the human body in the movement of the specific fitness movement standard.
  • the bone nodes of the human body include the first standard node.
  • the first standard node can be understood as a node of the user's limbs.
  • the first key node is a bone node corresponding to the first standard node in the user action, or in other words, the first key node is a bone node in the same part as the first standard node in the user action.
  • the electronic device may compare the movement trajectory of the first key node with the movement trajectory of the first standard node. If the similarity of the motion trajectory of the first standard node is greater than the preset value of the similarity, the matching fails, and it is considered that the user has not performed a fitness action, and step S301 is performed to obtain the user action again. If the similarity of the motion trajectory of the first standard node is less than or equal to the preset similarity value, the matching is successful.
  • the electronic device can determine whether the movement trajectory of the first key node meets the preset characteristic, that is, whether the movement trajectory of the first key node has the preset characteristic.
  • the preset feature can also be understood as the preset condition.
  • the electronic device may determine, from the user's actions, an alternative action that meets the preset motion trajectory feature of the limb corresponding to the first standard node.
  • the preset feature may be the shape of the motion track, the periodicity of the motion track, etc., for example, may be a periodic position change law.
  • the preset periodic position change law of the first standard node can also be understood as the periodic position change law of the first standard node.
  • the electronic device can determine an alternative action. Different fitness actions can correspond to the same or different alternative action selection rules.
  • the electronic device may use a video image of a period of the first key node corresponding to the movement track of the first key node as the candidate action, that is, the candidate action of the user in the candidate action is determined.
  • the electronic device may also use a video image in the user action corresponding to the first key node at a certain point or a certain range in the movement track of the first key node as an alternative action.
  • the action evaluation index set may include a preset feature, and the preset feature is used to indicate the manner in which the position of the first standard node changes.
  • the way of changing the position of the first standard node may be the movement direction of the first standard node, such as upward or downward movement, or all or upward movement during the up and down reciprocating movement process.
  • the way of changing the position of the first standard node can also be the shape of the motion trajectory, for example, the motion trajectory is triangular, circular, arc, polyline, etc., or a part of the complete shape completed by the first standard node, such as the first standard node.
  • a standard node trajectory is a triangle, when the first standard node is located on a certain side of the movement. .
  • the determined alternative action can be, for example, a segment of the user action corresponding to the upward movement of the first key node during the up and down reciprocating movement, or the movement trajectory of the first key node is a triangle, and the first key node moves on a certain side of the triangle.
  • the corresponding segment of user action can be, for example, a segment of the user action corresponding to the upward movement of the first key node during the up and down reciprocating movement, or the movement trajectory of the first key node is a triangle, and the first key node moves on a certain side of the triangle.
  • step S304 is performed.
  • steps S304-S307 the electronic device processes the candidate actions.
  • the first standard node may be the hip node.
  • the electronic device can recognize the user's hip node in each frame of image as the first key node in the user's action.
  • the height of the hip nodes presents periodic fluctuations.
  • the electronic device can match the movement trajectory of the first key node with the movement trajectory of the first standard node, determine the cycle start and end points of the hip node up and down, use a video between the cycle start and end points as an alternative action, or take a cycle
  • a video of the inner hip node moving up or down is used as an alternative action. That is, the preset motion trajectory feature may be a cycle of up and down movement of the hip node, or it may be an upward movement of the hip node in a cycle of up and down movement of the hip node.
  • the trigger condition is that the movement trajectory of the first key node matches the movement trajectory of the first standard node successfully.
  • step S304 is performed.
  • step S304 the electronic device determines that the user performs a fitness action.
  • the electronic device can determine whether the user action in the user action is a specific fitness action through the recognition condition.
  • the recognition condition is the preset condition.
  • Process the determined alternative actions It is determined whether the candidate actions of the user in the candidate actions meet the recognition conditions of the specific actions in the action evaluation index set. When the candidate action meets the recognition condition, it is determined that the candidate action is a specific action performed by the user. When the alternative action does not meet the recognition condition, it is determined that the alternative action is a specific action not performed by the user.
  • the identification condition includes a condition satisfied by the second key node corresponding to the second standard node.
  • the second standard node may include all or part of the first standard node, and the second standard node may also include other bone nodes.
  • the identification condition may include position change information of the second standard node.
  • the position change information of the second standard node may include a position change range of the second standard node in the video.
  • the position change information of the second standard node may also be the range of the relative position change between the second standard nodes.
  • the position change information of the second standard node can be used to indicate the range of movement changes of the limbs.
  • the movement change range of the limbs may include the change range of the limb angle change, and may also include the change range of the relative position between the limbs.
  • the change range of the limb angle change can be understood as the interval between the maximum and minimum values of the angle between the limb and the horizontal or vertical direction in the user's actions.
  • the change range of the relative position between the limbs can be understood as the interval between the maximum and minimum distances between the limbs.
  • the distance between the two limbs can be determined according to the length of the limbs, etc., for example, the distance between the two limbs is a multiple or a ratio of the length of one of the limbs.
  • the identification conditions can be determined based on professional knowledge.
  • the second key node is a bone node corresponding to the second standard node in the user action, or in other words, the second key node is a bone node in the same part as the second standard node in the user action.
  • the recognition condition of the squat movement may include that the change of the thigh angle satisfies the first preset range.
  • the second standard node may include a hip node and a node at both ends of the thigh (that is, a knee node and a hip node), and the thigh angle can be determined according to the nodes at the two ends of the thigh in an alternative action. That is to say, when the change of the thigh angle in the candidate action satisfies the first preset range, it is considered that the user performs a squat exercise.
  • FIG. 8(a) it is determined that the user is performing a squat exercise.
  • Fig. 8(b) the user stands, and in Fig. 8(b), the user squats to the lowest position.
  • the user's hip node A moves up and down, and the angle between the thigh and the horizontal direction changes, that is, the thigh angle changes. Reflect the movement of the thighs through changes in the angle of the thighs.
  • the preset feature in the action evaluation index set may be the downward movement of the hip node A during the up and down movement. According to the preset feature, alternative actions among the user actions can be determined.
  • the recognition conditions in the action evaluation index set may include the range of the amplitude of the thigh angle.
  • the electronic device can determine whether the amplitude of the thigh angle in the candidate action meets the recognition condition to determine whether to output the guidance information.
  • the action evaluation index set may include the position of the motion trajectory corresponding to the maximum value of the thigh angle, and the position of the motion trajectory corresponding to the minimum value of the thigh angle.
  • the action evaluation index set may include the position of the hip node in the motion trajectory corresponding to the starting action position of the thigh motion, or the action evaluation index set may include the position of the hip node in the motion trajectory corresponding to the motion end position of the thigh motion .
  • the electronic device can determine the thigh angle. During the user's movement, the electronic device determines whether the movement change of the thigh meets the maximum and minimum requirements of the preset movement change range. That is, the electronic device determines the range of motion change of the thigh, the maximum value of the range meets the requirement of the maximum value of the preset motion change range, and the minimum value of the range meets the requirement of the minimum value of the preset motion change range.
  • the maximum value requirement may be an interval, that is, the electronic device determines that the maximum value of the range of motion change of the thigh is within the interval of the maximum value of the preset motion change range.
  • the minimum requirement may also be an interval, and the electronic device determines that the minimum value of the range of motion change of the thigh is within the interval of the preset minimum value of the motion change range.
  • dotted lines indicate the range of the direction of thigh motion corresponding to the minimum range of the preset motion change range, and the range of the direction of thigh motion corresponding to the maximum range of the preset motion change range.
  • the electronic device can determine that the user performs a squat movement.
  • the maximum value of the movement change of the thigh is within the maximum range of the preset movement change range
  • the minimum value of the movement change of the thigh is within the minimum range of the preset movement change range, that is, the movement of the user's thigh
  • the change meets the maximum and minimum requirements of the preset action change range. If there are no other recognition conditions, it can be determined that the user in the picture is performing a squat action.
  • the recognition condition of the squat movement may also include the variation range of the relative distance between the hip node and the ankle node.
  • the second standard node may include a hip node, a knee node, and an ankle node.
  • the distance between the hip node and the knee node, and the distance between the knee node and the ankle node determine the range of change in the relative distance between the hip node and the ankle node.
  • the relative distance between the hip node and the ankle node can be calculated by dividing the distance between the hip node and the ankle node by the sum of the distance between the hip node and the knee node and the distance between the knee node and the ankle node.
  • the ratio of the distance between the nodes to the leg length indicates the relative distance between the hip node and the ankle node.
  • the range of the interval between the minimum and maximum of the ratio in the user's action is the variation range of the relative distance between the hip node and the ankle node.
  • step S305 If the result of the judgment is that the user performs a fitness action, proceed to step S305; otherwise, proceed to step S301 to reacquire user actions.
  • the guidance system for each action can include core indicators and secondary indicators.
  • the core indicators are used to determine the basic scoring and guidance of user actions, and the secondary indicators are used to modify the scoring and provide comprehensive guidance.
  • step S305 the electronic device scores and evaluates according to the core indicators.
  • the action evaluation index set includes core indexes.
  • Core indicators can include evaluation criteria for the position information of certain limbs. Limbs are represented by bone nodes. That is, the core indicators may include evaluation criteria for the third standard node. According to the core indicators, the score and evaluation corresponding to the third standard node in the user's action are obtained.
  • the position information of the limbs may include one or more of the movement range of the limbs, the limit value of the movement change range of the limbs, and the movement trajectory of the limbs.
  • the limit value of the movement change range of the limb is the maximum or minimum value of the movement change range of the limb.
  • the core index may include the corresponding relationship between the movement range of the limb and the evaluation information, the corresponding relationship between the limit value of the movement change range of the limb and the evaluation information, and the correspondence between the movement trajectory of the limb and the evaluation information.
  • Scoring and evaluation are performed according to the core indicators, and the relative position and position change between the third key node corresponding to the third standard node in the candidate video can be scored and evaluated.
  • the third key node is the bone node corresponding to the third standard node in the user action, or in other words, the third key node is the bone node in the same part of the user action as the third standard node.
  • the core indicator may include correspondences between the location information of various third standard nodes and various scores and/or evaluations.
  • the core indicator may also include the correspondence between the location information of various third standard nodes and the increase or decrease of various scores, and/or the core indicator may include the relationship between the location information of multiple third standard nodes and various evaluations.
  • Correspondence According to the location information of the third standard node, the score can be increased or decreased on the basis of the preset score to determine the score. Evaluation can also be understood as guidance and suggestions.
  • the position information of the third standard node may include the relative position between the third standard nodes and/or the position change of each third standard node.
  • the third standard node may include all or part of the first standard node, the third standard node may include all or part of the second standard node, and the third standard node may also include other bone nodes.
  • the third standard node may be a part of the first standard node and the second standard node.
  • the electronic device may determine one or more frames of images in the candidate actions or user actions according to the movement trajectory of the first key node.
  • the electronic device can determine one or more frames of images in the alternative actions or user actions according to the movement trajectory of the third key node.
  • the electronic device may obtain the corresponding score or evaluation according to the relative position between the third key nodes in the one or more frames of images.
  • the electronic device can determine the selection method of the one or more frames according to the fitness action.
  • the relative position between the third key nodes can reflect the position of the user's limbs, such as the angle of the limbs, the distance between the limbs, and so on.
  • the electronic device may obtain a score and/or evaluation corresponding to the motion trajectory of the third key node according to the motion trajectory of the third key node.
  • the electronic device may determine the change range of the position of the third key node in an alternative action or a user action.
  • the score and/or evaluation corresponding to the change range or the limit value of the change range of the position of the third key node can be obtained.
  • the core indicators may include the correspondence between the minimum value of the thigh angle range and the score, and the correspondence between the thigh angle range and the evaluation.
  • the minimum value of the angle between the thigh and the ground is less than 75 degrees, the user is considered to have completed a squat action, and when the angle is 0 degrees, the indicator is considered to have reached the best degree of completion.
  • the minimum value of the angle between the thigh and the ground, that is, the horizontal direction is less than 75 degrees, and can be divided into several intervals (such as less than 0 degrees, 0-25 degrees (excluding 25 degrees), 25-50 degrees (excluding 50 degrees), 50- 75 degrees (excluding 75 degrees) four intervals), each interval corresponds to a different score, and each interval corresponds to the same or different evaluation.
  • the angle between the thigh and the ground determined according to the third key node is a certain value, determine the range to which the value belongs, determine the corresponding score and evaluation, so as to provide the user with corresponding guidance (such as the thigh and the ground determined according to the third key node)
  • the minimum value of the angle is 32 degrees, the interval to which it belongs is 25-50 degrees, the score corresponding to this interval is 80 points, and the evaluation corresponding to this range is squatting deeper).
  • step S306 the electronic device scores and evaluates according to the secondary index.
  • the action evaluation index set includes secondary indexes.
  • the secondary index may include the evaluation standard of the fourth standard node. According to the scoring and evaluation of the secondary indicators, the electronic device can score and evaluate the relative position and position change of the fourth key node in the candidate video.
  • the fourth key node is a bone node corresponding to the fourth standard node in the user action, or in other words, the fourth key node is a bone node in the same part of the user action as the fourth standard node.
  • the secondary index may include the corresponding relationship between the location information of the fourth standard node and the score reduction and/or evaluation. It can also be understood that the secondary index includes the correspondence between the position information of the limb corresponding to the fourth standard node and the score reduction and/or evaluation.
  • the position information of the fourth standard node may include the relative position between the fourth standard nodes and/or the position change of each fourth standard node.
  • the fourth standard node may include all or part of the third standard node, and the fourth key node may also include other bone nodes other than the third standard node.
  • the position change situation may be, for example, the magnitude, range, and movement trajectory of the position change.
  • the secondary indicators may include the correspondence between the calf angle range and the score, and the correspondence between the calf angle range and the evaluation.
  • the fourth standard node may include a knee node and an ankle node. According to the relative position between the knee node and the ankle node, the electronic device can determine the calf angle.
  • Secondary indicators can also include the correspondence between the trunk angle range and the score, and the correspondence between the trunk angle range and the evaluation, and so on. Secondary indicators can include an angle between the calf and the ground greater than 50 degrees.
  • the evaluation indicators can be completely different, and the influence of each body movement in different fitness exercises is also different (for example, arm movements in squats are usually irrelevant, but in dumbbell curls, arm movements are the core, lower limbs The action is not important). If the movements of each limb are judged, evaluated, and guided, too much useless information will be sent to the user, which will affect the user experience.
  • steps S305-S306 the evaluation corresponding to a specific fitness exercise can be determined, and effective guidance can be provided to the user.
  • step S307 the electronic device feeds back the score and evaluation.
  • the electronic device may output feedback information, which includes the score determined in step S306 and the evaluation determined in steps S305-S306, so as to feed back the score and evaluation to the user. For example, you can give feedback on scores and evaluations by popping up text on the screen, playing prompt voices, etc.
  • the electronic device can play the voice through the Bluetooth headset to score and/or review, or display the score and/or review on the screen.
  • the public broadcast is played through the speaker.
  • the electronic device can display the score and/or evaluation on the screen.
  • the electronic device may perform image recognition to determine whether the user performing the fitness action wears a Bluetooth headset.
  • the Bluetooth headset is connected to the auxiliary fitness device through Bluetooth, the score and/or evaluation are played through the Bluetooth headset.
  • the electronic device may also obtain feedback mode indication information, where the feedback mode indication information is used to indicate a manner of feedback, that is, a manner of outputting a score and/or an evaluation.
  • the electronic device can count the user's fitness actions according to the movement trajectory of the first key node, and record the number of fitness actions completed by the user.
  • the electronic device may store the corresponding relationship between the number of cycles of the first standard node and the number of fitness actions in the exercise evaluation index set.
  • one cycle of the movement trajectory of the first key node corresponds to the completion of a fitness action by the user.
  • multiple cycles of the movement track of the first key node correspond to the user completing a fitness action, or one cycle of the movement track of the first key node corresponds to the user completing a fitness action.
  • the electronic device can also count the user's fitness actions according to the movement trajectory of the second key node or the third key node.
  • the electronic device can determine multiple alternative actions. According to the number of candidate actions, the electronic device can determine the number of times the user completes the fitness action. The number of completed fitness actions can be fed back to the user.
  • the recognition condition it is determined that the user is performing a specific fitness action.
  • the recognition condition Even if other parts are very non-standard, it is considered that the user is trying to learn the fitness action, but the action is not standard.
  • the embodiments of the present application can accurately identify whether the user action is a fitness action. Even if the user's actions are not standard, the method provided in the embodiments of the present application can determine that the user performs fitness actions.
  • the starting and ending time of the user's fitness action can be determined, so that the user's action within the starting and ending time can be judged.
  • Determine whether the user performs fitness actions and score the user's actions that is, determine whether the user performs fitness actions based on the minimum completion index of a specific fitness action, that is, the recognition condition.
  • the actions that do not meet the recognition condition are filtered out, that is, the user's non-fitness actions are not scored and evaluated, and only the actions that meet the recognition condition are scored and evaluated, and guidance is provided to the user for the action.
  • the electronic device may determine whether the first key node includes all the first standard nodes and whether the second key node includes all the second standard nodes.
  • step S303 is performed.
  • step S301 is performed to obtain a user action again.
  • the electronic device may output first prompt information, which is used to prompt the user to adjust the range of the user's body collected by the user action.
  • the first prompt information may be used to remind the user to adjust the relative position with the camera.
  • the first prompt information may also include the first standard node not included in the first key node and/or the second key node does not include the second standard node information, or the first prompt information may also include all the first standard nodes and the first standard node. 2.
  • Information of standard nodes to remind the user to adjust the relative position with the camera so that user actions include all nodes in the first standard node and the second standard node, and the first standard node and the second standard node do not exceed the user action The scope of collection.
  • the electronic device may determine whether there is a third standard node in the candidate action, that is, whether the third key node includes all third standard nodes.
  • step S301 may be performed to obtain the user action again. If there are all nodes in the third standard node in the candidate action, step S305 is performed. If the third standard node is all or part of the first standard node and the second standard connection node, when it is determined that the first key node includes all the first standard nodes and the second key node includes all the second standard nodes, The electronic device can then determine that all nodes in the third standard node exist in the candidate action.
  • the electronic device may determine that the score of the candidate action is the lowest score. If all or part of the third standard nodes exist, step S305 can be performed.
  • the influence of the node can be considered when determining the score, that is, the score corresponding to the node can be deducted.
  • the evaluation corresponding to the third standard node that does not exist can no longer be determined.
  • the second prompt information can be output, and the second prompt information is used to prompt the user to adjust the range of the user's body collected by the user action.
  • the second prompt information is used to remind the user to adjust the relative position with the camera.
  • the second prompt information may include the information of the third standard node that is not included in the third key node to remind the user to adjust the relative position with the camera, so that the user action includes all the nodes in the third standard node, and the third key node is related to The third standard node has a one-to-one correspondence, and the body part corresponding to the third standard node does not exceed the spatial range of the user's action collection.
  • the electronic device may determine whether there is a fourth standard node in the candidate action. If there is no one of the fourth key nodes, it can be determined that the score of the candidate action is the lowest score. If all or part of the fourth key nodes exist, step S306 can be performed. For the case where some nodes in the fourth key node do not exist, the total score can be appropriately reduced. The electronic device can reduce the score calculated by the core indicator according to the score reduction amount corresponding to some nodes in the fourth key node that does not exist in the action evaluation indicator set.
  • the electronic device may output third prompt information, which is used to prompt the user to adjust the user's body range collected by the user action.
  • the third prompt information is used to remind the user to adjust the relative position with the camera.
  • the third prompt information may include the information of the fourth standard node that is not included in the fourth key node to remind the user to adjust the relative position with the camera, so that the user action includes all the nodes in the fourth standard node, and the fourth standard node does not Beyond the scope of user action collection.
  • the electronic device can consider the influence of the node when determining the score, that is, deduct the score corresponding to the node. The electronic device may no longer determine the evaluation corresponding to the non-existent fourth standard node.
  • the electronic device may output fourth prompt information, which is used to remind the user that the feedback information is incomplete and that all standard nodes are not evaluated.
  • the standard node that has not been evaluated may be the third standard node or the fourth standard node. That is to say, when it is determined that a part of the fourth standard node is missing in the fourth key node information in the case of being able to recognize that the user is performing a fitness action, the electronic device may remind the user.
  • a fault-tolerant mechanism is provided when part of the user's body is not within the user's action.
  • the user may not be able to perform fitness exercises with part of his body outside the screen.
  • Recognition for example, the distance is close, the ankle is not visible outside the screen.
  • the method provided by the embodiment of this application can still count, score and evaluate the user's fitness actions, avoiding the user's partial bone nodes Recognition and evaluation errors caused by not within the scope of image recognition.
  • For each fitness action establish a minimum joint point set that can recognize the action, that is, a point set composed of the first key node and the second key node.
  • a minimum joint point set that can recognize the action, that is, a point set composed of the first key node and the second key node.
  • a basic guide point set that is, a point set composed of the third standard node.
  • the basic guide point set may be the same as or different from the minimum joint point set.
  • the basic guide point set and the minimum joint point set can be composed of hip nodes and knee nodes.
  • the electronic device can provide the user with basic scoring and guidance.
  • the electronic device can output reminder information to remind the user to adjust the collection range of the user action, so that the collected user action includes the basic guide point set and / Or all nodes in the smallest joint point set.
  • an extended guidance point set that is, a point set composed of the fourth standard node.
  • the fourth standard node is not all located in the user's action, and it is impossible to determine whether the user's action meets the secondary index, and the evaluation corresponding to the secondary index can no longer be output, and the score can be appropriately reduced.
  • steps S301-S307 it is possible to accurately determine whether the user performs a fitness action, record the number of times the user completes the action, evaluate the quality of the user's completed action, identify the wrong partial action, and give the user feedback and guidance.
  • the information related to the position of the user’s limbs can be used to indicate the position in the two-dimensional space, that is, the position in the user's actions, or the position in the three-dimensional space, that is, according to The position in the three-dimensional space determined by the user action is not limited in the embodiment of the present application.
  • Fig. 4 is a method for assisting fitness provided by an embodiment of the present application.
  • the method can be executed by an auxiliary fitness device, which is an electronic device.
  • the auxiliary fitness device includes a camera, a processor, a memory, a display/speaker/Bluetooth communication module, etc.
  • the processor includes a CPU, and may also include a GPU, an NPU, and so on.
  • step S201 the camera obtains a user action video.
  • step S202 the CPU/GPU/NPU runs a skeletal node identification algorithm to identify the user's skeletal node in the user's action video.
  • the memory stores the action evaluation index set.
  • the memory may be ROM, for example.
  • step S204 the CPU judges the fitness action and performs scoring and evaluation.
  • the CPU judges whether the user is performing a fitness action according to the action evaluation index set stored in the memory and the identified skeletal node, and when determining that the user is performing the fitness action, it scores and evaluates the user's action.
  • step S205 the display/speaker/Bluetooth headset, etc., output feedback information.
  • the Bluetooth communication module can send feedback information to the Bluetooth headset, and the Bluetooth headset can output the feedback information.
  • the feedback information may include ratings and evaluations of the user's actions.
  • the graphical user interface can be seen in Figure 5.
  • Fig. 5 is a schematic diagram of a graphical user interface (GUI) provided by an embodiment of the present application.
  • GUI graphical user interface
  • the auxiliary fitness device is a mobile phone as an example for description.
  • Figure 5(a) shows that in the unlocking mode of the mobile phone, the screen display system of the mobile phone displays the currently output interface content 501, which is the main interface of the mobile phone.
  • the interface content 501 displays a variety of third-party applications (application, App), such as Alipay, task card store, Weibo, photo album, WeChat, card package, settings, fitness, etc. It should be understood that the interface content 501 may also include other more application programs, which is not limited in this application.
  • the fitness application interface 503 may include various fitness actions.
  • the mobile phone When the mobile phone detects that the user clicks on a fitness action on the fitness application interface 503, it can display the guidance interface of the fitness action as shown in Figure 5 (c).
  • the guidance interface may include standard fitness actions 504, user actions collected by the camera in real time, that is, user action videos 505, evaluations that are guidance information 506, user action counts 507, and so on.
  • Fig. 6 is a schematic diagram of a graphical user interface provided by an embodiment of the present application.
  • the television can display the guidance interface of the fitness action as shown in FIG. 6.
  • FIGS. 5 and 6 are only exemplary illustrations, and other electronic devices with display functions such as tablet computers, personal computer displays, etc. can also display the images shown in (c) in FIG. 5 and shown in FIG. 6 Guidance interface for fitness actions.
  • FIG. 7 is a schematic flowchart of a method for assisting fitness provided by an embodiment of the present application.
  • step S601 the electronic device acquires a user action.
  • the electronic device can acquire real-time images of user actions.
  • the electronic device can obtain the user's action video.
  • the user action video the user is performing a user action.
  • the user action video may be a video formed by an image collected by an electronic device in real time.
  • the electronic device may recognize the joint points in the user motion to determine the first limb and the second limb in the user motion.
  • the electronic device can identify the user's skeleton node in the user's action.
  • the bone nodes can also be called joint points.
  • the user's bone node can represent the user's limbs. Through the change of the position of the user's bone node, the electronic device can determine the position and movement of the user's limbs. Or, other methods can be used to identify the user's body. Limbs can also be understood as body parts.
  • the electronic device can recognize the first limb and the second limb according to the joint points.
  • step S602 the electronic device determines the movement track of the first limb in the user action.
  • the first limb can be a certain part or parts of the body.
  • the motion trajectory of the first limb may refer to the spatial characteristics of the action composed of the route that the first limb traverses from the start position to the end.
  • the motion trajectory may include the direction of the motion trajectory, the shape of the motion trajectory, and the like.
  • the movement track direction of the first limb may be the movement direction formed by the first limb when the user moves.
  • the shape of the motion trajectory can be a straight line, a curve, or a combination of the two.
  • the electronic device may determine an alternative action from the user's actions according to the movement trajectory of the first limb.
  • the electronic device determines whether the motion trajectory of the first limb meets the first preset condition, and uses a segment of user action corresponding to the motion trajectory that meets the first preset condition as an alternative action.
  • the candidate action is a user action in which the movement trajectory of the first limb meets the first preset condition.
  • the first preset condition can also be understood as a preset position change feature of the first limb.
  • the electronic device may determine a segment of user action in which the movement trajectory of the user's first limb conforms to the preset position change feature as an alternative action. That is, the electronic device can determine, from the user's actions, an alternative action that meets the preset motion track feature of the first limb. The electronic device may determine the candidate action from the user action video according to the characteristics of the movement track of the first limb.
  • the preset position change characteristics may include one or more of the shape of the motion track, the periodicity of the motion track, and the like.
  • the electronic device may select a piece of video in which the movement trajectory of the first limb meets the preset characteristics as the candidate action, that is, the candidate action is a piece of video whose movement trajectory has a certain characteristic.
  • the electronic device may determine alternative actions from the user's actions according to the periodicity of the movement track of the first limb.
  • the electronic device may select a segment of user action in which the similarity between the motion trajectory of the first limb and the preset trajectory is less than a preset value as an alternative action.
  • the alternative action can be a video corresponding to one cycle of the movement trajectory of the first limb, or a video or image corresponding to a certain segment of the movement trajectory of the first limb in one cycle, for example, the first limb is located in a specific position range in one cycle
  • the trajectory corresponds to the video or image.
  • the selection method of the alternative exercise may be determined according to a specific fitness exercise.
  • the preset feature can be used to indicate the way the position of the bone node corresponding to the first limb changes.
  • the way of changing the position of the first limb may be a change in the direction of movement of the first limb.
  • the change of the movement direction of the first limb may be, for example, upward or downward movement, or up and down reciprocating movement.
  • the direction of movement of the first limb can also be understood as the direction of movement of all or part of the bone nodes of the first limb.
  • the first key node is the bone node in the first limb.
  • the way of changing the position of the first key node may also be the shape of the motion trajectory, for example, the motion trajectory is a triangle, circle, arc, or broken line shape.
  • the angle change of the first limb is the angle change of the first limb action, the direction between the nodes of the first limb changes, and the relative position changes.
  • the movement trajectory of the first limb can be understood as the movement trajectory of the first key node in the first limb.
  • the determined alternative action can be, for example, a segment of the user action corresponding to the upward movement of the first key node during the up and down reciprocating movement, or the movement trajectory of the first key node is a triangle, and the first key node moves on a certain side of the triangle. The corresponding segment of user action.
  • step S603 the electronic device determines the motion change range of the second limb in the candidate motion.
  • the electronic device may determine the range of change in the movement of the second limb in the alternative movement.
  • the movement range of the second limb can also be understood as the position change range of the second limb, that is, the difference between the maximum range of the position change of the second limb.
  • the range of motion changes can include changing angles, changing distances, and so on. That is to say, the movement range of the second limb can be the difference between the maximum value and the minimum value of the angle between the second limb and the horizontal or vertical direction, and the movement change range of the second limb can also be in the second limb.
  • the farthest distance between each position passed during the position change of the second limb, the movement change range of the second limb may also be the maximum value of the distance change between the second limbs.
  • the electronic device may determine that the movement change of the second limb in the user movement meets the maximum value and minimum value requirements of the preset movement change range, so as to determine that the user movement is a fitness movement.
  • the second limb may include one or more body parts.
  • the second limb may include all or part of the first limb.
  • the second limb may also be a limb other than the first limb of the user.
  • the electronic device may also determine that the user action satisfies a preset movement change range of the second limb to determine that the user action is a fitness action.
  • the motion change range of the second limb in the candidate motion may include the motion change range of the second limb in the candidate motion.
  • the action change range of the second limb in the alternative action may include the action start position of the second limb in the alternative action, and the position of the second limb in the alternative action At least one of the action termination positions.
  • the action start position and action end position of the second limb in the candidate action can be determined according to the movement trajectory of the first limb.
  • the action starting position of the second limb in the candidate action may be the position of the first limb in the first frame of the candidate action.
  • the action termination position of the second limb in the candidate action is the position of the first limb in the first frame image of the candidate action.
  • it may be determined that the position of the second limb in the candidate action image corresponding to a certain point on the movement trajectory of the first limb is the action starting position or the action ending position of the second limb.
  • the electronic device may store the corresponding relationship between the position of the first limb on its movement track and the movement start position or movement end position of the second limb.
  • the range of change in the position of the second limb in the alternative action may be a range corresponding to the range of change in the position of the second limb in the alternative action.
  • the change range of the position of the second limb is the difference between the action start position and the action end position of the second limb in the alternative action.
  • the change range of the position of the second limb in the candidate action is used to indicate that the position of the second limb is in the last frame image of the candidate action relative to the first frame image of the candidate action The change.
  • the maximum value requirement of the preset action variation range is the interval range of the maximum value of the preset action variation range; the minimum value requirement of the preset action variation range is the interval range of the minimum value of the preset action variation range.
  • the second limb may include one or more body parts.
  • the electronic device can determine whether the user performs a fitness action according to the range of change in the position of the second limb.
  • the electronic device can obtain the change range of the position of the second limb according to the user's action. According to the stored fitness exercise corresponding recognition condition, the electronic device determines whether the user performs a fitness exercise.
  • the recognition condition may be the range of change in the position of the second limb or the like.
  • the range of motion changes can include the way the position changes.
  • the way of changing the position may be, for example, up and down movement, horizontal movement, circular movement, angle change, and the like.
  • the movement change range of the second limb may include the change range of the angle change of the second limb, and may also include the change range of the relative position between the second limbs.
  • the electronic device may identify the bone node in the user's motion to determine the second limb in the user's motion.
  • the electronic device may determine that the second limb in the user's action satisfies the preset movement change range of the second limb.
  • the electronic device may determine the type of fitness exercise performed by the user according to the change range of the position of the second limb. According to the recognition condition satisfied by the change of the position of the second limb, the type of fitness exercise corresponding to the recognition condition can be determined.
  • the electronic device may obtain input information before step S601 or before step S602, and the input information is used to indicate a fitness action.
  • the electronic device determines the first preset condition according to the input information, that is, the preset condition that the motion trajectory of the first limb satisfies.
  • the electronic device may determine at least one of the first limb, the second limb, the third limb, the fourth limb, etc. corresponding to the fitness action.
  • the electronic device can determine the fitness exercise.
  • the electronic device can determine the type of fitness exercise based on the input information.
  • the second limb can be a different body part.
  • the recognition condition may be the range of change in the position of the second limb. Therefore, the recognition conditions can be different for different fitness actions.
  • the electronic device can determine whether the user performs the fitness action according to the corresponding recognition condition of the type of fitness exercise.
  • the recognition condition may include the motion change range of the second limb, and the recognition condition may also include at least one of the motion start position of the second limb and the motion end position of the second limb.
  • the electronic device can determine whether the user performs a fitness action only according to the range of the second limb corresponding to the fitness action, without judging whether the user's action meets the recognition conditions of other fitness actions, reducing the amount of calculation.
  • the movement change range of the second limb may include the movement change range of the second limb, and may also include the movement start position of the second limb and the movement end position of the second limb.
  • the action start position of the second limb and the action end position of the second limb may be determined according to the movement trajectory of the first limb.
  • the action start position of the second limb may be the action start position of the second limb in the candidate action, that is, the position of the second limb in the first frame of image corresponding to the candidate action.
  • the action starting position of the second limb may also be the position of the second limb in the image of the candidate action corresponding to the point when the first limb is located at a certain point in the movement track.
  • the change range of the position of the second limb of the user may be the change range of the position of the second limb in the entire user action, that is, it may be the maximum value of the change range of the position of the second limb in the user action. Range.
  • the electronic device can select any two frames of images each time in the user's action, compare the position of the second limb, and determine the position of the second limb through multiple selections of image frames and comparison of the position of the second limb. The maximum value of the change.
  • the change range of the position of the user's second limb may also be the change range of the position in the candidate action in the user action.
  • the alternative action is a video of the user's action.
  • the electronic device may determine whether the user's first limb is included in the user's motion.
  • step S602 is performed.
  • the electronic device may output reminder information.
  • the reminder information may be used to remind the user to adjust the image collection range of the user action. At this time, the electronic device may not perform the subsequent steps of the steps.
  • the electronic device may also determine whether the user's second limb is included in the user's motion. When the second limb is outside the user action, a reminder message can be output. The reminder information may be used to remind the user to adjust the image collection range of the user action.
  • the electronic device may also determine the horizontal direction or the vertical direction in the user's motion. According to the horizontal direction or the vertical direction, the electronic device can determine the position of the user's limbs, the change of the position, the movement trajectory, and the like.
  • step S604 the electronic device determines to output guidance information according to the magnitude of the motion change.
  • the electronic device can evaluate and guide the user's action.
  • the second preset condition may also be referred to as an identification condition.
  • the electronic device determines that the motion change range meets the second preset condition; the electronic device determines to output the guidance information.
  • the movement range of the second limb satisfies the second preset condition, it is considered that the user is performing a fitness movement.
  • the electronic device can compare the similarity between the user's movement and the standard movement of the fitness movement, and evaluate and guide the user's movement.
  • the electronic device may save the correspondence between user actions and evaluation information.
  • the electronic device may indicate the user's actions through the position information of the user's limbs.
  • the electronic device may store the correspondence between the position information of the user's first limb and the evaluation information.
  • the position information of the second limb may be used to indicate the position of the second limb, and the position of the second limb may be a change range of the position, or may be a position at a specific point in time.
  • the position information of the second limb includes at least one of the movement range of the second limb, the movement start position of the second limb, the movement end position of the second limb, and the movement track of the second limb.
  • the electronic device can determine the time point when the second limb is located at a certain position according to the movement trajectory of the first limb, determine the user action at that time point, and determine the position information of the second limb at the time point, thereby according to the position of the second limb Correspondence between information and evaluation information to determine the corresponding evaluation information.
  • the electronic device may determine the action start position of the second limb according to the saved correspondence relationship between the action start position of the second limb, the action end position of the second limb, and the position of the first limb in its movement track. Starting position, and ending position of the second limb. According to whether the action start position of the second limb and the action end position of the second limb meet the recognition condition, it is determined whether to output the guidance information.
  • the electronic device can determine the time period when the first limb is in a certain range according to the movement trajectory of the first limb, and determine the position information of the second limb in the time period, so as to determine the corresponding relationship between the position information of the second limb and the evaluation information. Determine the corresponding evaluation information.
  • the electronic device may determine the corresponding evaluation information according to the change of the position of the first limb.
  • the position of the second limb may be the maximum or minimum value of the angle between the second limb and the horizontal or vertical direction in the user action or alternative action, and may be the range of the angle change, that is, the minimum value of the angle.
  • the position of the second limb can also be the ratio of the movement distance of the second limb to the length of the second limb, and the position of the second limb can also be the second limb and other limbs The relative positional relationship of the second limbs or the relative relationship between the second limbs, etc.
  • the first evaluation information corresponding to the first position information is determined.
  • the first position information is used to indicate the position of the second limb in the user action.
  • the position information of the second limb can be in one-to-one correspondence with the evaluation information.
  • the position information of the second limb may include the movement change range of the second limb, the movement change range amplitude of the second limb, the limit value of the movement change range of the second limb, and the movement trajectory of the second limb At least one of them.
  • the limit value of the movement change range of the second limb that is, the maximum or minimum value of the movement change range of the second limb.
  • it may include the maximum or minimum value of the angle of the second limb, and it may also include the maximum or minimum distance between the second limbs, that is, the maximum or minimum distance between one body part and another body part. Wait.
  • the range of motion change of the second limb may also be referred to as the range of motion change, that is, the difference between the maximum or minimum value of the motion change range of the second limb.
  • the correspondence between the movement trajectory of the second limb and the evaluation information may be, for example, the correspondence between the shape and period of the movement trajectory of the second limb and the evaluation information.
  • the electronic device may output guidance information based on the first evaluation information.
  • the first evaluation information may include scores and/or reviews.
  • the guide information may be the same as or different from the first evaluation information.
  • the electronic device may adjust the first evaluation information according to the completion of the actions of the user's other limbs to obtain the guidance information.
  • the electronic device may determine the second evaluation information corresponding to the second location information of the user.
  • the second position information is used to indicate the position of the third limb in the user action.
  • the second evaluation information may be, for example, a score and/or an evaluation.
  • the second position information may include one of the movement change range of the third limb, the movement change range of the third limb, the limit value of the movement change range of the third limb, the movement trajectory of the third limb, etc. Or multiple.
  • the electronic device may output guidance information based on the second evaluation information and the first evaluation information.
  • the guidance information may include the evaluation in the second evaluation information.
  • the electronic device can output the evaluation in the second evaluation information when the score in the first evaluation information corresponding to the second limb position is greater than or equal to the preset value; conversely, the score in the first evaluation information corresponding to the second limb position When it is less than the preset value, the guide information may only include the evaluation in the first evaluation information.
  • the second evaluation information may not be determined.
  • the electronic device may adjust the score in the first evaluation information according to the score in the second evaluation information, so as to determine the score in the guidance information.
  • the score in the second evaluation information may be an increased or decreased score value, and the score value may be increased or decreased on the basis of the score in the first evaluation information.
  • the score in the second evaluation information may also be the score of the action of the third limb.
  • the electronic device may increase or decrease the value obtained by multiplying the score in the second evaluation information by the weight on the basis of the score in the first evaluation information according to the weight of the third limb.
  • the third limb may include one or more body parts.
  • the third limb may be a limb other than the second limb of the user.
  • the electronic device can determine whether the user's third limb is included in the user's motion. When the third limb is outside the user action, the electronic device may output reminder information. The reminder information may be used to remind the user to adjust the image collection range of the user action.
  • steps S601-S602 it is possible to accurately determine whether the user performs a fitness action, so as to provide guidance when the user performs a fitness action, and improve user experience.
  • FIG. 9 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the device 700 includes an acquiring module 701 and a determining module 702.
  • the obtaining module 701 is used to obtain user actions.
  • the determining module 702 is configured to determine, from the user motion, the candidate motion of the first limb in the user motion that satisfies the first preset condition.
  • the determining module 702 is further configured to determine the motion change range of the second limb in the candidate motion.
  • the determining module 702 is further configured to determine to output guidance information according to the change range of the action.
  • the device 700 further includes a judging module for judging that the motion change range satisfies a second preset condition.
  • the determining module 702 is also used to determine the output guidance information.
  • the obtaining module 701 is also used to obtain input information.
  • the determining module 702 is further configured to determine the first preset condition corresponding to the input information.
  • the determining module 702 is further configured to determine the first evaluation information corresponding to the first position information, where the first position information includes the movement range of the second limb, the movement starting position of the second limb, At least one of the action end position of the second limb and the movement track of the second limb, the action start position of the second limb and the action end position of the second limb are based on the first The trajectory of the limbs is determined.
  • the device 700 further includes an output module for outputting guidance information according to the first evaluation information.
  • the determining module 702 is further configured to determine second evaluation information corresponding to the second position information of the user, where the second position information includes the movement range of the third limb and the movement of the third limb. At least one of the starting position, the motion ending position of the third limb, and the motion trajectory of the third limb. The motion starting position of the third limb and the motion ending position of the third limb are based on The movement trajectory of the first limb is determined.
  • the output module is further configured to output the guidance information according to the second evaluation information and the first evaluation information.
  • the device 700 further includes a recognition module for recognizing joint points in the user motion to determine the first limb and the second limb in the user motion.
  • the determining module 702 is further configured to determine to output the guidance information according to the motion change range and the motion start position of the second limb in the candidate motion; or, the electronic The device determines to output the guidance information according to the motion change range and the motion termination position of the second limb in the candidate motion; or, the electronic device determines to output the guidance information according to the motion change range, the second The action start position and action end position of the limb in the candidate action are determined to output the guidance information.
  • the action start position of the second limb and the action end position of the second limb are based on the The trajectory of the first limb is determined.
  • FIG. 10 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the device 800 includes a processor 801 and a communication interface 802.
  • the communication interface 802 is used to obtain user actions.
  • the processor 801 is configured to: determine, from the user action, an alternative action for which the movement trajectory of the first limb in the user action satisfies a first preset condition; and determine the movement range of the second limb in the alternative action ; According to the change range of the action, determine the output guidance information.
  • the processor 801 is configured to: determine that the motion change range satisfies a second preset condition; determine to output guidance information.
  • the communication interface 802 is also used to obtain input information.
  • the processor 801 is further configured to determine the first preset condition corresponding to the input information.
  • the processor 801 is configured to: determine the first evaluation information corresponding to the first position information, where the first position information includes the movement range of the second limb, the movement starting position of the second limb, and the At least one of the action ending position of the second limb and the movement track of the second limb, the action starting position of the second limb and the action ending position of the second limb are based on the first limb
  • the trajectory of the movement is determined; the guidance information is determined according to the first evaluation information.
  • the processor 801 is configured to: determine second evaluation information corresponding to the second position information of the user, where the second position information includes the motion change range of the third limb, and the motion start of the third limb. At least one of the starting position, the motion ending position of the third limb, and the motion trajectory of the third limb. The motion starting position of the third limb and the motion ending position of the third limb are based on The movement trajectory of the first limb is determined; the guidance information is determined according to the second evaluation information and the first evaluation information.
  • the processor 801 is further configured to: identify joint points in the user motion to determine the first limb and the second limb in the user motion.
  • the processor 801 is further configured to: determine to output the guidance information according to the motion change range and the motion start position of the second limb in the candidate motion, or according to the Determine to output the guidance information according to the range of motion change and the motion termination position of the second limb in the candidate motion; or, according to the motion change range, the second limb’s motion in the candidate motion
  • the action start position and action end position in determine the output of the guidance information, the action start position of the second limb and the action end position of the second limb are determined according to the movement trajectory of the first limb .
  • An embodiment of the present application further provides an electronic device, including: at least one processor and a communication interface, the communication interface is used for the electronic device to exchange information with other devices, when the program instructions are executed in the at least one processor At this time, the electronic device is caused to execute the above method.
  • An embodiment of the present application also provides a computer program storage medium, which is characterized in that the computer program storage medium has program instructions, and when the program instructions are directly or indirectly executed, the foregoing method can be realized.
  • An embodiment of the present application further provides a chip system, characterized in that the chip system includes at least one processor, and when the program instructions are executed in the at least one processor, the foregoing method can be realized.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • the disclosed system, device, and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Telephone Function (AREA)

Abstract

一种人工智能领域中的辅助健身的方法,其特征在于,包括:电子设备获取用户动作(S601);电子设备从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作(S602);电子设备确定第二肢体在所述备选动作中的动作变化幅度(S603);电子设备根据所述动作变化幅度,确定输出指导信息(S604)。通过确定肢体动作轨迹满足第一预设条件的备选动作,根据备选动作中的肢体动作变化幅度,可以准确确定用户进行健身动作。在确定用户进行健身动作的情况下输出指导信息,能够提高输出指导信息的准确性,提升用户体验。

Description

辅助健身的方法和电子装置
本申请要求于2019年8月30日提交中国专利局、申请号为201910817978.X、申请名称为“辅助健身的方法和电子装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人工智能(artificial intelligence,AI)领域,具体涉及图像处理领域,尤其涉及一种辅助健身的方法及装置。
背景技术
健身需要专业的指导,否则不仅难以达到锻炼效果,还可能会产生较严重的运动损伤。专业的教练指导价格较高,难以满足广大健身爱好者的需求。通过图像识别用户的健身动作,根据健身动作的关键指标评价用户的完成质量,并指出错误动作和改进方式,可以为用户提供指导,让用户可以科学的运动。
用户在健身的过程中,可能进行其他与健身无关的动作,对于这些动作进行评价,计算量较大,且用户体验较差。将用户的动作模板与标准模板比对,通过相似度或差异度判断用户是否进行健身动作,判断结果的准确性较低。
发明内容
本申请提供一种辅助健身的方法及电子装置,能够准确识别健身动作,提高用户体验。
第一方面,提供一种辅助健身的方法,包括:电子设备获取用户动作;所述电子设备从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作;所述电子设备确定第二肢体在所述备选动作中的动作变化幅度;所述电子设备根据所述动作变化幅度,确定输出指导信息。
通过确定肢体动作轨迹满足第一预设条件的备选动作,根据备选动作中的肢体动作变化幅度,可以准确确定指导信息的输出,能够提高用户体验。
结合第一方面,在一些可能的实现方式中,所述方法还包括:所述电子设备获取输入信息;所述电子设备根据所述输入信息确定所述第一预设条件。
通过获取的输入信息,确定第一预设条件,减小计算量,提高输出的指导信息的准确定。
结合第一方面,在一些可能的实现方式中,所述电子设备确定与第一位置信息对应的第一评价信息,所述第一位置信息包括第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述运动轨迹确定的;所述电子设备根据所述第一评价信息,输出指导信息。
通过第一肢体的位置信息与评价信息的对应关系,确定与用户动作视频中的用户的第一肢体的第一位置信息对应的第一评价信息,从而为用户的健身动作提供指导。在仅有第一肢体的位置信息的情况下,能为用户动作提供指导,提高适用性。
结合第一方面,在一些可能的实现方式中,所述方法还包括::所述电子设备确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;所述电子设备根据所述第一评价信息,输出指导信息,包括:所述电子设备根据所述第二评价信息和所述第一评价信息,输出所述指导信息。
用户动作中还包括第二肢体时,可以根据第二肢体的位置信息与评价信息的对应关系,确定与用户动作视频中的用户的第二肢体的第二位置信息对应的第二评价信息,从而为用户的健身动作提供更为全面、详细的指导。
结合第一方面,在一些可能的实现方式中,识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
通过识别骨骼节点确定第一肢体,能够减小计算量。
结合第一方面,在一些可能的实现方式中,所述方法还包括:所述电子设备获取输入信息,所述输入信息用于指示所述健身动作;所述电子设备确定与所述健身动作对应的所述第一肢体。
通过指示健身动作的输入信息,确定第一肢体,可以减小计算量。
结合第一方面,在一些可能的实现方式中,所述电子设备根据所述动作变化幅度,确定输出指导信息,包括:所述电子设备根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息;或者,所述电子设备根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;或者,所述电子设备根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
通过所述动作变化幅度,以及第二肢体的在所述备选动作中的动作起始位置、动作终止位置中的一个或多个,确定输出所述指导信息,即通过第二肢体更多的位置信息对用户动作进行判断,提高健身动作识别的准确性,能够输出更加准确的指导信息,提高用户体验。
第二方面,提供一种辅助健身的装置,包括:获取模块,用于获取用户动作;确定模块,用于从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作;所述确定模块还用于,确定第二肢体在所述备选动作中的动作变化幅度;所述确定模块还用于,根据所述动作变化幅度,确定输出指导信息。
结合第二方面,所述装置还包括判断模块,用于判断所述动作变化幅度满足第二预设条件;所述确定模块用于,确定输出指导信息。
结合第二方面,在一些可能的实现方式中,所述获取模块还用于,获取输入信息;所述确定模块还用于,确定所述输入信息对应的所述第一预设条件。
结合第二方面,在一些可能的实现方式中,所述确定模块还用于,确定与第一位置信 息对应的第一评价信息,所述第一位置信息包括第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体动作的运动轨迹确定的;所述装置还包括输出模块,用于根据所述第一评价信息,输出指导信息。
结合第二方面,在一些可能的实现方式中,所述确定模块还用于,确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;输出模块还用于,根据所述第二评价信息和所述第一评价信息,输出所述指导信息。
结合第二方面,在一些可能的实现方式中,所述装置还包括识别模块,用于识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
结合第二方面,在一些可能的实现方式中,所述确定模块还用于,根据所述动作变化幅度,以及所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息,或者,根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;或者,根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,所述第二肢体的动作起始位置和动作终止位置是根据所述运动轨迹确定的。
第三方面,提供一种辅助健身的装置,包括处理器,通信接口。通信接口用于,获取用户动作;处理器用于:从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作;确定第二肢体在所述备选动作中的动作变化幅度;根据所述动作变化幅度,确定输出指导信息。
结合第三方面,在一些可能的实现方式中,处理器用于:判断所述动作变化幅度满足第二预设条件;确定输出指导信息。
结合第三方面,在一些可能的实现方式中,通信接口还用于,获取输入信息;处理器还用于,确定所述输入信息对应的所述第一预设条件。
结合第三方面,在一些可能的实现方式中,处理器用于:确定与第一位置信息对应的第一评价信息,所述第一位置信息包括第一肢体的动作变化幅度、所述第一肢体的动作起始位置、所述第一肢体的动作终止位置、所述第一肢体的运动轨迹中的至少一种,所述第一肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;根据所述第一评价信息,确定所述指导信息。
结合第三方面,在一些可能的实现方式中,处理器用于:所述电子设备确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;根据所述第二评价信息和所述第一评价信息,确定所述指导信息。
结合第三方面,在一些可能的实现方式中,处理器还用于:识别所述用户动作中的关 节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
结合第三方面,在一些可能的实现方式中,处理器还用于:根据所述动作变化幅度,以及所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息;或者,所述电子设备根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;或者,所述电子设备根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
第四方面,提供一种计算机存储介质,当所述计算机指令在电子设备上运行时,使得所述电子设备执行第一方面所述的方法。
第五方面,提供一种芯片系统,所述芯片系统包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得所述芯片系统执行第一方面所述的方法。
附图说明
图1是一种电子设备的硬件结构示意图。
图2是一种电子设备的软件结构示意图。
图3是本申请一个实施例提供的一种辅助健身的方法的示意性流程图。
图4是本申请另一个实施例提供的一种辅助健身的方法的示意性流程图。
图5是本申请一个实施例提供的一种辅助健身的用户界面示意图。
图6是本申请另一个实施例提供的一种辅助健身的用户界面示意图。
图7是本申请又一个实施例提供的一种辅助健身的方法的示意性流程图。
图8是深蹲运动的示意图。
图9是本申请一个实施例提供的一种电子装置的示意性结构图。
图10是本申请另一个实施例提供的一种电子装置的示意性结构图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
健身爱好者群体逐渐增大,而健身需要专业的指导,否则不仅难以达到锻炼效果,还可能会产生较严重的运动损伤。而专业的私教不仅数量少,价格也较高,难以满足所有健身爱好者的需求。
通过图像识别用户的健身动作,可以记录用户完成动作的次数,也可以根据健身动作的关键指标评价用户的完成质量,并指出错误动作和改进方式,可以为用户提供科学的指导。通过采集的用户健身动作的图像,可以采集用户的身体的关节点的信息,根据用户的关节点信息和标准动作的关节点信息,对用户的姿态与标准动作的姿态进行对比,从而确定用户的动作与标准动作之间的差异,为用户提供反馈和指导。
在健身的过程中,用户可能进行一些与健身无关的动作,如拿东西、接电话、走动等。在用户未进行健身动作时,将这些无关的动作视为不标准的健身动作,提供评价和指导,会导致较差的用户体验。
当用户的动作与标准动作的相似度低于预设值时,可以不进行用户动作的评价和指 导。
对于不同的动作,身体不同部位对于动作完成的程度影响不同。可以对不同的身体部位设置不同的权重,仅对于该动作有关的身体部位的姿态进行评价。举例来说,对于深蹲动作,最重要的评价指标有三个,分别是小腿角度,大腿角度和躯干角度,其他肢体的角度几乎不影响深蹲动作的准确性。将其他肢体的姿态对动作完成程度的影响的权重设为0,只有小腿、大腿、躯干的权重大于0。其他肢体的姿态与标准动作之间的差异不会影响对深蹲动作的评价,可以避免其他肢体的动作与标准动作差异较大导致相似度过低,从而能够触发正确的评价和指导。
但是实际中,健身初学者的动作可能非常不标准,尤其对于需要身体不同部位配合的复杂动作,初学者进行的动作可能无法达到相似度要求,以至于被判断为无关动作,无法提供有效的指导,用户体验较差。
仍然以深蹲为例,当用户的动作很不标准时,如躯干过度前倾,小腿过度前倾,此时整体的相似度仍然较低,可能低于预设的阈值,以至于被判断为无关动作,不会触发评价和指导。
因此,针对健身场景,通过静态地比较用户动作的姿态与标准动作的姿态的相似度,无法准确识别用户是否进行健身动作。
用户动作可以是一个时间段内的连续动作。用户动作与标准动作比较时,是将某一个时间点或时间段内的用户动作与静态的标准动作进行比较。为了进行用户动作的姿态与标准动作的姿态的相似度的比较,通常可以需要采用动态时间调整(dynamic time warping,DTW)技术,确定时间窗口,将时间窗口内的用户动作与标准动作进行比较。理想情况,确定用户动作的起始时间和结束时间,将其分别作为时间窗口的起点和终点,将时间窗口内的用户动作与标准动作进行比较。实际中,不同用户不同动作的持续时间不同,用户动作的起始时间和结束时间很难准确的确定,即时间窗口难以确定。若窗口过大则计算开销较大,且计算得到的相似度较低,窗口过小则可能无法识别用户是否进行健身动作。同时,该方法比较用户动作的姿态与标准动作的姿态的相似度,无法准确识别用户是否进行健身动作。
为了解决上述问题,本申请提供了一种辅助健身的方法,能够识别用户是否进行健身动作。该方法可以在电子设备中执行。
示例性的,图1示出了电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部 件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
例如,本申请实施例中的电子设备可以包括处理器110,音频模块170,扬声器170A,无线通信模块160中的蓝牙模块通信,显示屏194、摄像头193,内部存储器121等。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,神经网络处理器(neural-network processing unit,NPU)等中的至少一种。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,通用串行总线(universal serial bus,USB)接口等中的至少一种。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信 模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施 例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,IR技术等中的至少一种。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS),星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
例如,在本申请提供的辅助健身的方法中,摄像头可以采集用户动作视频。感光元件把采集到的光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP,做相关的图像加工处理。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻 盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键 信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图2是本申请实施例的电子设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
例如,在本申请中,处理图像的算法等都可以包括在应用程序框架层。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
例如,在本申请中,内容控制器可以实时获取预览界面中采集到的图像,并将处理之后的图像显示在预览界面中。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
例如,在本申请中,显示器界面上显示的“用户动作视频”、“标准的健身动作”、“指导信息”等内容,可以由视图系统接收处理器的指示进行显示。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
为了便于理解,本申请以下实施例将以具有图1和图2所示结构的电子设备为例,结合附图和应用场景,对本申请实施例提供的辅助健身的方法进行具体阐述。
本申请实施例提供的辅助健身的方法可以应用于电视机、手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
图3是本申请实施例提供的一种辅助健身的方法的示意性流程图。
在步骤S301,电子设备获取用户动作。
该用户动作可以是摄像头采集的实时视频中的用户动作。也可以说,电子设备可以获取摄像头采集的用户动作。电子设备获取的包含用户动作的视频可以称为用户动作视频。
在步骤S302或步骤S303之前,用户可以从动作评价指标集合中的至少一个健身动作 中选择一个特定的健身动作。后续的步骤中,均根据该特定的健身动作对用户动作中进行处理。动作评价指标集合可以是根据专业知识确定的。即电子设备可以获取输入信息,作数输入信息用于指示健身动作。
在步骤S302,电子设备识别用户动作中的用户骨骼节点。
电子设备可以识别用户动作中每一帧图像中的用户骨骼节点,也可以在用户动作中间隔固定或不固定的多帧图像中识别用户骨骼节点。用户的骨骼节点可以用于表示用户的肢体。
用户骨骼节点可以包括图像中的用户的身体上的所有骨骼节点。或者用户骨骼节点可以包括与动作评价指标集合中的特定的健身动作的第一标准节点对应的第一关键节点、与第二标准节点对应的第二关键节点、与第三标准节点对应的第三关键节点、与第四标准节点对应的第四关键节点中的一种或多种。第一关键节点、第二关键节点、第三关键节点、第四关键节点均为图像中的用户的身体上的骨骼节点。
在步骤S303,电子设备匹配第一关键节点的轨迹。
根据动作评价指标集合,匹配用户骨骼节点中的第一关键节点的轨迹。
在步骤S303之前,电子设备还可以确定用户动作的水平方向、竖直方向中的至少一个方向。根据水平方向或竖直方向,可以确定用户的肢体的位置、位置的变化、运动轨迹等的方向。可以利用电子设备中的陀螺仪传感器等装置,在用户动作采集的过程中确定用户动作的水平方向、竖直方向。对于不经常移动的电子设备如电视机、台式计算机等,可以为采集的图像设置默认的水平方向、竖直方向。可以根据默认的水平方向、竖直方向中的至少一个方向,确定用户动作的水平方向、竖直方向。
对于特定的健身动作,专家可以通过专业知识,确定至少一个第一标准节点以及在健身动作中该至少一个第一标准节点中的每个节点的运动轨迹,并将该第一标准节点的运动轨迹保存在动作评价指标集合中。动作评价指标集合包括至少一个健身动作与至少一个第一标准节点的运动轨迹的对应关系。第一标准节点的运动轨迹是特定的健身动作标准的动作中的人体的骨骼节点的运动轨迹。人体的骨骼节点包括第一标准节点。第一标准节点可以理解为用户的肢体的节点。
第一关键节点是用户动作中与第一标准节点对应的骨骼节点,或者说,第一关键节点是用户动作中与第一标准节点相同部位的骨骼节点。
在一些实施例中,电子设备可以将第一关键节点的运动轨迹与第一标准节点的运动轨迹进行对比。如果第一标准节点的运动轨迹的相似度大于相似度预设值,则匹配失败,认为用户未进行健身动作,进行步骤S301,重新获取用户动作。如果第一标准节点的运动轨迹的相似度小于或等于相似度预设值,则匹配成功。
在另一些实施例中,电子设备可以判断第一关键节点的运动轨迹是否符合预设特征,即第一关键节点的运动轨迹是否具有预设特征。预设特征也可以理解为预设条件。电子设备可以从用户动作中确定符合第一标准节点对应的肢体的预设运动轨迹特征的备选动作。
预设特征可以是以是运动轨迹的形状、运动轨迹的周期性等,例如可以是周期性位置变化规律。预设的第一标准节点的周期性位置变化规律也可以理解为第一标准节点的周期性位置变化规律。
根据第一标准节点的位置变化的周期性,电子设备可以确定备选动作。不同的健身动 作,可以对应于相同或不同的备选动作的选取规则。根据备选动作的选取规则,电子设备可以将第一关键节点的运动轨迹对应第一关键节点一个周期的一段视频图像作为备选动作,即确定了备选动作中的用户的备选动作。或者,电子设备也可以将第一关键节点的运动轨迹中第一关键节点位于某一点或某一范围对应的用户动作中的一段视频图像作为备选动作。
动作评价指标集合可以包括预设特征,预设特征用于指示第一标准节点的位置变化的方式。第一标准节点的位置变化的方式可以是第一标准节点的运动方向,如向上或向下运动,或者上下往复运动过程的全部或向上的运动等。第一标准节点的位置变化的方式也可以是运动轨迹的形状,如运动轨迹为三角形、圆形、弧形、折线形等形状,或者该第一标准节点完成的完整形状中的部分,如第一标准节点轨迹为三角形,当第一标准节点位于某一条边的运动。。
确定的备选动作例如可以是第一关键节点在上下往复运动过程中,向上运动的一段对应的用户动作,或者第一关键节点运动轨迹为三角形,第一关键节点在三角形的某一条边运动时对应的一段用户动作。
确定备选动作之后,进行步骤S304。在步骤S304-S307,电子设备对备选动作中进行处理。
例如,深蹲动作,第一标准节点可以是臀部节点。电子设备可以在用户动作中识别每帧图像中的用户的臀部节点作为第一关键节点。在进行深蹲运动时,臀部节点的高度呈现周期性的起伏。电子设备可以将第一关键节点的运动轨迹与第一标准节点的运动轨迹进行匹配,确定臀部节点上下运动的周期起止点,将周期起止点之间的一段视频作为备选动作,或者将一个周期内臀部节点向上运动或向下运动的一段视频作为备选动作。即,预设的运动轨迹特征可以是臀部节点上下运动的一个周期的运动,也可以是臀部节点上下运动的一个周期中臀部节点向上运动。
对于一个特定的健身运动,触发条件为第一关键节点的运动轨迹与第一标准节点的运动轨迹匹配成功。满足触发条件时,进行步骤S304。
在步骤S304,电子设备判断用户进行健身动作。
电子设备可以通过识别条件,确定用户动作中的用户动作是否为特定的健身动作。识别条件即预设的条件。
对确定的备选动作进行处理。判断备选动作中的用户的备选动作是否满足动作评价指标集合中的特定动作的识别条件。当备选动作满足识别条件时,确定备选动作为用户进行的特定动作。当备选动作不满足识别条件时,确定备选动作为用户不是进行的特定动作。
识别条件包括与第二标准节点对应的第二关键节点满足的条件。第二标准节点可以包括第一标准节点的全部或部分节点,第二标准节点也可以包括其他骨骼节点。识别条件可以包括第二标准节点的位置变化信息。第二标准节点的位置变化信息可以包括第二标准节点在视频中的位置变化范围。第二标准节点的位置变化信息也可以第二标准节点之间的相对位置变化的范围。
第二标准节点的位置变化信息可以用于指示肢体的动作变化范围。肢体的动作变化范围可以包括肢体角度变化的变化范围,也可以包括肢体之间相对位置的变化范围。肢体角度变化的变化范围,可以理解为用户动作中,肢体与水平方向或竖直方向之间夹角的最大 值和最小值之间的区间范围。肢体之间相对位置的变化范围,可以理解为肢体之间的距离最大值和最小值之间的区间范围。两个肢体之间的距离,可以根据肢体的长度等确定,例如,两个肢体之间的距离为其中某个肢体的长度的倍数或比例。
识别条件可以是根据专业知识确定的。第二关键节点是用户动作中与第二标准节点对应的骨骼节点,或者说,第二关键节点是用户动作中与第二标准节点相同部位的骨骼节点。
例如,对于深蹲运动,在臀部节点的高度从波峰到波谷的过程中,大腿角度从竖直逐渐向水平方向变化。深蹲运动的识别条件可以包括大腿角度变化满足第一预设范围。第二标准节点可以包括臀部节点和大腿两端点的节点(即膝盖节点和臀部节点),根据备选动作中的大腿两端点的节点可以确定的大腿角度。也就是说,备选动作中的大腿角度变化满足第一预设范围时,认为用户进行深蹲运动。
参见图8,判断用户进行深蹲运动。图8的(a)中,用户站立,图8的(b)中,用户下蹲至最低位置。深蹲运动过程中,用户的臀部节点A上下运动,大腿与水平方向夹角大小变化,即大腿角度发生变化。通过大腿角度的变化反应大腿的动作变化。
对于深蹲运动,动作评价指标集合中的预设特征可以是臀部节点A进行上下运动过程中向下的运动。根据该预设特征,可以确定用户动作中的备选动作。
对于深蹲运动,动作评价指标集合中的识别条件可以包括大腿角度的幅度的范围。
电子设备可以判断备选动作中大腿角度的幅度,是否满足识别条件,以确定是否输出指导信息。
动作评价指标集合可以包括大腿角度的最大值对应的运动轨迹的位置,以及大腿角度的最小值对应的运动轨迹的位置。或者,动作评价指标集合可以包括大腿动作的起始动作位置对应的臀部节点在运动轨迹中的位置,或者,动作评价指标集合可以包括大腿动作的动作终止位置对应的臀部节点在运动轨迹中的位置。通过臀部节点A的运动轨迹,可以确定备选动作的第一帧图像中大腿位置为大腿的动作起始位置,备选动作的最后一帧图像中大腿位置为大腿的动作终止位置。
通过臀部节点A和膝盖节点B,电子设备可以确定大腿角度。在用户动作进行过程中,电子设备确定大腿的动作变化是否满足预设动作变化范围的最大值和最小值要求。也就是说,电子设备确定大腿的动作变化的范围,该范围的最大值满足预设动作变化范围的最大值的要求,且该范围的最小值满足预设动作变化范围的最小值的要求。
最大值要求可以是一个区间,即电子设备判断大腿的动作变化的范围的最大值在预设动作变化范围最大值的区间内。最小值要求也可以是一个区间,电子设备判断大腿的动作变化的范围的最小值在预设动作变化范围最小值的区间内。
图中用虚线表示预设动作变化范围的最小值范围对应的大腿动作的方向的范围,以及预设动作变化范围的最大值范围对应的大腿动作的方向的范围。当用户动作中,当大腿与地面的角度最小时,大腿的方向在最小值范围对应的大腿动作的方向的范围内,且当大腿与地面的角度最大时,大腿的方向在最大值范围对应的大腿动作的方向的范围内,电子设备可以确定用户进行深蹲动作。
图中的用户,大腿的动作变化的最大值在预设动作变化范围的最大值范围内,大腿的动作变化的最小值在预设动作变化范围的最小值范围内,即该用户的大腿的动作变化满足预设动作变化范围的最大值要求和最小值要求。如果没有其他的识别条件,可以确定图中 的用户在进行深蹲动作。
深蹲运动的识别条件还可以包括臀部节点相对脚踝节点的相对距离的变化范围。第二标准节点可以包括臀部节点、膝盖节点和脚踝节点。臀部节点和膝盖节点之间的距离,以及膝盖节点和脚踝节点之间的距离,确定臀部节点相对脚踝节点的相对距离的变化范围。臀部节点相对脚踝节点的相对距离可以通过臀部节点相对脚踝节点之间的距离除以臀部节点和膝盖节点之间的距离与膝盖节点和脚踝节点之间的距离之和,即可以通过臀部节点相对脚踝节点之间的距离与腿长之间的比值表示臀部节点相对脚踝节点的相对距离。用户动作中该比值的最小值与最大值之间的区间范围,即为臀部节点相对脚踝节点的相对距离的变化范围。
如果判断结果是用户进行健身动作,则进行步骤S305;否则,进行步骤S301,重新获取用户动作。
当确定用户进行健身动作时,对于不同的健身动作,可以设计不同的指导体系,从而评价健身动作的标准程度。每个动作的指导体系可以包括核心指标和次要指标。核心指标用于确定用户动作的基础评分和指导,次要指标用于修正评分并提供综合指导。
在步骤S305,电子设备根据核心指标进行评分和评价。
动作评价指标集合包括核心指标。核心指标可以包括对某些肢体的位置信息的评价标准。肢体通过骨骼节点表示。即核心指标可以包括对第三标准节点的评价标准。根据核心指标,获取用户动作中的第三标准节点对应的评分和评价。
肢体的位置信息,可以包括肢体的动作变化范围、肢体的动作变化范围的极限值、肢体的运动轨迹中等的一种或多种。肢体的动作变化范围的极限值即肢体的动作变化范围的最大值或最小值。
核心指标可以包括肢体的动作变化范围与评价信息的对应关系,也可以包括肢体的动作变化范围的极限值与评价信息的对应关系,还可以包括肢体的运动轨迹与评价信息的对应关系。
根据核心指标进行评分和评价,可以对备选视频中与第三标准节点对应的第三关键节点之间的相对位置、位置变化情况等进行评分和评价。第三关键节点是用户动作中与第三标准节点对应的骨骼节点,或者说,第三关键节点是用户动作中与第三标准节点相同部位的骨骼节点。
核心指标可以包括多种第三标准节点的位置信息与多种评分和/或评价之间的对应关系。核心指标也可以包括多种第三标准节点的位置信息与多种分数增加或减少量的对应关系,和/或,核心指标可以包括多种第三标准节点的位置信息与多种评价之间的对应关系。根据第三标准节点的位置信息可以在预设分数的基础上增加或减少,确定评分。评价也可以理解为指导建议。
第三标准节点的位置信息可以包括第三标准节点之间的相对位置和/或每个第三标准节点位置变化情况。第三标准节点可以包括第一标准节点中的全部或部分节点,第三标准节点可以包括第二标准节点中的全部或部分节点,第三标准节点还可以包括其他骨骼节点。第三标准节点可以是第一标准节点和第二标准节点中的部分节点。
电子设备可以根据第一关键节点的运动轨迹,在备选动作或用户动作中确定一帧或多帧图像。电子设备可以根据第三关键节点的运动轨迹,在备选动作或用户动作中确定一帧 或多帧图像。电子设备可以根据该一帧或多帧图像中第三关键节点之间的相对位置,获取对应的评分或评价。电子设备可以根据健身动作确定该一帧或多帧图像的选取方式。第三关键节点之间的相对位置,可以反映用户肢体的位置,如该肢体的角度、肢体间的距离等。
电子设备可以根据第三关键节点的运动轨迹,获取与第三关键节点的运动轨迹对应的评分和/或评价。
电子设备可以在备选动作或用户动作中,确定第三关键节点位置的变化范围。可以获取与第三关键节点位置的变化范围或变化范围的极限值对应的评分和/或评价。
例如,对于深蹲运动,核心指标可以包括大腿角度范围的最小值与评分之间的对应关系,以及大腿角度范围与评价之间的对应关系。当大腿与地面角度的最小值小于75度时,认为用户完成了一个深蹲动作,当角度为0度时,认为该指标达到最好的完成度。大腿与地面即水平方向角度的最小值小于75度,可以分为若干个区间(如小于0度、0-25度(不含25度)、25-50度(不含50度)、50-75度(不含75度)四个区间),每个区间对应于不同的评分,每个区间对应于相同或不同的评价。当根据第三关键节点确定的大腿与地面角度为某一数值,确定该数值所属的范围,确定对应的评分和评价,从而为用户提供相应的指导(如根据第三关键节点确定的大腿与地面角度的最小值为32度,所属的区间为25-50度,该区间对应的评分为80分,该范围对应的评价为蹲的再深一点)。
在步骤S306,电子设备根据次要指标进行评分和评价。
动作评价指标集合包括次要指标。次要指标可以包括第四标准节点的评价标准。根据次要指标进行评分和评价,电子设备可以对备选视频中第四关键节点之间的相对位置、位置变化情况等进行评分和评价。第四关键节点是用户动作中与第四标准节点对应的骨骼节点,或者说,第四关键节点是用户动作中与第四标准节点相同部位的骨骼节点。
次要指标可以包括第四标准节点的位置信息与评分减少量和/或评价之间的对应关系。也可以理解为,次要指标包括第四标准节点对应的肢体的位置信息与评分减少量和/或评价之间的对应关系。当用户动作不满足次要指标的阈值时,可以降低通过核心指标计算出的评分,并可以确定相应的指导。
第四标准节点的位置信息可以包括第四标准节点之间的相对位置和/或每个第四标准节点位置变化情况。第四标准节点可以包括第三标准节点中的全部或部分节点,第四关键节点还可以包括第三标准节点之外的其他骨骼节点。位置变化情况例如可以是位置变化的幅度、范围、运动轨迹等。
例如,对于深蹲运动,次要指标可以包括小腿角度范围与评分之间的对应关系,以及小腿角度范围与评价之间的对应关系。第四标准节点可以包括膝盖节点与脚踝节点。根据膝盖节点与脚踝节点之间的相对位置,电子设备可以确定小腿角度。次要指标还可以包括躯干角度范围与评分之间的对应关系,以及躯干角度范围与评价之间的对应关系等。次要指标可以包括小腿与地面夹角大于50度。当备选动作不满足次要指标时,可以为用户提供相应的评价进行指导,如提醒用户小腿不要过度前倾。
对于不同的健身动作,评价的指标可以完全不同,每个肢体动作在不同的健身动作中的影响也不同(如深蹲中手臂动作通常无关紧要,而哑铃弯举动作中手臂动作是核心,下肢动作反而不重要)。如果对每个肢体的动作都进行判断、评价和指导,则会向用户输送过多的无用信息,影响用户体验。通过步骤S305-S306,能够确定对应于特定健身运动的 评价,为用户提供有效指导。
在步骤S307,电子设备反馈评分和评价。
电子设备可以输出反馈信息,反馈信息包括步骤S306确定的评分以及步骤S305-S306确定的评价,从而将评分和评价反馈给用户。例如,可以通过在屏幕上弹出文字、播放提示语音等方式,反馈评分和评价。
当用户的使用场景不适合公放语音播报时,电子设备可以将语音通过蓝牙耳机播放评分和/或评价,或者,通过屏幕显示评分和/或评价。公放即通过扬声器播放。
例如,通过图像识别,确定用户动作中包括多个人,电子设备可以通过屏幕显示评分和/或评价。
在确定不通过扬声器播放评分和/或评价时,电子设备可以进行图像识别,确定进行健身动作的用户是否佩戴蓝牙耳机。当确定用户佩戴蓝牙耳机,且蓝牙耳机与辅助健身装置通过蓝牙连接时,通过蓝牙耳机播放评分和/或评价。
电子设备也可以获取反馈方式指示信息,所述反馈方式指示信息用于指示进行反馈的方式,即输出评分和/或评价的方式。
电子设备可以根据第一关键节点的运动轨迹,对用户的健身动作进行计数,记录用户完成健身动作的数量。电子设备可以在运动评价指标集合中保存第一标准节点的周期数量与健身动作数量的对应关系。
例如,第一关键节点的运动轨迹的一个周期,对应于用户完成一个健身动作。对于不同的健身动作,也可以是第一关键节点的运动轨迹的多个周期对应于用户完成一个健身动作,或第一关键节点的运动轨迹的一个周期对应于用户完成一个健身动作。
电子设备也可以根据第二关键节点或第三关键节点的运动轨迹,对用户的健身动作进行计数。
根据步骤S301-S304,电子设备可以确定多个备选动作。根据备选动作的数量,电子设备可以确定用户完成健身动作的次数。可以将完成健身动作的次数反馈给用户。
通过最核心的动作评判规则,即识别条件,确定用户进行的是特定的健身动作。当用户的动作满足识别条件时,即使其他部位很不标准,也认为用户在尝试学习该健身动作,只是动作不标准。而如果用户的动作不满足识别条件,即使肢体与该健身动作的标准动作相似度很高,也不是健身动作,避免了健身动作的误识别问题。因此,本申请实施例能够准确识别用户动作是否为健身动作。即使用户的动作不标准,通过本申请实施例提供的方法,能够确定用户进行健身动作。
通过第一关键节点运动轨迹与第一标准节点运动轨迹的匹配,可以确定用户进行健身动作的起止时间,从而能够度该起止时间内的用户动作进行判断。判断用户是否进行健身动作并对用户的动作进行评分,即通过特定健身动作的最低完成指标即识别条件判断用户是否进行健身动作。将不满足该识别条件的动作过滤掉,即对于用户的非健身动作不进行评分和评价,仅对满足该识别条件的动作进行评分和评价,对于该动作为用户提供指导。
由于环境限制,可能存在部分肢体超出屏幕的情况,即用户动作可能不包括用户的一些骨骼节点。
在步骤S303之前,电子设备可以确定第一关键节点是否包括全部的第一标准节点,第二关键节点是否包括全部的第二标准节点。
若第一关键节点包含第一标准节点中的所有节点,且第二关键节点包含第二标准节点中的所有节点,进行步骤S303。
若第一关键节点不包含第一标准节点中的任一个节点,或第二关键节点不包含第二标准节点中的任一个节点,进行步骤S301,重新获取用户动作。这种情况下,电子设备可以输出第一提示信息,第一提示信息用于提示用户调整用户动作采集的用户的身体的范围。
第一提示信息可以用于提醒用户调整与摄像头之间的相对位置。第一提示信息还可以包括第一关键节点不包含的第一标准节点和/或第二关键节点不包含第二标准节点的信息,或者,第一提示信息还可以包括全部第一标准节点和第二标准节点的信息,以提醒用户调整与摄像头之间的相对位置,以使得用户动作包括第一标准节点和第二标准节点中的全部节点,第一标准节点和第二标准节点不超出用户动作采集的范围。
在步骤S305之前,电子设备可以确定备选动作中是否存在第三标准节点,即第三关键节点是否包含全部的第三标准节点。
在一些实施例中,若备选动作中不存在第三标准节点中的任何一个节点,可以进行步骤S301,重新获取用户动作。若备选动作中存在第三标准节点中的所有节点,进行步骤S305。如果第三标准节点是第一标准节点和第二标准接节点中的全部或部分节点,在确定第一关键节点包括全部的第一标准节点,第二关键节点包括全部的第二标准节点时,电子设备就可以确定备选动作中存在第三标准节点中的所有节点。
在另一些实施例中,若不存在第三标准节点中的任何一个节点,电子设备可以确定备选动作的评分为最低评分。若存在第三标准节点中的全部或部分节点,可以进行步骤S305。
对于备选动作中不存在的第三标准节点,可以在确定评分时考虑该节点的影响,即扣除该节点对应的评分。可以不再确定不存在的第三标准节点对应的评价。
若不存在第三标准节点中的全部或部分节点,可以输出第二提示信息,第二提示信息用于提示用户调整用户动作采集的用户的身体的范围。也就是说,第二提示信息用于提醒用户调整与摄像头之间的相对位置。第二提示信息可以包括第三关键节点不包括的第三标准节点的信息,以提醒用户调整与摄像头之间的相对位置,使得用户动作包括第三标准节点中的全部节点,第三关键节点与第三标准节点一一对应,第三标准节点对应的身体部位不超出用户动作采集的空间范围。
在步骤S306之前,电子设备可以确定备选动作中是否存在第四标准节点。若不存在第四关键节点中的任何一个节点,可以确定备选动作的评分为最低评分。若存在第四关键节点中的全部或部分节点,可以进行步骤S306。对于不存在第四关键节点中部分节点的情况,可以适当降低总评分。电子设备可以根据动作评价指标集合中不存在的第四关键节点中部分节点对应的评分减少量,降低通过核心指标计算出的评分。
若不存在第四关键节点中的全部或部分节点,电子设备可以输出第三提示信息,第三提示信息用于提示用户调整用户动作采集的用户的身体范围。也就是说,第三提示信息用于提醒用户调整与摄像头之间的相对位置。第三提示信息可以包括第四关键节点不包括的第四标准节点的信息,以提醒用户调整与摄像头之间的相对位置,使得用户动作包括第四标准节点中的全部节点,第四标准节点不超出用户动作采集的范围。
对于备选动作中不存在的第四标准节点,即第四关键节点不包含的第四标准节点,电 子设备可以在确定评分时考虑该节点的影响,即扣除该节点对应的评分。电子设备可以不再确定不存在的第四标准节点对应的评价。
电子设备可以输出第四提示信息,第四提示信息用于提醒用户反馈信息不完整,未对对所有标准节点进行评价。未进行评价的标准节点可以是第三标准节点或第四标准节点。也就是说,在能够识别用户进行健身动作的情况下,确定第四关键节点信息中缺少部分第四标准节点时,电子设备可以对用户进行提醒。
通过上述方式,提供了用户部分肢体不在用户动作内时的容错机制。对于基于图像的健身辅助场景,尤其是在家庭中的使用场景,由于空间有限,以及图像获取设备通常位置和角度是固定的,导致用户在做健身动作的过程中,部分肢体可能在屏幕外无法识别(比如距离较近,脚踝在屏幕外看不到)。当部分肢体由于环境限制导致超出用户动作范围,即屏幕范围或视频采集范围时,采用本申请实施例提供的方法,仍能对用户进行的健身动作进行计数、评分和评价,避免用户部分骨骼节点不在图像识别范围内造成的识别和评价错误。
对于每个健身动作,建立一个可以识别该动作的最小关节点集,即第一关键节点和第二关键节点组成的点集。当用户动作中存在该最小关节点集中的所有节点时,可以识别用户进行健身动作,可以对用户完成的健身动作进行计数。
对于每个健身动作,建立一个基础指导点集,即第三标准节点组成的点集。当用户动作中存在该基础指导点集中的所有节点时,可以对用户进行的健身动作进行基本的评分和评价。基础指导点集可以与最小关节点集相同或不同。如深蹲运动,基础指导点集与最小关节点集均可以由臀部节点和膝盖节点组成。
如果用户动作包含基础指导点集中的全部节点时,电子设备可以为用户提供基础的评分和指导。
如果用户动作中不包括基础指导点集和/或最小关节点集中的全部节点,电子设备可以输出提醒信息,以提醒用户调整用户动作的采集范围,从而使得采集的用户动作包括基础指导点集和/或最小关节点集中的全部节点。
对于每个健身动作,建立一个扩展指导点集,即第四标准节点组成的点集。第四标准节点不是全部位于用户动作中的情况,无法确定用户的动作是否满足次要指标,可以不再输出次要指标对应的评价,可以适当减低评分。
通过步骤S301-S307,能够准确判断用户是否进行健身动作,记录用户完成动作的次数,评价用户完成动作的质量,识别出错误的局部动作,并给予用户反馈和指导。
应当理解,位置变化范围、运动轨迹等与用户肢体的位置相关的信息,可以用于指示二维空间的位置,即在用户动作中的位置,也可以用于指示三维空间中的位置,即根据用户动作确定的在三维空间中的位置,本申请实施例对此不作限定。
图4是本申请实施例提供的一种辅助健身的方法。该方法可以由辅助健身装置执行,辅助健身装置是一种电子设备。辅助健身装置包括摄像头、处理器、存储器、显示器/扬声器/蓝牙通信模块等。处理器包括CPU,还可以包括GPU、NPU等。
在步骤S201,摄像头获取用户动作视频。
在步骤S202,CPU/GPU/NPU运行骨骼节点识别算法,以识别用户动作视频中的用户的骨骼节点。
在步骤S203,存储器存储动作评价指标集合。存储器例如可以是ROM。
在步骤S204,CPU判断健身动作并进行评分和评价。CPU根据存储器存储的动作评价指标集合和识别的骨骼节点,判断用户是否进行健身动作,并在确定用户进行健身动作时,对用户的动作进行评分和评价。
在步骤S205,显示器/扬声器/蓝牙耳机等,输出反馈信息。蓝牙通信模块可以将反馈信息发送至蓝牙耳机,蓝牙耳机可以输出反馈信息。反馈信息可以包括用户的动作的评分和评价。通过显示器输出反馈信息,图形用户界面可以参见图5。
图5是本申请实施例提供的一种图形用户界面(graphical user interface,GUI)的示意图。本申请以辅助健身装置是手机为例进行说明。
图5中的(a)图示出了手机的解锁模式下,手机的屏幕显示系统显示了当前输出的界面内容501,该界面内容501为手机的主界面。该界面内容501显示了多款第三方应用程序(application,App),例如支付宝、任务卡商店、微博、相册、微信、卡包、设置、健身等。应理解,界面内容501还可以包括其他更多的应用程序,本申请对此不作限定。
当手机检测到用户点击主界面501上的健身应用的图标502的操作后,可以启动健身应用,显示如图5中的(b)图所示的健身应用界面503。健身应用界面503上可以包括多种健身动作。
当手机检测到用户点击健身应用界面503上的一种健身动作后,可以显示如图5中的(c)图所示的该健身动作的指导界面。指导界面中可以包括标准的健身动作504,摄像头实时采集的用户动作即用户动作视频505,以及评价即指导信息506,用户动作计数507等。
图6是本申请实施例提供的一种图形用户界面的示意图。电视机可以显示如图6所示的该健身动作的指导界面。
应当理解,图5和图6仅是示例性的说明,其他具有显示功能的电子设备如平板电脑、个人计算机的显示器等也可以显示如图5中的(c)图、图6所示的该健身动作的指导界面。
图7是本申请实施例提供的一种辅助健身的方法的流程性示意图。
在步骤S601,电子设备获取用户动作。
电子设备可以获取用户动作的实时图像。或者说,电子设备可以获取用户动作视频。用户动作视频中,用户正在进行用户动作。用户动作视频可以是电子设备实时采集的图像形成的视频。
电子设备可以识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
电子设备可以识别用户动作中的用户的骨骼节点。骨骼节点也可以称为关节点。用户的骨骼节点可以表示用户的肢体。通过用户的骨骼节点位置的变化,电子设备可以确定用户的肢体的位置、动作等。或者,也可以通过其他方式,识别用户的肢体。肢体也可以理解为身体部位。
电子设备可以根据关节点识别第一肢体和第二肢体。
在步骤S602,电子设备确定所述用户动作中第一肢体的运动轨迹。
第一肢体可以是身体的某个或某些部位。
第一肢体的运动轨迹可以指第一肢体从开始位置到结束为止所经过的路线组成的动作的空间特征。运动轨迹可以包括运动轨迹方向、运动轨迹形状等。第一肢体的运动轨迹方向可以是第一肢体在进行用户动作时所形成的移动方向。运动轨迹形状可以是直线、曲线或者两者的组合。
电子设备可以根据第一肢体的运动轨迹,从用户动作中确定备选动作。电子设备判断第一肢体的运动轨迹是否满足第一预设条件,将满足第一预设条件的运动轨迹对应的一段用户动作作为备选动作。备选动作即第一肢体的运动轨迹满足第一预设条件的用户动作。
第一预设条件也可以理解为对第一肢体的预设的位置变化特征。在用户动作中,电子设备可以确定所述用户的第一肢体的运动轨迹符合预设的位置变化特征的一段用户动作作为备选动作。即电子设备可以从用户动作中确定符合所述第一肢体的预设运动轨迹特征的备选动作。电子设备可以根据所述第一肢体的运动轨迹的特征,从所述用户动作视频中确定备选动作。
预设的位置变化特征可以包括运动轨迹的形状、运动轨迹的周期性等中的一种或多种。
电子设备可以选取所述第一肢体的运动轨迹符合预设特征的一段视频作为备选动作,即备选动作是运动轨迹具有某一特征的一段视频。
例如电子设备可以根据第一肢体的运动轨迹的周期性,从用户动作中确定备选动作。或者,电子设备可以选取第一肢体的运动轨迹与预设轨迹的相似度小于预设值的一段用户动作作为备选动作。
备选动作可以是对应于第一肢体的运动轨迹一个周期的视频,可以是对应于第一肢体的运动轨迹一个周期中的某一段的视频或图像,例如一个周期中第一肢体位于特定位置范围的轨迹对应的视频或图像。备选动作的选取方式可以是根据特定的健身动作确定的。
预设特征可以用于指示第一肢体对应的骨骼节点的位置变化的方式。第一肢体的位置变化的方式可以是第一肢体的运动方向变化。
第一肢体的运动方向变化例如可以是如向上或向下运动,或者上下往复运动等。第一肢体的运动方向,也可以理解为第一肢体的全部或部分骨骼节点的运动方向。第一关键节点为第一肢体中的骨骼节点。第一关键节点的位置变化的方式也可以是运动轨迹的形状,如运动轨迹为三角形、圆形、弧形、折线形等形状。
第一肢体的角度变化即第一肢体动作的角度变化,第一肢体的节点之间的方向改变,相对位置改变。
第一肢体的运动轨迹可以理解为第一肢体中的第一关键节点的运动轨迹。确定的备选动作例如可以是第一关键节点在上下往复运动过程中,向上运动的一段对应的用户动作,或者第一关键节点运动轨迹为三角形,第一关键节点在三角形的某一条边运动时对应的一段用户动作。
在步骤S603,电子设备确定第二肢体在所述备选动作中的动作变化幅度。
电子设备可以确定在备选动作中,确定第二肢体的动作变化幅度。
第二肢体的动作变化幅度,也可以理解为第二肢体的位置变化幅度,即第二肢体的位置变化最大范围之间的差值。动作变化幅度可以包括变化角度、变化距离等。也就是说,第二肢体的动作变化幅度可以是第二肢体与水平方向或竖直方向的夹角的最大值与最小 值的差值,第二肢体的动作变化幅度也可以是在第二肢体的位置变化过程中经过的各个位置之间的最远距离,第二肢体的动作变化幅度还可以是第二肢体之间距离变化量的最大值。
电子设备可以判断所述用户动作中第二肢体的动作变化满足预设动作变化范围的最大值要求和最小值要求,以确定所述用户动作为健身动作。
第二肢体可以包括一个或多个身体部位。第二肢体可以包括第一肢体中的全部或部分。第二肢体也可以是用户的第一肢体之外的肢体。
电子设备还可以判断所述用户动作满足预设的第二肢体的动作变化范围,以确定所述用户动作为健身动作。
第二肢体在所述备选动作中的动作变化范围,可以包括第二肢体在备选动作中的动作变化幅度。第二肢体在所述备选动作中的动作变化范围,可以包括所述第二肢体的在所述备选动作中的动作起始位置,所述第二肢体的在所述备选动作中的动作终止位置中的至少一种。
可以根据第一肢体的运动轨迹确定第二肢体在所述备选动作中的动作起始位置和动作终止位置。例如,第二肢体在所述备选动作中的动作起始位置可以是备选动作第一帧图像中第一肢体的位置。第二肢体在所述备选动作中的动作终止位置是备选动作第一帧图像中第一肢体的位置。或者,可以确定第一肢体位于其运动轨迹上的某一点对应的备选动作中的图像中第二肢体的位置为第二肢体的动作起始位置或动作终止位置。
电子设备可以保存有第一肢体在其运动轨迹上的位置与第二肢体的动作起始位置或动作终止位置的对应关系。
备选动作中的所述第二肢体的位置的变化范围可以是在备选动作中第二肢体的位置的变化幅度对应的范围。通过备选动作的选取,使得第二肢体的位置的变化幅度为第二肢体在备选动作中的动作起始位置和动作终止位置对应的差值。或者,备选动作中的所述第二肢体的位置的变化范围用于指示所述第二肢体的位置在所述备选动作的最后一帧图像相对于所述备选动作的第一帧图像的变化。
预设动作变化范围的最大值要求,即预设动作变化范围的最大值的区间范围;预设动作变化范围的最小值要求,即预设动作变化范围的最小值的区间范围。当第二肢体的动作变化范围的最大值在预设动作变化范围的最大值区间内,且第二肢体的动作变化范围的最小值在预设动作变化范围的最小值区间内,可以认为用户动作为健身动作,即该用户进行健身动作。
第二肢体可以包括一个或多个身体部位。电子设备可以根据第二肢体的位置的变化的范围,确定用户是否进行健身动作。电子设备可以根据用户动作获取第二肢体的位置的变化范围。根据保存的健身运动对应识别条件,电子设备确定用户是否进行健身动作。
识别条件可以是第二肢体的位置的变化的范围等。
动作变化范围可以包括位置的变化方式。位置的变化的方式例如可以是上下运动、水平运动、环形移动、角度变化等。第二肢体的动作变化范围可以包括第二肢体角度变化的变化范围,也可以包括第二肢体之间相对位置的变化范围。
电子设备可以识别所述用户动作中的骨骼节点,以确定所述用户动作中的第二肢体。电子设备可以判断所述用户动作中的第二肢体满足所述预设的第二肢体的动作变化范围。
电子设备可以根据第二肢体的位置的变化范围,确定用户进行的健身运动的种类。根据第二肢体的位置的变化满足的识别条件,可以确定该识别条件对应的健身运动的种类。
电子设备可以在步骤S601之前,或者在步骤S602之前,获取输入信息,输入信息用于指示健身动作。所述电子设备根据所述输入信息确定第一预设条件,即第一肢体的运动轨迹满足的预设条件。
电子设备可以确定与所述健身动作对应的第一肢体、第二肢体、第三肢体、第四肢体等中的至少一种。
根据输入信息,电子设备可以确定健身运动。也就是说,电子设备可以根据输入信息,确定健身运动的种类。对于不同的健身动作,第二肢体可以是不同的身体部位。识别条件可以是第二肢体的位置的变化范围。因此,对于不同的健身动作,识别条件可以不同。电子设备可以根据该种类的健身运动对应识别条件,确定用户是否进行该健身动作。
识别条件可以包括第二肢体的动作变化幅度,识别条件还可以包括第二肢体的动作起始位置、第二肢体的动作终止位置中的至少一种。当用户动作中的第二肢体满足识别条件,可以确定用户动作为健身动作,即用户在进行健身动作。
通过输入信息指示健身动作,电子设备可以仅根据该健身动作对应的第二肢体的范围,确定用户是否进行健身动作,无需判断用户的动作是否满足其他健身动作的识别条件,减少计算量。第二肢体的动作变化范围可以包括第二肢体的动作变化幅度,还可以包括第二肢体的动作起始位置、第二肢体的动作终止位置。
第二肢体的动作起始位置、第二肢体的动作终止位置可以根据第一肢体的运动轨迹确定。第二肢体的动作起始位置可以是备选动作中第二肢体的动作起始位置,即备选动作对应的第一帧图像中第二肢体的位置。第二肢体的动作起始位置也可以是第一肢体位于运动轨迹中某一点时,该点对应的备选动作的图像中第二肢体的位置。
用户的第二肢体的位置的变化范围,可以是第二肢体在整个用户动作中的位置变化范围,即可以是第二肢体在所述用户动作中第二肢体的位置的变化幅度的最大值对应的范围。电子设备可以在用户动作中每次选取任意两帧图像,将第二肢体的位置进行比较,通过多次的图像的帧的选取和第二肢体的位置的比较,从而确定第二肢体的位置的变化幅度的最大值。
或者,用户的第二肢体的位置的变化范围也可以是在用户动作中备选动作中的位置变化范围。备选动作是用户动作中的一段视频。
在步骤S602之前,电子设备可以确定用户动作中是否包含用户的第一肢体。
当用户动作中包含用户的第一肢体,进行步骤S602。
当所述第一肢体位于所述用户动作之外,电子设备可以输出提醒信息。所述提醒信息可以用于提醒所述用户调整所述用户动作的图像采集范围。此时,电子设备可以不再进行步骤后续步骤。
电子设备还可以确定用户动作中是否包含用户的第二肢体。当所述第二肢体位于所述用户动作之外,可以输出提醒信息。所述提醒信息可以用于提醒所述用户调整所述用户动作的图像采集范围。
在步骤S602之前,电子设备还可以确定用户动作中的水平方向或竖直方向。根据水平方向或竖直方向,电子设备可以确定用户的肢体的位置、位置的变化、运动轨迹等。
在步骤S604,电子设备根据所述动作变化幅度,确定输出指导信息。
当所述用户进行健身动作时,电子设备可以对用户的动作进行评价和指导。
可以根据第二肢体的动作变化幅度是否满足第二预设条件,确定是否输出指导信息。第二预设条件也可以称为识别条件。
电子设备判断所述动作变化幅度满足第二预设条件;电子设备确定输出指导信息。当第二肢体的动作变化幅度满足第二预设条件时,即认为用户进行健身动作。
在一些实施例中,电子设备可以比较用户动作与该健身动作的标准动作的相似度,对用户的动作进行评价和指导。
在另一些实施例中,对于健身动作,电子设备可以保存用户动作与评价信息的对应关系。电子设备可以通过用户的肢体的位置信息表示用户的动作。电子设备可以保存有用户的第一肢体的位置信息与评价信息的对应关系。
第二肢体的位置信息可以用于指示第二肢体的位置,第二肢体的位置可以是位置的变化范围,或者可以是具体某一时间点的位置。
第二肢体的位置信息包括第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种。
电子设备可以根据第一肢体的运动轨迹,确定当第二肢体位于某一位置的时间点,确定该时间点的用户动作,确定该时间点第二肢体的位置信息,从而根据第二肢体的位置信息与评价信息的对应关系,确定对应的评价信息。
电子设备可以根据保存的所述第二肢体的动作起始位置、所述第二肢体的动作终止位置与第一肢体在其运动轨迹中的位置的对应关系,确定所述第二肢体的动作起始位置、所述第二肢体的动作终止位置。根据第二肢体的动作起始位置、所述第二肢体的动作终止位置是否满足识别条件,确定是否输出指导信息。
电子设备可以根据第一肢体的运动轨迹,确定当第一肢体位于某一范围的时间段,确定该时间段第二肢体的位置信息,从而根据第二肢体的位置信息与评价信息的对应关系,确定对应的评价信息。
电子设备可以根据第一肢体的位置变化的情况,确定对应的评价信息。
例如,第二肢体的位置可以是第二肢体在用户动作或备选动作中的与水平或竖直方向的夹角的最大值或最小值,可以是夹角的变化范围,即夹角的最小值至最大值或最小值至最大值,第二肢体的位置也可以是第二肢体的运动距离与第二肢体的长度等尺寸的比值,第二肢体的位置还可以是第二肢体与其他肢体的相对位置关系或第二肢体之间的相对关系等。
当所述用户进行健身动作时,确定与第一位置信息对应的第一评价信息。所述第一位置信息用于指示在所述用户动作中所述第二肢体的位置。
第二肢体的位置信息可以与评价信息一一对应。第二肢体的位置信息可以包括所述第二肢体的动作变化范围、所述第二肢体的动作变化范围幅度、所述第二肢体的动作变化范围的极限值、所述第二肢体的运动轨迹中的至少一种。
第二肢体的动作变化范围的极限值,即第二肢体的动作变化范围的最大值或最小值。例如可以包括第二肢体的角度的最大值或最小值,也可以包括第二肢体中之间的距离的最大值或最小值,即的一个身体部位与另一个身体部位距离的最大值或最小值等。
第二肢体的动作变化幅度,也可以称为动作变化范围的幅度,即第二肢体的动作变化范围的最大值或最小值之间的差值。
第二肢体的运动轨迹与评价信息的对应关系,例如可以是第二肢体的运动轨迹的形状、周期等与评价信息的对应关系。
电子设备可以根据所述第一评价信息,输出指导信息。第一评价信息可以包括评分和/或评价。指导信息可以与第一评价信息相同或不同。电子设备可以根据用户的其他肢体的动作完成情况,对第一评价信息进行调整,以得到指导信息。
电子设备可以确定与所述用户的第二位置信息对应的第二评价信息。所述第二位置信息用于指示在所述用户动作中所述第三肢体的位置。第二评价信息例如可以是评分和/或评价。第二位置信息可以包括所述第三肢体的动作变化范围、第三肢体的动作变化幅度、所述第三肢体的动作变化范围的极限值、所述第三肢体的运动轨迹等中的一种或多种。
电子设备可以根据所述第二评价信息和所述第一评价信息,输出指导信息。
在第二肢体位置对应的评分大于预设值时,指导信息可以包括第二评价信息中的评价。电子设备可以在第二肢体位置对应的第一评价信息中的评分大于或等于预设值时,输出第二评价信息中的评价;反之,在第二肢体位置对应的第一评价信息中的评分小于预设值时,指导信息可以仅包括第一评价信息中的评价。
对于健身动作,可以在用户的主要肢体动作完成的不标准时,仅对主要肢体动作进行指导,在用户的主要肢体动作完成较为标准时,为其他的肢体的动作进行指导。可以在评分小于预设值时,不进行第二评价信息的确定。
电子设备可以根据第二评价信息中的评分对第一评价信息中的评分进行调整,从而确定指导信息中的评分。
第二评价信息中的评分可以是增加或减少的分数数值,可以在第一评价信息中的评分基础上增加或减少该分数数值。或者,第二评价信息中的评分也可以是第三肢体的动作的分值。电子设备可以根据第三肢体的权重,在第一评价信息中的评分基础上增加或减少对第二评价信息中的评分乘以权重得到的数值。
第三肢体可以包括一个或多个身体部位。第三肢体可以是用户的第二肢体之外的肢体。
电子设备可以确定用户动作中是否包含用户的第三肢体。当所述第三肢体位于所述用户动作之外,电子设备可以输出提醒信息。所述提醒信息可以用于提醒所述用户调整所述用户动作的图像采集范围。
通过步骤S601-S602,能够准确判断用户是否进行健身动作,从而能够在用户进行健身动作时提供指导,提高用户体验。
图9是本申请实施例提供的一种电子装置的示意性结构图。装置700包括获取模块701,确定模块702。
获取模块701,用于获取用户动作。
确定模块702,用于从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作。
确定模块702还用于,确定第二肢体在所述备选动作中的动作变化幅度。
确定模块702还用于,根据所述动作变化幅度,确定输出指导信息。
可选地,装置700还包括判断模块,用于判断所述动作变化幅度满足第二预设条件。
确定模块702还用于,确定输出指导信息。
可选地,获取模块701还用于,获取输入信息。
确定模块702还用于,确定所述输入信息对应的所述第一预设条件。
可选地,确定模块702还用于,确定与第一位置信息对应的第一评价信息,所述第一位置信息包括第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
装置700还包括输出模块,用于根据所述第一评价信息,输出指导信息。
可选地,确定模块702还用于,确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
输出模块还用于,根据所述第二评价信息和所述第一评价信息,输出所述指导信息。
可选地,装置700还包括识别模块,用于识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
可选地,确定模块702还用于,根据所述动作变化幅度,以及所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息;或者,所述电子设备根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;或者,所述电子设备根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
图10是本申请实施例提供的一种电子装置的示意性结构图。装置800包括处理器801,通信接口802。
通信接口802用于,获取用户动作。
处理器801用于:从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作;确定第二肢体在所述备选动作中的动作变化幅度;根据所述动作变化幅度,确定输出指导信息。
可选地,处理器801用于:判断所述动作变化幅度满足第二预设条件;确定输出指导信息。
可选地,通信接口802还用于,获取输入信息。
处理器801还用于,确定所述输入信息对应的所述第一预设条件。
可选地,处理器801用于:确定与第一位置信息对应的第一评价信息,所述第一位置信息包括第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;根据所述第一评价信息,确定所述指导信息。
可选地,处理器801用于:确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三 肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;根据所述第二评价信息和所述第一评价信息,确定所述指导信息。
可选地,处理器801还用于:识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
可选地,处理器801还用于:根据所述动作变化幅度,以及所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息,或者,根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;或者,根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
本申请实施例还提供一种电子装置,包括:至少一个处理器和通信接口,所述通信接口用于所述电子装置与其他装置进行信息交互,当程序指令在所述至少一个处理器中执行时,使得所述电子装置执行上文中的方法。
本申请实施例还提供一种计算机程序存储介质,其特征在于,所述计算机程序存储介质具有程序指令,当所述程序指令被直接或者间接执行时,使得前文中的方法得以实现。
本申请实施例还提供一种芯片系统,其特征在于,所述芯片系统包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得前文中的方法得以实现。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,除非另有说明,“多个”是指两个或多于两个。
术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的 部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (16)

  1. 一种辅助健身的方法,其特征在于,包括:
    电子设备获取用户动作;
    所述电子设备从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作;
    所述电子设备确定第二肢体在所述备选动作中的动作变化幅度;
    所述电子设备根据所述动作变化幅度,确定输出指导信息。
  2. 根据权利要求1所述的方法,其特征在于,所述电子设备根据所述动作变化幅度,确定输出指导信息,包括:
    所述电子设备判断所述动作变化幅度满足第二预设条件;
    所述电子设备确定输出所述指导信息。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    所述电子设备获取输入信息;
    所述电子设备确定所述输入信息对应的所述第一预设条件。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备确定与第一位置信息对应的第一评价信息,所述第一位置信息包括所述第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体动作的运动轨迹确定的;
    所述电子设备根据所述第一评价信息,输出所述指导信息。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:所述电子设备确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体动作的运动轨迹确定的;
    所述电子设备根据所述第一评价信息,输出指导信息,包括:所述电子设备根据所述第二评价信息和所述第一评价信息,输出所述指导信息。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述方法还包括:识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述电子设备根据所述动作变化幅度,确定输出指导信息,包括:
    所述电子设备根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息;
    或者,所述电子设备根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;
    或者,所述电子设备根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,
    所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
  8. 一种电子装置,其特征在于,包括处理器和通信接口;
    所述通信接口用于,获取用户动作;
    所述处理器用于:
    从所述用户动作中确定所述用户动作中第一肢体的运动轨迹满足第一预设条件的备选动作;
    确定第二肢体在所述备选动作中的动作变化幅度;
    根据所述动作变化幅度,确定输出指导信息。
  9. 根据权利要求8所述的装置,其特征在于,所述处理器用于:
    判断所述动作变化幅度满足第二预设条件;
    确定输出所述指导信息。
  10. 根据权利要求8或9所述的装置,其特征在于,
    所述通信接口还用于,获取输入信息;
    所述处理器还用于,根据所述输入信息确定所述第一预设条件。
  11. 根据权利要求8-10中任一项所述的装置,其特征在于,所述处理器用于:
    确定与第一位置信息对应的第一评价信息,所述第一位置信息包括所述第二肢体的动作变化幅度、所述第二肢体的动作起始位置、所述第二肢体的动作终止位置、所述第二肢体的运动轨迹中的至少一种,所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的;
    根据所述第一评价信息,确定所述指导信息。
  12. 根据权利要求11所述的装置,其特征在于,所述处理器用于:
    所述电子设备确定与所述用户的第二位置信息对应的第二评价信息,所述第二位置信息包括第三肢体的动作变化幅度、所述第三肢体的动作起始位置、所述第三肢体的动作终止位置、所述第三肢体的运动轨迹中的至少一种,所述第三肢体的动作起始位置和所述第三肢体的动作终止位置是根据所述第一肢体动作的运动轨迹确定的;
    根据所述第二评价信息和所述第一评价信息,确定所述指导信息。
  13. 根据权利要求8-12中任一项所述的装置,其特征在于,所述处理器还用于:识别所述用户动作中的关节点,以确定所述用户动作中的所述第一肢体和所述第二肢体。
  14. 根据权利要求8-13中任一项所述的装置,其特征在于,所述处理器还用于:
    根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作起始位置,确定输出所述指导信息,
    或者,根据所述动作变化幅度和所述第二肢体的在所述备选动作中的动作终止位置,确定输出所述指导信息;
    或者,根据所述动作变化幅度、所述第二肢体的在所述备选动作中的动作起始位置和动作终止位置,确定输出所述指导信息,
    所述第二肢体的动作起始位置和所述第二肢体的动作终止位置是根据所述第一肢体的运动轨迹确定的。
  15. 一种计算机存储介质,其特征在于,当所述计算机指令在电子设备上运行时,使 得所述电子设备执行权利要求1-7中任一项所述的方法。
  16. 一种芯片系统,所述芯片系统包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得所述芯片系统执行权利要求1-7中任一项所述的方法。
PCT/CN2020/102394 2019-08-30 2020-07-16 辅助健身的方法和电子装置 WO2021036568A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2022513339A JP2022546453A (ja) 2019-08-30 2020-07-16 フィットネス支援方法および電子装置
EP20857343.6A EP4020491A4 (en) 2019-08-30 2020-07-16 FITNESS ASSISTED PROCEDURE AND ELECTRONIC DEVICE
US17/680,967 US20220176200A1 (en) 2019-08-30 2022-02-25 Method for Assisting Fitness and Electronic Apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910817978.X 2019-08-30
CN201910817978.XA CN112447273A (zh) 2019-08-30 2019-08-30 辅助健身的方法和电子装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/680,967 Continuation US20220176200A1 (en) 2019-08-30 2022-02-25 Method for Assisting Fitness and Electronic Apparatus

Publications (1)

Publication Number Publication Date
WO2021036568A1 true WO2021036568A1 (zh) 2021-03-04

Family

ID=74233975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/102394 WO2021036568A1 (zh) 2019-08-30 2020-07-16 辅助健身的方法和电子装置

Country Status (5)

Country Link
US (1) US20220176200A1 (zh)
EP (1) EP4020491A4 (zh)
JP (1) JP2022546453A (zh)
CN (2) CN112447273A (zh)
WO (1) WO2021036568A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111488824B (zh) * 2020-04-09 2023-08-08 北京百度网讯科技有限公司 运动提示方法、装置、电子设备和存储介质
CN113655935A (zh) * 2021-01-30 2021-11-16 华为技术有限公司 一种用户确定方法、电子设备和计算机可读存储介质
US20220245836A1 (en) * 2021-02-03 2022-08-04 Altis Movement Technologies, Inc. System and method for providing movement based instruction
CN113380374B (zh) * 2021-05-08 2022-05-13 荣耀终端有限公司 基于运动状态感知的辅助运动方法、电子设备及存储介质
US20230141420A1 (en) * 2021-07-20 2023-05-11 Colette Booker-Bell Squat Exercise System
FR3137203A1 (fr) * 2022-06-22 2023-12-29 Ai Bright Systeme et methode d’assistance a la realisation de mouvements physiques
CN115049967B (zh) * 2022-08-12 2022-11-11 成都信息工程大学 体操学习动作检测方法、装置以及电子设备
US20240057893A1 (en) * 2022-08-17 2024-02-22 August River, Ltd Co Remotely tracking range of motion measurement
CN115798676B (zh) * 2022-11-04 2023-11-17 中永(广东)网络科技有限公司 一种基于vr技术的互动体验分析管理方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150279230A1 (en) * 2014-03-26 2015-10-01 Wai Lana Productions, Llc Method for yoga instruction with media
CN109144247A (zh) * 2018-07-17 2019-01-04 尚晟 视频交互的方法以及基于可交互视频的运动辅助系统
CN109621331A (zh) * 2018-12-13 2019-04-16 深圳壹账通智能科技有限公司 辅助健身方法、装置及存储介质、服务器
CN110038274A (zh) * 2019-05-21 2019-07-23 福建工程学院 一种智能家庭无人指导健身方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101964047B (zh) * 2009-07-22 2012-10-10 深圳泰山在线科技有限公司 一种基于多跟踪点的人体动作识别方法
US10049595B1 (en) * 2011-03-18 2018-08-14 Thomas C. Chuang Athletic performance and technique monitoring
JP6359343B2 (ja) * 2013-07-01 2018-07-18 キヤノンメディカルシステムズ株式会社 動作情報処理装置及び方法
US11037369B2 (en) * 2017-05-01 2021-06-15 Zimmer Us, Inc. Virtual or augmented reality rehabilitation
CN108519818A (zh) * 2018-03-29 2018-09-11 北京小米移动软件有限公司 信息提示方法及装置
CN110151187B (zh) * 2019-04-09 2022-07-05 缤刻普达(北京)科技有限责任公司 健身动作识别方法、装置、计算机设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150279230A1 (en) * 2014-03-26 2015-10-01 Wai Lana Productions, Llc Method for yoga instruction with media
CN109144247A (zh) * 2018-07-17 2019-01-04 尚晟 视频交互的方法以及基于可交互视频的运动辅助系统
CN109621331A (zh) * 2018-12-13 2019-04-16 深圳壹账通智能科技有限公司 辅助健身方法、装置及存储介质、服务器
CN110038274A (zh) * 2019-05-21 2019-07-23 福建工程学院 一种智能家庭无人指导健身方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4020491A4

Also Published As

Publication number Publication date
US20220176200A1 (en) 2022-06-09
CN112447273A (zh) 2021-03-05
EP4020491A1 (en) 2022-06-29
JP2022546453A (ja) 2022-11-04
EP4020491A4 (en) 2022-10-19
CN112259191A (zh) 2021-01-22

Similar Documents

Publication Publication Date Title
WO2021036568A1 (zh) 辅助健身的方法和电子装置
WO2020211701A1 (zh) 模型训练方法、情绪识别方法及相关装置和设备
WO2020151387A1 (zh) 一种基于用户运动状态的推荐方法及电子设备
WO2021244457A1 (zh) 一种视频生成方法及相关装置
CN114242037A (zh) 一种虚拟人物生成方法及其装置
CN114466128A (zh) 目标用户追焦拍摄方法、电子设备及存储介质
WO2021169370A1 (zh) 服务元素的跨设备分配方法、终端设备及存储介质
WO2022037479A1 (zh) 一种拍摄方法和拍摄系统
CN113996046B (zh) 热身判断方法、装置及电子设备
CN115188064A (zh) 一种运动指导信息的确定方法、电子设备和运动指导系统
WO2022214004A1 (zh) 一种目标用户确定方法、电子设备和计算机可读存储介质
WO2022100597A1 (zh) 动作自适应评价方法、电子设备和存储介质
WO2022042275A1 (zh) 测量距离的方法、装置、电子设备及可读存储介质
WO2021036562A1 (zh) 用于健身训练的提示方法和电子设备
WO2022095983A1 (zh) 一种防止手势误识别的方法及电子设备
WO2022007757A1 (zh) 跨设备声纹注册方法、电子设备及存储介质
CN111339513B (zh) 数据分享的方法和装置
WO2021253296A1 (zh) 一种运动模型生成方法及相关设备
WO2021233018A1 (zh) 运动后肌肉疲劳度的检测方法及装置、电子设备
CN116450026B (zh) 用于识别触控操作的方法和系统
CN114362878B (zh) 数据处理方法及电子设备
WO2023207862A1 (zh) 确定头部姿态的方法以及装置
WO2021254091A1 (zh) 运动次数的确定方法和终端
WO2021254092A1 (zh) 划桨频率的推荐方法、装置和设备
WO2020237444A1 (zh) 一种移动终端最大发射功率的控制方法以及移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20857343

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022513339

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020857343

Country of ref document: EP

Effective date: 20220322