CN115202531A - Interface interaction method and system and electronic device - Google Patents

Interface interaction method and system and electronic device Download PDF

Info

Publication number
CN115202531A
CN115202531A CN202210593676.0A CN202210593676A CN115202531A CN 115202531 A CN115202531 A CN 115202531A CN 202210593676 A CN202210593676 A CN 202210593676A CN 115202531 A CN115202531 A CN 115202531A
Authority
CN
China
Prior art keywords
user
interface
animation
display interface
planned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210593676.0A
Other languages
Chinese (zh)
Inventor
金凌琳
余锋
陈思屹
谭李慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dangqu Network Technology Hangzhou Co Ltd
Original Assignee
Dangqu Network Technology Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dangqu Network Technology Hangzhou Co Ltd filed Critical Dangqu Network Technology Hangzhou Co Ltd
Priority to CN202210593676.0A priority Critical patent/CN115202531A/en
Publication of CN115202531A publication Critical patent/CN115202531A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application relates to an interface interaction method, a system and an electronic device, after an instruction of a user for entering a planned course is obtained, a screen display interface enters a guiding animation interface, the guiding animation is played on the guiding animation interface under the condition that human bones are identified, and the interface of the planned course is entered under the condition that the human bones identify and detect that the user completes the guiding animation.

Description

Interface interaction method and system and electronic device
Technical Field
The present application relates to the field of interface interaction technologies, and in particular, to an interface interaction method, system and electronic device.
Background
With the development of large-screen terminals such as smart televisions, projectors and the like, the interaction between the large-screen terminal and a user is more and more abundant, one of the interactions is that the user can select any one plan course to learn through an application program of the large-screen terminal, the large-screen terminal collects the completion conditions of the user plan courses and scores the user, for example, in a fitness application program, a fitness training course can be selected to train, but in the related technology, after the user selects any one plan course, the training and learning can be directly started, but the user may not be ready at this time, or the identification method and the identification state of the plan course are not clear, so that the user cannot be intuitively guided and indicated, and the user experience is poor.
At present, no effective solution is provided for the problem of poor user experience caused by non-intuitive guidance and indication of application program interface interaction in a large-screen terminal in the related art.
Disclosure of Invention
The embodiment of the application provides an interface interaction method, an interface interaction system and an electronic device, and aims to at least solve the problem that user experience is poor due to the fact that interaction guidance and indication of an application program interface in a large-screen terminal are not visual in the related technology.
In a first aspect, an embodiment of the present application provides an interface interaction method, where the method includes:
after an instruction of entering a planned course of a user is obtained, a screen display interface enters a guiding animation interface;
and playing a guide animation on the guide animation interface under the condition that the human skeleton is identified, and entering the interface of the planned course under the condition that the human skeleton identification detects that the user finishes the guide animation.
In some of these embodiments, the human bone recognition detecting that the user completed the guidance animation includes:
and in the process of detecting that the user completes the guiding animation, displaying the identification progress bar completing the guiding animation, wherein the identification progress bar indicates the user to complete the guiding animation after the identification progress bar is executed.
In some embodiments, the displaying the recognition progress bar process of completing the guidance animation comprises: and if the detection fails, the identification progress bar returns to zero, and identification is carried out again.
In some embodiments, after the instruction for the user to enter the planned lesson is obtained, the method includes:
and dividing the screen display interface into a first display interface and a second display interface, wherein the first display interface displays the planned course introduction, and the second display interface is the guide animation interface.
In some embodiments, after the human bone recognition detects that the user completes the guiding animation, the method further comprises:
the first display interface plays the planned course video, and the second display interface plays the user movement video.
In some of these embodiments, after the human bone recognition detects that the user has completed the guided animation, the screen display interface further includes a third display interface that displays planned lesson information including a current lesson name, a next lesson name, and a score.
In some embodiments, the second display interface displays the human body bone recognition condition of the user on the user motion video while playing the user motion video, and obtains the training score according to the human body bone recognition condition of the user.
In some of these embodiments, during the user movement, the method further comprises:
and if the human skeleton identification of the user is lost, pausing the planned course video, wherein the second display interface is the guide animation interface, and the user enters the interface of the planned course when finishing the guide animation.
In a second aspect, an embodiment of the present application provides a system for interface interaction, where the system includes an obtaining module and a playing module,
the acquisition module is used for acquiring an instruction of a user for entering a planned course and then enabling a screen display interface to enter a guiding animation interface;
the playing module is used for playing the guiding animation on the guiding animation interface under the condition that the human skeleton is identified, and entering the interface of the planned course under the condition that the human skeleton identification detects that the user finishes the guiding animation.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the method for interface interaction according to the first aspect is implemented.
Compared with the related art, the interface interaction method provided by the embodiment of the application has the advantages that after the instruction of entering the planned course by the user is obtained, the screen display interface enters the guiding animation interface, the guiding animation is played on the guiding animation interface under the condition that the human skeleton is identified, and the interface of the planned course is entered under the condition that the human skeleton identification detects that the user finishes the guiding animation.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow diagram of a method of interface interaction according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a guidance animation interface according to an embodiment of the application;
FIG. 3 is a schematic diagram of a second screen display interface according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a third screen display interface according to an embodiment of the present application;
FIG. 5 is a schematic view of a fourth screen display interface in accordance with an embodiment of the present application;
FIG. 6 is a schematic diagram of a fifth screen display interface in accordance with embodiments of the present application;
fig. 7 is a block diagram of a system for interface interaction according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by one of ordinary skill in the art that the embodiments described herein may be combined with other embodiments without conflict.
Unless otherwise defined, technical or scientific terms referred to herein should have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The use of the terms "including," "comprising," "having," and any variations thereof herein, is meant to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The present embodiment provides a method for interface interaction, and fig. 1 is a flowchart of a method for interface interaction according to an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step S101, after an instruction of a user for entering a planned course is obtained, a screen display interface enters a guiding animation interface; in the related technology, after the large-screen terminal acquires the instruction of the user to enter the planned course, the screen display interface can directly start the planned course, in the embodiment, the screen display interface can enter the guidance animation interface first, and the user can act along with the guidance animation, so that the user can know the identification mode and the identification state of the planned course more clearly and clearly, and the situation that the user is not ready to directly enter the planned course or cannot know the identification mode and the identification state of the planned course clearly and cannot give intuitive guidance and indication to the user, and the experience feeling is poor is prevented.
And step S102, playing a guide animation on a guide animation interface under the condition that the human skeleton is identified, and entering an interface of a planned course under the condition that the human skeleton identification detects that the user finishes the guide animation. In this embodiment, after entering the guidance animation interface, if the user stands in the visual field of the camera, the human skeleton is recognized, which indicates that the user is ready to act along with the guidance animation, and starts playing the guidance animation, and meanwhile, voice prompt may be performed to guide the user to act, and after detecting that the user finishes the guidance animation through human skeleton recognition, the user enters the interface for planning the course.
Optionally, fig. 2 is a schematic diagram of a guidance animation interface according to an embodiment of the present application, as shown in fig. 2, the guidance animation interface may be displayed in a full screen or in a partial screen, and the guidance animation is displayed after both hands are lifted laterally and the forearm is bent downward, and when it is detected through human skeleton recognition that a user follows a standard completion action of the guidance animation, the user enters a planned course interface, where multiple different types of guidance animations may also be set, and the user may select any one of the guidance animations according to a preference, and may also set each of the guidance animations to play automatically in sequence each time or to set other user-defined settings.
Through the steps S101 to S102, compared with the problem that the interactive guidance and instruction of the application program interface in the large-screen terminal in the related art are not intuitive, which results in poor user experience, in this embodiment, after the instruction of the user entering the planned course is obtained, the screen display interface enters the guidance animation interface, the guidance animation is played on the guidance animation interface under the condition that the human skeleton is recognized, and the user enters the interface of the planned course under the condition that the human skeleton recognition detects that the user completes the guidance animation.
In some embodiments, fig. 3 is a schematic diagram of a second screen display interface according to an embodiment of the application, and as shown in fig. 3, the detection of the completion of the guiding animation by the user through human bone recognition includes:
and in the process of detecting that the user finishes guiding the animation, displaying a recognition progress bar finishing the guiding animation, wherein the recognition progress bar indicates that the user finishes guiding the animation after finishing executing the recognition progress bar. In practical application, the identification progress bar can be divided into 100 parts, if the guiding animation needs to last for 3 seconds and is identified once every 500 milliseconds, the detection is performed for 6 times within 1,3 seconds every 30 milliseconds, and the identification progress bar is displayed, so that a user can clearly know the identification progress.
Optionally, in the process of displaying the progress bar for completing the identification of the guidance animation, if the user stops moving, the movement is not standard or the human skeleton cannot be identified, the detection fails, the identification progress bar returns to zero, and the identification is performed again. In this embodiment, if the detection fails, the user may be satisfied with the clearly recognized rule, and therefore, the user needs to perform recognition again, so that the user is more familiar with the recognition rule, and an error is prevented from occurring in the course of completing the planned course.
If the guidance animation interface is displayed in a full screen manner, the screen display interface is concise, but too monotonous, and in order to make the screen display interface richer, in some embodiments, fig. 4 is a schematic diagram of a third screen display interface according to the embodiments of the present application, as shown in fig. 4, after an instruction for a user to enter a planned course is obtained, the screen display interface is divided into a first display interface and a second display interface, the first display interface displays the introduction of the planned course, and the second display interface is a guidance animation interface. In this embodiment, if the user wants to know the introduction content of the planned course, the guidance animation may be paused first, and then played after the user knows the introduction content, so as to further improve the interaction between the large-screen terminal and the user, thereby improving the user experience.
In some of these embodiments, fig. 5 is a schematic view of a fourth screen display interface according to an embodiment of the present application, and as shown in fig. 5, after human bone recognition detects that the user has completed the guided animation, the first display interface plays a scheduled lesson video and the second display interface plays a user movement video. After the current course action is completed, voice prompt enters the next course and relevant introduction of the next course is displayed.
Optionally, fig. 6 is a schematic diagram of a fifth screen display interface according to an embodiment of the present application, as shown in fig. 6, after human skeleton recognition detects that a user completes guidance animation, the screen display interface further includes a third display interface, the third display interface displays planned course information, the planned course information includes a current course name, a next course name, and a score condition, if the planned course is a fitness training course, the planned course further includes calories consumed by training, and the user can quickly obtain the training condition by adding the third display interface to display the planned course information.
In some embodiments, as shown in fig. 6, while the second display interface plays the user motion video, the human bone recognition condition of the user is displayed on the user motion video, and the training score is calculated according to the human bone recognition condition of the user.
In some embodiments, during the user movement, if the human bone recognition of the user is lost, the video of the planned course is paused, and the second display interface is a guidance animation interface, and when the user finishes the guidance animation, the user enters the interface of the planned course. Specifically, when part or all of the body of a user is outside the visual field range of the camera, the human skeleton of the user is lost, at the moment, the planned curriculum video is automatically paused, the planned curriculum video is prevented from being continuously played after the user stops training, the user needs to manually play back the planned curriculum video, the second display interface is changed into a guiding animation interface again, when the user wants to train again, the user needs to complete the guiding animation again and then can enter the planned curriculum, the planned curriculum video starts to be automatically played at the current progress, the interaction between the large-screen terminal and the user is improved through the embodiment, and the user experience is further improved.
The following describes the embodiments of the present invention in detail with reference to the application scenario of fitness training:
as shown in fig. 2 and 4, after the instruction of the user to enter the fitness training course is obtained, the screen display interface enters the guidance animation interface, the guidance animation interface may be displayed in a full screen or a partial screen, and when the partial screen is displayed, the other partial screen may display the introduction of the fitness training course.
And under the condition that the human skeleton is identified, playing a guiding animation on a screen display interface, and under the condition that the human skeleton identification detects that the user finishes guiding the animation, entering a fitness training course. As shown in fig. 5 and 6, after entering the fitness training course, the first display interface plays a video of the fitness training course, the second display interface plays a video of the user movement, the human skeleton recognition condition of the user is displayed on the video of the user movement, the training score is calculated according to the human skeleton recognition condition of the user, and the name of the current fitness training course, the name of the next fitness training course, the score condition and the consumed calories are displayed on the third display interface.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The embodiment also provides an interface interaction system, which is used for implementing the above embodiments and preferred embodiments, and the description of the system is omitted for brevity. As used below, the terms "module," "unit," "sub-unit," and the like may implement a combination of software and/or hardware of predetermined functions. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 7 is a block diagram of a structure of an interface interaction system according to an embodiment of the present application, as shown in fig. 7, the system includes an obtaining module 71 and a playing module 72, where the obtaining module 71 is configured to obtain an instruction that a user enters a planned course, and then the screen display interface enters a guidance animation interface, and the playing module 72 is configured to play the guidance animation on the screen display interface when a human bone is recognized, and then enter the planned course interface when the human bone is recognized and it is detected that the user completes the guidance animation, so that a problem of poor user experience due to non-intuitive guidance and instruction of an application interface in a large-screen terminal in the related art is solved, and user experience in a course of completing the planned course is improved.
It should be noted that the above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the interface interaction method in the foregoing embodiments, the embodiments of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the interface interaction methods of the above embodiments.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of interface interaction. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of interface interaction, the method comprising:
after an instruction of entering a planned course of a user is obtained, a screen display interface enters a guiding animation interface;
and playing a guide animation on the guide animation interface under the condition that the human skeleton is identified, and entering the interface of the planned course under the condition that the human skeleton identification detects that the user finishes the guide animation.
2. The method of claim 1, wherein the human bone recognition detecting that the user completed the guidance animation comprises:
and in the process of detecting that the user completes the guiding animation, displaying the identification progress bar for completing the guiding animation, wherein the user is indicated to complete the guiding animation after the identification progress bar is executed.
3. The method of claim 2, wherein displaying the recognition progress bar process of completing the guidance animation comprises: and if the detection fails, the identification progress bar returns to zero, and identification is carried out again.
4. The method as claimed in claim 1, wherein after the obtaining of the instruction for the user to enter the planned lesson, the method comprises:
and dividing the screen display interface into a first display interface and a second display interface, wherein the first display interface displays the planned course introduction, and the second display interface is the guidance animation interface.
5. The method of claim 4, wherein after human bone recognition detects that the user completes the guidance animation, the method further comprises:
the first display interface plays the planned course video, and the second display interface plays the user movement video.
6. The method as recited in claim 5, wherein after human bone recognition detects completion of the guided animation by the user, the screen display interface further comprises a third display interface displaying planned lesson information, the planned lesson information comprising a current lesson name, a next lesson name, and a score.
7. The method of claim 5, wherein the second display interface displays the human bone recognition condition of the user on the user motion video while playing the user motion video, and obtains the training score according to the human bone recognition condition of the user.
8. The method of claim 7, wherein during the user movement, the method further comprises:
and if the human skeleton identification of the user is lost, pausing the planned course video, wherein the second display interface is the guide animation interface, and the user enters the interface of the planned course when finishing the guide animation.
9. The system for interface interaction is characterized by comprising an acquisition module and a playing module,
the acquisition module is used for acquiring an instruction of a user for entering a planned course and then enabling a screen display interface to enter a guiding animation interface;
the playing module is used for playing the guiding animation on the guiding animation interface under the condition that the human skeleton is identified, and entering the interface of the planned course under the condition that the human skeleton identification detects that the user finishes the guiding animation.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the method of interface interaction of any of claims 1 to 8.
CN202210593676.0A 2022-05-27 2022-05-27 Interface interaction method and system and electronic device Pending CN115202531A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210593676.0A CN115202531A (en) 2022-05-27 2022-05-27 Interface interaction method and system and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210593676.0A CN115202531A (en) 2022-05-27 2022-05-27 Interface interaction method and system and electronic device

Publications (1)

Publication Number Publication Date
CN115202531A true CN115202531A (en) 2022-10-18

Family

ID=83575603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210593676.0A Pending CN115202531A (en) 2022-05-27 2022-05-27 Interface interaction method and system and electronic device

Country Status (1)

Country Link
CN (1) CN115202531A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160130085A (en) * 2015-04-30 2016-11-10 모다정보통신 주식회사 Exercising Method and System Using a Smart Mirror
CN112272324A (en) * 2020-10-15 2021-01-26 聚好看科技股份有限公司 Follow-up mode control method and display device
CN112399234A (en) * 2019-08-18 2021-02-23 聚好看科技股份有限公司 Interface display method and display equipment
CN113505662A (en) * 2021-06-23 2021-10-15 广州大学 Fitness guidance method, device and storage medium
CN113591523A (en) * 2020-04-30 2021-11-02 聚好看科技股份有限公司 Display device and experience value updating method
CN113721758A (en) * 2020-05-26 2021-11-30 华为技术有限公司 Fitness guiding method and electronic equipment
CN113808446A (en) * 2020-06-11 2021-12-17 华为技术有限公司 Fitness course interaction method and related device
CN113990440A (en) * 2021-10-22 2022-01-28 成都医云科技有限公司 Human skeleton rehabilitation training method and device, electronic equipment and storage medium
JP2022518970A (en) * 2019-03-15 2022-03-17 株式会社ソニー・インタラクティブエンタテインメント Reinforcement learning to train characters using heterogeneous target animation data
CN114422861A (en) * 2022-01-24 2022-04-29 北京优酷科技有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN114432683A (en) * 2019-08-30 2022-05-06 华为技术有限公司 Intelligent voice playing method and equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160130085A (en) * 2015-04-30 2016-11-10 모다정보통신 주식회사 Exercising Method and System Using a Smart Mirror
JP2022518970A (en) * 2019-03-15 2022-03-17 株式会社ソニー・インタラクティブエンタテインメント Reinforcement learning to train characters using heterogeneous target animation data
CN112399234A (en) * 2019-08-18 2021-02-23 聚好看科技股份有限公司 Interface display method and display equipment
CN114432683A (en) * 2019-08-30 2022-05-06 华为技术有限公司 Intelligent voice playing method and equipment
CN113591523A (en) * 2020-04-30 2021-11-02 聚好看科技股份有限公司 Display device and experience value updating method
CN113721758A (en) * 2020-05-26 2021-11-30 华为技术有限公司 Fitness guiding method and electronic equipment
WO2021238356A1 (en) * 2020-05-26 2021-12-02 华为技术有限公司 Fitness guidance method and electronic device
CN113808446A (en) * 2020-06-11 2021-12-17 华为技术有限公司 Fitness course interaction method and related device
CN112272324A (en) * 2020-10-15 2021-01-26 聚好看科技股份有限公司 Follow-up mode control method and display device
CN113505662A (en) * 2021-06-23 2021-10-15 广州大学 Fitness guidance method, device and storage medium
CN113990440A (en) * 2021-10-22 2022-01-28 成都医云科技有限公司 Human skeleton rehabilitation training method and device, electronic equipment and storage medium
CN114422861A (en) * 2022-01-24 2022-04-29 北京优酷科技有限公司 Interaction method, interaction device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
机锋科技视频: "新一代华为智慧屏V系列体验:大屏与运动更配", pages 1 - 4, Retrieved from the Internet <URL:《https://www.iqiyi.com/v_19yermi8dpk.html》> *

Similar Documents

Publication Publication Date Title
CN107551521B (en) Fitness guidance method and device, intelligent equipment and storage medium
US11872445B2 (en) Fitness management method, device, and computer readable storage medium
US20170103670A1 (en) Interactive Cognitive Recognition Sports Training System and Methods
US20170294136A1 (en) Graphical user interface for interactive cognitive recognition sports training system
US10600018B2 (en) Data processing systems for processing and analyzing data regarding self-awareness and executive function
US8905761B2 (en) Training system, training device, and program recording medium and playback control method
KR101667281B1 (en) Apparatus and method for training recognition ability, and computer-readable storage medium in which the method is recorded
WO2015100295A1 (en) Systems and methods for a self-directed working memory task for enhanced cognition
JP4972527B2 (en) Movie display system, movie display method, and computer program
CN112527171B (en) Multimedia file playing method, device, equipment and medium
CN110019757A (en) Books point reads interaction device and its control method, computer readable storage medium
US10191830B1 (en) Data processing systems for processing and analyzing data regarding self-awareness and executive function
EP4167067A1 (en) Interaction method and apparatus, device, and readable medium
CN115202531A (en) Interface interaction method and system and electronic device
CN112364478A (en) Virtual reality-based testing method and related device
US20140113719A1 (en) Computing device and video game direction method
CN115588485A (en) Adaptive intervention method, system, device and medium based on social story training
US20220246060A1 (en) Electronic device and method for eye-contact training
CN116324700A (en) Display equipment and media asset playing method
KR102095647B1 (en) Comparison of operation using smart devices Comparison device and operation Comparison method through dance comparison method
CN111260678A (en) Gymnastics assistant learning method and device, storage medium and terminal equipment
JP7314624B2 (en) Information processing device and program
US20240123317A1 (en) Fitness Coaching Method, System and Terminal
US10870058B2 (en) Data processing systems for processing and analyzing data regarding self-awareness and executive function
JP7338210B2 (en) Information processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination