CN116899199A - Dance training method, device, equipment and readable storage medium - Google Patents

Dance training method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN116899199A
CN116899199A CN202310897136.6A CN202310897136A CN116899199A CN 116899199 A CN116899199 A CN 116899199A CN 202310897136 A CN202310897136 A CN 202310897136A CN 116899199 A CN116899199 A CN 116899199A
Authority
CN
China
Prior art keywords
dance
training
data
scene
analysis result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310897136.6A
Other languages
Chinese (zh)
Inventor
张吉松
夏勇峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Beehive Century Technology Co ltd
Original Assignee
Beijing Beehive Century Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Beehive Century Technology Co ltd filed Critical Beijing Beehive Century Technology Co ltd
Priority to CN202310897136.6A priority Critical patent/CN116899199A/en
Publication of CN116899199A publication Critical patent/CN116899199A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2244/00Sports without balls
    • A63B2244/22Dancing

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a dance training method, device, equipment and readable storage medium, wherein the method comprises the steps of rendering a virtual dance scene corresponding to a target dance; obtaining dance data of a performer in a virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts; and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment. The method can achieve the effect of improving the accuracy of dance movements in the dance training process.

Description

Dance training method, device, equipment and readable storage medium
Technical Field
The present application relates to the field of intelligent AR, and in particular, to a dance training method, apparatus, device, and readable storage medium.
Background
Currently, with the continuous development of AR technology, more and more AR applications are widely used in various fields. Meanwhile, dance is popular with more and more people as fashion culture and body building modes. However, in dance training, the user can only consume a teacher inviting a specialty to perform dance training or realize dance training by using a network in the form of video.
The traditional street dance training mode has a plurality of problems, such as limited training time and space, difficulty in monitoring action accuracy and unsatisfactory training effect.
Therefore, how to improve the accuracy of dance movements in the dance training process is a technical problem to be solved.
Disclosure of Invention
The embodiment of the application aims to provide a dance training method, and the technical scheme of the embodiment of the application can achieve the effect of improving the accuracy of dance movements in the dance training process.
In a first aspect, an embodiment of the present application provides a method for dance training, including rendering a virtual dance scene corresponding to a target dance; obtaining dance data of a performer in a virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts; and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment.
According to the embodiment of the application, in the rendered virtual dance scene, a performer can perform dance performance, the dance motion of the performer is analyzed through the AR equipment, and then a dance training method is provided according to an analysis result obtained by the dance motion of the performer, the performer can watch own motion and the dance training method through the AR equipment display screen, and further the effect of improving the accuracy of the dance motion in the dance training process can be achieved.
In some embodiments, analyzing dance data and displaying an analysis result and a dance training method corresponding to the analysis result through a display screen of an AR device includes: analyzing dance data through a preset dance analysis model to obtain an analysis result, wherein the dance analysis model is obtained by training a basic model through a plurality of training samples, and the plurality of training samples are obtained by marking a plurality of dance data of different users; analyzing the analysis result to determine that the performer does not meet the dance action of the preset requirement; and displaying standard dance movements, dance training suggestions and analysis results corresponding to the dance movements which do not meet the preset requirements on a display screen.
In the embodiment of the application, dance data of a performer can be analyzed through the dance analysis model trained in advance, and an analysis result obtained by analyzing the dance data through the model obtained by continuously learning the neural network is more accurate.
In some embodiments, before rendering the virtual dance scene corresponding to the target dance, the method further comprises: collecting multiple dance data of different users when dancing; marking a plurality of items of dance data respectively to obtain a plurality of training samples, wherein each training sample in the plurality of training samples comprises a plurality of items of dance data and corresponding marks; and training the basic model through a plurality of training samples to obtain the dance analysis model.
In the embodiment of the application, the training sample is obtained by marking the dance data, and the dance analysis model obtained by training the training sample can accurately analyze the dance data of the performer to obtain an accurate analysis result.
In some embodiments, rendering a virtual dance scene corresponding to a target dance includes: acquiring a scene template corresponding to a target dance; rendering data in the scene template, and combining the data with the real scene to obtain a virtual dance scene.
In the embodiment of the application, the virtual dance scene under the current scene can be quickly rendered by combining the preset scene template with the displayed scene.
In some embodiments, obtaining dance data of a performer in a virtual dance scene includes: acquiring accelerations, angular velocities and angular displacements of a plurality of body parts of the actor transmitted by the inertial sensor device; and/or acquiring the position, velocity and acceleration of a plurality of body parts of the performer transmitted by the optical sensor device.
In the embodiment of the application, a plurality of items of data of the body part of the performer can be acquired through the sensor, so that the effect of accurately acquiring the dance data is achieved.
In some embodiments, a dance training method includes: at least one of dance training difficulty selection, music selection and action guidance.
According to the embodiment of the application, dance difficulty, music and guiding actions can be rendered to realize training of performers, and the effect of custom setting of dance training methods according to different requirements is achieved.
In some embodiments, after analyzing the dance data and displaying the analysis result and the dance training method corresponding to the analysis result through the display screen of the AR device, the method further includes: establishing a training plan corresponding to a dance training method; and when the performer trains the dance through the training program, the dance movements of the performer are monitored in real time and adjustment suggestions are given.
In the embodiment of the application, the action of the performer can be monitored in real time and corresponding adjustment suggestions can be given, so that the effect of timely correcting the dance action is achieved.
In a second aspect, an embodiment of the present application provides a dance training apparatus, including:
the rendering module is used for rendering the virtual dance scene corresponding to the target dance;
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring dance data of a performer in a virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts;
the analysis module is used for analyzing the dance data and displaying an analysis result and a dance training method corresponding to the analysis result through a display screen of the augmented reality AR device.
Optionally, the analysis module is specifically configured to:
analyzing dance data through a preset dance analysis model to obtain an analysis result, wherein the dance analysis model is obtained by training a basic model through a plurality of training samples, and the plurality of training samples are obtained by marking a plurality of dance data of different users;
analyzing the analysis result to determine that the performer does not meet the dance action of the preset requirement;
and displaying standard dance movements, dance training suggestions and analysis results corresponding to the dance movements which do not meet the preset requirements on a display screen.
Optionally, the apparatus further includes:
the training module is used for acquiring multiple dance data of different users when the users dance before the rendering module renders the virtual dance scene corresponding to the target dance;
marking a plurality of items of dance data respectively to obtain a plurality of training samples, wherein each training sample in the plurality of training samples comprises a plurality of items of dance data and corresponding marks;
and training the basic model through a plurality of training samples to obtain the dance analysis model.
Optionally, the rendering module is specifically configured to:
acquiring a scene template corresponding to a target dance;
rendering data in the scene template, and combining the data with the real scene to obtain a virtual dance scene.
Optionally, the acquiring module is specifically configured to:
acquiring accelerations, angular velocities and angular displacements of a plurality of body parts of the actor transmitted by the inertial sensor device;
and/or
The position, velocity and acceleration of a plurality of body parts of the performer transmitted by the optical sensor device are acquired.
Optionally, the dance training method includes:
at least one of dance training difficulty selection, music selection and action guidance.
Optionally, the apparatus further includes:
the monitoring module is used for establishing a training plan corresponding to the dance training method after the analysis module analyzes the dance data and displays the analysis result and the dance training method corresponding to the analysis result through a display screen of the AR equipment;
and when the performer trains the dance through the training program, the dance movements of the performer are monitored in real time and adjustment suggestions are given.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor and a memory storing computer readable instructions which, when executed by the processor, perform the steps of the method as provided in the first aspect above.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method as provided in the first aspect above.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the embodiments of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for dance training according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for performing dance training according to an embodiment of the present application;
FIG. 3 is a schematic block diagram of a dance training apparatus according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a dance training device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Some of the terms involved in the embodiments of the present application will be described first to facilitate understanding by those skilled in the art.
Terminal equipment: the mobile terminal, stationary terminal or portable terminal may be, for example, a mobile handset, a site, a unit, a device, a multimedia computer, a multimedia tablet, an internet node, a communicator, a desktop computer, a laptop computer, a notebook computer, a netbook computer, a tablet computer, a personal communications system device, a personal navigation device, a personal digital assistant, an audio/video player, a digital camera/camcorder, a positioning device, a television receiver, a radio broadcast receiver, an electronic book device, a game device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof. It is also contemplated that the terminal device can support any type of interface (e.g., wearable device) for the user, etc.
And (3) a server: the cloud server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, basic cloud computing services such as big data and artificial intelligent platforms and the like.
AR: (Augmented Reality) the augmented reality technology is a technology of skillfully fusing virtual information with a real world, and widely uses various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like, and applies virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer to the real world after simulation, so that the two kinds of information are mutually complemented, thereby realizing the enhancement of the real world.
The method is applied to a scene of training dances by intelligent Augmented Reality (AR) equipment, wherein a specific scene is to analyze dance actions of a performer through the AR equipment and give specific training suggestions according to analysis results.
Currently, with the continuous development of AR technology, more and more AR applications are widely used in various fields. Meanwhile, dance is popular with more and more people as fashion culture and body building modes. However, in dance training, the user can only consume a teacher inviting a specialty to perform dance training or realize dance training by using a network in the form of video. The traditional street dance training mode has a plurality of problems, such as limited training time and space, difficulty in monitoring action accuracy and unsatisfactory training effect.
Therefore, the virtual dance scene corresponding to the target dance is rendered; obtaining dance data of a performer in a virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts; and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment. In the virtual dance scene that renders, the actor can carry out dance performance, through the analysis of AR equipment to actor's dance action, and then according to the analysis result that actor's dance action obtained gives dance training's method, the actor can watch oneself action and dance training method through AR equipment display screen, and then can reach the effect of the accurate degree of dance action of in-process that improves dance training.
In the embodiment of the application, the execution subject may be dance training equipment in the dance training system, and in practical application, the dance training equipment may be electronic equipment such as terminal equipment and a server, which is not limited herein.
The following describes the dance training method according to the embodiment of the present application in detail with reference to fig. 1.
Referring to fig. 1, fig. 1 is a flowchart of a dance training method according to an embodiment of the present application, where the dance training method shown in fig. 1 includes:
step 110: and rendering a virtual dance scene corresponding to the target dance.
Wherein the target dance means a dance specially performed by a performer or the same type of dance such as street dance, classical dance, ballet dance, national dance, folk dance, modern dance, kick dance, jazz dance, etc., the present application is not limited thereto. The virtual dance scene may be a combination of rendering the virtual scene and the real scene by the AR device. The AR device may be AR glasses or AR helmets, etc.
In some embodiments of the present application, before rendering the virtual dance scene corresponding to the target dance, the method shown in fig. 1 further includes: collecting multiple dance data of different users when dancing; marking a plurality of items of dance data respectively to obtain a plurality of training samples, wherein each training sample in the plurality of training samples comprises a plurality of items of dance data and corresponding marks; and training the basic model through a plurality of training samples to obtain the dance analysis model.
According to the dance analysis model, the training sample is obtained through marking the dance data, and the dance analysis model obtained through training of the training sample can accurately analyze the dance data of the performer to obtain an accurate analysis result.
Wherein the plurality of dance data includes movement conditions of each body part of the user, such as acceleration, angular velocity, angular displacement, position, velocity, etc. of the body part. The corresponding mark represents manual analysis marks of dance data, and the manual analysis marks comprise information such as dance accuracy, action standard, data difference value and the like. The basic model can use a basic information comparison model or an information matching model and other models.
In one embodiment, after collecting multiple dance data of different users when dancing, noise and abnormal values of the multiple dance data are cleaned and screened, and pretreatment is performed in modes of filtering, noise reduction, normalization and the like. It is also desirable to extract dance-related data features from multiple data for model training use.
In one embodiment, after training the basic model through a plurality of training samples to obtain the dance analysis model, the method further comprises: and evaluating and analyzing the training result of the model, determining the accuracy and reliability of the model, for example, testing the model through test data, and adjusting the parameters of the model according to the test result to obtain an analysis result.
In some embodiments of the present application, rendering a virtual dance scene corresponding to a target dance includes: acquiring a scene template corresponding to a target dance; rendering data in the scene template, and combining the data with the real scene to obtain a virtual dance scene.
In the process, the virtual dance scene under the current scene can be quickly rendered by combining the preset scene template with the displayed scene.
The scene template can be composed of elements such as lamplight, stages and characters under professional dance scenes, the virtual dance scenes can be directly displayed on an AR device display screen through the elements in the scene template, and the display screen can be a display screen of the AR device or a virtual display screen rendered through the AR device. The real scene represents a real scene in which a performer is located, including persons and things of the real scene.
Step 120: and obtaining dance data of the performer in the virtual dance scene.
Wherein the dance data comprises a plurality of movement parameters of each of the plurality of body parts.
The dance data includes motion parameters of each body part of the performer, such as acceleration, angular velocity, angular displacement, position, and velocity of the body part.
In some embodiments of the present application, obtaining dance data of a performer in a virtual dance scene includes: acquiring accelerations, angular velocities and angular displacements of a plurality of body parts of the actor transmitted by the inertial sensor device; and/or acquiring the position, velocity and acceleration of a plurality of body parts of the performer transmitted by the optical sensor device.
In the process, the application can acquire a plurality of items of data of the body part of the performer through the sensor, thereby achieving the effect of accurately acquiring the dance data.
Wherein the inertial sensor and the optical sensor may be separate sensor devices that transmit data to the AR device by measuring a plurality of data of the body part of the user. The inertial sensor may be a sensor for measuring a motion state of an object based on an inertial principle, and its working principle may be to measure acceleration, angular velocity, angular displacement, etc. of the object by using characteristics of inertia of a mass in motion, and the inertial sensor is generally composed of an accelerometer for measuring acceleration of the object and a gyroscope for measuring angular velocity and angular displacement of the object. Accelerometer measurement: the accelerometer can measure acceleration of the object in three directions, namely the X, Y, Z axis of the spatial coordinates. By measuring the value of the X, Y, Z axis of the spatial coordinates output by the accelerometer, the acceleration magnitude and direction of the object can be calculated. And (3) gyroscope measurement: the gyroscope can measure the angular velocity and angular displacement of the object in three directions, namely the X, Y, Z axis of the spatial coordinates. By measuring the values of the X, Y, Z axis of the spatial coordinates output by the gyroscope, the angular velocity and angular displacement of the object can be calculated. The optical sensor can be a sensor for measuring the motion state of an object by utilizing the optical principle, and the working principle of the optical sensor can be to utilize the reflection or transmission of light to measure the position, the speed, the acceleration and other parameters of the object. Optical sensors typically include a photodiode for emitting an optical signal and a photosensitive element for receiving the optical signal and converting it into an electrical signal for processing.
In one embodiment, when dance data of a user is acquired, data of dancing of a performer can be recorded through one or more cameras of the AR device, and multiple data of a body part of the performer are directly analyzed through a processor.
Step 130: and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment.
The analysis result may be a comparison result of dance data of the performer and standard dance data, or may be a result of whether the action of the performer is standard or not, etc.
In some embodiments of the present application, a dance training method includes: at least one of dance training difficulty selection, music selection and action guidance.
In the process, the dance difficulty, music and guiding actions can be rendered to realize training of the performer, and the effect of custom setting of the dance training method according to different requirements is achieved.
The dance training difficulty can be selected according to the needs of performers and comprises a plurality of grades.
In some embodiments of the present application, dance data is analyzed, and an analysis result and a dance training method corresponding to the analysis result are displayed through a display screen of an AR device, including: analyzing dance data through a preset dance analysis model to obtain an analysis result, wherein the dance analysis model is obtained by training a basic model through a plurality of training samples, and the plurality of training samples are obtained by marking a plurality of dance data of different users; analyzing the analysis result to determine that the performer does not meet the dance action of the preset requirement; and displaying standard dance movements, dance training suggestions and analysis results corresponding to the dance movements which do not meet the preset requirements on a display screen.
In the process, dance data of a performer can be analyzed through the dance analysis model trained in advance, and an analysis result obtained by analyzing the dance data through the model obtained by continuous learning of the neural network is more accurate.
The preset requirement may be set according to the requirement, for example, the angular velocity of the user does not reach the preset angular velocity or the value of the angular velocity exceeds the preset threshold. Dance training advice includes movements of the trained body part, standard information of the movements, such as standard acceleration, angular velocity, and angular displacement, etc.
In some embodiments of the present application, after analyzing dance data and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR device, the method shown in fig. 1 further includes: establishing a training plan corresponding to a dance training method; and when the performer trains the dance through the training program, the dance movements of the performer are monitored in real time and adjustment suggestions are given.
In the process, the application can monitor the action of the performer in real time and give out corresponding adjustment suggestions, thereby achieving the effect of timely correcting the dance action.
The training program comprises training time, training actions, acceleration of standard actions, angular speed, angular displacement and training difficulty level. The adjustment advice may be to adjust the action of the performer as much as possible as standard dance actions.
In the process shown in fig. 1, the virtual dance scene corresponding to the target dance is rendered; obtaining dance data of a performer in a virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts; and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment. In the virtual dance scene that renders, the actor can carry out dance performance, through the analysis of AR equipment to actor's dance action, and then according to the analysis result that actor's dance action obtained gives dance training's method, the actor can watch oneself action and dance training method through AR equipment display screen, and then can reach the effect of the accurate degree of dance action of in-process that improves dance training.
The following describes the method for implementing dance training according to the embodiment of the present application in detail with reference to fig. 2.
Referring to fig. 2, fig. 2 is a flowchart of a method for implementing dance training according to an embodiment of the present application, where the method for implementing dance training shown in fig. 2 includes:
step 210: training a dance analysis model.
Specific: collecting multiple dance data of different users when dancing; marking a plurality of items of dance data respectively to obtain a plurality of training samples; and training the basic model through a plurality of training samples to obtain the dance analysis model.
Step 220: and rendering a virtual dance scene corresponding to the target dance.
Specific: acquiring a scene template corresponding to a target dance; rendering data in the scene template, and combining the data with the real scene to obtain a virtual dance scene.
Step 230: and obtaining dance data of the performer in the virtual dance scene.
Specific: a plurality of motion parameters is acquired for each of a plurality of body parts.
Step 240: and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment.
Specific: analyzing the dance data through a preset dance analysis model to obtain an analysis result; analyzing the analysis result to determine that the performer does not meet the dance action of the preset requirement; and displaying standard dance movements, dance training suggestions and analysis results corresponding to the dance movements which do not meet the preset requirements on a display screen.
In addition, the specific method and steps shown in fig. 2 may refer to the method shown in fig. 1, which is not repeated herein.
The dance training method is described in the foregoing through fig. 1, and the dance training apparatus is described in conjunction with fig. 3 to 4.
Referring to fig. 3, a schematic block diagram of an apparatus 300 for dance training according to an embodiment of the present application is shown, where the apparatus 300 may be a module, a program segment, or a code on an electronic device. The apparatus 300 corresponds to the embodiment of the method of fig. 1 described above, and is capable of performing the steps involved in the embodiment of the method of fig. 1. Specific functions of the apparatus 300 will be described below, and detailed descriptions thereof will be omitted herein as appropriate to avoid redundancy.
Optionally, the apparatus 300 includes:
a rendering module 310, configured to render a virtual dance scene corresponding to the target dance;
an obtaining module 320, configured to obtain dance data of a performer in a virtual dance scene, where the dance data includes a plurality of motion parameters of each of a plurality of body parts;
and the analysis module 330 is used for analyzing the dance data and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the augmented reality AR device.
Optionally, the analysis module is specifically configured to:
analyzing dance data through a preset dance analysis model to obtain an analysis result, wherein the dance analysis model is obtained by training a basic model through a plurality of training samples, and the plurality of training samples are obtained by marking a plurality of dance data of different users; analyzing the analysis result to determine that the performer does not meet the dance action of the preset requirement; and displaying standard dance movements, dance training suggestions and analysis results corresponding to the dance movements which do not meet the preset requirements on a display screen.
Optionally, the apparatus further includes:
the training module is used for acquiring multiple dance data of different users when the users dance before the rendering module renders the virtual dance scene corresponding to the target dance; marking a plurality of items of dance data respectively to obtain a plurality of training samples, wherein each training sample in the plurality of training samples comprises a plurality of items of dance data and corresponding marks; and training the basic model through a plurality of training samples to obtain the dance analysis model.
Optionally, the rendering module is specifically configured to:
acquiring a scene template corresponding to a target dance; rendering data in the scene template, and combining the data with the real scene to obtain a virtual dance scene.
Optionally, the acquiring module is specifically configured to:
acquiring accelerations, angular velocities and angular displacements of a plurality of body parts of the actor transmitted by the inertial sensor device; and/or acquiring the position, velocity and acceleration of a plurality of body parts of the performer transmitted by the optical sensor device.
Optionally, the dance training method includes: at least one of dance training difficulty selection, music selection and action guidance.
Optionally, the apparatus further includes:
the monitoring module is used for establishing a training plan corresponding to the dance training method after the analysis module analyzes the dance data and displays the analysis result and the dance training method corresponding to the analysis result through a display screen of the AR equipment; and when the performer trains the dance through the training program, the dance movements of the performer are monitored in real time and adjustment suggestions are given.
Referring to fig. 4, a schematic structural diagram of a dance training apparatus according to an embodiment of the application may include a memory 410 and a processor 420. Optionally, the apparatus may further include: a communication interface 430 and a communication bus 440. The apparatus corresponds to the embodiment of the method of fig. 1 described above, and is capable of performing the steps involved in the embodiment of the method of fig. 1, and specific functions of the apparatus may be found in the following description.
In particular, the memory 410 is used to store computer readable instructions.
The processor 420, which processes the readable instructions stored in the memory, is capable of performing the various steps in the method of fig. 1.
Communication interface 430 is used for signaling or data communication with other node devices. For example: for communication with a server or terminal, or with other device nodes, although embodiments of the application are not limited in this regard.
A communication bus 440 for enabling direct connection communication of the above-described components.
The communication interface 430 of the device in the embodiment of the present application is used for performing signaling or data communication with other node devices. The memory 410 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. Memory 410 may also optionally be at least one storage device located remotely from the aforementioned processor. The memory 410 has stored therein computer readable instructions which, when executed by the processor 420, perform the method process described above in fig. 1. Processor 420 may be used on apparatus 300 and to perform functions in the present application. By way of example, the processor 420 described above may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, and the embodiments of the application are not limited in this regard.
Embodiments of the present application also provide a readable storage medium, which when executed by a processor, performs a method process performed by an electronic device in the method embodiment shown in fig. 1.
It will be clear to those skilled in the art that, for convenience and brevity of description, reference may be made to the corresponding procedure in the foregoing method for the specific working procedure of the apparatus described above, and this will not be repeated here.
In summary, the embodiments of the present application provide a method, an apparatus, an electronic device, and a readable storage medium for dance training, where the method includes rendering a virtual dance scene corresponding to a target dance; obtaining dance data of a performer in a virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts; and analyzing the dance data, and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment. The method can achieve the effect of improving the accuracy of dance movements in the dance training process.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and variations will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A method of dance training applied to an augmented reality AR device, comprising:
rendering a virtual dance scene corresponding to the target dance;
obtaining dance data of a performer in the virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts;
analyzing the dance data, and displaying an analysis result and a dance training method corresponding to the analysis result through a display screen of the AR equipment.
2. The method of claim 1, wherein analyzing the dance data and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of the AR device, comprises:
analyzing the dance data through a preset dance analysis model to obtain an analysis result, wherein the dance analysis model is obtained by training a basic model through a plurality of training samples, and the training samples are obtained by marking a plurality of dance data of different users;
analyzing the analysis result to determine dance movements of the performer which do not meet preset requirements;
and displaying the standard dance movements, dance training suggestions and the analysis results corresponding to the dance movements which do not meet the preset requirements on the display screen.
3. The method of claim 2, wherein prior to rendering the virtual dance scene corresponding to the target dance, the method further comprises:
collecting the multiple items of dance data of different users when dancing;
marking the multiple items of dance data respectively to obtain multiple training samples, wherein each training sample in the multiple training samples comprises multiple items of dance data and corresponding marks;
and training the basic model through the training samples to obtain the dance analysis model.
4. The method of any one of claims 1-3, wherein rendering the virtual dance scene corresponding to the target dance comprises:
acquiring a scene template corresponding to the target dance;
and rendering the data in the scene template and combining the data with the real scene to obtain the virtual dance scene.
5. A method according to any one of claims 1-3, wherein said obtaining dance data of a performer in the virtual dance scene comprises:
acquiring accelerations, angular velocities and angular displacements of a plurality of body parts of the actor transmitted by an inertial sensor device;
and/or
The position, velocity and acceleration of a plurality of body parts of the actor transmitted by the optical sensor device are acquired.
6. A method according to any one of claims 1-3, wherein the dance training method comprises:
at least one of dance training difficulty selection, music selection and action guidance.
7. The method of any one of claims 1-3, wherein after said analyzing said dance data and displaying the analysis result and a dance training method corresponding to the analysis result through a display screen of said AR device, said method further comprises:
establishing a training plan corresponding to the dance training method;
and when the performer trains dance through the training program, monitoring dance movements of the performer in real time and giving adjustment suggestions.
8. A dance training apparatus, comprising:
the rendering module is used for rendering the virtual dance scene corresponding to the target dance;
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring dance data of a performer in the virtual dance scene, wherein the dance data comprises a plurality of motion parameters of each body part in a plurality of body parts;
and the analysis module is used for analyzing the dance data and displaying an analysis result and a dance training method corresponding to the analysis result through a display screen of the augmented reality AR equipment.
9. An electronic device, comprising:
a memory and a processor, the memory storing computer readable instructions that, when executed by the processor, perform the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, comprising:
computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-7.
CN202310897136.6A 2023-07-20 2023-07-20 Dance training method, device, equipment and readable storage medium Pending CN116899199A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310897136.6A CN116899199A (en) 2023-07-20 2023-07-20 Dance training method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310897136.6A CN116899199A (en) 2023-07-20 2023-07-20 Dance training method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN116899199A true CN116899199A (en) 2023-10-20

Family

ID=88352659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310897136.6A Pending CN116899199A (en) 2023-07-20 2023-07-20 Dance training method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116899199A (en)

Similar Documents

Publication Publication Date Title
US10339972B2 (en) Systems and methods of interactive exercising
US20130177296A1 (en) Generating metadata for user experiences
CN104012038A (en) Sensor fusion interface for multiple sensor input
TWI681798B (en) Scoring method and system for exercise course and computer program product
Chen et al. Using real-time acceleration data for exercise movement training with a decision tree approach
CN104252712A (en) Image generating apparatus and image generating method
Zou et al. Intelligent fitness trainer system based on human pose estimation
CN110148072B (en) Sport course scoring method and system
Wei et al. Performance monitoring and evaluation in dance teaching with mobile sensing technology
Fei et al. Flow-pose Net: An effective two-stream network for fall detection
KR20010095900A (en) 3D Motion Capture analysis system and its analysis method
CN114543797A (en) Pose prediction method and apparatus, device, and medium
CN116958584A (en) Key point detection method, regression model training method and device and electronic equipment
Xu et al. [Retracted] An Inertial Sensing‐Based Approach to Swimming Pose Recognition and Data Analysis
WO2021157691A1 (en) Information processing device, information processing method, and information processing program
CN113449945B (en) Sport course scoring method and system
Peng Research on dance teaching based on motion capture system
CN116899199A (en) Dance training method, device, equipment and readable storage medium
US20190293779A1 (en) Virtual reality feedback device, and positioning method, feedback method and positioning system thereof
WO2022085069A1 (en) Exercise improvement instruction device, exercise improvement instruction method, and exercise improvement instruction program
TW202314249A (en) Positioning method, electronic equipment and computer-readable storage medium
Kerdvibulvech et al. Guitarist fingertip tracking by integrating a Bayesian classifier into particle filters
JP7044840B2 (en) Exercise course scoring method, exercise course scoring system, and program
TW202005407A (en) System for displaying hint in augmented reality to play continuing film and method thereof
Li et al. Real‐Time Capture of Snowboarder’s Skiing Motion Using a 3D Vision Sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination