WO2022017181A1 - 一种互动方法、装置、设备和可读介质 - Google Patents

一种互动方法、装置、设备和可读介质 Download PDF

Info

Publication number
WO2022017181A1
WO2022017181A1 PCT/CN2021/104899 CN2021104899W WO2022017181A1 WO 2022017181 A1 WO2022017181 A1 WO 2022017181A1 CN 2021104899 W CN2021104899 W CN 2021104899W WO 2022017181 A1 WO2022017181 A1 WO 2022017181A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
icon
music
action
highlighted
Prior art date
Application number
PCT/CN2021/104899
Other languages
English (en)
French (fr)
Inventor
王虹丹
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Priority to EP21846566.4A priority Critical patent/EP4167067A4/en
Priority to US18/005,812 priority patent/US20230298384A1/en
Publication of WO2022017181A1 publication Critical patent/WO2022017181A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters

Definitions

  • the embodiments of the present disclosure relate to the technical field of application development, and in particular, to an interaction method, apparatus, device, and readable medium.
  • new gesture icons are randomly slid into the interface and old gesture icons are slid out with the music playing. Interaction with the application. However, the sliding of the gesture icon in the current application occurs randomly, and there is no connection with the music being played. Therefore, the user's actions during the interaction are also relatively random, lacking rhythm and aesthetics, and the interaction
  • the form of the content is relatively simple.
  • the embodiments of the present disclosure provide an interaction method, apparatus, device, and readable medium, which can realize the interaction between the user and the played music, and can also improve the diversity and interest of the interactive content and the coherence of the user's actions. Sex and beauty.
  • an embodiment of the present disclosure provides an interaction method, the method comprising:
  • the interaction completion degree of the user is determined according to the determined final matching result.
  • an embodiment of the present disclosure provides an interactive device, the device comprising:
  • a music playing module for playing the first music
  • an icon highlighting module for highlighting at least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface according to the target music beat in the first music
  • the body movement acquisition module is used to obtain the body movements performed by the user
  • a limb matching judging module configured to determine whether the limb movement performed by the user matches the icon feature of the highlighted action indication icon
  • the interaction module is configured to determine the interaction completion degree of the user according to the determined final matching result after the first music finishes playing.
  • an embodiment of the present disclosure further provides an electronic device, the electronic device comprising:
  • processors one or more processors
  • memory for storing one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the interaction method as described in any embodiment of the present disclosure.
  • an embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the interaction method described in any of the embodiments of the present disclosure.
  • an embodiment of the present disclosure further provides a computer program product, the computer program product includes: a computer program, where the computer program is stored in a readable storage medium, and one or more processors of the electronic device can be downloaded from The readable storage medium reads the computer program, and the one or more processors execute the computer program, so that the electronic device executes the interaction method according to any embodiment of the present disclosure.
  • an embodiment of the present disclosure further provides a computer program, where the computer program is stored in a readable storage medium, and one or more processors of a device can read the computer from the readable storage medium A program, wherein the one or more processors execute the computer program, so that the electronic device executes the interaction method according to any embodiment of the present disclosure.
  • An interactive method, apparatus, device, and readable medium provided by the embodiments of the present disclosure, according to the target music beat in the played music, highlight at least one action among a plurality of action indication icons displayed at a predetermined position in a user interface Indication icon, so that the user can perform the corresponding physical action according to the highlighted action indication icon, so as to realize the interaction between the user and the played music and the user interface, and at the same time, determine whether the physical action performed by the user is performed each time the highlighted display is displayed. It matches the icon feature of the highlighted action indication icon, and after the music finishes playing, the user's interaction completion degree is determined according to the determined final matching result.
  • Brightly displaying at least one action indication icon can improve the diversity and interest of interactive content, and also improve the coherence and aesthetics of user actions.
  • FIG. 1 shows a flowchart of an interaction method provided by an embodiment of the present disclosure
  • FIG. 2 shows a schematic interface diagram of displaying a plurality of action indication icons at predetermined positions and their highlighting process in a method provided by an embodiment of the present disclosure
  • FIG. 3 shows a flowchart of another interaction method provided by an embodiment of the present disclosure
  • FIG. 4 shows a schematic structural diagram of an interactive device provided by an embodiment of the present disclosure
  • FIG. 5 shows a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • the term “including” and variations thereof are open-ended inclusions, ie, "including but not limited to”.
  • the term “based on” is “based at least in part on.”
  • the term “one embodiment” means “at least one embodiment”; the term “another embodiment” means “at least one additional embodiment”; the term “some embodiments” means “at least some embodiments”. Relevant definitions of other terms will be given in the description below.
  • FIG. 1 shows a flowchart of an interaction method provided by an embodiment of the present disclosure, and the embodiment of the present disclosure can be applied to any application program.
  • An interaction method provided by an embodiment of the present disclosure may be executed by an interactive apparatus provided by an embodiment of the present disclosure, and the apparatus may be implemented in software and/or hardware, and integrated in an electronic device that executes the method.
  • the electronic device that executes the method may include mobile terminals, such as smart phones, PDAs, tablet computers, wearable devices with display screens, etc., and may also include computer devices, such as desktop computers, notebook computers, all-in-one computers, and the like.
  • the interaction method provided in the embodiment of the present disclosure may include the following steps:
  • the present disclosure can solve the problem that the interaction form in the existing application program is relatively simple, the user cannot interact with the played music and pictures, or the corresponding auxiliary equipment needs to be additionally set up to interact.
  • the technical solution can realize the interaction between the user and the played music and the user interface, and also enables the user to record a corresponding interactive video when interacting, thereby improving the user's interactive experience.
  • the played first music in this embodiment may include pre-configured fixed music, or may include establishing a connection with a certain music application in advance, so as to support the user by accessing Any music selected from the connected music app.
  • any kind of music is composed of rebeats and upbeats under a fixed combination rule, and each music beat may have different playback sound lengths, in order to realize the communication between the user and the played first music Interaction, according to the music beat characteristics such as rebeat, upbeat, and playback sound length when the first music is played, at least one action indication icon among the plurality of action indication icons displayed at predetermined positions in the user interface can be highlighted for the user respectively. , so that the user performs different body movements under different music beats, thereby realizing the interaction between the user and the played first music and the user interface.
  • the first music may include music that needs to be played during the current interactive execution process selected by the user from a pre-configured fixed music collection or from a pre-connected music application.
  • the detection process of the first music beat may be performed in advance to determine the target beat in the first music, so that the first music will be matched with the accurate target music beat in advance, thereby
  • the electronic device can determine all the target music beats existing in the first music according to the pre-matching situation between the first music and the target music beats, so as to determine the current music beats in the subsequent playback process. Whether to play to the target music beat.
  • the electronic device may also detect the current playing beat of the first music in real time during the playing process of playing the first music, and determine whether the current playing beat is the target music beat.
  • the target music beat can be set as the music rebeat.
  • the music rebeat in the first music highlight the Displaying at least one action indication icon among a plurality of action indication icons displayed at predetermined positions in the user interface, so that the user can perform the corresponding physical action under the music rebeat, but does not need to perform any physical action under the music downbeat, thereby increasing the It improves the coherence and beauty of user actions.
  • the user in order to ensure the accurate interaction between the user and the first music, in this embodiment, according to the user's requirement for music interaction, under different music beats, the user can be respectively highlighted in the user interface according to a predetermined position.
  • Action indication icons adapted to each music beat are displayed to the user to instruct the user to perform physical actions in accordance with the corresponding action indication icons under different music beats, thereby enabling the user to interact with the played first music.
  • the electronic device determines whether the target music beat is currently played according to the playback progress in the first music, and every time it detects the current When a certain target music beat of the first music is played, among the plurality of action indication icons displayed at the predetermined position in the user interface, according to the music attributes of the first music and the plurality of display icons at the predetermined position in the user interface.
  • the electronic device can also enhance the variety and interest of the interactive content by highlighting at least one action indicating icon displayed at a predetermined position in the user interface.
  • the electronic device when the target music beat is a music rebeat, the electronic device does not highlight any action indication icon when the first music is played to a music downbeat; when the first music is played to a music rebeat, The electronic device highlights at least one action indication icon, so as to realize the interactive operation between the user and the first music.
  • action indication icons representing different body movements may be set at different predetermined positions of the user interface, respectively.
  • the action indication icons may be gesture icons.
  • the corresponding gesture icons are respectively set on the positions, and when detecting that the current playing beat of the first music is a music remake, the electronic device can highlight at least one gesture icon from each gesture icon displayed at a predetermined position in the user interface, so that The user interacts with the music being played by moving the corresponding palm to match the highlighted gesture icon.
  • the music attribute in the present disclosure may include music characteristics such as the music type, music rhythm, and music volume of the first music.
  • the music attribute in the present disclosure may include music characteristics such as the music type, music rhythm, and music volume of the first music.
  • at least one action indication icon can be selected and highlighted from a plurality of action indication icons displayed at predetermined positions in the user interface.
  • the user performs the corresponding action according to the highlighted action indication icon, which not only conforms to the musical attribute of the first music, but also has the aesthetic feeling of the action. For example, when the rhythm of the first music is slow, the variation range of each action of the user may be small. At this time, the position change between the action indication icons highlighted by the electronic device each time is also correspondingly small.
  • the electronic device highlights the icon A at the lowermost middle position, and under music remake 2, the electronic device highlights the icon B at the lower left adjacent to the icon A .
  • the electronic device in order to ensure the success of the user in performing the corresponding physical action, among the plurality of action indication icons displayed at predetermined positions in the user interface, if each action indication icon requires interaction through user gestures, the That is, the multiple action indication icons displayed at predetermined positions in the user interface are gesture icons, then at most two action indication icons can be selected for highlighting, that is, one or two action indication icons can be selected for highlighting, so that the user can Successfully perform the corresponding physical action.
  • the present disclosure does not limit the number of highlighted action indication icons, and corresponding highlighted action indication icons can be set according to different music and actions.
  • the user will perform a corresponding physical action according to the highlighted action indication icon to match all the action indication icons. Play the first music to interact.
  • the physical action performed by the user may be acquired to determine whether the physical action performed by the user matches the icon feature of the highlighted action indicating icon.
  • S140 Determine whether the physical action performed by the user matches the icon feature of the highlighted action indication icon.
  • At least one action indication icon is highlighted from among a plurality of action indication icons displayed at predetermined positions in the user interface.
  • the user can perform the corresponding physical action according to the highlighted action indication icon.
  • the electronic device needs to determine the physical action performed by the user and the icon features of the highlighted action indicating icon each time after at least one action indicating icon is highlighted. match between.
  • the icon feature may include at least one of a recognition range pre-specified by the highlighted action indication icon and a specific action indicated by the highlighted action indication icon. Therefore, after each time highlighting at least one action indication icon among the plurality of action indication icons displayed at a predetermined position in the user interface, the electronic device needs to determine the physical action performed by the user when the action indication icon is highlighted this time. Whether it matches the icon feature of the highlighted action indication icon, and then the final matching result at the end of music playback is determined through the matching result at each highlighted display.
  • the icon feature includes the recognition range pre-specified by the highlighted action indication icon, the electronic device determines whether the body action performed by the user is consistent with The highlighted action indicates that the icon characteristics of the icon match.
  • the icon special effect includes the specific action indicated by the highlighted action indication icon, the physical action performed by the user is matched with the action indicated in the highlighted action indication icon to determine whether the physical action performed by the user is not Matches the icon characteristics of the highlighted action indicator icon.
  • each time at least one action indication icon is highlighted by judging whether the physical action performed by the user successfully matches the highlighted action indication icon this time, it is possible to analyze the user's performance according to the highlighted action indication icon.
  • the action indicates the success rate of the physical action performed by the icon, that is, the user's interaction completion degree.
  • the end of playing the first music may include the following two situations: 1) the first music is automatically ended after the first music is played; 2) the first music is passively ended during the playback process according to the exit operation issued by the user. .
  • the electronic device may determine the interaction completion degree of the user interaction. The electronic device may, according to the matching result between the physical action performed by the user and the icon feature of the highlighted action indicating icon when at least one action indicating icon is highlighted each time under the target music beat at different playback moments, determine that the user is in the The final match result during this interaction.
  • the electronic device can determine the number of times the user is in the game by the number of occurrences N of the highlighted action indicating icon and the number of successful matches P between the physical action performed by the user and the icon feature of the highlighted action indicating icon during the playback of the first music.
  • the final matching result (for example, P/N) between the body movements performed during this interaction and the icon features of the highlighted action indicator icon, which can accurately represent the success rate of the user completing this interaction , and then determine the user's current interaction completion degree according to the determined final matching result.
  • the interaction completion degree of the user may be displayed in different forms such as an interaction score or an interaction ratio, which is not limited in this disclosure.
  • each matching result after the action indication icon is highlighted next time may be summed and averaged, and the average matching result may be used as the user's interaction completion degree.
  • At least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface is highlighted according to the target music beat in the played music, so that the user can display the action indication icon according to the highlighted display.
  • the action indicating icon performs the corresponding physical action, so as to realize the interaction between the user and the played music and the user interface, and at the same time, it is determined whether the physical action performed by the user is consistent with the highlighted action indicating icon each time it is highlighted.
  • feature matching, and after the music finishes playing, the user's interaction completion degree is determined according to the determined final matching result.
  • Brightly displaying at least one action indication icon can improve the diversity and interest of interactive content, and also improve the coherence and aesthetics of user actions.
  • the judging process of determining whether the body motion performed by the user matches the icon feature of the highlighted action indication icon in the interaction method provided by the embodiment of the present disclosure is further described.
  • the icon feature includes a pre-specified identification range of the highlighted action indication icon
  • the above interactive method determines that the user is Whether the executed limb action matches the icon feature of the highlighted action indication icon may include: detecting whether a specific limb of the user is located in the pre-display area of the highlighted action indication icon within a preset highlighting period; The detection result determines whether the physical action performed by the user matches the icon feature indicated by the highlighted action indication icon.
  • each action indication icon is set with a corresponding pre-display area in the user interface, when the user performs the corresponding physical action according to the highlighted action indication icon, the electronic device can prompt the user to The limb corresponding to the action indication icon is moved to the pre-display area where the highlighted action indication icon is located.
  • this embodiment also presets The corresponding preset highlighting period prompts the user to complete the execution of the body movement corresponding to the highlighted action indication icon within the preset highlighting period. If the electronic device does not detect that the specific limb of the user is located in the pre-display area of the highlighted action indication icon within the preset highlighting period, it means that the physical action performed by the user is the same as that indicated by the highlighted action indication icon. There was no successful match between the icon characteristics of .
  • the specific limb of the user is located in the pre-display area of the highlighted action indication icon within the preset highlighting period, it means that the user has moved the specific limb to the highlighted display during the preset highlighting period
  • the pre-display area of the action indication icon it is determined that the physical action performed by the user is successfully matched with the highlighted action indication icon; and if the specific limb of the user is not detected in the highlighted display period within the preset highlighting period.
  • the pre-display area of the action indication icon it means that the user cannot move a specific limb into the pre-display area of the highlighted action indication icon within the preset highlight period. The highlighted action indicates that the icon was not successfully matched.
  • detecting whether the specific limb of the user is located in the pre-display area of the highlighted action indication icon during the preset highlighting period may include: collecting the user's information in real time during the preset highlighting period. , and determine whether the specific limb of the user is located in the pre-display area of the highlighted action indication icon according to the body motion image.
  • the electronic device collects the user's body motion image in real time.
  • each limb movement image records the different moving positions of the user when moving a specific limb.
  • the electronic device When the icon feature includes a specific action pre-specified by the highlighted action indication icon, after detecting that the first music is currently playing to the target music beat, the electronic device highlights a plurality of action indication icons displayed at predetermined positions in the user interface After at least one action indication icon in the action indication icon, the user will perform the corresponding physical action according to the specific action indicated by the highlighted action indication icon. For other gesture actions, the user performs the corresponding specific action according to the highlighted at least one action indication icon, and then at least one action indication icon is highlighted each time, and after obtaining the physical action performed by the user, the preset highlighting period is displayed. It is internally determined whether the specific action performed by the user is consistent with the specific action indicated by the highlighted action indication icon, so as to judge the matching degree between the specific action performed by the user and the highlighted action indication icon.
  • the electronic device collects the user's body motion image in real time, The specific action indicated by the highlighted action indication icon and the body action performed by the user are recorded in the body action image, and the highlighted action indication is indicated in each body action image collected during the preset highlighting period.
  • the key feature points of the specific action indicated by the icon, as well as the key feature points of the physical action performed by the user, and then by analyzing the matching relationship between the key feature points, the action posture of the highlighted action indication icon and the limb performed by the user are determined respectively.
  • the action posture of the action and then to determine whether the two action postures are consistent, can determine whether the body action performed by the user matches the specific action indicated by the highlighted action indication icon.
  • FIG. 3 shows a flowchart of another interaction method provided by an embodiment of the present disclosure. This embodiment is optimized on the basis of each optional solution provided by the foregoing embodiment. Specifically, in this implementation, other interaction processes that exist are mainly introduced in detail.
  • the method in this embodiment may include the following steps:
  • S310 Identify a specific limb of the user in the user interface, and display an initial interface according to the recognition result, where the initial interface includes a plurality of action indication icons displayed at predetermined positions.
  • the electronic device can detect whether the specific limb specified by the plurality of action indication icons displayed at predetermined positions in the user interface has been placed.
  • the electronic device determines whether a specific limb of the user is recognized in the user interface, a preset initial interface is displayed to the user, and the initial interface may include a plurality of action indication icons displayed at predetermined positions, At the same time, the electronic device starts to play the first music, so that the user can participate in this music interaction.
  • the electronic device can display corresponding prompt information to the user in the user interface to prompt the user to place the specific limb on the Within the user interface, the interaction is facilitated.
  • At least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface is highlighted, in order to improve the user interface by a predetermined position.
  • the display diversity of the multiple action indication icons displayed, the present disclosure can also detect the playback rhythm of the first music in real time within the preset highlight display period of the highlighted action indication icons, and according to the playback rhythm, A corresponding flickering frequency is set for the highlighted action indication icon, and then the highlighted action indication icon is controlled to flicker according to the corresponding flickering frequency. Controlling the flickering frequency of the highlighted action indication icon according to the music rhythm can improve the interactive interest between the user and the first music.
  • At least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface is highlighted according to the target music beat in the played music, so that the user can display the action indication icon according to the highlighted display.
  • the action indicating icon performs the corresponding physical action, so as to realize the interaction between the user and the played music and the user interface, and at the same time, it is determined whether the physical action performed by the user is consistent with the highlighted action indicating icon each time it is highlighted.
  • feature matching, and after the music finishes playing, the user's interaction completion degree is determined according to the determined final matching result.
  • Brightly displaying at least one action indication icon can improve the diversity and interest of interactive content, and also improve the coherence and aesthetics of user actions.
  • FIG. 4 shows a schematic structural diagram of an interactive device provided by an embodiment of the present disclosure.
  • the embodiment of the present disclosure can be applied to any application program.
  • the device can be implemented by software and/or hardware, and integrated in the execution of this method of electronic equipment.
  • the interactive device in the embodiment of the present disclosure may specifically include:
  • a music playing module 410 configured to play the first music
  • the icon highlighting module 420 is configured to highlight at least one action indication icon among the plurality of action indication icons displayed at a predetermined position in the user interface according to the target music beat in the first music;
  • a body motion acquisition module 430 configured to acquire body motions performed by the user
  • a limb matching judgment module 440 configured to determine whether the limb movement performed by the user matches the icon feature of the highlighted action indication icon
  • the interaction module 450 is configured to determine the interaction completion degree of the user according to the determined final matching result after the first music finishes playing.
  • At least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface is highlighted according to the target music beat in the played music, so that the user can display the action indication icon according to the highlighted display.
  • the action indicating icon performs the corresponding physical action, so as to realize the interaction between the user and the played music and the user interface, and at the same time, it is determined whether the physical action performed by the user is consistent with the highlighted action indicating icon each time it is highlighted.
  • feature matching, and after the music finishes playing, the user's interaction completion degree is determined according to the determined final matching result.
  • Brightly displaying at least one action indication icon can improve the diversity and interest of interactive content, and also improve the coherence and aesthetics of user actions.
  • icon highlighting module 420 can be specifically used for:
  • the at least one action indication icon is highlighted and displayed according to the music attribute of the first music and the positional association of the plurality of action indication icons displayed at predetermined positions in the user interface.
  • limb matching judgment module 440 can be specifically used for:
  • the detection result it is determined whether the physical action performed by the user matches the icon feature indicated by the highlighted action indication icon.
  • limb matching judgment module 440 can also be specifically used for:
  • Whether the specific limb of the user is located in the pre-display area of the highlighted motion indication icon is determined according to the limb motion image.
  • limb matching judgment module 440 can also be specifically used for:
  • the physical action performed by the user is matched with the action identified in the highlighted action indication icon to determine whether the physical action performed by the user matches the icon feature of the highlighted action indication icon.
  • the above-mentioned interactive device may also include:
  • An icon flickering module configured to control the flickering frequency of the highlighted action indication icon according to the playing rhythm of the first music within a preset highlighting period.
  • the above-mentioned interactive device may also include:
  • the limb recognition module is used for recognizing the specific limb of the user in the user interface, and displaying an initial interface according to the recognition result, where the initial interface includes a plurality of action indication icons displayed at predetermined positions.
  • the above-mentioned interactive device may also include:
  • a limb recognition prompt module configured to display corresponding prompt information to the user in the user interface if the specific limb of the user is not recognized in the user interface.
  • interaction module 450 can be specifically used for:
  • the interaction device provided by the embodiment of the present disclosure and the interaction method provided by the above-mentioned embodiment belong to the same inventive concept.
  • the embodiment of the present disclosure has the same characteristics as the above-mentioned embodiment. same beneficial effect.
  • FIG. 5 it shows a schematic structural diagram of an electronic device 500 suitable for implementing an embodiment of the present disclosure.
  • the electronic devices in the embodiments of the present disclosure may include, but are not limited to, such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), vehicle-mounted terminals (eg, Vehicle navigation terminals), mobile terminals such as wearable electronic devices, etc., and stationary terminals such as digital TVs, desktop computers, and the like.
  • the electronic device shown in FIG. 5 is only an example, and should not impose any limitation on the function and scope of use of the embodiments of the present disclosure.
  • the electronic device 500 may include a processing device (eg, a central processing unit, a graphics processing unit, etc.) 501 , which may be based on a program stored in a read only memory (ROM) 502 or from a storage device 508 A program loaded into a random access memory (RAM) 503 performs various appropriate actions and processes. In the RAM 503, various programs and data required for the operation of the electronic device 500 are also stored.
  • the processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504.
  • Input/output (I/O) interface 505 is also connected to bus 504 .
  • I/O interface 505 the following devices can be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD) Output device 507 , speaker, vibrator, etc.; storage device 508 including, for example, magnetic tape, hard disk, etc.; and communication device 509 .
  • Communication means 509 may allow electronic device 500 to communicate wirelessly or by wire with other devices to exchange data. While FIG. 5 shows electronic device 500 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via the communication device 509, or from the storage device 508, or from the ROM 502.
  • the steps in the method of the embodiment of the present disclosure are executed, so as to realize the above-mentioned functions defined by the computer program.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Computer readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable read only memory (Electrical programmable read only memory, EPROM or flash memory), optical fiber, portable compact disc read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable of the above combination.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with computer-readable program code embodied thereon.
  • Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • the program code embodied on the computer readable medium may be transmitted by any suitable medium, including but not limited to: electric wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the above.
  • electronic devices can communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol), and can communicate with digital data in any form or medium (
  • HTTP HyperText Transfer Protocol
  • communication networks are interconnected.
  • Examples of communication networks include Local Area Network (“LAN”), Wide Area Network (“WAN”), Internet (eg, the Internet), and Peer-to-Peer Networks (eg, ad hoc Peer-to-Peer Network), and any currently known or future developed networks.
  • LAN Local Area Network
  • WAN Wide Area Network
  • Internet eg, the Internet
  • Peer-to-Peer Networks eg, ad hoc Peer-to-Peer Network
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or may exist alone without being assembled into the electronic device.
  • the above-mentioned computer-readable medium carries one or more programs, and when the above-mentioned one or more programs are executed by the electronic device, the electronic device is made to: play the first music; according to the target music beat in the first music, highlight the At least one action indication icon among the plurality of action indication icons displayed at a predetermined position in the user interface; acquiring the physical action performed by the user; determining whether the physical action performed by the user is consistent with the icon feature of the highlighted action indication icon Matching; after the first music finishes playing, the user's interaction completion degree is determined according to the determined final matching result.
  • Computer program code for performing operations of the present disclosure may be written in one or more programming languages, including but not limited to object-oriented programming languages, such as Java, Smalltalk, C++, and This includes conventional procedural programming languages, such as the "C" language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or wide area network (WAN), or may be connected to an external computer (eg, through the Internet using an Internet service provider) connect).
  • LAN local area network
  • WAN wide area network
  • Internet service provider an external computer
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logical functions for implementing the specified functions executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure may be implemented in a software manner, and may also be implemented in a hardware manner. Among them, the name of the unit does not constitute a limitation of the unit itself under certain circumstances.
  • exemplary types of hardware logic components include: Field programmable gate array (FPGA), Application specific integrated circuit (ASIC), Application specific standard product (Application specific standard product) standard parts, ASSP), system on chip (System on chip, SOC), complex programmable logic device (Complex programmable logic device, CPLD) and so on.
  • FPGA Field programmable gate array
  • ASIC Application specific integrated circuit
  • ASSP Application specific standard product
  • ASSP system on chip
  • SOC System on chip
  • complex programmable logic device Complex programmable logic device
  • CPLD complex programmable logic device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium can be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • An interaction method provided according to one or more embodiments of the present disclosure includes:
  • the interaction completion degree of the user is determined according to the determined final matching result.
  • the highlighting of at least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface includes:
  • the at least one action indication icon is highlighted and displayed according to the music attribute of the first music and the positional association of the plurality of action indication icons displayed at predetermined positions in the user interface.
  • the determining whether the physical action performed by the user matches the icon feature of the highlighted action indication icon includes:
  • the detection result it is determined whether the physical action performed by the user matches the icon feature indicated by the highlighted action indication icon.
  • the detecting within the preset highlighting period whether the specific limb of the user is located in the pre-display area of the highlighted action indication icon includes:
  • Whether the specific limb of the user is located in the pre-display area of the highlighted motion indication icon is determined according to the limb motion image.
  • the determining whether the physical action performed by the user matches the icon feature of the highlighted action indication icon includes:
  • the physical action performed by the user is matched with the action identified in the highlighted action indication icon to determine whether the physical action performed by the user matches the icon feature of the highlighted action indication icon.
  • the above method further includes:
  • the blinking frequency of the highlighted action indication icon is controlled according to the playing rhythm of the first music.
  • the above method further includes:
  • the above method further includes:
  • the determining the degree of interaction completion of the user according to the determined final matching result includes:
  • the device includes:
  • a music playing module for playing the first music
  • an icon highlighting module for highlighting at least one action indication icon among a plurality of action indication icons displayed at a predetermined position in the user interface according to the target music beat in the first music
  • the body movement acquisition module is used to obtain the body movements performed by the user
  • a limb matching judging module configured to determine whether the limb movement performed by the user matches the icon feature of the highlighted action indication icon
  • the interaction module is configured to determine the interaction completion degree of the user according to the determined final matching result after the first music finishes playing.
  • the icon highlighting module is specifically configured to:
  • the at least one action indication icon is highlighted and displayed according to the music attribute of the first music and the positional association of the plurality of action indication icons displayed at predetermined positions in the user interface.
  • the limb matching judgment module is specifically used for:
  • the detection result it is determined whether the physical action performed by the user matches the icon feature indicated by the highlighted action indication icon.
  • the above-mentioned limb matching judgment module is specifically used for:
  • Whether the specific limb of the user is located in the pre-display area of the highlighted motion indication icon is determined according to the limb motion image.
  • the above-mentioned limb matching judgment module is specifically used for:
  • the physical action performed by the user is matched with the action identified in the highlighted action indication icon to determine whether the physical action performed by the user matches the icon feature of the highlighted action indication icon.
  • the above device further includes:
  • An icon flickering module configured to control the flickering frequency of the highlighted action indication icon according to the playing rhythm of the first music within a preset highlighting period.
  • the above device further includes:
  • the limb recognition module is used for recognizing the specific limb of the user in the user interface, and displaying an initial interface according to the recognition result, where the initial interface includes a plurality of action indication icons displayed at predetermined positions.
  • the above device further includes:
  • a limb recognition prompt module configured to display corresponding prompt information to the user in the user interface if the specific limb of the user is not recognized in the user interface.
  • the above-mentioned interactive module is specifically used for:
  • the electronic device includes:
  • memory for storing one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the interaction method as described in any embodiment of the present disclosure.
  • a computer-readable storage medium is provided according to one or more embodiments of the present disclosure, and a computer program is stored thereon, and when the computer program is executed by a processor, the interactive method described in any of the embodiments of the present disclosure is implemented .
  • a computer program product includes: a computer program, the computer program is stored in a readable storage medium, and one or more processors of an electronic device The computer program is read from the readable storage medium, and the one or more processors execute the computer program, so that the electronic device executes the solution provided by any of the foregoing embodiments.
  • a computer program is provided according to one or more embodiments of the present disclosure.
  • the computer program is stored in a readable storage medium, and one or more processors of an electronic device can read the computer program from the readable storage medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Acoustics & Sound (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种互动方法、装置、设备和可读介质。其中,该方法包括:播放第一音乐(S110);根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标(S120);获取用户执行的肢体动作(S130);确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配(S140);在第一音乐结束播放后,根据确定的最终匹配结果,确定用户的互动完成度(S150)。该方法避免互动时的设备复杂性,同时通过在用户界面内按预定位置展示不同的动作指示图标并根据音乐节拍高亮显示至少一个动作指示图标,可以提升互动内容的多样性和趣味性,也提高了用户动作的连贯性和美观度。

Description

一种互动方法、装置、设备和可读介质
相关申请交叉引用
本申请要求于2020年7月24日提交中国专利局、申请号为202010724931.1、发明名称为“一种互动方法、装置、设备和可读介质”的中国专利申请的优先权,其全部内容通过引用并入本文。
技术领域
本公开实施例涉及应用开发技术领域,尤其涉及一种互动方法、装置、设备和可读介质。
背景技术
随着互联网技术的快速发展,各类应用程序通常会面向用户开发大量娱乐类的游戏或者道具来增加用户留存,以提升应用程序的服务效果。
目前,各类应用程序内所开发的现有娱乐类游戏或道具中,通常会随着音乐播放,随机在界面中滑入新的手势图标,并滑出旧的手势图标,并基于此实现用户与该应用程序之间的互动。然而,当前应用程序中的手势图标的滑动是随机出现的,与所播放的音乐之间不存在任何联系,因此,用户在交互过程中的动作也比较随机、缺乏节奏感和美感,并且,互动内容的形式较为单一。
发明内容
有鉴于此,本公开实施例提供了一种互动方法、装置、设备和可读介质,实现用户与播放的音乐之间的互动,还可以提升互动内容的多样性、趣味性和用户动作的连贯性和美感。
第一方面,本公开实施例提供了一种互动方法,该方法包括:
播放第一音乐;
根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
获取用户执行的肢体动作;
确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
第二方面,本公开实施例提供了一种互动装置,该装置包括:
音乐播放模块,用于播放第一音乐;
图标高亮显示模块,用于根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
肢体动作获取模块,用于获取用户执行的肢体动作;
肢体匹配判断模块,用于确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
互动模块,用于在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
第三方面,本公开实施例还提供了一种电子设备,该电子设备包括:
一个或多个处理器;
存储器,用于存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如本公开任一实施例中所述的互动方法。
第四方面,本公开实施例提供了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如本公开任一实施例中所述的互动方法。
第五方面,本公开实施例还提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序,所述计算机程序存储在可读存储介质中,电子设备的一个或多个处理器可以从所述可读存储介质读取所述计算机程序,所述一个或多个处理器执行所述计算机程序,使得所述电子设备执行如本公开任一实施例所述的互动方法。
第六方面,本公开实施例还提供了一种计算机程序,所述计算机程序存储在可读存储介质中,设备的一个或多个处理器可以从所述可读存储介质中读取所述计算机程序,所述一个或多个处理器执行所述计算机程序,使得所述电子设备执行如本公开任一实施例所述的互动方法。
本公开实施例提供的一种互动方法、装置、设备和可读介质,根据播放的音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,以使用户按照高亮显示的动作指示图标执行对应的肢体动作,从而实现用户与播放的音乐和用户界面之间的互动,同时在每次高亮显示时确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配,并在音乐结束播放后,根据确定的最终匹配 结果,确定用户的互动完成度。通过采用该技术方案,用户在和终端电子设备交互时无需额外设置相应的辅助设备,避免互动时的设备复杂性,同时通过在用户界面内按预定位置展示不同的动作指示图标并根据音乐节拍高亮显示至少一个动作指示图标,可以提升互动内容的多样性和趣味性,也提高了用户动作的连贯性和美观度。
附图说明
结合附图并参考以下具体实施方式,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。贯穿附图中,相同或相似的附图标记表示相同或相似的元素。应当理解附图是示意性的,原件和元素不一定按照比例绘制。
图1示出了本公开实施例提供的一种互动方法的流程图;
图2示出了本公开实施例提供的方法中按预定位置展示多个动作指示图标及其高亮显示过程的界面示意图;
图3示出了本公开实施例提供的另一种互动方法的流程图;
图4示出了本公开实施例提供的一种互动装置的结构示意图;
图5示出了本公开实施例提供的一种电子设备的结构示意图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中显示了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
应当理解,本公开的方法实施方式中记载的各个步骤可以按照不同的顺序执行,和/或并行执行。此外,方法实施方式可以包括附加的步骤和/或省略执行示出的步骤。本公开的范围在此方面不受限制。
本文使用的术语“包括”及其变形是开放性包括,即“包括但不限于”。术语“基于”是“至少部分地基于”。术语“一个实施例”表示“至少一个实施例”;术语“另一实施例”表示“至少一个另外的实施例”;术语“一些实施例”表示“至少一些实施例”。其他术语的相关定义将在下文描述中给出。
需要注意,本公开中提及的“第一”、“第二”等概念仅用于对不同的装置、模块或单元进行区分,并非用于限定这些装置、模块或单元所执行的功能的顺序或者相互依存关 系。
需要注意,本公开中提及的“一个”、“多个”的修饰是示意性而非限制性的,本领域技术人员应当理解,除非在上下文另有明确指出,否则应该理解为“一个或多个”。
本公开实施方式中的多方之间所交互的消息或者信息的名称仅用于说明性的目的,而并不是用于对这些消息或信息的范围进行限制。
图1示出了本公开实施例提供的一种互动方法的流程图,本公开实施例可适用于任一种应用程序中。本公开实施例提供的一种互动方法可以由本公开实施例提供的互动装置来执行,该装置可以通过软件和/或硬件的方式来实现,并集成在执行本方法的电子设备中,在本实施例中执行本方法的电子设备可以包括移动终端,例如智能手机、掌上电脑、平板电脑、带显示屏的可穿戴设备等等,还可以包括计算机设备,如台式机、笔记本电脑、一体机等。
具体的,如图1所示,本公开实施例中提供的互动方法可以包括如下步骤:
S110,播放第一音乐。
具体的,本公开可以针对现有的应用程序中互动形式较为单一、无法使用户与所播放的音乐和画面进行互动,或者还需要额外设置相应的辅助设备才能进行互动的问题,通过采用本公开的技术方案,可以实现用户与播放的音乐和用户界面之间的互动,还可以使得用户在进行互动时录制相应的互动视频,提高了用户的交互体验。
其中,为了提高互动音乐的多样性,本实施例中的所播放的第一音乐可以包括预先配置好的固定音乐,也可以包括通过预先与某一音乐应用之间建立连接,以支持用户通过访问所连接的音乐应用而选定的任一音乐。
S120,根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标。
由于任一种音乐都是由固定组合规律下的重拍和弱拍共同组成的,而且每一音乐节拍可能存在不同的播放音长,因此,为了实现用户与所播放的第一音乐之间的互动,可以针对第一音乐播放时的重拍、弱拍以及播放音长等音乐节拍特点,面向用户分别高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,以使用户在不同的音乐节拍下执行不同的肢体动作,进而实现用户与所播放的第一音乐之间以及用户界面的互动。
具体的,第一音乐可以包括用户从预先配置的固定音乐集合或者从预先建立连接的音 乐应用中选定的当前互动执行过程中需要播放的音乐。在一个实施例中,可以在播放第一音乐之前,预先进行第一音乐节拍的检测处理,确定第一音乐中的目标节拍,使得该第一音乐中会预先匹配有准确的目标音乐节拍,从而在用户选定第一音乐后,电子设备能够根据第一音乐与目标音乐节拍之间的预先匹配情况,确定出该第一音乐中存在的全部目标音乐节拍,从而在后续播放过程中,确定当前是否播放到目标音乐节拍。在另一个实施例中,电子设备还可以在播放第一音乐的播放过程中,实时检测第一音乐的当前播放节拍,并判断当前播放节拍是否为目标音乐节拍。
需要说明的是,由于音乐重拍相比音乐弱拍更加能够带动用户的娱乐性,因此本实施例中可以设定目标音乐节拍为音乐重拍,根据第一音乐中的音乐重拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,使得用户可以在音乐重拍下执行相应的肢体动作,而在音乐弱拍下不必执行任何肢体动作,从而增加了用户动作的连贯性和美感。
在一个实施例中,为了确保用户与第一音乐之间的准确互动,本实施例可以根据用户对音乐互动需求,在不同的音乐节拍下,面向用户分别在用户界面中按照预定位置高亮显示与各音乐节拍适配的动作指示图标,并展示给用户,以指示用户在不同音乐节拍下执行符合对应动作指示图标指示的肢体动作,从而使用户与所播放的第一音乐进行互动。
本实施例中,在确定出第一音乐中的各个目标音乐节拍(如音乐重拍)后,电子设备根据第一音乐中的播放进度判断当前是否播放到目标音乐节拍,在每次检测到当前播放到第一音乐的某个目标音乐节拍时,则会在该用户界面中的按预定位置展示的多个动作指示图标中,根据第一音乐的音乐属性以及用户界面中按预定位置展示的多个动作指示图标的位置关联性选择至少一个动作指示图标,并高亮显示所选择出的至少一个动作指示图标,来指示用户在当前目标音乐节拍下所需要执行的肢体动作,使得用户按照该高亮显示的动作指示图标所指示的肢体动作执行对应的肢体动作,从而实现用户与音乐之间的互动。此外,电子设备通过高亮显示用户界面内按预定位置展示的至少一个动作指示图标,还可以提升互动内容的多样性和趣味性。在一个实施例中,当目标音乐节拍为音乐重拍时,则在第一音乐播放到音乐弱拍时,电子设备不高亮显示任何动作指示图标;在第一音乐播放到音乐重拍时,电子设备高亮显示至少一个动作指示图标,从而实现用户与第一音乐之间的互动操作。
示例性的,本实施例可以在用户界面的不同预定位置上分别设置代表不同肢体动作的动作指示图标,如图2所示,该动作指示图标可以是手势图标,在用户界面内的多个固定 位置上分别设定对应的手势图标,当检测到第一音乐的当前播放节拍为音乐重拍时,电子设备可以从用户界面中按预定位置展示的各个手势图标中高亮显示至少一个手势图标,使得用户通过移动对应手掌并与高亮显示的手势图标进行匹配,从而实现用户与所播放的音乐之间的互动。
同时,为了确保用户按照高亮显示的动作指示图标执行相应的肢体动作具有美感,本实施例中高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,可以包括:按照第一音乐的音乐属性以及用户界面中按预定位置展示的多个动作指示图标的位置关联性,高亮显示至少一个动作指示图标。
具体的,本公开中的该音乐属性可以包括第一音乐的音乐类型、音乐节奏、音乐音量等音乐特点,通过检测第一音乐的音乐属性并进一步根据用户界面中按预定位置展示的各个动作指示图标之间的位置关联性,可以从用户界面中按预定位置展示的多个动作指示图标中,选择至少一个动作指示图标进行高亮显示。用户根据该高亮显示的动作指示图标执行相应的动作,既可以符合第一音乐的音乐属性,又具有动作的美感。例如,当第一音乐节奏缓慢时,用户的每次的动作变化幅度可以很小,此时,电子设备每次高亮显示的动作指示图标之间的位置变化也相应地很小,例如,依次高亮显示位置上相邻的动作指示图标。例如,图2中的在音乐重拍1下,电子设备高亮显示最下方中间位置处的图标A,在音乐重拍2下,电子设备高亮显示与图标A相邻的左下方的图标B。此外,在一个实施例中,为了保证用户执行对应肢体动作的成功性,在用户界面中按预定位置展示的多个动作指示图标中,如果每一动作指示图标均要求通过用户手势来互动,也就是用户界面中按预定位置展示的多个动作指示图标均为手势图标,那么可以选择至多两个动作指示图标进行高亮显示,也就是选择一个或者两个动作指示图标进行高亮显示,使得用户成功执行对应的肢体动作。以上仅是示例性说明,本公开不对高亮显示的动作指示图标的数目进行限制,可以根据不同的音乐和动作设置相应的高亮显示的动作指示图标。
S130,获取用户执行的肢体动作。
具体的,在每次高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标之后,用户会按照高亮显示的动作指示图标执行对应的肢体动作,以与所播放的第一音乐进行互动。此时在每次高亮显示至少一个动作指示图标之后,可以获取用户执行的肢体动作,以确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配。
S140,确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配。
具体的,在第一音乐的播放过程中会在不同播放时刻根据确定的目标音乐节拍,从用户界面中按预定位置展示的多个动作指示图标中,高亮显示至少一个动作指示图标。用户可以按照高亮显示的动作指示图标执行对应的肢体动作。为了确定用户与第一音乐之间的互动完成度,电子设备需要在每次高亮显示至少一个动作指示图标后,确定用户所执行的肢体动作与本次高亮显示的动作指示图标的图标特征之间的匹配度。
该图标特征可以包括高亮显示的动作指示图标预先指定的识别范围和该高亮显示的动作指示图标所指示的具体动作中的至少一个。因此在每次高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标后,电子设备均需要确定用户在本次高亮显示动作指示图标时所执行的肢体动作是否与高亮显示的动作指示图标的图标特征相匹配,进而通过每次高亮显示时的匹配结果确定出音乐播放结束时的最终匹配结果。当图标特征包括高亮显示的动作指示图标预先指定的识别范围时,电子设备通过确定用户执行的肢体动作是否位于高亮显示的动作指示图标的预展示区域内,确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征相匹配。当图标特效包括该高亮显示的动作指示图标所指示的具体动作时,通过将用户执行的肢体动作与高亮显示的动作指示图标中所指示的动作进行匹配,以确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配。
示例性的,本实施例可以在每次高亮显示至少一个动作指示图标时,通过判断用户执行的肢体动作与本次的高亮显示的动作指示图标是否成功匹配,来分析用户按照高亮显示的动作指示图标所执行的肢体动作的成功率,即用户的交互完成度。
S150,在第一音乐结束播放后,根据确定的最终匹配结果,确定用户的互动完成度。
可选的,本实施例中第一音乐结束播放可以包括如下两种情况:1)第一音乐播放完毕后自动结束播放;2)第一音乐在播放过程中根据用户发出的退出操作被动结束播放。当第一音乐结束播放后,电子设备可以确定用户互动的互动完成度。电子设备可以根据在不同播放时刻的目标音乐节拍下每次高亮显示至少一个动作指示图标时,用户执行的肢体动作与高亮显示的动作指示图标的图标特征之间的匹配结果,确定用户在本次互动过程中的最终匹配结果。电子设备可以通过在第一音乐的播放过程中高亮显示的动作指示图标的出现次数N、用户执行的肢体动作与高亮显示的动作指示图标的图标特征之间成功匹配的次数P,确定用户在本次互动过程中所执行的肢体动作与高亮显示的动作指示图标的图标特征之间的最终匹配结果(例如,P/N),该最终匹配结果能够准确表示用户完成本次互动的成功率,进而按照确定的最终匹配结果,确定用户本次的互动完成度。
需要说明的是,本公开中可以通过互动得分或者互动比例等不同形式来展示用户的互动完成度,本公开对此不作限定。
示例性的,本实施例中确定用户的互动完成度时,可以对将次高亮显示动作指示图标后的各个匹配结果求和并进行平均处理,并将平均匹配结果作为用户的互动完成度。
本公开实施例提供的技术方案,根据播放的音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,以使用户按照高亮显示的动作指示图标执行对应的肢体动作,从而实现用户与播放的音乐和用户界面之间的互动,同时在每次高亮显示时确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配,并在音乐结束播放后,根据确定的最终匹配结果,确定用户的互动完成度。通过采用该技术方案,用户在和终端电子设备交互时无需额外设置相应的辅助设备,避免互动时的设备复杂性,同时通过在用户界面内按预定位置展示不同的动作指示图标并根据音乐节拍高亮显示至少一个动作指示图标,可以提升互动内容的多样性和趣味性,也提高了用户动作的连贯性和美观度。
在上述实施例提供的技术方案的基础上,对于本公开实施例提供的互动方法中确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配的判断过程进行进一步说明。当图标特征包括高亮显示的动作指示图标预先指定的识别范围时,上述互动方法的在高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标之后,确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配,可以包括:在预设高亮显示时段内检测用户的特定肢体是否位于高亮显示的动作指示图标的预展示区域内;根据检测结果确定用户执行的肢体动作是否与高亮显示的动作指示图标所指示的图标特征匹配。
在检测到第一音乐当前播放到目标音乐节拍,电子设备高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标后,用户会按照高亮显示的动作指示图标来执行对应的肢体动作。由于每一动作指示图标在用户界面内均会设置有对应的预展示区域,在用户按照高亮显示的动作指示图标执行对应的肢体动作时,电子设备可以通过用户界面上的提示信息提示用户将动作指示图标对应的肢体移动到高亮显示的动作指示图标所在的预展示区域内,同时为了确保用户按照高亮显示的动作指示图标执行对应肢体动作的及时性,本实施例还会预先设定对应的预设高亮显示时段,提示用户在预设高亮显示时段内,完成该高亮显示的动作指示图标对应的肢体动作的执行。如果电子设备在预设高亮显示时段内未检测到用户的特定肢体位于该高亮显示的动作指示图标的预展示区 域内,则说明用户执行的肢体动作与高亮显示的动作指示图标所指示的图标特征之间未成功匹配。因此,本实施例可以通过在预设高亮显示时段内检测用户的特定肢体是否位于高亮显示的动作指示图标的预展示区域内的检测结果,来分析用户执行的肢体动作是否与高亮显示的动作指示图标所指示的图标特征匹配。如果在预设高亮显示时段内能够检测到用户的特定肢体位于高亮显示的动作指示图标的预展示区域内,说明用户在预设高亮显示时段内将特定肢体已经移动到该高亮显示的动作指示图标的预展示区域内,则确定用户执行的肢体动作与该高亮显示的动作指示图标成功匹配;而如果在预设高亮显示时段内未检测到用户的特定肢体位于高亮显示的动作指示图标的预展示区域内,说明用户在预设高亮显示时段内未能够将特定肢体移动到该高亮显示的动作指示图标的预展示区域内,因此确定用户执行的肢体动作与该高亮显示的动作指示图标未成功匹配。
示例性的,本实施例中在预设高亮显示时段内检测用户的特定肢体是否位于高亮显示的动作指示图标的预展示区域内,可以包括:在预设高亮显示时段内实时采集用户的肢体动作图像;根据肢体动作图像判断用户的特定肢体是否位于高亮显示的动作指示图标的预展示区域内。
具体的,在每次高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标后的预设高亮显示时段内,电子设备实时采集用户的肢体动作图像,此时每一肢体动作图像中均记录了用户移动特定肢体时所处的不同移动位置,通过识别预设高亮显示时段内所采集的每一肢体动作图像中表示该特定肢体的特征点,以及表示高亮显示的动作指示图标的预展示区域的区域边界特征点,通过分析特定肢体的特征点与区域边界特征点之间的位置关系,可以判断用户的特定肢体是否移动到高亮显示的动作指示图标的预展示区域内。
当图标特征包括高亮显示的动作指示图标预先指定的具体动作时,在检测到第一音乐当前播放到目标音乐节拍后,电子设备高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标后,用户会按照高亮显示的动作指示图标所指示的具体动作来执行对应的肢体动作,该具体动作可以例如为左手或者右手作出的“ok”或“剪刀手”等手势动作,用户按照高亮显示的至少一个动作指示图标来对应的具体动作,进而在每次高亮显示至少一个动作指示图标,并获取用户执行的肢体动作后,在预设高亮显示时段内确定用户执行的具体动作是否与高亮显示的动作指示图标所指示的具体动作一致,来判断用户执行的具体动作与高亮显示的动作指示图标之间的匹配度。
示例性的,在每次高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少 一个动作指示图标后的预设高亮显示时段内,电子设备实时采集用户的肢体动作图像,该肢体动作图像中记录了高亮显示的动作指示图标所指示的具体动作以及用户执行的肢体动作,通过预设高亮显示时段内所采集的每一肢体动作图像中表示高亮显示的动作指示图标所指示的具体动作的关键特征点,以及用户执行的肢体动作的关键特征点,然后通过分析关键特征点之间的匹配关系分别确定高亮显示的动作指示图标的动作姿态和用户执行的肢体动作的动作姿态,进而判断两个动作姿态是否一致,可以确定用户执行的肢体动作是否与高亮显示的动作指示图标所指示的具体动作之间的匹配度。
图3示出了本公开实施例提供的另一种互动方法的流程图,本实施例在上述实施例提供的各个可选方案的基础上进行优化。具体的,本实施中主要对于还存在的其他互动过程进行详细的介绍。
可选的,如图3所示,本实施例中的方法可以包括如下步骤:
S310,识别用户界面内的用户的特定肢体,并根据识别结果展示初始界面,该初始界面包括按预定位置展示的多个动作指示图标。
为了确保互动的有效执行,在检测到用户启动用于执行本次互动的应用程序后,电子设备可以检测在用户界面中按预定位置展示的多个动作指示图标所指定的特定肢体是否已被置于该用户界面内,当电子设备确定在用户界面内是否识别到用户的特定肢体时,向用户展示出预先设定的初始界面,该初始界面可以包括按预定位置展示的多个动作指示图标,同时电子设备开始播放第一音乐,以使用户参与本次音乐互动。
S320,如果在用户界面内未识别到用户的特定肢体,则在用户界面中向用户展示相应的提示信息。
可选的,如果在用户界面内未识别到用户的特定肢体,说明用户当前还未准备好执行游戏,电子设备可以在用户界面中向用户展示相应的提示信息,以提示用户将特定肢体放置于该用户界面内,便于执行该交互。
S330,如果在用户界面内识别到用户的特定肢体,则播放第一音乐。
S340,根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标。
S350,在预设高亮显示时段内,根据第一音乐的播放节奏控制高亮显示的动作指示图标的闪烁频率。
可选的,在第一音乐每次播放到某个目标音乐节拍高亮显示用户界面中按预定位置展 示的多个动作指示图标中的至少一个动作指示图标后,为了提升用户界面中按预定位置展示的多个动作指示图标的高亮显示多样性,本公开还可以在高亮显示的动作指示图标的预设高亮显示时段内,实时检测第一音乐的播放节奏,并按照该播放节奏为高亮显示的动作指示图标设置对应的闪烁频率,进而控制该高亮显示的动作指示图标按照对应的闪烁频率进行闪烁。根据音乐节奏控制高亮显示的动作指示图标的闪烁频率,可以提高用户与第一音乐之间的互动趣味性。
S360,获取用户执行的肢体动作。
S370,确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配。
S380,在第一音乐结束播放后,根据确定的最终匹配结果,确定用户的互动完成度。
上述S330-S340,S360-S380的执行过程与本公开中S110-S150的执行过程一致,具体的执行方案在S110-S150中已经描述过,在此不再进行具体说明。
本公开实施例提供的技术方案,根据播放的音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,以使用户按照高亮显示的动作指示图标执行对应的肢体动作,从而实现用户与播放的音乐和用户界面之间的互动,同时在每次高亮显示时确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配,并在音乐结束播放后,根据确定的最终匹配结果,确定用户的互动完成度。通过采用该技术方案,用户在和终端电子设备交互时无需额外设置相应的辅助设备,避免互动时的设备复杂性,同时通过在用户界面内按预定位置展示不同的动作指示图标并根据音乐节拍高亮显示至少一个动作指示图标,可以提升互动内容的多样性和趣味性,也提高了用户动作的连贯性和美观度。
图4示出了本公开实施例提供的一种互动装置的结构示意图,本公开实施例可适用于任一种应用程序中,该装置可以通过软件和/或硬件来实现,并集成在执行本方法的电子设备中。如图4所示,本公开实施例中的互动装置,具体可以包括:
音乐播放模块410,用于播放第一音乐;
图标高亮显示模块420,用于根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
肢体动作获取模块430,用于获取用户执行的肢体动作;
肢体匹配判断模块440,用于确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
互动模块450,用于在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
本公开实施例提供的技术方案,根据播放的音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,以使用户按照高亮显示的动作指示图标执行对应的肢体动作,从而实现用户与播放的音乐和用户界面之间的互动,同时在每次高亮显示时确定用户执行的肢体动作是否与高亮显示的动作指示图标的图标特征匹配,并在音乐结束播放后,根据确定的最终匹配结果,确定用户的互动完成度。通过采用该技术方案,用户在和终端电子设备交互时无需额外设置相应的辅助设备,避免互动时的设备复杂性,同时通过在用户界面内按预定位置展示不同的动作指示图标并根据音乐节拍高亮显示至少一个动作指示图标,可以提升互动内容的多样性和趣味性,也提高了用户动作的连贯性和美观度。
进一步的,上述图标高亮显示模块420,可以具体用于:
按照所述第一音乐的音乐属性以及所述用户界面中按预定位置展示的多个动作指示图标的位置关联性,高亮显示所述至少一个动作指示图标。
进一步的,上述肢体匹配判断模块440,可以具体用于:
在预设高亮显示时段内检测所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内;
根据所述检测结果确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标所指示的图标特征匹配。
进一步的,上述肢体匹配判断模块440,还可以具体用于:
在预设高亮显示时段内实时采集所述用户的肢体动作图像;
根据所述肢体动作图像判断所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内。
进一步的,上述肢体匹配判断模块440,还可以具体用于:
将所述用户执行的肢体动作与所述高亮显示的动作指示图标中标识的动作进行匹配,以确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配。
进一步的,上述互动装置,还可以包括:
图标闪烁模块,用于在预设高亮显示时段内,根据所述第一音乐的播放节奏控制所述高亮显示的动作指示图标的闪烁频率。
进一步的,上述互动装置,还可以包括:
肢体识别模块,用于识别所述用户界面内的所述用户的特定肢体,并根据识别结果展示初始界面,所述初始界面包括所述按预定位置展示的多个动作指示图标。
进一步的,上述互动装置,还可以包括:
肢体识别提示模块,用于如果在所述用户界面内未识别到所述用户的特定肢体,则在所述用户界面中向所述用户展示相应的提示信息。
进一步的,上述互动模块450,可以具体用于:
将每次高亮显示所述至少一个动作指示图标后的全部匹配结果进行平均处理,得到所述用户的互动完成度。
本公开实施例提供的互动装置,与上述实施例提供的互动方法属于同一发明构思,未在本公开实施例中详尽描述的技术细节可参见上述实施例,并且本公开实施例与上述实施例具有相同的有益效果。
下面参考图5,其示出了适于用来实现本公开实施例的电子设备500的结构示意图。本公开实施例中的电子设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、车载终端(例如车载导航终端)、可穿戴电子设备等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。图5示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图5所示,电子设备500可以包括处理装置(例如中央处理器、图形处理器等)501,其可以根据存储在只读存储器(Read only memory,ROM)502中的程序或者从存储装置508加载到随机访问存储器(Random access memory,RAM)503中的程序而执行各种适当的动作和处理。在RAM 503中,还存储有电子设备500操作所需的各种程序和数据。处理装置501、ROM 502以及RAM 503通过总线504彼此相连。输入/输出(Input/output,I/O)接口505也连接至总线504。
通常,以下装置可以连接至I/O接口505:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置506;包括例如液晶显示器(Liquid crystal display,LCD)、扬声器、振动器等的输出装置507;包括例如磁带、硬盘等的存储装置508;以及通信装置509。通信装置509可以允许电子设备500与其他设备进行无线或有线通信以交换数据。虽然图5示出了具有各种装置的电子设备500,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件 程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在非暂态计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置509从网络上被下载和安装,或者从存储装置508被安装,或者从ROM 502被安装。在该计算机程序被处理装置501执行时,执行本公开实施例的方法中的步骤,以实现其限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(Electrical programmable read only memory,EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(Compact disc read only memory,CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
在一些实施方式中,电子设备可以利用诸如HTTP(HyperText Transfer Protocol,超文本传输协议)之类的任何当前已知或未来研发的网络协议进行通信,并且可以与任意形式或介质的数字数据通信(例如,通信网络)互连。通信网络的示例包括局域网(Local area network,“LAN”),广域网(Wide area network,“WAN”),网际网(例如,互联网)以及端对端网络(例如,ad hoc端对端网络),以及任何当前已知或未来研发的网络。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备:播放第一音乐;根据第一音乐中的目标音乐节拍,高亮显示 用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;获取用户执行的肢体动作;确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括但不限于面向对象的程序设计语言,诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言,诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(Field programmable gate array,FPGA)、专用集成电路(Application specific integrated circuit,ASIC)、专用标准产品(Application specific standard parts,ASSP)、片上系统(System on chip,SOC)、复杂可编程逻辑设备(Complex programmable logic device,CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读 介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
根据本公开的一个或多个实施例提供的一种互动方法,该方法包括:
播放第一音乐;
根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
获取用户执行的肢体动作;
确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
根据本公开的一个或多个实施例,上述方法中,所述高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,包括:
按照所述第一音乐的音乐属性以及所述用户界面中按预定位置展示的多个动作指示图标的位置关联性,高亮显示所述至少一个动作指示图标。
根据本公开的一个或多个实施例,上述方法中,所述确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配,包括:
在预设高亮显示时段内检测所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内;
根据所述检测结果确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标所指示的图标特征匹配。
根据本公开的一个或多个实施例,上述方法中,
所述在预设高亮显示时段内检测所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内,包括:
在预设高亮显示时段内实时采集所述用户的肢体动作图像;
根据所述肢体动作图像判断所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内。
根据本公开的一个或多个实施例,上述方法中,所述确定所述用户执行的肢体动作是 否与所述高亮显示的动作指示图标的图标特征匹配,包括:
将所述用户执行的肢体动作与所述高亮显示的动作指示图标中标识的动作进行匹配,以确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配。
根据本公开的一个或多个实施例,上述方法中,还包括:
在预设高亮显示时段内,根据所述第一音乐的播放节奏控制所述高亮显示的动作指示图标的闪烁频率。
根据本公开的一个或多个实施例,上述方法中,还包括:
识别所述用户界面内的所述用户的特定肢体,并根据识别结果展示初始界面,所述初始界面包括所述按预定位置展示的多个动作指示图标。
根据本公开的一个或多个实施例,上述方法中,还包括:
如果在所述用户界面内未识别到所述用户的特定肢体,则在所述用户界面中向所述用户展示相应的提示信息。
根据本公开的一个或多个实施例,上述方法中,所述根据确定的最终匹配结果,确定所述用户的互动完成度,包括:
将每次高亮显示所述至少一个动作指示图标后的全部匹配结果进行平均处理,得到所述用户的互动完成度。
根据本公开的一个或多个实施例提供的一种互动装置,该装置包括:
音乐播放模块,用于播放第一音乐;
图标高亮显示模块,用于根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
肢体动作获取模块,用于获取用户执行的肢体动作;
肢体匹配判断模块,用于确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
互动模块,用于在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
根据本公开的一个或多个实施例,上述装置中,所述图标高亮显示模块,具体用于:
按照所述第一音乐的音乐属性以及所述用户界面中按预定位置展示的多个动作指示图标的位置关联性,高亮显示所述至少一个动作指示图标。
根据本公开的一个或多个实施例,所述肢体匹配判断模块,具体用于:
在预设高亮显示时段内检测所述用户的特定肢体是否位于所述高亮显示的动作指示 图标的预展示区域内;
根据所述检测结果确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标所指示的图标特征匹配。
根据本公开的一个或多个实施例,上述肢体匹配判断模块,具体用于:
在预设高亮显示时段内实时采集所述用户的肢体动作图像;
根据所述肢体动作图像判断所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内。
根据本公开的一个或多个实施例,上述肢体匹配判断模块,具体用于:
将所述用户执行的肢体动作与所述高亮显示的动作指示图标中标识的动作进行匹配,以确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配。
根据本公开的一个或多个实施例,上述装置中,还包括:
图标闪烁模块,用于在预设高亮显示时段内,根据所述第一音乐的播放节奏控制所述高亮显示的动作指示图标的闪烁频率。
根据本公开的一个或多个实施例,上述装置中,还包括:
肢体识别模块,用于识别所述用户界面内的所述用户的特定肢体,并根据识别结果展示初始界面,所述初始界面包括所述按预定位置展示的多个动作指示图标。
根据本公开的一个或多个实施例,上述装置中,还包括:
肢体识别提示模块,用于如果在所述用户界面内未识别到所述用户的特定肢体,则在所述用户界面中向所述用户展示相应的提示信息。
根据本公开的一个或多个实施例,上述互动模块,具体用于:
将每次高亮显示所述至少一个动作指示图标后的全部匹配结果进行平均处理,得到所述用户的互动完成度。
根据本公开的一个或多个实施例提供的一种电子设备,该电子设备包括:
一个或多个处理器;
存储器,用于存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如本公开任一实施例中所述的互动方法。
根据本公开的一个或多个实施例提供的一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如本公开任一实施例中所述的互动方法。
根据本公开的一个或多个实施例提供的一种计算机程序产品,该计算机程序产品包 括:计算机程序,该计算机程序存储在可读存储介质中,电子设备的一个或多个处理器可以从所述可读存储介质读取所述计算机程序,所述一个或多个处理器执行所述计算机程序,使得所述电子设备执行上述任一实施例提供的方案。
根据本公开的一个或多个实施例提供的一种计算机程序,该计算机程序存储在可读存储介质中,电子设备的一个或多个处理器可以从所述可读存储介质中读取所述计算机程序,所述一个或多个处理器执行所述计算机程序,使得所述电子设备执行上述任一实施例提供的方案。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。
此外,虽然采用特定次序描绘了各操作,但是这不应当理解为要求这些操作以所示出的特定次序或以顺序次序执行来执行。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实施例中。相反地,在单个实施例的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实施例中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (14)

  1. 一种互动方法,其特征在于,包括:
    播放第一音乐;
    根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
    获取用户执行的肢体动作;
    确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
    在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
  2. 根据权利要求1所述的方法,其特征在于,所述高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标,包括:
    按照所述第一音乐的音乐属性以及所述用户界面中按预定位置展示的多个动作指示图标的位置关联性,高亮显示所述至少一个动作指示图标。
  3. 根据权利要求1或2所述的方法,其特征在于,所述确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配,包括:
    在预设高亮显示时段内检测所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内;
    根据所述检测结果确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标所指示的图标特征匹配。
  4. 根据权利要求3所述的方法,其特征在于,所述在预设高亮显示时段内检测所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内,包括:
    在预设高亮显示时段内实时采集所述用户的肢体动作图像;
    根据所述肢体动作图像判断所述用户的特定肢体是否位于所述高亮显示的动作指示图标的预展示区域内。
  5. 根据权利要求1或2所述的方法,其特征在于,所述确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配,包括:
    将所述用户执行的肢体动作与所述高亮显示的动作指示图标中标识的动作进行匹配,以确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述方法还包括:
    在预设高亮显示时段内,根据所述第一音乐的播放节奏控制所述高亮显示的动作指示图标的闪烁频率。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述方法还包括:
    识别所述用户界面内的所述用户的特定肢体,并根据识别结果展示初始界面,所述初始界面包括所述按预定位置展示的多个动作指示图标。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    如果在所述用户界面内未识别到所述用户的特定肢体,则在所述用户界面中向所述用户展示相应的提示信息。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,所述根据确定的最终匹配结果,确定所述用户的互动完成度,包括:
    将每次高亮显示所述至少一个动作指示图标后的全部匹配结果进行平均处理,得到所述用户的互动完成度。
  10. 一种互动装置,其特征在于,包括:
    音乐播放模块,用于播放第一音乐;
    图标高亮显示模块,用于根据第一音乐中的目标音乐节拍,高亮显示用户界面中按预定位置展示的多个动作指示图标中的至少一个动作指示图标;
    肢体动作获取模块,用于获取用户执行的肢体动作;
    肢体匹配判断模块,用于确定所述用户执行的肢体动作是否与所述高亮显示的动作指示图标的图标特征匹配;
    互动模块,用于在所述第一音乐结束播放后,根据确定的最终匹配结果,确定所述用户的互动完成度。
  11. 一种电子设备,其特征在于,所述电子设备包括:
    一个或多个处理器;
    存储器,用于存储一个或多个程序;
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-9中任一项所述的互动方法。
  12. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-9中任一项所述的互动方法。
  13. 一种计算机程序产品,其特征在于,包括计算机程序指令,所述计算机程序指令使得计算机执行如权利要求1-9中任一项所述的互动方法。
  14. 一种计算机程序,其特征在于,所述计算机程序使得计算机执行如权利要求1-9中任一项所述的互动方法。
PCT/CN2021/104899 2020-07-24 2021-07-07 一种互动方法、装置、设备和可读介质 WO2022017181A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21846566.4A EP4167067A4 (en) 2020-07-24 2021-07-07 INTERACTION METHOD AND APPARATUS, DEVICE AND READABLE MEDIUM
US18/005,812 US20230298384A1 (en) 2020-07-24 2021-07-07 Interaction method and apparatus, device and readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010724931.1A CN111857482B (zh) 2020-07-24 2020-07-24 一种互动方法、装置、设备和可读介质
CN202010724931.1 2020-07-24

Publications (1)

Publication Number Publication Date
WO2022017181A1 true WO2022017181A1 (zh) 2022-01-27

Family

ID=72950946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/104899 WO2022017181A1 (zh) 2020-07-24 2021-07-07 一种互动方法、装置、设备和可读介质

Country Status (4)

Country Link
US (1) US20230298384A1 (zh)
EP (1) EP4167067A4 (zh)
CN (1) CN111857482B (zh)
WO (1) WO2022017181A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111857482B (zh) * 2020-07-24 2022-05-17 北京字节跳动网络技术有限公司 一种互动方法、装置、设备和可读介质
CN112988027B (zh) * 2021-03-15 2023-06-27 北京字跳网络技术有限公司 对象控制方法及设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080280680A1 (en) * 2007-05-08 2008-11-13 Disney Enterprises, Inc. System and method for using a touchscreen as an interface for music-based gameplay
CN108536293A (zh) * 2018-03-29 2018-09-14 北京字节跳动网络技术有限公司 人机交互系统、方法、计算机可读存储介质及交互装置
CN108815845A (zh) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 人机交互的信息处理方法及装置、计算机设备及可读介质
CN111857482A (zh) * 2020-07-24 2020-10-30 北京字节跳动网络技术有限公司 一种互动方法、装置、设备和可读介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5319750B2 (ja) * 2011-10-05 2013-10-16 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム装置の制御方法、及びプログラム
CN104598867B (zh) * 2013-10-30 2017-12-01 中国艺术科技研究所 一种人体动作自动评估方法及舞蹈评分系统
CN104754421A (zh) * 2014-02-26 2015-07-01 苏州乐聚一堂电子科技有限公司 互动节拍特效系统及互动节拍特效处理方法
KR101895691B1 (ko) * 2016-12-13 2018-09-05 계명대학교 산학협력단 사용자 동작 기반의 지휘 게임 장치 및 그것을 이용한 지휘 게임 방법
CN109799903A (zh) * 2018-12-21 2019-05-24 段新 基于虚拟现实的打击乐方法、终端设备及系统
CN110544301A (zh) * 2019-09-06 2019-12-06 广东工业大学 一种三维人体动作重建系统、方法和动作训练系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080280680A1 (en) * 2007-05-08 2008-11-13 Disney Enterprises, Inc. System and method for using a touchscreen as an interface for music-based gameplay
CN108536293A (zh) * 2018-03-29 2018-09-14 北京字节跳动网络技术有限公司 人机交互系统、方法、计算机可读存储介质及交互装置
CN108815845A (zh) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 人机交互的信息处理方法及装置、计算机设备及可读介质
CN111857482A (zh) * 2020-07-24 2020-10-30 北京字节跳动网络技术有限公司 一种互动方法、装置、设备和可读介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4167067A4 *

Also Published As

Publication number Publication date
EP4167067A1 (en) 2023-04-19
CN111857482B (zh) 2022-05-17
EP4167067A4 (en) 2023-12-13
CN111857482A (zh) 2020-10-30
US20230298384A1 (en) 2023-09-21

Similar Documents

Publication Publication Date Title
WO2022083199A1 (zh) 视频处理方法、装置、电子设备和计算机可读存储介质
WO2022012182A1 (zh) 特效展示方法、装置、电子设备及计算机可读介质
WO2021093737A1 (zh) 生成视频的方法、装置、电子设备和计算机可读介质
US20240040199A1 (en) Video-based interaction method and apparatus, storage medium and electronic device
EP2945045B1 (en) Electronic device and method of playing music in electronic device
WO2022017181A1 (zh) 一种互动方法、装置、设备和可读介质
WO2021135626A1 (zh) 菜单项选择方法、装置、可读介质及电子设备
CN107562361A (zh) 消息处理方法、装置及终端
CN110221734A (zh) 信息显示方法、图形用户接口及终端
WO2022228336A1 (zh) 视频互动方法、装置、电子设备和存储介质
WO2022007565A1 (zh) 增强现实的图像处理方法、装置、电子设备及存储介质
WO2022262470A1 (zh) 视频处理方法、装置、存储介质及电子设备
KR20150079371A (ko) 단말기에서 근전도 검사 장치로 데이터를 전송하기 위한 장치, 시스템 및 방법
US11886484B2 (en) Music playing method and apparatus based on user interaction, and device and storage medium
EP4192021A1 (en) Audio data processing method and apparatus, and device and storage medium
US11863835B2 (en) Interaction method and apparatus, and electronic device
WO2023051294A9 (zh) 道具处理方法、装置、设备及介质
WO2020155915A1 (zh) 用于播放音频的方法和装置
WO2021218981A1 (zh) 互动记录的生成方法、装置、设备及介质
US20220159197A1 (en) Image special effect processing method and apparatus, and electronic device and computer readable storage medium
CN109982130A (zh) 一种视频拍摄方法、装置、电子设备及存储介质
WO2023088006A1 (zh) 云游戏交互方法、装置、可读介质和电子设备
CN110113253A (zh) 即时通信方法、设备及计算机可读存储介质
JP2023508462A (ja) ビデオエフェクト処理方法及び装置
WO2023140786A2 (zh) 特效视频处理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21846566

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021846566

Country of ref document: EP

Effective date: 20230112

NENP Non-entry into the national phase

Ref country code: DE