CN113448483A - Interaction method, interaction device, electronic equipment and computer storage medium - Google Patents

Interaction method, interaction device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN113448483A
CN113448483A CN202010224423.7A CN202010224423A CN113448483A CN 113448483 A CN113448483 A CN 113448483A CN 202010224423 A CN202010224423 A CN 202010224423A CN 113448483 A CN113448483 A CN 113448483A
Authority
CN
China
Prior art keywords
chord
playing
interactive
current
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010224423.7A
Other languages
Chinese (zh)
Inventor
饶荣庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huancheng culture media Co.,Ltd.
Original Assignee
Beijing Wall Breaker Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wall Breaker Technology Co ltd filed Critical Beijing Wall Breaker Technology Co ltd
Priority to CN202010224423.7A priority Critical patent/CN113448483A/en
Publication of CN113448483A publication Critical patent/CN113448483A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

The embodiment of the invention provides an interaction method, an interaction device, electronic equipment and a computer storage medium. The interaction method comprises the following steps: acquiring a first interactive operation performed by a user aiming at a playing interactive region of a current chord in a chord sequence of a playing track; determining a chord next to the current chord from the chord sequence according to the first interactive operation; and presenting the playing interaction area of the next chord. Because the next chord is determined according to the first interactive operation and the playing interactive region of the next chord is presented, the playing interactive process of the chord is simple and quick, the requirement on the chord creation capability of the user is reduced, and the user friendliness is improved.

Description

Interaction method, interaction device, electronic equipment and computer storage medium
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an interaction method, an interaction device, electronic equipment and a computer storage medium.
Background
The mobile phone singing is a new entertainment mode. Generally, on mobile phone singing software, a limited number of chord tones are converged according to the fragment melody of a song and displayed in a key form, and a user clicks to make the chord sounding, so that the user can sing while clicking the chord tones to accompany, and the recording synthesis is completed. However, for the ordinary user, the requirement of the interactive singing mode on the creation capability is too high, and the user usually gives up after trying for several times.
Disclosure of Invention
Embodiments of the present invention provide an interaction method, an interaction apparatus, an electronic device, and a computer storage medium to solve or alleviate the above problems.
According to a first aspect of the embodiments of the present invention, there is provided an interaction method, including: acquiring a first interactive operation performed by a user aiming at a playing interactive region of a current chord in a chord sequence of a playing track; determining a chord next to the current chord from the chord sequence according to the first interactive operation; and presenting the playing interaction area of the next chord.
According to a second aspect of the embodiments of the present invention, there is provided an interaction method, including: acquiring a first interactive operation which is performed by a user on an interactive interface aiming at a playing interactive area of a current chord in a chord sequence of a playing track; and determining whether a next chord of the current chord exists or not from the chord sequence according to the first interactive operation, and if so, presenting a playing interactive area of the next chord on the interactive interface.
According to a third aspect of the embodiments of the present invention, there is provided an interaction method, including: presenting a session interface of a plurality of accounts for instant video communication, wherein the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts; acquiring prompt operation of a first account in the video window queue for a video window of a target second account in the plurality of second accounts; and responding to the prompt operation, and preposing the video window of the target second account in the video window queue to update the video window queue.
According to a fourth aspect of the embodiments of the present invention, there is provided an interactive apparatus, including: the system comprises a presentation module and a display module, wherein the presentation module presents a session interface of a plurality of accounts for instant video communication, the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts; the acquisition module is used for acquiring prompt operation of a first account in the video window queue aiming at a video window of a target second account in the plurality of second accounts; and the moving module responds to the prompt operation, and leads the video window of the target second account in the video window queue so as to update the video window queue.
According to a fifth aspect of the embodiments of the present invention, there is provided an interactive apparatus, including: the acquisition module is used for acquiring a first interaction operation which is performed by a user aiming at a playing interaction area of a current chord in the chord sequence of the playing track; the determining module is used for determining the next chord of the current chord from the chord sequence according to the first interactive operation; and the presentation module is used for presenting the playing interaction area of the next chord.
According to a sixth aspect of the embodiments of the present invention, there is provided an interactive apparatus, including: the system comprises an acquisition module, a determination module and a presentation module, wherein the acquisition module is used for acquiring a first interaction operation which is performed by a user on an interaction interface aiming at a playing interaction area of a current chord in a chord sequence of a playing track; the determining module is configured to determine whether a next chord of the current chord exists in the chord sequence according to the first interaction operation, and if the next chord exists, present a playing interaction area of the next chord on the interaction interface through the presenting module.
According to a seventh aspect of embodiments of the present invention, there is provided an electronic apparatus, the apparatus including: one or more processors; a computer readable medium configured to store one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any one of the first to third aspects.
According to an eighth aspect of embodiments of the present invention, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the method according to any one of the first to third aspects.
The scheme of the embodiment of the invention can acquire the first interactive operation of the user aiming at the playing interactive region of the current chord in the chord sequence of the playing track; determining a chord next to the current chord from the chord sequence according to the first interactive operation; and presenting the playing interactive area of the next chord. Because the next chord is determined according to the first interactive operation and the playing interactive region of the next chord is presented, the playing interactive process of the chord is simple and quick, the requirement on the chord creation capability of the user is reduced, and the user friendliness is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
FIG. 1A is a schematic flow chart of an interaction method according to a first embodiment of the invention;
fig. 1B is a schematic diagram of an interaction method according to a first embodiment of the invention;
fig. 1C is a schematic diagram of an interaction method according to a first embodiment of the invention;
fig. 1D is a schematic diagram of an interaction method according to a first embodiment of the invention;
FIG. 2 is a schematic flow chart of an interaction method according to a second embodiment of the present invention;
FIG. 3 is a schematic block diagram of an interactive apparatus according to a third embodiment of the present invention;
FIG. 4 is a schematic block diagram of an interactive apparatus according to a fourth embodiment of the present invention;
FIG. 5A is a schematic flow chart of an interaction method according to a fifth embodiment of the present invention;
fig. 5B is a diagram illustrating an exemplary interaction method according to a fifth embodiment of the present invention;
fig. 5C is a schematic diagram of another exemplary interaction method according to a fifth embodiment of the invention;
FIG. 6 is a schematic block diagram of an interactive apparatus according to a sixth embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic apparatus according to a seventh embodiment of the present invention;
fig. 8 is a hardware configuration of an electronic device according to an eighth embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings.
FIG. 1A is a schematic flow chart of an interaction method according to a first embodiment of the invention; the interaction method of fig. 1A includes:
110: acquiring a first interactive operation performed by a user aiming at a playing interactive region of a current chord in a chord sequence of a playing track;
120: determining a chord next to the current chord from the chord sequence according to the first interactive operation;
130: and presenting the playing interactive area of the next chord.
It should be understood that the interaction area of the embodiments of the present invention can be implemented in various forms, and the interaction area can be represented as a fixed location area or a variable area. For example, move over the interactive interface, or blink over the interactive interface. Or, as the user interacts. The embodiment of the present invention is not limited thereto. For example, the play interaction operation of the embodiment of the present invention is used to determine or select the chord represented by the play interaction operation. The playing interactive areas of the embodiment of the invention can be overlapped in the form of layers. In summary, the embodiment of the present invention does not limit the interactive form, the interactive operation, or the interactive area.
It should also be appreciated that the interaction regions of embodiments of the present invention may be implemented in the form of keys, such as keys. Other forms are also possible. In addition, the playing interactive operation may adopt any form of touch operation, gesture operation, voice interactive operation, and the like. For example, in the case of touch operation, the duration, trajectory, force, etc. of touch may be combined in any way to form different operations.
It should also be understood that tracks of embodiments of the present invention include, but are not limited to, songs, accompaniment tracks, instrumental tracks, and the like. The playing tracks of the embodiment of the invention can be determined in any way. E.g., locally (e.g., in memory). For example, the end-to-end communication may be obtained by other devices, for example, the user may share songs between different devices via near field communication. For example, the tracks are obtained from a server (such as a cloud server) based on the communication connection. Further, the above-mentioned obtaining or storing includes, but is not limited to, obtaining or storing the tracks themselves, and the tracks in the form of a chord sequence. For example, the track may be acquired and then the chord sequence may be obtained by performing the chord processing, or the chord sequence may be directly acquired. In summary, the embodiment of the present invention does not limit the range of the tracks, the obtaining manner, the storage manner, and the storage manner or obtaining manner of the chord sequence.
For example, the interaction method of the embodiment of the present invention further includes: obtaining a chord sequence of the playing track; and presenting the playing interactive area of the current chord in the chord sequence of the playing track.
Because the next chord is determined according to the first interactive operation and the playing interactive region of the next chord is presented, the playing interactive process of the chord is simple and quick, the requirement on the chord creation capability of the user is reduced, and the user friendliness is improved.
In addition, compared with multiple chord selections provided in the prior art, the scheme of the embodiment of the invention can be used for playing and playing a group of users (including but not limited to color blindness and color weakness) with limited visual colors, so that the users can smoothly and accurately press the next chord. In other words, the ease of playing is improved, thereby improving user friendliness.
For example, the obtaining of the first interaction operation performed by the user with respect to the play interaction area of the current chord in the chord sequence of the played track includes: acquiring a first interaction operation performed by a user for a playing interaction area of a current chord in a chord sequence of a playing track during playing of a chord previous to the current chord, wherein the interaction method further comprises the following steps: and responding to the first interactive operation, interrupting the playing of the chord which is last to the current chord in the chord sequence, so as to start the playing of the current chord. The playing of the previous chord is interrupted in response to the interactive operation of the user, so that the current chord is directly switched, and the freedom degree of the chord creation of the user is increased.
For example, the obtaining of the first interaction operation performed by the user with respect to the play interaction area of the current chord in the chord sequence of the played track includes: acquiring a first interaction operation performed by a user on a playing interaction area of a current chord in a chord sequence of a played track during playing of a chord previous to the current chord, wherein the method further comprises the following steps: and responding to the first interactive operation, and starting playing the current chord after the playing of the previous chord is finished. For example, the next chord is loaded in response to a play interaction operation by the user. And after the playing of the previous chord is finished, starting the playing of the current chord. The interactive operation of the user is responded, and the playing of the current chord is started after the playing of the previous chord is finished, so that the smoothness of playing the chord is realized, the auditory interactive experience is improved, and the user can participate in the interaction.
For example, the interaction method further comprises: and acquiring second interactive operation performed by the user aiming at the playing interactive area of the current chord at the second presentation position, and repeatedly playing the current chord according to the second interactive operation. Because the user can repeat the current chord through the interactive operation of the second presentation position, the chord creation freedom of the user is improved. In addition, when the user exercises by adopting the interaction method of the embodiment of the invention, the current chord is played repeatedly, which is beneficial to self-correction and improves the learning effect.
In one implementation manner of the present invention, the playing interaction area for presenting the next chord can be performed at any presentation position. For example, in the interactive interface, the current chord is superimposed on the playing interactive area. For example, the user can move the upper playing interactive region to other positions through interactive operation to show the lower playing interactive region. For example, it may be presented on a different play interaction area than the play area of the current chord.
It should be understood that the number of the playing interactive areas in the embodiment of the present invention may be one or more (more than two). Under the condition of one playing interaction area, the playing interaction area of the previous chord can be adopted to replace the playing interaction area of the current chord. In the case of a plurality of play interaction regions, for example, a play interaction region of a previous chord, a play interaction region of a current chord, and a play interaction region of a next chord may be presented.
For example, when the play interaction region of the current chord and the play interaction region of the next chord are presented in two presentation positions, the interaction regions may be presented alternately. As shown in fig. 1B, when the key on the left side (a presentation form playing the interactive region) is clicked (an interactive operation), the next chord is presented on the key on the right side, and when the user clicks the key on the right side to further obtain the next chord, the next chord is presented on the key on the left side.
In one implementation manner of the present invention, obtaining a first interaction operation performed by a user for a play interaction area of a current chord in a chord sequence of a play track includes: acquiring a first interaction operation performed by a user for a playing interaction area of a current chord at a first presentation position, wherein the playing interaction area for presenting a next chord comprises: and at the first presentation position, replacing the playing interaction area of the current chord to present the playing interaction area of the next chord. For example, the first presentation position may be a position that is movable based on an interactive operation of a user. For example, the first presentation location may be a unique location on the interactive interface. For example, the first presentation location may be one of a plurality of presentation locations on the interactive interface. For example, as shown in fig. 1C, when the user clicks a key on the right side as the first presentation position, the next chord is presented on the key. For example, the current chord is presented on the key on the left side as the other position, or the current chord is covered or shielded so as not to be presented or may be presented by an operation of moving away the upper layer by the user's moving operation.
In one implementation manner of the present invention, obtaining a first interaction operation performed by a user for a play interaction area of a current chord in a chord sequence of a play track includes: acquiring a first interactive operation performed by a user for a playing interactive area of a current chord at a first presentation position, wherein the interactive method further comprises the following steps: and moving the playing interaction area of the current chord from the first presentation position to a second presentation position different from the first presentation position according to the first interaction operation.
Due to the fact that the interaction method is achieved through the first presentation position and the second presentation position, compared with the situation that keys are dense (such as seven chords), the scheme of the embodiment of the invention can present a relatively large playing interaction area on the electronic equipment with limited screen size, so that the playing process is convenient and fast, and the probability that fingers are prone to mistakenly touch other keys or click without interruption is reduced.
For example, the present position of the embodiment of the present invention may also be presented as more than three keys. For example, as shown in fig. 1D, the key on the left side (third presentation position) may be used to obtain the chord previous to the current chord. For example, the middle key (second presentation position) may be used to repeat the current chord. For example, the key on the right (first presentation position) is used to obtain the next chord.
It should be understood that the scheme of the embodiment of the present invention does not limit the number of presentation positions. Even if the number of keys of the embodiment of the invention is not less than that of the keys in the prior art, the scheme of the embodiment of the invention presents an interaction area aiming at the current chord and the next chord, so that the problem of reducing the difficulty of chord creation of a user can be still ensured. Meanwhile, due to the convenience of the operation, the interaction function such as moving the playing interaction area can be added, so that the interactivity is increased.
In one implementation manner of the present invention, the playing interaction area is presented in a first color at a first presentation position, and the playing interaction area is presented in a second color at a second presentation position. In other words, the presentation position is associated with a color. Different presentation positions may be presented in different colors. And the different colors are adopted to distinguish the presentation positions, so that the authoring of the user is further facilitated. In addition, because the number of the presenting positions is small, and the presenting positions are distinguished by different colors, the operation threshold is reduced, and particularly the operation threshold is reduced for people with limited vision.
In one embodiment of the present invention, the presenting of the playing interaction area of the current chord in the chord sequence of the playing track includes: and presenting the playing interaction area of the current chord by adopting a third color, wherein the presenting of the playing interaction area of the next chord comprises the following steps: and presenting the playing interaction area of the next chord by adopting a fourth color. In other words, different colors are used for the interaction regions of different current chords and next chords (or previous chords). Since specific colors are associated with different operation interaction areas, even if the position of the operation interaction area is changed, the colors are not changed, thereby further facilitating the creation of users. In addition, the interactivity in the creation process is further improved.
There is a prior art method of playing interactive (i.e. each key represents a specific chord), for example, a song segment has multiple chords, in other words, playing a song requires multiple keys side by side. When there are many cases, even if different chords are distinguished by color and the chord color is marked at the place where the lyrics correspond to the beginning of the chord, there is a problem that the color is difficult to distinguish. In addition, in general, in product design, consecutive keys are provided with gradations for the sake of the aesthetic appearance of the arrangement of the keys.
In other words, when a song segment contains a plurality of chords, the color distinction degree between the keys is not large (e.g., blue, bluish), and it is difficult to find an accurate color. Compared with the scheme in the prior art, the scheme of the embodiment of the invention greatly reduces the display positions, solves the problem of color display of the keys, and enables a user to find the corresponding chord color when singing the word more easily (for example) in the process of playing while singing, match the color and press (touch) the corresponding chord button when adopting the interaction method of the invention. Greatly reducing the learning and operating thresholds for playing while singing.
In addition, the interaction method provided by the embodiment of the invention has higher user friendliness to the people with limited color blindness and color weakness and mobility capability, so that the people can distinguish the colors of the keys more easily.
In another implementation manner of the present invention, the presenting a play interaction area for playing a current chord in a chord sequence of a song includes: presenting the playing interaction area of the current chord together with the chord mark of the playing interaction area of the current chord, wherein presenting the playing interaction area of the next chord comprises: and presenting the playing interactive area of the next chord together with the chord mark of the playing interactive area of the next chord. The chord type is convenient for the user to remember because the chord mark and the playing interaction area are presented together, so that the learning function of the interaction method is further improved.
In another implementation manner of the present invention, the interaction method further includes: the lyrics of the target song are presented. For example, the lyrics of a target song are divided into a plurality of words. And all words and sentences are presented without distinction so that the user can train the pleasure.
In another implementation manner of the present invention, the interaction method further includes: acquiring a third interactive operation performed by the user aiming at the word and sentence prompting interactive area of the current chord; determining a current word corresponding to the current chord in the lyrics according to the third interactive operation, wherein the presenting of the lyrics of the target song comprises: and displaying the lyrics of the target song by highlighting the current words and sentences. For example, highlighting is performed by highlighting and enlarging, which is not limited in the embodiments of the present invention. Because the specific words and sentences can be highlighted, for example, when the user forgets the current process, the user can conveniently find the current words and sentences, and the user can learn or create the words and sentences conveniently.
In another implementation manner of the present invention, at least one of the first presentation position and the second presentation position may be set. In addition, at least one of the first and second presentation positions may be movable. For example, the at least one presentation position is determined in response to a presentation position setting instruction input by a user. For example, the at least one presentation position is moved in response to a presentation position movement instruction input by the user. For example, the presentation position setting instruction may include target position information. For example, the presentation position movement instruction may include source position information and target position information. For example, the presentation position movement instruction may be obtained by clicking on the source position and the target position. For example, the presentation position movement instruction may be determined by a gesture trajectory input to the user. For example, the start point information and the end point information of the gesture trajectory may be determined as source location information and target location information, respectively. Alternatively, the presentation position movement instruction may include only target position information.
It should also be appreciated that moving the presentation position and moving the play interaction region may be concurrent. For example, the playing interactive area remains unchanged while the presentation position is moved. In addition, the playing interactive area can move along with the movement of the presentation position. The embodiment of the present invention is not limited thereto. In addition, the triggering of the playing interactive region and the movement or setting of the presentation position may be in a similar but different triggering manner. For example, the trigger for playing the interactive region is a single click operation, and correspondingly, the movement of the presentation position can be a double click operation. The movement of the presentation position may be in the same manner as the setting of the presentation position or may be in a different manner.
It should also be understood that the movement for different presentation positions may be performed in the same layer, or in different layers. For example, the plurality of rendering locations may correspond to a plurality of layers, respectively. Multiple layers may have different priorities. For example, when the rendering positions in different layers overlap, the transparency of the overlapping rendering positions may be changed. When different presentation positions are in the same layer, a collision effect triggering assembly or an ejection effect triggering assembly can be arranged on the periphery of one presentation position. For example, the assembly is configured with collision parameters, which may include a launch distance, a collision distance, a motion trajectory parameter, a motion time parameter, and the like. For example, when another presentation position moves to the periphery of the presentation position, the collision of the two presentation positions is displayed. For example, different presentation positions may have different dimensions for characterizing the "quality" of the presentation position. For example, for a target collision effect, the above parameters may be determined based on the law of conservation of momentum, wherein the mass of the colliding object may be characterized for the size of the presentation location. In other words, while one presentation position is colliding, the other presentation position is collided. For example, for a target ejection effect, the impacted presentation position may remain, and the moved (triggering the ejection effect triggering component) presentation position may be ejected back to the initial position, or to other positions. For example, after being ejected to the border of the display interface, is bounced further. It will be appreciated that the course of motion described above may be simulated based on physical laws and parameters to determine the presentation parameters of the course of motion. In addition, effect amplification presentation can be performed on at least part of the motion process so as to improve interactive participation. It should also be understood that the above-described display modes are merely exemplary, and that various combinations of the various trigger modes and the presented motion modes are possible.
As an example, the obtaining a first interaction operation performed by a user for a play interaction area of a current chord in a chord sequence of a playing track includes: acquiring a first interactive operation performed by a user for a playing interactive area of the current chord at a first presentation position, wherein the method further comprises: and responding to the display position moving operation input by the user to move the first display position from the current position to the target position.
As an example, the method further comprises: responding to a presentation position setting operation input by a user, and determining the first presentation position; and presenting the playing interaction area of the current chord at the first presentation position.
FIG. 2 is a schematic flow chart of an interaction method according to a second embodiment of the present invention; the interaction method of fig. 2 includes:
210: acquiring a first interactive operation which is performed by a user on an interactive interface aiming at a playing interactive area of a current chord in a chord sequence of a playing track;
220: according to the first interactive operation, determining whether a chord next to the current chord exists from the chord sequence, and if the chord next exists, executing 230:
230: and presenting the playing interaction area of the next chord on the interaction interface.
Because the next chord is determined according to the first interactive operation and the playing interactive region of the next chord is presented, the playing interactive process of the chord is simple and quick, the requirement on the chord creation capability of the user is reduced, and the user friendliness is improved.
In another implementation manner of the present invention, obtaining a first interactive operation performed by a user on an interactive interface for a play interactive region of a current chord in a chord sequence of a play track includes: acquiring a first interactive operation of a user on an interactive interface aiming at a playing interactive area of a current chord at a first presentation position, wherein the interactive method further comprises the following steps: and according to the first interactive operation, moving the playing interactive area of the current chord from the first presentation position to a second presentation position different from the first presentation position on the interactive interface.
In another implementation manner of the present invention, the interaction method further includes: and if the next chord does not exist, presenting the playing interaction area of the current chord at the first presentation position.
Fig. 3 is a schematic block diagram of an interactive apparatus according to a third embodiment of the present invention. The interactive apparatus of fig. 3 includes:
the first obtaining module 310 obtains a first interaction operation performed by a user for a play interaction region of a current chord in the chord sequence of the played track;
the determining module 320 determines a chord next to the current chord from the chord sequence according to the first interactive operation;
the presenting module 330 presents the playing interaction area of the next chord.
Because the next chord is determined according to the first interactive operation and the playing interactive region of the next chord is presented, the playing interactive process of the chord is simple and quick, the requirement on the chord creation capability of the user is reduced, and the user friendliness is improved.
In another implementation manner of the present invention, the interaction device further includes: and the first play control module responds to the first interaction operation and interrupts the play of the chord which is the last chord of the current chord in the chord sequence so as to start the play of the current chord.
In another implementation manner of the present invention, the interaction device further includes: and the second playing control module responds to the first interactive operation and starts playing the current chord after the playing of the chord which is last to the current chord in the chord sequence is finished.
In another implementation manner of the present invention, the first obtaining module 310 is specifically configured to: obtaining a first interaction operation performed by a user for a play interaction region of a current chord at a first presentation position, where the presentation module 330 is specifically configured to: and at the first presentation position, replacing the playing interaction area of the current chord to present the playing interaction area of the next chord.
In another implementation manner of the present invention, the first obtaining module 310 is specifically configured to: acquiring a first interaction operation performed by a user for a playing interaction area of a current chord at a first presentation position, wherein the interaction device further comprises: and the moving module moves the playing interaction area of the current chord from the first presentation position to a second presentation position different from the first presentation position according to the first interaction operation.
In another implementation manner of the present invention, the interaction device further includes: and the third playing control module repeatedly plays the current chord according to the second interactive operation.
In another implementation manner of the present invention, the presenting module 330 is specifically configured to: and presenting the playing interaction area by adopting a first color at the first presentation position, and presenting the playing interaction area by adopting a second color at the second presentation position.
In another implementation manner of the present invention, the interaction device further includes: the third obtaining module obtains the chord sequence of the playing track; the presentation module 330 is further configured to: and presenting the playing interactive area of the current chord in the chord sequence of the playing track.
In another implementation manner of the present invention, the presenting module 330 is specifically configured to: and presenting the playing interaction area of the current chord by adopting a third color, and presenting the playing interaction area of the next chord by adopting a fourth color.
In another implementation manner of the present invention, the presenting module 330 is specifically configured to: and presenting the playing interaction area of the current chord together with the chord identifier of the playing interaction area of the current chord, and presenting the playing interaction area of the next chord together with the chord identifier of the playing interaction area of the next chord.
In another implementation manner of the present invention, the playing track is an accompaniment of the target song, and the presenting module 330 is further specifically configured to: the lyrics of the target song are presented.
In another implementation manner of the present invention, the interaction device further includes: the fourth acquisition module is used for acquiring third interaction operation which is performed by the user aiming at the word and sentence prompt interaction area of the current chord; the determining module 320 is specifically configured to: according to the third interactive operation, a current word corresponding to the current chord in the lyrics is determined, wherein the presentation module 330 is specifically configured to: and displaying the lyrics of the target song by highlighting the current words and sentences.
Fig. 4 is a schematic block diagram of an interactive apparatus according to a fourth embodiment of the present invention. The interactive apparatus of fig. 4 includes:
an acquisition module 410, a determination module 420, and a presentation module 430, wherein,
the obtaining module 410 is configured to obtain a first interaction operation performed by a user on the interaction interface for a playing interaction area of a current chord in the chord sequence of the playing track;
the determining module 420 is configured to determine whether a next chord of the current chord exists from the chord sequence according to the first interactive operation, and if the next chord exists, present a playing interactive area of the next chord on the interactive interface through the presenting module 430.
Because the next chord is determined according to the first interactive operation and the playing interactive region of the next chord is presented, the playing interactive process of the chord is simple and quick, the requirement on the chord creation capability of the user is reduced, and the user friendliness is improved.
In another implementation manner of the present invention, the obtaining module 410 is specifically configured to: acquiring a first interactive operation of a user on an interactive interface aiming at a playing interactive area of a current chord at a first presentation position, wherein the interactive device further comprises: and the moving module is used for moving the playing interaction area of the current chord from the first presentation position to a second presentation position different from the first presentation position on the interaction interface according to the first interaction operation.
In another implementation of the invention, the presentation module 430 is further configured to: and if the next chord does not exist, presenting the playing interaction area of the current chord at the first presentation position.
Fig. 5A is a schematic flowchart of an interaction method according to a fifth embodiment of the present invention. The interaction method of fig. 5A includes:
510: presenting a session interface of a plurality of accounts for instant video communication, wherein the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts;
520: acquiring prompt operation of a first account in the video window queue for a video window of a target second account in the plurality of second accounts;
530: and responding to the prompt operation, and preposing the video window of the target second account in the video window queue to update the video window queue.
The scheme of the embodiment of the invention can respond to the prompt operation of the target account, and leads the video window of the target account in the video window queue, thereby being beneficial to the interaction of a plurality of users in the instant video communication.
In another implementation manner of the present invention, the session interface further includes a video window of a voice-initiated account, wherein the method further includes: and moving the video window of the target second account from the current window position in the video window queue to the video window of the voice initiated account according to the voice response of the target second user according to the prompt message.
Fig. 5B is a schematic diagram of an exemplary interaction method according to a fifth embodiment of the present invention. As shown, the account numbers 1 to 5 correspond to the plurality of second account numbers, respectively. When the first account clicks the account 4, the terminal device sends a prompt message to the terminal device where the account 4 is located. For example, the prompting message is used to prompt the account 4 whether to post comments on a topic discussed in the current instant video communication. The prompting message is used to prompt account 4 to pay attention to the topic or topic currently discussed, for example. For example, the prompt message is used to prompt the account 4 to speak after the current account finishes speaking. For example, the prompt message is used to prompt account 4 for information about the current instant video communication. It should be understood that the video windows of varying sizes shown in the figures may be used as a queue of video windows as described herein. For example, the first window in the video window queue is highlighted. For example, the first window is rendered at a larger window size than the other windows. For example, multiple front windows in a video window queue are rendered at a larger window size. For example, the first window may be a video window of the current voice-initiated account. It should be understood that the voice-initiated account may be the session-initiated account for the instant video communication session. The voice initiated account may also be the account currently undergoing voice output. For example, in a video conference scenario, the account of the currently speaking person may be the voice-initiated account. For example, a plurality of accounts corresponding to a plurality of current speakers may be used as the voice initiated account. For example, the account of the target speaker of the plurality of current speakers may be the voice initiated account. For example, the account number of the target speaker may be determined based on the voice origination frequency or voice output duration of a plurality of current speakers. For example, a priority as a voice initiated account may be preset. For example, the session interface includes a video window of at least one voice-initiated account. For example, the session interface includes a video window of the voice-initiated account. It should be understood that the video window may or may not be in the video window queue described above. As shown, when the user clicks on the video window of account 4, the video window moves to the highlighted video window. When the video window is located in the video window queue, the video window can also be used as a voice initiation window.
Fig. 5C is a schematic diagram of another exemplary interaction method according to a fifth embodiment of the present invention. As shown, when the user triggers the video window of account 4, such as by clicking, the video window moves to the video window of primary account 1. For example, the terminal device may generate a prompt for the account 4 in response to the touch. For example, the terminal device sends the prompt message to the device of account 4. For example, the user of account 4 may trigger the corresponding component by, for example, clicking, as a reply to the reminder message. For example, in the session interface, a video window of account 4 (the target second account) may be advanced in a video window queue in response to a reply to the reminder message. It should be appreciated that as the video window for account 4 moves, other windows may be reordered in the video window queue, maintaining the original order. For example, the order of the video windows for each account 4 is left unchanged. For example, when the user or conference participant corresponding to account 4 issues a voice (e.g., speaks or outputs a voice) in response to the prompt, the terminal device may move the video window of account 4 from the current window to a highlighted window (corresponding to the voice-initiated account) in the conversation interface. For example, the terminal device may move the window for account 4 to a highlighted window in response to a message identifying the account 4 as a voice initiated account. In other words, in this example, the window of account number 4 is moved from the upper right position to the upper left position. It should be understood that in this example, the highlighted window is not in the video window queue, but it should be understood that this is merely exemplary, and in other examples, the highlighted window may be in the video window queue. For example, in the pre-window process described above, the window for account 4 may be moved to a non-top position in the video window queue (e.g., the top position is reserved for voice initiated accounts).
Fig. 6 is a schematic block diagram of an interaction device according to a sixth embodiment of the present invention, where the interaction device in fig. 6 includes:
a presentation module 610, configured to present a session interface of multiple accounts for performing instant video communication, where the multiple accounts include a first account and multiple second accounts, and the session interface at least includes a video window queue in the multiple second accounts;
the obtaining module 620 obtains a prompt operation of the first account in the video window queue for a video window of a target second account in the plurality of second accounts;
the moving module 630, in response to the prompt operation, pre-locates the video window of the target second account in the video window queue to update the video window queue.
The scheme of the embodiment of the invention can respond to the prompt operation of the target account, and leads the video window of the target account in the video window queue, thereby being beneficial to the interaction of a plurality of users in the instant video communication.
In another implementation manner of the present invention, the session interface further includes a voice-initiated account video window, wherein the moving module is further configured to: and moving the video window of the target second account from the current window position in the video window queue to the video window of the voice initiated account according to the voice response of the target second user according to the prompt message.
Fig. 7 is a schematic structural diagram of an electronic device in a fifth embodiment of the present application; the electronic device may include:
one or more processors 701;
a computer-readable medium 702, which may be configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the methods as described in the embodiments above.
Fig. 8 is a hardware structure of an electronic device according to a sixth embodiment of the present application; as shown in fig. 8, the hardware structure of the electronic device may include: a processor 801, a communication interface 802, a computer-readable medium 803, and a communication bus 804;
wherein the processor 801, the communication interface 802, and the computer readable medium 803 communicate with each other via a communication bus 804;
alternatively, the communication interface 802 may be an interface of a communication module;
the processor 801 may be specifically configured to: acquiring a first interactive operation performed by a user aiming at a playing interactive region of a current chord in a chord sequence of a playing track; determining a chord next to the current chord from the chord sequence according to the first interactive operation; presenting a play interaction area of the next chord, or,
acquiring a first interactive operation which is performed by a user on an interactive interface aiming at a playing interactive area of a current chord in a chord sequence of a playing track; determining whether a next chord of the current chord exists from the chord sequence according to the first interactive operation, if so, presenting a playing interactive area of the next chord on the interactive interface, or,
presenting a session interface of a plurality of accounts for instant video communication, wherein the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts; acquiring prompt operation of a first account in the video window queue for a video window of a target second account in the plurality of second accounts; and responding to the prompt operation, and preposing the video window of the target second account in the video window queue to update the video window queue.
The Processor 801 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The computer-readable medium 803 may be, but is not limited to, a Random Access Memory (RAM), a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code configured to perform the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by a Central Processing Unit (CPU), performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access storage media (RAM), a read-only storage media (ROM), an erasable programmable read-only storage media (EPROM or flash memory), an optical fiber, a portable compact disc read-only storage media (CD-ROM), an optical storage media piece, a magnetic storage media piece, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code configured to carry out operations for the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may operate over any of a variety of networks: including a Local Area Network (LAN) or a Wide Area Network (WAN) -to the user's computer, or alternatively, to an external computer (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions configured to implement the specified logical function(s). In the above embodiments, specific precedence relationships are provided, but these precedence relationships are only exemplary, and in particular implementations, the steps may be fewer, more, or the execution order may be modified. That is, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present application may be implemented by software or hardware. The modules described may also be provided in a processor.
As another aspect, the present application also provides a computer-readable medium on which a computer program is stored, which when executed by a processor implements the method as described in the above embodiments.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring a first interactive operation performed by a user aiming at a playing interactive region of a current chord in a chord sequence of a playing track; determining a chord next to the current chord from the chord sequence according to the first interactive operation; presenting a play interaction area of the next chord, or,
acquiring a first interactive operation which is performed by a user on an interactive interface aiming at a playing interactive area of a current chord in a chord sequence of a playing track; determining whether a next chord of the current chord exists from the chord sequence according to the first interactive operation, if so, presenting a playing interactive area of the next chord on the interactive interface, or,
presenting a session interface of a plurality of accounts for instant video communication, wherein the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts; acquiring prompt operation of a first account in the video window queue for a video window of a target second account in the plurality of second accounts; and responding to the prompt operation, and preposing the video window of the target second account in the video window queue to update the video window queue.
The expressions "first", "second", "said first" or "said second" used in various embodiments of the present disclosure may modify various components regardless of order and/or importance, but these expressions do not limit the respective components. The above description is only configured for the purpose of distinguishing elements from other elements. For example, the first user equipment and the second user equipment represent different user equipment, although both are user equipment. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
When an element (e.g., a first element) is referred to as being "operably or communicatively coupled" or "connected" (operably or communicatively) to "another element (e.g., a second element) or" connected "to another element (e.g., a second element), it is understood that the element is directly connected to the other element or the element is indirectly connected to the other element via yet another element (e.g., a third element). In contrast, it is understood that when an element (e.g., a first element) is referred to as being "directly connected" or "directly coupled" to another element (a second element), no element (e.g., a third element) is interposed therebetween.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (24)

1. An interaction method, comprising:
acquiring a first interactive operation performed by a user aiming at a playing interactive region of a current chord in a chord sequence of a playing track;
determining a chord next to the current chord from the chord sequence according to the first interactive operation;
and presenting the playing interaction area of the next chord.
2. The method as claimed in claim 1, wherein the obtaining of the first interactive operation performed by the user for the play interaction area of the current chord in the chord sequence of the play track includes:
acquiring a first interaction operation performed by a user on a playing interaction area of a current chord in a chord sequence of a played track during playing of a chord previous to the current chord, wherein the method further comprises the following steps:
and responding to the first interactive operation, and interrupting the playing of the previous chord to start the playing of the current chord.
3. The method as claimed in claim 1, wherein the obtaining of the first interactive operation performed by the user for the play interaction area of the current chord in the chord sequence of the play track includes:
acquiring a first interaction operation performed by a user on a playing interaction area of a current chord in a chord sequence of a played track during playing of a chord previous to the current chord, wherein the method further comprises the following steps:
and responding to the first interactive operation, and starting playing the current chord after the playing of the previous chord is finished.
4. The method as claimed in claim 1, wherein the obtaining of the first interactive operation performed by the user for the play interaction area of the current chord in the chord sequence of the play track includes:
acquiring a first interaction operation performed by a user for a playing interaction area of the current chord at a first presentation position, wherein the playing interaction area for presenting the next chord comprises:
and at the first presentation position, replacing the playing interaction area of the current chord to present the playing interaction area of the next chord.
5. The method as claimed in claim 1, wherein the obtaining of the first interactive operation performed by the user for the play interaction area of the current chord in the chord sequence of the play track includes:
acquiring a first interactive operation performed by a user for a playing interactive area of the current chord at a first presentation position, wherein the method further comprises:
and moving the playing interaction area of the current chord from the first presentation position to a second presentation position different from the first presentation position according to the first interaction operation.
6. The method of claim 5, wherein the method further comprises:
acquiring a second interactive operation of the user for the playing interactive area of the current chord at the second presentation position,
and repeatedly playing the current chord according to the second interactive operation.
7. The method of claim 5, wherein the play interaction region is presented in a first color at the first presentation position and a second color at the second presentation position.
8. The method of claim 1, wherein the method further comprises:
obtaining the chord sequence of the playing track;
and presenting the playing interactive area of the current chord in the chord sequence of the playing track.
9. The method as claimed in claim 8, wherein said presenting the play interaction area of the current chord in the chord sequence of the play track comprises:
presenting the playing interaction area of the current chord by adopting a third color, wherein the presenting the playing interaction area of the next chord comprises:
and presenting the playing interaction area of the next chord by adopting a fourth color.
10. The method as claimed in claim 8, wherein said presenting the play interaction area of the current chord in the chord sequence of the play track comprises:
presenting the playing interaction area of the current chord together with the chord identifier of the playing interaction area of the current chord, wherein presenting the playing interaction area of the next chord comprises:
and presenting the playing interaction area of the next chord together with the chord mark of the playing interaction area of the next chord.
11. The method of claim 1, wherein the played track is an accompaniment of a target song, the method further comprising:
and presenting the lyrics of the target song.
12. The method of claim 11, wherein the method further comprises:
acquiring a third interactive operation performed by the user aiming at the word and sentence prompting interactive area of the current chord;
determining a current word and sentence corresponding to the current chord in the lyrics according to the third interactive operation, wherein the presenting of the lyrics of the target song comprises:
and displaying the lyrics of the target song by highlighting the current words and sentences.
13. The method as claimed in claim 1, wherein the obtaining of the first interactive operation performed by the user for the play interaction area of the current chord in the chord sequence of the play track includes:
acquiring a first interactive operation performed by a user for a playing interactive area of the current chord at a first presentation position, wherein the method further comprises:
and responding to the display position moving operation input by the user to move the first display position from the current position to the target position.
14. The method of claim 13, wherein the method further comprises:
responding to a presentation position setting operation input by a user, and determining the first presentation position;
and presenting the playing interaction area of the current chord at the first presentation position.
15. An interaction method, comprising:
presenting a session interface of a plurality of accounts for instant video communication, wherein the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts;
acquiring prompt operation of a first account in the video window queue for a video window of a target second account in the plurality of second accounts;
and responding to the prompt operation, and preposing the video window of the target second account in the video window queue to update the video window queue.
16. The method of claim 1, wherein the session interface further comprises a voice-initiated account video window, wherein the method further comprises:
and moving the video window of the target second account from the current window position in the video window queue to the video window of the voice initiated account according to the voice response of the target second user according to the prompt message.
17. An interaction method, comprising:
acquiring a first interactive operation which is performed by a user on an interactive interface aiming at a playing interactive area of a current chord in a chord sequence of a playing track;
and determining whether a next chord of the current chord exists or not from the chord sequence according to the first interactive operation, and if so, presenting a playing interactive area of the next chord on the interactive interface.
18. The method of claim 17, wherein the obtaining of the first interactive operation performed by the user on the interactive interface for the play interaction area of the current chord in the chord sequence of the playing track comprises:
acquiring a first interactive operation performed by a user on an interactive interface aiming at a playing interactive area of the current chord at a first presentation position, wherein the method further comprises the following steps:
and moving the playing interactive area of the current chord from the first presentation position to a second presentation position different from the first presentation position on the interactive interface according to the first interactive operation.
19. The method of claim 18, wherein the method further comprises:
and if the next chord does not exist, presenting the playing interaction area of the current chord at the first presentation position.
20. An interactive device, comprising:
the system comprises a presentation module and a display module, wherein the presentation module presents a session interface of a plurality of accounts for instant video communication, the plurality of accounts comprise a first account and a plurality of second accounts, and the session interface at least comprises a video window queue in the plurality of second accounts;
the acquisition module is used for acquiring prompt operation of a first account in the video window queue aiming at a video window of a target second account in the plurality of second accounts;
and the moving module responds to the prompt operation, and leads the video window of the target second account in the video window queue so as to update the video window queue.
21. An interactive device, comprising:
the acquisition module is used for acquiring a first interaction operation which is performed by a user aiming at a playing interaction area of a current chord in the chord sequence of the playing track;
the determining module is used for determining the next chord of the current chord from the chord sequence according to the first interactive operation;
and the presentation module is used for presenting the playing interaction area of the next chord.
22. An interactive device, comprising:
an acquisition module, a determination module, and a presentation module, wherein,
the acquisition module is used for acquiring a first interactive operation which is performed by a user on an interactive interface aiming at a playing interactive area of a current chord in a chord sequence of a playing track;
the determining module is configured to determine whether a next chord of the current chord exists in the chord sequence according to the first interaction operation, and if the next chord exists, present a playing interaction area of the next chord on the interaction interface through the presenting module.
23. An electronic device, the device comprising:
one or more processors;
a computer readable medium configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method as claimed in any one of claims 1-19.
24. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 19.
CN202010224423.7A 2020-03-26 2020-03-26 Interaction method, interaction device, electronic equipment and computer storage medium Pending CN113448483A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010224423.7A CN113448483A (en) 2020-03-26 2020-03-26 Interaction method, interaction device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010224423.7A CN113448483A (en) 2020-03-26 2020-03-26 Interaction method, interaction device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN113448483A true CN113448483A (en) 2021-09-28

Family

ID=77807173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010224423.7A Pending CN113448483A (en) 2020-03-26 2020-03-26 Interaction method, interaction device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN113448483A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101313477A (en) * 2005-12-21 2008-11-26 Lg电子株式会社 Music generating device and operating method thereof
US20100288108A1 (en) * 2009-05-12 2010-11-18 Samsung Electronics Co., Ltd. Music composition method and system for portable device having touchscreen
CN103038781A (en) * 2010-05-05 2013-04-10 数字标记公司 Hidden image signaling
US20160140944A1 (en) * 2013-06-04 2016-05-19 Berggram Development Oy Grid based user interference for chord presentation on a touch screen device
US20180144732A1 (en) * 2015-06-05 2018-05-24 Qluge As Methods, Devices and Computer Program Products for Interactive Musical Improvisation Guidance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101313477A (en) * 2005-12-21 2008-11-26 Lg电子株式会社 Music generating device and operating method thereof
US20100288108A1 (en) * 2009-05-12 2010-11-18 Samsung Electronics Co., Ltd. Music composition method and system for portable device having touchscreen
CN103038781A (en) * 2010-05-05 2013-04-10 数字标记公司 Hidden image signaling
US20160140944A1 (en) * 2013-06-04 2016-05-19 Berggram Development Oy Grid based user interference for chord presentation on a touch screen device
US20180144732A1 (en) * 2015-06-05 2018-05-24 Qluge As Methods, Devices and Computer Program Products for Interactive Musical Improvisation Guidance

Similar Documents

Publication Publication Date Title
US20220044661A1 (en) Network musical instrument
US20200125920A1 (en) Interaction method and apparatus of virtual robot, storage medium and electronic device
US20140036022A1 (en) Providing a conversational video experience
JP2002351806A (en) Chat information, information recording medium, computer system, and server system
CN108337380A (en) Adjust automatically user interface is for hands-free interaction
US20140028780A1 (en) Producing content to provide a conversational video experience
CN104240703A (en) Voice message processing method and device
CN109671429B (en) Voice interaction method and device
WO2021244257A1 (en) Song processing method and apparatus, electronic device, and readable storage medium
CN113748425A (en) Auto-completion for content expressed in video data
US11521653B2 (en) Video sequence layout method, electronic device and storage medium
CN112527171A (en) Multimedia file playing method, device, equipment and medium
CN1937002A (en) Intelligent man-machine conversation system and implementation method thereof
CN109224445A (en) Game control method, device, equipment and storage medium in intelligent wearable equipment
CN111797271A (en) Method and device for realizing multi-person music listening, storage medium and electronic equipment
CN108810241B (en) Audio data-based sound modification display method and device
CN107562724A (en) For guiding method, equipment, server and the computer-readable recording medium of chat
US20240207735A1 (en) Interactive information display method, apparatus, electronic device and storage medium
JP2019215502A (en) Server, sound data evaluation method, program, and communication system
CN113448483A (en) Interaction method, interaction device, electronic equipment and computer storage medium
CN112306603A (en) Information prompting method and device, electronic equipment and storage medium
CN115963963A (en) Interactive novel generation method, presentation method, device, equipment and medium
JP2013092912A (en) Information processing device, information processing method, and program
JP6824547B1 (en) Active learning system and active learning program
WO2013181633A1 (en) Providing a converstional video experience

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220318

Address after: 510627 room 1701, No. 163, Pingyun Road, Tianhe District, Guangzhou City, Guangdong Province (Location: self compiled room 01) (office only)

Applicant after: Guangzhou Huancheng culture media Co.,Ltd.

Address before: 100102 901, floor 9, building 9, zone 4, Wangjing Dongyuan, Chaoyang District, Beijing

Applicant before: Beijing wall breaker Technology Co.,Ltd.

TA01 Transfer of patent application right