CN111665936A - Music collection method and device, electronic equipment and medium - Google Patents

Music collection method and device, electronic equipment and medium Download PDF

Info

Publication number
CN111665936A
CN111665936A CN202010427346.5A CN202010427346A CN111665936A CN 111665936 A CN111665936 A CN 111665936A CN 202010427346 A CN202010427346 A CN 202010427346A CN 111665936 A CN111665936 A CN 111665936A
Authority
CN
China
Prior art keywords
music
gesture information
song list
song
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010427346.5A
Other languages
Chinese (zh)
Inventor
叶慧敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010427346.5A priority Critical patent/CN111665936A/en
Publication of CN111665936A publication Critical patent/CN111665936A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/686Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title or artist information, time, location or usage information, user ratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a music collection method, a music collection device, electronic equipment and a medium. Belonging to the field of music software. The method comprises the following steps: receiving gesture input of a user under the condition of playing target music; responding to the gesture input, and acquiring first gesture information; and collecting the target music to a first song list under the condition that the first gesture information is matched with the preset gesture information, wherein the first song list is associated with the preset gesture information. By the aid of the method and the device, the problem that music collection operation is complex can be solved.

Description

Music collection method and device, electronic equipment and medium
Technical Field
The embodiment of the application relates to the field of music software, in particular to a music collection method, a music collection device, electronic equipment and a medium.
Background
At present, when a user plays music by using electronic equipment, the used music application generally opens the function of a music song list for the user, the user can collect favorite music in the created song list, and the song list can be opened subsequently to play the music in the song list.
However, when a user wants to collect music played currently, the user needs to unlock the screen of the electronic device, open the song playing interface of the music application, and click the collection button to collect the music, so that the music can be collected, and the operation of the whole process is complex.
Disclosure of Invention
An object of the embodiments of the present application is to provide a music collection method, apparatus, electronic device and medium, which can solve the problem of complicated music collection operation.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a music collection method, including:
receiving gesture input of a user under the condition of playing target music;
and under the condition that the first gesture information is matched with preset gesture information, collecting the target music to a first song list, wherein the first song list corresponds to the preset gesture information.
In a second aspect, an embodiment of the present application further provides a music collection apparatus, including:
the first receiving module is used for receiving gesture input of a user under the condition of playing target music;
the first collecting module is used for collecting the target music to a first song list under the condition that the first gesture information is matched with preset gesture information, wherein the first song list corresponds to the preset gesture information.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the music collection method according to the first aspect.
In a fourth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the music collection method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, in the playing process of the target music, gesture input of a user is received, and if first gesture information corresponding to the gesture input is matched with preset gesture information, the currently played target music is collected into the first song list associated with the preset gesture information. In the embodiment, the user can collect the music in the song list only by performing gesture input in the electronic equipment, the music collection operation is simple, and the convenience of the user in music collection is improved. Moreover, each song list is corresponding to corresponding preset gesture information in advance, so that a user can correspondingly store music in different song lists by inputting different gestures, namely, the music can be stored in a classified mode by simple gesture input, operation steps during music classified storage are simplified, and the music can be stored in a classified mode quickly. .
Drawings
Fig. 1 is a flowchart illustrating a music collection method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an editing interface provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a song list provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a setup interface provided in an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a music collection apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
As mentioned in the background section, when a music user plays music by using an electronic device, the music application generally opens the function of a music song list for the user, so that the user more wants to sort and collect music suitable for the user and favorite for listening into the corresponding song list, and open different song lists to play the music therein with suitable mood and time.
The current music collection process mainly comprises the following steps: lightening a screen, entering a music application, and clicking a collection key to collect music to a favorite list; and entering a favorite list, and collecting the music in the favorite list into different song lists in a classified mode. Alternatively, it may be: and (5) lightening the screen, entering music application, and directly collecting the music in a corresponding song list in a classified manner.
The process of listening to music of the user is ubiquitous in leisure, travel and rest time which does not affect normal thinking, work and learning. When listening to music in such a scenario, the user may not want to turn on the screen again to perform the series of collection operations; if the collection is abandoned, the music needs to be searched from the historical playing records subsequently, which is very inconvenient and has poor user experience.
In order to solve the above problem, an embodiment of the present invention provides a music collection method, which is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a music collection method according to an embodiment of the present application; the method is applied to the electronic equipment and comprises the following steps:
s101, receiving gesture input of a user under the condition of playing target music;
the gesture input here may include a slide gesture input, a click gesture input, or a combination of slide and click gesture inputs, etc., which are input in the touch screen by the user. For example, the input gesture may be a five-pointed star gesture where the user slides out of the touch screen, or a double or triple tap gesture on the screen, etc. The application does not limit the specific content of the gesture input.
S102, responding to gesture input, and acquiring first gesture information;
the first gesture information here may include type information of the gesture input, an input position in the screen, and the like, and the type information of the gesture input may include a number of clicks, a slide trajectory, and the like. The present application does not limit the content of the first gesture information.
S103, collecting the target music to a first song list under the condition that the first gesture information is matched with the preset gesture information, wherein the first song list is associated with the preset gesture information.
The first song list is a song list with gesture information for music collection, and the first song list may be a song list including specific features or keywords, such as an ancient song list and an electric song list. Or the first song list may also be a song list without specific features or keywords, that is, the first song list may include song lists of various types of music, such as a "new song list," an unnamed song list, "and the like. For example, a function of "enabling a single-finger three-click to collect songs to a newly created song list" is provided in the electronic device, referring to fig. 2, and fig. 2 is a schematic view of a setting interface provided in an embodiment of the present application.
In addition, because the singing lists such as the newly-built singing list and the unnamed singing list are special, specific gesture information can be independently set for the singing lists, and then no matter whether the newly-built singing list is associated with the specific gesture information or not, when the first gesture information is matched with the specific gesture information, the target music is collected into the newly-built singing list. Of course, the above is only a specific implementation manner, and it can be limited that any song list can be collected by music only if associated gesture information is needed.
In the embodiment of the application, in the playing process of the target music, gesture input of a user is received, and if first gesture information corresponding to the gesture input is matched with preset gesture information, the currently played target music is collected into the first song list associated with the preset gesture information. In the embodiment, the user can collect the music in the song list only by performing gesture input in the electronic equipment, the music collection operation is simple, and the convenience of the user in music collection is improved. Moreover, each song list is corresponding to corresponding preset gesture information in advance, so that a user can correspondingly store music in different song lists by inputting different gestures, namely, the music can be stored in a classified mode by simple gesture input, operation steps during music classified storage are simplified, and the music can be stored in a classified mode quickly.
Wherein, each song list can correspond to the same gesture information or different gesture information. Therefore, optionally, the first gesture information may be matched with gesture information of a plurality of songs, in which case the target music may be respectively collected into a plurality of songs that are successfully matched. For example, the gesture information corresponding to the "electric song list" and the "pop music song list" are both double-click, and when the double-click input by the user is received, the music being played is collected into the "electric song list" and the "pop music song list" respectively.
Or, in other embodiments, in the case that the first gesture information matches the preset gesture information of the plurality of song lists, collecting the target music to the first song list; the first song list here may be: a song list satisfying preset conditions.
That is, in this embodiment, in the case where the preset gesture information corresponds to a plurality of song sheets, a song sheet satisfying the preset condition is selected as the first song sheet among the plurality of song sheets. According to the mode, the preset conditions are set, so that the target music can be stored in the song list frequently selected by the user or attached to the target music only by performing gesture input on the target music subsequently, the song listening requirement of the user is guaranteed, and the occupation of a song cache on a memory space is reduced.
In a further embodiment, the preset condition may include: the played song list is played in a preset time period. Since the played song list is played for the user with a higher frequency within a preset time period, for example, within 3 days, the target music is stored in the song list, and the probability that the subsequent user hears the target music is higher, so that the preference of the user in a recent time period is better met.
In another embodiment, the preset condition may include: the song list with the largest number of pieces of music, which is the same as the target music category, is included in the plurality of song lists. Namely, the category of the music contained in each song list successfully matched with the first gesture information is respectively determined, and the song list with the category the same as the target music and the largest number of music is taken as the first song list. In this case, it is indicated that the music stored in the first song list is similar to the target music category, so that the storage of the target music in the song list is more suitable for the user's classification manner, and the convenience of the user in playing the music subsequently is improved.
Since the electronic device must be unlocked in the current music collection mode, the operation is complex in this case, and the user may not want to unlock or be inconvenient to unlock the electronic device.
Based on this, in some embodiments of the present application, in the process that the target music application plays the target music, the above S101 may include:
with the electronic device in the screen-off mode, a gesture input of a user is received.
In this embodiment, music can be collected by receiving gesture input of a user when the electronic device is in the screen-off mode. The mode does not need to unlock the electronic equipment to unlock the music, improves the convenience of the user in music collection, and simplifies the music collection operation of the user.
In order to implement the above method embodiment, the song list creation is required to be performed in advance in the electronic device, and the created song list is edited. The editing process may include naming the song list, associating gesture information, and the like. Optionally, before S101, the method may further include the following operations:
receiving a first input of an editing control of a song list to be edited by a user;
responding to the first input, displaying an editing interface of the song list to be edited, wherein the editing interface comprises at least two gesture controls to be selected;
receiving a second input of a user to a target gesture control in the at least two gesture controls to be selected;
responding to the second input, and corresponding the gesture information corresponding to the target gesture control with the song list to be edited.
In this embodiment, the user sets the gesture information corresponding to the song list to be edited by selecting the gesture control to be selected, so that convenience in setting the gesture information by the user is improved. In other embodiments, the user may also be allowed to customize the specific content of the gesture information, for example, in the editing interface, an input box may be provided, and the user may input the customized gesture information to correspond to the song sheet to be edited. The specific manner of use is not a limitation of the present application.
In addition, in the above editing interface, a selection control of "enabling to click on a screen collection" may also be set, as shown in fig. 3, fig. 3 illustrates a schematic diagram of an editing interface provided by an embodiment of the present application. And after the user clicks the selection control, the user can select the gesture control to be selected.
In other embodiments of the present application, the user may be provided with multiple vocalists, but only a portion of the vocalists correspond to corresponding gesture information. Based on the above, optionally, in the song list of the target music application, a preset identifier is displayed on the name of the song corresponding to the gesture information; different gesture information corresponds to different preset identifications. Referring to fig. 4, fig. 4 is a schematic diagram illustrating a song list provided in an embodiment of the present application.
By the method, when the user looks up the song list, the user can quickly determine which song list is provided with the corresponding gesture information, so that the user can know which song list the music can be collected in through the first gesture information in the music playing process, and the convenience of the user is improved.
Since only a part of the entire menu created by the user may correspond to the gesture information, and there may be a case where the user wrongly remembers the gesture information. Based on this, in other embodiments of the present application, before S101, the method may further include:
and collecting the target music to a second song list under the condition that the matching of the first gesture information and the gesture information related to all song lists fails.
In this embodiment, when the first gesture information input by the user and the gesture information associated with all the song sheets are unsuccessfully matched, it indicates that the user may wrongly remember the gesture information corresponding to the song sheet, for example, the gesture information corresponding to the song sheet is double-click gesture information, but the first gesture information input by the user is single-finger three-click gesture information. However, the purpose of the user's input of the first gesture information is to want to collect music, in which case the music may be temporarily collected into the second menu. The second song list may be a song list which is not associated with gesture information and can receive various types of music, such as an unnamed song list or a newly created song list, or may be a specific song list which is selected by the user in advance, such as a "song list 3", and the application does not limit the second song list.
Further, optionally, here the collecting of the target music to the second song list may include:
under the condition that a second song list is arranged in the target music application, collecting the target music to the second song list;
and under the condition that the second song list is not set in the target music application, establishing the second song list, collecting the target music to the established second song list, and establishing the association relationship between the first gesture information and the second song list.
In this embodiment, when the user incorrectly inputs a music collection gesture, the target music needs to be saved in the second song list, but the second song list may not exist, so that the song list creation needs to be performed, and the first gesture information incorrectly input by the user is corresponding to the second song list. Because the gesture input by the user by mistake is usually a deeper gesture memorized by the user, under the condition that the first gesture information does not correspond to any function, the first gesture information can be associated with the newly-built second song list, so that the subsequent user can collect music through the first gesture information.
It should be noted that, in the above embodiments, as some gesture inputs may be corresponding to other functions of the electronic device, when the matching of the first gesture information and the gesture information associated with all the song sheets fails, it is further required to determine whether the first gesture information matches with the gesture information corresponding to the other functions in advance, and when the matching of the first gesture information and the gesture information corresponding to the other functions in advance fails, the first gesture information is used as the music collection gesture information input by the user by mistake.
In some other embodiments of the present application, after S103, the method may further include:
and outputting first prompt information, wherein the first prompt information is used for prompting the user that the music collection is successful.
In the embodiment, after the music collection is completed, the first prompt message is output to inform the user of the music collection, so that the user can know the music collection condition in time.
In a specific embodiment, optionally, the manner of outputting the first prompt message may include at least one of the following:
outputting a voice prompt indicating that the music collection is successful; for example outputting the alert tone "ding" one.
And after the user enters the interface of the target music application, popping up a prompt box for successful music collection.
The first prompt information output in this embodiment is used to inform the user that the current music has been successfully collected, so the output first prompt information only needs to achieve the above purpose, and the application does not limit the specific content of the first prompt information and the specific manner of outputting the first prompt information.
In still other embodiments of the present application, after S103, the method may further include:
music in the first song list is played.
Because the user collects the music, namely the user likes the currently played target music very much, on the basis, the music in the first song list corresponding to the target music is probably the music which the user likes to listen to at present, the music in the first song list is directly played after the target music is collected, so that the user does not need to unlock the electronic equipment for song list selection, and the convenience of the user is improved.
It should be noted that, in the music collection method provided in the embodiment of the present application, the execution main body may be a music collection device, or a control module in the music collection device for executing the method for loading music collection. In the embodiment of the present application, a music collection device is taken as an example to execute a music collection loading method, and the music collection method provided in the embodiment of the present application is described.
Based on the embodiment of the music collection method provided by the foregoing embodiment, correspondingly, an embodiment of the present application further provides a music collection device, and referring to fig. 5, fig. 5 shows a schematic structural diagram of a music collection device provided by the embodiment of the present application. The device includes:
a first receiving module 201, configured to receive a gesture input of a user in a case where a target music is played;
the information acquisition module 202 is used for responding to gesture input and acquiring first gesture information;
the first collecting module 203 is configured to collect the target music to a first song list in a case that the first gesture information matches the preset gesture information, where the first song list is associated with the preset gesture information.
In the embodiment of the application, in the playing process of the target music, gesture input of a user is received, and if first gesture information corresponding to the gesture input is matched with preset gesture information, the currently played target music is collected into the first song list associated with the preset gesture information. In the embodiment, the user can collect the music in the song list only by performing gesture input in the electronic equipment, the music collection operation is simple, and the convenience of the user in music collection is improved. Moreover, each song list is corresponding to corresponding preset gesture information in advance, so that a user can correspondingly store music in different song lists by inputting different gestures, namely, the music can be stored in a classified mode by simple gesture input, operation steps during music classified storage are simplified, and the music can be stored in a classified mode quickly.
Wherein, each song list can correspond to the same gesture information or different gesture information. Therefore, optionally, the first gesture information may be matched with gesture information corresponding to a plurality of vocabularies, in which case the target music may be respectively collected into a plurality of vocabularies successfully matched, that is, the first vocabularies include a plurality of vocabularies.
Or, in other embodiments, the first collection module 203 is specifically configured to:
collecting the target music to a first song list under the condition that the first gesture information is matched with the preset gesture information of the plurality of song lists; the first song list here may be: a song list satisfying preset conditions.
That is, in the present embodiment, in the case where the first gesture information is matched to a plurality of vocalists, a vocalist satisfying a preset condition is selected as the first vocalist among the plurality of vocalists successfully matched. According to the mode, the preset conditions are set, so that the target music can be stored in the song list frequently selected by the user or attached to the target music only by performing gesture input on the target music subsequently, the song listening requirement of the user is guaranteed, and the occupation of a song cache on a memory space is reduced.
In a further embodiment, the preset condition may include: the played song list is played in a preset time period. Since the played song list is played for the user with a higher frequency within a preset time period, for example, within 3 days, the target music is stored in the song list, and the probability that the subsequent user hears the target music is higher, so that the preference of the user in a recent time period is better met.
In another embodiment, the preset condition may include: the song list with the largest number of pieces of music, which is the same as the target music category, is included in the plurality of song lists. Namely, the category of the music contained in each song list successfully matched with the first gesture information is respectively determined, and the song list with the category the same as the target music and the largest number of music is taken as the first song list. In this case, it is indicated that the music stored in the first song list is similar to the target music category, so that the storage of the target music in the song list is more suitable for the user's classification manner, and the convenience of the user in playing the music subsequently is improved.
Since the electronic device must be unlocked in the current music collection mode, the operation is complex in this case, and the user may not want to unlock or be inconvenient to unlock the electronic device.
Based on this, in some embodiments of the present application, in the process that the target music application plays the target music, the first receiving module 201 may be configured to: with the electronic device in the screen-off mode, a gesture input of a user is received.
In this embodiment, music can be collected by receiving gesture input of a user when the electronic device is in the screen-off mode. The mode does not need to unlock the electronic equipment to unlock the music, improves the convenience of the user in music collection, and simplifies the music collection operation of the user.
In order to implement the above method embodiment, the song list creation is required to be performed in advance in the electronic device, and the created song list is edited. The editing process may include naming the song list, associating gesture information, and the like.
Optionally, the apparatus may further include:
the editing module is used for receiving first input of an editing control of a song list to be edited by a user; responding to the first input, displaying an editing interface of the song list to be edited, wherein the editing interface comprises at least two gesture controls to be selected; receiving a second input of a user to a target gesture control in the at least two gesture controls to be selected; responding to the second input, and corresponding the gesture information corresponding to the target gesture control with the song list to be edited.
In this embodiment, the user sets the gesture information corresponding to the song list to be edited by selecting the gesture control to be selected, so that convenience in setting the gesture information by the user is improved. In other embodiments, the user may also be allowed to customize the specific content of the gesture information, for example, in the editing interface, an input box may be provided, and the user may input the customized gesture information to be associated with the song sheet to be edited. The specific manner of use is not a limitation of the present application.
In addition, a selection control for starting screen-click collection can be set in the editing interface, and a user can select the gesture control to be selected after checking the selection control.
In other embodiments of the present application, the user may be provided with multiple vocalists, but only a portion of the vocalists correspond to corresponding gesture information. Based on the above, optionally, in the song list of the target music application, a preset identifier is displayed on the name of the song corresponding to the gesture information; different gesture information corresponds to different preset identifications.
By the method, when the user looks up the song list, the user can quickly determine which song list is provided with the corresponding gesture information, so that the user can know which song list the music can be collected in through the first gesture information in the music playing process, and the convenience of the user is improved.
In other embodiments of the present application, the apparatus may further comprise:
and the second collection module is used for collecting the target music to the second song list under the condition that the matching of the first gesture information and the gesture information related to all the song lists fails.
In this embodiment, in the case where the first gesture information input by the user fails to match the gesture information associated with all the vocalists, it indicates that the user may misclassify the previously set gesture information. But the purpose of the gesture input by the user is to want music to be collected, in which case the music may be temporarily collected into the second menu. The second song list may be a song list which is not associated with gesture information and can receive various types of music, such as an unnamed song list or a newly created song list, or may be a specific song list which is selected by the user in advance, such as a "song list 3", and the application does not limit the second song list.
Furthermore, optionally, the second collection module may include:
the song list establishing unit is used for establishing a second song list under the condition that the matching of the first gesture information and the gesture information related to all song lists fails;
a collection unit for collecting the target music to a second song list;
and the association unit is used for establishing the association relationship between the first gesture information and the second song list.
In addition, the second collection module may further include:
the judging unit is used for judging whether the second song list is arranged currently or not under the condition that the matching of the first gesture information and the gesture information related to all the song lists fails; in the case where the second song list is not set, the song list creation unit is triggered. In case a second song list is provided, the favorites unit is triggered.
In this embodiment, when the user incorrectly inputs a music collection gesture, the target music needs to be saved in the second song list, but the second song list may not exist, so that the song list creation needs to be performed, and the first gesture information incorrectly input by the user is corresponding to the second song list. Because the gesture input by the user by mistake is usually a deeper gesture memorized by the user, under the condition that the first gesture information does not correspond to any function, the first gesture information can be associated with the newly-built second song list, so that the subsequent user can collect music through the first gesture information.
For the situations of the above embodiments, because the electronic device may have some gesture inputs corresponding to other functions, when the matching of the first gesture information and the preset gesture information corresponding to all the song sheets fails, it is further required to determine whether the first gesture information matches with the gesture information corresponding to other functions in advance, and when the first gesture information does not match with the gesture information corresponding to other functions in advance, the first gesture information is used as the music collection gesture information input by the user by mistake.
In some other embodiments of the present application, the apparatus may further include:
and the prompting module is used for outputting first prompting information, and the first prompting information is used for prompting the user that the music collection is successful.
In the embodiment, after the music collection is completed, the first prompt message is output to inform the user of the music collection, so that the user can know the music collection condition in time.
In a specific embodiment, optionally, the prompt module may be configured to:
and outputting a voice prompt indicating that the music collection is successful, and/or popping up a prompt box indicating that the music collection is successful after the user enters an interface of the target music application.
The first prompt information output in this embodiment is used to inform the user that the current music has been successfully collected, so the output first prompt information only needs to achieve the above purpose, and the application does not limit the specific content of the first prompt information and the specific manner of outputting the first prompt information.
In still other embodiments of the present application, the apparatus may further comprise:
and the playing module is used for playing the music in the first song list.
Because the user collects the music, namely the user likes the currently played target music very much, on the basis, the music in the first song list corresponding to the target music is probably the music which the user likes to listen to at present, the music in the first song list is directly played after the target music is collected, so that the user does not need to unlock the electronic equipment for song list selection, and the convenience of the user is improved.
The apparatus provided in the embodiment of the present application can implement each method step implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
The music collection device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a kiosk, and the like, and the embodiments of the present application are not particularly limited.
The music collection device in the embodiment of the application may be a device with an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The music collection device provided in the embodiment of the present application can implement each process implemented by the music collection device in the method embodiment of fig. 1, and is not described herein again to avoid repetition.
Optionally, an embodiment of the present application further provides an electronic device, which includes a processor, a memory, and a program or an instruction stored in the memory and capable of being executed on the processor, where the program or the instruction is executed by the processor to implement each process of the above music collection method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 300 includes, but is not limited to: radio frequency unit 301, network module 302, audio output unit 303, input unit 304, sensor 305, display unit 306, user input unit 307, interface unit 308, memory 309, and processor 310.
Those skilled in the art will appreciate that the electronic device 300 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 310 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A user input unit 307, configured to receive a gesture input of a user in a case where the target music is played;
a processor 310 for acquiring first gesture information in response to a gesture input; and collecting the target music to a first song list under the condition that the first gesture information is matched with the preset gesture information, wherein the first song list is associated with the preset gesture information.
In the embodiment of the application, in the playing process of the target music, gesture input of a user is received, and if first gesture information corresponding to the gesture input is matched with preset gesture information, the currently played target music is collected into the first song list associated with the preset gesture information. In the embodiment, the user can collect the music in the song list only by performing gesture input in the electronic equipment, the music collection operation is simple, and the convenience of the user in music collection is improved. Moreover, each song list is corresponding to corresponding preset gesture information in advance, so that a user can correspondingly store music in different song lists by inputting different gestures, namely, the music can be stored in a classified mode by simple gesture input, operation steps during music classified storage are simplified, and the music can be stored in a classified mode quickly.
Optionally, the user input unit 307 is specifically configured to: with the electronic device in the screen-off mode, a gesture input of a user is received.
In this embodiment, music can be collected by receiving gesture input of a user when the electronic device is in the screen-off mode. The mode does not need to unlock the electronic equipment to unlock the music, improves the convenience of the user in music collection, and simplifies the music collection operation of the user.
Optionally, the processor 310 is further configured to: and collecting the target music to a second song list under the condition that the matching of the first gesture information and the gesture information related to all song lists fails.
In this embodiment, in a case that the first gesture information input by the user fails to match the gesture information associated with all the song sheets, it indicates that the user may misclassify the gesture information corresponding to the song sheets. However, the purpose of the user's input of the first gesture information is to want to collect music, in which case the music may be temporarily collected into the second menu.
Optionally, the processor 310 is further configured to: under the condition that the matching of the first gesture information and gesture information related to all the song lists fails, establishing a second song list; and collecting the target music to the second song list, and establishing the association relation between the first gesture information and the second song list.
In this embodiment, when the user incorrectly inputs a music collection gesture, the target music needs to be saved in the second song list, but the second song list may not exist, so that the song list creation needs to be performed, and the first gesture information incorrectly input by the user is corresponding to the second song list. Because the gesture input by the user by mistake is usually a deeper gesture memorized by the user, under the condition that the first gesture information does not correspond to any function, the first gesture information can be associated with the newly-built second song list, so that the subsequent user can collect music through the first gesture information.
Optionally, the processor 310 is further configured to: collecting the target music to a first song list under the condition that the first gesture information is matched with the preset gesture information of the plurality of song lists; the first song list is: playing the played song list within a preset time period; or, the song list containing the largest amount of music of the same category as the target music among the plurality of song lists.
In this embodiment, in a case where the preset gesture information corresponds to a plurality of song sheets, a song sheet satisfying a preset condition is selected as the first song sheet from the plurality of song sheets. According to the mode, the preset conditions are set, so that the target music can be stored in the song list frequently selected by the user or attached to the target music only by performing gesture input on the target music subsequently, the song listening requirement of the user is guaranteed, and the occupation of a song cache on a memory space is reduced.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above music collection method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above music collection method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A music collection method, comprising:
receiving gesture input of a user under the condition of playing target music;
in response to the gesture input, acquiring first gesture information;
collecting the target music to a first song list under the condition that the first gesture information is matched with preset gesture information, wherein the first song list is associated with the preset gesture information.
2. The method of claim 1, wherein receiving a gesture input by a user comprises:
receiving the gesture input of the user when the electronic device is in a screen-off mode.
3. The method according to claim 1 or 2, wherein after the acquiring the first gesture information, further comprising:
and under the condition that the first gesture information is unsuccessfully matched with the gesture information associated with all the song sheets, collecting the target music to a second song sheet.
4. The method of claim 3, wherein collecting the target music to a second song list in the event that the first gesture information fails to match gesture information associated with all song lists comprises:
under the condition that the matching of the first gesture information and gesture information related to all the song lists fails, establishing a second song list;
and collecting the target music to the second song list, and establishing the association relation between the first gesture information and the second song list.
5. The method of claim 1, wherein collecting the target music to a first song list if the first gesture information matches preset gesture information comprises:
under the condition that the first gesture information is matched with preset gesture information of a plurality of song lists, collecting the target music to the first song list;
the first song list is:
playing the played song list within a preset time period;
or, the song list with the largest number of pieces of music contained in the plurality of song lists and the same type as the target music.
6. A music collection apparatus, comprising:
the first receiving module is used for receiving gesture input of a user under the condition of playing target music;
the information acquisition module is used for responding to the gesture input and acquiring first gesture information;
the first collecting module is used for collecting the target music to a first song list under the condition that the first gesture information is matched with preset gesture information, wherein the first song list is associated with the preset gesture information.
7. The apparatus of claim 6, wherein the first receiving module is specifically configured to: receiving the gesture input of the user when the electronic device is in a screen-off mode.
8. The apparatus of claim 6 or 7, further comprising:
and the second collection module is used for collecting the target music to a second song list under the condition that the matching of the first gesture information and the gesture information related to all song lists fails.
9. The apparatus of claim 8, wherein the second stowage module comprises:
the song list establishing unit is used for establishing the second song list under the condition that the matching of the first gesture information and gesture information related to all song lists fails;
a collection unit for collecting the target music to the second song list;
and the association unit is used for establishing the association relationship between the first gesture information and the second song list.
10. The apparatus of claim 6, wherein the first collection module is specifically configured to:
under the condition that the first gesture information is matched with preset gesture information of a plurality of song lists, collecting the target music to the first song list; the first song list is: playing the played song list within a preset time period; or, the song list with the largest number of pieces of music contained in the plurality of song lists and the same type as the target music.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the music collection method of any of claims 1 to 5.
12. A readable storage medium, on which a program or instructions are stored, which when executed by a processor, carry out the steps of the music collection method of any one of claims 1 to 5.
CN202010427346.5A 2020-05-19 2020-05-19 Music collection method and device, electronic equipment and medium Pending CN111665936A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010427346.5A CN111665936A (en) 2020-05-19 2020-05-19 Music collection method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010427346.5A CN111665936A (en) 2020-05-19 2020-05-19 Music collection method and device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN111665936A true CN111665936A (en) 2020-09-15

Family

ID=72384021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010427346.5A Pending CN111665936A (en) 2020-05-19 2020-05-19 Music collection method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN111665936A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256233A (en) * 2020-10-22 2021-01-22 北京字节跳动网络技术有限公司 Music playing method and device
CN112286421A (en) * 2020-10-30 2021-01-29 维沃移动通信有限公司 Playlist processing method and device and electronic equipment
CN113485600A (en) * 2021-07-19 2021-10-08 维沃移动通信(杭州)有限公司 Singing list sharing method and device and electronic equipment
CN114564604A (en) * 2022-03-01 2022-05-31 北京字节跳动网络技术有限公司 Media collection generation method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577060A (en) * 2012-08-07 2014-02-12 联想(北京)有限公司 Data processing method and electronic equipment
CN104267898A (en) * 2014-09-16 2015-01-07 北京数字天域科技股份有限公司 Method and device for quick triggering application program or application program function
CN105446637A (en) * 2014-08-29 2016-03-30 宇龙计算机通信科技(深圳)有限公司 Image storage method and system and terminal
US20160187992A1 (en) * 2014-04-03 2016-06-30 Honda Motor Co., Ltd. Smart tutorial for gesture control system
CN106484302A (en) * 2016-10-31 2017-03-08 维沃移动通信有限公司 A kind of playback of songs method and mobile terminal
US20170068320A1 (en) * 2014-03-03 2017-03-09 Nokia Technologies Oy An Input Axis Between an Apparatus and A Separate Apparatus
CN107577734A (en) * 2017-08-24 2018-01-12 维沃移动通信有限公司 The method and mobile terminal of a kind of song collection
CN108073280A (en) * 2016-11-16 2018-05-25 汤姆逊许可公司 The selecting object in enhancing or reality environment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577060A (en) * 2012-08-07 2014-02-12 联想(北京)有限公司 Data processing method and electronic equipment
US20170068320A1 (en) * 2014-03-03 2017-03-09 Nokia Technologies Oy An Input Axis Between an Apparatus and A Separate Apparatus
US20160187992A1 (en) * 2014-04-03 2016-06-30 Honda Motor Co., Ltd. Smart tutorial for gesture control system
CN105446637A (en) * 2014-08-29 2016-03-30 宇龙计算机通信科技(深圳)有限公司 Image storage method and system and terminal
CN104267898A (en) * 2014-09-16 2015-01-07 北京数字天域科技股份有限公司 Method and device for quick triggering application program or application program function
CN106484302A (en) * 2016-10-31 2017-03-08 维沃移动通信有限公司 A kind of playback of songs method and mobile terminal
CN108073280A (en) * 2016-11-16 2018-05-25 汤姆逊许可公司 The selecting object in enhancing or reality environment
CN107577734A (en) * 2017-08-24 2018-01-12 维沃移动通信有限公司 The method and mobile terminal of a kind of song collection

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256233A (en) * 2020-10-22 2021-01-22 北京字节跳动网络技术有限公司 Music playing method and device
US11934632B2 (en) 2020-10-22 2024-03-19 Beijing Bytedance Network Technology Co., Ltd. Music playing method and apparatus
CN112286421A (en) * 2020-10-30 2021-01-29 维沃移动通信有限公司 Playlist processing method and device and electronic equipment
CN113485600A (en) * 2021-07-19 2021-10-08 维沃移动通信(杭州)有限公司 Singing list sharing method and device and electronic equipment
CN114564604A (en) * 2022-03-01 2022-05-31 北京字节跳动网络技术有限公司 Media collection generation method and device, electronic equipment and storage medium
CN114564604B (en) * 2022-03-01 2023-08-08 抖音视界有限公司 Media collection generation method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111665936A (en) Music collection method and device, electronic equipment and medium
US10782856B2 (en) Method and device for displaying application function information, and terminal device
US10091643B2 (en) Method and apparatus for displaying associated information in electronic device
US20210352059A1 (en) Message Display Method, Apparatus, and Device
CN102984050A (en) Method, client and system for searching voices in instant messaging
CN103941969A (en) Menu display method and device
CN112287165A (en) File processing method and device
CN104615663A (en) File sorting method and device and terminal
CN111638826A (en) Interface display method and device and electronic equipment
CN106325889A (en) Data processing method and device
CN112765104A (en) File saving method, device, equipment and storage medium
CN106648746A (en) Application execution method and device
CN106888308B (en) A kind of method of speech processing and mobile terminal
CN112256233A (en) Music playing method and device
CN109002184A (en) A kind of association method and device of input method candidate word
CN112286421A (en) Playlist processing method and device and electronic equipment
CN107193878B (en) Automatic naming method of song list and mobile terminal
EP2065810A2 (en) Method and system for displaying and accessing music data files
CN111641551A (en) Voice playing method, voice playing device and electronic equipment
CN112416143B (en) Text information editing method and device and electronic equipment
CN112328149B (en) Picture format setting method and device and electronic equipment
CN112637411B (en) Image searching method and device, electronic equipment and readable storage medium
CN113485600A (en) Singing list sharing method and device and electronic equipment
CN113076444A (en) Song identification method and device, electronic equipment and storage medium
CN112578965A (en) Processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200915

RJ01 Rejection of invention patent application after publication