WO2022111260A1 - 音乐筛选方法、装置、设备及介质 - Google Patents

音乐筛选方法、装置、设备及介质 Download PDF

Info

Publication number
WO2022111260A1
WO2022111260A1 PCT/CN2021/129233 CN2021129233W WO2022111260A1 WO 2022111260 A1 WO2022111260 A1 WO 2022111260A1 CN 2021129233 W CN2021129233 W CN 2021129233W WO 2022111260 A1 WO2022111260 A1 WO 2022111260A1
Authority
WO
WIPO (PCT)
Prior art keywords
music
attribute
dimension
coordinate system
dimensional coordinate
Prior art date
Application number
PCT/CN2021/129233
Other languages
English (en)
French (fr)
Inventor
何珂
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2022111260A1 publication Critical patent/WO2022111260A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/638Presentation of query results

Definitions

  • the present application relates to the field of Internet technologies, and in particular, to a music screening method, apparatus, device and medium.
  • the screening function in the application program usually requires the user to input a keyword of the target music in the screening column, thereby screening out some candidate music sets, and the user selects the target music from the candidate music set.
  • music screening using the above method often requires the user to provide more detailed key information, which cannot meet the user's requirements for a specific attribute of the target music, resulting in the screening results failing to meet the user's fuzzy preferences and fuzzy needs.
  • the embodiments of the present application provide a music screening method, device, device and medium, which can realize fuzzy screening of music through triggering operations on a two-dimensional coordinate system.
  • the technical solution at least includes the following technical solutions:
  • a music screening method comprising:
  • the music screening interface is displayed, and a two-dimensional coordinate system is displayed on the music screening interface.
  • the first dimension of the two-dimensional coordinate system corresponds to the first music attribute
  • the second dimension of the two-dimensional coordinate system corresponds to the second music attribute
  • the first music attribute corresponds to the first music attribute.
  • Two musical attributes are different musical attributes;
  • the filtered target music is displayed, the first music attribute of the target music corresponds to the coordinates of the trigger position in the first dimension, and the second music attribute of the target music corresponds to the coordinates of the trigger position in the second dimension.
  • a music screening device comprising:
  • the display module is used to display the music screening interface.
  • a two-dimensional coordinate system is displayed on the music screening interface.
  • the first dimension of the two-dimensional coordinate system corresponds to the first music attribute
  • the second dimension of the two-dimensional coordinate system corresponds to the second music attribute.
  • the first musical attribute and the second musical attribute are different musical attributes;
  • an acquisition module used for acquiring the triggering position of the triggering operation in response to the triggering operation on the two-dimensional coordinate system
  • the display module is also used to display the selected target music, the first music attribute of the target music corresponds to the coordinates of the trigger position in the first dimension, and the second music attribute of the target music corresponds to the coordinates of the trigger position in the second dimension .
  • a computer device includes a processor and a memory, the memory stores at least one piece of program code, the program code is loaded by the processor and executes the above music screening method.
  • a computer-readable storage medium is provided, and at least one piece of program code is stored in the computer-readable storage medium, and the program code is loaded and executed by a processor to implement the above music screening method.
  • a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the music screening method as above.
  • FIG. 1 is a schematic diagram of an interface change of a music screening method provided by an exemplary embodiment of the present application
  • FIG. 2 is a block diagram of a computer system provided by an exemplary embodiment of the present application.
  • FIG. 3 is a flowchart of a music screening method provided by an exemplary embodiment of the present application.
  • FIG. 4a is a schematic interface diagram of a music screening interface provided by an exemplary embodiment of the present application.
  • 4b is a schematic interface diagram of a music screening interface provided by another exemplary embodiment of the present application.
  • FIG. 5 is a schematic interface diagram of a music screening interface provided by another exemplary embodiment of the present application.
  • FIG. 6 is a flowchart of a music screening method provided by an exemplary embodiment of the present application.
  • Fig. 7 is a flow chart of steps of melody localization provided by an exemplary embodiment of the present application.
  • FIG. 8 is a technical flow chart of a music screening method provided by an exemplary embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a music screening apparatus provided by an exemplary embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a computer device provided by an exemplary embodiment of the present application.
  • Music screening interface refers to the program interface presented to the user for performing music screening and/or displaying results.
  • Two-dimensional coordinate system refers to a coordinate system formed by two number axes with a common origin on the same plane, usually with two dimensions.
  • Musical attributes The sounds produced by musical instruments through percussion, friction, blowing, etc. and/or human voices form music.
  • Music attributes refer to the special properties of the formed music, mainly involving two aspects, namely the melody characteristics of the music and the type of musical instruments used in the music.
  • Attribute tag refers to the classification tag marked after classifying music according to the aforementioned music attributes.
  • a piece of music has at least one attribute tag of a music attribute, and the number of attribute tags of one music attribute of a piece of music is not limited.
  • a piece of music has attribute tags of two types of music attributes: tempo and instrument type, and the instrument type attribute has attribute tags of three instrument types: pipa, guzheng and xiao.
  • the embodiment of the present application provides a music screening method, which displays the screened target music through a trigger operation on a two-dimensional coordinate system, reduces the time and steps for the user to perform music screening, and allows the user to perform music screening according to vague requirements possibility to enhance the user experience.
  • a two-dimensional coordinate system 111 is displayed on the music screening interface 110 .
  • the two-dimensional coordinate system 111 is a rectangular coordinate system, and the target music is displayed in response to a trigger operation on the two-dimensional coordinate system 111 .
  • a search bar control and a two-dimensional coordinate system 111 are displayed on the music screening interface 110 .
  • the user can perform an input operation in the search bar control, and implement music filtering by inputting key information; or, the user can perform music filtering by clicking on the two-dimensional coordinate system 111 .
  • the two-dimensional coordinate system 111 as an example of a rectangular coordinate system
  • the user double-clicks on a certain coordinate point in the rectangular coordinate system
  • the program interface jumps from the music screening interface 110 to the music playing interface 120 .
  • a song "Song A" is displayed on the music playing interface 120, and the song is in a playing state. That is, after the user performs a double-click operation on the Cartesian coordinate system, the target music is directly played.
  • the selected target music is determined through human-computer interaction between the user and the terminal.
  • the user needs to use a screening tool to determine the desired music through operations performed on the screening tool, and the terminal displays the target music according to the user's operation.
  • the embodiment of the present application provides an interactive solution for music screening based on a two-dimensional coordinate system. Through a user's triggering operation on the two-dimensional coordinate system, the screened target music is displayed, which can meet the user's fuzzy screening requirements.
  • FIG. 2 shows a block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 200 includes a server 210 , a terminal 220 , a cloud database 230 and a local database 240 .
  • the server 210 may be a server, or a server cluster composed of several servers, or a virtualization platform, or a cloud computing service center.
  • the server 210 may be a server that provides background support for music applications, and the server 120 may be composed of one or more functional units.
  • a plurality of terminals 220 are connected to the server 210 through a wireless or wired network.
  • a music application program is installed and running on the terminal 220, and the application program has the function of supporting music screening.
  • the application program may be a music player program, a video player program, a radio player program or other music application programs.
  • the terminal 220 may be at least one of a computer, a smart phone, a tablet computer, an e-book reader, an MP3 player, an MP4 player, a laptop computer, a desktop computer, a smart TV, a smart car, and a smart device.
  • the cloud database 230 and the local database 240 are connected to the server 210 through a wireless or wired network, and are used to store music-related data, such as music title, music duration, music performer information, types of music attributes, and attribute tags of each music.
  • the music screening method provided by the embodiment of the present application provides the convenience of fuzzy query for the user to perform music screening.
  • the schematic flowchart shown in Figure 3, the execution subject of the method is a computer device, for example, a terminal running a music application program, such as the terminal 220 in Figure 2, the method includes the following steps:
  • Step 302 Display the music screening interface.
  • a two-dimensional coordinate system is displayed on the music screening interface, the first dimension of the two-dimensional coordinate system corresponds to the first music attribute, and the second dimension of the two-dimensional coordinate system corresponds to the second music attribute.
  • the first music attribute and the second music attribute are different music attributes.
  • Music screening interface refers to the program interface for music screening and/or result display.
  • the music screening interface may be at least one of a screening function interface, a search function interface, a query function interface, and an identification function interface.
  • at least a two-dimensional coordinate system is displayed in the music screening interface, and the two-dimensional coordinate system is used for the user to perform operations related to music screening.
  • at least one of a search bar control, a query bar control, and an audio recognition control may also be displayed in the music screening interface.
  • a search bar control is also displayed, and input operations can be performed on the search bar control. That is, the music filtering performed by the user in the music filtering interface 110 may be on the two-dimensional coordinate system 111 or on the search bar control.
  • a two-dimensional coordinate system refers to a coordinate system formed by two number axes with a common origin on the same plane, usually with two dimensions.
  • the two-dimensional coordinate system involved in the embodiments of the present application includes at least one of a polar coordinate system and a rectangular coordinate system.
  • Polar coordinate system refers to a coordinate system composed of poles, polar axes and polar diameters in a plane, including radial and azimuth dimensions.
  • a point O is determined on the plane, which is called a pole; a ray Ox is drawn from the pole O, which is called a polar axis; and a unit length OP is determined, which is called a polar diameter.
  • the specification is bounded by the polar axis, taking the counterclockwise direction as positive.
  • the position of any point in the plane can be determined by the length ⁇ of the line segment OP, the angle ⁇ between the polar axis Ox and the line segment OP, and the pair of ordered numbers ( ⁇ , ⁇ ) is called the polar coordinate of the point P, and ⁇ is P
  • the radius coordinate of the point, ⁇ is the azimuth coordinate of the point P.
  • a two-dimensional coordinate system 411 is displayed in the music screening interface 410, and the two-dimensional coordinate system 411 is a polar coordinate system.
  • the two-dimensional coordinate system 411 is a polar coordinate system.
  • a Cartesian coordinate system refers to a coordinate composed of two number axes that are perpendicular to each other and have a common origin in a plane, including the x-axis dimension and the y-axis dimension.
  • the plane of the rectangular coordinate system is called the coordinate plane
  • the common origin is called the origin of the rectangular coordinate system
  • the two axes are called the x-axis and the y-axis.
  • the x-axis and the y-axis divide the coordinate plane into four quadrants, and the quadrants are bounded by the number axis.
  • the counterclockwise direction from the upper right quadrant is the first quadrant, the second quadrant, the third quadrant and the fourth quadrant.
  • a two-dimensional coordinate system 511 is displayed on the music screening interface 510 , and the two-dimensional coordinate system 511 is a rectangular coordinate system.
  • the two-dimensional coordinate system 511 is a rectangular coordinate system.
  • the Cartesian coordinates are marked (-2, 1).
  • part of the content of the two-dimensional coordinate system may be hidden in the music screening interface.
  • the two-dimensional coordinate system includes at least one of a first dimension axis, a second dimension axis, an origin, a first dimension unit, a second dimension unit, a plurality of quadrant regions, grid lines, and circular lines.
  • part of the content of the two-dimensional coordinate system can be hidden according to the user's needs.
  • a two-dimensional coordinate system is displayed in the music screening interface, and the two-dimensional coordinate system is a polar coordinate system.
  • the polar coordinate system includes an orientation dimension number axis, an origin, a radius dimension unit, and an orientation dimension unit.
  • the radius dimension unit only displays “slow rhythm” and "fast rhythm”
  • the azimuth dimension unit displays "scale one", “scale two", “scale three”, “scale four", “scale five", “scale six” and "Scale Seven”. That is, the circular line and part of the radius dimension unit of the polar coordinate system are hidden in the music filter interface.
  • a two-dimensional coordinate system is displayed in the music screening interface.
  • the two-dimensional coordinate system is a rectangular coordinate system.
  • the rectangular coordinate system includes the x-axis number axis, the y-axis number axis, the origin, and the x-axis dimension units "fast rhythm” and "normal rhythm”.
  • y-axis dimension units "Treble” and "Midrange”, first quadrant area. That is, the grid lines of the rectangular coordinate system, part of the x-axis dimension units, the y-axis dimension units, the second quadrant area, the third quadrant area, and the fourth quadrant area are hidden in the music screening interface.
  • Music is formed by the sounds produced by musical instruments and/or human voices through percussion, friction, blowing, etc.
  • the music involved in the embodiments of the present application includes, but is not limited to, at least one of the following: songs, playlists, audio, video, and radio.
  • the characteristics related to the musical melody include but are not limited to at least one of tempo, scale, dynamics, speed, musical form, pitch, timbre, and range.
  • the beat refers to the combination rule of upbeat and downbeat, and specifically refers to the total length of notes in each measure in the score.
  • Scale refers to the mode form formed by arranging the tones in steps from the beginning of the tonic to the end of the tonic in music. The mode can be understood as the melody of the music.
  • Velocity refers to the strength of the sound in the music.
  • Tempo refers to how fast or slow the music goes.
  • Form refers to the horizontal organization of music.
  • Pitch refers to the high or low frequency of a sound.
  • Timbre refers to the single or mixed use of vocals and instruments in music.
  • the vocal range is the range from the lowest to the highest notes that a human voice and/or instrument can reach.
  • keyboard instruments can be divided into piano, pipe organ, accordion, electronic organ, etc.
  • plucked instruments can be divided into pipa, zheng, dulcimer, lyre, donbula, liuqin, ruan, etc.
  • the specific types of musical instruments the above are only illustrative examples, which can be adjusted according to actual needs, which are not limited in this application.
  • the music attributes involved in the embodiments of the present application include:
  • the two-dimensional coordinate system includes two dimensions, namely the first dimension and the second dimension.
  • the first dimension corresponds to the first music attribute
  • the second dimension corresponds to the second music attribute
  • the first music attribute and the second music attribute are different music attributes.
  • the first dimension corresponds to the tempo attribute
  • the second dimension corresponds to the pitch attribute
  • the first dimension corresponds to the instrument type of the plucked instrument
  • the second dimension corresponds to the instrument type of the keyboard instrument
  • the first dimension corresponds to the scale attribute
  • the second dimension corresponds to the musical instrument type of the keyboard instrument.
  • the second dimension corresponds to the percussion instrument type.
  • a two-dimensional coordinate system 411 is displayed on the music screening interface 410, and the two-dimensional coordinate system 411 is a polar coordinate system.
  • the radius dimension of polar coordinates corresponds to the beat attribute
  • the azimuth dimension corresponds to the scale attribute.
  • a two-dimensional coordinate system 511 is displayed on the music screening interface 510 , and the two-dimensional coordinate system 511 is a rectangular coordinate system.
  • the x-axis dimension of the rectangular coordinate system corresponds to the beat attribute
  • the y-axis dimension corresponds to the vocal range attribute.
  • other dimension-related elements may also be displayed in the two-dimensional coordinate system, for example, auxiliary information of the dimension is displayed below the first dimension and/or the second dimension.
  • the two-dimensional coordinate system 511 is a rectangular coordinate system
  • the x-axis dimension units "slow tempo” and “fast tempo” are displayed in the cartesian coordinate system
  • the y-axis dimension units "bass” and "" high pitch are displayed in the cartesian coordinate system
  • four instrument names are displayed under the above four dimensional units to help users understand the dimensional units. For example, "violin” displayed under the x-axis dimension unit "slow tempo” is used to prompt the user that the beat of the slow tempo is similar to the melody played by a violin.
  • Step 304 In response to the triggering operation on the two-dimensional coordinate system, obtain the triggering position of the triggering operation.
  • the trigger operation on the two-dimensional coordinate system includes, but is not limited to, at least one of the following operations: a sliding operation, a touch operation, a single-click operation, and a double-click operation within the scope of the two-dimensional coordinate system.
  • the trigger position of the trigger operation refers to the detailed coordinates of the trigger operation on the two-dimensional coordinate system.
  • the terminal can obtain the trigger position of the user's trigger operation on the touch screen.
  • the touch chip obtains the touch event, and reports the touch event to the processor of the terminal, and the processor obtains the trigger position according to the reported touch event.
  • touch events include a Touch Start event, a Touch Move event, and a Touch End event.
  • the Touch Start event is used to indicate the touch coordinates of the finger on the touch screen
  • the Touch Move event is used to indicate the continuous touch coordinates when the finger continuously slides on the touch screen
  • the Touch End event is used to indicate the touch coordinates of the finger leaving the touch screen.
  • the above-mentioned touch coordinates are acquired by the touch sensor according to the touch position of the user on the touch screen.
  • the two-dimensional coordinate system 411 is a polar coordinate system.
  • the radius dimension of polar coordinates is the beat dimension
  • the azimuth dimension is the scale dimension.
  • the sliding button 412 is displayed on the polar coordinate system
  • the trigger position refers to the coordinates of the sliding button 412 on the polar coordinate system.
  • the sliding button 412 can be slid within the range of the two-dimensional coordinate system 411, and the sliding button 412 is a mark used to determine the specific position of the trigger operation, and may not be displayed in the two-dimensional coordinate system 411.
  • the trigger position of the click operation at the sliding button 412 is obtained as (radius coordinates, azimuth coordinates).
  • the rhythm dimension it is divided into five beat levels: slow rhythm, sub-slow rhythm, normal rhythm, sub-fast rhythm, and fast rhythm.
  • the obtained coordinates of the trigger position are (general rhythm, scale four).
  • the two-dimensional coordinate system includes at least one of a polar coordinate system and a rectangular coordinate system.
  • step 304 may adopt at least one of the following two optional manners:
  • the two-dimensional coordinate system is a polar coordinate system.
  • Step 304 may include:
  • the radius coordinates are determined as the coordinates of the trigger position in the first dimension, and the azimuth coordinates are determined as the coordinates of the trigger position in the second dimension.
  • the two-dimensional coordinate system 411 is a polar coordinate.
  • the first dimension is the radius dimension
  • the second dimension is the orientation dimension
  • the position of the sliding button 412 is the trigger position of the trigger operation.
  • the radius dimension corresponds to the beat attribute
  • the orientation dimension corresponds to the scale attribute.
  • the rhythm dimension it is divided into five beat levels: slow rhythm, sub-slow rhythm, normal rhythm, sub-fast rhythm, and fast rhythm.
  • the radius coordinate of the position where the sliding button 412 is obtained is the general rhythm
  • the azimuth coordinate is the scale four
  • the general rhythm is determined as the sliding button 412 is located in the radius
  • the scale four is determined as the coordinates of the position of the sliding button 412 in the azimuth dimension.
  • the two-dimensional coordinate system is a rectangular coordinate system.
  • Step 304 may include:
  • the x-axis coordinates are determined as the coordinates of the trigger position in the first dimension, and the y-axis coordinates are determined as the coordinates of the trigger position in the second dimension.
  • the two-dimensional coordinate system 511 is a rectangular coordinate.
  • the first dimension is the x-axis dimension
  • the second dimension is the y-axis dimension
  • the position of the sliding button 512 is the trigger position of the trigger operation.
  • the x-axis dimension corresponds to the beat dimension
  • the y-axis dimension corresponds to the vocal range dimension.
  • the rhythm dimension it is divided into 9 beat levels of -4-level rhythm, -3-level rhythm, -2-level rhythm, -1-level rhythm, 0-level rhythm, 1-level rhythm, 2-level rhythm, 3-level rhythm and 4-level rhythm.
  • the sound range dimension is divided into 9 ranges of 4-level range, -3-level range, -2-level range, -1-level range, 0-level range, 1-level range, 2-level range, 3-level range and 4-level range.
  • the x-axis coordinate of the position of the sliding button 512 is obtained as the -2-level rhythm
  • the y-axis coordinate is the 1-level sound range
  • the -2-level rhythm is determined as the position of the sliding button 512 at x
  • the 1-level sound range is determined as the coordinate of the position of the sliding button 512 in the y-axis dimension.
  • Step 306 Display the filtered target music.
  • the first music attribute of the target music corresponds to the coordinates of the trigger position in the first dimension
  • the second music attribute of the target music corresponds to the coordinates of the trigger position in the second dimension
  • the target music refers to music selected by the terminal after screening and having an attribute label corresponding to the coordinates of the trigger position in the two-dimensional coordinate system.
  • the target music may be one or more.
  • they can be expressed in the form of playlists, radio stations, lists, and the like.
  • the program interface to which the target music belongs is a music screening interface or a second program interface
  • the second program interface and the music screening interface are different interfaces.
  • the second program interface is a music playing interface, or other functional interfaces.
  • a target song and a target song list are displayed, wherein the target songs include “song 1" and “song 2”, and the target song list is displayed.
  • the program interface to which the target song and the target playlist belong is the music screening interface 410.
  • the music screening interface may display information related to some candidate music.
  • the music screening interface may display information related to some candidate music.
  • a list item of candidate music is displayed around the position where the sliding button 412 is located, and the list item includes "song 1" and "song 2". , "Playlist 1" and "Playlist 2"; in addition, the user can also expand the list item by triggering "", for example, click "" to expand the list item.
  • a triggering operation on the polar coordinate system in the music screening interface 410 such as a double-click operation
  • a music playing control is displayed at the bottom of the music screening interface 410, and song 1 is played at the same time.
  • step 306 includes the following steps:
  • the first music whose first music attribute has the first attribute tag and the second music attribute has the second attribute tag is filtered out in the music library.
  • the first attribute tag and the second attribute tag are different attribute tags.
  • the attribute label refers to a classification label that is labeled after classifying each music according to the music attribute.
  • a piece of music has at least one attribute tag of a music attribute, and the number of attribute tags of a piece of music attribute of a piece of music is not limited.
  • music 1 has four attribute tags: slow rhythm, scale 2, scale 3, guzheng, and pipa.
  • Slow rhythm belongs to the attribute tag of beat attribute
  • scale 2 and scale 3 belong to the attribute tag of scale attribute
  • guzheng and pipa belong to the musical instrument type.
  • the property label for the type of instrument played in the properties is not limited.
  • the music library refers to a database for storing music-related information, and the music library includes at least one candidate music.
  • the terminal finally determines the selected target music by screening music in the music library.
  • the music library may be stored in the local database, or in the cloud database, or in both the local database and the cloud database. For example, after the terminal first filters the music library 1 in the local database, the target music is not found; then, the terminal finds the target music after filtering the music library 2 in the cloud database.
  • the trigger position corresponds to a coordinate point in the two-dimensional coordinate system, and attribute labels of two different music attributes corresponding to the coordinate point can be obtained. That is, the first attribute label is obtained according to the coordinates of the trigger position in the first dimension, and the second attribute label is obtained according to the coordinates of the trigger position in the second dimension.
  • the terminal can filter out at least one target music in the music library, and display the target music.
  • the terminal filters out the first music in the music library, the first music attribute of the first music has the first attribute tag, and the second music attribute has the second music attribute. attribute label.
  • the two-dimensional coordinate system 511 is a rectangular coordinate system
  • the position where the user performs the trigger operation on the rectangular coordinate system is the position where the sliding button 512 is located.
  • the trigger position of the trigger operation is obtained as (-2-level beat, 1-level sound range).
  • the first attribute label of the trigger position is -2-level beat
  • the second attribute label is 1-level sound range.
  • the terminal filters the music library and finds that the song "Song A” has attribute tags of "-2-level beat", “scale three", “1-level range”, “piano”, and “guzheng”. ” is determined as the target song, and jumps from the music screening interface 510 to the music playing interface 520 to display the song “Song A”.
  • step 306 further includes the following steps:
  • the second music is filtered out as the target music in the music library, wherein the attribute label of the first music attribute of the second music has the closest distance to the first attribute label, and the second music
  • the second music attribute has the second attribute label; or, filter out the third music in the music library as the target music, wherein the first music attribute of the third music has the first attribute tag, the attribute tag of the second music attribute of the third music has the closest distance to the second attribute tag; or, the fourth music is filtered out as the target music in the music library, wherein the The distance between the attribute label of the first music attribute of the fourth music and the first attribute label is the closest, and the distance between the attribute label of the second music attribute of the fourth music and the second attribute label is the closest .
  • the distance between the attribute label of the first music attribute and the first attribute label refers to the distance between the attribute label of the first music attribute of the second music and the first attribute label.
  • the definitions of the rest of the distances are similar, and are not repeated here.
  • the first dimension and the second dimension determined according to the first music attribute and the second music attribute constitute a two-dimensional coordinate system, and there is at least one coordinate point in the two-dimensional coordinate system corresponding to one candidate music.
  • the distance may be determined by the distance between mathematical coordinate points in the two-dimensional coordinate system, or determined according to other preset distance comparison rules.
  • the coordinates of the trigger position are (1, 2), and the coordinates of the candidate music are (3, 5), then the first attribute label is 1, and the second attribute label is 2.
  • the distance between the first music attribute of the candidate music and the first attribute label is 2, and the distance between the second music attribute and the second attribute label is 3.
  • the coordinates of the trigger position are (fast rhythm, scale 5), and the coordinates of the candidate music are (fast rhythm, scale 1), then the first attribute label is fast rhythm, and the second attribute label is scale.
  • the first music attribute of the candidate music has a first attribute label, and the distance between the second music attribute and the second attribute label is four scales.
  • the music screening method provided by the embodiments of the present application can display the target music selected by triggering operations on the two-dimensional coordinate system, realize the fuzzy screening of music, and reduce the cost of music screening for users. It takes time and energy to make the selected target music meet the user's fuzzy screening needs.
  • an embodiment of the present application provides a method for tagging an attribute tag of candidate music in a music library, and a method for generating a two-dimensional coordinate system.
  • the execution body of the method is a computer device, for example, a terminal running a music application program, such as the terminal 220 in FIG. 2 , the method includes the following steps:
  • Step 601 Mark the attribute tags of each candidate music in the music library according to the music attributes.
  • music attributes mainly involve two aspects.
  • it relates to the characteristics of music melody, including but not limited to at least one of beat, scale, dynamics, speed, musical form, pitch, timbre, and range;
  • it relates to the types of musical instruments used in music, including but not limited to playing At least one of musical instruments, plucked instruments, percussion instruments, drawn string instruments, string instruments, woodwind instruments, brass instruments, and keyboard instruments.
  • the music attributes involved in the embodiments of the present application include:
  • the attribute label refers to the classification mark marked after classifying each music according to the music attribute.
  • a piece of music has at least one attribute tag of a music attribute, and the number of attribute tags of one music attribute of a piece of music is not limited.
  • the music library includes at least one candidate music.
  • the terminal matches the first attribute tag and the second attribute tag obtained according to the trigger position with the attribute tags of the candidate music in the music library. Therefore, it is necessary to classify the attribute labels for each music attribute of each candidate music in the music library.
  • melody localization technology can be used as the basis for marking attribute labels, and according to the audio waveform information of candidate music, the method of frequency measurement is used to extract the melody based on the pitch saliency, and obtain the relevant music attributes. information.
  • the execution body of the melody recognition technology is a computer device, for example, a terminal running a music application program, such as the terminal 220 in FIG. 2 , including the following steps:
  • Step 701 Preprocess the input music content.
  • Step 702 Perform time-frequency transformation and spectral processing on the preprocessed music content.
  • Step 703 Perform calculation through the pitch saliency function.
  • Step 704 Track the pitch.
  • Step 705 Melody positioning.
  • the preprocessed music content is converted into frequency information, and then the frequency information is calculated by the pitch saliency function, and the frequency information is calculated as different pitches, and different pitches are tracked. , and finally realize the melody positioning.
  • the basis for marking attribute labels may also be other classification methods and/or techniques, such as simple weighting methods.
  • step 601 includes:
  • the beat label of the marked candidate music is one of five beats and the like.
  • the five beat levels include slow tempo, sub-slow tempo, normal tempo, sub-fast tempo, and fast tempo.
  • the beat features are divided into five types: weak, sub-weak, average, sub-strong, and strong, respectively corresponding to five beat levels of slow, sub-slow, normal, sub-fast, and fast.
  • the beat label of the candidate music is marked as slow tempo; or, in the case that the beat feature of the candidate music conforms to the sub-weak beat type, the beat label of the candidate music is marked as the second slow tempo Rhythm; or, in the case that the beat feature of the candidate music conforms to the general beat type, mark the beat label of the candidate music as a general beat; or, in the case that the beat feature of the candidate music conforms to the sub-strong beat type, mark the beat of the candidate music
  • the label is sub-fast tempo; or, in the case that the beat feature of the candidate music conforms to the strong beat type, the beat label of the candidate music is marked as fast tempo.
  • the classification of beat levels can also be performed according to the combination rule of upbeats and downbeats.
  • the beat characteristics can be divided into 1/4, 2/4, 3/4, 4/4, 3/8, 6/8, 7/8, 9/8, 12/8
  • step 601 includes:
  • the scale labels that mark the candidate music are at least one of seven scale levels according to the number of occurrences of the notes in the candidate music.
  • the seven scale levels include scale one, scale two, scale three, scale four, scale five, scale six, and scale seven.
  • the notes include seven notes of 1 note, 2 note, 3 note, 4 note, 5 note, 6 note, and 7 note, respectively corresponding to scale 1, scale 2, scale 3, scale 4, scale 5, scale 6 , the seven scale levels of the seventh scale.
  • the scale label of the candidate music is marked as scale one; or, in the case where the number of occurrences of 2 notes in the candidate music is the most, the scale label of the candidate music is marked as scale Two; or, in the case where the number of occurrences of the 3-note in the candidate music is the most, mark the scale label of the candidate music as scale three; or, in the case where the number of occurrences of the 4-note in the candidate music is the most, mark the The scale label is scale four; or, in the case of the most frequent occurrences of the 5-note in the candidate music, the scale label of the marked candidate music is scale five; or, in the case of the largest number of occurrences of the 6-note in the candidate music, The scale label of the marked candidate music is scale six; or, in the case where the number of occurrences of the 7-note in the candidate music is the most, the scale label of the marked candidate music is scale seven.
  • the attribute label of the candidate music is marked corresponding to the first note and the second note.
  • the number of occurrences of the 3rd note, the 4th note and the 5th note in the candidate music is the same, and they are all larger than the remaining notes, and the scale labels marking the candidate music are scales three, four and five.
  • step 601 includes:
  • the pitch labels of the candidate music are labeled as one of seven pitch levels.
  • the seven tone levels include very low frequency, low frequency, low mid frequency, mid frequency, high mid frequency, high frequency, and very high frequency.
  • Tone is primarily determined by the frequency of the sound, and is therefore affected by the type of instrument and the combined frequency of the vocal.
  • the pitch label of the candidate music is marked as the extremely low frequency; or, in the case that the music frequency of the candidate music belongs to the low frequency, the candidate music is marked.
  • the pitch label of the candidate music is low frequency; or, in the case that the music frequency of the candidate music belongs to the middle and low frequency, the pitch label of the candidate music is marked as the middle and low frequency; or, in the case that the music frequency of the candidate music belongs to the middle frequency, the pitch of the candidate music is marked
  • the label is medium frequency; or, in the case that the music frequency of the candidate music belongs to the middle and high frequency, the pitch label of the candidate music is marked as medium and high frequency; or, in the case that the music frequency of the candidate music belongs to the high frequency, the pitch label of the candidate music is marked is high frequency; or, in the case that the music frequency of the candidate music belongs to the extremely high frequency, the pitch tag of the candidate music is marked as the extremely high frequency.
  • step 601 includes:
  • the musical instrument type label of the candidate music is marked as the first musical instrument.
  • the musical instruments that appear in a candidate music are piano, accordion, lute, guitar, snare drum, flute, and oboe, and their durations are 35 seconds, 22 seconds, 35 seconds, 27 seconds, 12 seconds, 19 seconds, and 26 seconds, respectively. . Since the appearance duration of both piano and pipa is 35 seconds, which is longer than the appearance duration of the remaining musical instruments, the instrument type labels of the candidate music are marked as piano and pipa.
  • Step 602 Display a selection function interface.
  • At least two dimension strings are displayed on the selection function interface, and the dimension strings are generated according to music attributes.
  • the dimension string is used to generate a two-dimensional coordinate system, which is achieved by triggering operations on the dimension string.
  • the first dimension of the two-dimensional coordinates corresponds to the first musical attribute
  • the second dimension corresponds to the second musical attribute. Therefore, the number of dimension strings is not less than the number of musical attributes, and at least includes two characters corresponding to the musical attributes. string.
  • the dimension string includes the string “beat”, the string “scale”, the string “pitch”, the string “form”, the string “percussion”, the string “keyboard”, the string “wind instrument” ".
  • Step 603 In response to the selection operation on the dimension string, generate a two-dimensional coordinate system.
  • the selection operation on the dimension string includes, but is not limited to, at least one of the following operations: a single-click operation, a double-click operation, and a drag operation performed on the dimension string.
  • step 603 at least includes the following steps:
  • a radius dimension is generated; in response to the second selection operation on the second dimension string, an orientation dimension is generated; according to the radius dimension and the orientation dimension, a polar coordinate system is generated;
  • the first dimension string is the string “beat” and the second dimension string is the string “scale”. Click the string “beat” to generate the radius dimension, and click the string “scale” to generate the azimuth dimension, thus generating the polar coordinate system.
  • the first dimension string is the string "percussion instrument”
  • the second dimension string is the string "sound range”. Double-click the string "percussion” to generate the x-axis dimension, and double-click the string "range” to generate the y-axis dimension, thereby generating a Cartesian coordinate system.
  • the first-dimensional character string is the character string "Music”
  • the second-dimensional character string is the character string "tone”. Drag the string "pitch” to the specified position to generate the radius dimension, and drag the string "curve” to the specified position to generate the azimuth coordinates, thereby generating the polar coordinate system.
  • Step 604 Display the music screening interface.
  • a two-dimensional coordinate system is displayed on the music screening interface, the first dimension of the two-dimensional coordinate system corresponds to the first music attribute, and the second dimension of the two-dimensional coordinate system corresponds to the second music attribute.
  • the first music attribute and the second music attribute are different music attributes.
  • Step 605 In response to the triggering operation on the two-dimensional coordinate system, obtain the triggering position of the triggering operation.
  • the trigger operation on the two-dimensional coordinate system includes, but is not limited to, at least one of the following operations: a sliding operation, a touch operation, a single-click operation, and a double-click operation within the scope of the two-dimensional coordinate system.
  • Step 606 Display the filtered target music.
  • the first music attribute of the target music corresponds to the coordinates of the trigger position in the first dimension
  • the second music attribute of the target music corresponds to the coordinates of the trigger position in the second dimension
  • Step 604, step 605 and step 606 are the same as step 302, step 304 and step 306, which can be used for reference and will not be repeated here.
  • an embodiment of the present application provides a technical flow chart of a music screening method, and the specific steps are as follows:
  • Step 801 The server performs song recording.
  • the server classifies the music attributes of each candidate music in the music library, marks the attribute labels of each candidate music, and records the marked songs.
  • the server uploads the classified resources to the cloud database for subsequent re-classification, multiple screening and data invocation; on the other hand, it saves the attribute label data in the local database so that the terminal can call it at any time.
  • Step 802 The terminal generates a two-dimensional coordinate system.
  • Step 803 The terminal sends the relevant information of the first dimension and the second dimension to the server.
  • Step 804 The server performs two-dimensional coordinate system analysis.
  • the user selects two dimension strings according to their own needs to generate the first dimension and the second dimension respectively, and the terminal generates a two-dimensional coordinate system according to the first dimension and the second dimension .
  • the server receives the first dimension and the second dimension in the two-dimensional coordinate system sent by the terminal, and analyzes the two-dimensional coordinate system.
  • the server calls the data in the music library according to the first dimension and the second dimension, and the music library is stored in the cloud database and/or the local database.
  • the server imports the candidate music that conforms to the first dimension and the second dimension into the two-dimensional coordinate system, ensuring that there is at least one coordinate point corresponding to one candidate music in the two-dimensional coordinate system.
  • Step 805 The terminal performs a triggering operation on the two-dimensional coordinate system.
  • Step 806 The terminal sends the coordinates of the trigger position to the server.
  • the terminal In response to the triggering operation performed on the two-dimensional coordinate system, the terminal obtains the triggering position of the triggering operation, and sends the coordinates of the triggering position to the server, that is, the coordinates and the coordinates of the first dimension of the triggering position on the two-dimensional coordinate system are sent to the server.
  • the coordinates of the second dimension are sent to the server.
  • Step 807 The server performs music screening.
  • Step 808 The server sends the filtered target music to the terminal.
  • Step 809 The terminal displays the target music.
  • the music screening starts.
  • the server searches the music library according to the coordinates of the trigger position, and after searching for the first candidate music, determines whether the first music attribute of the first candidate music matches the coordinates of the trigger position in the first dimension, and whether the second music attribute matches the coordinate of the trigger position in the first dimension. Whether the coordinates of the trigger position in the second dimension match. If it does not match, then expand the search to the surrounding within the range of the two-dimensional coordinate system with the coordinates of the trigger position as the center.
  • the server After the server searches for matching candidate music, that is, the target music is found, the server sends the filtered target music to the terminal, and the terminal displays the target music.
  • the music screening method provided by the embodiment of the present application by marking the attribute tags of candidate music, subdivides the music attributes of each candidate music into classification, so that the basis for music screening is more accurate, and the music is reduced. filtering granularity.
  • the music screening method provided by the embodiment of the present application by providing a dimension string, enables the user to select the dimension of the two-dimensional coordinate system according to his own needs, that is, the user can blur specific music attributes according to his own needs. Screening satisfies the user's use needs and enhances the user's use experience to a certain extent.
  • the present application provides a music screening apparatus, which can be implemented as all or a part of a terminal through software, hardware or a combination of the two.
  • the music screening apparatus includes: a marking module 920 , a displaying module 940 , a generating module 960 and an obtaining module 980 .
  • the marking module 920 is used for marking the attribute tags of each candidate music in the music library according to the music attributes.
  • the display module 940 is configured to display a selection function interface, where at least two dimension strings are displayed on the selection function interface, and the dimension strings are generated according to music attributes.
  • the generating module 960 is configured to generate a two-dimensional coordinate system in response to a selection operation on the dimension string.
  • the display module 940 is also used to display a music screening interface, where a two-dimensional coordinate system is displayed on the music screening interface, the first dimension of the two-dimensional coordinate system corresponds to the first music attribute, and the second dimension of the two-dimensional coordinate system corresponds to the second Music attributes, the first music attribute and the second music attribute are different music attributes.
  • the obtaining module 980 is configured to obtain the triggering position of the triggering operation in response to the triggering operation on the two-dimensional coordinate system.
  • the display module 940 is also used to display the filtered target music, the first music attribute of the target music corresponds to the coordinates of the trigger position in the first dimension, and the second music attribute of the target music corresponds to the coordinates of the trigger position in the second dimension correspond.
  • the two-dimensional coordinate system is a polar coordinate system
  • the obtaining module 980 is further configured to: in response to a trigger operation on the polar coordinate system, obtain the radius coordinates of the trigger position and Azimuth coordinates; determine the radius coordinates as the coordinates of the trigger position in the first dimension, and determine the azimuth coordinates as the coordinates of the trigger position in the second dimension.
  • the two-dimensional coordinate system is a rectangular coordinate system
  • the obtaining module 980 is further configured to: in response to a trigger operation on the rectangular coordinate system, obtain the x-axis coordinate of the trigger position and the y-axis coordinate; the x-axis coordinate is determined as the coordinate of the trigger position in the first dimension, and the y-axis coordinate is determined as the coordinate of the trigger position in the second dimension.
  • the display module 940 is further configured to: determine the first attribute label according to the coordinates of the trigger position in the first dimension; determine the first attribute label according to the coordinates of the trigger position in the second dimension Two attribute tags; filter out the first music with the first attribute tag and the second attribute tag in the music library, as the target music, the first attribute tag and the second attribute tag are different property label.
  • the display module 940 is further configured to: in the case where the first music does not exist, filter out the second music in the music library as the target music , wherein the attribute tag of the first music attribute of the second music has the closest distance to the first attribute tag, and the second music attribute of the second music has the second attribute tag.
  • the display module 940 is further configured to: in the case where the first music does not exist, filter out a third music in the music library as the target music , wherein the first music attribute of the third music has the first attribute tag, and the second music attribute of the third music has the attribute tag that is closest to the second attribute tag.
  • the display module 940 is further configured to: filter out a fourth music from the music library as the target music when the first music does not exist , wherein the attribute label of the first music attribute of the fourth music is the closest to the first attribute label, and the attribute label of the second music attribute of the fourth music is the same as the second music attribute.
  • the property label is the closest.
  • the music attributes include at least two of tempo, scale, velocity, speed, musical form, pitch, timbre, and range; or, at least two of musical instrument types; or, beat , at least one of scale, velocity, speed, musical form, pitch, timbre, range, and at least one of instrument type.
  • the music attribute includes tempo
  • the marking module 920 is further configured to: mark the tempo label of the candidate music as one of five tempo levels according to the tempo feature of the candidate music
  • the five beat levels include one of slow tempo, sub-slow tempo, normal tempo, sub-fast tempo, and fast tempo.
  • the music attributes include scales
  • the marking module 920 is further configured to: mark the scale labels of the candidate music as seven scale levels according to the number of occurrences of notes in the candidate music At least one of the seven scale levels includes at least one of scale one, scale two, scale three, scale four, scale five, scale six, and scale seven.
  • the music attribute includes a musical instrument type
  • the marking module 920 is further configured to: determine the musical instrument appearing in the candidate music and the appearance duration of the musical instrument; the first musical instrument in the candidate music In the case where the appearance duration of the musical instrument is greater than the appearance duration of the remaining musical instruments, the instrument type tag of the marked candidate music is the first musical instrument.
  • the music attribute includes a musical instrument type
  • the marking module 920 is further configured to: determine the musical instrument appearing in the candidate music and the appearance duration of the musical instrument; the first musical instrument in the candidate music When the appearance duration of the musical instrument and the second musical instrument are the same, and the appearance durations of the first musical instrument and the second musical instrument are both longer than the occurrence durations of the remaining musical instruments, the musical instrument type label of the marked candidate music is the first musical instrument and the second musical instrument.
  • the generating module 960 is further configured to: in response to the first selection operation on the first dimension string, generate the radius dimension; in response to the first selection operation on the second dimension string The second selection operation generates the azimuth dimension; according to the radius dimension and the azimuth dimension, the polar coordinate system is generated.
  • the generating module 960 is further configured to: generate the x-axis dimension in response to the first selection operation on the first dimension string;
  • the second selection operation generates a y-axis dimension, and generates a Cartesian coordinate system according to the x-axis dimension and the y-axis dimension.
  • FIG. 10 shows a structural block diagram of the computer device 1000 provided by an exemplary embodiment of the present application.
  • the computer device 1000 can be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, a moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert Compression Standard Audio Layer 4) Player.
  • the computer device 1000 may also be referred to by other names such as user equipment, portable terminal, and the like.
  • computer device 1000 includes: processor 1001 and memory 1002 .
  • the processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1001 can use at least one hardware form among DSP (Digital Signal Processing, digital signal processing), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, programmable logic array) accomplish.
  • the processor 1001 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake-up state, also called CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor for processing data in a standby state.
  • the processor 1001 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
  • the processor 1001 may further include an AI (Artificial Intelligence, artificial intelligence) processor, where the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence, artificial intelligence
  • Memory 1002 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, a non-transitory computer-readable storage medium in memory 1002 is used to store at least one instruction for execution by processor 1001 to implement the method of generating an album video provided in this application .
  • the computer device 1000 may further include: a peripheral device interface 1003 and at least one peripheral device.
  • the peripheral device includes: at least one of a radio frequency circuit 1004 , a touch display screen 1005 , a camera assembly 1006 , an audio circuit 1007 , a positioning assembly 1008 and a power supply 1009 .
  • the peripheral device interface 1003 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1001 and the memory 1002 .
  • processor 1001, memory 1002, and peripherals interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one of processor 1001, memory 1002, and peripherals interface 1003 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1004 communicates with the communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1004 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1004 includes an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
  • the radio frequency circuit 1004 may communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, or 3G, or 4G, or 5G, or their combination), wireless local area network and/or WiFi (Wireless Fidelity , Wi-Fi) network.
  • the radio frequency circuit 1004 may further include a circuit related to NFC (Near Field Communication, short-range wireless communication), which is not limited in this application.
  • the touch screen 1005 is used to display UI (User Interface, user interface).
  • the UI can include graphics, text, icons, video, and any combination thereof.
  • the touch display 1005 also has the ability to acquire touch signals on or over the surface of the touch display 1005 .
  • the touch signal can be input to the processor 1001 as a control signal for processing.
  • the touch screen 1005 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or soft keyboard.
  • there may be one touch display screen 1005 which is provided on the front panel of the computer device 1000 ; in other embodiments, there may be at least two touch display screens 1005 , which are respectively provided on different surfaces or surfaces of the computer device 1000 .
  • touch display 1005 may be a flexible display disposed on a curved or folded surface of computer device 1000 . Even, the touch display screen 1005 can also be set as a non-rectangular irregular figure, that is, a special-shaped screen.
  • the touch display screen 1005 can be made of materials such as LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic Light-Emitting Diode, organic light emitting diode).
  • the camera assembly 1006 is used to capture images or video.
  • the camera assembly 1006 includes a front camera and a rear camera.
  • the front camera is used for video calls or selfies
  • the rear camera is used for photo or video shooting.
  • there are at least two rear cameras which are any one of a main camera, a depth-of-field camera, and a wide-angle camera, so as to realize the fusion of the main camera and the depth-of-field camera to realize the background blur function, and the fusion of the main camera and the wide-angle camera Realize panoramic shooting and VR (Virtual Reality, virtual reality) shooting functions.
  • the camera assembly 1006 may also include a flash.
  • the flash can be a single color temperature flash or a dual color temperature flash. Dual color temperature flash refers to the combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • Audio circuitry 1007 is used to provide an audio interface between the user and computer device 1000 .
  • Audio circuitry 1007 may include a microphone and speakers.
  • the microphone is used to collect the sound waves of the user and the environment, convert the sound waves into electrical signals, and input them to the processor 1001 for processing, or to the radio frequency circuit 1004 to realize voice communication.
  • the microphone may also be an array microphone or an omnidirectional collection microphone.
  • the speaker is used to convert the electrical signal from the processor 1001 or the radio frequency circuit 1004 into sound waves.
  • the loudspeaker can be a traditional thin-film loudspeaker or a piezoelectric ceramic loudspeaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert electrical signals into sound waves audible to humans, but also convert electrical signals into sound waves inaudible to humans for distance measurement and other purposes.
  • the audio circuit 1007 may also include a headphone jack.
  • the positioning component 1008 is used to locate the current geographic location of the computer device 1000 to implement navigation or LBS (Location Based Service).
  • the positioning component 1008 may be a positioning component based on the GPS (Global Positioning System, global positioning system) of the United States, the Beidou system of China or the Galileo system of Russia.
  • Power supply 1009 is used to power various components in computer device 1000 .
  • the power source 1009 may be alternating current, direct current, disposable batteries or rechargeable batteries.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. Wired rechargeable batteries are batteries that are charged through wired lines, and wireless rechargeable batteries are batteries that are charged through wireless coils.
  • the rechargeable battery can also be used to support fast charging technology.
  • the computer device 1000 also includes one or more sensors 1010 .
  • the one or more sensors 1010 include, but are not limited to, an acceleration sensor 1011 , a gyro sensor 1012 , a pressure sensor 1013 , a fingerprint sensor 1014 , an optical sensor 1015 and a proximity sensor 1016 .
  • the acceleration sensor 1011 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the computer device 1000 .
  • the acceleration sensor 1011 can be used to detect the components of the gravitational acceleration on the three coordinate axes.
  • the processor 1001 can control the touch display screen 1005 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011 .
  • the acceleration sensor 1011 can also be used for game or user movement data collection.
  • the gyroscope sensor 1012 can detect the body direction and rotation angle of the computer device 1000 , and the gyroscope sensor 1012 can cooperate with the acceleration sensor 1011 to collect the 3D actions of the user on the computer device 1000 .
  • the processor 1001 can implement the following functions according to the data collected by the gyro sensor 1012 : motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1013 may be disposed on the side frame of the computer device 1000 and/or on the lower layer of the touch display screen 1005 .
  • the pressure sensor 1013 can detect the user's holding signal of the computer device 1000, and perform left and right hand identification or shortcut operations according to the holding signal.
  • the operability controls on the UI interface can be controlled according to the user's pressure operation on the touch display screen 1005.
  • the operability controls include at least one of button controls, scroll bar controls, icon controls, and menu controls.
  • the fingerprint sensor 1014 is used to collect the user's fingerprint to identify the user's identity according to the collected fingerprint.
  • the processor 1001 authorizes the user to perform relevant sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, making payments, and changing settings.
  • Fingerprint sensor 1014 may be provided on the front, back, or side of computer device 1000 .
  • the fingerprint sensor 1014 can be integrated with the physical buttons or the manufacturer's logo.
  • the optical sensor 1015 is used to collect ambient light intensity.
  • the processor 1001 may control the display brightness of the touch display screen 1005 according to the ambient light intensity collected by the optical sensor 1015 . Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is decreased.
  • the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the ambient light intensity collected by the optical sensor 1015 .
  • Proximity sensor 1016 also referred to as a distance sensor, is typically provided on the front of computer device 1000 .
  • Proximity sensor 1016 is used to collect the distance between the user and the front of computer device 1000 .
  • the processor 1001 controls the touch display screen 1005 to switch from the bright screen state to the off screen state; when the proximity sensor 1016 When it is detected that the distance between the user and the front face of the computer device 1000 gradually increases, the processor 1001 controls the touch display screen 1005 to switch from the off-screen state to the bright-screen state.
  • FIG. 10 does not constitute a limitation on the computer device 1000, and may include more or less components than the one shown, or combine some components, or adopt different component arrangements.
  • the present application also provides a computer device, the computer device includes a processor and a memory, the memory stores at least one piece of program code, the program code is loaded and executed by the processor to implement the music screening method provided by the above method embodiments .
  • the present application also provides a computer-readable storage medium, where at least one piece of program code is stored in the storage medium, and the program code is loaded and executed by a processor to implement the music screening method provided by the above method embodiments.
  • references herein to "a plurality” means two or more.
  • "And/or" which describes the association relationship of the associated objects, means that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone.
  • the character “/” generally indicates that the associated objects are an "or" relationship.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

一种音乐筛选方法、装置、设备及介质,涉及互联网技术领域。该方法包括:显示音乐筛选界面,音乐筛选界面上显示有二维坐标系(302),二维坐标系的第一维度对应第一音乐属性,二维坐标系的第二维度对应第二音乐属性,第一音乐属性与第二音乐属性是不同的音乐属性;响应于二维坐标系上的触发操作,获取触发操作的触发位置(304);及,显示筛选出的目标音乐,目标音乐的第一音乐属性与触发位置在第一维度的坐标对应,目标音乐的第二音乐属性与触发位置在第二维度的坐标对应(306)。

Description

音乐筛选方法、装置、设备及介质
本申请要求于2020年11月26日提交中国专利局、申请号202011343928.1、申请名称为“音乐筛选方法、装置、设备及介质”的中国专利申请的优先权。
技术领域
本申请涉及互联网技术领域,特别涉及一种音乐筛选方法、装置、设备及介质。
发明背景
随着网络技术的发展,用户可以通过不同的应用程序搜索各类音乐资源。由于网络中的音乐资源的数量增长速度极快,导致用户搜索自身喜欢的音乐需要耗费较长的时间和精力,因此用户需要借助音乐播放客户端中的筛选功能以获取相关音乐资源。
相关技术中,应用程序中的筛选功能通常是通过需要用户在筛选栏中输入目标音乐的关键词,从而筛选出部分候选音乐集合,由用户在候选音乐集合中进行选择出目标音乐。
然而采用上述方法进行的音乐筛选,往往需要用户提供较为详细的关键信息,无法满足用户对目标音乐的某一特定属性的要求,导致筛选结果无法满足用户的模糊偏好和模糊需求。
发明内容
本申请实施例提供了一种音乐筛选方法、装置、设备及介质,通过二维坐标系上的触发操作,能够实现音乐的模糊筛选。所述技术方案至少包括如下技术方案:
根据本申请的一个方面,提供了一种音乐筛选方法,该方法包括:
显示音乐筛选界面,音乐筛选界面上显示有二维坐标系,二维坐标系的第一维度对应第一音乐属性,二维坐标系的第二维度对应第二音乐属性,第一音乐属性与第二音乐属性是不同的音乐属性;
响应于二维坐标系上的触发操作,获取触发操作的触发位置;及,
显示筛选出的目标音乐,目标音乐的第一音乐属性与触发位置在第一维度的坐标对应,目标音乐的第二音乐属性与触发位置在第二维度的坐标对应。
根据本申请的一个方面,提供了一种音乐筛选装置,该装置包括:
显示模块,用于显示音乐筛选界面,音乐筛选界面上显示有二维坐标系,二维坐标系的第一维度对应第一音乐属性,二维坐标系的第二维度对应第二音乐属性,第一音乐属性与第二音乐属性是不同的音乐属性;
获取模块,用于响应于二维坐标系上的触发操作,获取触发操作的触发位置;
其中,显示模块,还用于显示筛选出的目标音乐,目标音乐的第一音乐属性与触发位置在第一维度的坐标对应,目标音乐的第二音乐属性与触发位置在第二维度的坐标对应。
根据本申请的一个方面,提供了一种计算机设备,该计算机设备包括处理器和存储器,存储器中存储有至少一条程序代码,程序代码由处理器加载并执行如上的音乐筛选方法。
根据本申请的一个方面,提供了一种计算机可读存储介质,该计算机可读存储介质中存储有至少一条程序代码,程序代码由处理器加载并执行以实现如上的音乐筛选方法。
根据本申请的一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行如上的音乐筛选方法。
附图简要说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个示例性实施例提供的音乐筛选方法的界面变化示意图;
图2是本申请一个示例性实施例提供的计算机系统的框图;
图3是本申请一个示例性实施例提供的音乐筛选方法的流程图;
图4a是本申请一个示例性实施例提供的音乐筛选界面的界面示意图;
图4b是本申请另一个示例性实施例提供的音乐筛选界面的界面示意图;
图5是本申请又一个示例性实施例提供的音乐筛选界面的界面示意图;
图6是本申请一个示例性实施例提供的音乐筛选方法的流程图;
图7是本申请一个示例性实施例提供的旋律定位的步骤流程图;
图8是本申请一个示例性实施例提供的音乐筛选方法的技术流程图;
图9是本申请一个示例性实施例提供的音乐筛选装置的结构示意图;
图10是本申请一个示例性实施例提供的计算机设备的结构示意图。
实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
为便于理解,以下对本申请涉及的名词做出解释:
音乐筛选界面:是指呈现于用户面前,用于进行音乐筛选和/或结果显示的程序界面。
二维坐标系:是指在同一平面上有公共原点的两条数轴构成的坐标系,通常具有两个维度。
音乐属性:乐器经过敲击、摩擦、吹奏等方式发出的声音和/或人的声音形成音乐。音乐属性是指形成的音乐中所具有的特殊性质,主要涉及两方面,分别 是音乐的旋律特征、音乐中使用的乐器的类型。
属性标签:是指根据前述音乐属性,对音乐进行分类后标注的分类标记。一个音乐至少具有一个音乐属性的属性标签,一个音乐的一个音乐属性所具有的属性标签的个数不限。比如,一个音乐具有节拍和乐器类型两类音乐属性的属性标签,其中的乐器类型属性具有琵琶、古筝和萧三个乐器类型的属性标签。
本申请实施例提供了一种音乐筛选方法,通过在二维坐标系上进行的触发操作,显示筛选出的目标音乐,减少了用户进行音乐筛选的时间和步骤,给予用户根据模糊需求进行音乐筛选的可能性,增强了用户的体验感。
示意性的如图1所示,音乐筛选界面110上显示有二维坐标系111,比如该二维坐标系111是直角坐标系,响应于二维坐标系111上的触发操作,显示目标音乐。
示意性的,音乐筛选界面110上显示有搜索栏控件和二维坐标系111。用户可以在搜索栏控件中进行输入操作,通过输入关键信息实现音乐筛选;或者,用户通过在二维坐标系111上的点击操作进行音乐筛选。以二维坐标系111是直角坐标系为例,用户在直角坐标系中的某一坐标点上进行双击操作,程序界面从音乐筛选界面110跳转到音乐播放界面120。在音乐播放界面120上显示有歌曲“歌曲A”,该歌曲处于播放状态。也即,用户在直角坐标系上进行双击操作后,直接播放目标音乐。
音乐的筛选过程中,通过用户与终端进行人机交互确定筛选出的目标音乐。通常情况下,用户需要借助筛选工具,通过筛选工具上进行的操作,确定所需要的音乐,终端根据用户操作显示目标音乐。本申请实施例提供了一种基于二维坐标系的音乐筛选交互方案,通过用户在二维坐标系上的触发操作,显示筛选出的目标音乐,可以满足用户的模糊筛选需求。
图2示出了本申请一个示例性实施例提供的计算机系统的框图。该计算机系统200包括服务器210、终端220、云端数据库230和本地数据库240。
服务器210可以是一台服务器,或者是若干台服务器组成的服务器集群,或者是一个虚拟化平台,或者是一个云计算服务中心。示意性的,服务器210可以是为音乐类应用程序提供后台支持的服务器,服务器120可以由一个或多个功能单元组成。
多个终端220通过无线或有线网络与服务器210连接。
终端220上安装和运行有音乐类应用程序,该应用程序具有支持音乐筛选的功能,该应用程序可以是音乐播放程序、视频播放程序、电台播放程序或者其他音乐类应用程序。终端220可以是电脑、智能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器、膝上型便携计算机、台式计算机、智能电视、智能车载、智能设备中的至少一种。
云端数据库230和本地数据库240通过无线或有线网络与服务器210连接,用于存储音乐的相关数据,比如音乐名称、音乐时长、音乐演唱者信息、音乐属性的类型以及每个音乐的属性标签。
本申请实施例提供的音乐筛选方法,为用户进行音乐筛选提供了模糊查询的便利。示意性的如图3所示的流程图,该方法的执行主体为计算机设备,例如,为运行有音乐类应用程序的终端,如图2中的终端220,该方法包括如下步骤:
步骤302:显示音乐筛选界面。
示意性的,音乐筛选界面上显示有二维坐标系,二维坐标系的第一维度对应第一音乐属性,二维坐标系的第二维度对应第二音乐属性。其中,第一音乐属性与第二音乐属性是不同的音乐属性。
音乐筛选界面是指进行音乐筛选和/或结果显示的程序界面,以音乐类应用程序为例,音乐筛选界面可以是筛选功能界面、搜索功能界面、查询功能界面、识别功能界面中的至少一种。示意性的,音乐筛选界面中至少显示有二维坐标系,该二维坐标系用于用户进行音乐筛选的相关操作。可选的,音乐筛选界面中还可以显示有搜索栏控件、查询栏控件、音频识别控件中的至少一种。示意性的如图1所示,音乐筛选界面110中处显示有二维坐标系111之外,还显示有搜索栏控件,该搜索栏控件上可以进行输入操作。也即,用户在音乐筛选界面110中进行的音乐筛选,可以在二维坐标系111上,也可以在搜索栏控件上。
二维坐标系是指在同一平面上有公共原点的两条数轴构成的坐标系,通常具有两个维度。本申请实施例中涉及的二维坐标系,包括极坐标系和直角坐标系中的至少一种。
极坐标系是指在平面内由极点、极轴和极径组成的坐标系,包括半径维度和方位维度。在平面上确定一点O,称为极点;从极点O出发引出一条射线Ox,称为极轴;再确定一个单位长度OP,称为极径。在一示例中,规定以极轴为界,取逆时针方向为正。由此,在平面内的任一点的位置可以用线段OP的长度ρ、极轴Ox与线段OP的角度θ来确定,有序数对(ρ,θ)称为P点的极坐标,ρ为P点的半径坐标,θ为P点的方位坐标。
示意性的,如图4a所示,音乐筛选界面410中显示有二维坐标系411,该二维坐标系411是极坐标系。极坐标系内有滑动按键412,以半径单位是1为例,该滑动按键412所在位置的半径坐标为3,方位坐标为154.29°,故该滑动案件412的极坐标应记为(3,154.29°)。
直角坐标系是指在平面内由两条互相垂直且有公共原点的数轴组成的坐标,包括x轴维度和y轴维度。直角坐标系所在平面称为坐标平面,公共原点称为直角坐标系的原点,两条数轴称为x轴和y轴。其中,x轴和y轴将坐标平面分为四个象限,象限以数轴为界,从右上角的象限逆时针方向分别为第一象限、第二象限、第三象限和第四象限。
示意性的,如图5所示,音乐筛选界面510上显示有二维坐标系511,该二维坐标系511是直角坐标系。直角坐标系中有滑动按键512,以x轴维度和y轴维度的单位均为1为例,该滑动按键512所在位置的x轴坐标为-2,y轴坐标为1,故该滑动按键512的直角坐标记为(-2,1)。
示意性的,音乐筛选界面中可以隐藏二维坐标系的部分内容。二维坐标系包括第一维度数轴、第二维度数轴、原点、第一维度单位、第二维度单位、多个象限区域、网格线、圆环线中的至少一种。音乐筛选界面中可以根据用户的需要,隐藏二维坐标系的部分内容。
比如,音乐筛选界面中显示有二维坐标系,该二维坐标系是极坐标系,极坐标系包括方位维度数轴、原点、半径维度单位、方位维度单位。其中,半径维度单位仅显示“慢节奏”和“快节奏”,方位维度单位显示“音阶一”、“音阶二”、“音阶三”、“音阶四”、“音阶五”、“音阶六”和“音阶七”。也即,音乐筛选界面中隐藏了极坐标系的圆环线和部分半径维度单位。
再如,音乐筛选界面中显示有二维坐标系,该二维坐标系是直角坐标系,直角坐标系包括x轴数轴、y轴数轴、原点、x轴维度单位“快节奏”和“一般节奏”、y轴维度单位“高音”和“中音”、第一象限区域。也即,音乐筛选界面中隐藏了直角坐标系的网格线、部分x轴维度单位、y轴维度单位、第二象限区域、第三象限区域和第四象限区域。
乐器经过敲击、摩擦、吹奏等方式发出的声音和/或人的声音形成音乐。示意性的,本申请实施例涉及的音乐,包括但不限于如下中的至少一种:歌曲、歌单、音频、视频、电台。
音乐属性是指形成的音乐中所具有的特殊性质,主要涉及两方面内容。
一方面,涉及音乐旋律的特征,包括但不限于节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少一种。其中,节拍是指强拍和弱拍的组合规律,具体是指在乐谱中每一小节的音符总长度。音阶是指在音乐中,从主音开始到主音结束,对音进行阶梯状排列后形成的调式形态,调式可理解为音乐的旋律。力度是指音乐中音的强弱程度。速度是指音乐进行的快慢。曲式是指音乐的横向组织结构。音调是指声音频率的高低。音色是指音乐中的人声和乐器声的单一使用或混合使用方式。音域是指人声和/或乐器所能达到的最低音至最高音的范围。
另一方面,涉及音乐中使用到的乐器类型,包括但不限于吹奏乐器、弹拨乐器、打击乐器、拉弦乐器、弦乐器、木管乐器、铜管乐器、键盘乐器中的至少一种。以上分类又可以细分为多个种类。比如,键盘乐器可分为钢琴、管风琴、手风琴、电子琴等;又如,弹拨乐器可分为琵琶、筝、扬琴、七弦琴、冬不拉、柳琴、阮等。对于乐器类型的具体种类,以上仅为示意性举例,可根据实际需要进行调整,本申请在此不做限定。
示意性的,本申请实施例中涉及的音乐属性包括:
节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少两种;
或,乐器类型中的至少两种;
或,节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少一种和乐器类型中的至少一种。
根据前述内容,二维坐标系中包括两个维度,分别是第一维度和第二维度。其中,第一维度对应第一音乐属性,第二维度对应第二音乐属性,第一音乐属性和第二音乐属性是不同的音乐属性。比如,第一维度对应节拍属性,第二维度对应音调属性;又如,第一维度对应弹拨乐器的乐器类型,第二维度对应键盘乐器的乐器类型;又如,第一维度对应音阶属性,第二维度对应打击乐器的乐器类型。
示意性的,如图4a所示,音乐筛选界面410上显示有二维坐标系411,该二维坐标系411是极坐标系。其中,极坐标的半径维度对应节拍属性,方位维度对应音阶属性。
示意性的,如图5所示,音乐筛选界面510上显示有二维坐标系511,该二维坐标系511是直角坐标系。其中,直角坐标系的x轴维度对应节拍属性,y轴维度对应音域属性。示意性的,二维坐标系中还可以显示维度相关的其他元素,比如在第一维度和/或第二维度的下方显示维度的辅助信息。
示意性的,如图5所示,二维坐标系511是直角坐标系,该直角坐标系中显示有x轴维度单位“慢节奏”和“快节奏”,y轴维度单位“低音”和“高音”。同时,在上述四个维度单位下分别显示有四种乐器名称,以帮助用户理解该维度单位。比如,x轴维度单位“慢节奏”下显示的“小提琴”,用于提示用户,慢节奏的节拍与小提琴演奏的旋律类似。
步骤304:响应于二维坐标系上的触发操作,获取触发操作的触发位置。
示意性的,二维坐标系上的触发操作包括但不限于如下操作中的至少一种:在二维坐标系范围内的滑动操作、触摸操作、单击操作、双击操作。
触发操作的触发位置是指触发操作位于二维坐标系上的详细坐标。
以终端是触摸屏为例,运用压力触控技术和/或悬浮触控技术,终端可以获取到用户在触摸屏上的触发操作的触发位置。用户在触摸屏上进行触摸时,触控芯片获取触摸事件,并将触摸事件上报给终端的处理器,处理器根据上报的触摸事件获取触发位置。
示意性的,触摸事件包括Touch Start事件、Touch Move事件和Touch End事件。其中,Touch Start事件用于指示手指在触摸屏上的触摸坐标,Touch Move事件用于指示手指在触摸屏上的连续滑动时的连续触摸坐标,Touch End事件用于指示手指从触摸屏上离开的触摸坐标,上述触摸坐标由触摸感应器根据用户在触摸屏上的触摸位置获取。
示意性的,如图4a所示,二维坐标系411是极坐标系。其中,极坐标的半径维度是节拍维度,方位维度是音阶维度。该极坐标系上显示有滑动按键412,触发位置是指滑动按键412在极坐标系上的坐标。示意性的,滑动按键412可在二维坐标系411的范围内进行滑动,滑动按键412是用于确定触发操作的具体 位置的标志,也可以不显示于二维坐标系411内。响应于二维坐标系411上的点击操作,获取点击操作在滑动按键412处的触发位置为(半径坐标,方位坐标)。以节拍维度分为慢节奏、次慢节奏、一般节奏、次快节奏、快节奏的五个节拍等级,音阶维度分为音阶一、音阶二、音阶三、音阶四、音阶五、音阶六、音阶七的七个音阶等级为例,获取到的触发位置的坐标为(一般节奏,音阶四)。
根据前述内容,二维坐标系包括极坐标系和直角坐标系中的至少一种。基于此,步骤304可采用如下两种可选方式中的至少一种:
一、二维坐标系是极坐标系。
步骤304可包括:
响应于极坐标系上的触发操作,获取触发位置的半径坐标和方位坐标;
将半径坐标确定为触发位置在第一维度的坐标,将方位坐标确定为触发位置在第二维度的坐标。
示意性的,如图4a所示,二维坐标系411是极坐标。其中,第一维度是半径维度,第二维度是方位维度,滑动按键412所在位置即为触发操作的触发位置。根据图4a所示,半径维度对应节拍属性,方位维度对应音阶属性。以节拍维度分为慢节奏、次慢节奏、一般节奏、次快节奏、快节奏的五个节拍等级,音阶维度分为音阶一、音阶二、音阶三、音阶四、音阶五、音阶六、音阶七的七个音阶等级为例,响应于滑动按键412处的点击操作,获取滑动按键412所在位置的半径坐标为一般节奏,方位坐标为音阶四;将一般节奏确定为滑动按键412所在位置在半径维度的坐标,将音阶四确定为滑动按键412所在位置在方位维度的坐标。
二、二维坐标系是直角坐标系。
步骤304可包括:
响应于直角坐标系上的触发操作,获取触发位置的x轴坐标和y轴坐标;
将x轴坐标确定为触发位置在第一维度的坐标,将y轴坐标确定为触发位置在第二维度的坐标。
示意性的,如图5所示,二维坐标系511是直角坐标。其中,第一维度是x轴维度,第二维度是y轴维度,滑动按键512所在位置即为触发操作的触发位置。根据图5所示,x轴维度对应节拍维度,y轴维度对应音域维度。以节拍维度分为-4等级节奏、-3等级节奏、-2等级节奏、-1等级节奏、0等级节奏、1等级节奏、2等级节奏、3等级节奏和4等级节奏的9个节拍等级,音域维度分为4等级音域、-3等级音域、-2等级音域、-1等级音域、0等级音域、1等级音域、2等级音域、3等级音域和4等级音域的9个音域等级为例。比如,响应于滑动按键512处的点击操作,获取滑动按键512所在位置的x轴坐标为-2等级节奏,y轴坐标为1等级音域;将-2等级节奏确定为滑动按键512所在位置在x轴维度的坐标,将1等级音域确定为滑动按键512所在位置在y轴维度的坐标。
步骤306:显示筛选出的目标音乐。
示意性的,目标音乐的第一音乐属性与触发位置在第一维度的坐标对应,目标音乐的第二音乐属性与触发位置在第二维度的坐标对应。
目标音乐是指终端通过筛选后选出的,具有与触发位置在二维坐标系中的坐标相对应的属性标签的音乐。示意性的,目标音乐可以是一个或多个。在存在多个目标音乐时,可以采用歌单、电台、列表等形式表示。
示意性的,目标音乐所属的程序界面是音乐筛选界面或者第二程序界面,第二程序界面与音乐筛选界面是不同的界面。示例性的,第二程序界面是音乐播放界面,或者是其他功能界面。
示意性的,如图4a所示,响应于音乐筛选界面410中极坐标系上的触发操作,显示目标歌曲和目标歌单,其中目标歌曲包括“歌曲1”和“歌曲2”,目标歌单包括“歌单1”和“歌单2”,该目标歌曲和目标歌单所属的程序界面是音乐筛选界面410。
可选的,音乐筛选界面上还可以显示目标音乐的其他显示元素,比如,音乐筛选界面上显示出部分候选音乐的相关信息。示意性的,如图4b所示,在滑动按键412滑动至该坐标点时,在滑动按键412所在的位置的周边显示候选音乐的列表项,列表项内包括“歌曲1”、“歌曲2”、“歌单1”和“歌单2”;另外,用户还可以通过触发“…”展开列表项,比如,点击“...”展开列表项。响应于音乐筛选界面410中极坐标系上的触发操作,比如双击操作,在音乐筛选界面410的底部显示音乐播放控件,同时播放歌曲1。
示意性的,步骤306包括如下步骤:
根据触发位置在第一维度的坐标,确定第一属性标签;
根据触发位置在第二维度的坐标,确定第二属性标签;
在音乐库中筛选出第一音乐属性具有第一属性标签,且第二音乐属性具有第二属性标签的第一音乐,作为目标音乐,第一属性标签和第二属性标签是不同的属性标签。
属性标签是指,根据音乐属性对每个音乐进行分类后标注的分类标记。示意性的,一个音乐至少具有一个音乐属性的属性标签,一个音乐的一个音乐属性所具有的属性标签的个数不限。比如,音乐1具有慢节奏、音阶二、音阶三、古筝、琵琶四个属性标签,其中慢节奏属于节拍属性的属性标签,音阶二和音阶三属于音阶属性的属性标签,古筝和琵琶属于乐器类型属性中的弹奏乐器类型的属性标签。
音乐库是指用于储存音乐相关信息的数据库,音乐库中至少包括一个候选音乐。终端通过在音乐库中进行音乐的筛选,最终确定筛选出的目标音乐。示意性的,音乐库可以保存在本地数据库中,或者保存在云端数据库中,或者是在本地数据库和云端数据库中均有保存。比如,终端首先在本地数据库中的音乐库1进行筛选后,未找到目标音乐;随后,终端在云端数据库中的音乐库2进行筛选后找到目标音乐。
触发位置在二维坐标系中对应一个坐标点,通过该坐标点可以获取到对应的两个不同的音乐属性的属性标签。也即,根据触发位置在第一维度的坐标得到第一属性标签,根据触发位置在第二维度的坐标得到第二属性标签。依据第一属性标签和第二属性标签,终端可以在音乐库中筛选出至少一个目标音乐,并将目标音乐进行显示。示意性的,根据获取到的第一属性标签和第二属性标签,终端在音乐库中筛选出第一音乐,第一音乐的第一音乐属性具有第一属性标签,第二音乐属性具有第二属性标签。
示意性的,如图5所示,二维坐标系511是直角坐标系,用户在直角坐标系上进行触发操作的位置是滑动按键512所在的位置。响应于直角坐标系上的触发操作,获取触发操作的触发位置为(-2等级节拍,1等级音域)。根据触发操作的触发位置,可以确定,触发位置的第一属性标签为-2等级节拍,第二属性标签为1等级音域。终端在音乐库中进行筛选,找到歌曲“歌曲A”具有“-2等级节拍”、“音阶三”、“1等级音域”、“钢琴”、“古筝”的属性标签,终端将歌曲“歌曲A”确定为目标歌曲,并从音乐筛选界面510跳转音乐播放界面520显示歌曲“歌曲A”。
在音乐筛选的过程中,并非任何时候均能筛选出第一音乐。因此,本申请实施例提供的音乐筛选方法中,在不存在第一音乐的情况下,步骤306还包括如下步骤:
在所述音乐库中筛选出第二音乐作为所述目标音乐,其中,所述第二音乐的第一音乐属性所具有的属性标签与所述第一属性标签的距离最近,所述第二音乐的第二音乐属性具有所述第二属性标签;或,在所述音乐库中筛选出第三音乐作为所述目标音乐,其中,所述第三音乐的第一音乐属性具有所述第一属性标签,所述第三音乐的第二音乐属性所具有的属性标签与所述第二属性标签的距离最近;或,在所述音乐库中筛选出第四音乐作为所述目标音乐,其中,所述第四音乐的第一音乐属性所具有的属性标签与所述第一属性标签的距离最近,所述第四音乐的第二音乐属性所具有的属性标签与所述第二属性标签的距离最近。
示意性的,第一音乐属性具有的属性标签与第一属性标签的距离是指,第二音乐的第一音乐属性所具有的属性标签与第一属性标签的距离。其余的距离的定义与之类似,在此不再赘述。在音乐筛选界面中,根据第一音乐属性和第二音乐属性确定的第一维度和第二维度构成二维坐标系,在二维坐标系中至少存在一个坐标点对应有一个候选音乐。示意性的,该距离可以是通过二维坐标系中的数学坐标点之间的距离决定的,或者是根据其他预设的距离比较规则决定的。
比如,在直角坐标系中,触发位置的坐标为(1,2),候选音乐的坐标为(3,5),则第一属性标签为1,第二属性标签为2,依据数学坐标点的距离计算方法,候选音乐的第一音乐属性与第一属性标签的距离为2,第二音乐属性与第二属性标签的距离为3。
又如,在极坐标系中,触发位置的坐标为(快节奏,音阶五),候选音乐的 坐标为(快节奏,音阶一),则第一属性标签为快节奏,第二属性标签为音阶五,根据预设的音阶距离比较规则,候选音乐的第一音乐属性具有第一属性标签,第二音乐属性与第二属性标签的距离是四个音阶。
综上所述,本申请实施例提供的音乐筛选方法,通过在二维坐标系上进行的触发操作,可以显示筛选出的目标音乐,实现了音乐的模糊筛选,减少了用户进行音乐筛选所耗费的时间和精力,使得筛选出的目标音乐能够满足用户的模糊筛选需求。
示意性的如图6所示,本申请实施例提供了一种在音乐库中对候选音乐标记属性标签的方法,以及一种生成二维坐标系的方法。该方法的执行主体为计算机设备,例如,为运行有音乐类应用程序的终端,如图2中的终端220,方法包括如下步骤:
步骤601:根据音乐属性,标记音乐库中的各个候选音乐的属性标签。
根据前述内容,音乐属性主要涉及两方面内容。一方面涉及音乐旋律的特征,包括但不限于节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少一种;一方面涉及音乐中使用到的乐器类型,包括但不限于吹奏乐器、弹拨乐器、打击乐器、拉弦乐器、弦乐器、木管乐器、铜管乐器、键盘乐器中的至少一种。
示意性的,本申请实施例中涉及的音乐属性包括:
节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少两种;
或,乐器类型中的至少两种;
或,节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少一种和乐器类型中的至少一种。
属性标签是指根据音乐属性,对每个音乐进行分类后标注的分类标记。一个音乐至少具有一个音乐属性的属性标签,一个音乐的一个音乐属性所具有的属性标签的个数不限。
音乐库中包括有至少一个候选音乐,终端在进行音乐筛选时,将根据触发位置获取到的第一属性标签和第二属性标签,与音乐库中的候选音乐的属性标签进行匹配。因此,需要对音乐库中的各个候选音乐的每个音乐属性均需要进行属性标签的分类。
示意性的,如图7所示,可以采用旋律定位技术作为标记属性标签的基础,根据候选音乐的音频波形信息,运用测频的方法,基于音高显著度进行旋律提取,获取相关音乐属性的信息。旋律识别技术的执行主体为计算机设备,例如,为运行有音乐类应用程序的终端,如图2中的终端220,包括如下步骤:
步骤701:对输入的音乐内容进行预处理。
步骤702:对预处理后的音乐内容进行时频变换及谱处理。
步骤703:通过音高显著度函数进行计算。
步骤704:对音高进行跟踪。
步骤705:旋律定位。
通过时频变换及谱处理,将预处理后的音乐内容转化为频率信息,再通过音高显著度函数对频率信息进行计算,将频率信息计算为不同的音高,对不同的音高进行跟踪,最终实现旋律定位。
示意性的,标记属性标签的基础还可以是其他分类方法和/或技术,比如简单加权方法。
根据音乐属性进行的属性标签的标记,可以有多种标记规则,以下仅为本申请提供的几种示例性实施例,终端可根据实际需要进行调整:
1、标记节拍属性的属性标签。
在音乐属性包括节拍的情况下,步骤601包括:
根据候选音乐的节拍特征,标记候选音乐的节拍标签为五个节拍等几种的一种。示意性的,五个节拍等级包括慢节奏、次慢节奏、一般节奏、次快节奏和快节奏。
示意性的,将节拍特征分为弱、次弱、一般、次强、强的五种类型,分别对应慢节奏、次慢节奏、一般节奏、次快节奏、快节奏的五个节拍等级。
在候选音乐的节拍特征符合弱节拍类型的情况下,标记候选音乐的节拍标签为慢节奏;或,在候选音乐的节拍特征符合次弱节拍类型的情况下,标记候选音乐的节拍标签为次慢节奏;或,在候选音乐的节拍特征符合一般节拍类型的情况下,标记候选音乐的节拍标签为一般节奏;或,在候选音乐的节拍特征符合次强节拍类型的情况下,标记候选音乐的节拍标签为次快节奏;或,在候选音乐的节拍特征符合强节拍类型的情况下,标记候选音乐的节拍标签为快节奏。
另外,节拍等级的分类也可以根据强拍和弱拍的组合规律进行。根据强拍和弱拍的组合规律,节拍特征可以分为1/4、2/4、3/4、4/4、3/8、6/8、7/8、9/8、12/8的八个类型,由此可以产生八个节拍等级。
2、标记音阶属性的属性标签。
在音乐属性包括音阶的情况下,步骤601包括:
根据候选音乐中的音符的出现次数,标记候选音乐的音阶标签为七个音阶等级中的至少一种。示意性的,七个音阶等级包括音阶一、音阶二、音阶三、音阶四、音阶五、音阶六、音阶七。
示意性的,音符包括1音符、2音符、3音符、4音符、5音符、6音符、7音符的七个音符,分别对应音阶一、音阶二、音阶三、音阶四、音阶五、音阶六、音阶七的七个音阶等级。
在候选音乐中的1音符的出现次数最多的情况下,标记候选音乐的音阶标签为音阶一;或,在候选音乐中的2音符的出现次数最多的情况下,标记候选音乐的音阶标签为音阶二;或,在候选音乐中的3音符的出现次数最多的情况下,标记候选音乐的音阶标签为音阶三;或,在候选音乐中的4音符的出现次数最多的情况下,标记候选音乐的音阶标签为音阶四;或,在候选音乐中的5音符的 出现次数最多的情况下,标记候选音乐的音阶标签为音阶五;或,在候选音乐中的6音符的出现次数最多的情况下,标记候选音乐的音阶标签为音阶六;或,在候选音乐中的7音符的出现次数最多的情况下,标记候选音乐的音阶标签为音阶七。
示意性的,在候选音乐的第一音符和第二音符的出现次数相同,且均大于剩余音符的出现次数的情况下,对应第一音符和第二音符,标记候选音乐的属性标签。比如,候选音乐中的3音符、4音符和5音符的出现次数相同,且均大于剩余音符,标记候选音乐的音阶标签为音阶三、音阶四、音阶五。
3、标记音调属性的属性标签。
在音乐属性包括音调的情况下,步骤601包括:
根据候选音乐的音频特征,将候选音乐的音调标签标记为七个音调等级中的一种。示意性的,七个音调等级包括极低频、低频、中低频、中频、中高频、高频、极高频。
音调主要由声音的频率决定,因此受到乐器类型和人声的综合频率的影响。示意性的,根据声音频率的高低,在候选音乐的音乐频率属于极低频的情况下,标记候选音乐的音调标签为极低频;或,在候选音乐的音乐频率属于低频的情况下,标记候选音乐的音调标签为低频;或,在候选音乐的音乐频率属于中低频的情况下,标记候选音乐的音调标签为中低频;或,在候选音乐的音乐频率属于中频的情况下,标记候选音乐的音调标签为中频;或,在候选音乐的音乐频率属于中高频的情况下,标记候选音乐的音调标签为中高频;或,在候选音乐的音乐频率属于高频的情况下,标记候选音乐的音调标签为高频;或,在候选音乐的音乐频率属于极高频的情况下,标记候选音乐的音调标签为极高频。
4、标记乐器类型属性的属性标签。
在音乐属性包括乐器类型的情况下,步骤601包括:
确定候选音乐中出现的乐器以及乐器的出现时长;
在候选音乐中的第一乐器的出现时长大于剩余乐器的出现时长的情况下,标记候选音乐的乐器类型标签为第一乐器;
或,在候选音乐中的第一乐器与第二乐器的出现时长相同,且第一乐器与第二乐器的出现时长均大于剩余乐器的出现时长的情况下,标记候选音乐的乐器类型标签为第一乐器和第二乐器。
比如,某一候选音乐中出现的乐器有钢琴、手风琴、琵琶、吉他、小鼓、长笛、双簧管,其出现时长分别是35秒、22秒、35秒、27秒、12秒、19秒、26秒。由于钢琴和琵琶的出现时长均为35秒,且大于剩余乐器的出现时长,故将该候选音乐的乐器类型标签标记为钢琴和琵琶。
步骤602:显示选择功能界面。
示意性的,选择功能界面上显示有至少两个维度字符串,维度字符串根据音乐属性生成。
维度字符串用于生成二维坐标系,通过在维度字符串上的触发操作实现。根据前述内容,二维坐标的第一维度对应第一音乐属性,第二维度对应第二音乐属性,因此,维度字符串的数量不小于音乐属性的数量,且至少包括两个音乐属性对应的字符串。比如,维度字符串包括字符串“节拍”、字符串“音阶”、字符串“音调”、字符串“曲式”、字符串“打击乐器”、字符串“键盘乐器”、字符串“吹奏乐器”。
步骤603:响应于维度字符串上的选择操作,生成二维坐标系。
示意性的,维度字符串上的选择操作包括但不限于如下操作中的至少一种:在维度字符串上进行的单击操作、双击操作、拖动操作。
根据前述内容,二维坐标系包括极坐标系和直角坐标系中的至少一种。因此,步骤603至少包括如下步骤:
响应于第一维度字符串上的第一选择操作,生成半径维度;响应于第二维度字符串上的第二选择操作,生成方位维度;根据半径维度和方位维度,生成极坐标系;
或,响应于第一维度字符串上的第一选择操作,生成x轴维度;响应于第二维度字符串上的第二选择操作,生成y轴维度,根据x轴维度和y轴维度,生成直角坐标系。
比如,第一维度字符串是字符串“节拍”,第二维度字符串是字符串“音阶”。单击字符串“节拍”生成半径维度,单击字符串“音阶”生成方位维度,由此生成极坐标系。
又如,第一维度字符串是字符串“打击乐器”,第二维度字符串是字符串“音域”。双击字符串“打击乐器”生成x轴维度,双击字符串“音域”生成y轴维度,由此生成直角坐标系。
又如,第一维度字符串是字符串“曲式”,第二维度字符串是字符串“音调”。拖动字符串“音调”到指定位置生成半径维度,拖动字符串“曲式”到指定位置生成方位坐标,由此生成极坐标系。
步骤604:显示音乐筛选界面。
示意性的,音乐筛选界面上显示有二维坐标系,二维坐标系的第一维度对应第一音乐属性,二维坐标系的第二维度对应第二音乐属性。其中,第一音乐属性与第二音乐属性是不同的音乐属性。
步骤605:响应于二维坐标系上的触发操作,获取触发操作的触发位置。
示意性的,二维坐标系上的触发操作包括但不限于如下操作中的至少一种:在二维坐标系范围内的滑动操作、触摸操作、单击操作、双击操作。
步骤606:显示筛选出的目标音乐。
示意性的,目标音乐的第一音乐属性与触发位置在第一维度的坐标对应,目标音乐的第二音乐属性与触发位置在第二维度的坐标对应。
步骤604、步骤605和步骤606与步骤302、步骤304和步骤306相同,可 作参考,在此不再赘述。
示意性的如图8所示,本申请实施例提供了音乐筛选方法的技术流程图,具体如下步骤:
步骤801:服务器进行歌曲录入。
服务器对音乐库中的各个候选音乐进行音乐属性的分类,并标记各个候选音乐的属性标签,将标记后的歌曲进行录入。服务器一方面将分类资源上传至云端数据库,以便后续的再次归类、多次筛选和数据调用;一方面将属性标签的数据保存在本地数据库,以便终端随时调用。
步骤802:终端生成二维坐标系。
步骤803:终端向服务器发送第一维度和第二维度的相关信息。
步骤804:服务器进行二维坐标系解析。
根据选择功能界面上显示的至少两个维度字符串,用户依据自身需要从中选择两个维度字符串,分别生成第一维度和第二维度,终端根据第一维度和第二维度生成二维坐标系。
服务器接收到终端发送的二维坐标系中的第一维度和第二维度,进行二维坐标系解析。服务器根据第一维度和第二维度,调用音乐库中的数据,音乐库保存在云端数据库和/或本地数据库中。服务器将符合第一维度和第二维度的候选音乐导入到二维坐标系中,保证二维坐标系中存在至少一个坐标点对应一个候选音乐。
步骤805:终端在二维坐标系上进行触发操作。
步骤806:终端向服务器发送触发位置的坐标。
响应于二维坐标系上进行的触发操作,终端获取到触发操作的触发位置,并将触发位置的坐标发送至服务器,也即,将触发位置在二维坐标系上的第一维度的坐标和第二维度的坐标发送至服务器。
步骤807:服务器进行音乐筛选。
步骤808:服务器向终端发送筛选出的目标音乐。
步骤809:终端显示目标音乐。
服务器获取到触发位置的坐标后开始进行音乐筛选。服务器依据触发位置的坐标在音乐库中进行搜索,在搜索到第一候选音乐后,判断第一候选音乐的第一音乐属性与触发位置在第一维度的坐标是否匹配,以及第二音乐属性与触发位置在第二维度的坐标是否匹配。若不匹配,则以触发位置的坐标为中心在二维坐标系的范围内向四周扩大搜索。
在服务器搜索到匹配的候选音乐后,也即搜索到目标音乐,服务器向终端发送筛选出的目标音乐,终端对目标音乐进行显示。
综上所述,本申请实施例提供的音乐筛选方法,通过对候选音乐的属性标签进行标记,将每个候选音乐的音乐属性进行细化分类,使得音乐筛选的依据更为 准确,降低了音乐的筛选粒度。另外,本申请实施例提供的音乐筛选方法,通过提供维度字符串的方式,使得用户可以根据自己的需求选择二维坐标系的维度,也即,用户可以根据自己的需求进行特定音乐属性的模糊筛选,满足了用户的使用需求,一定程度上增强了用户的使用体验感。
示意性的如图9所示,本申请提供了一种音乐筛选装置,该装置可以通过软件、硬件或者两者的结合实现成为终端的全部或一部分。该音乐筛选装置包括:标记模块920、显示模块940、生成模块960和获取模块980。
标记模块920,用于根据音乐属性,标记音乐库中的各个候选音乐的属性标签。
显示模块940,用于显示选择功能界面,选择功能界面上显示有至少两个维度字符串,维度字符串根据音乐属性生成。
生成模块960,用于响应于维度字符串上的选择操作,生成二维坐标系。
所述显示模块940,还用于显示音乐筛选界面,音乐筛选界面上显示有二维坐标系,二维坐标系的第一维度对应第一音乐属性,二维坐标系的第二维度对应第二音乐属性,第一音乐属性与第二音乐属性是不同的音乐属性。
获取模块980,用于响应于二维坐标系上的触发操作,获取触发操作的触发位置。
所述显示模块940,还用于显示筛选出的目标音乐,目标音乐的第一音乐属性与触发位置在第一维度的坐标对应,目标音乐的第二音乐属性与触发位置在第二维度的坐标对应。
在本申请的一种可以实现的方式中,所述二维坐标系是极坐标系,所述获取模块980,还用于:响应于极坐标系上的触发操作,获取触发位置的半径坐标和方位坐标;将半径坐标确定为触发位置在第一维度的坐标,将方位坐标确定为触发位置在第二维度的坐标。
在本申请的一种可以实现的方式中,所述二维坐标系是直角坐标系,所述获取模块980,还用于:响应于直角坐标系上的触发操作,获取触发位置的x轴坐标和y轴坐标;将x轴坐标确定为触发位置在第一维度的坐标,将y轴坐标确定为触发位置在第二维度的坐标。
在本申请的一种可以实现的方式中,所述显示模块940,还用于:根据触发位置在第一维度的坐标,确定第一属性标签;根据触发位置在第二维度的坐标,确定第二属性标签;在音乐库中筛选出第一音乐属性具有第一属性标签,且第二音乐属性具有第二属性标签的第一音乐,作为目标音乐,第一属性标签和第二属性标签是不同的属性标签。
在本申请的一种可以实现的方式中,所述显示模块940,还用于:在不存在所述第一音乐的情况下,在所述音乐库中筛选出第二音乐作为所述目标音乐,其中,所述第二音乐的第一音乐属性所具有的属性标签与所述第一属性标签的距离最近, 且所述第二音乐的第二音乐属性具有所述第二属性标签。
在本申请的一种可以实现的方式中,所述显示模块940,还用于:在不存在所述第一音乐的情况下,在所述音乐库中筛选出第三音乐作为所述目标音乐,其中,所述第三音乐的第一音乐属性具有所述第一属性标签,且所述第三音乐的第二音乐属性所具有的属性标签与所述第二属性标签的距离最近。
在本申请的一种可以实现的方式中,所述显示模块940,还用于:在不存在所述第一音乐的情况下,在所述音乐库中筛选出第四音乐作为所述目标音乐,其中,所述第四音乐的第一音乐属性所具有的属性标签与所述第一属性标签的距离最近,且所述第四音乐的第二音乐属性所具有的属性标签与所述第二属性标签的距离最近。
在本申请的一种可以实现的方式中,音乐属性包括节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少两种;或,乐器类型中的至少两种;或,节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少一种和乐器类型中的至少一种。
在本申请的一种可以实现的方式中,所述音乐属性包括节拍,所述标记模块920,还用于:根据候选音乐的节拍特征,标记候选音乐的节拍标签为五个节拍等级中的一种,五个节拍等级包括慢节奏、次慢节奏、一般节奏、次快节奏和快节奏中的一种。
在本申请的一种可以实现的方式中,所述音乐属性包括音阶,所述标记模块920,还用于:根据候选音乐中的音符的出现次数,标记候选音乐的音阶标签为七个音阶等级中的至少一种,七个音阶等级包括音阶一、音阶二、音阶三、音阶四、音阶五、音阶六、音阶七中的至少一种。
在本申请的一种可以实现的方式中,所述音乐属性包括乐器类型,所述标记模块920,还用于:确定候选音乐中出现的乐器以及乐器的出现时长;在候选音乐中的第一乐器的出现时长大于剩余乐器的出现时长的情况下,标记候选音乐的乐器类型标签性为第一乐器。
在本申请的一种可以实现的方式中,所述音乐属性包括乐器类型,所述标记模块920,还用于:确定候选音乐中出现的乐器以及乐器的出现时长;在候选音乐中的第一乐器与第二乐器的出现时长相同,且第一乐器与第二乐器的出现时长均大于剩余乐器的出现时长的情况下,标记候选音乐的乐器类型标签为第一乐器和第二乐器。
在本申请的一种可以实现的方式中,所述生成模块960,还用于:响应于第一维度字符串上的第一选择操作,生成半径维度;响应于第二维度字符串上的第二选择操作,生成方位维度;根据半径维度和方位维度,生成极坐标系。
在本申请的一种可以实现的方式中,所述生成模块960,还用于:响应于第一维度字符串上的第一选择操作,生成x轴维度;响应于第二维度字符串上的第二选择操作,生成y轴维度,根据x轴维度和y轴维度,生成直角坐标系。
下面是对本申请应用的计算机设备进行说明,图10其示出了本申请一个示例性实施例提供的计算机设备1000的结构框图。该计算机设备1000可以是便携式移动终端,比如:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器。计算机设备1000还可能被称为用户设备、便携式终端等其他名称。
通常,计算机设备1000包括有:处理器1001和存储器1002。
处理器1001可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1001可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1001也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1001可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1001还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1002可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。存储器1002还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1002中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1001所执行以实现本申请中提供的生成影集视频的方法。
在一些实施例中,计算机设备1000还可包括有:外围设备接口1003和至少一个外围设备。具体地,外围设备包括:射频电路1004、触摸显示屏1005、摄像头组件1006、音频电路1007、定位组件1008和电源1009中的至少一种。
外围设备接口1003可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1001和存储器1002。在一些实施例中,处理器1001、存储器1002和外围设备接口1003被集成在同一芯片或电路板上;在一些其他实施例中,处理器1001、存储器1002和外围设备接口1003中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1004用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1004通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1004将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1004包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡 等等。射频电路1004可以通过至少一种无线通信协议来与其他终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G,或3G,或4G,或5G,或它们的组合)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1004还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
触摸显示屏1005用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其他们的任意组合。触摸显示屏1005还具有采集在触摸显示屏1005的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1001进行处理。触摸显示屏1005用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,触摸显示屏1005可以为一个,设置计算机设备1000的前面板;在另一些实施例中,触摸显示屏1005可以为至少两个,分别设置在计算机设备1000的不同表面或呈折叠设计;在一些实施例中,触摸显示屏1005可以是柔性显示屏,设置在计算机设备1000的弯曲表面上或折叠面上。甚至,触摸显示屏1005还可以设置成非矩形的不规则图形,也即异形屏。触摸显示屏1005可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1006用于采集图像或视频。可选地,摄像头组件1006包括前置摄像头和后置摄像头。通常,前置摄像头用于实现视频通话或自拍,后置摄像头用于实现照片或视频的拍摄。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能,主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能。在一些实施例中,摄像头组件1006还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1007用于提供用户和计算机设备1000之间的音频接口。音频电路1007可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1001进行处理,或者输入至射频电路1004以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在计算机设备1000的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1001或射频电路1004的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1007还可以包括耳机插孔。
定位组件1008用于定位计算机设备1000的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1008可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗 斯的伽利略系统的定位组件。
电源1009用于为计算机设备1000中的各个组件进行供电。电源1009可以是交流电、直流电、一次性电池或可充电电池。当电源1009包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,计算机设备1000还包括有一个或多个传感器1010。该一个或多个传感器1010包括但不限于:加速度传感器1011、陀螺仪传感器1012、压力传感器1013、指纹传感器1014、光学传感器1015以及接近传感器1016。
加速度传感器1011可以检测以计算机设备1000建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1011可以用于检测重力加速度在三个坐标轴上的分量。处理器1001可以根据加速度传感器1011采集的重力加速度信号,控制触摸显示屏1005以横向视图或纵向视图进行用户界面的显示。加速度传感器1011还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1012可以检测计算机设备1000的机体方向及转动角度,陀螺仪传感器1012可以与加速度传感器1011协同采集用户对计算机设备1000的3D动作。处理器1001根据陀螺仪传感器1012采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1013可以设置在计算机设备1000的侧边框和/或触摸显示屏1005的下层。当压力传感器1013设置在计算机设备1000的侧边框时,可以检测用户对计算机设备1000的握持信号,根据该握持信号进行左右手识别或快捷操作。当压力传感器1013设置在触摸显示屏1005的下层时,可以根据用户对触摸显示屏1005的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1014用于采集用户的指纹,以根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1001授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1014可以被设置计算机设备1000的正面、背面或侧面。当计算机设备1000上设置有物理按键或厂商Logo时,指纹传感器1014可以与物理按键或厂商Logo集成在一起。
光学传感器1015用于采集环境光强度。在一个实施例中,处理器1001可以根据光学传感器1015采集的环境光强度,控制触摸显示屏1005的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1005的显示亮度;当环境光强度较低时,调低触摸显示屏1005的显示亮度。在另一个实施例中,处理器1001还可以根据光学传感器1015采集的环境光强度,动态调整摄像头组件1006的拍摄参数。
接近传感器1016,也称距离传感器,通常设置在计算机设备1000的正面。接近传感器1016用于采集用户与计算机设备1000的正面之间的距离。在一个实施例中,当接近传感器1016检测到用户与计算机设备1000的正面之间的距离逐渐变小时,由处理器1001控制触摸显示屏1005从亮屏状态切换为息屏状态;当接近传感器1016检测到用户与计算机设备1000的正面之间的距离逐渐变大时,由处理器1001控制触摸显示屏1005从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图10中示出的结构并不构成对计算机设备1000的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请还提供了一种计算机设备,该计算机设备包括处理器和存储器,该存储器中存储有至少一条程序代码,该程序代码由处理器加载并执行以实现上述各方法实施例提供的音乐筛选方法。
本申请还提供了一种计算机可读存储介质,该存储介质中存储有至少一条程序代码,该程序代码由处理器加载并执行以实现上述各方法实施例提供的音乐筛选方法。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的可选实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (23)

  1. 一种音乐筛选方法,由计算机设备执行,所述方法包括:
    显示音乐筛选界面,所述音乐筛选界面上显示有二维坐标系,所述二维坐标系的第一维度对应第一音乐属性,所述二维坐标系的第二维度对应第二音乐属性,所述第一音乐属性与所述第二音乐属性是不同的音乐属性;
    响应于所述二维坐标系上的触发操作,获取所述触发操作的触发位置;及,
    显示筛选出的目标音乐,所述目标音乐的第一音乐属性与所述触发位置在所述第一维度的坐标对应,所述目标音乐的第二音乐属性与所述触发位置在所述第二维度的坐标对应。
  2. 根据权利要求1所述的音乐筛选方法,其中,所述二维坐标系是极坐标系,所述响应于所述二维坐标系上的触发操作,获取所述触发操作的触发位置包括:
    响应于所述极坐标系上的触发操作,获取所述触发位置的半径坐标和方位坐标;
    将所述半径坐标确定为所述触发位置在所述第一维度的坐标,将所述方位坐标确定为所述触发位置在所述第二维度的坐标。
  3. 根据权利要求1所述的音乐筛选方法,其中,所述二维坐标系是直角坐标系,所述响应于所述二维坐标系上的触发操作,获取所述触发操作的触发位置包括:
    响应于所述直角坐标系上的触发操作,获取所述触发位置的x轴坐标和y轴坐标;
    将所述x轴坐标确定为所述触发位置在所述第一维度的坐标,将所述y轴坐标确定为所述触发位置在所述第二维度的坐标。
  4. 根据权利要求1至3任一所述的音乐筛选方法,其中,所述显示筛选出的目标音乐,包括:
    根据所述触发位置在所述第一维度的坐标,确定第一属性标签;
    根据所述触发位置在所述第二维度的坐标,确定第二属性标签;
    在音乐库中筛选出所述第一音乐属性具有所述第一属性标签,且所述第二音乐属性具有所述第二属性标签的第一音乐,作为所述目标音乐,所述第一属性标签和所述第二属性标签是不同的属性标签。
  5. 根据权利要求4所述的方法,其中,所述方法还包括:
    在不存在所述第一音乐的情况下,在所述音乐库中筛选出第二音乐作为所述目标音乐,其中,所述第二音乐的第一音乐属性所具有的属性标签与所述第一属性标签的距离最近,且所述第二音乐的第二音乐属性具有所述第二属性标签。
  6. 根据权利要求4所述的方法,其中,所述方法还包括:
    在不存在所述第一音乐的情况下,在所述音乐库中筛选出第三音乐作为所述目标音乐,其中,所述第三音乐的第一音乐属性具有所述第一属性标签,且所述第三音乐的第二音乐属性所具有的属性标签与所述第二属性标签的距离最近。
  7. 根据权利要求4所述的方法,其中,所述方法还包括:
    在不存在所述第一音乐的情况下,在所述音乐库中筛选出第四音乐作为所述目标音乐,其中,所述第四音乐的第一音乐属性所具有的属性标签与所述第一属性标签的距离最近,且所述第四音乐的第二音乐属性所具有的属性标签与所述第二属性标签的距离最近。
  8. 根据权利要求1至3任一所述的音乐筛选方法,其中,所述音乐属性包括:
    节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少两种;
    或,
    乐器类型中的至少两种;
    或,
    所述节拍、所述音阶、所述力度、所述速度、所述曲式、所述音调、所述音色、所述音域中的至少一种和所述乐器类型中的至少一种。
  9. 根据权利要求8所述的音乐筛选方法,其中,所述方法还包括:
    根据所述音乐属性,标记音乐库中的各个候选音乐的属性标签。
  10. 根据权利要求9所述的音乐筛选方法,其中,所述音乐属性包括节拍,所述根据所述音乐属性,标记音乐库中的各个候选音乐的属性标签,包括:
    根据所述候选音乐的节拍特征,标记所述候选音乐的节拍标签为五个节拍等级中的一种,所述五个节拍等级包括慢节奏、次慢节奏、一般节奏、次快节奏、快节奏。
  11. 根据权利要求9所述的音乐筛选方法,其中,所述音乐属性包括音阶,所述根据所述音乐属性,标记音乐库中的各个候选音乐的属性标签,包括:
    根据所述候选音乐中音符的出现次数,标记所述候选音乐的音阶标签为七个音阶等级中的至少一种,所述七个音节等级包括音阶一、音阶二、音阶三、音阶四、音阶五、音阶六、音阶七。
  12. 根据权利要求9所述的音乐筛选方法,其中,所述音乐属性包括乐器类型,所述根据所述音乐属性,标记音乐库中的各个候选音乐的属性标签,包括:
    确定所述候选音乐中出现的乐器以及所述乐器的出现时长;
    在所述候选音乐中的第一乐器的出现时长大于剩余乐器的出现时长的情况下,标记所述候选音乐的乐器类型标签为所述第一乐器。
  13. 根据权利要求9所述的音乐筛选方法,其中,所述音乐属性包括乐器类型,所述根据所述音乐属性,标记音乐库中的各个候选音乐的属性标签,包括:
    确定所述候选音乐中出现的乐器以及所述乐器的出现时长;
    在所述候选音乐中的第一乐器与第二乐器的出现时长相同,且所述第一乐器与所述第二乐器的出现时长均大于剩余乐器的出现时长的情况下,标记所述候选音乐的乐器类型标签为所述第一乐器和所述第二乐器。
  14. 根据权利要求1至3任一所述的音乐筛选方法,其中,所述方法还包括:
    显示选择功能界面,所述选择功能界面上显示有至少两个维度字符串,所述维度字符串根据音乐属性生成;
    响应于所述维度字符串上的选择操作,生成所述二维坐标系。
  15. 根据权利要求14所述的音乐筛选方法,其中,所述响应于所述维度字符串上的选择操作,生成所述二维坐标系,包括:
    响应于第一维度字符串上的第一选择操作,生成半径维度;
    响应于第二维度字符串上的第二选择操作,生成方位维度;
    根据所述半径维度和所述方位维度,生成极坐标系。
  16. 根据权利要求14所述的音乐筛选方法,其中,所述响应于所述维度字符串上的选择操作,生成所述二维坐标系,包括:
    响应于所述第一维度字符串上的第一选择操作,生成x轴维度;
    响应于所述第二维度字符串上的第二选择操作,生成y轴维度;
    根据所述x轴维度和所述y轴维度,生成直角坐标系。
  17. 一种音乐筛选装置,所述装置包括:
    显示模块,用于显示音乐筛选界面,所述音乐筛选界面上显示有二维坐标系,所述二维坐标系的第一维度对应第一音乐属性,所述二维坐标系的第二维度对应第二音乐属性,所述第一音乐属性与所述第二音乐属性是不同的音乐属性;
    获取模块,用于响应于所述二维坐标系上的触发操作,获取所述触发操作的触发位置;
    其中,所述显示模块还用于,显示筛选出的目标音乐,所述目标音乐的第一音乐属性与所述触发位置在所述第一维度的坐标对应,所述目标音乐的第二音乐属性与所述触发位置在所述第二维度的坐标对应。
  18. 根据权利要求17所述的音乐筛选装置,其中,所述显示模块用于,根据所述触发位置在所述第一维度的坐标,确定第一属性标签;根据所述触发位置在所述第二维度的坐标,确定第二属性标签;在音乐库中筛选出所述第一音乐属性具有所述第一属性标签,且所述第二音乐属性具有所述第二属性标签的第一音乐,作为所述目标音乐,所述第一属性标签和所述第二属性标签是不同的属性标签。
  19. 根据权利要求17所述的音乐筛选装置,其中,所述音乐属性包括:
    节拍、音阶、力度、速度、曲式、音调、音色、音域中的至少两种;
    或,
    乐器类型中的至少两种;
    或,
    所述节拍、所述音阶、所述力度、所述速度、所述曲式、所述音调、所述音色、所述音域中的至少一种和所述乐器类型中的至少一种。
  20. 根据权利要求19所述的音乐筛选装置,其中,所述装置还包括:
    标记模块,用于根据所述音乐属性,标记音乐库中的各个候选音乐的属性标签。
  21. 根据权利要求17所述的音乐筛选装置,其中,所述显示模块还用于,显示选择功能界面,所述选择功能界面上显示有至少两个维度字符串,所述维度字符串根据音乐属性生成;
    所述装置还包括:
    生成模块,用于响应于所述维度字符串上的选择操作,生成所述二维坐标系。
  22. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条程序代码,所述程序代码由所述处理器加载并执行以实现如权利要求1至16中任一所述的音乐筛选方法。
  23. 一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述程序代码由处理器加载并执行以实现如权利要求1至16中任一所述的音乐筛选方法。
PCT/CN2021/129233 2020-11-26 2021-11-08 音乐筛选方法、装置、设备及介质 WO2022111260A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011343928.1A CN113515209B (zh) 2020-11-26 2020-11-26 音乐筛选方法、装置、设备及介质
CN202011343928.1 2020-11-26

Publications (1)

Publication Number Publication Date
WO2022111260A1 true WO2022111260A1 (zh) 2022-06-02

Family

ID=78060663

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/129233 WO2022111260A1 (zh) 2020-11-26 2021-11-08 音乐筛选方法、装置、设备及介质

Country Status (2)

Country Link
CN (1) CN113515209B (zh)
WO (1) WO2022111260A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113515209B (zh) * 2020-11-26 2023-07-25 腾讯科技(深圳)有限公司 音乐筛选方法、装置、设备及介质
CN114302253B (zh) * 2021-11-25 2024-03-12 北京达佳互联信息技术有限公司 媒体数据处理方法、装置、设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1326303A (zh) * 2000-05-25 2001-12-12 雅马哈株式会社 具有音乐合成能力的便携式通信终端装置
CN101059746A (zh) * 2005-12-20 2007-10-24 索尼株式会社 内容选择方法和内容选择装置
CN106599114A (zh) * 2016-11-30 2017-04-26 上海斐讯数据通信技术有限公司 音乐推荐方法及系统
CN113515209A (zh) * 2020-11-26 2021-10-19 腾讯科技(深圳)有限公司 音乐筛选方法、装置、设备及介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7750224B1 (en) * 2007-08-09 2010-07-06 Neocraft Ltd. Musical composition user interface representation
CN103092854B (zh) * 2011-10-31 2017-02-08 深圳光启高等理工研究院 一种音乐数据分类方法
WO2013144993A1 (ja) * 2012-03-26 2013-10-03 パイオニア株式会社 表示方法、選曲方法、表示装置、音響装置およびプログラム
CN105824686B (zh) * 2016-03-11 2019-03-22 中国联合网络通信集团有限公司 一种虚拟机宿主机的选择方法和选择系统
US20180032611A1 (en) * 2016-07-29 2018-02-01 Paul Charles Cameron Systems and methods for automatic-generation of soundtracks for live speech audio
CN107038198B (zh) * 2016-12-08 2020-04-07 阿里巴巴集团控股有限公司 数据的可视化处理方法及装置
US20200301962A1 (en) * 2017-12-09 2020-09-24 Shubhangi Mahadeo Jadhav System and Method For Recommending Visual-Map Based Playlists
US11481181B2 (en) * 2018-12-03 2022-10-25 At&T Intellectual Property I, L.P. Service for targeted crowd sourced audio for virtual interaction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1326303A (zh) * 2000-05-25 2001-12-12 雅马哈株式会社 具有音乐合成能力的便携式通信终端装置
CN101059746A (zh) * 2005-12-20 2007-10-24 索尼株式会社 内容选择方法和内容选择装置
CN106599114A (zh) * 2016-11-30 2017-04-26 上海斐讯数据通信技术有限公司 音乐推荐方法及系统
CN113515209A (zh) * 2020-11-26 2021-10-19 腾讯科技(深圳)有限公司 音乐筛选方法、装置、设备及介质

Also Published As

Publication number Publication date
CN113515209A (zh) 2021-10-19
CN113515209B (zh) 2023-07-25

Similar Documents

Publication Publication Date Title
CN107978323B (zh) 音频识别方法、装置及存储介质
US9480927B2 (en) Portable terminal with music performance function and method for playing musical instruments using portable terminal
CN110556127B (zh) 语音识别结果的检测方法、装置、设备及介质
WO2022111260A1 (zh) 音乐筛选方法、装置、设备及介质
CN111524501B (zh) 语音播放方法、装置、计算机设备及计算机可读存储介质
WO2020103550A1 (zh) 音频信号的评分方法、装置、终端设备及计算机存储介质
CN109657236B (zh) 引导信息获取方法、装置、电子装置及存储介质
CN112735429B (zh) 确定歌词时间戳信息的方法和声学模型的训练方法
WO2022111168A1 (zh) 视频的分类方法和装置
CN111428079B (zh) 文本内容处理方法、装置、计算机设备及存储介质
JP2020046500A (ja) 情報処理装置、情報処理方法および情報処理プログラム
CN111625682A (zh) 视频的生成方法、装置、计算机设备及存储介质
CN110867194B (zh) 音频的评分方法、装置、设备及存储介质
CN111081277B (zh) 音频测评的方法、装置、设备及存储介质
CN109189978B (zh) 基于语音消息进行音频搜索的方法、装置及存储介质
WO2019223393A1 (zh) 生成歌词、显示歌词的方法、装置、电子设备及存储介质
CN113220590A (zh) 语音交互应用的自动化测试方法、装置、设备及介质
CN112667844A (zh) 检索音频的方法、装置、设备和存储介质
CN111933098A (zh) 伴奏音乐的生成方法、装置及计算机可读存储介质
CN108831423B (zh) 提取音频数据中主旋律音轨的方法、装置、终端及存储介质
CN112786025B (zh) 确定歌词时间戳信息的方法和声学模型的训练方法
CN111640432B (zh) 语音控制方法、装置、电子设备及存储介质
CN108763521A (zh) 存储歌词注音的方法和装置
CN113343022A (zh) 歌曲教学方法、装置、终端和存储介质
CN109635153B (zh) 迁移路径生成方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21896761

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.11.2023)