CN101488128B - Music search method and system based on rhythm mark - Google Patents

Music search method and system based on rhythm mark Download PDF

Info

Publication number
CN101488128B
CN101488128B CN 200810000482 CN200810000482A CN101488128B CN 101488128 B CN101488128 B CN 101488128B CN 200810000482 CN200810000482 CN 200810000482 CN 200810000482 A CN200810000482 A CN 200810000482A CN 101488128 B CN101488128 B CN 101488128B
Authority
CN
China
Prior art keywords
music
rhythm
melody
mark
humming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200810000482
Other languages
Chinese (zh)
Other versions
CN101488128A (en
Inventor
邓菁
朱璇
史媛媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN 200810000482 priority Critical patent/CN101488128B/en
Publication of CN101488128A publication Critical patent/CN101488128A/en
Application granted granted Critical
Publication of CN101488128B publication Critical patent/CN101488128B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a music searching method based on melody imprinting and a system thereof. The music searching method based on melody imprinting comprises the following steps: music segment to be searched is input; based on the music segment to be searched and a reference model stored in advance, a humming melody imprinting is created; the melody imprinting matching is carried out on the created humming melody imprinting and music melody printing stored in advance; the melody matching is carried out according to the result of the melody imprinting matching; and then a matching result is output.

Description

Method for searching music and system based on rhythm mark
Technical field
A kind of method and system of searching for music by music rhythm of the present invention, more particularly, relate to a kind of method for searching music based on rhythm mark and system, the method and system are by screening music libraries with rhythm mark, thereby the quickening search speed makes this method for searching music and system can be applied to embedded device.
Background technology
The 20070131094th A1 United States Patent (USP) is about using the three-dimensional search algorithm in music retrieval, it represents to hum and the similarity degree of music on melody and the lyrics by calculating similarity.In this patent, use based on three-dimensional searching algorithm, i.e. (t, S, H), and t represents the time here, and S represents acoustic feature, and H represents the UDS character string.This patent uses (t, S, H) to represent melody, and this patent related system is based on note, and has used sound identification module, and this makes the processing speed of its system can not satisfy embedded device.
The 6188010th B1 United States Patent (USP) is similar to the mode of fingerboard input by use, allow the user input the retrieval melody, searches specific song.
No. 6121530 United States Patent (USP) has provided a melody retrieval method on WWW.At first, melody is expressed as the form of the poor and adjacent note time length ratio of adjacent note; Then change it into spectrogram; Calculate to judge that by distance the input melody belongs to that song at last.
No. 5874686 United States Patent (USP) is mainly to improve about the mode with wavelet transformation the precision that melody extracts.It is converted to the poor form of adjacent note, i.e. UDS character string with melody.In search, use the maximum k of a permission unmatched string matching algorithm.
The 20070163425th A1 United States Patent (USP) mainly about how to carry out note in the music retrieval system is cut apart, and comprising: SED designator, pitch variation and energy variation.By relatively fundamental frequency and beat are determined the coupling mark.
By the music retrieval method/system of the above prior art that provides, can find out, traditional humming searching method comprises the following steps: extract fundamental frequency by using the fundamental frequency extracting method from humming, and be converted into minim; Carry out note based on the humming searching method of note on the melody that extracts and cut apart, wherein the duration of note is left in the basket, and does not cut apart and generally do not need to carry out note based on the humming searching method of frame; The forms such as humming and music rhythm are represented as finally that adjacent note is poor, UDS character string and melody average subtract; Carry out the melody search, the melody searching algorithm mainly contains string matching algorithm, dynamic time warping (DTW), Viterbi search, linear alignment (Linear Alignment) and iteration alignment (Recursive Alignment) etc.
Yet the processing speed of existing humming searching method and system is slower, also is not suitable for being applied to embedded device.
So, need a kind of processing speed that can accelerate the humming search system, can be applied to the method and system of embedded device (as mobile phone, MP3 player etc.).
Summary of the invention
According to an aspect of the present invention, a kind of method for searching music based on rhythm mark and system are provided, by relatively humming and the rhythm mark of music screen in advance to music, reduce the scale of melody matching, thereby improved the speed of music searching, made this method for searching music and system can be applied to embedded device.
According to an aspect of the present invention, provide a kind of method for searching music based on rhythm mark, said method comprising the steps of: input snatch of music to be searched; Create the humming rhythm mark based on snatch of music to be searched and pre-stored reference model; Humming rhythm mark and the pre-stored music rhythm marking that creates carried out the rhythm mark coupling; Carry out melody matching according to the result of rhythm mark coupling; The output matching result.
According to a further aspect in the invention, provide a kind of music search system based on rhythm mark, described system comprises: input block, input snatch of music to be searched; The rhythm mark creating unit creates the humming rhythm mark based on snatch of music to be searched and reference model; The rhythm mark matching unit, humming rhythm mark and the pre-stored music rhythm marking that the rhythm mark creating unit is created carry out the rhythm mark coupling; The melody matching unit carries out melody matching according to the result of rhythm mark matching unit coupling; Output unit, the output matching result.
Description of drawings
By the detailed description of reference accompanying drawing to exemplary embodiment of the present, above and other feature of the present invention and aspect will become clearer, wherein:
Fig. 1 illustrates the block diagram of music search system according to an exemplary embodiment of the present invention;
Fig. 2 illustrates the process flow diagram that extracts melody;
Reference model according to an exemplary embodiment of the present invention has been shown in Fig. 3;
Fig. 4 shows the structure of the music rhythm marking of music rhythm according to an exemplary embodiment of the present invention;
Fig. 5 shows the process flow diagram of rhythm mark coupling according to an exemplary embodiment of the present invention;
Fig. 6 illustrates the process flow diagram of method for searching music according to an exemplary embodiment of the present invention.
Embodiment
To describe exemplary embodiment of the present invention in detail now, its example is shown in the drawings, and wherein, identical label represents identical parts all the time.Below described exemplary embodiment is described to explain the present invention by the reference accompanying drawing.
Fig. 1 illustrates the block diagram of music search system according to an exemplary embodiment of the present invention.As shown in Figure 1, music search system comprises according to an exemplary embodiment of the present invention: melody extraction unit 100, rhythm mark creating unit 200, rhythm mark matching unit 300, melody matching unit 400 and output unit 500.
User's humming is input to the melody extraction unit.Melody extraction unit 100 can be to be used for extracting the module of melody in based on the music search system of note, can be also the melody extraction module that uses in the music search system based on frame, or other melody extraction module.Below, we provide a kind of melody extraction module that uses and how to extract melody from user's humming to explain in the music search system based on frame.As exemplary embodiment of the present invention, Fig. 2 shows the concrete operations that melody extraction unit 100 extracts melody.
With reference to Fig. 2, as shown in Figure 2, in step 201, melody extraction unit 100 extracts fundamental frequency from user's humming.Can extract fundamental frequency with arbitrary fundamental frequency extracting method of the prior art.Be not described in detail at this.
Then, in step 202, melody extraction unit 100 can carry out smoothing processing to the fundamental frequency that extracts by using smoothing method (for example, medium filtering, linear filtering etc.), to remove the mistake that produces in algorithm.
In step 203, will be transformed into minim through the fundamental frequency after level and smooth according to following formula (1):
semitone = 12 × log 2 ( freq 440 ) + 69
Because user's humming has certain difference with original music, as the note difference, therefore, need to carry out conversion process to the minim sequence that obtains.So, in step 204, be that the melody average subtracts and the poor melody that represents of adjacent note by using commonly used in based on the music search system of frame.
Melody extraction unit 100 will output to through the melody after conversion rhythm mark creating unit 200.In addition, be input to rhythm mark creating unit 200 with reference to model and music libraries.In exemplary embodiment of the present invention, described reference model refers to the melody class 1 of a plurality of typical music to the set of melody class N.Figure 3 illustrates reference model.Described music libraries is comprised of the melody of many songs, and described melody can extract from MIDI, or obtains from music score, or extracts from existing music format (as MP3, WMA etc.) file.
In the music search system according to exemplary embodiment of the present invention, be only exemplary to the description of above unit/input, the present invention is not limited to this.For example, if store melody or the reference model of music file in the music search system according to exemplary embodiment of the present invention, be the music libraries described in exemplary embodiment of the present, can be to rhythm mark creating unit 200 input music libraries or reference models.In addition, if storing music file in music search system according to an exemplary embodiment of the present invention, rather than the melody of music file (namely, music libraries described in exemplary embodiment of the present), also can input music libraries, and the melody that only needs to extract by any melody extraction algorithm of the prior art the music file of storing in storer gets final product, and the set of the melody of the music file of extraction is music libraries alleged in exemplary embodiment of the present invention.In addition, user's input can be that the user hums, and can be also one section melody input.If one section melody replaces user's humming above-mentioned, need not comprise melody extraction unit 100, that is to say, directly one section melody with user's input is input to rhythm mark creating unit 200.
Rhythm mark creating unit 200 creates respectively humming rhythm mark and the music rhythm marking.
Rhythm mark creating unit 200 creates the humming rhythm mark by using from melody and the reference model of 100 inputs of melody extraction unit.According to any melody matching algorithm that can be applicable to music search system of the prior art, each model in humming melody and reference model is mated, the matching result (as the coupling mark) that obtains is arranged according to the reference model sequence number of correspondence, record as a vector, this vector is called " humming rhythm mark ".For example, with vector [S 1, S 2..., S i..., S N] as the humming rhythm mark, wherein, i represents the melody class i of reference model, N represents the quantity of melody class in reference model.
Music libraries and reference model that rhythm mark creating unit 200 is inputted by use create the music rhythm marking.According to any melody matching algorithm that can be applicable to music search system of the prior art, each trifle of each melody in music libraries and each model in reference model are mated, the matching result (as the coupling mark) that obtains is arranged according to the reference model sequence number of correspondence, record as a vector, the vector corresponding to each trifle of each melody in music libraries is called one " the music rhythm marking ".Fig. 4 shows the structure of a plurality of music rhythm markings corresponding to each trifle in a music rhythm.
As shown in Figure 4, a plurality of music rhythms are arranged in music libraries, each music rhythm is made of a plurality of trifles again, and the structure of a plurality of music rhythm markings of current music rhythm in the music libraries has been shown in Fig. 4, and wherein, described current music rhythm is made of M trifle.In Fig. 4, S I, jRepresent the i joint of current music and the matching result of the j melody class in reference model, for example mate mark.In Fig. 4, N represents the number of melody class in reference model, and M represents current music what trifles is made of.
In the music search system according to exemplary embodiment of the present invention, above description to rhythm mark creating unit 200 is only exemplary, and the present invention is not limited to this.For example, if in advance music libraries and reference model are stored in according in music search system of the present invention, can be pre-created and store the music rhythm marking, and not need rhythm mark creating unit 200 to create again the music rhythm marking when the search music.
Rhythm mark matching unit 300 mates humming rhythm mark and each music rhythm marking of 200 outputs of rhythm mark creating unit.
Below, describe the operation of rhythm mark matching unit 300 coupling humming rhythm marks and the music rhythm marking in detail with reference to Fig. 5.
In step 501, travel through whole music libraries, determine whether each music rhythm in music libraries has all passed through the rhythm mark coupling.
If determine to also have music rhythm not through the rhythm mark coupling in music libraries in step 501, proceed to step 502, select a music rhythm from the melody that mates without rhythm mark.
Then, proceed to step 503, determine whether the music rhythm marking in the music rhythm of selected unmatched has all passed through the rhythm mark coupling.
If determine to also have the music rhythm marking to mate without rhythm mark, proceed to step 504, from without selecting a music rhythm marking the music rhythm marking of coupling in the music rhythm of the unmatched of selecting in step 503.
As can be seen from Figure 4, the music measures of each music rhythm has the music rhythm marking of a correspondence, and step 504 can be selected successively by the order of music measures each music rhythm marking of music rhythm.
Then proceed to step 505, calculate the music rhythm marking of selection and the distance between these two vectors of humming rhythm mark.In system according to an exemplary embodiment of the present invention, be described as example with the distance between computational music rhythm mark and these two vectors of humming rhythm mark, but the present invention is not limited to this.For example, also can calculate two similarities between vector, the criterion of calculating can be arbitrary based on the similarity of vector or apart from calculation criterion.
Then, proceed to step 506, whether the distance of determining to calculate in step 505 is less than predetermined threshold.If the distance of calculating in step 506 proceeds to step 507 less than threshold value, store the selected music rhythm marking, and return to step 503.
If the distance of calculating in step 506 is returned to step 503 greater than threshold value.
If determine that in step 503 all music rhythm markings of the music rhythm of the unmatched of selection in step 502 have all passed through the rhythm mark coupling, return to step 501.
If determine that in step 501 in music libraries, all music rhythms have all passed through music rhythm marking coupling, finish rhythm mark and process.
After aforesaid operations, the music rhythm marking that rhythm mark matching unit 300 will be stored in step 507 outputs to melody matching unit 400.If do not store the music rhythm marking in step 507, rhythm mark matching unit 300 outputs to output unit 500 with matching result (that is, not finding the music file of coupling).
Melody matching unit 400 is by using any melody matching algorithm that can be applicable to music search system of the prior art, to mate with the corresponding music rhythm of the music rhythm marking that receives and humming melody, and the result of melody matching will be outputed to output unit 500.If melody matching unit 400 searches and the music rhythm of humming melody and being complementary, output unit 500 exportable with the corresponding music file of this music rhythm, or export information such as the music that searches coupling by modes such as voice suggestion, text displays, perhaps with the information of other relevant music that search such as title of the direct outputting musics of mode such as text; If do not search and the music of humming melody and being complementary, output unit 500 can be exported information such as the music that does not search coupling by modes such as voice suggestion or text displays, and described output unit 500 can be audio player, display etc.
Music search system shown in Fig. 1 is only exemplary, and the present invention is not limited to this.Music search system also can comprise other parts according to an exemplary embodiment of the present invention, and perhaps parts shown in Figure 1 can be integrated into parts still less.For example, in music search system according to an exemplary embodiment of the present invention, also can comprise: input block, humming fragment, reference model, music libraries etc. can be input to music search system, input block can be the input medias such as microphone; Storage unit, (for example be used for the intermediate value of music that storage is input to music search system, humming fragment, reference model, music search system output, humming rhythm mark or the music rhythm marking), perhaps pre-stored reference model, music libraries or the music rhythm marking that creates based on reference model and music libraries etc.;
Fig. 6 is the process flow diagram that illustrates according to an exemplary embodiment of the present invention based on the method for searching music of rhythm mark.Below, will method for searching music be described by reference Fig. 6.
As shown in Figure 6, in step 601, the input user hums fragment, reference model, music libraries.In exemplary embodiment of the present invention, this input operation is only exemplary, and the present invention is not limited to this.For example, if store the melody of music file in the storer of the system that uses this searching method, namely the music libraries described in exemplary embodiment of the present, can input described music libraries, hum fragment and only input the user, or the input user hums fragment and reference model.In addition, if store music file in the storer of the system that uses this searching method, rather than the melody of music file (namely, music libraries described in exemplary embodiment of the present), also can input music libraries, and the melody that only needs to extract by any melody extraction algorithm of the prior art the music file of storing in storer gets final product, and the set of the melody of the music file of extraction is music libraries alleged in exemplary embodiment of the present invention.
In addition, also can just input reference model in step 603, perhaps just input reference model and music libraries in step 603.In addition, preferred, can pre-stored reference model and music libraries, and needn't input again reference model and music libraries when the search music.
In step 602, hum from the user who inputs and extract melody fragment.Describe the concrete operations that melody extracts in detail, will omit its detailed description at this.
In step 603, create humming rhythm mark and the music rhythm marking.Describing in detail how creating humming rhythm mark and the music rhythm marking with reference to the rhythm mark creating unit of Fig. 1, no longer being repeated in this description at this.Preferably, if pre-stored reference model and music libraries, the music rhythm marking that can be pre-stored be pre-created based on reference model and music libraries, and needn't create again the music rhythm marking when the search music, and in step 604, humming rhythm mark and the pre-stored music rhythm marking that creates carried out the rhythm mark coupling.
In step 604, carry out the rhythm mark coupling.Before concrete operations how to carry out rhythm mark coupling please refer to the detailed description of each operation of Fig. 5.
Then proceed to step 605, determine whether to exist and the music rhythm marking of humming the rhythm mark coupling.If determine to have the music rhythm marking that mates with the humming rhythm mark in step 605, proceed to step 606, carry out melody matching.By using any melody matching algorithm that can be applicable to music search system of the prior art, will mate with the humming melody corresponding to the music rhythm with the music rhythm marking of humming the rhythm mark coupling.
Then, proceed to output step 607.
If determine not exist and the music rhythm marking of humming the rhythm mark coupling in step 605, proceed to output step 607.
Do not exist and the music rhythm marking of humming the rhythm mark coupling if determine in step 605, perhaps do not search the corresponding music of melody with user's humming in step 606, can export information such as the music file that does not search coupling by modes such as voice suggestions, perhaps can not search the text message of the music file of coupling in the upper demonstration of output unit (for example, display screen) by the mode of text display.If search and the music of humming melody signal and being complementary according to the melody matching result in step 606, in exportable this music of step 607, or export information such as the music file that searches coupling by modes such as voice suggestions.
Should be noted that the operation steps of above-mentioned method for searching music based on rhythm mark only for exemplary, the present invention is not limited to this.For example, as in step 601 mention, if store music file in the storer of the system that uses this searching method, rather than the melody of music file (namely, music libraries described in exemplary embodiment of the present), also can input music libraries, and the melody that only needs to extract by any melody extraction algorithm of the prior art the music file of storing in storer gets final product, like this, in step 602, just not only hum the snippet extraction melody from the user, also need to extract music rhythm from the music file of storage; The perhaps music rhythm of the music file in pre-stored music libraries and needn't extract music rhythm in step 602.
In addition, if user input can be one section melody hum according to an exemplary embodiment of the present invention searching method and can not comprise step 602, the operation that does not namely need to extract melody is directly to step 603 from step 601.
Method for searching music and system based on rhythm mark can be based on note according to an exemplary embodiment of the present invention, also can be based on frame, can use any melody for music retrieval to represent, searching algorithm can be to be used for arbitrarily the algorithm of music retrieval, and user's input can be humming, can be also one section melody input.In addition, the present invention can accelerate the processing speed of music search system, makes it possible to be applied to embedded device (as mobile phone, MP3 player etc.).
It should be appreciated by those skilled in the art, without departing from the spirit and scope of the present invention, can carry out in form and details various changes.Therefore, exemplary embodiment as above is the purpose in order to illustrate only, and should not be interpreted as limitation of the present invention.Scope of the present invention is defined by the claims.

Claims (6)

1. method for searching music based on rhythm mark comprises the following steps:
Input snatch of music to be searched;
Create the humming rhythm mark based on snatch of music to be searched and pre-stored reference model, wherein, fragment to be searched is that the user hums or one section melody, and described reference model refers to the set of the melody class of a plurality of typical music;
Humming rhythm mark and the pre-stored music rhythm marking that creates carried out the rhythm mark coupling;
Carry out melody matching according to the result of rhythm mark coupling;
The output matching result,
Wherein, hum if fragment to be searched is the user, extract melody from user's humming,
The step that creates the humming rhythm mark comprises: the melody that extracts or the melody of input and each model in reference model are mated; The matching result that obtains is arranged according to the reference model sequence number of correspondence, recorded as a vector, this vector is called the humming rhythm mark,
The step of rhythm mark coupling comprises: music or the music rhythm of selecting a unmatched from music libraries; Select the music rhythm marking of a unmatched from the music of the unmatched selected or music rhythm; Calculate the distance between the music rhythm marking of humming rhythm mark and selection; Whether the music rhythm marking of determining humming rhythm mark and selection based on the distance of calculating mates,
Wherein, create the described music rhythm marking based on music libraries and reference model, if described music libraries is comprised of the melody of many songs, the step that creates the music rhythm marking further comprises:
Each trifle of each melody in music libraries and each model in reference model are mated;
The matching result that obtains is arranged according to the reference model sequence number of correspondence, recorded as a vector, the vector corresponding to each trifle of each melody in music libraries is called the music rhythm marking.
2. the method for claim 1, is characterized in that described music libraries is comprised of the melody of many songs or many songs.
3. the method for claim 1 is characterized in that the step that creates the music rhythm marking further comprises if described music libraries is comprised of many songs:
Extract the melody of per song in music libraries;
Each joint of the melody of per song in the music libraries of extracting and each model in reference model are mated;
The matching result that obtains is arranged according to the reference model sequence number of correspondence, recorded as a vector, the vector corresponding to each joint of the per song in music libraries is called the music rhythm marking.
4. music search system based on rhythm mark comprises:
Input block is inputted snatch of music to be searched;
The rhythm mark creating unit creates the humming rhythm mark based on snatch of music to be searched and reference model, and wherein, fragment to be searched is that the user hums or one section melody, and described reference model refers to the set of the melody class of a plurality of typical music;
The rhythm mark matching unit, humming rhythm mark and the pre-stored music rhythm marking that the rhythm mark creating unit is created carry out the rhythm mark coupling;
The melody matching unit carries out melody matching according to the result of rhythm mark matching unit coupling;
Output unit, the output matching result,
Wherein, hum if fragment to be searched is the user, the rhythm mark creating unit is extracted melody from user's humming,
The rhythm mark creating unit is mated the melody that extracts or the melody of input and each model in reference model; The matching result that obtains is arranged according to the reference model sequence number of correspondence, recorded as a vector, this vector is called the humming rhythm mark,
Music or the music rhythm of a unmatched selected in the melody matching unit from music libraries, select the music rhythm marking of a unmatched from the music of the unmatched selected or music rhythm, calculate the distance between the music rhythm marking of humming rhythm mark and selection, whether the music rhythm marking of determining humming rhythm mark and selection based on the distance of calculating mates
Wherein, if described music libraries is comprised of the melody of many songs, the rhythm mark creating unit is mated each trifle of each melody in music libraries and each model in reference model, and the matching result that obtains is arranged according to the reference model sequence number of correspondence, store as a vector, the vector corresponding to each trifle of each melody in music libraries is called the music rhythm marking.
5. system as claimed in claim 4, is characterized in that described music libraries is comprised of the melody of many songs or many songs.
6. system as claimed in claim 4, it is characterized in that if described music libraries is comprised of many songs, the rhythm mark creating unit is extracted the melody of per song in music libraries, each joint of the melody of per song in the music libraries of extracting and each model in reference model are mated, and the matching result that obtains is arranged according to the reference model sequence number of correspondence, store as a vector, the vector corresponding to each joint of the per song in music libraries is called the music rhythm marking.
CN 200810000482 2008-01-14 2008-01-14 Music search method and system based on rhythm mark Expired - Fee Related CN101488128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200810000482 CN101488128B (en) 2008-01-14 2008-01-14 Music search method and system based on rhythm mark

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200810000482 CN101488128B (en) 2008-01-14 2008-01-14 Music search method and system based on rhythm mark

Publications (2)

Publication Number Publication Date
CN101488128A CN101488128A (en) 2009-07-22
CN101488128B true CN101488128B (en) 2013-06-12

Family

ID=40891024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200810000482 Expired - Fee Related CN101488128B (en) 2008-01-14 2008-01-14 Music search method and system based on rhythm mark

Country Status (1)

Country Link
CN (1) CN101488128B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102522083B (en) * 2011-11-29 2014-03-05 北京百纳威尔科技有限公司 Method for searching hummed song by using mobile terminal and mobile terminal thereof
CN104484426A (en) * 2014-12-18 2015-04-01 天津讯飞信息科技有限公司 Multi-mode music searching method and system
CN105447199B (en) * 2015-12-29 2019-06-14 小米科技有限责任公司 Audio-frequency information acquisition methods and device
CN107436953B (en) * 2017-08-15 2020-07-10 中国联合网络通信集团有限公司 Music searching method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5510572A (en) * 1992-01-12 1996-04-23 Casio Computer Co., Ltd. Apparatus for analyzing and harmonizing melody using results of melody analysis
CN1737798A (en) * 2005-09-08 2006-02-22 上海交通大学 Music rhythm sectionalized automatic marking method based on eigen-note
CN1737796A (en) * 2005-09-08 2006-02-22 上海交通大学 Across type rapid matching method for digital music rhythm
CN1752970A (en) * 2005-09-08 2006-03-29 上海交通大学 Leap over type high speed matching device of numerical music melody
CN1940926A (en) * 2006-03-15 2007-04-04 中国人民大学 Efficient musical database query method based on humming

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5510572A (en) * 1992-01-12 1996-04-23 Casio Computer Co., Ltd. Apparatus for analyzing and harmonizing melody using results of melody analysis
CN1737798A (en) * 2005-09-08 2006-02-22 上海交通大学 Music rhythm sectionalized automatic marking method based on eigen-note
CN1737796A (en) * 2005-09-08 2006-02-22 上海交通大学 Across type rapid matching method for digital music rhythm
CN1752970A (en) * 2005-09-08 2006-03-29 上海交通大学 Leap over type high speed matching device of numerical music melody
CN1940926A (en) * 2006-03-15 2007-04-04 中国人民大学 Efficient musical database query method based on humming

Also Published As

Publication number Publication date
CN101488128A (en) 2009-07-22

Similar Documents

Publication Publication Date Title
CN100429656C (en) Music search system and music search apparatus
KR101082121B1 (en) System and method for storing and retrieving non-text-based information
CN103823867B (en) Humming type music retrieval method and system based on note modeling
CN101398827B (en) Method and device for singing search
CN105070283B (en) The method and apparatus dubbed in background music for singing voice
US7488886B2 (en) Music information retrieval using a 3D search algorithm
CN101471068B (en) Method and system for searching music files based on wave shape through humming music rhythm
CN101689225B (en) Generating music thumbnails and identifying related song structure
CN101625864B (en) Voice recognition apparatus, voice recognition method
CN100573518C (en) A kind of efficient musical database query method based on humming
JPH09293083A (en) Music retrieval device and method
CN109326280B (en) Singing synthesis method and device and electronic equipment
CN102760426A (en) Performance data search using a query indicative of a tone generation pattern
CN101740025A (en) Singing score evaluation method and karaoke apparatus using the same
CN103559309B (en) A kind of music retrieval and commending system accelerating based on GPU
CN101488128B (en) Music search method and system based on rhythm mark
JP2000187671A (en) Music retrieval system with singing voice using network and singing voice input terminal equipment to be used at the time of retrieval
CN102841932A (en) Content-based voice frequency semantic feature similarity comparative method
JPH11272274A (en) Method for retrieving piece of music by use of singing voice
Kroher et al. Computational ethnomusicology: a study of flamenco and Arab-Andalusian vocal music
JP5085577B2 (en) Playlist creation device, music playback device, playlist creation method, and playlist creation program
Ching et al. Instrument role classification: Auto-tagging for loop based music
CN112837698A (en) Singing or playing evaluation method and device and computer readable storage medium
JP2022033579A (en) Music structure analyzing device
Schuller et al. Multimodal music retrieval for large databases

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130612

Termination date: 20160114