EP2351017B1 - Procédé permettant de détecter des motifs de notes dans des pièces musicales - Google Patents

Procédé permettant de détecter des motifs de notes dans des pièces musicales Download PDF

Info

Publication number
EP2351017B1
EP2351017B1 EP09755830A EP09755830A EP2351017B1 EP 2351017 B1 EP2351017 B1 EP 2351017B1 EP 09755830 A EP09755830 A EP 09755830A EP 09755830 A EP09755830 A EP 09755830A EP 2351017 B1 EP2351017 B1 EP 2351017B1
Authority
EP
European Patent Office
Prior art keywords
list
channel
patterns
instances
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP09755830A
Other languages
German (de)
English (en)
Other versions
EP2351017A1 (fr
Inventor
Brigitte Rafael
Stefan M. Oertl
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to EP09755830A priority Critical patent/EP2351017B1/fr
Publication of EP2351017A1 publication Critical patent/EP2351017A1/fr
Application granted granted Critical
Publication of EP2351017B1 publication Critical patent/EP2351017B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • G10H1/0041Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
    • G10H1/0058Transmission between separate instruments or between individual components of a musical system
    • G10H1/0066Transmission between separate instruments or between individual components of a musical system using a MIDI interface
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/061Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for extraction of musical phrases, isolation of musically relevant segments, e.g. musical thumbnail generation, or for temporal structure analysis of a musical piece, e.g. determination of the movement sequence of a musical work
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2250/00Aspects of algorithms or signal processing methods without intrinsic musical character, yet specifically adapted for or used in electrophonic musical processing
    • G10H2250/131Mathematical functions for musical analysis, processing, synthesis or composition
    • G10H2250/135Autocorrelation

Definitions

  • the present invention relates to a method of recognizing similar recurring patterns of notes in a piece of music containing note sequences distributed in parallel channels.
  • the invention sets itself the goal of creating such a method.
  • the method of the invention thus takes into account, for the first time and significantly, the parallel structural information of a multi-channel piece of music which may be concealed in the temporal coincidences of potential patterns ("candidate patterns") of different channels and associates these with an assessment of the robustness of found candidate patterns due to the similarities between them Instances, their so-called “fitness”.
  • candidate patterns potential patterns
  • channel used herein for a multi-channel piece of music is to be understood in its most general form, i. both in the sense of a single (monophonic) voice of a polyphonic sentence, in the sense of a (possibly polyphonic) instrumental voice, such as a bass, trumpet, string, percussion, piano parts, etc., as well as in the sense of a technical Channels like a midi channel, which can contain both monophonic and polyphonic voices, parts or their combinations, eg a drum pattern, a chord progression, a string replacement, etc.
  • the channel-related pattern recognition is thus placed on two equal bases, once an identifier and once a similarity detection, for which variants different Method can be used.
  • an implicit combination of the two methods results in the subsequent list evaluation by means of the self-similarity and coincidence values, because the results of the two methods compete there.
  • the method of the invention thus becomes "self-adaptive" for different types of input signals which respond differently to different types of recognition methods.
  • step a1) the detection of identical patterns is carried out by means of the correlation matrix method, as known per se from Hsu Jia-Lien et al. (supra) is known.
  • step a1) the best covering patterns are selected by iteratively selecting the respectively most frequent and / or longest pattern from the detected patterns.
  • step a) the segment length is varied in multiples of the clock unit of the piece of music, which limits the possible variations to an appropriate level and saves computing time. It is particularly advantageous if the segment length is varied from twice the average note duration of the piece of music to half the length of the piece of music.
  • step a) the determination of segments which are similar to each other is carried out by aligning the notes of two segments, determining a degree of agreement of the two segments and recognizing similarity, if the degree of agreement exceeds a predetermined threshold value.
  • the alignment of the notes preferably takes place by means of the "Dynamic Programming" method, as described in Kilian Jürgen et al. (supra) or Hu Ning et al. (supra, with further evidence) known per se.
  • the eigen similarity value is calculated in step b). in that, for each candidate pattern of the list, a similarity matrix of its instances is set up whose values are linked to the self-similarity value of the list, preferably weighted by the channel coverage of the candidate patterns of the list. It has been found that this embodiment leads to a rapid and stable implementation.
  • this predetermined threshold value is adaptive, in particular a percentage of the highest self-similarity value of all lists of the channel, particularly preferably at least 70%. In a practically particularly suitable embodiment, the threshold is about 85%.
  • step c) only the overlaps to those instances of the other list are taken into account for a specific candidate pattern of a list, with which the time-longest overlaps exist. Practical experiments have shown that this leads to a satisfactory recognition rate and simplifies the process in this step.
  • step d) for each list of each channel only those coincidence values to the lists of the other channels are taken into account, which there represent the highest value, which further improves the recognition rate.
  • the coincidence values taken into account for a list are respectively summed up and the accumulated coincidence values are particularly preferably multiplied by the intrinsic similarity value of the list to said total value.
  • Fig. 1 shows a section of a piece of music which contains notch sequences q 1 , q 2 and q 3 (generally q p ) distributed in parallel channels ch 1 , ch 2 and ch 3 (generally ch p ), which in Fig. 2 are shown schematically.
  • the channels ch p are, for example, separate MIDI channels for the various instruments or voices of the piece of music, although this is not mandatory, as explained above.
  • step a1) can optionally be dispensed with, with a correspondingly limited range of applications of the method, as discussed above.
  • Steps a1) to d) will now be described in detail.
  • step a1) for detecting the identical recurring in a channel ch p touch pattern (identical "loops") at first a correlation matrix according to Hsu Jia-Lien et al for each channel ch p. (supra).
  • Fig. 4 shows an example of such a correlation matrix: the first row and the first column each contain the entire note sequence of a channel in which patterns are to be detected; and only one triangle of the matrix is relevant.
  • the first entry "1" in a row means that a note in the sequence already occurs the second time ; an entry “2" means that the pattern of length 2 ("2-loop") consisting of this and the preceding note occurs a second time; the entry “3” indicates that the pattern of length 3 ("3-loop") consisting of this, the previous and the previous notes appears a second time in this line, etc.
  • a provisional list can be set according to Fig. 5 in which identity patterns m I , m II , m III , m IV , etc. found as identically identifiable, with the positions of their occurrence or occurrence in the note sequence q p , ie their so-called “instances”, as well as their length and frequency are listed.
  • step a1) results in each channel ch p a first list L 1 of candidate patterns m 1 a m 1b (generally m 1x), that the channel ch p or its sequence of notes q p without overlap, and as far as possible, that is without gaps as possible cover, see Fig. 8 ,
  • step a) a second approach is followed.
  • Each channel ch p (or its note sequence q p ) is repeated and each segmented in different ways, with varying segment length and start.
  • Fig. 9 shows five exemplary types of segmentation I - V, wherein the segment length in multiples of the clock unit of the piece of music, ie the duration of a beat of the piece of music is varied; For example, at a 4/4 time, the clock unit is a quarter note.
  • segmentation types I and II shown are based on a segmentation into segments with a length of two beats, wherein in segmentation II the beginning of the segment has been offset by one beat.
  • the segmentation types III - V are based on a segment length of three beats and a successive offset of the beginning of the segment by one beat at a time.
  • the segment length is varied from twice the average note duration of the piece of music to a maximum of half the length of the entire piece of music, since the maximum Length of a note pattern can be at most half the length of the piece of music. If desired, for shortening the method could also be terminated earlier, ie the segment length can be varied, for example, only up to a predetermined number of clocks.
  • the similarity of the segments S s and S t is then evaluated using a correspondingly selected scoring scheme between 0% (unlike) and 100% (ident), for example based on the number of identical scores, the number of gaps, the pitch of deviating scores, etc.
  • Two Segments S s , S t are then recognized as "similar” if their similarity value determined in this way is above a predetermined threshold value, preferably above 50%.
  • instances i i a candidate pattern resulting from the note sequence of one (eg the first) of these segments.
  • the candidate patterns thus found for a segmentation type of a channel are stored in the form of another list L 2 of candidate patterns m 2a , m 2b , etc. with their respective instances i 1 , i 2 , etc., see Fig. 13 ,
  • a self-similarity value E n is calculated on the basis of similarity matrices for all candidate patterns m na , m nb , etc. (generally m nx ) of the list L n .
  • Fig. 16 Figure 10 shows an exemplary similarity matrix for the instances i 1 , i 2 , i 3 and i 4 of a candidate pattern m n of the list L n :
  • the cells of the matrix give the degree of similarity, for example as determined according to the "Dynamic Programming" step of step a) , again; For example, here the similarity between instance i 1 and instance i 3 is 80%.
  • the self-similarity value E nx of the candidate pattern m nx is also referred to as "loopfitness" of the candidate pattern m nx .
  • step that lists L can, after determining the intrinsic similarity values E n of the lists L n, for example, immediately after step b), for a specific channel ch p all n of the channel ch are deleted p whose intrinsic similarity values E n a predetermined Do not reach threshold.
  • the threshold value can preferably be specified adaptively or dynamically, for example as a percentage of the highest eigennormality value E n of all lists L n of the channel ch p , eg as at least 70% or particularly preferably as approximately 85% of the highest eigennormality value E n of all lists L n des Channel ch p .
  • step c) for each list L n, coincidence values are calculated, between each list L n of each channel ch p and each list L n of each other channel ch p ' , as in US Pat Fig. 17 and 18 outlined.
  • Fig. 18 shows - representative of all these coincidence value calculations - the first list L 21 of the channel ch 2 , which is in each case compared with all other lists (but not with the lists of its own channel ch 2 ) in order to obtain coincidence values K 21-12 , K 21- 31 , etc., generally K pn-p'n ' (with p' ⁇ p), from which then a total coincidence value K pn for each list L pn is determined, as described below.
  • Fig. 17 is a coincidence value from the temporal overlaps u of the instances i i of two lists to be compared - Fig. 17 for the sake of simplicity only referred to as L 1 and L 2 - calculated:
  • the coincidence value K pn-p'n ' is the sum of all time durations t i of those instance overlaps u, which are considered below, based on the time duration T of the entire considered channel ch p .
  • the candidate pattern m 1b (ie its three instances i i , i 2 , i 3 ) overlaps three times with instances of one and the same candidate pattern of the second list L 2 , namely with the three instances i i , i 2 and i 5 the candidate pattern m 2a at the overlap times t 1 , t 2 and t 5 ; and only these overlap times are considered for the candidate pattern m 1b .
  • the coincidence value K pn-p'n ' can optionally be for instances coinciding exactly in their beginning or end - in the example shown in FIG Fig. 17 the coincident beginnings of the first instances i 1 of the candidate patterns m 1b and m 2a and the coincidence of the ends of the third instances i 3 of m 1a and m 2a and the starts of the fourth instances i 4 of m 1a and m 2a - for each coincidence especially increased, for example by a predetermined "bonus value" be incremented.
  • the eigen similarity values E pn and coincidence values K pn-p'n ' determined for each list L pn are linked to a total value G pn of the list L pn , for example by summing, multiplying or other mathematical operations.
  • the following relationship is applied: As in Fig. 18
  • a list for example the first list L 21 of the second channel ch 2 , only those coincidence values K 21-p'n 'are taken into account with respect to the lists L p'n' of the other channels ch p ' which are present in each channel each have the highest value.
  • K pn ⁇ p' Max in p' K pn - p'n' ,
  • the candidate patterns m px of the lists L p thus represent the best known for each channel ch p , taking into account its structural relationships with all other channels, similar recurring note pattern of the channel, as in Fig. 20 shown.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Auxiliary Devices For Music (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Claims (15)

  1. Procédé permettant de détecter des motifs de notes se reproduisant de manière semblable dans une pièce musicale, qui contient des séquences de note (q) réparties sur des canaux parallèles (ch), comportant les étapes :
    a) segmenter de manière répétée chaque canal (ch) en variant la longueur et le début du segment et pour chaque type de segmentation, déterminer des segments (S) semblables les uns aux autres et les enregistrer dans des listes (L) de motifs de candidat (m) avec leurs instances respectives (i), et ce respectivement dans une liste par type de segmentation et canal ;
    b) calculer une valeur de propre similitude (E) pour chaque liste (L) qui se base sur des similarités des instances (i) de chaque motif de candidat (m) d'une liste entre elles ;
    c) calculer des valeurs de coïncidence (K) pour chaque liste (L) de chaque canal (ch) par rapport aux listes de tous les autres canaux, qui se base respectivement sur les chevauchements (u) d'instances (i) d'un motif de candidat (m) d'une liste (L) avec des instances (i) d'un motif de candidat (m) de l'autre liste (L) si celles-ci se chevauchent au moins deux fois ; et
    d) assembler les valeurs de propre similitude et de coïncidence (E, K) de chaque liste (L) en une valeur globale (G) par liste et utiliser les candidats de motif (m) des listes (L) avec la valeur globale (G) la plus élevée dans chaque canal (ch) comme motif de note détecté du canal.
  2. Procédé selon la revendication 1, caractérisé en ce qu'à l'étape a), l'étape suivante est en outre réalisée :
    a1) détecter les motifs (m) se reproduisant identiquement dans un canal (ch), à partir de là sélectionner les motifs recouvrant au mieux le canal et les enregistrer dans une autre liste (L) de motifs de candidat (m) avec leurs instances respectives (i) par canal.
  3. Procédé selon la revendication 2, caractérisé en ce qu'à l'étape a1), la détection de motifs (m) se reproduisant identiquement est réalisée à l'aide du procédé de matrice de corrélation connu en soi.
  4. Procédé selon la revendication 2 ou 3, caractérisé en ce qu'à l'étape a1), la sélection des motifs (m) recouvrant au mieux est effectuée par sélection itérative du motif respectivement le plus fréquent et/ou le plus long parmi les motifs détectés.
  5. Procédé selon l'une quelconque des revendications 1 à 4, caractérisé en ce qu'à l'étape a), la longueur de segment est modulée en multiples de l'unité de mesure de la pièce musicale.
  6. Procédé selon la revendication 5, caractérisé en ce que la longueur de segment est modulée du double de la durée des notes moyenne de la pièce musicale jusqu'à la moitié de la longueur de la pièce musicale.
  7. Procédé selon l'une quelconque des revendications 1 à 6, caractérisé en ce qu'à l'étape a), la détermination de segments (S) similaires les uns aux autres est effectuée par l'alignement mutuel des notes de deux segments, la détermination d'un degré de concordance des deux segments et la détection de la similitude si le degré de concordance dépasse une valeur seuil prescrite.
  8. Procédé selon la revendication 7, caractérisé en ce que l'alignement des notes est effectué à l'aide du procédé de « programmation dynamique » connu en soi.
  9. Procédé selon l'une quelconque des revendications 1 à 8, caractérisé en ce qu'à l'étape b) pour chaque motif de candidat (m) de la liste (L), une matrice de similitude de ses instances (i) est établie, dont les valeurs sont assemblées pour former une valeur de propre similitude (E) de la liste (L), de préférence par pondération par le recouvrement de canal (P) des motifs de candidat (m) de la liste (L).
  10. Procédé selon l'une quelconque des revendications 1 à 9, caractérisé en ce qu'à la fin de l'étape b), les listes (L) d'un canal (ch), dont la valeur de propre similitude (E) n'atteint pas une valeur seuil prescrite, sont supprimées.
  11. Procédé selon la revendication 10, caractérisé en ce que la valeur seuil prescrite est un pourcentage de la valeur de propre similitude (E) maximale de toutes les listes (L) du canal (ch), s'élève de préférence à au moins 70 %, de manière particulièrement préférée à environ 85 %.
  12. Procédé selon l'une quelconque des revendications 1 à 11, caractérisé en ce qu'à l'étape c), pour un motif de candidat déterminé d'une liste (L), seuls sont pris en considération les chevauchements (u) pour les instances (i) de l'autre liste (L), avec lesquelles les chevauchements les plus longs se présentent.
  13. Procédé selon l'une quelconque des revendications 1 à 12, caractérisé en ce que pour l'assemblage de l'étape d) pour chaque liste (L) de chaque canal (ch), seules les valeurs de coïncidence (K) par rapport aux listes (L) des autres canaux (ch) sont prises en considération, lesquelles y représentent la valeur respectivement la plus élevée.
  14. Procédé selon l'une quelconque des revendications 1 à 13, caractérisé en ce que lors de l'assemblage de l'étape d), les valeurs de coïncidence (K) prises en considération pour une liste (L) sont respectivement totalisées.
  15. Procédé selon la revendication 14, caractérisé en ce que lors de l'assemblage de l'étape d), les valeurs de coïncidence (K) totalisées sont multipliées par la valeur de propre similitude (E) de la liste (L) pour obtenir la valeur globale (G) citée.
EP09755830A 2008-10-22 2009-10-15 Procédé permettant de détecter des motifs de notes dans des pièces musicales Not-in-force EP2351017B1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP09755830A EP2351017B1 (fr) 2008-10-22 2009-10-15 Procédé permettant de détecter des motifs de notes dans des pièces musicales

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP08450164A EP2180463A1 (fr) 2008-10-22 2008-10-22 Procédé destiné à la reconnaissance de motifs de notes dans des morceaux de musique
EP09755830A EP2351017B1 (fr) 2008-10-22 2009-10-15 Procédé permettant de détecter des motifs de notes dans des pièces musicales
PCT/AT2009/000401 WO2010045665A1 (fr) 2008-10-22 2009-10-15 Procédé permettant de détecter des motifs de notes dans des pièces musicales

Publications (2)

Publication Number Publication Date
EP2351017A1 EP2351017A1 (fr) 2011-08-03
EP2351017B1 true EP2351017B1 (fr) 2013-01-02

Family

ID=40365403

Family Applications (2)

Application Number Title Priority Date Filing Date
EP08450164A Withdrawn EP2180463A1 (fr) 2008-10-22 2008-10-22 Procédé destiné à la reconnaissance de motifs de notes dans des morceaux de musique
EP09755830A Not-in-force EP2351017B1 (fr) 2008-10-22 2009-10-15 Procédé permettant de détecter des motifs de notes dans des pièces musicales

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP08450164A Withdrawn EP2180463A1 (fr) 2008-10-22 2008-10-22 Procédé destiné à la reconnaissance de motifs de notes dans des morceaux de musique

Country Status (3)

Country Link
US (1) US8283548B2 (fr)
EP (2) EP2180463A1 (fr)
WO (1) WO2010045665A1 (fr)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5935503B2 (ja) * 2012-05-18 2016-06-15 ヤマハ株式会社 楽曲解析装置および楽曲解析方法
JP5799977B2 (ja) * 2012-07-18 2015-10-28 ヤマハ株式会社 音符列解析装置
US9263013B2 (en) * 2014-04-30 2016-02-16 Skiptune, LLC Systems and methods for analyzing melodies
US11132983B2 (en) 2014-08-20 2021-09-28 Steven Heckenlively Music yielder with conformance to requisites
JP6160599B2 (ja) 2014-11-20 2017-07-12 カシオ計算機株式会社 自動作曲装置、方法、およびプログラム
JP6160598B2 (ja) * 2014-11-20 2017-07-12 カシオ計算機株式会社 自動作曲装置、方法、およびプログラム
US9804818B2 (en) 2015-09-30 2017-10-31 Apple Inc. Musical analysis platform
US9852721B2 (en) 2015-09-30 2017-12-26 Apple Inc. Musical analysis platform
US9824719B2 (en) 2015-09-30 2017-11-21 Apple Inc. Automatic music recording and authoring tool
US9672800B2 (en) * 2015-09-30 2017-06-06 Apple Inc. Automatic composer
US10074350B2 (en) 2015-11-23 2018-09-11 Adobe Systems Incorporated Intuitive music visualization using efficient structural segmentation
US11615772B2 (en) * 2020-01-31 2023-03-28 Obeebo Labs Ltd. Systems, devices, and methods for musical catalog amplification services

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW333644B (en) * 1995-10-30 1998-06-11 Victor Company Of Japan The method for recording musical data and its reproducing apparatus
JP2001042866A (ja) 1999-05-21 2001-02-16 Yamaha Corp ネットワークを介したコンテンツ提供方法及びシステム
US6225546B1 (en) * 2000-04-05 2001-05-01 International Business Machines Corporation Method and apparatus for music summarization and creation of audio summaries
US6747201B2 (en) * 2001-09-26 2004-06-08 The Regents Of The University Of Michigan Method and system for extracting melodic patterns in a musical piece and computer-readable storage medium having a program for executing the method
JP3613254B2 (ja) * 2002-03-20 2005-01-26 ヤマハ株式会社 楽曲データの圧縮方法
DE102004047068A1 (de) * 2004-09-28 2006-04-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zum Gruppieren von zeitlichen Segmenten eines Musikstücks

Also Published As

Publication number Publication date
EP2351017A1 (fr) 2011-08-03
EP2180463A1 (fr) 2010-04-28
US20110259179A1 (en) 2011-10-27
WO2010045665A1 (fr) 2010-04-29
US8283548B2 (en) 2012-10-09

Similar Documents

Publication Publication Date Title
EP2351017B1 (fr) Procédé permettant de détecter des motifs de notes dans des pièces musicales
EP1371055B1 (fr) Dispositif pour l'analyse d'un signal audio concernant des informations de rythme de ce signal a l'aide d'une fonction d'auto-correlation
EP1523719B1 (fr) Systeme et procede pour caracteriser un signal d'information
EP1797552B1 (fr) Procede et dispositif pour extraire une melodie servant de base a un signal audio
EP1368805B1 (fr) Procede et dispositif de caracterisation d'un signal et procede et dispositif de production d'un signal indexe
EP1407446B1 (fr) Procede et dispositif pour caracteriser un signal et pour produire un signal indexe
DE10123366C1 (de) Vorrichtung zum Analysieren eines Audiosignals hinsichtlich von Rhythmusinformationen
DE60303993T2 (de) Musikstrukturerkennungsgerät und -verfahren
EP1794745A1 (fr) Dispositif et procede pour modifier la segmentation d'un morceau audio
EP2099024A1 (fr) Procédé d'analyse orienté objet sonore et destiné au traitement orienté objet sonore de notes d'enregistrements de sons polyphoniques
WO2006039993A1 (fr) Procede et dispositif pour lisser un segment de ligne melodique
EP1794743B1 (fr) Dispositif et procede pour regrouper des segments temporels d'un morceau de musique
WO2006039992A1 (fr) Extraction d'une melodie sous-jacente a un signal audio
DE102004028693B4 (de) Vorrichtung und Verfahren zum Bestimmen eines Akkordtyps, der einem Testsignal zugrunde liegt
WO2006005448A1 (fr) Procede et dispositif de mise en forme rythmique de signaux audio
EP1377924B1 (fr) Procede et dispositif permettant d'extraire une identification de signaux, procede et dispositif permettant de creer une banque de donnees a partir d'identifications de signaux, et procede et dispositif permettant de se referencer a un signal temps de recherche
EP1671315B1 (fr) Procede et dispositif pour caracteriser un signal audio
EP1743324B1 (fr) Dispositif et procede pour analyser un signal d'information
DE10253868B3 (de) Verfahren und Anordnung zur Synchronisation von Test- und Referenzmustern sowie ein entsprechendes Computerprogramm-Erzeugnis und ein entsprechendes computerlesbares Speichermedium
DE102006014507B4 (de) Verfahren und Vorrichtung zur Klassifikation und Beurteilung von Musikinstrumenten gleicher Instrumentengruppen
EP1381024B1 (fr) Méthode pour retrouver une suite de notes

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110517

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 591988

Country of ref document: AT

Kind code of ref document: T

Effective date: 20130115

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502009005893

Country of ref document: DE

Effective date: 20130314

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20130102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130402

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130413

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130502

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130402

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130502

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130403

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

26N No opposition filed

Effective date: 20131003

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502009005893

Country of ref document: DE

Effective date: 20131003

BERE Be: lapsed

Owner name: OERTL, STEFAN M.

Effective date: 20131031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131031

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131015

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131015

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20091015

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130102

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20151026

Year of fee payment: 7

Ref country code: DE

Payment date: 20151022

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: AT

Payment date: 20151021

Year of fee payment: 7

Ref country code: FR

Payment date: 20151026

Year of fee payment: 7

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 502009005893

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 591988

Country of ref document: AT

Kind code of ref document: T

Effective date: 20161015

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20161015

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20170630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161015

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161102

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170503

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161015