EP3579223A1 - Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale - Google Patents

Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale Download PDF

Info

Publication number
EP3579223A1
EP3579223A1 EP18382392.1A EP18382392A EP3579223A1 EP 3579223 A1 EP3579223 A1 EP 3579223A1 EP 18382392 A EP18382392 A EP 18382392A EP 3579223 A1 EP3579223 A1 EP 3579223A1
Authority
EP
European Patent Office
Prior art keywords
tempo
frame
value
score
dominant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP18382392.1A
Other languages
German (de)
English (en)
Other versions
EP3579223B1 (fr
Inventor
Ainhoa ESTENOZ ABENDAÑO
Miroslav ZIVANOVIC JEREMIC
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NewMusicNow SL
Original Assignee
NewMusicNow SL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NewMusicNow SL filed Critical NewMusicNow SL
Priority to EP18382392.1A priority Critical patent/EP3579223B1/fr
Priority to PCT/EP2019/064153 priority patent/WO2019233886A1/fr
Publication of EP3579223A1 publication Critical patent/EP3579223A1/fr
Application granted granted Critical
Publication of EP3579223B1 publication Critical patent/EP3579223B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10GREPRESENTATION OF MUSIC; RECORDING MUSIC IN NOTATION FORM; ACCESSORIES FOR MUSIC OR MUSICAL INSTRUMENTS NOT OTHERWISE PROVIDED FOR, e.g. SUPPORTS
    • G10G1/00Means for the representation of music
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/051Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for extraction or detection of onsets of musical sounds or notes, i.e. note attack timings
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/076Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for extraction of timing, tempo; Beat detection
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/091Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for performance evaluation, i.e. judging, grading or scoring the musical qualities or faithfulness of a performance, e.g. with respect to pitch, tempo or other timings of a reference performance
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/005Non-interactive screen display of musical or status data
    • G10H2220/015Musical staff, tablature or score displays, e.g. for score reading during a performance.

Definitions

  • the present invention relates to the field of digital signal treatment and, in particular, to the treatment of digital signals corresponding to a musical performance and to methods and systems for recognizing music as it is performed.
  • the invention also relates to methods and systems for displaying and scrolling musical scores on a display screen.
  • a page of music image data from a music database is defined; next, ordered logical sections within that page are defined; then, the mapping is stored in a memory for selective retrieval; finally, the video display of the music responsive to the mapping and the storing is provided.
  • European patent EP2919228B1 discloses a method for scrolling a musical score on a screen of a device, in which musical signs are scrolled on the screen by continuously showing on the screen additional signs of music while the already scrolled ones disappear from the screen. The scrolling speed is adjusted according to the music content being displayed on the screen.
  • United States patent US8530735B2 describes a method for displaying music on a display screen, in which a tempo of the user's performance is supposed to be detected, from which the time period required by the player to complete the performance of a displayed portion of musical notes is calculated. At the end of the calculated time period, the portion of musical notes displayed on the screen is automatically replaced with a subsequent portion of musical notes.
  • M.F. McKinney et al. in Evaluation of Audio Beat Tracking and Music Tempo Extraction Algorithms, Journal of New Music Research, 2007, Vol. 36, No 1, pp. 1-16 provide an extended analysis of eight different algorithms for musical tempo extraction and beat tracking. While obtaining the tempo of a musical record has been successfully achieved for recorded musical pieces, extrapolating current methods thereof to real-time live performances has been proved to be unsuccessful due to noise and other disturbances.
  • US6156964 refers to a method of displaying a musical score in which a portion of the music score data corresponding to the playing position of the musician is displayed on a display device. The playing position of the musician in order to display the appropriate portion of the score on the screen is determined by comparing tone frequency data of the music score with tone frequency data of the music being played.
  • Another attemp for performing real-time music note recognition is disclosed in US2005/0015258A1 , in which a played note is identified and compared with a reference note by identifying the starting and ending edges in the time domain of each note.
  • US8660678B1 refers to a method for following a score based on Markov chains. Instead of focussing on analysing the detected audio signal, probabilistic technics are used, both for reducing the processing workload and for trying to avoid the problems derived from audio signals analysis. The most likely current location in the score and the most likely current tempo are estimated. However, again, this method cannot unequivocally identifiy the music played by any instrument and requires training beforehand a software application in order to generate Markov models.
  • the present disclosure provides a method for scrolling a digital musical score on a screen of an electronic device based on real-time music recognition which overcomes the mentioned disadvantages.
  • the scrolling speed is adjusted in real time according to the real time tempo at which music is being played by a musician.
  • the method described herein is mainly designed to run on an electronic device, such as a personal digital assistant (PDA), a portable reader device, a tablet, a cell phone containing a display or any device comprising a memory, a processor and a screen or display.
  • PDA personal digital assistant
  • An audio sensing means such as a sound sensing means or a vibration sensing means, is also required in order to capture the sound produced by the one or more musicians.
  • the sound may be produced by traditional instrument(s) or digital one(s), such as a MIDI board.
  • the sensing means may be embedded in the electronic device or may be a separate device connected to the electronic device by means of a wired or wireless connection.
  • Non-limiting examples of audio sensing means are a microphone or any other sound or vibration capturing means, such as piezo-electric capturing means.
  • the term "audio signal” refers to the signal as captured by the audio sensing means. The capture takes place in real time, that is to say, as music is being played. The captured signal is typically an analog signal. While it is captured, it may be converted into a digital signal.
  • the term "audio signal” may refer to the already digitized analog signal by means of an A/D converter (analog-to-digital converter), preferably embedded in the electronic device.
  • the audio sensing means may be comprised in the electronic device or may be independent therefrom, in which case the audio sensing means is connected to the electronic device.
  • the execution of the current method does not require high computational workload, as a consequence of which the method is especially indicated for being executed on low and mid-range electronic devices, such as a personal digital assistant (PDA), a portable reader device, a tablet or a a cell phone.
  • PDA personal digital assistant
  • the method is preferably implemented as a software application (APP).
  • the method may be designed to run simultaneously in a plurality of such devices, for example when an orchestra or any other group of musicians is playing together.
  • the method is implemented as computer program instructions/code which runs on one or more of the previously mentioned devices. It also requires storage means for storing the music scores in the form of digital files. This storage can be local or distributed, such as in the cloud.
  • additional hardware can be used, such as pedals for hands-free operation.
  • a “reference musical figure” can be any of the former notes (whole note, half note, etc.) which is taken as a reference along a score or a portion of a score. For example, if the quarter note is taken as “reference figure”, then, the whole note is made of four reference figures.
  • An empty measure sign also has time duration of a certain number of beats, indicated in the score or by the conductor or musician.
  • musical notes and rests can be dotted in order to lengthen their duration.
  • Tempo which is normally expressed as beats per minute (BPM or bpm), controls the rate at which the musical signs in a line -or in general, in a score- of music are played.
  • the musical score is in a digital format for representing, understanding and/or providing musical notation, that is to say, in a format which enables to unequivocally obtain all the symbols comprised in a score.
  • the format must be a musical notation format, such as MusicXML format or Standard MIDI File (SMF) format or MXL format, which are well-known formats for representing musical notation, unlike other digital formats, such as PDF, TIFF, JPG, BMP, EPS, PostScript or others.
  • the MusicXML format is a fully and openly documented XML-based proprietary format for representing musical notation.
  • the MusicXML standard contains information such as title, author, number of measures, number of systems, instrument number and name, position and duration of notes, and, generally, the same information as provided by a paper score.
  • MIDI Musical Instrument Digital Interface
  • MIDI carries event messages that specify notation, pitch and velocity, control signals for parameters such as volume, vibrato, audio panning, cues, and clock signals that set and synchronize tempo between multiple devices.
  • the Standard MIDI File (SMF) is a file format that provides a standardized way for sequences to be saved, transported, and opened in other systems.
  • the contents of the digital score may be adapted to the screen of the device.
  • file refers to a file in a musical notation format comprising a musical score.
  • the file is preferably loaded in the device and stored locally in a buffer within the memory of the device.
  • digital score refers to a musical score in a musical notation format.
  • a method for displaying a musical score on a screen of a device comprising: loading a file having a digital score in a piece or part of memory of the device; scrolling the digital score on the screen of the device; capturing an audio signal corresponding to the musical score being played by a musician; repeatedly selecting frames of the captured audio signal and, for each selected frame: obtaining a dominant tempo value at which the music contained in said frame is played; from the dominant tempo value obtained from said frame and from a reference tempo comprising a reference figure and a reference tempo value, estimating the tempo at which the musician is playing, said estimated tempo comprising said reference figure and a normalized tempo value with respect to the dominant tempo value; adjusting the scrolling speed of the digital score according to the estimated tempo.
  • the scrolling speed is adjusted in real time, according to the current tempo of the user who is actually playing.
  • the performer musician
  • real-time should be understood as guaranteeing a response within certain time constraints.
  • “real time” is understood a time comprised within a time range varying between a lower value V min and an upper value V max .
  • the upper value V max may be in the order of seconds, such as equal to or lower than, for example, 10 seconds, or equal to or lower than 5 seconds, or equal to or lower than 2 seconds, or equal to or lower than 1 seconds.
  • the lower value V min may be, in a non-limiting way, equal to or higher than 1 ⁇ s (microsecond, 10 -6 seconds), such as equal to or higher than 0.1 ms (miliseconds, 10 -3 seconds), or equal to or higher than 1 ms, or equal to or higher than 50 ms, or equal to or higher than 100 ms.
  • the musical score may be scrolled continuously - also referred to as dynamically, that is to say, as a continuous string of musical signs, drawn on the screen vertically, horizontally or obliquely, without interruption.
  • the musical score may be scrolled continuously or dynamically, vertically, horizontally or obliquely, with interruptions, such as soft interruptions, when required.
  • the scrolling speed is adjusted -increased or decreased-taking into account the music being played by the musician.
  • a portion of the digital score may be stopped or interrupted for a certain time on the screen and started to be continuously scrolled again afterwards.
  • the dominant tempo value at which the music contained in said frame is played is obtained as follows: detecting an onset function of said frame; finding a dominant tempo value in said onset function.
  • the onset function is detected as follows: obtaining a spectrogram from the captured frame; obtaining the onset function from the spectrogram.
  • the dominant tempo value is obtained by applying an autocorrelation function to the onset function.
  • the estimated tempo obtained from the dominant tempo value and from a reference tempo is obtained as follows: applying a set of scaling values to the obtained dominant tempo value, thus obtaining a set of scaled dominant tempo values, calculating the absolute difference between each value of the set of scaled dominant tempo values and the reference tempo value of the reference tempo, selecting the scaling value of the set of scaling values corresponding to the lowest absolute difference, multiplying the dominant tempo value by the selected scaling value, thus obtaining a normalized tempo value of the estimated tempo, the estimated tempo comprising said normalized tempo value and the reference figure of the reference tempo.
  • the method further comprises: after calculating the absolute difference, selecting the two lowest values V 1 , V 2 , wherein V 1 is the lowest value of the two values and V 2 is the highest value of the two values; if V 1 /V 2 ⁇ R, wherein R is a ratio established for limiting the deviation of the current estimated tempo, the estimation is considered correct, if V 1 /V 2 > R, then the estimation is considered incorrect.
  • the estimated tempo for that frame is obtained from a number of previously obtained estimated tempos for corresponding previous frames.
  • the reference tempo used for obtaining the estimated tempo is obtained as follows: The reference tempo is manually fixed by the musician, or the reference tempo is obtained from the digital score, or the reference tempo is provided by the algorithm by default, or the reference tempo is extracted from a data base.
  • the reference tempo is obtained from the digital score as follows: as a metronome mark, as a value associated to a word included in the score, as a time signature, as indication of metric changes along the score, or as a combination of the former.
  • the scrolling speed of said musical score on the screen is readjusted every time a new frame of the audio signal is captured.
  • the method further comprises, prior to continuously capturing frames of said audio signal, verifying that the first played notes correspond to a starting point identified in the digital score.
  • said digital score is scrolled on the screen of the electronic device as a continuous string of musical signs, drawn on the screen in a consecutive way, by showing on the screen additional musical signs of music while the already scrolled musical signs disappear from the screen, in such a way that additional musical signs start to gradually appear on the screen while the already scrolled musical signs start to gradually disappear from the screen.
  • a device comprising means for carrying out the method according to any preceding claim, said device being a personal digital assistant (PDA), a portable reader device, a tablet, a cell phone or any device which comprises a memory, a processor and a screen or display.
  • PDA personal digital assistant
  • portable reader device a portable reader device
  • tablet a tablet
  • cell phone any device which comprises a memory, a processor and a screen or display.
  • a computer program product comprising computer program instructions/code, for performing the method already disclosed.
  • a computer-readable memory/medium that stores program instructions/code, for performing the method already disclosed.
  • the method of displaying on the screen of an electronic device a score kept in a digital file is as follows: Preferably, a file having a digital score has been loaded in the memory of the device and the contents of the file stored in the buffer are read. Then, the total length of the score may be calculated in order to, by default, for example display the full score. Alternatively, only a first portion of the total length of the score may be calculated in order to, for example, display the first portion of the score. In this case, a second portion of the total length of the score is calculated before the first portion thereof has been played, and then displayed. This calculation and displaying of subsequent portions of score may be repeated until a last portion of the whole score is calculated and displayed.
  • the width of the digital score may be adapted to that of the screen on which it is displayed. In other words, by default, as many music lines as required may be shown/drawn, in order to show on the screen all the notes of the score along the width of the screen. Since, however, for practical reasons, only a certain amount of "lines" can be shown on the screen -for the user to be able to read them-, a scrolling or displacing function is activated.
  • the repetitions may be expanded.
  • those measures -or in general, musical signs- that should be played more than once are concatenated in a row as many times as repetitions marked in the score, according to a specific notation in the score.
  • the annotations corresponding to repetitions are marked in the digital file. Thanks to these marks, the algorithm, embedded in processing means, knows which portions must be expanded and how many times they must be expanded, that is to say, copied in a concatenated way.
  • This process may fill the buffer with the score fully "expanded". In this process, a pre-buffer may be stored in a temporary buffer for subsequent use.
  • FIG. 2A to 2D An example of vertical scrolling is shown in figures 2A to 2D , wherein four sequences of a digital score being scrolled from bottom to top are illustrated.
  • FIG. 3A to 3E An example of horizontal scrolling is shown in figures 3A to 3E , wherein five sequences of a digital score being scrolled from right to left are illustrated.
  • musical symbols or signs move along an "y" axis (along the height of the screen), while in horizontal scrolling musical symbols or signs move along an "x" axis, (along the width of the screen).
  • Figure 4 shows a virtual representation of the continuous scrolling according to embodiments of the invention, in which the notes or measures move along the screen (either from bottom to top or from right to left, that is to say, along a dimension of the screen).
  • the method of the present disclosure is then performed. Next it is explained how the scrolling speed is adjusted in real time to the music being played by the musician.
  • the algorithm adapts the speed at which the digital score is shown, that is to say, scrolled on the screen of the electronic device, based on an estimated tempo at which the musician(s) is(are) playing the score.
  • the algorithm is capable of estimating the tempo at which the musician(s) is(are) playing and of adjusting the scrolling speed of the digital score to the estimated tempo.
  • the expression "the musician” generally refers to a single musician or to a group of musicians playing together.
  • the musical signs scroll on the screen at the tempo at which the musician is playing.
  • the algorithm calculates the speed at which music should move on the screen, either vertically or horizontally, in such a way that the user is able to read it and interpret it, thus playing his/her instrument without interruptions and in a linear way, as illustrated for example in Figure 4 .
  • an algorithm for signal processing is applied, as explained next.
  • a musician starts playing the song.
  • the musician may then activate the algorithm for scrolling the digital score, for example by pressing a "start" button on the display prior to starting to play or by stepping on a pedal.
  • a method is performed for continuously obtaining or estimating a tempo at which the musician is playing, in order to adjust the scrolling speed to the estimated tempo.
  • the term "continuously” refers to repeatedly recalculating the estimated tempo for frames of audio signal of certain time duration, as explained next.
  • the music being played is sound waves.
  • the music (sound waves) being played is captured by an audio sensing means, such as a microphone, embedded or connected to the electronic device on which screen the digital score is being displayed.
  • the audio sensing means converts the captured sound into an analog audio signal. While it is captured, the analog audio signal is converted into a digital audio signal for example by means of an A/D converter.
  • Some electronic devices may comprise processing means for producing a digital audio signal in a single step, transparent to the user.
  • the audio sensing means may be integrated or embedded together with analog-to-digital conversion means. From now on in this disclosure, the term "audio signal" refers to the already digitized analog signal.
  • Figure 5 shows a general block diagram of the method stages for obtaining an estimated tempo 504 at which the musician is playing and for adjusting the scrolling speed 53 taking into account the estimated tempo 504.
  • Figure 5 represents a signal processing block for treating an audio signal 501 as captured by the audio sensing means and dully digitized, and corresponding to a musical performance. The method is executed in three stages: In a first stage 51, the rate at which music is being played is obtained; in other words, a dominant tempo value 502, also referred to as dominant rate, expressed in beats per time unit, typically BPM (beats per minute) is continuously obtained from the audio signal 501.
  • a dominant tempo value 502 also referred to as dominant rate
  • BPM beats per minute
  • an estimated tempo of the performance 504 is continuously obtained from the dominant tempo value 502 and from a reference tempo 503.
  • the dominant tempo value 502 is continuously normalized to a reference tempo 503, as a result of which the estimated tempo 504 of the music played by the musician is continuously obtained.
  • Figure 6 shows a block diagram of a stage 51 for obtaining or detecting a dominant tempo value 502 or dominant rate, such as dominant BPM, from an audio signal 501 corresponding to the music being played by the musician, according to a possible embodiment of the invention.
  • a dominant tempo value 502 or dominant rate such as dominant BPM
  • the term "dominant”, referred to a tempo value, rate or BPM means "the most repeated”, such as the most repeated tempo value, obtained from the most repeated time interval between the strongest peaks in an onset function, in other words, from the most repeated periodicity, as will be explained later in this disclosure.
  • a stage of onset function detection 511 is continuously applied to portions of the audio signal 501 being captured by the audio sensing means.
  • a stage of detection 512 of dominant tempo value also referred to as dominant period detection, is performed.
  • the onset detection 511 is applied in the form of a loop that lasts the duration of the score.
  • frames frame_i of the audio signal 501 of certain time duration are captured and then analyzed.
  • the capture of frames is represented by reference 510 in figure 6 .
  • the time duration of the captured frames may be constant or non-constant.
  • the time duration of these frames may be selected to be between 1 and 20 s (seconds), such as between 2 and 15 s, such as between 2 and 10 s.
  • This selected time duration may be different for different users, electronic devices or other circumstances. For example, it may vary, depending on the processing resources of the electronic device, among other reasons.
  • the analog signal 501 is digitized, for example by means of an analog-to-digital converter (A/D converter) either prior to the capture of frames 510 or after such capture.
  • the audio sensing means (not shown) includes or is embedded together with, an A/D converter, as a consequence of which the audio signal 501 is already a digital signal.
  • the algorithm may check whether or not the musician has started to play the score displayed on the screen of the device. This verification may be done in different ways. For example, but not in a limiting way, it may be done by comparing energy levels, such as by comparing the mean energy of an audio signal frame with the mean energy of a reference audio signal frame of with the mean energy of a group of audio signal frames. If the result (difference) of this comparison is above a certain threshold, it may be determined that the musician has started to play. Alternatively, it may be done by knowing the frequency of the first note in the score and the tuning of the instrument being played. Other ways of verification may be used. The way this verification is performed is out of the scope of the present disclosure.
  • an analysis of the captured frame frame_i is performed in order to detect the onset 511 of each played note within said frame frame_i.
  • An onset occurs every time a musical note starts to play.
  • An onset is represented as a peak in the temporal domain.
  • an onset function 61 is obtained, the onset function being a vector having the detected onsets and having the same temporal duration as the frame frame_i.
  • a spectrogram is computed 71 from each frame frame_i into which the audio signal 501 is divided.
  • the spectrogram represents the spectrum of frequencies in the audio signal -or rather, in a frame frame_i of the audio signal-. It represents how the frequencies vary with time (frequency on the vertical axis, time on the horizontal axis).
  • the spectrogram may be obtained by means of Fourier transform.
  • the spectrogram may be obtained for example, but not limiting, using the FFT (Fast Fourier Transform).
  • the spectrogram represents a time-frequency energy distribution of the audio signal in a given analysis frame. Analytically, it can be obtained as a squared modulus of short-time Fourier transform (STFT) of the audio signal in a given analysis frame.
  • STFT short-time Fourier transform
  • FIG 7B a frame frame_i of the audio signal is shown, from which a spectrogram is calculated in block 71.
  • a spectrogram is illustrated.
  • the spectrogram has been obtained by Fourier transform, such as a Fast Fourier transform (FFT), of the frame shown in figure 7B .
  • FFT Fast Fourier transform
  • the frames frame_i correspond to an analog audio signal
  • the time-domain frames are sampled for digital conversion and then Fourier transform is performed on each group of samples.
  • a third dimension indicating the amplitude of a particular frequency at a particular time is represented by the intensity of color (in grey scale) of each point in the image.
  • an onset function 61 is obtained (block 72 in figure 7A ).
  • each spectrogram is processed as follows: First, a vector of weights is generated in order to recombine the samples of the spectrogram into Mel-frequency bands. For example, the samples may be recombined into 40-channel Mel-frequency bands.
  • Mel-spectrogram As disclosed for example by Haytham Fayek in Speech Processing for Machine Learning: Filter banks, Mel-Frequency Cepstral Coefficients (MFCCs) and What's In-Between, April 21, 2016 (http://haythamfayek.com/2017/04/21/speech-processing-for-machine-learning. html) .
  • the Mel-spectogram is logarithmically compressed.
  • the Mel-spectogram samples may be calculated in decibels and those smaller than a certain threshold are rejected (set to zero).
  • the threshold may be fixed, for example, to -80 decibels.
  • the result of this operation is called Mel-log-spectrogram.
  • Figure 8C graphically represents the autocorrelation signal 55 obtained after applying an autocorrelation function (stage 512 for dominant period detection in figure 6 ) to the onset function 61 shown in figure 8B in turn obtained from a frame frame_i of audio signal 501 shown in figure 8A .
  • the autocorrelation function may be a discrete time autocorrelation function.
  • the strongest peak in the autocorrelation signal represents the dominant periodicity and therefore the dominant tempo value 502 of the audio signal analyzed in each frame.
  • a dominant tempo value (current tempo value of the performance as captured in frame_i) 502 is obtained for every frame frame_i of audio signal 501.
  • the dominant tempo value 502 In order to match this dominant tempo value with the content of the digital score being played, and therefore to adjust the speed at which the digital score scrolls, for each frame frame_i, the dominant tempo value 502 must be normalized with respect to a reference tempo 503.
  • an estimated tempo 504 of the performance as represented by frame_i must be obtained from the dominant tempo value 502 and from a reference tempo 503. That is to say, the actual tempo of the music being played must be obtained.
  • Figure 9 shows the tempo value (bpm) of audio frames of a musical performance.
  • audio frames frame_i are represented (1, 2, 3, 4...) as vertical lines.
  • tempo value (bpm) is represented in the "x" axis.
  • a theoretical constant tempo value of a portion of the musical performance is represented.
  • the actual tempo value, at which the musician is playing, which is not constant, is represented.
  • a spot represents the dominant tempo value 502 calculated prior to the dominant tempo value normalization stage 52 by the method of the present disclosure.
  • the errors in the estimation of frames 1 and 3 are most likely errors caused by the tempo estimation itself (because a musician cannot follow a determined tempo value with absolute precision), while the errors in the estimation of frames 2 and 4 are most likely errors caused by the rhythmic ambiguity of the frame.
  • the dominant period 502 In order to compensate for these errors (frames 2 and 4), the dominant period 502 must be normalized to a reference figure.
  • the calculated dominant tempo value 502 is normalized to a reference tempo 503, in order to update the estimated tempo 504 of the performance.
  • a reference tempo (tempo value & reference figure) is often indicated or suggested.
  • a reference tempo may be suggested on the score, typically using an Italian word ( Andante... ) associated to certain predetermined or well-known value, or a reference tempo is imposed by the conductor.
  • the player chooses the reference tempo at which he/she is going to play.
  • the reference tempo may be manually fixed by the musician, for example by typing it on the screen of the electronic device in order for the algorithm to be aware of the reference tempo. This is an option available at the APP implementing the algorithm.
  • a reference figure extracted from a time signature in the score may be taken into account, or alternatively the user may freely establish the reference tempo in BPM at his/her will.
  • the reference tempo may also be obtained by combining the two former possibilities, that is to say, using the Italian word indicated on the score together with a metronome mark or as time signature, or as indication of metric changes along the score, among other ways of obtaining the reference tempo from the digital score.
  • the reference tempo may also be provided by the APP by default.
  • the reference tempo may also be extracted from a data base.
  • the denominator indicates "eighth note”. Therefore, because 3 eighth notes must be grouped in each beat, the reference figure is the sum of 3 eighth notes, that is to say, a dotted quarter note.
  • the reference tempo indicated by any of the above mentioned ways, or by any other way is the reference tempo 503 for the first audio signal frame frame_1.
  • the audio sensing means such as a microphone
  • the estimated tempo 504 of the first portion of the performance is estimated.
  • the reference tempo 503 may be the estimated tempo of the previous frame frame_i-1. Or the reference tempo 503 may be an average estimated tempo calculated taking into account a certain number N of previous frames. Or the reference tempo 503 may be indicated to the algorithm by any of the ways already enumerated.
  • a set of scaling values are applied to the dominant tempo value 502 in order to normalize the dominant tempo value 502 to the reference tempo 503.
  • the dominant tempo value 502 is multiplied by all the values of the set of scaling values, thus obtaining a set of scaled dominant tempo values.
  • the absolute difference between each value of the set of scaled dominant tempo values and the reference figure of the reference tempo 503 is calculated:
  • the modulus (positive value) of the difference is considered.
  • the lowest absolute difference indicates the scaling value by which the dominant tempo value 502 must be multiplied in order to obtain the tempo value of the estimated tempo 504, the reference figure of the estimated tempo 504 being the reference figure of the reference tempo 503.
  • the dominant tempo value 502 When the dominant tempo value 502 is multiplied by the selected scaling value, the dominant tempo value 502 becomes normalized to the reference figure 503.
  • the tempo value of the estimated tempo 504 is the scaled dominant tempo value (scaled by the selected scaling value). This way, it is established that the tempo value of the estimated tempo 504 is the correctly scaled dominant tempo value and the reference figure of the estimated tempo 504 is the reference figure of the reference tempo 503.
  • the reference tempo 503 may be based on the previously normalized tempos 504.
  • the reference tempo 503 for frame_i may be the estimated tempo for frame_i-1, or an average tempo calculated taking into account the last N frames (frame_i-1, frame_i-2, ...frame_i-N).
  • it may be decided that the reference tempo 503 for frames other than the first one frame_1 is based on an indicated reference tempo, for example indicated to the algorithm by any of the ways already enumerated.
  • tempo estimation 504 is disclosed, in which the set of scaling values is 3 2 3 2 1 1 2 1 3 .
  • the musician has just started playing and therefore the first frame frame_1 of the audio signal 501 has just been captured.
  • a reference tempo 503 has been provided by the musician (for example by typing it on a window opened with that purpose on the screen of the device).
  • the value of the dominant tempo value 502 has been computed and is 60.
  • the dominant tempo value 502 (60 in this example) must be multiplied by the scaling value 2.
  • the reference figure of the estimated tempo 504 is the reference figure of the reference tempo 503, that is to say, "quarter note”.
  • the dominant tempo value 502 is thus normalized to the reference figure 503.
  • tempo estimation 504 is disclosed.
  • the musician keeps on playing and a second frame frame_2 of the audio signal 501 has just been captured.
  • frame_2 the value of the dominant tempo value 502 is 123.
  • the absolute difference between each value of the set of scaled dominant tempo values and the tempo value (in this case, 120) of the reference tempo 503 is calculated: ⁇ 249, 126, 64.5, 3, 58.5, 79 ⁇ .
  • the dominant tempo value 502 (123 in this example) must be multiplied by the scaling value 1.
  • the reference figure of the estimated tempo 504 is the reference figure of the reference tempo 503, that is to say, quarter note.
  • the digital score is scrolled on the screen of the electronic device at a speed adjusted 53 from the estimated tempo 504 at which the player is actually playing.
  • the scrolling speed is recalculated -adjusted- every time a new frame frame_i of the audio signal is captured.
  • the scrolling speed is recalculated in real time, since a new frame is captured and analyzed every few seconds or even milliseconds.
  • the algorithm reacts to this mistake and discards the estimation performed for the current frame (for example frame_j).
  • the estimated tempo for the current frame frame_j is replaced, for example, for a mean value of all the previous estimated tempos (from frame_1 to frame_j-1), or for a mean value of the N previously estimated tempos (from frame_j-N-1 to frame_j-1) or for the lastly estimated tempo (for frame_j-1).
  • the two lowest values of the set of absolute differences are selected.
  • two musical figures are selected as candidates (for example the musical figures represented by "2" and "3/2").
  • a ratio R that limits the deviation of the current estimated tempo from an average value of previous estimated tempos is then used.
  • V 1 /V 2 > R If, on the contrary, V 1 /V 2 > R, then it is considered that the estimated tempo value has deviated too much from any potential tempo as a consequence of a severe error.
  • the estimated tempo value is then considered to be a different one, for example a mean value of all the previous estimated tempo values (from frame_1 to frame_j-1), of a mean value of the N previously estimated tempo values (from frame_1+N to frame_j-1) or the lastly estimated tempo value (for frame_j-1).
  • the rate R may be empirically obtained.
  • the rate R may be selected to be 0.6 ⁇ R ⁇ 0.9.
  • V 1 /V 2 ⁇ R it is considered that there is no error and the tempo value of the estimated tempo 504 is calculated following the general method. If, on the contrary, V 1 /V 2 > R, the correction disclosed in this paragraph is applied.
  • an adjusted estimated tempo 504 is calculated.
  • the digital score is scrolled on the screen of the electronic device at a speed adjusted 53 from the adjusted estimated tempo 504 at which the player is actually playing.
  • the scrolling speed is recalculated every time a new frame frame_i of the audio signal is captured.
  • the scrolling speed is adjusted as follows: The scrolling speed is adjusted according to the musical signs being displayed on the screen at each time instant (for example, every time a new frame frame_i is captured) and to the obtained estimated tempo 504. For each displayed sign, the time length or time duration required for playing the displayed sign is calculated as the amount of reference musical figures in the sign divided by the tempo value.
  • the time length or time duration needed by the sign to cover the length or width (depending on whether the scroll is vertical or horizontal) of the screen is calculated as the amount of reference musical figures in the sign divided by the tempo value. If required, the dimensions (length or width) of the screen (space to be covered by a musical sign) may be obtained from the electronic device.
  • the invention provides many advantages over prior art methods of scrolling musical scores. Some advantages are recited next: It is not necessary to establish the exact point of reading, but only the beats per minute at which the player is playing. Therefore, different rhythmic variations are detected without causing a reaction critical point. Thus, the scrolling speed variations necessary to adapt the displayed portion of score to the actual tempo of the player have a wide margin and can be soft, without compromising at any moment the reading of the score.
  • the flow of the digital score on the screen itself acts as a "time line", but it leaves at all times the player a wide margin for reading both ahead and behind the exact point of interpretation. The tempo of any performance can be detected, even from a performance obtained from a recording or from a live concert.
  • the method can be carried out by a plurality of users playing simultaneously the same score, but different particellas.
  • each user has a device of the ones already described (at least with a processor, memory and screen), the digital score being shown in a device of each user.
  • one of the devices can work as a master one, in the sense that the other devices synchronize with respect to this one.
  • each electronic device may scroll the corresponding score (particella) independently -that is to say, not synchronized- from the scroll of the other devices in the musical group.
  • the software application also permits the user to purchase scores. Preferably, once a score as been purchased, it is stored in an external system restricted to a particular classification of metadata.
  • the term “approximately” and terms of its family should be understood as indicating values very near to those which accompany the aforementioned term. That is to say, a deviation within reasonable limits from an exact value should be accepted, because a skilled person in the art will understand that such a deviation from the values indicated is inevitable due to measurement inaccuracies, etc. The same applies to the terms “about” and “around” and “substantially”.
EP18382392.1A 2018-06-04 2018-06-04 Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale Active EP3579223B1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18382392.1A EP3579223B1 (fr) 2018-06-04 2018-06-04 Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale
PCT/EP2019/064153 WO2019233886A1 (fr) 2018-06-04 2019-05-30 Procédé, dispositif et produit-programme informatique pour faire défiler une partition musicale

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP18382392.1A EP3579223B1 (fr) 2018-06-04 2018-06-04 Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale

Publications (2)

Publication Number Publication Date
EP3579223A1 true EP3579223A1 (fr) 2019-12-11
EP3579223B1 EP3579223B1 (fr) 2021-01-13

Family

ID=62716005

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18382392.1A Active EP3579223B1 (fr) 2018-06-04 2018-06-04 Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale

Country Status (2)

Country Link
EP (1) EP3579223B1 (fr)
WO (1) WO2019233886A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6156964A (en) 1999-06-03 2000-12-05 Sahai; Anil Apparatus and method of displaying music
US20010023635A1 (en) 2000-03-22 2001-09-27 Hideaki Taruguchi Method and apparatus for detecting performance position of real-time performance data
US20050015258A1 (en) 2003-07-16 2005-01-20 Arun Somani Real time music recognition and display system
US7098392B2 (en) 1996-07-10 2006-08-29 Sitrick David H Electronic image visualization system and communication methodologies
US8530735B2 (en) 2009-12-04 2013-09-10 Stephen Maebius System for displaying and scrolling musical notes
US8660678B1 (en) 2009-02-17 2014-02-25 Tonara Ltd. Automatic score following
US20140358265A1 (en) * 2013-05-31 2014-12-04 Dolby Laboratories Licensing Corporation Audio Processing Method and Audio Processing Apparatus, and Training Method
US9280960B1 (en) 2014-12-15 2016-03-08 Amazon Technologies, Inc. Navigating music using an index including musical symbols
EP2919228B1 (fr) 2014-03-12 2016-10-19 NewMusicNow, S.L. Procédé, dispositif et programme informatique pour faire défiler une partition musicale.
US20170110102A1 (en) * 2014-06-10 2017-04-20 Makemusic Method for following a musical score and associated modeling method
US9747876B1 (en) 2015-07-28 2017-08-29 Amazon Technologies, Inc. Adaptive layout of sheet music in coordination with detected audio

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7098392B2 (en) 1996-07-10 2006-08-29 Sitrick David H Electronic image visualization system and communication methodologies
US6156964A (en) 1999-06-03 2000-12-05 Sahai; Anil Apparatus and method of displaying music
US20010023635A1 (en) 2000-03-22 2001-09-27 Hideaki Taruguchi Method and apparatus for detecting performance position of real-time performance data
US20050015258A1 (en) 2003-07-16 2005-01-20 Arun Somani Real time music recognition and display system
US8660678B1 (en) 2009-02-17 2014-02-25 Tonara Ltd. Automatic score following
US8530735B2 (en) 2009-12-04 2013-09-10 Stephen Maebius System for displaying and scrolling musical notes
US20140358265A1 (en) * 2013-05-31 2014-12-04 Dolby Laboratories Licensing Corporation Audio Processing Method and Audio Processing Apparatus, and Training Method
EP2919228B1 (fr) 2014-03-12 2016-10-19 NewMusicNow, S.L. Procédé, dispositif et programme informatique pour faire défiler une partition musicale.
US20170110102A1 (en) * 2014-06-10 2017-04-20 Makemusic Method for following a musical score and associated modeling method
US9280960B1 (en) 2014-12-15 2016-03-08 Amazon Technologies, Inc. Navigating music using an index including musical symbols
US9747876B1 (en) 2015-07-28 2017-08-29 Amazon Technologies, Inc. Adaptive layout of sheet music in coordination with detected audio

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ARSHIA CONT: "Modeling musical anticipation: From the time of music to the music of time", 16 October 2008 (2008-10-16), XP055516200, ISBN: 978-0-549-90641-4, Retrieved from the Internet <URL:https://tel.archives-ouvertes.fr/tel-00417565/document> [retrieved on 20181017] *
ARZT A ET AL: "Automatic page turning for musicians via real-time machine listening", 18TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, GHALLAB METAL, PATRAS GREECE, 25 July 2008 (2008-07-25), XP001544133, ISBN: 978-1-58603-891-5 *
HAYTHAM FAYEK: "Speech Processing for Machine Learning: Filter banks", MEL-FREQUENCY CEPSTRAL COEFFICIENTS (MFCCS) AND WHAT'S IN-BETWEEN, 21 April 2016 (2016-04-21), Retrieved from the Internet <URL:http://havthamfavek.com/2016/04/21/speech-processinq-for-mach ine-Iearn inq. html>
M.F. MCKINNEY ET AL.: "Evaluation of Audio Beat Tracking and Music Tempo Extraction Algorithms", JOURNAL OF NEW MUSIC RESEARCH, vol. 36, no. 1, 2007, pages 1 - 16, XP055178414, DOI: doi:10.1080/09298210701653252

Also Published As

Publication number Publication date
EP3579223B1 (fr) 2021-01-13
WO2019233886A1 (fr) 2019-12-12

Similar Documents

Publication Publication Date Title
US7582824B2 (en) Tempo detection apparatus, chord-name detection apparatus, and programs therefor
JP4672613B2 (ja) テンポ検出装置及びテンポ検出用コンピュータプログラム
JP4767691B2 (ja) テンポ検出装置、コード名検出装置及びプログラム
JP4823804B2 (ja) コード名検出装置及びコード名検出用プログラム
JP4916947B2 (ja) リズム検出装置及びリズム検出用コンピュータ・プログラム
US20040044487A1 (en) Method for analyzing music using sounds instruments
US9852721B2 (en) Musical analysis platform
US9804818B2 (en) Musical analysis platform
JP2010518428A (ja) 音楽転写
US9892758B2 (en) Audio information processing
JP4613923B2 (ja) 楽音処理装置およびプログラム
JP5229998B2 (ja) コード名検出装置及びコード名検出用プログラム
Caetano et al. Automatic segmentation of the temporal evolution of isolated acoustic musical instrument sounds using spectro-temporal cues
JP3996565B2 (ja) カラオケ装置
EP3579223B1 (fr) Procédé, dispositif et produit de programme informatique pour faire défiler une partition musicale
JP5005445B2 (ja) コード名検出装置及びコード名検出用プログラム
JP4070120B2 (ja) 自然楽器の楽音判定装置
JP3599686B2 (ja) カラオケ歌唱時に声域の限界ピッチを検出するカラオケ装置
JP5153517B2 (ja) コード名検出装置及びコード名検出用コンピュータ・プログラム
JP2009003225A (ja) コード名検出装置及びコード名検出用プログラム
JP4609921B2 (ja) 自然楽器の楽音判定装置におけるモデル作成装置およびモデル作成用プログラム
JP4530199B2 (ja) 自然楽器の楽音判定装置におけるモデル作成装置およびモデル作成用プログラム
JP2005234304A (ja) 演奏音判定装置および演奏音判定プログラム
JP2010032809A (ja) 自動演奏装置及び自動演奏用コンピュータ・プログラム
JP3870727B2 (ja) 演奏タイミング抽出方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200611

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIN1 Information on inventor provided before grant (corrected)

Inventor name: ZIVANOVIC JEREMIC, MIROSLAV

Inventor name: ESTENOZ ABENDANO, AINHOA

INTG Intention to grant announced

Effective date: 20200806

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602018011820

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1355136

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210215

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1355136

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210113

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210113

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210513

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210413

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210414

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210413

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210513

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602018011820

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

26N No opposition filed

Effective date: 20211014

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210604

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210513

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210113

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210113

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20180604

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230626

Year of fee payment: 6

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230529

Year of fee payment: 6