CN111613198B - Rhythm type identification method and application of MIDI - Google Patents

Rhythm type identification method and application of MIDI Download PDF

Info

Publication number
CN111613198B
CN111613198B CN202010399290.7A CN202010399290A CN111613198B CN 111613198 B CN111613198 B CN 111613198B CN 202010399290 A CN202010399290 A CN 202010399290A CN 111613198 B CN111613198 B CN 111613198B
Authority
CN
China
Prior art keywords
midi
occurrence
rhythm
notes
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010399290.7A
Other languages
Chinese (zh)
Other versions
CN111613198A (en
Inventor
李晨啸
张克俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuyiyue Technology Hangzhou Co ltd
Zhejiang University ZJU
Original Assignee
Fuyiyue Technology Hangzhou Co ltd
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuyiyue Technology Hangzhou Co ltd, Zhejiang University ZJU filed Critical Fuyiyue Technology Hangzhou Co ltd
Priority to CN202010399290.7A priority Critical patent/CN111613198B/en
Publication of CN111613198A publication Critical patent/CN111613198A/en
Application granted granted Critical
Publication of CN111613198B publication Critical patent/CN111613198B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/071Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for rhythm pattern analysis or rhythm style recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Auxiliary Devices For Music (AREA)

Abstract

The invention discloses a rhythm type identification method of MIDI, comprising the following steps: splitting MIDI fragments into a bass region and a middle-high treble region according to threshold distribution; dividing the MIDI segment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and respectively counting the frequency of occurrence of notes in each time step in a bass region and a midtreble region; respectively counting the average frequency of the occurrence of notes on the corresponding time steps of N units aiming at a bass region and a middle treble region; the average frequency number is expressed as a time-step sequence, i.e. indicative of a rhythmic type of characteristic. A music style determining method based on rhythm type and a playing mode determining method based on rhythm type are also disclosed. The three methods are simple, and provide a new idea for automatic composition and automatic accompaniment.

Description

Rhythm type identification method and application of MIDI
Technical Field
The invention belongs to the field of MIDI music, and particularly relates to a rhythm type identification method of MIDI, a music style determination method based on rhythm type and a playing mode determination method.
Background
The rhythm (rhythmic pattern) of music generally refers to the regular repetition of melodies, textures, drum sets, etc. on the listening feel of music to a person. In jazz, it can be considered the quality of a continuously repeating rhythm unit that results from the interaction of music played by the band rhythm part (e.g., drum, electric bass or bass, guitar and keyboard). Rhythms are an important feature of popular music and can be found in many types including sha, put-gram, rock, fusion and soul music. Whereas a typical rock rhythm is one where bars 1 and 3 are the beats of the bass drum and the body instrument and bars 2 and 4 are the treble decorative performance of the bass drum and the body instrument, and a melodic octave is added at the half beat position.
From a broader national musical perspective, rhythmic patterns are described as something that persists in a unique, regular and attractive manner, a particular but orderly sensation that may attract listeners. Rhythmic is an understanding or sense of a rhythmic pattern, which is an intuitive sense of the musical movement cycle.
Different music styles each have a very characteristic rhythm pattern. People often judge the style properties of a piece of music by distinguishing between different rhythmic features. The rhythms of bruss, hip hop, jazz, lei Gui and the like surround one or two core rhythms, and the music delicacies of which the number is not counted are derived.
MIDI is a technology standard that describes communication protocols, digital interfaces, and specifications that connect various electronic musical instruments, computers, and related audio equipment for playing, editing, and recording music.
Musical notes played with instruments expressed in MIDI format are a symbolized representation of the content of the performance. The most basic MIDI note format does not cover specific tone color information, but only time, intensity, time released, etc. when different key positions are pressed in time series. In the modern links of recording, analyzing, making, etc., MIDI is widely used to record performance information of musical instruments. In the production and analysis of modern electronic music combined with a computer, a large number of algorithms start to emerge, and the demands of automatic generation of melodies and textures, automatic accompaniment and the like are also continuously developed. The identification and analysis of rhythmic patterns is of great significance to them.
Early musical sequencers, drums and electronic musical instruments began to consider the rhythm type attribute of MIDI fragments, and different rhythm type effects were realized through some established templates or programmable control modules. Various kinds of music workstations, synthesizers, computer virtual musical instrument plugins (VSTs) and the like appearing later also add support for rhythm selection, control, customization and the like.
However, the existing algorithm generally generates MIDI notes based on a predetermined rhythm pattern plate, and analyzes rhythm patterns for a given MIDI, and presumes the style of the existing MIDI fragments, so that the scheme for generating the same kind or difference value is relatively free.
Disclosure of Invention
The invention aims to provide a rhythm type identification method of MIDI, which comprises the following steps:
splitting MIDI fragments into a bass region and a middle-high treble region according to threshold distribution;
dividing the MIDI segment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and respectively counting the frequency of notes appearing in each time step of a bass region and a middle-high bass region;
respectively counting the average frequency of the occurrence of notes on the corresponding time steps of N units aiming at a bass region and a middle treble region; the average frequency number is expressed as a time-step sequence, i.e. indicative of a rhythmic type of characteristic.
The rhythm type identification method of MIDI can obtain rhythm type characteristics according to MIDI fragment analysis, is simple and reliable, and provides a new thought for automatic composition and automatic accompaniment.
Another object of the present invention is to provide a music style determining method based on rhythm, comprising the steps of:
determining rhythm type characteristics of the MIDI fragments by adopting the rhythm type identification method of MIDI;
and calculating the similarity of the rhythm type features and the rhythm type features corresponding to each music style, and selecting the music style corresponding to the maximum similarity as the music style of the MIDI section.
In the music style determining method based on the rhythm type, the music style of the MIDI fragment to be determined is identified through the similarity between the rhythm type characteristic of the music style and the rhythm type characteristic of the MIDI fragment to be determined, the process is simple, and the style determination is accurate.
Still another object of the present invention is to provide a performance style determining method based on rhythm, comprising the steps of:
determining rhythm type characteristics of the MIDI fragments by adopting the rhythm type identification method of MIDI;
according to the rhythm type characteristics of the middle and high voice areas, counting the total number of note time steps and the total frequency of the occurrence of the notes in the middle and high voice areas;
the performance mode of the mid-high pitch region is determined according to the ratio of the total frequency of occurrence of notes to the total number of occurrence of note time steps.
The performance mode is directly determined according to the frequency states of the notes presented by the rhythm type characteristics of the MIDI fragments.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a rhythm type recognition method of MIDI provided by the embodiment;
FIG. 2 is a schematic diagram of a MIDI segment split into a bass region and a mid-treble region according to a threshold distribution provided by an embodiment;
FIG. 3 is a diagram showing a visual representation of the statistics of the frequency of occurrence of notes for each time step in the bass and mid-treble regions provided by the example;
FIG. 4 is a graph showing a visual representation of the statistics of the average frequency of occurrence of notes over N units of time steps provided by the example;
fig. 5 is a flowchart of a music style determination method based on rhythm provided by the embodiment;
fig. 6 is a flowchart of a music style determination method based on a tempo scheme provided by the embodiment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the detailed description is presented by way of example only and is not intended to limit the scope of the invention.
As shown in fig. 1, an embodiment provides a rhythm type recognition method of MIDI, which includes the following steps:
s101, dividing MIDI fragments into a bass region and a middle-high pitch region according to the threshold distribution.
The MIDI segments may generally be MIDI performance segments of a textile instrument (e.g., a piano, a string instrument, a guitar, an electronic synthesizer, etc.). In the process of the tone region division, the highest pitch and the lowest pitch of all notes of the whole MIDI section are firstly analyzed, the tone threshold range is determined, and then the MIDI section is split into a bass region and a middle-high pitch region according to the tone threshold distribution. Wherein, the threshold range of 1/4-1/3 of the lowest pitch can be used as the bass region, and the rest is the middle-high tone region. Specifically, the 1/4 range of the lowest pitch can be located directly in the bass region, leaving the threshold range (i.e., the 3/4 range near the highest pitch) to locate the mid-treble region, as shown in FIG. 2. After the bass region and the mid-treble region are determined, the notes located in the bass region become bass notes, and one note located in the mid-treble region is a mid-treble note. The mid-high pitch region is composed of notes played by the textile instrument, and the mid-high pitch region can be the mid-high pitch textile region, and the corresponding notes are textile notes.
S102, dividing the MIDI segment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and counting the frequency of occurrence of notes in each time step in a bass region and a mid-treble region respectively.
The rhythmicity of MIDI fragments is generally circularly represented, and a phrase or several phrases is a rhythm type, and thus, a minimum rhythm type is a unit length. Through a large number of statistics, MIDI fragments are typically divided in units of length of 2 or 4 bars. For a 32-bar MIDI fragment, if 2 bars are used as units, the MIDI fragment is divided into 16 units, and the frequency of occurrence of notes is counted for the bass region and the mid-treble region in the 16 units.
In the frequency statistics, time-step division is performed for each unit, that is, the time length corresponding to each unit is divided into a plurality of time steps. The granularity of this time step is determined according to the beat and note type of the MIDI segment, and it is generally required that one time step contains at most the shortest notes to the MIDI segment. For example, if the beat of a MIDI segment is 4/4 beats, the shortest note that appears for the MIDI segment is a 4-note, and one time step is at most the duration of one 4-note. If the duration corresponding to the 16 partials is one time step, 1 beat is divided into 4 time steps.
In this embodiment, statistics is performed on whether notes occur in each time step of the bass region and the mid-treble region, and the number of occurrence notes. As shown in the MIDI fragment of fig. 2, only one note appears in each time step of the bass region, the corresponding statistical frequency is 1, the corresponding statistical frequency is 0 if no note appears in a part of the time steps of the mid-treble region, and the corresponding statistical frequency is 3 if 3 notes appear in a part of the time steps. Thus, the MIDI fragments can be converted into rhythms expressed by numbers according to the occurrence states of specific time-step notes as judgment standards.
In order to more intuitively present the statistics, after counting the frequency of occurrence of each time step note in the bass region and the mid-treble region, the statistics may be visually represented, which specifically includes:
each time step is initialized to a light representation of empty particles, all empty particles are sequentially arranged in time sequence, the particles of the time step in which the notes appear are represented by dark colors, and the frequency of the occurrence of the notes is directly filled into the particles represented by dark colors.
In this embodiment, the shape of the hollow particles may be any shape such as a circle, a square, etc., and is not limited herein.
For example, for a MIDI fragment of 4/4 beat, if a 16-note is obtained in time steps, the visual result of the frequency of audio occurrence of the bass region and the mid-treble region of a MIDI fragment of 1 bar length is shown in fig. 3.
S103, respectively counting the average frequency of the occurrence of notes on the corresponding time steps of N units aiming at a bass region and a middle treble region; the average frequency number is expressed as a sequence of time steps, i.e. the rhythmic characteristics, and N is a natural number greater than 2.
In this embodiment, when dividing units according to rhythmicity, the duration corresponding to each unit is equal, and the number of time steps divided by each unit is also the same. It is necessary to count the states of occurrence of notes at the corresponding time steps of each cell. Specifically, the average frequency N of occurrence of notes over time steps for N units is counted according to the following formula i
Wherein n is i Representing the average frequency of occurrence of notes at the ith time step, l i,j Indicating the frequency of occurrence of notes in the ith time step of the jth cell, i=1, 2,3, … …, M being a natural number greater than 2, indicating the total number of time steps per cell.
Similarly, for visualization, after counting the average frequency of occurrence of notes over the time steps corresponding to the N units, the counted average frequency is visually represented, which specifically includes:
and initializing each time step to represent empty particles by light color, sequentially arranging all the empty particles according to time sequence, setting the maximum average frequency of the time steps to be dark color, giving different gradient colors from the dark color to the light color to the average frequency from the maximum to zero, and filling the empty particles with the gradient colors corresponding to the average frequency to realize visualization.
For example, assuming that the empty particles correspond to a light color having a gray value of 55, the dark color corresponding to a gray value of 255, the maximum average frequency number corresponds to a black color having a gray value of 255, the maximum value of 5 and the minimum value of 0, and if the average frequency number distribution from the maximum average frequency number of 5 to the minimum average frequency number of 0 is 5,4,3,2,1,0, gradient colors having gray values of 200,150,100,50 are respectively assigned to the average frequency numbers of 4,3,2,1, as shown in fig. 4.
The rhythm type identification method of MIDI can obtain rhythm type characteristics according to MIDI fragment analysis, is simple and reliable, and provides a new thought for automatic composition and automatic accompaniment.
As shown in fig. 5, the embodiment further provides a music style determining method based on rhythm, which includes the following steps:
s501, determining the rhythm type characteristics of the MIDI fragments by adopting the rhythm type recognition method of the MIDI.
The steps of the method for calculating the rhythm type characteristics of the MIDI fragment adopted in S501 are the same as those of S101 to S103 in the rhythm type identification method of MIDI, and are not repeated here.
S502, calculating the similarity of the rhythm type features and the rhythm type features corresponding to each music style, and selecting the music style corresponding to the maximum similarity as the music style of the MIDI section.
Music style, i.e., type of music, refers to a representative unique appearance that a musical composition exhibits as a whole, specifically including rock, soul, jazz, electronic pop, etc. The tempo of each music style presentation will present a certain tempo type feature, so that the music style of a MIDI can be determined from the tempo type features.
In this embodiment, the method for determining the rhythm type feature of each music style is as follows:
after collecting MIDI fragment samples belonging to the same music style, determining the rhythm type characteristic of each MIDI fragment sample by adopting the rhythm type identification method of MIDI, and taking the average value of the rhythm type characteristics of all MIDI fragment samples as the rhythm type characteristic of the music style.
In this embodiment, when similarity determination is performed, clustering, relative entropy or Wasserstein distance is used to calculate the similarity between the rhythm type feature and the rhythm type feature corresponding to each music style.
When the clustering method is adopted, the rhythm type characteristics of each music style are used as a clustering center, the distance between the rhythm type characteristics of the MIDI fragments and the clustering center is judged, so that the rhythm type characteristics of the MIDI fragments are clustered, and the corresponding music style of the type to which the rhythm type characteristics of the MIDI fragments belong is used as a determined music style.
In the music style determining method based on the rhythm type, the music style of the MIDI fragment to be determined is identified through the similarity between the rhythm type characteristic of the music style and the rhythm type characteristic of the MIDI fragment to be determined, the process is simple, and the style determination is accurate.
As shown in fig. 6, the embodiment further provides a performance mode determining method based on rhythm, which includes the following steps:
s601, determining the rhythm type characteristics of the MIDI fragments by adopting the rhythm type recognition method of MIDI.
The steps of the method for calculating the rhythm type characteristics of the MIDI fragment adopted in S601 are the same as those in S101 to S103 in the rhythm type identification method of MIDI, and are not described here again.
S602, counting the total number of note time steps and the total frequency of occurrence of the notes in the middle and high voice zone according to the rhythm type characteristics of the middle and high voice zone.
Assuming that the total time steps of a MIDI segment are 16 steps, the average frequency of occurrence at time step 1 is 2.3, the average frequency of occurrence at time step 5 is 2.6, the average frequency of occurrence at time step 13 is 2.9, the total number of time steps at which notes occur is 3, and the total frequency of occurrence of notes is 2.3+2.6+2.9=7.8 for the middle and high voice zone.
S603, determining the performance mode of the middle and high pitch zone according to the ratio of the total frequency of occurrence of the notes to the total number of occurrence of the note time steps.
In this embodiment, the performance mode is determined according to the magnitude relation of the comparison setting threshold value and the ratio of the total frequency of occurrence of notes to the total number of occurrence of note time steps. The method specifically comprises the following steps:
setting a performance mode judgment first threshold value and a second threshold value, wherein the first threshold value is smaller than the second threshold value; through experiments, the first threshold and the second threshold have better performance when about 1.3 and 2.0.
When the ratio of the total frequency of occurrence of the notes to the total number of occurrence of the note positions is smaller than a first threshold value, determining that the performance mode of the middle and high tone areas is a decomposition performance mode;
when the ratio of the total frequency of occurrence of the notes to the total number of occurrence of the note positions is greater than a second threshold value, determining that the performance mode of the middle and high tone areas is a column performance mode;
when the ratio of the total frequency of occurrence of notes to the total number of occurrence note positions is between the first threshold and the second threshold, the performance mode of the mid-high pitch region is determined to be a mixed performance mode.
The performance mode is directly determined according to the frequency states of the notes presented by the rhythm type characteristics of the MIDI fragments.
The foregoing detailed description of the preferred embodiments and advantages of the invention will be appreciated that the foregoing description is merely illustrative of the presently preferred embodiments of the invention, and that no changes, additions, substitutions and equivalents of those embodiments are intended to be included within the scope of the invention.

Claims (8)

1. A method for identifying a rhythm type of MIDI comprising the steps of:
splitting MIDI fragments into a bass region and a middle-high treble region according to threshold distribution;
dividing the MIDI segment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, respectively counting the frequency of occurrence of notes on each time step of a bass region and a midrange treble region, and visually representing the counting result, wherein the method comprises the following steps: initializing each time step into a blank particle represented by light color, sequentially arranging all blank particles according to time sequence, representing the particles of the note appearing in the time step by dark color, and directly filling the frequency of the note appearing on the particles represented by dark color;
for a bass region and a middle treble region, respectively counting average frequency of occurrence of notes on corresponding time steps of N units, and carrying out visual representation on the counted average frequency, wherein the visual representation comprises the following steps: initializing each time step to be a light-colored empty particle, sequentially arranging all the empty particles according to a time sequence, setting the maximum average frequency of the time steps to be a dark color, giving different gradient colors from the dark color to the light color to the average frequency from the maximum to zero, and filling the empty particles with the gradient colors corresponding to the average frequency to realize visualization;
the average frequency number is a sequence arranged according to time steps, namely, the characteristic of rhythmic type, and N is a natural number greater than 2.
2. A rhythm recognition method of MIDI as set forth in claim 1, characterized in that a threshold range of 1/4 to 1/3 near the lowest pitch is taken as a bass region, and the rest is a mid-high pitch region;
the MIDI fragments are divided into units with a unit length of 2 or 4 sections.
3. A MIDI rhythm type recognition method according to claim 1, wherein the average frequency N of occurrence of notes over time steps corresponding to N units is counted according to the following formula i
Wherein n is i Representing the average frequency of occurrence of notes at the ith time step, l i,j Indicating the frequency of occurrence of notes in the ith time step of the jth cell, i=1, 2,3, … …, M being a natural number greater than 2, indicating the total number of time steps per cell.
4. A music style determining method based on rhythm, which is characterized by comprising the following steps:
determining a rhythm type characteristic of a MIDI fragment using the rhythm type recognition method of MIDI as set forth in any one of claims 1 to 3;
and calculating the similarity of the rhythm type features and the rhythm type features corresponding to each music style, and selecting the music style corresponding to the maximum similarity as the music style of the MIDI section.
5. A rhythm-based music style determination method according to claim 4 wherein the determination method of rhythm-type characteristics of each music style is:
after collecting MIDI fragment samples belonging to the same music style, determining the rhythm type characteristic of each MIDI fragment sample by adopting the rhythm type recognition method of the MIDI as set forth in any one of claims 1 to 3, and taking the average value of the rhythm type characteristics of all MIDI fragment samples as the rhythm type characteristic of the music style.
6. A tempo-based music style determination process according to claim 4 wherein similarity of the tempo-type features to each music style is calculated using clustering, relative entropy or wasperstein distance.
7. A performance mode determining method based on rhythm is characterized by comprising the following steps:
determining a rhythm type characteristic of a MIDI fragment using the rhythm type recognition method of MIDI as set forth in any one of claims 1 to 3;
according to the rhythm type characteristics of the middle and high voice areas, counting the total number of note time steps and the total frequency of the occurrence of the notes in the middle and high voice areas;
the performance mode of the mid-high pitch region is determined according to the ratio of the total frequency of occurrence of notes to the total number of occurrence of note time steps.
8. The rhythm-based performance mode determination method according to claim 7, further comprising:
setting a performance mode judgment first threshold value and a second threshold value, wherein the first threshold value is smaller than the second threshold value;
when the ratio of the total frequency of occurrence of the notes to the total number of occurrence of the note positions is smaller than a first threshold value, determining that the performance mode of the middle and high tone areas is a decomposition performance mode;
when the ratio of the total frequency of occurrence of the notes to the total number of occurrence of the note positions is greater than a second threshold value, determining that the performance mode of the middle and high tone areas is a column performance mode;
when the ratio of the total frequency of occurrence of notes to the total number of occurrence note positions is between the first threshold and the second threshold, the performance mode of the mid-high pitch region is determined to be a mixed performance mode.
CN202010399290.7A 2020-05-12 2020-05-12 Rhythm type identification method and application of MIDI Active CN111613198B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010399290.7A CN111613198B (en) 2020-05-12 2020-05-12 Rhythm type identification method and application of MIDI

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010399290.7A CN111613198B (en) 2020-05-12 2020-05-12 Rhythm type identification method and application of MIDI

Publications (2)

Publication Number Publication Date
CN111613198A CN111613198A (en) 2020-09-01
CN111613198B true CN111613198B (en) 2023-09-08

Family

ID=72205152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010399290.7A Active CN111613198B (en) 2020-05-12 2020-05-12 Rhythm type identification method and application of MIDI

Country Status (1)

Country Link
CN (1) CN111613198B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112331170B (en) * 2020-10-28 2023-09-15 平安科技(深圳)有限公司 Method, device, equipment and storage medium for analyzing Buddha music melody similarity

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130039944A (en) * 2011-10-13 2013-04-23 김영옥 A novel score and apparatus for displaying the same
CN105374347A (en) * 2015-09-22 2016-03-02 中国传媒大学 A mixed algorithm-based computer-aided composition method for popular tunes in regions south of the Yangtze River

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3557917B2 (en) * 1998-09-24 2004-08-25 ヤマハ株式会社 Automatic composer and storage medium
JP4670423B2 (en) * 2005-03-24 2011-04-13 ヤマハ株式会社 Music information analysis and display device and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130039944A (en) * 2011-10-13 2013-04-23 김영옥 A novel score and apparatus for displaying the same
CN105374347A (en) * 2015-09-22 2016-03-02 中国传媒大学 A mixed algorithm-based computer-aided composition method for popular tunes in regions south of the Yangtze River

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
舒曼《交响练习曲》三个不同演奏版本的比较;王凌子;《蚌埠学院学报》;20180820(第04期);全文 *

Also Published As

Publication number Publication date
CN111613198A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
Bittner et al. Medleydb: A multitrack dataset for annotation-intensive mir research.
US9053696B2 (en) Searching for a tone data set based on a degree of similarity to a rhythm pattern
US5792971A (en) Method and system for editing digital audio information with music-like parameters
US9563701B2 (en) Sound data processing device and method
US5736666A (en) Music composition
US9875304B2 (en) Music selection and organization using audio fingerprints
CN102760426B (en) Searched for using the such performance data for representing musical sound generation mode
US10225328B2 (en) Music selection and organization using audio fingerprints
US20150220633A1 (en) Music selection and organization using rhythm, texture and pitch
EP1340219A4 (en) Method for analyzing music using sounds of instruments
CN1750116A (en) Automatic rendition style determining apparatus and method
US11948542B2 (en) Systems, devices, and methods for computer-generated musical note sequences
CN111613198B (en) Rhythm type identification method and application of MIDI
JP2806351B2 (en) Performance information analyzer and automatic arrangement device using the same
EP2342708A1 (en) Method for analyzing a digital music audio signal
KR20030067377A (en) Method and apparatus for searching of musical data based on melody
Das et al. Analyzing and classifying guitarists from rock guitar solo tablature
JP7033365B2 (en) Music processing system, music processing program, and music processing method
Vatolkin et al. An evolutionary multi-objective feature selection approach for detecting music segment boundaries of specific types
Moroni et al. Evolutionary computation applied to algorithmic composition
Tian et al. Music structural segmentation across genres with Gammatone features
WO2022019268A1 (en) Music processing system, music processing program, and music processing method
Tan et al. Rhythm analysis for personal and social music applications using drum loop patterns
Savelsberg Visualizing the Structure of Music
Adli et al. A content dependent visualization system for symbolic representation of piano stream

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant