CN111613198A - MIDI rhythm type identification method and application - Google Patents
MIDI rhythm type identification method and application Download PDFInfo
- Publication number
- CN111613198A CN111613198A CN202010399290.7A CN202010399290A CN111613198A CN 111613198 A CN111613198 A CN 111613198A CN 202010399290 A CN202010399290 A CN 202010399290A CN 111613198 A CN111613198 A CN 111613198A
- Authority
- CN
- China
- Prior art keywords
- midi
- notes
- rhythm
- occurrence
- frequency
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/40—Rhythm
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/071—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for rhythm pattern analysis or rhythm style recognition
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Computational Linguistics (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Human Computer Interaction (AREA)
- Auxiliary Devices For Music (AREA)
Abstract
The invention discloses a MIDI rhythm type identification method, which comprises the following steps: dividing MIDI fragments into a bass region and a treble region according to tone threshold distribution; dividing the MIDI fragment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and respectively counting the frequency of the occurrence of notes in each time step in a bass region and a alto-bass region; respectively counting the average frequency of the notes on the time steps corresponding to the N units aiming at the bass region and the alto-bass region; the average frequency is arranged in a sequence according to the time step, and the rhythm type characteristic is represented. A rhythm type based music style determination method and a rhythm type based performance style determination method are also disclosed. The three methods are simple, and provide a new idea for automatic composition and automatic accompaniment.
Description
Technical Field
The invention belongs to the field of MIDI music, and particularly relates to a MIDI rhythm type identification method, a rhythm type-based music style determination method and a playing mode determination method.
Background
The rhythm pattern of music generally refers to the regular repetition of melody, body, drum set, etc. on the listening feeling of music. In jazz music, it can be considered as a quality of a continuously repeating rhythm unit, which is produced by the interaction of music played by the rhythm portion of the band (e.g., drum, electric bass or viola, guitar and keyboard). Rhythmic patterns are an important feature of popular music and can be found in many types, including sha, hoock, rock, fuse, and soul music. While a typical rock-and-roll rhythm type, the bars 1 and 3 appear with the repetition of bass drum and the body instrument, the bars 2 and 4 appear with the high-pitched decorative performance of the bars and the body instrument, and the melody eighth note is added at the half-beat position.
From a broader national musician perspective, the tempo type is described as something that persists in a unique, regular and attractive way, a particular but orderly perception that may attract listeners. The rhythm type is an understanding or feeling of a rhythm pattern, and is an intuitive feeling of a music movement cycle.
Different music styles each have a very characteristic rhythmic pattern. People often judge the style attributes of a piece of music by distinguishing between different tempo-type features. The tempos of Brooku, hip-hop, jazz, Raprod and the like surround one or two core tempos of the two tempos respectively, and countless music canvasses are derived.
MIDI is a technical standard describing communication protocols, digital interfaces, and specifications for connecting various electronic musical instruments, computers, and related audio equipment to play, edit, and record music.
The musical notes played by the musical instruments expressed in the MIDI format are symbolic representations of the content of the playing. The most basic MIDI note format does not cover specific timbre information, and only lists the pressed time, force and released time of different keys in time sequence. In the modern links of music recording, analysis, production and the like, a great deal of MIDI is used for recording the performance information of musical instruments. In the production and analysis of modern electronic music combined with computers, a large number of algorithms begin to emerge, and demands for automatic generation of melodies and textures, automatic accompaniment and the like are continuously developed. The identification and analysis of the rhythm type have important significance.
In early sequencers, drums, and electronic organs, the rhythmic attributes of MIDI fragments were considered, and different rhythmic effects were achieved through some predetermined templates or modules controlled by programs. Various music workstations, synthesizers, computer virtual instrument plug-ins (VSTs) and the like which appear later also add support for rhythm type selection, control, customization and the like.
However, the existing algorithms generally generate MIDI notes based on a preset rhythm template, and analyze the rhythm type for the given MIDI, and presume the existing MIDI fragment style, so that the scheme for performing the same-class or difference generation is relatively lacked.
Disclosure of Invention
The invention aims to provide a MIDI rhythm type identification method, which comprises the following steps:
dividing MIDI fragments into a bass region and a treble region according to tone threshold distribution;
dividing the MIDI fragment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and respectively counting the frequency of occurrence of notes in each time step in a bass region and a mid-high region;
respectively counting the average frequency of the notes on the time steps corresponding to the N units aiming at the bass region and the alto-bass region; the average frequency is arranged in a sequence according to the time step, and the rhythm type characteristic is represented.
The MIDI rhythm type recognition method can obtain rhythm type characteristics according to MIDI fragment analysis, is simple and reliable, and provides a new idea for automatic composition and automatic accompaniment.
Another object of the present invention is to provide a music style determining method based on tempo type, comprising the steps of:
determining the rhythm characteristics of the MIDI fragments by adopting the MIDI rhythm identification method;
and calculating the similarity between the rhythm type characteristics and the rhythm type characteristics corresponding to each music style, and selecting the music style corresponding to the maximum similarity value as the music style of the MIDI fragment.
According to the music style determining method based on the rhythm type, the music style of the MIDI segments to be determined is identified through the similarity of the rhythm type characteristics of the music style and the rhythm type characteristics of the MIDI segments to be determined, the process is simple, and the style determination is accurate.
It is still another object of the present invention to provide a rhythm-based performance style determining method, comprising the steps of:
determining the rhythm characteristics of the MIDI fragments by adopting the MIDI rhythm identification method;
according to the rhythm type characteristics of the alto-treble region, counting the total number of time steps of the occurrence of the notes in the alto-treble region and the total frequency of the occurrence of the notes;
and determining the playing mode of the alto-bass region according to the ratio of the total frequency of the occurrence of the notes to the total number of the time steps of the occurrence of the notes.
The playing mode determining method based on the rhythm type directly determines the playing mode in the frequency state of the occurrence of the notes presented by the rhythm type characteristics of the MIDI fragment, and is simple, convenient and reliable.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of a method for rhythm type identification of MIDI provided by an embodiment;
FIG. 2 is a schematic diagram of a MIDI fragment divided into a bass region and a treble region according to a threshold distribution;
FIG. 3 is a diagram illustrating a statistical visualization of the frequency of occurrence of notes at each time step in a bass region and a treble region according to an embodiment;
FIG. 4 is a graphical representation of a statistical visualization of the average frequency of occurrence of notes over time steps for N units according to an exemplary embodiment;
FIG. 5 is a flowchart of a rhythm-based music style determination method according to an embodiment;
fig. 6 is a flowchart of a rhythm-based music style determination method according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the detailed description and specific examples, while indicating the scope of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
As shown in fig. 1, an embodiment provides a method for recognizing a rhythm of a MIDI, comprising the steps of:
s101, dividing the MIDI fragment into a bass region and a treble region according to the distribution of the tone threshold.
The MIDI fragment can be a MIDI performance fragment of a fabric musical instrument (e.g. a struck string musical instrument such as a piano, a plucked string musical instrument such as a guitar, an electronic synthesizer, etc.). When the tone region division is carried out, the highest pitch and the lowest pitch of all notes of the whole MIDI segment are firstly analyzed, the tone threshold range is determined, and then the MIDI segment is divided into a low tone region and a medium tone region according to the tone threshold distribution. The range of the threshold sound close to 1/4-1/3 of the lowest sound height can be used as a low sound zone, and the rest can be used as a middle and high sound zone. Specifically, the 1/4 range of lowest pitch could be located directly in the bass region, with the remaining threshold range (i.e., the 3/4 range near the highest pitch) locating the midrange region, as shown in FIG. 2. After the bass region and the alt-treble region are determined, the note in the bass region becomes a bass note, and the note in the alt-treble region becomes an alt-treble note. When the alto region is composed of notes played by a body instrument, the alto region can be an alto body region, and corresponding notes are body notes.
S102, dividing the MIDI fragment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and respectively counting the frequency of the occurrence of notes in each time step in the bass region and the alto-bass region.
The rhythmicity of MIDI clips is generally cyclically presented, and a phrase or phrases are a rhythm type, and thus, the minimum rhythm type is one unit length. Through a large amount of statistics, generally, the unit length is 2 bars or 4 bars, and the MIDI fragment is divided into units. For a MIDI fragment of 32 bars, the MIDI fragment is divided into 16 units in units of 2 bars, and the frequency of occurrence of the note is counted for each of the bass region and the treble region in the 16 units.
In the frequency statistics, time step division is performed for each unit, that is, the time length corresponding to each unit is divided into a plurality of time steps. The granularity of this time step is determined according to the tempo and note type of the MIDI fragment, and generally requires that a time step contain at most the shortest note to the occurrence of the MIDI fragment. For example, if the beat of a MIDI fragment is 4/4 beats, and the shortest note appearing on the MIDI fragment is a 4-point note, a time step is at most one duration of the 4-point note. If the duration corresponding to the 16-minute note is one time step, 1 beat is divided into 4 time steps.
In this embodiment, statistics is performed on whether notes appear in each time step in the bass region and the alto region, and the number of the notes appearing in each time step. As shown in fig. 2, in the MIDI fragment, if only one note appears at each time step in the bass region, the corresponding statistical frequency is 1, if no note appears at a part of the time steps in the midbass region, the corresponding statistical frequency is 0, and if 3 notes appear at a part of the time steps, the corresponding statistical frequency is 3. Therefore, the MIDI segments can be converted into rhythm type expressed by numbers according to the occurrence state of the specific time step notes as a judgment standard.
In order to present the statistical result more intuitively, after counting the occurrence frequency of the note at each time step in the bass region and the treble region, the statistical result may be visually represented, and specifically includes:
initializing each time step into a particle which is indicated by light color, sequentially arranging all the empty particles according to the time sequence, indicating the particles with notes appearing in the time step by dark color, and directly filling the frequency of the notes appearing in the particles with the dark color.
In this embodiment, the shape of the void particles may be any shape such as a circle, a square, or the like, and is not limited thereto.
For example, for a MIDI fragment of 4/4, if a 16-point note is obtained at a time step, the frequency of audio occurrence in the bass region and the midbass region of the MIDI fragment with 1 bar length is visualized as shown in fig. 3.
S103, respectively counting the average frequency of the notes on the time steps corresponding to the N units aiming at the bass region and the alto-bass region; the average frequency is arranged in a sequence according to time steps to represent rhythm type characteristics, and N is a natural number larger than 2.
In this embodiment, when the units are divided according to the rhythmicity, the corresponding time duration of each unit is equal, and the number of time steps for dividing each unit is also the same. It is necessary to count the state of occurrence of notes at the corresponding time step for each unit. Specifically, the average frequency N of the note occurrences at the time step corresponding to N units is counted according to the following formulai:
Wherein n isiRepresenting the mean frequency of occurrence of notes, l, at the ith time stepi,jThe frequency of the occurrence of the notes at the ith time step of the jth unit is shown, i is 1,2,3, … …, M is a natural number which is larger than 2 and is shown as the total number of each unit time step.
Similarly, for visualization, after counting the average frequency of occurrence of the notes in the time step corresponding to the N units, performing visual representation on the counted average frequency specifically includes:
similarly, each time step is initialized to be a light color to represent empty particles, all the empty particles are sequentially arranged according to the time sequence, the maximum average frequency occurring in the time step is set as a dark color, different gradient colors from the dark color to the light color are given to the maximum average frequency from zero, and the empty particles are filled with the gradient colors corresponding to the average frequency, so that visualization is realized.
For example, if the empty particle corresponds to a light color with a gray value of 55, the dark color corresponding to the maximum average frequency corresponds to a black color with a gray value of 255, and if the maximum value of the average frequency is 5 and the minimum value is 0, and if the average frequency distribution of the maximum average frequency from 5 to the minimum average frequency 0 is 5,4,3,2,1,0, the average frequency is 4,3,2,1 respectively, and a gradient color with a gray value of 200,150,100,50 respectively is assigned, as shown in fig. 4.
The MIDI rhythm type recognition method can obtain rhythm type characteristics according to MIDI fragment analysis, is simple and reliable, and provides a new idea for automatic composition and automatic accompaniment.
As shown in fig. 5, the embodiment further provides a music style determining method based on the rhythm type, including the following steps:
s501, determining the rhythm characteristics of the MIDI fragments by adopting the rhythm identification method of the MIDI.
The steps of the method for calculating the rhythm characteristics of the MIDI fragment adopted in S501 are the same as those of S101 to S103 in the above-mentioned method for identifying the rhythm of the MIDI fragment, and are not repeated herein.
S502, calculating the similarity between the rhythm type characteristics and the rhythm type characteristics corresponding to each music style, and selecting the music style corresponding to the maximum similarity value as the music style of the MIDI fragment.
The music style is also the music type, and the finger music refers to the typical unique appearance of the music works as a whole, and specifically comprises rock, soul, jazz, electronic pop and the like. The rhythm presented by each music style presents certain rhythm type characteristics, so that the music style of a piece of MIDI can be determined according to the rhythm type characteristics.
In this embodiment, the method for determining the rhythm type characteristic of each music style includes:
after MIDI fragment samples belonging to the same music style are collected, the rhythm type characteristics of each MIDI fragment sample are determined by adopting the MIDI rhythm type identification method, and the average value of the rhythm type characteristics of all the MIDI fragment samples is taken as the rhythm type characteristics of the music style.
In this embodiment, when performing similarity determination, clustering, relative entropy, or Wasserstein distance is used to calculate the similarity between the rhythm type feature and the rhythm type feature corresponding to each music style.
When the clustering method is adopted, the rhythm type characteristics of each music style are taken as a clustering center, the distance between the rhythm type characteristics of the MIDI segments and the clustering center is judged so as to realize the clustering of the rhythm type characteristics of the MIDI segments, and the music style corresponding to the type to which the rhythm type characteristics of the MIDI segments belong is taken as the determined music style.
According to the music style determining method based on the rhythm type, the music style of the MIDI segments to be determined is identified through the similarity of the rhythm type characteristics of the music style and the rhythm type characteristics of the MIDI segments to be determined, the process is simple, and the style determination is accurate.
As shown in fig. 6, the embodiment further provides a rhythm-based performance style determining method, including the steps of:
s601, determining the rhythm type characteristics of the MIDI segments by adopting the rhythm type identification method of the MIDI.
The steps of the method for calculating the rhythm characteristics of the MIDI fragment adopted in S601 are the same as those of S101 to S103 in the above-mentioned method for identifying the rhythm of the MIDI, and are not described herein again.
And S602, counting the total number of time steps of the occurrence of the notes and the total frequency of the occurrence of the notes in the alto-bass region according to the rhythm type characteristics of the alto-bass region.
Assuming that the total time step of a MIDI fragment is 16 steps, for a midbass region, the average frequency of occurrence at the 1 st time step is 2.3, the average frequency of occurrence at the 5 th time step is 2.6, the average frequency of occurrence at the 13 th time step is 2.9, and no note occurs at the rest time steps, the total number of time steps at which notes occur is 3, and the total frequency of occurrence of notes is 2.3+2.6+2.9, which is 7.8.
S603, determining the playing mode of the alto-bass region according to the ratio of the total frequency of the occurrence of the notes to the total number of the time steps of the occurrence of the notes.
In this embodiment, the playing mode is determined according to the magnitude relationship between the comparison setting threshold and the ratio of the total frequency of occurrence of the notes to the total number of time steps of occurrence of the notes. The method specifically comprises the following steps:
setting a first threshold value and a second threshold value for judging the playing mode, wherein the first threshold value is smaller than the second threshold value; experiments show that the first threshold value and the second threshold value have better performances at about 1.3 and 2.0.
When the ratio of the total frequency of the occurrence of the notes to the total number of the positions of the occurrence of the notes is smaller than a first threshold value, determining the playing mode of the alto-bass region as a decomposition playing mode;
when the ratio of the total frequency of the occurrence of the notes to the total number of the positions of the occurrence of the notes is greater than a second threshold value, determining the playing mode of the alto-bass region as a columnar playing mode;
and when the ratio of the total frequency of the occurrence of the notes to the total number of the occurrence of the notes is between a first threshold value and a second threshold value, determining the playing mode of the alto-bass region as a mixed playing mode.
The playing mode determining method based on the rhythm type directly determines the playing mode in the frequency state of the occurrence of the notes presented by the rhythm type characteristics of the MIDI fragment, and is simple, convenient and reliable.
The above-mentioned embodiments are intended to illustrate the technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only the most preferred embodiments of the present invention, and are not intended to limit the present invention, and any modifications, additions, equivalents, etc. made within the scope of the principles of the present invention should be included in the scope of the present invention.
Claims (10)
1. A MIDI rhythm type recognition method is characterized by comprising the following steps:
dividing MIDI fragments into a bass region and a treble region according to tone threshold distribution;
dividing the MIDI fragment into N units according to a certain unit length, dividing the time length corresponding to each unit into a plurality of time steps, and respectively counting the frequency of the occurrence of notes in each time step in a bass region and a alto-bass region;
respectively counting the average frequency of the notes on the time steps corresponding to the N units aiming at the bass region and the alto-bass region; the average frequency is arranged in a sequence according to time steps to represent rhythm type characteristics, and N is a natural number larger than 2.
2. The method for recognizing rhythm type of MIDI as claimed in claim 1, wherein the threshold range of the tone near the lowest tone height is 1/4 ~ 1/3 as the bass region, and the rest is the mid-bass region;
and dividing the MIDI fragment into units by taking 2 bars or 4 bars as unit length.
3. The method for recognizing rhythm type of MIDI of claim 1, wherein after counting the frequency of occurrence of notes at each time step in the bass region and the alt-treble region, the counting result can be visually represented, which specifically comprises:
initializing each time step into a particle which is indicated by light color, sequentially arranging all the empty particles according to the time sequence, indicating the particles with notes appearing in the time step by dark color, and directly filling the frequency of the notes appearing in the particles with the dark color.
4. The method for recognizing rhythm type of MIDI as claimed in claim 1, wherein the average frequency N of occurrence of notes in N units corresponding to time steps is counted according to the following formulai:
Wherein n isiRepresenting the mean frequency of occurrence of notes, l, at the ith time stepi,jThe frequency of the occurrence of the notes at the ith time step of the jth unit is shown, i is 1,2,3, … …, M is a natural number which is larger than 2 and is shown as the total number of each unit time step.
5. The method for recognizing rhythm type of MIDI of claim 1, wherein the step of performing a visual representation of the statistical average frequency number after counting the average frequency number of occurrence of notes in the corresponding time step of N units comprises:
similarly, each time step is initialized to be a light color to represent empty particles, all the empty particles are sequentially arranged according to the time sequence, the maximum average frequency occurring in the time step is set as a dark color, different gradient colors from the dark color to the light color are given to the maximum average frequency from zero, and the empty particles are filled with the gradient colors corresponding to the average frequency, so that visualization is realized.
6. A music style determination method based on rhythm type is characterized by comprising the following steps:
determining the rhythm characteristics of MIDI fragments by using the method for recognizing the rhythm of MIDI as claimed in any one of claims 1-5;
and calculating the similarity between the rhythm type characteristics and the rhythm type characteristics corresponding to each music style, and selecting the music style corresponding to the maximum similarity value as the music style of the MIDI fragment.
7. A rhythm-based music style determination method as claimed in claim 6, wherein the rhythm-type characteristic of each music style is determined by:
after collecting MIDI fragment samples belonging to the same music style, determining the rhythm type characteristics of each MIDI fragment sample by using the MIDI rhythm type identification method of any claim 1-6, and taking the average value of the rhythm type characteristics of all the MIDI fragment samples as the rhythm type characteristics of the music style.
8. The method of determining a rhythm-based music style according to claim 6, wherein the similarity of the rhythm-based features to rhythm-based features corresponding to each music style is calculated using clustering, relative entropy, or Wasserstein distance.
9. A rhythm-based performance mode determination method is characterized by comprising the following steps:
determining the rhythm characteristics of MIDI fragments by using the method for recognizing the rhythm of MIDI as claimed in any one of claims 1-5;
according to the rhythm type characteristics of the alto-treble region, counting the total number of time steps of the occurrence of the notes in the alto-treble region and the total frequency of the occurrence of the notes;
and determining the playing mode of the alto-bass region according to the ratio of the total frequency of the occurrence of the notes to the total number of the time steps of the occurrence of the notes.
10. A rhythm-based performance style determination method as claimed in claim 9, further comprising:
setting a first threshold value and a second threshold value for judging the playing mode, wherein the first threshold value is smaller than the second threshold value;
when the ratio of the total frequency of the occurrence of the notes to the total number of the positions of the occurrence of the notes is smaller than a first threshold value, determining the playing mode of the alto-bass region as a decomposition playing mode;
when the ratio of the total frequency of the occurrence of the notes to the total number of the positions of the occurrence of the notes is greater than a second threshold value, determining the playing mode of the alto-bass region as a columnar playing mode;
and when the ratio of the total frequency of the occurrence of the notes to the total number of the occurrence of the notes is between a first threshold value and a second threshold value, determining the playing mode of the alto-bass region as a mixed playing mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010399290.7A CN111613198B (en) | 2020-05-12 | 2020-05-12 | Rhythm type identification method and application of MIDI |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010399290.7A CN111613198B (en) | 2020-05-12 | 2020-05-12 | Rhythm type identification method and application of MIDI |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111613198A true CN111613198A (en) | 2020-09-01 |
CN111613198B CN111613198B (en) | 2023-09-08 |
Family
ID=72205152
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010399290.7A Active CN111613198B (en) | 2020-05-12 | 2020-05-12 | Rhythm type identification method and application of MIDI |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111613198B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112331170A (en) * | 2020-10-28 | 2021-02-05 | 平安科技(深圳)有限公司 | Method, device and equipment for analyzing similarity of Buddha music melody and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020007722A1 (en) * | 1998-09-24 | 2002-01-24 | Eiichiro Aoki | Automatic composition apparatus and method using rhythm pattern characteristics database and setting composition conditions section by section |
US20060219089A1 (en) * | 2005-03-24 | 2006-10-05 | Yamaha Corporation | Apparatus for analyzing music data and displaying music score |
KR20130039944A (en) * | 2011-10-13 | 2013-04-23 | 김영옥 | A novel score and apparatus for displaying the same |
CN105374347A (en) * | 2015-09-22 | 2016-03-02 | 中国传媒大学 | A mixed algorithm-based computer-aided composition method for popular tunes in regions south of the Yangtze River |
-
2020
- 2020-05-12 CN CN202010399290.7A patent/CN111613198B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020007722A1 (en) * | 1998-09-24 | 2002-01-24 | Eiichiro Aoki | Automatic composition apparatus and method using rhythm pattern characteristics database and setting composition conditions section by section |
US20060219089A1 (en) * | 2005-03-24 | 2006-10-05 | Yamaha Corporation | Apparatus for analyzing music data and displaying music score |
KR20130039944A (en) * | 2011-10-13 | 2013-04-23 | 김영옥 | A novel score and apparatus for displaying the same |
CN105374347A (en) * | 2015-09-22 | 2016-03-02 | 中国传媒大学 | A mixed algorithm-based computer-aided composition method for popular tunes in regions south of the Yangtze River |
Non-Patent Citations (1)
Title |
---|
王凌子: "舒曼《交响练习曲》三个不同演奏版本的比较", 《蚌埠学院学报》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112331170A (en) * | 2020-10-28 | 2021-02-05 | 平安科技(深圳)有限公司 | Method, device and equipment for analyzing similarity of Buddha music melody and storage medium |
WO2021203713A1 (en) * | 2020-10-28 | 2021-10-14 | 平安科技(深圳)有限公司 | Buddhist music melody similarity analysis method, apparatus and device, and storage medium |
CN112331170B (en) * | 2020-10-28 | 2023-09-15 | 平安科技(深圳)有限公司 | Method, device, equipment and storage medium for analyzing Buddha music melody similarity |
Also Published As
Publication number | Publication date |
---|---|
CN111613198B (en) | 2023-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Bittner et al. | Medleydb: A multitrack dataset for annotation-intensive mir research. | |
US5792971A (en) | Method and system for editing digital audio information with music-like parameters | |
JP4199097B2 (en) | Automatic music classification apparatus and method | |
US6703549B1 (en) | Performance data generating apparatus and method and storage medium | |
US7737354B2 (en) | Creating music via concatenative synthesis | |
CN102760426B (en) | Searched for using the such performance data for representing musical sound generation mode | |
JP2010538335A (en) | Automatic accompaniment for voice melody | |
WO2002047064A1 (en) | Method for analyzing music using sounds of instruments | |
CN1750116A (en) | Automatic rendition style determining apparatus and method | |
WO1997035299A1 (en) | Music composition | |
CN102760428B (en) | Use the such performance data search of the inquiry representing musical sound generation mode | |
US5900567A (en) | System and method for enhancing musical performances in computer based musical devices | |
KR100784075B1 (en) | System, method and computer readable medium for online composition | |
CA2436679C (en) | Performance data processing and tone signal synthesizing methods and apparatus | |
CN111613198B (en) | Rhythm type identification method and application of MIDI | |
Das et al. | Analyzing and classifying guitarists from rock guitar solo tablature | |
Rona | The MIDI companion | |
Vatolkin | Evolutionary approximation of instrumental texture in polyphonic audio recordings | |
Yasuraoka et al. | Changing timbre and phrase in existing musical performances as you like: manipulations of single part using harmonic and inharmonic models | |
JP3812510B2 (en) | Performance data processing method and tone signal synthesis method | |
Sarmento et al. | Shredgp: Guitarist style-conditioned tablature generation | |
Romo | MIDI: A Standard for Music in the Ever Changing Digital Age | |
Tutzer | Drum rhythm retrieval based on rhythm and sound similarity | |
Shlien | A Statistical Analysis of a Midi File Database | |
JP3775249B2 (en) | Automatic composer and automatic composition program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |