WO2007066819A1 - 音楽編集装置及び音楽編集方法 - Google Patents
音楽編集装置及び音楽編集方法 Download PDFInfo
- Publication number
- WO2007066819A1 WO2007066819A1 PCT/JP2006/324890 JP2006324890W WO2007066819A1 WO 2007066819 A1 WO2007066819 A1 WO 2007066819A1 JP 2006324890 W JP2006324890 W JP 2006324890W WO 2007066819 A1 WO2007066819 A1 WO 2007066819A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- music
- remix
- song
- metadata
- songs
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 47
- 230000001360 synchronised effect Effects 0.000 claims abstract description 43
- 238000012545 processing Methods 0.000 claims description 53
- 230000008569 process Effects 0.000 claims description 15
- 230000005236 sound signal Effects 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 5
- 239000000203 mixture Substances 0.000 abstract description 17
- 230000015654 memory Effects 0.000 abstract description 7
- 230000000694 effects Effects 0.000 description 26
- 238000010586 diagram Methods 0.000 description 25
- HAORKNGNJCEJBX-UHFFFAOYSA-N cyprodinil Chemical compound N=1C(C)=CC(C2CC2)=NC=1NC1=CC=CC=C1 HAORKNGNJCEJBX-UHFFFAOYSA-N 0.000 description 13
- 241001342895 Chorus Species 0.000 description 12
- 230000008859 change Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 7
- 239000011295 pitch Substances 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000013500 data storage Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 239000013028 medium composition Substances 0.000 description 2
- 235000010240 Paullinia pinnata Nutrition 0.000 description 1
- 241001119522 Paullinia pinnata Species 0.000 description 1
- 239000011324 bead Substances 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000007562 laser obscuration time method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000003415 peat Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000004260 weight control Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
- G10H1/0025—Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/40—Rhythm
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L19/00—Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/076—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for extraction of timing, tempo; Beat detection
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/105—Composing aid, e.g. for supporting creation, edition or modification of a piece of music
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/125—Medley, i.e. linking parts of different musical pieces in one single piece, e.g. sound collage, DJ mix
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/131—Morphing, i.e. transformation of a musical piece into a new different one, e.g. remix
- G10H2210/136—Morphing interpolation, i.e. interpolating in pitch, harmony or time, tempo or rhythm, between two different musical pieces, e.g. to produce a new musical work
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/375—Tempo or beat alterations; Music timing control
- G10H2210/391—Automatic tempo adjustment, correction or control
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/011—Files or data streams containing coded musical information, e.g. for transmission
- G10H2240/016—File editing, i.e. modifying musical data files or streams as such
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/011—Files or data streams containing coded musical information, e.g. for transmission
- G10H2240/046—File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
- G10H2240/061—MP3, i.e. MPEG-1 or MPEG-2 Audio Layer III, lossy audio compression
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/011—Files or data streams containing coded musical information, e.g. for transmission
- G10H2240/046—File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
- G10H2240/066—MPEG audio-visual compression file formats, e.g. MPEG-4 for coding of audio-visual objects
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/091—Info, i.e. juxtaposition of unrelated auxiliary information or commercial messages with or between music files
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/295—Packet switched network, e.g. token ring
- G10H2240/305—Internet or TCP/IP protocol use for any electrophonic musical instrument data or musical parameter transmission purposes
Definitions
- the present invention relates to a music device and a music collection method that generate new content by collecting music content obtained by subdividing a music into music patterns, measures, and the like, as a material.
- Memo With the audio The has become a routine to carry a large number of songs. For audio that allows you to listen to as many songs as you like, not only can you listen to the original CD album, but you can also listen to all the songs across albums There is a widespread style of reconstructing and listening only as a play list. It could be said that the freedom to listen to the songs that Za wants to listen to in the order in which he wants to listen has spread without being kept in the album.
- Figure 23 shows a conventional live performance using song A and song B. Each is played at the original tempo of the song. Of course, I have a little bit of honey.
- the report discloses a method of combining multiple materials and interactively editing or playing back music to enjoy the music, depending on the force through the pose.
- chips such as sound patterns and hooks that apply to these are assigned.
- the chip corresponding to this is executed.
- the corresponding chips are synthesized. Also, press in order and combine the chips with to compose the music.
- the purpose of the present invention is to provide a music device and a music collection method capable of realizing a collection of a large number of recorded songs on a body such as a dodis flash, etc. And, the aim is to provide a music device and a music collection method that allow the body or part of the music to be connected to the real time and played back on the spot.
- Ming-related music in order to solve the above-mentioned problems, we use at least the time-based bit generated corresponding to the music
- a mix processing section and a stabit are generated, and this stub is sent to the above mix processing section.
- the above-mentioned control unit gives instructions according to the mix pattern data, and plays multiple songs according to the mix process Synchronous playback
- the music collection method uses the bit generated at least on the time axis corresponding to the music data to determine the mix pattern. Based on this, the limiter and stabit for performing mix processing are generated based on this mix, and this starbit is sent to the above-mentioned mix processing step, and at the same time, according to the above-mentioned mitter setting and mix pattern data. It is equipped with a synchronous replay that plays multiple songs according to the instructed remixing process, and a mixture of the songs that were replayed in the above-mentioned replay stage.
- FIG. 2 is a block showing the detailed structure of the audio playback mixing unit
- Fig. 3 is the music.
- Blocks of music instruments Fig. 4 is a chart showing the order of music programs executed by music instruments
- Fig. 5 shows data on the time axis
- Fig. 6 shows Fig. 7 shows an example of time axis data
- Fig. 7 shows another example of time axis data ⁇ 8 A
- 8 B 8 C shows the method of data
- Fig. 9 shows
- Example of a mix spat Fig. 0 is for explaining the music by Ming
- Fig. Is for explaining the connection by cross
- Fig. 2 is the connection by cutin.
- Fig. 0 is for explaining the music by Ming
- Fig. Is for explaining the connection by cross
- Fig. 2 is the connection by cutin.
- Fig. 0 is for explaining the music by Ming
- Fig. Is for explaining the connection by cross
- Fig. 2 is the connection by cutin.
- FIG. 3 is for explaining the tie using the effect
- Fig. 4 is for explaining the tie by using the cross
- Fig. 5 is for the simultaneous reproduction.
- Fig. 6 is for explaining the use of the hook
- Fig. 7 is for explaining the partial live
- Fig. 8 is for the music device with a network device.
- Figure shows a block of a music device with a network device
- Figure 20 shows a block of a music device with a sensor
- Figure 2 shows a block of a music device with a sensor.
- 2 A 2 2 B is a flowchart showing the order of the music devices with sensors
- 2 3 2 It is a figure. Good for carrying out Ming
- FIG. 1 is a block diagram of music to which the device and method according to the present invention are applied.
- this unit is connected to the central processing unit () 2 via 3 via the storage unit 4 and the playback unit 8. (CP 2 to 3 through RM 3 RAM 4, the 5th interface)
- C P 2 determines how to connect the music to the real time, and gives the necessary materials to the synchronous playback device 8 at the required timing. Also, depending on the work of the Z, the tempo
- the memory device 4 is composed of a music memory device 5, a data memory device 6 and a mix pattern memory device 7.
- the storage device 5 is a storage device that holds a plurality of data. It is a storage device such as a storage device for storage and a flash memory device that a portable device has.
- the data storage device 6 which may be the music data stored in the music storage device 5 or the compressed data may be any storage device such as a flash memory device and is added to the music. , It stores the data on the time axis.
- the metadata is supplementary data added to the music on the time axis, and includes not only the tempo but also bit information, measure (simply referred to as head) information, and prelude (introduction).
- the melody information such as rust
- the mix pattern storage device 7 does not have to be particularly limited as long as it is a storage device. It is a storage device that indicates the method of the hard disk pattern and holds the hard disk pattern. The details of the Tuspattan will be described later. Is not only written, but also how to combine them, or where to use and where to combine song A and song B.
- the playback device 8 is a block for automatically playing music, and plays the music material indicated by the C P 2's miter control function in synchronization with the reference bit. Period reproduction
- Period playback 9 has multiple and plays multiple issues in synchronization with the clock issue generated within itself.
- the position of the song currently being played is constantly monitored based on the data from the meta data storage device 6, and the mix function of CP 2 and the song being played now are recorded. Returns the number of samples that are being played, and whether the node is being played.
- Mixing 0 combines and outputs the sequence signals generated by multiple in the synchronized playback 9. Converts the digital signal reproduced by voice mixing 0 into an analog signal. 2 amplifies the analog signal from D A and outputs it to a speakerphone.
- R O 3 houses a music program that is organized according to the method of Ming. It also stores the data of the alt.
- RAM 4 is the worker when CP 2 executes the above program. It also stores the data when the above program is executed.
- the 1 F 5 accepts works by The For example Examples include kipo, usu, touchpad, etc.
- 6 is a display that includes the current situation, the music situation, or a touch panel that enables the work by the user.
- FIG. 2 is a block diagram showing the detailed structure of synchronized playback 9 audio mixing 0.
- the term regeneration 9 consists of a stabilizer 90 and three systems.
- the stabit 90 generates a clock equivalent to the bit. In detail, the tempo at the time of mixing and the bit synchronized with that tempo are output.
- the stabit 90 generates a measure number and other usual bit numbers according to the designated 4 4 and 3 4 children, and outputs them.
- tracks A, B, and effect und eet SE are used as 3 tracks. is there. Of course, 4 or 5 or more tracks may be used depending on the music.
- the sync signal clock generated by the stabit 90, or the bit is used to perform the synchronous playback according to the bit position of the music according to the stabinov device.
- Each track has r) 999 C and a time strech (h 9 2 9 9 2).
- Decoder 9 9 9 is used to decorate the compressed voice such as MP3 or the like. Since the SE has a short length and a small data size, it is not necessary to shrink it. You may.
- the Time Stretch 9 2 9 9 2 c is a device that converts the playback level while keeping the pitch constant. Match materials with different tempos based on the data from the metadata storage device 6 to the tempo of the reference bit. Depending on the ratio with the original tempo of the song, the process of changing the playback rate to real time is performed. This allows the original tempo of the song to match the tempo of the stadium. Of course, it does not change the pitch as mentioned above.
- pitch shift h h f r function may be included in these voices. Change the pitch while keeping the pitch shift and playback constant. This is used to comfortably combine materials with different degrees, but it is not a necessary function but an additional function.
- three system control units 0, 0 0 0 and a volume 0 a 0 b are provided. After being mixed by these three types of voice mix 02, it is amplified by voice 0 3 and output to an external speaker or headphone. It is configured so that it can control the volume from each track and volume.
- Periodic playback 9 Audio mixing 0 is connected to C P 2 via the bus or the Calbus of the synchronous playback unit.
- Figure 3 is a block diagram of music.
- the function of CP 2 in the figure is the mixing section 20.
- the data processing unit 20 is 2 Divided into 2 2 embedded patterns.
- the mix processing unit 20 processes the data stored in the data storage device 6 by the data processing unit 2. As described above, data on the time axis is added to the song, and it holds the bit information, bar information, and melody information only by tempo.
- the meta data processing unit 2 reads the data on the time axis corresponding to the music, reads it with the mix pattern embedding 22 and investigates the music information according to the instructions in the mix pattern report. ⁇ Knowing which bit is where you are, and where the bit of the song you are trying to combine are, keep track of the multiple songs and effects. , I have decided how to play it.
- the mix processing unit 20 stores the mix pattern file 7a in the miter pattern storage device 7 or reads the mix pattern file 7a with the mix pattern pattern 22.
- the mix pattern file 7a is a file that specifies whether to hood, cut, or what SE should apply.
- a mimic pattern is either a data artificially directed by the hand of the third party or a person who is connected to the role of this), or it is created by an automatic algorithm. The machine decides in hand and it is like a limiter).
- the phase regeneration 9 generates a stabit by the stabit 90, sends this sta- tus to the mixing section 20, and the mixing section 20 mixes the pattern with the mix pattern.
- the meta data storage device can be instructed. Play multiple songs using the data from 6
- Figure 4 is the order of the music programs that music executes by C P 2. This is an example of how to collect this program.
- the mix processing unit 20 of C P 2 reads and acquires the mix pattern file 7a from the mix pattern storage device 7 with the mix pattern inclusion 2 2 in step S.
- the synchronized playback 9 acquire the song, eg, step S2). If there is the next song, go to step S3, and determine the tempo of the synchronized playback 9 sta- bile 90.
- Step S4 This may be fixed at 40 to 40 or may be specified by The.
- Step S 5 the binder pattern (also written in the pattern).
- step S 6 acquire the data of the song. It is A's data.
- step S7 which determines from the mix pattern whether or not a function is necessary, YES if necessary, and it is effective to apply a suitable amount to the function part 0 0. Yes Step S 8
- step S9 it is judged from the miter pattern whether the volume control is necessary. For example, select whether you want to raise or lower the tune when editing song A and song B and stacking them. If necessary (, step S 0 to set the hood parameter. I wrote it on the assumption that it can be raised or lowered dynamically, but set the lamella there.
- Step S Set the original tempo of the song.
- the original tempo of the song is attached to the data.
- the empty voice in the synchronous playback 9 is acquired.
- the 3C example has been described above, but if there is an empty space, get the free space and set the song you want to play back (S3).
- step S4 After obtaining the location of the song, step S4) is performed, and then it is determined whether the point for preparing the next song is reached or not in step S5. , If S is a cross, the cross will end immediately before that section, or if it is a cutin, it will start immediately, so preparations from the front will be in time. Of course, they are played at the same time and at exactly the same time. See if you have reached that point, and if you haven't reached the point where the song is prepared, go back to step S4 and wait until you get there. If you have reached the point for the next song, step S2.
- Figure 5 is a diagram showing data 30 on the time axis.
- the meta data 30 is supplemental data added to the music on the time axis, and not only the tempo but also bit information, measure information, and intro-type melodies. It has been described.
- 3 represents the eyes. Represents the eye.
- 4 represents 4 of the knot.
- 2 represents the second eye.
- 2 means the beat at the beginning of the knot.
- 3 2 shows what is in that position. Indicates the position of a normal, melody, tesabi, etc. in a clause. 0 01 indicates the head of the bar. , Have 2 sex. Sample, for example, a song with a kH sample If so, there are 0 samples in between, but the position is described in sample positions.
- the data 30 as shown in Fig. 5 is described by text expressions and expressions.
- Figure 6 shows an example of time axis data. The data of the axis between 40 and 4 bits is displayed at the same time.
- the long 4 indicates a bar and the short 4 2 indicates a normal bit.
- the four-note timing is used to hold three positions corresponding to the sample positions of the music. It is a figure which shows the body example. It is possible to add to the 50 not only the bit 5 5 but also the song intro 5 and the melody composition such as A melody 52 5 3, B melody 5 4 and te (sabi). is there. You can use this information to find out the position of the target song and the position of the fixed melody.
- the data was stored in the data storage device 6.
- the data may be shown in 8A, 8B and 8C.
- 8 is an example in which the data 7 and the data 7 2 are logically different and physically exist in the same media like P 3.
- 8B is an example in which data 73 is mixed with music data 7 4 like MP G 4.
- 8 C is an example in which data 75 corresponding to music data 7 6 is retrieved via a network.
- the music data is. This is the music described later This is applied when the network is equipped with a network device such as 800 and is connected to a network such as the Internet.
- the site data on the network can be downloaded according to D.
- Figure 9 is a diagram showing an example of the body of the Mic Spatter File.
- the song's data is for the song, but the mimic pattern allows the user to create it freely and has no dependence on music. it can. It is a question of how to connect.
- ⁇ in the file 6 may be the same as A to E. It can be an opposite file.
- 6 2 shows where to play for each song. Specify the chorus for A, the chord for B, the chord for CZ, the 8th to 20th knots, the whole for D, and the chorus for E.
- the birth hook 63 specifies how to apply the hook to each of the parts.
- the tether pattern 64 indicates that it is a cross against A, but when it is connected to B, it should be connected with a cross.
- the joint foot 65 specifies that the reverb, mouth, and discontinuity are to be applied as work hooks at the time of joining.
- the tether SE 66 gives the pattern shown in Fig. 9 that specifies the effect, the actual life is as follows. Is played. Intros come to Crosov near the rust. Intro At the same time, the number of 20h sections from which the 8th life begins is brought to Kusuf. It is also sometimes used. It is also played at the same time. That is the case.
- MicSpata is a deliberate addition of instructions to where and how to connect songs in a conventional play list.
- the tempo of ABC is adjusted to the tempo determined by the or system regardless of the original tempo of the song, and the position of the song ABC (down pitch This also allows you to play the song in a seamless, non-topless manner, which means that the music is played in the same way as song A song B song C as described above.
- the degree of the song ABC In order to adjust the tempo position during playback, the degree of the song ABC must be changed. In order to know the original tempo position, in music, the music ABC and the respective data on the time axis are used. The original tempo of the song at the time of birth The playback rate is changed based on the ratio of the current tempo of the song, and playback is started with the section of each song aligned.
- the tempo positions of multiple songs are handled accurately, and the playback position is controlled by the real time for synchronous playback. It is premised that this music playback method is used to connect the music described here to the seamless mode.
- the figure shows an example in which music A and music B are overlapped with a bit and are connected by cross-processing.
- the current A is fed out (the amount is gradually decreased) while the next song B is fed in (the amount is gradually increased) .By playing that one at the same time, both songs are synchronized.
- the cross point is a method that has traditionally been used for F radio transmission, etc., but the clear points are the tempo positions for these A and B songs. Therefore, the songs are connected at the top without feeling of harmony.
- Fek realizes a more natural and attractive linking method by applying reverb to song A and playing back to song B while applying mouth cut fill, and by applying the above various connection methods at the same time. It is possible .
- the music is not limited to the musical composition to apply the above-mentioned connection method, and may be the music or the intro as shown in Fig. 7. If you connect the parts of a song, you will be able to connect them together so that you can mix them.
- the tempo may be determined by the tempo specified by the song, or the tempo of the main song may be used as a whole.
- FIG. 8 shows the other sounds to which the
- Figure 9 is a block diagram of music 80.
- This 80 has a network 8 and can be connected to the internet 82.
- the above meta data can be brought from the site on the network.
- the music is played and the meta data comes from the internet as shown in Fig. 8 above.
- the content service is not limited to an individual user, and may be provided by the content service side or may be used as a mix spatter.
- the internet connection It will be possible to share it with others and share it with other people, or create a mike with multiple people and create a new music based on the value of the music. Also, if you know the song D, you can bring the song data from the internet.
- Figures 20 and 2 show the diagram and block diagram of other sounds 0.
- This 0 is a configuration that acquires the sensor sensor value via the A / D converter, and is functionally equipped with a sensor.
- the method (2 0 0 3 3 9 3 5 9), the technique of detecting the walking tempo by using the acceleration sensor and changing the tempo of the music according to the tempo is applied. .
- the invention of selecting music in accordance with the tempo of walking jogging may be applied, as shown in (2) 0 0 5 3 0 3 0 9 9 9) of the method of music replacement.
- a sensor is indispensable, and by applying the sensor and the algorithm of this invention to the system, songs are selected according to the state of the, and then they are mimicked. It is possible to produce a non-stop.
- a sensor mode in addition to the mix pattern, and the processing is different depending on the selection of the. Sensor Then, a sensor detects a pattern such as walking or jogging, and the bit is changed according to the pattern.
- step S2 which determines whether the sensor mode is the button mode.
- the sensor mode is the technique that the previous user selects by jogging or walking, and the order of the songs and the song selection itself are determined by the sensor. Not necessarily determined by. It can be said that this suggests that it changes dynamically. It is an image that the pattern file is dynamically created by the sensor instead of reading the pattern file that is already prepared.
- step S 2 When the pattern is selected instead of the sensor destination in step S 2, the process is the same as that of the above-described FIG. The case when the sensor mode is applied to the S2 and S2 2 will be described. This is when music is automatically selected according to jogging and music is automatically connected accordingly.
- the music and the tempo are determined from the sensor's power (step S23. If it is determined in step S24 that there is the next music, the step tempo is set in step S25. In this case, the tempo is detected from the walking tempo, and the set tempo is set.Since the connecting method of the song is not always preliminarily determined, it is automatically determined step S26). In the case of jogging mods, I'm looking at the easy ones that can be connected with a cross and the data of the next song, and since I'm doing the duning, I may just repeat them. For the reason for the step S 27,
- the way to connect the songs is made public on the Internet, etc., and it is shared with other people or made by multiple people to make a mike and value it.
- New music taken Communication is possible. It is also possible to bring meta data from the Internet site in correspondence with music. You can also use the sensor to listen to music that suits your situation and condition.
- the above-mentioned meaning and arbitrary timing are characterized in that they can be set particularly in bar, beat, and melodic positions.
- another music for the currently playing song can be Minutes can be played back simultaneously at any timing.
- the feature is that the volume that is being played at the same time as the above-mentioned period can be individually played. It is possible to simultaneously synthesize and play while increasing the amount of the next song while lowering the amount of the song currently being played at the time of the above-mentioned production.
- the above-mentioned period reproduction is characterized in that the timing is controlled based on the pre-recorded tempo, time signature, down-bit device, melodies, etc. corresponding to the above-mentioned song.
- the above-mentioned metadata is recorded on the same file as the music, or recorded as a file and the response to the music is managed using a database and.
- a separate pattern describing the generation start timing, the volume control method, and the generation / modulation method of the song at the time of the above-mentioned playback is prepared separately, and based on this, a real time pattern is prepared. It is characterized by the fact that it is used for reproduction.
- the above tempo is characterized by being specified by Z or automatically determined by the position.
- the feature is that the data from the sensor is used as the tempo automatically determined by the memory.
- the sensor is characterized in that any sensor such as an acceleration sensor, a sensor, a sensor / sensor, a body motion sensor, an electroencephalogram sensor can be added.
- the pattern file is not only stored in the storage device, but also in an external storage device or interface. It can be obtained from
- Ming it is possible to realize a collection of dozens of recorded songs or parts of a large number of recorded songs on a body such as a doss-flame. Also, the body or part of the music can be connected to the realtime and played on the spot.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Human Computer Interaction (AREA)
- Electrophonic Musical Instruments (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20060834645 EP1959429A4 (en) | 2005-12-09 | 2006-12-07 | MUSIC EDITING DEVICE AND MUSIC EDITING PROCEDURE |
JP2007549215A JPWO2007066819A1 (ja) | 2005-12-09 | 2006-12-07 | 音楽編集装置及び音楽編集方法 |
CN2006800457218A CN101322180B (zh) | 2005-12-09 | 2006-12-07 | 音乐编辑装置和音乐编辑方法 |
US12/092,641 US7855333B2 (en) | 2005-12-09 | 2006-12-07 | Music edit device and music edit method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-356825 | 2005-12-09 | ||
JP2005356825 | 2005-12-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007066819A1 true WO2007066819A1 (ja) | 2007-06-14 |
Family
ID=38122953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/324890 WO2007066819A1 (ja) | 2005-12-09 | 2006-12-07 | 音楽編集装置及び音楽編集方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US7855333B2 (ja) |
EP (1) | EP1959429A4 (ja) |
JP (1) | JPWO2007066819A1 (ja) |
KR (1) | KR20080074977A (ja) |
CN (1) | CN101322180B (ja) |
WO (1) | WO2007066819A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009038225A1 (ja) * | 2007-09-19 | 2009-03-26 | Sony Corporation | コンテンツ再生装置及びコンテンツ再生方法 |
JP2010531464A (ja) * | 2007-06-25 | 2010-09-24 | ソニー エリクソン モバイル コミュニケーションズ, エービー | 電子装置を用いて複数のソングを自動的にビートミックスするシステム及び方法 |
JP2011227212A (ja) * | 2010-04-17 | 2011-11-10 | Nl Giken Kk | 電子オルゴール |
JP2012103603A (ja) * | 2010-11-12 | 2012-05-31 | Sony Corp | 情報処理装置、楽曲区間抽出方法、及びプログラム |
JP2016085332A (ja) * | 2014-10-24 | 2016-05-19 | オンキヨー株式会社 | 音楽編集装置、及び、音楽編集プログラム |
CN108202334A (zh) * | 2018-03-22 | 2018-06-26 | 东华大学 | 一种能够识别音乐节拍和风格的舞蹈机器人 |
JP7564189B2 (ja) | 2019-08-08 | 2024-10-08 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | インテリジェントマッチングを実現し、リズミックトラックを追加する方法及びシステム |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002089111A1 (fr) * | 2001-04-17 | 2002-11-07 | Kabushiki Kaisha Kenwood | Systeme de transfert d'informations sur un attribut, par exemple, de disque compact |
WO2007066813A1 (ja) * | 2005-12-09 | 2007-06-14 | Sony Corporation | 音楽編集装置、音楽編集情報の作成方法、並びに音楽編集情報が記録された記録媒体 |
JP5259083B2 (ja) * | 2006-12-04 | 2013-08-07 | ソニー株式会社 | マッシュアップ用データの配布方法、マッシュアップ方法、マッシュアップ用データのサーバ装置およびマッシュアップ装置 |
JP5007563B2 (ja) * | 2006-12-28 | 2012-08-22 | ソニー株式会社 | 音楽編集装置および方法、並びに、プログラム |
JP4311466B2 (ja) * | 2007-03-28 | 2009-08-12 | ヤマハ株式会社 | 演奏装置およびその制御方法を実現するプログラム |
US7956274B2 (en) * | 2007-03-28 | 2011-06-07 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US7915512B2 (en) * | 2008-10-15 | 2011-03-29 | Agere Systems, Inc. | Method and apparatus for adjusting the cadence of music on a personal audio device |
US9024166B2 (en) * | 2010-09-09 | 2015-05-05 | Harmonix Music Systems, Inc. | Preventing subtractive track separation |
JP5594052B2 (ja) * | 2010-10-22 | 2014-09-24 | ソニー株式会社 | 情報処理装置、楽曲再構成方法及びプログラム |
US9070352B1 (en) * | 2011-10-25 | 2015-06-30 | Mixwolf LLC | System and method for mixing song data using measure groupings |
US9111519B1 (en) * | 2011-10-26 | 2015-08-18 | Mixwolf LLC | System and method for generating cuepoints for mixing song data |
GB201202565D0 (en) | 2012-02-15 | 2012-03-28 | British American Tobacco Co | Packaging |
JP2014052469A (ja) * | 2012-09-06 | 2014-03-20 | Sony Corp | 音声処理装置、音声処理方法、及び、プログラム |
US9070351B2 (en) * | 2012-09-19 | 2015-06-30 | Ujam Inc. | Adjustment of song length |
CN103928037B (zh) * | 2013-01-10 | 2018-04-13 | 先锋高科技(上海)有限公司 | 一种音频切换方法及终端设备 |
US8927846B2 (en) * | 2013-03-15 | 2015-01-06 | Exomens | System and method for analysis and creation of music |
US9798974B2 (en) | 2013-09-19 | 2017-10-24 | Microsoft Technology Licensing, Llc | Recommending audio sample combinations |
US9280313B2 (en) * | 2013-09-19 | 2016-03-08 | Microsoft Technology Licensing, Llc | Automatically expanding sets of audio samples |
US9372925B2 (en) | 2013-09-19 | 2016-06-21 | Microsoft Technology Licensing, Llc | Combining audio samples by automatically adjusting sample characteristics |
US9257954B2 (en) | 2013-09-19 | 2016-02-09 | Microsoft Technology Licensing, Llc | Automatic audio harmonization based on pitch distributions |
US9613605B2 (en) * | 2013-11-14 | 2017-04-04 | Tunesplice, Llc | Method, device and system for automatically adjusting a duration of a song |
CN103902647A (zh) * | 2013-12-27 | 2014-07-02 | 上海斐讯数据通信技术有限公司 | 一种应用于智能设备上的乐谱识别方法及智能设备 |
EP4218975A3 (en) | 2015-05-19 | 2023-08-30 | Harmonix Music Systems, Inc. | Improvised guitar simulation |
US10101960B2 (en) | 2015-05-19 | 2018-10-16 | Spotify Ab | System for managing transitions between media content items |
GB2581032B (en) * | 2015-06-22 | 2020-11-04 | Time Machine Capital Ltd | System and method for onset detection in a digital signal |
US9773486B2 (en) | 2015-09-28 | 2017-09-26 | Harmonix Music Systems, Inc. | Vocal improvisation |
US9799314B2 (en) | 2015-09-28 | 2017-10-24 | Harmonix Music Systems, Inc. | Dynamic improvisational fill feature |
US9502017B1 (en) * | 2016-04-14 | 2016-11-22 | Adobe Systems Incorporated | Automatic audio remixing with repetition avoidance |
GB2571340A (en) * | 2018-02-26 | 2019-08-28 | Ai Music Ltd | Method of combining audio signals |
CN108831425B (zh) | 2018-06-22 | 2022-01-04 | 广州酷狗计算机科技有限公司 | 混音方法、装置及存储介质 |
KR102128315B1 (ko) | 2018-06-25 | 2020-06-30 | 서울시립대학교 산학협력단 | 가상 악기 시각화 기반 미디 음악 편곡 장치, 이를 위한 방법 및 이 방법을 수행하는 프로그램이 기록된 컴퓨터 판독 가능한 기록매체 |
CN110867174A (zh) * | 2018-08-28 | 2020-03-06 | 努音有限公司 | 自动混音装置 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06342282A (ja) * | 1993-04-08 | 1994-12-13 | Sony Corp | 音楽出力装置 |
JPH07295589A (ja) * | 1994-04-22 | 1995-11-10 | Yamaha Corp | 波形処理装置 |
JP2000056780A (ja) * | 1998-08-05 | 2000-02-25 | Yamaha Corp | カラオケ装置 |
JP2001109470A (ja) * | 1999-10-13 | 2001-04-20 | Yamaha Corp | 自動演奏装置及び方法 |
JP2003044046A (ja) | 2001-07-30 | 2003-02-14 | Sony Corp | 情報処理装置及び情報処理方法、並びに記憶媒体 |
JP2003050588A (ja) * | 2001-08-06 | 2003-02-21 | Pioneer Electronic Corp | コンテンツ提供システムの管理サーバ装置、および端末装置 |
JP2003108132A (ja) * | 2001-09-28 | 2003-04-11 | Pioneer Electronic Corp | オーディオ情報再生装置及びオーディオ情報再生システム |
JP2004198759A (ja) * | 2002-12-19 | 2004-07-15 | Sony Computer Entertainment Inc | 楽音再生装置及び楽音再生プログラム |
JP2005156641A (ja) | 2003-11-20 | 2005-06-16 | Sony Corp | 再生態様制御装置及び再生態様制御方法 |
JP2005303099A (ja) | 2004-04-14 | 2005-10-27 | Hitachi High-Technologies Corp | プラズマ処理装置およびプラズマ処理方法 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2956569B2 (ja) * | 1996-02-26 | 1999-10-04 | ヤマハ株式会社 | カラオケ装置 |
JP3861381B2 (ja) * | 1997-06-13 | 2006-12-20 | ヤマハ株式会社 | カラオケ装置 |
JP2000244811A (ja) * | 1999-02-23 | 2000-09-08 | Makino Denki:Kk | ミキシング方法およびミキシング装置 |
JP3570309B2 (ja) * | 1999-09-24 | 2004-09-29 | ヤマハ株式会社 | リミックス装置および記憶媒体 |
JP4293712B2 (ja) * | 1999-10-18 | 2009-07-08 | ローランド株式会社 | オーディオ波形再生装置 |
JP3843688B2 (ja) | 2000-03-15 | 2006-11-08 | ヤマハ株式会社 | 楽曲データ編集装置 |
JP3870671B2 (ja) | 2000-06-26 | 2007-01-24 | ヤマハ株式会社 | 携帯端末装置 |
JP2003022660A (ja) * | 2001-07-05 | 2003-01-24 | Kenwood Corp | 記録再生装置および記録モード表示方法 |
JP2003114678A (ja) | 2001-07-30 | 2003-04-18 | Sony Corp | 情報処理装置及び情報処理方法、並びに記憶媒体 |
JP2003114677A (ja) | 2001-07-30 | 2003-04-18 | Sony Corp | 情報処理装置及び情報処理方法、並びに記憶媒体 |
US6933432B2 (en) | 2002-03-28 | 2005-08-23 | Koninklijke Philips Electronics N.V. | Media player with “DJ” mode |
US20030205124A1 (en) * | 2002-05-01 | 2003-11-06 | Foote Jonathan T. | Method and system for retrieving and sequencing music by rhythmic similarity |
JP2003345351A (ja) | 2002-05-27 | 2003-12-03 | Nec Corp | デジタルコンテンツ編集システムおよびデジタルコンテンツ編集方法 |
US7208672B2 (en) * | 2003-02-19 | 2007-04-24 | Noam Camiel | System and method for structuring and mixing audio tracks |
US7521623B2 (en) * | 2004-11-24 | 2009-04-21 | Apple Inc. | Music synchronization arrangement |
US7189913B2 (en) * | 2003-04-04 | 2007-03-13 | Apple Computer, Inc. | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
US20040254660A1 (en) * | 2003-05-28 | 2004-12-16 | Alan Seefeldt | Method and device to process digital media streams |
US7026536B2 (en) * | 2004-03-25 | 2006-04-11 | Microsoft Corporation | Beat analysis of musical signals |
US7592534B2 (en) * | 2004-04-19 | 2009-09-22 | Sony Computer Entertainment Inc. | Music composition reproduction device and composite device including the same |
US7518053B1 (en) * | 2005-09-01 | 2009-04-14 | Texas Instruments Incorporated | Beat matching for portable audio |
US20070074618A1 (en) * | 2005-10-04 | 2007-04-05 | Linda Vergo | System and method for selecting music to guide a user through an activity |
US20080097633A1 (en) * | 2006-09-29 | 2008-04-24 | Texas Instruments Incorporated | Beat matching systems |
-
2006
- 2006-12-07 KR KR1020087013806A patent/KR20080074977A/ko active IP Right Grant
- 2006-12-07 WO PCT/JP2006/324890 patent/WO2007066819A1/ja active Application Filing
- 2006-12-07 JP JP2007549215A patent/JPWO2007066819A1/ja active Pending
- 2006-12-07 CN CN2006800457218A patent/CN101322180B/zh not_active Expired - Fee Related
- 2006-12-07 US US12/092,641 patent/US7855333B2/en not_active Expired - Fee Related
- 2006-12-07 EP EP20060834645 patent/EP1959429A4/en not_active Withdrawn
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06342282A (ja) * | 1993-04-08 | 1994-12-13 | Sony Corp | 音楽出力装置 |
JPH07295589A (ja) * | 1994-04-22 | 1995-11-10 | Yamaha Corp | 波形処理装置 |
JP2000056780A (ja) * | 1998-08-05 | 2000-02-25 | Yamaha Corp | カラオケ装置 |
JP2001109470A (ja) * | 1999-10-13 | 2001-04-20 | Yamaha Corp | 自動演奏装置及び方法 |
JP2003044046A (ja) | 2001-07-30 | 2003-02-14 | Sony Corp | 情報処理装置及び情報処理方法、並びに記憶媒体 |
JP2003050588A (ja) * | 2001-08-06 | 2003-02-21 | Pioneer Electronic Corp | コンテンツ提供システムの管理サーバ装置、および端末装置 |
JP2003108132A (ja) * | 2001-09-28 | 2003-04-11 | Pioneer Electronic Corp | オーディオ情報再生装置及びオーディオ情報再生システム |
JP2004198759A (ja) * | 2002-12-19 | 2004-07-15 | Sony Computer Entertainment Inc | 楽音再生装置及び楽音再生プログラム |
JP2005156641A (ja) | 2003-11-20 | 2005-06-16 | Sony Corp | 再生態様制御装置及び再生態様制御方法 |
JP2005303099A (ja) | 2004-04-14 | 2005-10-27 | Hitachi High-Technologies Corp | プラズマ処理装置およびプラズマ処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1959429A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010531464A (ja) * | 2007-06-25 | 2010-09-24 | ソニー エリクソン モバイル コミュニケーションズ, エービー | 電子装置を用いて複数のソングを自動的にビートミックスするシステム及び方法 |
WO2009038225A1 (ja) * | 2007-09-19 | 2009-03-26 | Sony Corporation | コンテンツ再生装置及びコンテンツ再生方法 |
CN101802920B (zh) * | 2007-09-19 | 2012-08-29 | 索尼公司 | 内容再现设备和内容再现方法 |
JP2011227212A (ja) * | 2010-04-17 | 2011-11-10 | Nl Giken Kk | 電子オルゴール |
JP2012103603A (ja) * | 2010-11-12 | 2012-05-31 | Sony Corp | 情報処理装置、楽曲区間抽出方法、及びプログラム |
JP2016085332A (ja) * | 2014-10-24 | 2016-05-19 | オンキヨー株式会社 | 音楽編集装置、及び、音楽編集プログラム |
CN108202334A (zh) * | 2018-03-22 | 2018-06-26 | 东华大学 | 一种能够识别音乐节拍和风格的舞蹈机器人 |
JP7564189B2 (ja) | 2019-08-08 | 2024-10-08 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | インテリジェントマッチングを実現し、リズミックトラックを追加する方法及びシステム |
Also Published As
Publication number | Publication date |
---|---|
US7855333B2 (en) | 2010-12-21 |
EP1959429A1 (en) | 2008-08-20 |
CN101322180B (zh) | 2012-01-11 |
JPWO2007066819A1 (ja) | 2009-05-21 |
EP1959429A4 (en) | 2011-08-31 |
CN101322180A (zh) | 2008-12-10 |
US20090133568A1 (en) | 2009-05-28 |
KR20080074977A (ko) | 2008-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2007066819A1 (ja) | 音楽編集装置及び音楽編集方法 | |
KR101287984B1 (ko) | 음악 편집 장치 및 음악 편집 방법 | |
US7678983B2 (en) | Music edit device, music edit information creating method, and recording medium where music edit information is recorded | |
Bartlett et al. | Practical Recording Techniques: The step-by-step approach to professional audio recording | |
US20110112672A1 (en) | Systems and Methods of Constructing a Library of Audio Segments of a Song and an Interface for Generating a User-Defined Rendition of the Song | |
JP2004212473A (ja) | カラオケ装置及びカラオケ再生方法 | |
JP4412128B2 (ja) | 再生装置および再生方法 | |
US20160054976A1 (en) | Method for producing media contents in duet mode and apparatus used therein | |
WO2007060605A2 (en) | Device for and method of processing audio data items | |
KR101029483B1 (ko) | 멀티채널 오디오 파일을 이용한 음악 ucc 제작방법 및 그 장치 | |
JP4481225B2 (ja) | 重唱曲における模範歌唱の再生制御に特徴を有するカラオケ装置 | |
JP3859200B2 (ja) | ポータブルミキシング記録装置及びその制御方法並びにプログラム | |
JP2008216681A (ja) | 録音した自分の歌声と模範歌唱とを厳しく比較できるカラオケ装置 | |
KR100552605B1 (ko) | 멀티 트랙 오디오 포맷 변환/재생 방법 및 시스템 | |
JP2005033826A (ja) | ポータブルミキシング記録装置及びプログラム | |
KR101562041B1 (ko) | 듀엣 모드의 미디어 콘텐츠 제작 방법 및 이에 사용되는 미디어 콘텐츠 제작 장치 | |
JP2005107285A (ja) | 楽曲再生装置 | |
JP3900576B2 (ja) | 音楽情報再生装置 | |
JP4267513B2 (ja) | カラオケ録音装置 | |
KR20110071192A (ko) | 미디어 편집 장치, 미디어 편집 서비스 제공 방법, 및 이에 사용되는 웹서버 | |
JP2007079413A (ja) | オーディオ再生装置、オーディオ配信システム、オーディオ再生プログラムおよびオーサリングプログラム | |
JP2007133440A (ja) | 音楽情報再生装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680045721.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: 2007549215 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12092641 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 4011/DELNP/2008 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006834645 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087013806 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |