US10540950B2 - Electrophonic chordophone system, apparatus and method - Google Patents
Electrophonic chordophone system, apparatus and method Download PDFInfo
- Publication number
- US10540950B2 US10540950B2 US16/063,246 US201616063246A US10540950B2 US 10540950 B2 US10540950 B2 US 10540950B2 US 201616063246 A US201616063246 A US 201616063246A US 10540950 B2 US10540950 B2 US 10540950B2
- Authority
- US
- United States
- Prior art keywords
- notes
- tonal
- melody
- strings
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims description 25
- 238000013507 mapping Methods 0.000 claims abstract description 22
- 238000004891 communication Methods 0.000 claims abstract description 12
- 230000004044 response Effects 0.000 claims description 6
- 230000033764 rhythmic process Effects 0.000 description 9
- 239000011295 pitch Substances 0.000 description 8
- 230000008569 process Effects 0.000 description 5
- 230000001020 rhythmical effect Effects 0.000 description 5
- 241001342895 Chorus Species 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- HAORKNGNJCEJBX-UHFFFAOYSA-N cyprodinil Chemical compound N=1C(C)=CC(C2CC2)=NC=1NC1=CC=CC=C1 HAORKNGNJCEJBX-UHFFFAOYSA-N 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 230000001256 tonic effect Effects 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002153 concerted effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/38—Chord
- G10H1/386—One-finger or one-key chord systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/38—Chord
- G10H1/383—Chord detection and/or recognition, e.g. for correction, or automatic bass generation
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H3/00—Instruments in which the tones are generated by electromechanical means
- G10H3/12—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
- G10H3/125—Extracting or recognising the pitch or fundamental frequency of the picked up signal
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H3/00—Instruments in which the tones are generated by electromechanical means
- G10H3/12—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
- G10H3/14—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means
- G10H3/18—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means using a string, e.g. electric guitar
- G10H3/186—Means for processing the signal picked up from the strings
- G10H3/188—Means for processing the signal picked up from the strings for converting the signal to digital format
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H7/00—Instruments in which the tones are synthesised from a data store, e.g. computer organs
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/005—Musical accompaniment, i.e. complete instrumental rhythm synthesis added to a performed melody, e.g. as output by drum machines
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/056—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for extraction or identification of individual instrumental parts, e.g. melody, chords, bass; Identification or separation of instrumental parts by their characteristic voices or timbres
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/555—Tonality processing, involving the key in which a musical piece or melody is played
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/165—User input interfaces for electrophonic musical instruments for string input, i.e. special characteristics in string composition or use for sensing purposes, e.g. causing the string to become its own sensor
Definitions
- This invention relates to an electrophonic chordophone system, apparatus and associated computer-implemented electrophonic chordophone method.
- the present invention lies in the field of music, in general.
- a music note is essentially one audible frequency, with a scale being a set of frequency intervals that connects one note to the same note of double the initial frequency.
- a chord occurs when multiple notes or frequencies are played simultaneously, in one or more scales. Such chord progression then produces music.
- Some common elements of music are pitch, (which governs melody and harmony), rhythm (including concepts such as tempo, meter, and articulation), dynamics (loudness and softness), and the sonic qualities of timbre and texture.
- MIDI Musical Instrument Digital Interface
- MIDI was developed so that musical instruments could communicate with each other and so that one instrument can control another.
- a single MIDI link can carry up to sixteen channels of information, each of which can be routed to a separate device.
- MIDI typically carries event messages that specify notation, pitch and velocity, control signals for parameters such as volume, vibrato, audio panning, cues, and clock signals that set and synchronize tempo between multiple devices. These messages are typically transmitted via a MIDI connection, e.g. cable, wireless, etc., to other devices where they control sound generation and other features.
- This data can also be recorded into a hardware or software device called a sequencer, which can be used to edit the data and to play it back at a later time.
- MIDI data files are a means of storing MIDI messages in a standardized and persistent manner, where MIDI messages are stored, and can later be recalled and transmitted to MIDI devices.
- An example use of a MIDI data file would be where a pop song is stored as a sequence of MIDI messages held in a MIDI data file, whereby recalling the MIDI messages and transmitting them to a musical keyboard would cause said keyboard to play the pop song.
- MIDI data files are static after creation in that they do not change, evolve or comprise any dynamic elements. MIDI data files can be edited by a user using technologies relevant to MIDI music composition.
- looping pedals provide an ability to overdub repeated recordings of live guitar playing in order to form a layer of sounds.
- use of looping pedals is not practical in the context of mainstream live performances rendered on guitar and is generally limited to use by only the most technical and accomplished of players.
- pre-recorded audio as backing tracks which can be used to digitally encode a musical performance for repeated replay.
- every time a pre-recorded audio file is used it inherently sounds identical.
- a guitarist has used such a backing track several times, its usefulness as a composition tool or source of inspiration diminishes due to its invariant nature.
- Pre-recorded backing tracks of any kind always deliver the same chords in the same sequence using the same sounds and are of little use as an inspirational platform for one seeking to compose or play ad lib.
- US2015/0221297 discloses a method substantially constrained to the formation of drum rhythms, based on an analysis of a musical performance. Additionally, the process involves a sequence of intrinsically linked steps, being an analysis of a musical performance is completed and until such analysis is completed the next step of the sequence cannot begin. A subsequent step executes in order to ascertain an accent pattern and until said accent pattern is ascertained the next step of the sequence cannot begin. A further subsequent step executes in order to use said accent pattern as a means to identify a reference pattern and until said reference pattern is identified the next step of the sequence cannot begin. In this manner, a musical accompaniment is generated, however this is not dynamically responsive due to the intrinsic reliance on the step-wise interlocked sequence which must complete in series before said musical accompaniment can be produced.
- This prior art method does not create a dynamic musical accompaniment that would be inspiring and responsive to the performance of a musician in real time. At best, it provides novelty value in creating endless static loops formed from a static musical performance, there is no real-time dynamic content after the accompaniment is generated.
- US2015/0221297 teaches a method of creating a rhythmic output which does not comprise melodic elements.
- An example embodiment of US2015/0221297 is demonstrated in the GarageBand app, which in turn exemplifies a core constraint of the prior art, where a musician is required to manually create a backing track in the form of an infinitely repeating musical loop that never varies from what the musician input in the first instance.
- GarageBand does not create a melodic accompaniment in response to continuously and randomly selected chords performed by a guitar player.
- GarageBand is so complex that entire books are published to teach novice users how to create the most rudimentary musical backing tracks.
- U.S. Pat. No. 7,309,829 describes a system which can generate a different sound for each string of a guitar, and further can generate multiple sounds, which can be seen as timbrel modification.
- U.S. Pat. No. 7,309,829 is embodied through custom electronics without which no sound will be generated or heard. It requires the creation of complex circuits, adding much complexity, expense and inconvenience of building and interconnecting a multitude of IC's and other circuitry.
- This prior art system does not generate an accompaniment that would be observed by any musician to be a real-time accompaniment, e.g. would not produce a sound like a rock band from a single performer.
- U.S. Pat. No. 5,663,517 discloses a system which has a significant constraint in its reliance on MIDI input and its responsiveness to all notes appearing as input—there is no provision for analogue signal input. Such an arrangement inherently means that if the input stops, then output correspondingly stops.
- the system also requires a selection of a plethora of MIDI files and operation of multiple complex user interface elements, which can be complex to a novice musician.
- the system does not comprise the technology necessary to execute the mathematical models (e.g. Fast Fourier Transform) required in determining the multitude of polyphonic pitches and intents inherent in an analogue signal source.
- the Fishman MIDI pickup is one example of many devices in the prior art which produce MIDI messages in response to notes played on a guitar, whereby said MIDI messages can be transmitted to a MIDI keyboard.
- the conversion of guitar notes to MIDI messages is a concept which is central to the present invention, however, the primary use of any MIDI pickup is to enable a guitarist to emulate another musical instrument, e.g. a synthesiser essentially substituting the tone of a guitar for the tone of another musical instrument, which Applicant believes is not desirable in many musical applications.
- Applicant has identified a need in the art for allowing a guitarist to experience a dynamic and responsive backing track, equivalent to playing with a group of musicians, controlled by the notes or chords played on a guitar, whilst also optionally playing melody ad lib, in order to release a guitarist from the constraints experienced by pre-recorded backing tracks and furthermore enable a single guitarist to control an entire band formed from sequenced and synthesized instruments.
- the present invention seeks to propose possible improvements, at least in part, in amelioration of the known shortcomings in the art.
- reference in this specification to a ‘guitar’ generally includes reference to a chordophone, being any musical instrument capable of producing sounds through the vibration of at least one string tensioned between two points on the instrument.
- reference to the term ‘guitar’ should be construed as reference to any multi-stringed chordophone as far as practical.
- an electrophonic chordophone system comprising:
- a sensor operatively responsive to respective strings of a guitar
- non-transitory processor-readable storage means containing first and second user-configurable tonal formats
- processor arranged in signal communication with the sensor and storage means, said processor adaptable to:
- a synthesiser arranged in signal communication with the processor, said synthesiser configured to produce both the first and second tonal formats simultaneously in substantial real-time, the first tonal format actuatable via the melody group of notes and the second tonal format dynamically selectable via the control group of notes and actuatable via the melody group of notes, so that a melody is producible via the first tonal format and an independent dynamic backing track comprising multiple timbres is producible via the second tonal format.
- the first tonal format comprises a collection of notes.
- the one-to-one mapping generally refers to a direct correlation. Accordingly, it is to be appreciated that such a direct correlation between the melody group of notes and the first tonal format is a correlation in which a large number of melody notes is associated with a large number of first tonal format notes.
- the second tonal format comprises a collection of chords and/or timbres.
- the one-to-many mapping generally refers to an indirect correlation. Accordingly, it is to be appreciated that such an indirect correlation between the control group of notes and the second tonal format is a correlation in which a small number of control notes is generally associated with a large number of second tonal format notes, chords and/or timbres.
- control group of notes forms a subset of the melody group of notes.
- substantially real-time within this specification is to be understood as meaning an instance of time which may include a delay typically resulting from processing and/or transmission times inherent in processing systems or communication signal transmissions. These processing and transmission times, albeit of generally short duration, can introduce some delay, i.e. typically milli- or microseconds, but the tonal formats are generally produced with a human-imperceptible delay between string actuation and synthesiser reproduction, i.e. within ‘substantial real-time’.
- the senor may comprise a transducer configured to capture mechanical vibrations from the strings and convert same to an electrical signal.
- the senor may be configured to digitise analog notes produced by the strings.
- the processor may be configured to digitise analog notes produced by the strings.
- the first and second tonal formats may be user-configurable by means of a software application executed by the processor.
- notes comprising the melody group of notes are generally user-configurable.
- notes comprising the control group of notes are generally user-configurable.
- the melody group of notes is user-configurable.
- the control group of notes is typically user-configurable.
- the melody group of notes and the control group of notes may be user-configurable by means of a software application executed by the processor.
- the processor may be adapted to associate the melody group of notes producible by the strings with the first tonal format in a direct correlation by means of a software application executed by the processor.
- the processor may be adapted to associate the control group of notes producible by the strings with the second tonal format in an indirect correlation by means of a software application executed by the processor.
- the synthesiser comprises an electronic musical instrument that converts electric signals to sound.
- the synthesiser may include at least one speaker.
- an electrophonic chordophone apparatus comprising:
- non-transitory processor-readable storage means containing first and second user-configurable tonal formats
- a processor arranged in signal communication with the input and storage means;
- processor is adapted to:
- the input may include a transducer configured to capture mechanical vibrations from the strings and convert same to an electrical signal.
- the input may be configured to digitise analog notes produced by the strings.
- the processor may be configured to digitise analog notes produced by the strings.
- control group of notes may form a subset of the melody group of notes.
- the first and second tonal formats may be user-configurable by means of a software application executed by the processor.
- notes comprising the melody group of notes are generally user-configurable.
- notes comprising the control group of notes are generally user-configurable.
- the melody group of notes is user-configurable.
- the control group of notes is typically user-configurable.
- the melody group of notes and the control group of notes may be user-configurable by means of a software application executed by the processor.
- the processor may be adapted to associate the melody group of notes producible by the strings with the first tonal format in a one-to-one mapping by means of a software application executed by the processor.
- the processor may be adapted to associate the control group of notes producible by the strings with the second tonal format in a one-to-many mapping by means of a software application executed by the processor.
- a computer-implemented electrophonic chordophone method comprising the steps of:
- the step of sensing is performed by means of a transducer configured to capture mechanical vibrations from the strings and convert same to an electrical signal.
- both steps of associating the melody group and control group are performed by means of a processor executing a software application.
- control group of notes may form a subset of the melody group of notes.
- the step of synthesising is performed by means of a synthesiser.
- the first tonal format comprises a collection of notes.
- the second tonal format comprises a collection of chords and/or timbres.
- FIG. 1 is a diagrammatic representation of an electrophonic chordophone system and associated apparatus, in accordance with one aspect of the invention
- FIG. 2 is a diagrammatic representation of steps comprising an electrophonic chordophone method in accordance with one aspect of the invention
- FIG. 3 is a diagrammatic representation of a specific example of the electrophonic chordophone system of FIG. 1 ;
- FIG. 4 is a diagrammatic representation of a specific example of the electrophonic chordophone apparatus of FIG. 1 .
- the system 20 generally comprises a sensor 24 which has been configured to be operatively responsive to respective strings of a guitar 22 .
- the sensor 24 generally comprises a transducer which is configured to capture mechanical vibrations from the strings of the guitar 22 and convert such signals to an electrical signal, such as a MIDI signal, or the like.
- the sensor 24 may include a pickup, a microphone, and/or the like.
- the senor 24 may be configured to digitise analogue notes produced by the strings.
- a processor 32 of the system 20 may be configured to digitise analogue notes produced by the strings.
- the system 20 also includes non-transitory processor-readable storage means 26 which contains first and second user-configurable tonal formats 28 and 30 , respectively. Also included in system 20 is processor 32 arranged in signal communication with the sensor 24 and storage means 26 .
- the processor 32 is generally adapted to associate a melody group of notes producible by the strings with the first tonal format 28 in a one-to-one mapping or direct correlation, and to associate a control group of notes producible by the strings with the second tonal format 30 in a one-to-many mapping or indirect correlation.
- the system 22 generally includes a synthesiser 34 arranged in signal communication with the processor 32 , as shown.
- the synthesiser 34 is, in turn, configured to produce both the first and second tonal formats 28 and 30 simultaneously in substantial real-time.
- the first tonal format 28 is actuatable via the melody group of notes and the second tonal format 30 is dynamically selectable via the control group of notes and also typically actuatable via the melody group of notes. In this manner, a melody is producible via the first tonal format 28 with a dynamic backing track producible via the second tonal format 30 , all using one guitar 22 .
- the first tonal format 28 typically comprises a collection of distinct notes. Accordingly, it is to be appreciated that the direct correlation between the melody group of notes and the first tonal format 28 is typically a 1:1 mapping or correlation, i.e. one melody group note produces one tonal format note.
- the first tonal format 28 typically alters or changes the ‘sound’ of the melody group notes.
- the first tonal format 28 instead of an acoustic guitar ‘sound’ produced by plucking a string on the guitar 22 , the first tonal format 28 directs the system 20 to change the sound produced by such plucking to a note produced by another instrument, or a distorted note typically produced by an electric guitar, and/or the like.
- Such functionality of a direct correlation between string pluck and resulting sound is known in the art of musical synthesisers.
- the second tonal format 30 typically comprises a collection of chords, chord progressions, and/or timbres. Accordingly, it is to be appreciated that the one-to-many mapping or indirect correlation between the control group of notes and the second tonal format 30 generally results in a single control group note selecting or activating various second tonal format notes, chords, chord progressions, timbres, etc. In addition, the reproduction of such various second tonal format notes, once selected or activated by the control group notes, is actuated by the melody group notes.
- this arrangement enables a guitarist to dynamically control an entire band formed from synthesized instruments at the same time as playing his/her own guitar.
- the first and second tonal formats 28 and 30 are generally user-configurable by means of a software application executed by the processor 32 .
- the notes comprising the melody group of notes along with the notes comprising the control group of notes are generally user-configurable.
- the melody group of notes and the control group of notes are user-configurable by means of a software application executed by the processor 32 .
- Such user-configurability allows a user to select which notes form part of the melody group and which notes part of the control group.
- the notes, chords, chord progressions, timbres, rhythms, melodies, and other sound characteristics forming the first and second tonal formats 28 and 30 can be pre-selected by a user to allowing an infinite number of musical compositions or arrangements to be formed.
- control group of notes may form a subset of the melody group of notes.
- a user such as a guitarist, may configure a particular note of a melody or song which starts a chorus of such song as a control group note (which is also a melody group note), so that the second tonal format enlivens or amplifies the chorus as backing whilst the melody or song is played via the first tonal format.
- the user may decide not to assign or select any notes for the melody group of notes and only assign the control group of notes, such that the first tonal format essentially comprises a null-value.
- This arrangement allows a melody to be reproducible via only the control group of notes, which may be desirable in certain circumstances.
- the synthesiser 34 generally comprises any electronic musical instrument that is able to convert electric signals to sound.
- the synthesiser 34 includes at least one speaker 36 , as shown, via which the sound is broadcast.
- the present invention also provides for an associated electrophonic chordophone apparatus 40 .
- This apparatus 40 generally comprises an input 42 for receiving notes produced by strings of the guitar 22 , as well as the non-transitory processor-readable storage means 26 which contains the first and second user-configurable tonal formats 28 and 30 .
- Apparatus 40 further includes the processor 32 arranged in signal communication with the input and storage means, along with an output 44 via which the processor 32 is able to output signals to the synthesiser 34 .
- the processor 32 is adapted to (i) associate the melody group of notes producible by the strings with the first tonal format 28 in a direct correlation, (ii) associate the control group of notes producible by the strings with the second tonal format 30 in an indirect correlation, and (iii) output, to the synthesiser 34 , both the first and second tonal formats 28 and 30 simultaneously in substantial real-time.
- Such adaptation is typically achieved via a suitable software application.
- the first tonal format 28 is actuatable via the melody group of notes and the second tonal format 30 is dynamically selectable via the control group of notes and, once so selected, also actuatable via the melody group of notes.
- a melody is producible by the synthesiser 34 via the first tonal format 28 , and a dynamic backing track via the second tonal format 30 .
- the melody group of notes, the control group of notes, the first tonal format 28 and the second tonal format 30 are user-configurable by means of a software application executed by the processor 32 .
- the method 48 generally comprises the steps of sensing respective strings of the guitar 22 (indicated by process step 50 ), associating a melody group of notes producible by the strings with the first tonal format 28 in a direct correlation (indicated by process step 52 ), as well as associating a control group of notes producible by the strings with the second tonal format in an indirect correlation (indicated by process step 54 ).
- the method 48 further includes the step of, in response to actuation of the strings, synthesising both the first and second tonal formats 28 and 30 simultaneously in substantial real-time (indicated by process step 56 ).
- the first tonal format 28 is actuatable via the melody group of notes
- the second tonal format 30 is dynamically selectable via the control group of notes and actuatable via the melody group of notes. In this manner, a melody is producible via the first tonal format 28 and a dynamic backing track via the second tonal format 30 .
- step of sensing 50 is generally performed by means of a transducer configured to capture mechanical vibrations from the strings and convert same to an electrical signal.
- both steps of associating the melody group and control group 52 and 54 are performed by means of the processor 32 executing a suitable software application.
- the step of synthesising 56 is typically performed by means of the synthesiser 34 , as described above.
- FIGS. 3 and 4 of the accompanying drawings particular examples of the system 20 and apparatus 40 are provided.
- guitar 1 and MIDI instrument 7 are generally prior art known to those of ordinary skill in musical instruments.
- Software module 3 is configured to firstly receive source musical notes from an input 42 , and in the context of FIG. 3 , guitar 1 represents said input. Secondly, software module 3 processes said source musical notes using data extracted from arrangement 9 in order for software module 3 to correspondingly create output 250 . Thirdly, software module 3 transmits output 250 to MIDI instrument 7 such that melodic and rhythmic sounds emanate from MIDI instrument 7 .
- guitar 1 is an example of an input.
- input means a stringed instrument having at least either an ability to convey an analogue signal to software module 3 , or an ability to convey MIDI messages to software module 3 .
- FIG. 3 depicts guitar 1 comprising both MIDI pickup 2 and electromagnetic pickup 10 for the purposes of demonstrating two example of the present embodiment. If MIDI pickup 2 is connected to software module 3 , then electromagnetic pickup 10 is generally not connected to software module 3 . Conversely, if electromagnetic pickup 10 is connected to software module 3 , then MIDI pickup 2 is generally not connected to software module 3 .
- one embodiment is formed with guitar 1 comprising MIDI pickup 2 subsequently connected to software module 3 which is subsequently connected to MIDI instrument 7 .
- a second embodiment is formed with guitar 1 comprising electromagnetic pickup 10 subsequently connected to software module 3 which is subsequently connected to MIDI instrument 7 .
- software module 3 receives MIDI messages from MIDI pickup 2 which are used by internal processes in software module 3 .
- software module 3 receives an analogue signal from electromagnetic pickup 10 which pitch detector 11 converts to MIDI messages which are used by internal processes in software module 3 .
- Guitar 1 comprises strings 4 . Any musical note arising from strings 4 being plucked is said to be a plucked note. Strings 4 are now divided into two groups, firstly being a control group 5 and secondly a melody group 6 , i.e. the melody group notes and the control group notes. Any plucked note occurring from control group 5 is said to be a control message. Any plucked note occurring from melody group is said to be a melody message.
- Software module 3 is generally configured to only respond to a control message, meaning a control message is processed by software module 3 , whilst software module 3 disregards melody message.
- MIDI pickup 2 The purpose of MIDI pickup 2 is to translate plucked notes into MIDI messages and transmit them to software module 3 .
- electromagnetic pickup 10 is to convey a monophonic analogue signal to software module 3 , wherein, pitch detector 11 converts said analogue signal to a frequency expressed in hertz which is subsequently converted into a MIDI note message having note identification in the range of 0 to 127.
- Electromagnetic pickup senses at least one string 4 .
- Said at least one string 4 sensed by electromagnetic pickup 10 means that said at least one string 4 can be comprised by control group 5 and can subsequently give rise to a control message.
- Output 250 is formed by software module 3 by continuously drawing MIDI messages from arrangement 9 and manipulating said MIDI messages from arrangement 9 in accordance with control message and correspondingly output 250 is a collection of MIDI messages.
- output 250 is a collection of MIDI messages repeatedly formed momentarily in a polling loop subsequent to being transmitted to MIDI instrument 7 . If output 250 were observed just prior to being transmitted to MIDI instrument 7 , then multiple MIDI messages would be found to encode melodic and rhythmic MIDI note on and MIDI note off events which in turn operate MIDI instrument 7 .
- software module 3 can be realised as a conventional MIDI sequencer and output 250 can be realised as a dynamic collection of MIDI messages correspondingly forming a MIDI sequence.
- a distinguishing feature of the present invention is that software module 3 can be controlled with a single string pluck causing output 250 to be formed in accordance with a pre-set musical style encoded in arrangement 9 . Where said string pluck represent notes of differing pitch, output 250 is heard to comprise chord changes corresponding to said notes of differing pitch, and in this way software module 3 effects an ability for a performer to control an entire band formed from synthesized instruments.
- FIG. 3 shows an input embodied by guitar 1 connected to software module 3 such that control messages and melody messages are receivable by software module 3 .
- Software module 3 through use of a polling loop, continuously derives and manipulates information from arrangement 9 in response to control message, in order to form output 250 , which is subsequently transmitted to MIDI instrument 7 , causing MIDI instrument 7 to emit sound.
- Attributes 104 are populated both with data from arrangement 9 and data gathered from a user. Attributes 104 , when loaded from arrangement 9 , can generally be modified by a user.
- Attributes 104 typically comprise tempo, voice patches assigned to MIDI output channels, number of notes per chord, chord type and a key and scale, etc.
- tempo beats per minute
- voice patches means the timbres used to form sounds
- number of notes per chord means how many individual notes occur when a chord is sounded whereby a chord of C major can comprise at least 3 notes voiced across multiple octaves
- chord type means triad
- key and scale mean a tonic and scale formula from which notes, chords and modes are derived.
- MIDI output channels voice patches, tempo, chords, triads, seventh, ninth, chord formula, modes, scales, key, octave and tonic are all terms known to one of ordinary skill.
- control message 190 encodes a note of C, and attributes 104 indicate the key of C major
- a performer would expect software module 3 to assign selected chord 200 a chord value of C major.
- control message 190 encodes a note of D
- selected chord 200 would correspondingly be assigned a value of D minor.
- software module 3 automatically assigns a correct chord, according to a scale degree encoded as a note in control message 190 , where correct chord is determined based upon information encoded in attributes 104 .
- Selected chord 200 is a software variable comprised by software module 3 which is subsequently used as a control parameter for sub-modules 150 .
- Sub-modules 150 comprise a collection of processing logic exemplified by rhythm 100 , melody 101 , autochord 102 and MagicChord 103 functions.
- the minimal characteristics of any logic comprised by sub-modules 150 are firstly the ability to examine the present value of selected chord 200 and the values of attributes 104 and secondly to output MIDI messages which are harmonically related to either chord 200 or values encoded in attributes 104 .
- Polling loop 120 comprised by software module 3 , periodically triggers sub-modules 150 , where polling loop 120 is realised by one of ordinary skill to be a system timer common to software development languages. Trigger means that sub-modules 120 has an opportunity, several times per second, to analyse selected chord 200 and attributes 104 and can make a determination also based on data extracted from arrangement 9 as to whether or not to emit MIDI note on and MIDI note off messages having a harmonically correct pitch and occurring at a point in time determined by arrangement 9 .
- arrangement 9 could inform rhythm 100 to trigger a kick drum on every first beat of a bar and a hi-hat on every second and third beat of a bar.
- Arrangement 9 generally comprises data which, in the field of Digital Audio Workstations (DAW), forms a step sequence.
- Said step sequence is a common method of encoding musical events which are read and interpreted before output by a step sequencer.
- Output 250 depicts the collective output of sub-modules 150 as each polling loop 120 event occurs, therefore as polling loop 120 triggers, sub-modules 150 correspondingly emits a collection of MIDI messages to be transmitted to MIDI instrument 7 .
- Rhythm aspect 100 generally output MIDI messages comprised by Output 250 which are read from arrangement 9 and by varying arrangement 9 , rhythm 100 is correspondingly caused to output different styles of rhythm.
- Melody aspect 101 generally extracts notes from selected chord 200 in order that notes can be duplicated, transposed and harmonized in order to dynamically create melody parts that are melodically correct in the context of selected chord 200 .
- melody 101 could output notes of C E G where selected chord 200 is assigned a value of C major and said notes of C E G are output in sequence with a time delay between each said note C E G such that MIDI instrument 7 emits a melody corresponding to the first, third and fifth scale degrees of the scale of C major.
- Autochord aspect 102 generally outputs selected chord 200 with an enhanced voicing, meaning that multiple notes are voiced across a number of octaves extending the harmonic range of chords formulated by autochord 102 .
- Magicchord aspect 103 generally reads names of chords from arrangement 9 in the same aforementioned style of step sequence. For example, sequential steps assigned to magicchord 103 from arrangement 9 can convey a sequence of chords such as C major, A minor and G major in turn.
- Autochord aspect 102 differs in purpose to magicchord 103 insofar as autochord 102 is driven by the value assigned to selected chord 200 , which is correspondingly driven by input from control message 190 which arises from actions from a musical performer. Conversely magicchord 103 outputs chords defined by arrangement 9 and therefore outputs predefined chords regardless of the value assigned to selected chord 200 .
- aspects of MIDI events including velocity and event time are stored such that the volume level of MIDI events, being notes, can be encoded, replayed but also randomly varied to a degree in order to humanize a performance, meaning that step sequence sounds less quantized.
- Arrangement 9 also typically comprises short step sequences equivalent to a small number of measures, for example 2 bars, which in common time is 8 beats. It is an aspect therefore of software module 3 to use arrangement 9 as a source of loops. Loops are known in the art of DAW to be building blocks of electronic music and when used repeatedly can rapidly lead to construction of musical passages. Loops are collections of MIDI messages as opposed to digitized sound samples.
- Software module 3 generally coordinates the formation of collections of MIDI messages arising from processes executed in sub-modules 150 , whereby said collections of MIDI messages are gathered into output 250 and subsequently fed to MIDI instrument 7 resulting in the output of a backing track in accordance with selected chord 200 which governs the tonality of MIDI messages generated by software module 3 .
- Software module 3 generally responds substantially instantaneously to changes in assigned values of selected chord 200 and by doing so, differs significantly from a pre-recorded backing track.
- Software module 3 also outputs MIDI messages through MIDI instrument 7 using timbres defined in arrangement 9 only for notes arising from control events and in doing so, by not responding to melody events, differs substantially from software associated with prior art that enables a guitar to simply transmit MIDI notes verbatim to a connected MIDI instrument where MIDI events arise from every note played on every string of a guitar.
- Applicant believes it particularly advantageous that the present invention enables a user to activate/deactivate the second tonal format without interrupting the natural action of strumming a guitar. This allows, for example, the ability to make a backing track sound different in a chorus than in a verse.
- control group notes it is possible to turn aspects of a backing track on or off by monitoring how the user strums the strings to create a distinction between a chorus and a verse, enriching the whole experience.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Electrophonic Musical Instruments (AREA)
Abstract
Description
-
- i) allow a user to associate a melody group of notes producible by the strings with the first tonal format in a one-to-one mapping, and
- ii) allow the user to associate a control group of notes producible by the strings with the second tonal format in a one-to-many mapping; and
-
- (i) associate a melody group of notes producible by the strings with the first tonal format in a one-to-one mapping,
- (ii) associate a control group of notes producible by the strings with the second tonal format in an one-to-many mapping, and
- (iii) output, to the synthesiser, both the first and second tonal formats simultaneously in substantial real-time, the first tonal format actuatable via the melody group of notes and the second tonal format dynamically selectable via the control group of notes and actuatable via the melody group of notes,
so that a melody is producible by the synthesiser via the first tonal format and a dynamic backing track comprising multiple timbres is producible via the second tonal format.
Claims (20)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2015905253A AU2015905253A0 (en) | 2015-12-17 | Electrophonic chordophone system, apparatus and method | |
AU2015905253 | 2015-12-17 | ||
PCT/AU2016/051239 WO2017100850A1 (en) | 2015-12-17 | 2016-12-15 | Electrophonic chordophone system, apparatus and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190012998A1 US20190012998A1 (en) | 2019-01-10 |
US10540950B2 true US10540950B2 (en) | 2020-01-21 |
Family
ID=59055385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/063,246 Expired - Fee Related US10540950B2 (en) | 2015-12-17 | 2016-12-15 | Electrophonic chordophone system, apparatus and method |
Country Status (3)
Country | Link |
---|---|
US (1) | US10540950B2 (en) |
AU (1) | AU2016374495B2 (en) |
WO (1) | WO2017100850A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019046487A1 (en) * | 2017-08-29 | 2019-03-07 | Intelliterran, Inc. | Apparatus, system, and method for recording and rendering multimedia |
US10482858B2 (en) * | 2018-01-23 | 2019-11-19 | Roland VS LLC | Generation and transmission of musical performance data |
DE102021006036B4 (en) | 2021-12-08 | 2024-03-07 | Klaus Eigenbrodt | Electronic musical instrument that controls devices that have a MIDI input via contactors arranged in functional groups |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5663517A (en) | 1995-09-01 | 1997-09-02 | International Business Machines Corporation | Interactive system for compositional morphing of music in real-time |
US5986201A (en) * | 1996-10-30 | 1999-11-16 | Light And Sound Design, Ltd. | MIDI monitoring |
US6175070B1 (en) * | 2000-02-17 | 2001-01-16 | Musicplayground Inc. | System and method for variable music notation |
JP2001092449A (en) | 1999-09-22 | 2001-04-06 | Kawai Musical Instr Mfg Co Ltd | Electronic instrument |
US20040089131A1 (en) | 2002-11-12 | 2004-05-13 | Alain Georges | Systems and methods for creating, modifying, interacting with and playing musical compositions |
US20050045027A1 (en) | 2002-07-16 | 2005-03-03 | Celi Peter J. | Stringed instrument with embedded DSP modeling for modeling acoustic stringed instruments |
US6995310B1 (en) * | 2001-07-18 | 2006-02-07 | Emusicsystem | Method and apparatus for sensing and displaying tablature associated with a stringed musical instrument |
US7309829B1 (en) | 1998-05-15 | 2007-12-18 | Ludwig Lester F | Layered signal processing for individual and group output of multi-channel electronic musical instruments |
US20090100991A1 (en) * | 2007-02-05 | 2009-04-23 | U.S. Music Corporation | Music Processing System Including Device for Converting Guitar Sounds to Midi Commands |
US7667126B2 (en) | 2007-03-12 | 2010-02-23 | The Tc Group A/S | Method of establishing a harmony control signal controlled in real-time by a guitar input signal |
JP2010266680A (en) | 2009-05-14 | 2010-11-25 | Nec Fielding Ltd | System, method, and program for generating accompaniment |
US20130025437A1 (en) | 2009-06-01 | 2013-01-31 | Matt Serletic | System and Method for Producing a More Harmonious Musical Accompaniment |
US20150013527A1 (en) * | 2013-07-13 | 2015-01-15 | Apple Inc. | System and method for generating a rhythmic accompaniment for a musical performance |
US20150013533A1 (en) * | 2013-07-13 | 2015-01-15 | Apple Inc. | System and method for determining an accent pattern for a musical performance |
US20150143978A1 (en) | 2013-11-25 | 2015-05-28 | Samsung Electronics Co., Ltd. | Method for outputting sound and apparatus for the same |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015010234A1 (en) * | 2013-07-22 | 2015-01-29 | Texas Instruments Incorporated | Hybrid controller for brushless dc motor |
-
2016
- 2016-12-15 US US16/063,246 patent/US10540950B2/en not_active Expired - Fee Related
- 2016-12-15 WO PCT/AU2016/051239 patent/WO2017100850A1/en active Application Filing
- 2016-12-15 AU AU2016374495A patent/AU2016374495B2/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5663517A (en) | 1995-09-01 | 1997-09-02 | International Business Machines Corporation | Interactive system for compositional morphing of music in real-time |
US5986201A (en) * | 1996-10-30 | 1999-11-16 | Light And Sound Design, Ltd. | MIDI monitoring |
US7309829B1 (en) | 1998-05-15 | 2007-12-18 | Ludwig Lester F | Layered signal processing for individual and group output of multi-channel electronic musical instruments |
JP2001092449A (en) | 1999-09-22 | 2001-04-06 | Kawai Musical Instr Mfg Co Ltd | Electronic instrument |
US6175070B1 (en) * | 2000-02-17 | 2001-01-16 | Musicplayground Inc. | System and method for variable music notation |
US6995310B1 (en) * | 2001-07-18 | 2006-02-07 | Emusicsystem | Method and apparatus for sensing and displaying tablature associated with a stringed musical instrument |
US20050045027A1 (en) | 2002-07-16 | 2005-03-03 | Celi Peter J. | Stringed instrument with embedded DSP modeling for modeling acoustic stringed instruments |
US20040089131A1 (en) | 2002-11-12 | 2004-05-13 | Alain Georges | Systems and methods for creating, modifying, interacting with and playing musical compositions |
US20090100991A1 (en) * | 2007-02-05 | 2009-04-23 | U.S. Music Corporation | Music Processing System Including Device for Converting Guitar Sounds to Midi Commands |
US7667126B2 (en) | 2007-03-12 | 2010-02-23 | The Tc Group A/S | Method of establishing a harmony control signal controlled in real-time by a guitar input signal |
JP2010266680A (en) | 2009-05-14 | 2010-11-25 | Nec Fielding Ltd | System, method, and program for generating accompaniment |
US20130025437A1 (en) | 2009-06-01 | 2013-01-31 | Matt Serletic | System and Method for Producing a More Harmonious Musical Accompaniment |
US20150013527A1 (en) * | 2013-07-13 | 2015-01-15 | Apple Inc. | System and method for generating a rhythmic accompaniment for a musical performance |
US20150013533A1 (en) * | 2013-07-13 | 2015-01-15 | Apple Inc. | System and method for determining an accent pattern for a musical performance |
US20150221297A1 (en) | 2013-07-13 | 2015-08-06 | Apple Inc. | System and method for generating a rhythmic accompaniment for a musical performance |
US20150143978A1 (en) | 2013-11-25 | 2015-05-28 | Samsung Electronics Co., Ltd. | Method for outputting sound and apparatus for the same |
Non-Patent Citations (2)
Title |
---|
"What Is MIDI Guitar?"; Jam Origin; 2018; 8 pages. |
Garay, Peter; "International Search Report"; prepared for application No. PCT/AU2016/051239; dated Feb. 21, 2017; 11 pages. |
Also Published As
Publication number | Publication date |
---|---|
AU2016374495B2 (en) | 2021-07-29 |
US20190012998A1 (en) | 2019-01-10 |
AU2016374495A1 (en) | 2018-06-28 |
WO2017100850A1 (en) | 2017-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Rothstein | MIDI: A comprehensive introduction | |
JP3598598B2 (en) | Karaoke equipment | |
US5792971A (en) | Method and system for editing digital audio information with music-like parameters | |
CN1750116B (en) | Automatic rendition style determining apparatus and method | |
US8314320B2 (en) | Automatic accompanying apparatus and computer readable storing medium | |
US11875763B2 (en) | Computer-implemented method of digital music composition | |
US10540950B2 (en) | Electrophonic chordophone system, apparatus and method | |
EP2074382A1 (en) | A percussion assembly, as well as drumsticks and input means for use in said percussion assembly | |
JP6175812B2 (en) | Musical sound information processing apparatus and program | |
JP3829780B2 (en) | Performance method determining device and program | |
JP5292702B2 (en) | Music signal generator and karaoke device | |
US8378200B1 (en) | Source-dependent acoustic, musical and/or other instrument processing and feedback system | |
RU2145121C1 (en) | Method for translating accords | |
US7381882B2 (en) | Performance control apparatus and storage medium | |
JP2006301019A (en) | Pitch-notifying device and program | |
CN113140201A (en) | Accompaniment sound generation device, electronic musical instrument, accompaniment sound generation method, and accompaniment sound generation program | |
Juusela | The Berklee Contemporary Dictionary of Music | |
JP3613062B2 (en) | Musical sound data creation method and storage medium | |
JP3618203B2 (en) | Karaoke device that allows users to play accompaniment music | |
JPH08227296A (en) | Sound signal processor | |
Souvignier | Loops and grooves: The musician's guide to groove machines and loop sequencers | |
Pandey | Encyclopaedic dictionary of music | |
JP6981239B2 (en) | Equipment, methods and programs | |
JP5034471B2 (en) | Music signal generator and karaoke device | |
JP3861886B2 (en) | Musical sound waveform data creation method and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
AS | Assignment |
Owner name: IN8BEATS PTY LTD, AUSTRALIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NURSE, CHRISTOPHER ANDREW;REEL/FRAME:046113/0752 Effective date: 20180612 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240121 |