WO2023107431A1 - A handheld musical instrument with gesture control - Google Patents

A handheld musical instrument with gesture control Download PDF

Info

Publication number
WO2023107431A1
WO2023107431A1 PCT/US2022/051935 US2022051935W WO2023107431A1 WO 2023107431 A1 WO2023107431 A1 WO 2023107431A1 US 2022051935 W US2022051935 W US 2022051935W WO 2023107431 A1 WO2023107431 A1 WO 2023107431A1
Authority
WO
WIPO (PCT)
Prior art keywords
musical instrument
tempo
handheld musical
handheld
program
Prior art date
Application number
PCT/US2022/051935
Other languages
French (fr)
Inventor
Arne Schulze
Original Assignee
Arne Schulze
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/074,421 external-priority patent/US11893969B2/en
Priority claimed from US18/075,295 external-priority patent/US20230178056A1/en
Application filed by Arne Schulze filed Critical Arne Schulze
Publication of WO2023107431A1 publication Critical patent/WO2023107431A1/en

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H3/00Instruments in which the tones are generated by electromechanical means
    • G10H3/12Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
    • G10H3/14Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means
    • G10H3/143Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means characterised by the use of a piezoelectric or magneto-strictive transducer
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • G10H1/42Rhythm comprising tone forming circuits
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H3/00Instruments in which the tones are generated by electromechanical means
    • G10H3/12Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
    • G10H3/14Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means
    • G10H3/146Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means using a membrane, e.g. a drum; Pick-up means for vibrating surfaces, e.g. housing of an instrument
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/375Tempo or beat alterations; Music timing control
    • G10H2210/385Speed change, i.e. variations from preestablished tempo, tempo change, e.g. faster or slower, accelerando or ritardando, without change in pitch
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/221Keyboards, i.e. configuration of several keys or key-like input devices relative to one another
    • G10H2220/251Keyboards, i.e. configuration of several keys or key-like input devices relative to one another arranged as 2D or 3D arrays; Keyboards ergonomically organised for playing chords or for transposing, e.g. Janko keyboard
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/395Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing.
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/045Special instrument [spint], i.e. mimicking the ergonomy, shape, sound or other characteristic of a specific acoustic musical instrument category
    • G10H2230/251Spint percussion, i.e. mimicking percussion instruments; Electrophonic musical instruments with percussion instrument features; Electrophonic aspects of acoustic percussion instruments, MIDI-like control therefor
    • G10H2230/275Spint drum
    • G10H2230/281Spint drum assembly, i.e. mimicking two or more drums or drumpads assembled on a common structure, e.g. drum kit

Definitions

  • One or more embodiments of the invention generally relate to handheld electronic percussion instruments. More particularly, certain embodiments of the invention relate to gesture controlling a tempo associated with the handheld electronic percussion instrument.
  • Electronic percussion instruments mimic traditional acoustic drum kits with electronic triggers corresponding to the various drums and cymbals of an acoustic kit. Striking a pad triggers a drum machine to play a percussion sound or sounds assigned to the pad or pads. A percussionist plays these types of electronic percussion instruments with sticks in the same way that an acoustic drum kit would be played. Further, during a live musical performance, a supplementing percussion, such as, without limitation, a MIDI (Musical Instrument Digital Interface) sequencer needs to be synchronized with the live performance.
  • a MIDI Musical Instrument Digital Interface
  • MIDI compatible sequencer listens and detects the output of a live drummer’s kick or snare drum’s audio channel and derives a tempo in real time to generate a MIDI time code to synchronize the MIDI sequence with a live performance. Any abrupt deviation in tempo may cause a disturbance in synchronizing the MIDI sequence.
  • FIG. 1 illustrates a handheld electronic musical instrument, in accordance with some embodiments of the present invention
  • FIG. 2 illustrates a schematic of a handheld electronic musical instrument for tempo control, in accordance with some embodiments of the present invention
  • FIG. 3 illustrates moving the handheld electronic musical instrument for a desired tempo while triggering percussion sounds, in accordance with some embodiments of the present invention
  • FIG. 4 illustrates a method for changing the tempo of an audio output from a handheld electronic musical instrument, in accordance with an embodiment of the present invention.
  • FIG. 5 is a block diagram depicting an exemplary client/server system which may be used by an exemplary web-enabled/networked embodiment of the present invention. Unless otherwise indicated illustrations in the figures are not necessarily drawn to scale.
  • a reference to “a step” or “a means” is a reference to one or more steps or means and may include sub-steps and subservient means. All conjunctions used are to be understood in the most inclusive sense possible. Thus, the word “or” should be understood as having the definition of a logical “or” rather than that of a logical “exclusive or” unless the context clearly necessitates otherwise. Structures described herein are to be understood also to refer to functional equivalents of such structures. Language that may be construed to express approximation should be so understood unless the context clearly dictates otherwise.
  • references to a "device,” an “apparatus,” a “system,” etc., in the preamble of a claim should be construed broadly to mean “any structure meeting the claim terms” exempt for any specific structure(s)/type(s) that has/(have) been explicitly disavowed or excluded or admitted/implied as prior art in the present specification or incapable of enabling an object/aspect/goal of the invention.
  • the present specification discloses an object, aspect, function, goal, result, or advantage of the invention that a specific prior art structure and/or method step is similarly capable of performing yet in a very different way
  • the present invention disclosure is intended to and shall also implicitly include and cover additional corresponding alternative embodiments that are otherwise identical to that explicitly disclosed except that they exclude such prior art structure(s)/step(s), and shall accordingly be deemed as providing sufficient disclosure to support a corresponding negative limitation in a claim claiming such alternative embodiment(s), which exclude such very different prior art structure(s)/step(s) way(s).
  • references to "one embodiment,” “an embodiment,” “example embodiment,” “various embodiments,” “some embodiments,” “embodiments of the invention,” etc., may indicate that the embodiment(s) of the invention so described may include a particular feature, structure, or characteristic, but not every possible embodiment of the invention necessarily includes the particular feature, structure, or characteristic. Further, repeated use of the phrase “in one embodiment,” or “in an exemplary embodiment,” “an embodiment,” do not necessarily refer to the same embodiment, although they may.
  • references to “user”, or any similar term, as used herein, may mean a human or nonhuman user thereof.
  • “user”, or any similar term, as used herein, unless expressly stipulated otherwise, is contemplated to mean users at any stage of the usage process, to include, without limitation, direct user(s), intermediate user(s), indirect user(s), and end user(s).
  • the meaning of “user”, or any similar term, as used herein, should not be otherwise inferred, or induced by any pattern(s) of description, embodiments, examples, or referenced prior-art that may (or may not) be provided in the present patent.
  • references to “end user”, or any similar term, as used herein, is generally intended to mean late-stage user(s) as opposed to early-stage user(s). Hence, it is contemplated that there may be a multiplicity of different types of “end user” near the end stage of the usage process.
  • examples of an “end user” may include, without limitation, a “consumer”, “buyer”, “customer”, “purchaser”, “shopper”, “enjoyer”, “viewer”, or individual person or non-human thing benefiting in any way, directly or indirectly, from use of. or interaction, with some aspect of the present invention.
  • some embodiments of the present invention may provide beneficial usage to more than one stage or type of usage in the foregoing usage process.
  • references to “end user”, or any similar term, as used therein are generally intended to not include the user that is the furthest removed, in the foregoing usage process, from the final user therein of an embodiment of the present invention.
  • intermediate user(s) may include, without limitation, any individual person or non-human thing benefiting in any way, directly or indirectly, from use of, or interaction with, some aspect of the present invention with respect to selling, vending, Original Equipment Manufacturing, marketing, merchandising, distributing, service providing, and the like thereof.
  • the mechanisms/units/circuits/components used with the "configured to” or “operable for” language include hardware--for example, mechanisms, structures, electronics, circuits, memory storing program instructions executable to implement the operation, etc. Reciting that a mechanism/unit/circuit/component is "configured to” or “operable for” perform(ing) one or more tasks is expressly intended not to invoke 35 ll.S.C... sctn.112, sixth paragraph, for that mechanism/unit/circuit/component. "Configured to” may also include adapting a manufacturing process to fabricate devices or components that are adapted to implement or perform one or more tasks.
  • this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is, a determination may be solely based on those factors or based, at least in part, on those factors.
  • a determination may be solely based on those factors or based, at least in part, on those factors.
  • phase “consisting of” excludes any element, step, or ingredient not specified in the claim.
  • the phrase “consists of” (or variations thereof) appears in a clause of the body of a claim, rather than immediately following the preamble, it limits only the element set forth in that clause; other elements are not excluded from the claim as a whole.
  • the phase “consisting essentially of” and “consisting of” limits the scope of a claim to the specified elements or method steps, plus those that do not materially affect the basis and novel characteristic(s) of the claimed subject matter (see Norian Corp, v Stryker Corp., 363 F.3d 1321 , 1331-32, 70 USPQ2d 1508, Fed. Cir. 2004).
  • any claim limitation phrased in functional limitation terms covered by 35 USC ⁇ 112(6) (post AIA 112(f)) which has a preamble invoking the closed terms "consisting of,” or “consisting essentially of,” should be understood to mean that the corresponding structure(s) disclosed herein define the exact metes and bounds of what the so claimed invention embodiment(s) consists of, or consisting essentially of, to the exclusion of any other elements which do not materially affect the intended purpose of the so claimed embodiment(s).
  • any statement(s), identification(s), or reference(s) to a structure(s) and/or element(s) that corresponds to and/or supports a claim limitation(s) phrased in functional limitation terms covered by 35 USC ⁇ 112(6) (post AIA 112(f)) should be understood to be identified by way of example and not limitation, and as such, should not be interpreted to mean that such recited structure and/or element is/are the only structure(s) and/or element(s) disclosed in this patent application that corresponds to and/or supports such claim limitations phrased in functional limitation terms.
  • This claims interpretation intention also applies to any such subsequent statements made by Applicant during prosecution.
  • Devices or system modules that are in at least general communication with each other need not be in continuous communication with each other, unless expressly specified otherwise.
  • devices or system modules that are in at least general communication with each other may communicate directly or indirectly through one or more intermediaries.
  • any system components described or named in any embodiment or claimed herein may be grouped or sub-grouped (and accordingly implicitly renamed) in any combination or sub-combination as those skilled in the art can imagine as suitable for the particular application, and still be within the scope and spirit of the claimed embodiments of the present invention.
  • a commercial implementation in accordance with the spirit and teachings of the present invention may configured according to the needs of the particular application, whereby any aspect(s), feature(s), function(s), result(s), component(s), approach(es), or step(s) of the teachings related to any described embodiment of the present invention may be suitably omitted, included, adapted, mixed and matched, or improved and/or optimized by those skilled in the art, using their average skills and known techniques, to achieve the desired implementation that addresses the needs of the particular application.
  • Coupled may mean that two or more elements are in direct physical or electrical contact. However, “coupled” may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other.
  • a "computer” may refer to one or more apparatus and/or one or more systems that are capable of accepting a structured input, processing the structured input according to prescribed rules, and producing results of the processing as output.
  • Examples of a computer may include: a computer; a stationary and/or portable computer; a computer having a single processor, multiple processors, or multi-core processors, which may operate in parallel and/or not in parallel; a general purpose computer; a supercomputer; a mainframe; a super mini-computer; a mini-computer; a workstation; a micro-computer; a server; a client; an interactive television; a web appliance; a telecommunications device with internet access; a hybrid combination of a computer and an interactive television; a portable computer; a tablet personal computer (PC); a personal digital assistant (PDA); a portable telephone; application-specific hardware to emulate a computer and/or software, such as, for example, a digital signal processor (DSP), a field-programmable gate array (FPGA), an application specific
  • embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Where appropriate, embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
  • Software may refer to prescribed rules to operate a computer. Examples of software may include: code segments in one or more computer-readable languages; graphical and or/textual instructions; applets; pre-compiled code; interpreted code; compiled code; and computer programs.
  • the example embodiments described herein can be implemented in an operating environment comprising computer-executable instructions (e.g., software) installed on a computer, in hardware, or in a combination of software and hardware.
  • the computer-executable instructions can be written in a computer programming language or can be embodied in firmware logic. If written in a programming language conforming to a recognized standard, such instructions can be executed on a variety of hardware platforms and for interfaces to a variety of operating systems.
  • HTML Hypertext Markup Language
  • XML Extensible Markup Language
  • XSL Extensible Stylesheet Language
  • DSSSL Document Style Semantics and Specification Language
  • SCS Cascading Style Sheets
  • SML Synchronized Multimedia Integration Language
  • WML Wireless Markup Language
  • Java.TM. Jini.TM.
  • C C++
  • Smalltalk Perl
  • Perl UNIX Shell
  • Visual Basic or Visual Basic Script Virtual Reality Markup Language
  • VRML Virtual Reality Markup Language
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++, or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • a network is a collection of links and nodes (e.g., multiple computers and/or other devices connected together) arranged so that information may be passed from one part of the network to another over multiple links and through various nodes.
  • networks include the Internet, the public switched telephone network, the global Telex network, computer networks (e.g., an intranet, an extranet, a local-area network, or a wide-area network), wired networks, and wireless networks.
  • the Internet is a worldwide network of computers and computer networks arranged to allow the easy and robust exchange of information between computer users.
  • ISPs Internet Service Providers
  • Content providers e.g., website owners or operators
  • multimedia information e.g., text, graphics, audio, video, animation, and other forms of data
  • webpages comprise a collection of connected, or otherwise related, webpages.
  • the combination of all the websites and their corresponding webpages on the Internet is generally known as the World Wide Web (WWW) or simply the Web.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • Non-volatile media include, for example, optical or magnetic disks and other persistent memory.
  • Volatile media include dynamic random-access memory (DRAM), which typically constitutes the main memory.
  • Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor.
  • Transmission media may include or convey acoustic waves, light waves, and electromagnetic emissions, such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • RF radio frequency
  • IR infrared
  • Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, removable media, flash memory, a "memory stick", any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
  • sequences of instruction may be delivered from RAM to a processor, (ii) may be carried over a wireless transmission medium, and/or (iii) may be formatted according to numerous formats, standards, or protocols, such as Bluetooth, TDMA, CDMA, 3G.
  • a "computer system” may refer to a system having one or more computers, where each computer may include a computer-readable medium embodying software to operate the computer or one or more of its components.
  • Examples of a computer system may include: a distributed computer system for processing information via computer systems linked by a network; two or more computer systems connected together via a network for transmitting and/or receiving information between the computer systems; a computer system including two or more processors within a single computer; and one or more apparatuses and/or one or more systems that may accept data, may process data in accordance with one or more stored software programs, may generate results, and typically may include input, output, storage, arithmetic, logic, and control units.
  • a "network” may refer to a number of computers and associated devices that may be connected by communication facilities.
  • a network may involve permanent connections such as cables or temporary connections such as those made through telephone or other communication links.
  • a network may further include hard-wired connections (e.g., coaxial cable, twisted pair, optical fiber, waveguides, etc.) and/or wireless connections (e.g., radio frequency waveforms, free- space optical waveforms, acoustic waveforms, etc.).
  • Examples of a network may include: an internet, such as the Internet; an intranet; a local area network (LAN); a wide area network (WAN); and a combination of networks, such as an internet and an intranet.
  • client-side application should be broadly construed to refer to an application, a page associated with that application, or some other resource or function invoked by a client-side request to the application.
  • a "browser” as used herein is not intended to refer to any specific browser (e.g., Internet Explorer, Safari, FireFox, or the like), but should be broadly construed to refer to any client-side rendering engine that can access and display Internet- accessible resources.
  • a “rich” client typically refers to a non-HTTP based client-side application, such as an SSH or CFIS client. Further, while typically the client-server interactions occur using HTTP, this is not a limitation either.
  • the client server interaction may be formatted to conform to the Simple Object Access Protocol (SOAP) and travel over HTTP (over the public Internet), FTP, or any other reliable transport mechanism (such as IBM.RTM. MQSeries.RTM. technologies and CORBA, for transport over an enterprise intranet) may be used.
  • SOAP Simple Object Access Protocol
  • Any application or functionality described herein may be implemented as native code, by providing hooks into another application, by facilitating use of the mechanism as a plug-in, by linking to the mechanism, and the like.
  • Exemplary networks may operate with any of a number of protocols, such as Internet protocol (IP), asynchronous transfer mode (ATM), and/or synchronous optical network (SONET), user datagram protocol (UDP), IEEE 802.x, etc.
  • IP Internet protocol
  • ATM asynchronous transfer mode
  • SONET synchronous optical network
  • UDP user datagram protocol
  • IEEE 802.x IEEE 802.x
  • Embodiments of the present invention may include apparatuses for performing the operations disclosed herein.
  • An apparatus may be specially constructed for the desired purposes, or it may comprise a general-purpose device selectively activated or reconfigured by a program stored in the device.
  • Embodiments of the invention may also be implemented in one or a combination of hardware, firmware, and software. They may be implemented as instructions stored on a machine- readable medium, which may be read and executed by a computing platform to perform the operations described herein.
  • aspects of the present invention may be embodied as a system, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • computer program medium and “computer readable medium” may be used to generally refer to media such as, but not limited to, removable storage drives, a hard disk installed in hard disk drive, and the like.
  • These computer program products may provide software to a computer system. Embodiments of the invention may be directed to such computer program products.
  • An algorithm is here, and generally, considered to be a self-consistent sequence of acts or operations leading to a desired result. These include physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers or the like. It should be understood, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities.
  • the phrase "configured to” or “operable for” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general- purpose processor executing software) to operate in a manner that is capable of performing the task(s) at issue. "Configured to” may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
  • a manufacturing process e.g., a semiconductor fabrication facility
  • devices e.g., integrated circuits
  • processor may refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory.
  • a “computing platform” may comprise one or more processors.
  • Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer-executable instructions or data structures stored thereon.
  • Such non-transitory computer-readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above.
  • non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design.
  • non-transitory computer readable medium includes, but is not limited to, a hard drive, compact disc, flash memory, volatile memory, random access memory, magnetic memory, optical memory, semiconductor-based memory, phase change memory, optical memory, periodically refreshed memory, and the like; the non-transitory computer readable medium, however, does not include a pure transitory signal perse; i.e. , where the medium itself is transitory.
  • Embodiments of the present invention disclose a handheld electronic musical instrument, such as, without limitations, a percussion instrument.
  • the handheld electronic musical instrument comprises a body part and a handle part.
  • the handle part may be connected to the body part by various ways as discussed in detail in the parent application 18,074,421, which is incorporated completely herein by reference per above.
  • the body part comprises a set of pads, for example, without limitations, rubber pads.
  • the rubber pads are attached to pressure sensitive sensors, such as, without limitations, a piezoelectric sensor.
  • a piezoelectric sensor When a user of the handheld percussion instruments strikes the rubber pads by finger or by hand the piezoelectric sensor is triggered and generates sound files for example, without limitations, digital drum sound files.
  • the handle part of the handheld electronic percussion instrument comprises buttons enabling the user to select different program styles, such as, without limitation, a drum and percussion style program, a Musical Instrument Digital Interface (MIDI) percussion or musical sequences.
  • the handheld electronic musical instrument comprises controlling the tempo of the selected program style based on detecting a gesture information from hand movement of a user of the handheld electronic musical instrument. For example, without limitations, an accelerometer, or a mechanical means may be implemented for gesture detection.
  • the mechanical means may include a small metal or plastic ball attached to one end of a thin flexible arm, with the opposite end of the arm fastened to the handheld electronic musical instrument, wherein with a back and forth gesture the thin flexible arm would sway back and forth to hit and trigger opposite side sensors generating a MIDI clock tempo.
  • the MIDI clock tempo generated is based on the pulse speed generated by the gesture.
  • a small plastic or metal ball may be placed inside a small tube fastened to the handheld electronic musical instrument, such that with a back and forth gesture the ball would travel back and forth allowing it to hit and trigger opposite side sensors generating a MIDI clock tempo with a speed generated by the gesture.
  • the percussion sound obtained by striking the sensor pads combined with gesture detection obtained by the movement of the handheld musical instrument provides a continuous MIDI clock, for example, without limitations, a time code enabling manual control of the tempo of audio output from the instrument such as, without limitations, preprogrammed musical or percussion audio sequence files, or MIDI sequence files.
  • a time code enabling manual control of the tempo of audio output from the instrument such as, without limitations, preprogrammed musical or percussion audio sequence files, or MIDI sequence files.
  • the detected gesture is a slow shaky movement
  • the playback tempo is made slow and on the other hand when the shake is fast then the playback tempo is made faster.
  • the sounds may be amplified and broadcast through loudspeakers controllers connected to an internal sound module of the musical instrument.
  • the loudspeakers controller may be wired or wirelessly connected to the internal sound module.
  • the music performance from the handheld musical instrument may be converted to MIDI, and be transmitted wirelessly to an external MIDI sound module, a drum machine, or a computer, and may be amplified and broadcast through loudspeakers controllers or recorded to a storage medium such as, without limitations, a computer.
  • time code (alternatively “timecode”) as may be used herein, uncles stated otherwise, a “time code” is generally an invisible digital address that may be recorded along with an audio or video file for viable synchronization purposes.
  • time code is generally an invisible digital address that may be recorded along with an audio or video file for viable synchronization purposes.
  • MIDI time code MIDI time code
  • MIDI clock being tempo dependent”. That is, typically, MIDI clock events are sent at a rate of 24 pulses per quarter note. Those events are used to control and maintain a tempo for BPM- dependent e.g., without limitation, MIDI sequence files.
  • MIDI Clock/code should be understood as a MIDI clock/code that is at least in part tempo dependent thereby facilitating a user’s Gesture to a MIDI tempo dependent time code/clock as employed in at least some embodiments of the present invention.
  • FIGS. 1 illustrates a handheld electronic musical instrument 100.
  • Fig. 1 illustrates a front view of instrument 100.
  • Instrument 100 includes a body part 110 and a handle part 120.
  • Body part 110 and the handle part 120 may be connected in different ways and is discussed in detail in the parent application 18,074,421 which is incorporated completely herein by reference per above.
  • Body part 110 includes one or more sensor pads 130.
  • Sensor pads 130 are for example, without limitation, piezoelectric sensor pads. Each piezoelectric sensor pad comprises a rubber pad, a metal plate, a piezoelectric sensor, and a foam layer. Striking one or more sensor pads 130 produces a percussion sound.
  • Instrument 100 also enables the user to select different program styles using a button, such as, without limitations, a thumb button, a push button, or a joystick.
  • the buttons may be associated with handle 120 of instrument 100.
  • Different program styles include, such as, without limitation, a drum and percussion style program, a Musical Instrument Digital Interface (MIDI) percussion or musical sequences.
  • instrument 100 comprises controlling the tempo of the selected program style based on detecting a gesture information from a user of the handheld electronic musical instrument, wherein the gesture information is associated with a hand movement of the user.
  • gesture information includes a fast or a slow shaking of instrument 100 and controlling the tempo includes increasing the tempo for a fast shake and decreasing the tempo for a slow shake.
  • FIG. 2 illustrates a schematic of a handheld electronic musical instrument for tempo control, in accordance with some embodiments of the present invention.
  • a schematic block diagram 200 of instrument 100 is shown, the various blocks of instrument 100 includes, such as, without limitations, a piezo sensor distribution module and circuity 205 connected to a common ground 210, a memory management module 215, a battery management module 220, a MIDI sequence loop & drum fill selector 225, an accelerometer 230, a pulse-code modulation & sound module 240, a MIDI Bluetooth transmitter & receiver module 245, a display 235, a digital to analog audio wi-fi & Bluetooth transmitter 250, a USB in/out connector 255, a solid-state USB drive 260, a public address system (PA system) 265, an external drum machine 275, and or a computer 270.
  • PA system public address system
  • piezoelectric sensor pads 130 are connected to piezoelectric sensor distribution module and circuity 205. Sensor pads 130 are triggered by the hand or finger tapping of the user. The trigger from sensor pads 130 are individually wired to piezoelectric sensor distribution module and circuity 205 for analog to digital signal conversion. Sensor pads 130 are connected to a common ground 210.
  • Midi Sequence Loop & Drum Fill Selector 225 includes a set of preprogrammed MIDI sequences, for example, without limitations, hi hat, shakers, ride cymbals, cowbell, the play back tempo of the MIDI sequences is controlled based on accelerometer 230 identified MIDI Clock tempo.
  • accelerometer 230 detects a movement of the handheld instrument 100 and converts the detected movement to a preferable readable MIDI clock (time code), and the MIDI clock is applied to control the tempo of the preprogrammed musical MIDI or percussion sequence files, wherein the sequences are obtained by means of a finger or hand, tapping assigned sensor pads 130, by using Joystick 160, or finger and thumb sensors of handle 120.
  • Each individual sequence may be represented in any suitable format depending upon the needs of the application; however, in the present embodiment, individual sequence may be represented as a MIDI file playing PCM (Pulse-Code Modulation) digital audio files that may be stored in Sound Module 240, of a computer 270, or external drum machine 275.
  • PCM Pulse-Code Modulation
  • the individual sequence may (e.g., a MIDI file) may instead or also be transmitted wirelessly.
  • a standard MIDI file is a file format that provides a standardized way for music sequences to be saved, transported, and opened in other systems.
  • SMFs do not contain actual audio data as do regular audio files such as MP3s or WAVs.
  • SMFs instead explain what notes are played, when they're played, and how long or loud each note should be.
  • Files in this format are basically instructions that explain how the sound should be produced once attached to a playback device or loaded into a particular software program that knows how to interpret the data.
  • SMF playback tempo is typically controlled by a variable MIDI clock (tempo dependent time code) setting identified by BPM (Beats Per Minute).
  • the time code typically may also include a digital time code.
  • Accelerometer 230 detecting gestures to generate time code and controlling the tempo associated with MIDI sequences from Midi Sequence Loop & Drum Fill Selector 225 may collectively be called a MIDI Sequence storage module.
  • MIDI Bluetooth transmitter & receiver 245 transmits MIDI information, such as, without limitations, individual MIDI notes, MIDI Clock Tempo, and MIDI Sequences directly to an external drum machine 275 or a computer 270.
  • Digital to Analog Audio Wi-Fi & Bluetooth transmitter 250 transmits the audio performance from instrument 100 to PA System 265 such as, without limitations, loudspeaker.
  • USB In/Out connector 255 may connect to computer 270 or storage device (not shown) through solid state USB drive 260 for loading and saving PCM data during a data transfer.
  • Display 235 may be used to display for example, without limitations, the selected preset programs, sensor and button assignments, library sequences from MIDI sequence loop and drum fill selector 225 , functions of memory management module 215 , setup and status of Wi-Fi and Bluetooth transmitter 250 , setup and status of MIDI Bluetooth transmitter 245 , functions of USB In/Out port 255, a charging status of battery management module 220 , MIDI Clock Tempo by BPM (Beats Per Minute) values from accelerometer 230 and a BPM setting for Manual auditioning loaded sequences.
  • BPM Beats Per Minute
  • gesture detection is performed through mechanical means.
  • the mechanical means may include a small metal or plastic ball attached to one end of a thin flexible arm, with the opposite end of the arm fastened to the handheld electronic musical instrument, wherein with a back and forth gesture the thin flexible arm would sway back and forth to hit and trigger opposite side sensors generating a MIDI clock tempo.
  • the MIDI clock tempo generated is based on the pulse speed generated by the gesture.
  • FIG. 3 illustrates moving the handheld electronic musical instrument for a desired tempo while triggering percussion sounds, in accordance with some embodiments of the present invention.
  • Fig. 3 shows an example of a user moving the instrument 100 while tapping the sensor pads 130.
  • Instrument 100 includes a body 110 and handle 120, wherein the body 120 includes sensor pads 130.
  • Handle 120 includes a thumb operable joystick button 160 and a display 180.
  • Joystick button 160 enables selecting different program styles include, such as, without limitation, a drum and percussion style program, a MIDI percussion or musical sequences changing the audio program settings.
  • Display includes a LED (light emitting diode) or a LCD (liquid crystal display) display.
  • the selected program may be displayed on display 180.
  • the user of the instrument holds the handle 120 with a right hand 310 and selects a program using joystick button 160.
  • the user taps the sensor pad 130 using a left hand 320. Apart from the hold and tap action, the user also provides gesture by moving instrument 100 back and forth, thereby controlling the tempo of percussion sounds produced by tapping the sensor pad 130 based on the gestures.
  • the tempo of instrument 100 may be modified based on different types of gestures, for example, without limitations, shaking of the instrument 100 side to side, back-and- forth movement or twisting the instrument 100 similar to shaking of a tambourine.
  • FIG. 4 illustrates a method for changing the tempo of an audio output from a handheld electronic musical instrument, in accordance with an embodiment of the present invention.
  • a flow chart describes method 400 for changing the tempo of the audio output from the handheld electronic musical instrument 100.
  • instrument 100 is prepared for playing different musical sounds.
  • a user selects a type of audio to be output or played by instrument 100.
  • the audio output to be played includes such as, without limitation, a drum and percussion style program, a MIDI percussion, or musical sequences.
  • the selected audio output is played.
  • a gesture is detected based on a movement made by the user of instrument 100.
  • an accelerometer 230 as shown in Fig.2 or other mechanical means may be used to detect the movement.
  • Gestures include, for example, without limitations, shaking of the instrument 100 side to side, back-and-forth movement or twisting the instrument 100 like shaking of a tambourine.
  • the audio signal output may be wireless and may be compatible with Bluetooth or Wi-Fi GHz 24 standards.
  • Gesture detection enables handheld electronic musical instrument to play live music while simultaneously controlling in real time the tempo of supplementing active MIDI sequences, such as, without limitations, hi hat, shakers, ride cymbals, cowbell in synchronization with the live performance and provides the ability to physically move freely during an onstage performance.
  • active MIDI sequences such as, without limitations, hi hat, shakers, ride cymbals, cowbell in synchronization with the live performance and provides the ability to physically move freely during an onstage performance.
  • FIG. 5 is a block diagram depicting an exemplary client/server system which may be used by an exemplary web-enabled/networked embodiment of the present invention.
  • a communication system 500 includes a multiplicity of clients with a sampling of clients denoted as a client 502 and a client 504, a multiplicity of local networks with a sampling of networks denoted as a local network 506 and a local network 508, a global network 510 and a multiplicity of servers with a sampling of servers denoted as a server 512 and a server 514.
  • Client 502 may communicate bi-directionally with local network 506 via a communication channel 516.
  • Client 504 may communicate bi-directionally with local network 508 via a communication channel 518.
  • Local network 506 may communicate bi-directionally with global network 510 via a communication channel 520.
  • Local network 508 may communicate bidirectionally with global network 510 via a communication channel 522.
  • Global network 510 may communicate bi-directionally with server 512 and server 514 via a communication channel 524.
  • Server 512 and server 514 may communicate bi-directionally with each other via communication channel 524.
  • clients 502, 504, local networks 506, 508, global network 510 and servers 512, 514 may each communicate bi-directionally with each other.
  • global network 510 may operate as the Internet. It will be understood by those skilled in the art that communication system 500 may take many different forms. Nonlimiting examples of forms for communication system 500 include local area networks (LANs), wide area networks (WANs), wired telephone networks, wireless networks, or any other network supporting data communication between respective entities.
  • LANs local area networks
  • WANs wide area networks
  • wired telephone networks wireless networks, or any other network supporting data communication between respective entities.
  • Clients 502 and 504 may take many different forms.
  • Non-limiting examples of clients 502 and 504 include personal computers, personal digital assistants (PDAs), cellular phones and smartphones.
  • PDAs personal digital assistants
  • smartphones may take many different forms.
  • Client 502 includes a CPU 526, a pointing device 528, a keyboard 530, a microphone 532, a printer 534, a memory 536, a mass memory storage 538, a GUI 540, a video camera 542, an input/output interface 544 and a network interface 546.
  • CPU 526, pointing device 528, keyboard 530, microphone 532, printer 534, memory 536, mass memory storage 538, GUI 540, video camera 542, input/output interface 544 and network interface 546 may communicate in a unidirectional manner or a bi-directional manner with each other via a communication channel 548.
  • Communication channel 548 may be configured as a single communication channel or a multiplicity of communication channels.
  • CPU 526 may be comprised of a single processor or multiple processors.
  • CPU 526 may be of various types including micro-controllers (e.g., with embedded RAM/ROM) and microprocessors such as programmable devices (e.g., RISC or SISC based, or CPLDs and FPGAs) and devices not capable of being programmed such as gate array ASICs (Application Specific Integrated Circuits) or general-purpose microprocessors.
  • micro-controllers e.g., with embedded RAM/ROM
  • microprocessors such as programmable devices (e.g., RISC or SISC based, or CPLDs and FPGAs) and devices not capable of being programmed such as gate array ASICs (Application Specific Integrated Circuits) or general-purpose microprocessors.
  • memory 536 is used typically to transfer data and instructions to CPU 526 in a bi-directional manner.
  • Memory 536 may include any suitable computer-readable media, intended for data storage, such as those described above excluding any wired or wireless transmissions unless specifically noted.
  • Mass memory storage 538 may also be coupled bi-directionally to CPU 526 and provides additional data storage capacity and may include any of the computer-readable media described above.
  • Mass memory storage 538 may be used to store programs, data and the like and is typically a secondary storage medium such as a hard disk. It will be appreciated that the information retained within mass memory storage 538, may, in appropriate cases, be incorporated in standard fashion as part of memory 536 as virtual memory.
  • CPU 526 may be coupled to GUI 540.
  • GUI 540 enables a user to view the operation of computer operating system and software.
  • CPU 526 may be coupled to pointing device 528.
  • pointing device 528 include computer mouse, trackball, and touchpad.
  • Pointing device 528 enables a user with the capability to maneuver a computer cursor about the viewing area of GUI 540 and select areas or features in the viewing area of GUI 540.
  • CPU 526 may be coupled to keyboard 530.
  • Keyboard 530 enables a user with the capability to input alphanumeric textual information to CPU 526.
  • CPU 526 may be coupled to microphone 532.
  • Microphone 532 enables audio produced by a user to be recorded, processed, and communicated by CPU 526.
  • CPU 526 may be connected to printer 534.
  • Printer 534 enables a user with the capability to print information to a sheet of paper.
  • CPU 526 may be connected to video camera 542.
  • Video camera 542 enables video produced or captured by user to be recorded, processed, and communicated by CPU 526.
  • CPU 526 may also be coupled to input/output interface 544 that connects to one or more input/output devices such as CD-ROM, video monitors, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, or other well-known input devices such as, of course, other computers.
  • input/output devices such as CD-ROM, video monitors, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, or other well-known input devices such as, of course, other computers.
  • CPU 526 optionally may be coupled to network interface 546 which enables communication with an external device such as a database or a computer or telecommunications or internet network using an external connection shown generally as communication channel 516, which may be implemented as a hardwired or wireless communications link using suitable conventional technologies. With such a connection, CPU 526 might receive information from the network, or might output information to a network in the course of performing the method steps described in the teachings of the present invention.
  • network interface 546 which enables communication with an external device such as a database or a computer or telecommunications or internet network using an external connection shown generally as communication channel 516, which may be implemented as a hardwired or wireless communications link using suitable conventional technologies.
  • CPU 526 might receive information from the network, or might output information to a network in the course of performing the method steps described in the teachings of the present invention.
  • Such computers referenced and/or described in this disclosure may be any kind of computer, either general purpose, or some specific purpose computer such as, but not limited to, a workstation, a mainframe, GPU, ASIC, etc.
  • the programs may be written in C, or Java, Brew, or any other suitable programming language.
  • the programs may be resident on a storage medium, e.g., magnetic, or optical, e.g., without limitation, the computer hard drive, a removable disk, or media such as, without limitation, a memory stick or SD media, or other removable medium.
  • the programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.
  • gesture control of a tempo associated with the handheld electronic percussion instrument may vary depending upon the particular context or application.
  • the gesture control of a tempo associated with the handheld electronic percussion instrument described in the foregoing were principally directed to controlling a tempo of MIDI sequences during a live performance implementations; however, similar techniques may instead be applied to real time tempo control for a lightweight handheld musical instruments that provides free movement during on stage performance to control and trigger e.g., lighting scenes, by wirelessly transmitting MIDI to e.g., a MIDI-to-DMX processor (DMX, being the acronym for Digital Multiplex, which is the standard digital communication protocol that is used to remotely control intelligent e.g., stage lighting fixtures) providing the e.g., user to, at the same time the ability to synchronize e.g., a light show, by the tempo of his or a musical group live stage performance] which implementations of the present invention are contemplated as within the scope of the present invention.
  • DMX being the acronym for Digital Multiplex, which is the standard digital communication protocol that is used to remotely control intelligent e.g., stage lighting fixtures

Abstract

A handheld musical instrument for playing a variety of audio program, the handheld musical instrument includes a main body portion with one or more sensors, a handle portion engaged with said main body portion, wherein said handle portion includes one or more control buttons to select a type of program to be played by said handheld musical instrument and a gesture detection module for detecting gestures associated with said handheld musical instrument and controlling a tempo associated with said program played by said handheld musical instrument based on said detected gestures.

Description

A handheld musical instrument with gesture control
CROSS- REFERENCE TO RELATED APPLICATIONS
[0001] The present PCT patent application claims priority benefit of the Continuation-in-part patent application number 18/075,295, entitled “A Handheld musical instrument with control buttons”, filed on 05-DEC-2022, and further claims priority benefit of the U.S. patent application number 18/074,421 , entitled “A Handheld musical instrument”, and filed on 02-DEC-2022 under 35 USC 111(a), and further claim priority benefit of the U.S. provisional patent application number 63/286, 105, entitled “Handheld Electronic Percussion Instrument”, filed on 06-DEC-2021 under 35 U.S.C 119 (e). The contents of these foregoing related patent applications are incorporated herein by reference for all purposes to the extent that such subject matter is not inconsistent herewith or limiting hereof.
BACKGROUND
[0002] One or more embodiments of the invention generally relate to handheld electronic percussion instruments. More particularly, certain embodiments of the invention relate to gesture controlling a tempo associated with the handheld electronic percussion instrument.
[0003] The following background information may present examples of specific aspects of the prior art (e.g., without limitation, approaches, facts, or common wisdom) that, while expected to be helpful to further educate the reader as to additional aspects of the prior art, is not to be construed as limiting the present invention, or any embodiments thereof, to anything stated or implied therein or inferred thereupon.
[0004] Electronic percussion instruments mimic traditional acoustic drum kits with electronic triggers corresponding to the various drums and cymbals of an acoustic kit. Striking a pad triggers a drum machine to play a percussion sound or sounds assigned to the pad or pads. A percussionist plays these types of electronic percussion instruments with sticks in the same way that an acoustic drum kit would be played. Further, during a live musical performance, a supplementing percussion, such as, without limitation, a MIDI (Musical Instrument Digital Interface) sequencer needs to be synchronized with the live performance.
[0005] The following is an example of a specific aspect in the prior art that, while expected to be helpful to further educate the reader as to additional aspects of the prior art, is not to be construed as limiting the present invention, or any embodiments thereof, to anything stated or implied therein or inferred thereupon. By way of educational background, another aspect of the prior art generally useful to be aware of is that a MIDI compatible sequencer listens and detects the output of a live drummer’s kick or snare drum’s audio channel and derives a tempo in real time to generate a MIDI time code to synchronize the MIDI sequence with a live performance. Any abrupt deviation in tempo may cause a disturbance in synchronizing the MIDI sequence.
[0006] In view of the foregoing, it is clear that these traditional techniques are not perfect and leave room for more optimal approaches.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
[0008] FIG. 1 illustrates a handheld electronic musical instrument, in accordance with some embodiments of the present invention;
[0009] FIG. 2 illustrates a schematic of a handheld electronic musical instrument for tempo control, in accordance with some embodiments of the present invention;
[0010] FIG. 3 illustrates moving the handheld electronic musical instrument for a desired tempo while triggering percussion sounds, in accordance with some embodiments of the present invention;
[0011] FIG. 4 illustrates a method for changing the tempo of an audio output from a handheld electronic musical instrument, in accordance with an embodiment of the present invention; and,
[0012] FIG. 5 is a block diagram depicting an exemplary client/server system which may be used by an exemplary web-enabled/networked embodiment of the present invention. Unless otherwise indicated illustrations in the figures are not necessarily drawn to scale.
DETAILED DESCRIPTION
[0013] The present invention is best understood by reference to the detailed figures and description set forth herein.
[0014] Embodiments of the invention are discussed below with reference to the Figures. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments. For example, it should be appreciated that those skilled in the art will, in light of the teachings of the present invention, recognize a multiplicity of alternate and suitable approaches, depending upon the needs of the particular application, to implement the functionality of any given detail described herein, beyond the particular implementation choices in the following embodiments described and shown. That is, there are modifications and variations of the invention that are too numerous to be listed but that all fit within the scope of the invention. Also, singular words should be read as plural and vice versa and masculine as feminine and vice versa, where appropriate, and alternative embodiments do not necessarily imply that the two are mutually exclusive.
[0015] It is to be further understood that the present invention is not limited to the particular methodology, compounds, materials, manufacturing techniques, uses, and applications, described herein, as these may vary. It is also to be understood that the terminology used herein is used for the purpose of describing particular embodiments only, and is not intended to limit the scope of the present invention. It must be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include the plural reference unless the context clearly dictates otherwise. Thus, for example, a reference to "an element" is a reference to one or more elements and includes equivalents thereof known to those skilled in the art. Similarly, for another example, a reference to "a step" or "a means" is a reference to one or more steps or means and may include sub-steps and subservient means. All conjunctions used are to be understood in the most inclusive sense possible. Thus, the word "or" should be understood as having the definition of a logical "or" rather than that of a logical "exclusive or" unless the context clearly necessitates otherwise. Structures described herein are to be understood also to refer to functional equivalents of such structures. Language that may be construed to express approximation should be so understood unless the context clearly dictates otherwise. [0016] All words of approximation as used in the present disclosure and claims should be construed to mean “approximate,” rather than “perfect,” and may accordingly be employed as a meaningful modifier to any other word, specified parameter, quantity, quality, or concept. Words of approximation, include, yet are not limited to terms such as “substantial”, “nearly”, “almost”, “about”, “generally”, "largely", "essentially”, "closely approximate", etc.
[0017] As will be established in some detail below, it is well settled law, as early as 1939, that words of approximation are not indefinite in the claims even when such limits are not defined or specified in the specification.
[0018] Unless defined otherwise, all technical and scientific terms used herein have the same meanings as commonly understood by one of ordinary skill in the art to which this invention belongs. Preferred methods, techniques, devices, and materials are described, although any methods, techniques, devices, or materials similar or equivalent to those described herein may be used in the practice or testing of the present invention. Structures described herein are to be understood also to refer to functional equivalents of such structures. The present invention will be described in detail below with reference to embodiments thereof as illustrated in the accompanying drawings.
[0019] References to a "device," an "apparatus," a "system," etc., in the preamble of a claim should be construed broadly to mean “any structure meeting the claim terms” exempt for any specific structure(s)/type(s) that has/(have) been explicitly disavowed or excluded or admitted/implied as prior art in the present specification or incapable of enabling an object/aspect/goal of the invention. Furthermore, where the present specification discloses an object, aspect, function, goal, result, or advantage of the invention that a specific prior art structure and/or method step is similarly capable of performing yet in a very different way, the present invention disclosure is intended to and shall also implicitly include and cover additional corresponding alternative embodiments that are otherwise identical to that explicitly disclosed except that they exclude such prior art structure(s)/step(s), and shall accordingly be deemed as providing sufficient disclosure to support a corresponding negative limitation in a claim claiming such alternative embodiment(s), which exclude such very different prior art structure(s)/step(s) way(s).
[0020] From reading the present disclosure, other variations and modifications will be apparent to persons skilled in the art. Such variations and modifications may involve equivalent and other features which are already known in the art, and which may be used instead of or in addition to features already described herein.
[0021] Although Claims have been formulated in this Application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel feature or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any Claim and whether or not it mitigates any or all of the same technical problems as does the present invention.
[0022] Features which are described in the context of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination. The Applicants hereby give notice that new Claims may be formulated to such features and/or combinations of such features during the prosecution of the present Application or of any further Application derived therefrom.
[0023] References to "one embodiment," "an embodiment," "example embodiment," "various embodiments," “some embodiments,” “embodiments of the invention,” etc., may indicate that the embodiment(s) of the invention so described may include a particular feature, structure, or characteristic, but not every possible embodiment of the invention necessarily includes the particular feature, structure, or characteristic. Further, repeated use of the phrase "in one embodiment," or "in an exemplary embodiment," “an embodiment,” do not necessarily refer to the same embodiment, although they may. Moreover, any use of phrases like “embodiments” in connection with “the invention” are never meant to characterize that all embodiments of the invention must include the particular feature, structure, or characteristic, and should instead be understood to mean “at least some embodiments of the invention” include the stated particular feature, structure, or characteristic.
[0024] References to “user”, or any similar term, as used herein, may mean a human or nonhuman user thereof. Moreover, “user”, or any similar term, as used herein, unless expressly stipulated otherwise, is contemplated to mean users at any stage of the usage process, to include, without limitation, direct user(s), intermediate user(s), indirect user(s), and end user(s). The meaning of “user”, or any similar term, as used herein, should not be otherwise inferred, or induced by any pattern(s) of description, embodiments, examples, or referenced prior-art that may (or may not) be provided in the present patent.
[0025] References to “end user”, or any similar term, as used herein, is generally intended to mean late-stage user(s) as opposed to early-stage user(s). Hence, it is contemplated that there may be a multiplicity of different types of “end user” near the end stage of the usage process. Where applicable, especially with respect to distribution channels of embodiments of the invention comprising consumed retail products/services thereof (as opposed to sellers/vendors or Original Equipment Manufacturers), examples of an “end user” may include, without limitation, a “consumer”, “buyer”, “customer”, “purchaser”, “shopper”, “enjoyer”, “viewer”, or individual person or non-human thing benefiting in any way, directly or indirectly, from use of. or interaction, with some aspect of the present invention.
[0026] In some situations, some embodiments of the present invention may provide beneficial usage to more than one stage or type of usage in the foregoing usage process. In such cases where multiple embodiments targeting various stages of the usage process are described, references to “end user”, or any similar term, as used therein, are generally intended to not include the user that is the furthest removed, in the foregoing usage process, from the final user therein of an embodiment of the present invention.
[0027] Where applicable, especially with respect to retail distribution channels of embodiments of the invention, intermediate user(s) may include, without limitation, any individual person or non-human thing benefiting in any way, directly or indirectly, from use of, or interaction with, some aspect of the present invention with respect to selling, vending, Original Equipment Manufacturing, marketing, merchandising, distributing, service providing, and the like thereof.
[0028] References to “person”, “individual”, "human", "a party", “animal”, “creature”, or any similar term, as used herein, even if the context or particular embodiment implies living user, maker, or participant, it should be understood that such characterizations are sole by way of example, and not limitation, in that it is contemplated that any such usage, making, or participation by a living entity in connection with making, using, and/or participating, in any way, with embodiments of the present invention may be substituted by such similar performed by a suitably configured non-living entity, to include, without limitation, automated machines, robots, humanoids, computational systems, information processing systems, artificially intelligent systems, and the like. It is further contemplated that those skilled in the art will readily recognize the practical situations where such living makers, users, and/or participants with embodiments of the present invention may be in whole, or in part, replaced with such non-living makers, users, and/or participants with embodiments of the present invention. Likewise, when those skilled in the art identify such practical situations where such living makers, users, and/or participants with embodiments of the present invention may be in whole, or in part, replaced with such non-living makers, it will be readily apparent in light of the teachings of the present invention how to adapt the described embodiments to be suitable for such non-living makers, users, and/or participants with embodiments of the present invention. Thus, the invention is thus to also cover all such modifications, equivalents, and alternatives falling within the spirit and scope of such adaptations and modifications, at least in part, for such non-living entities.
[0029] Headings provided herein are for convenience and are not to be taken as limiting the disclosure in any way.
[0030] The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
[0031] It is understood that the use of specific component, device and/or parameter names are for example only and not meant to imply any limitations on the invention. The invention may thus be implemented with different nomenclature/terminology utilized to describe the mechanisms/units/structures/components/devices/parameters herein, without limitation. Each term utilized herein is to be given its broadest interpretation given the context in which that term is utilized.
[0032] Terminology. The following paragraphs provide definitions and/or context for terms found in this disclosure (including the appended claims):
[0033] "Comprising" And “contain” and variations of them- Such terms are open-ended and mean “including but not limited to”. When employed in the appended claims, this term does not foreclose additional structure or steps. Consider a claim that recites: "A memory controller comprising a system cache . . . . " Such a claim does not foreclose the memory controller from including additional components (e.g., a memory channel unit, a switch).
[0034] "Configured To." Various units, circuits, or other components may be described or claimed as "configured to" perform a task or tasks. In such contexts, "configured to" or “operable for” is used to connote structure by indicating that the mechanisms/units/circuits/components include structure (e.g., circuitry and/or mechanisms) that performs the task or tasks during operation. As such, the mechanisms/unit/circuit/component can be said to be configured to (or be operable) for perform(ing) the task even when the specified mechanisms/unit/circuit/component is not currently operational (e.g., is not on). The mechanisms/units/circuits/components used with the "configured to" or “operable for” language include hardware--for example, mechanisms, structures, electronics, circuits, memory storing program instructions executable to implement the operation, etc. Reciting that a mechanism/unit/circuit/component is "configured to" or “operable for” perform(ing) one or more tasks is expressly intended not to invoke 35 ll.S.C... sctn.112, sixth paragraph, for that mechanism/unit/circuit/component. "Configured to" may also include adapting a manufacturing process to fabricate devices or components that are adapted to implement or perform one or more tasks.
[0035] "Based On." As used herein, this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is, a determination may be solely based on those factors or based, at least in part, on those factors. Consider the phrase "determine A based on B." While B may be a factor that affects the determination of A, such a phrase does not foreclose the determination of A from also being based on C. In other instances, A may be determined based solely on B.
[0036] The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
[0037] All terms of exemplary language (e.g., including, without limitation, “such as”, “like”, “for example”, “for instance”, “similar to”, etc.) are not exclusive of any other, potentially, unrelated, types of examples; thus, implicitly mean "by way of example, and not limitation...", unless expressly specified otherwise.
[0038] Unless otherwise indicated, all numbers expressing conditions, concentrations, dimensions, and so forth used in the specification and claims are to be understood as being modified in all instances by the term "about." Accordingly, unless indicated to the contrary, the numerical parameters set forth in the following specification and attached claims are approximations that may vary depending at least upon a specific analytical technique.
[0053] The term "comprising," which is synonymous with "including," "containing," or "characterized by" is inclusive or open-ended and does not exclude additional, unrecited elements or method steps. "Comprising" is a term of art used in claim language which means that the named claim elements are essential, but other claim elements may be added and still form a construct within the scope of the claim.
[0054] ] As used herein, the phase "consisting of" excludes any element, step, or ingredient not specified in the claim. When the phrase "consists of" (or variations thereof) appears in a clause of the body of a claim, rather than immediately following the preamble, it limits only the element set forth in that clause; other elements are not excluded from the claim as a whole. As used herein, the phase "consisting essentially of" and "consisting of" limits the scope of a claim to the specified elements or method steps, plus those that do not materially affect the basis and novel characteristic(s) of the claimed subject matter (see Norian Corp, v Stryker Corp., 363 F.3d 1321 , 1331-32, 70 USPQ2d 1508, Fed. Cir. 2004). Moreover, for any claim of the present invention which claims an embodiment "consisting essentially of" or "consisting of" a certain set of elements of any herein described embodiment it shall be understood as obvious by those skilled in the art that the present invention also covers all possible varying scope variants of any described embodiment(s) that are each exclusively (i.e. , “consisting essentially of”) functional subsets or functional combination thereof such that each of these plurality of exclusive varying scope variants each consists essentially of any functional subset(s) and/or functional combination(s) of any set of elements of any described embodiment(s) to the exclusion of any others not set forth therein. That is, it is contemplated that it will be obvious to those skilled how to create a multiplicity of alternate embodiments of the present invention that simply consisting essentially of a certain functional combination of elements of any described embodiment(s) to the exclusion of any others not set forth therein, and the invention thus covers all such exclusive embodiments as if they were each described herein.
[0055] With respect to the terms "comprising," "consisting of," and "consisting essentially of," where one of these three terms is used herein, the disclosed and claimed subject matter may include the use of either of the other two terms. Thus, in some embodiments not otherwise explicitly recited, any instance of "comprising" may be replaced by "consisting of' or, alternatively, by "consisting essentially of', and thus, for the purposes of claim support and construction for "consisting of" format claims, such replacements operate to create yet other alternative embodiments "consisting essentially of' only the elements recited in the original "comprising" embodiment to the exclusion of all other elements. [0056] Moreover, any claim limitation phrased in functional limitation terms covered by 35 USC §112(6) (post AIA 112(f)) which has a preamble invoking the closed terms "consisting of," or "consisting essentially of," should be understood to mean that the corresponding structure(s) disclosed herein define the exact metes and bounds of what the so claimed invention embodiment(s) consists of, or consisting essentially of, to the exclusion of any other elements which do not materially affect the intended purpose of the so claimed embodiment(s). Furthermore, any statement(s), identification(s), or reference(s) to a structure(s) and/or element(s) that corresponds to and/or supports a claim limitation(s) phrased in functional limitation terms covered by 35 USC §112(6) (post AIA 112(f)) should be understood to be identified by way of example and not limitation, and as such, should not be interpreted to mean that such recited structure and/or element is/are the only structure(s) and/or element(s) disclosed in this patent application that corresponds to and/or supports such claim limitations phrased in functional limitation terms. This claims interpretation intention also applies to any such subsequent statements made by Applicant during prosecution.
[0057] Devices or system modules that are in at least general communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. In addition, devices or system modules that are in at least general communication with each other may communicate directly or indirectly through one or more intermediaries. Moreover, it is understood that any system components described or named in any embodiment or claimed herein may be grouped or sub-grouped (and accordingly implicitly renamed) in any combination or sub-combination as those skilled in the art can imagine as suitable for the particular application, and still be within the scope and spirit of the claimed embodiments of the present invention. For an example of what this means, if the invention was a controller of a motor and a valve and the embodiments and claims articulated those components as being separately grouped and connected, applying the foregoing would mean that such an invention and claims would also implicitly cover the valve being grouped inside the motor and the controller being a remote controller with no direct physical connection to the motor or internalized valve, as such the claimed invention is contemplated to cover all ways of grouping and/or adding of intermediate components or systems that still substantially achieve the intended result of the invention.
[0058] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the present invention.
[0059] As is well known to those skilled in the art many careful considerations and compromises typically must be made when designing for the optimal manufacture of a commercial implementation any system, and in particular, the embodiments of the present invention. A commercial implementation in accordance with the spirit and teachings of the present invention may configured according to the needs of the particular application, whereby any aspect(s), feature(s), function(s), result(s), component(s), approach(es), or step(s) of the teachings related to any described embodiment of the present invention may be suitably omitted, included, adapted, mixed and matched, or improved and/or optimized by those skilled in the art, using their average skills and known techniques, to achieve the desired implementation that addresses the needs of the particular application.
[0060] In the following description and claims, the terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, "connected" may be used to indicate that two or more elements are in direct physical or electrical contact with each other. "Coupled" may mean that two or more elements are in direct physical or electrical contact. However, "coupled" may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other.
[0061] A "computer" may refer to one or more apparatus and/or one or more systems that are capable of accepting a structured input, processing the structured input according to prescribed rules, and producing results of the processing as output. Examples of a computer may include: a computer; a stationary and/or portable computer; a computer having a single processor, multiple processors, or multi-core processors, which may operate in parallel and/or not in parallel; a general purpose computer; a supercomputer; a mainframe; a super mini-computer; a mini-computer; a workstation; a micro-computer; a server; a client; an interactive television; a web appliance; a telecommunications device with internet access; a hybrid combination of a computer and an interactive television; a portable computer; a tablet personal computer (PC); a personal digital assistant (PDA); a portable telephone; application-specific hardware to emulate a computer and/or software, such as, for example, a digital signal processor (DSP), a field-programmable gate array (FPGA), an application specific integrated circuit (ASIC), an application specific instruction-set processor (ASIP), a chip, chips, a system on a chip, or a chip set; a data acquisition device; an optical computer; a quantum computer; a biological computer; and generally, an apparatus that may accept data, process data according to one or more stored software programs, generate results, and typically include input, output, storage, arithmetic, logic, and control units.
[0062] Those of skill in the art will appreciate that where appropriate, some embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Where appropriate, embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
[0063] "Software" may refer to prescribed rules to operate a computer. Examples of software may include: code segments in one or more computer-readable languages; graphical and or/textual instructions; applets; pre-compiled code; interpreted code; compiled code; and computer programs.
[0064] While embodiments herein may be discussed in terms of a processor having a certain number of bit instructions/data, those skilled in the art will know others that may be suitable such as 16 bits, 32 bits, 64 bits, 128s or 256-bit processors or processing, which can usually alternatively be used. Where a specified logical sense is used, the opposite logical sense is also intended to be encompassed.
[0065] The example embodiments described herein can be implemented in an operating environment comprising computer-executable instructions (e.g., software) installed on a computer, in hardware, or in a combination of software and hardware. The computer-executable instructions can be written in a computer programming language or can be embodied in firmware logic. If written in a programming language conforming to a recognized standard, such instructions can be executed on a variety of hardware platforms and for interfaces to a variety of operating systems. Although not limited thereto, computer software program code for carrying out operations for aspects of the present invention can be written in any combination of one or more suitable programming languages, including an object oriented programming languages and/or conventional procedural programming languages, and/or programming languages such as, for example, Hypertext Markup Language (HTML), Dynamic HTML, Extensible Markup Language (XML), Extensible Stylesheet Language (XSL), Document Style Semantics and Specification Language (DSSSL), Cascading Style Sheets (CSS), Synchronized Multimedia Integration Language (SMIL), Wireless Markup Language (WML), Java.TM., Jini.TM., C, C++, Smalltalk, Perl, UNIX Shell, Visual Basic or Visual Basic Script, Virtual Reality Markup Language (VRML), ColdFusion. TM. or other compilers, assemblers, interpreters or other computer languages or platforms.
[0066] Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++, or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
[0067] A network is a collection of links and nodes (e.g., multiple computers and/or other devices connected together) arranged so that information may be passed from one part of the network to another over multiple links and through various nodes. Examples of networks include the Internet, the public switched telephone network, the global Telex network, computer networks (e.g., an intranet, an extranet, a local-area network, or a wide-area network), wired networks, and wireless networks.
[0068] The Internet is a worldwide network of computers and computer networks arranged to allow the easy and robust exchange of information between computer users. Hundreds of millions of people around the world have access to computers connected to the Internet via Internet Service Providers (ISPs). Content providers (e.g., website owners or operators) place multimedia information (e.g., text, graphics, audio, video, animation, and other forms of data) at specific locations on the Internet referred to as webpages. Websites comprise a collection of connected, or otherwise related, webpages. The combination of all the websites and their corresponding webpages on the Internet is generally known as the World Wide Web (WWW) or simply the Web.
[0069] Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0070] The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[0071] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
[0072] Further, although process steps, method steps, algorithms or the like may be described in a sequential order, such processes, methods, and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously. [0073] It will be readily apparent that the various methods and algorithms described herein may be implemented by, e.g., appropriately programmed general purpose computers and computing devices. Typically, a processor (e.g., a microprocessor) will receive instructions from a memory or like device, and execute those instructions, thereby performing a process defined by those instructions. Further, programs that implement such methods and algorithms may be stored and transmitted using a variety of known media.
[0074] When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article.
[0075] The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the present invention need not include the device itself.
[0076] The term "computer-readable medium" as used herein refers to any medium that participates in providing data (e.g., instructions) which may be read by a computer, a processor or a like device. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks and other persistent memory. Volatile media include dynamic random-access memory (DRAM), which typically constitutes the main memory. Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor. Transmission media may include or convey acoustic waves, light waves, and electromagnetic emissions, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, removable media, flash memory, a "memory stick", any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
[0077] Various forms of computer readable media may be involved in carrying sequences of instructions to a processor. For example, sequences of instruction (i) may be delivered from RAM to a processor, (ii) may be carried over a wireless transmission medium, and/or (iii) may be formatted according to numerous formats, standards, or protocols, such as Bluetooth, TDMA, CDMA, 3G.
[0078] Where databases are described, it will be understood by one of ordinary skill in the art that (i) alternative database structures to those described may be readily employed, (ii) other memory structures besides databases may be readily employed. Any schematic illustrations and accompanying descriptions of any sample databases presented herein are exemplary arrangements for stored representations of information. Any number of other arrangements may be employed besides those suggested by the tables shown. Similarly, any illustrated entries of the databases represent exemplary information only; those skilled in the art will understand that the number and content of the entries can be different from those illustrated herein. Further, despite any depiction of the databases as tables, an object-based model could be used to store and manipulate the data types of the present invention and likewise, object methods or behaviors can be used to implement the processes of the present invention.
[0079] A "computer system" may refer to a system having one or more computers, where each computer may include a computer-readable medium embodying software to operate the computer or one or more of its components. Examples of a computer system may include: a distributed computer system for processing information via computer systems linked by a network; two or more computer systems connected together via a network for transmitting and/or receiving information between the computer systems; a computer system including two or more processors within a single computer; and one or more apparatuses and/or one or more systems that may accept data, may process data in accordance with one or more stored software programs, may generate results, and typically may include input, output, storage, arithmetic, logic, and control units.
[0080] A "network" may refer to a number of computers and associated devices that may be connected by communication facilities. A network may involve permanent connections such as cables or temporary connections such as those made through telephone or other communication links. A network may further include hard-wired connections (e.g., coaxial cable, twisted pair, optical fiber, waveguides, etc.) and/or wireless connections (e.g., radio frequency waveforms, free- space optical waveforms, acoustic waveforms, etc.). Examples of a network may include: an internet, such as the Internet; an intranet; a local area network (LAN); a wide area network (WAN); and a combination of networks, such as an internet and an intranet. [0081] As used herein, the "client-side" application should be broadly construed to refer to an application, a page associated with that application, or some other resource or function invoked by a client-side request to the application. A "browser" as used herein is not intended to refer to any specific browser (e.g., Internet Explorer, Safari, FireFox, or the like), but should be broadly construed to refer to any client-side rendering engine that can access and display Internet- accessible resources. A "rich" client typically refers to a non-HTTP based client-side application, such as an SSH or CFIS client. Further, while typically the client-server interactions occur using HTTP, this is not a limitation either. The client server interaction may be formatted to conform to the Simple Object Access Protocol (SOAP) and travel over HTTP (over the public Internet), FTP, or any other reliable transport mechanism (such as IBM.RTM. MQSeries.RTM. technologies and CORBA, for transport over an enterprise intranet) may be used. Any application or functionality described herein may be implemented as native code, by providing hooks into another application, by facilitating use of the mechanism as a plug-in, by linking to the mechanism, and the like.
[0082] Exemplary networks may operate with any of a number of protocols, such as Internet protocol (IP), asynchronous transfer mode (ATM), and/or synchronous optical network (SONET), user datagram protocol (UDP), IEEE 802.x, etc.
[0083] Embodiments of the present invention may include apparatuses for performing the operations disclosed herein. An apparatus may be specially constructed for the desired purposes, or it may comprise a general-purpose device selectively activated or reconfigured by a program stored in the device.
[0084] Embodiments of the invention may also be implemented in one or a combination of hardware, firmware, and software. They may be implemented as instructions stored on a machine- readable medium, which may be read and executed by a computing platform to perform the operations described herein.
[0085] More specifically, as will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0086] In the following description and claims, the terms "computer program medium" and "computer readable medium" may be used to generally refer to media such as, but not limited to, removable storage drives, a hard disk installed in hard disk drive, and the like. These computer program products may provide software to a computer system. Embodiments of the invention may be directed to such computer program products.
[0087] An algorithm is here, and generally, considered to be a self-consistent sequence of acts or operations leading to a desired result. These include physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers or the like. It should be understood, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities.
[0088] Unless specifically stated otherwise, and as may be apparent from the following description and claims, it should be appreciated that throughout the specification descriptions utilizing terms such as "processing," "computing," "calculating," "determining," or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
[0089] Additionally, the phrase "configured to" or “operable for” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general- purpose processor executing software) to operate in a manner that is capable of performing the task(s) at issue. "Configured to" may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
[0090] In a similar manner, the term "processor" may refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory. A "computing platform" may comprise one or more processors.
[0091] Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer-executable instructions or data structures stored thereon. Such non-transitory computer-readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above. By way of example, and not limitation, such non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of the computer-readable media.
[0092] While a non-transitory computer readable medium includes, but is not limited to, a hard drive, compact disc, flash memory, volatile memory, random access memory, magnetic memory, optical memory, semiconductor-based memory, phase change memory, optical memory, periodically refreshed memory, and the like; the non-transitory computer readable medium, however, does not include a pure transitory signal perse; i.e. , where the medium itself is transitory.
[0093] Embodiments of the present invention disclose a handheld electronic musical instrument, such as, without limitations, a percussion instrument. The handheld electronic musical instrument comprises a body part and a handle part. The handle part may be connected to the body part by various ways as discussed in detail in the parent application 18,074,421, which is incorporated completely herein by reference per above. The body part comprises a set of pads, for example, without limitations, rubber pads. The rubber pads are attached to pressure sensitive sensors, such as, without limitations, a piezoelectric sensor. When a user of the handheld percussion instruments strikes the rubber pads by finger or by hand the piezoelectric sensor is triggered and generates sound files for example, without limitations, digital drum sound files. The handle part of the handheld electronic percussion instrument comprises buttons enabling the user to select different program styles, such as, without limitation, a drum and percussion style program, a Musical Instrument Digital Interface (MIDI) percussion or musical sequences. Further, the handheld electronic musical instrument comprises controlling the tempo of the selected program style based on detecting a gesture information from hand movement of a user of the handheld electronic musical instrument. For example, without limitations, an accelerometer, or a mechanical means may be implemented for gesture detection. For example, without limitations, the mechanical means may include a small metal or plastic ball attached to one end of a thin flexible arm, with the opposite end of the arm fastened to the handheld electronic musical instrument, wherein with a back and forth gesture the thin flexible arm would sway back and forth to hit and trigger opposite side sensors generating a MIDI clock tempo. The MIDI clock tempo generated is based on the pulse speed generated by the gesture. In another example, a small plastic or metal ball may be placed inside a small tube fastened to the handheld electronic musical instrument, such that with a back and forth gesture the ball would travel back and forth allowing it to hit and trigger opposite side sensors generating a MIDI clock tempo with a speed generated by the gesture. The percussion sound obtained by striking the sensor pads combined with gesture detection obtained by the movement of the handheld musical instrument provides a continuous MIDI clock, for example, without limitations, a time code enabling manual control of the tempo of audio output from the instrument such as, without limitations, preprogrammed musical or percussion audio sequence files, or MIDI sequence files. For example, without limitations if the detected gesture is a slow shaky movement, the playback tempo is made slow and on the other hand when the shake is fast then the playback tempo is made faster. The sounds may be amplified and broadcast through loudspeakers controllers connected to an internal sound module of the musical instrument. The loudspeakers controller may be wired or wirelessly connected to the internal sound module. For example, without limitations, the music performance from the handheld musical instrument may be converted to MIDI, and be transmitted wirelessly to an external MIDI sound module, a drum machine, or a computer, and may be amplified and broadcast through loudspeakers controllers or recorded to a storage medium such as, without limitations, a computer.
[0094] Regarding the term “time code” (alternatively “timecode”) as may be used herein, uncles stated otherwise, a “time code” is generally an invisible digital address that may be recorded along with an audio or video file for viable synchronization purposes. In the context of the present invention, there is contemplated a distinction between a general “time code” and a “MIDI time code” and “MIDI clock being tempo dependent”. That is, typically, MIDI clock events are sent at a rate of 24 pulses per quarter note. Those events are used to control and maintain a tempo for BPM- dependent e.g., without limitation, MIDI sequence files. In at least some embodiments described herein, the term MIDI Clock/code should be understood as a MIDI clock/code that is at least in part tempo dependent thereby facilitating a user’s Gesture to a MIDI tempo dependent time code/clock as employed in at least some embodiments of the present invention.
[0095] FIGS. 1 illustrates a handheld electronic musical instrument 100. Fig. 1 illustrates a front view of instrument 100. Instrument 100 includes a body part 110 and a handle part 120. Body part 110 and the handle part 120 may be connected in different ways and is discussed in detail in the parent application 18,074,421 which is incorporated completely herein by reference per above. Body part 110 includes one or more sensor pads 130. Sensor pads 130 are for example, without limitation, piezoelectric sensor pads. Each piezoelectric sensor pad comprises a rubber pad, a metal plate, a piezoelectric sensor, and a foam layer. Striking one or more sensor pads 130 produces a percussion sound. Instrument 100 also enables the user to select different program styles using a button, such as, without limitations, a thumb button, a push button, or a joystick. The buttons may be associated with handle 120 of instrument 100. Different program styles include, such as, without limitation, a drum and percussion style program, a Musical Instrument Digital Interface (MIDI) percussion or musical sequences. Further, instrument 100 comprises controlling the tempo of the selected program style based on detecting a gesture information from a user of the handheld electronic musical instrument, wherein the gesture information is associated with a hand movement of the user. For example, without limitation, gesture information includes a fast or a slow shaking of instrument 100 and controlling the tempo includes increasing the tempo for a fast shake and decreasing the tempo for a slow shake.
[0096] FIG. 2 illustrates a schematic of a handheld electronic musical instrument for tempo control, in accordance with some embodiments of the present invention. In Fig. 2 a schematic block diagram 200 of instrument 100 is shown, the various blocks of instrument 100 includes, such as, without limitations, a piezo sensor distribution module and circuity 205 connected to a common ground 210, a memory management module 215, a battery management module 220, a MIDI sequence loop & drum fill selector 225, an accelerometer 230, a pulse-code modulation & sound module 240, a MIDI Bluetooth transmitter & receiver module 245, a display 235, a digital to analog audio wi-fi & Bluetooth transmitter 250, a USB in/out connector 255, a solid-state USB drive 260, a public address system (PA system) 265, an external drum machine 275, and or a computer 270.
[0097] In Fig. 2, piezoelectric sensor pads 130 are connected to piezoelectric sensor distribution module and circuity 205. Sensor pads 130 are triggered by the hand or finger tapping of the user. The trigger from sensor pads 130 are individually wired to piezoelectric sensor distribution module and circuity 205 for analog to digital signal conversion. Sensor pads 130 are connected to a common ground 210. Midi Sequence Loop & Drum Fill Selector 225 includes a set of preprogrammed MIDI sequences, for example, without limitations, hi hat, shakers, ride cymbals, cowbell, the play back tempo of the MIDI sequences is controlled based on accelerometer 230 identified MIDI Clock tempo. For example, without limitations, accelerometer 230 detects a movement of the handheld instrument 100 and converts the detected movement to a preferable readable MIDI clock (time code), and the MIDI clock is applied to control the tempo of the preprogrammed musical MIDI or percussion sequence files, wherein the sequences are obtained by means of a finger or hand, tapping assigned sensor pads 130, by using Joystick 160, or finger and thumb sensors of handle 120. Each individual sequence may be represented in any suitable format depending upon the needs of the application; however, in the present embodiment, individual sequence may be represented as a MIDI file playing PCM (Pulse-Code Modulation) digital audio files that may be stored in Sound Module 240, of a computer 270, or external drum machine 275. In some embodiment, the individual sequence may (e.g., a MIDI file) may instead or also be transmitted wirelessly. In the present embodiment, a standard MIDI file (SMF) is a file format that provides a standardized way for music sequences to be saved, transported, and opened in other systems. Generally, as is well known to those skilled in the art, SMFs do not contain actual audio data as do regular audio files such as MP3s or WAVs. In contrast, SMFs instead explain what notes are played, when they're played, and how long or loud each note should be. Files in this format are basically instructions that explain how the sound should be produced once attached to a playback device or loaded into a particular software program that knows how to interpret the data. SMF playback tempo is typically controlled by a variable MIDI clock (tempo dependent time code) setting identified by BPM (Beats Per Minute). The time code typically may also include a digital time code.
[0098] Accelerometer 230 detecting gestures to generate time code and controlling the tempo associated with MIDI sequences from Midi Sequence Loop & Drum Fill Selector 225 may collectively be called a MIDI Sequence storage module. MIDI Bluetooth transmitter & receiver 245 transmits MIDI information, such as, without limitations, individual MIDI notes, MIDI Clock Tempo, and MIDI Sequences directly to an external drum machine 275 or a computer 270. Further, Digital to Analog Audio Wi-Fi & Bluetooth transmitter 250 transmits the audio performance from instrument 100 to PA System 265 such as, without limitations, loudspeaker. USB In/Out connector 255 may connect to computer 270 or storage device (not shown) through solid state USB drive 260 for loading and saving PCM data during a data transfer. [0099] Display 235 may be used to display for example, without limitations, the selected preset programs, sensor and button assignments, library sequences from MIDI sequence loop and drum fill selector 225 , functions of memory management module 215 , setup and status of Wi-Fi and Bluetooth transmitter 250 , setup and status of MIDI Bluetooth transmitter 245 , functions of USB In/Out port 255, a charging status of battery management module 220 , MIDI Clock Tempo by BPM (Beats Per Minute) values from accelerometer 230 and a BPM setting for Manual auditioning loaded sequences.
[00100] In some embodiments, gesture detection is performed through mechanical means. For example, without limitations, the mechanical means may include a small metal or plastic ball attached to one end of a thin flexible arm, with the opposite end of the arm fastened to the handheld electronic musical instrument, wherein with a back and forth gesture the thin flexible arm would sway back and forth to hit and trigger opposite side sensors generating a MIDI clock tempo. The MIDI clock tempo generated is based on the pulse speed generated by the gesture. In another example, a small plastic or metal ball may be placed inside a small tube fastened to the handheld electronic musical instrument, such that with a back and forth gesture the ball would travel back and forth allowing it to hit and trigger opposite side sensors generating a MIDI clock tempo with a speed generated by the gesture. FIG. 3 illustrates moving the handheld electronic musical instrument for a desired tempo while triggering percussion sounds, in accordance with some embodiments of the present invention. Fig. 3, shows an example of a user moving the instrument 100 while tapping the sensor pads 130. Instrument 100 includes a body 110 and handle 120, wherein the body 120 includes sensor pads 130. Handle 120 includes a thumb operable joystick button 160 and a display 180. Joystick button 160 enables selecting different program styles include, such as, without limitation, a drum and percussion style program, a MIDI percussion or musical sequences changing the audio program settings. Display includes a LED (light emitting diode) or a LCD (liquid crystal display) display. The selected program may be displayed on display 180. In an example operation, the user of the instrument holds the handle 120 with a right hand 310 and selects a program using joystick button 160. The user taps the sensor pad 130 using a left hand 320. Apart from the hold and tap action, the user also provides gesture by moving instrument 100 back and forth, thereby controlling the tempo of percussion sounds produced by tapping the sensor pad 130 based on the gestures. When the back-and-forth movement is fast then tempo of percussion sound is increased, on the other hand when back-and-forth movement is slow the tempo is decreased. The tempo of instrument 100 may be modified based on different types of gestures, for example, without limitations, shaking of the instrument 100 side to side, back-and- forth movement or twisting the instrument 100 similar to shaking of a tambourine.
[00101] FIG. 4 illustrates a method for changing the tempo of an audio output from a handheld electronic musical instrument, in accordance with an embodiment of the present invention. In Fig. 4 a flow chart describes method 400 for changing the tempo of the audio output from the handheld electronic musical instrument 100. At initial step 410 instrument 100 is prepared for playing different musical sounds. At step 420 a user selects a type of audio to be output or played by instrument 100. The audio output to be played includes such as, without limitation, a drum and percussion style program, a MIDI percussion, or musical sequences. At step 430 the selected audio output is played. At step 440 a gesture is detected based on a movement made by the user of instrument 100. For example, an accelerometer 230 as shown in Fig.2 or other mechanical means may be used to detect the movement. Gestures include, for example, without limitations, shaking of the instrument 100 side to side, back-and-forth movement or twisting the instrument 100 like shaking of a tambourine. At step 450 a determination is made if the movement created by the user’s gesture is fast movement or a slow movement. If the detected movement is fast movement, the tempo of the audio output is increased at step 470 and continued to play with the increased tempo. On the other hand, if the detected movement is slow movement, the tempo of the audio output is decreased at step 460 and continued to play with the reduced tempo. The audio signal output may be wireless and may be compatible with Bluetooth or Wi-Fi GHz 24 standards.
[00102] Gesture detection enables handheld electronic musical instrument to play live music while simultaneously controlling in real time the tempo of supplementing active MIDI sequences, such as, without limitations, hi hat, shakers, ride cymbals, cowbell in synchronization with the live performance and provides the ability to physically move freely during an onstage performance.
[00103] FIG. 5 is a block diagram depicting an exemplary client/server system which may be used by an exemplary web-enabled/networked embodiment of the present invention.
[00104] A communication system 500 includes a multiplicity of clients with a sampling of clients denoted as a client 502 and a client 504, a multiplicity of local networks with a sampling of networks denoted as a local network 506 and a local network 508, a global network 510 and a multiplicity of servers with a sampling of servers denoted as a server 512 and a server 514.
[00105] Client 502 may communicate bi-directionally with local network 506 via a communication channel 516. Client 504 may communicate bi-directionally with local network 508 via a communication channel 518. Local network 506 may communicate bi-directionally with global network 510 via a communication channel 520. Local network 508 may communicate bidirectionally with global network 510 via a communication channel 522. Global network 510 may communicate bi-directionally with server 512 and server 514 via a communication channel 524. Server 512 and server 514 may communicate bi-directionally with each other via communication channel 524. Furthermore, clients 502, 504, local networks 506, 508, global network 510 and servers 512, 514 may each communicate bi-directionally with each other.
[00106] In one embodiment, global network 510 may operate as the Internet. It will be understood by those skilled in the art that communication system 500 may take many different forms. Nonlimiting examples of forms for communication system 500 include local area networks (LANs), wide area networks (WANs), wired telephone networks, wireless networks, or any other network supporting data communication between respective entities.
[00107] Clients 502 and 504 may take many different forms. Non-limiting examples of clients 502 and 504 include personal computers, personal digital assistants (PDAs), cellular phones and smartphones.
[00108] Client 502 includes a CPU 526, a pointing device 528, a keyboard 530, a microphone 532, a printer 534, a memory 536, a mass memory storage 538, a GUI 540, a video camera 542, an input/output interface 544 and a network interface 546.
[00109] CPU 526, pointing device 528, keyboard 530, microphone 532, printer 534, memory 536, mass memory storage 538, GUI 540, video camera 542, input/output interface 544 and network interface 546 may communicate in a unidirectional manner or a bi-directional manner with each other via a communication channel 548. Communication channel 548 may be configured as a single communication channel or a multiplicity of communication channels.
[00110] CPU 526 may be comprised of a single processor or multiple processors. CPU 526 may be of various types including micro-controllers (e.g., with embedded RAM/ROM) and microprocessors such as programmable devices (e.g., RISC or SISC based, or CPLDs and FPGAs) and devices not capable of being programmed such as gate array ASICs (Application Specific Integrated Circuits) or general-purpose microprocessors.
[00111] As is well known in the art, memory 536 is used typically to transfer data and instructions to CPU 526 in a bi-directional manner. Memory 536, as discussed previously, may include any suitable computer-readable media, intended for data storage, such as those described above excluding any wired or wireless transmissions unless specifically noted. Mass memory storage 538 may also be coupled bi-directionally to CPU 526 and provides additional data storage capacity and may include any of the computer-readable media described above. Mass memory storage 538 may be used to store programs, data and the like and is typically a secondary storage medium such as a hard disk. It will be appreciated that the information retained within mass memory storage 538, may, in appropriate cases, be incorporated in standard fashion as part of memory 536 as virtual memory.
[00112] CPU 526 may be coupled to GUI 540. GUI 540 enables a user to view the operation of computer operating system and software. CPU 526 may be coupled to pointing device 528. Nonlimiting examples of pointing device 528 include computer mouse, trackball, and touchpad.
Pointing device 528 enables a user with the capability to maneuver a computer cursor about the viewing area of GUI 540 and select areas or features in the viewing area of GUI 540. CPU 526 may be coupled to keyboard 530. Keyboard 530 enables a user with the capability to input alphanumeric textual information to CPU 526. CPU 526 may be coupled to microphone 532. Microphone 532 enables audio produced by a user to be recorded, processed, and communicated by CPU 526. CPU 526 may be connected to printer 534. Printer 534 enables a user with the capability to print information to a sheet of paper. CPU 526 may be connected to video camera 542. Video camera 542 enables video produced or captured by user to be recorded, processed, and communicated by CPU 526.
[00113] CPU 526 may also be coupled to input/output interface 544 that connects to one or more input/output devices such as such as CD-ROM, video monitors, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, or other well-known input devices such as, of course, other computers.
[00114] Finally, CPU 526 optionally may be coupled to network interface 546 which enables communication with an external device such as a database or a computer or telecommunications or internet network using an external connection shown generally as communication channel 516, which may be implemented as a hardwired or wireless communications link using suitable conventional technologies. With such a connection, CPU 526 might receive information from the network, or might output information to a network in the course of performing the method steps described in the teachings of the present invention. [00115] Those skilled in the art will readily recognize, in light of and in accordance with the teachings of the present invention, that any of the foregoing steps and/or system modules may be suitably replaced, reordered, removed and additional steps and/or system modules may be inserted depending upon the needs of the particular application, and that the systems of the foregoing embodiments may be implemented using any of a wide variety of suitable processes and system modules, and is not limited to any particular computer hardware, software, middleware, firmware, microcode and the like. For any method steps described in the present application that can be carried out on a computing machine, a typical computer system can, when appropriately configured or designed, serve as a computer system in which those aspects of the invention may be embodied. Such computers referenced and/or described in this disclosure may be any kind of computer, either general purpose, or some specific purpose computer such as, but not limited to, a workstation, a mainframe, GPU, ASIC, etc. The programs may be written in C, or Java, Brew, or any other suitable programming language. The programs may be resident on a storage medium, e.g., magnetic, or optical, e.g., without limitation, the computer hard drive, a removable disk, or media such as, without limitation, a memory stick or SD media, or other removable medium. The programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.
[00116] All the features disclosed in this specification, including any accompanying abstract and drawings, may be replaced by alternative features serving the same, equivalent, or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
[00117] Having fully described at least one embodiment of the present invention, other equivalent or alternative methods of implementing gesture control of a tempo associated with the handheld electronic percussion instrument according to the present invention will be apparent to those skilled in the art. Various aspects of the invention have been described above by way of illustration, and the specific embodiments disclosed are not intended to limit the invention to the particular forms disclosed. The particular implementation of the gesture control of a tempo associated with the handheld electronic percussion instrument may vary depending upon the particular context or application. By way of example, and not limitation, the gesture control of a tempo associated with the handheld electronic percussion instrument described in the foregoing were principally directed to controlling a tempo of MIDI sequences during a live performance implementations; however, similar techniques may instead be applied to real time tempo control for a lightweight handheld musical instruments that provides free movement during on stage performance to control and trigger e.g., lighting scenes, by wirelessly transmitting MIDI to e.g., a MIDI-to-DMX processor (DMX, being the acronym for Digital Multiplex, which is the standard digital communication protocol that is used to remotely control intelligent e.g., stage lighting fixtures) providing the e.g., user to, at the same time the ability to synchronize e.g., a light show, by the tempo of his or a musical group live stage performance] which implementations of the present invention are contemplated as within the scope of the present invention. The invention is thus to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the following claims. It is to be further understood that not all of the disclosed embodiments in the foregoing specification will necessarily satisfy or achieve each of the objects, advantages, or improvements described in the foregoing specification.
[00118] Claim elements and steps herein may have been numbered and/or lettered solely as an aid in readability and understanding. Any such numbering and lettering in itself is not intended to and should not be taken to indicate the ordering of elements and/or steps in the claims.
[00119] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed.
[00120] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
[00121] The Abstract is provided to comply with 37 C.F.R. Section 1.72(b) requiring an abstract that will allow the reader to ascertain the nature and gist of the technical disclosure. That is, the Abstract is provided merely to introduce certain concepts and not to identify any key or essential features of the claimed subject matter. It is submitted with the understanding that it will not be used to limit or interpret the scope or meaning of the claims.
[00122] The following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separate embodiment.
[00123] Only those claims which employ the words "means for" or “steps for” are to be interpreted under 35 USC 112, sixth paragraph (pre-AIA) or 35 USC 112(f) post-AIA. Otherwise, no limitations from the specification are to be read into any claims, unless those limitations are expressly included in the claims.

Claims

What is claimed is: CLAIMS
1. A handheld musical instrument, said handheld musical instrument comprising: a main body portion with one or more sensors; a handle portion engaged with said main body portion, wherein said handle portion includes one or more control buttons to select a type of program to be played by said handheld musical instrument; and a gesture detection module for detecting gestures associated with said handheld musical instrument and controlling a tempo associated with said program played by said handheld musical instrument based on said detected gestures.
2. The handheld musical instrument of claim 1 , wherein said gesture detection module detects physical movements associated with said handheld musical instrument.
3. The handheld musical instrument of claim 2, wherein said physical movements comprises at least one of shaking side to side, moving back and forth or twisting, of said handheld musical instrument.
4. The handheld musical instrument of claim 2, wherein said gesture detection module comprises an accelerometer to detect said physical movements.
5. The handheld musical instrument of claim 2, wherein said gesture detection module comprises a mechanical means to detect said physical movements.
6. The handheld musical instrument of claim 1 , further comprising said gesture detection module converting said detected gestures to a tempo dependent time code.
7. The handheld musical instrument of claim 6, wherein said gesture detection module controls a tempo associated with said program played by said handheld musical instrument based on said tempo dependent time code. The handheld musical instrument of claim 6, wherein said gesture detection module controls a tempo associated with an audio output produced by triggering said sensors in said main body portion of said handheld musical instrument based on said tempo dependent time code. The handheld musical instrument of claim 7, wherein said time code is a digital tempo dependent time code. The handheld musical instrument of claim 7, wherein said program played by said handheld musical instrument is connected to a public addressing system. A method comprising the steps of: selecting a type of program to be played by a handheld musical instrument; detecting gestures associated with said handheld musical instrument; and controlling a tempo associated with said program played by said handheld musical instrument based on said detected gestures and/or a manual control means. The method of claim 11 , wherein the step of detecting gestures further comprises the step of detecting physical movements associated with said handheld musical instrument. The method of claim 12, wherein the step of detecting physical movements further comprises detecting at least one of shaking side to side, moving back and forth or twisting, of said handheld musical instrument. The method of claim 12 further comprising the steps of: increasing said tempo associated with said program played when said detected physical movements are fast movements; and playing said program with said increased tempo. The method of claim 12 further comprising the steps of: reducing said tempo associated with said program played when said detected physical movements are slow movements; and playing said program with said reduced tempo. The method of claim 12, wherein the step of detecting physical movements is performed by a mechanical means. The method of claim 11 further comprising the step of converting said detected gestures to a tempo dependent time code. The method of claim 11 , wherein the step of controlling a tempo associated with said program played by said handheld musical instrument further comprises controlling said tempo based at least in part on signals received from a gesture sensitive accelerometer comprised with the handheld musical instrument. The method of claim 11 , wherein said manual control means is by way of receiving signals from a joystick comprised with the handheld musical instrument, wherein manual manipulation of said joystick generates unique output signals, and wherein said step of tempo controlling is at least in part based upon said joystick output signals. An apparatus comprising: means for selecting a type of program to be played by a handheld musical instrument; means for detecting gestures associated with said handheld musical instrument; and means for controlling a tempo associated with said program played by said handheld musical instrument.
PCT/US2022/051935 2021-12-06 2022-12-06 A handheld musical instrument with gesture control WO2023107431A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US202163286105P 2021-12-06 2021-12-06
US63/286,105 2021-12-06
US18/074,421 US11893969B2 (en) 2021-12-06 2022-12-02 Handheld musical instrument
US18/074,421 2022-12-02
US18/075,295 US20230178056A1 (en) 2021-12-06 2022-12-05 Handheld musical instrument with control buttons
US18/075,295 2022-12-05

Publications (1)

Publication Number Publication Date
WO2023107431A1 true WO2023107431A1 (en) 2023-06-15

Family

ID=86731096

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/051935 WO2023107431A1 (en) 2021-12-06 2022-12-06 A handheld musical instrument with gesture control

Country Status (1)

Country Link
WO (1) WO2023107431A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180188850A1 (en) * 2016-12-30 2018-07-05 Jason Francesco Heath Sensorized Spherical Input and Output Device, Systems, and Methods

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180188850A1 (en) * 2016-12-30 2018-07-05 Jason Francesco Heath Sensorized Spherical Input and Output Device, Systems, and Methods

Similar Documents

Publication Publication Date Title
US9542920B2 (en) Modular wireless sensor network for musical instruments and user interfaces for use therewith
US8111241B2 (en) Gestural generation, sequencing and recording of music on mobile devices
US10955984B2 (en) Step sequencer for a virtual instrument
EP2760014A1 (en) Method for making audio file and terminal device
KR102270633B1 (en) Apparatus, system, and method for transferring data from a terminal to an electromyography device
US20030159567A1 (en) Interactive music playback system utilizing gestures
US20010035087A1 (en) Interactive music playback system utilizing gestures
CN109791740A (en) Intelligent measurement and feedback system for intelligent piano
US9779710B2 (en) Electronic apparatus and control method thereof
KR20060126727A (en) Advanced control device for home entertainment utilizing three dimensional motion technology
JP2014507829A5 (en)
WO2017028686A1 (en) Information processing method, terminal device and computer storage medium
CN110796918A (en) Training method and device and mobile terminal
US20230178059A1 (en) Handheld musical instrument with gesture control
CN104822095A (en) Composite beat special effect system and composite beat special effect processing method
CN112435641B (en) Audio processing method, device, computer equipment and storage medium
WO2023107431A1 (en) A handheld musical instrument with gesture control
Reid et al. Minimally Invasive Gesture Sensing Interface (MIGSI) for Trumpet.
US9176610B1 (en) Audiovisual sampling for percussion-type instrument with crowd-sourced content sourcing and distribution
CN104822094A (en) Polyrhythm special-effect system and polyrhythm special-effect processing method
CN104584537B (en) Improved method for searching and device for content playback
EP3518230A1 (en) Generation and transmission of musical performance data
Bouillot et al. A Mobile Wireless Augmented Guitar.
Martin Touchless gestural control of concatenative sound synthesis
Ren et al. Interactive virtual percussion instruments on mobile devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22904990

Country of ref document: EP

Kind code of ref document: A1