EP2630557A1 - Procédés, dispositifs et systèmes permettant de créer des signaux de commande - Google Patents

Procédés, dispositifs et systèmes permettant de créer des signaux de commande

Info

Publication number
EP2630557A1
EP2630557A1 EP11833636.1A EP11833636A EP2630557A1 EP 2630557 A1 EP2630557 A1 EP 2630557A1 EP 11833636 A EP11833636 A EP 11833636A EP 2630557 A1 EP2630557 A1 EP 2630557A1
Authority
EP
European Patent Office
Prior art keywords
interface
pitch
exemplary embodiments
user
digit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11833636.1A
Other languages
German (de)
English (en)
Inventor
Joshua Michael Young
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/AU2010/001409 external-priority patent/WO2011047438A1/fr
Priority claimed from AU2010905631A external-priority patent/AU2010905631A0/en
Application filed by Individual filed Critical Individual
Priority claimed from PCT/AU2011/001341 external-priority patent/WO2012051664A1/fr
Publication of EP2630557A1 publication Critical patent/EP2630557A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H13/00Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch
    • H01H13/70Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard
    • H01H13/84Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard characterised by ergonomic functions, e.g. for miniature keyboards; characterised by operational sensory functions, e.g. sound feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/32Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
    • A63F13/327Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections using wireless networks, e.g. Wi-Fi® or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H9/00Details of switching devices, not covered by groups H01H1/00 - H01H7/00
    • H01H9/02Bases, casings, or covers
    • H01H9/0214Hand-held casings
    • H01H2009/0221Hand-held casings the switches being fixed to the operator's hand, e.g. integrated in a glove or fixed to a ring

Definitions

  • This disclosure generally relates to machine interfaces, and, more particularly, to methods, devices and/or systems for creating control signals in response to a user's actions such as the coordinated or independent movement of one or more of the user's digits (fingers/thumb), hand(s), and/or arm(s).
  • Another known interface attempts to create control signals using linear or rotational velocity, acceleration, or the time-derivative of acceleration to control electronic musical sounds.
  • Another known interface which is utilized for the purpose of playing musical video games uses accelerometers and gyroscopes in data input and includes buttons that can be used to elicit binary (on-off) control signals.
  • Another known interface generates control signals via two digit touch sensors assigned to each digit.
  • the two touch sensors assigned to a digit are each actuated by contact with a different area of the digit.
  • RO/AU that captures movements and postures of a user's hand and/or arm in a way that is intuitive, high-resolution, and easy to learn. Accordingly, it would be desirable to provide machine interfaces, and methods, devices and/or systems for creating control signals in response to a user's actions to address one or more other problems in the art and/or provide one or more advantages.
  • Exemplary embodiments relate to machine interfaces and/or methods, devices and/or systems for creating control signals in response to a user's actions.
  • these actions may include, without limitation, the coordinated or independent movement of one or more of the user's digits (fingers/thumb), hand(s), and/or arm(s).
  • Exemplary embodiments of the methods, devices and/or systems may be used to control audio and visual information and/or outputs.
  • Exemplary embodiments may provide rapid, substantially concurrent, and/or temporally-precise access to a wide range of discrete output events.
  • the output events may be used, for example, to produce melodic, harmonic, and/or rhythmic outcomes.
  • a hand-held device may provide at least 1 , 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 , 12, 13, 14, and/or 15 finger/thumb operated buttons or activation points.
  • the device may be capable of providing access to at least 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 , 12, 13, 14, and/or 15 discrete output events.
  • the duration of time needed to play C major scale may be measured.
  • the time may start from when the first tone (C4) is triggered and end when the last tone (C5) is triggered.
  • tone pitch may be assigned to the digit buttons chromatically (e.g., as illustrated in FIG. 12A) or diatonically (e.g., as illustrated in FIG. 12B).
  • users with an intermediate level of experience may be able to complete the C major scale in approximately 1 second (e.g., about 0.75, 0.8, 0.85, 0.9, 0.95, 1.05, 1.1 , 1.15, 1.2, or 1.25 seconds) in the chromatic configuration and in approximately 0.8 seconds (e.g., about 0.6, 0.65, 0.7, 0.75, 0.85, 0.9, 0.95, 1.05, or 1.1 Seconds) in the diatonic configuration.
  • a similar test may be performed measuring the duration of time needed to play the pitches of a chromatic scale starting at C (e.g., C4, Db4, D4, Eb4, E4, F4, Gb4, G4, Ab4, A4, Bb4, B4, and C5) with the time starting from when the first tone (C4) is triggered to when the last tone (C5) is triggered.
  • C e.g., C4, Db4, D4, Eb4, E4, F4, Gb4, G4, Ab4, A4, Bb4, B4, and C5
  • a chromatic assignment of pitches to the digit buttons may be used (e.g., as illustrated in FIG. 12A).
  • Tests may also assess the concurrent access to discrete output events provided by exemplary embodiments, that is, having more than one discrete output in a
  • triggered state at a time This may be assessed by measuring the duration of time a user requires to activate a harmonic set of musical tones (also referred to as a "chord"). For example, after a start signal is presented to a user the test may measure how long it takes to trigger the tones of a chord (without deactivating any of those tones). In this test a chromatic assignment of pitches to the digit buttons may be used (e.g., as illustrated in FIG. 12A). For users with an intermediate level of experience using an exemplary embodiment
  • C4, E4, and G4 0.4 second (e.g., about 0.2, 0.3, 0.5, or 0.6 seconds); D4, F4, and A4: 0.3 seconds
  • F4, A4, and C5 0.3 seconds (e.g., about 0.1 , 0.2,
  • G4 1.5 seconds (e.g., 1 , 1.1, 1.2, 1.3, 1.4, 1.6, 1.7, 1.8, 1.9, or 2 seconds); D4, F4, and A4:
  • RO/AU 0.5 seconds ⁇ e.g., about 0.2, 0.3, 0.4, 0.6, 0.7, 0.8, 0.9, or 1 seconds); F4, A4, and C5: 0.4 seconds (e.g., about 0.1 , 0.2, 0.3, 0.5, 0.6, 0.7, 0.8 seconds).
  • C4, E4, and G4 0.3 seconds (e.g., about 0.1, 0.2, 0.4, or 0.5 seconds); D4, F4, and A4: 0.2 seconds (e.g., about 0.1 , 0.3, 0.4, or 0.5 seconds); F4, A4, and C5: 0.2 seconds (e.g., about 0.1 , 0.3, 0.4, or 0.5 seconds).
  • Tests may also assess the temporal-precision with which discrete output events can be triggered by exemplary embodiments. This may be assessed by measuring, for example, how accurately a user can reproduce a rhythm using the onsets of musical sounds triggered using exemplary embodiments. For example, a test rhythm of 4 beats per measure (or "bar") at a tempo of 100 beats per minute may be made audible to a user, and the user may be required to emulate this rhythm while it is playing by repeatedly triggering a musical sound via a single digit button on an exemplary embodiment.
  • the time interval between each sound in a test rhythm may be 0.6 seconds, and the time interval between each sound triggered by the user may be measured and subtracted from 0.6 seconds to determine how close on average the user is to producing the test rhythm.
  • the resulting average value may be divided by the test interval of 0.6 seconds and then multiplied by 100 to give a percentage error.
  • Distal button - thumb 0.013 seconds (e.g., about 2%, 3%, or 4% error);
  • Proximal button - middle finger 0.016 seconds (e.g., about 2%, 3%, or 4% error);
  • Distal button - middle finger 0.022 seconds (e.g., about 3%, 4%, or 5% error).
  • distal button - thumb 0.013 seconds (e.g., about 2%, 3%, or 4% error)
  • Proximal button - middle finger 0.016 seconds (e.g., about 2%, 3%, or 4% error)
  • Distal button - middle finger 0.022 seconds (e.g., about 3%, 4%, or 5% error).
  • distal button - thumb 0.013 seconds
  • Proximal button - middle finger 0.016 seconds (e.g., about 2%, 3%, or 4% error)
  • Distal button - middle finger 0.022 seconds (e.g., about 3%,
  • Distal button - thumb 0.025 seconds (e.g., about 3%, 4%, or 5% error); Proximal button - middle finger: 0.032 seconds (e.g., about 4%, 5%, or 6% error); Distal button - middle finger: 0.046 seconds (e.g., about 7%, 8%, or 9% error).
  • Proximal button - middle finger 0.032 seconds (e.g., about 4%, 5%, or 6% error);
  • Distal button - middle finger 0.046 seconds (e.g., about 7%, 8%, or 9% error).
  • Proximal button - middle finger 0.011 seconds (e.g., about 2%, 3%, or 4% error);
  • Distal button - middle finger 0.015 seconds (e.g., about 2%, 3%, or 4% error).
  • Tests may also assess how intuitive the interface is to use by measuring the ease with which a user can learn to use it to perform particular tasks.
  • the interface may be used to control an emulation of a sustained-tone instrument like a saxophone, whereby notes are triggered using the digit buttons and controlling the rate of rotation of the interface around its vertical (yaw) axis is used to emulate the effect of blowing intensity (i.e., the force of blowing into a saxophone) on these tones.
  • the user may be required to use the digit buttons to ascend melodically through the C major scale, and while keeping each new note actuated the interface may be swung in a plane approximately horizontal to the ground (from left to right or vice versa) in order to provide that note with a "fully voiced'' tone.
  • the user actuates the next note in the scale approximately prior to each horizontal swing, with each swing moving in the direction opposite to the preceding swing.
  • notes may be assigned to the digit buttons chromatically (e.g., as illustrated in FIG. 12A) or diatonically (e.g., as illustrated in FIG. 12B).
  • the learning time they will require to play a C major scale in the manner prescribed above will be approximately less than 15 minutes (e.g., less than 10, 12, 14, 16, 18, or 20 minutes) in the diatonic configuration and less than 20 minutes (e.g., less than 15, 17, 19, 22, 24, or 26 minutes)in the chromatic configuration.
  • Tests may also assess the overall convenience of using exemplary embodiments. For example, the duration of time required to fasten an exemplary embodiment to a user's hand may be measured. In such a test a start signal may be given, after which a user must fasten an exemplary embodiment to their hand and actuate a single digit button. For intermediate and expert level users (as defined above) it is expected that they could fasten an exemplary embodiment to their hand and actuate a single digit button in approximately 8 seconds (e.g., about 5, 6, 7, 9, or 10 sec).
  • embodiments may include their weight. Exemplary embodiments are anticipated to weigh 200-300 grams (e.g., about 175, 200, 225, 250, 275, 300, or 325 grams). Exemplary embodiments may also be used without interfering with conventional clothing worn by a user.
  • Exemplary embodiments may include a device with 15 finger operated buttons which gives the user rapid, substantially concurrent, and temporally-precise access to 15 discrete output events.
  • Exemplary embodiments may include a device with 3, 5, 7, 8,
  • the interface may include at least 3, 5, 7, 8, 12, 13, 14, or 15 finger/thumb operated buttons.
  • the user may be provided with access to at least 3, 5, 7, 8, 12, 13, 14, or 15 discrete output events.
  • the buttons may be operated individually and/or in combination to create a harmonic arrangement of triggered notes.
  • the device may be configured to allow the user to move between octaves by changing the orientation of the device around its lateral axis.
  • Exemplary embodiments may provide for a combination of melodic, harmonic, and/or rhythmic capacities with motion, orientation, and/or position sensing that is more precise, repeatable, intuitive, convenient, easy to learn, is less costly or combinations thereof.
  • Exemplary embodiments may provide for a hand-operated device that combines motion, orientation, and/or position sensing with digit (finger and thumb) buttons.
  • the device may include multiple buttons (e.g., 3, 5, 7, 8, 12, 13, 14, or 15 buttons).
  • one or more of the buttons may be designed to be actuated only by the end segments of the digits or by other parts of the digits as well.
  • Exemplary embodiments may include a device with motion, orientation, and/or position sensing and no finger operated buttons or activation points.
  • the motion, orientation, and/or position sensing technology may be embodied in numerous ways.
  • the device may use any combination of acceleration sensing, angular rotation rate sensing, magnetic field sensing, video motion capture, ultrasound, time of flight cameras, etc.
  • the device may combine motion, orientation, and/or position sensing with a "multi-phalangeal" interface.
  • the device may have multiple buttons (for example 3, 5, 7, 8, 12, 13, 14, or 15 buttons) some of which are positioned to be actuated by phalanges other than the distal phalanx (tip of finger).
  • the device may combine motion and/or orientation sensing with a multi-phalangeal interface that has at least 3 touch sensors per finger or at least 3 touch sensors per digit (fingers and thumb). In exemplary embodiments the device may combine motion and/or orientation sensing with a multi-phalangeal interface that has at least 4 touch sensors per finger or at least 4 touch sensors per digit (fingers and thumb). In exemplary embodiments the device may combine motion and/or orientation sensing with a multi-phalangeal interface that has at least 2 touch sensors per finger or at least 2 touch
  • the device may combine motion and/or orientation sensing with a multi-phalangeal interface that has at least 1 , 2, 3, 4 or 5 touch sensors per finger or at least 1 , 2, 3, 4 or 5 touch sensors per digit (fingers and thumb). Certain embodiments may have different combinations of touch sensors per finger or touch sensors per digit (fingers and thumb). For example at least one digit may have 4 touch sensors and at least one digit may have 2 sensors. Other combinations are also contemplated.
  • a hand operated input device including a series of activation points activated by the fingers and/or a thumb of a user; a positioning component measuring a current motion, orientation, and/or position of the device, and a processor interconnected to the activation points and the positioning component for outputting a series of currently active activation points and the current motion, orientation, and/or position of the input device.
  • a hand operated input device including a series of activation points activated by the fingers and/or a thumb of a user; a positioning component measuring a current motion, orientation, and/or position of the device, and a processor interconnected to the activation points and the positioning component for outputting a series of currently active activation points and the current motion, orientation, and/or position of the input device.
  • the number of activation points per finger and/or thumb may be at least 2. In exemplary embodiments, the activation points may be spaced apart from one another for interaction with different portions of a user's finger and/or thumb.
  • the number of activation points per finger may be at least 3.
  • a series (e.g., at least 2 or 3) of activation points may also be accommodated for the thumb.
  • the positioning component may include one or more orientation sensors for sensing the rotational orientation of the device.
  • orientation sensors may output a roll, pitch and/or yaw angle of the device.
  • the positioning component may include one or more angular rate sensors for sensing the rate of angular rotational of the device.
  • the positioning component may include position sensors either internal or external to the device which sense the position of the device.
  • the device may include a weighted elongated portion counterbalancing the activation points when in use by a user.
  • the relative position of the activation points may be adjustable for each finger.
  • the activation points may be formed from switches that can be actuated by a finger or thumb.
  • the processor may be
  • the interconnection may also be a wired connection or an infrared connection.
  • the activation points may be actuated either individually or in combination with other activation points.
  • the distal, medial or proximal activation points assigned to different fingers may be actuated at the same time or at substantially the same time.
  • the distal and proximal activation points assigned to the same finger may be actuated at substantially the same time, or the distal and medial activation points assigned to the same finger may be actuated at substantially the same time.
  • the systems, devices, and methods may be utilized as a music input device.
  • the activation points may be mapped to notes on a chromatic or diatonic scale, one axis of the orientation of the device can be mapped to a series of zones that control the octave of a note's pitch, one axis of the orientation of the device can be used to control gradated pitch, one axis of the orientation of the device can be used to control one or more sound effects, one axis of the orientation of the device can be used to control the rate of playback of audio or video samples, and one axis of the orientation of the device can be used to control audio volume.
  • each device may include a series of activation points activated by the fingers of a user; a positioning component measuring a current motion, orientation, and/or position of the device and a processor interconnected to the activation points and the motion sensors for the orientation of the input device.
  • at least one additional processor may. be interconnected to the processor of each device for calculating a differential output between at least two hand-operated input devices.
  • Exemplary embodiments may relate to a hand operated input device comprising a plurality of modules, each module being configured for operation by a digit (finger or thumb) of a user; a plurality of activation points configured to be activated by the digits of the user; at least one positioning component sensor for measuring a motion, position, or orientation value of the input device; and a processor interconnected to the activation points and the positioning component sensor for outputting a series of currently active activation points and the motion, position, or orientation value of the input device.
  • each of the plurality of modules comprises at least one activation point capable of being modulated by a distal portion of a finger, a medial portion of a finger, or a proximal portion of a finger; and the activation points are mapped to musical notes. Certain embodiments may have various combinations of modules and activation points.
  • FIG. 1 illustrates an exemplary embodiment of an interface from a front left perspective
  • FIG. 2 illustrates an exemplary embodiment of an interface from the front right perspective
  • FIG. 3 illustrates an exemplary embodiment of an interface from a lower left side perspective
  • FIG. 4A illustrates an exemplary embodiment of a single finger digit array from a front left perspective in isolation
  • FIG. 4B, 4C, and 4D illustrate an exemplary embodiment of a single finger digit array from a left perspective in isolation and methods of actuating finger digit buttons using a finger;
  • FIG. 5 illustrates an exemplary embodiment of a single finger digit array from the rear right side perspective in isolation with the side panels of the proximal and distal enclosures removed, and the top section of the medial enclosure removed;
  • FIG. 6 illustrates an exemplary embodiment of a digit array track and a digit array track connector in isolation from a front-left perspective
  • FIG. 7A illustrates an exemplary embodiment of a thumb digit array in isolation from a lower left perspective with the lower portion of the thumb digit array's enclosure housing removed;
  • FIG. 7B, 7C, and 7D illustrate an exemplary embodiment of a thumb digit array in isolation from a lower left rotated perspective and methods of actuating thumb digit buttons using a thumb, wherein the independent actuation of the distal thumb, medial thumb, and proximal thumb buttons are illustrated in FIG. 7B, 7C, and 7D respectively;
  • FIG. 8 illustrates in block diagram form an exemplary embodiment of an interface's electronics
  • FIG. 9 illustrates in block diagram form an exemplary embodiment of a program that may be used by the digit button sensor relay component of the electronics
  • FIG. 10 illustrates in block diagram form an exemplary embodiment of the actuation sequence filter subroutine referred to in FIG. 9;
  • FIG. 11 illustrates in block diagram form an exemplary embodiment of a program that may be used by the processor component of the electronics
  • FIG. 12A and 12B illustrate exemplary assignments of tone pitches to interface digit buttons
  • FIG. 13A illustrates an exemplary embodiment of an interface from a lower perspective in which electronics from the rear enclosure are placed in the palm enclosure and the rear enclosure is absent;
  • FIG. 13B illustrates an exemplary embodiment of an interface from a front left perspective in which the rear enclosure is reduced in size
  • FIG. 13C illustrates an exemplary embodiment of an interface from a lower front left perspective in which the height of upper surface of the palm enclosure is adjustable
  • FIG. 14A illustrates an exemplary embodiment of an interface from an upper perspective illustrating a strap and clasp hand fastening mechanism
  • FIG. 14B, 14C, and 1 D illustrate an exemplary embodiment of an interface from an upper perspective that includes an attachment mechanism comprising material that stretches over the hand, is threaded under a buckle, and attaches back on to the material on the back of the hand or wrist;
  • FIG. 14E illustrates an exemplary embodiment of an interface from right perspective in which the upper surface of the palm enclosure is shown in isolation and includes air ventilation holes.
  • FIG. 15 illustrates an exemplary embodiment of a single finger digit array from a front left perspective in isolation without a distal button
  • FIG. 16 illustrates an exemplary embodiment of a single finger digit array from a front left perspective in isolation without a medial button
  • FIG. 17 illustrates an exemplary embodiment of a single finger digit array from a front left perspective in isolation without a proximal button
  • FIG. 18 illustrates an exemplary embodiment of a thumb digit array from a left perspective in isolation without a medial button
  • FIG. 19 illustrates an exemplary embodiment of a thumb digit array from a left perspective in isolation without a proximal button
  • FIG. 20 illustrates an exemplary embodiment of a thumb digit array from a left perspective with a medial button positioned on the outside of the thumb rather than on the inside of the thumb;
  • FIG. 21 illustrates an exemplary embodiment of an interface from a front left perspective without a thumb digit array
  • FIG. 22 illustrates an exemplary embodiment of an interface from a front left perspective without a little finger digit array
  • FIG. 23 illustrates an exemplary embodiment from a lower perspective of an interface including a speaker under the palm allowing sound production of audio synthesized on the interface;
  • FIG. 24 illustrates an exemplary embodiment of an interface from a lower perspective including a speaker on the rear enclosure allowing sound production of audio synthesized on the interface;
  • FIG. 25 illustrates an exemplary embodiment of an interface from a front left perspective wherein the digit array positions are fixed relative to each other;
  • FIG. 26A and 26B illustrate an exemplary embodiment of an interface with five or nine buttons only, but which may contain the same electronics as interfaces described elsewhere in this specification, from a front left perspective;
  • FIG. 27A, 27B, 27C, 27D, and 27E illustrate exemplary embodiments of an interface from a front left perspective
  • FIG. 28A and 28B illustrate an exemplary embodiment of a gaming functionality achievable with exemplary embodiments of the methods, devices, and systems described herein;
  • FIG. 29A illustrates an exemplary embodiment of a gaming functionality achievable with exemplary embodiments of the methods, devices, and systems described herein;
  • FIG. 29B illustrates an exemplary embodiment of a gaming functionality achievable with exemplary embodiments of the methods, devices, and systems described herein;
  • FIG. 30A is an exemplary embodiment of components involved in achieving gaming functionality
  • FIG. 30B is an exemplary embodiment of content involved in achieving gaming functionality
  • FIG. 31 illustrates an exemplary embodiment of components involved in achieving audio control functionality
  • FIG. 32 illustrates an exemplary embodiment of components involved in achieving gradated pitch control functionality
  • FIG. 33 illustrates an exemplary embodiment of components involved in manipulating audio and/or visual content
  • FIG. 34 illustrates an exemplary embodiment of algorithms involved in manipulating audio and/or visual content.
  • Exemplary embodiments may include a device with 15 digit operated buttons which gives the user rapid and rhythmically-precise access to 15 notes.
  • the buttons may be operated individually and/or in combination (thereby creating melody and/or harmony).
  • the device may be configured to allow the user to move between octaves by changing the orientation of the device around its lateral axis.
  • Exemplary embodiments may provide for a combination of melodic, harmonic, and/or rhythmic capacities with a motion and/or orientation sensing that is more precise, repeatable, intuitive, convenient, and easier to learn.
  • Access to at least 13 pitches means the user may be able to play through all, or substantially all, the notes of standard divisions of an octave, for example the "western" chromatic scale.
  • a user can access most or all the diatonic scales derived from the chromatic scale ⁇ e.g., major and minor scales) without needing to change the assignment of notes to the interface. Due to this consistency, combined with the temporal-precision and repeatability of note-triggering, exemplary embodiments provide a highly effective and easy to learn musical controller system.
  • locations on the human hand and arm mentioned in the following description refer to an anatomical position of the right arm in which the upper arm hangs parallel to the upright body with the elbow bent and with the forearm and hand horizontal to the ground and pointing forwards.
  • the forearm is pronated such that the palm of the right hand is facing the ground at a slight angle (i.e., with the palm lifted up slightly towards the user's body).
  • a variety of angles may be used, and for this exemplary embodiment an angle of approximately 25 degrees from the ground plane is prescribed.
  • this anatomical position will be referred to as the "neutral operating position".
  • Other exemplary embodiments may use pronation angles of -30, -15, 0, 15, 30, 45, 60, 75, or 90 (thumb pointing up), 105, 20, or 135 degrees.
  • the device's axes of roll, pitch, and yaw are defined approximately relative to the user's hand: With fingers outstretched in the same plane as the palm, rotating the hand and forearm around the axis of the middle finger is defined as rotating within the roll plane (i.e., rotating around the longitudinal axis). Bending at the elbow is defined as rotating within the pitch plane (i.e., rotating around the lateral axis). Perpendicular to both the roll and the pitch planes is the yaw plane (i.e., the vertical axis).
  • pitch may be used in the sense of the pitch of a sound as it is perceived by a listener, rather than as a strict reference to the fundamental frequency of a sound.
  • pitch is largely synonymous with the term "note” (for example, a pitch of C is meant to refer to the note C in any octave).
  • RO/AU Scientific pitch notation may also be used to describe both pitch and octave.
  • the pitch A4 refers to the note A in octave number 4.
  • the term continuous may be used in reference to sensor measurements, and is intended to describe sensor values that have more one than one value over time and are substantially gradated in character.
  • FIG. 1 to FIG. 12B Exemplary embodiments of a hand operated device are illustrated in FIG. 1 to FIG. 12B. These exemplary embodiments are designed to interact with the right hand of the user, and the terms “left” and “right'' used in this description are also defined relative to the user. However, it should be readily understood that the embodiments described herein are not limited to right hand devices. Methods, devices and systems described herein may also be used with the left hand or with both hands. In exemplary embodiments, the device may be constructed to be used interchangeably with the left and right hands.
  • FIG. 1 illustrates an exemplary interface from a front-left perspective.
  • modules 110, 111 , 112, and 113
  • These finger digit arrays are positioned for operation by the little finger (110), ring finger (111), middle finger (112), and index finger (113) of the user's right hand respectively.
  • Each finger digit array is connected to the rest of the structure by a rail or track 114 (the "digit array track").
  • This track is connected to a region of the structure, referred to as the "palm enclosure” 115, which is designed to sit under the palm of the user's hand.
  • a module referred to as the "thumb digit array” 118, which is positioned for operation by the thumb.
  • a "palm clasp” 116 Attached to the right-hand side of the palm enclosure 115 of this exemplary embodiment and reaching over the top of the user's hand is a "palm clasp" 116. Attached to the left-hand side of the palm enclosure 115 and reaching over the top of the user's hand is a "hand strap" 117.
  • the section of the hand strap attached to the palm enclosure may be flexible and elastic.
  • the lower surface of the opposite end of the hand strap attaches to the upper surface the palm clasp 116.
  • a variety of different mechanisms may be used to attach the hand strap to the palm clasp, including means like press studs or buckles, etc.
  • a hook and loop mechanism may be used, and, in exemplary embodiments, the areas of the hand strap and palm clasp covered by the hook and loop mechanism may be made be sufficiently large to allow the attachment position to be varied while maintaining a secure attachment. In exemplary embodiments, this variation may allow the tightness of the attachment of the interface to the hand to be adjusted, however additional tightness adjustment means may also be used.
  • a soft detachable cushioning section 119 Sitting inside the palm clasp of this exemplary embodiment is a soft detachable cushioning section 119, referred to as the "hand clasp spacer".
  • the palm enclosure 115 Located behind the palm enclosure 115 is the "rear enclosure" 120.
  • a power switch 121 for turning the electronics of the interface on and off may be located on the rear enclosure.
  • the rear enclosure may be angled slightly downwards away from the plane formed by the top of the palm enclosure which may assist in preventing the rear enclosure from colliding with the user's forearm if the wrist is flexed. As it descends from the palm enclosure, the rear enclosure may also fall slightly rightwards (relative to the palm enclosure). In exemplary embodiments, this angle may be such that when the hand and arm are in the neutral operating position, the rear enclosure of the interface lies beneath (rather than to the left or right) of the forearm.
  • FIG. 2 illustrates an exemplary embodiment of an interface from a front-right perspective.
  • a data cable port 210 e.g., USB, MIDI, Firewire, Thunderbolt, or another suitable connector type
  • the hand clasp spacer 119 may be held in place by a protrusion 211 which projects into a frame formed by the hand clasp 116.
  • the hand clasp spacer may be swapped-out for a different-sized spacer that projects more or less leftwards into the area above the palm enclosure 115, or the spacer may be removed entirely.
  • an opening 212 at the front of the palm enclosure may act as a recess for the rear-most sections of the finger digit arrays (110, 111, 112, and 113).
  • FIG. 3 shows an exemplary embodiment of an interface from a lower-left side perspective.
  • Three buttons may be located on the thumb digit array 118; a "distal" thumb button 310, a “medial” thumb button 311 , and a “proximal” thumb button 312.
  • the underside of the rear enclosure 120 may include a socket for receiving a power cable 31 .
  • FIG. 4A Illustrated in FIG. 4A is an embodiment of a finger digit array, from a front-left perspective, in isolation from the rest of the exemplary interface.
  • the finger digit array may include a distal finger button 410, a medial finger button 411, and a proximal finger button 416.
  • the medial finger button may be mounted in a combined structure formed by a "medial" enclosure 412 and the rear portion of the distal finger button 410.
  • the distal finger button may be mounted in a "distal" enclosure 413.
  • some or all of the finger digit arrays may not be identical. For example, they may be different sizes and/or include different buttons. For both the fingers
  • each button may be referred to as a "digit button”, and may include a “digit button sensor” or "button sensor”.
  • the distal enclosure may be mounted on a "distal" shaft 414, such that the distal enclosure can slide up and down, as well as around, the distal shaft.
  • the distal shaft may be connected to a "proximal” enclosure 415, and the proximal enclosure may also be the structure in which the proximal finger button 416 is mounted.
  • the proximal enclosure may be connected to a "proximal” shaft 417.
  • the exposed rear portion of the proximal shaft may be mounted in a "digit array track connector" 421, such that the proximal shaft can slide in and out of, as well as rotate within, the digit array track connector.
  • a cylindrical "digit array track connector clamp” 418 Threaded into this clamp may be a "connector bolt” 420 and under the head of the bolt may be a washer 419.
  • the upper end of the connector bolt may interface with, and can be tightened/loosened by, an appropriate sized Allen or Hex key.
  • a variety of methods for tightening and loosening the connector bolt may be used, including, for example, an outward protruding key head on the bolt that is accessible to, and can be manipulated by, the user's fingers, or conventional screws, etc.
  • FIG. 5 illustrates an exemplary embodiment of a finger digit array in isolation from a rear right side perspective, with side sections of the proximal and distal enclosures removed, as well as the top section of the medial enclosure removed.
  • the proximal shaft 417 and the distal shaft 414 may both be hollow, allowing electrical wiring to enter the digit array at the rear-end 510 of the proximal shaft and exit at a portal 512 within the proximal enclosure or a portal 520 in the distal enclosure.
  • a threaded bolt 511 may extend through the underside of the tubular section of the digit array track connector 421 (bolt thread not shown in figure).
  • a rubber plug may be attached that makes contact with the proximal shaft, thus screwing the bolt inwards may act to immobilize the proximal shaft relative to the digit array track connector.
  • a threaded bolt 515 may extend through the underside of the distal enclosure 413 (bolt thread not shown in figure), and screwing the bolt inwards may act to immobilize the distal enclosure relative to the distal shaft.
  • each of these bolts may interface with, and can be tightened/loosened by, an appropriate sized Allen or Hex key.
  • an appropriate sized Allen or Hex key may be used, including, for example, a large outward protruding key head on the bolt that is accessible to, and can be manipulated by, the user's fingers, or conventional screws, etc.
  • a "proximal" microswitch 513 may be positioned for actuation by the proximal finger button 416.
  • the microswitch may be used to provide operating and/or return force for the button, and/or haptic feedback indicating the trigger point has been reached. In exemplary embodiments, this may be the case for all (or at least some) of the microswitches and their respective buttons used in the finger and thumb digit arrays.
  • Axle protrusions from the proximal enclosure housing may be inserted into an axle cavity 514 and its matching axle cavity on the other side of the proximal finger button. These components would form an axle mechanism around which the proximal finger button rotates during its actuation.
  • a method of reducing the relative force transmitted to the axle mechanism by the actuating finger may be used. For example, as can be seen in FIG. 5, the height of the proximal button above the axle cavity 514 is reduced relative to the rear portion of the button. As a result, more of the force of the actuating finger may be translated into the rear of the button than the front axle area, thereby making the button easier to actuate.
  • the overall height of the button can also be adjusted with a removable "button cover" 516.
  • This cover may slide over the top of the proximal finger button and be kept in place using standard methods (e.g., by friction between the cover and the button resulting from a tight fit, or a clipping mechanism formed by, overhanging sections of the cover, etc.). Once in place, the cover may allow normal operation of the button, but with the contact surface now being closer to the actuating finger.
  • a "medial" microswitch 517 may be positioned for actuation by the medial finger button 411.
  • the medial finger button axle protrusion 519 and its matching axle protrusion on the lower side of the medial finger button may insert into axle cavities in the medial enclosure housing and the top of the distal button 410. These components would form an axle mechanism around which the medial finger button rotates during its actuation.
  • the medial finger button may use the force-to-axle reduction - method described for the proximal finger button above.
  • a "distal" microswitch 521 may be positioned for actuation by the distal finger button 410.
  • the distal finger button axle protrusion 518 and its matching axle protrusion on the other side of the distal finger button may insert into axle cavities in the distal enclosure housing.
  • these components may form an axle mechanism around which the distal finger button rotates during its actuation.
  • actuation of the distal finger button would also rotate the medial enclosure and it's components around the distal finger button's axle mechanism.
  • the medial finger button's finger-contact area may be relatively thin (as measured between its top and bottom edges) and/or rounded.
  • the finger-contact area of the distal finger button may be relatively long, as measured from its axle mechanism to its front edge.
  • the three microswitches on the finger digit array may be orientated in such a way that their hinges are positioned towards the axles of their respective buttons, thus the microswitch levers would actuate in the same arc as their respective buttons.
  • the positive, ground, and signal wires from the medial microswitch 517 may descend through a cavity in the distal finger button into the distal enclosure 413.
  • the positive and ground connections of the medial and distal microswitches may be combined, and the positive, ground, and two signal wires may enter the distal shaft via a wiring portal 520.
  • the signal wires from the distal and medial microswitches may extend back through the distal and proximal shafts to the wiring portal 510.
  • the positive and ground connections of some or all three microswitches may be combined in the proximal enclosure and, combined with the signal wire of the proximal microswitch, extend back through the proximal shaft to the wiring portal 510.
  • FIG. 6 illustrates an exemplary embodiment of the digit array track 114 and a digit array track connector 421 in isolation from a front-left perspective.
  • there may be a recessed fin section 610 within the digit array track against which the lower face of the connector bolt washer 419 and the upper face of the connector clamp 418 press.
  • the connector bolt 420 may pass through a channel 611 running between the fin parts on either side. Tightening the connector bolt would press the washer and the connector clamp against the fin parts 6 0, effectively immobilizing the digit array track connector's location and orientation on the digit array track.
  • FIG. 7A illustrates an embodiment of the thumb digit array in isolation from below, with the lower portion of the thumb digit array's enclosure housing removed.
  • the medial thumb button 311 may have an axle protrusion 710. This protrusion, and its matching axle protrusion on the other side of the medial thumb button, would insert into axle cavities in the thumb digit array enclosure housing. These components would form an axle mechanism around which the medial thumb button rotates during its actuation.
  • a "medial" thumb microswitch 711 may be positioned for actuation by an extension 712 of the medial thumb button.
  • the extension is on the opposite side of the medial thumb button's axle mechanism, thus actuating (depressing) the medial thumb button would rotate the extension towards the medial thumb microswitch.
  • the microswitch may be oriented such that the tip of its lever makes contact with the extension and the hinge of the microswitch is positioned towards the left of the interface (which in FIG. 7A is also towards the left of the figure), thus the microswitch lever would actuate in an arc orthogonal to that of the extension.
  • a "distal" thumb microswitch 713 may be positioned for actuation by the distal thumb button 310.
  • the distal thumb button axle protrusion 714, and its matching axle protrusion on the other side of the distal thumb button, may insert into axle cavities in the thumb digit array enclosure housing. These components would form an axle mechanism around which the distal thumb button rotates during its actuation.
  • the distal thumb microswitch is orientated in such a way that its hinge may be positioned towards the axle of the distal thumb button (i.e., towards the right of FIG. 7A), thus the microswitch lever would actuate in the same arc as the distal thumb button.
  • a "proximal" thumb microswitch 715 may be positioned for actuation by the proximal thumb button 312.
  • the proximal thumb button axle protrusion 716 and its matching axle protrusion on the other side of the proximal thumb button 312 may insert into axle cavities in the thumb digit array enclosure housing. These components would form an axle mechanism around which the proximal thumb button rotates during its actuation.
  • the proximal thumb microswitch may be orientated in such a way that its hinge is positioned towards the axle of the proximal thumb button (i.e., towards the right of FIG. 7A), thus the microswitch lever would actuate in the same arc as the proximal thumb button.
  • the proximal thumb button may use the force-to-axle reduction method described for the proximal finger and medial finger buttons above. While not illustrated in FIG. 7A, this button may also incorporate a removable button cover (as described for the proximal finger button above) to adjust the distance of the contact surface of the button from the thumb.
  • the rear enclosure 120 is designed to house electronics and to use the weight of these electronics and its own structure to act as a counterweight against the weight of the interface's sections that are positioned in front of the user's wrist.
  • This counterweight effect can be used to modify or eliminate the muscular activity required by the user wearing the interface to keep their wrist straight in the neutral operating position (as defined earlier).
  • the balance point the place where the interface can be suspended from and remain in balance
  • the balance point may be utilized.
  • the balance point may lie approximately at the middle of the user's palm (i.e., approximately the middle of the palm enclosure 115). In exemplary embodiments that have no rear enclosure (see FIG. 13A) or a relatively short rear enclosure (see FIG. 13B) the balance point may be closer to the front of the interface. For uses in which it is desirable to move the balance point back (further
  • exemplary embodiments may include additional weight in the rear enclosure and/or distance weight in the rear enclosure further away from the wrist, possibly by extending the rear enclosure.
  • some or all of the additional electronics may be located in the rear enclosure. In exemplary embodiments, some or all of the additional electronics may be located elsewhere in the interface or not be located in the interface at all. In each of these alternatives however, the electronics may perform the following tasks. One task may be to convert the signals coming from the digit buttons into a single digital data stream that can be passed on to another device in a useful form. Another task may be to measure the interface's motion, orientation, and/or position and pass these measurements on to another device in a useable form.
  • FIG. 8 illustrates a functional block diagram of an exemplary embodiment of electronics that may be used in conjunction with the structure described herein.
  • signals from the digit button sensors 811 may be passed on to a relay 812 that has multiple input channels.
  • This relay may convert these multiple input signals into a single digital data stream which is passed on to a processor 817.
  • a variety of devices may perform the functions required of this relay, including a microcontroller.
  • the button sensor relay may supply the required positive and ground connections as well as the required signal.
  • the button sensor relay may also be able to pass on the collected digit button data via an output port (e.g., a TX pin).
  • An example of the type of algorithm that may be employed by the button sensor relay to perform its task is illustrated in FIG. 9 and described below.
  • FIG. 8 Also illustrated in FIG. 8 are the electronics of this exemplary embodiment that may be used to measure the interface's motion and orientation.
  • These components include three types of sensors: (1) a sensor that measures the interface's dynamic and static gravity acceleration in one, two, or three dimensions 814 (e.g., an accelerometer), (2) a sensor that measures the angular rate of the interface's rotation around at least one of the pitch (lateral), yaw (vertical), and roll (longitudinal) axes 815 (e.g., a gyroscope), and (3) a sensor that measures magnetic fields around the interface in one, two, or three dimensions 816 (e.g., a magnetometer).
  • a sensor that measures the interface's dynamic and static gravity acceleration in one, two, or three dimensions 814 e.g., an accelerometer
  • the data from these three sensor types may then be passed on to the processor 817 that can convert the data into a form that is appropriate for transmitting to an internal wireless link 818.
  • the processor 817 can convert the data into a form that is appropriate for transmitting to an internal wireless link 818.
  • a variety of devices for performing the functions of these sensors (814, 815, and 816) and the processor 817 are available. For example, in an exemplary embodiment an
  • RO/AU integrated inertial measurement unit 813 comprising a microcontroller, one or more accelerometers, one or more gyroscopes, and one or more magnetometers may be suitable.
  • the unit may be able to receive data from the button sensor relay 812 via a input serial port (RX pin). This unit may also able to process and pass its
  • accelerometer/gyroscope/magnetometer data along with the digit button data on to the internal wireless link 818 via an output port (e.g., a TX pin).
  • an output port e.g., a TX pin.
  • these sensors may be oriented within the rear enclosure such that they are approximately horizontal to the ground when the interface is in its neutral operating position.
  • FIG. 8 shows sensors 814, 815, and 816, in exemplary
  • various combinations of these sensors may be present. For example, in exemplary embodiments, only sensor 814 or 815 or 816 may be present. In exemplary embodiments sensors 814 and 815 may be present or 814 and 816 may be present, or 815 and 816 may be present.
  • FIG. 8 illustrates the wireless link 818 internal to the interface 810.
  • This internal wireless link may be configured to wirelessly transmit the combined digit button and motion/orientation/position sensor data to a wireless link 819 that is external to the interface.
  • This external wireless link may then transfer the data it has received to a recipient device 820.
  • any number of wireless systems would be suitable for acting as the internal and external wireless links.
  • additional standard components may be utilized to pass data to and from these wireless links in an appropriate form.
  • the wireless link components 818 and 819 may be made additionally capable of transferring data from the recipient device to the interface.
  • Data from the interface may be used by any number of devices, and in exemplary embodiments the recipient device 820 shown in FIG. 8 may be a computer or mobile computing device. In such embodiments the recipient device may receive the interface's data via a cabled connection (e.g., USB, MIDI, Firewire, or Thunderbolt) from the external wireless link 819, and may be running music software. The data received from the interface may be used to control aspects of this software, the playing of software-based
  • RO/AU musical sounds being but one example. This software may be one of the many
  • the external wireless link may perform whatever conversion is required to make the interface's data useable by the computer.
  • the external wireless link may act as a MIDI or OSC device that converts the interface's data to MIDI or OSC data that may then be used by the recipient device's software using standard methods.
  • the external wireless link may provide the data in another format (e.g., using the cabled connection as a serial port) and an additional program may be installed on the recipient device for accessing this data and providing it to be used by other programs on the recipient device.
  • the user may also have the option of using a left- handed version of the interface (which may be essentially a mirror image of the right-handed version) or using right- and left-handed interface versions simultaneously.
  • the data from the two interfaces may be passed on to the recipient device 820 (see FIG. 8) via a substantially similar external wireless link 819.
  • an extra type of data may also be generated through a comparison of the actions of the two interfaces.
  • algorithms for processing such comparative data may be included in a program running on the recipient device, or by an additional processing component included on the external wireless link.
  • Examples of data comparisons include differences in orientation on the lateral, longitudinal, and/or vertical axes of each interface ⁇ i.e., pitch, roll, and yaw) or differences in rates of rotation on these axes; differences in choice and timing of button actuation; or differences in relative position (in exemplary embodiments that measure relative position).
  • the two interfaces may communicate with each other directly via wireless link, allowing them to directly compare their data. Direct communication may be especially useful for interfaces that produce their own sound or provide the user with haptic feedback, and in these exemplary embodiments such a data comparison may be used to control or modulate the output of these features.
  • a battery 821 may provide some or all the electricity required by the interface's electronics, the supply of which might be gated by a power switch 121 (see FIG. 1). Depending on the battery's voltage, standard methods of voltage conversion may be required for supplying an appropriate voltage to the interface's components. While a variety of battery types can be used, in exemplary embodiments the battery may be a rechargeable lithium polymer type, which can be charged by a standard charging device (using conventional methods of power supply) that is connected to the external power socket 314 (see FIG. 3). Alternatively a replaceable battery system may be
  • RO/AU used, with a standard apparatus for swapping the battery/batteries in and out of the rear enclosure.
  • an external port 822 that may be incorporated in exemplary embodiments of the interface.
  • This port which may connect to an external data cable, may be used for data communication with, and updating the software of, the processor 817 and/or the button sensor relay 812.
  • a USB connector 210 may act as the connector for port 822, however, other connector types may be used including MIDI, Firewire, Thunderbolt, or another suitable connector type.
  • a cable connected to the port 822 can act as the communication link to the recipient device 820 and perform the task of the wireless components 818 and 819. This cable can also supply power to the interface from the recipient device, to power the interface's electronics and/or to charge its battery.
  • a cable-dependent interface requiring no onboard battery and/or wireless link system may be implemented.
  • FIG. 9 An exemplary embodiment of an algorithm that may be performed by a button sensor relay 812 (see FIG. 8) is illustrated in FIG. 9.
  • this algorithm may be utilized to collate the signals from the multiple digit button sensor inputs to the relay, and report digit button sensor state changes to the processor 817 via a single data-channel.
  • button 1 may be represented as unactuated with a value of 0 and actuated with a value of 15, while button 2 may be represented as unactuated with a value of 1 and actuated with a value of 16, and so on.
  • a filtering step 914 then takes place which will be described in detail in the next section.
  • the new tagged state value of button X is then passed on (915) to the next component, which in this embodiment is the processor 817 (see FIG. 8).
  • the program then iterates to X+1 and returns to step 910.
  • the forms and positioning of the distal finger button 410 and proximal finger button 416 (see FIG. 4A) belonging to the same digit array may allow their assigned finger to actuate them either individually or in combination with each other. This is also the case for the distal finger button and medial finger button 411 belonging to the same digit array.
  • the actuation sequence filter 914 shown in FIG. 9 may allow the output events assigned to the medial and proximal finger buttons of a digit array to be used in combination with each other through specific sequences of button actuation. By doing so, every, or a substantial number of, possible combination of simultaneous "on" signals among a finger digit array's three buttons becomes possible. A detailed description of how this functionality can be used is provided herein.
  • the actuation sequence filter can also be applied to signals originating from the thumb digit array, but in exemplary embodiments, this may be less necessary as thumb button combinations can be achieved manually by some or all users.
  • This actuation sequence filter subroutine may be achieved via a variety of methods, and an exemplary embodiment is illustrated in FIG. 10.
  • the subroutine begins when a new button state is received and it checks whether the new state belongs to any of the distal finger buttons (1010). If not, the new data is passed out of the subroutine (1011), without filtering, to the next stage of the program (915) illustrated in FIG. 9. If the new state was triggered by a distal button the subroutine checks whether the stored state of the proximal button belonging to the same digit array is as actuated (1012).
  • the filter will "hold” the report of the proximal button changing to an unactuated state, but will pass on the most recent such "held” report when the distal button of that digit array is unactuated (1013). Meanwhile, the actuated state of the distal button is passed out of the subroutine (1011). If the proximal button is not actuated, the subroutine checks whether the stored state of the medial button belonging to the same digit array is as actuated (1014). If yes, the filter will hold the report of the medial button changing to an unactuated state, but will pass on the most recent such "held” report when the distal button of that digit array is unactuated (1015).
  • this report of the distal button being actuated will not be passed on and no reports of its actuation will be passed on until the distal and medial buttons are unactuated (1015). After the distal and medial buttons are unactuated, subsequent reports of distal button actuation will be allowed through the filter. If the answer at step 1014 is no, the distal button actuation report is passed out of the subroutine (1011), without filtering, to the next stage of the program (915) illustrated in FIG. 9. The use of this subroutine can be made optional, with its activation being controlled using physical controls on the interface or via commands sent from the recipient device 820 via the wireless link system (see FIG. 8).
  • the accelerometer, gyroscope, magnetometer data, or combinations thereof may be used to estimate the interface's orientation in at least one of the pitch, roll, and yaw axes.
  • This task may be performed by software running on a processor 817 (see FIG. 8).
  • a processor 817 see FIG. 8
  • a technique that utilizes a "direction cosine matrix" may be used, with a program structure like that described in FIG. 11.
  • the initial step in this program is to read the accelerometer, gyroscope, and magnetometer data from the relevant sensors (1110).
  • the current estimates for pitch and roll are then used to compensate for the effect on magnetometer readings of the magnetometer not being orthogonal to the ground, and then a heading is calculated relative to the Earth's magnetic field (1111).
  • Angular rate I.e., gyroscope sensor
  • DCM direction cosine matrix
  • corrections are then made to ensure that the estimated reference axes (x, y, and z) for the interface remain orthogonal to each other, then the accelerometer and magnetometer data are used to correct errors that have developed over time in the angular rate-based direction cosine matrix values (1113).
  • the direction cosine matrix values are then translated into estimates of pitch, roll, and yaw
  • buttons and motion/orientation/position data is outputted (1116) to the internal wireless link 818 (see FIG. 8).
  • a variety of motion/orientation/position data combinations may be outputted to the internal wireless link.
  • the combination may include; button state values, pitch, roll, and yaw orientation values, as well as angular rate of rotation (gyroscope) and acceleration
  • buttons there may be fifteen touch-activated buttons located on the interface and three buttons may be assigned to each digit (the fingers and thumb). Each of these groups of three buttons, referred to as a "digit array", may be economically positioned along the main plane of flexion of a single digit. As described above, each button (distal, medial, proximal, etc.) may be referred to as a "digit button”. As part of the normal operation of the interface, each digit may only be required to interact with one digit array of digit buttons.
  • digit button may refer to any substantially switch-like mechanism that can be actuated through interaction with one or more of a user's digits, to produce either a binary (on/off) or graduated (i.e., variable beyond two values) output.
  • exemplary embodiments may employ button mechanisms
  • RO/AU including but not limited to microswitches (or other electromechanical switches), capacitance and resistance touch switches, photo sensor-based switches, dome and other membrane switches, or Hall effect sensor switches, etc., or combinations thereof.
  • the user's right hand may be placed between the palm enclosure 115 and the hand clasp 116 and the hand strap 117 may be attached to the upper surface of the hand clasp at a position that causes the interface to remain firmly but comfortably attached to the hand despite the arm and hand being moved around in space.
  • the palm may be positioned such that the user's little, ring, middle, and index fingers can comfortably access the buttons on the finger digit arrays 1 0, 111, 112, and 113, respectively.
  • the user's thumb may be positioned so it can comfortably access the buttons on the thumb digit array 118.
  • the hand clasp spacer 119 may be swapped for one of a different size or removed entirely.
  • distal finger button 410 and medial finger button 411 may be positioned to be actuated
  • actuation of the distal finger button may be achieved mainly through flexion at the finger's middle knuckle 423 (proximal interphalangeal joint) and/or base knuckle 424 (metacarpophalangeal joint).
  • actuation of the medial finger button 411 may be achieved by curling the finger, mainly via flexion at the top knuckle 425 (distal interphalangeal joint) and middle knuckle 423.
  • the proximal finger button (obscured in this image by a "proximal finger button cover" 516 - see FIG. 5) may be positioned to be actuated by the middle segment 426 (intermediate phalanx) and/or base segment 427 (proximal phalanx) of the finger. Actuation of the proximal finger button may be achieved mainly via flexion at the base knuckle 424 and may include extension at the middle knuckle 423 and/or the top knuckle 425.
  • the operation of each finger digit array for some or all four fingers may be more or less identical or at least substantially identical.
  • the operation of one or more finger digit arrays for 2, 3, or 4 fingers may be more or less identical or at least substantially identical.
  • the distal thumb button 310 and medial thumb button 311 may be positioned to be actuated independently or concurrently by contact with the thumb, or more specifically, mainly by contact with the thumb's tip segment 717 (distal phalanx). As illustrated in FIG. 7B, independent actuation of the distal thumb button may be achieved mainly through flexion at the top knuckle 718 (distal interphalangeal joint). As shown in FIG. 7C, the medial thumb
  • RO/AU button 311 may be independently actuated mainly through movement (adduction) of the thumb towards the hand, which may occur mainly through flexion at the base knuckle 719 (metacarpophalangeal joint) and/or the joint connecting the thumb to the hand 720
  • proximal thumb button 312 may be positioned to be actuated mainly through contact with the lower surface of the base segment 721 (proximal phalanx) and/or palmar segment 722 (metacarpal) of the thumb.
  • independent actuation of the proximal thumb button 312 may be achieved mainly through flexion at the thumb's base knuckle 719 and/or the joint connecting the thumb to the hand 720, and may also involve extension at the thumb's top knuckle 718.
  • An advantage of this exemplary arrangement of buttons on the thumb digit array may be that any combination of simultaneous actuation of these buttons is possible through operation with the user's thumb alone.
  • buttons on the interface in order for the user to be able to comfortably and effectively operate the digit buttons on the interface a variety of mechanisms may be present for adjusting the locations and orientations of these buttons.
  • the location of each finger digit array on the digit array track may be adjustable. As is illustrated in FIG. 6 this may be achieved by unscrewing the connector bolt 420 until pressure of the washer 419 and the connector clamp 418 against the channel fin parts 610 is reduced enough for the position of the digit array track connector 421 (and the rest of the digit array) along the length of the available track 114 to be altered.
  • Loosening the connector bolt in this way may also allow the rotation of the digit array track connector, relative to the digit array track, to be adjusted.
  • the track connector can be immobilized again by re-screwing the connector bolt.
  • exemplary embodiments further adjustment of the locations and orientations of a finger's digit buttons may be made possible, whereby the user can unscrew the proximal shaft bolt 511 and/or the distal shaft bolt 515.
  • the proximal shaft bolt 511 By unscrewing the proximal shaft bolt 511 , pressure on the rubber pad lying against the proximal shaft 417 is relieved, and the proximal shaft is able to slide forwards and rearwards within the tubular section of the digit array track connector 421. In so far as may be possible without colliding with the neighboring finger digit arrays, rotation of the proximal shaft within the digit array track connector can also take place.
  • buttons belonging to the same digit array may allow these buttons to be actuated either individually or in combination with each other by a single digit.
  • the buttons are used to trigger musical tones, such combinations would allow specific harmonies to occur, thereby extending the range of harmonies that can be produced beyond that of combinations of buttons belonging to separate digit arrays.
  • the contact surface of the medial finger button 411 is curved and relatively thin (measured between its top and bottom edges) and mounted on top of the distal finger button 410.
  • the user can, while maintaining actuation of the medial finger button, push down (on the distal and/or medial finger button) and actuate the distal finger button.
  • the user can, while maintaining actuation of the distal finger button, pull their finger back and actuate the medial finger button.
  • the distal and proximal finger buttons belonging to the same digit array can also be actuated either individually or in combination with each other by a single digit.
  • the distal button's length means that the user can actuate it with either a partially curled or outstretched finger. In the latter case the lower pad of the finger's distal segment (distal phalanx) may make contact at the front end of the button. This posture makes it easier for the user to maintain actuation of the distal button while actuating the proximal button and vice versa.
  • the user may have the option of having each digit array's sequence of button activation algorithmically interpreted in real-time, or substantially in realtime, to selectively allow the combination of the medial and proximal button output events to occur. This may be achieved using an actuation sequence filter subroutine 914 (see FIG. ⁇ and FIG. 10).
  • actuation sequence filter subroutine 914 see FIG. ⁇ and FIG. 10
  • maintaining actuation of the proximal button while actuating the distal button allows the output signal of the proximal button to be sustained despite the proximal button being released (steps 1010, 1012, and 1013 in FIG. 10).
  • the distal button While the distal button remains actuated the output signals of the distal and proximal buttons will be sustained concurrently. While keeping the distal button actuated, the user can then actuate the medial button, thereby causing the output signals of the distal, medial and proximal buttons to be sustained concurrently.
  • Another feature of exemplary embodiments of the filter is that if the distal button is actuated after the medial button is actuated (while the medial button's actuation is maintained) then the distal button's output
  • RO/AU signal will not trigger a response (steps 1010, 1014, and 1015). If the medial button is then released while actuation of the distal button is maintained, then the output signal of the medial button will continue uninterrupted. The user can then actuate the proximal button, while keeping the distal button actuated, thereby allowing the output signals of the medial and proximal buttons to be sustained concurrently.
  • the proximal, medial, and distal buttons of the finger digit arrays and thumb digit array may have the principal function of providing discrete on and off signals that can be translated by the recipient device 820 (see FIG. 8) into sounds, such as musical tones or sounds.
  • each of the fifteen digit buttons may be assigned to one of the twelve tones of the chromatic scale, with the remaining three buttons assigned to notes above or below the chosen scale.
  • two octaves of a diatonic scale may be assigned to the fifteen digit buttons. Examples of such arrangements are shown in FIG. 12A and 12B.
  • the upper table FIG. 12A and 12B.
  • FIG. 12A illustrates an example of a chromatic arrangement: Starting at a C note on the distal thumb button, the notes ascend first through the distal buttons, then through the proximal buttons, then through the medial buttons, finally reaching a D note (in the octave above) on the medial button of the little . finger digit array.
  • the lower table shows an example of a diatonic arrangement (a C major scale): Starting again at a C note on the distal thumb button, the notes ascend first through the distal buttons, then through the proximal buttons, then through the medial buttons, finally reaching a C note (two octaves up) on the medial button of the little finger digit array.
  • any number of note assignments to the digit buttons is possible (including note assignments derived scales other than the "western" chromatic scale) and are not limited to those described herein.
  • Exemplary embodiments may provide the user with a configuration whereby single digit buttons may trigger more than one note, and the notes that are triggered by a digit button may have harmonic interval relationships. For example, actuating the distal thumb button may trigger the notes C, E, and G.
  • Exemplary embodiments similar to those illustrated in FIG. 1 may include 12, 13, or 14 digit buttons. Exemplary embodiments with 13 or 12 buttons may be sufficient to allow the user to play a chromatic scale from the tonic note to the tonic note an octave higher (13 buttons) or from the tonic note to the note one below the octave-higher tonic note (12 buttons). Exemplary embodiments with 14 buttons may also be used to play chromatic scales, but may also allow the user to play 2 octaves of a diatonic scale (without employing an octave selection feature) from the tonic note to the note one below a tonic note that is 2 octaves higher.
  • the interface may provide the user with a variety of options with regard to how the interface's angular rate of rotation, orientation (pitch, roll,
  • this may include using these data to modulate the recipient device's processing of input from the interface's buttons.
  • the recipient device responds to digit button input by producing tones or sounds resembling those of a sustained-tone instrument (e.g., cello, violin, saxophone, flute, organ, lead synthesizer sound, etc), and the angular rate of interface rotation around the vertical (yaw) and/or lateral (pitch) axes is used to emulate the effect of bowing or blowing intensity on these tones.
  • a sustained-tone instrument e.g., cello, violin, saxophone, flute, organ, lead synthesizer sound, etc
  • the angular rate of interface rotation around the vertical (yaw) and/or lateral (pitch) axes is used to emulate the effect of bowing or blowing intensity on these tones.
  • the user may be generating changes in the rate of angular rotation in the yaw plane by swinging the interface from side to side (from the neutral operating position), mainly by rotation at the shoulder joint and bending at the elbow.
  • a compound movement of the interface e.g., involving rotational and translational motion
  • rotation of an exemplary interface around an axis is referred to it is assumed that the user's motion includes, but is not necessarily restricted to, rotational motion around the axis in question.
  • actuation of a digit button on one interface may select the starting pitch of a tone and actuation of a digit button on the other may select the end pitch of the tone, and reducing the orientation difference between the two interface's (for example, in the lateral axis) may slide the pitch of the tone from the start pitch to the end pitch.
  • exemplary embodiments may utilize interface-based portamento control and/or vibrato control to modulate the pitch of musical tones, in a manner similar to that described elsewhere in this specification.
  • a large variety of additional alternative effects on musical sounds may be configured to be controlled via an interface, and this should not be considered a complete list.
  • Exemplary embodiments may allow the user to exert "contextual control" via an interface whereby one form of control is used to modulate another form of control.
  • one form of control is used to modulate another form of control.
  • the orientation of the interface around the lateral axis (pitch axis) at the moment of said actuation may be recorded by the system, and changes in the lateral axis orientation relative to said recorded orientation may be used to control a modulatory sound effect on the musical tone.
  • increasing the lateral axis orientation after digit button actuation i.e. raising the front of the interface upwards
  • a contextual control configuration similar to the example described above a variety of
  • RO/AU alternative interface outputs may be used to control a variety of other effects.
  • exemplary embodiments may also provide the user with an "octave selection" option based on interface orientation.
  • This option may control the octave value of the tones triggered by the digit buttons.
  • the user may choose one of the orientation axes, for example the lateral axis (pitch axis), to be divided into multiple zones. If a total of three angle zones around the lateral axis were chosen (e.g., down, middle, and up) then the lateral axis angle of the interface relative to these zones would determine the octave values of the notes triggered by the digit buttons.
  • An example of the borders between these three zones might be (assuming 0 degrees as horizontal) -40 degrees and 40 degrees, whereby the down zone is -40 degrees and below, the middle zone is greater than -40 degrees and less than 40 degrees, and the up zone is 40 degrees and above.
  • three tones in three adjacent octaves may be produced simultaneously, but their respective volumes may be determined by the interface's lateral axis angle relative to the down, middle, and up zones at the time of triggering.
  • actuating a digit button corresponding to the note C while the interface is in the down zone might be set up to trigger the notes C3, C4, and C5, but only C3 would have an audible volume.
  • the user may be given the option of attributing crossfaded volumes to the borders of these zones, such that actuating the C digit button near the border of the down and middle zones would again trigger the C tone in all three octaves but both the C3 and C4 tones would have an audible volume.
  • the user may also be given the option of using this octave selection in a dynamic or constant mode. In the dynamic mode maintaining activation of the C digit button while moving the interface from the down zone to the middle zone would dynamically crossfade the volumes of the C3 and C4 tones, such that the former would fade and the latter would increase.
  • tones may retain the zone- based volume level assigned at the time they were triggered, thus actuation of the C digit button in the down zone followed by moving the interface to the middle zone would result in the volume of the C3 tone being maintained at the same level throughout the movement (while possibly being subject to volume-modulation by other aspects of the system).
  • effectively only one of the notes (in this case C3) in the octave group (in this case C3, C4, and C5) is triggered at a time, and the selection of which note is triggered is dependent on the zone the interface is in at the time of triggering.
  • the processing required to perform the octave selection described above may be performed by a variety of components including the processor 817 (see FIG. 8), a processing component added to the external wireless link 819, or an additional program installed on the recipient device 820.
  • an axis of orientation may be used to select from a range of options (a range of octaves in this instance).
  • exemplary embodiments may use directions of translational and/or rotational motion to select from different options.
  • zones of interface rotation direction may be configured such that rotating the interface in a specific direction may select a specific option from a range of options.
  • rotating the interface in a specific direction e.g. rotating an interface rightwards around the vertical axis
  • a specific frequency of oscillation for a sound effect on a musical tone e.g. a modulating volume gate or frequency filter, etc.
  • phase of these oscillations may also be synched to external events, the tempo of a piece of music being but one example.
  • an oscillation that lasts for one musical bar may be synched to "start” (e.g. cross zero into the positive phase of the oscillation) on the first beat of the bar.
  • start e.g. cross zero into the positive phase of the oscillation
  • directional control may be used to control a variety of options and parameters.
  • the recipient device may be a device on which the user may play a video game (e.g., the Microsoft Xbox, Sony Playstation, Nintendo Wii, or a personal computer/mobile computing device, etc.) where the user may participate in the game through their operation of the interface.
  • equipment that is designed to generate musical sounds in response to external commands (e.g., MIDI messages) may act as the recipient device, with hardware synthesizers being but one example.
  • the recipient device may be a lighting system, whereby a user's operation of the interface may control the actions of the lighting system.
  • the recipient device may be a lighting system at a live performance venue.
  • the recipient device may be a system that may be remotely controlled by a user's operation of the interface, for example a vehicle or robot.
  • a recipient device 820 may act as a data-entry device (e.g., a personal computer or mobile computing device, etc.), where the range of different discrete output signals the interface can produce may be mapped to a specific data set (e.g., letters, numbers, etc.).
  • the range of different output signals the interface can produce may be expanded beyond what can be achieved by actuating individual digit buttons by making the events triggered by digit button actuation dependent on the interface's orientation and/or motion (in a similar way to the octave selection option described above).
  • additional specific events may be triggered through specific combinations of digit button actuation. For example, in the case of an interface with 15 digit buttons, these buttons may be assigned event 1, event 2, event 3, and so on through to event 15. However, pairs of buttons actuated substantially at
  • RO/AU the same time may be configured to trigger more events beyond the initial 15. For example, actuating the distal thumb and distal index finger buttons at substantially the same time may trigger event 16, and the distal index and distal middle finger buttons together may trigger event 17, and so on. Combinations of more than two buttons may also be employed. In this example the events may be musical tones with specific pitches, or characters from an alphabet, etc. Such a "combinatorial configuration" may be utilized for a variety of exemplary embodiments including interfaces with different amounts of buttons and different button configurations.
  • one or more interface buttons may be assigned a modal role, whereby said modal button primarily modulates the events triggered by other buttons.
  • the thumb button may be assigned a modal role whereby the finger buttons, while the thumb button remains unactuated, may be able to trigger events 1 to 4. While the thumb button is actuated, the finger buttons may be able to trigger events 5 to 8.
  • Such an embodiment may allow all the pitches of a C major scale to be played on an interface with only five buttons.
  • an interface had two buttons per finger and at least one button for the thumb (see FIG.
  • a musical mode (e.g. C major) may be assigned to the finger buttons, whereby actuating said thumb button may cause any finger-triggered note to be one semitone lower (or higher) than would be the case if the thumb button were unactuated.
  • a musical mode played on an interface with only nine buttons Such a "modal configuration" may be utilized for a variety of exemplary
  • exemplary embodiments of the interface may include a different number of digit buttons and/or a different arrangement of those buttons.
  • some embodiments may include only medial buttons (finger digit array: 411 ; thumb digit array: 311) and proximal buttons (finger digit array: 416; thumb digit array: 3 2), with no distal buttons (finger digit array: 410; thumb digit array: 310).
  • An example of this arrangement for a finger digit array is illustrated in FIG. 15.
  • Some embodiments may include only distal buttons and proximal buttons, with no medial buttons (see e.g., FIG. 16 and FIG. 18).
  • Some embodiments may include only distal buttons and medial buttons, with no proximal buttons (see e.g., FIG. 17 and FIG. 19).
  • Exemplary embodiments may include a thumb digit array with a medial button on the outside of thumb rather than on the inside of the thumb (see e.g., FIG. 20).
  • Exemplary embodiments that use digit button arrangements similar to those illustrated in FIG. 15, FIG. 16, FIG. 17, FIG. 18, or FIG. 19 may include 8 or 7 digit buttons, and this number of buttons may be sufficient to allow a user to play a diatonic scale (e.g.,
  • RO/AU C4, D4, E4, F4, G4, A4, B4, C5 from the tonic note to the tonic note one octave above (8 digit buttons) or from the tonic note to one note below the octave-higher tonic note (7 digit buttons).
  • buttons per digit may be provided on the interface. Such additional digit buttons may be positioned to be actuated through sideways movement of the digit, or extension of the digit.
  • Some embodiments may not include a thumb digit array 118 (see e.g., FIG. 21). In exemplary embodiments that do not include a thumb digit array, the thumb may be given the task of keeping the interface in contact with the hand, via an appropriate structure against which the thumb may grip or press.
  • Exemplary embodiments may not include digit arrays for other digits. For example, as illustrated in FIG. 22, some embodiments may not include a digit array for the little finger. Other exemplary embodiments may not include one or more digit arrays for other digits.
  • Exemplary embodiments similar to those illustrated in FIG. 21 and FIG. 22 may include 12, or 13 digit buttons. Exemplary embodiments with 13 or 12 buttons may be sufficient to allow the user to play a chromatic scale from the tonic note to the tonic note an octave higher (13 buttons) or from the tonic note to the note one below the octave-higher tonic note (12 buttons).
  • Exemplary embodiments that include fewer digit buttons may utilize a different overall form.
  • exemplary embodiments may utilize a form that the user's hand and fingers can more readily wrap around.
  • Such embodiments may include no buttons or some buttons for the digits of the user's hand, including the thumb.
  • exemplary embodiments may include four digit buttons 2601 to be operated by the user's fingers, and one digit button 2602 to be operated by the user's thumb.
  • FIG. 26B As illustrated in FIG. 26B;
  • exemplary embodiments may include eight digit buttons 2604 to be operated by the user's fingers, and one digit button 2602 to be operated by the user's thumb.
  • Exemplary embodiments similar to that illustrated in FIG. 26A may include digit buttons to be operated by the fingers, and no digit button to be operated by the thumb, thereby having 4 digit buttons in total.
  • Exemplary embodiments similar to that illustrated in FIG. 26A may include digit buttons to be operated by the index, middle and ring fingers, and no digit button to be operated by the thumb, thereby having 3 digit buttons in total.
  • Exemplary embodiments similar to that illustrated in FIG. 26A may include 5 digit buttons in total, and this number of buttons may be sufficient to allow a user to select commonly used harmonic pitch intervals.
  • Exemplary embodiments similar to that illustrated in FIG. 26B may include digit buttons to be operated by the fingers, and no digit button to be operated by the thumb, thereby having 8 digit buttons in total.
  • Exemplary embodiments similar to that illustrated in FIG. 26B may
  • RO/AU include 8 or 7 digit buttons, and this number of buttons may be sufficient to allow a user to play a diatonic scale from the tonic note to the tonic note one octave above (e.g., C major: C4, D4, E4, F4, G4, A4, B4, and C5 assigned to 8 digit buttons) or from the tonic note to one note below the octave-higher tonic note (e.g., C major: C4, D4, E4, F4, G4, A4, and B4 assigned to 7 digit buttons).
  • Exemplary embodiments may use similar hand-attachment mechanisms to those mentioned in descriptions of other embodiments.
  • a hand strap 2603 may be employed, whereby the user's palm may rest against the interface and the strap may run across the back of the hand.
  • These hand- attachment mechanisms may be configured to provide an adjustable their fit to the user's hand.
  • the hand-attachment mechanism at either end of the interface may be able to swing around the long axis of the interface.
  • the strap may be rotated from its attachment points at either end of the interface, thus allowing the strap to rotate closer to, or away from, the digit buttons 2601 or 2604. This would allow the angle of the faces of the buttons relative to the user's fingers to be changed.
  • a benefit of this adjustment mechanism may be that users with different sized hands would be able to choose the most comfortable and effective locations on their fingers with which to make contact with the buttons.
  • Exemplary embodiments may include some, none, or all of the motion, orientation, and/or position sensors mentioned in descriptions of other embodiments.
  • embodiments may include an acceleration sensor 814 with one or more axes and/or an angular rate sensor 815 with one or more axes.
  • some embodiments may lack axes in the roll plane for the acceleration 8 4 and angular rate sensors 815, or may lack a magnetic field sensor 816 entirely.
  • Exemplary embodiments may employ other forms of motion tracking.
  • active or passive infrared markers may be attached to the interface and tracked by an external stationary infrared camera.
  • the interface may be tracked with a time-of-flight camera.
  • the interface may include components that emit an ultrasonic signal and the spatial location of the signal may be tracked by an external stationary ultrasonic receiver array.
  • the interface may include components that emit a magnetic field and the spatial location of the emitter may be tracked by an external stationary magnetic field detector, or the detector may be attached to the interface and the emitter may be external and stationary.
  • Exemplary embodiments may have other additional sensors included in the interface, like a GPS receiver, or a receiver for higher-resolution positioning signals.
  • buttons with more detailed measurement capabilities may be used in exemplary embodiments.
  • the digit buttons of the finger and thumb digit arrays may be equipped with sensors that feature velocity and/or after touch sensitivities, similar to the keys found on many MIDI piano keyboards.
  • Some embodiments may include buttons that have multiple axes of actuation, thereby producing additional independent streams of data output from the interface.
  • buttons may be included that can be actuated up, down, forwards, backwards, left, and right, or only some of these directions. Standard electromechanical sensor designs understood by those skilled in the art may be used for these purposes, and changes to the data processing and communications apparatus of the interface may be made to accommodate this additional data.
  • Exemplary embodiments may include digit buttons that are designed to be actuated largely exclusively by the end segments (distal phalanges) of the digits. Such embodiments may have the advantage of needing less Or no adjustability mechanisms to maintain usability among users with different hand sizes.
  • some embodiments may include three rows of digit buttons 2701 comprising four digit buttons per row. Each of these twelve digit buttons may be designed to be actuated by the end segment of one of the user's four fingers. In order to reach the digit buttons in one of these three rows with the end segment of a finger the user may need to flex or extend that finger.
  • Exemplary embodiments may include three digit buttons 2702 designed to be actuated by the end segment of the user's thumb. In order to individually actuate one of these three thumb digit buttons the user may be required to either abduct or adduct their thumb relative to the hand.
  • some embodiments may include two rows of digit buttons 2703 comprising four digit buttons per row. Furthermore, certain embodiments may include two digit buttons 2704 designed to be actuated by the user's thumb.
  • Exemplary embodiments may have finger digit buttons mounted on a curved surface.
  • the angle of a button's digit contact surface may be dependent on that button's position on the curved surface.
  • some embodiments may include three rows of digit buttons 2705 comprising four digit buttons per row.
  • An advantage of having finger digit buttons mounted on a curved surface may be that it reduces the extent of flexion or extension of the fingers required to reach each of the finger digit buttons.
  • exemplary embodiments may include two rows of digit buttons 2706 on a curved surface comprising four digit buttons per row.
  • Exemplary embodiments similar to those illustrated in FIG. 27A and FIG. 27C may include 12, 13, 14, or 15 digit buttons. Exemplary embodiments with 13 or 12 buttons may allow the user to play a chromatic scale from the tonic note to the tonic note an octave higher (13 buttons) or from the tonic note to the note one below the octave-higher tonic note (12 buttons).
  • buttons may also be used to play chromatic scales, but may also allow the user to play 2 octaves of a diatonic scale (without requiring the use of an octave selection feature) from the tonic note to the tonic note 2 octaves higher (15 buttons) or from the tonic note to the note one below the 2-octave-higher tonic note (14 buttons).
  • Exemplary embodiments similar to that illustrated in FIG. 27B and FIG, 27D may include 8 or 7 digit buttons, and this number of buttons may be sufficient to allow a user to play a diatonic scale (e.g., C4, D4, E4, F4, G4, A4, B4, C5) from the tonic note to the tonic note one octave above (8 digit buttons) or from the tonic note to one note below the octave- higher tonic note (7 digit buttons).
  • a diatonic scale e.g., C4, D4, E4, F4, G4, A4, B4, C5
  • exemplary embodiments may include rows of digit buttons that are non-straight in their horizontal arrangement.
  • the distance of a digit button from the palm enclosure 115 may be proportional to the relative length of the digit which that button is designed to be actuated by.
  • the digit buttons that are designed to be actuated by the little finger may be on average positioned closer to the palm enclosure than the digit buttons that are designed to be actuated by the middle finger.
  • exemplary embodiments may include digit buttons 2707 that have at least three directions of actuation. These digit buttons may be designed to be actuated by the end segments of the digits (distal phalanges) and the directions of actuation may be: retraction (pulling the digit button towards the palm of the hand), extension (pushing the digit button away from the palm of the hand), and pressing (pushing the digit button down into the enclosure it is mounted on).
  • a three-direction digit button 2708 may also be included for the thumb, to be actuated by the end segment of the thumb (distal phalanx).
  • This thumb button may have the actuation directions of adduction (pulling the digit button towards the main body of the interface), abduction (pushing the digit button away from the main body of the interface), and pressing (pushing the digit button down into the enclosure it is mounted on).
  • exemplary embodiments may include digit buttons 2707 that have at least two directions of actuation (push and pull).
  • buttons Five of the three-direction buttons would allow the user to produce at least fifteen discrete output signals from the buttons.
  • Embodiments of this type may include adjustability whereby the base location that each button is actuated from can be adjusted.
  • RO/AU Such adjustability may assist in allowing an embodiment to maintain usability among users with different hand sizes and finger lengths.
  • the finger digit buttons 2707 may be adjustable in their distance from the palm of the user's hand (i.e., forwards and backwards).
  • the thumb digit button 2708 may also have the capacity to have its base position distance from the proximal segments of the thumb altered.
  • Exemplary embodiments may incorporate different forms of adjustment.
  • an adjustable component may be built into the thumb digit array 118 (see e.g., FIG. 3) whereby the distance between the proximal button 312 and the section that includes the distal and medial buttons (310 and 311) can be altered by the user.
  • a mechanism may be included that allows the position and/or angle of the entire thumb digit array relative to the palm enclosure to be adjusted.
  • the ranges of adjustment mechanisms mentioned in the above description may be increased or reduced, or various types of adjustment may be eliminated entirely.
  • some embodiments may have no separation between the finger digit arrays (see e.g., FIG. 25), where some or all the digit buttons for the fingers are positioned on one or more enclosures.
  • Some embodiments may be produced in different sizes to fit different-sized hands.
  • Certain embodiments may use a modular design, where the rear enclosure 120 (see e.g., FIG. 1), including its contents, may be detachable from the rest of the interface.
  • this detachable rear enclosure may be compatible with a range of front sections of the interface (palm enclosure 115 and the finger and thumb digit buttons, etc.) designed to fit different sized hands.
  • the rear enclosure may use conventional methods to form a secure structural and electronic connection with these front sections.
  • the finger and thumb digit arrays may be made in different sizes, and/or with or without some or all of the adjustability mechanisms described for the finger digit arrays in other embodiments.
  • these different-sized digit arrays may be interchangeable and swapped in and out of the interface to provide a better fit for an individual user.
  • the finger digit arrays may be swapped in/out at their connection to the digit array track 114. This would assist not only in accommodating a large range of hand sizes, but also the size differences between the fingers of an individual hand.
  • conventional connectors may be used to attach the sensor wiring of the digit buttons to other parts of the interface's electronics.
  • FIG. 14A illustrates an exemplary embodiment as shown in FIG. 1 , 2, and 3 whereby an interface is fastened to a user's right hand using the palm clasp 116 and hand
  • a strap 1401 may be included that extends from the thumb side of the palm enclosure over the user's hand.
  • the strap may consist of material that is elastic in character.
  • the strap may thread under a loop 1402 or buckle trim or equivalent structure (not shown in image).
  • the end portion 1406 of the strap 1401 may then thread back over the user's hand to attach to the outer surface of the preceding section of the strap 1401 running over the top of the user's hand.
  • exemplary embodiments may utilize a longer strap 1405 that threads under the loop 1402 and then extends to, and wraps around, the user's wrist (starting on the thumb side and then travelling under the wrist to the opposite side and then over the top of the wrist).
  • the end portion 1406 of the strap 1405 may then wrap over the outer surface of the preceding section of the strap on the wrist and attach to this strap surface. Attachment of the end of the strap to the surface of the preceding section of strap (e.g., on the top of the hand or around the wrist) may be made using a hook and loop, press stud, side release buckle, or button mechanism, or any equivalent mechanism.
  • the strap 1405 may consist of material that is elastic in character.
  • FIG. 14B, 14C, and 14D the end of the strap is shown extended upwards and not attached to the preceding section of strap (as it would be during normal use).
  • exemplary embodiments may include a hand strap 1401 that attaches at a lower point on the thumb side of the interface and contacts the side of the index finger knuckle and surrounding area, thereby providing a different attachment fit to the hand.
  • the strap may be comprised of material that can act as an attachment partner in a hook and loop mechanism.
  • some or all of the outer surface of the strap 1401 wrapped over the user's hand may include loop components and a section of material 1404 on the end portion 1406 of the strap may provide the hook components.
  • Exemplary embodiments may include mechanisms that reduce the accumulation of sweat on the user's hand when using the device. As illustrated in FIG. 14E, exemplary embodiments may include ventilation holes 1407 that run through the upper surface of the palm enclosure 1301 (illustrated here in isolation from all the other
  • Exemplary embodiments may utilize different electronics in the interface.
  • the data processing functions performed by the processor 817 (see e.g., FIG. 8) and/or the digit button sensor relay 812 may be performed by a processor component added to the external wireless link 819 and/or additional software installed on the recipient device 820 (in the instance where that device is a computer and/or processor of some type).
  • the data sent from the interface may be in a less processed state, but one that may allow some or all the necessary processing to take place at these subsequent points in the data chain.
  • Embodiments of this kind may have the advantage of reducing the interface's power consumption and making changes to the data-processing algorithms more convenient for the user.
  • the electronics housed in the rear enclosure 120 may be moved to the palm enclosure 115, and the rear enclosure may be reduced in size or eliminated.
  • An illustration of an embodiment that does not include a rear enclosure is shown in FIG. 13A.
  • An illustration of an embodiment that includes a relatively short and thin rear enclosure 120 is shown in FIG. 13B.
  • the counterweight effect of the rear enclosure may be lessened, but these
  • embodiments may be useful for applications where the physical presence of a rear enclosure is undesirable and/or unnecessary.
  • Exemplary embodiments may include mechanisms that allow the contact surface for the user's palm to have its location and/or orientation relative to the rest of the interface to be adjusted.
  • An exemplary embodiment of this kind is illustrated in FIG. 13C, where components that wrap around the user's hand ⁇ e.g. the palm clasp 116, hand strap 117, and hand clasp spacer 119) are not illustrated in the figure for the sake of clarity.
  • the upper surface of the palm enclosure 115 may have multiple degrees of movement freedom relative to the lower portion of the palm enclosure.
  • a mobile upper surface 1301 of the palm enclosure may be able to be raised or lowered relative to the lower portion of the palm enclosure, supported by projections that slide in and out of indentations within the lower portion of the palm enclosure.
  • a front-right rod 302, a rear rod 1303, and a front-left rod 1304 may be attached to the mobile upper surface 1301 and may slide in and out of cylinders that form part of the structure of the lower portion of the palm enclosure.
  • RO/AU may be raised or lowered by turning bolts that pass through the rods into the lower portion of the palm enclosure 115 (the bolts being accessed through holes on top of the mobile upper surface).
  • the rods of the mobile upper surface may slide freely within the cylinders and then be fixed in place through a locking system where horizontal pins within the lower portion of the palm enclosure 115 are inserted into one of many holes running along the length of the rods.
  • Other adjustment mechanisms may also be used, for example, multiple mobile upper surfaces of different rod lengths may be available to the user and chosen to be fitted to the device depending on which length provides the best fit.
  • the components that wrap around the hand e.g. the palm clasp 116, hand strap 117, or alternative hand strap 1401 may attach to the mobile upper surface 1301, the lower portion of the palm enclosure 115, or a combination of the two.
  • Exemplary embodiments may have a reduced number of axes of
  • Exemplary embodiments may include audio synthesis/production components within the interface itself.
  • the interface may be able to produce audible musical sounds with little or no assistance from other devices.
  • a speaker 2310 or other sound producing component may be located on the palm enclosure.
  • a speaker 2310 or other sound producing component may be located on the rear enclosure, or in any other suitable position op the interface.
  • Exemplary embodiments may include audio synthesis components, but require an external amplification device ⁇ e.g., a guitar amplifier) to be made audible.
  • Exemplary embodiments may include a system within the interface that provides haptic feedback to the user.
  • one or more vibration motors may be included within the palm enclosure 115 (see e.g., FIG. 1) and information may be provided to the user through their activation. This information may be generated on board the interface by its processing components (e.g., the processor 8 7, see FIG. 8) or other sources (e.g., the recipient device 820, or a processing component added to the external wireless link 819, etc.).
  • an interface may be used to manipulate the aural and/or visual elements of a video, or other types of visual and/or audio content.
  • Exemplary embodiments may involve an interface being used to manipulate the aural and/or visual elements of a music video.
  • game characteristics may be used whereby achieving specific outcomes through use of the interface is rewarded by one or more measures of achievement (e.g., points).
  • measures of achievement e.g., points.
  • a variety of interfaces may be used to play game-like embodiments including the exemplary interfaces explicitly described herein.
  • the exemplary embodiments may be configured to function with exemplary
  • RO/AU interfaces similar to those illustrated in FIG. 1, FIG. 25, FIG. 26A, FIG. 26B, or FIG 27A, B, C, D, or E.
  • Exemplary embodiments may be configured to provide a game suitable to the specific capabilities of one or more interfaces.
  • Exemplary embodiments may include the use of interface orientations, positions, and/or motions to provide one or more substantially continuous values, and/or digit buttons to provide one or more discrete values.
  • Appropriate input may include input that can provide one or more discrete input values (for triggering individual pitches or notes, for example) and/or one or more substantially continuous values (e.g., a number that may take values between 0 and 100, and can perform the same role as, for example, data derived from a sensor component that measures angular rotation rate or orientation around a vertical axis).
  • substantially continuous values e.g., a number that may take values between 0 and 100, and can perform the same role as, for example, data derived from a sensor component that measures angular rotation rate or orientation around a vertical axis.
  • moving or orienting a motion, orientation, and/or position sensitive mobile device like a cell phone, PDA, hand-held video game device, or tablet computer, etc.
  • moving a finger across a touch sensitive screen may also provide one or more substantially continuous values, while contacting specific points on said touch screen may elicit discrete output events.
  • some or all of the system of exemplary embodiments described herein may be implemented on a mobile computing device (e.g., cell phone, PDA, hand-held video game device, or tablet computer, etc.), video game platform (e.g., the Microsoft Xbox, Sony Playstation, or Nintendo Wii, etc.) or other computer, either in association with, or independent from, the exemplary interfaces described herein.
  • Exemplary embodiments may involve the manipulation of audio or musical audio only, while others may involve the manipulation of video only.
  • Possible sources of pre-recorded video include live action video (e.g., a music video), computer-generated video, or animated video.
  • computer graphics may be used in conjunction with or instead of pre-recorded video.
  • some or all the audio may be synthesized in real-time, rather than some or all of the audio relying on pre-made recordings.
  • some or all of the components of the video's audio may be configured to be manipulated by the user.
  • some or all of the elements of a video's visual component also may be configured to be manipulated by the user.
  • Exemplary embodiments may include the benefit of providing the user with an enhanced experience of engagement with musical audio or visual images or both due to the
  • FIG. 28A and 28B illustrate some of the visual elements that may be presented to a user while playing exemplary embodiments.
  • a variety of instructive visual elements may be presented to the user in a display panel.
  • multiple "section blocks" 2802 of different sizes may also be presented. These blocks may be set to correspond to specific sections in the audio or video samples or both. In some exemplary embodiments these sample sections and their corresponding section blocks may be consecutive. In other words, playing through each sample section one after another would advance smoothly through the entire sample.
  • An example of audio that might be configured for control via an interface is a singer's voice singing a song, and a section block may be set to correspond to one musical bar of that singing (i.e., in the case of a song with a time signature of 4/4, one bar would consist of four beats occurring at a rate determined by the tempo of the song, often expressed in beats per minute).
  • a smaller block may be set to correspond to a shorter section, for example, one half of a bar of singing.
  • audio that is configured to be controlled by an interface e.g., a singer's voice
  • control audio sample may be considered synonymous with the term "control video sample”.
  • a control audio sample 3020 may be divided up into multiple sample sections 3019 of varying sizes.
  • visual elements termed “section blocks” 3018 may be created that possess timings and durations proportional to their corresponding sample sections.
  • the section blocks 2802 may move towards and pass through a "play line” 2807.
  • the section blocks may move together (arranged in sequence one after the other) at a constant speed from the top to the bottom of the visual instruction display.
  • the play line position may be fixed throughout the duration of the game, and may be set at any position in the visual instructions display.
  • a section block may pass through the play line 2807 as the section in the control audio sample to which it corresponds is made audible to the user, and the location of the play line on this section block may represent the play back position of the section block's corresponding control audio sample section.
  • the section block passing through the play line is referred to as being the "active" section block and is numbered in FIG. 28A as 2806.
  • the user may operate an interface to control an audio sample, the visual of a video sample, or both.
  • rotation of an interface around its vertical (yaw) axis back and forth may be used to advance an audio sample (termed the "control audio sample") forward in time.
  • control audio sample a sample of rotation
  • other axes of rotation or trajectories of movement may be used for this purpose.
  • the system may be configured to achieve an auditory effect whereby the listener perceives that the control audio sample remains audible even if it is not being advanced in time.
  • An auditory effect may also be implemented such that variations in the rate or direction of the control audio sample's playback from the normal rate and direction do not cause changes in the pitch of the control audio sample.
  • the audio processing methods that are capable of producing such effects are presented later in this description.
  • other audio or video samples may be played back at a normal constant rate during the game and not subject to control via an interface. For example, if the control audio sample is a lead vocal track associated with a song featuring other musical sounds or instruments, these other musical sounds or instruments may be played back at a normal constant rate during the game and not change in response to actions performed on an interface.
  • These non-manipulated audio components may be referred to as the "backing music" or the "constant audio sample”.
  • the two types of audio sample may be provided to an exemplary embodiment as separate samples (e.g., as a vocal sample and a backing music sample).
  • audio pre-processing may be used to separate the required audio components into two separate samples prior to the user engaging in the game.
  • an "ideal" rate of vertical axis rotation that exists such that, if performed by the user, may advance the control audio sample in time (or "in sync") with the constant audio sample.
  • the user in order to progress the playback of the control audio sample the user may be required to rotate the interface around its vertical axis in a specific direction.
  • the "active" section block 2806 may indicate to the user
  • RO/AU that they are required to rotate the interface from left to right around the vertical axis (e.g., with a clockwise motion of the forearm running approximately parallel to the ground) in order to advance the playback of the control audio sample.
  • a visual indicator 2803 inside the active section block 2806 may provide this direction information by pointing in the required rotation direction.
  • the vertical axis rotation may be measured by the interface's angular rate sensor.
  • Compound movement (including for example rotational and translational movement) of the interface would therefore provide usable input to the system as long as that movement included vertical axis rotation.
  • Exemplary embodiments may utilize other or additional types of interface motion and/or orientation for controlling a game, and may utilize measurements coming from other sensor types associated with an interface.
  • the user may be required to begin rotating the interface approximately when the lower edge of a section block 2802 reaches the play line 2807. Achieving this movement timing, along with achieving the ideal rate of vertical axis rotation, would cause a control audio sample to be correctly synced with a constant audio sample.
  • the section blocks move at a constant rate
  • a visual indication of the ideal rate of movement may be formed by the combination of the height of the active section block 2806 and the speed with which the active section block is travelling downwards through the play line.
  • one ongoing game objective may be that the user has progressed through the entire segment of the control audio sample assigned to the active section block by the time the top edge of the active section block reaches the play line.
  • additional visual indicators may be used to guide the user's actions. While the active section block 2806 passes through the play line 2807 and the user rotates an interface around its vertical axis in the direction specified by the direction indicator 2803, the direction indicator may itself move in the specified direction at a rate proportional to the rate of the interface's vertical axis rotation.
  • An additional visual indicator may be used, for example, a rectangle 2804 that begins as a line and then expands behind and in concert with the moving direction indicator, at a rate proportional to the rate of the interface's rotation.
  • the direction indicator 2803 and section advancement indicator may be programmed to cease their respective movement
  • exemplary embodiments may be configured such that no further advancement through the control audio sample is possible until the next section block becomes active. Furthermore, advancement may also not be possible unless the interface is moved in the direction specified by the next active section block's direction indicator. Exemplary embodiments may also be configured to not allow the control audio sample (or the visual indicators) to advance in the direction opposite to the direction specified by the active section block's direction indicator. In other words, in such embodiments this would effectively mean that the control audio sample would not be able to be played backwards.
  • section block A a section block termed "section block A”
  • section block B the system may be configured to advance the control audio sample from its playback position in section block A as the interface is moved in the new direction specified by section block B. If an objective is that a control audio sample and a constant audio sample remain synchronized, the user would need to cause the control audio sample to "catch up" with the constant audio sample by increasing the control audio sample's playback above the ideal rate through an increased rate of movement of the interface.
  • the system may be configured to continue advancing from the start of the sample section corresponding to the next section block to become “active” (effectively “skipping" a part of the control audio sample).
  • a further visual indicator of the ideal rate of an interface's vertical axis rotation may be presented to the user, comprising a visual component that may remain perceivable to the user while superimposed on either of the elements 2803 or 2804.
  • an ideal rate indicator 2805 may move at the ideal rate regardless of whether the user is moving the interface. If, as shown in FIG. 28A, the ideal rate indicator 2805 is visually similar to the direction indicator 2803 then the user may be aided in achieving the ideal rate by rotating the interface around its vertical axis such that the direction indicator 2804 and the ideal rate indicator 2805 remain superimposed while the ideal rate indicator moves sideways.
  • each section block may have a direction indicator 2803 pointing in a direction opposite to the direction indicator belonging to the previous section block. If so the user would be able to follow a vertical axis rotation of the interface as specified by a section block with a vertical axis rotation in the opposite direction when the next section block becomes active.
  • elements of the video image may be made visible to the user and may also be under the user's control.
  • the video image sample may also be divided.
  • the timing and duration of these video sections may be made identical to the control audio sample sections, such that playback advancement of synchronized control audio and video samples may be simultaneously controlled by movement of an interface.
  • Each pair of matching control audio and video samples may also have their control visualized through a single section block and its associated components.
  • the video image 2808 may be displayed in close proximity to or superimposed with the visual instructions display, allowing the user to conveniently receive visual feedback from both sources.
  • the video image 2808 and visual instructions display may be presented on the same visual device (e.g., a TV screen, computer monitor, projected image, etc.).
  • the video image 2808 is illustrated as small relative to the visual instructions display 2801 , however, in exemplary embodiments the video image 2808 may be large relative to the visual instructions display.
  • the video image 2808 and visual instructions display may be partially or completely superimposed, and the visual instructions display may be overlaid on top of the video image in a position (e.g., on the left hand side) and visual configuration (e.g., partially transparent) that minimizes the visual instructions display's occlusion of the video image.
  • a "perspective view" of the visual instructions display 2901 may be included.
  • This visual instructions display may include the visual components as discussed for FIG. 28A and 28B, and the moving components may advance from the "back" towards the "front” of the visual instructions display 2901 , as if they were coming towards the user.
  • the video image 2808 and visual instructions display 2901 may be partially or completely superimposed, and the visual instructions display may be overlaid on top of the video image in a position and visual configuration (e.g., partially transparent) that minimizes the visual instructions display's occlusion of the video image.
  • one of the benefits of the game to the user may be that moving the interface at the correct rate causes the control audio sample to combine pleasantly with the constant audio sample, in a way that sounds enjoyably familiar to a user who knows the song.
  • a correct rate of interface movement may also cause motion within the video image to combine pleasantly with the constant audio sample. Both of these pleasant effects can occur in spite of, or due to, variations in the rate and timing of interface movement from an ideal rate and timing (where
  • features of the game may allow the user to achieve game objectives with less reliance on visual instructions and visual feedback.
  • the sample sections corresponding to the section blocks 2802 may each begin at a rhythmically identifiable moment, for example the start of each musical bar or measure. If each sample section lasts for a single bar and each section block requires a direction of interface movement opposite to the previous section block, then the user may anticipate that they may need to change movement direction at the beginning of each bar.
  • the rate of interface movement required to produce an ideal rate of sample advancement may be configured to remain constant throughout the game.
  • the user may begin to rely on their own sense of the required rate of movement that is acquired.through playing the game.
  • the sound of the control audio sample may also provide helpful feedback for achieving the desired rate of sample advancement. This effect may be enhanced if the user is familiar with how the control audio sample sounds at the ideal advancement rate (i.e., as it sounds in the original complete recording). Audio sounds not originating from the original audio sample may also be used to provide feedback to the user.
  • a displayed video sample is being controlled by the user this may also provide feedback to the user that is relevant to achieving the ideal rate of advancement. This effect may be enhanced if the user is familiar with how the video sample looks at the ideal advancement rate (i.e., the normal speed of playback). Additional visual elements may also be added to the video sample to provide useful feedback to the user.
  • Exemplary embodiments may also utilize digit button presses (i.e., actuation) on an interface as part of playing the game.
  • button indicators 2809 may be incorporated into the visual instruction display and may move at the same rate and in the same direction as the section blocks 2802 and their associated visual components.
  • the user may be required to press a button on the interface that corresponds to the button indicator.
  • the system may be configured in such a way that pressing the wrong digit button, or pressing the right button too soon or too late may result in audio or visual feedback or both indicating the digit button press attempt failed.
  • the system may also be configured such that this failure prevents the control audio sample from being heard for a specific section of time, or cause it to be audibly modulated.
  • Exemplary embodiments may be configured to require specific button presses at any time during operation, thus a button
  • RO/AU indicator may be aligned with the beginning of a section block, or may be positioned part way through a section block.
  • a variety of visual features may be provided to allow the user to identify which digit button is being signaled by a button indicator as needing to be pressed as part of the game.
  • digit buttons and button indicators may be matched by location (e.g., left to right order, or up-down order), color, or by common identifying marks or symbols.
  • Exemplary embodiments may use motion, orientation, and/or position sensing to control the pitch of a control audio sample.
  • the interface's orientation around its lateral axis may be used to select from a range of pitch choices specified by the system for each sample section 3019 (see FIG. 30B).
  • the pitch choices available for each sample section may be illustrated for the user in the sections corresponding section block (2802, and 3018 in FIG. 28A and 30B respectively).
  • the vertical dimension of each block may abstractly represent pitch and rectangular-shaped "pitch blocks" may represent the timing and duration of each required pitch.
  • pitch blocks may be arranged sequentially across a section block (In an order specified by the section block's direction indicator) and one or more pitch blocks may be available to choose from for each sub-section of the corresponding sample section.
  • the lateral axis orientation of the interface may be represented by a visual indicator within a section block, whereby the visual indicator's position in the vertical dimension of the section block may be proportional to the interface's orientation around its lateral axis.
  • Exemplary embodiments may include an alternative form of button indicator termed a "word-button indicator". These word-button indicators may appear within a visual instructions display (e.g., 2801 or 2901). As illustrated in FIG: 29B, these word-button indicators 2902 may appear in association with section blocks 2802, for example, inside or in the vicinity of a section block. Alternatively, in exemplary embodiments the word-button indicators may be presented without substantial association with section blocks.
  • a word-button indicator may be associated with specific control audio sample section corresponding to a sung or spoken word, word fragment, sequence of words or non-lexical vocables (i.e., wordless vocal sounds).
  • a word-button indicator may be associated with specific control audio sample section corresponding to a sung or spoken word, word fragment, sequence of words or non-lexical vocables (i.e., wordless vocal sounds).
  • the text a word-button indicator is associated with may be signaled by visualized text in the vicinity of the word-button indicator.
  • the length (from left to right) of a word-button indicator may be proportional to the duration of the sample section it is associated with.
  • a word-button indicator may be paired with a digit button on an interface. As illustrated in FIG: 29B, multiple word-button indicators may be associated with a single section block 2802, and each of these word-button indicators may be paired to a different digit button. A variety of visual features may be provided to allow the user to identify which digit button is paired with a word-button indicator. For example, digit buttons and word-button indicators may be matched by location (e.g., left to right order, or up-down order), color, or by common identifying marks or symbols.
  • Actuating a digit button may allow the control audio sample section corresponding to the paired word-button indicator to be progressed through ⁇ i.e., made audible) via the interface movements described herein. Re-actuating the same digit button may allow the same control audio sample section to be progressed through again via interface movement. In this way, the user may be provided with the opportunity to progress through control audio sample sections non-sequentially, as well as repeating sections, and avoiding some sections entirely. This functionality may be useful for making game-play more challenging, and/or adding elements of improvisation and creativity to the game-play.
  • the location and/or specific visual features of a word-button indicator may indicate to the user when (relative to the progress of an ideal rate indicator) said word-button indicator's paired digit button should be pressed as part of the game.
  • these location and/or specific visual features may indicate digit button actuation timings that may contribute to the control audio sample sounding as if it is being played back at the ideal rate.
  • the location and/or specific visual features of word-button indicators may indicate a musically-interesting way to rearrange the playback of a control audio sample section.
  • FIG. 30A An exemplary embodiment is illustrated in FIG. 30A.
  • the components illustrated in FIG. 30A may be implemented by software or hardware or a combination of both.
  • Some components may be classified as "content” 3001, in that they are materials that may be supplied to an exemplary embodiment for use during its operation.
  • Such content may be "offline” in origin, meaning that the content may be created prior to the user operating the system.
  • the content may be created with or without the involvement of some exemplary component described herein. Included in this content may be a video sample 3002, for example, the visual component of a music video (also referred to above as the video image).
  • Additional content may include sequence data 3003.
  • Sequence data may describe game elements that are intended to act in sync with visual and audio samples. Examples of sequence data include the timing and duration of section blocks relative to visual and audio sample content, the timing of button indicators and the
  • composition components 3001 may include a control audio sample 3004 and a constant audio sample 3005.
  • control audio sample may have the rate and timing of its playback controlled by the user via an interface, while the constant audio sample may be played back at a normal constant speed.
  • these samples may be associated, along with the video sample 3002, with the same piece of music.
  • control audio sample may be a vocal track from a piece of music
  • constant audio sample may be the "backing instruments" from that same piece of music.
  • video sample may be the visual component of a music video made to accompany that same piece of music.
  • FIG. 30A another form of input that may be provided to the system originates from the user via some form of interface 3006.
  • This interface input may include one or more continuous control signals that may direct the timing and rate of visual or audio playback or both, as well as any other feedback elements relating to playback.
  • This interface input may also include discrete control signals capable of controlling a range of individual and independent events.
  • one or more interfaces that are detailed in this description may be employed to provide interface input 3006.
  • the continuous control signals may originate from motion, orientation and/or position sensing included in an interface, and the discrete control signals may originate from the digit buttons of an interface.
  • the sequence data 3003 and interface input 3006 may be provided to a processing component 3007.
  • the sequence data may specify what and when actions should be performed on the interface by the user, while the interface input may describe what interface actions are actually occurring.
  • Component 3007 may include the "rules" of a game in algorithmic form which allow the sequence data and interface input to be combined and compared, with the results of that comparison to be fed back to the user via subsequent components as visual or aural elements or both.
  • the continuous control signals from an interface may include continuously-updated values that represent rates of some kind and may be "gated" by sequence data.
  • a rate of vertical axis rotation with a directional sign may act as a continuous control signal. If rotation occurs at the correct time and in the right direction (as specified by section blocks) the continuous control signals may be allowed to pass on subsequent components in the system. Similarly if an interface as detailed in this description is acting as the interface for this application, a rate of vertical axis rotation with a directional sign (plus or minus, i.e., clockwise or anticlockwise) may act as a continuous control signal. If rotation occurs at the correct time and in the right direction (as specified by section blocks) the continuous control signals may be allowed to pass on subsequent components in the system. Similarly if an interface as detailed in this description is acting as the interface for this application, a rate of vertical axis rotation with a directional sign (plus or minus, i.e., clockwise or anticlockwise) may act as a continuous control signal. If rotation occurs at the correct time and in the right direction (as specified by section blocks) the continuous control signals may be allowed to pass on subsequent components
  • digit button actuation that is correctly selected and timed relative to sequence data (i.e., button indicators) may be allowed to trigger events in subsequent components in the system, and may also act as an additional required permission for continuous control signals to be passed on to these components.
  • digit button actuation may also be employed to trigger pitch alterations in the control audio sample.
  • Comparison of sequence data and interface input may also be used by component 3007 to assess the user's performance, the results of which may be fed back to the user via subsequent components as visual or aural elements or both.
  • an employed interface has the capacity to provide visual, aural, or haptic feedback to the user
  • instructions or feedback originating from the "comparison" component 3007 may be provided to the user via these channels 3016.
  • the continuous control signal may be passed on to visual and audio playback components 3008 and 3011. These components may be configured to buffer the video sample 3002 and control audio sample 3004 respectively, and may play these samples back at rates and times specified by the comparison component 3007 (through its mediation of interface input).
  • the audio playback component 3011 may employ timescale-pitch control methods to allow the rate of playback to be varied without altering the sample's pitch. In embodiments that allow the user to control the pitch of the control audio sample, timescale-pitch control methods may be employed by component 3011 to shift the pitch of the control audio sample without affecting the sample's playback rate.
  • aspects of the directed audio playback performed by component 3011 may be fed back 3017 to comparison component 3007 to contribute to an assessment of the user's performance. These aspects may include the rhythmic or melodic qualities of the control audio sample as directed by the user. Alternatively, in exemplary embodiments, rhythmic and melodic features provided by the control audio sample may be extracted "offline", included as part of the sequence data 3003, and compared to interface input 3006 to contribute to a performance assessment performed by the comparison component 3007 (without requiring feedback from playback component 3011 ).
  • audio playback component 3012 may be configured to buffer the constant audio sample 3005. However, playback component 3012 may be configured to play back the constant audio sample at a constant rate, independent of input from an interface.
  • the comparison component 3007 may also pass its output on to a visual instruction and feedback generator 3009. This component may generate visual instructions to be provided to the user (e.g., the elements of the visual
  • RO/AU components display 2801 or 2901 - see FIG. 28A and 29A) as well as feedback on their actions (e.g., the section advancement indicator 2804).
  • Comparison component 3007 may also pass its output on to an audio instruction and feedback generator 3010. This, component may generate aural instructions to be provided to the user as well as feedback on their actions (e.g., digit button actuation mistimed relative to a button indicator may result in the sound effect of a vocalist failing to sing correctly).
  • Visual components 3008 and 3009 may supply video and graphics data to a visual production component 3014 that can make these elements visible (e.g., a TV screen, computer monitor, projected image, etc.) or record them for viewing at a later time, or both.
  • audio components 3010, 3011, and 3012 may supply audio sample and sound effect data to an audio production component 3015 that can make these elements audible (e.g., speakers, headphones, etc.) or record them for listening at a later time, or both.
  • pressing a button may cause the pitch of the controlled audio sample to match a pitch assigned to that button. For example, if the control audio sample is of a singer's voice, pressing a digit button may cause the pitch of the singer's voice to be shifted to match the pitch assigned to the pressed button.
  • This pitch controlling function may be of benefit to users who would like the opportunity to improvise with the melody of the control audio sample or would like to recreate the original melody under their control.
  • visual guidance may be provided to the user to assist them in achieving specific melodies.
  • Some embodiments of this type may also allow the user to create harmonies with the control audio sample by pressing more than one button at a time.
  • the performance of the user playing the game may be assessed and this assessment may be provided to the user as feedback.
  • One example of an assessable aspect of user performance includes the accuracy of timing the beginning of a sample-controlling movement of the interface or, in the case of a section block immediately following another section block, the accuracy of the timing in the change in the direction of movement of the interface between those section blocks.
  • Characteristics of the rate of movement of the interface may also be assessed by exemplary embodiments, including the consistency of the rate and how close the rate value is to that of an ideal value (the rate that is required to reproduce the control audio sample as it sounds in the original complete sample played at normal speed).
  • RO/AU may also be configured to identify and assess user-generated rhythmic variations in the playback of the control audio sample. For example, high amplitude transients in the control audio sample may be repositioned (by the user's movements of the interface) to occur at new rhythmically-classifiable timings. Through recognizing that these new timings fit into a conventional rhythmic structure (that differs from the audio sample played continuously at the ideal rate) exemplary embodiments may be configured to increase the positivity of their assessment of the user's performance.
  • buttons 2809 are another example aspect of user performance exemplary embodiments may assess.
  • Another example is the accuracy with which the user, by pressing the correct buttons at the correct times, reproduces the melody of the original control audio sample.
  • Other embodiments may be configured to use conventional rules of composition to assess a user's improvisation with the pitch of the control audio sample.
  • an effect may be employed whereby slowing down or speeding up the control audio sample does not alter the control audio sample's pitch. Furthermore, this effect may also allow the control audio sample to be halted entirely, while remaining continuously audible, as if the sound is "frozen" in time.
  • Audio timescale-pitch modification or "audio time stretching”.
  • audio time stretching include two techniques termed “time domain harmonic scaling” and "phase vocoding”.
  • time domain harmonic scaling and “phase vocoding”.
  • phase vocoding can produce audio from an audio track that matches the perceived pitch of that audio track played at normal speed despite the audio track being played through faster or slower relative to normal speed, or in reverse.
  • these techniques allow the audio track to be halted part way through being played, with a constant sound being produced that is representative of the sound at that audio track position when the audio track is being played through at normal speed.
  • audio time stretching techniques can be incorporated into the hardware or software of exemplary embodiments by persons skilled in the art.
  • the listener may perceive the sample's sound as having a quality of consistency regardless of how fast or slow the control audio sample is played through, or whether it is played in reverse, or halted altogether. Described another way, this audio processing contributes to the perception that, within the audio sample, time is being sped up, slowed down, reversed, or halted altogether.
  • the system may be configured to pre-process the control audio sample prior to operation. If the control audio sample is monophonic (for example a human voice) and its pitch varies little throughout its duration it may be desirable to tune the entire sample to a single pitch. If the range of pitches within the control audio sample is large it may be desirable instead to tune the sample to a sequence of constant pitches, with each constant pitch at a frequency centered on the pitch frequencies it is replacing. If the control audio sample is polyphonic the pitch processing may be configured to make each pitch in the polyphony continuous for the duration of the sample. In each case the processed control audio sample is passed on with data specifying which pitch (or pitches) the sample is tuned to and, if the pitch varies, at which sample time positions the pitch changes occur.
  • the pitch processing may be configured to make each pitch in the polyphony continuous for the duration of the sample. In each case the processed control audio sample is passed on with data specifying which pitch (or pitches) the sample is tuned to and, if the pitch varies, at which sample time positions the pitch changes occur.
  • the pre-processed control audio sample will have more or completely constant pitch and the pitch value or values will already been known.
  • the pitch difference between the current pitch of the processed control audio sample and the desired pitch (or pitches) may be calculated. This pitch difference may then be used to shift the current pitch of the audio track to the desired pitch, subject to any pre-set pitch glide effects that may be utilized.
  • Some pitch shifting methods incorporate a technique termed "formant preservation", which is described in more detail elsewhere in this application. Exemplary embodiments may include formant-preserving pitch shifting methods, since these can assist in making shifted pitches sound more "natural” or less “artificial” to a listener.
  • Pitch shifting techniques including those that incorporate formant preservation, can be incorporated into the hardware or software of exemplary embodiments by persons skilled in the art.
  • Exemplary embodiments may include systems whereby the user can operate an interface to manipulate one or more audio streams. These audio streams may be prerecorded, or be captured in real-time via a mechanism designed to assimilate information for the purpose of sound creation and/or sound amplification (e.g., a microphone or a guitar pick-up), or be produced in real-time by analog or digital synthesis. Exemplary embodiments may use one or more of the exemplary interfaces detailed herein. Exemplary embodiments may include the use of interface orientations and/or motions to provide one or more substantially continuous values, and/or digit buttons to provide one or more discrete values.
  • Appropriate input may include input that can provide one or more
  • RO/AU discrete input values for triggering individual pitches or notes, for example
  • one or more substantially continuous values e.g., a number that may take values between 0 and 100, and can perform the same role as, for example, data derived from a sensor component that measures angular rotation rate or orientation around a vertical axis.
  • a MIDI keyboard equipped with a MIDI control wheel may provide discrete output events via the keyboard keys and substantially continuous values Via the MIDI control wheel.
  • moving or orienting a motion, orientation, and/or position sensitive mobile device like a cell phone, PDA, hand-held video game device, or tablet computer, etc.
  • moving a finger across a touch sensitive screen may also provide one or more substantially continuous values, while contacting specific points on said touch screen may elicit discrete output events.
  • some or all of the system of exemplary embodiments described herein may be implemented on a mobile computing device (e.g., cell phone, PDA, hand-held video game device, or tablet computer, etc.), video game platform (e.g., the Microsoft Xbox, Sony Playstation, or Nintendo Wii, etc.)) or other computer, either in association with, or independent from, the exemplary interfaces described herein.
  • a user may capture their voice or another's voice via one or more microphones and manipulate the vocal sound via an interface.
  • An example of manipulation may be to alter the pitch of the vocal sound.
  • Exemplary embodiments may make audible or record more than one audio stream.
  • one audio stream may be a vocal sound in a non- or partially-manipulated state (which will be referred to as the "source” audio stream), while another may be a duplicate or substantially duplicate manipulated version of the same vocal sound (which may be referred to as the "duplicate audio stream"). If exemplary embodiments of this type used pitch-manipulation of one or more duplicate audio streams, then the source audio stream may act in concert with the duplicate audio stream(s) to create harmonies.
  • the pitch of a duplicate audio stream may be controlled by the user via the digit buttons on an interface. Additional mechanisms for pitch selection detailed elsewhere in this description may also be employed. Additional sensor data from an interface may also be used to manipulate the audio streams, for example, controlling the volume of a duplicate audio stream. In addition to the human voice, any other form of audio derived from acoustic oscillation or synthesis may act as a source audio stream.
  • exemplary embodiments may be configured to produce one duplicate audio stream for each actuated digit button.
  • each digit button may also specify a pitch or pitch change amount the duplicate audio stream it elicits should be shifted
  • Exemplary embodiments may be configured to not make the source audio stream audible.
  • the system may be configured to produce one duplicate audio stream for each actuated digit button. Additionally, in exemplary embodiments, the system may be configured to shift some or all the simultaneous pitches in an audio stream by a single value, with this value being specified by actuation of one or more digit buttons. For example, if a source audio stream contained two pitches C4 and E4, then selecting a pitch change value of five semitones higher (e.g., via one or more digit buttons on an interface) may result in a duplicate audio stream having the pitches F4 and A4.
  • Exemplary embodiments may also be configured to respond to digit button actuation by shifting pitch by an amount relative to the current pitch of an audio stream. This configuration may be referred to as the "relative pitch selection method”. Other exemplary embodiments may be configured to respond to digit button actuation by shifting pitch to a specific absolute pitch (that may be referred to as the "target pitch”). This configuration may be referred to as the "absolute pitch selection method”. In either configuration the pitch of the source or duplicate audio streams or both may be detected.
  • the pitch shift amount and direction specified by digit button actuation may be referred to as an "interval". This interval may be compared to the pitch of the duplicate audio stream (prior to pitch shifting) in order to calculate the target pitch (the pitch that is to be achieved by the pitch shift). In either pitch selection method the pre-shift pitch of the duplicate audio stream may be compared to the target pitch in order to calculate the required pitch shift factor.
  • more than one digit button may be actuated at one time, thereby producing multiple duplicate audio streams with each stream being produced with its own pitch (as specified by the corresponding digit button).
  • the relative pitch selection method may be especially useful for interfaces that incorporate a small number of digit buttons.
  • the most commonly used pitch intervals above the root pitch are a “3rd”, “4th”, “5th”, “6th”, and “unison” (same pitch as the root pitch).
  • These intervals are commonly defined relative to diatonic musical "scales” or “keys” (e.g., major or minor scales).
  • each digit button may be configured to elicit a duplicate audio stream shifted by one of these intervals (while a root pitch is produced by the source audio stream).
  • 26A may be used in conjunction with a relative pitch selection method, however, other interface designs may also be used in conjunction with this method.
  • any combination of intervals may be included to be triggered by any number and arrangement of digit buttons.
  • multiple digit buttons may be actuated at one time, thereby producing multiple duplicate audio streams at different pitches.
  • an interface with nine digit buttons may be set to elicit intervals including (relative to the root note) a 6th below, a 5th below, a 4th below, a 3rd below, a unison, a 3rd above, a 4th above, a 5th above, and a 6th above.
  • an interface similar to that illustrated in FIG. 26B may be used in conjunction with a relative pitch selection method, however, other interface designs may also be used in conjunction with this method.
  • an absolute pitch selection method may be beneficial.
  • an interface with seven or more buttons may be able to access the pitches of a diatonic scale (e.g., a major or minor scale).
  • the system may accept a user's instruction to set the useable collection of pitches to, for example, the pitches in a C natural minor scale (C, D, Eb, F, G, Ab, and Bb).
  • C, D, Eb, F, G, Ab, and Bb Any number of different scales with different tonic pitches (first pitch of the scale) may be provided for the user to choose from.
  • each of the digit buttons may be set to elicit one of the pitches in the C natural minor scale.
  • the interface may also be used to choose which octave each pitch should be produced in.
  • the relative pitch selection method in the absolute pitch selection method multiple digit buttons may be actuated at one time, thereby producing multiple duplicate audio streams at different pitches.
  • exemplary interfaces with more than seven digit buttons may have a larger number of pitches assigned to them.
  • an interface with eight digit buttons may include the pitches D4, E4, F#4, G4, A4, B4, C#5, and D5.
  • the user chose the scale D major may include the pitches D4, E4, F#4, G4, A4, B4, C#5, and D5.
  • the user chose the scale D major may include the pitches D4, E4, F#4, G4, A4, B4, C#5, and D5.
  • RO/AU chose the scale D major, an interface with fifteen digit buttons may include the pitches D4, E4, F#4, G4, A4, B4, C#5, D5, E5, F#5, G5, A5, B5, C#6, and D6.
  • An example of an arrangement similar to this is shown in FIG. 12B.
  • Exemplary embodiments that include interfaces with twelve or more digit buttons may be configured to use the absolute pitch selection method in conjunction with a chromatic arrangement of pitch assignments on the digit buttons.
  • each of the digit buttons may be set to elicit one of the pitches C4, Db4, D4, Eb4, E4, F4, Gb4, G4, Ab4, A4, Bb4, or B4.
  • Exemplary interfaces with more than twelve digit buttons may include a greater range of pitches.
  • an interface with fifteen digit buttons may use the arrangement C4, Db4, D4, Eb4, E4, F4, Gb4, G4, Ab4, A4, Bb4, B4, C5, Db5, and D5.
  • An example of this kind of arrangement is shown in FIG. 12A.
  • the interface may also be used to choose which octave each pitch should be produced in.
  • pitches may be assigned to the digit buttons, and the system may provide the user with the option of varying the assignment of pitches to the digit buttons.
  • Exemplary embodiments may include pitch correction on either the source or duplicate audio streams or both.
  • embodiments of this kind may be configured to correct any pitch that lies too far between the pitches of a chromatic scale, a correction sometimes referred to as "pitch quantization".
  • pitch quantization Such "off-center" pitches are sometimes described by listeners as being “sharp” or “flat” and may be undesirable in a musical context.
  • the system may be set up to shift the frequency of this tone to 440 Hz (the frequency of pitch A4). This is because 445 Hz is closer to 440 Hz than 466 Hz (the frequency of pitch A#4). Because the relationship between a change in pitch frequency and perceived pitch is non-linear, the term "closer" is used here in reference to perceived pitch.
  • Exemplary embodiments may be configured to perform pitch correction on a source audio stream, either before it becomes a duplicate audio stream or before it is made audible or recorded. Exemplary embodiments may be configured to perform pitch correction on one or more duplicate audio streams only. Pitch correction of a duplicate audio stream may be desirable if it has "inherited' "sharp” or “flat” pitched sounds from its source audio stream. Pitch correction of duplicate audio streams may be integrated into the pitch shifting functionality described thus far, whereby the pitch shifting involved in pitch correction and reaching the target pitch is performed in the same processing step. For example, if the source audio stream is producing a tone with a pitch corresponding to a frequency of 445 Hz
  • Exemplary embodiments may prevent certain pitches from being produced at all, a feature that will be referred to as "pitch scale filtering".
  • the user may choose to constrain some or all pitches produced by an exemplary embodiment to those found in C major, or D minor, or any other musical scale. This constraint may be especially useful in exemplary embodiments where a relative pitch selection method is used, where each digit button on an interface may be used to elicit a specific interval.
  • pitch scale filtering would be where the user is provided with a choice of tonic pitch and musical scale, (e.g., major, minor, and so on) and this scale may be used to filter the pitches that can be produced by the filtered audio stream. In such a configuration, pitches that are not present in the chosen scale may be shifted to the closest pitch within that scale. In other words, if the user chose the scale C major, then the set of "permitted" pitches would be C, D, E, F, G, A, and B (in any octave). If an audio stream contained the pitch D# this pitch may be shifted to either D or E.
  • tonic pitch and musical scale e.g., major, minor, and so on
  • the direction of the shift may be determined by the frequency of the pitch in the audio stream. For example, if the frequency of the pitch were closer (in the sense of perceived pitch) to the pitch center of D than E then the audio stream's pitch may be shifted to D.
  • the pitch scale filtering method may be configured to select target pitches according to intervals specified by a diatonic scale.
  • a specific musical scale for use with the pitch scale filter for example, C major (comprising the pitches C, D, E, F, G, A, and B).
  • a source audio stream may be producing a C-pitched tone and the user may have, via the interface, specified that a duplicate audio stream should be produced at a pitch a "3rd" higher than the tone in the source audio stream.
  • Exemplary embodiments that use a pitch scale filter similar to that described above may restrict the types of intervals that can be created by the system.
  • the pitches C and E form a "major 3rd” (four semitones), while the pitches D and F form a "minor 3rd” (three semitones).
  • the system may allow the user to specify that certain intervals, like a minor 3rd, are not permitted.
  • the system may be configured to silence the duplicate audio stream as long as shifting its pitch would cause a minor 3rd interval harmony (D and F) to be created.
  • Exemplary embodiments may utilize additional output data from an interface.
  • the system may be configured to use measurements from an angular rate sensor to control aspects of manipulation of one or more duplicate audio streams.
  • One example of this manipulation may be to control the volume of one or more duplicate audio streams with the rate of an interface's vertical (yaw) axis rotation (where the user's forearm is approximately parallel to the ground plane and the clockwise or anticlockwise movement of the forearm also runs approximately parallel to the ground plane).
  • a compound movement of an interface e.g., that includes rotational and translational movement
  • increasing the rate of vertical axis rotation may increase the volume (possibly from a non-audible starting point) of one or more duplicate audio streams.
  • Exemplary embodiments may utilize other or additional types of interface movement/orientation as control input, and may utilize measurements coming from other sensor types. For example, with the user's forearm approximately parallel to the ground, the "roll" angle of an interface (as controlled by, in the neutral operating position, forearm rotation and measured by an acceleration sensor 814) may be used to control the volume of additional duplicate audio streams. In this example, if the relative pitch selection method (see above) was in use and a duplicate audio stream at an interval of a 3rd above was elicited by the user, then rolling the interface such that the thumb is moved to face upwards may cause an additional duplicate audio stream to be made audible at a pitch that is a 3rd below the pitch of the source audio stream.
  • Exemplary embodiments may utilize interface-based portamento control and/or vibrato control to modulate the pitch of one or more duplicate audio streams, in a manner similar to that described elsewhere in this specification.
  • Exemplary embodiments may utilize interface-based contextual control and directional control including oscillation rate control effects employing frequency filters and/or volume gates, in a manner similar to that
  • Exemplary embodiments described thus far may utilize real-time pitch detection, that is, the estimation of the pitch or fundamental frequency of an audio signal as it is perceived by a listener.
  • the term "real-time” is used here in the sense that the audio stream processing is taking place approximately as the stream is being recorded or played back. Numerous methods are available for performing real-time pitch detection and can be implemented by persons skilled in the art.
  • Exemplary embodiments described herein may employ real-time pitch shifting.
  • an absolute pitch selection method as a new digit button actuation event is received the pitch difference between the corresponding target pitch and pitch of the duplicate audio stream (prior to shifting) may be calculated. This difference may then be used to calculate the required pitch shift factor.
  • the pitch of the duplicate audio stream may be used to calculate the target pitch.
  • pitch shifting may be achieved by using a fixed shift factor specific to each interval.
  • calculating the post-shift pitch may be useful in conjunction with pitch scale filtering for determining if a post- shift pitch would fail within the permitted pitch set. This may ensure that only pitches "permitted" by the pitch scale filter may be produced by pitch shifting. After filtering, the resulting target pitch may be used in calculating the required pitch shift factor.
  • pitch shift factor For both the absolute and relative methods of pitch selection, once the pitch shift factor has been finalized it may then be used to shift the current pitch of a duplicate audio stream, subject to any pre-set pitch glide effects that may be employed by the system. Pitch correction may be performed before, after, or as part of the main pitch shifting process.
  • Some pitch shifting methods incorporate a technique termed "formant preservation” which is described in more detail elsewhere in this application.
  • Exemplary embodiments may include formant-preserving pitch shifting methods, since these can assist in making shifted pitches sound more "natural” or less “artificial” to a listener.
  • Real-time pitch shifting techniques including those that incorporate formant preservation, can be
  • FIG. 31 A diagram representing the processing components involved in exemplary embodiments is shown in FIG. 31.
  • a source audio stream 3101 may be reproduced as a
  • the duplicate audio stream's pitch (or pitches) may be estimated by a pitch detector 3103 and this "pre-shift" pitch estimate may then be passed on to a target pitch calculator 3104.
  • input from the digit buttons 3105 may be combined with the pitch estimate to determine the target pitch.
  • the target pitch (or pitches) ' and the pre-shift pitch estimate may then be passed on to a pitch scale filter 3106.
  • the digit button input may also include other information relevant to calculating the target pitch, for example, input from an interface's octave selection mechanism (as detailed elsewhere in this description).
  • a pitch scale filter 3106 may be used to determine if the target pitch belongs to the set of "permitted" pitches (e.g., a scale or key) previously chosen by the user 3107. This choice of musical scale may be made by the user prior to engaging in the audio control process, and may be made via the interface (for example by selecting an option on a video display using the digit buttons) or another user interface included in the system. If the target pitch does belong to the permitted set of pitches it may be passed on unaltered to the next system component (along with the pre- shift pitch estimate). If it does not belong to the set, the pitch scale filter may employ one or more algorithms (see above for description) to decide what the altered target pitch should be.
  • target pitches may be selected according to interval choices specified by a diatonic scale (see above for description). Once finalized, the target pitch may then be passed on to the next system component (along with a pre-shift pitch estimate).
  • a pitch corrector 3108 may be used to identify a "sharp” or “flat” target pitch and correct its value (sometimes referred to as "pitch quantization”).
  • the target pitch calculator 3104, the pitch scale filter 3 06, or both may not be employed.
  • digit button input 3105 and a pre-shift pitch estimate may be provided directly to a pitch corrector 3 08.
  • each digit button may correspond to a specific target pitch (subject to any octave selection mechanism).
  • the target pitch may be passed on, along with a pre-shift pitch estimate, to a pitch shift calculator 3109.
  • This pitch shift calculator may compare the pre-shift pitch estimate with the target pitch and calculate the shift amount required to make the pitch of the former match that of the latter.
  • This calculated "pitch shift factor” may then be passed on to a pitch shifter 3110 component, which then shifts the duplicate audio stream as directed by the pitch shift factor.
  • the duplicate audio stream may then be subjected to additional modulation 3111 (e.g., volume control) as directed by sensor input from an interface 3112.
  • additional modulation 3111 e.g., volume control
  • RO/AU additional effects e.g., compression, reverb, etc.
  • additional effects e.g., compression, reverb, etc.
  • the pitch detector 3103 may receive an audio signal via components separate to those that provide an audio signal to the pitch shifter 3110.
  • This alternative audio stream 3114 may originate from the same source (e.g. a singer's voice) but the method of transducing the source into a usable signal may be different.
  • the alternative audio stream may be generated through signals obtained from one or more contact microphones (or any other device that measures vibration through direct contact) worn on the singer's body.
  • a contact microphone also referred as a piezoelectric microphone
  • a singer's neck, chest, or head e.g. in contact with bone inside the outer ear.
  • these contact microphone signals may undergo amplification and frequency filtering prior to being supplied to the pitch detector 3103.
  • the pitch detector may not require input from the duplicate audio stream 3102 because the signal for measuring the pitch of the sound source (e.g. a singer's voice) may be supplied by the alternative audio stream 3114.
  • the calculation at stage 3109 of the required pitch shift may be based on signals from the alternative audio stream, the actual audio that would undergo pitch shifting may be that of the duplicate audio stream.
  • the advantage of this exemplary embodiment may be that the alternative audio stream 3114 carries much less signal from sounds extraneous to that of the desired sound source (e.g. unwanted sounds emanating from other musical instruments), due to the low sensitivity of the alternative transduction method (e.g. contact microphone) to airborne vibration.
  • This 'cleaner 1 signal may allow a more accurate measurement of the pitch of the desired sound source by the pitch detector 3103.
  • Exemplary embodiments may allow the user to exert substantially gradated, as well as discrete, control over the pitches of sounds.
  • exemplary embodiments may comprise three components.
  • the first component may be a user interface 3210, through which the user may create control signals that are used to direct the audio effects.
  • Exemplary embodiments may use one or more of the exemplary interfaces described herein.
  • Exemplary embodiments may include the use of interface orientations and/or motions to provide one or more substantially continuous values, and/or digit buttons to provide one or more discrete values.
  • Appropriate input may include input that can provide one or more discrete input values (for triggering individual pitches or notes, for example) and/or one or more substantially continuous values (e.g., a number that may take values between 0 and 100, and can perform the same role as, for example, data derived from a sensor component
  • a MIDI keyboard equipped with a MIDI control wheel may provide discrete output events via the keyboard keys and substantially continuous values via the MIDI control wheel.
  • moving or orienting a motion, orientation, and/or position sensitive mobile device like a cell phone, PDA, hand-held video game device, or tablet computer, etc.
  • moving a finger across a touch sensitive screen may also provide one or more substantially continuous values, while contacting specific points on said touch screen may elicit discrete output events.
  • a mobile computing device e.g., cell phone, PDA, hand-held video game device, or tablet computer, etc.
  • video game platform e.g., the Microsoft Xbox, Sony Playstation, or Nintendo Wii, etc.
  • other computer either in association with, or independent from, the exemplary interfaces described herein.
  • the second component in exemplary embodiments may be a data processor 3211 which may receive control signals from the user interface, convert these control signals into audio data, and pass on the processed information to the audio production device 3212.
  • the audio production device may either make the audio data perceivable to the user and/or their audience via conventional methods, or may record these data for later use. Methods for presenting the audio information may include audio speakers, headphones, etc.
  • the data processor 3211 may also employ components for receiving commands from the user that modify its overall operation, providing the option to turn a specific modulatory sound effect on or off, for example.
  • the following is a summary of an audio effect achieved by some exemplary embodiments, which may be to allow the user to trigger specific musical sounds and to control the pitch of these sounds in a gradated manner.
  • the user interface may employ components to measure its orientation and movement within multiple axes in space.
  • Exemplary embodiments may use an interface's orientation or rotation around the vertical (yaw) axis to control said gradated pitch shifting of a musical sound (however, orientation in either the pitch or roll axes may be used for this purpose instead).
  • a data processor 3211 may be configured to produce a variety of different musical sound data to be modulated by the pitch shift mechanism (and then made audible by the audio production device 3212).
  • exemplary embodiments may include the capacity to produce musical sounds that have the sound qualities of an electric slide guitar.
  • a user interface may be employed by the user to activate ("trigger") and/or deactivate the musical sound generated by the data processor
  • the total rotation (in either direction from the start point) around the yaw axis that may be required to reach the pitch of the second note may be configured to be proportional to the pitch difference between the first and second notes.
  • the total required rotation may also be subject to a pre-set value chosen by the user to scale the required rotation to suit their preference.
  • the user may be able to specify that once the required extent of rotation (to shift from the first to the second note) has been reached the pitch will remain at the pitch of the second note despite continued rotation, unless the user rotates back towards the start point (the yaw orientation at the time the first note was triggered), thereby shifting the pitch back to that of the first note. If the user rotates the interface back from reaching the pitch of the second note (the end point) towards the start point, the system may be configured such that rotating past the start point will not shift the pitch further beyond that of the first triggered note.
  • the user may be given the option of allowing additional effects to occur once the pitch of the second note is reached. For example, once this end point is reached a tremolo effect that is controlled by the velocity of rotation around the pitch axis may be automatically activated. As would be apparent to a person skilled in the art, a large number of different audio effects may be assigned to the various control signals of the user interface, providing the user with a greater range of control over the produced musical sounds.
  • the user may un-actuate the first note on the interface (while keeping the second note active) and trigger a third note. Rotation around the yaw axis in either direction may then gradually shift from the pitch of the second note to that of the third note. Obviously this process may be carried on ad infinitum, starting with the second note being un-actuated and a fourth note being triggered and so on.
  • the user may have access to a configuration whereby actuating a digit button on an interface may trigger more than one sound, each with its own pitch.
  • These pitches may have harmonic interval relationships and rotation around the yaw axis
  • RO/AU may cause the harmonic set of "first" pitches to shift in unison to reach a harmonic set of "second” pitches.
  • the pitch shifting described above may be controlled via a comparison of the motion and/or orientation of the two interfaces. For example, actuation of a button on one interface may select the first note (start point) and actuation of a button on the other interface may select the second note (end point). If the user begins by holding the two interfaces at different orientations (e.g., on the lateral or vertical axes), then reducing the orientation difference between them may be configured to gradually shift the pitch of the start note to that of the end note. Alternatively, increasing the orientation difference between the two interfaces may be configured to gradually shift the pitch of the start note to that of the end note.
  • a "portamento effect" may be achieved that does not require more than one digit button to be actuated simultaneously.
  • the start note and end note of the pitch shift may be continually redefined based on the order in which digit buttons are actuated. For any digit button actuation that occurs after the first actuation in a session of use, the pitch of the musical sound that is elicited may correspond to the pitch assigned to the previously- actuated digit button.
  • the pitch of the elicited sound may gradually shift to the pitch assigned to the currently- actuated digit button, with said pitch shift occurring at a rate proportional to the rate of rotation.
  • the distal thumb button is assigned a pitch of C and the distal index finger button is assigned a pitch of D (and also assuming that at least one digit button actuation has already occurred)
  • actuating the distal thumb button may elicit a musical sound with the pitch of the previously actuated digit button.
  • the pitch of the musical sound may gradually shift to C.
  • the system may be configured to prevent further pitch shifting to occur as a consequence of continued vertical axis rotation in the same, or both, directions.
  • actuating the distal index finger button may then elicit a musical sound with a pitch of C, and then rotating the interface left or right around the vertical axis, while maintaining actuation of the distal index finger button, the pitch of the musical sound may gradually shift to D.
  • This process may be continued indefinitely, allowing the user to play musical sounds with a portamento effect.
  • the system may also be configured to modulate the activation and/or speed of such a portamento effect via one or more other control parameters. For example, rotating the interface beyond a certain angle around the longitudinal (roll) axis may
  • RO/AU activate the portamento effect, and rotating beyond this angle may modulate the
  • Exemplary embodiments described herein may employ real-time pitch shifting.
  • the method by which pitch shifting is achieved may depend of the nature of the audio to be shifted. For example, if the audio is the product of hardware or software synthesis, pitch shifting may be achieved by changing actual synthesis parameters (i.e., whereby the interface is used to control the pitch or pitches at which the audio is synthesized in an ongoing process). In another example, if the audio is derived from recorded audio samples then real-time pitch shifting methods may be employed. Some pitch shifting methods, including those that employ "formant preservation", are described in more detail elsewhere in this application, and can be incorporated into the hardware or software of exemplary embodiments by persons skilled in the art.
  • the data processing required for the functions described above may be performed by the data processor 3211 (see FIG. 32).
  • the data processor may be a personal computer that communicates with the user interface either wirelessly or via a cable connection.
  • the required data processing and audio data generation described here are achievable by conventional methods that may be implemented in software, hardware, or a combination of the two, and are hence achievable by persons skilled in the art.
  • orientation, motion, or position of an interface may be used to control other aspects of sound in addition to pitch.
  • orientation or motion around the yaw, pitch, or roll axes may be assigned to modulatory sound effects.
  • the velocity of rotation around the yaw axis may be assigned to modulate the musical sound with a "wah-wah” effect, similar to the effects processing that takes place in "wah-wah” effects pedals (controlled by motion of the player's foot) used to process electric guitar signals.
  • the larger the rotation velocity the stronger the wah-wah effect may be configured to become.
  • Exemplary embodiments may allow the user to control recorded or synthesized audio; or the visual component of recorded video or synthesized visual data; or both. As illustrated in FIG. 33, exemplary embodiments may comprise four components. One of these components may be a user interface 3310, through which the user creates control signals that may be used to direct audio and/or visual effects generated by the system. Exemplary embodiments may use one or more of the exemplary interfaces detailed elsewhere in this description. Exemplary embodiments may include the use of interface
  • Appropriate input may include input that can provide one or more discrete input values (for triggering individual pitches or notes, for example) and/or one or more substantially continuous values (e.g., a number that may take values between 0 and 100, and can perform the same role as, for example, data derived from a sensor component that measures angular rotation rate or orientation around a vertical axis).
  • a MIDI keyboard equipped with a MIDI control wheel may provide discrete output events via the keyboard keys and substantially continuous values via the MIDI control wheel.
  • moving or orienting a motion, orientation, and/or position sensitive mobile device may provide one or more substantially continuous values suitable for use in exemplary embodiments.
  • moving a finger across a touch sensitive screen may also provide one or more substantially continuous values, while contacting specific points on said touch screen may elicit discrete output events.
  • a mobile computing device e.g., cell phone, PDA, hand-held video game device, or tablet computer, etc.
  • video game platform e.g., the Microsoft Xbox, Sony Playstation, or Nintendo Wii, etc.
  • other computer either in association with, or independent from, the exemplary interfaces described herein.
  • an additional component may be a data processor 3311 which may receive audio and visual information from a video sample 3312 and control signals from an interface.
  • the data processor may process the information from these two sources and pass on the processed information to an audio/visual production device 3313.
  • the data processor 3311 may be a personal computer that communicates with the interface either wirelessly or via a cable connection or equivalent method.
  • the audio/visual production device may make the audio and/or visual video information perceivable to the user and/or their audience via conventional methods, or record this information for later use.
  • Methods for presenting; the video information may include a television, or computer screen, or light projector, etc.
  • Methods for presenting the audio information include audio speakers, or headphones, etc.
  • the data processor 3311 may also possess the capacity to receive commands from the user that modify its overall operation, providing the option to turn a specific modulatory sound effect on or off, for example.
  • the interface may possess the capacity to measure its orientation and movement within multiple axes in space.
  • the interface's orientation around the yaw axis may be used to control the video sample's "track position" (however, orientation in either the pitch or roll axes may be used for this purpose instead).
  • the term "track position" refers the part or point in a sample that is currently being made audible or "played” and for the visual and audio components of a video sample a track position value may refer to a matching position in the two components.
  • the video track position may be progressed gradually from beginning to end for the visual and/or audio components of the video. For example, if a video sample has 25 frames per second with a duration of 6 seconds, it will contain 150 frames in total. If the interface's control range for yaw rotation is pre-set by the user to be north to north-east, then rotating the interface from north to north-east would gradually switch through the video frames 0 to 150 (i.e., from 0 seconds to 6 seconds). Conversely, rotating the interface from north-east to north would gradually switch through the video frames 150 to 0. Thus the user may choose to move in either direction through the video and at any rate.
  • This interface- based control means they may also pause at any frame within the video, and change direction of movement through the video at any frame.
  • the audio component of a video sample may also have its playback controlled in the same way, in sync with the visual component.
  • the system may be configured such that moving beyond the two pre-selected limits within the yaw rotation range of the interface (i.e., from north towards north-west or from north-east towards east) may have no further effect on the visual and audio components of the video.
  • Exemplary embodiments that use the interface's orientation around the yaw axis to control a video sample's track position may do so using measurements from one or more angular rate sensors or one or more magnetic field sensors or a combination of the measurements from the two sensor types.
  • track position control may be based on angular distance travelled rather than estimating absolute yaw values (e.g., north, south, etc.). In other words, estimates of relative yaw orientation may be used. In exemplary embodiments angular rate and magnetic field sensing estimates of absolute yaw orientation may be used.
  • Exemplary embodiments may employ audio processing methods that achieve audio that is substantially pitch-constant and continuously-audible regardless of the rate
  • RO/AU rate at which the video is played through Halting progress at a particular track position may render the image motionless, and this image may be perceived to have consistency with the moving images that appeared when the video was being played through (either backwards or forwards).
  • the audio component of the video (termed “audio track”), however, may become far less perceptually-consistent when the rate at which the video is played through changes from normal speed.
  • audio tracks require being "played though” (i.e., progressed either forwards or backwards) to allow the modulating pressure waves that are perceived as audible sound to be produced at all.
  • the rate at which an audio track is played through may also affect the perceived pitch of the audio. Techniques for overcoming the dependence of audibility and pitch on audio playback rate are described below.
  • Audio timescale-pitch modification'' or “audio time stretching” Techniques for achieving the audio effects of pitch-constancy and continuous- audibility are often described as "audio timescale-pitch modification'' or "audio time stretching". These techniques include two methods termed “time domain harmonic scaling” and “phase vocoding”. These techniques can produce audio that matches the pitch (sound frequency) of an audio track played at normal speed despite the audio track being played through faster or slower relative to normal speed, and/or in reverse. These techniques may also be used to shift the pitch (or pitches) of an audio track by a chosen amount.
  • Pitch shifting methods may incorporate a technique termed "formant preservation".
  • Formants are prominent frequency regions produced by the resonances in an instrument or vocal tract's structure that has a strong influence on the timbre of its sound. If the pitch of an audio track is shifted, formants will be moved thereby producing an altered quality of sound that a listener may consider very different from the original.
  • For the audio timescale-pitch modification techniques mentioned above corresponding methods are available for changing the formants to compensate for the side effects of the pitch shifting and thereby "preserve" the formants.
  • Exemplary embodiments may include formant-preserving methods as part of their audio timescale-pitch modification.
  • Audio timescale-pitch modification may be implemented in hardware and/or software by persons skilled in the art.
  • the audio timescale-pitch modification may be performed by the data processor.
  • RO/AU this audio processing may contribute to the perception that, within the events of the video, time is being sped up, slowed down, reversed, or halted altogether.
  • the audio timescale-pitch modification will be referred to as the "time stretch algorithm”.
  • an interface 3310 may also provide a user with the opportunity to control when they would like the audio track of. the video sample to be made audible and the pitch at which they would like this audio to be made audible.
  • the employed interface includes one or more digit buttons
  • exemplary embodiments may be configured such that the audio of the video may only be audible when one or more digit buttons are actuated.
  • the pitch (or pitches) of the audio may be specified by the user's choice of which digit button (or buttons) to actuate.
  • the user may also be given control over when the audio track of the video is audible and at what pitch. This may allow, for example, the user to create melodies using the sound from the video's audio track.
  • exemplary embodiments may allow more than one stream of audio to be activated at one time and at different pitches. In this configuration the user may actuate more than one digit button at a time, thereby initiating multiple streams of the audio track to be produced at the pitches specified by the actuated digit buttons. This feature may allow, for example, the user to create pitch harmonies.
  • a video sample used with an exemplary embodiment was of an individual singing one or more words
  • the user may be able to control the rate and direction in which those words were sung.
  • rotating the interface from north to north-east may produce synchronized visual and audio video components of said individual singing the phrase at a rate proportional to the speed of the rotation from north to north-east.
  • rotating from north-east to north may produce synchronized visual and audio video components of said individual singing the phrase backwards at a rate proportional to the speed of the rotation from north-east to north.
  • the user may also be able to pause at any track position, during a vowel sound for example, and a sound that is representative of the vowel at that track position may continue to be produced (along with the halted visual image at that track position).
  • a sound that is representative of the vowel at that track position may continue to be produced (along with the halted visual image at that track position).
  • the user may have control over when the audio track is audible ⁇ i.e., when at least one audio stream is active).
  • the pitch of initiated audio streams e.g., via one or more digit buttons
  • the user may have control the pitch (or pitches) that this audio is played at. In "singer" video example, these pitch and track position controls provided by the
  • RO/AU interface may contribute to the perception that the user is controlling (in terms of phrasing and pitch) how the individual in the video is singing the phrase.
  • any video material may be used by exemplary embodiments to create interesting visual and audio effects using methods similar to those described above.
  • the user may also be given the opportunity to preset a "pitch glide" value that may modulate the pitch of audio streams initiated via an interface. For example, if an audio stream is triggered soon after a previously triggered audio stream has been deactivated (or, if only one audio stream is permitted at a time, prior to deactivation), the pitch of the newly-triggered audio stream may shift (either up or down) from the pitch of the previous audio stream to the designated pitch of the newly-triggered audio stream. By choosing the pitch glide value the user may determine over what duration this shift takes place.
  • the user may also be given the opportunity to pre-set the "attack” and/or "decay” aspects of the audio stream triggering, whereby the user may choose how rapidly the audio volume rises after triggering (attack) and/or how rapidly the audio volume diminishes after an audio stream is deactivated (decay).
  • a variety of additional effects may be configured to be controlled via data generated from an interface 3310 (see FIG. 33).
  • a tremolo effect applied to an audio stream may be configured to be controlled by the rotational velocity of the interface around its lateral axis (i.e., the "pitch" angle of the interface).
  • the brightness of the video image may be configured to be reduced while no audio streams are active.
  • the volume of the audio may be configured to be reduced when the video is being played in a reverse direction, as opposed to when it is being played in a forward direction.
  • the volume of the audio may be configured to be controlled by an axis of rotation on the interface, for example, the longitudinal axis (i.e., the "roll" angle of the interface).
  • Exemplary embodiments may utilize interface-based portamento control and/or vibrato control to modulate the pitch of the audio track of a video sample in a manner similar to that described elsewhere in this specification.
  • Exemplary embodiments may utilize interface-based contextual control and directional control including oscillation rate control effects employing frequency filters and/or volume gates, in a manner similar to that described elsewhere in this specification.
  • a large variety of additional alternative audio and visual effects may be configured to be controlled via an interface, and this should not be considered a complete list.
  • Exemplary embodiments may use the data processor 3311 (see FIG. 33) to execute an algorithm as described in the following text and in FIG. 34.
  • Two preliminary procedures 3410 may be performed prior to initiating an ongoing real-time
  • RO/AU procedure 3414 may include extracting an audio track from a video sample 3411 and modifying the pitch of this audio track 3412.
  • the pitch of the audio track may be modified such that its pitch is set to a single pitch for the duration of the audio track, or to multiple consecutive constant pitches that change at defined track positions. If the audio is monophonic (for example a human voice) and its pitch varies little during the audio track, it may be desirable to tune the entire sample to a single pitch. If the pitch varies significantly it may be desirable instead to tune the audio track to multiple consecutive pitches. If the audio track is polyphonic the pitch processing may be configured to make each pitch in the polyphony continuous for the duration of the audio track.
  • the processed audio sample may be passed on with data specifying which pitch (or pitches) the audio track is tuned to and, if the pitch varies, at which track positions the pitch changes occur.
  • pitch detection Numerous methods are available for performing pitch detection including those that analyze audio signals in the frequency- or time-domain, and can be implemented by persons skilled in the art.
  • next step 3413 in the algorithm may be to load the pitch shifted audio track into a time-stretch algorithm buffer (along with the audio track's pitch info) and load the visual component of the video sample into the video buffer.
  • the triggered audio streams may be the only audible sound produced by the system, and the original audio track in the video sample may not be made audible.
  • the first performed step may be to retrieve the current commands from the interface 3415. These commands may include updates on audio stream activation, pitch selection, track position, and additional effects. Due to processing in step 3412 pitch of the pre-processed audio track may be known for some or all track positions.
  • the pitch difference between the known current pitch of the audio track and the pitch (or pitches) specified by the interface may be calculated 3416. This pitch difference may then be used to shift the current pitch of the audio track to the desired pitch 3417, potentially subject to any pre-set pitch glide effect.
  • triggering an audio stream via the user interface may produce a sound that is "representative" of the sound at that track position (i.e., substantially similar to the sound of the audio track at that position when it is being played through at normal speed, aside from a chosen shift in pitch).
  • the next step in the real-time procedure 3414 may be to apply additional effects to the current audio and visual video data 3418 in accordance with the current commands received from the user interface in step 3415.
  • the pre-set rise or decay in volume of active or recently deactivated audio streams may be taken into account when calculating the required audio volume level (or levels in the case of
  • RO/AU simultaneously active audio streams.
  • the updated visual and audio video data may be transferred to the audio/visual production device 3313 (see FIG. 33) to be made visible and audible (steps 3419 and 3420).
  • a MIDI keyboard equipped with a MIDI control wheel may act as the interface in the system. Audio stream/pitch commands may be elicited via the keyboard keys and track position may be controlled via the MIDI control wheel.
  • the visual component may be omitted such that only the audio streams are produced and made audible and/or recorded. In exemplary embodiments the audio component may be omitted such that only the visual component is made visible and/or recorded.
  • the interface may be used to rapidly select between individual audio or video samples, and/or select between positions within an audio or video sample.
  • rotation of the interface around its vertical axis may be configured to advance (either forward or backwards) through a sample's duration and the digit buttons may allow the user to select which sample is to undergo said advancement.
  • the distal thumb button may be configured to select audio sample A, the distal index finger button to select audio sample B, the distal middle finger button select audio sample C, and so on.
  • the beginning point of advancement for a sample may reset to the beginning of the sample each time its corresponding digit button is actuated.
  • Rotating the interface either left or right around the vertical axis may be configured to cause the audio sample to advance forwards through the sample's duration.
  • a variety of other configurations are also possible including rightwards rotation advancing the sample forwards and leftwards rotation backwards through the sample.
  • other axes of rotational or translational motion may be used to control sample advancement.
  • the rate of advancement may be proportional to the rate of motion, whereby the perceived pitch of an audio sample would be lower if the motion were slower and higher if the motion were faster.
  • the perceived pace of events within a video sample would be slower if the motion were slower and faster if the motion were faster.
  • Exemplary embodiments of the kinds described above would allow the user to produce audio and visual effects similar to 'tumtabilism' hardware and software, but with the advantages of combining rapid sample selection and advancement into a single interface that can be operated with one hand and has strong live performance appeal.
  • Exemplary embodiments may utilize interface-based contextual control and directional control effects to modulate the selected samples, including oscillation rate-control effects employing frequency filters and/or volume gates, in a manner similar to that described elsewhere in this specification. As would be understood by a person skilled in the
  • RO/AU art a large variety of additional alternative effects modulating selected samples may be configured to be controlled via an interface, and this should not be considered a complete list.
  • a hand operated input device comprising: a plurality of activation means configured to be activated by the digits of the user; and an output means for outputting a series of currently active activation means;
  • the hand operated input device wherein said device includes at least one sensor means for measuring a current motion, position, or orientation value of the input device, and that can pass these measurements on to said output means.
  • attachment means secure the device to the user's hand.
  • the hand operated input device wherein the device is designed to remain in close contact with the hand during operation.
  • the hand operated input device wherein, when the device is in a fixed position relative to the user's hand, said device includes at least one activation means capable of being actuated by contact with a surface of one of the user's digits and at least one more activation means capable of being actuated by contact with a different surface of the same digit.
  • the hand operated input device wherein, when the device is in a fixed position relative to the user's hand, said device includes a first activation means capable of being actuated by contact with a first surface of one of the user's digits, a second activation means capable of being actuated by contact with a second different surface of the same digit, and a third activation means capable of being actuated by contact with a third different surface of the same digit.
  • the hand operated input device wherein, when the device is in a fixed position relative to the user's hand, said device includes at least one activation means capable of being actuated by contact with the distal phalanx of one of the user's digits and at least one more activation means capable of being actuated by contact with a segment of the same digit other than its distal phalanx.
  • the hand operated input device wherein the activation means are mapped to audio or video samples, or different time points within audio or video samples.
  • the hand operated input device wherein the activation means are located on a plurality of module means, each module means being configured for access by a single user digit;
  • the hand operated input device wherein the number of activation means per finger is at least 2 and there is one or no activation means for the thumb.
  • the hand operated input device wherein the number of activation means per digit is at least 1.
  • the hand operated input device wherein the number of activation means per digit is at least 2.
  • the hand operated input device wherein the number of activation means per digit is at least 3.
  • the hand operated input device wherein the digits include the fingers and thumb of a user.
  • the hand operated input device wherein said sensor means include at least one angular rate sensor measuring the rate of angular rotation of the device around the lateral, longitudinal, or vertical axis of the device.
  • the hand operated input device wherein said sensor means include at least one orientation sensor measuring the orientation of the device around the lateral, longitudinal, or vertical axis of the device.
  • the hand operated input device wherein said sensor means measure the orientation of the device around the lateral, longitudinal, and vertical axes of the device.
  • the hand operated input device wherein said sensor means measure the orientation of the device around the lateral and longitudinal axes of the device.
  • the hand operated input device wherein the sensor means measure at least one position value of the device.
  • the hand operated input device wherein said device further includes an elongated portion counterbalancing the weight of the activation means when in use by a user.
  • the hand operated input device wherein the position of one or more activation means is adjustable.
  • the hand operated input device wherein the distance of one or more activation means from the user's palm is adjustable.
  • the hand operated input device wherein the lateral position of one or more activation means relative to the user's palm is adjustable.
  • the hand operated input device wherein the distance of the device's contact surface for the user's attached hand relative to the rest of the device is adjustable.
  • the hand operated input device wherein the device's contact surface for the user's attached hand includes ventilation means.
  • the hand operated input device wherein said output means includes a wireless transmission means for wireless transmission of the output.
  • each of the activation means can be actuated either individually or in combination with other activation means.
  • the hand operated device wherein at least one axis of the orientation of the device is mapped to output the octave of a sound's perceived pitch.
  • the hand operated input device wherein the direction of rotational or translational motion of the device acts as a method for selecting specific audio or visual outcomes.
  • RO/AU [00301] The hand operated input device wherein at least one measurement of rotational motion, translational motion, orientation, or position of the device acts to modulate audio or visual outcomes controlled by another measurement of rotational motion, translational motion, orientation, or position.
  • the hand operated device as wherein one or more axis of the orientation of the device is mapped to a series of zones.
  • the hand operated device wherein the device is used to interact with a video game.
  • the hand operated device wherein the device is used to control a lighting system.
  • the hand operated device wherein the device is used to remotely control a robot or vehicle.
  • the hand operated device wherein the device provides haptic feedback to the user.
  • the hand operated device wherein the device sends input to audio or visual processing software on a computer.
  • the hand operated device wherein the device is used to modify at least one of an audio signal and a video signal.
  • the hand operated device wherein the sensor means comprises at least one of an accelerometer that measures static acceleration, an accelerometer that measures dynamic acceleration, a gyroscope that measures rotational motion, or a magnetometer that measures magnetic fields.
  • the hand operated device wherein the position of the device is estimated based on the interaction between a signal emitter and a signal receiver, one of which is located in the device and the other of which is physically separate to the device.
  • the hand operated device wherein sounds controlled by the device can be modulated by a portamento effect controlled by the sequence of actuation of activation means and/or motion, orientation, or position of the device.
  • the hand operated device wherein sounds controlled by the device can be modulated by a vibrato effect controlled by motion, orientation, or position of the device after the actuation of activation means.
  • the hand operated device wherein sounds controlled by the device can be modulated by a tempo-synced oscillation rate-based effect controlled by the orientation or position of the device and/or directions of motion of the device.
  • RO/AU velocity modulates the sound of a stringed instrument or breath velocity modulates the sound of a wind instrument.
  • activation means are mapped to letters or numbers and motion, position, or orientation modulates this mapping.
  • the hand operated input device wherein the device includes an arrangement of activation points subdivided into sets assigned to each digit, the number of sets being at least four.
  • the hand operated input device wherein the device includes an arrangement of activation points subdivided into sets assigned to each digit, the number of sets being at least three.
  • a hand operated input device comprising: a plurality of activation points configured to be activated by the digits of the user; at least one sensor means for measuring a current motion, position, or orientation value of the input device; and a output means connected to the activation points and the sensor means for outputting a series of currently active activation points and at least one of the motion, position, or orientation values of the input device.
  • control audio sample is a person's sung or spoken voice.
  • control audio sample is a sound that can be controlled for musical effect.
  • the hand operated device wherein one or more distinct audio samples is simultaneously played back at a constant rate that is not controlled via the input device.
  • RO/AU [00327] The hand operated device wherein actuation of activation points is used to select between control audio samples or playback start points within control audio samples.
  • the hand operated device wherein an axis of orientation of the device is used to control the pitch of the control audio sample.
  • the hand operated device wherein visual and/or audio elements provide feedback on a user's performance of control thereby imbuing a game-like quality to the task.
  • An entertainment system comprising: a user input device providing a series of user-controlled input data streams comprising substantially continuous input values and substantially discrete input values; and an output component connected to said user input data streams; wherein said output component outputs said input data streams for playback control of an audio sample (the "control audio sample").
  • control audio sample is a person's sung or spoken voice.
  • control audio sample is a sound that can be controlled for musical effect.
  • control over a visual video component sample associated with the control audio sample is simultaneously exerted by user-controlled substantially continuous input data.
  • RO/AU [00341] The system wherein user-controlled discrete input values are used to gate the audibility of the control audio sample.
  • control of one or more sequential sections of the control audio sample requires a direction-specific user action, with the required direction indicated visually.
  • a hand operated input device comprising: a plurality of activation points configured to be activated by the digits of the user; at least one sensor means for measuring a current motion, position, or orientation value of the input device; and a output means interconnected to the activation points and the sensor means for outputting a series of currently active activation points and at least one motion, position, or orientation value of the input device; wherein movement of the device modulates one or more duplicate audio streams derived from an audio source (e.g., a voice recorded by a microphone).
  • an audio source e.g., a voice recorded by a microphone
  • the hand operated device wherein the activation points and/or device movement is used to control the volume of one or more duplicate audio streams.
  • the hand operated device wherein the audio source and one or more duplicate audio streams are made audible (and/or recordable) at the same time to produce harmony.
  • the hand operated device wherein motion, orientation, or position of the device is used to control the volume and/or other audio qualities of one or more duplicate audio streams.
  • the hand operated device wherein the pitch of one or more duplicate audio streams is selected by a musical pitch interval relative to the pitch of the audio source, whereby each specific pitch interval is triggered by a specific activation point.
  • the hand operated device wherein the pitch of one or more duplicate audio streams and/or the source audio Is quantized.
  • the hand operated device wherein supplementary transduction of the audio source is achieved using a contact microphone and the resulting signal is analyzed to detect one or more pitches within the audio source.
  • the hand operated device wherein the pitch of one or more duplicate audio streams can be modulated by a portamento effect controlled by the sequence of actuation of activation points and/or motion, orientation, or position of the device.
  • the hand operated device wherein the pitch of one or more duplicate audio streams can be modulated by a vibrato effect controlled by the motion, orientation, or position of the device after actuation of an activation point.
  • the hand operated device wherein sounds controlled by the device can be modulated by a tempo-synchronised oscillation rate effect controlled by the orientation or position of the device and/or directions of motion of the device.
  • An entertainment system comprising: a user input device providing a series of user-controlled input data streams comprising substantially continuous input values and substantially discrete input values; and an output component interconnected to said user input data streams; wherein said output component outputs said input data streams for modulation of one or more duplicate audio streams derived from an audio source (e.g., a voice recorded by a microphone).
  • an audio source e.g., a voice recorded by a microphone
  • the system wherein the pitch of one or more duplicate audio streams can be modulated by a portamento effect controlled by the sequence of user-controlled discrete input values and/or user-controlled substantially continuous input data.
  • the system wherein the pitch of one or more duplicate audio streams can be modulated by a vibrato effect that responds to specific combinations of user-controlled discrete values and substantially continuous input data.
  • the system wherein the sound of one or more duplicate audio streams can be modulated by a tempo-synced oscillation rate-based effect that responds to user- controlled substantially continuous input data.
  • a hand operated input device comprising: a plurality of activation points configured to be activated by the digits of the user; at least one sensor for measuring a current motion, position, or orientation value of the input device; and an output means interconnected to the activation points and the sensor for outputting a series of currently active activation points and at least one motion, position, or orientation value of the input device; wherein movement of the device controls the substantially gradated change in the pitch of a sound between a start pitch and an end pitch.
  • the hand operated device wherein, after selection of the start and end pitches, motion of the device controls the substantially gradated change in the pitch of a sound between the start pitch and the end pitch.
  • the hand operated device wherein a user may operate left and right-handed versions of the hand operated device simultaneously and differences in at least the relative motion, position, or orientation of the two devices is used to control the substantially gradated change in the pitch of a sound between a start pitch and an end pitch.
  • An entertainment system comprising: a user input device providing a series of user-controlled input data streams comprising substantially continuous input values and substantially discrete input values; and an output component interconnected to said input data streams; wherein said output component outputs said input data streams for controlling the substantially gradated change in the pitch of a sound between a start pitch and an end pitch.
  • a hand operated input device comprising: a plurality of activation points configured to be activated by the digits of the user; at least one sensor for measuring a current motion, position, or orientation value of the input device; and an output means interconnected to the activation points and the sensor for outputting a series of currently active activation points and at least one of the motion, position, or orientation values of the input device; wherein movement of the device controls the playback of an audio sample and/or an associated visual video component sample.
  • the hand operated device wherein the audio sample is pre-processed to partially or completely reduce its pitch variability, after which the pitch or pitches of the audio sample is detected at one or more points in the duration of the audio sample.
  • the hand operated device wherein the audio and/or an associated visual video component sample can be played forwards and backwards at any rate.
  • the hand operated device wherein motion, position, and/or orientation values of the input device; and/or activation points of the input device, control additional modulation of the audio sample.
  • RO/AU [00385] The hand operated device wherein motion, position, and/or orientation values of the input device; and/or activation points of the input device, control additional modulation of the visual video component sample.
  • the hand operated device wherein the pitch of the audio sample can be modulated by a portamento effect controlled by the sequence of actuation of activation points and/or motion, orientation, or position of the device.
  • the hand operated device wherein the pitch of the audio sample can be modulated by a vibrato effect controlled by motion, orientation, or position of the device after actuation of one or more activation points.
  • the hand operated device wherein the sound of the audio sample can be modulated by a tempo-synced oscillation rate effect controlled by the orientation or position of the device and/or directions of motion of the device.
  • An entertainment system comprising: a user input device providing a series of user-controlled input data streams comprising substantially continuous input values and substantially discrete input values; and an output component interconnected to said user input data streams; wherein said output component outputs said input data streams for controlling the playback of an audio and/or an associated visual video component sample.
  • the system wherein the pitch of the audio sample can be modulated by a vibrato effect that responds to specific combinations of user-controlled discrete values and substantially continuous input data.
  • the system wherein the audio sample can be modulated by a tempo-synced oscillation rate effect that responds to user-controlled substantially continuous input data.
  • An entertainment system comprising: a user input device providing a series of user controlled input data streams derived from a current device movement or orientation; and an output component interconnected to said user input device, said output component outputting musical sound audio data with substantially gradated pitch control depending on said data streams of the user input device.
  • the system wherein the input device comprises: a plurality of activation points configured to be activated by the digits of the user; at least one sensor component for measuring a current motion, position, or orientation value of the hand of a user; and a processing means interconnected to the activation points and the sensor component for outputting a series of currently active activation points and at least one of the motion, position, or orientation values of the input device.
  • the music entertainment system wherein the start and end pitches of said substantially gradated pitch control depend on current discrete data events initiated by the user via controls provided by the user interface.
  • a method of producing an interactive musical sound including the steps of: (a) providing a user input device providing a series of user-controlled input data streams derived from a current device movement, position, or orientation; (b) processing said user input device data, to output musical sound audio data with substantially gradated pitch control depending on said data streams of the user input device.
  • start and end pitches of said substantially gradated pitch control depend on current discrete data events initiated by the user via controls provided by the user interface.
  • An entertainment system comprising: a user input device providing a series of user-controlled input data streams derived from a current device movement, position, or orientation; a video stream having both audio and associated video information; and a processor interconnected to said user input device and said video stream, said processor outputting video at a specific position in the video stream, dependent on said movement, position, or orientation data streams of the user input device, and a current audio output derived from audio at said specific position in the video steam.
  • the user input device comprises: a plurality of activation points configured to be activated by the digits of the user; at least one sensor component for measuring a current motion, position, or orientation value of the interface device; and an output component interconnected to the activation points and the position sensors for outputting a series of currently active activation points and at least one of the motion, position, or orientation values of the input device.
  • a method of producing an interactive video image including the steps of: (a) providing a user input device providing a series of user-controlled input data streams derived from a current device movement, position, or orientation; (b) providing a video stream having both audio and associated video information; and (c) processing said video stream, to output video at a specific position in said video stream, dependent on said movement, position, or orientation data streams of the user input device, and to output audio derived from audio at said specific position in the video stream.
  • a hand operated input device comprising: a plurality of activation points configured to be activated by the digits of the user; at least one sensor means for measuring a current motion, position, or orientation value of the input device; and a output means interconnected to the activation points and the sensor means for outputting a series of
  • RO/AU currently active activation points and at least one of the motion, position, or orientation values of the input device.
  • the hand operated input device wherein the activation points are located on a plurality of module means, each module being configured for access by a single user digit;
  • each of the plurality of modules comprises at least one activation point capable of being modulated by a distal portion of a digit, a medial portion of a digit, or a proximal portion of a digit.
  • the hand operated input device wherein the number of activation points per finger is at least 2.
  • the hand operated input device wherein the digits include fingers of a user and the thumb.
  • the hand operated input device wherein the sensors include at least one angular rate sensor sensing the rate of angular rotation of the device.
  • the hand operated input device wherein the sensor means measure at least one position value of the device.
  • the hand operated input device wherein the sensor means measure at least one movement value of the device.
  • the hand operated input device wherein said device further includes an elongated portion counterbalancing the weight of the activation points when in use by a user.
  • the hand operated input device wherein the positions of the activation points are adjustable for one or more digits.
  • the hand operated input device wherein the activation points are formed from electromechanical switches.
  • each of the activation points can be actuated either individually or in combination with other activation points.
  • the hand operated device wherein the device is used to interact with a video game.
  • the hand operated device wherein the device is used to modify at least one of an audio signal and a video signal.
  • the hand operated device wherein the positioning sensor comprises at least one of an accelerometer that measures static acceleration, an accelerometer that measures dynamic acceleration, a gyroscope that measures rotational motion, or a magnetometer that measures magnetic fields.
  • the hand operated input device wherein the device is designed to remain in close contact with the hand during movement
  • the hand operated input device wherein the device includes an arrangement of activation points subdivided into sets assigned to each digit, the number of sets being at least four.
  • the hand operated input device wherein the device includes an arrangement of activation points subdivided into sets assigned to each digit, the number of sets being at least three.
  • a method for manipulating audio/visual content comprising:
  • said input device when the input device is in a fixed position relative to the user's hand, said input device includes at least one activation point capable of being actuated by contact with the distal phalanx of one of the user's digits and at least one more
  • RO/AU activation point capable of being actuated by contact with a segment of the same digit other than its distal phalanx.
  • the method further comprising transmitting the output data.
  • each of the activation points can be actuated either individually or in combination with other activation points.
  • a hand operated input device comprising: a plurality of activation means configured to be activated by the digits of the user; at least one sensor means for measuring a current motion, position, or orientation value of the input device; and a output means interconnected to the activation points and the sensor means for outputting a series of currently active activation points and at least one motion, position, or orientation value of the input device.
  • the hand operated input device wherein the activation means are mapped to audio or video samples, or different time points within audio or video samples.
  • the hand operated device wherein movement of the device controls the rate of playback of audio or video samples from the time points selected by actuation of the activation means.
  • the hand operated device wherein one direction of angular rotation around the vertical axis of the device advances the playback of the selected audio or video sample forwards at a rate proportional to the rotation, while the other direction advances the playback of the selected audio or video sample backwards at a rate proportional to the rotation.
  • RO/AU may lie in less than all features of a single foregoing disclosed embodiment.
  • the claims following the Detailed Description are hereby expressly incorporated into this Description, with each claim standing on its own as a separate embodiment of this disclosure.
  • the terms comprising, comprised of or which comprises are open terms that mean including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Une interface comprenant un dispositif d'entrée commandé manuellement comportant une série de points d'activation activés par les doigts (doigts et/ou pouce) d'un utilisateur ; un composant capteur mesurant un mouvement, une orientation et/ou une position courante du dispositif d'entrée et un composant de sortie interconnecté aux points d'activation et au composant capteur pour produire dans une série les points d'activation couramment actifs et le mouvement, l'orientation et/ou la position courante du dispositif d'entrée.
EP11833636.1A 2010-10-22 2011-10-21 Procédés, dispositifs et systèmes permettant de créer des signaux de commande Withdrawn EP2630557A1 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
PCT/AU2010/001409 WO2011047438A1 (fr) 2009-10-22 2010-10-22 Dispositif d'interface de machine humaine
AU2010905631A AU2010905631A0 (en) 2010-12-23 Music entertainment system
AU2010905630A AU2010905630A0 (en) 2010-12-23 Entertainment system
US201161478278P 2011-04-22 2011-04-22
PCT/AU2011/001341 WO2012051664A1 (fr) 2010-10-22 2011-10-21 Procédés, dispositifs et systèmes permettant de créer des signaux de commande

Publications (1)

Publication Number Publication Date
EP2630557A1 true EP2630557A1 (fr) 2013-08-28

Family

ID=48808120

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11833636.1A Withdrawn EP2630557A1 (fr) 2010-10-22 2011-10-21 Procédés, dispositifs et systèmes permettant de créer des signaux de commande

Country Status (1)

Country Link
EP (1) EP2630557A1 (fr)

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2012051664A1 *

Similar Documents

Publication Publication Date Title
US10895914B2 (en) Methods, devices, and methods for creating control signals
Miranda et al. New digital musical instruments: control and interaction beyond the keyboard
US11011145B2 (en) Input device with a variable tensioned joystick with travel distance for operating a musical instrument, and a method of use thereof
US7541536B2 (en) Multi-sound effect system including dynamic controller for an amplified guitar
US9502012B2 (en) Drumstick controller
US9558727B2 (en) Performance method of electronic musical instrument and music
CN104582530A (zh) 用于定位输入装置并产生控制信号的方法和装置和系统
CN105741639B (zh) 一种模拟弓弦类乐器的微感掌上乐器
Paradiso et al. Interactive music for instrumented dancing shoes
AU2004245773B2 (en) Multi-sound effect system including dynamic controller for an amplified guitar
KR20170106889A (ko) 지능형 인터페이스를 구비한 악기
JP3654143B2 (ja) 時系列データの読出制御装置、演奏制御装置、映像再生制御装置、および、時系列データの読出制御方法、演奏制御方法、映像再生制御方法
CN205486954U (zh) 一种模拟弓弦类乐器的微感掌上乐器
Jessop The Vocal Augmentation and Manipulation Prosthesis (VAMP): A Conducting-Based Gestural Controller for Vocal Performance.
US20120209560A1 (en) Human machine interface device
Kim et al. Developing humanoids for musical interaction
EP2630557A1 (fr) Procédés, dispositifs et systèmes permettant de créer des signaux de commande
Todoroff Control of digital audio effects
Overholt Advancements in violin-related human-computer interaction
Kim et al. Enabling humanoid musical interaction and performance
Zanini The Augmented Drumstick
Casciato On the choice of gestural controllers for musical applications: an evaluation of the Lightning II and the Radio Baton
Boyt Gesture-Sensing Technology for the Bow: A Relevant and Accessible Digital Interface for String Instruments
Murphy The electronic sensor bow: a new gestural control interface
Burtner Controllers for Computers and Musical Instruments (Past)

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130502

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160503