GB2559815A - Music control device - Google Patents

Music control device Download PDF

Info

Publication number
GB2559815A
GB2559815A GB1702815.0A GB201702815A GB2559815A GB 2559815 A GB2559815 A GB 2559815A GB 201702815 A GB201702815 A GB 201702815A GB 2559815 A GB2559815 A GB 2559815A
Authority
GB
United Kingdom
Prior art keywords
events
movement
accelerations
devices
intensity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1702815.0A
Other versions
GB201702815D0 (en
Inventor
Philip Pisani Justin
Robert Gordon Simon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1702815.0A priority Critical patent/GB2559815A/en
Publication of GB201702815D0 publication Critical patent/GB201702815D0/en
Publication of GB2559815A publication Critical patent/GB2559815A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • G10H1/0041Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
    • G10H1/0058Transmission between separate instruments or between individual components of a musical system
    • G10H1/0066Transmission between separate instruments or between individual components of a musical system using a MIDI interface
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H3/00Instruments in which the tones are generated by electromechanical means
    • G10H3/12Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/096Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith using a touch screen
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/265Key design details; Special characteristics of individual keys of a keyboard; Key-like musical input devices, e.g. finger sensors, pedals, potentiometers, selectors
    • G10H2220/311Key design details; Special characteristics of individual keys of a keyboard; Key-like musical input devices, e.g. finger sensors, pedals, potentiometers, selectors with controlled tactile or haptic feedback effect; output interfaces therefor
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/321Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/321Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
    • G10H2220/326Control glove or other hand or palm-attached control device
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/321Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
    • G10H2220/331Ring or other finger-attached control device
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/321Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
    • G10H2220/336Control shoe or boot, i.e. sensor-equipped lower part of lower limb, e.g. shoe, toe ring, sock, ankle bracelet or leg control attachment
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/391Angle sensing for musical purposes, using data from a gyroscope, gyrometer or other angular velocity or angular movement sensing device
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/395Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing.

Abstract

One or more devices Fig 1, 1 may be worn on parts of a users body Fig 5. Said devices may detect movement of parts of the wearers body in multiple axes by use of accelerometers 13 that may also incorporate gyroscopes 14 or compass. Outputs may be filtered 17 Associated microprocessor controls analyse said movement and convert them into an output which is sent wired or wirelessly. Movement of the wearer may control musical tempo, 19 or intensity 24 of the music reflect the ambulation of the wearer. Multiple users wearing multiple devices may be linked in an ensemble or group performance and outputs may be sent via communications interface 37 to a mixing controller or a public address system, Figure 6

Description

(54) Title of the Invention: Music control device
Abstract Title: Music control device having accelerometer measurement of wearer's movement and note control (57) One or more devices Fig 1, 1 may be worn on parts of a user’s body Fig 5. Said devices may detect movement of parts of the wearer’s body in multiple axes by use of accelerometers 13 that may also incorporate gyroscopes 14 or compass. Outputs may be filtered 17 Associated microprocessor controls analyse said movement and convert them into an output which is sent wired or wirelessly. Movement of the wearer may control musical tempo, 19 or intensity 24 of the music reflect the ambulation of the wearer. Multiple users wearing multiple devices may be linked in an ensemble or group performance and outputs may be sent via communications interface 37 to a mixing controller or a public address system, Figure 6
Figure 3 - Activity sensor control block diagram
Figure GB2559815A_D0001
/4
Figure GB2559815A_D0002
Figure 1 - Music control system
Figure GB2559815A_D0003
Figure 2 - Music control device
2/4
Figure GB2559815A_D0004
Figure 3 - Activity sensor control block diagram
3/4
Figure GB2559815A_D0005
-►
Increasing movement intensity
Figure 4 - Activity Sensor Receiving Function
Figure GB2559815A_D0006
Figure 5 - Multiple Activity Sensors
Figure GB2559815A_D0007
4/4
Figure GB2559815A_D0008
Figure 6 - Multiple co-ordinated sensor devices
Public Address ©
© j
_Mixine (-J, i-unction ©
Audio
-* f ι'Ύλ \ Performance
Equipment
Intellectual
Property
Office
Application No. GB1702815.0
RTM
Date :26 July 2017
The following terms are registered trade marks and should be read as such wherever they occur in this document:
MIDI, Novation, Launchkey, Bluetooth, Apple, Duracell, Texas Instruments
Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
Music control device
Description
Methods to remotely control electronic musical instruments are well established. A common approach is to use an interface protocol known as the Musical Instrument Digital interface (MIDI™). This protocol is used to generate sounds from instruments which support this interface such as synthesisers and electronic drum machines. A control device sends commands encoded in MIDI™ messages, usually triggered by a user input on the control device. The commands are often analogous to the pressing and releasing of keys on a piano keyboard (eg: Novation Launchkey 25, Novation Ltd UK) and this is a common format for MIDI™ control devices.
Wearable personal computing devices are now also commonplace. Often these devices use microminiature accelerometers (e.g. ST Microelectronics LIS2DH) or gyroscopes to detect accelerative forces when worn by a user. With the advent of wearable wireless technology standards to embed MIDI™ control over Bluetooth™ LE links (eg: Apple Inc, Apple Bluetooth™ Low Energy MIDI™ specification) it is possible to use motion sensors to trigger MIDI™ messages and enable movement to be used to trigger sounds.
Most wearable MIDI™ control devices have a fixed mapping from the physical gesture or event required to a single MIDI™ event and hence are limited to the range of events which can be generated without additional user intervention on the instrument, for example to select different sound ranges or types mid performance. Such intervention can make performing with such devices difficult as it requires interruption of performance to change sounds. Some wearable control devices allow different MIDI™ events to be created for the same physical action via the operation of buttons on the device but this also can create an unnatural feel for the performer and a less spontaneous experience. Hence, a limitation of this approach is that it can be difficult to create more complex musical arrangements with such a device.
This invention provides means to overcome these limitations in the wearable music control device and enable a performer or ensemble to generate multiple note events depending on the context and features of the motion sensed by the control device or devices.
Preferred Embodiment
The invention is now described with reference to the following figures:
Figure 1 shows an example of the music control system.
Figure 2 shows a block diagram of the music control device.
Figure 3 shows how to derive MIDI™ events from the sensed data.
Figure 4 shows how multiple sounds may be triggered from events.
Figure 5 shows how a plurality of music control devices may be used by a performer.
Figure 6 shows how a plurality of performers can use the music control devices.
Referring to Figure 1, the music control device is mounted in a form suitable to attach to the performer's body, in this case a wrist strap [1], It is possible the same device may also be attached to the body in different ways, for example but not limited to, belt clips, lanyards or necklaces.
The device can communicate wirelessly using a radio link [3], for example using the Bluetooth™ Low Energy protocol, to a computing device [2] in order to transfer information about the actions its senses. The computing device is arranged to contain a sound generating function such as an electronic synthesiser application and may be, for example but not limited to, a smartphone, computer tablet or personal computer with a wireless connection means.
Figure 2 shows a block diagram of the music control device. The mechanism contains a microprocessor [8] and wireless transceiver [7], Those skilled in the art will be aware that these functions may be achieved using a single electronic component such as a wireless system on chip (eg: CC2540 Texas Instruments, USA). The device measures acceleration forces using an accelerometer [5] as well as other environmental parameters such as ambient temperature [12], ambient sound via a microphone [9] and user inputs via a touch sensor or button [10], The unit may be powered by a battery [11] such as a generic CR2032 coin cell (Duracell Inc, USA) or by a rechargeable cell. For the latter, a rechargeable mechanism may also be incorporated to allow wired or wireless charging for the cell.
The accelerometer [5] provides measures of acceleration in 3 or more axes. Optionally the accelerometer [5] may also contain a gyroscope or compass to enable angular rotation or heading to be measured.
User feedback and status of the sensor device maybe provided by a display [4] and optionally haptic feedback via a vibrating device [6],
Referring to Figure 3, motion sensing is achieved by the accelerometer[13] measuring variations in acceleration in its axes. In the example shown three axes, x, y and z are used. The acceleration values are recorded at a suitable frequency for movement detection and transferred [15] periodically into a filter stage [17], The filter stage may apply different filtering characteristics to the signals [15] to match the particular features intended to be extracted from the signal. Different features can be extracted concurrently to enable multiple events to be extracted simultaneously.
A measure of the overall level of activity [24] may be derived by summing and weighting the variation in acceleration on the filtered acceleration signals [20], This may then be scaled to provide an index of activity changing in value depending on the short-term accelerations measured. The time period over which these are measured may be chosen to match the movement types being detected but could be in the order of a few seconds. Hence, it can be seen that such a method can be used to define the range of ambulation achieved by a performer from static to high ambulation levels. Such a classification may be used, for example, to discriminate the intensity of activity being undertaken by a performing dancer during a performance.
In addition, it is possible to filter unwanted frequency components of the signal and take the filtered signal [17] and perform processing to deduce an average periodicity to the movement and use this to provide an estimate of the dominant tempo of the movement [23],
A further use of the accelerometer signal, again after appropriate filtering, is to deduce from the relationship of the three axes values and the presence of peaks in acceleration, specific gesture events [25], For example, for a wrist worn sensor, hand movements in specific directions may be deduced [25],
Optionally, a measured of rotation angle and speed may be deduced from a gyroscope sensor [14] and again after filtering [17], a classifier [26] may extract events to indicate changes in the angular rotation and rotation speed of the sensor.
Those skilled in the art will be aware that such processing is possible using digital signal processing techniques which may be undertaken on the sensors microprocessor (Figure 2 [8]).
The different level and event detections performed [23], [24], [25], [26] may then be presented to a note/event lookup function [32] and used to generate specific note events, for example as MIDI™ note on messages. These notes may be presented to the communications interface [37] for onward communication to the intended instrument or instruments hosted on the computing device (Fig 1 [3]). The precise events transferred will depend on the configuration of the lookup function [32], This may be configured by the user to suit the intended performance and stored as data inside the control device after being transferred to it via the communication interface [37],
Furthermore, the lookup function [32] may apply additional criteria to the note event signal selection based on the users performance preference. For example, the lookup may select different notes, randomly or cyclically, for each event to provide variation in output.
The rate at which the lookup function [32] generates note events may vary depending on user preference. The rate may be controlled by a tempo co-ordination function [31] which takes tempo information internally from the internal tempo extraction function [23] or from an external tempo signal transferred to the sensor device via the communications interface [37],
Within commonplace musical arrangements, a musical piece consists of a number of sub element or parts which may be categorised broadly into components as rhythm, accompaniment and melody.
Referring to Figure 4, the receiving sound generation function operating on the connected computing device (Figure 1 [2]) may be used to trigger different types and duration of sound samples depending on the note events received.
By arrangement, note events can be used to trigger categories of sound events associated with these rhythm, accompaniment and melody categories. Hence, when combined with the music creation sensor described in this embodiment, it is possible for a performer to generate multi-part sound including rhythm, accompaniment and melody using a combination of movement periodicity, movement intensity and specific gesture movements.
Activity level note events (Fig 3 [24]) may be used to select rhythmic sound samples [43], [44], [45], [46] varying in style and intensity to suit the current activity level. Hence the overall movement of the performer may be used to select a specific rhythmic component to the performance. Furthermore, this rhythmic sound may be made to vary by the performer by adjusting the strength of activity they are performing.
Accompaniment may be selected by the same performer by specific movements or gestures which trigger additional note events (Figure 3 [25], [26]) which when received [42] may be used to trigger sound samples which are suitable for accompaniment (for example musical chords). These note events could trigger a fixed accompaniment, for example a specific chordal sound [38], [39], or pre-arranged accompaniment progression such as a cyclic chord progression. The performer may add variation to the accompaniment by varying the intensity of the gestures used to generate the events.
Furthermore, melody may be added by additional gestures or movements, for example sharp movements such as shaking or raising limbs, and these events used to trigger appropriate melody sound samples [40], [41],
Referring to Figure 5, it is possible to increase the scope of note generation by the user wearing multiple control devices on different limbs or locations on the body [50], [51], [52], [53] which communicate wirelessly [55] to the device containing the receiving instrument [54] ,
A further use of the device is in ensemble or multi-performer settings. Referring to Figure 6, performers using the sensing device [56], [57] may transmit note events to one or more computing devices [58], [59] which contain different instruments with a sound generating function. By mutual co-ordination, the performers may generate complimentary sounds to generate a combined musical piece.
Alternatively, the music control device [56], [57] or the receiving devices [58], [59], may forward the note events generated to a mixing function [60] which combines these notes events and uses these to generate sounds via a public address system [61], The mixing function may apply additional criteria to the received events in order to manipulate the output [64] send to the public address [61], for example but not limited to, tempo, volume or tonal characteristics.
The mixing function may provide an additional output [63] to connect to audio performance equipment [62] which is playing sound from an alternative source such as a music recording. For example , but not limited to, this may include a disc jockey music performance system. Hence it is possible that a pleasing interaction between the performers and the music playback system can be achieved. For example, the system could estimate the overall intensity of dancing movement of a number of dancing performers wearing the music control devices and, as a result of combining this information [63], modify the characteristics of sound being generated [64] in order to provide pleasing audible or audio visual feedback to the performers or audience.

Claims (11)

  1. Claims
    We claim:1. A control device equipped with:
    a. a power source, an accelerometer and a microprocessor means to record accelerations
    b. a means to analyse accelerations and categorise them into one or more note event outputs
    c. a wired or wireless communications means
    d. a means to attached the device to a body.
  2. 2. A device as defined in claim 1, where the accelerations may be processed concurrently to categorise physical movements from a wearer into:
    a. tempo of movement or ambulation
    b. intensity of movement or ambulation
    c. tempo of pre-defined motion patterns
    d. intensity of pre-defined motion patterns.
  3. 3. A device as defined in claim 2 where additional sensed information is used to determine the events which may include:
    a. environmental temperature
    b. audible sound level
    c. light intensity
    d. user interface inputs from the performer.
  4. 4. A device as defined in claim 1, which uses a vibrating motor to generate user feedback.
  5. 5. A device as defined in claim 1, which uses a gyroscope or compass to sense angular rotation as a means to discriminate additional gestures.
  6. 6. A device as defined claim 1, where the device includes a wired or wireless means to recharge its power source.
  7. 7. A device as defined in claim 1, where the events generated may be predefined by the user and stored in the control device.
  8. 8. A device as defined in claim 1 or 2 where the events generated conform to the Musicai Instrument Digital interface (MIDI™) protocol.
  9. 9. A device as defined in claim 1, where the events generated for similar accelerations may vary in a predefined pattern defined by the user.
  10. 10. A system comprising a plurality of control devices where the events may be combined using one or more of the following mechanisms to create a multi sensor performance:
    a. an individual receiving instrument with a sound generated function
    b. a combined receiving instrument which communicates with two or more control devices with a sound generating function.
  11. 11. A system as defined in claim 10, where the events generated are processed and used to interact with a pre-recorded or alternate music performance
    Intellectual
    Property
    Office
    Application No: GB1702815.0
GB1702815.0A 2017-02-21 2017-02-21 Music control device Withdrawn GB2559815A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1702815.0A GB2559815A (en) 2017-02-21 2017-02-21 Music control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1702815.0A GB2559815A (en) 2017-02-21 2017-02-21 Music control device

Publications (2)

Publication Number Publication Date
GB201702815D0 GB201702815D0 (en) 2017-04-05
GB2559815A true GB2559815A (en) 2018-08-22

Family

ID=58486894

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1702815.0A Withdrawn GB2559815A (en) 2017-02-21 2017-02-21 Music control device

Country Status (1)

Country Link
GB (1) GB2559815A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5166463A (en) * 1991-10-21 1992-11-24 Steven Weber Motion orchestration system
US5290964A (en) * 1986-10-14 1994-03-01 Yamaha Corporation Musical tone control apparatus using a detector
US20110132181A1 (en) * 2009-12-07 2011-06-09 Neven Kockovic Wearable Trigger Electronic Percussion Music System
US20150077234A1 (en) * 2011-07-12 2015-03-19 Aliphcom System of wearable devices with sensors for synchronization of body motions based on haptic prompts
WO2016094057A1 (en) * 2014-12-12 2016-06-16 Intel Corporation Wearable audio mixing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5290964A (en) * 1986-10-14 1994-03-01 Yamaha Corporation Musical tone control apparatus using a detector
US5166463A (en) * 1991-10-21 1992-11-24 Steven Weber Motion orchestration system
US20110132181A1 (en) * 2009-12-07 2011-06-09 Neven Kockovic Wearable Trigger Electronic Percussion Music System
US20150077234A1 (en) * 2011-07-12 2015-03-19 Aliphcom System of wearable devices with sensors for synchronization of body motions based on haptic prompts
WO2016094057A1 (en) * 2014-12-12 2016-06-16 Intel Corporation Wearable audio mixing

Also Published As

Publication number Publication date
GB201702815D0 (en) 2017-04-05

Similar Documents

Publication Publication Date Title
US20210248986A1 (en) Stick Controller
US10895914B2 (en) Methods, devices, and methods for creating control signals
KR101528118B1 (en) Method and apparatus for distributing haptic synchronous signals
JP2698320B2 (en) Permanent input system, Permanent intention communication system, Permanent music keyboard system, Permanent Braille input / output system
US20150293590A1 (en) Method, Apparatus, And Computer Program Product For Haptically Providing Information Via A Wearable Device
CN108008930A (en) The method and apparatus for determining K song score values
WO2006070044A1 (en) A method and a device for localizing a sound source and performing a related action
GB2379016A (en) Portable apparatus monitoring reaction of user to music
JP2018011201A (en) Information processing apparatus, information processing method, and program
EP3786941B1 (en) Musical instrument controller and electronic musical instrument system
Papetti et al. Rhythm'n'Shoes: a Wearable Foot Tapping Interface with Audio-Tactile Feedback.
CN107407961A (en) The hand-held controller of computer, the control system of computer and computer system
US20190220085A1 (en) System and Method for Generating Wireless Signals and Controlling Digital Responses from Physical Movement
US20220351708A1 (en) Arrangement and method for the conversion of at least one detected force from the movement of a sensing unit into an auditory signal
US9959854B2 (en) Performance enhancing device and related methods
JP2010175754A (en) Attitude evaluating device, attitude evaluating system and program
US20050211068A1 (en) Method and apparatus for making music and article of manufacture thereof
GB2559815A (en) Music control device
WO2022237362A1 (en) Method for detecting user action on basis of music beats, and device
CN214344200U (en) Human body limb movement and instrument movement simulated sound motion capability enhancing device
GB2395282A (en) System monitoring reaction of users to a performance
Aylward et al. 2006: Sensemble: A Wireless, Compact, Multi-user Sensor System for Interactive Dance
JP2018049052A (en) Program, device, and method for virtual musical instrument performance
Cobb An accelerometer based gestural capture system for performer based music composition
KR100887980B1 (en) Rooters tool and control method for the same

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)