US20180315405A1 - Sensor driven enhanced visualization and audio effects - Google Patents

Sensor driven enhanced visualization and audio effects Download PDF

Info

Publication number
US20180315405A1
US20180315405A1 US15/581,957 US201715581957A US2018315405A1 US 20180315405 A1 US20180315405 A1 US 20180315405A1 US 201715581957 A US201715581957 A US 201715581957A US 2018315405 A1 US2018315405 A1 US 2018315405A1
Authority
US
United States
Prior art keywords
effect
drum
gesture
visualization
sensor data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/581,957
Other versions
US10102835B1 (en
Inventor
Saurin Shah
Swarnendu Kar
Lakshman Krishnamurthy
Arthur D. Webb
Manan Goel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US15/581,957 priority Critical patent/US10102835B1/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WEBB, ARTHUR D., KRISHNAMURTHY, LAKSHMAN, GOEL, Manan, KAR, Swarnendu, SHAH, SAURIN
Application granted granted Critical
Publication of US10102835B1 publication Critical patent/US10102835B1/en
Publication of US20180315405A1 publication Critical patent/US20180315405A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • G10H1/0041Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
    • G10H1/0058Transmission between separate instruments or between individual components of a musical system
    • G10H1/0066Transmission between separate instruments or between individual components of a musical system using a MIDI interface
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • G10H1/0041Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
    • G10H1/0058Transmission between separate instruments or between individual components of a musical system
    • G10H1/0066Transmission between separate instruments or between individual components of a musical system using a MIDI interface
    • G10H1/0075Transmission between separate instruments or between individual components of a musical system using a MIDI interface with translation or conversion means for unvailable commands, e.g. special tone colors
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0033Recording/reproducing or transmission of music for electrophonic musical instruments
    • G10H1/0083Recording/reproducing or transmission of music for electrophonic musical instruments using wireless transmission, e.g. radio, light, infrared
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/005Non-interactive screen display of musical or status data
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/185Stick input, e.g. drumsticks with position or contact sensors
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/191Plectrum or pick sensing, e.g. for detection of string striking or plucking
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/201User input interfaces for electrophonic musical instruments for movement interpretation, i.e. capturing and recognizing a gesture or a specific kind of movement, e.g. to control a musical instrument
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/201User input interfaces for electrophonic musical instruments for movement interpretation, i.e. capturing and recognizing a gesture or a specific kind of movement, e.g. to control a musical instrument
    • G10H2220/206Conductor baton movement detection used to adjust rhythm, tempo or expressivity of, e.g. the playback of musical pieces
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/365Bow control in general, i.e. sensors or transducers on a bow; Input interface or controlling process for emulating a bow, bowing action or generating bowing parameters, e.g. for appropriately controlling a specialised sound synthesiser
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/395Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing.
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/045Special instrument [spint], i.e. mimicking the ergonomy, shape, sound or other characteristic of a specific acoustic musical instrument category
    • G10H2230/251Spint percussion, i.e. mimicking percussion instruments; Electrophonic musical instruments with percussion instrument features; Electrophonic aspects of acoustic percussion instruments, MIDI-like control therefor
    • G10H2230/275Spint drum
    • G10H2230/281Spint drum assembly, i.e. mimicking two or more drums or drumpads assembled on a common structure, e.g. drum kit

Definitions

  • live performances featured little in the way of visual effects. More recently, live performances have been augmented by video, lighting effects, pyrotechnics, and props. While these effects have been entertaining, they do not let the audience experience the musician's point of view. These effects are further limited in that they are not controllable by the musician during the performance.
  • FIG. 1 illustrates a pair of drum sticks for creating visualization or audio effects in accordance with some embodiments.
  • FIGS. 2A-2C illustrate example visualization effects in accordance with some embodiments.
  • FIGS. 3A-3C illustrate instrumentation objects for creating visualization or audio effects in accordance with some embodiments.
  • FIG. 4 illustrates a system for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments.
  • FIG. 5 illustrates a flowchart showing a technique for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments.
  • FIG. 6 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
  • Systems and methods for providing virtual instrument visualization effects corresponding to movement of physical objects, such as drum sticks, a violin bow, a guitar pick, a conductor baton, or the like are described herein.
  • the systems and methods described herein are used to augment and enrich experiences from traditional musical instruments by communicating with a device to perform motion sensing, gesture detection, and wireless communication.
  • the systems and methods described herein are used for music performance with enhanced experiences.
  • Motion sensing is added to one or more musical instruments, such as a drum stick, a violin/viola/cello bow, a guitar pick, an end of a guitar (e.g., headstock), a conductor's baton, or the like.
  • the musical experience of an artist or audience may be enhanced, and may include real instruments or virtual instruments, such as those used in virtual reality (VR) or augmented reality (AR) systems.
  • VR virtual reality
  • AR augmented reality
  • Some traditional systems may use one or more cameras to track musical instruments. An issue with these traditional systems is that they may be occluded by a player's own hand.
  • the presently described systems and methods use a sensor, such as a nine-axis gyroscope and accelerometer, which may measure rotation, location, or orientation.
  • the sensor may be used without a camera, which allows uninterrupted rotation, location, or orientation information to be available while a user plays, without concern for a camera line of sight.
  • the systems and methods described herein provide audible and visual feedback to be played and displayed, respectively, when an action, motion, or gesture occurs at a device (e.g., a drum stick, a violin/viola/cello bow, a guitar pick, an end of a guitar, a conductor's baton, or the like).
  • a device e.g., a drum stick, a violin/viola/cello bow, a guitar pick, an end of a guitar, a conductor's baton, or the like.
  • the audible feedback may include sound created at a musical instrument by the action, motion, or gesture (e.g., when a violin how traverses a violin string, the string vibrates, causing sound to be produced), which may be augmented (e.g., amplified and played by a speaker, changed, or distorted) or the created sound may be added to by the audible feedback (e.g., additional sound may be played not caused by the musical instrument).
  • the audible feedback may include sound created by a computer system, such as a Musical Instrument Digital Interface (MIDI) system, a drum kit, etc. This type of audible feedback may be created by, for example, tracking a gesture of one or more drum sticks (e.g., without hitting an actual drum), tracking movement of a conductor's baton, etc.
  • MIDI Musical Instrument Digital Interface
  • the visual feedback provided in the system may be displayed for an audience, a performer, a remote viewer, or a combination thereof.
  • the visual feedback may include using a display screen, a VR or AR display, etc.
  • Other visual feedback may include fireworks, animated or video sequences, lighting effects on a wearable device, or the like.
  • the visual or audible feedback described herein may be based on additional activity of a user, such as dancing, gestures, predetermined effects, or the like.
  • FIG. 1 illustrates a pair of drum sticks 100 for creating visualization or audio effects in accordance with some embodiments.
  • the pair of drum sticks 100 include a first drum stick 102 A and a second drum stick 10213 .
  • Each of the pair of drum sticks 100 includes a respective motion sensor 104 A and 104 B and respective circuitry 106 A and 106 B.
  • a single sensor 104 A and 104 B are illustrated in FIG. 1 , it is understood that multiple sensors may be used on each drum stick 102 A and 102 B.
  • the circuitry 106 A and 106 B may include a transceiver, a processor, memory, or a system on a chip.
  • the first drum stick 102 A may be a parent and the second drum stick 102 B may be a child.
  • the parent may receive information from the sensor 104 B of the child, and the parent may forward that sensor information along with information from the sensor 104 A of the parent to a device, such as a mobile device, a computer, a server, etc., for further processing.
  • a device such as a mobile device, a computer, a server, etc.
  • the pair of drum sticks 100 may independently send information to a device.
  • the pair of drum sticks 100 may be paired by assigning one to a left hand of a user and one to a right hand of a user (or simulating such).
  • the left hand assigned drum stick may be assigned to output a first set of audible or visual feedback and the right hand assigned drum stick may be assigned to output a second set of audible or visual feedback.
  • either drum stick may be used to cause drum sound as audible feedback based on the location and motion of the drum stick, and one of the drum sticks may be assigned to a first visual effect (e.g., a flashing light on a proximate wearable device), and the other of the drum sticks may be assigned to a second visual effect (e.g., a visual effect on a display screen).
  • a first visual effect e.g., a flashing light on a proximate wearable device
  • a second visual effect e.g., a visual effect on a display screen
  • the sensors 104 A, 104 B may include a magnetometer, accelerometer, or gyroscope,
  • the sensors 104 A, 104 B may include a nine-axis sensor with a magnetometer, accelerometer, and a gyroscope for detecting location, position, and movement.
  • the sensors 104 A, 104 B may be initiated at a starting location, position, or orientation, such that the sensors 104 A, 104 B may determine relative locations, positions, movement, or orientations in response to changes by the pair of drum sticks 100 .
  • the sensor 104 A may provide data based on movement of the drum stick 102 A.
  • the transceiver of circuitry 106 A may transmit the sensor data to a device, such as a wearable device (e.g., a smart watch), a mobile device, a computer, a remote server, or the like.
  • the device receiving the sensor data may include a processor to recognize a gesture from the sensor data.
  • the gesture may be used to determine a visualization effect corresponding to the gesture or an audio effect including a drum sound corresponding to the gesture.
  • the gesture may include movement from the pair of drum sticks 100 in coordination with each other. In an example, timing information may be sent to coordinate displaying the visualization effect and playing the audio effect.
  • the gesture may be based on an orientation, location, movement, or force of one or both of the pair of drum sticks 100 , for example as determined by one or more of the first sensor 104 A or the second sensor 104 B.
  • the gesture may be determined based on additional input, such as an additional sensor attached to an ankle or a foot of a user controlling the drum stick 102 A.
  • the additional sensor may output data to cause a second audio effect, such as a second drum sound (e.g., a bass drum sound) corresponding to a second gesture.
  • the second audio effect may be played by a speaker.
  • the gesture may include one or more of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, a minimum deceleration, or the like.
  • a display device may be used to display the visualization effect or a speaker may be used to play the audio effect.
  • the speaker may be controlled by a MIDI player.
  • the audio effect may include Multidimensional Polyphonic Expression instructions for use by the MIDI player.
  • the processor of circuitry 106 A may be used to determine the gesture.
  • a visualization effect may be determined based on one or more previously recognized gestures (e.g., a series).
  • the visualization effect may include using a plurality of wearable devices within a predetermined proximity of the drum stick 102 A to display the visualization effect. Devices within the predetermined proximity may include devices on a network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • FIGS. 2A-2C illustrate example visualization effects 200 A- 200 C in accordance with some embodiments.
  • the first visualization effect 200 A includes sensor visualization effects 202 and orientation visualization effects 204 and 206 .
  • the second visualization effect 200 B includes a front-facing view of a drum set and a drummer.
  • the third visualization effect 200 C includes a point-of-view perspective display of a drum set 208 .
  • the components of the visualization effects 200 A- 200 C may include virtual components or augmented reality components.
  • the drum sets displayed in visualization effects 200 B- 200 C may be virtual (e.g., displayed using a VR display).
  • Motion data or gesture primitives detected by a sensor may be used to create visualizations or visual effects to accompany a performance, such as to enhance an audience experience.
  • enhanced visual experiences may be shown, such as capturing activity level (e.g., sensor visualization effects 202 related to an accelerometer or gyroscope of a drum stick or other musical instrument), orientation of the musical instrument device (e.g., drum sticks, as shown in the orientation visualization effects 204 - 206 ), or other aspects of a performance.
  • a system to predetermine the visualization effect 200 A may include a customizable platform to select a background or visual effects, which may be manipulated by a user. Audience interaction may be enabled, such as at wearable devices (e.g., wrist bands) where lights may turn on and off, which may be controlled using the musical instrument device (e.g., drum sticks).
  • the visualization effects 200 A- 200 B may be presented to an audience of a user controlling a musical instrument device.
  • a display screen may be used to display visualization effects 200 A or 200 B.
  • the point-of-view drum set 208 may be presented, in an example, on a display screen to an audience.
  • the visualization effect 200 C may be presented using a VR display to a controller of the musical instrument device (e.g., drum sticks).
  • the visualization effect 200 C within the VR display may show the point-of-view drum set 208 and may include virtual drum sticks 210 or a virtual pedal 212 .
  • the virtual drum sticks 210 may be displayed virtually at a location corresponding to real drum sticks based on sensors and location information of the real drum sticks.
  • the user of the real drum sticks while wearing the VR display, may see the virtual drum sticks 210 as if the user was holding the virtual drum sticks 210 (and hands may also be shown to further this effect).
  • the user may wear an ankle or foot device with a sensor to detect motion of the ankle or foot. The detected motion may cause the foot pedal 212 to move (e.g., in the VR environment) and may cause an audible or visual effect to occur.
  • the user wearing the VR display may play the drum set 208 virtually with the virtual drum sticks 210 by controlling the real drum sticks (and optionally the foot pedal 212 ).
  • the drum set 208 may move or display a visualization according to motion of the real drum sticks (e.g., the cymbals may crash, a drum head may appear to vibrate, a played drum may light up, etc.).
  • FIGS. 3A-3C illustrate instrumentation objects 300 A- 300 C for creating visualization or audio effects in accordance with some embodiments.
  • the instrumentation object 300 A may be a violin bow, viola bow, cello bow, or other stringed instrument bow.
  • the instrumentation object 300 A includes a sensor 304 and may include circuitry 306 , such as a transceiver, a processor, memory, or a system on a chip.
  • the sensor 304 is located at the tip of the instrumentation object 300 A.
  • the sensor 304 may be located in the middle of the instrumentation object 300 A or at the frog.
  • the sensor 304 may include a gyroscope, an accelerometer, or a magnetometer to determine position, orientation, or movement of the instrumentation object 300 A.
  • a plurality of sensors may be disposed on the instrumentation object 300 A (e.g., one at the tip, one in the middle, and one at the frog).
  • the sensor 304 may track back and forth movement of the instrumentation object 300 A.
  • the sensor 304 may track bow tapping movements (e.g. in a perpendicular or partially perpendicular movement to the back and forth traditional bow movement on a stringed instrument).
  • the tracked movement (or position or orientation) of the instrumentation object 300 A may be used to create or identify visual effects to be shown.
  • the visual effects may be matched to the music created by playing the stringed instrument with the instrumentation object 300 A.
  • augmented audible feedback may be created or identified by the tracked movement, which may be played in addition to the music created.
  • a real violin may be played using the instrumentation object 300 A as a bow, and the sensor on the bow may add to a performance experience by integrating visual or audio effects in addition to the music created by playing the violin.
  • a first visual or audible effect may be created or identified and when the bow moves in a second direction, a second visual or audible, effect may be created or identified.
  • mixers may be used to add in augmented sound.
  • a Multidimensional Polyphonic Expression for use with a MIDI player may be used to create or play the augmented sound.
  • a player of an instrument using the instrumentation object 300 A may have a sensor on a finger or fingers of the player.
  • a violin player may place a sensor or sensors on one or more fingers used to play violin (e.g., a fourth finger of the player's left hand).
  • the movement of the pinky finger may indicate a particular visual effect.
  • notes may often be played interchangeably with different fingers (e.g., the fourth finger in first position on a first string, an open second string, or a second finger in a third position on the first string may all correspond to a single note).
  • a specific visual (or audible) affect may be identified or created.
  • the instrumentation object 300 A may be in communication with a server.
  • the server may include a processor to receive sensor data from the sensor 304 of the instrumentation object 300 A, the sensor data may be based on movement of the instrumentation object 300 A.
  • the processor may recognize a gesture from the sensor data, such as a back or forth movement, a tapping of the instrumentation object 300 A on a string, etc.
  • the processor may determine, such as from the gesture, a visualization effect corresponding to the gesture or an audio effect corresponding to the gesture.
  • the visualization effect may be determined using a visualization engine.
  • the processor may cause the visualization effect or the audio effect to be output in response to the determination.
  • the audio effect may include a natural sound caused by the movement of the instrumentation object 300 A.
  • causing the visualization effect to be output may include sending the visualization effect to a virtual reality headset of a user controlling the instrumentation object 300 A, for example, to be displayed on the VR headset.
  • Causing the audio effect to be output may include sending the audio effect to a speaker to play the audio effect.
  • the processor may send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • the processor may receive data from the sensor 304 indicating an initial position of the instrumentation object 300 A and recognize the gesture based on a determined final position of the instrumentation object 300 A.
  • the visualization effect or the audio effect may be determined based on an orientation of the instrumentation object 300 A identified in the sensor data.
  • the visualization effect may be based on one or more (e.g., a series) of previously recognized gestures.
  • the visualization effect may include a lighting effect.
  • Outputting the visualization effect may include sending the lighting effect to a plurality of wearable devices, such as within a predetermined proximity of the instrumentation object 300 A to be displayed, for example at the plurality of wearable devices.
  • Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • the gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a maximum or minimum deceleration, or the like.
  • the audio effect may include a Multidimensional Polyphonic Expression instruction for a MIDI player.
  • the instrumentation object 300 B may be a guitar pick.
  • the instrumentation object 300 B includes a sensor 310 and may include circuitry 312 , such as a transceiver, a processor, memory, or a system on a chip.
  • the instrumentation object 300 B may be used to strum a stringed instrument, such as a guitar. Movement of the instrumentation object 300 B may correspond with a visual or audible effect to be produced. For example, when the instrumentation object 30013 is used to strum a guitar upward, a first visual or audible effect may be identified and when the instrumentation object 3009 is used to strum the guitar downward, a second visual or audible effect may be identified and used.
  • the instrumentation object 3009 may be in communication with a server.
  • the server may include a processor to receive sensor data from the sensor 310 of the instrumentation object 300 B, the sensor data may be based on movement of the instrumentation object 300 B.
  • the processor e.g., circuitry 312
  • the processor may recognize a gesture from the sensor data, such as a strumming movement, a slapping movement, etc.
  • the processor may determine, such as from the gesture, a visualization effect corresponding to the gesture or an audio effect corresponding to the gesture.
  • the visualization effect may be determined using a visualization engine.
  • the processor may cause the visualization effect or the audio effect to be output in response to the determination.
  • the audio effect may include a natural sound caused by the movement of the instrumentation object 300 B.
  • causing the visualization effect to be output may include sending the visualization effect to a virtual reality headset of a user controlling the instrumentation object 300 B, for example, to be displayed on the VR headset.
  • Causing the audio effect to be output may include sending the audio effect to a speaker to play the audio effect.
  • the processor may send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • the processor may receive data from the sensor 310 indicating an initial position of the instrumentation object 3009 and recognize the gesture based on a determined final position of the instrumentation object 300 B.
  • the visualization effect or the audio effect may be determined based on an orientation of the instrumentation object 300 B identified in the sensor data.
  • the visualization effect may be based on one or more (e.g., a series) of previously recognized gestures.
  • the visualization effect may include a lighting effect.
  • Outputting the visualization effect may include sending the lighting effect to a plurality of wearable devices, such as within a predetermined proximity of the instrumentation object 300 B to be displayed, for example at the plurality of wearable devices.
  • Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • the gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a maximum or minimum deceleration, or the like.
  • the audio effect may include a Multidimensional Polyphonic Expression instruction for a MIDI player.
  • the instrumentation object 300 C may be a conductor's baton.
  • the instrumentation object 300 C includes a sensor 316 and may include circuitry 318 , such as a transceiver, a processor, memory, or a system on a chip.
  • the instrumentation object 300 C may be used to conduct an orchestra, either real or virtual.
  • the real orchestra may play music in response to movement of the instrumentation object 300 C or orchestral sound may be created in response to movement of the instrumentation object 300 C with a virtual orchestra.
  • a visual effect or audible effect may be created or identified in response to movement of the instrumentation object 300 C.
  • the instrumentation object 300 C may be in communication with a server.
  • the server may include a processor to receive sensor data from the sensor 316 of the instrumentation object 300 C, the sensor data may be based on movement of the instrumentation object 300 C.
  • the processor e.g., circuitry 318
  • the processor may recognize a gesture from the sensor data, such as a up and down or left and right movement, a conducting cadence movement (e.g., based on a tempo of music being played, such as 3/4, 4/4, 7/8, etc.), or the like.
  • the processor may determine, such as from the gesture, a visualization effect corresponding to the gesture or an audio effect corresponding to the gesture.
  • the visualization effect may be determined using a visualization engine.
  • the processor may cause the visualization effect or the audio effect to be output in response to the determination.
  • causing the visualization effect to be output may include sending the visualization effect to a virtual reality headset of a user controlling the instrumentation object 300 C, for example, to be displayed on the VR headset.
  • Causing the audio effect to be output may include sending the audio effect to a speaker to play the audio effect.
  • the processor may send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • the processor may receive data from the sensor 316 indicating an initial position of the instrumentation object 300 C and recognize the gesture based on a determined final position of the instrumentation object 300 C.
  • the visualization effect or the audio effect may be determined based on an orientation of the instrumentation object 300 C identified in the sensor data.
  • the visualization effect may be based on one or more (e.g., a series) of previously recognized gestures.
  • the visualization effect may include a lighting effect.
  • Outputting the visualization effect may include sending the lighting effect to a plurality of wearable devices, such as within a predetermined proximity of the instrumentation object 3000 to be displayed, for example at the plurality of wearable devices.
  • Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • the gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a maximum or minimum deceleration, or the like.
  • the audio effect may include a Multidimensional Polyphonic Expression instruction for a MIDI player.
  • FIG. 4 illustrates a system 400 for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments.
  • Remote instrumentation devices may include a drum stick 410 , a violin bow 416 , a guitar pick 422 , a conductor's baton 428 , or the like.
  • the instrumentation devices 410 , 416 , 422 , and 428 may include respective sensors (e.g., 412 , 418 , 424 , 430 ) and optionally respective transceivers or processors (e.g., 414 , 420 , 426 , 432 ).
  • the system 400 includes a server 401 in communication with one or more remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ), or a wearable device 408 .
  • the server 401 includes a processor 402 , memory 404 , and a visualization engine 406 .
  • the processor 402 may receive sensor data from a sensor ( 412 , 418 , 424 , or 430 ) of one or more of the remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ), such as the drum stick sensor 412 .
  • the drum stick 410 may be paired with a second drum stick, and the pair may include a parent and a child drum stick.
  • the child drum stick may have limited communication capabilities (e.g., capable of communicating with the parent drum stick, but may be incapable of communicating with another remote device.
  • the parent drum stick may have the processor 414 or a transceiver, for example to communicate with a mobile device, wearable device, or remote device.
  • the pair of drum sticks may be used together.
  • the processor 402 may receive the sensor data from one of the pair of drum sticks (e.g., a parent) or both (e.g., individually, or via the parent). In an example, the sensor data is based on movement of the drum stick 410 .
  • the processor 402 may recognize a gesture from the sensor data.
  • the gesture may include a drum strike, a violin playing movement, a conductor baton conducting movement, a guitar strum, etc.
  • the gesture may include one or more of a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a minimum or maximum deceleration, or the like.
  • the processor 402 may determine, for example using the gesture, a visualization effect corresponding to the gesture.
  • the visualization effect may be determined using the visualization engine 406 .
  • the processor 402 is to determine the visualization effect based on a series of previously recognized gestures.
  • the visualization effect may include a lighting effect, such as a flashing light or light sequence on a screen, a virtual reality light effect, or a light effect sent for display to a plurality of wearable devices (e.g., the wearable device 408 ).
  • the plurality of wearable devices may be identified within a proximity of the remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ).
  • the processor 402 may receive wearable sensor data from the plurality of wearable devices (e.g., the wearable device 408 ), which may be within a predetermined proximity of the remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ).
  • the visualization effect may be modified, for example, based on the wearable sensor data.
  • Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • the processor 402 may determine an audio effect corresponding to the gesture including, for example, a drum sound, a violin sound (or other stringed instrument sound), a guitar sound, an orchestral sound (e.g., a combination of sounds from a plurality of instruments), or the like.
  • the visualization effect and the audio effect are determined based on an orientation of the remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ) identified in the sensor data.
  • the audio effect may include Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • the processor 402 may cause the visualization effect or the audio effect to be output, such as in response to the determination.
  • the visualization effect may be output to a virtual reality headset 434 , which may display the visualization effect using a virtual reality display.
  • the virtual reality headset 434 may be on a user who is controlling a remote instrumentation device (e.g., 410 , 416 , 422 , 428 ).
  • the visualization effect may be displayed in coordination with the audio effect played, for example, by a speaker.
  • the speaker may be used to play the audio effect.
  • the processor 402 may receive data from the sensor indicating an initial position of the remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ). For example, the processor 402 may determine a final position of the remote instrumentation devices (e.g., 410 , 416 , 422 , 428 ), such as a drum stick 410 . In an example, the drum stick 410 may be used to generate the drum sound (e.g., without striking a drum), which may be determined based on the initial position and the final position.
  • a final position of the remote instrumentation devices e.g., 410 , 416 , 422 , 428
  • the drum stick 410 may be used to generate the drum sound (e.g., without striking a drum), which may be determined based on the initial position and the final position.
  • the processor 402 may receive additional sensor data from a second sensor attached to an ankle or a foot of a user who is controlling the drum stick.
  • a second gesture may be recognized from the additional sensor data.
  • the processor 402 may determine from the second gesture, a second audio effect or a second visualization effect.
  • the second audio effect may include a second drum sound corresponding to the second gesture.
  • the processor 402 may cause the second audio effect or the second visualization effect to be output with the visualization effect or the audio effect.
  • FIG. 5 illustrates a flowchart showing a technique 500 for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments.
  • the technique 500 includes an operation 502 to receive sensor data from a sensor of at least one drum stick of a pair of drum sticks.
  • the sensor data may be based on movement of the at least one drum stick.
  • the sensor data may include data from sensors of both the pair of drum sticks.
  • the Gesture may include movement of the pair of drum sticks in coordination with each other.
  • the technique 500 includes an operation 504 to recognize a gesture.
  • the gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a minimum or maximum deceleration, or the like.
  • the technique 500 includes an operation 506 to determine a visualization effect and an audio effect.
  • determining the visualization effect includes using a visualization engine.
  • determining the visualization effect includes determining the visualization effect based on one or more (e.g., a series) of previously recognized gestures.
  • the audio effect may include Multidimensional Polyphonic Expression instructions for a MIDI player.
  • the technique 500 includes an operation 508 to output the visualization effect and the audio effect, such as in response to the determination.
  • Outputting the visualization effect may include sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • Outputting the audio effect may include sending the audio effect to a speaker to play the audio effect.
  • the visualization effect and the audio effect may be displayed and played, respectively, in coordination.
  • the technique 500 may include receiving data from the sensor indicating an initial position of the drum stick, and recognizing the gesture may include determining a final position of the drum stick.
  • the drum sound may be determined based on the initial position or the final position.
  • the visualization effect corresponding to the gesture or the audio effect including the drum sound may corresponding to the gesture may be determined based on an orientation of the drum stick identified in the sensor data.
  • the technique 500 may include receiving additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick, recognizing a second gesture from the additional sensor data, determining, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture, and causing the second audio effect to be output with the visualization effect and the audio effect.
  • the technique 500 may include receiving wearable sensor data from a plurality of wearable devices, such as within a predetermined proximity of the drum stick.
  • the visualization effect may be modified based on the wearable sensor data.
  • the visualization effect may include a lighting effect, which may be sent to a plurality of wearable devices within a predetermined proximity of the drum stick, such as to be displayed at the plurality of wearable devices.
  • Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • FIG. 6 illustrates generally an example of a block diagram of a machine 600 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
  • the machine 600 may operate as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine 600 may operate in the capacity of a server machine, a client machine, or both in server-client network environments.
  • the machine 600 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment.
  • P2P peer-to-peer
  • the machine 600 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • STB set-top box
  • PDA personal digital assistant
  • mobile telephone a web appliance
  • network router network router, switch or bridge
  • machine any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
  • SaaS software as a service
  • Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or mechanisms.
  • Modules are tangible entities (e.g., hardware) capable of performing specified operations when operating.
  • a module includes hardware.
  • the hardware may be specifically configured to carry out a specific operation (e.g., hardwired).
  • the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation. The configuring may occur under the direction of the executions units or a loading mechanism. Accordingly, the execution units are communicatively coupled to the computer readable medium when the device is operating.
  • the execution units may be a member of more than one module.
  • the execution units may be configured by a first set of instructions to implement a first module at one point in time and reconfigured by a second set of instructions to implement a second module.
  • Machine 600 may include a hardware processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 604 and a static memory 606 , some or all of which may communicate with each other via an interlink (e.g., bus) 608 .
  • the machine 600 may further include a display unit 610 , an alphanumeric input device 612 (e.g., a keyboard), and a user interface (UI) navigation device 614 (e.g., a mouse).
  • the display unit 610 , alphanumeric input device 612 and UI navigation device 614 may be a touch screen display.
  • the machine 600 may additionally include a storage device (e.g., drive unit) 616 , a signal generation device 618 (e.g., a speaker), a network interface device 620 , and one or more sensors 621 , such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor.
  • the machine 600 may include an output controller 628 , such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • USB universal serial bus
  • the storage device 616 may include a machine readable medium 622 that is non-transitory on which is stored one or more sets of data structures or instructions 624 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
  • the instructions 624 may also reside, completely or at least partially, within the main memory 604 , within static memory 606 , or within the hardware processor 602 during execution thereof by the machine 600 .
  • one or any combination of the hardware processor 602 , the main memory 604 , the static memory 606 , or the storage device 616 may constitute machine readable media.
  • machine readable medium 622 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 624 .
  • machine readable medium may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 624 .
  • machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 600 and that cause the machine 600 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
  • Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media.
  • machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • non-volatile memory such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash memory devices e.g., electrically Erasable Programmable Read-Only Memory (EEPROM)
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash memory devices e.g., electrically Era
  • the instructions 624 may further be transmitted or received over a communications network 626 using a transmission medium via the network interface device 620 utilizing any one of a number of transfer protocols (e.g., frame relay, Internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
  • transfer protocols e.g., frame relay, Internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
  • Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others.
  • the network interface device 620 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 626 .
  • the network interface device 620 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques.
  • SIMO single-input multiple-output
  • MIMO multiple-input multiple-output
  • MISO multiple-input single-output
  • transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 600 , and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • Example 1 is a server in communication with a pair of drum sticks, the server comprising: a processor to: receive sensor data from a sensor of at least one drum stick of the pair of drum sticks, the sensor data based on movement of the at least one drum stick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination.
  • a processor to: receive sensor data from a sensor of at least one drum stick of the pair of drum sticks, the sensor data based on movement of the at least one drum stick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination.
  • Example 2 the subject matter of Example 1 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • Example 3 the subject matter of any one or more of Examples 1-2 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • Example 4 the subject matter of any one or more of Examples 1-3 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • Example 5 the subject matter of any one or more of Examples 1-4 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • Example 6 the subject matter of Example 5 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 7 the subject matter of any one or more of Examples 1-6 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the drum stick, and wherein to recognize the gesture, the processor is to determine a final position of the drum stick.
  • Example 8 the subject matter of Example 7 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • Example 9 the subject matter of any one or more of Examples 1-8 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the drum stick identified in the sensor data.
  • Example 10 the subject matter of any one or more of Examples 1-9 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • Example 11 the subject matter of any one or more of Examples 1-10 optionally include wherein the processor is further to: receive additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; recognize a second gesture from the additional sensor data; determine, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and cause the second audio effect to be output with the visualization effect and the audio effect.
  • Example 12 the subject matter of any one or more of Examples 1-11 optionally include wherein the processor is further to: receive wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and modify the visualization effect based on the wearable sensor data.
  • Example 13 the subject matter of any one or more of Examples 1-12 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • Example 14 the subject matter of any one or more of Examples 1-13 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 15 the subject matter of any one or more of Examples 1-14 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 16 is a method for providing effects corresponding to movement of drum sticks, the method comprising: receiving sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at least one drum stick; recognizing a gesture from the sensor data; determining, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and causing the visualization effect and the audio effect to be output in response to the determination.
  • Example 17 the subject matter of Example 16 optionally includes wherein determining the visualization effect includes using a visualization engine.
  • Example 18 the subject matter of any one or more of Examples 16-17 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • Example 19 the subject matter of any one or more of Examples 16-18 optionally include wherein causing the visualization effect to be output includes sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • Example 20 the subject matter of any one or more of Examples 16-19 optionally include wherein causing the audio effect to be output includes sending the audio effect to a speaker to play the audio effect.
  • Example 21 the subject matter of Example 20 optionally includes wherein causing the visualization effect to be output includes sending the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 22 the subject matter of any one or more of Examples 16-21 optionally include receiving data from the sensor indicating an initial position of the drum stick, and wherein recognizing the gesture includes determining a final position of the drum stick.
  • Example 23 the subject matter of Example 22 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • Example 24 the subject matter of any one or more of Examples 16-23 optionally include wherein the visualization effect corresponding to the vesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the drum stick identified in the sensor data.
  • Example 25 the subject matter of any one or more of Examples 16-24 optionally include wherein determining the visualization effect includes determining the visualization effect based on a series of previously recognized gestures.
  • Example 26 the subject matter of any one or more of Examples 16-25 optionally include receiving additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; recognizing a second gesture from the additional sensor data; determining, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and causing the second audio effect to be output with the visualization effect and the audio effect.
  • Example 27 the subject matter of any one or more of Examples 16-26 optionally include receiving wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and modifying the visualization effect based on the wearable sensor data.
  • Example 28 the subject matter of any one or more of Examples 16-27 optionally include wherein the visualization effect includes a lighting effect, and wherein causing the visualization effect to be output includes sending the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • Example 29 the subject matter of any one or more of Examples 16-28 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 30 the subject matter of any one or more of Examples 16-29 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 31 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 16-30.
  • Example 32 is an apparatus comprising means for performing any of the methods of Examples 16-30.
  • Example 33 is at least one machine-readable medium including instructions for providing effects corresponding to movement of drum sticks, which when executed by a machine, cause the machine to: receive sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at, least one drum stick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination.
  • Example 34 the subject matter of Example 33 optionally includes wherein the instructions to determine the visualization effect include instructions to use a visualization engine.
  • Example 35 the subject matter of any one or more of Examples 33-34 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • Example 36 the subject matter of any one or more of Examples 33-35 optionally include wherein the instructions to cause the visualization effect to be output include instructions to send the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on e virtual reality headset.
  • Example 37 the subject matter of any one or more of Examples 33-36 optionally include wherein the instructions to cause the audio effect to be output include instructions to send the audio effect to a speaker to play the audio effect.
  • Example 38 the subject matter of Example 37 optionally includes wherein the instructions to cause the visualization effect to be output include instructions to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 39 the subject matter of any one or more of Examples 33-38 optionally include instructions to receive data from the sensor indicating an initial position of the drum stick, and wherein the instructions to recognize the gesture include instructions to determine a final position of the drum stick.
  • Example 40 the subject matter of Example 39 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • Example 41 the subject matter of any one or more of Examples 33-40 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation the drum stick identified in the sensor data.
  • Example 42 the subject matter of any one or more of Examples 33-41 optionally include wherein the instructions to determine the visualization effect include instructions to determine the visualization effect based on a series of previously recognized gestures.
  • Example 43 the subject matter of any one or more of Examples 33-42 optionally include instructions to: receive additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; recognize a second gesture from the additional sensor data; determine, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and cause the second audio effect to be output with the visualization effect and the audio effect.
  • Example 44 the subject matter of any one or more of Examples 33-43 optionally include instructions to: receive wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and modify the visualization effect based on the wearable sensor data,
  • Example 45 the subject matter of any one or more of Examples 33-44 optionally include wherein the visualization effect includes a lighting effect, and wherein the instructions to cause the visualization effect to be output include instructions to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • Example 46 the subject matter of any one or more of Examples 33-45 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 47 the subject matter of any one or more of Examples 33-46 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 48 is an apparatus for providing effects corresponding to movement of drum sticks, the apparatus comprising: means for receiving sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at least one drum stick; means for recognizing a gesture from the sensor data; means for determining, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and means for causing the visualization effect and the audio effect to be output in response to the determination.
  • Example 49 the subject matter of Example 48 optionally includes wherein the means for determining the visualization effect include means for using a visualization engine.
  • Example 50 the subject matter of any one or more of Examples 48-49 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • Example 51 the subject matter of any one or more of Examples 48-50 optionally include wherein the means for causing the visualization effect to be output include means for sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • Example 52 the subject matter of any one or more of Examples 48-51 optionally include wherein the means for causing the audio effect to be output include means for sending the audio effect to a speaker to play the audio effect.
  • Example 53 the subject matter of Example 52 optionally includes wherein the means for causing the visualization effect to be output include means for sending the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 54 the subject matter of any one or more of Examples 48-53 optionally include means for receiving data from the sensor indicating an initial position of the drum stick, and wherein the means for recognizing the gesture include means for determining a final position of the drum stick.
  • Example 55 the subject matter of Example 54 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • Example 56 the subject matter of any one or more of Examples 48-55 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the drum stick identified in the sensor data.
  • Example 57 the subject matter of any one or more of Examples 48-56 optionally include wherein the means for determining the visualization effect include means for determining the visualization effect based on a series of previously recognized gestures.
  • Example 58 the subject matter of any one or more of Examples 48-57 optionally include means for receiving additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; means for recognizing a second gesture from the additional sensor data; means for determining, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and means for causing the second audio effect to be output with the visualization effect and the audio effect.
  • Example 59 the subject matter of any one or more of Examples 48-58 optionally include means for receiving wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and means for modifying the visualization effect based on the wearable sensor data.
  • Example 60 the subject matter of any one or more of Examples 48-59 optionally include wherein the visualization effect includes a lighting effect, and wherein the means for causing the visualization effect to be output include means for sending the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • Example 61 the subject matter of any one or more of Examples 48-60 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 62 the subject matter of any one or more of Examples 48-61 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 63 is a virtual drum set system comprising: a pair of drum sticks each including: a sensor to provide data based on movement of the drum stick; and a transceiver to transmit the sensor data; a device including a processor to: recognize a gesture from the sensor data; and determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and a display device to display the visualization effect; and a speaker to play the audio effect.
  • Example 64 the subject matter of Example 63 optionally includes wherein the device is a mobile device.
  • Example 65 the subject matter of any one or more of Examples 63-64 optionally include wherein the device further includes a device transceiver to receive the sensor data.
  • Example 66 the subject matter of any one or more of Examples 63-65 optionally include wherein the display device is a virtual realty headset and the visualization effect includes a virtual drum set.
  • Example 67 the subject matter of any one or more of Examples 63-66 optionally include wherein the sensor includes a nine-axis sensor including a magnetometer, an accelerometer, and a gyroscope.
  • Example 68 the subject matter of any one or more of Examples 63-67 optionally include wherein the speaker includes headphones.
  • Example 69 the subject matter of any one or more of Examples 63-68 optionally include wherein one of the pair of drum sticks is a parent drum stick and the transceiver of the parent drum stick is configured to receive child sensor data from the other of the pair of drum sticks and wherein the transceiver of the parent drum stick is to send combined sensor data to the device.
  • Example 70 the subject matter of any one or more of Examples 63-69 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • Example 71 the subject matter of any one or more of Examples 63-70 optionally include wherein the processor is to send timing information to the display device and the speaker to coordinate displaying the visualization effect and playing the audio effect.
  • Example 72 the subject matter of any one or more of Examples 63-71 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation the drum stick identified in the sensor data.
  • Example 73 the subject matter of any one or more of Examples 63-72 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • Example 74 the subject matter of any one or more of Examples 63-73 optionally include wherein the system further comprises an additional sensor attached to an ankle or a foot of a user controlling the drum stick; and wherein the processor is further to determine, from the additional sensor data of the additional sensor, a second audio effect including a second drum sound corresponding to the second gesture.
  • Example 75 the subject matter of Example 74 optionally includes wherein the speaker is to play the second audio effect.
  • Example 76 the subject matter of any one or more of Examples 63-75 optionally include wherein the display device includes a plurality of wearable devices within a predetermined proximity of the drum stick, the visualization effect to be displayed at the plurality of wearable devices.
  • Example 77 the subject matter of any one or more of Examples 63-76 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 78 the subject matter of any one or more of Examples 63-77 optionally include wherein the speaker is controlled by a musical instrument digital interface (MIDI) player and wherein the audio effect includes Multidimensional Polyphonic Expression instructions for use by the MIDI player.
  • MIDI musical instrument digital interface
  • Example 79 is a server in communication with a violin bow, the server comprising: a processor to: receive sensor data from a sensor of the violin how, the sensor data based on movement of the violin bow; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect including a natural sound caused by the movement of the violin bow.
  • Example 80 the subject matter of Example 79 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • Example 81 the subject matter of any one or more of Examples 79-80 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the violin bow to be displayed on the virtual reality headset.
  • Example 82 the subject matter of any one or more of Examples 79-81 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • Example 83 the subject matter of Example 82 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 84 the subject matter of any one or more of Examples 79-83 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the violin bow, and wherein to recognize the gesture, the processor is to determine a final position of the violin bow.
  • Example 85 the subject matter of any one or more of Examples 79-84 optionally include wherein the visualization effect corresponding to the gesture and the audio effect corresponding to the gesture are determined based on an orientation of the violin bow identified in the sensor data.
  • Example 86 the subject matter of any one or more of Examples 79-85 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • Example 87 the subject matter of any one or more of Examples 79-86 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the violin bow to be displayed at the plurality of wearable devices.
  • Example 88 the subject matter of any one or more of Examples 79-87 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 89 the subject matter of any one or more of Examples 79-88 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 90 is a server in communication with a guitar pick, the server comprising: a processor to: receive sensor data from a sensor of guitar pick, the sensor data based on movement of the guitar pick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect including a natural sound caused by the movement of the guitar pick.
  • Example 91 the subject matter of Example 90 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • Example 92 the subject matter of any one or more of Examples 90-91 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the guitar pick to be displayed on the virtual reality headset.
  • Example 93 the subject matter of any one or more of Examples 90-92 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • Example 94 the subject matter of Example 93 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 95 the subject matter of any one or more of Examples 90-94 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the guitar pick, and wherein to recognize the gesture, the processor is to determine a final position of the guitar pick.
  • Example 96 the subject matter of any one or more of Examples 90-95 optionally include wherein the visualization effect corresponding to the gesture and the audio effect corresponding to the gesture are determined based on an orientation of the guitar pick identified in the sensor data.
  • Example 97 the subject matter of any one or more of Examples 90-96 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • Example 98 the subject matter of any one or more of Examples 90-97 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the guitar pick to be displayed at the plurality of wearable devices.
  • Example 99 the subject matter of any one or more of Examples 90-98 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 100 the subject matter of any one or more of Examples 90-99 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 101 is a server in communication with a conductor baton, the server comprising: a processor to: receive sensor data from a sensor of the conductor baton, the sensor data based on movement of the conductor baton; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect to be played with corresponding natural sounds directed by the movement of the conductor baton.
  • a processor to: receive sensor data from a sensor of the conductor baton, the sensor data based on movement of the conductor baton; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect to be played with corresponding natural sounds directed by the movement of the conductor baton.
  • Example 102 the subject matter of Example 101 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • Example 103 the subject matter of any one or more of Examples 101-102 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the conductor baton to be displayed on the virtual reality headset.
  • Example 104 the subject matter of any one or more of Examples 101-103 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • Example 105 the subject matter of Example 104 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • Example 106 the subject matter of any one or more of Examples 101-105 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the conductor baton, and wherein to recognize the gesture, the processor is to determine a final position of the conductor baton.
  • Example 107 the subject matter of any one or more of Examples 101-106 optionally include wherein the visualization effect corresponding to the gesture and the audio effect corresponding to the gesture are determined based on an orientation of the conductor baton identified in the sensor data.
  • Example 108 the subject matter of any one or more of Examples 101-107 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • Example 109 the subject matter of any one or more of Examples 101-108 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the conductor baton to be displayed at the plurality of wearable devices.
  • Example 110 the subject matter of any one or more of Examples 101-109 optionally include wherein the vesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • the vesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • Example 111 the subject matter of any one or more of Examples 101-110 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • MIDI musical instrument digital interface
  • Example 112 is at least one machine-readable medium including instructions, which when executed by a machine, cause the machine to perform operations of any of the operations of Examples 1-111.
  • Example 113 is an apparatus comprising means for performing any of the operations of Examples 1-111.
  • Example 114 is a system to perform the operations of any of the Examples 1-111.
  • Example 115 is a method to perform the operations of any of the Examples 1-111.
  • Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples.
  • An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times.
  • Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMS), read only memories (ROMs), and the like.
  • hard disks removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMS), read only memories (ROMs), and the like.

Abstract

Systems and methods may be used to provide effects corresponding to movement of instrument objects or other objects. A method may include receiving sensor data from an object based on movement of the object, recognizing a gesture from the sensor data, and determining an effect, such as a visualization or audio effect corresponding to the gesture. The method may include causing the effect to be output in response to the determination.

Description

    BACKGROUND
  • Playing and listening to live music has been captivating humans for millennia. Traditionally, live performances featured little in the way of visual effects. More recently, live performances have been augmented by video, lighting effects, pyrotechnics, and props. While these effects have been entertaining, they do not let the audience experience the musician's point of view. These effects are further limited in that they are not controllable by the musician during the performance.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
  • FIG. 1 illustrates a pair of drum sticks for creating visualization or audio effects in accordance with some embodiments.
  • FIGS. 2A-2C illustrate example visualization effects in accordance with some embodiments.
  • FIGS. 3A-3C illustrate instrumentation objects for creating visualization or audio effects in accordance with some embodiments.
  • FIG. 4 illustrates a system for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments.
  • FIG. 5 illustrates a flowchart showing a technique for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments.
  • FIG. 6 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
  • DETAILED DESCRIPTION
  • Systems and methods for providing virtual instrument visualization effects corresponding to movement of physical objects, such as drum sticks, a violin bow, a guitar pick, a conductor baton, or the like are described herein. The systems and methods described herein are used to augment and enrich experiences from traditional musical instruments by communicating with a device to perform motion sensing, gesture detection, and wireless communication.
  • The systems and methods described herein are used for music performance with enhanced experiences. Motion sensing is added to one or more musical instruments, such as a drum stick, a violin/viola/cello bow, a guitar pick, an end of a guitar (e.g., headstock), a conductor's baton, or the like. The musical experience of an artist or audience may be enhanced, and may include real instruments or virtual instruments, such as those used in virtual reality (VR) or augmented reality (AR) systems.
  • Some traditional systems may use one or more cameras to track musical instruments. An issue with these traditional systems is that they may be occluded by a player's own hand. The presently described systems and methods use a sensor, such as a nine-axis gyroscope and accelerometer, which may measure rotation, location, or orientation. The sensor may be used without a camera, which allows uninterrupted rotation, location, or orientation information to be available while a user plays, without concern for a camera line of sight.
  • In an example, the systems and methods described herein provide audible and visual feedback to be played and displayed, respectively, when an action, motion, or gesture occurs at a device (e.g., a drum stick, a violin/viola/cello bow, a guitar pick, an end of a guitar, a conductor's baton, or the like). In an example, the audible feedback may include sound created at a musical instrument by the action, motion, or gesture (e.g., when a violin how traverses a violin string, the string vibrates, causing sound to be produced), which may be augmented (e.g., amplified and played by a speaker, changed, or distorted) or the created sound may be added to by the audible feedback (e.g., additional sound may be played not caused by the musical instrument). In another example, the audible feedback may include sound created by a computer system, such as a Musical Instrument Digital Interface (MIDI) system, a drum kit, etc. This type of audible feedback may be created by, for example, tracking a gesture of one or more drum sticks (e.g., without hitting an actual drum), tracking movement of a conductor's baton, etc.
  • The visual feedback provided in the system may be displayed for an audience, a performer, a remote viewer, or a combination thereof. The visual feedback may include using a display screen, a VR or AR display, etc. Other visual feedback may include fireworks, animated or video sequences, lighting effects on a wearable device, or the like. The visual or audible feedback described herein may be based on additional activity of a user, such as dancing, gestures, predetermined effects, or the like.
  • FIG. 1 illustrates a pair of drum sticks 100 for creating visualization or audio effects in accordance with some embodiments. The pair of drum sticks 100 include a first drum stick 102A and a second drum stick 10213. Each of the pair of drum sticks 100 includes a respective motion sensor 104A and 104B and respective circuitry 106A and 106B. Although a single sensor 104A and 104B are illustrated in FIG. 1, it is understood that multiple sensors may be used on each drum stick 102A and 102B. The circuitry 106A and 106B may include a transceiver, a processor, memory, or a system on a chip. In an example, the first drum stick 102A may be a parent and the second drum stick 102B may be a child. The parent may receive information from the sensor 104B of the child, and the parent may forward that sensor information along with information from the sensor 104A of the parent to a device, such as a mobile device, a computer, a server, etc., for further processing. In another example, the pair of drum sticks 100 may independently send information to a device.
  • The pair of drum sticks 100 may be paired by assigning one to a left hand of a user and one to a right hand of a user (or simulating such). The left hand assigned drum stick may be assigned to output a first set of audible or visual feedback and the right hand assigned drum stick may be assigned to output a second set of audible or visual feedback. For example, either drum stick may be used to cause drum sound as audible feedback based on the location and motion of the drum stick, and one of the drum sticks may be assigned to a first visual effect (e.g., a flashing light on a proximate wearable device), and the other of the drum sticks may be assigned to a second visual effect (e.g., a visual effect on a display screen).
  • The sensors 104A, 104B may include a magnetometer, accelerometer, or gyroscope, For example, the sensors 104A, 104B may include a nine-axis sensor with a magnetometer, accelerometer, and a gyroscope for detecting location, position, and movement. The sensors 104A, 104B may be initiated at a starting location, position, or orientation, such that the sensors 104A, 104B may determine relative locations, positions, movement, or orientations in response to changes by the pair of drum sticks 100.
  • The sensor 104A may provide data based on movement of the drum stick 102A. The transceiver of circuitry 106A may transmit the sensor data to a device, such as a wearable device (e.g., a smart watch), a mobile device, a computer, a remote server, or the like. The device receiving the sensor data may include a processor to recognize a gesture from the sensor data. The gesture may be used to determine a visualization effect corresponding to the gesture or an audio effect including a drum sound corresponding to the gesture. The gesture may include movement from the pair of drum sticks 100 in coordination with each other. In an example, timing information may be sent to coordinate displaying the visualization effect and playing the audio effect. The gesture may be based on an orientation, location, movement, or force of one or both of the pair of drum sticks 100, for example as determined by one or more of the first sensor 104A or the second sensor 104B.
  • The gesture may be determined based on additional input, such as an additional sensor attached to an ankle or a foot of a user controlling the drum stick 102A. The additional sensor may output data to cause a second audio effect, such as a second drum sound (e.g., a bass drum sound) corresponding to a second gesture. The second audio effect may be played by a speaker. The gesture may include one or more of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, a minimum deceleration, or the like.
  • A display device may be used to display the visualization effect or a speaker may be used to play the audio effect. The speaker may be controlled by a MIDI player. The audio effect may include Multidimensional Polyphonic Expression instructions for use by the MIDI player. In an example, the processor of circuitry 106A may be used to determine the gesture. In an example, a visualization effect may be determined based on one or more previously recognized gestures (e.g., a series). The visualization effect may include using a plurality of wearable devices within a predetermined proximity of the drum stick 102A to display the visualization effect. Devices within the predetermined proximity may include devices on a network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • FIGS. 2A-2C illustrate example visualization effects 200A-200C in accordance with some embodiments. The first visualization effect 200A includes sensor visualization effects 202 and orientation visualization effects 204 and 206. The second visualization effect 200B includes a front-facing view of a drum set and a drummer. The third visualization effect 200C includes a point-of-view perspective display of a drum set 208. The components of the visualization effects 200A-200C may include virtual components or augmented reality components. For example, the drum sets displayed in visualization effects 200B-200C may be virtual (e.g., displayed using a VR display).
  • Motion data or gesture primitives detected by a sensor may be used to create visualizations or visual effects to accompany a performance, such as to enhance an audience experience. For example, enhanced visual experiences may be shown, such as capturing activity level (e.g., sensor visualization effects 202 related to an accelerometer or gyroscope of a drum stick or other musical instrument), orientation of the musical instrument device (e.g., drum sticks, as shown in the orientation visualization effects 204-206), or other aspects of a performance. A system to predetermine the visualization effect 200A may include a customizable platform to select a background or visual effects, which may be manipulated by a user. Audience interaction may be enabled, such as at wearable devices (e.g., wrist bands) where lights may turn on and off, which may be controlled using the musical instrument device (e.g., drum sticks).
  • In an example, the visualization effects 200A-200B may be presented to an audience of a user controlling a musical instrument device. For example, a display screen may be used to display visualization effects 200A or 200B. For visualization effect 200C, the point-of-view drum set 208 may be presented, in an example, on a display screen to an audience. In an example, the visualization effect 200C may be presented using a VR display to a controller of the musical instrument device (e.g., drum sticks). The visualization effect 200C within the VR display may show the point-of-view drum set 208 and may include virtual drum sticks 210 or a virtual pedal 212. The virtual drum sticks 210 may be displayed virtually at a location corresponding to real drum sticks based on sensors and location information of the real drum sticks. The user of the real drum sticks, while wearing the VR display, may see the virtual drum sticks 210 as if the user was holding the virtual drum sticks 210 (and hands may also be shown to further this effect). The user may wear an ankle or foot device with a sensor to detect motion of the ankle or foot. The detected motion may cause the foot pedal 212 to move (e.g., in the VR environment) and may cause an audible or visual effect to occur. The user wearing the VR display may play the drum set 208 virtually with the virtual drum sticks 210 by controlling the real drum sticks (and optionally the foot pedal 212). The drum set 208 may move or display a visualization according to motion of the real drum sticks (e.g., the cymbals may crash, a drum head may appear to vibrate, a played drum may light up, etc.).
  • FIGS. 3A-3C illustrate instrumentation objects 300A-300C for creating visualization or audio effects in accordance with some embodiments.
  • For example, the instrumentation object 300A may be a violin bow, viola bow, cello bow, or other stringed instrument bow. The instrumentation object 300A includes a sensor 304 and may include circuitry 306, such as a transceiver, a processor, memory, or a system on a chip. In the example shown in FIG. 3A, the sensor 304 is located at the tip of the instrumentation object 300A. In other examples, the sensor 304 may be located in the middle of the instrumentation object 300A or at the frog. The sensor 304 may include a gyroscope, an accelerometer, or a magnetometer to determine position, orientation, or movement of the instrumentation object 300A. In an example, a plurality of sensors may be disposed on the instrumentation object 300A (e.g., one at the tip, one in the middle, and one at the frog).
  • The sensor 304 may track back and forth movement of the instrumentation object 300A. The sensor 304 may track bow tapping movements (e.g. in a perpendicular or partially perpendicular movement to the back and forth traditional bow movement on a stringed instrument). The tracked movement (or position or orientation) of the instrumentation object 300A may be used to create or identify visual effects to be shown. The visual effects may be matched to the music created by playing the stringed instrument with the instrumentation object 300A. In another example, augmented audible feedback may be created or identified by the tracked movement, which may be played in addition to the music created. For example, a real violin may be played using the instrumentation object 300A as a bow, and the sensor on the bow may add to a performance experience by integrating visual or audio effects in addition to the music created by playing the violin. For example, when the bow moves in a first direction, a first visual or audible effect may be created or identified and when the bow moves in a second direction, a second visual or audible, effect may be created or identified. In an example, mixers may be used to add in augmented sound. For example, a Multidimensional Polyphonic Expression for use with a MIDI player may be used to create or play the augmented sound.
  • In an example, a player of an instrument using the instrumentation object 300A may have a sensor on a finger or fingers of the player. For example, a violin player may place a sensor or sensors on one or more fingers used to play violin (e.g., a fourth finger of the player's left hand). The movement of the pinky finger may indicate a particular visual effect. For example, when playing a stringed instrument, notes may often be played interchangeably with different fingers (e.g., the fourth finger in first position on a first string, an open second string, or a second finger in a third position on the first string may all correspond to a single note). By playing with a particular finger, a specific visual (or audible) affect may be identified or created.
  • The instrumentation object 300A may be in communication with a server. The server may include a processor to receive sensor data from the sensor 304 of the instrumentation object 300A, the sensor data may be based on movement of the instrumentation object 300A. The processor may recognize a gesture from the sensor data, such as a back or forth movement, a tapping of the instrumentation object 300A on a string, etc. The processor may determine, such as from the gesture, a visualization effect corresponding to the gesture or an audio effect corresponding to the gesture. The visualization effect may be determined using a visualization engine. In an example, the processor may cause the visualization effect or the audio effect to be output in response to the determination. The audio effect may include a natural sound caused by the movement of the instrumentation object 300A.
  • In an example, causing the visualization effect to be output may include sending the visualization effect to a virtual reality headset of a user controlling the instrumentation object 300A, for example, to be displayed on the VR headset. Causing the audio effect to be output may include sending the audio effect to a speaker to play the audio effect. The processor may send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker. The processor may receive data from the sensor 304 indicating an initial position of the instrumentation object 300A and recognize the gesture based on a determined final position of the instrumentation object 300A. The visualization effect or the audio effect may be determined based on an orientation of the instrumentation object 300A identified in the sensor data. The visualization effect may be based on one or more (e.g., a series) of previously recognized gestures. The visualization effect may include a lighting effect. Outputting the visualization effect may include sending the lighting effect to a plurality of wearable devices, such as within a predetermined proximity of the instrumentation object 300A to be displayed, for example at the plurality of wearable devices. Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc. The gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a maximum or minimum deceleration, or the like. The audio effect may include a Multidimensional Polyphonic Expression instruction for a MIDI player.
  • The instrumentation object 300B may be a guitar pick. The instrumentation object 300B includes a sensor 310 and may include circuitry 312, such as a transceiver, a processor, memory, or a system on a chip. The instrumentation object 300B may be used to strum a stringed instrument, such as a guitar. Movement of the instrumentation object 300B may correspond with a visual or audible effect to be produced. For example, when the instrumentation object 30013 is used to strum a guitar upward, a first visual or audible effect may be identified and when the instrumentation object 3009 is used to strum the guitar downward, a second visual or audible effect may be identified and used.
  • The instrumentation object 3009 may be in communication with a server. The server may include a processor to receive sensor data from the sensor 310 of the instrumentation object 300B, the sensor data may be based on movement of the instrumentation object 300B. The processor (e.g., circuitry 312) may recognize a gesture from the sensor data, such as a strumming movement, a slapping movement, etc. The processor may determine, such as from the gesture, a visualization effect corresponding to the gesture or an audio effect corresponding to the gesture. The visualization effect may be determined using a visualization engine. In an example, the processor may cause the visualization effect or the audio effect to be output in response to the determination. The audio effect may include a natural sound caused by the movement of the instrumentation object 300B.
  • In an example, causing the visualization effect to be output may include sending the visualization effect to a virtual reality headset of a user controlling the instrumentation object 300B, for example, to be displayed on the VR headset. Causing the audio effect to be output may include sending the audio effect to a speaker to play the audio effect. The processor may send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker. The processor may receive data from the sensor 310 indicating an initial position of the instrumentation object 3009 and recognize the gesture based on a determined final position of the instrumentation object 300B. The visualization effect or the audio effect may be determined based on an orientation of the instrumentation object 300B identified in the sensor data. The visualization effect may be based on one or more (e.g., a series) of previously recognized gestures. The visualization effect may include a lighting effect. Outputting the visualization effect may include sending the lighting effect to a plurality of wearable devices, such as within a predetermined proximity of the instrumentation object 300B to be displayed, for example at the plurality of wearable devices. Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc. The gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a maximum or minimum deceleration, or the like. The audio effect may include a Multidimensional Polyphonic Expression instruction for a MIDI player.
  • The instrumentation object 300C may be a conductor's baton. The instrumentation object 300C includes a sensor 316 and may include circuitry 318, such as a transceiver, a processor, memory, or a system on a chip. The instrumentation object 300C may be used to conduct an orchestra, either real or virtual. The real orchestra may play music in response to movement of the instrumentation object 300C or orchestral sound may be created in response to movement of the instrumentation object 300C with a virtual orchestra. A visual effect or audible effect may be created or identified in response to movement of the instrumentation object 300C.
  • The instrumentation object 300C may be in communication with a server. The server may include a processor to receive sensor data from the sensor 316 of the instrumentation object 300C, the sensor data may be based on movement of the instrumentation object 300C. The processor (e.g., circuitry 318) may recognize a gesture from the sensor data, such as a up and down or left and right movement, a conducting cadence movement (e.g., based on a tempo of music being played, such as 3/4, 4/4, 7/8, etc.), or the like. The processor may determine, such as from the gesture, a visualization effect corresponding to the gesture or an audio effect corresponding to the gesture. The visualization effect may be determined using a visualization engine. In an example, the processor may cause the visualization effect or the audio effect to be output in response to the determination.
  • In an example, causing the visualization effect to be output may include sending the visualization effect to a virtual reality headset of a user controlling the instrumentation object 300C, for example, to be displayed on the VR headset. Causing the audio effect to be output may include sending the audio effect to a speaker to play the audio effect. The processor may send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker. The processor may receive data from the sensor 316 indicating an initial position of the instrumentation object 300C and recognize the gesture based on a determined final position of the instrumentation object 300C. The visualization effect or the audio effect may be determined based on an orientation of the instrumentation object 300C identified in the sensor data. The visualization effect may be based on one or more (e.g., a series) of previously recognized gestures. The visualization effect may include a lighting effect. Outputting the visualization effect may include sending the lighting effect to a plurality of wearable devices, such as within a predetermined proximity of the instrumentation object 3000 to be displayed, for example at the plurality of wearable devices. Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc. The gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a maximum or minimum deceleration, or the like. The audio effect may include a Multidimensional Polyphonic Expression instruction for a MIDI player.
  • FIG. 4 illustrates a system 400 for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments. Remote instrumentation devices may include a drum stick 410, a violin bow 416, a guitar pick 422, a conductor's baton 428, or the like. The instrumentation devices 410, 416, 422, and 428 may include respective sensors (e.g., 412, 418, 424, 430) and optionally respective transceivers or processors (e.g., 414, 420, 426, 432).
  • The system 400 includes a server 401 in communication with one or more remote instrumentation devices (e.g., 410, 416, 422, 428), or a wearable device 408. The server 401 includes a processor 402, memory 404, and a visualization engine 406. The processor 402 may receive sensor data from a sensor (412, 418, 424, or 430) of one or more of the remote instrumentation devices (e.g., 410, 416, 422, 428), such as the drum stick sensor 412. The drum stick 410 may be paired with a second drum stick, and the pair may include a parent and a child drum stick. For example, the child drum stick may have limited communication capabilities (e.g., capable of communicating with the parent drum stick, but may be incapable of communicating with another remote device. The parent drum stick may have the processor 414 or a transceiver, for example to communicate with a mobile device, wearable device, or remote device. The pair of drum sticks may be used together. The processor 402 may receive the sensor data from one of the pair of drum sticks (e.g., a parent) or both (e.g., individually, or via the parent). In an example, the sensor data is based on movement of the drum stick 410.
  • In an example, the processor 402 may recognize a gesture from the sensor data. For example, the gesture may include a drum strike, a violin playing movement, a conductor baton conducting movement, a guitar strum, etc. The gesture may include one or more of a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a minimum or maximum deceleration, or the like. The processor 402 may determine, for example using the gesture, a visualization effect corresponding to the gesture. The visualization effect may be determined using the visualization engine 406. In an example, to determine the visualization effect, the processor 402 is to determine the visualization effect based on a series of previously recognized gestures. The visualization effect may include a lighting effect, such as a flashing light or light sequence on a screen, a virtual reality light effect, or a light effect sent for display to a plurality of wearable devices (e.g., the wearable device 408). The plurality of wearable devices may be identified within a proximity of the remote instrumentation devices (e.g., 410, 416, 422, 428). The processor 402 may receive wearable sensor data from the plurality of wearable devices (e.g., the wearable device 408), which may be within a predetermined proximity of the remote instrumentation devices (e.g., 410, 416, 422, 428). The visualization effect may be modified, for example, based on the wearable sensor data. Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • The processor 402 may determine an audio effect corresponding to the gesture including, for example, a drum sound, a violin sound (or other stringed instrument sound), a guitar sound, an orchestral sound (e.g., a combination of sounds from a plurality of instruments), or the like. In an example, the visualization effect and the audio effect are determined based on an orientation of the remote instrumentation devices (e.g., 410, 416, 422, 428) identified in the sensor data. In an example, the audio effect may include Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • The processor 402 may cause the visualization effect or the audio effect to be output, such as in response to the determination. In an example, the visualization effect may be output to a virtual reality headset 434, which may display the visualization effect using a virtual reality display. The virtual reality headset 434 may be on a user who is controlling a remote instrumentation device (e.g., 410, 416, 422, 428). The visualization effect may be displayed in coordination with the audio effect played, for example, by a speaker. The speaker may be used to play the audio effect.
  • In an example, the processor 402 may receive data from the sensor indicating an initial position of the remote instrumentation devices (e.g., 410, 416, 422, 428). For example, the processor 402 may determine a final position of the remote instrumentation devices (e.g., 410, 416, 422, 428), such as a drum stick 410. In an example, the drum stick 410 may be used to generate the drum sound (e.g., without striking a drum), which may be determined based on the initial position and the final position.
  • In an example, the processor 402 may receive additional sensor data from a second sensor attached to an ankle or a foot of a user who is controlling the drum stick. A second gesture may be recognized from the additional sensor data. The processor 402 may determine from the second gesture, a second audio effect or a second visualization effect. The second audio effect may include a second drum sound corresponding to the second gesture. The processor 402 may cause the second audio effect or the second visualization effect to be output with the visualization effect or the audio effect.
  • FIG. 5 illustrates a flowchart showing a technique 500 for providing effects corresponding to movement of an instrumentation object in accordance with some embodiments. The technique 500 includes an operation 502 to receive sensor data from a sensor of at least one drum stick of a pair of drum sticks. The sensor data may be based on movement of the at least one drum stick. The sensor data may include data from sensors of both the pair of drum sticks. The Gesture may include movement of the pair of drum sticks in coordination with each other.
  • The technique 500 includes an operation 504 to recognize a gesture. The gesture may include a linear movement, a tapping movement, a sweeping movement, a minimum or maximum acceleration, a minimum or maximum deceleration, or the like. The technique 500 includes an operation 506 to determine a visualization effect and an audio effect. The effects corresponding to the gesture. In an example, determining the visualization effect includes using a visualization engine. In an example, determining the visualization effect includes determining the visualization effect based on one or more (e.g., a series) of previously recognized gestures. The audio effect may include Multidimensional Polyphonic Expression instructions for a MIDI player.
  • The technique 500 includes an operation 508 to output the visualization effect and the audio effect, such as in response to the determination. Outputting the visualization effect may include sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset. Outputting the audio effect may include sending the audio effect to a speaker to play the audio effect. The visualization effect and the audio effect may be displayed and played, respectively, in coordination.
  • In an example, the technique 500 may include receiving data from the sensor indicating an initial position of the drum stick, and recognizing the gesture may include determining a final position of the drum stick. The drum sound may be determined based on the initial position or the final position. The visualization effect corresponding to the gesture or the audio effect including the drum sound may corresponding to the gesture may be determined based on an orientation of the drum stick identified in the sensor data.
  • The technique 500 may include receiving additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick, recognizing a second gesture from the additional sensor data, determining, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture, and causing the second audio effect to be output with the visualization effect and the audio effect. The technique 500 may include receiving wearable sensor data from a plurality of wearable devices, such as within a predetermined proximity of the drum stick. The visualization effect may be modified based on the wearable sensor data. The visualization effect may include a lighting effect, which may be sent to a plurality of wearable devices within a predetermined proximity of the drum stick, such as to be displayed at the plurality of wearable devices. Devices within the predetermined proximity may include devices on a wi-fi network, within range of a Bluetooth device or devices, within a specified distance, within a room, etc.
  • FIG. 6 illustrates generally an example of a block diagram of a machine 600 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments. In alternative embodiments, the machine 600 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 600 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 600 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 600 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
  • Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or mechanisms. Modules are tangible entities (e.g., hardware) capable of performing specified operations when operating. A module includes hardware. In an example, the hardware may be specifically configured to carry out a specific operation (e.g., hardwired). In an example, the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation. The configuring may occur under the direction of the executions units or a loading mechanism. Accordingly, the execution units are communicatively coupled to the computer readable medium when the device is operating. In this example, the execution units may be a member of more than one module. For example, under operation, the execution units may be configured by a first set of instructions to implement a first module at one point in time and reconfigured by a second set of instructions to implement a second module.
  • Machine (e.g., computer system) 600 may include a hardware processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 604 and a static memory 606, some or all of which may communicate with each other via an interlink (e.g., bus) 608. The machine 600 may further include a display unit 610, an alphanumeric input device 612 (e.g., a keyboard), and a user interface (UI) navigation device 614 (e.g., a mouse). In an example, the display unit 610, alphanumeric input device 612 and UI navigation device 614 may be a touch screen display. The machine 600 may additionally include a storage device (e.g., drive unit) 616, a signal generation device 618 (e.g., a speaker), a network interface device 620, and one or more sensors 621, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 600 may include an output controller 628, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • The storage device 616 may include a machine readable medium 622 that is non-transitory on which is stored one or more sets of data structures or instructions 624 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 624 may also reside, completely or at least partially, within the main memory 604, within static memory 606, or within the hardware processor 602 during execution thereof by the machine 600. In an example, one or any combination of the hardware processor 602, the main memory 604, the static memory 606, or the storage device 616 may constitute machine readable media.
  • While the machine readable medium 622 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 624.
  • The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 600 and that cause the machine 600 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • The instructions 624 may further be transmitted or received over a communications network 626 using a transmission medium via the network interface device 620 utilizing any one of a number of transfer protocols (e.g., frame relay, Internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 620 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 626. In an example, the network interface device 620 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 600, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • VARIOUS NOTES & EXAMPLES
  • Each of these non-limiting examples may stand on its own, or may be combined in various permutations or combinations with one or more of the other examples.
  • Example 1 is a server in communication with a pair of drum sticks, the server comprising: a processor to: receive sensor data from a sensor of at least one drum stick of the pair of drum sticks, the sensor data based on movement of the at least one drum stick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination.
  • In Example 2, the subject matter of Example 1 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • In Example 3, the subject matter of any one or more of Examples 1-2 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • In Example 4, the subject matter of any one or more of Examples 1-3 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • In Example 5, the subject matter of any one or more of Examples 1-4 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • In Example 6, the subject matter of Example 5 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 7, the subject matter of any one or more of Examples 1-6 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the drum stick, and wherein to recognize the gesture, the processor is to determine a final position of the drum stick.
  • In Example 8, the subject matter of Example 7 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • In Example 9, the subject matter of any one or more of Examples 1-8 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the drum stick identified in the sensor data.
  • In Example 10, the subject matter of any one or more of Examples 1-9 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • In Example 11, the subject matter of any one or more of Examples 1-10 optionally include wherein the processor is further to: receive additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; recognize a second gesture from the additional sensor data; determine, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and cause the second audio effect to be output with the visualization effect and the audio effect.
  • In Example 12, the subject matter of any one or more of Examples 1-11 optionally include wherein the processor is further to: receive wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and modify the visualization effect based on the wearable sensor data.
  • In Example 13, the subject matter of any one or more of Examples 1-12 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • In Example 14, the subject matter of any one or more of Examples 1-13 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 15, the subject matter of any one or more of Examples 1-14 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 16 is a method for providing effects corresponding to movement of drum sticks, the method comprising: receiving sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at least one drum stick; recognizing a gesture from the sensor data; determining, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and causing the visualization effect and the audio effect to be output in response to the determination.
  • In Example 17, the subject matter of Example 16 optionally includes wherein determining the visualization effect includes using a visualization engine.
  • In Example 18, the subject matter of any one or more of Examples 16-17 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • In Example 19, the subject matter of any one or more of Examples 16-18 optionally include wherein causing the visualization effect to be output includes sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • In Example 20, the subject matter of any one or more of Examples 16-19 optionally include wherein causing the audio effect to be output includes sending the audio effect to a speaker to play the audio effect.
  • In Example 21, the subject matter of Example 20 optionally includes wherein causing the visualization effect to be output includes sending the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 22, the subject matter of any one or more of Examples 16-21 optionally include receiving data from the sensor indicating an initial position of the drum stick, and wherein recognizing the gesture includes determining a final position of the drum stick.
  • In Example 23, the subject matter of Example 22 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • In Example 24, the subject matter of any one or more of Examples 16-23 optionally include wherein the visualization effect corresponding to the vesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the drum stick identified in the sensor data.
  • In Example 25, the subject matter of any one or more of Examples 16-24 optionally include wherein determining the visualization effect includes determining the visualization effect based on a series of previously recognized gestures.
  • In Example 26, the subject matter of any one or more of Examples 16-25 optionally include receiving additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; recognizing a second gesture from the additional sensor data; determining, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and causing the second audio effect to be output with the visualization effect and the audio effect.
  • In Example 27, the subject matter of any one or more of Examples 16-26 optionally include receiving wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and modifying the visualization effect based on the wearable sensor data.
  • In Example 28, the subject matter of any one or more of Examples 16-27 optionally include wherein the visualization effect includes a lighting effect, and wherein causing the visualization effect to be output includes sending the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • In Example 29, the subject matter of any one or more of Examples 16-28 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 30, the subject matter of any one or more of Examples 16-29 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 31 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 16-30.
  • Example 32 is an apparatus comprising means for performing any of the methods of Examples 16-30.
  • Example 33 is at least one machine-readable medium including instructions for providing effects corresponding to movement of drum sticks, which when executed by a machine, cause the machine to: receive sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at, least one drum stick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination.
  • In Example 34, the subject matter of Example 33 optionally includes wherein the instructions to determine the visualization effect include instructions to use a visualization engine.
  • In Example 35, the subject matter of any one or more of Examples 33-34 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • In Example 36, the subject matter of any one or more of Examples 33-35 optionally include wherein the instructions to cause the visualization effect to be output include instructions to send the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on e virtual reality headset.
  • In Example 37, the subject matter of any one or more of Examples 33-36 optionally include wherein the instructions to cause the audio effect to be output include instructions to send the audio effect to a speaker to play the audio effect.
  • In Example 38, the subject matter of Example 37 optionally includes wherein the instructions to cause the visualization effect to be output include instructions to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 39, the subject matter of any one or more of Examples 33-38 optionally include instructions to receive data from the sensor indicating an initial position of the drum stick, and wherein the instructions to recognize the gesture include instructions to determine a final position of the drum stick.
  • In Example 40, the subject matter of Example 39 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • In Example 41, the subject matter of any one or more of Examples 33-40 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation the drum stick identified in the sensor data.
  • In Example 42, the subject matter of any one or more of Examples 33-41 optionally include wherein the instructions to determine the visualization effect include instructions to determine the visualization effect based on a series of previously recognized gestures.
  • In Example 43, the subject matter of any one or more of Examples 33-42 optionally include instructions to: receive additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; recognize a second gesture from the additional sensor data; determine, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and cause the second audio effect to be output with the visualization effect and the audio effect.
  • In Example 44, the subject matter of any one or more of Examples 33-43 optionally include instructions to: receive wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and modify the visualization effect based on the wearable sensor data,
  • In Example 45, the subject matter of any one or more of Examples 33-44 optionally include wherein the visualization effect includes a lighting effect, and wherein the instructions to cause the visualization effect to be output include instructions to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • In Example 46, the subject matter of any one or more of Examples 33-45 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 47, the subject matter of any one or more of Examples 33-46 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 48 is an apparatus for providing effects corresponding to movement of drum sticks, the apparatus comprising: means for receiving sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at least one drum stick; means for recognizing a gesture from the sensor data; means for determining, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and means for causing the visualization effect and the audio effect to be output in response to the determination.
  • In Example 49, the subject matter of Example 48 optionally includes wherein the means for determining the visualization effect include means for using a visualization engine.
  • In Example 50, the subject matter of any one or more of Examples 48-49 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • In Example 51, the subject matter of any one or more of Examples 48-50 optionally include wherein the means for causing the visualization effect to be output include means for sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
  • In Example 52, the subject matter of any one or more of Examples 48-51 optionally include wherein the means for causing the audio effect to be output include means for sending the audio effect to a speaker to play the audio effect.
  • In Example 53, the subject matter of Example 52 optionally includes wherein the means for causing the visualization effect to be output include means for sending the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 54, the subject matter of any one or more of Examples 48-53 optionally include means for receiving data from the sensor indicating an initial position of the drum stick, and wherein the means for recognizing the gesture include means for determining a final position of the drum stick.
  • In Example 55, the subject matter of Example 54 optionally includes wherein the drum sound is determined based on the initial position and the final position.
  • In Example 56, the subject matter of any one or more of Examples 48-55 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the drum stick identified in the sensor data.
  • In Example 57, the subject matter of any one or more of Examples 48-56 optionally include wherein the means for determining the visualization effect include means for determining the visualization effect based on a series of previously recognized gestures.
  • In Example 58, the subject matter of any one or more of Examples 48-57 optionally include means for receiving additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the drum stick; means for recognizing a second gesture from the additional sensor data; means for determining, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and means for causing the second audio effect to be output with the visualization effect and the audio effect.
  • In Example 59, the subject matter of any one or more of Examples 48-58 optionally include means for receiving wearable sensor data from a plurality of wearable devices within a predetermined proximity of the drum stick; and means for modifying the visualization effect based on the wearable sensor data.
  • In Example 60, the subject matter of any one or more of Examples 48-59 optionally include wherein the visualization effect includes a lighting effect, and wherein the means for causing the visualization effect to be output include means for sending the lighting effect to a plurality of wearable devices within a predetermined proximity of the drum stick to be displayed at the plurality of wearable devices.
  • In Example 61, the subject matter of any one or more of Examples 48-60 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 62, the subject matter of any one or more of Examples 48-61 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 63 is a virtual drum set system comprising: a pair of drum sticks each including: a sensor to provide data based on movement of the drum stick; and a transceiver to transmit the sensor data; a device including a processor to: recognize a gesture from the sensor data; and determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and a display device to display the visualization effect; and a speaker to play the audio effect.
  • In Example 64, the subject matter of Example 63 optionally includes wherein the device is a mobile device.
  • In Example 65, the subject matter of any one or more of Examples 63-64 optionally include wherein the device further includes a device transceiver to receive the sensor data.
  • In Example 66, the subject matter of any one or more of Examples 63-65 optionally include wherein the display device is a virtual realty headset and the visualization effect includes a virtual drum set.
  • In Example 67, the subject matter of any one or more of Examples 63-66 optionally include wherein the sensor includes a nine-axis sensor including a magnetometer, an accelerometer, and a gyroscope.
  • In Example 68, the subject matter of any one or more of Examples 63-67 optionally include wherein the speaker includes headphones.
  • In Example 69, the subject matter of any one or more of Examples 63-68 optionally include wherein one of the pair of drum sticks is a parent drum stick and the transceiver of the parent drum stick is configured to receive child sensor data from the other of the pair of drum sticks and wherein the transceiver of the parent drum stick is to send combined sensor data to the device.
  • In Example 70, the subject matter of any one or more of Examples 63-69 optionally include wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
  • In Example 71, the subject matter of any one or more of Examples 63-70 optionally include wherein the processor is to send timing information to the display device and the speaker to coordinate displaying the visualization effect and playing the audio effect.
  • In Example 72, the subject matter of any one or more of Examples 63-71 optionally include wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation the drum stick identified in the sensor data.
  • In Example 73, the subject matter of any one or more of Examples 63-72 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • In Example 74, the subject matter of any one or more of Examples 63-73 optionally include wherein the system further comprises an additional sensor attached to an ankle or a foot of a user controlling the drum stick; and wherein the processor is further to determine, from the additional sensor data of the additional sensor, a second audio effect including a second drum sound corresponding to the second gesture.
  • In Example 75, the subject matter of Example 74 optionally includes wherein the speaker is to play the second audio effect.
  • In Example 76, the subject matter of any one or more of Examples 63-75 optionally include wherein the display device includes a plurality of wearable devices within a predetermined proximity of the drum stick, the visualization effect to be displayed at the plurality of wearable devices.
  • In Example 77, the subject matter of any one or more of Examples 63-76 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 78, the subject matter of any one or more of Examples 63-77 optionally include wherein the speaker is controlled by a musical instrument digital interface (MIDI) player and wherein the audio effect includes Multidimensional Polyphonic Expression instructions for use by the MIDI player.
  • Example 79 is a server in communication with a violin bow, the server comprising: a processor to: receive sensor data from a sensor of the violin how, the sensor data based on movement of the violin bow; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect including a natural sound caused by the movement of the violin bow.
  • In Example 80, the subject matter of Example 79 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • In Example 81, the subject matter of any one or more of Examples 79-80 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the violin bow to be displayed on the virtual reality headset.
  • In Example 82, the subject matter of any one or more of Examples 79-81 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • In Example 83, the subject matter of Example 82 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 84, the subject matter of any one or more of Examples 79-83 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the violin bow, and wherein to recognize the gesture, the processor is to determine a final position of the violin bow.
  • In Example 85, the subject matter of any one or more of Examples 79-84 optionally include wherein the visualization effect corresponding to the gesture and the audio effect corresponding to the gesture are determined based on an orientation of the violin bow identified in the sensor data.
  • In Example 86, the subject matter of any one or more of Examples 79-85 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • In Example 87, the subject matter of any one or more of Examples 79-86 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the violin bow to be displayed at the plurality of wearable devices.
  • In Example 88, the subject matter of any one or more of Examples 79-87 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 89, the subject matter of any one or more of Examples 79-88 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 90 is a server in communication with a guitar pick, the server comprising: a processor to: receive sensor data from a sensor of guitar pick, the sensor data based on movement of the guitar pick; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect including a natural sound caused by the movement of the guitar pick.
  • In Example 91, the subject matter of Example 90 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • In Example 92, the subject matter of any one or more of Examples 90-91 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the guitar pick to be displayed on the virtual reality headset.
  • In Example 93, the subject matter of any one or more of Examples 90-92 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • In Example 94, the subject matter of Example 93 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 95, the subject matter of any one or more of Examples 90-94 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the guitar pick, and wherein to recognize the gesture, the processor is to determine a final position of the guitar pick.
  • In Example 96, the subject matter of any one or more of Examples 90-95 optionally include wherein the visualization effect corresponding to the gesture and the audio effect corresponding to the gesture are determined based on an orientation of the guitar pick identified in the sensor data.
  • In Example 97, the subject matter of any one or more of Examples 90-96 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • In Example 98, the subject matter of any one or more of Examples 90-97 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the guitar pick to be displayed at the plurality of wearable devices.
  • In Example 99, the subject matter of any one or more of Examples 90-98 optionally include wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 100, the subject matter of any one or more of Examples 90-99 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 101 is a server in communication with a conductor baton, the server comprising: a processor to: receive sensor data from a sensor of the conductor baton, the sensor data based on movement of the conductor baton; recognize a gesture from the sensor data; determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect corresponding to the gesture; and cause the visualization effect and the audio effect to be output in response to the determination, the audio effect to be played with corresponding natural sounds directed by the movement of the conductor baton.
  • In Example 102, the subject matter of Example 101 optionally includes wherein to determine the visualization effect, the processor is to use a visualization engine.
  • In Example 103, the subject matter of any one or more of Examples 101-102 optionally include wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the conductor baton to be displayed on the virtual reality headset.
  • In Example 104, the subject matter of any one or more of Examples 101-103 optionally include wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
  • In Example 105, the subject matter of Example 104 optionally includes wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a display to be displayed in coordination with the audio effect played by the speaker.
  • In Example 106, the subject matter of any one or more of Examples 101-105 optionally include wherein the processor is further to receive data from the sensor indicating an initial position of the conductor baton, and wherein to recognize the gesture, the processor is to determine a final position of the conductor baton.
  • In Example 107, the subject matter of any one or more of Examples 101-106 optionally include wherein the visualization effect corresponding to the gesture and the audio effect corresponding to the gesture are determined based on an orientation of the conductor baton identified in the sensor data.
  • In Example 108, the subject matter of any one or more of Examples 101-107 optionally include wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
  • In Example 109, the subject matter of any one or more of Examples 101-108 optionally include wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the conductor baton to be displayed at the plurality of wearable devices.
  • In Example 110, the subject matter of any one or more of Examples 101-109 optionally include wherein the vesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
  • In Example 111, the subject matter of any one or more of Examples 101-110 optionally include wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
  • Example 112 is at least one machine-readable medium including instructions, which when executed by a machine, cause the machine to perform operations of any of the operations of Examples 1-111.
  • Example 113 is an apparatus comprising means for performing any of the operations of Examples 1-111.
  • Example 114 is a system to perform the operations of any of the Examples 1-111.
  • Example 115 is a method to perform the operations of any of the Examples 1-111.
  • Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples. An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times. Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMS), read only memories (ROMs), and the like.

Claims (25)

1. A server in communication with a pair of drum sticks, the server comprising:
a processor to:
receive sensor data from a sensor of at least one drum stick of the pair of drum sticks, the sensor data based on movement of the at least one drum stick;
recognize a gesture from the sensor data;
determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and
cause the visualization effect and the audio effect to be output in response to the determination, wherein the visualization effect includes a virtual drum set and is output to a display to be displayed in coordination with the audio effect.
2. The server of claim 1, wherein to determine the visualization effect, the processor is to use a visualization engine.
3. The server of claim 1, wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
4. The server of claim 1, wherein to cause the visualization effect to be output, the processor is to send the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
5. The server of claim 1, wherein to cause the audio effect to be output, the processor is to send the audio effect to a speaker to play the audio effect.
6. The server of claim 1, wherein the output to the display includes captured video of a person performing the movement with the at least one drumstick.
7. The server of claim 1, wherein the processor is further to receive data from the sensor indicating an initial position of the at least one drum stick, and wherein to recognize the gesture, the processor is to determine a final position of the at least one drum stick.
8. The server of claim 7, wherein the drum sound is determined based on the initial position and the final position.
9. The server of claim 1, wherein the visualization effect corresponding to the gesture and the audio effect including the drum sound corresponding to the gesture are determined based on an orientation of the at least one drum stick identified in the sensor data.
10. The server of claim 1, wherein to determine the visualization effect, the processor is to determine the visualization effect based on a series of previously recognized gestures.
11. The server of claim 1, wherein the processor is further to:
receive additional sensor data from a second sensor attached to an ankle or a foot of a user controlling the at least one drum stick;
recognize a second gesture from the additional sensor data;
determine, from the second gesture, a second audio effect including a second drum sound corresponding to the second gesture; and
cause the second audio effect to be output with the visualization effect and the audio effect.
12. The server of claim 1, wherein the processor is further to:
receive wearable sensor data from a plurality of wearable devices within a predetermined proximity of the at least one drum stick; and
modify the visualization effect based on the wearable sensor data.
13. The server of claim 1, wherein the visualization effect includes a lighting effect, and wherein to cause the visualization effect to be output, the processor is to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the at least one drum stick to be displayed at the plurality of wearable devices.
14. The server of claim 1, wherein the gesture includes at least one of a linear movement, a tapping movement, a sweeping movement, a minimum acceleration, or a minimum deceleration.
15. The server of claim 1, wherein the audio effect includes Multidimensional Polyphonic Expression instructions for a musical instrument digital interface (MIDI) player.
16. A method for providing effects corresponding to movement of drum sticks, the method comprising:
receiving sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at least one drum stick;
recognizing a gesture from the sensor data;
determining, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and
causing the visualization effect and the audio effect to be output in response to the determination, wherein the visualization effect includes a virtual drum set and is output to a display to be displayed in coordination with the audio effect.
17. The method of claim 16, wherein determining the visualization effect includes using a visualization engine.
18. The method of claim 16, wherein the sensor data includes data from sensors of both of the pair of drum sticks and wherein the gesture includes movement of the pair of drum sticks in coordination with each other.
19. The method of claim 16, wherein causing the visualization effect to be output includes sending the visualization effect to a virtual reality headset of a user controlling the pair of drum sticks to be displayed on the virtual reality headset.
20. At least one non-transitory machine-readable medium including instructions for providing effects corresponding to movement of drum sticks, which when executed by a machine, cause the machine to:
receive sensor data from a sensor of at least one drum stick of a pair of drum sticks, the sensor data based on movement of the at least one drum stick;
recognize a gesture from the sensor data;
determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and
cause the visualization effect and the audio effect to be output in response to the determination, wherein the visualization effect includes a virtual drum set and is output to a display to be displayed in coordination with the audio effect.
21. The at least one non-transitory machine-readable medium of claim 20, further comprising instructions to:
receive wearable sensor data from a plurality of wearable devices within a predetermined proximity of the at least one drum stick; and
modify the visualization effect based on the wearable sensor data.
22. The at least one non-transitory machine-readable medium of claim 20, wherein the visualization effect includes a lighting effect, and wherein the instructions to cause the visualization effect to be output include instructions to send the lighting effect to a plurality of wearable devices within a predetermined proximity of the at least one drum stick to be displayed at the plurality of wearable devices.
23. A virtual drum set system comprising:
a pair of drum sticks each including:
a sensor to provide data based on movement of a drum stick of the pair of drum sticks; and
a transceiver to transmit the sensor data;
a device including a processor to:
recognize a gesture from the sensor data; and
determine, from the gesture, a visualization effect corresponding to the gesture and an audio effect including a drum sound corresponding to the gesture; and
a display device to display the visualization effect, wherein the visualization effect includes a virtual drum set and is output to a display to be displayed in coordination with the audio effect; and
a speaker to play the audio effect.
24. The virtual drum set system of claim 23, wherein the display device is a virtual realty headset and the visualization effect includes a virtual drum set.
25. The virtual drum set system of claim 23, wherein the sensor includes a nine-axis sensor including a magnetometer, an accelerometer, and a gyroscope.
US15/581,957 2017-04-28 2017-04-28 Sensor driven enhanced visualization and audio effects Active US10102835B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/581,957 US10102835B1 (en) 2017-04-28 2017-04-28 Sensor driven enhanced visualization and audio effects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/581,957 US10102835B1 (en) 2017-04-28 2017-04-28 Sensor driven enhanced visualization and audio effects

Publications (2)

Publication Number Publication Date
US10102835B1 US10102835B1 (en) 2018-10-16
US20180315405A1 true US20180315405A1 (en) 2018-11-01

Family

ID=63761309

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/581,957 Active US10102835B1 (en) 2017-04-28 2017-04-28 Sensor driven enhanced visualization and audio effects

Country Status (1)

Country Link
US (1) US10102835B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202200002378A1 (en) * 2022-02-14 2023-08-14 Art Communication S R L DEVICE AND RELATED METHOD FOR LIVE INTERACTIVE ELECTRONICS BASED ON THE EVOLUTION OF THE CONDUCTOR'S WAND

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3061797B1 (en) * 2017-01-11 2021-06-18 Jerome Dron EMULATION OF AT LEAST ONE SOUND OF A BATTERY-TYPE PERCUSSION INSTRUMENT
EP3848103B1 (en) * 2018-01-08 2022-12-07 Kids II Hape Joint Venture Limited Children's toys with capacitive touch interactivity
US11688374B2 (en) 2019-12-04 2023-06-27 Nicholas J. Macias Motion/position-sensing responsive light-up musical instrument
GB2597462B (en) * 2020-07-21 2023-03-01 Rt Sixty Ltd Evaluating percussive performances
US20230221433A1 (en) * 2022-01-12 2023-07-13 Freedrum Studio AB System and a method for determining positions of sensor units

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4341140A (en) * 1980-01-31 1982-07-27 Casio Computer Co., Ltd. Automatic performing apparatus
US4968877A (en) * 1988-09-14 1990-11-06 Sensor Frame Corporation VideoHarp
US5290964A (en) * 1986-10-14 1994-03-01 Yamaha Corporation Musical tone control apparatus using a detector
US5414256A (en) * 1991-10-15 1995-05-09 Interactive Light, Inc. Apparatus for and method of controlling a device by sensing radiation having an emission space and a sensing space
US5475214A (en) * 1991-10-15 1995-12-12 Interactive Light, Inc. Musical sound effects controller having a radiated emission space
US6028594A (en) * 1996-06-04 2000-02-22 Alps Electric Co., Ltd. Coordinate input device depending on input speeds
US6222465B1 (en) * 1998-12-09 2001-04-24 Lucent Technologies Inc. Gesture-based computer interface
US20010035087A1 (en) * 2000-04-18 2001-11-01 Morton Subotnick Interactive music playback system utilizing gestures
USRE37654E1 (en) * 1996-01-22 2002-04-16 Nicholas Longo Gesture synthesizer for electronic sound device
US6388183B1 (en) * 2001-05-07 2002-05-14 Leh Labs, L.L.C. Virtual musical instruments with user selectable and controllable mapping of position input to sound output
US6492775B2 (en) * 1998-09-23 2002-12-10 Moshe Klotz Pre-fabricated stage incorporating light-actuated triggering means
US20030159567A1 (en) * 2002-10-18 2003-08-28 Morton Subotnick Interactive music playback system utilizing gestures
US6960715B2 (en) * 2001-08-16 2005-11-01 Humanbeams, Inc. Music instrument system and methods
US20080122786A1 (en) * 1997-08-22 2008-05-29 Pryor Timothy R Advanced video gaming methods for education and play using camera based inputs
US20100009746A1 (en) * 2008-07-14 2010-01-14 Raymond Jesse B Music video game with virtual drums
US7723604B2 (en) * 2006-02-14 2010-05-25 Samsung Electronics Co., Ltd. Apparatus and method for generating musical tone according to motion
US20110102568A1 (en) * 2009-10-30 2011-05-05 Medical Motion, Llc Systems and methods for comprehensive human movement analysis
US20110239847A1 (en) * 2010-02-04 2011-10-06 Craig Small Electronic drumsticks system
US8198526B2 (en) * 2009-04-13 2012-06-12 745 Llc Methods and apparatus for input devices for instruments and/or game controllers
US20120144979A1 (en) * 2010-12-09 2012-06-14 Microsoft Corporation Free-space gesture musical instrument digital interface (midi) controller
US20130118339A1 (en) * 2011-11-11 2013-05-16 Fictitious Capital Limited Computerized percussion instrument
US20130228062A1 (en) * 2012-03-02 2013-09-05 Casio Computer Co., Ltd. Musical performance device, method for controlling musical performance device and program storage medium
US20130239785A1 (en) * 2012-03-15 2013-09-19 Casio Computer Co., Ltd. Musical performance device, method for controlling musical performance device and program storage medium
US20130239780A1 (en) * 2012-03-14 2013-09-19 Casio Computer Co., Ltd. Musical performance device, method for controlling musical performance device and program storage medium
US9018508B2 (en) * 2012-04-02 2015-04-28 Casio Computer Co., Ltd. Playing apparatus, method, and program recording medium
US20150356957A1 (en) * 2009-07-30 2015-12-10 Gregory A. Piccionelli Drumstick controller
US9360206B2 (en) * 2013-10-24 2016-06-07 Grover Musical Products, Inc. Illumination system for percussion instruments
US20160203806A1 (en) * 2015-01-08 2016-07-14 Muzik LLC Interactive instruments and other striking objects
US9720509B2 (en) * 2013-11-05 2017-08-01 Moff, Inc. Gesture detection system, gesture detection apparatus, and mobile communication terminal
US20170229103A1 (en) * 2016-02-05 2017-08-10 William R. Benner, Jr. Device For Reducing Vibration In Impact Tools And Associated Methods

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102024140B (en) 2009-09-16 2012-07-11 深圳泰山在线科技有限公司 Drumbeating action identification method based on computer

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4341140A (en) * 1980-01-31 1982-07-27 Casio Computer Co., Ltd. Automatic performing apparatus
US5290964A (en) * 1986-10-14 1994-03-01 Yamaha Corporation Musical tone control apparatus using a detector
US4968877A (en) * 1988-09-14 1990-11-06 Sensor Frame Corporation VideoHarp
US5414256A (en) * 1991-10-15 1995-05-09 Interactive Light, Inc. Apparatus for and method of controlling a device by sensing radiation having an emission space and a sensing space
US5475214A (en) * 1991-10-15 1995-12-12 Interactive Light, Inc. Musical sound effects controller having a radiated emission space
USRE37654E1 (en) * 1996-01-22 2002-04-16 Nicholas Longo Gesture synthesizer for electronic sound device
US6028594A (en) * 1996-06-04 2000-02-22 Alps Electric Co., Ltd. Coordinate input device depending on input speeds
US20080122786A1 (en) * 1997-08-22 2008-05-29 Pryor Timothy R Advanced video gaming methods for education and play using camera based inputs
US6492775B2 (en) * 1998-09-23 2002-12-10 Moshe Klotz Pre-fabricated stage incorporating light-actuated triggering means
US6222465B1 (en) * 1998-12-09 2001-04-24 Lucent Technologies Inc. Gesture-based computer interface
US20010035087A1 (en) * 2000-04-18 2001-11-01 Morton Subotnick Interactive music playback system utilizing gestures
US6388183B1 (en) * 2001-05-07 2002-05-14 Leh Labs, L.L.C. Virtual musical instruments with user selectable and controllable mapping of position input to sound output
US6960715B2 (en) * 2001-08-16 2005-11-01 Humanbeams, Inc. Music instrument system and methods
US20030159567A1 (en) * 2002-10-18 2003-08-28 Morton Subotnick Interactive music playback system utilizing gestures
US7723604B2 (en) * 2006-02-14 2010-05-25 Samsung Electronics Co., Ltd. Apparatus and method for generating musical tone according to motion
US20170103742A1 (en) * 2006-03-03 2017-04-13 Gregory A. Piccionelli Drumstick controller
US20100009746A1 (en) * 2008-07-14 2010-01-14 Raymond Jesse B Music video game with virtual drums
US8198526B2 (en) * 2009-04-13 2012-06-12 745 Llc Methods and apparatus for input devices for instruments and/or game controllers
US20150356957A1 (en) * 2009-07-30 2015-12-10 Gregory A. Piccionelli Drumstick controller
US20110102568A1 (en) * 2009-10-30 2011-05-05 Medical Motion, Llc Systems and methods for comprehensive human movement analysis
US20110239847A1 (en) * 2010-02-04 2011-10-06 Craig Small Electronic drumsticks system
US20120144979A1 (en) * 2010-12-09 2012-06-14 Microsoft Corporation Free-space gesture musical instrument digital interface (midi) controller
US20130118339A1 (en) * 2011-11-11 2013-05-16 Fictitious Capital Limited Computerized percussion instrument
US20130228062A1 (en) * 2012-03-02 2013-09-05 Casio Computer Co., Ltd. Musical performance device, method for controlling musical performance device and program storage medium
US20130239780A1 (en) * 2012-03-14 2013-09-19 Casio Computer Co., Ltd. Musical performance device, method for controlling musical performance device and program storage medium
US20130239785A1 (en) * 2012-03-15 2013-09-19 Casio Computer Co., Ltd. Musical performance device, method for controlling musical performance device and program storage medium
US9018508B2 (en) * 2012-04-02 2015-04-28 Casio Computer Co., Ltd. Playing apparatus, method, and program recording medium
US9360206B2 (en) * 2013-10-24 2016-06-07 Grover Musical Products, Inc. Illumination system for percussion instruments
US9720509B2 (en) * 2013-11-05 2017-08-01 Moff, Inc. Gesture detection system, gesture detection apparatus, and mobile communication terminal
US20160203806A1 (en) * 2015-01-08 2016-07-14 Muzik LLC Interactive instruments and other striking objects
US9799315B2 (en) * 2015-01-08 2017-10-24 Muzik, Llc Interactive instruments and other striking objects
US20170229103A1 (en) * 2016-02-05 2017-08-10 William R. Benner, Jr. Device For Reducing Vibration In Impact Tools And Associated Methods

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202200002378A1 (en) * 2022-02-14 2023-08-14 Art Communication S R L DEVICE AND RELATED METHOD FOR LIVE INTERACTIVE ELECTRONICS BASED ON THE EVOLUTION OF THE CONDUCTOR'S WAND

Also Published As

Publication number Publication date
US10102835B1 (en) 2018-10-16

Similar Documents

Publication Publication Date Title
US10102835B1 (en) Sensor driven enhanced visualization and audio effects
CN103325363B (en) Music performance apparatus and method
US10234956B2 (en) Dynamic effects processing and communications for wearable devices
US8759659B2 (en) Musical performance device, method for controlling musical performance device and program storage medium
US11557269B2 (en) Information processing method
JP6728168B2 (en) Wearable voice mixing
JP2019507389A (en) Apparatus, system and method for generating music
CN103258529A (en) Performance method of electronic musical instrument and music
CN105389013A (en) Gesture-based virtual playing system
JP2022115956A (en) Information processing method, information processing device and program
JP2008134295A (en) Concert system
US20130239781A1 (en) Musical instrument, method and recording medium
Young et al. Aobachi: A new interface for japanese drumming
US8784205B2 (en) Game device, method for controlling game device, program, and information storage medium
US20180137770A1 (en) Musical instrument indicator apparatus, system, and method to aid in learning to play musical instruments
US9508329B2 (en) Method for producing audio file and terminal device
CN102789712A (en) Laser marking musical instrument teaching system and laser marking musical instrument teaching method based on spherical ultrasonic motor
JP6888351B2 (en) Input device, speech synthesizer, input method, and program
US10606547B2 (en) Electronic device
JP7432127B2 (en) Information processing method, information processing system and program
JP7249859B2 (en) karaoke system
US20230062315A1 (en) Live venue performance sensor capture and visualization over game network
JP5109127B2 (en) Ensemble system
Suen et al. Mobile and sensor integration for increased interactivity and expandability in mobile gaming and virtual instruments
JP2016194629A (en) Plucked string instrument performance evaluation device, music performance device, and plucked string instrument performance evaluation program

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHAH, SAURIN;KAR, SWARNENDU;KRISHNAMURTHY, LAKSHMAN;AND OTHERS;SIGNING DATES FROM 20170707 TO 20180806;REEL/FRAME:046642/0663

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4