WO2014125033A1 - Pda with motion controlled user interface for music phrases selection - Google Patents

Pda with motion controlled user interface for music phrases selection Download PDF

Info

Publication number
WO2014125033A1
WO2014125033A1 PCT/EP2014/052836 EP2014052836W WO2014125033A1 WO 2014125033 A1 WO2014125033 A1 WO 2014125033A1 EP 2014052836 W EP2014052836 W EP 2014052836W WO 2014125033 A1 WO2014125033 A1 WO 2014125033A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
sound
axis
orientation
phrase
Prior art date
Application number
PCT/EP2014/052836
Other languages
French (fr)
Inventor
Stian HAUGE
Kjartan VESTVIK
Original Assignee
Rattlejam As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rattlejam As filed Critical Rattlejam As
Publication of WO2014125033A1 publication Critical patent/WO2014125033A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0025Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/101Music Composition or musical creation; Tools or processes therefor
    • G10H2210/151Music Composition or musical creation; Tools or processes therefor using templates, i.e. incomplete musical sections, as a basis for composing
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/391Angle sensing for musical purposes, using data from a gyroscope, gyrometer or other angular velocity or angular movement sensing device
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/395Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing.
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/005Device type or category
    • G10H2230/015PDA [personal digital assistant] or palmtop computing devices used for musical purposes, e.g. portable music players, tablet computers, e-readers or smart phones in which mobile telephony functions need not be used
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/171Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
    • G10H2240/175Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments for jam sessions or musical collaboration through a network, e.g. for composition, ensemble playing or repeating; Compensation of network or internet delays therefor

Definitions

  • the present invention relates to a motion-controlled user interface for a mobile device, the user interface being for playing sounds via the mobile device.
  • Applications comprising motion-controlled user interfaces for mobile devices (e.g. mobile telephones, PDAs, tablets) make use of existing gyroscopes and accelerometers provided in the mobile device in order to detect direction and motion. It is known to use motion control as an interface to allow the user to control games and other software on the mobile device. It is also known to use the motion of the mobile device to control music.
  • String Trio provided by greySox Inc. via Apple iTunes
  • String Trio is a mobile device application which allows a user to simulate playing a violin, cello or viola. Tilting, swinging and sliding the mobile device, and varying the speed and strength of the movements, changes the sound produced by the application.
  • iBaton provided by Vanke Software Inc. via Apple iTunes, which allows a user to select a piece of music and wave the mobile device like a conductor's baton, changing the tempo of the music as desired.
  • a method of controlling a user interface for a mobile device having a visual display the user interface being for playing sounds via the mobile device; wherein in the method the orientation of the mobile device about a first axis is used to select a sound phrase from a plurality of sound phrases associated with a music track, and the orientation of the mobile device about a second axis is used to determine whether the sound phrase is to be played.
  • the invention also extends to a mobile device supporting the user interface, and to a related computer programme product.
  • a mobile device comprising: a visual display for displaying a visual output of a user interface; a speaker for playing sound output by the user interface; an accelerometer for measuring linear acceleration of the mobile device; a gyroscope for measuring angular rotation of the mobile device; and a processor, wherein the processor is configured to: receive information from the
  • a computer programme product for a mobile device comprising instructions that, when executed, will configure the mobile device such that it provides a user interface operable in accordance with the method of the first aspect.
  • the computer programme product may be a downloadable file.
  • applications comprising motion-controlled user interfaces for mobile devices (e.g. mobile telephones, PDAs, tablets) make use of existing gyroscopes and accelerometers provided in the mobile device in order to detect direction and motion.
  • the present invention utilises this information to allow a user to interact with previously recorded musical material by means of a combination of control movements and a breakdown of the musical material. This combination of features is not found in the prior art.
  • the music track may be any musical material that is separable into appropriate tracks and sound phrases. It may for example include the sound of a particular instrument or combination of instruments, or may include the recordings of bands or other musical ensembles.
  • known musical material is divided into small pieces (parts, fragments or sequences) such that the user can interact with these "building blocks", which form the sound phrases.
  • the user interface gives the user access to these building blocks of the pre-existing musical material, such that the user can control which building block is played, and when.
  • musical material can be recorded in a multi-track format recording.
  • multiple musical instruments and vocals
  • References to a "track” herein may hence refer to a single track of a multi-track recording, although it should also be understood that the term "track” extends to any sound recordings/data representing sound recordings of an equivalent nature to the multi-track recording.
  • the advantages of the sequence of movements used by the invention apply to any appropriate sound as the music track.
  • each music track corresponds to a different instrument or vocal performance.
  • music tracks are: drum track, keyboard/piano track, guitar track, bass guitar track, saxophone track, vocal track, and so on.
  • Each track has associated therewith a plurality of sound phrases.
  • Each sound phrase may comprise a rhythmic pattern and at least one sample derived from the pre-existing musical material.
  • the different sound phrases may be the sounds of a single instrument played in a different way, using different notes, tones and/or chords.
  • the user may control one music track at a time.
  • the music track is controlled by the orientation of the mobile device in 3D space, i.e. by moving the mobile device along or around two or more axes. Preferably, these axes are orthogonal.
  • a graphical user interface displayed on the display of the mobile device reflects how the movement affects the music track.
  • the mobile device will generally be a rectangular cuboid, with a centre of gravity approximately defined by the point of intersection of three axes that pass through the central point of each face of the cuboid. These axes are referred to herein as the longitudinal, lateral and vertical axes.
  • the three axes are defined in an equilibrium state where, in preferred embodiments, the mobile device is held flat by the user (i.e.
  • one of the axes of the mobile device will be vertical (the vertical axis), and the other two (longitudinal and lateral) will be in the horizontal plane, orthogonal to one another and to the vertical axis. It should be clear that the axes are fixed in space and are invariant; they are defined with respect to the equilibrium position of the device, but do not move with the device.
  • the visual display of the mobile device will generally be defined by two longer sides and two shorter sides.
  • information can be displayed either oriented in the direction of the longer sides (i.e. portrait orientation) or oriented in the direction of the shorter sides (i.e. landscape orientation).
  • the longitudinal axis is preferably the axis oriented in the same direction as the orientation of the visual display.
  • the lateral axis is defined orthogonally to the longitudinal axis. That is, if the mobile device of the preferred embodiment is held in portrait orientation, the longitudinal axis is parallel to the longer sides of the device and the lateral axis is parallel to the shorter sides. If the mobile device of the preferred embodiment is held in landscape orientation, the longitudinal axis is parallel to the shorter sides of the device and the lateral axis is parallel to the longer sides.
  • the axes may also be referred to as the pitch, roll and yaw axes. As used herein, these terms have the standard meanings known in the field of aircraft, for example.
  • the yaw axis is the vertical axis
  • the yaw of the mobile device is the angle about this axis.
  • the roll axis is the longitudinal axis
  • the pitch axis is the lateral axis.
  • the pitch, yaw and roll axes are fixed in space and are invariant; they are defined with respect to the equilibrium position of the device, but do not move with the device.
  • the first axis is the vertical axis (i.e. the yaw axis) and the second axis is either the longitudinal or lateral axis (i.e. the pitch or roll axis).
  • the orientation about the lateral axis (pitch) or the longitudinal axis (roll) is used to control whether the sound phrase is playing or not.
  • the lateral axis is preferred as this is considered to provide a more ergonomic movement for the user's hand and allows for a comfortable movement for both left and right handed users.
  • a specific threshold angle is preferably defined (for instance 45° up from horizontal position about the axis). Moving the device to one side of this angle (either above or below the angle) makes the sound phrase play, and moving the device to the other side of the threshold angle stops the sound phrase from playing. For example, in preferred embodiments, if the mobile device is held in an upright position, the sound phrase will not play. If the mobile device is held flat, the sound phrase will play. It is preferable that the sound phrase plays when the mobile device is held flat with the visual display facing upwards towards the user, and the sound phrase does not play when the phone is orientated such that the visual display is oriented vertically and not easily visible to the user.
  • the orientation about the lateral axis or longitudinal axis may also be used to determine the volume at which the sound phrase is played. For example, when the orientation of the mobile device passes the threshold angle the sound phrase may begin to play at a low volume, and as the mobile device is angled further past the threshold the volume may be increased.
  • the sound phrases of each music track are preferably made up of a rhythmic pattern and one or several samples triggered by the rhythmic pattern.
  • the rhythmic pattern itself is synchronized with an underlying sequencer.
  • the sequencer ensures that the sound phrase is always playing in time with the beat and is synchronized with other users in a jamming session. For example, the sequencer may count from 0 to 16 repeatedly (a one bar loop).
  • An exemplary rhythmic pattern starts samples on the sequencer's count of 4, 8 and 14, thus triggering three samples during one bar.
  • a delay will possibly occur from the moment the user changes the sound phrase to the moment the first sample starts playing.
  • each sound phrase represents samples played with one specific chord.
  • the orientation of the device about the vertical axis (yaw) is preferably used to control the choice of what sound phrase of the selected music track is played.
  • the available sound phrases may be evenly distributed along a virtual 180° arc about the vertical axis.
  • the music track has six sound phrases associated with it, and each phrase is distributed about the vertical axis (yaw) in a sector bounded by a 30° arc.
  • Each music track available via the interface could have the same number of sound phrases associated with it, or there may be a different number of sound phrases for different music tracks.
  • each music track may have more or fewer than six sound phrases associated with it.
  • the corresponding sound phrase will play.
  • the sound phrase will continue to play in a loop.
  • the default position is the middle of the virtual arc (i.e. 90° if the arc is 180°).
  • the playing sound phrase When the device is moved from one arc/sector to another, the playing sound phrase will stop and the sound phrase corresponding to the new arc will start playing. The moment when this sound phrase will start may be determined by the rhythmic pattern for this sound phrase such that the sound phrase may start playing on the next occurring beat of the pattern.
  • the distribution of sound phrases about the vertical axis may be shown in a graphical user interface on the display of the mobile device, for example as an arc with markings showing how the sound phrases are separated.
  • this graphical representation rotates as the device is rotated, indicating how the device is moved within the 180° arc, and a "needle" indicates the current position of the device within this arc.
  • the state of the music track i.e. active (playing) or passive (not playing) is reflected in the graphical user interface of the application, displayed on the display of the mobile device.
  • a graphical object may be greyed out or ghosted to indicate a passive state (i.e. the music track is not playing), and coloured or highlighted to indicate an active state (i.e. the music track is playing).
  • each time the user selects a new music track the position of the sound phrases about the vertical axis is reset. In this way, the current position of the device becomes the default middle position (i.e. 90° within a 180° arc) of the new music track.
  • the sound phrases for the music tracks are preferably structured in a way that makes it possible for a plurality of users to recreate together a certain part of the original recording, which might be a recording of a band made up of different instruments and vocals.
  • the sound phrases in the corresponding arc may be selected so that they belong together in a musical way; for example they are either parts of one musical motif or phrase, or they play in the same chord.
  • each user should play a sound phrase from the same sector simultaneously.
  • users may play sound phrases from different sectors if the sound phrases in different sectors nevertheless belong together. For example, a bass track might have a sector containing phrases within one chord, whereas the guitar track might have two different phrases, in two sectors, both belonging within the same chord and thus belonging together with the single bass sector.
  • multiple users can connect with each other and aggregated data from all the connected devices are sent to one master device.
  • data from each device may be synchronized and merged into a single jam session.
  • the aggregated data comprises audio data, and can also comprise video data.
  • the video and audio data from a jam session can be merged on one device and converted to a music video recorded from all connected devices.
  • a reference template/matrix is provided for each different pre-existing musical recording to give clues to the users as to what music tracks and associated sound phrases should be played at a given time. Clues are given to guide users, and may be provided via the display, audibly or by vibration of the mobile device. The users' jam session may be compared against the reference template/matrix in order to assess how accurately the users recreated the original song. Based on this assessment, a score may be provided to the users.
  • the interface may allow for the user to select a different music track by means of motion of the mobile device or by interaction with an input device on the mobile device, such as the buttons or a touch screen.
  • an input device on the mobile device such as the buttons or a touch screen.
  • the device has a touch screen and a different music track is selected by a swiping movement on the touch screen.
  • Suitable programming tools and languages can be used to create software to implement the invention and preferred embodiments, for example by means of a downloadable application or "app" for the mobile device.
  • Some non-limiting examples of programming languages that could be used are: C, Objective-C, C++, and Pure Data.
  • programming tools are Apple Xcode and Pure Data. Where the mobile device is an Apple product, the Core Motion framework of Apple's iOS may be used to access the orientation of the device.
  • Figure 1 shows a mobile device and example axes of movement.
  • a user holds a mobile device 2 in their hand 1.
  • the mobile device 2 comprises a screen 3, i.e. a visual display capable of displaying the output of the user interface.
  • the user holds the mobile device 2 in the portrait orientation, such that a longitudinal axis is defined along the length (longer sides) of the mobile device 2, and a lateral axis is defined along the width (shorter sides).
  • the user first selects a piece of pre-existing musical material to interact with.
  • the user can select a piece to play by selection from a menu or other selection tool of a user interface of the mobile device.
  • the mobile device 2 has a touch screen 3 that the user can use to select a piece of pre-existing musical material.
  • the user can then select a music track from those available for that material.
  • the music track is selected in a similar way to the selection of the piece of pre-existing musical material.
  • the user On selection of a music track, the user is presented with a graphical user interface on the screen 3 showing the available sound phrases associated with the music track.
  • the music track has six sound phrases associated with it.
  • the sound phrases are evenly distributed along a virtual 180° arc about the vertical axis, such that each sound phrase is within a sector bounded by a 30° arc.
  • the default position is the middle of the virtual arc (i.e. 90°).
  • the 180° arc and division into sectors are displayed in the graphical user interface, along with an indication of the associated sound phrases, such as a name of the sound phrase.
  • a needle points to the selected arc/sector and the graphical representation rotates as the mobile device 2 is rotated.
  • the corresponding sound phrase When the user orients the mobile device 2 such that it is pointed within a given sector, the corresponding sound phrase will be available to play. Whilst the mobile device 2 remains oriented in the same sector/arc, the sound phrase can continue to play in a loop. When the user rotates the mobile device 2 about the vertical axis, the orientation of the device 2 changes from pointing to one arc/sector to another. When that happens, the playing sound phrase will stop and the sound phrase corresponding to the new arc/sector will start playing.
  • the position of the sound phrases about the vertical axis is reset. In this way, the current position of the device 2 becomes the middle position (i.e. 90°) within the 180° arc of the new music track.
  • the user changes the orientation of the mobile device 2 relative to the lateral axis. That is, the user tilts the phone upward above a threshold angle to stop the sound phrase playing, and tilts the phone to a horizontal position to make the sound phrase play.
  • the state of the music track/sound phrase i.e. active (playing) or passive (not playing) is reflected in the graphical user interface of the application, shown on the display 3 of the mobile device 2.
  • a graphical object may be greyed out or ghosted to indicate a passive state (i.e. the music track is not playing), and coloured or highlighted to indicate an active state (i.e. the music track is playing).
  • the user may receive hints from the user interface regarding which phrase should be played, and at what time. These hints may be in the form of audio or visual clues, or by vibration of the mobile device.

Abstract

A method of controlling a user interface for a mobile device 2 having a visual display 3, the user interface being for playing pre-existing musical material via the mobile device 2; wherein the orientation of the mobile device about a first axis is used to select a sound phrase from a plurality of sound phrases associated with a music track from the pre-existing musical material, and wherein orientation of the mobile device 2 about a second axis is used to determine whether the sound phrase is to be played.

Description

PDA WITH MOTION CONTROLLED USER INTERFACE FOR MUSIC
PHRASES SELECTION
The present invention relates to a motion-controlled user interface for a mobile device, the user interface being for playing sounds via the mobile device.
Applications comprising motion-controlled user interfaces for mobile devices (e.g. mobile telephones, PDAs, tablets) make use of existing gyroscopes and accelerometers provided in the mobile device in order to detect direction and motion. It is known to use motion control as an interface to allow the user to control games and other software on the mobile device. It is also known to use the motion of the mobile device to control music. For example, String Trio, provided by greySox Inc. via Apple iTunes, is a mobile device application which allows a user to simulate playing a violin, cello or viola. Tilting, swinging and sliding the mobile device, and varying the speed and strength of the movements, changes the sound produced by the application. Another example is iBaton, provided by Vanke Software Inc. via Apple iTunes, which allows a user to select a piece of music and wave the mobile device like a conductor's baton, changing the tempo of the music as desired.
According to a first aspect of the present invention, there is provided a method of controlling a user interface for a mobile device having a visual display, the user interface being for playing sounds via the mobile device; wherein in the method the orientation of the mobile device about a first axis is used to select a sound phrase from a plurality of sound phrases associated with a music track, and the orientation of the mobile device about a second axis is used to determine whether the sound phrase is to be played.
The invention also extends to a mobile device supporting the user interface, and to a related computer programme product.
According to a second aspect, there is provided a mobile device comprising: a visual display for displaying a visual output of a user interface; a speaker for playing sound output by the user interface; an accelerometer for measuring linear acceleration of the mobile device; a gyroscope for measuring angular rotation of the mobile device; and a processor, wherein the processor is configured to: receive information from the
accelerometer and gyroscope to determine the orientation of the mobile device; select a sound phrase from a plurality of sound phrases associated with a music track depending on the orientation of the mobile device about a first axis; and determine whether the sound phrase is to be played depending on the orientation of the mobile device about a second axis. According to a third aspect, there is provided a computer programme product for a mobile device, the computer programme product comprising instructions that, when executed, will configure the mobile device such that it provides a user interface operable in accordance with the method of the first aspect. The computer programme product may be a downloadable file.
The following preferable features of the present invention are applicable to all of the foregoing aspects.
As noted above, applications comprising motion-controlled user interfaces for mobile devices (e.g. mobile telephones, PDAs, tablets) make use of existing gyroscopes and accelerometers provided in the mobile device in order to detect direction and motion. The present invention utilises this information to allow a user to interact with previously recorded musical material by means of a combination of control movements and a breakdown of the musical material. This combination of features is not found in the prior art.
The music track may be any musical material that is separable into appropriate tracks and sound phrases. It may for example include the sound of a particular instrument or combination of instruments, or may include the recordings of bands or other musical ensembles. In preferred embodiments, known musical material is divided into small pieces (parts, fragments or sequences) such that the user can interact with these "building blocks", which form the sound phrases. The user interface gives the user access to these building blocks of the pre-existing musical material, such that the user can control which building block is played, and when.
As is well known in the art, musical material can be recorded in a multi-track format recording. In multi-tracking, multiple musical instruments (and vocals) can be recorded, either one at a time or simultaneously, onto individual tracks, so that the sounds thus recorded can be accessed, processed and manipulated individually to produce the desired results. References to a "track" herein may hence refer to a single track of a multi-track recording, although it should also be understood that the term "track" extends to any sound recordings/data representing sound recordings of an equivalent nature to the multi-track recording. The advantages of the sequence of movements used by the invention apply to any appropriate sound as the music track.
Preferably, each music track corresponds to a different instrument or vocal performance. Non-limiting examples of music tracks are: drum track, keyboard/piano track, guitar track, bass guitar track, saxophone track, vocal track, and so on. Each track has associated therewith a plurality of sound phrases. Each sound phrase may comprise a rhythmic pattern and at least one sample derived from the pre-existing musical material. For example, the different sound phrases may be the sounds of a single instrument played in a different way, using different notes, tones and/or chords.
The user may control one music track at a time. The music track is controlled by the orientation of the mobile device in 3D space, i.e. by moving the mobile device along or around two or more axes. Preferably, these axes are orthogonal.
Preferably, a graphical user interface displayed on the display of the mobile device reflects how the movement affects the music track.
The mobile device will generally be a rectangular cuboid, with a centre of gravity approximately defined by the point of intersection of three axes that pass through the central point of each face of the cuboid. These axes are referred to herein as the longitudinal, lateral and vertical axes. The three axes are defined in an equilibrium state where, in preferred embodiments, the mobile device is held flat by the user (i.e.
horizontally) with the visual display facing upwards such that it is visible to the user. When held in this manner, one of the axes of the mobile device will be vertical (the vertical axis), and the other two (longitudinal and lateral) will be in the horizontal plane, orthogonal to one another and to the vertical axis. It should be clear that the axes are fixed in space and are invariant; they are defined with respect to the equilibrium position of the device, but do not move with the device.
The visual display of the mobile device will generally be defined by two longer sides and two shorter sides. Generally, information can be displayed either oriented in the direction of the longer sides (i.e. portrait orientation) or oriented in the direction of the shorter sides (i.e. landscape orientation). Whichever is the case, the longitudinal axis is preferably the axis oriented in the same direction as the orientation of the visual display. The lateral axis is defined orthogonally to the longitudinal axis. That is, if the mobile device of the preferred embodiment is held in portrait orientation, the longitudinal axis is parallel to the longer sides of the device and the lateral axis is parallel to the shorter sides. If the mobile device of the preferred embodiment is held in landscape orientation, the longitudinal axis is parallel to the shorter sides of the device and the lateral axis is parallel to the longer sides.
The axes may also be referred to as the pitch, roll and yaw axes. As used herein, these terms have the standard meanings known in the field of aircraft, for example. In this nomenclature, the yaw axis is the vertical axis, and the yaw of the mobile device is the angle about this axis. The roll axis is the longitudinal axis and the pitch axis is the lateral axis. Again, the pitch, yaw and roll axes are fixed in space and are invariant; they are defined with respect to the equilibrium position of the device, but do not move with the device.
In preferred embodiments, the first axis is the vertical axis (i.e. the yaw axis) and the second axis is either the longitudinal or lateral axis (i.e. the pitch or roll axis).
Thus, in one preferred embodiment, the orientation about the lateral axis (pitch) or the longitudinal axis (roll) is used to control whether the sound phrase is playing or not. The lateral axis is preferred as this is considered to provide a more ergonomic movement for the user's hand and allows for a comfortable movement for both left and right handed users. A specific threshold angle is preferably defined (for instance 45° up from horizontal position about the axis). Moving the device to one side of this angle (either above or below the angle) makes the sound phrase play, and moving the device to the other side of the threshold angle stops the sound phrase from playing. For example, in preferred embodiments, if the mobile device is held in an upright position, the sound phrase will not play. If the mobile device is held flat, the sound phrase will play. It is preferable that the sound phrase plays when the mobile device is held flat with the visual display facing upwards towards the user, and the sound phrase does not play when the phone is orientated such that the visual display is oriented vertically and not easily visible to the user.
The orientation about the lateral axis or longitudinal axis may also be used to determine the volume at which the sound phrase is played. For example, when the orientation of the mobile device passes the threshold angle the sound phrase may begin to play at a low volume, and as the mobile device is angled further past the threshold the volume may be increased.
The sound phrases of each music track are preferably made up of a rhythmic pattern and one or several samples triggered by the rhythmic pattern. The rhythmic pattern itself is synchronized with an underlying sequencer. The sequencer ensures that the sound phrase is always playing in time with the beat and is synchronized with other users in a jamming session. For example, the sequencer may count from 0 to 16 repeatedly (a one bar loop). An exemplary rhythmic pattern starts samples on the sequencer's count of 4, 8 and 14, thus triggering three samples during one bar. To ensure that the sound phrase is played in time with the beat and is synchronized with any other users in the jamming session, a delay will possibly occur from the moment the user changes the sound phrase to the moment the first sample starts playing.
Optionally, each sound phrase represents samples played with one specific chord. The orientation of the device about the vertical axis (yaw) is preferably used to control the choice of what sound phrase of the selected music track is played. The available sound phrases may be evenly distributed along a virtual 180° arc about the vertical axis. In one example, the music track has six sound phrases associated with it, and each phrase is distributed about the vertical axis (yaw) in a sector bounded by a 30° arc. Each music track available via the interface could have the same number of sound phrases associated with it, or there may be a different number of sound phrases for different music tracks. Of course, each music track may have more or fewer than six sound phrases associated with it. When the device is oriented such that it is pointed within one of these sectors, the corresponding sound phrase will play. Preferably, whilst the device remains oriented in the same sector/arc, the sound phrase will continue to play in a loop. The default position is the middle of the virtual arc (i.e. 90° if the arc is 180°).
When the device is moved from one arc/sector to another, the playing sound phrase will stop and the sound phrase corresponding to the new arc will start playing. The moment when this sound phrase will start may be determined by the rhythmic pattern for this sound phrase such that the sound phrase may start playing on the next occurring beat of the pattern.
The distribution of sound phrases about the vertical axis (yaw) may be shown in a graphical user interface on the display of the mobile device, for example as an arc with markings showing how the sound phrases are separated. Preferably, this graphical representation rotates as the device is rotated, indicating how the device is moved within the 180° arc, and a "needle" indicates the current position of the device within this arc.
Optionally, the state of the music track, i.e. active (playing) or passive (not playing) is reflected in the graphical user interface of the application, displayed on the display of the mobile device. For example, a graphical object may be greyed out or ghosted to indicate a passive state (i.e. the music track is not playing), and coloured or highlighted to indicate an active state (i.e. the music track is playing).
In preferred embodiments, each time the user selects a new music track, the position of the sound phrases about the vertical axis is reset. In this way, the current position of the device becomes the default middle position (i.e. 90° within a 180° arc) of the new music track.
The sound phrases for the music tracks are preferably structured in a way that makes it possible for a plurality of users to recreate together a certain part of the original recording, which might be a recording of a band made up of different instruments and vocals. For a plurality of music tracks, the sound phrases in the corresponding arc may be selected so that they belong together in a musical way; for example they are either parts of one musical motif or phrase, or they play in the same chord. Thus it may be the case that, to recreate a part of the original recording, each user should play a sound phrase from the same sector simultaneously. Alternatively or in addition, to recreate a part of the original recording, users may play sound phrases from different sectors if the sound phrases in different sectors nevertheless belong together. For example, a bass track might have a sector containing phrases within one chord, whereas the guitar track might have two different phrases, in two sectors, both belonging within the same chord and thus belonging together with the single bass sector.
In some preferred embodiments, multiple users can connect with each other and aggregated data from all the connected devices are sent to one master device. In this way, data from each device may be synchronized and merged into a single jam session. The aggregated data comprises audio data, and can also comprise video data. The video and audio data from a jam session can be merged on one device and converted to a music video recorded from all connected devices.
Optionally, a reference template/matrix is provided for each different pre-existing musical recording to give clues to the users as to what music tracks and associated sound phrases should be played at a given time. Clues are given to guide users, and may be provided via the display, audibly or by vibration of the mobile device. The users' jam session may be compared against the reference template/matrix in order to assess how accurately the users recreated the original song. Based on this assessment, a score may be provided to the users.
The interface may allow for the user to select a different music track by means of motion of the mobile device or by interaction with an input device on the mobile device, such as the buttons or a touch screen. In one preferred embodiment the device has a touch screen and a different music track is selected by a swiping movement on the touch screen.
Suitable programming tools and languages can be used to create software to implement the invention and preferred embodiments, for example by means of a downloadable application or "app" for the mobile device. Some non-limiting examples of programming languages that could be used are: C, Objective-C, C++, and Pure Data. Similarly, some non-limiting examples of programming tools are Apple Xcode and Pure Data. Where the mobile device is an Apple product, the Core Motion framework of Apple's iOS may be used to access the orientation of the device.
A preferred embodiment of the invention will now be described by way of example only and with reference to the accompanying drawing in which: Figure 1 shows a mobile device and example axes of movement.
As shown in Figure 1 , a user holds a mobile device 2 in their hand 1. The mobile device 2 comprises a screen 3, i.e. a visual display capable of displaying the output of the user interface. In the example shown, the user holds the mobile device 2 in the portrait orientation, such that a longitudinal axis is defined along the length (longer sides) of the mobile device 2, and a lateral axis is defined along the width (shorter sides).
The user first selects a piece of pre-existing musical material to interact with. The user can select a piece to play by selection from a menu or other selection tool of a user interface of the mobile device. In the example shown in Figure 1 , the mobile device 2 has a touch screen 3 that the user can use to select a piece of pre-existing musical material. The user can then select a music track from those available for that material. The music track is selected in a similar way to the selection of the piece of pre-existing musical material.
On selection of a music track, the user is presented with a graphical user interface on the screen 3 showing the available sound phrases associated with the music track. In this example, the music track has six sound phrases associated with it. The sound phrases are evenly distributed along a virtual 180° arc about the vertical axis, such that each sound phrase is within a sector bounded by a 30° arc. The default position is the middle of the virtual arc (i.e. 90°). The 180° arc and division into sectors are displayed in the graphical user interface, along with an indication of the associated sound phrases, such as a name of the sound phrase. A needle points to the selected arc/sector and the graphical representation rotates as the mobile device 2 is rotated.
When the user orients the mobile device 2 such that it is pointed within a given sector, the corresponding sound phrase will be available to play. Whilst the mobile device 2 remains oriented in the same sector/arc, the sound phrase can continue to play in a loop. When the user rotates the mobile device 2 about the vertical axis, the orientation of the device 2 changes from pointing to one arc/sector to another. When that happens, the playing sound phrase will stop and the sound phrase corresponding to the new arc/sector will start playing.
If the user selects a new music track, the position of the sound phrases about the vertical axis is reset. In this way, the current position of the device 2 becomes the middle position (i.e. 90°) within the 180° arc of the new music track.
To change the music track/sound phrase from an active state (playing) to a passive state (not playing, or muted), or vice versa, the user changes the orientation of the mobile device 2 relative to the lateral axis. That is, the user tilts the phone upward above a threshold angle to stop the sound phrase playing, and tilts the phone to a horizontal position to make the sound phrase play.
The state of the music track/sound phrase, i.e. active (playing) or passive (not playing) is reflected in the graphical user interface of the application, shown on the display 3 of the mobile device 2. In this example, a graphical object may be greyed out or ghosted to indicate a passive state (i.e. the music track is not playing), and coloured or highlighted to indicate an active state (i.e. the music track is playing).
The user may receive hints from the user interface regarding which phrase should be played, and at what time. These hints may be in the form of audio or visual clues, or by vibration of the mobile device.

Claims

1. A method of controlling a user interface for a mobile device having a visual display, the user interface being for playing sounds via the mobile device; wherein in the method the orientation of the mobile device about a first axis is used to select a sound phrase from a plurality of sound phrases associated with a music track, and the orientation of the mobile device about a second axis is used to determine whether the sound phrase is to be played.
2. The method of claim 1 , wherein the first axis is a vertical axis and the second axis is a longitudinal or lateral axis.
3. The method of claim 1 or 2, wherein to determine whether the sound phrase is to be played the orientation about the second axis is compared with a predetermined threshold angle; and preferably wherein the threshold angle is 45° to horizontal.
4. The method of claim 1 , 2 or 3, wherein the orientation about the first axis is divided into evenly distributed sectors along an arc, each sector corresponding to selection of a different sound phrase, and preferably wherein the arc is an 180° arc.
5. The method of claim 4, wherein the sound phrase is played in a loop while the orientation of the device about the first axis remains within the same sector.
6. The method of claim 4 or 5, wherein when the orientation of the device about the first axis moves from a first sector to a second sector, a first sound phrase associated with the first sector stops playing and a second sound phrase associated with the second sector starts playing.
7. The method of claim 4, 5 or 6, comprising displaying, on the visual display, the sectors and an indication of the sound phrase associated with each sector.
8. The method of any preceding claim, wherein when a new music track is selected, the orientation about the first axis is reset such that a current orientation of the device becomes a middle position of the arc of the new music track.
9. The method of any preceding claim, comprising displaying whether the sound phrase is being played or not on the visual display.
10. The method of any preceding claim, wherein each sound phrase comprises a rhythmic pattern and one or several samples played by that music track, preferably wherein the sample is played with one chord.
1 1. The method of claim 10, wherein the sound phrases are derived from a pre-existing recorded piece of music.
12. The method of claim 1 1 , wherein a plurality of users can recreate the pre-existing recorded piece of music by each playing one of a plurality of music tracks and sound phrases associated with the pre-existing recorded piece of music at the same time.
13. The method of claim 12 wherein the multiple users can connect each of their devices with each other and aggregated data from all the connected devices are sent to a device that is designated as a master device.
14. The method of any preceding claim wherein a reference template is provided for each different pre-existing musical recording to give clues to the users as to what music tracks and associated sound phrases should be played at a given time; and preferably wherein clues may be provided via the visual display, audibly or by vibration of the mobile device.
15. A mobile device comprising: a visual display for displaying a visual output of a user interface; a speaker for playing sound output by the user interface; an accelerometer for measuring linear acceleration of the mobile device; a gyroscope for measuring angular rotation of the mobile device; and a processor;
wherein the processor is configured to: receive information from the accelerometer and gyroscope to determine the orientation of the mobile device; select a sound phrase from a plurality of sound phrases associated with a music track depending on the orientation of the mobile device about a first axis; and determine whether the sound phrase is to be played depending on the orientation of the mobile device about a second axis.
16. An apparatus as claimed in claim 15 wherein the processor is configured to carry out the method of any of claims 1 to 14.
17. A computer programme product for a mobile device, the computer programme product comprising instructions that, when executed, will configure the mobile device such that it provides a user interface operable in accordance with the method of any of claims 1 to 14.
PCT/EP2014/052836 2013-02-13 2014-02-13 Pda with motion controlled user interface for music phrases selection WO2014125033A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1302506.9A GB2510825A (en) 2013-02-13 2013-02-13 Motion controlled user interface
GB1302506.9 2013-02-13

Publications (1)

Publication Number Publication Date
WO2014125033A1 true WO2014125033A1 (en) 2014-08-21

Family

ID=47999035

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/052836 WO2014125033A1 (en) 2013-02-13 2014-02-13 Pda with motion controlled user interface for music phrases selection

Country Status (2)

Country Link
GB (1) GB2510825A (en)
WO (1) WO2014125033A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10233608A1 (en) * 2002-07-24 2004-02-12 Siemens Ag Input device for a terminal
US20090027338A1 (en) * 2007-07-24 2009-01-29 Georgia Tech Research Corporation Gestural Generation, Sequencing and Recording of Music on Mobile Devices
US20100245234A1 (en) * 2009-03-31 2010-09-30 Motorola, Inc. Portable Electronic Device with Low Dexterity Requirement Input Means

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10233608A1 (en) * 2002-07-24 2004-02-12 Siemens Ag Input device for a terminal
US20090027338A1 (en) * 2007-07-24 2009-01-29 Georgia Tech Research Corporation Gestural Generation, Sequencing and Recording of Music on Mobile Devices
US20100245234A1 (en) * 2009-03-31 2010-09-30 Motorola, Inc. Portable Electronic Device with Low Dexterity Requirement Input Means

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
APPLE CORP: "GarageBand Getting Started, GarageBand at a Glance; Using Apple Loops; Working in the Timeline; Working with Real Instruments; Keyboard Shortcuts; Connecting Music Equipment To Your Computer", INTERNET CITATION, 2 January 2005 (2005-01-02), pages 10 - 13,27, XP002402773, Retrieved from the Internet <URL:http://manuals.info.apple.com/en_US/GarageBand2_GettingStarted.pdf> [retrieved on 20061011] *

Also Published As

Publication number Publication date
GB2510825A (en) 2014-08-20
GB201302506D0 (en) 2013-03-27

Similar Documents

Publication Publication Date Title
JP5890968B2 (en) Game device, game program
US20150103019A1 (en) Methods and Devices and Systems for Positioning Input Devices and Creating Control
US20070221046A1 (en) Music playing apparatus, storage medium storing a music playing control program and music playing control method
KR100708411B1 (en) Apparatus and method for analyzing movement of portable production
US11557269B2 (en) Information processing method
US9898249B2 (en) System and methods for simulating real-time multisensory output
EP2786371A2 (en) Determining the characteristic of a played chord on a virtual instrument
JP2011011008A (en) Gaming device, game processing method and program
US9861892B2 (en) Music game which changes sound based on the quality of players input
GB2475339A (en) Optical bowing sensor for emulation of bowed stringed musical instruments
Miller et al. Wiiolin: a Virtual Instrument Using the Wii Remote.
CN112883223A (en) Audio display method and device, electronic equipment and computer storage medium
JP6154597B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
JP6411412B2 (en) Program, game apparatus, and game progress method
WO2014125033A1 (en) Pda with motion controlled user interface for music phrases selection
JP5510207B2 (en) Music editing apparatus and program
WO2014174621A1 (en) Recording medium, gaming apparatus and game progress method
US20080223199A1 (en) Instant Rehearseless Conducting
Dolhansky et al. Designing an expressive virtual percussion instrument
US8912420B2 (en) Enhancing music
JP5399831B2 (en) Music game system, computer program thereof, and method of generating sound effect data
Dannenberg et al. Human-computer music performance: From synchronized accompaniment to musical partner
JP2017038957A (en) Game device and program
Migneco et al. An audio processing library for game development in Flash
JP2019025346A (en) Program, game device, and game progress method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14704164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14704164

Country of ref document: EP

Kind code of ref document: A1