WO2021069630A1 - Digital audio workstation - Google Patents

Digital audio workstation Download PDF

Info

Publication number
WO2021069630A1
WO2021069630A1 PCT/EP2020/078346 EP2020078346W WO2021069630A1 WO 2021069630 A1 WO2021069630 A1 WO 2021069630A1 EP 2020078346 W EP2020078346 W EP 2020078346W WO 2021069630 A1 WO2021069630 A1 WO 2021069630A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
track
group
volume
audio
Prior art date
Application number
PCT/EP2020/078346
Other languages
French (fr)
Inventor
Ryan STABLES
Nicholas JILLINGS
Brecht DE MAN
Sean ENDERBY
Original Assignee
Semantic Audio Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Semantic Audio Limited filed Critical Semantic Audio Limited
Publication of WO2021069630A1 publication Critical patent/WO2021069630A1/en

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/46Volume control
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/02Means for controlling the tone frequencies, e.g. attack or decay; Means for producing special musical effects, e.g. vibratos or glissandos
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/02Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information
    • H04H60/04Studio equipment; Interconnection of studios
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
    • G10H2220/111Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters for graphical orchestra or soundstage control, e.g. on-screen selection or positioning of instruments in a virtual orchestra, using movable or selectable musical instrument icons
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/116Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters for graphical editing of sound parameters or waveforms, e.g. by graphical interactive control of timbre, partials or envelope
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/131Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters for abstract geometric visualisation of music, e.g. for interactive editing of musical parameters linked to abstract geometric figures

Definitions

  • the present invention relates to a digital audio workstation for the intelligent processing of audio data, and, more specifically, a digital audio workstation having enhanced session organisation features.
  • DAW Digital Audio Workstation
  • DAWs are aimed at the experienced user with extensive audio processing skills and experience.
  • a 'stage view' of tracks, in which tracks are presented to a user as user interface objects in a stage-like space provide an intuitive way of visualising such tracks.
  • stage view controllers include Line 6 StageScape, and, as a plugin, iZotope Visual Mixer.
  • stage view controllers of existing DAWs do not facilitate effective session management for complex mixing arrangements.
  • Grouping and sub-grouping audio tracks is a known technique to efficiently process audio tracks that are similar, such as those which relate to similar instruments.
  • an audio engineer can (for example) apply a processing effect to a group which is propagated to the individual tracks in the group.
  • the process of routing audio channels to groups per se can be automated by intelligent audio processing tools.
  • a routing methodology which utilises semantic labels is described in the paper 'Automatic Channel Routing Using Musical Instrument Linked Data' (Nicholas Jillings and Ryan Stables, Proceedings of the 3 rd Workshop on Intelligent Music Productions, Salford, UK, 15 th September 2017).
  • existing DAWs do not facilitate the visualisation (and manipulation) of a hierarchical grouping structure in an effective way. It is an aim of the present invention to address, or at least mitigate, deficiencies of the prior art by provision of a DAW interface which facilitates effective session organisation in a hierarchical grouping structure via a stage view interface.
  • a system for adjusting one or more parameters of one or more audio tracks wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value
  • the system comprising: a server and a user application in communication with the server, wherein the user application comprises a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more group track user interface objects, wherein each group track user interface object represents a group track; means for receiving user input, via the user interface, to adjust the first spatial position value and/or a volume value of the group track to a second spatial position value and/or a volume value of the group track, wherein the user application is configured to calculate a second spatial position value and/or volume value of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, and to display, on the stage view user
  • a method of adjusting one or more parameters of one or more audio tracks wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the plurality of audio tracks has a first spatial position value and a volume value, displaying, in a stage view user interface of a digital audio workstation, a group track user interface object, wherein the group track user interface object represents a group track, receiving, via the user interface, a user input to adjust the first spatial position and/or volume of the group track to a second spatial position and/or volume of the group track, calculating, by the digital audio workstation, a second spatial position and/or volume of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, outputting, to the stage view user interface, audio track user interface objects, wherein the position of each of the one or more audio track user interface objects in the stage of the stage view user interface represents the second spatial position and/or volume of
  • a method of adjusting one or more parameters of a group track comprising receiving user input instructions to specify adjustment of the one or more parameters of the group track over a time period; adjusting, by a user application, the one or more parameters of the group track according to the instructions; displaying, in a stage view of a user interface, one or more track user interface objects, wherein each user interface object represents each of the one or more audio tracks, wherein the positions of each of the one or more audio user interface objects changes in real time during playback according to the input instructions.
  • a system for adjusting one or more parameters of a group of one or more audio tracks wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value
  • the system comprising: a user application, wherein the user application is a digital audio workstation comprising a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more audio track user interface objects, wherein each audio track user interface object represents an audio track, and a highlighted stage segment having a boundary, wherein the boundary of the highlighted stage segment represents maximum and minimum values of the spatial position and volume of a group track, means for receiving user input, via the user interface, to adjust the boundary, wherein the user application is configured to calculate a second spatial position and/or volume value of the group track based on the adjustment of the boundary, and output, to the stage view user interface, a group track user
  • a system for adjusting one or more parameters of one or more audio tracks comprising: a user application, wherein the user application is a digital audio workstation comprising a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more group track user interface objects, wherein each group track user interface object represents a group track; means for receiving user input, via the user interface, to adjust the first spatial position value and/or a volume value of the group track to a second spatial position value and/or a volume value of the group track, wherein the user application is configured to calculate a second spatial position value and/or volume value of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, and to display, on the stage view user interface
  • Figure 1 is a screenshot of an exemplary timeline view showing hierarchical track groupings in a DAW according to an embodiment
  • Figure 2 is a screenshot of an exemplary mixer view showing hierarchical track groupings in a DAW according to an embodiment
  • Figure 3 is a screenshot of an exemplary stage view showing hierarchical track groupings in a DAW according to an embodiment
  • Figure 4 is a screenshot of an exemplary stage view showing tracks in a stage segment in a DAW according to an embodiment
  • Figure 5 is a is a screenshot of an exemplary stage view showing tracks and a group track in a stage segment in a DAW according to an embodiment
  • Figure 6 is a is a screenshot of an exemplary stage view showing tracks of a group in a stage segment in a DAW according to an embodiment
  • Figure 7 is a is a diagram showing the effect of altering a group's stereo position in a stage view of a DAW according to an embodiment
  • Figure 8 is a is a screenshot of an exemplary stage view showing groups and sub-groups in a stage segment in a DAW according to an embodiment
  • the DAW according to an embodiment of the invention having specific session management features will be described.
  • the DAW according to a preferred embodiment of the invention is a web-based tool; i.e. it is hosted by a server and accessible to a user via a browser. However, it may alternatively be a stand-alone user application which can be run on any suitable hardware.
  • Each user may be required to have registered as a user of the platform to enable secure access to the DAW and may be required to purchase access to specific features. Means for registration and the purchase of all or some features are known in the art.
  • the DAW of the present invention comprises a set of tools which allow a user to generate audio mixes using recorded and synthesised audio.
  • the audio is generally processed client-side, rather than server- side, via the user's browser, a desktop application or mobile application. Although most processing is done at the client, some specific tasks are done at the server, in which case one or more APIs running in the browser or application allow for client-server communication.
  • Audio files can be stored at the server and accessed and downloaded by a user when requested.
  • the user Using the DAW's interface, the user essentially instructs changes to the audio to be made (which may be made at the client or at the server). The changes are then returned to the client and, upon the user's confirmation to maintain the changes, the audio is updated to reflect the changes.
  • An audio mix generally comprises a number of audio tracks.
  • the DAW of the present invention comprises a number of functions which allow for complex audio arrangements to be manipulated more efficiently.
  • One such function is the automatic grouping of similar tracks in the audio.
  • a track is a single audio channel, and is often, for complex musical arrangement, a recording of a single instrument in a multi-instrumental arrangement.
  • a group may be 'strings', 'percussion', 'vocals' etc. Groups are represented in the DAW as tracks that take other tracks (or sub groups) as inputs, and output to other groups or the master channel.
  • a group track, sub-group track or sub-sub-group track (hereinafter referred to generally as a group track) is a collection of tracks that have been sub-mixed into a single track.
  • a group track (which may be referred to as a 'grandparent' track) may comprise one or more sub-group ('parent') tracks which in turn comprise one or more individual ('child') tracks.
  • the 'master' track is the root-level track, representing the complete mix of all groups, sub-groups and non-grouped tracks. Whilst it is not common, it is possible to have a high number of nested groups, however this is often limited due to Ul constraints.
  • a user is able to manipulate the audio tracks via three main 'views' or interface arrangements; timeline ( Figure 1), mixer ( Figure 2) and stage ( Figure 3).
  • the DAW allows a user to switch between these three views at any point.
  • the other views are automatically updated to reflect the changes such that the user can seamlessly switch between different views and each one will reflect the current state of audio processing.
  • the timeline view 100 of Figure 1 allows for the visualisation of progression of a mix over time, as is known in the art.
  • Figure 1 shows a collection of tracks stacked vertically. The tracks are grouped in a hierarchical structure; group track (blue) (comprising a single track) comprises sub-group (orange), (comprising two tracks) which comprises sub-sub-group (pink) (comprising a single track).
  • group track blue
  • range comprising two tracks
  • sub-sub-group comprising two tracks
  • pink sub-sub-group
  • Figure 2 shows mixer view 200.
  • the mixer view represents the tracks analogously to a mixing desk.
  • a plurality of tracks are arranged in a group, sub-group and sub-sub-group hierarchical structure.
  • Group 201 comprises two individual tracks as well as sub-groups 202 and 203.
  • Sub-group 203 comprises sub- sub-group 204.
  • the audio mix also includes individual tracks 205. Parameters of the master track can be manipulated using channel strip 206.
  • each individual track, sub-group or group is represented as a user interface object on a representation of a stage.
  • the user interface objects include an icon representative of the instrument or feature to which they relate.
  • the objects can be moved around on 2-dimensional stage 301 under user control.
  • Group, sub-group and sub-sub-group track objects are shaded to distinguish them from individual tracks.
  • the colour of the shaded group objects is the same as the colour of the outline of each track within the group. The colours are persistent across all views and provide a quick way to identify the tracks.
  • the bottom-centre position of the stage represents the position of a nominal listener.
  • the volume of a track can be adjusted by moving the track object relative to the listener position; the closer the track icon is to the listener position, the higher the volume.
  • the stereo position (pan) of the track is altered by rotating the track icon about the listening position and is calculated by cartesian to polar mapping based on the pseudo-position of the track on a rectangular stage. Users are able to modify parameters of the groups and the parameters of the individual tracks within the same user interface.
  • 'stereo position' is synonymous with 'spatial position' and infers no limitation on the number of output channels.
  • the hierarchical arrangement of tracks and sub-groups is shown in a track-list region on the left side of the user interface and shows a list of all available tracks and groups, arranged in a hierarchical tree structure. Multiple tracks or groups can be selected in this region and the corresponding objects will be highlighted in the stage view (and vice versa).
  • the stage view for a group can also be entered here by double-clicking a group object. If a track inside a group is selected, the contents of the group (the 'track view') will be shown on the stage, with the selected track object highlighted (as per track 302).
  • the colour of the stage segment in a group view represents the colour of the parent group track (as per segment 313 of Figure 4).
  • the track view can also be invoked by double clicking the group object.
  • Group 310 shown in the track-list region, is a sub-group of group 307. Because the current view of the stage objects is the group view (in this case, the master view), there is no object on the stage which represents group 310. If group 310 were selected, or group 307 were double-clicked, an object representing group 310 would be shown on the stage, along with another object representing track 311 in the track view.
  • a 'tooltip' menu is displayed above a track or group user interface object is selected.
  • the tooltip menu is a user interface object which displays the volume and pan values for the selected track or group, and allows a user, to toggle on/off any automation settings that have been applied to volume or pan (as discussed further below).
  • Channel strip 303 is positioned on the right of the stage view interface and mimics a channel strip of the mixer view.
  • Channel strip 303 corresponds to the track, group or sub-group that is currently selected (i.e. the strip changes to show the parameters of the selected object).
  • the parameters of strip 303 change in real time when the corresponding track or group object on stage 301 is moved, and similarly the parameters of the track can be manipulated via channel strip 303 which causes the corresponding track or group object to move on stage 301 as necessary.
  • Figure 4 shows a group view 330 for group 310. Objects corresponding to the four individual tracks present in group 310 are shown on the stage. Shaded stage segment 313 denotes the extent to which the volume and pan of the individual tracks can be altered, as limited by the parameters of group track 310, as will be explained further below.
  • Track 312 is highlighted in the track-list region and therefore the corresponding object is also highlighted on the stage.
  • the stage view 340 of Figure 5 is a further example of a hierarchical grouping structure. Group 341 is highlighted in the track-list region and its corresponding object 342 is also highlighted on the stage. Because group 341, a top-level group, is selected, the other objects on the stage represent the other groups and track in the master track.
  • Channel strip 344 corresponds to the selected group track, i.e. group 341.
  • Figure 6 illustrates the effect of selecting a track or sub-group within a group.
  • track 346 is highlighted in the track-list region and the corresponding object 347 is highlighted on the stage.
  • Shaded group object 349 is also shown on the stage, corresponding to sub-group 348. Since group 346 is selected, its parameters are represented in channel strip 350.
  • the stage is shaded in the colour of group track 351 to denote that modification of the group 351 will affect the individual tracks in the group. This is discussed in more detail below with reference to Figure 7.
  • the position of the individual tracks or sub-group tracks are normalised to the stereo position and volume of the group track.
  • the pan or volume of the individual track or sub-group track (output_value) is determined by:
  • 0utput_value ( (input_value - t miford) * (Pmax-Pmin) ) / (tmax-tmin) ) + Pmin (1)
  • input value is the starting value of pan or volume of the individual or sub-group track
  • t min and t max are the minimum and maximum values of the individual track or sub-group track (this is - 1 and +1 for pan, or -200dB and lOdB for volume)
  • P min and p max are the minimum and maximum value of pan or volume of the group track.
  • mapping to polar coordinates to determine the location on the stage to display the individual track or sub-group track having output_value will be accessible to those skilled in the art.
  • adjusting a parameter of a group track causes the values of p mm and/or p max to change, which changes the output value (and therefore the position on the stage) of the individual or sub group tracks, as well as the shaded/highlighted stage segment (discussed further with reference to Figures 7a-7d).
  • the highlighted stage segment represents the limits/bounds of parameters of individual tracks or sub-groups due to parameters of the group track because the boundaries of the highlighted stage segment denote the maximum and minimum volume and the maximum and minimum pan of the group.
  • a group track has specific values of volume and pan, the extent to which the group volume and pan can be adjusted is, in turn, set by the volume and pan of the group above it (which may be the master track).
  • Figures 7a-7d illustrate group-to-track interaction and the effect that adjustment of the group track has on the parameters of the tracks and sub-groups.
  • Figures 7a and 7c are 'group views' and Figures 7b and 7d are 'track views'.
  • Figures 7b and 7d show the individual tracks and sub-group 702 present in group 701 of Figures 7a and 7c.
  • Figures 7c and 7d show the effect of adjusting the stereo position of group 701. In Figure 7c, the pan of group 701 to adjusted to the right (the volume remains the same).
  • Figure 7d shows the effect of this on the contents of group 701; the pan position of each of the tracks and sub-group 702 has been shifted to the right to reflect the normalisation to group 701, and the boundaries of the highlighted stage segment has also been adjusted to reflect that p mm has changed.
  • the volume and pan of the tracks represented by the objects of Figure 7d are now limited to values specified by the boundary of the highlighted stage segment.
  • Adjustment of parameters of individual tracks and sub-groups can, however, affect parameters of a group (track-to-group interaction).
  • the volume or pan position of a group object can be adjusted from within the track view (i.e. the segmented stage view) by dragging one or more edges of the stage segment (i.e. dragging left or right about the listener position to affect pan or dragging the lower edge of the stage segment to increase or decrease the volume). Altering the limits of the stage segment in this way alters the values of p mm and/or p max of the group track.
  • the individual and sub-group track objects can also be dragged beyond the edge of the stage segment, which indirectly drags the boundary of the segment.
  • the corresponding parameter of the group track increases until it reaches the limit set by the group it belongs to (which may be the master track). This is discussed further with reference to Figure 8.
  • Adjustment of group track volume affects the volume of the individual and sub-group tracks within the group, and vice versa. For example, altering the volume of the master track (which can, in some embodiments, be achieved via a slider Ul object in the upper area of the user interface), affects the volume of groups within the master track, according to (1). As the master volume is adjusted, the stage view Ul adapts to reflect the total available gain. As a further example, if the volume of a group track is adjusted, the volume of the individual tracks and sub-group tracks within the group will be adjusted, and so the positions of the track and sub-group objects on the stage (in the track view/segmented stage view) will change, as well as the boundary of the highlighted stage segment.
  • Figure 8 shows a stage view interface 800 in which a three-level grouping structure has group 801 which includes sub-group 802 which in turn includes sub-sub-group 803.
  • Track 804 of sub-sub-group 803 is selected and therefore the individual track view/segmented stage view is displayed.
  • Highlighted stage segment 805 denotes the limits of pan and volume that can be applied to track 804 based on the pan and volume of group tracks 802 and 801.
  • the effect of the volume, at each level is cumulative, in that each stage segment represents the cumulative effect of the volume. For example, if a track is in a sub-sub-group having two higher level groups, and a 2dB volume reduction is applied to each higher level group, the volume boundary (i.e.
  • the lower curved edge of the stage segment for the track will reflect a 4dB reduction.
  • the volume limits shown on the stage are +10dB and - dB.
  • the lower curved boundary of the highlighted stage segment is somewhere between the two as volume reduction has been applied to higher-level groups (ie. p max has been adjusted for those groups).
  • the DAW of the present invention allows for the automation of adjustment in pan and volume over the course of audio playback.
  • a user in timeline view, is able to select either pan or volume automation and 'draw' in the timeline the adjustment to be made to pan or volume, applied to a track or a group track, over time.
  • the user interface object representing the automated-adjusted track will move around the stage in accordance with the automation, as set by the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Circuit For Audible Band Transducer (AREA)

Abstract

A system for adjusting one or more parameters of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the system comprising: a digital audio workstation comprising a user interface, wherein the digital audio workstation is configured to display, in a stage view of the user interface, one or more group track user interface objects, wherein each group track user interface object represents a group track; means for receiving user input, via the user interface, to adjust the first spatial position value and/or a volume value of the group track to a second spatial position value and/or a volume value of the group track, wherein the digital audio workstation is configured to calculate a second spatial position value and/or volume value of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, and to display, on the stage view user interface, audio track user interface objects, wherein each audio track user interface object represents an audio track, wherein the position of each of the audio track user interface objects in the stage of the stage view user interface represents the second spatial position value and/or volume value of the track.

Description

Digital Audio Workstation
Technical field of the invention
The present invention relates to a digital audio workstation for the intelligent processing of audio data, and, more specifically, a digital audio workstation having enhanced session organisation features.
Background to the invention
There are many digital tools available for audio manipulation, ranging from audio effects processors, synthesizers, and mixers. A comprehensive collection of such tools is often referred to as a Digital Audio Workstation (DAW). Different DAWs are often specifically aimed at users of different production experience and/or intended for use for specific types of projects. Existing DAWs include GarageBand (Apple Inc., US), Live (Ableton Inc., Germany), and Protools (Avid Technology, US).
Many DAWs are aimed at the experienced user with extensive audio processing skills and experience. The increased interest in independent, individual audio data generation and production, often referred to as the democratisation of audio processing, such as podcasts and music, has highlighted the need for DAWs which are accessible to such users and which are easy to use. There have been many studies and academic research on the visualization of audio parameters which facilitate ease of use. In particular, it has previously been identified that a 'stage view' of tracks, in which tracks are presented to a user as user interface objects in a stage-like space provide an intuitive way of visualising such tracks. This is discussed in the paper 'Comparing Stage Metaphor Interfaces as a Controller for Stereo Position and Level' (Brecht De Man, Nicholas Jillings and Ryan Stables, Proceedings of the 4th Workshop of Intelligent Music Productions, Huddersfield, UK, 14 September 2018). Currently available stage view controllers include Line 6 StageScape, and, as a plugin, iZotope Visual Mixer. However, the stage view controllers of existing DAWs do not facilitate effective session management for complex mixing arrangements.
Grouping and sub-grouping audio tracks is a known technique to efficiently process audio tracks that are similar, such as those which relate to similar instruments. By grouping tracks together into groups, an audio engineer can (for example) apply a processing effect to a group which is propagated to the individual tracks in the group. The process of routing audio channels to groups per se can be automated by intelligent audio processing tools. A routing methodology which utilises semantic labels is described in the paper 'Automatic Channel Routing Using Musical Instrument Linked Data' (Nicholas Jillings and Ryan Stables, Proceedings of the 3rd Workshop on Intelligent Music Productions, Salford, UK, 15th September 2017). However, existing DAWs do not facilitate the visualisation (and manipulation) of a hierarchical grouping structure in an effective way. It is an aim of the present invention to address, or at least mitigate, deficiencies of the prior art by provision of a DAW interface which facilitates effective session organisation in a hierarchical grouping structure via a stage view interface.
Summary of the invention
According to a first aspect of the invention, there is provided a system for adjusting one or more parameters of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the system comprising: a server and a user application in communication with the server, wherein the user application comprises a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more group track user interface objects, wherein each group track user interface object represents a group track; means for receiving user input, via the user interface, to adjust the first spatial position value and/or a volume value of the group track to a second spatial position value and/or a volume value of the group track, wherein the user application is configured to calculate a second spatial position value and/or volume value of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, and to display, on the stage view user interface, audio track user interface objects, wherein each audio track user interface object represents an audio track, wherein the position of each of the audio track user interface objects in the stage of the stage view user interface represents the second spatial position value and/or volume value of the track. Preferably, the user application is a digital audio workstation.
According to a second aspect of the invention, there is provided a method of adjusting one or more parameters of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the plurality of audio tracks has a first spatial position value and a volume value, displaying, in a stage view user interface of a digital audio workstation, a group track user interface object, wherein the group track user interface object represents a group track, receiving, via the user interface, a user input to adjust the first spatial position and/or volume of the group track to a second spatial position and/or volume of the group track, calculating, by the digital audio workstation, a second spatial position and/or volume of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, outputting, to the stage view user interface, audio track user interface objects, wherein the position of each of the one or more audio track user interface objects in the stage of the stage view user interface represents the second spatial position and/or volume of the one or more audio tracks. According to a third aspect of the invention, there is provided a method of adjusting one or more parameters of a group track, wherein the group track comprises one or more audio tracks, comprising receiving user input instructions to specify adjustment of the one or more parameters of the group track over a time period; adjusting, by a user application, the one or more parameters of the group track according to the instructions; displaying, in a stage view of a user interface, one or more track user interface objects, wherein each user interface object represents each of the one or more audio tracks, wherein the positions of each of the one or more audio user interface objects changes in real time during playback according to the input instructions.
According to a fourth aspect of the invention, there is provided a system for adjusting one or more parameters of a group of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the system comprising: a user application, wherein the user application is a digital audio workstation comprising a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more audio track user interface objects, wherein each audio track user interface object represents an audio track, and a highlighted stage segment having a boundary, wherein the boundary of the highlighted stage segment represents maximum and minimum values of the spatial position and volume of a group track, means for receiving user input, via the user interface, to adjust the boundary, wherein the user application is configured to calculate a second spatial position and/or volume value of the group track based on the adjustment of the boundary, and output, to the stage view user interface, a group track user interface object, wherein the group track user interface object represents a group track and wherein the position of the group track user interface object on the stage of the stage view user interface represents the second spatial position and/or volume value of the group track.
According to a fifth embodiment of the invention, there is provided a system for adjusting one or more parameters of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the system comprising: a user application, wherein the user application is a digital audio workstation comprising a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more group track user interface objects, wherein each group track user interface object represents a group track; means for receiving user input, via the user interface, to adjust the first spatial position value and/or a volume value of the group track to a second spatial position value and/or a volume value of the group track, wherein the user application is configured to calculate a second spatial position value and/or volume value of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, and to display, on the stage view user interface, audio track user interface objects, wherein each audio track user interface object represents an audio track, wherein the position of each of the audio track user interface objects in the stage of the stage view user interface represents the second spatial position value and/or volume value of the track. Preferably, the digital audio workstation is a user application programmed in the system.
Further preferable features are defined in the appended dependent claims.
Brief description of the figures
Embodiments of the invention will be described with reference to the figures in which:
Figure 1 is a screenshot of an exemplary timeline view showing hierarchical track groupings in a DAW according to an embodiment;
Figure 2 is a screenshot of an exemplary mixer view showing hierarchical track groupings in a DAW according to an embodiment;
Figure 3 is a screenshot of an exemplary stage view showing hierarchical track groupings in a DAW according to an embodiment;
Figure 4 is a screenshot of an exemplary stage view showing tracks in a stage segment in a DAW according to an embodiment;
Figure 5 is a is a screenshot of an exemplary stage view showing tracks and a group track in a stage segment in a DAW according to an embodiment;
Figure 6 is a is a screenshot of an exemplary stage view showing tracks of a group in a stage segment in a DAW according to an embodiment;
Figure 7 is a is a diagram showing the effect of altering a group's stereo position in a stage view of a DAW according to an embodiment;
Figure 8 is a is a screenshot of an exemplary stage view showing groups and sub-groups in a stage segment in a DAW according to an embodiment;
Detailed description
A digital audio workstation ('DAW') according to an embodiment of the invention having specific session management features will be described. The DAW according to a preferred embodiment of the invention is a web-based tool; i.e. it is hosted by a server and accessible to a user via a browser. However, it may alternatively be a stand-alone user application which can be run on any suitable hardware. Each user may be required to have registered as a user of the platform to enable secure access to the DAW and may be required to purchase access to specific features. Means for registration and the purchase of all or some features are known in the art.
The DAW of the present invention comprises a set of tools which allow a user to generate audio mixes using recorded and synthesised audio. The audio is generally processed client-side, rather than server- side, via the user's browser, a desktop application or mobile application. Although most processing is done at the client, some specific tasks are done at the server, in which case one or more APIs running in the browser or application allow for client-server communication.
Once logged in to the DAW, a user can upload an audio file for processing. Audio files, whether processed, partially processed or unprocessed, can be stored at the server and accessed and downloaded by a user when requested. Using the DAW's interface, the user essentially instructs changes to the audio to be made (which may be made at the client or at the server). The changes are then returned to the client and, upon the user's confirmation to maintain the changes, the audio is updated to reflect the changes.
An audio mix generally comprises a number of audio tracks. The DAW of the present invention comprises a number of functions which allow for complex audio arrangements to be manipulated more efficiently. One such function is the automatic grouping of similar tracks in the audio. A track is a single audio channel, and is often, for complex musical arrangement, a recording of a single instrument in a multi-instrumental arrangement. There may be multiple tracks in the audio which are similar in some way and can be grouped together based on the similarly. A group may be 'strings', 'percussion', 'vocals' etc. Groups are represented in the DAW as tracks that take other tracks (or sub groups) as inputs, and output to other groups or the master channel. A group track, sub-group track or sub-sub-group track (hereinafter referred to generally as a group track) is a collection of tracks that have been sub-mixed into a single track. For example, a group track (which may be referred to as a 'grandparent' track) may comprise one or more sub-group ('parent') tracks which in turn comprise one or more individual ('child') tracks. The 'master' track is the root-level track, representing the complete mix of all groups, sub-groups and non-grouped tracks. Whilst it is not common, it is possible to have a high number of nested groups, however this is often limited due to Ul constraints.
Various methods for automatically grouping tracks are known in the art. In the DAW described herein, grouping is based on a semantic analysis of metadata of instrument tags and is discussed in further details in the paper 'Automatic Channel Routing Using Musical Instrument Linked Data' by Nicholas Jillings and Ryan Stables (as mentioned above), the contents of which are incorporated herein by reference where permitted.
In the DAW of the present invention, a user is able to manipulate the audio tracks via three main 'views' or interface arrangements; timeline (Figure 1), mixer (Figure 2) and stage (Figure 3). In an embodiment of the invention, the DAW allows a user to switch between these three views at any point. As the audio is processed via a particular view, the other views are automatically updated to reflect the changes such that the user can seamlessly switch between different views and each one will reflect the current state of audio processing.
The timeline view 100 of Figure 1 allows for the visualisation of progression of a mix over time, as is known in the art. Figure 1 shows a collection of tracks stacked vertically. The tracks are grouped in a hierarchical structure; group track (blue) (comprising a single track) comprises sub-group (orange), (comprising two tracks) which comprises sub-sub-group (pink) (comprising a single track). In the timeline and mixer views, groups are shown as collapsible tracks, whereby individual tracks are indented (and coloured the same as the group to which they belong).
Figure 2 shows mixer view 200. The mixer view represents the tracks analogously to a mixing desk. A plurality of tracks are arranged in a group, sub-group and sub-sub-group hierarchical structure. Group 201 comprises two individual tracks as well as sub-groups 202 and 203. Sub-group 203 comprises sub- sub-group 204. The audio mix also includes individual tracks 205. Parameters of the master track can be manipulated using channel strip 206.
In the stage view 300 (shown generally at Figure 3), each individual track, sub-group or group is represented as a user interface object on a representation of a stage. The user interface objects include an icon representative of the instrument or feature to which they relate. The objects can be moved around on 2-dimensional stage 301 under user control. Group, sub-group and sub-sub-group track objects are shaded to distinguish them from individual tracks. The colour of the shaded group objects is the same as the colour of the outline of each track within the group. The colours are persistent across all views and provide a quick way to identify the tracks.
The bottom-centre position of the stage represents the position of a nominal listener. The volume of a track can be adjusted by moving the track object relative to the listener position; the closer the track icon is to the listener position, the higher the volume. The stereo position (pan) of the track is altered by rotating the track icon about the listening position and is calculated by cartesian to polar mapping based on the pseudo-position of the track on a rectangular stage. Users are able to modify parameters of the groups and the parameters of the individual tracks within the same user interface. Throughout this disclosure, 'stereo position' is synonymous with 'spatial position' and infers no limitation on the number of output channels.
The hierarchical arrangement of tracks and sub-groups is shown in a track-list region on the left side of the user interface and shows a list of all available tracks and groups, arranged in a hierarchical tree structure. Multiple tracks or groups can be selected in this region and the corresponding objects will be highlighted in the stage view (and vice versa). The stage view for a group can also be entered here by double-clicking a group object. If a track inside a group is selected, the contents of the group (the 'track view') will be shown on the stage, with the selected track object highlighted (as per track 302). The colour of the stage segment in a group view represents the colour of the parent group track (as per segment 313 of Figure 4). The track view can also be invoked by double clicking the group object. If a group is selected, the corresponding group object, and other tracks or objects at the same hierarchical level (which may, at the highest level, be the master track), will be shown (the 'group view'). Shaded objects 304, 305 and 306 correspond to groups 307, 308 and 309 respectively. Group 310, shown in the track-list region, is a sub-group of group 307. Because the current view of the stage objects is the group view (in this case, the master view), there is no object on the stage which represents group 310. If group 310 were selected, or group 307 were double-clicked, an object representing group 310 would be shown on the stage, along with another object representing track 311 in the track view.
In some embodiments, a 'tooltip' menu is displayed above a track or group user interface object is selected. The tooltip menu is a user interface object which displays the volume and pan values for the selected track or group, and allows a user, to toggle on/off any automation settings that have been applied to volume or pan (as discussed further below).
Channel strip 303 is positioned on the right of the stage view interface and mimics a channel strip of the mixer view. Channel strip 303 corresponds to the track, group or sub-group that is currently selected (i.e. the strip changes to show the parameters of the selected object). The parameters of strip 303 change in real time when the corresponding track or group object on stage 301 is moved, and similarly the parameters of the track can be manipulated via channel strip 303 which causes the corresponding track or group object to move on stage 301 as necessary.
Figure 4 shows a group view 330 for group 310. Objects corresponding to the four individual tracks present in group 310 are shown on the stage. Shaded stage segment 313 denotes the extent to which the volume and pan of the individual tracks can be altered, as limited by the parameters of group track 310, as will be explained further below. Track 312 is highlighted in the track-list region and therefore the corresponding object is also highlighted on the stage. The stage view 340 of Figure 5 is a further example of a hierarchical grouping structure. Group 341 is highlighted in the track-list region and its corresponding object 342 is also highlighted on the stage. Because group 341, a top-level group, is selected, the other objects on the stage represent the other groups and track in the master track. Channel strip 344 corresponds to the selected group track, i.e. group 341.
Figure 6 illustrates the effect of selecting a track or sub-group within a group. In the stage view 345 of Figure 6, track 346 is highlighted in the track-list region and the corresponding object 347 is highlighted on the stage. Shaded group object 349 is also shown on the stage, corresponding to sub-group 348. Since group 346 is selected, its parameters are represented in channel strip 350. The stage is shaded in the colour of group track 351 to denote that modification of the group 351 will affect the individual tracks in the group. This is discussed in more detail below with reference to Figure 7.
The position of the individual tracks or sub-group tracks are normalised to the stereo position and volume of the group track. The pan or volume of the individual track or sub-group track (output_value) is determined by:
0utput_value = ( (input_value - tmi„) * (Pmax-Pmin) ) / (tmax-tmin) ) + Pmin (1) where input value is the starting value of pan or volume of the individual or sub-group track; tmin and tmax are the minimum and maximum values of the individual track or sub-group track (this is - 1 and +1 for pan, or -200dB and lOdB for volume), and
Pmin and pmax are the minimum and maximum value of pan or volume of the group track.
The mapping to polar coordinates to determine the location on the stage to display the individual track or sub-group track having output_value will be accessible to those skilled in the art.
Accordingly, adjusting a parameter of a group track causes the values of pmm and/or pmax to change, which changes the output value (and therefore the position on the stage) of the individual or sub group tracks, as well as the shaded/highlighted stage segment (discussed further with reference to Figures 7a-7d). The highlighted stage segment represents the limits/bounds of parameters of individual tracks or sub-groups due to parameters of the group track because the boundaries of the highlighted stage segment denote the maximum and minimum volume and the maximum and minimum pan of the group. Whilst a group track has specific values of volume and pan, the extent to which the group volume and pan can be adjusted is, in turn, set by the volume and pan of the group above it (which may be the master track). Changing the position of individual tracks or sub-group tracks to beyond the boundary of the shaded stage segment, or dragging a boundary of the shaded stage segment to make the segment larger or smaller, causes the group object to move, since pmm and/or pmax have been adjusted.
Figures 7a-7d illustrate group-to-track interaction and the effect that adjustment of the group track has on the parameters of the tracks and sub-groups. Figures 7a and 7c are 'group views' and Figures 7b and 7d are 'track views'. Figures 7b and 7d show the individual tracks and sub-group 702 present in group 701 of Figures 7a and 7c. Figures 7c and 7d show the effect of adjusting the stereo position of group 701. In Figure 7c, the pan of group 701 to adjusted to the right (the volume remains the same). Figure 7d shows the effect of this on the contents of group 701; the pan position of each of the tracks and sub-group 702 has been shifted to the right to reflect the normalisation to group 701, and the boundaries of the highlighted stage segment has also been adjusted to reflect that pmm has changed. The volume and pan of the tracks represented by the objects of Figure 7d are now limited to values specified by the boundary of the highlighted stage segment.
Adjustment of parameters of individual tracks and sub-groups can, however, affect parameters of a group (track-to-group interaction). The volume or pan position of a group object (and consequently its parameters) can be adjusted from within the track view (i.e. the segmented stage view) by dragging one or more edges of the stage segment (i.e. dragging left or right about the listener position to affect pan or dragging the lower edge of the stage segment to increase or decrease the volume). Altering the limits of the stage segment in this way alters the values of pmm and/or pmax of the group track. The individual and sub-group track objects can also be dragged beyond the edge of the stage segment, which indirectly drags the boundary of the segment. When one or more parameters of the individual and sub-group tracks reach their maximum value, the corresponding parameter of the group track increases until it reaches the limit set by the group it belongs to (which may be the master track). This is discussed further with reference to Figure 8.
Adjustment of group track volume affects the volume of the individual and sub-group tracks within the group, and vice versa. For example, altering the volume of the master track (which can, in some embodiments, be achieved via a slider Ul object in the upper area of the user interface), affects the volume of groups within the master track, according to (1). As the master volume is adjusted, the stage view Ul adapts to reflect the total available gain. As a further example, if the volume of a group track is adjusted, the volume of the individual tracks and sub-group tracks within the group will be adjusted, and so the positions of the track and sub-group objects on the stage (in the track view/segmented stage view) will change, as well as the boundary of the highlighted stage segment. Conversely, adjusting the volume of the individual or sub-group tracks within a group, either by dragging the volume boundary of the highlighted stage segment, or dragging the individual track/sub-group track objects past the boundary (which causes the boundary of the stage segment to change), Pmin/Pmax of the group track, and the position of the group object in group view, will change accordingly.
Figure 8 shows a stage view interface 800 in which a three-level grouping structure has group 801 which includes sub-group 802 which in turn includes sub-sub-group 803. Track 804 of sub-sub-group 803 is selected and therefore the individual track view/segmented stage view is displayed. Highlighted stage segment 805 denotes the limits of pan and volume that can be applied to track 804 based on the pan and volume of group tracks 802 and 801. The effect of the volume, at each level, is cumulative, in that each stage segment represents the cumulative effect of the volume. For example, if a track is in a sub-sub-group having two higher level groups, and a 2dB volume reduction is applied to each higher level group, the volume boundary (i.e. the lower curved edge) of the stage segment for the track will reflect a 4dB reduction. In Figure 8, the volume limits shown on the stage are +10dB and - dB. The lower curved boundary of the highlighted stage segment, however, is somewhere between the two as volume reduction has been applied to higher-level groups (ie. pmax has been adjusted for those groups). In a preferred embodiment, it is possible to increase volume by 12dB at each level, such that, for a three level group hierarchy, the outline of the stage, for each level, will represent a volume limit of 12dB. So, for a sub-sub-group track, whilst the volume limit on the stage outline will read +12dB (or +10dB in the case of Figure 8), (since +12dB is the maximum variation in volume for that track), the cumulative effect of maximum volume of the sub-group and group means that the maximum volume of the track is in effect +36dB. There is no cumulative effect of pan adjustment.
The DAW of the present invention allows for the automation of adjustment in pan and volume over the course of audio playback. To apply an automation feature, a user, in timeline view, is able to select either pan or volume automation and 'draw' in the timeline the adjustment to be made to pan or volume, applied to a track or a group track, over time. When the track is played back in stage view, the user interface object representing the automated-adjusted track will move around the stage in accordance with the automation, as set by the user.

Claims

Claims
1. A system for adjusting one or more parameters of a group of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the system comprising: a user application, wherein the user application is a digital audio workstation comprising a user interface, wherein the user application is configured to display, in a stage view of the user interface, one or more audio track user interface objects, wherein each audio track user interface object represents an audio track, and a highlighted stage segment having a boundary, wherein the boundary of the highlighted stage segment represents maximum and minimum values of the spatial position and volume of a group track, means for receiving user input, via the user interface, to adjust the boundary, wherein the user application is configured to calculate a second spatial position and/or volume value of the group track based on the adjustment of the boundary, and output, to the stage view user interface, a group track user interface object, wherein the group track user interface object represents a group track and wherein the position of the group track user interface object on the stage of the stage view user interface represents the second spatial position and/or volume value of the group track.
2. The system of claim 1, wherein the user application is further configured to calculate second spatial position and/or volume values for each of the one or more audio tracks and output, to the stage view user interface, audio track user interface objects, wherein each audio track user interface object represents each of the one or more audio tracks, wherein the position of each of the audio track user interface objects represents the second spatial position and volume of each of the audio tracks.
3. The system of any of claims 1 or 2, further comprising a server in communication with the user application.
4. A method of adjusting one or more parameters of a group of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the method comprising displaying, in a stage view user interface of a digital audio workstation, one or more audio track user interface objects, wherein each audio track user interface object represents an audio track, and a highlighted stage segment having a boundary, wherein the boundary of the highlighted stage segment represents maximum and minimum values of the spatial position and volume of a group track, receiving, via the user interface, a user input to adjust the boundary, calculating, by the user application, a second spatial position and/or volume value of the group track based on the adjustment of the boundary, outputting, to the stage view user interface, a group track user interface object, wherein the group track user interface object represents a group track and wherein the position of the group track user interface object on the stage of the stage view user interface represents the second spatial position and/or volume value of the group track.
5. The method of claim 4, wherein the user input comprises dragging the boundary of the highlighted stage segment.
6. The method of claim 4, wherein the user input comprises dragging an audio track user interface object beyond the boundary of the highlighted stage segment.
7. A system for adjusting one or more parameters of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the one or more audio tracks has a first spatial position value and a volume value, the system comprising: a digital audio workstation comprising a user interface, wherein the digital audio workstation is configured to display, in a stage view of the user interface, one or more group track user interface objects, wherein each group track user interface object represents a group track; means for receiving user input, via the user interface, to adjust the first spatial position value and/or a volume value of the group track to a second spatial position value and/or a volume value of the group track, wherein the digital audio workstation is configured to calculate a second spatial position value and/or volume value of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, and to display, on the stage view user interface, audio track user interface objects, wherein each audio track user interface object represents an audio track, wherein the position of each of the audio track user interface objects in the stage of the stage view user interface represents the second spatial position value and/or volume value of the track.
8. The system of claim 7, wherein the digital audio workstation is further arranged to display, in the stage view user interface, a channel strip, wherein the channel strip is arranged to display the second spatial position and/or volume value of the group.
9. A method of adjusting one or more parameters of one or more audio tracks, wherein the one or more audio tracks together comprise a group track having a first spatial position value and a volume value, wherein each of the plurality of audio tracks has a first spatial position value and a volume value, displaying, in a stage view user interface of a digital audio workstation, a group track user interface object, wherein the group track user interface object represents a group track, receiving, via the user interface, a user input to adjust the first spatial position and/or volume of the group track to a second spatial position and/or volume of the group track, calculating, by the digital audio workstation, a second spatial position and/or volume of each of the one or more audio tracks based on the second spatial position and/or volume of the group track, outputting, to the stage view user interface, audio track user interface objects, wherein the position of each of the one or more audio track user interface objects in the stage of the stage view user interface represents the second spatial position and/or volume of the one or more audio tracks.
10. The method of claim 9, further comprising selecting, by a user, an option to view the audio track user interface objects in stage view, and displaying, in the stage view user interface, the one or more audio tracks, wherein a segment of the stage is highlighted to denote the maximum and minimum spatial position and/or volume of the one or more audio tracks based on the second spatial position and/or volume of the group track.
11. The method of claim 10, further comprising receiving, via the user interface, a user input to adjust the second spatial position and/or volume of the group track to a third spatial position and/or volume of the group track, calculating, by the user application, a third spatial position and/or volume of each of the one or more audio tracks based on the third spatial position and/or volume of the group track, outputting, to the stage view user interface, audio track user interface objects, wherein the position of each of the one or more audio track user interface objects in the stage of the stage view user interface represents the third spatial position and/or volume of the one or more audio tracks and adjusting the area of the highlighted stage segment to denote the extent to which the volume or spatial position of the one or more audio tracks can be adjusted based on the third spatial position and/or volume of the group track.
12. The method of claim 11, wherein the adjusting is done in real-time.
13. The method of claim 9, further comprising receiving a user input to adjust the master volume, calculating an adjusted volume value of the group track based on the adjustment to the master volume, outputting, to the stage view user interface, a group track user interface object, wherein the position of the group track user interface object on the stage based on the adjustment to the master volume, wherein the position of the group track user interface object represents the adjusted volume value of the group track.
14. The method of claim 13, further comprising calculating the minimum and maximum volume of the audio tracks associated with the group track based on the adjusted volume of the group track; and adjusting the area of the highlighted stage segment to denote the extent to which the volume of the one or more audio tracks can be adjusted based on the adjusted volume of the group track.
15. The method of claim 14, wherein the adjusting is done in real-time.
16. The method of claim 9, wherein the group track is a sub-group of a higher-level group, and wherein the method further comprises receiving, via the user interface, a user input to adjust a first spatial position and/or volume value of the higher-level group to a second spatial position and/or volume value of the higher-level group, calculating, by the digital audio workstation, a second spatial position and/or volume value of the group track, outputting, to the stage view user interface, a group user interface object, wherein the position of the group track user interface object in the stage of the stage view user interface represents the second spatial position and volume of the group tracks.
17. A method of adjusting one or more parameters of a group track, wherein the group track comprises one or more audio tracks, comprising receiving user input instructions to specify adjustment of the one or more parameters of the group track over a time period; adjusting, by a user application, the one or more parameters of the group track according to the instructions; displaying, in a stage view of a user interface, one or more track user interface objects, wherein each user interface object represents each of the one or more audio tracks, wherein the positions of each of the one or more audio user interface objects changes in real time during playback according to the input instructions.
18. The method of claim 17, wherein the one or more audio tracks together comprise a group track, wherein the method further comprises displaying, in a stage view of a user interface, one or more audio track user interface objects corresponding to the one or more audio tracks and a highlighted stage segment, wherein adjustment of parameters of the group track causes the area of the highlighted stage segment to change in real time during play back, wherein the area boundary of the highlighted stage segment represents the spatial position limits and volume limits of the one or more tracks.
19. A computer readable medium comprising executable instructions which, when executed by a processor, perform the method according to any of claims 4 to 6 and 9 to 18.
PCT/EP2020/078346 2019-10-09 2020-10-08 Digital audio workstation WO2021069630A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1914571.3 2019-10-09
GB1914571.3A GB2588137A (en) 2019-10-09 2019-10-09 Digital audio workstation

Publications (1)

Publication Number Publication Date
WO2021069630A1 true WO2021069630A1 (en) 2021-04-15

Family

ID=68541244

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/078346 WO2021069630A1 (en) 2019-10-09 2020-10-08 Digital audio workstation

Country Status (2)

Country Link
GB (1) GB2588137A (en)
WO (1) WO2021069630A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210048868A1 (en) * 2017-01-09 2021-02-18 Inmusic Brands, Inc. Systems and methods for generating a visual color display of audio-file data
CN113986191A (en) * 2021-12-27 2022-01-28 广州酷狗计算机科技有限公司 Audio playing method and device, terminal equipment and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2606538A (en) * 2021-05-11 2022-11-16 Altered States Tech Ltd Method and system for manipulating an audio track being presented at an output

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8085269B1 (en) * 2008-07-18 2011-12-27 Adobe Systems Incorporated Representing and editing audio properties

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8073160B1 (en) * 2008-07-18 2011-12-06 Adobe Systems Incorporated Adjusting audio properties and controls of an audio mixer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8085269B1 (en) * 2008-07-18 2011-12-27 Adobe Systems Incorporated Representing and editing audio properties

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BRECHT DE MANNICHOLAS JILLINGSRYAN STABLES: "Comparing Stage Metaphor Interfaces as a Controller for Stereo Position and Level", PROCEEDINGS OF THE 4TH WORKSHOP OF INTELLIGENT MUSIC PRODUCTIONS, 14 September 2018 (2018-09-14)
NICHOLAS JILLINGSRYAN STABLES: "Automatic Channel Routing Using Musical Instrument Linked Data", PROCEEDINGS OF THE 3RD WORKSHOP ON INTELLIGENT MUSIC PRODUCTIONS, 15 September 2017 (2017-09-15)
WWW.ENCIRCLED-AUDIO.COM/: "Spatial Audio Workstation SAW User Manual", 31 July 2017 (2017-07-31), Internet, pages 1 - 50, XP055586950, Retrieved from the Internet <URL:http://www.iosono-sound.com/uploads/downloads/SAW_User_Manual_24.pdf> [retrieved on 20190508] *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210048868A1 (en) * 2017-01-09 2021-02-18 Inmusic Brands, Inc. Systems and methods for generating a visual color display of audio-file data
US11567552B2 (en) * 2017-01-09 2023-01-31 Inmusic Brands, Inc. Systems and methods for generating a visual color display of audio-file data
CN113986191A (en) * 2021-12-27 2022-01-28 广州酷狗计算机科技有限公司 Audio playing method and device, terminal equipment and storage medium
CN113986191B (en) * 2021-12-27 2022-06-07 广州酷狗计算机科技有限公司 Audio playing method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
GB2588137A (en) 2021-04-21
GB201914571D0 (en) 2019-11-20

Similar Documents

Publication Publication Date Title
WO2021069630A1 (en) Digital audio workstation
US9420394B2 (en) Panning presets
US8068105B1 (en) Visualizing audio properties
US20040199395A1 (en) Interface for providing modeless timelines based selection of an audio or video file
US8085269B1 (en) Representing and editing audio properties
US8073160B1 (en) Adjusting audio properties and controls of an audio mixer
US9530396B2 (en) Visually-assisted mixing of audio using a spectral analyzer
US9459771B2 (en) Method and apparatus for modifying attributes of media items in a media editing application
US7869892B2 (en) Audio file editing system and method
US6789109B2 (en) Collaborative computer-based production system including annotation, versioning and remote interaction
US6480194B1 (en) Computer-related method, system, and program product for controlling data visualization in external dimension(s)
US9014544B2 (en) User interface for retiming in a media authoring tool
US20130308051A1 (en) Method, system, and non-transitory machine-readable medium for controlling a display in a first medium by analysis of contemporaneously accessible content sources
US20080002844A1 (en) Sound panner superimposed on a timeline
KR20170057736A (en) Virtual-Reality EDUCATIONAL CONTENT PRODUCTION SYSTEM AND METHOD OF CONTRLLING THE SAME
CN109314499B (en) Audio equalization system and method
JP2019533195A (en) Method and related apparatus for editing audio signals using isolated objects
Phillips et al. Sonification workstation
US9477674B2 (en) Merging and splitting of media composition files
Garcia et al. Interactive-compositional authoring of sound spatialization
US20080229200A1 (en) Graphical Digital Audio Data Processing System
US7484201B2 (en) Nonlinear editing while freely selecting information specific to a clip or a track
CN104750059B (en) Lamp light control method
US10770045B1 (en) Real-time audio signal topology visualization
JP2017016275A (en) Control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20789816

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20789816

Country of ref document: EP

Kind code of ref document: A1