GB2538733A - Control system for media manipulation - Google Patents

Control system for media manipulation Download PDF

Info

Publication number
GB2538733A
GB2538733A GB1508953.5A GB201508953A GB2538733A GB 2538733 A GB2538733 A GB 2538733A GB 201508953 A GB201508953 A GB 201508953A GB 2538733 A GB2538733 A GB 2538733A
Authority
GB
United Kingdom
Prior art keywords
effect
point
map
control point
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1508953.5A
Other versions
GB201508953D0 (en
Inventor
Bernard Streater Stephen
Musa Abdelaziz
Wikholm Jens
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Blackbird PLC
Original Assignee
Forbidden Technologies PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Forbidden Technologies PLC filed Critical Forbidden Technologies PLC
Priority to GB1508953.5A priority Critical patent/GB2538733A/en
Publication of GB201508953D0 publication Critical patent/GB201508953D0/en
Priority to PCT/GB2016/051491 priority patent/WO2016189296A1/en
Publication of GB2538733A publication Critical patent/GB2538733A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A user interface for modifying multiple characteristics of visual or audio media comprising an effect map having multiple effect points each representing transformation of a particular characteristic of the target media, and an input map wherein a control point is moveable by a user. The characteristic transformations represented by each effect point are effected according to the relative location of the control point to the effect point. The input map does not indicate the location or identity of the effect points within the effect map. The input map may be displayed on a touch screen and the effect map may be overlaid upon the input map. The level of modification of the characteristics represented by each effect point may be dependent upon the position of the control point within an effect point radius. The aim of the invention is to provide a simpler and more intuitive control method for manipulating photo, video and audio media by masking any labelling of the effects to be applied. The user is encouraged to rely upon visual/audio feedback of the effects as they are applied in real time so that the desired result may be achieved.

Description

CONTROL SYSTEM FOR MEDIA MANIPULATION
The present invention relates to a control system and user interface to allow an end user to easily and efficiently apply a multitude of effects to visual or audio media. Specifically, the present invention provides a novel and intuitive way for the end user to achieve complex effects using a very simple interface.
Recent advances in consumer technology have given more people the means to easily record media (specifically photo) video and audio clips) on to devices including mobile phones, hand held devices, tablets and personal computers, or use such devices to access previously recorded media over a network. With greater access comes the desire to be able to easily and quickly manipulate/enhance/edit the media as the end user desires. Increasingly, it is also desirable that users with limited or no understanding of traditional media manipulation and editing parameters are able to achieve results based on fast feedback from their devices.
Media manipulation, such as video and image manipulation, using controls to vary individual image characteristics such as hue, saturation and luminance, are well known in the art. It is also known to manipulate a combination of individual image or audio characteristics by using one or more weighted preset filters to achieve an overall desired effect. Individual parameters within a particular effect/filter may be manipulated using individual settings. However manually changing these individual settings requires time and expertise which novice users may find difficult and time consuming.
Tools such as a colour wheel allow you to vary two separate characteristics simultaneous by moving a control within a confined control area/map. However, such 2D maps are typically restricted to controlling only two parameters at a time owing to the two dimensional nature of the control. Further parameters must be adjusted using additional controls. For instance, the lightness and intensity of a particular image/video may be adjusted using a 2D control map incorporating lightness along one axis and intensity along the other. Further parameters (such as hue) must be adjusted using additional controllers, perhaps a separate slider.
Overall, whilst multiple effects may be applied and fine-tuned to photo and video media, the user must make an informed decision about which effects they wish to apply, or which characteristics they wish to manipulate.
The present invention adds several dimensions of sophistication to previously known media manipulation methods by mixing the preset effects together in an intuitive way and allows users, regardless of experience, to achieve a desired outcome based on rapid feedback from the device displaying the modified media. Unlike known solutions, the input device does not necessarily provide the user with all of the up-front knowledge as to the effect that any given point on a control map will have, or perhaps any information at all. Therefore the user does not need to rely on any knowledge of image, audio or video (henceforth referred to collectively as "media") manipulation. This simpler and more intuitive control method for manipulating media is achieved by masking any labelling of the effects to be applied so the user can rely upon visual or audio feedback of the effects as they are applied, preferably in real time, until a desired result is achieved.
The present invention makes it easy for an end user to apply an effect, or blend multiple effects and presets to a media file (video, image or audio) in a fraction of a second with a single touch or movement of a finger on a control. Inexpert users may achieve a desirable end result (without needing knowledge of complex filtering methods or requiring any upfront understanding of how to achieve an end result based on such filters) by providing instant feedback based on at least one of the current positioning, path or pressure of the input control.
The user may change rapidly to another outcome, or fine tune a chosen outcome, by moving the control point within the control map or by carrying out smaller movements within the vicinity of a currently selected point respectively. The simple controls can be expressed on a small device and potentially with a less sensitive input method such as that provided by a small touch screen. The present invention replaces a large number of controls which could otherwise take a large amount of display space with a single multi-dimensional control.
Statement of Invention
In its broadest aspect the present invention provides a user interface for modifying multiple characteristics of visual or audio media, the interface comprising an effect map, the map being a pre-defined region comprising multiple effect points located within the effect map, each effect point representing transformation of a particular characteristic, or group of characteristics, of the media, an input map being a predefined region wherein a control point is moveable by a user within the bounds of the input map, whereby said characteristic transformations represented by each effect-point are affected according to the relative location of the control point to the effect-point, and wherein the input map does not indicate the location or identity of the effect points within the effect map.
In accordance with one aspect the input map is displayed on a touch screen.
Preferably the effect map is overlaid upon the input map.
Preferably the user interface further comprises a corresponding visual or audio output providing feedback representing the outcome due to the transformations being applied to the target media.
Preferably the output is substantially instant.
In accordance with one aspect the output is to a display screen, and the input map is displayed on the same screen.
In accordance with a further aspect the moveable control point has an effect radius, whereby any effect-point only affects the transformation if it falls within the effect-radius of the control point.
In accordance with a still further aspect the effect points have an effect radius, whereby the control point only affects the transformation if the control point falls within the effect-radius of an effect point.
Preferably the level of modification of the characteristics represented by each effect point is dependent upon the either or both of the position within the effect-point-radius of the control point, and the position within the control-point-radius of the effect point.
Preferably the effect map may be modified by a pre-determined path carried out by the control point, such as rotation of the control point about the input map.
List of Figures The present invention will now be described by way of an example only, with reference to the accompanying drawings, in which: Figure 1 provides an example of image manipulation controls provided in the prior art. Figure 2 provides further examples of image manipulation controls provided in the prior art.
Figure 3 shows the control map according to a first embodiment of the present invention.
Figure 4 shows the control map according to a second embodiment of the present invention. Figure 5 shows the control map according to a third embodiment of the present invention. Figure 6 shows the control map according to a fourth embodiment of the present invention.
As shown in Figure 1, methods of manipulating image and video media in the prior art are known to incorporate single on/off controls A,B,C,D and sliding variable controls E,F to apply single or multiple effects, or adjust image characteristics to varying degrees. For example, a black and white filter may be applied to an image or video by simply selecting or deselecting the black and white effect. Adjustable variables such as hue, intensity, and opacity may be adjusted using sliding variables which apply or manipulate each characteristic along a variable scale.
Figures 2a and 2b provide examples of 2D control maps which are able to combine the control of two variable characteristics. One such example is the colour wheel in Fig 2a, which adjusts intensity along the radius of the colour wheel G and hue along the circumference and angle G'. Another example, as shown in Fig 2b, controls variables along two perpendicular axis H, H' within the map, such as intensity vs hue.
However, both methods rely on the end user being aware of the effects that each control will have, and are restricted to applying multiple filters using single on/off controls.
Whilst known video and photo effect solutions may provide multiple effects which may be adjusted manually within multiple controls, in modern hand held devices where screen real estate may be restricted, the use of single preset effects available in a single button are preferred.
As shown in Figures 3 to 6, the present invention provides a user interface for manipulating, enhancing or changing effects (such as colours and layered effects) in a video, image or animation. Whilst the examples provided are primarily directed towards image and video manipulation, it will be understood by the person skilled in the art that the invention could easily be employed in any media manipulation, including audio for example, adjusting sound effects and frequencies of an input audio file and outputting the effects in real time via an audio output system such as a speaker.
The present invention provides an interface comprising a representation of a physical region, known as an input map (for example a defined space on a device screen, or the area above and around it) with a control point moveable within the input map region under a user's control (for example by touch or mouse input, or any other suitable means). The system can interpret the position (for example on, or around, a screen), path and even pressure (for example on a screen) of the control point within the input map region.
There also exists an effect map region comprising a number of preset variables at different points within the effect map. Each of these preset points defines an effect to be applied to the target media, transforming the original attributes (comparable to a filter). These effect points incorporate a multiple array of settings including, but not limited to, RGB, 'WV, luminance, brightness, contrast, hues, saturation, textures, layers enhancement settings, pitch playback and speed, amongst others. As the control point is moved about the input map, its proximity to the multitude of effect points within the effect map will influence the transformation of the input media owing to the corresponding effect, and the resultant modified media is outputted, for example on a display screen or via speakers. The user interface tracks the position (or pressure) of the control point and calculates which effect is to be applied (and to what extent) to an input image or video depending on the distance between the control point within the control map and one or more preset effect points within the effect map.
It is important to note that whilst it is contemplated that the input map and effect map may overlap fully, this is not necessarily the case. They may overlap only partially, or even not at all. For ease of understanding the following embodiments describe a system where both the input and effect maps fully overlap. However it is understood by the skilled person that this is not necessarily the case, for example the effect points may exist outside of the input map, and therefore may not be directly reachable by the user input control point. In such cases, their influence may still be felt owing to the varying proximity of the control point to the effect points. This will become more apparent later on.
In a preferred embodiment the output display screen may be the same as the screen upon which the user interface is provided (such as a touch screen), such that the interface (e.g. the control map) overlays the output image or video.
A single control point allows the user to engage with the interface single-handedly.
Whilst the preset effect points distributed about the effect map can been seen in Figures 3 to 6, a key element of the invention is that the user interface does not necessarily display these effect points to the end user. The input map is bereft of any indication as to which effects are available, where they are located (on the effect map), and how much the control point will interact with each of the effect points. Instead, the user is reliant upon feedback received directly from an output device. The user may fine tune the outcome by moving the control point within the input map about the immediate vicinity of the point representing the desired outcome. Moving the control point further is likely to bring about greater changes to effect(s) applied. Once the user has found a combination that provides a desirable result, they may apply ("fix") the filter(s)/effect(s) to the source media. The user's choice is made upon direct feedback from the output device (be it visual or audio) providing an almost immediate representation of how the selected combination of effects will affect the source media. The user is not forced to adjust multiple individual characteristics, and does not need to identify, or understand, a range of filter methods.
The first embodiment of the invention, as shown in Figure 3, incorporates both a circular input and a circular effect map, aligned with each other 300 (although as discussed, this need not always be the case). Multiple effect points 301,302,303,304,305,306,307,308 are distributed about the circumference of the (input/effect) map 300. The centre of the map represents a "null point" 310. When the user's control point is in the centre, no effects are applied to the source media. As the user moves the control point out of the centre and towards the outer circumference (radially), the different effect points will begin to come into play. If the control point is moved directly outwards along a line of radius towards an effect point, the corresponding effect (for example fading to black & white) will be applied gradually until the control point is situated on the effect point, applying the effect fully. The rate at which the effect is applied as the control point travels towards it may not be linear.
Furthermore, if the control point is maneuvered towards the outside on a heading between two adjacent effect points, each effect is applied gradually as the control point is moved outwards, and the two effects are blended depending upon the circumferential position of the control point between the two. For example, with the control point sat directly over a single effect point, the corresponding effect will apply 100%. However, as the control point is moved circumferentially towards an adjacent effect point (maintaining the radius) the original effect decreases as the effect of the adjacent effect point increases. If the control point maintains its circumferential position between two effect points but is maneuvered radially either away or towards the central "null point", the relative contribution of each effect remains, but their combined strength is reduced/increased depending on the radial position of the control point.
In the given examples, control point "A" is situated on the circumference, equally between effect points 301 and 302. Therefore the contribution by both effects is equal, 50% strength each to provide 100% effect strength in total.
Control point "B" is situated on a radial line directly between the null point 310 and effect point 303. Therefore only effect 303 is applied, and to roughly half strength (depending on the rate of change applied radially).
Control point "C" is nearer to point 306 than to effect point 305, and therefore the contribution from effect 306 will be greater. The control point is also found about 75% of the way out from the centre 310, so both effects will be tempered accordingly.
The spatial arrangement of the preset effect points is not required to be uniform, and they may be located at any point (provided it is within the pre-defined effect map area). Calculation of the influence of each preset effect point is also not required to be uniform or linear. Furthermore, the size and shape of effect-points is not restricted. Whilst the example shown in Figure 3 shows the effect-points to be single points dotted about the map, any effect-point may instead be a larger size, or of irregular shape, provided it remains within the pre-defined map area.
The above example provides a system where only two effects are applied at once. However, further embodiments envisage a system where more than two effects may come into play, depending upon the "effect radius" of each effect point.
The "effect radius" of each effect point is the region within which each effect point begins to apply once the control point is inside the effect radius of a particular effect point. For the first embodiment, the effect radii of each effect point can be considered to be a 900 section of the circle, circumferentially centered on the effect point (e.g. shaded region 317 about point 307). Therefore each effect point will play a role provided the control point does not cross the adjacent direct centre-to-circumference line 450 in either direction. The effect radii may of course overlap, allowing for multiple effects to be applied at once.
In the second embodiment, shown in Figure 4, the system provides for effect points not only on the edge of the input/effect map 400, but also located inside the region. Eight effect points 401,402,403,404,405,406,407,408 are located around the circumference of the map 400, and eight effect points 421,422,423,424,425,426,427,428 are located more towards the centre of the map 400. Five example effect radii 423%404%424%405%425' are displayed.
As the user moves the control point about the input map, it is likely to fall within the effect radii of more than one effect point. It is important to note that each effect radius of an effect point may be of a different size, and can be of any shape, not necessarily uniform. The gradient strength of the effect is also not uniform, aside from beginning at the edge of the corresponding effect radius, and growing to a maximum at the effect point within the radius. Given the range of shapes available, the effect point may or may not be in the centre of the effect radius. The separate effect radii are likely to overlap to allow for combined effects, but may also be mutually exclusive of each other in situations where independent filters and effects are offered.
This allows the present invention to provide a truly unique media manipulation experience as the user pilots the control point around the input map, engaging with the seemingly "invisible" effect points as they do so. Each different direction will increase/decrease the strength of any number of effects as the control point is brought near to/further from the effect points.
In a third embodiment, and as shown in Figures, it is the control point that incorporates an effect-radius surrounding it, and not the effect points. Each preset effect point will have no effect upon the input image or video until it falls within the pre-defined effect-radius of the control point as the control-point is moved about the map. The strength of the effect applied may also depend on the distance of the effect-point from the centre of the control point effect-radius to its outer limit. For instance, the strength of the effect may scale from minimal at the edge of the effect-radius to maximal at the centre of the effect-radius (i.e. when the control point is directly over the effect point). Therefore, as the user moves the control point (by his finger, mouse pointer or other suitable means) around the map, the output image or video will smoothly transition between preset effects as they pass in and out of the effect-radius of the control point.
Depending on the effect-radius of the control point and the location of the individual effect-points, the control point may interact with more than one effect-point at once, and thus provide an overall effect which can be a combination of a multiple parameters which would not otherwise be possible to express on a standard two-dimensional map other than by using a large number of individual controls.
Furthermore, the invention may include effect radii for both the effect points and the control points in combination.
In a fourth embodiment, and as seen in figure 6, the effect map 600 may be adjusted by tracking the path of the control input and providing different effect points once the control point has been rotated fully about the input map. Instead of interpreting the input map to be a flat surface about which the control point may be moved, in the fourth embodiment, the input map can be seen as a helioid shape (i.e. a filled-in helix). Therefore, once, the user has completed a full rotation of the input map with the control point, the control point is left at a different position in relation to the effect map, with a different set of local effect points. Therefore a 2D user input, such as a touch screen, can be utilized to provide many more effect points than the input area might normally allow by allowing full rotation (or any other recognized path about the input map) to affect the effects given by the effect map.
This provides another way of adjusting a particular characteristic, e.g. the overall hue of the input image or video.
Whilst the present embodiments and examples provide the input and effect map overlapping each other, the skilled person will understand how they may be skewed, or indeed fully exclusive from each other. Therefore, whilst the user might manoeuvre the control point within the input map towards an effect point on the effect map, if they are exclusive to each other, the control may only ever approach, but may not reach the effect point.
As has been mentioned previously, the skilled person will understand that control of the control point may not be restricted to movement upon an input map, but other inputs including user pressure (for example upon a touch screen) or distance from a surface (for example hovering above a proximity sensing surface) may also be used to control the control point and therefore affect change in the source media. L1

Claims (11)

  1. Claims 1. A user interface for modifying multiple characteristics of visual or audio media, the interface comprising; an effect map, the map being a pre-defined region comprising multiple effect points located within the effect map, each effect point representing transformation of a particular characteristic, or group of characteristics, of the target media, an input map being a predefined region wherein a control point is moveable by a user within the bounds of the input map, whereby said characteristic transformations represented by each effect-point are affected according to the relative location of the control point to the effect-point, and wherein the input map does not indicate the location or identity of the effect points within the effect map.
  2. 2. The user interface according to claim 1, whereby the input map is displayed on a touch screen.
  3. 3. The user interface according to claim 1 or 2, whereby the effect map is overlaid upon the input map.
  4. 4. The user interface according to any preceding claim, further comprising a corresponding visual or audio output providing feedback representing the outcome due to the transformations being applied to the target media.
  5. 5. The user interface according to claim 4 wherein the output is substantially instant.
  6. 6. The user interface according to claim 4 or 5, whereby the output is to a display screen, and the input map is displayed on the same screen.
  7. 7. The user interface according to any preceding claim whereby the moveable control point has an effect radius, whereby any effect-point only affects the transformation if it falls within the effect-radius of the control point.
  8. 8. The user interface according to any preceding claim whereby the effect points have an effect radius, whereby the control point only affects the transformation if the control point falls within the effect-radius of an effect point.
  9. 9. The user interface according to claim 7 or 8 whereby the level of modification of the characteristics represented by each effect point is dependent upon the position within the effect-point-radius of the control point, and/or the position within the control-point-radius of the effect point.
  10. 10. The user interface according to any preceding claim wherein the effect map may be modified by a pre-determined path carried out by the control point, such as rotation of the control point about the input map.
  11. 11. The user interface substantially as described herein and with reference to figures 3 to 6.
GB1508953.5A 2015-05-26 2015-05-26 Control system for media manipulation Withdrawn GB2538733A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1508953.5A GB2538733A (en) 2015-05-26 2015-05-26 Control system for media manipulation
PCT/GB2016/051491 WO2016189296A1 (en) 2015-05-26 2016-05-24 Control system for media manipulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1508953.5A GB2538733A (en) 2015-05-26 2015-05-26 Control system for media manipulation

Publications (2)

Publication Number Publication Date
GB201508953D0 GB201508953D0 (en) 2015-07-01
GB2538733A true GB2538733A (en) 2016-11-30

Family

ID=53506310

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1508953.5A Withdrawn GB2538733A (en) 2015-05-26 2015-05-26 Control system for media manipulation

Country Status (2)

Country Link
GB (1) GB2538733A (en)
WO (1) WO2016189296A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110074809A1 (en) * 2009-09-30 2011-03-31 Nokia Corporation Access to control of multiple editing effects
US20140181671A1 (en) * 2012-10-05 2014-06-26 Photographica Limited System, method and computer-accessible medium for manipulating a plurality of components using a single gesture or motion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6490359B1 (en) * 1992-04-27 2002-12-03 David A. Gibson Method and apparatus for using visual images to mix sound
US7843427B2 (en) * 2006-09-06 2010-11-30 Apple Inc. Methods for determining a cursor position from a finger contact with a touch screen display
WO2010138309A1 (en) * 2009-05-26 2010-12-02 Dolby Laboratories Licensing Corporation Audio signal dynamic equalization processing control
US20140173519A1 (en) * 2011-05-24 2014-06-19 Nokia Corporation Apparatus with an audio equalizer and associated method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110074809A1 (en) * 2009-09-30 2011-03-31 Nokia Corporation Access to control of multiple editing effects
US20140181671A1 (en) * 2012-10-05 2014-06-26 Photographica Limited System, method and computer-accessible medium for manipulating a plurality of components using a single gesture or motion

Also Published As

Publication number Publication date
GB201508953D0 (en) 2015-07-01
WO2016189296A1 (en) 2016-12-01

Similar Documents

Publication Publication Date Title
JP5268919B2 (en) Color selection input apparatus and method
US9495087B2 (en) Two-dimensional slider control
US9917987B2 (en) Media editing with overlaid color adjustment tools
US9105121B2 (en) Image editing with user interface controls overlaid on image
JP5990600B2 (en) Color adjuster for color classification
US8823726B2 (en) Color balance
US9594493B2 (en) Graphical user interface with dial control for a parameter
US9563972B2 (en) Methods and apparatus for providing color palette management within a graphical user interface
CN104247390A (en) User interface tools for cropping and straightening image
US20150109323A1 (en) Interactive black and white image editing
US7586499B1 (en) Method and apparatus for adjusting the color of a digital image
WO2016080011A1 (en) Color attribute display device, method, and program for digital image and image processing apparatus
US10168880B2 (en) System, method and computer-accessible medium for manipulating a plurality of components using a single gesture or motion
US20150212721A1 (en) Information processing apparatus capable of being operated by multi-touch
GB2538733A (en) Control system for media manipulation
EP3046340B1 (en) User interface device, sound control apparatus, sound system, sound control method, and program
CN106648311B (en) Color adjusting method for mobile phone display screen and smart phone
JP7364381B2 (en) Image processing device
KR20180053883A (en) Apparatus and method for providing changeable controller user interface based of user information and contents attribute

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)