GB2498508A - Positional manipulation of an object in a GUI beyond that permitted by an associated application - Google Patents

Positional manipulation of an object in a GUI beyond that permitted by an associated application Download PDF

Info

Publication number
GB2498508A
GB2498508A GB201118752A GB201118752A GB2498508A GB 2498508 A GB2498508 A GB 2498508A GB 201118752 A GB201118752 A GB 201118752A GB 201118752 A GB201118752 A GB 201118752A GB 2498508 A GB2498508 A GB 2498508A
Authority
GB
United Kingdom
Prior art keywords
text
user input
manipulation
displayed
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB201118752A
Other versions
GB201118752D0 (en
Inventor
Nigel Pearce
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Promethean Ltd
Original Assignee
Promethean Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Promethean Ltd filed Critical Promethean Ltd
Priority to GB201118752A priority Critical patent/GB2498508A/en
Publication of GB201118752D0 publication Critical patent/GB201118752D0/en
Priority to PCT/EP2012/070626 priority patent/WO2013064378A1/en
Publication of GB2498508A publication Critical patent/GB2498508A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

A graphical user interface for manipulating a displayed object according to user input in accordance with a positional manipulation permitted by an application which displayed the object. A condition under which the further positional manipulation of the object is prevented by the application is detected, and in response thereto further user input is monitored and if the further user input meets a predetermined condition, further positional manipulation of the object is allowed. The manipulation may include moving, resizing or rotating the object and the user input may include selecting an object at a user input position. The condition may relate to a positional deviation between the user input and displayed object and further manipulation may be allowed when the deviation is maintained for a predetermined length of time or the deviation exceeds a predetermined distance. The user interface is preferably for an interactive display system including a touch-sensitive display.

Description

GUI OBJECT MANIPULATION PREVENTION AND OVER-RIDE
BACKGROUND TO THE INVENTION:
Field of the Invention:
The present invention relates to the manipulation of a S displayed object in a graphical user interface, where the manipulation of the object is in certain circumstances prevented by the software application under which manipulation is taking place.
Description of the Prior Art:
Computer software applications often permit certain specific displayed object manipulations on a graphical user interface (GUI) using one or more inputs. Examples of permitted object manipulations include: moving displayed files and folders; increasing the size of a displayed window; rotating a displayed image; and moving a displayed vertical scroilbar to navigate through a document.
During a specific displayed object manipulation, the manipulation can sometimes become prevented or altered by the software application. Examples of restricting a manipulation include: a scrolibar movement being prevented when it hits the scroll end stop; a window movement being prevented when it meets the edge of the display screen; an object resize manipulation being prevented when it reaches a pre-determined size; and a free' object movement being altered into a vertical only movement as a result of the object meeting a horizontal restriction.
In some situations, when a previously permitted object manipulation becomes prevented or altered by the application it may be that the user then optionally requires the prevented or altered manipulation to continue on the object for some other action to occur on the object.
When this situation arises, it is normally the case that the user must stop the current manipulation action attempt, and then perform some other action before starting a new manipulation on the object.
As an example, an object has a maximum height property. If an object size manipulation causes the object to reach its maximum height, the application will prevent further sizing in height. If the user wishes to make the object any larger, they must first alter the maximum height property, then start a new is size manipulation on the object.
It is an aim of the invention to provide for improved manipulation of a displayed object in a graphical user interface, where the manipulation of the object is in certain circumstances prevented by the software application under which manipulation is taking place.
SUMM23RY OF THE INVENTION: The invention provides a user with a technique to optionally allow a prevented or altered manipulation of a displayed object to continue on an object, or to perform some other action on the displayed object, after the manipulation has been prevented or altered and as part of the ongoing manipulation attempt, i.e. without having to first perform some other action. The manipulation that may be prevented or altered is typically associated with a default operation of a software application under the control of which the manipulation is being carried out.
There is provided a method for manipulating a displayed object of a graphical user interface, comprising: manipulating the object under the control of a user input in accordance with a positional manipulation permitted by an application under the control of which the object is displayed; and detecting a condition under which the further positional manipulation of the object is prevented by the application, and in response thereto: monitoring further user input; and in dependence upon the further user input meeting a predetermined condition, further positionally manipulating the object.
The step of detecting a condition under which further manipulation of the object is prevented may comprise detecting a deviation in the positional relationship between the user input and the displayed object. The user input may be associated with a position on the graphical user interface, and wherein the step of positionally manipulating the object under the control of a user input comprises selecting the object by locating the position associated with the user input on the object at an object selection point, wherein the step of detecting deviation.
comprises the step of detecting a deviation between the position associated with the user input and the object selection point on the object.
The step of positionally manipulating the object may be under the control of movement of the user input.
The step of positionally manipulating the object may comprise one or more of: moving the object, resizing the object, or rotating the object.
The method may further comprise, on detecting the condition under which the further manipulation of the object is prevented, storing the positional relationship between the user input and the displayed object, wherein the step of monitoring the further user input comprises monitoring a subsequent positional relationship between the user input and the displayed object.
The step of further manipulating the object may comprise enabling an action determined by the further user input.
The step of further manipulating the object may comprise restoring the relationship between the user input and the displayed object.
The predetermined condition may comprise maintaining the deviation in the positional relationship between the user input and the displayed object for a predetermined length of time.
The predetermined condition may comprise determining if the deviation in the positional relationship between the user input and the displayed object exceeds a predetermined distance.
The step of further manipulating the object may be further dependent upon a selection of the further manipulation.
If the selection is not made the positional relationship between the user input and the displayed object may be restored.
The invention further provides means for carrying out the stated method steps.
S
The invention provides a computer program which, when run on a computer, further carries out the stated method steps.
The invention provides a computer program product for storing computer program code which, when run on a computer, S further carries out the stated method steps.
The invention further provides apparatus adapted to perform the method steps.
BRIEF DESCRIPTION OF THE FIGURES:
The invention is now described by way of example with reference to the accompanying Figures in which: Figures 1(a) to 1(e) illustrate the principles of object containment; Figures 2(a) and 2(b) illustrate a first prior art problem associated with object containment; Figure 3(a) to 3(c) illustrate a second problem associated with object containment; Figure 4 illustrates a process flow in accordance with the general principles of the invention; Figures 5(a) to 5(d) illustrate a solution to the problems of object containment in accordance with an embodiment of the invention; Figure 6 illustrates an exemplary implementation of circuitry for implementing the method of Figure 4; and Figure 7 illustrates an exemplary interactive display system in which the invention may be implemented.
DESCRIPTION OF THE PREFERRED EMBODIMENTS:
The invention is generally applicable to any computer system associated with a display on which objects are displayed, and which objects are controlled by a computer application running on the computer system.
Such a computer system is provided with an input device. An input device is any device which allows the computer system, or an application running on the computer system, to be controlled.
An input device may be, for example: a mouse, a keyboard, a stylus, a touch sensitive screen etc. A computer system may have one or more types of input device, and one or more of any input device type.
Other terminology associated with input devices and inputs
is used in the following description.
An input event is a computer instruction derived from an input device and received by an application. An example of an input event associated with a touch sensitive display surface is "TouchMove", which represents movement of a detected touch across the surface by one unit of distance.
An input event stream is a set of input events derived from the same input device and received in sequence by an application. An example of an input event stream associated with a touch sensitive display surface is "TouchDown; TouchMove; TouchMove; Touchup". This sequence of events represents: detection of a touch input at the surface; detection of movement of one unit distance across the surface; detection of movement of one further unit distance across the surface; and detection of release of the touch input (i.e. release of contact with the surface) The terms input device', input', input event' and input event stream' are well known to one skilled in the art, and the above definitions are well-understood definitions of the terms, included herein for completeness.
An embodiment of the invention is described with reference to an example of removing a contained object from a bounded container object.
A container object' is an object that is designed to hold other objects, such that when the container object is manipulated, the contained objects are also manipulated accordingly. An example of a container object is a "File View" object, which object displays individual objects which each represent an object contained in a file.
a contained object' is an object that is held within a container object. An example of a contained object is a file which is represented within a "File View" window.
The terms container object' and contained' object are well known to one skilled in the art, and the above definitions are well-understood definitions of the terms included herein for completeness.
The example with respect to which the embodiment of the invention is described is a non-limiting example, and one
B
skilled in the art will appreciate that the invention is more broadly applicable than the specifically described example.
Computer applications in general often implement the concept of object containment. This concept typically involves a container object and one or more contained objects.
when an object is contained by a container object, one feature of this arrangement is that when the container object is manipulated, the contained object is also manipulated in the same way as the container.
A contained object is typically displayed over the top of and within the boundary of the container object.
In order to make an object a contained object within a container object, one approach is to drag and drop' the object to manipulate it onto the container object.
The term drag' refers to the manipulation of an object such that the object changes position.
The term drop' refers to an object receiving a release input event. An example is a "Touchtlp" event.
The term manipulate refers to altering the position, size and/or angle of a displayed object using an input event stream from an input device to move, resize of rotate the displayed object. In general the invention applies to any manipulation of a displayed object, whether as part of a drag and drop operation or otherwise. This manipulation may more specifically be referred to as positional manipulation' of an object.
The terms drag', drop' and manipulate' in the context of displayed object manipulation are well known to one skilled in the art, and the above definitions are well-understood definitions of the terms included herein for completeness.
Figure 1(a) illustrates two displayed objects Object A denoted by reference numeral 102 and Object B denoted by reference numeral 104. Neither Object A nor Object B is contained within a container object, and therefore neither is a contained object.
Using an input device (e.g. a finger or a touch sensitive display), Object A 102 is dragged and dropped onto Object B 104.
As shown in Figure 1W), Object A 102 is now contained within Object B 104. Object A 102 is now a contained object and Object B 104 is now a container object.
The conditions for allowing containment to occur are well known. Such conditions may typically include one or more of: Upon dropping the object, the boundary of the dropped object must be fully located within the boundary of the container object. The boundary of an object may be defined as: o The bounding rectangle of the object (a non-rectangular object may have a bounding rectangle associated therewith which contains the object, althouqh that bounding rectangle may not be displayed); or o The actual outline perimeter of the object (which may be non-rectangular) * Upon dropping the object, the input position of the input device mapped to the display must be located within the boundary of the container object.
* Upon dropping the object, a pre-determined percentage of S the area of the dragged object (e.g. >50%) must intersect or overlap the container object.
Referring again to Figure 1 Kb), in this arrangement if a user now manipulates Object B 104, Object A 102 will also be manipulated.
For example, as illustrated in Figure 1(c), object B 104 is rotated by a certain angle, and Object A 102 rotates by the same amount, retaining the same position and orientation with respect to Object B 104.
The application may allow for continued manipulation of Object A 102 whilst it is contained within Object B 104. with reference to Figure 1(d), and in comparison to Figure 1(b), Object A 102 has been moved to a new position within Object B 104, yet it continues to be a contained object within an objector container defined by Object B 104.
In some applications it is possible to remove a contained object from a container object by simply manipulating the contained object such that it is moved away from the container object. In this situation, the conditions for allowing un-containment to occur may typically include one or more of: * Upon dropping the object, the boundary of the dropped object must not intersect or overlay the boundary of the container object.
* upon dropping the object, the input position of the input device mapped to the display must not be contained within the boundary of the container object.
* Upon dropping the object a pre-determined percentage of the area of the dragged object (e.g. >5Q9) must not be contained within the container object.
With reference to Figure 1(c), object A 102 has been dragged and dropped to a place that does not intersect the container object 104 and is therefore no longer contained.
A problem associated with object containment is that some computer applications that provide containment may force objects to remain contained by restricting the manipulation of the contained object to be within the boundary of its container object (or by some other means) ander certain conditions.
A first problem can be understood with reference to Figure 2(a). Object A 102 is contained within Object B 104 consistent with Figure 1(b) above. An input device then attempts to drag Object A 102 to a position generally denoted by reference numeral 106 and marked with a cross in Figure 2(a) outside of the container object 104. However the manipulation of Object A 102 is prevents when Object A 102 meets the boundary of Object B 104, as illustrated in Figure 2(b). As the user drags the display input position further toward position 106, Object A 102 remains in a fixed position adjacent the boundary of Object B 104, and the displayed object does not move to follow the ongoing movement of the input device.
It should be noted that the input device is associated with an input position of the display surface. The input position may correspond directly to the position of the input device, where for example inputs are provided by a touch input on a touch sensitive display. The input position may correspond to the position of the input device by a mapping of coordinate systems, where for example inputs are provided by a mouse. In general it can be understood that the input device is associated with a display input position bn the display surface, the display input position being moved by a corresponding movement of the input device. Thus in this description the term display input position' is used to refer to a position on the display associated with the relative position of the input device. It should be noted that the display input position' is not necessarily actually displayed (e.g. by a cursor) on the display, as this may be unnecessary, for example in a touch input system.
In the situation as exemplified by Figures 2(a) and 2(b) It is not possible for the user to drag Object A 102 to position 106 without stopping manipulation, and performing some other action. For example, the user may have to set a property for Object A 102 as "Contained = False" within a menu option to allow Object A 102 to be moved to position 106.
A second problem can be understood with reference to Figures 3(a) to 3(c). Rather than the manipulation of Object A 102 being stopped, Object A 102 may undergo a change in a specific manipulation. With reference to Figure 3(a), again the situation of Figure 1(b) is established and Object A 102 is contained within Object B 104. A user attempts to drag Object A to a location denoted by reference numeral 108, and marked in Figure 3(a) by a cross. A vector 110 illustrates the path a user intends Object A 102 to follow as it is dragged under the control of the input device, i.e. vector 110 represents the path travelled by the display input position.
As illustrated in Figure 3 (b), for the first part of the movement of the display input position, Object A 102 is freely manipulated along the path 110. This occurs until the perimeter of Object A 102 abuts the perimeter of Object B 104.
As illustrated in Figure 3 (c), for the further part of the display input position movement along path 110, Object A 102 is itself restricted to movement within the boundary of Object B 104, and in this example moves vertically up along a vector path 112 to finish restricted in a higher position corresponding to a vertical coordinate of the location 108, but does not move further in the horizontal direction.
Therefore with this second problem, the initially free manipulation of Object A 102 is changed to a vertical only manipulation. The application allows movement in any direction provided that movement retains the contained object within the container object.
In accordance with the invention and its embodiments, in the example scenarios as described above, a user is provided with the means to optionally allow a contained object to become uncontained, and moved to a position such as position 106 in Figures 2(a) and 2(b) or position 108 in Figures 3(a) to 3(c) after the point at which the application would ordinarily prevent or alter the positional manipulation as in Figures 2(a) and 2(b) or as in Figures 3(a) to 3(c), as part of the ongoing positional manipulation attempt and without having to stop the positional manipulation attempt in order to perform some other action.
The foregoing examples set out problems associated with object positional manipulation of a specific type, relating to contained objects. The invention provides a solution to these problems, but is not limited to this type of positional manipulation scenarios. Further hereinbelow it will be described how the invention addresses the problems for the specific type of positional manipulation scenarios as described with respect to the above.
However with reference to Fig-ure 4 there is illustrated a flow process in accordance with the invention which illustrates the general principles of the invention without being restricted to a particular type of positional manipulation scenario.
The invention relies on the feature that during unrestricted object manipulation, the display input position associated an input device with respect to a displayed object which is being positionally manipulated does not normally alter with respect to its position on the object itself. That is, the location of the display input position on the object does not normally alter, which position may be referred to as the object selection position or object pick-up point.
Thus the invention relies upon the characteristic that during unrestricted positional manipulation of the object, the object selection position or object pick-up point on the object remains positioned directly under the display input position associated with the input device as the device moves. For example, if the input device picks-up the center of an object and then moves, the center point of the object remains coincident with the display input position as it moves under control of the input device.
with reference to Figure 4, in a step 200 a software application under the control of which an object is displayed detects selection of the object by a user input. For the purposes of this description, it is assumed that all inputs are S provided under the control of a user of a software application in which positional manipulation of objects is performed.
In a step 202, the software application detects that the object is being positionally manipulated under the control of the user input. The invention is concerned with a positional manipulation which is associated with the input selecting and positionally adjusting the object, e.g. by movement, resizing or rotation. The display input position of the input device is coincident with a part of the object, and maintains the same relative position with respect to the object as the object moves such that the object moves with the display input position.
In a step 204 the software application then monitors the positional manipulation of the object.
In a step 206 the software application determines if the positional manipulation of the object is permitted. As long as the positional manipulation is permitted, then the software application continues to monitor the positional manipulation in step 104.
In step 206 it is determined at any instant that the positional manipulation under the control of the user input is not a permitted positional manipulation, the process progresses to step 208.
In step 208 the software application stops the positional manipulation of the object.
In step 210 the software application monitors any further user input. More particularly the software application monitors the user input for input events associated with positional manipulation, i.e. movement of the display input position.
S In a step 212 the application stores the location of the display input position at the instant the positional manipulation was stopped by the software application.
In a step 214 it is optionally determined the difference between the current location of the display input position and the stored location of the display input position at the instant positional manipulation was prevented. The difference is compared to a threshold distance value D. If the distance is greater than D, the process moves to step 220. otherwise the process moves to step 216.
In a step 216 it is optionally determined the time elapsed since the positional manipulation of the object was stopped. The time elapsed is compared to a threshold time T. If the time elapsed is greater than T, the process moves to step 220.
Otherwise the process moves to step 218.
Steps 214 and 216 provide options in which a determination is made to take some further action based on either a distance or time threshold being exceeded. In embodiments only one or the other, or both, may be implemented. In other embodiments alternative criteria may be provided and implemented in isolation or in combination.
In step 218 it is determined if the user input has been continuously maintained since the manipulation was terminated.
If not, the process proceeds to step 224. If so, the process returns to step 214.
In a touch display system, step 218 may be used to monitor that a contact is maintained with the display surface. The process may be terminated on detection of a touch up' condition. For other input device some other condition may be monitored, for example a mouse click depression in a mouse input scenario.
In step 220 the application optionally displays to a user further options associated with an option to take further action, i.e. a positional manipulation further to the original positional manipulation.
In this described example step 220 allows a user to confirm that an additional action is to be carried out. In alternative embodiment the user may be given no such option, and the further action implemented automatically on determination of the relevant condition.
In a step 222 it is then determined whether the user selects the additional action. If the additional action is selected then in step 226 the further action is enabled.
Otherwise the process proceeds to step 224. After further action is enabled in step 226, the process reverts to step 200 and repeats.
In step 224 the manipulation is terminated on the basis of the status of the object at the time positional manipulation is stopped, in accordance with the conventional operation of the software application.
The general principles of the invention can be further understood with reference to its application to the earlier described problems, and with further reference to Figure 5(a) to 5(d) As shown in Figure 5 (a), a finger input 114 is placed on Object A 102 such that contact with the object is made at a specific location 116 of Object A 102. As the finger input is moved, Object A 102 moves accordingly and the position of Object A with respect to the finger input is such that the contact point 116 will continue to remain directly under the finger input. In the example scenario of Figure 5(a) the display input position corresponds to the finger contact point.
Returning to the problems described above, assuming Object A 102 is being positionally manipulated with the single finger input 116 at contact point 116 as shown in Figure 5(a). the following adaptation is made.
When Object A 102 reaches the boundary of Object B 104 as illustrated in Figure 5 (b), as illustrated by the path 118 traversed by Object A in Figure 5(b), the ongoing specific positional manipulation of the display of Object A is prevented or altered by the application as described with reference to the two respective problems discussed above. Object A 102 will no longer move in the direction of the input. At this point the position of Object A 102 and the display input position are noted.
The input device (i.e. the finger 114) then continues to move toward the destination point 106 as illustrated by vector illustrated in Figi.ire 5(c). Due to the restriction of the containing object, the display input position moves away from, i.e. becomes disconnected from, its position on Object A 102, i.e. position 116.
During this operation, the input device continues to generate positional manipulation events into the software application, and together with the fact that Object A 102 has stopped moving in the direction of the input device, i.e. along vector 120, this allows the application to determine that the display input position no longer coincides with the position in location 116 with respect to Object A 102, i.e. it has become disconnected.
An appropriate condition as described in Figure 4 is detected, and as shown in Figure 5(d) Object A 102 is positionally manipulated such that the display position is moved and it is displayed in a location such that the original contact point 116 is again coincident with target point 106, which corresponds to the current display input position.
The invention is enabled on detection of the condition that the display input position no longer coincides with the original object selection position or pick-up position.
Once the display input position has become disconnected from the object selection position (its original position with respect to the Object A 102) , and Object A 102 is no longer moving with the display input position, a further test can be made to determine whether to free Object A 102 from the Object B 104 container, such as steps 214 and 216 of Figure 4.
Examples of such tests, which may be carried out independently or in combination, include: * Checking the position of the disconnected input no longer falls within any part of the container object (i.e. Object B 104) . * Checking that the distance between the position of the disconnected input and the position of the location 116 on the contained object (the disconnect distance) is greater than a pre-determined distance (step 214 of Figure 4) * Checking that the elapsed time of any disconnect is larger than a predetermined time (step 216 of Figure 4) * Checking f or any combinations of the above. For example checking the disconnect distance is greater than a pre-determined distance for a predetermined time.
The invention is not limited to the nature of any subsequent action or actions responsive to detection of a suitable condition. In the above described example, the subsequent action is preferably for Object A 102 to become uncontained from Object B 104, and then for its initial ongoing manipulation to continued. Such a further action is clearly associated with the exemplary description relating to an object containment scenario.
In another example in the object containment scenario, the subsequent action may be, for example, to delete Object A, or for an uncontained copy of Object A to be made and moved along with the input device.
Other actions may be provided in other scenarios of object positional manipulation, other than object containment scenarios.
A user may be given a visual notification of the potential further action on the object, corresponding to step 220 of Figure 4, due to the positional manipulation of the object in the expected manner being stopped. At this point the user may have the option of returning the display input position to the original object selection point or pick-up point (position of no disconnect') on the object before a further test condition is met. By doing this, it is possible for the user to prevent or cancel any further action on the object.
Examples are:
* Once Object A 102 stops moving, the user may have to move the display input position at least a certain number of pixels away from the original object selection point on the object in order for Object A to become uncontained, or return the display input position back to the contact point on the object to keep it contained.
* Once Object A stops moving, the user may have a predefined time period to keep the display input position disconnected from the original object selection point on Object A in order for Object A to become uncontained, or to return the display input position back to the contact point on Object A to keep it contained.
With reference to Figure 6 there is illustrated an exemplary implementation of a computer system for performing the functionality of the method of Figure 4. Only those functional elements are illustrated which are associated with the implementation of an embodiment of the invention, and additional functional elements will be required to fully implement a computer system. The exemplary computer system includes a touch sensitive display surface, but the invention is not limited to such a display device.
The exemplary computer system comprises a touch sensitive display driver 302; coordinate detection circuitry 304; a positional manipulation controller 306, such as a processor; a reset control circuit 324; a memory 308; a comparator circuit 310; an elapsed time circuit 312; a countdown clock circuit 314; a logic circuit 316; an object rendering circuit 318; an object parameter store 320; and a display driver 322.
The touch sensitive display driver 302 receives input data from the touch sensitive display surface (not shown), which data is passed to the coordinate detection circuitry 304 for determination of a position on the display associated with the received data. The positional information associated with the received data is forwarded to the positional manipulation controller 306, the reset control circuit 324, the object rendering circuit 318, and the comparator 310.
In normal operation under the control of an application, the object rendering circuit 318 renders a displayed object based on object parameters provided by the object parameter store 320 (which may be part of the memory 308), and provides the rendered object to the display driver 322 for display on the touch sensitive display. The positional information of detected inputs received by the object rendering circuit is used to update the position of the displayed object on the display.
In accordance with embodiments of the invention the positional manipulation controller 306 receives the position information from the coordinate detection circuitry, and determines whether an input is associated with a positional manipulation which, under the control of the software application, is to be prevented or altered.
In the event that the positional manipulation controller 306 determines that an input is associated with a positional manipulation which is to be prevented or altered, then the positional manipulation controller 306 sets a control signal INHIBIT which is transmitted to the logic circuit 316. The logic circuit then sets an enable/disable signal EN/DIg to the object rendering circuit 318, to disable further updating of the position of the displayed object (or to restrict it from being altered in a particular way).
The control signal INHIBIT is also provided to the comparator circuit 310, the elapsed time circuit 312, and the countdowii clock circuit 314 to enable the operation of these circuits.
The comparator 310 provides the functionality of step 214 of Figure 4, and receives the current position information of the input and the stored position information at the time the control signal INHIBIT is set, which is stored in the memory 308. The comparator 310 compares the difference between these two positions to the distance D, and sets a first override signal ORl to the logic circuit 316 on determination of the condition that D is exceeded.
The elapsed time circuit 312 performs the functionality of step 216 of Figure 4, and receives the value of the countdown clock circuit 314, which is initiated when the control signal INHBIT is set. The elapsed time circuit 312 compares the elapsed time to the time T, and sets a second override signal 0R2 to the logic circuit 316 on determination of the condition that T is exceeded.
The logic circuit 316 is adapted, in accordance with the desired implementation, to over-ride the control signal INHIBIT from the positional manipulation controller 306 when one or both of the override signals ORl and 0R2 are set. When over-ridden, the logic circuit 316 then sets the enable/disable signal EN/DIS to the object rendering circuit 318, to enable further updating of the position of the displayed object in accordance with the position of the received inputs.
The invention and its embodiments are broadly applicable to any computer system including a display, and in which an object displayed on a graphical user interface on the display can be positionally manipulated under the control of one or more user inputs.
In Figure 7, there is illustrated an exemplary environment in which the principles of the present invention may be utilised in an interactive display system. Figure 7 illustrates an interactive display system generally denoted by reference numeral 400, including a touch-sensitive display system 401, and a computer 408.
The touch-sensitive display system 401 includes a touch-sensitive display surface 402, on which is displayed graphical elements such as displayed objects as described herein in various embodiments. Also illustrated in Figure 7 is a hand 404 of a user being used at the touch-sensitive display surface 402 to provide inputs.
The computer 408 is connected to the touch-sensitive display system 302 via a connection 416, which may be wireless or wired. Computer software including applications running on the computer system 408 control the display of graphical information on the touch-sensitive display surface 402, and are controlled by inputs detected at the touch-sensitive display surface 402. The computer system 408 may be provided with its own monitor 410, which may display the same information as displayed on the interactive display surface 402 or different information. As illustrated in Figure 7, the computer system 408 may also be connected to other types of input device, such as a mouse input device 412 and a keyboard input device 414, and receive and process inputs from such devices.
An interactive display system 400 as illustrated in Figure 7 may be provided in a classroom environment for educational purposes.
The touch-sensitive display system 401 may be provided as a large vertical display surface, which is viewable by a large number of users in, for example, a classroom. The touch-sensitive display system 401 may be provided as a horizontal display surface comprising an interactive table, which is usable by a number of users in a classroom.
The touch-sensitive display system 401 provides for collaborative working by a number of users.
The provision of touch inputs at the touch-sensitive display surface 402 may be provided by a teacher or students in a classroom.
Figure 7 represents only one exemplary implementation of the invention.
The touch-sensitive display system 401 may also be provided as a small-scale device, such as a portable or hand-held device, with a touch-sensitive display surface.
The invention is not limited to any particular type of display technology. The touch-sensitive display system 401 of Figure 7 may, for example, comprise an emissive display surface.
The touch-sensitive display system 401 of Figure 7 may comprise a surface onto which images are projected by a projector.
Further, and as noted hereinabove, the invention is not limited in its applicability to touch-sensitive display systems, and in general is applicable to any computer system associated with a display, whether an interactive display system or otherwise.
As will be clear to one skilled in the art, numerous embodiments of interactive display systems may be used to practice the present invention, e.g. to run the methods described herein as part of an interactive program stored on storage media such as computer memory storage or peripheral memory storage which is accessible by a computer, and the above description of an exemplary application in an interactive display system is not intended to limit the breadth of coverage provided by the invention. For example, the program software for practicing the present invention may be stored on a disk (such as a CD or other data storage media), may be downloaded into a computer's memory to be run, or may run from disk.
In general the invention may be implemented in software, the software comprising computer program code which, when executed on a computer, causes the computer to operate in accordance with methods as described in. The computer program code may be stored on any type of computer readable medium.
In general the invention may be implemented in a computer system, by providing and/or adapting computer hardware to operate in accordance with functionality as described herein.
All examples and embodiments described herein may be combined in various combinations, and are not mutually exclusive.
The invention has been described herein by way of reference to particular examples and exemplary embodiments. One skilled in the art will appreciate that the invention is not limited to the details of the specific examples and exemplary embodiments set forth. Numerous other embodiments may be envisaged without departing from the scope of the invention, which is defined by the appended claims.

Claims (1)

  1. <claim-text>Claims 1. A method for manipulating a displayed object of a graphical user interface, comprising: manipulating the object under the control of a user input in accordance with a positional manipulation permitted by an application under the control of which the object is displayed; and detecting a condition under which the further positional manipulation of the object is prevented by the application, and in response thereto: monitoring further user input; and in dependence upon the further user input meeting a predetermined condition, further positionally manipulating the object.</claim-text> <claim-text>2. The method according to claim 1 wherein the step of detecting a condition under which further manipulation of the object is prevented comprises detecting a deviation in the positional relationship between the user input and the displayed object.</claim-text> <claim-text>3. The method of claim 2 wherein the user input is associated with a position on the graphical user interface, and wherein the step of positionally manipulating the object under the control of a user input comprises selecting the object by locating the position associated with the user input on the object at an object selection point, wherein the step of detecting deviation comprises the step of detecting a deviation between the position associated with the user input and the object selection point on the object.</claim-text> <claim-text>4. The method of any one of claims 1 to 3 wherein the step of positionally manipulating the object is under the control of movement of the user input.</claim-text> <claim-text>5. The method of any one of claims 1 to 4 wherein the step of S positionally manipulating the object comprises one or more of: moving the object, resizing the object, or rotating the object.</claim-text> <claim-text>6. The method according to any one of claims 1 to 5 further comprising, on detecting the condition under which the further manipulation of the object is prevented, storing the positional relationship between the user input and the displayed object, wherein the step of monitoring the further user input comprises monitoring a subsequent positional relationship between the user input and the displayed object.</claim-text> <claim-text>7. The method of any one of claims 1 to 6 wherein the step of further manipulating the object comprises enabling an action determined by the further user input.</claim-text> <claim-text>8. The method of any one of claims 2 to 7 wherein the step of further manipulating the object comprises restoring the relationship between the user input and the displayed object.</claim-text> <claim-text>9. The method of any one of claims 2 to 8 wherein the predetermined condition comprises maintaining the deviation in the positional relationship between the user input and the displayed object for a predetermined length of time.</claim-text> <claim-text>10. The method of any one of claims 2 to 9 wherein the predetermined condition comprises determining if the deviation in the positional relationship between the user input and the displayed object exceeds a predetermined distance.</claim-text> <claim-text>11. The method of any preceding claim wherein the step of further manipulating the object is further dependent upon a S selection of the further manipulation.</claim-text> <claim-text>12. The method of claim 11 when dependent upon any one of claims 2 to 10, wherein if the selection is not made the positional relationship between the user input and the displayed object is restored.</claim-text> <claim-text>13. A computer program which, when run on a computer, performs the method of any one of claims 1 to 12.</claim-text> <claim-text>14. A computer program product for storing computer program code which, when run on a computer, performs the method of any one of claims 1 to 12.</claim-text> <claim-text>15. A controller for a graphical user interface, adapted to: manipulate an object displayed in the graphical user interface under the control of a user input in accordance with a positional manipulation permitted by an application under the control of which the object is displayed; and detect a condition under which the further positional manipulation of the object is prevented by the application, and in response thereto the controller is further adapted to: monitor further user input; and in dependence upon the further user input meeting a predetermined condition, further positionally manipulate the object.</claim-text> <claim-text>16. The controller according to claims 15 further adapted to detect a deviation in the positional relationship between the user input and the displayed object in order to detect a condition under which further manipulation of the object is prevented comprises.</claim-text> <claim-text>17. The controller of claim 16 wherein the user input is S associated with a position on the graphical user interface, the controller further being adapted to select the object by locating the position associated with the user input on the object at an object selection point to positionally manipulate the object under the control of a user input, and to detect a deviation between the position associated with the user input and the object selection point on the object to detect deviation.</claim-text> <claim-text>18. The controller of any one of claims 15 to 17 further adapted to control the positional manipulation of the object under the control of movement of the user input.</claim-text> <claim-text>19. The controller of any one of claims 15 to 18 further adapted to positionally manipulate the object by more of: moving the object, resizing the object, or rotating the object.</claim-text> <claim-text>20. The controller according to any one of claims 15 to 19 further adapted, on detecting the condition under which the further manipulation of the object is prevented, to store the positional relationship between the user input and the displayed object, and adapted to monitor a subsequent positional relationship between the user input and the displayed object to monitor the further user input.</claim-text> <claim-text>21. The controller of any one of claims 15 to 20 further adapted to enable an action determined by the further user input for the further manipulation of the object.</claim-text> <claim-text>22. The controller of any one of claims 15 to 21 further adapted to restore the relationship between the user input and the displayed object as the further manipulation of the object.</claim-text> <claim-text>23. The controller of any one of claims 16 to 22 further adapted to define the predetermined condition as maintaining the deviation in the positional relationship between the user input and the displayed object for a predetermined length of time.</claim-text> <claim-text>24. The controller of any one of claims 16 to 23 further adapted to define the predetermined condition as determining if the deviation in the positional relationship between the user input and the displayed object exceeds a predetermined distance.</claim-text> <claim-text>25. The controller of any one of claims 15 to 24 further adapted to further manipulate the object further in dependence upon a selection of the further manipulation.</claim-text> <claim-text>26. The controller of claim 25 when dependent upon any one of claims 16 to 24, wherein if the selection is not made the positional relationship between the user input and the displayed object is restored.</claim-text> <claim-text>27. A computer system comprising a display and a controller according to any one of claims 15 to 25.</claim-text> <claim-text>28. An interactive display system including a controller according to any one of claims 15 to 26.</claim-text> <claim-text>29. An interactive display system comprising a touch sensitive interactive display surface and a controller according to any S one of claims 15 to 26.</claim-text> <claim-text>30. An interactive display system according to claims 29 wherein the interactive display surface is horizontally disposed.</claim-text> <claim-text>31. A controller for controlling the positional manipulation of a displayed object of a graphical user interface, comprising: means for manipulating the object under the control of a user input in accordance with a positional manipulation permitted by an application under the control of which the object is displayed; and means for detecting a condition under which the further positional manipulation of the object is prevented by the application, and in response thereto, the controller further comprising: means for monitoring further user input; and in dependence upon the further user input meeting a predetermined condition, means for further positionally manipulating the object.</claim-text> <claim-text>32. A controller according to claim 31 wherein the means for detecting a condition under which further manipulation of the object is prevented comprises means for detecting a deviation in the positional relationship between the user input and the displayed object.</claim-text> <claim-text>33. A controller according to claim 32 wherein the user input is associated with a position on the graphical user interface, S and wherein the means for positionally manipulating the object under the control of a user input comprises means for selecting the object by locating the position associated with the user input on the object at an object selection point, wherein the means for detecting deviation comprises means for detecting a deviation between the position associated with the user input and the object selection point on the object.</claim-text> <claim-text>34. An apparatus for controlling the positional manipulation of a displayed object of a graphical user interface, comprising: a positional manipulation controller for manipulating the object under the control of a user input in accordance with a positional manipulation permitted by an application under the control of which the object is displayed; and for detecting a condition under which the further positional manipulation of the object is prevented by the application, the apparatus further comprising: circuitry for monitoring further user input; and in dependence upon the further user input meeting a predetermined condition, circuitry for further positionally manipulating the object.</claim-text> <claim-text>35. An apparatus according to claim 34 wherein the controller for detecting a condition under which further manipulation of the object is prevented is arranged to detect a deviation in the positional relationship between the user input and the displayed object.</claim-text> <claim-text>36. An apparatus according to claim 32 wherein the user input is associated with a position on the graphical user interface, and wherein the controller for positionally manipulating the object under the control of a user input comprises circuitry for selecting the object by locating the position associated with the user input on the object at an object selection point, wherein the circuitry for detecting deviation comprises circuitry for detecting a deviation between the position associated with the user input and the object selection point on the object.</claim-text>
GB201118752A 2011-10-31 2011-10-31 Positional manipulation of an object in a GUI beyond that permitted by an associated application Withdrawn GB2498508A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB201118752A GB2498508A (en) 2011-10-31 2011-10-31 Positional manipulation of an object in a GUI beyond that permitted by an associated application
PCT/EP2012/070626 WO2013064378A1 (en) 2011-10-31 2012-10-18 Gui object manipulation prevention and over-ride

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201118752A GB2498508A (en) 2011-10-31 2011-10-31 Positional manipulation of an object in a GUI beyond that permitted by an associated application

Publications (2)

Publication Number Publication Date
GB201118752D0 GB201118752D0 (en) 2011-12-14
GB2498508A true GB2498508A (en) 2013-07-24

Family

ID=45375566

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201118752A Withdrawn GB2498508A (en) 2011-10-31 2011-10-31 Positional manipulation of an object in a GUI beyond that permitted by an associated application

Country Status (2)

Country Link
GB (1) GB2498508A (en)
WO (1) WO2013064378A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1674977A2 (en) * 2004-12-21 2006-06-28 Microsoft Corporation Pressure sensitive graphical controls
US20090015568A1 (en) * 2007-07-12 2009-01-15 Koski David A Method and Apparatus for Implementing Slider Detents
US20100005420A1 (en) * 2008-07-07 2010-01-07 International Business Machines Corporation Notched slider control for a graphical user interface
WO2010024969A1 (en) * 2008-08-26 2010-03-04 Apple Inc. Dynamic control of list navigation based on list item properties
US20110169753A1 (en) * 2010-01-12 2011-07-14 Canon Kabushiki Kaisha Information processing apparatus, information processing method thereof, and computer-readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005346583A (en) * 2004-06-04 2005-12-15 Canon Inc Image display apparatus, multi-display system, coordinate information output method, and control program thereof
TWI358028B (en) * 2007-12-25 2012-02-11 Htc Corp Electronic device capable of transferring object b
JP2010176332A (en) * 2009-01-28 2010-08-12 Sony Corp Information processing apparatus, information processing method, and program
JP5606686B2 (en) * 2009-04-14 2014-10-15 ソニー株式会社 Information processing apparatus, information processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1674977A2 (en) * 2004-12-21 2006-06-28 Microsoft Corporation Pressure sensitive graphical controls
US20090015568A1 (en) * 2007-07-12 2009-01-15 Koski David A Method and Apparatus for Implementing Slider Detents
US20100005420A1 (en) * 2008-07-07 2010-01-07 International Business Machines Corporation Notched slider control for a graphical user interface
WO2010024969A1 (en) * 2008-08-26 2010-03-04 Apple Inc. Dynamic control of list navigation based on list item properties
US20110169753A1 (en) * 2010-01-12 2011-07-14 Canon Kabushiki Kaisha Information processing apparatus, information processing method thereof, and computer-readable storage medium

Also Published As

Publication number Publication date
WO2013064378A1 (en) 2013-05-10
GB201118752D0 (en) 2011-12-14

Similar Documents

Publication Publication Date Title
CA2729392C (en) Method of manipulating assets shown on a touch-sensitive display
JP5230733B2 (en) Pointer control unit
KR102052771B1 (en) Cross-slide gesture to select and rearrange
US20170185261A1 (en) Virtual reality device, method for virtual reality
US20130063384A1 (en) Electronic apparatus, display method, and program
US20100053221A1 (en) Information processing apparatus and operation method thereof
US20110157027A1 (en) Method and Apparatus for Performing an Operation on a User Interface Object
US8769409B2 (en) Systems and methods for improving object detection
US10359904B2 (en) Graphic user interface pointer control
US20170109026A1 (en) Dial control for touch screen navigation
US20180188919A1 (en) System and method to control a touchscreen user interface
US10310705B2 (en) Menu display control
US20150220242A1 (en) System and method for remote controlling computing device
EP3044665B1 (en) System and method for remote computer control
US20110199517A1 (en) Method of showing video on a touch-sensitive display
US10073617B2 (en) Touchscreen precise pointing gesture
KR20110094693A (en) Apparatus and method for providing user interface
JP6876557B2 (en) Display control program, display control method and display control device
GB2498508A (en) Positional manipulation of an object in a GUI beyond that permitted by an associated application
US10698601B2 (en) Second touch zoom control
US20140380188A1 (en) Information processing apparatus
US9213555B2 (en) Off-screen window controls
WO2021062948A1 (en) Element adding method and apparatus, and electronic device
JP2017157046A (en) Information processor, information processing program and information processing method
KR20200094583A (en) Method and apparatus for providing contents platform using vr interface

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20130919 AND 20130925

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)